devxlogo

Exploring the J2ME Mobile Media APIs

Exploring the J2ME Mobile Media APIs

s J2ME gains prominence, the range of things that developers can do with Java on devices is expanding. This month I’ll cover the Mobile Media API (JSR-135), which is an optional J2ME package. The Mobile Media APIs enable the playing and recording of audio, taking pictures and playing videos, and other things.

The Mobile Media APIs find their way onto devices through several different means. They are designed to be compatible with both the CDC and CLDC configurations, which makes them available to any J2ME device regardless of whether the application profile is MIDP, Personal Profile, or PDA Profile. The Mobile Media APIs are also included as part of the Java Technology for the Wireless Industry (JTWI) specification (JSR-185), which serves to unify a number of related specifications around MIDP 2.0.

In either case the Mobile Media APIs are optional. A device can become JTWI-certified without support for the Mobile Media APIs. Furthermore, a device may choose to support a subset of the Mobile Media API capabilities.

What Optional Means in the Mobile Media APIs
The optional nature of the Mobile Media APIs is based on the reality that different devices have different needs and capabilities. It is unrealistic to require every device to support the entire Mobile Media API specification. Allowing many of the Mobile Media features to be optional allows implementers to better support the features that make sense. However, if a device supports a multimedia feature, such as audio capture, the implementation must follow the Mobile Media API specification. For application developers this means that although multimedia support will vary between devices, if a feature is supported it will be done in a standard way.

Mobile Media APIs and MIDP 2.0
There is a strong relationship between the Mobile Media API specification and the MIDP 2.0 specification. This is to provide compatibility between the media capabilities of MIDP 2.0 devices and devices that include the more advanced capabilities of the Mobile Media APIs. MIDP 2.0 contains a direct subset of the Mobile Media APIs in order to provide capabilities such as audio playback and tone generation. Table 1 lists features of the Mobile Media APIs included by MIDP 2.0.

Table 1. Subset of Mobile Media APIs included in MIDP 2.0.

API

Tone Generation

AudioPlayback

Media FlowControls

Volume

Ability toquery for supported media features

The Mobile Media API specification expands on the MIDP 2.0 subset to include features listed in Table 2.

Table 2. Mobile Media API features that expand on features also defined in MIDP 2.0.

Feature

AudioCapture

VideoCapture and Playback

Customprotocol support (i.e., you can define your own datasources)

Richer setsof content and protocol types

The relationship of the Mobile Media APIs and MIDP 2.0 is unique in that other application specifications, such as the Personal Profile, do not include a subset of Mobile Media capabilities.

Mobile Media Basics
So now that I’ve covered how the Mobile Media APIs may find their way onto devices, I’ll dig into what the APIs are composed of and what developers can do with them. Regardless of whether you are using MIDP 2.0 features or the more advanced features of the Mobile Media API specification, the basic behavior and functionality for controlling media is the same. The media framework comprises the following participants:

  • Manager
  • Player
  • Control

The Manager class is a factory used to create Players. Player itself is an interface. The underlying implementation of a player is specific to the content type being requested. For example, if the requested media is in audio WAV format, a WAV Player implementation is returned to handle the content. If the media format is MPEG, a different Player is returned to handle the specifics of MPEG media.

Once a Player is created, an application can request Control objects that encapsulate behaviors to control the media playback, capture, and so forth. The Control objects available vary depending on the media supported by the player. Figure 1 shows how the Manager, Player, and Control classes relate.

Figure 1. Manager, Player, Control: Relationship of the framework classes and interfaces used to manage multimedia using the Mobile Media APIs.

Simple Audio Playback Example
Before delving into more concepts on controlling media I’ll write some code that makes use of the core framework. The following example uses the Manager to create a Player that handles WAV audio format. The Manager.createPlayer() method determines the type of player to return by parsing the media locator string passed as a parameter. In this case, the player must read the WAV media using the HTTP network protocol. Once the Player instance is created, Player.start() is called to play the media on the device.

try	{  String wav = "http://java.sun.com/products/java"+  "media/mma/media/test-wav.wav";  Player p = Manager.createPlayer(wav);  p.start();} catch (IOException x) {  x.printStackTrace();} catch (MediaException x) {  x.printStackTrace();}

Player Lifecycle
Regardless of the protocol or media type involved, the Player moves through the same discrete states during its lifecycle. These states are listed in Table 3.

Table 3. Lifecycle states of a Player instance.

State

Description

Unrealized

Initial state when a Player is created. In this state the player does not have enough information to acquire the necessary resources to process the media.

Realized

The Player moves into the Realized state once it has obtained the necessary information to acquire the resources. In this state it is likely that most of the resources have already been acquired to function. However, some resources may not have been acquired at this point, especially if there are system dependencies involved such as with an audio or video driver where exclusive access must be obtained.

Prefetched

The Player moves into the Prefetched state once all resources have been acquired, including scarce and system dependent resources. Once in the Prefetched state, the Player has everything necessary to perform its tasks.

Started

A Player in the Started state indicates that the content associated with the Player is being processed.

Closed

A Player moves to the Closed state at the end of its lifecycle. A Player in the Closed state must not be used again.

The Player interface contains methods to explicitly move the Player through these various lifecycle states. However, in some cases, it is not necessary to call each one explicitly. In the previous code example, the player was created using Manager.createPlayer(), putting the player into the Unrealized state. The next line of code called Player.start(), which implicitly moved the player through the Realized and Prefetched states. The lifecycle methods of the Player interface are described in Table 4.

Table 4. Lifecycle methods of the Player interface.

Method

Description

realize()

Explicitly moves the player from the Unrealized state to the Realized state.

prefetch()

Explicitly moves the player from the Realized state to the Prefetched state.

start()

Tells the player to start processing the media. Moves the Player to the Started state.

stop()

Tells a player in the Started state to pause. Moves the state of the player from Started to Prefetched.

close()

Moves a player to the Closed state. This method may be called from the any of the other states.

deallocate()

Tells the player to release scares or exclusive resources (the types of resources usually acquired by a call to prefetch()).

Many of the Player methods listed in Table 4 are valid only when the player is in certain states. For example, calling stop() when the player is in the Realized state makes no sense. In some cases, the method call may be ignored; in other cases an IllegalStateException is thrown. State changes can be monitored by adding a PlayerListener to a player. A PlayerListener will receive state changes events as the Player moves through its lifecycle. Figure 2 shows a state diagram of the Player lifecycle.

Figure 2. Life of a Player: The State diagram below shows the Player lifecycle.

Figure 2. State diagram of the Player lifecycle.

More Media Examples
There are a number of ways to interact with the player and media. The following examples demonstrate some of these options.

Media can be stored as a resource within the JAR file. To access media bundled into the JAR file do something like the following:

InputStream is =   getClass().getResourceAsStream("mymusic.mp3");Player p = Manager.createPlayer(is, "audio/mpeg"); p.start();

Media can also be stored in a RecordStore and accessed by doing something like this:

InputStream is =   new ByteArrayInputStream(  myRecordStore.getRecord(id));Player player = Manager.createPlayer(  is, "audio/mpeg");player.start();

Simple tone generation is also possible. The Manager class provides a convenience method Manager.playTone() that generates tones without needing to explicitly create a Player. This method takes three parameters: the note, which is an integer from 0 to 127, a duration in milliseconds for how long the tone should be generated, and the volume to use when generating the tone. The following example generates a sequence of random notes.

for(int ccnt=0; ccnt < 100; ccnt++){  int note = (int)     (System.currentTimeMillis() % 127);  Manager.playTone(note, 100, 100);  //let the clock move ahead a bit  Thread.currentThread().sleep(1); }

Adding Control
Up to this point I've focused mostly on the Player and Manager. However, for each type of media there may be capabilities for controlling the flow and behavior of how the content is processed. To provide more content-specific behaviors players can make use of the Control interface. For example, if the media is a video, there may be a need to control the size of the display. Or, if an application is capturing audio there is a need to start recording and pipe the input to memory or a RecordStore. Control instances can be obtained by making a call to the factor method Player.getControl() and passing in a control type. The control type parameter is the class name of the control to be created. If the control type parameter is not a fully qualified class name, the package javax.microedition.medial.control is assumed.

There are two standard controls defined as part of the MIDP 2.0 Mobile Media subset: ToneControl, which used to define a sequence of tones, and VolumeControl, which can be used to control the volume as the media is processed. Within the Mobile Media APIs, many more Control interfaces are defined, such as VideoControl, TempoControl, RecordControl, and MIDIControl just to name a few. Implementers can also provide additional controls as needed.

Playing More Complex Media
Playing audio can be fun, but it's pretty basic. So far, all the code examples shown are supported within the MIDP 2.0 specification. Now I'll show you something only the Mobile Media API specification can provide. The next example demonstrates how to play video on a device. For the most part, showing video is similar to audio playback. The primary difference is that you need something to display the video on. This is where the Control interface comes into play.

String url = "http://java.sun.com /"+  "products/java-media/mma/media/test-mpeg.mpg";Player p = Manager.createPlayer(url);p.realize();//must do this before getting the control//get the video controllerVideoControl video = (VideoControl) p.getControl("VideoControl");//get a GUI to display the videoItem videoItem = (Item)video.initDisplayMode(  VideoControl.USE_GUI_PRIMITIVE, null);//append the GUI to a formvideoForm.append(videoItem);//start the videop.start();

This code retrieves an MPG video using HTTP, obtains an Item from the VideoControl that can be place onto a Form, and runs the video.

The Mobile Media APIs provide a rich and extensible framework for playing and capturing media content on mobile devices. I am only able to skim the surface in this article; however, there are interfaces and protocols defined for doing a lot of interesting things in your applications such as real-time streaming of Internet radio content, streaming other RTP (Real-time Transport Protocol) content, capturing audio, and taking pictures. However, an application can only do these things if the underlying content types and protocols are supported. To find out what is supported on a device, the Mobile Media APIs provide some query mechanisms to help developers discover capabilities and limitations of a device.

Knowing What Is Available
Let's face it: While the flexibility and power provided by the Mobile Media API is exciting, this excitement is tempered by the realization that support for Mobile Media features is likely to vary from one device to another. Even applications that support media only within the MIDP 2.0 specification must be able to adapt to the absence of certain media features and capabilities. For example, while audio playback may be supported, an application cannot assume that a device supports WAV, MP3, and AU formats. The specification does not require any specific audio formats. Furthermore, an application cannot assume that just because WAV format is supported that HTTP can be used to stream or download the content.

To address these problems applications can gain insight to the supported capabilities of a device by querying a couple of methods on the Manager class:

  • getSupportedContentTypes(), which accepts a protocol identifier, such as "http" and returns a list of content types supported by this protocol.
  • getSupportedProtocols(), which accepts a content type identifier, such as "audio/mpeg" and returns a list of protocols (such as http) supporting this content type.

In addition to the content and protocol support, there are other multimedia landmines to avoid. For example, the methods above may indicate WAV format is supported through HTTP, but can the device play 16-bit sound samples recorded at 44100 Hz? To gain insight to some of these details a few calls to System.getProperty() may be necessary. Table 5 lists some of the properties available.

Table 5. Mobile Media API system properties that can be accessed by calling System.getProperty().

System Property

Description

supports.mixing

Returns true or false. True indicates that at least two tones can simultaneously be played with Manager.playtone() and at least two players can play audio simultaneously.

supports.audio.capture

Returns true or false indicating if audio capture is supported by the device.

supports.video.capture

Returns true or false indicating if video capture is supported by the device.

supports.recording

Returns true or false indicating if audio recording is supported.

audio.encodings

Returns the details of audio formats supported such as bits, channels, rates, endian, etc.

video.encodings

Returns the details of video formats supported.

video.snapshot.encodings

Returns the details of video snapshot formats supported when VideoControl.getSnapshot() is called.

Device Support is Developers' Challenge
The Mobile Media API provides a rich and extensible framework for incorporating media into J2ME applications. While there are a number of verification steps an application will want to perform in order to ensure a device can handle specific media content, the Mobile Media API does a good job of providing developers with enough information to make informed decisions regarding a device's capabilities. Fortunately, the competitive mobile device market that currently exists puts the odds in the developer's favor that most of the popular media types will be supported.

devxblackblue

About Our Editorial Process

At DevX, we’re dedicated to tech entrepreneurship. Our team closely follows industry shifts, new products, AI breakthroughs, technology trends, and funding announcements. Articles undergo thorough editing to ensure accuracy and clarity, reflecting DevX’s style and supporting entrepreneurs in the tech sphere.

See our full editorial policy.

About Our Journalist