Mobile Media API (JSR-135)

JCP Specification


This document, Mobile Media API (JSR-135) Specification, defines the Multimedia API for the Java TM 2 Platform, Micro Edition (J2METM).

Revision History

MMAPI, 1st Draft
Initial MMAPI draft proposed for the Aug. 30-31 Expert Group meeting
MMAPI, 1st EG Draft
Incorporated changes agreed during the Aug. 30-31 Expert Group meeting
MMAPI, Community Draft, v. 0.8
MMAPI Community review draft version
MMAPI, EG Draft v. 0.85
Incorporated changes agreed during the Nov. 5-6 Expert Group meeting
MMAPI, EG Draft v. 0.86
Incorporated changes made in the EG mailing list discussion
MMAPI, Public Draft, v. 0.9
MMAPI Public review draft version
11-March-2002 MMAPI, Revised Public Draft, v. 0.95 MMAPI Revised  Public draft version
26-April-2002 MMAPI, EG Draft v.0.96 Incorporated changes agreed in the EG meeting & proposed in public review
02-May-2002 MMAPI, EG Draft v.0.97 Incorporated changes agreed in the EG meeting 30.04.2002
09-May-2002 MMAPI, EG Draft v. 0.98 Incorporated changes agreed in the EG meeting 07.05.2002
23-May-2002 MMAPI, Proposed Final Draft, v. 1.0 MMAPI Proposed Final Draft Version
24-June-2002 MMAPI, v. 1.0 MMAPI Final Version

Who Should Use This Specification

The audience for this document is the public Java community reviewing this specification and the Java Community Process (JCP) expert group defining this specification, implementors of the Multimedia API, and application developers targeting the J2ME platform.
The MMAPI specification is an optional package. An optional package can be adopted to existing J2ME profiles. A profile of J2ME defines device-type-specific sets of APIs for a particular vertical market or industry. Profiles are more exactly defined in the related publication, Configurations and Profiles Architecture Specification, Sun Microsystems, Inc.

Related Literature

The Java Language Specification by James Gosling, Bill Joy, and Guy L. Steele. Addison-Wesley, 1996, ISBN 0-201-63451-1
Connected, Limited Device Configuration (JSR-30), Sun Microsystems, Inc.
Connected Device Configuration (JSR-36), Sun Microsystems, Inc.
Mobile Information Device Profile (JSR-37), Sun Microsystems, Inc.
Mobile Information Device Profile 2.0 (JSR-118), Sun Microsystems, Inc.
Java Media Framework, Sun Microsystems, Inc.


Many multimedia types and formats exist in today's market, and new types and formats are being introduced all the time. There are also many, diverse methods to store and deliver these various media types. For example, there are traditional storage devices (such as disk file systems, CDs, and DVDs), wired protocols (UDP, HTTP, etc.) and wireless protocols (WAP, etc.).

J2METM devices range from cell phones with simple tone generation to PDAs and web tablets with advanced audio and video rendering capabilities. To accommodate diverse configurations and multimedia processing capabilities, an API with a high level of abstraction is needed.

The MMAPI Expert Group has also contributed to the ongoing JSR-118 (Mobile Information Device Profile 2.0) JCP specification, and the target is to make MMAPI a direct superset of the JSR-118 MIDP 2.0 Media API.

Expert Group

The MMAPI Expert Group consists of the following companies:


MMAPI features the following:
Feature Description
Support for Tone Generation, Playback and Recording of Time Based Media The API supports any time-based audio and video content by offering tools to control the flow of the media stream. Tone generation is a special media type that is characterized by frequency and duration.
CLDC Target The main target for the API is a CLDC based device. Other environments (e.g. CDC) are not excluded, but CLDC is the lowest common denominator.

Note: MMAPI references IllegalStateException which is not present in CLDC 1.0. As such, it is required that the underlying platform (configuration/profile), such as CLDC1.0/MIDP1.0, includes that exception.

Small Footprint CLDC as the target configuration sets strict limits for memory consumption. Design of the API emphasizes that as much as possible.
Protocol and Content Agnostic
The design of the API is not biased towards any specific protocol or content type.
Subsettable: Audio-only vs. General Multimedia It is possible to separate a subset of the API in order to provide support for only some type of content (e.g. for basic audio). It allows profiles that cannot support all the features of the full API to only take the parts that are needed.
The API is designed in a way that allows new features to be added later without breaking the old functionality.
Optionality for Implementation The API offers a wide range of features for different purposes. Design of the API allows implementations that can't provide real support for all the features to leave some of them unimplemented.

Overview of MMAPI

This section provides a high-level overview of MMAPI. Short code examples in the Usage Scenarios section illustrate practical use of the API.

Basic Concepts: Protocol and Content Handling

Basically, multimedia processing can be broken into two parts:
  1. Handling the data delivery protocol
  2. Handling the data content
Protocol handling refers to reading data from a source (such as a file, capture device, or streaming server) into a media processing system. Content handling usually requires processing the media data (parsing or decoding, for example) and rendering the media to output devices such as an audio speaker or video display.

Two high-level objects are used in this API: DataSource and Player. Each object encapsulates the two parts of multimedia processing:

DataSource and Player
  1. DataSource for protocol handling
  2. Player for content handling
A DataSource encapsulates protocol handling. It hides the details of how the data is read from its source, whether the data is coming from a file, streaming server, or proprietary delivery mechanism. DataSource provides a set of methods to allow a Player to read data from it for processing.

A Player reads from the DataSource, processes the data, and renders the media to the output device. It provides a set of methods to control media playback and basic synchronization. Players also provide some type-specific controls to access features for specific media types.

A factory mechanism, the Manager, creates Players from DataSources. For convenience, Manager also provides methods to create Players from locators and InputStreams.

DataSource, Player and Manager

API Details

The createPlayer method is the top-level entry point to the API:
Player Manager.createPlayer(String urlString)
The urlString fully specifies the protocol and the content of the data:
<protocol>:<content location>
The Manager parses the URL and creates a DataSource to handle the specified data delivery protocol. The DataSource derives the content type from the data. The Manager then takes this content type and creates a Player to handle the presentation of the data. The resulting Player is returned for use by the application.

The Player provides general methods to control data flow and presentation, for example:

Player.setMediaTime(long time)
Fine-grained control is an important feature of the API. Therefore, each Player also provides type-specific controls with the getControls and getControl methods:
Control[] Player.getControls()
Control Player.getControl(String controlType)
Since each type of media will yield different types of controls from its corresponding Player, the getControls and getControl methods expose features that are unique to a particular media type. For example, for the MIDI type, you can receive a MIDIControl from the Player's getControl method.

System Properties

MMAPI has some properties that can be queried by System.getProperty(String key). Currently, the following keys are defined:

Key Description
supports.mixing Query for whether audio mixing is supported. The string returned is either "true" or "false". If mixing is supported, the following conditions are true:
  • At least two tones can be played with Manager.playTone simultaneously.
  • Manager.playTone can be used at the same time when at least one Player is playing back audio.
  • At least two Player's can be used to play back audio simultaneously. Query for whether audio capture is supported. The string returned is either true or false. Query for whether video capture is supported. The string returned is either true or false.
supports.recording Query for whether recording is supported. The string returned is either true or false.
audio.encodings The string returned specifies the supported capture audio formats. Each format is specified in a special syntax. The formats are delimited by at least one space.
video.encodings The string returned specifies the supported capture video formats. Each format is specified in a special syntax. The formats are delimited by at least one space.
video.snapshot.encodings Supported video snapshot formats for the getSnapshot method in VideoControl. The string returned specifies the supported capture image formats. Each format is specified in a special syntax. The formats are delimited by at least one space.

Tone Generation

Tone generation is important for games and other audio applications. On very small devices, it is particularly important since it is likely to be the only form of multimedia capability. In its simplest form, tone generation reduces to a single buzzer or some simple monophonic tone generation. The Manager class provides a top level method to handle this simple form of single tone generation:
Manager.playTone(int note, int duration, int volume)
The implementation of this method can be mapped directly to a device's hardware tone generator to provide the most responsive sound generation.

In addition, the API provides a way to create a specific type of Player for synthesizing tone sequences:

Player p = Manager.createPlayer(Manager.TONE_DEVICE_LOCATOR)
The Player created provides a special type of Control, ToneControl which can be used for programming a tone sequence. This enables more sophisticated applications written for slightly more powerful devices.


MMAPI provides support for a camera. From the API point of view, a camera attached to a device doesn't really differ from any other video content. Camera has a special locator "capture://video" that is used to create its Player. An application uses VideoControl to display the view finder on the screen and take pictures.

VideoControl.getSnapshot(String imageType) is used to capture a camera's picture. By default, the picture is returned in PNG format. However, if the implementation supports other formats, the application can use imageType to select another format. The video.snapshot.encodings key can be used to query supported formats from the system.

Scenario 11 shows how to use the camera, and Manager documents different locators for creating the camera's Player.

Optionality and Implementation Requirements

Not all implementations of MMAPI support all multimedia types and input protocols. Some implementations may support only a few selected types and protocols. For example, some may support only playback of local audio files. The design of MMAPI allows implementations to provide optional support for different media types and protocols.

Manager provides getSupportedContentTypes and getSupportedProtocols to query for supported media types and protocols. An application can also attempt to create a Player from Manager given the media input. If the content type or the protocol specified is not supported, the Player creation methods throw an Exception.

Since MMAPI is an optional package, it does not mandate any particular media types or protocols. Required types and protocols are to be determined by the appropriate profiles adopting MMAPI (e.g. JSR 118). However, an implementation must guarantee support of at least one media type and protocol.

MMAPI also allows implementations to support optional features. For example, some implementations may support only media playback, but not recording. Or some devices may support volume control while others may not. MMAPI allows implementations to expose these optional features as applicable.

Optional features are organized as Controls. A Player can be queried for all of its supported Controls by using getControls or a particular type of Control by using getControl.

MMAPI mandates support of some features for some media types. Other features are recommended while others are entirely optional. This guarantees some uniformity across different implementations. Feature sets are organized as described below. For the purpose of this discussion, the following definitions are used:


This document uses definitions based upon those specified in RFC 2119 (See

The associated definition is an absolute requirement of this specification.
The definition is an absolute prohibition of this specification.
Indicates a recommended practice. There may exist valid reasons in particular circumstances to ignore this recommendation, but the full implications must be understood and carefully weighed before choosing a different course.
Indicates a non-recommended practice. There may exist valid reasons in particular circumstances when the particular behavior is acceptable or even useful, but the full implications should be understood and the case carefully weighed before implementing any behavior described with this label.
Indicates that an item is truly optional.

Feature Sets

The following feature sets are defined for five different types of media. When a Player is created for a particular type, it must follow these guidelines and implement the appropriate Control types:

Feature Set Implementation Requirements
Sampled Audio
  • Should implement VolumeControl, StopTimeControl.
  • Should implement VolumeControl, MIDIControl, TempoControl, PitchControl, StopTimeControl.
Tone Sequence (Player for TONE_DEVICE_LOCATOR)
  • Must implement ToneControl.
  • Should implement VolumeControl, StopTimeControl.
Interactive MIDI (Player for MIDI_DEVICE_LOCATOR)
  • Must implement MIDIControl.
  • Must implement VideoControl.
  • Should implement FramePositioningControl, StopTimeControl, VolumeControl (if audio is also available).

The following controls belong to none of the above feature sets. Implementations may implement them when applicable: GUIControl, MetaDataControl, RateControl, RecordControl.

Testing and Runtime Requirements

A Player has five states: UNREALIZED, REALIZED, PREFETCHED, STARTED and CLOSED. A Player implementation must allow successful state transition to each of these states using the six state-transition methods: This means that the implementation must guarantee that these methods succeed under normal runtime conditions. This is to ensure that an implementation provides Players that are functional.

Security Considerations

MMAPI does not define any security mechanism. Rather, implementations of MMAPI are subject to the security mechanisms provided by the underlying profile and configuration, e.g. MIDP 2.0. A few methods in MMAPI are defined such that a SecurityException will be thrown when called without the appropriate security permissions from the caller. An implementation must guarantee that:
  1. the exception is thrown when the caller does not have the appropriate security permissions; and
  2. the method can be used when the appropriate permissions are granted.
The implementation must also provide documentation on the proper setup procedures to obtain or deny the required security permissions, such that these methods can be properly tested.

The MIDP 2.0 Sound Subset

Some J2METM devices are very resource constrained. It may not be feasible for a device to support a full range of multimedia types, such as video, on some cell phones. As a result, not all devices are required to support the full generality of a multimedia API, such as extensibility to support custom protocols and content types.

For the MID profile, version 2.0 (JSR 118), the size of the API is another determining factor. A special subset of MMAPI has been derived to address the specific needs of MIDP. This proposed subset fulfills the requirements set by the MIDP 2.0 Expert Group. These include:

This subset differs from the general multimedia API in the following ways: It is important to note that the MIDP 2.0 subset is a proper subset of the full MMAPI specification and is fully forward compatible with MMAPI.

Future Extensions

Currently, MMAPI targets general media playback. As such, it does not cover the following areas of multimedia: It is our intent that by staying with a high level of abstraction, we will be able to extend the API to support the above features.This has been demonstrated by the JavaTM Media Framework which employs a similar paradigm.

Usage Scenarios

This section provides some common scenarios of how the API could be used.

Scenario 1: Single-Tone Generation

try {
    Manager.playTone(ToneControl.C4, 5000 /* millisec */, 100 /* max vol */);
} catch (MediaException e) { }

Scenario 2: Simple Media Playback with Looping

try {
    Player p = Manager.createPlayer("http://webserver/music.mp3");
} catch (IOException ioe) {
} catch (MediaException me) { }

Scenario 3: Fine-Grained Playback Control

static final long SECS_TO_MICROSECS = 1000000L;

Player p;
VolumeControl vc;

try {
    p = Manager.createPlayer("http://webserver/music.mp3");

   // Set a listener.
   p.addPlayerListener(new Listener());
   // Grab volume control for the player.
   // Set Volume to max.
   vc = (VolumeControl)p.getControl("VolumeControl");
   if (vc != null)

   // Set a start time.
   p.setMediaTime(5 * SECS_TO_MICROSECS);

   // Guarantee that the player can start with the smallest latency.

   // Non-blocking start
} catch (IOException ioe) {
} catch (MediaException me) { }

class Listener implements PlayerListener {

    public void playerUpdate(Player p, String event, Object eventData) {

        if (event == END_OF_MEDIA || event == STOP_AT_TIME) {
            System.out.println("Done processing");
            try {
                p.setMediaTime(5 * SECS_TO_MICROSECS);
            } catch (MediaException me) { }

Scenario 4: MIDI Playback with Some Fine-Grained Control

Player p;
TempoControl tc;

try {
    p = Manager.createPlayer("http://webserver/tune.mid");

    // Grab the tempo control.
    tc = (TempoControl)p.getControl("TempoControl");
    tc.setTempo(120000); // 120 beats/min

} catch (IOException ioe) {
} catch (MediaException me) { }

Scenario 5: Video Playback

Player p;
VideoControl vc;

try {
    p = Manager.createPlayer("http://webserver/movie.mpg");

    // Grab the video control and set it to the current display.
    vc = (VideoControl)p.getControl("VideoControl");
    if (vc != null) {
        Form form = new Form("video");
        form.append((Item)vc.initDisplayMode(vc.USE_GUI_PRIMITIVE, null));


} catch (IOException ioe) {
} catch (MediaException me) { }

Scenario 6: Playing Back from Media Stored in RMS

RecordStore rs;
int recordID;
   :  // code to set up the record store.

try {
    InputStream is = new
    Player p = Manager.createPlayer(is, "audio/X-wav");
} catch (IOException ioe) {
} catch (MediaException me) { }

Scenario 7: Playing Back from Media Stored in JAR

/** Notice that in MIDP 2.0, the wav format is mandatory only */
/** in the case that the device supports sampled audio.       */

try {
    InputStream is = getClass().getResourceAsStream("audio.wav");
    Player p = Manager.createPlayer(is, "audio/X-wav");
} catch (IOException ioe) {
} catch (MediaException me) { }

Scenario 8: Synchronization of Different Players

Player p1, p2;

try {
    p1 = Manager.createPlayer("http://webserver/tune.mid");
    p2 = Manager.createPlayer("http://webserver/movie.mpg");
} catch (IOException ioe) {
} catch (MediaException me) { }

Scenario 9: Tone Sequence Generation

 * "Mary Had A Little Lamb" has "ABAC" structure.
 * Use block to repeat "A" section. 
byte tempo = 30; // set tempo to 120 bpm 
byte d = 8;      // eighth-note 

byte C4 = ToneControl.C4;; 
byte D4 = (byte)(C4 + 2); // a whole step 
byte E4 = (byte)(C4 + 4); // a major third 
byte G4 = (byte)(C4 + 7); // a fifth 
byte rest = ToneControl.SILENCE; // rest 

byte[] mySequence = {
    ToneControl.VERSION, 1,   // version 1
    ToneControl.TEMPO, tempo, // set tempo
    ToneControl.BLOCK_START, 0,   // start define "A" section
    E4,d, D4,d, C4,d, E4,d,       // content of "A" section
    E4,d, E4,d, E4,d, rest,d,           
    ToneControl.BLOCK_END, 0,     // end define "A" section
    ToneControl.PLAY_BLOCK, 0,    // play "A" section
    D4,d, D4,d, D4,d, rest,d,     // play "B" section
    E4,d, G4,d, G4,d, rest,d,
    ToneControl.PLAY_BLOCK, 0,    // repeat "A" section
    D4,d, D4,d, E4,d, D4,d, C4,d  // play "C" section

    Player p = Manager.createPlayer(Manager.TONE_DEVICE_LOCATOR); 
    ToneControl c = (ToneControl)p.getControl("ToneControl"); 
} catch (IOException ioe) { 
} catch (MediaException me) { }

Scenario 10: Capture and Recording

try {
    // Create a DataSource that captures live audio.
    Player p = Manager.createPlayer("capture://audio");
    // Get the RecordControl, set the record location, and 
    // start the Player and record for 5 seconds.
    RecordControl rc = (RecordControl)p.getControl("RecordControl");
} catch (IOException ioe) {
} catch (MediaException me) {
} catch (InterruptedException e) { }

Scenario 11: Camera

Player p;
VideoControl vc;

// initialize camera 
try {
    p = Manager.createPlayer("capture://video");

    // Grab the video control and set it to the current display.
    vc = (VideoControl)p.getControl("VideoControl");
    if (vc != null) {
        Form form = new Form("video");
        form.append((Item)vc.initDisplayMode(vc.USE_GUI_PRIMITIVE, null));


} catch (IOException ioe) {
} catch (MediaException me) { }

// now take a picture
try {
    byte[] pngImage = vc.getSnapshot(null);

    // do something with the image ...
} catch (MediaException me) { }