CONTENTS | PREV | NEXT | INDEX JMF 2.0 API Guide



2

Understanding JMF

JavaTM Media Framework (JMF) provides a unified architecture and messaging protocol for managing the acquisition, processing, and delivery of time-based media data. JMF is designed to support most standard media content types, such as AIFF, AU, AVI, GSM, MIDI, MPEG, QuickTime, RMF, and WAV.

By exploiting the advantages of the Java platform, JMF delivers the promise of "Write Once, Run AnywhereTM" to developers who want to use media such as audio and video in their Java programs. JMF provides a common cross-platform Java API for accessing underlying media frameworks. JMF implementations can leverage the capabilities of the underlying operating system, while developers can easily create portable Java programs that feature time-based media by writing to the JMF API.

With JMF, you can easily create applets and applications that present, capture, manipulate, and store time-based media. The framework enables advanced developers and technology providers to perform custom processing of raw media data and seamlessly extend JMF to support additional content types and formats, optimize handling of supported formats, and create new presentation mechanisms.

High-Level Architecture

Devices such as tape decks and VCRs provide a familiar model for recording, processing, and presenting time-based media. When you play a movie using a VCR, you provide the media stream to the VCR by inserting a video tape. The VCR reads and interprets the data on the tape and sends appropriate signals to your television and speakers.


Figure 2-1: Recording, processing, and presenting time-based media.

JMF uses this same basic model. A data source encapsulates the media stream much like a video tape and a player provides processing and control mechanisms similar to a VCR. Playing and capturing audio and video with JMF requires the appropriate input and output devices such as microphones, cameras, speakers, and monitors.

Data sources and players are integral parts of JMF's high-level API for managing the capture, presentation, and processing of time-based media. JMF also provides a lower-level API that supports the seamless integration of custom processing components and extensions. This layering provides Java developers with an easy-to-use API for incorporating time-based media into Java programs while maintaining the flexibility and extensibility required to support advanced media applications and future media technologies.


Figure 2-2: High-level JMF achitecture.

Time Model

JMF keeps time to nanosecond precision. A particular point in time is typically represented by a Time object, though some classes also support the specification of time in nanoseconds.

Classes that support the JMF time model implement Clock to keep track of time for a particular media stream. The Clock interface defines the basic timing and synchronization operations that are needed to control the presentation of media data.


Figure 2-3: JMF time model.

A Clock uses a TimeBase to keep track of the passage of time while a media stream is being presented. A TimeBase provides a constantly ticking time source, much like a crystal oscillator in a watch. The only information that a TimeBase provides is its current time, which is referred to as the time-base time. The time-base time cannot be stopped or reset. Time-base time is often based on the system clock.

A Clock object's media time represents the current position within a media stream--the beginning of the stream is media time zero, the end of the stream is the maximum media time for the stream. The duration of the media stream is the elapsed time from start to finish--the length of time that it takes to present the media stream. (Media objects implement the Duration interface if they can report a media stream's duration.)

To keep track of the current media time, a Clock uses:

When presentation begins, the media time is mapped to the time-base time and the advancement of the time-base time is used to measure the passage of time. During presentation, the current media time is calculated using the following formula:

 MediaTime = MediaStartTime + Rate(TimeBaseTime - TimeBaseStartTime)

When the presentation stops, the media time stops, but the time-base time continues to advance. If the presentation is restarted, the media time is remapped to the current time-base time.

Managers

The JMF API consists mainly of interfaces that define the behavior and interaction of objects used to capture, process, and present time-based media. Implementations of these interfaces operate within the structure of the framework. By using intermediary objects called managers, JMF makes it easy to integrate new implementations of key interfaces that can be used seamlessly with existing classes.

JMF uses four managers:

To write programs based on JMF, you'll need to use the Manager create methods to construct the Players, Processors, DataSources, and DataSinks for your application. If you're capturing media data from an input device, you'll use the CaptureDeviceManager to find out what devices are available and access information about them. If you're interested in controlling what processing is performed on the data, you might also query the PlugInManager to determine what plug-ins have been registered.

If you extend JMF functionality by implementing a new plug-in, you can register it with the PlugInManager to make it available to Processors that support the plug-in API. To use a custom Player, Processor, DataSource, or DataSink with JMF, you register your unique package prefix with the PackageManager.

Event Model

JMF uses a structured event reporting mechanism to keep JMF-based programs informed of the current state of the media system and enable JMF-based programs to respond to media-driven error conditions, such as out-of data and resource unavailable conditions. Whenever a JMF object needs to report on the current conditions, it posts a MediaEvent. MediaEvent is subclassed to identify many particular types of events. These objects follow the established Java Beans patterns for events.

For each type of JMF object that can post MediaEvents, JMF defines a corresponding listener interface. To receive notification when a MediaEvent is posted, you implement the appropriate listener interface and register your listener class with the object that posts the event by calling its addListener method.

Controller objects (such as Players and Processors) and certain Control objects such as GainControl post media events.


Figure 2-4: JMF event model.

RTPSessionManager objects also post events. For more information, see RTP Events.

Data Model

JMF media players usually use DataSources to manage the transfer of media-content. A DataSource encapsulates both the location of media and the protocol and software used to deliver the media. Once obtained, the source cannot be reused to deliver other media.

A DataSource is identified by either a JMF MediaLocator or a URL (universal resource locator). A MediaLocator is similar to a URL and can be constructed from a URL, but can be constructed even if the corresponding protocol handler is not installed on the system. (In Java, a URL can only be constructed if the corresponding protocol handler is installed on the system.)

A DataSource manages a set of SourceStream objects. A standard data source uses a byte array as the unit of transfer. A buffer data source uses a Buffer object as its unit of transfer. JMF defines several types of DataSource objects:


Figure 2-5: JMF data model.
Push and Pull Data Sources

Media data can be obtained from a variety of sources, such as local or network files and live broadcasts. JMF data sources can be categorized according to how data transfer is initiated:

The degree of control that a client program can extend to the user depends on the type of data source being presented. For example, an MPEG file can be repositioned and a client program could allow the user to replay the video clip or seek to a new position in the video. In contrast, broadcast media is under server control and cannot be repositioned. Some VOD protocols might support limited user control--for example, a client program might be able to allow the user to seek to a new position, but not fast forward or rewind.

Specialty DataSources

JMF defines two types of specialty data sources, cloneable data sources and merging data sources.

A cloneable data source can be used to create clones of either a pull or push DataSource. To create a cloneable DataSource, you call the Manager createCloneableDataSource method and pass in the DataSource you want to clone. Once a DataSource has been passed to createCloneableDataSource, you should only interact with the cloneable DataSource and its clones; the original DataSource should no longer be used directly.

Cloneable data sources implement the SourceCloneable interface, which defines one method, createClone. By calling createClone, you can create any number of clones of the DataSource that was used to construct the cloneable DataSource. The clones can be controlled through the cloneable DataSource used to create them-- when connect, disconnect, start, or stop is called on the cloneable DataSource, the method calls are propagated to the clones.

The clones don't necessarily have the same properties as the cloneable data source used to create them or the original DataSource. For example, a cloneable data source created for a capture device might function as a master data source for its clones--in this case, unless the cloneable data source is used, the clones won't produce any data. If you hook up both the cloneable data source and one or more clones, the clones will produce data at the same rate as the master.

A MergingDataSource can be used to combine the SourceStreams from several DataSources into a single DataSource. This enables a set of DataSources to be managed from a single point of control--when connect, disconnect, start, or stop is called on the MergingDataSource, the method calls are propagated to the merged DataSources.

To construct a MergingDataSource, you call the Manager createMergingDataSource method and pass in an array that contains the data sources you want to merge. To be merged, all of the DataSources must be of the same type; for example, you cannot merge a PullDataSource and a PushDataSource. The duration of the merged DataSource is the maximum of the merged DataSource objects' durations. The ContentType is application/mixed-media.

Data Formats

The exact media format of an object is represented by a Format object. The format itself carries no encoding-specific parameters or global timing information, it describes the format's encoding name and the type of data the format requires.

JMF extends Format to define audio- and video-specific formats.


Figure 2-6: JMF media formats.

An AudioFormat describes the attributes specific to an audio format, such as sample rate, bits per sample, and number of channels. A VideoFormat encapsulates information relevant to video data. Several formats are derived from VideoFormat to describe the attributes of common video formats, including:

To receive notification of format changes from a Controller, you implement the ControllerListener interface and listen for FormatChangeEvents. (For more information, see Responding to Media Events.)

Controls

JMF Control provides a mechanism for setting and querying attributes of an object. A Control often provides access to a corresponding user interface component that enables user control over an object's attributes. Many JMF objects expose Controls, including Controller objects, DataSource objects, DataSink objects, and JMF plug-ins.

Any JMF object that wants to provide access to its corresponding Control objects can implement the Controls interface. Controls defines methods for retrieving associated Control objects. DataSource and PlugIn use the Controls interface to provide access to their Control objects.

Standard Controls

JMF defines the standard Control interfaces shown in Figure 2-8:, "JMF controls."

CachingControl enables download progress to be monitored and displayed. If a Player or Processor can report its download progress, it implements this interface so that a progress bar can be displayed to the user.

GainControl enables audio volume adjustments such as setting the level and muting the output of a Player or Processor. It also supports a listener mechanism for volume changes.


Figure 2-7: Gain control.

Figure 2-8: JMF controls.

DataSink or Multiplexer objects that read media from a DataSource and write it out to a destination such as a file can implement the StreamWriterControl interface. This Control enables the user to limit the size of the stream that is created.

FramePositioningControl and FrameGrabbingControl export frame-based capabilities for Players and Processors. FramePositioningControl enables precise frame positioning within a Player or Processor object's media stream. FrameGrabbingControl provides a mechanism for grabbing a still video frame from the video stream. The FrameGrabbingControl can also be supported at the Renderer level.

Objects that have a Format can implement the FormatControl interface to provide access to the Format. FormatControl also provides methods for querying and setting the format.

A TrackControl is a type of FormatControl that provides the mechanism for controlling what processing a Processor object performs on a particular track of media data. With the TrackControl methods, you can specify what format conversions are performed on individual tracks and select the Effect, Codec, or Renderer plug-ins that are used by the Processor. (For more information about processing media data, see Processing Time-Based Media with JMF.)

Two controls, PortControl and MonitorControl enable user control over the capture process. PortControl defines methods for controlling the output of a capture device. MonitorControl enables media data to be previewed as it is captured or encoded.

BufferControl enables user-level control over the buffering done by a particular object.

JMF also defines several codec controls to enable control over hardware or software encoders and decoders:

User Interface Components

A Control can provide access to a user interface Component that exposes its control behavior to the end user. To get the default user interface component for a particular Control, you call getControlComponent. This method returns an AWT Component that you can add to your applet's presentation space or application window.

A Controller might also provide access to user interface Components. For example, a Player provides access to both a visual component and a control panel component--to retrieve these components, you call the Player methods getVisualComponent and getControlPanelComponent.

If you don't want to use the default control components provided by a particular implementation, you can implement your own and use the event listener mechanism to determine when they need to be updated. For example, you might implement your own GUI components that support user interaction with a Player. Actions on your GUI components would trigger calls to the appropriate Player methods, such as start and stop. By registering your custom GUI components as ControllerListeners for the Player, you can also update your GUI in response to changes in the Player object's state.

Extensibility

Advanced developers and technology providers can extend JMF functionality in two ways:

Implementing a JMF plug-in enables you to customize or extend the capabilities of a Processor without having to implement one from scratch. Once a plug-in is registered with JMF, it can be selected as a processing option for any Processor that supports the plug-in API. JMF plug-ins can be used to:

In situations where an even greater degree of flexibility and control is required, custom implementations of the JMF Controller, Player, Processor, DataSource, or DataSink interfaces can be developed and used seamlessly with existing implementations. For example, if you have a hardware MPEG decoder, you might want to implement a Player that takes input from a DataSource and uses the decoder to perform the parsing, decoding, and rendering all in one step. Custom Players and Processors can also be implemented to integrate media engines such as Microsoft's Media Player, Real Network's RealPlayer, and IBM's HotMedia with JMF.

Note: JMF Players and Processors are not required to support plug-ins. Plug-ins won't work with JMF 1.0-based Players and some Processor implementations might choose not to support them. The reference implementation of JMF 2.0 provided by Sun Microsystems, Inc. and IBM Corporation fully supports the plug-in API.

Presentation

In JMF, the presentation process is modeled by the Controller interface. Controller defines the basic state and control mechanism for an object that controls, presents, or captures time-based media. It defines the phases that a media controller goes through and provides a mechanism for controlling the transitions between those phases. A number of the operations that must be performed before media data can be presented can be time consuming, so JMF allows programmatic control over when they occur.

A Controller posts a variety of controller-specific MediaEvents to provide notification of changes in its status. To receive events from a Controller such as a Player, you implement the ControllerListener interface. For more information about the events posted by a Controller, see Controller Events.

The JMF API defines two types of Controllers: Players and Processors. A Player or Processor is constructed for a particular data source and is normally not re-used to present other media data.


Figure 2-9: JMF controllers.

Players

A Player processes an input stream of media data and renders it at a precise time. A DataSource is used to deliver the input media-stream to the Player.The rendering destination depends on the type of media being presented.


Figure 2-10: JMF player model.

A Player does not provide any control over the processing that it performs or how it renders the media data.

Player supports standardized user control and relaxes some of the operational restrictions imposed by Clock and Controller.


Figure 2-11: JMF players.
Player States

A Player can be in one of six states. The Clock interface defines the two primary states: Stopped and Started. To facilitate resource management, Controller breaks the Stopped state down into five standby states: Unrealized, Realizing, Realized, Prefetching, and Prefetched.


Figure 2-12: Player states.

In normal operation, a Player steps through each state until it reaches the Started state:

A Player posts TransitionEvents as it moves from one state to another. The ControllerListener interface provides a way for your program to determine what state a Player is in and to respond appropriately. For example, when your program calls an asynchronous method on a Player or Processor, it needs to listen for the appropriate event to determine when the operation is complete.

Using this event reporting mechanism, you can manage a Player object's start latency by controlling when it begins Realizing and Prefetching. It also enables you to determine whether or not the Player is in an appropriate state before calling methods on the Player.

Methods Available in Each Player State

To prevent race conditions, not all methods can be called on a Player in every state. The following table identifies the restrictions imposed by JMF. If you call a method that is illegal in a Player object's current state, the Player throws an error or exception.

Method
Unrealized Player Realized Player Prefetched Player Started Player
addController NotRealizedError legal legal ClockStartedError
deallocate legal legal legal ClockStartedError
getControlPanelComponent NotRealizedError legal legal legal
getGainControl NotRealizedError legal legal legal
getStartLatency NotRealizedError legal legal legal
getTimeBase NotRealizedError legal legal legal
getVisualComponent NotRealizedError legal legal legal
mapToTimeBase ClockStoppedException ClockStoppedException ClockStoppedException legal
removeController NotRealizedError legal legal ClockStartedError
setMediaTime NotRealizedError legal legal legal
setRate NotRealizedError legal legal legal
setStopTime NotRealizedError legal legal StopTimeSetError
if previously set
setTimeBase NotRealizedError legal legal ClockStartedError
syncStart NotPrefetchedError NotPrefetchedError legal ClockStartedError

Table 2-1: Method restrictions for players.

Processors

Processors can also be used to present media data. A Processor is just a specialized type of Player that provides control over what processing is performed on the input media stream. A Processor supports all of the same presentation controls as a Player.


Figure 2-13: JMF processor model.

In addition to rendering media data to presentation devices, a Processor can output media data through a DataSource so that it can be presented by another Player or Processor, further manipulated by another Processor, or delivered to some other destination, such as a file.

For more information about Processors, see Processing.

Presentation Controls

In addition to the standard presentation controls defined by Controller, a Player or Processor might also provide a way to adjust the playback volume. If so, you can retrieve its GainControl by calling getGainControl. A GainControl object posts a GainChangeEvent whenever the gain is modified. By implementing the GainChangeListener interface, you can respond to gain changes. For example, you might want to update a custom gain control Component.

Additional custom Control types might be supported by a particular Player or Processor implementation to provide other control behaviors and expose custom user interface components. You access these controls through the getControls method.

For example, the CachingControl interface extends Control to provide a mechanism for displaying a download progress bar. If a Player can report its download progress, it implements this interface. To find out if a Player supports CachingControl, you can call getControl(CachingControl) or use getControls to get a list of all the supported Controls.

Standard User Interface Components

A Player or Processor generally provides two standard user interface components, a visual component and a control-panel component.You can access these Components directly through the getVisualComponent and getControlPanelComponent methods.

You can also implement custom user interface components, and use the event listener mechanism to determine when they need to be updated.

Controller Events

The ControllerEvents posted by a Controller such as a Player or Processor fall into three categories: change notifications, closed events, and transition events:


Figure 2-14: JMF events.

Processing

A Processor is a Player that takes a DataSource as input, performs some user-defined processing on the media data, and then outputs the processed media data.


Figure 2-15: JMF processors.

A Processor can send the output data to a presentation device or to a DataSource. If the data is sent to a DataSource, that DataSource can be used as the input to another Player or Processor, or as the input to a DataSink.

While the processing performed by a Player is predefined by the implementor, a Processor allows the application developer to define the type of processing that is applied to the media data. This enables the application of effects, mixing, and compositing in real-time.

The processing of the media data is split into several stages:


Figure 2-16: Processor stages.

The processing at each stage is performed by a separate processing component. These processing components are JMF plug-ins. If the Processor supports TrackControls, you can select which plug-ins you want to use to process a particular track. There are five types of JMF plug-ins:

Processor States

A Processor has two additional standby states, Configuring and Configured, which occur before the Processor enters the Realizing state..


Figure 2-17: Processor states.

While a Processor is in the Configured state, getTrackControls can be called to get the TrackControl objects for the individual tracks in the media stream. These TrackControl objects enable you specify the media processing operations that you want the Processor to perform.

Calling realize directly on an Unrealized Processor automatically transitions it through the Configuring and Configured states to the Realized state. When you do this, you cannot configure the processing options through the TrackControls--the default Processor settings are used.

Calls to the TrackControl methods once the Processor is in the Realized state will typically fail, though some Processor implementations might support them.

Methods Available in Each Processor State

Since a Processor is a type of Player, the restrictions on when methods can be called on a Player also apply to Processors. Some of the Processor-specific methods also are restricted to particular states. The following table shows the restrictions that apply to a Processor. If you call a method that is illegal in the current state, the Processor throws an error or exception.

Method
Unrealized Processor Configuring Processor Configured Processor Realized Processor
addController NotRealizedError NotRealizedError NotRealizedError legal
deallocate legal legal legal legal
getControlPanelComponent NotRealizedError NotRealizedError NotRealizedError legal
getControls legal legal legal legal
getDataOutput NotRealizedError NotRealizedError NotRealizedError legal
getGainControl NotRealizedError NotRealizedError NotRealizedError legal
getOutputContentDescriptor NotConfiguredError NotConfiguredError legal legal
getStartLatency NotRealizedError NotRealizedError NotRealizedError legal
getSupportedContent-
Descriptors
legal legal legal legal
getTimeBase NotRealizedError NotRealizedError NotRealizedError legal
getTrackControls NotConfiguredError NotConfiguredError legal FormatChange-
Exception
getVisualComponent NotRealizedError NotRealizedError NotRealizedError legal
mapToTimeBase ClockStoppedException ClockStoppedException ClockStoppedException ClockStopped-
Exception
realize legal legal legal legal
removeController NotRealizedError NotRealizedError NotRealizedError legal
setOutputContentDescriptor NotConfiguredError NotConfiguredError legal FormatChange-
Exception
setMediaTime NotRealizedError NotRealizedError NotRealizedError legal
setRate NotRealizedError NotRealizedError NotRealizedError legal
setStopTime NotRealizedError NotRealizedError NotRealizedError legal
setTimeBase NotRealizedError NotRealizedError NotRealizedError legal
syncStart NotPrefetchedError NotPrefetchedError NotPrefetchedError NotPrefetchedError

Table 2-2: Method restrictions for processors.

Processing Controls

You can control what processing operations the Processor performs on a track through the TrackControl for that track. You call Processor getTrackControls to get the TrackControl objects for all of the tracks in the media stream.

Through a TrackControl, you can explicitly select the Effect, Codec, and Renderer plug-ins you want to use for the track. To find out what options are available, you can query the PlugInManager to find out what plug-ins are installed.

To control the transcoding that's performed on a track by a particular Codec, you can get the Controls associated with the track by calling the TrackControl getControls method. This method returns the codec controls available for the track, such as BitRateControl and QualityControl. (For more information about the codec controls defined by JMF, see Controls.)

If you know the output data format that you want, you can use the setFormat method to specify the Format and let the Processor choose an appropriate codec and renderer. Alternatively, you can specify the output format when the Processor is created by using a ProcessorModel. A ProcessorModel defines the input and output requirements for a Processor. When a ProcessorModel is passed to the appropriate Manager create method, the Manager does its best to create a Processor that meets the specified requirements.

Data Output

The getDataOutput method returns a Processor object's output as a DataSource. This DataSource can be used as the input to another Player or Processor or as the input to a data sink. (For more information about data sinks, see Media Data Storage and Transmission.)

A Processor object's output DataSource can be of any type: PushDataSource, PushBufferDataSource, PullDataSource, or PullBufferDataSource.

Not all Processor objects output data--a Processor can render the processed data instead of outputting the data to a DataSource. A Processor that renders the media data is essentially a configurable Player.

Capture

A multimedia capturing device can act as a source for multimedia data delivery. For example, a microphone can capture raw audio input or a digital video capture board might deliver digital video from a camera. Such capture devices are abstracted as DataSources. For example, a device that provides timely delivery of data can be represented as a PushDataSource. Any type of DataSource can be used as a capture DataSource: PushDataSource, PushBufferDataSource, PullDataSource, or PullBufferDataSource.

Some devices deliver multiple data streams--for example, an audio/video conferencing board might deliver both an audio and a video stream. The corresponding DataSource can contain multiple SourceStreams that map to the data streams provided by the device.

Media Data Storage and Transmission

A DataSink is used to read media data from a DataSource and render the media to some destination--generally a destination other than a presentation device. A particular DataSink might write data to a file, write data across the network, or function as an RTP broadcaster. (For more information about using a DataSink as an RTP broadcaster, see Transmitting RTP Data With a Data Sink.)

Like Players, DataSink objects are constructed through the Manager using a DataSource. A DataSink can use a StreamWriterControl to provide additional control over how data is written to a file. See Writing Media Data to a File for more information about how DataSink objects are used.

Storage Controls

A DataSink posts a DataSinkEvent to report on its status. A DataSinkEvent can be posted with a reason code, or the DataSink can post one of the following DataSinkEvent subtypes:

To respond to events posted by a DataSink, you implement the DataSinkListener interface.

Extensibility

You can extend JMF by implementing custom plug-ins, media handlers, and data sources.

Implementing Plug-Ins

By implementing one of the JMF plug-in interfaces, you can directly access and manipulate the media data associated with a Processor:

Note: The JMF Plug-In API is part of the official JMF API, but JMF Players and Processors are not required to support plug-ins. Plug-ins won't work with JMF 1.0-based Players and some Processor implementations might choose not to support them. The reference implementation of JMF 2.0 provided by Sun Microsystems, Inc. and IBM Corporation fully supports the plug-in API.

Custom Codec, Effect, and Renderer plug-ins are available to a Processor through the TrackControl interface. To make a plug-in available to a default Processor or a Processor created with a ProcessorModel, you need to register it with the PlugInManager. Once you've registered your plug-in, it is included in the list of plug-ins returned by the PlugInManager getPlugInList method and can be accessed by the Manager when it constructs a Processor object.

Implementing MediaHandlers and DataSources

If the JMF Plug-In API doesn't provide the degree of flexibility that you need, you can directly implement several of the key JMF interfaces: Controller, Player, Processor, DataSource, and DataSink. For example, you might want to implement a high-performance Player that is optimized to present a single media format or a Controller that manages a completely different type of time-based media.

The Manager mechanism used to construct Player, Processor, DataSource, and DataSink objects enables custom implementations of these JMF interfaces to be used seamlessly with JMF. When one of the create methods is called, the Manager uses a well-defined mechanism to locate and construct the requested object. Your custom class can be selected and constructed through this mechanism once you register a unique package prefix with the PackageManager and put your class in the appropriate place in the predefined package hierarchy.

MediaHandler Construction

Players, Processors, and DataSinks are all types of MediaHandlers--they all read data from a DataSource. A MediaHandler is always constructed for a particular DataSource, which can be either identified explicitly or with a MediaLocator. When one of the createMediaHandler methods is called, Manager uses the content-type name obtained from the DataSource to find and create an appropriate MediaHandler object.


Figure 2-18: JMF media handlers.

JMF also supports another type of MediaHandler, MediaProxy. A MediaProxy processes content from one DataSource to create another. Typically, a MediaProxy reads a text configuration file that contains all of the information needed to make a connection to a server and obtain media data. To create a Player from a MediaProxy, Manager:

  1. Constructs a DataSource for the protocol described by the MediaLocator
  2. Uses the content-type of the DataSource to construct a MediaProxy to read the configuration file.
  3. Gets a new DataSource from the MediaProxy.
  4. Uses the content-type of the new DataSource to construct a Player.

The mechanism that Manager uses to locate and instantiate an appropriate MediaHandler for a particular DataSource is basically the same for all types of MediaHandlers:

When constructing Players and Processors, Manager generates the search list of available handler classes from the list of installed content package-prefixes and the content-type name of the DataSource. To search for Players, Manager looks for classes of the form:

    <content package-prefix>.media.content.<content-type>.Handler

To search for Processors, Manager looks for classes of the form:

    <content package-prefix>.media.processor.<content-type>.Handler

If the located MediaHandler is a MediaProxy, Manager gets a new DataSource from the MediaProxy and repeats the search process.

If no appropriate MediaHandler can be found, the search process is repeated, substituting unknown for the content-type name. The unknown content type is supported by generic Players that are capable of handling a large variety of media types, often in a platform-dependent way.

Because a DataSink renders the data it reads from its DataSource to an output destination, when a DataSink is created the destination must also be taken into account. When constructing DataSinks, Manager uses the list of content package-prefixes and the protocol from the MediaLocator that identifies the destination. For each content package-prefix, Manager adds to the search list a class name of the form:

 <content package-prefix>.media.datasink.protocol.Handler

If the located MediaHandler is a DataSink, Manager instantiates it, sets its DataSource and MediaLocator, and returns the resulting DataSink object. If the handler is a DataSinkProxy, Manager retrieves the content type of the proxy and generates a list of DataSink classes that support the protocol of the destination Medialocator and the content type returned by the proxy:

 <content package-prefix>.media.datasink.protocol.<content-type>.Handler

The process continues until an appropriate DataSink is located or the Manager has iterated through all of the content package-prefixes.

DataSource Construction

Manager uses the same mechanism to construct DataSources that it uses to construct MediaHandlers, except that it generates the search list of DataSource class names from the list of installed protocol package-prefixes.

For each protocol package-prefix, Manager adds to the search list a class name of the form:

    <protocol package-prefix>.media.protocol.<protocol>.DataSource 

Manager steps through each class in the list until it finds a DataSource that it can instantiate and to which it can attach the MediaLocator.



CONTENTS | PREV | NEXT | INDEX

Copyright © 1998-1999 Sun Microsystems, Inc. All Rights Reserved.