CONTENTS | PREV | NEXT | INDEX JMF 2.0 API Guide



6

Extending JMF

You can extend JMF by implementing one of the plug-in interfaces to perform custom processing on a Track, or by implementing completely new DataSources and MediaHandlers.

Note: JMF Players and Processors are not required to support plug-ins--plug-ins won't work with JMF 1.0-based Players and some 2.0-based implementations might choose not to support plug-ins. The reference implementation of JMF 2.0 provided by Sun Microsystems, Inc. and IBM Corporation fully supports the plug-in API.

Implementing JMF Plug-Ins

Custom JMF plug-ins can be used seamlessly with Processors that support the plug-in API. After you implement your plug-in, you need to install it and register it with the PlugInManager to make it available to plug-in compatible Processors.

Implementing a Demultiplexer Plug-In

A Demultiplexer parses media streams such as WAV, MPEG or QuickTime. If the stream is multiplexed, the separate tracks are extracted. You might want to implement a Demultiplexer plug-in to support a new file format or provide a higher-performance demultiplexer. If you implement a custom DataSource, you can implement a Demultiplexer plug-in that works with your custom DataSource to enable playback through an existing Processor.

A Demultiplexer is a single-input, multi-output processing component. It reads data from a push or pull DataSource, extracts the individual tracks, and outputs each track separately.

A Demultiplexer is a type of MediaHandler, it must implement the MediaHandler setSource method. This method is used by the Processor to locate a Demultiplexer that can handle its DataSource. The Processor goes through the list of registered Demultiplexers until it finds one that does not return an exception when setSource it called.

The main work performed by a Demultiplexer is done in the implementation of the getTracks method, which returns an array of the tracks extracted from the input DataSource.

A complete example of a GSM demultiplexer is provided in Demultiplexer Plug-In. When you implement a Demultiplexer, you need to:

  1. Implement getSupportedInputContentDescriptors to advertise what input formats the demulitplexer supports. For example, the GSM demultiplexer needs to advertise that it supports GSM files.
Example 6-1: Implementing getSupportedInputContentDescriptors.
     private static ContentDescriptor[] supportedFormat =
         new ContentDescriptor[] {new ContentDescriptor("audio.x_gsm")};
 
     public ContentDescriptor [] getSupportedInputContentDescriptors() {
         return supportedFormat;
     }

  1. Implement the MediaHandler setSource method to check the DataSource and determine whether or not the Demultiplexer can handle that type of source. For example, the GSM demultiplexer supports PullDataSources:
Example 6-2: Implementing setSource for a Demultiplexer. (1 of 2)
     public void setSource(DataSource source)
         throws IOException, IncompatibleSourceException {
 
         if (!(source instanceof PullDataSource)) {
             throw new IncompatibleSourceException("DataSource 
                not supported: " + source);
         } else {
             streams = ((PullDataSource) source).getStreams();
         }
         if ( streams == null) {
             throw new IOException("Got a null stream from the DataSource");
         }
 
         if (streams.length == 0) {
             throw new IOException("Got a empty stream array 
                from the DataSource");
         }
 
         this.source = source;
         this.streams = streams;
         
         positionable =  (streams[0] instanceof Seekable);
         seekable =  positionable && ((Seekable)
                    streams[0]).isRandomAccess();
 
         if (!supports(streams))
             throw new IncompatibleSourceException("DataSource not 
                    supported: " + source);
     }

  1. Implement getTracks to parse the header and extract the individual tracks from the stream if it is multiplexed. In the GSM demultiplexer a readHeader method is implemented to parse the header. The getTracks method returns an array of GsmTracks. (See Demultiplexer Plug-In for the implementation of GsmTracks.)
Example 6-3: Implementing getTracks for a Demultiplexer. (1 of 2)
     public Track[] getTracks() throws IOException, BadHeaderException {
 
         if (tracks[0] != null)
             return tracks;        
         stream = (PullSourceStream) streams[0];
         readHeader();
         bufferSize = bytesPerSecond;
         tracks[0] = new GsmTrack((AudioFormat) format,
                                 /*enabled=*/ true,
                                  new Time(0),
                                  numBuffers,
                                  bufferSize,
                                  minLocation,
                                  maxLocation
                                  );
         return tracks;
     }
 // ...
 
     private void readHeader()
         throws IOException, BadHeaderException {
 
         minLocation = getLocation(stream); // Should be zero
 
         long contentLength = stream.getContentLength();
         if ( contentLength != SourceStream.LENGTH_UNKNOWN ) {
             double durationSeconds = contentLength / bytesPerSecond;
             duration = new Time(durationSeconds);
             maxLocation = contentLength;
         } else {
             maxLocation = Long.MAX_VALUE;
         }

Implementing a Codec or Effect Plug-In

Codec plug-ins are used to decode compressed media data, convert media data from one format to another, or encode raw media data into a compressed format. You might want to implement a Codec to provide performance enhancements over existing solutions, support new compressed or uncompressed data formats, or convert data from a custom format to a standard format that can be easily processed and rendered.

A Codec is a single-input, single-output processing component. It reads data for an individual track, processes the data, and outputs the results.

A Codec plug-in can enable the user to control the processing it performs through EncodingControl or DecodingControl objects. These controls provide a way to adjust attributes such as the frame rate, bit rate, and compression ratio. Codec controls are accessed through the getControls method. If a particular CodecControl provides a user-interface component, its accessed by calling getControlComponent.

When you implement a Codec, you need to:

  1. Implement getSupportedInputFormats and getSupportedOutputFormats to advertise what input and output formats the codec supports.
  2. Enable the selection of those formats by implementing setInputFormat and setOutputFormat.
  3. Implement process to actually perform the compression or decompression of the input Track.
Effect Plug-ins

An Effect plug-in is actually a specialized type of Codec that performs some processing on the input Track other than compression or decompression. For example, you might implement a gain effect that adjusts the volume of an audio track. Like a Codec, an Effect is a single-input, single-output processing component and the data manipulation that the Effect performs is implemented in the process method.

An Effect plug-in can be used as either a pre-processing effect or a post-processing effect. For example, if a Processor is being used to render a compressed media stream, the Effect would typically be used as a post-processing effect and applied after the stream has been decoded. Conversely, if the Processor was being used to output a compressed media stream, the Effect would typically be applied as a pre-processing effect before the stream is encoded.

When you implement an Effect, you need to:

  1. Implement getSupportedInputFormats and getSupportedOutputFormats to advertise what input and output formats the effect supports.
  2. Enable the selection of those formats by implementing setInputFormat and setOutputFormat.
  3. Implement process to actually perform the effect processing.

Note that there's no mechanism for specifying what a particular Effect does--the name of an Effect plug-in class should provide some indication of its intended use.

Example: GainEffect Plug-In

In this example, the Effect interface is implemented to create an effect that adjusts the gain on the incoming audio data and outputs the modified data. By default, the GainEffect process method increases the gain by a factor of 2.

Example 6-4: Implementing a gain effect plug-in (1 of 5)
 import javax.media.*;
 import javax.media.format.*;
 import javax.media.format.audio.*;
 
 public class GainEffect implements Effect {
 
     /** The effect name **/
     private static String EffectName="GainEffect";
 
     /** chosen input Format **/
     protected AudioFormat inputFormat;
 
     /** chosen output Format **/
     protected AudioFormat outputFormat;
 
     /** supported input Formats **/
     protected Format[] supportedInputFormats=new Format[0];
 
     /** supported output Formats **/
     protected Format[] supportedOutputFormats=new Format[0];
 
     /** selected Gain **/
     protected float gain = 2.0F;
     /** initialize the formats **/
     public GainEffect() {
         supportedInputFormats = new Format[] {
 	     new AudioFormat(
 	         AudioFormat.LINEAR,
                 Format.NOT_SPECIFIED,
                 16,
                 Format.NOT_SPECIFIED,
                 AudioFormat.LITTLE_ENDIAN,
                 AudioFormat.SIGNED,
                 16,
                 Format.NOT_SPECIFIED,
                 Format.byteArray
 	     )
 	 };
         supportedOutputFormats = new Format[] {
 	     new AudioFormat(
 	         AudioFormat.LINEAR,
                 Format.NOT_SPECIFIED,
                 16,
                 Format.NOT_SPECIFIED,
                 AudioFormat.LITTLE_ENDIAN,
                 AudioFormat.SIGNED,
                 16,
                 Format.NOT_SPECIFIED,
                 Format.byteArray
 	     )
 	 };
     }
     /** get the resources needed by this effect **/
     public void open() throws ResourceUnavailableException {
     }
 
     /** free the resources allocated by this codec **/
     public void close() {
     }
 
     /** reset the codec **/
     public void reset() {
     }
 
     /** no controls for this simple effect **/
     public Object[] getControls() {
         return (Object[]) new Control[0];
     }
 
     /**
      * Return the control based on a control type for the effect.
      **/
     public Object getControl(String controlType) {
         try {
             Class cls = Class.forName(controlType);
             Object cs[] = getControls();
             for (int i = 0; i < cs.length; i++) {
                 if (cls.isInstance(cs[i]))
                 return cs[i];
             }
             return null;
         } catch (Exception e) { // no such controlType or such control
             return null;
         }
     }
     /************** format methods *************/
     /** set the input format **/
     public Format setInputFormat(Format input) {
         // the following code assumes valid Format
         inputFormat = (AudioFormat)input;
         return (Format)inputFormat;
     }
     /** set the output format **/
     public Format setOutputFormat(Format output) {
         // the following code assumes valid Format
         outputFormat = (AudioFormat)output;
         return (Format)outputFormat;
     }
     /** get the input format **/
     protected Format getInputFormat() {
         return inputFormat;
     }
     /** get the output format **/
     protected Format getOutputFormat() {
         return outputFormat;
     }
 
     /** supported input formats **/
     public Format [] getSupportedInputFormats() {
         return supportedInputFormats;
     }
 
     /** output Formats for the selected input format **/
     public Format [] getSupportedOutputFormats(Format in) {
         if (! (in instanceof AudioFormat) )
             return new Format[0];
 
         AudioFormat iaf=(AudioFormat) in;
 
         if (!iaf.matches(supportedInputFormats[0]))
             return new Format[0];
 
 	 AudioFormat oaf= new AudioFormat(
 	         AudioFormat.LINEAR,
                 iaf.getSampleRate(),
                 16,
                 iaf.getChannels(),
                 AudioFormat.LITTLE_ENDIAN,
                 AudioFormat.SIGNED,
                 16,
                 Format.NOT_SPECIFIED,
                 Format.byteArray
         );
 
         return new Format[] {oaf};
     }
 
     /** gain accessor method **/
     public void setGain(float newGain){
         gain=newGain;
     }
     /** return effect name **/
     public String getName() {
         return EffectName;
     }
 
     /** do the processing **/
     public int process(Buffer inputBuffer, Buffer outputBuffer){
 
         // == prolog
         byte[] inData = (byte[])inputBuffer.getData();
         int inLength = inputBuffer.getLength();
         int inOffset = inputBuffer.getOffset();
         byte[] outData = validateByteArraySize(outputBuffer, inLength);
         int outOffset = outputBuffer.getOffset();
 
 	 int samplesNumber = inLength / 2 ;
 
         // == main
 
         for (int i=0; i< samplesNumber;i++) {
             int tempL = inData[inOffset ++];
             int tempH = inData[inOffset ++];
             int sample = tempH | (tempL & 255);
 
             sample = (int)(sample * gain);
 
             if (sample>32767) // saturate
                 sample = 32767;
             else if (sample < -32768)
                 sample = -32768;
 
             outData[outOffset ++]=(byte) (sample &  255);
             outData[outOffset ++]=(byte) (sample >> 8);
 
         }
 
         // == epilog
         updateOutput(outputBuffer,outputFormat, samplesNumber, 0);
         return BUFFER_PROCESSED_OK;
     }
     /**
      * Utility: validate that the Buffer object's data size is at least
      * newSize bytes.
      * @return array with sufficient capacity
      **/
     protected byte[] validateByteArraySize(Buffer buffer,int newSize) {
         Object objectArray=buffer.getData();
         byte[] typedArray;
         if (objectArray instanceof byte[]) { // is correct type AND not null
             typedArray=(byte[])objectArray;
             if (typedArray.length >= newSize ) { // is sufficient capacity
                 return typedArray;
             }
         }
         System.out.println(getClass().getName()+
 	                  " : allocating byte["+newSize+"] ");
         typedArray = new byte[newSize];
         buffer.setData(typedArray);
         return typedArray;
     }
     /** utility: update the output buffer fields **/
     protected void updateOutput(Buffer outputBuffer,
                                 Format format,int length, int offset) {
         outputBuffer.setFormat(format);
         outputBuffer.setLength(length);
         outputBuffer.setOffset(offset);
     }
 }

Implementing a Multiplexer Plug-In

A Multiplexer is essentially the opposite of a Demultiplexer: it takes individual tracks of media data and merges them into a single multiplexed media-stream such as an MPEG or QuickTime file. You might want to implement a Multiplexer plug-in to support a custom DataSource or provide a higher-performance. However, it's not always necessary to implement a separate Multiplexer plug-in--multiplexing can also be performed by a DataSink.

A Multiplexer is a multi-input, single-output processing component. It reads data from a set of tracks and outputs a DataSource.

The main work performed by a Multiplexer is done in the implementation of the process method. The getDataSource method returns the DataSource generated by the Multiplexer.

When you implement a Multiplexer, you need to:

  1. Implement getSupportedOutputContentDescriptors to advertise what output formats the Multiplexer supports.
  2. Enable the selection of the output format by implementing setOutputContentDescriptor.
  3. Implement process to actually merge the individual tracks into an output stream of the selected format.

Unlike a Codec, there is no specific query mechanism. The initializeTracks method should return false if any of the specified track formats are not supported.

Implementing a Renderer Plug-In

A Renderer delivers media data in its final processed state. It is a single-input processing component with no output. Renderer plug-ins read data from a DataSource and typically present the media data to the user, but can also be used to provide access to the processed media data for use by another application or device. For example, you might implement a Renderer plug-in if you want to render a video to a location other than the screen.

If you're implementing a video renderer, you should implement the VideoRenderer interface, which extends Renderer to define video-specific attributes such as the Component where the video will be rendered.

The main work performed by a Renderer is done in the implementation of the process method. When you implement a Renderer, you need to:

  1. Implement getSupportedInputFormats to advertise what input formats the Renderer supports.
  2. Enable the selection of the input format by implementing setInputFormat.
  3. Implement process to actually process the data and render it to the output device that this Renderer represents.
Example: AWTRenderer

This example implements the Renderer plug-in to create a Renderer for RGB images that uses AWT Image.

Example 6-5: Implementing a Renderer plug-in (1 of 7)
 import javax.media.*;
 import javax.media.renderer.VideoRenderer;
 import javax.media.format.Format;
 import javax.media.format.video.VideoFormat;
 import javax.media.format.video.RGBFormat;
 import java.awt.*;
 import java.awt.image.*;
 import java.awt.event.*;
 import java.util.Vector;
 /*******************************************************************
  * Renderer for RGB images using AWT Image.  
********************************************************************/
 public class SampleAWTRenderer implements 
javax.media.renderer.VideoRenderer {
     /**
     * Variables and Constants
     **/
 
     // The descriptive name of this renderer
     private static final String name = "Sample AWT Renderer";
 
     protected RGBFormat inputFormat;
     protected RGBFormat supportedRGB;
     protected Format [] supportedFormats;
 
     protected MemoryImageSource sourceImage;
     protected Image     destImage;
     protected Buffer    lastBuffer = null;
 
     protected int       inWidth = 0;
     protected int       inHeight = 0;
     protected Component component = null;
     protected Rectangle reqBounds = null;
     protected Rectangle bounds = new Rectangle();
     protected boolean started = false;
     
     /**
     * Constructor                
     **/
     
     public SampleAWTRenderer() {
 	 // Prepare supported input formats and preferred format
 	 int rMask = 0x000000FF;
 	 int gMask = 0x0000FF00;
 	 int bMask = 0x00FF0000;
 
 	 supportedRGB = new RGBFormat(null,	 	     // size
 	 	 	 	      Format.NOT_SPECIFIED,  // maxDataLength
 	 	 	 	      int[].class,	     // buffer type
 	 	 	 	      Format.NOT_SPECIFIED,  // frame rate
 	 	 	 	      32,	 	     // bitsPerPixel
 	 	 	 	      RGBFormat.PACKED,	     // packed
 	 	 	 	      rMask, gMask, bMask,   // component masks
 	 	 	 	      1,	 	 	     // pixel stride
 	 	 	 	      Format.NOT_SPECIFIED,  // line stride
 	 	 	 	      Format.FALSE,          // flipped
 	 	 	 	      Format.NOT_SPECIFIED   // endian
 	 	 	 	      );
 	 supportedFormats = new VideoFormat[1];
 	 supportedFormats[0] = supportedRGB;
     }
    /**
    * Controls implementation              
     **/
     
     // Returns an array of supported controls
 
    public Object[] getControls() {
 	 // No controls
         return (Object[]) new Control[0];
     }
 
     /**
      * Return the control based on a control type for the PlugIn.
      */
     public Object getControl(String controlType) {
        try {
           Class  cls = Class.forName(controlType);
           Object cs[] = getControls();
          for (int i = 0; i < cs.length; i++) {
              if (cls.isInstance(cs[i]))
                 return cs[i];
           }
           return null;
        } catch (Exception e) {   // no such controlType or such control
          return null;
        }
     }
 
     /**
      * PlugIn implementation
      **/
 
     public String getName() {
 	 return name;
     }
     
     // Opens the plugin
     public void open() throws ResourceUnavailableException {
 	 sourceImage = null;
 	 destImage   = null;
 	 lastBuffer    = null;
     }
     /** Resets the state of the plug-in. Typically at end of media or 
     * when media is repositioned.
      */
     public void reset() {
 	 // Nothing to do
     }
 
     public void close() {
 	 // Nothing to do
     }
     /** 
      * Renderer implementation
      **/
 
     public void start() {
 	 started = true;
     }
 
     public void stop() {
 	 started = false;
     }
     
     // Lists the possible input formats supported by this plug-in.
 
     public Format [] getSupportedInputFormats() {
 	 return supportedFormats;
     }
     // Set the data input format.
 
     public Format setInputFormat(Format format) {
 	 if ( format != null && format instanceof RGBFormat &&
 	      format.matches(supportedRGB)) {
 	     
 	     inputFormat = (RGBFormat) format;
 	     Dimension size = inputFormat.getSize();
 	     inWidth = size.width;
 	     inHeight = size.height;
 	     return format;
 	 } else
 	     return null;
     }
 
     // Processes the data and renders it to a component
 
     public synchronized int process(Buffer buffer) {
 	 if (component == null)
 	     return BUFFER_PROCESSED_FAILED;
 
 	 Format inf = buffer.getFormat();
 	 if (inf == null)
 	     return BUFFER_PROCESSED_FAILED;
 	 if (inf != inputFormat || !buffer.getFormat().equals(inputFormat))  
      {
 	     if (setInputFormat(inf) != null)
 	 	     return BUFFER_PROCESSED_FAILED;
 	 }
 
 	 Object data = buffer.getData();
 	 if (!(data instanceof int[]))
 	     return BUFFER_PROCESSED_FAILED;
 	 if (lastBuffer != buffer) {
 	     lastBuffer = buffer;
 	     newImage(buffer);
 	 }
 
 	 sourceImage.newPixels(0, 0, inWidth, inHeight);
 
 	 Graphics g = component.getGraphics();
 	 if (g != null) {
 	     if (reqBounds == null) {
 	 	 bounds = component.getBounds();
 	 	 bounds.x = 0;
 	 	 bounds.y = 0;
 	     } else
 	 	     bounds = reqBounds;
 	     g.drawImage(destImage, bounds.x, bounds.y,
 	 	 	 bounds.width, bounds.height,
 	 	 	 0, 0, inWidth, inHeight, component);
 	 }
 	 
 	 return BUFFER_PROCESSED_OK;
     }
 
     /**
      * VideoRenderer implementation
      **/
 
     /**
      * Returns an AWT component that it will render to. Returns null
      * if it is not rendering to an AWT component.
      */
     public java.awt.Component getComponent() {
 	 if (component == null) {
 	     component = new Canvas() {
 	 	 public Dimension getPreferredSize() {
 	 	     return new Dimension(getInWidth(), getInHeight());
 	 	 }
 
 	 	 public void update(Graphics g) {
 	 	 }
 
 	 	 public void paint(Graphics g) {
 	 	     // Need to repaint image if the movie is in paused state
 	 	 }
 
 	     };
 	 }
 
 	 return component;
     }
     /**
      * Requests the renderer to draw into a specified AWT component.
      * Returns false if the renderer cannot draw into the specified
      * component.
      */
     public boolean setComponent(java.awt.Component comp) {
 	 component = comp;
 	 return true;
     }
 
    /**
      * Sets the region in the component where the video is to be
      * rendered to. Video is to be scaled if necessary. If
     * <code>rect</code> is null, then the video occupies the entire 
     * component.
      */
     public void setBounds(java.awt.Rectangle rect) {
 	 reqBounds = rect;
     }
 
     /**
      * Returns the region in the component where the video will be
      * rendered to. Returns null if the entire component is being used.
      */
     public java.awt.Rectangle getBounds() {
 	 return reqBounds;
     }
     
     /**
      * Local methods
     **/
 
     int getInWidth() {
 	 return inWidth;
     }
 
     int getInHeight() {
 	 return inHeight;
     }
     
     private void newImage(Buffer buffer) {
 	 Object data = buffer.getData();
 	 if (!(data instanceof int[]))
 	     return;
 	 RGBFormat format = (RGBFormat) buffer.getFormat();
 
 	 DirectColorModel dcm = new 
            DirectColorModel(format.getBitsPerPixel(),
 	 	 	 	 	 	     format.getRedMask(),
 	 	 	 	 	 	     format.getGreenMask(),
 	 	 	 	 	 	     format.getBlueMask());
 	 sourceImage = new MemoryImageSource(format.getLineStride(),
 	 	 	 	 	     format.getSize().height,
 	 	 	 	 	     dcm,
 	 	 	 	 	     (int[])data, 0,
 	 	 	 	 	     format.getLineStride());
 	 sourceImage.setAnimated(true);
 	 sourceImage.setFullBufferUpdates(true);
 	 if (component != null) {
 	     destImage = component.createImage(sourceImage);
 	     component.prepareImage(destImage, component);
 	 }
     }
 }

Registering a Custom Plug-In With the Plug-In Manager

To make a custom plug-in available to a Processor through the TrackControl interface, you need to register it with the PlugInManager. (The default plug-ins are registered automatically.)

To register a new plug-in, you use the PlugInManager addPlugIn method. You must call commit to make the addition permanent. For example, to register the GainEffect plug-in from the example on page 89:

Example 6-6: Registering a new plug-in. (1 of 2)
 // Name of the new plugin
 string GainPlugin = new String("COM.mybiz.media.GainEffect");
 
 // Supported input Formats
 Format[] supportedInputFormats = new Format[] {
 	     new AudioFormat(
 	         AudioFormat.LINEAR,
                 Format.NOT_SPECIFIED,
                 16,
                 Format.NOT_SPECIFIED,
                 AudioFormat.LITTLE_ENDIAN,
                 AudioFormat.SIGNED,
                 16,
                 Format.NOT_SPECIFIED,
                 Format.byteArray
 	     )
 };
 
 // Supported output Formats 
 Format[] supportedOutputFormats = new Format[] {
 	     new AudioFormat(
 	         AudioFormat.LINEAR,
                 Format.NOT_SPECIFIED,
                 16,
                 Format.NOT_SPECIFIED,
                 AudioFormat.LITTLE_ENDIAN,
                 AudioFormat.SIGNED,
                 16,
                 Format.NOT_SPECIFIED,
                 Format.byteArray
 	     )
 };
 
 // Add the new plug-in to the plug-in registry
 PlugInManager.addPlugIn(GainPlugin, supportedInputFormats, 
                          supportedOutputFormats, EFFECT);
 
 // Save the changes to the plug-in registry
 PlugInManager.commit();

If you want to make your plug-in available to other users, you should create an Java applet or application that performs this registration process and distribute it with your plug-in.

You can remove a plug-in either temporarily or permanently with the removePlugIn method. To make the change permanent, you call commit.

Note: The reference implementation of JMF 2.0 provided by Sun Microsystems, Inc. and IBM Corporation provides a utility application, JMFRegistry, that you can use to register plug-ins interactively.

Implementing Custom Data Sources and Media Handlers

Custom DataSources and MediaHandlers such as Players and Processors can be used seamlessly with JMF to support new formats and integrate existing media engines with JMF.

Implementing a Protocol Data Source

A DataSource is an abstraction of a media protocol-handler. You can implement new types of DataSources to support additional protocols by extending PullDataSource, PullBufferDataSource, PushDataSource, or PushBufferDataSource. If you implement a custom DataSource, you can implement Demultiplexer and Multiplexer plug-ins that work with your custom DataSource to enable playback through an existing Processor, or you can implement a completely custom MediaHandler for your DataSource.

A DataSource manages a collection of SourceStreams of the corresponding type. For example, a PullDataSource only supports pull data-streams; it manages a collection of PullSourceStreams. Similarly, a PushDataSource only supports push data-streams; it manages a collection of PushSourceStreams. When you implement a new DataSource, you also need to implement the corresponding source stream: PullSourceStream, PullBufferStream, PushSourceStream, or PushBufferStream.

If your DataSource supports changing the media position within the stream to a specified time, it should implement the Positionable interface. If the DataSource supports seeking to a particular point in the stream, the corresponding SourceStream should implement the Seekable interface.

So that the Manager can construct your custom DataSource, the name and package hierarchy for the DataSource must follow certain conventions. The fully qualified name of your custom DataSource should be:

 <protocol package-prefix>.media.protocol.<protocol>.DataSource 

The protocol package-prefix is a unique identifier for your code that you register with the JMF PackageManager (for example, COM.mybiz) as a protocol package-prefix. The protocol identifies the protocol for your new DataSource. For more information, see Integrating a Custom Data Source with JMF.

Example: Creating an FTP DataSource

The example in Sample Data Source Implementation demonstrates how to support an additional protocol by implementing a custom DataSource and SourceStream. This DataSource, FTPDataSource, implements PullDataSource.

Integrating a Custom Data Source with JMF

To integrate a custom DataSource implementation with JMF you need to:

For example, to integrate a new DataSource for the protocol type xxx, you would create and install a package called:

 <protocol package-prefix>.media.protocol.xxx.DataSource

that contains the new DataSource class. You also need to add your package prefix (an identifier for your code, such as COM.mybiz) to the protocol package-prefix list managed by the PackageManager.

Example 6-7: Registering a protocol package-prefix.
 Vector  packagePrefix = PackageManager.getProtocolPrefixList();
 string myPackagePrefix = new String("COM.mybiz");
 // Add new package prefix to end of the package prefix list. 
 packagePrefix.addElement(myPackagePrefix);
 PackageManager.setProtocolPrefixList();
 // Save the changes to the package prefix list.
 PackageManager.commitProtocolPrefixList();

If you want to make your new DataSource available to other users, you should create an Java applet or application that performs this registration process and distribute it with your DataSource.

Implementing a Basic Controller

Controllers can be implemented to present time-based media other than audio or video data. For example, you might want to create a Controller that manages a slide-show presentation of still images.

Example: Creating a Timeline Controller

The sample in Sample Controller Implementation illustrates how a simple time-line Controller can be implemented in JMF. This Controller, TimeLineController, takes array of time values (representing a time line) and it keeps track of which segment in the time line you are in.

TimeLineController uses a custom media event, TimeLineEvent, to indicate when the segment in the time line changes.

Implementing a DataSink

JMF provides a default DataSink that can be used to write data to a file. Other types of DataSink classes can be implemented to facilitate writing data to the network or to other destinations.

To create a custom DataSink, you implement the DataSink interface. A DataSink is a type of MediaHandler, so you must also implement the MediaHandler setSource method.

To use your DataSink with JMF, you need to add your package-prefix to the content package-prefix list maintained by the PackageManager. For more information, see "Integrating a Custom Media Handler with JMF".

Integrating a Custom Media Handler with JMF

To integrate a new MediaHandler with JMF, you need to:

For example, to integrate a new Player for the content type mpeg.sys, you would create and install a package called:

 <content package-prefix>.media.content.mpeg.sys.Handler

that contains the new Player class. The package prefix is an identifier for your code, such as COM.mybiz. You also need to add your package prefix to the content package-prefix list managed by the PackageManager.

Example 6-8: Registering a content package-prefix.
 Vector  packagePrefix = PackageManager.getContentPrefixList();
 string myPackagePrefix = new String("COM.mybiz");
 // Add new package prefix to end of the package prefix list. 
 packagePrefix.addElement(myPackagePrefix);
 PackageManager.setContentPrefixList();
 // Save the changes to the package prefix list.
 PackageManager.commitContentPrefixList();

If you want to make your new MediaHandler available to other users, you should create an Java applet or application that performs this registration process and distribute it with your MediaHandler.

Registering a Capture Device with JMF

The implementor of a device is responsible for defining a CaptureDeviceInfo object for the device. When the device is installed, it must be registered with the CaptureDeviceManager by calling addDevice.



CONTENTS | PREV | NEXT | INDEX

Copyright © 1998-1999 Sun Microsystems, Inc. All Rights Reserved.