CONTENTS | PREV | NEXT | INDEX JMF 2.0 API Guide



5

Capturing Time-Based Media with JMF

You can use JMF to capture media data from a capture device such as a microphone or video camera. Captured media data can be processed and rendered or stored for future use.

To capture media data, you:

When you use a capture DataSource with a Player, you can only render the captured media data. To explicitly process or store the captured media data, you need to use a Processor.

Accessing Capture Devices

You access capture devices through the CaptureDeviceManager. The CaptureDeviceManager is the central registry for all of the capture devices available to JMF. You can get a list of the available capture devices by calling the CaptureDeviceManager.getDeviceList method.

Each device is represented by a CaptureDeviceInfo object. To get the CaptureDeviceInfo object for a particular device, you call CaptureDeviceManager.getDevice:

 CaptureDeviceInfo deviceInfo = CaptureDeviceManager.getDevice("deviceName");

Capturing Media Data

To capture media data from a particular device, you need to get the device's MediaLocator from its CaptureDeviceInfo object. You can either use this MediaLocator to construct a Player or Processor directly, or use the MediaLocator to construct a DataSource that you can use as the input to a Player or Processor. To initiate the capture process, you start the Player or Processor.

Allowing the User to Control the Capture Process

A capture device generally has a set of implementation-specific attributes that can be used to control the device. Two control types are defined to enable programmatic control of capture devices: PortControl and MonitorControl. You access these controls by calling getControl on the capture DataSource and passing in the name of the control you want.

A PortControl provides a way to select the port from which data will be captured. A MonitorControl provides a means for displaying the device's capture monitor.

Like other Control objects, if there's a visual component that corresponds to the PortControl or MonitorControl, you can get it by calling getControlComponent. Adding the Component to your applet or application window will enable users to interact with the capture control.

You can also display the standard control-panel component and visual component associated with the Player or Processor you're using.

Example 5-1: Displaying GUI components for a processor.
 Component controlPanel, visualComponent; 
 if ((controlPanel = p.getControlPanelComponent()) != null)
     add(controlPanel);  
 if ((visualComponent = p.getVisualComponent()) != null) 
     add(visualComponent);  

Storing Captured Media Data

If you want to save captured media data to a file, you need to use a Processor instead of a Player. You use a DataSink to read media data from Processor object's output data source and render the data to a file.

  1. Get the output DataSource from the Processor by calling getDataOutput.
  2. Construct a file writer DataSink by calling Manager.createDataSink. Pass in the output DataSource and a MediaLocator that specifies the location of the file to which you want to write.
  3. Call open on the DataSink to open the file.
  4. Call start on the DataSink.
  5. Call start on the Processor to begin capturing data.
  6. Wait for an EndOfMediaEvent, a particular media time, or a user event.
  7. Call stop on the Processor to end the data capture.
  8. Call close on the Processor.
  9. When the Processor is closed and the DataSink posts an EndOfStreamEvent, call close on the DataSink.
Example 5-2: Saving captured media data to a file.
 DataSink sink;
 MediaLocator dest = new MediaLocator("file://newfile.wav");
 try {
 	 sink = Manager.createDataSink(p.getDataOutput(), dest); 
 	 sink.open();
 	 sink.start();
 } catch (Exception) {} 

Example: Capturing and Playing Live Audio Data

To capture live audio data from a microphone and present it, you need to:

  1. Get the CaptureDeviceInfo object for the microphone.
  2. Create a Player using the MediaLocator retrieved from the CaptureDeviceInfo object. (You can create the Player by calling createPlayer(MediaLocator) or create a DataSource with the MediaLocator and use createPlayer(DataSource) to construct the Player.)
Example 5-3: Capturing and playing audio from a microphone.
 // Get the CaptureDeviceInfo for the live audio capture device 
 Vector deviceList = CaptureDeviceManager.getDeviceList(new 
                        AudioFormat("linear", 44100, 16, 2)); 
 if (deviceList.size() > 0)
 	 di = (CaptureDeviceInfo)deviceList.firstElement();
 else
 // Exit if we can't find a device that does linear, 44100Hz, 16 bit, 
// stereo audio. 
     System.exit(-1); 
 
 // Create a Player for the capture device: 
 try{
     Player p = Manager.createPlayer(di.getLocator()); 
 } catch (IOException e) { 
 } catch (NoPlayerException e) {} 

Example: Writing Captured Audio Data to a File

You can write captured media data to a file using a DataSink. To capture and store audio data, you need to:

  1. Get a CaptureDeviceInfo object for the audio capture device.
  2. Create a Processor using the MediaLocator retrieved from the CaptureDeviceInfo object.
  3. Get the output DataSource from the Processor.
  4. Create a MediaLocator for the file where you want to write the captured data.
  5. Create a file writer DataSink using the output DataSource.
  6. Start the file writer and the Processor.

This example uses a helper class, StateHelper.java, to manage the state of the Processor. The complete source for StateHelper is included in the appendix starting on page 179.

Example 5-4: Writing captured audio to a file with a DataSink. (1 of 2)
 	 CaptureDeviceInfo di = null;
 	 Processor p = null;
 	 StateHelper sh = null;
 	 Vector deviceList = CaptureDeviceManager.getDeviceList(new
 	 	 	 	 AudioFormat(AudioFormat.LINEAR, 44100, 16, 2));
 	 if (deviceList.size() > 0)
 	     di = (CaptureDeviceInfo)deviceList.firstElement();
 	 else
 	     // Exit if we can't find a device that does linear, 
          // 44100Hz, 16 bit,
 	     // stereo audio.
 	     System.exit(-1);
 	 try {
 	     p = Manager.createProcessor(di.getLocator());
 	     sh = new StateHelper(p);
 	 } catch (IOException e) {
 	     System.exit(-1);
 	 } catch (NoProcessorException e) {
 	     System.exit(-1);
 	 }
 	 // Configure the processor
 	 if (!sh.configure(10000))
 	     System.exit(-1);
 	 // Set the output content type and realize the processor
 	 p.setContentDescriptor(new
                  FileTypeDescriptor(FileTypeDescriptor.WAVE));
 	 if (!sh.realize(10000))
 	     System.exit(-1);
 	 // get the output of the processor
 	 DataSource source = p.getDataOutput();
 	 // create a File protocol MediaLocator with the location of the
 	 // file to which the data is to be written
 	 MediaLocator dest = new MediaLocator("file://foo.wav");
 	 // create a datasink to do the file writing & open the sink to
 	 // make sure we can write to it.
 	 DataSink filewriter = null;
 	 try {
 	     filewriter = Manager.createDataSink(source, dest);
 	     filewriter.open();
 	 } catch (NoDataSinkException e) {
 	     System.exit(-1);
 	 } catch (IOException e) {
 	     System.exit(-1);
 	 } catch (SecurityException e) {
 	     System.exit(-1);
 	 }
 	 // if the Processor implements StreamWriterControl, we can
 	 // call setStreamSizeLimit
 	 // to set a limit on the size of the file that is written.
 	 StreamWriterControl swc = (StreamWriterControl)
 	     p.getControl("javax.media.control.StreamWriterControl");
 	 //set limit to 5MB
 	 if (swc != null)
 	     swc.setStreamSizeLimit(5000000);
 
 	 // now start the filewriter and processor
 	 try {
 	     filewriter.start();
 	 } catch (IOException e) {
 	     System.exit(-1);
 	 }
 	 // Capture for 5 seconds
 	 sh.playToEndOfMedia(5000);
 	 sh.close();
 	 // Wait for an EndOfStream from the DataSink and close it...
 	 filewriter.close();

Example: Encoding Captured Audio Data

You can configure a Processor to transcode captured media data before presenting, transmitting, or storing the data. To encode captured audio data in the IMA4 format before saving it to a file:

  1. Get the MediaLocator for the capture device and construct a Processor.
  2. Call configure on the Processor.
  3. Once the Processor is in the Configured state, call getTrackControls.
  4. Call setFormat on each track until you find one that can be converted to IMA4. (For setFormat to succeed, appropriate codec plug-ins must be available to perform the conversion.)
  5. Realize the Processor and use it's output DataSource to construct a DataSink to write the data to a file.
Example 5-5: Encoding captured audio data.  
 	 // Configure the processor
 	 if (!sh.configure(10000))
 	     System.exit(-1);
 	 // Set the output content type
 	 p.setContentDescriptor(new 
                 FileTypeDescriptor(FileTypeDescriptor.WAVE));
 
 	 // Get the track control objects
 	 TrackControl track[] = p.getTrackControls();
 	 boolean encodingPossible = false;
 	 // Go through the tracks and try to program one of them
 	 // to output ima4 data.
 	 for (int i = 0; i < track.length; i++) {
 	     try {
 	 	 track[i].setFormat(new AudioFormat(AudioFormat.IMA4_MS));
 	 	 encodingPossible = true;
 	     } catch (Exception e) {
 	 	 // cannot convert to ima4
 	 	 track[i].setEnabled(false);
 	     }
 	 }
 
 	 if (!encodingPossible) {
 	     sh.close();
 	     System.exit(-1);
 	 }
 	 // Realize the processor
 	 if (!sh.realize(10000))
 	     System.exit(-1);

Example: Capturing and Saving Audio and Video Data

In this example, a ProcessorModel is used to create a Processor to capture live audio and video data, encode the data as IMA4 and Cinepak tracks, interleave the tracks, and save the interleaved media stream to a QuickTime file.

When you construct a ProcessorModel by specifying the track formats and output content type and then use that model to construct a Processor, the Processor is automatically connected to the capture device that meets the format requirements, if there is one.

Example 5-6: Creating a capture Processor with ProcessorModel.  
 	 Format formats[] = new Format[2];
 	 formats[0] = new AudioFormat(AudioFormat.IMA4);
 	 formats[1] = new VideoFormat(VideoFormat.CINEPAK);
 	 FileTypeDescriptor outputType =
 	     new FileTypeDescriptor(FileTypeDescriptor.QUICKTIME);
 	 Processor p = null;
 	 
 	 try {
 	     p = Manager.createRealizedProcessor(new ProcessorModel(formats,
 	 	 	 	 	 	 	 	    outputType));
 	 } catch (IOException e) {
 	     System.exit(-1);
 	 } catch (NoProcessorException e) {
 	     System.exit(-1);
 	 } catch (CannotRealizeException e) {
 	     System.exit(-1);
 	 }
 	 // get the output of the processor
 	 DataSource source = p.getDataOutput();
 	 // create a File protocol MediaLocator with the location
      // of the file to
 	 // which bits are to be written
 	 MediaLocator dest = new MediaLocator("file://foo.mov");
 	 // create a datasink to do the file writing & open the 
      // sink to make sure
 	 // we can write to it.
 	 DataSink filewriter = null;
 	 try {
 	     filewriter = Manager.createDataSink(source, dest);
 	     filewriter.open();
 	 } catch (NoDataSinkException e) {
 	     System.exit(-1);
 	 } catch (IOException e) {
 	     System.exit(-1);
 	 } catch (SecurityException e) {
 	     System.exit(-1);
 	 }
 	 // now start the filewriter and processor
 	 try {
 	     filewriter.start();
 	 } catch (IOException e) {
 	     System.exit(-1);
 	 }
 	 p.start();
 	 // stop and close the processor when done capturing...
 	 // close the datasink when EndOfStream event is received...



CONTENTS | PREV | NEXT | INDEX

Copyright © 1998-1999 Sun Microsystems, Inc. All Rights Reserved.