JMF-J3D Interactions: Prototype for a Streaming 3D Chat Room

Video texturing is an important technique that allows the developer to put live or prerecorded digital video on surfaces in the 3D environment. Typical applications that would make use of video texturing would be a 3D chat room, a 3D shopping mall application, or a video game with dynamic and naturalistic texturing.

From a JMF standpoint, we need to write a class that implements the VideoRenderer interface. The question is where. Recall that a VideoRenderer is a subinterface of Renderer, which is in turn a subinterface of Plug-In. A Plug-In accepts data in a particular format and performs some process including output. In the case of our Plug-In, we want to output data on to the screen.

The avenue for the creation of a dynamic texture in Java 3D was explored in AnimatedTextureJ3D.java (from Listing 11.8 of Chapter 11, “Creating and Viewing the Virtual World”). Recall that we created a simple Graphics2D object, placed it in a BufferedImage, and added that image to a Texture2D. Although it was not necessary for performance reasons on that particular example, we also specified the image as byReference. With the video stream that we are about use, the byReference option is pretty much mandatory because the performance needs are great with the dynamic digital video.

In short, we need put the data stored in the Buffer of the VideoRenderer to the BufferedImage of an ImageComponent2D for every frame of video. Recall from our studies of JMF that the process() method of Renderer is called continually when a TrackControl object is started. This is the natural place to try to exchange the video data with the texture data.

The conceptual challenge is in realizing that the VideoRenderer is not putting the video directly on the screen, but instead is passing the data to a BufferedImage. After the video data is fed to a Texture2D object, the Java 3D renderer can make the necessary transformations to put it in the 3D environment. The result is a video playing in the environment that can be seen from different angles and distances (see Figure 14.1).

Figure 14.1. Screen shot from the VideoCubes application.


Extending Texture2D by Implementing VideoRenderer

In order to keep the code as simple as possible, we will hold off on swapping the video data into the BufferedImage data until later. We first want to concentrate on the control of the texturing using the TrackControl object and the ByReference option of ImageComponent2D.

Recall from Chapter 11 that the ImageComponent2D class encapsulates a texture to be used with a Shape3D Node. The ImageComponent class works together with the Texture class and the Appearance class to enable mapping of textures on objects. Any ImageComponent object comes in one of two flavors—ByReference and ByCopy. The fundamental difference is that ByReference establishes a reference to a RenderedImage (of which BufferedImage is the primary example).

Under certain conditions (see Table 14.1), ByReference can be used. In By-Reference image data can be swapped into the ImageComponent without making a new copy of the image. We can see the necessity of this when we consider a video texture that has 15 frames per second. Without ByReference, we would be subject to real memory problems because of the number of images that would have been created and destroyed.

The implementing class for VideoRenderer, JMFTexture, is shown in Listing 14.1.

Listing 14.1 JMFTexture.java
import javax.media.*;
import javax.media.renderer.VideoRenderer;
import javax.media.control.*;
import javax.media.Format;
import javax.media.format.VideoFormat;
import javax.media.format.RGBFormat;
import java.awt.*;
import java.awt.image.*;
import java.awt.event.*;
import java.util.Vector;
import java.util.Random;
import java.awt.geom.*;
// Java 3D packages
import com.sun.j3d.utils.universe.*;
import com.sun.j3d.utils.geometry.Box;
import com.sun.j3d.utils.geometry.Sphere;
import com.sun.j3d.utils.geometry.Cylinder;

import javax.media.j3d.*;
import javax.vecmath.*;
import com.sun.j3d.utils.behaviors.mouse.*;
import javax.media.protocol.DataSource;

class JMFTexture2D extends Texture2D implements VideoRenderer {
    Processor p;
    protected RGBFormat inputFormat;
    protected RGBFormat supportedRGB;
    protected Format [] supportedFormats;

    boolean stateTransOK = true;
    private int textureHeight, textureWidth;
    private int videoHeight, videoWidth;

    boolean YUPFlag = true;

    int platformSpecificImageType;
    int[] waitSync = new int[0];
    BufferedImage bi;
    ImageComponent2D ic;

    byte[] textureData, jmfData;

    protected int scaledSize;

    protected Component component = null;
    protected Rectangle reqBounds = null;
    protected Rectangle bounds = new Rectangle();
    protected boolean started = false;
    protected Object lastData = null;
    Canvas3D c;

    boolean firstFrame = true;
    DataSource ds;

    public JMFTexture2D(Canvas3D c,
                        int textureHeight,
                        int textureWidth,
                        int videoHeight,
                        int videoWidth,
                        int platformSpecificImageType,
                        boolean YUPFlag) {


           super(Texture2D.BASE_LEVEL,
                 Texture2D.RGB,
                 textureHeight,
                 textureWidth);

           this.c = c;

           this.textureHeight = textureHeight;
           this.textureWidth = textureWidth;

           this.videoHeight = videoHeight;
           this.videoWidth = videoWidth;

           this.platformSpecificImageType = platformSpecificImageType;
           this.YUPFlag = YUPFlag;

           //must be able to read and write texture for image processing
           //and data swapping.
           this.setCapability(Texture2D.ALLOW_IMAGE_WRITE);
           this.setCapability(Texture2D.ALLOW_IMAGE_READ);

           supportedRGB =  new RGBFormat(null,
                                     Format.NOT_SPECIFIED,
                         Format.byteArray,
                         Format.NOT_SPECIFIED,
                        24,
                         3, 2, 1,
                         3, Format.NOT_SPECIFIED,
                         Format.TRUE,
                         Format.NOT_SPECIFIED);

      supportedFormats = new VideoFormat[] {supportedRGB };

          //create a BufferedImage to hold texture data;
          //return the data to bufferData

          bi = new BufferedImage(textureHeight,
                                 textureWidth,
                                  BufferedImage.TYPE_3BYTE_BGR);

textureData=((DataBufferByte)bi.getRaster().getDataBuffer()).getData();

       //instantiate a new ImageComponent; choose byRef and yUp
        if (platformSpecificImageType == BufferedImage.TYPE_3BYTE_BGR) {

          ic = new ImageComponent2D(ImageComponent2D.FORMAT_RGB,
                                    textureHeight,
                                    textureWidth,
                                    true,
                                    YUPFlag);

        } else {
          ic = new ImageComponent2D(ImageComponent2D.FORMAT_RGBA,
                                    textureHeight,
                                    textureWidth,
                                    true,
                                    YUPFlag);
        }

      //set bi as ImageComponenet object
         ic.set(bi);

    this.setImage(0, ic);

        this.start();
        System.out.println("JMFTexture2D constructor");
        // Prepare supported input formats and preferred format

    }

   public void setMedia(DataSource ds) {
     this.ds= ds;
   }

   public boolean openMedia() {

    try {
            System.out.println("Opening media ...");
        p = Manager.createProcessor(ds);
    } catch (Exception ex) {
          System.out.println("failed to create a processor for videostream " + ds);

      return false;
    }

    System.out.println("done opening; try to configure");
    p.configure();
    System.out.println("done configuring");

        if ( !waitForState(p.Configured)) {
        System.out.println("Failed to configure the processor");
        return false;
    } else {
            System.out.println("waiting for state");
        }

    System.out.println("setting content descriptor");
    // use processor as a player
    p.setContentDescriptor(null);
    System.out.println("done setting content descriptor");

    // obtain the track control

        TrackControl[] tc = p.getTrackControls();

    if ( tc == null ) {
        System.out.println("Failed to get the track control from processor");
        return false;
    }

    TrackControl vtc = null;

    for ( int i =0; i < tc.length; i++ ) {
        if (tc[i].getFormat() instanceof VideoFormat ) {
        vtc = tc[i];
        break;
        }

    }

    if ( vtc == null ) {
         System.out.println("can't find video track");
        return false;
    }

    try {
        vtc.setRenderer(this);
    } catch ( Exception ex) {
        ex.printStackTrace();
        System.out.println("the processor does not support effect");
        return false;
    }
    p.setContentDescriptor(null);

    // prefetch
    p.prefetch();

    return true;
    }

    public void init() {

        p.start();

    System.out.println("start transmission");

    }

   public void movieOff() {
        p.stop();
     System.out.println("stop transmission");
    }

     public void movieOn() {
        p.start();
     System.out.println("start transmission");
    }

    public void swapAndScaleRGB() {

        int op, ip, x, y;

    op = 0;
    int lineStride = 3 * videoWidth;
    for ( int i = 0; i < textureHeight; i++ ) {
        for ( int j = 0; j < textureWidth; j++) {
            x = (videoWidth*j) >> 7;
            y = (videoHeight*i) >> 7;

            if ( x >= videoWidth || y >= videoHeight ) {
            textureData[op++]  = 0;
            textureData[op++]  = 0;
            textureData[op++]  = 0;
            } else {
            ip = y*lineStride + x*3;
            textureData[op++] = jmfData[ip++];
            textureData[op++] = jmfData[ip++];
            textureData[op++] = jmfData[ip++];
            }
        }
         }
    }


    public void swapAndScaleARGB() {

        int op, ip, x, y;

    op = 0;
    int lineStride = 3 * videoWidth;
        for ( int i = 0; i < textureWidth; i++ )
        for ( int j = 0; j < textureHeight; j++) {
            x = (videoWidth*j) >> 7;
            y = (videoHeight*i) >> 7;

           if ( x >= videoWidth || y >= videoHeight ) {
            textureData[op++] = (byte)0xff;
            textureData[op++]  = 0;
            textureData[op++]  = 0;
            textureData[op++]  = 0;
            } else {
            ip = y*lineStride + x*3;
            textureData[op++] = (byte)0xff;
            textureData[op++] = jmfData[ip++];
            textureData[op++] = jmfData[ip++];
            textureData[op++] = jmfData[ip++];
            }
        }
    }

    public int process(Buffer buffer) {


    //get the video data
        jmfData =(byte[])(buffer.getData());

        if (platformSpecificImageType == BufferedImage.TYPE_3BYTE_BGR) {
           swapAndScaleRGB();
        } else {
           swapAndScaleARGB();
        }

        ic = new ImageComponent2D (ImageComponent.FORMAT_RGB,
                                                     bi,
                                                     true,
                                                     true);

    this.setImage(0, ic);
    return BUFFER_PROCESSED_OK;

    }

  //the following methods must be implemented

      public java.lang.Object[] getControls() {
        // No controls
        return (Object[]) new Control[0];
    }

    /**
     * Return the control based on a control type for the PlugIn.
     */
    public Object getControl(String controlType) {
       try {
          Class  cls = Class.forName(controlType);
          Object cs[] = getControls();
          for (int i = 0; i < cs.length; i++) {
             if (cls.isInstance(cs[i]))
                return cs[i];
          }
          return null;
       } catch (Exception e) {   // no such controlType or such control
         return null;
       }
    }

    /*************************************************************************
     * PlugIn implementation
     *************************************************************************/

    public java.lang.String getName() {
        return "JMFTexture";
    }

       public java.awt.Rectangle getBounds() {
        return reqBounds;
    }

     public javax.media.Format[] getSupportedInputFormats() {
        return supportedFormats;
    }

    public boolean waitForState(int state) {
    synchronized (waitSync) {
        try {
        while ( p.getState() != state && stateTransOK ) {

        }
        } catch (Exception ex) {}

        return stateTransOK;
    }
    }

   public boolean setComponent(java.awt.Component comp) {
        component = comp;
        return true;
    }

  public void open() throws javax.media.ResourceUnavailableException {

  }

  public Format setInputFormat(Format format) {
        if ( format != null && format instanceof RGBFormat &&
             format.matches(supportedRGB)) {

            inputFormat = (RGBFormat) format;
            Dimension size = inputFormat.getSize();
           // inWidth = size.width;
          //  inHeight = size.height;
            return format;
        } else
            return null;
    }

  public void setBounds(java.awt.Rectangle rectangle) {
  }
  public synchronized void close() {
  }

 public java.awt.Component getComponent() {
    return c;
    }
 public void reset() {
        // Nothing to do
    }
  public void start() {
      System.out.println("start called");
  }

  public void stop() {
      System.out.println("stop called");
  }


}
						

Note the considerable amount of code that goes into setting up the imaging type and data format parameters for the different operating systems. This would seem to be distinctly non–cross-platform and, in fact, it is. The reason is that Java 3D does not treat all image formats the same when doing texture by reference. Indeed, various formats are handled differently depending on platform (Windows, Solaris, and so on). The reason for this is that there is a very large number of BufferedImage formats (including custom) but the low-level 3D APIs (for example, OpenGL and Direct3D) do not support all of these. Therefore, Java 3D creates an intermediate format that is compatible with OpenGL. The image in the intermediate format is therefore kept in case the texture does not change.

For performance reasons, we do not want a situation in which Java 3D is making multiple copies of an image; therefore we need to determine which platform the code is being run on and set up the images so that no copy is made.

Table 14.1 summarizes the combinations of the BufferedImage format and ImageComponent2D in which a second copy is not made. Therefore, these are the conditions that must be met in order to effectively use a dynamic texture.

Table 14.1. Image Types That Support By-Reference Under Different Platforms and Low-Level APIs
Platform/Low-Level API version Format
OpenGL extension GL_EXT_abgr/ D3D version BufferedImage.TYPE_4BYTE_ABGR + ImageComponent.FORMAT_RGBA8
OpenGL version 1.2 and above/ D3D version BufferedImage.TYPE_3BYTE_BGR + ImageComponent.FORMAT_RGB
All others BufferedImage.TYPE_BYTE_GRAY + ImageComponent.FORMAT_CHANNEL8

The second condition that must be met in order to avoid making image copies is to specify Y-UP as true. It is necessary to include this parameter regardless of the target platform.

To create a JMFTexture and apply it to an object, you have to enable texture mapping and set up some RTP streaming or simple playback code. Listing 14.2 illustrates the use of the RTP Streaming mechanism.

Listing 14.2 VideoCubes.java
import javax.media.control.*;
import com.sun.j3d.utils.behaviors.mouse.MouseRotate;
import javax.media.*;
import java.applet.Applet;
import java.awt.*;
import java.awt.event.*;
import com.sun.j3d.utils.applet.MainFrame;

import java.net.URL;
import com.sun.j3d.utils.universe.*;
import javax.media.j3d.*;
import javax.vecmath.*;
import java.awt.image.*;
import com.sun.j3d.utils.geometry.Box;
import java.awt.image.BufferedImage;
import javax.media.j3d.ImageComponent2D;
import java.awt.image.*;
import java.awt.geom.*;

import javax.media.format.*;
import java.io.File;

import com.sun.j3d.utils.picking.behaviors.*;
import com.sun.j3d.utils.picking.*;
import com.sun.j3d.utils.picking.behaviors.*;
import java.awt.BorderLayout;
import java.awt.Component;
import java.awt.Point;
import javax.swing.*;
import javax.swing.border.BevelBorder;

import java.net.*;
import java.io.*;
import java.net.*;
import java.util.Vector;
import javax.media.rtp.*;
import javax.media.rtp.event.*;
import javax.media.rtp.rtcp.*;
import javax.media.protocol.*;
import javax.media.protocol.DataSource;
import javax.media.format.AudioFormat;
import javax.media.format.VideoFormat;
import javax.media.Format;
import javax.media.format.FormatChangeEvent;
import javax.media.control.BufferControl;


/**
 * VideoCubes to receive RTP transmission using the new RTP API.
 */
public class VideoCubes extends Applet
      implements ReceiveStreamListener, SessionListener, ControllerListener {
   String sessions[] = null;
   RTPManager mgrs[] = null;

   boolean stateTransOK = true;
   boolean dataReceived = false;

   Object dataSync = new Object();

   private View view = null;
   private PickRotateBehavior behavior1;
   private PickZoomBehavior   behavior2;
   private PickTranslateBehavior behavior3;

   Canvas3D c;

   TransformGroup objScale;
   BranchGroup scene;
   BoundingSphere bounds;
   private VirtualUniverse universe;
   Locale locale;

   int platformSpecificImageType;

   int iter;
   int[] waitSync = new int[0];

    public VideoCubes(String sessions[]) {
this.sessions = sessions;

            String os = System.getProperty("os.name");
            System.out.println("running on " + os);



            if ( os.startsWith("W") || os.startsWith("w")) {
               platformSpecificImageType = BufferedImage.TYPE_3BYTE_BGR;
        } else if (os.startsWith("S") || os.startsWith("s")){
           platformSpecificImageType = BufferedImage.TYPE_4BYTE_ABGR;
        } else {
           platformSpecificImageType = BufferedImage.TYPE_3BYTE_BGR;
            }

        init3d();
    }

    public BranchGroup createViewGraph() {

         BranchGroup objRoot = new BranchGroup();

     Transform3D t = new Transform3D();
     t.setTranslation(new Vector3f(0.0f, 0.f,0.0f));
     ViewPlatform vp = new ViewPlatform();
     TransformGroup vpTrans = new TransformGroup();
         vpTrans.setCapability(TransformGroup.ALLOW_TRANSFORM_WRITE);
     vpTrans.setCapability(TransformGroup.ALLOW_TRANSFORM_READ);
     vpTrans.setTransform(t);
     vpTrans.addChild(vp);
     view.attachViewPlatform(vp);
         view.setBackClipDistance(200.f);
         NavigationBehavior nav = new NavigationBehavior(vpTrans);
         vpTrans.addChild(nav);

         nav.setSchedulingBounds(bounds);

         objRoot.addChild(vpTrans);
         return objRoot;

   }



    public void init3d() {
        setLayout(new BorderLayout());

        bounds =
              new BoundingSphere(new Point3d(0.0,0.0,0.0), 100.0);

        universe = new VirtualUniverse();
    locale = new Locale(universe);

        GraphicsConfigTemplate3D g3d = new GraphicsConfigTemplate3D();
        GraphicsConfiguration gc = GraphicsEnvironment. getLocalGraphicsEnvironment().
                                     getDefaultScreenDevice(). getBestConfiguration(g3d);

    Canvas3D c = new Canvas3D(gc);
       add("Center", c);

        PhysicalBody body = new PhysicalBody();
    PhysicalEnvironment environment = new PhysicalEnvironment();
    view = new View();

    view.addCanvas3D(c);
    view.setPhysicalBody(body);
        view.setPhysicalEnvironment(environment);
    // Create a simple scene and attach it to the virtual universe

        bounds = new BoundingSphere(new Point3d(0.0,0.0,0.0), 100.0);
        scene = createSceneGraph(c);

        scene.setCapability(Group.ALLOW_CHILDREN_EXTEND);
        BranchGroup vgraph = createViewGraph();

        locale.addBranchGraph(vgraph);
    locale.addBranchGraph(scene);


    // Create a scene and attach it to the virtual universe


    }

    protected boolean initialize() {
        try {
        InetAddress ipAddr;
        SessionAddress localAddr = new SessionAddress();
        SessionAddress destAddr;

        mgrs = new RTPManager[sessions.length];

        SessionLabel session;


            // Open the RTP sessions.
        for (int i = 0; i < sessions.length; i++) {

         // Parse the session addresses.
        try {
            session = new SessionLabel(sessions[i]);
        } catch (IllegalArgumentException e) {
            System.err.println("Failed to parse the session address given: " + 
sessions[i]);
            return false;
        }

        System.err.println("  - Open RTP session for: addr: "
                             + session.addr + " port: "
                              + session.port + " ttl: " + session.ttl);

        mgrs[i] = (RTPManager) RTPManager.newInstance();
        mgrs[i].addSessionListener(this);
        mgrs[i].addReceiveStreamListener(this);

        ipAddr = InetAddress.getByName(session.addr);

        if( ipAddr.isMulticastAddress()) {
            // local and remote address pairs are identical:
            localAddr= new SessionAddress( ipAddr,
                           session.port,
                           session.ttl);
            destAddr = new SessionAddress( ipAddr,
                           session.port,
                           session.ttl);
        } else {
            localAddr= new SessionAddress( InetAddress.getLocalHost(),
                                 session.port);
                    destAddr = new SessionAddress( ipAddr, session.port);
        }

        mgrs[i].initialize( localAddr);

        // You can try out some other buffer size to see
        // if you can get better smoothness.
        BufferControl bc = (BufferControl)mgrs[i]. getControl("javax.media.control.
BufferControl");
        if (bc != null)
            bc.setBufferLength(350);

            mgrs[i].addTarget(destAddr);
        }

        } catch (Exception e){
            System.err.println("Cannot create the RTP Session: " + e.getMessage());
            return false;
        }

    // Wait for data to arrive before moving on.

    long then = System.currentTimeMillis();
    long waitingPeriod = 30000;

    try{
        synchronized (dataSync) {
        while (!dataReceived &&
            System.currentTimeMillis() - then < waitingPeriod) {
            if (!dataReceived)
            System.err.println("  - Waiting for RTP data to arrive...");
            dataSync.wait(1000);
        }
        }
    } catch (Exception e) {}

    if (!dataReceived) {
        System.err.println("No RTP data was received.");
        //close();
        return false;
    }

        return true;
    }


    public synchronized void update(SessionEvent evt) {
    if (evt instanceof NewParticipantEvent) {
        Participant p = ((NewParticipantEvent)evt).getParticipant();
        System.err.println("  - A new participant had just joined: "
                              + p.getCNAME());
    }
    }




   private BranchGroup createVCube(DataSource ds,
                                   double scale,
                                   double xpos,
                                   double ypos,
                                   double zpos){

     BranchGroup newG = new BranchGroup();

    Transform3D t = new Transform3D();
    t.set(scale, new Vector3d(xpos, ypos, zpos));

    TransformGroup objTrans = new TransformGroup(t);
    objTrans.setCapability(TransformGroup.ALLOW_TRANSFORM_WRITE);
    objTrans.setCapability(TransformGroup.ALLOW_TRANSFORM_READ);
    objTrans.setCapability(TransformGroup.ENABLE_PICK_REPORTING);

    // Create a second transform group node and initialize it to the
    // identity.  Enable the TRANSFORM_WRITE capability so that
    // our behavior code can modify it at runtime.
    TransformGroup spinTg = new TransformGroup();
    spinTg.setCapability(TransformGroup.ALLOW_TRANSFORM_WRITE);
    spinTg.setCapability(TransformGroup.ALLOW_TRANSFORM_READ);
    spinTg.setCapability(TransformGroup.ENABLE_PICK_REPORTING);


    Appearance app = new Appearance();
    app.setCapability(Appearance.ALLOW_TEXTURE_WRITE);

    //determine which  platform code is running on


    JMFTexture2D jtex = new JMFTexture2D(c,
                                         128,
                                         128,
                                         144,
                                         176,
                                         platformSpecificImageType,
                                         platformSpecificYUP);

    jtex.setMedia(ds);

    jtex.openMedia();
    jtex.init();


    app.setTexture(jtex);

    BoundingSphere b = new BoundingSphere(new Point3d(xpos,ypos,zpos), 1.0);

    Box stream = new Box(1.f,1.f,1.f,
                          Box.GENERATE_TEXTURE_COORDS |
                          Box.ENABLE_GEOMETRY_PICKING, app);


    spinTg.addChild(stream);

       // add it to the scene graph.
    objTrans.addChild(spinTg);
    newG.addChild(objTrans);
    return newG;
  }


   ///end of createObject
    public synchronized void update( ReceiveStreamEvent evt) {

    RTPManager mgr = (RTPManager)evt.getSource();
    Participant participant = evt.getParticipant();    // could be null.
    ReceiveStream stream = evt.getReceiveStream();  // could be null.

    if (evt instanceof RemotePayloadChangeEvent) {

        System.err.println("  - Received an RTP PayloadChangeEvent.");
        System.err.println("Sorry, cannot handle payload change.");
        System.exit(0);

    }

    else if (evt instanceof NewReceiveStreamEvent) {

        try {
        stream = ((NewReceiveStreamEvent)evt).getReceiveStream();
        DataSource ds = stream.getDataSource();
                iter=iter+10;

        // Find out the formats.
        RTPControl ctl = (RTPControl)ds.
                          getControl("javax.media.rtp.RTPControl");
        if (ctl != null){
            System.err.println("  - Received new RTP stream: "
                                 + ctl.getFormat());
        } else
            System.err.println("  - Received new RTP stream");

        if (participant == null)
            System.err.println(" The sender of this stream had yet to be identified.");
        else {
            System.err.println(" The stream comes from: " +
                                 participant.getCNAME());
        }


                 scene.addChild(createVCube(ds, 0.5, 0.0, 0.0,iter));
                  System.out.println("creating cube " );

        synchronized (dataSync) {
            dataReceived = true;
            dataSync.notifyAll();
        }

        } catch (Exception e) {
        System.err.println("NewReceiveStreamEvent exception " + e.getMessage());
        return;
        }

    }

    else if (evt instanceof StreamMappedEvent) {

         if (stream != null && stream.getDataSource() != null) {
        DataSource ds = stream.getDataSource();
        // Find out the formats.
        RTPControl ctl = (RTPControl)ds.
                           getControl("javax.media.rtp.RTPControl");
        System.err.println("  - The previously unidentified stream ");
        if (ctl != null)
            System.err.println("      " + ctl.getFormat());
        System.err.println("  has now been identified as sent by: "
                            + participant.getCNAME());
         }
    }

    else if (evt instanceof ByeEvent) {

         System.err.println("  - Got "bye" from: " + participant.getCNAME());

    }

    }

    private Group createStructuralElement(double scale,
                                          Vector3d pos, Color3f color,
                                          float xdim, float ydim, float zdim,
                                                int tnumber) {
    // Create a transform group node to scale and position the object.
    Transform3D t = new Transform3D();
    t.set(scale, pos);
    TransformGroup objTrans = new TransformGroup(t);

    Appearance app = new Appearance();
        ColoringAttributes ca =
                   new ColoringAttributes(color,ColoringAttributes.
                   SHADE_GOURAUD);
        app.setColoringAttributes(ca);

        Box structelem = new Box(xdim, ydim, zdim, app);

        objTrans.addChild(structelem);
    return objTrans;
    }


   public BranchGroup createSceneGraph(Canvas3D canvas) {

    // Create the root of the branch graph
    BranchGroup objRoot = new BranchGroup();

    //add walls, floors etc.

    Group rightwall =
        createStructuralElement(1.f,
                                new Vector3d( 50.0, 0.0, 0.0),
                                new Color3f(1.f,0.f,0.f),
                                2.0f, 14.0f, 100.0f, 1);
    objRoot.addChild(rightwall);

. . .



    behavior1 = new PickRotateBehavior(objRoot, canvas, bounds);
    objRoot.addChild(behavior1);

    behavior2 = new PickZoomBehavior(objRoot, canvas, bounds);
    objRoot.addChild(behavior2);

    behavior3 = new PickTranslateBehavior(objRoot, canvas, bounds);
    objRoot.addChild(behavior3);

    // Let Java 3D perform optimizations on this scene graph.
    objRoot.compile();

    return objRoot;
  }

 public synchronized void controllerUpdate(ControllerEvent evt) {
    if ( evt instanceof ConfigureCompleteEvent ||
         evt instanceof RealizeCompleteEvent ||
         evt instanceof PrefetchCompleteEvent ) {
        synchronized (waitSync) {
        stateTransOK = true;
        waitSync.notifyAll();
        }
    } else if ( evt instanceof ResourceUnavailableEvent) {
        synchronized (waitSync) {
        stateTransOK = false;
        waitSync.notifyAll();
        }
    }
    }

    /*
      A utility class to parse the session addresses.
     */
    class SessionLabel {

    public String addr = null;
    public int port;
    public int ttl = 1;

    SessionLabel(String session) throws IllegalArgumentException {

        int off;
        String portStr = null, ttlStr = null;

        if (session != null && session.length() > 0) {
        while (session.length() > 1 && session.charAt(0) == '/')
            session = session.substring(1);

        // Now see if there's a addr specified.
        off = session.indexOf('/'),
        if (off == -1) {
            if (!session.equals(""))
            addr = session;
        } else {
            addr = session.substring(0, off);
            session = session.substring(off + 1);
            // Now see if there's a port specified
            off = session.indexOf('/'),
            if (off == -1) {
            if (!session.equals(""))
                portStr = session;
            } else {
            portStr = session.substring(0, off);
            session = session.substring(off + 1);
            // Now see if there's a ttl specified
            off = session.indexOf('/'),
            if (off == -1) {
                if (!session.equals(""))
                ttlStr = session;
            } else {
                ttlStr = session.substring(0, off);
            }
            }
        }
        }

        if (addr == null)
        throw new IllegalArgumentException();

        if (portStr != null) {
        try {
            Integer integer = Integer.valueOf(portStr);
            if (integer != null)
            port = integer.intValue();
        } catch (Throwable t) {
            throw new IllegalArgumentException();
        }
        } else
        throw new IllegalArgumentException();

        if (ttlStr != null) {
        try {
            Integer integer = Integer.valueOf(ttlStr);
            if (integer != null)
            ttl = integer.intValue();
        } catch (Throwable t) {
            throw new IllegalArgumentException();
        }
        }
    }
    }

    public static void main(String argv[]) {
        if (argv.length == 0)
        prUsage();
        BranchGroup group;
    VideoCubes cubes = new VideoCubes(argv);
    new MainFrame(cubes,750,550);

          if (!cubes.initialize()) {
        System.err.println("Failed to initialize the sessions.");
        System.exit(-1);
    }

    }


    static void prUsage() {
    System.err.println("Usage: VideoCubes <session> <session> ...");
    System.err.println("     <session>: <address>/<port>/<ttl>");
    System.exit(0);
    }

}// end of VideoCubes
						

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset