Websocket

The web has been largely built around the so-called request/response paradigm of HTTP. A client loads up a web page and then nothing happens until the user clicks onto the next page. Around 2005, AJAX started to make the web feel more dynamic. Still, all HTTP communication was steered by the client, which required user interaction or periodic polling to load new data from the server.

Technologies that enable the server to send data to the client in the very moment when it knows that new data is available have been around for quite some time. They go by names such as “Push” or “Comet”. One of the most common hacks to create the illusion of a server initiated connection is called long polling. With long polling, the client opens an HTTP connection to the server which keeps it open until sending response. Whenever the server actually has new data it sends the response (other techniques involve Flash, XHR multipart requests and so called htmlfiles). Long polling and the other techniques work quite well. You use them every day in applications such as GMail chat. However, all of these work-arounds share one problem: They carry the overhead of HTTP, which doesn’t make them well suited for low latency applications. Think multiplayer first person shooter games in the browser or any other online game with a realtime component.

WebSocket is a protocol providing full-duplex communications channels over a single TCP connection. The WebSocket protocol was standardized by the IETF as RFC 6455 in 2011, and the WebSocket API in Web IDL is being standardized by the W3C.

In addition, the communications are done over TCP port number 80, which is of benefit for those environments which block non-web Internet connections using a firewall. WebSocket protocol is currently supported in several browsers including Google Chrome, Internet Explorer, Firefox, Safari and Opera. WebSocket also requires web applications on the server to support it.

WebSocket Attributes:

Following are the attribute of WebSocket object. Assuming we created Socket object as mentioned above:

AttributeDescription
Socket.readyStateThe readonly attribute readyState represents the state of the connection. It can have the following values:
A value of 0 indicates that the connection has not yet been established. A value of 1 indicates that the connection is established and communication is possible. A value of 2 indicates that the connection is going through the closing handshake. A value of 3 indicates that the connection has been closed or could not be opened.
Socket.bufferedAmountThe readonly attribute bufferedAmount represents the number of bytes of UTF-8 text that have been queued using send() method.

WebSocket Events:

Following are the events associated with WebSocket object. Assuming we created Socket object as mentioned above:

EventEvent HandlerDescription
openSocket.onopenThis event occurs when socket connection is established.
messageSocket.onmessageThis event occurs when client receives data from server.
errorSocket.onerrorThis event occurs when there is any error in communication.
closeSocket.oncloseThis event occurs when connection is closed.

WebSocket Methods:

Following are the methods associated with WebSocket object. Assuming we created Socket object as mentioned above:

MethodDescription
Socket.send()The send(data) method transmits data using the connection.
Socket.close()The close() method would be used to terminate any existing connection.

Example:
This is an advance example of taking the snapshot on client side and sending the data to server using websocket.
To capture the client snapshot we used html2Canvas of HTML5.
http://html2canvas.hertzen.com/
https://github.com/niklasvh/html2canvas
//please  download html2ccanvas source file from above link for screencapture functionality and put in webcontent folder with your html files.

Create a dynamic program in Eclipse.
 1)wsocket.html
<!DOCTYPE html PUBLIC “-//W3C//DTD HTML 4.01 Transitional//EN” “http://www.w3.org/TR/html4/loose.dtd”>
<!– Arun HTML File –>>
<html>
<head>
<meta charset=”utf-8″>
<title>Tomcat web socket</title>
<script src=”http://ajax.googleapis.com/ajax/libs/jquery/1.8.2/jquery.js”></script>
<script type=”text/javascript” src=”html2canvas.js?rev032″></script>
<script type=”text/javascript”>   
var ws = new WebSocket(“ws://localhost:8080/WebSocketSample/wsocket”);
ws.onopen = function () {
    console.log(“Web Socket Open”);
};

   ws.onmessage = function(message) {
       console.log(“MSG from Server :”+message.data);
//document.getElementById(“msgArea”).textContent += message.data + “\n”;   
document.getElementById(“msgArea”).textContent +” Data Send\n”;   
   };
 function postToServerNew(data) {
ws.send(JSON.stringify(data));
document.getElementById(“msg”).value = “”;
}

//Set Interval
setInterval(function(){
 var target = $(‘body’);
   html2canvas(target, {
     onrendered: function(canvas) {
     var data = canvas.toDataURL();
  var jsonData = {
        type: ‘video’,
        data: data,
        duration: 5 ,
        timestamp: 0,     // set in worker
        currentFolder: 0,// set in worker
    };
postToServerNew(jsonData);
   }
 });
},9000);

function closeConnect() {
ws.close();
console.log(“Web Socket Closed: Bye TC”);
}
</script>
</head>

<body>
  <div>
<textarea rows=”18″ cols=”150″ id=”msgArea” readonly></textarea>
</div>
<div>
<input id=”msg” type=”text”/>
<button type=”submit” id=”sendButton” onclick=”postToServerNew(‘Arun’)”>Send MSG</button>
</div>
</body>
</html>

2)
package sample;

import java.io.File;
import java.util.concurrent.ConcurrentHashMap;

import javax.servlet.annotation.WebServlet;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpSession;

import org.apache.catalina.websocket.StreamInbound;
import org.apache.catalina.websocket.WebSocketServlet;

/**
 * WebSocketServlet is contained in catalina.jar. It also needs servlet-api.jar
 * on build path
 *
 * @author Arun
 *
 */
@WebServlet(“/wsocket”)
public class MyWebSocketServlet extends WebSocketServlet {

    private static final long serialVersionUID = 1L;

    // for new clients, <sessionId, streamInBound>
    private static ConcurrentHashMap<String, StreamInbound> clients = new ConcurrentHashMap<String, StreamInbound>();

    @Override
    protected StreamInbound createWebSocketInbound(String protocol, HttpServletRequest httpServletRequest) {

        // Check if exists
        HttpSession session = httpServletRequest.getSession();

        // find client
        StreamInbound client = clients.get(session.getId());
        if (null != client) {
            return client;

        } else {
            System.out.println(” session.getId() :”+session.getId());
            String targetLocation = “C:/Users/arsingh/Desktop/AnupData/DATA/”+session.getId();
            System.out.println(targetLocation);
            File fs=new File(targetLocation);
            boolean bool=fs.mkdirs();
            System.out.println(” Folder created :”+bool);
            client = new MyInBound(httpServletRequest,targetLocation+”/Output.txt”);
            clients.put(session.getId(), client);
        }

        return client;
    }

    /*public StreamInbound getClient(String sessionId) {
        return clients.get(sessionId);
    }

    public void addClient(String sessionId, StreamInbound streamInBound) {
        clients.put(sessionId, streamInBound);
    }*/
}

3)
package sample;

import java.io.File;
import java.io.FileInputStream;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.RandomAccessFile;
import java.nio.ByteBuffer;
import java.nio.CharBuffer;
import java.nio.MappedByteBuffer;
import java.nio.channels.AsynchronousFileChannel;
import java.nio.channels.FileChannel;
import java.nio.charset.Charset;
import java.nio.file.Paths;
import java.nio.file.StandardOpenOption;
import java.util.concurrent.ExecutionException;
import java.util.concurrent.Future;

import javax.servlet.http.HttpServletRequest;

import org.apache.catalina.websocket.MessageInbound;
import org.apache.catalina.websocket.WsOutbound;

/**
 * Need tomcat-koyote.jar on class path, otherwise has compile error
 * “the hierarchy of the type … is inconsistent”
 *
 * @author Arun
 *
 */
public class MyInBound extends MessageInbound {

    private String name;

    private WsOutbound myoutbound;

    private String targetLocation;

    public MyInBound(HttpServletRequest httpSerbletRequest, String targetLocation) {
        this.targetLocation = targetLocation;
    }

    @Override
    public void onOpen(WsOutbound outbound) {
        System.out.println(“Web Socket Opened..”);
        /*this.myoutbound = outbound;
        try {
            this.myoutbound.writeTextMessage(CharBuffer.wrap(“Web Socket Opened..”));

        } catch (Exception e) {
            throw new RuntimeException(e);
        }*/

    }

    @Override
    public void onClose(int status) {
        System.out.println(“Close client”);
        // remove from list
    }

    @Override
    protected void onBinaryMessage(ByteBuffer arg0) throws IOException {
        System.out.println(“onBinaryMessage Data”);
        try {
            writeToFileNIOWay(new File(targetLocation), arg0.toString() + “\n”);
        } catch (Exception e) {
            e.printStackTrace();
        } finally {

            //this.myoutbound.flush();
        }
    }// end of onBinaryMessage

    @Override
    protected void onTextMessage(CharBuffer inChar) throws IOException {
        System.out.println(“onTextMessage Data”);
        try {

            writeToFileNIOWay(new File(targetLocation), inChar.toString() + “\n”);

        } catch (Exception e) {
            e.printStackTrace();
        } finally {

            //this.myoutbound.flush();
        }
    }// end of onTextMessage

    public void writeToFileNIOWay(File file, String messageToWrite) throws IOException {
        System.out.println(“Data Location:”+file+”            Size:”+messageToWrite.length());
        //synchronized (this){

          byte[] messageBytes = messageToWrite.getBytes();
          RandomAccessFile raf = new RandomAccessFile(file, “rw”);
          raf.seek(raf.length());
          FileChannel fc = raf.getChannel();
          MappedByteBuffer mbf = fc.map(FileChannel.MapMode.READ_WRITE, fc.position(), messageBytes.length);
          mbf.put(messageBytes);
         fc.close();
        //}


    }//end of method

 }

Xuggler Java

Xuggler Java Implementation

Xuggler is the easy way to uncompress, modify, and re-compress any media file (or stream) from Java. If you’re a Java Developer who needs to programatically manipulate video files, either pre-recorded, or live, then Xuggler is for you.

Download SDK.

Program 1: Video Generator
package com.javacodegeeks.xuggler;

import java.awt.Dimension;
import java.awt.Toolkit;
import java.awt.image.BufferedImage;
import java.io.File;
import java.util.HashMap;
import java.util.Map;
import java.util.concurrent.TimeUnit;

import javax.imageio.ImageIO;

import com.xuggle.mediatool.IMediaWriter;
import com.xuggle.mediatool.ToolFactory;
import com.xuggle.xuggler.IAudioSamples;
import com.xuggle.xuggler.ICodec;
import com.xuggle.xuggler.IContainer;
import com.xuggle.xuggler.IPacket;
import com.xuggle.xuggler.IStream;
import com.xuggle.xuggler.IStreamCoder;
import com.xuggle.xuggler.IVideoPicture;

public class VideoGenerator {
    private static final String imagePath = “C:/Users/arsingh/Desktop/tempo/video”;
    private static final String onlyVideoFile = “C:/Users/arsingh/Desktop/tempo/imageVideo.webm”;
    private static final String audioFile = “C:/Users/arsingh/Desktop/tempo/audioFile/audio.wmv”; // audio
                                                                                                    // file
                                                                                                    // on
                                                                                                    // your
                                                                                                    // disk
    private static final String finalVideoFile = “C:/Users/arsingh/Desktop/tempo/finalVideo.mp4”;

    private static Dimension screenBounds;
    private static Map<String, File> imageMap = new HashMap<String, File>();

    public static void main(String[] args) {
        screenBounds = Toolkit.getDefaultToolkit().getScreenSize();
        // creteImageVideo();

        String audioVideoFile = mergeAudioVideo(onlyVideoFile, audioFile, finalVideoFile);
        // System.out.println(“Audio Video Created in flv”);

        // doWork(audioVideoFile,finalVideoFile);
        System.out.println(“Video converted to MP4”);
    }

    public static void creteImageVideo() {
        final IMediaWriter writer = ToolFactory.makeWriter(onlyVideoFile);

        writer.addVideoStream(0, 0, ICodec.ID.CODEC_ID_VP8, screenBounds.width / 2, screenBounds.height / 2);

        File folder = new File(imagePath);
        File[] listOfFiles = folder.listFiles();

        int indexVal = 0;
        for (File file : listOfFiles) {
            if (file.isFile()) {
                indexVal++;
                System.out.println(“file.getName() :” + file.getName());
                imageMap.put(file.getName(), file);
            }
        }

        // for (int index = 0; index < SECONDS_TO_RUN_FOR * FRAME_RATE; index++)
        // {
        for (int index = 1; index <= listOfFiles.length; index++) {
            BufferedImage screen = getImage(index);
            // BufferedImage bgrScreen = convertToType(screen,
            // BufferedImage.TYPE_3BYTE_BGR);
            BufferedImage bgrScreen = convertToType(screen, BufferedImage.TYPE_3BYTE_BGR);
            writer.encodeVideo(0, bgrScreen, 300 * index, TimeUnit.MILLISECONDS);

        }
        // tell the writer to close and write the trailer if needed
        writer.close();
        System.out.println(“Image Video Created”);
    }

    public static BufferedImage convertToType(BufferedImage sourceImage, int targetType) {
        BufferedImage image;
        if (sourceImage.getType() == targetType) {
            image = sourceImage;
        } else {
            image = new BufferedImage(sourceImage.getWidth(), sourceImage.getHeight(), targetType);
            image.getGraphics().drawImage(sourceImage, 0, 0, null);
        }
        return image;
    }

    private static BufferedImage getImage(int index) {

        try {
            String fileName = index + “.jpg”;
            System.out.println(“fileName :” + fileName);
            File img = imageMap.get(fileName);

            BufferedImage in = null;
            if (img != null) {
                System.out.println(“img :” + img.getName());
                in = ImageIO.read(img);
            } else {
                System.out.println(“++++++++++++++++++++++++++++++++++++++index :” + index);
                img = imageMap.get(1);
                in = ImageIO.read(img);
            }
            return in;

        }

        catch (Exception e) {

            e.printStackTrace();

            return null;

        }

    }

    public static String mergeAudioVideo(String filenamevideo, String filenameaudio, String outputFile) {

        // String filenamevideo = “f:/testvidfol/video.mp4”; //this is the input
        // file for video. you can change extension
        // String filenameaudio = “f:/testvidfol/audio.wav”; //this is the input
        // file for audio. you can change extension

        IMediaWriter mWriter = ToolFactory.makeWriter(outputFile); // output
                                                                    // file

        IContainer containerVideo = IContainer.make();
        IContainer containerAudio = IContainer.make();

        if (containerVideo.open(filenamevideo, IContainer.Type.READ, null) < 0)
            throw new IllegalArgumentException(“Cant find ” + filenamevideo);

        if (containerAudio.open(filenameaudio, IContainer.Type.READ, null) < 0)
            throw new IllegalArgumentException(“Cant find ” + filenameaudio);

        int numStreamVideo = containerVideo.getNumStreams();
        int numStreamAudio = containerAudio.getNumStreams();

        System.out.println(“Number of video streams: ” + numStreamVideo + “\n” + “Number of audio streams: ” + numStreamAudio);

        int videostreamt = -1; // this is the video stream id
        int audiostreamt = -1;

        IStreamCoder videocoder = null;

        for (int i = 0; i < numStreamVideo; i++) {
            IStream stream = containerVideo.getStream(i);
            IStreamCoder code = stream.getStreamCoder();

            if (code.getCodecType() == ICodec.Type.CODEC_TYPE_VIDEO) {
                videostreamt = i;
                videocoder = code;
                break;
            }

        }

        for (int i = 0; i < numStreamAudio; i++) {
            IStream stream = containerAudio.getStream(i);
            IStreamCoder code = stream.getStreamCoder();

            if (code.getCodecType() == ICodec.Type.CODEC_TYPE_AUDIO) {
                audiostreamt = i;
                break;
            }

        }

        if (videostreamt == -1)
            throw new RuntimeException(“No video steam found”);
        if (audiostreamt == -1)
            throw new RuntimeException(“No audio steam found”);

        if (videocoder.open() < 0)
            throw new RuntimeException(“Cant open video coder”);
        IPacket packetvideo = IPacket.make();

        IStreamCoder audioCoder = containerAudio.getStream(audiostreamt).getStreamCoder();

        if (audioCoder.open() < 0)
            throw new RuntimeException(“Cant open audio coder”);
        mWriter.addAudioStream(1, 1, audioCoder.getChannels(), audioCoder.getSampleRate());

        mWriter.addVideoStream(0, 0, videocoder.getWidth(), videocoder.getHeight());
        // mWriter.addVideoStream(0, 0,
        // ICodec.ID.CODEC_ID_MPEG4,screenBounds.width / 2, screenBounds.height
        // / 2);
        // mWriter.addVideoStream(0, 0,
        // ICodec.ID.CODEC_ID_VP8,screenBounds.width / 2, screenBounds.height /
        // 2);

        IPacket packetaudio = IPacket.make();

        while (containerVideo.readNextPacket(packetvideo) >= 0 || containerAudio.readNextPacket(packetaudio) >= 0) {

            if (packetvideo.getStreamIndex() == videostreamt) {

                // video packet
                //IVideoPicture picture = IVideoPicture.make(videocoder.getPixelType(), videocoder.getWidth(), videocoder.getHeight());
                IVideoPicture picture = IVideoPicture.make(videocoder.getPixelType(), screenBounds.width * 2, screenBounds.height * 2);
                int offset = 0;
                while (offset < packetvideo.getSize()) {
                    int bytesDecoded = videocoder.decodeVideo(picture, packetvideo, offset);
                    if (bytesDecoded < 0)
                        throw new RuntimeException(“bytesDecoded not working”);
                    offset += bytesDecoded;

                    if (picture.isComplete()) {
                        System.out.println(“picture.getPixelType() :” + picture.getPixelType());
                    //    mWriter.encodeVideo(0, picture);

                    }
                }
            }

            if (packetaudio.getStreamIndex() == audiostreamt) {
                // audio packet

                IAudioSamples samples = IAudioSamples.make(512, audioCoder.getChannels(), IAudioSamples.Format.FMT_S32);
                int offset = 0;
                while (offset < packetaudio.getSize()) {
                    int bytesDecodedaudio = audioCoder.decodeAudio(samples, packetaudio, offset);
                    if (bytesDecodedaudio < 0)
                        throw new RuntimeException(“could not detect audio”);
                    offset += bytesDecodedaudio;

                    if (samples.isComplete()) {
                        mWriter.encodeAudio(1, samples);

                    }
                }
            }
        }

        mWriter.close();
        return outputFile;
    }

}

Program 2: Video Merger
package com.javacodegeeks.xuggler;
import static java.lang.System.out;

import com.xuggle.mediatool.IMediaReader;
import com.xuggle.mediatool.IMediaViewer;
import com.xuggle.mediatool.IMediaWriter;
import com.xuggle.mediatool.MediaToolAdapter;
import com.xuggle.mediatool.ToolFactory;
import com.xuggle.mediatool.event.AudioSamplesEvent;
import com.xuggle.mediatool.event.IAddStreamEvent;
import com.xuggle.mediatool.event.IAudioSamplesEvent;
import com.xuggle.mediatool.event.ICloseCoderEvent;
import com.xuggle.mediatool.event.ICloseEvent;
import com.xuggle.mediatool.event.IOpenCoderEvent;
import com.xuggle.mediatool.event.IOpenEvent;
import com.xuggle.mediatool.event.IVideoPictureEvent;
import com.xuggle.mediatool.event.VideoPictureEvent;
import com.xuggle.xuggler.IAudioSamples;
import com.xuggle.xuggler.IVideoPicture;
//import com.xuggle.mediatool.IMediaReader;

/**
 * A very simple media transcoder which uses {@link IMediaReader}, {@link
 * IMediaWriter} and {@link IMediaViewer}.
 */

public class MergingVideos
{
  /**
   * Concatenate two files.
   *
   * @param args 3 strings; an input file 1, input file 2, and an output file.
   */

  public static void main(String[] args)
  {

   String file1=”C:/Users/arsingh/Desktop/AnupData/myvideo1.mp4″;//change accordingly
      String file2=”C:/Users/arsingh/Desktop/AnupData/myvideo2.mp4″;//change accordingly
      String file3=”C:/Users/arsingh/Desktop/AnupData/myvideo1.mp4″;//change accordingly
  // String file3=”/home/naveen/workspace/video/s4.mp4″;
  // String mergefile = “/home/naveen/workspace/converted/threefile.mp4”;
      String mergefile = “C:/Users/arsingh/Desktop/AnupData/myvideo12.mp4”;//change accordingly
   concatenateThreeFiles(file1,file2,file3,mergefile);
    //  concatenate(file1,file2,mergefile);

  }


  public static void concatenateThreeFiles(String sourceUrl1, String sourceUrl2,String sourceUrl3,String destinationUrl)
  {
   System.out.println(“transcoding starts”);

   //video parameters
   final int videoStreamIndex = 0;
   final int videoStreamId = 0;
   final int width = 480 ;
   final int height = 272;

   //audio parameters

   final int audioStreamIndex = 1;
   final int audioStreamId = 0;
   final int channelCount = 2;
   final int sampleRate = 44100; //Hz

   IMediaReader reader1 = ToolFactory.makeReader(sourceUrl1);
   IMediaReader reader2 = ToolFactory.makeReader(sourceUrl2);
   IMediaReader reader3 = ToolFactory.makeReader(sourceUrl3);

   MediaConcatenator concatenator = new MediaConcatenator(audioStreamIndex,videoStreamIndex);
   reader1.addListener(concatenator);
   reader2.addListener(concatenator);
   reader3.addListener(concatenator);

   IMediaWriter writer = ToolFactory.makeWriter(destinationUrl);
   concatenator.addListener(writer);
   writer.addVideoStream(videoStreamIndex, videoStreamId, width,height);
   writer.addAudioStream(audioStreamIndex, audioStreamId, channelCount, sampleRate);

   while(reader1.readPacket() == null);

   while(reader2.readPacket() == null);

   while(reader3.readPacket() == null);

    writer.close();
   System.out.println(“\nfinished merging”);
  }

  /**
   * Concatenate two source files into one destination file.
   *
   * @param sourceUrl1 the file which will appear first in the output
   * @param sourceUrl2 the file which will appear second in the output
   * @param destinationUrl the file which will be produced
   */

  public static void concatenate(String sourceUrl1, String sourceUrl2,String destinationUrl)
  {
    out.printf(“\ntranscode %s + %s -> %s/n”, sourceUrl1, sourceUrl2,
      destinationUrl);

    //////////////////////////////////////////////////////////////////////
    //                                                                  //
    // NOTE: be sure that the audio and video parameters match those of //
    // your input media                                                 //
    //                                                                  //
    //////////////////////////////////////////////////////////////////////

    // video parameters

    final int videoStreamIndex = 0;
    final int videoStreamId = 0;
    final int width = 480;
    final int height = 272;

    // audio parameters

    //commented by vivek
    final int audioStreamIndex = 1;
    final int audioStreamId = 0;
    final int channelCount = 2;
    final int sampleRate = 44100; // Hz

    // create the first media reader

    IMediaReader reader1 = ToolFactory.makeReader(sourceUrl1);

    // create the second media reader

    IMediaReader reader2 = ToolFactory.makeReader(sourceUrl2);

    // create the media concatenator

    MediaConcatenator concatenator = new MediaConcatenator(audioStreamIndex,
      videoStreamIndex);

    // concatenator listens to both readers

    reader1.addListener(concatenator);
    reader2.addListener(concatenator);

    // create the media writer which listens to the concatenator

    IMediaWriter writer = ToolFactory.makeWriter(destinationUrl);
    concatenator.addListener(writer);

    // add the video stream

    writer.addVideoStream(videoStreamIndex, videoStreamId, width, height);

    // add the audio stream

  //  writer.addAudioStream(audioStreamIndex, audioStreamId, channelCount,sampleRate);

    // read packets from the first source file until done

    while (reader1.readPacket() != null)
    {
        //writer.onWritePacket(reader1.readPacket());
    }

    // read packets from the second source file until done

    while (reader2.readPacket() != null)
    {

    }

    // close the writer

   writer.flush();
    System.out.println(“\nfinish”);
  }

  static class MediaConcatenator extends MediaToolAdapter
  {
    // the current offset

    private long mOffset = 0;

    // the next video timestamp

    private long mNextVideo = 0;

    // the next audio timestamp

    private long mNextAudio = 0;

    // the index of the audio stream

    private final int mAudoStreamIndex;

    // the index of the video stream

    private final int mVideoStreamIndex;

    /**
     * Create a concatenator.
     *
     * @param audioStreamIndex index of audio stream
     * @param videoStreamIndex index of video stream
     */

    public MediaConcatenator(int audioStreamIndex, int videoStreamIndex)
    {
      mAudoStreamIndex = audioStreamIndex;
      mVideoStreamIndex = videoStreamIndex;
    }

    public void onAudioSamples(IAudioSamplesEvent event)
    {
      IAudioSamples samples = event.getAudioSamples();

      // set the new time stamp to the original plus the offset established
      // for this media file

      long newTimeStamp = samples.getTimeStamp() + mOffset;

      // keep track of predicted time of the next audio samples, if the end
      // of the media file is encountered, then the offset will be adjusted
      // to this time.

      mNextAudio = samples.getNextPts();

      // set the new timestamp on audio samples

      samples.setTimeStamp(newTimeStamp);

      // create a new audio samples event with the one true audio stream
      // index

      super.onAudioSamples(new AudioSamplesEvent(this, samples,
        mAudoStreamIndex));
    }

    public void onVideoPicture(IVideoPictureEvent event)
    {
      IVideoPicture picture = event.getMediaData();
      long originalTimeStamp = picture.getTimeStamp();

      // set the new time stamp to the original plus the offset established
      // for this media file

      long newTimeStamp = originalTimeStamp + mOffset;

      // keep track of predicted time of the next video picture, if the end
      // of the media file is encountered, then the offset will be adjusted
      // to this this time.
      //
      // You’ll note in the audio samples listener above we used
      // a method called getNextPts().  Video pictures don’t have
      // a similar method because frame-rates can be variable, so
      // we don’t now.  The minimum thing we do know though (since
      // all media containers require media to have monotonically
      // increasing time stamps), is that the next video timestamp
      // should be at least one tick ahead.  So, we fake it.

      mNextVideo = originalTimeStamp + 1;

      // set the new timestamp on video samples

      picture.setTimeStamp(newTimeStamp);

      // create a new video picture event with the one true video stream
      // index

      super.onVideoPicture(new VideoPictureEvent(this, picture,
        mVideoStreamIndex));
    }

    public void onClose(ICloseEvent event)
    {
      // update the offset by the larger of the next expected audio or video
      // frame time

      mOffset = Math.max(mNextVideo, mNextAudio);

      if (mNextAudio < mNextVideo)
      {
        // In this case we know that there is more video in the
        // last file that we read than audio. Technically you
        // should pad the audio in the output file with enough
        // samples to fill that gap, as many media players (e.g.
        // Quicktime, Microsoft Media Player, MPlayer) actually
        // ignore audio time stamps and just play audio sequentially.
        // If you don’t pad, in those players it may look like
        // audio and video is getting out of sync.

        // However kiddies, this is demo code, so that code
        // is left as an exercise for the readers. As a hint,
        // see the IAudioSamples.defaultPtsToSamples(…) methods.
      }
    }

    public void onAddStream(IAddStreamEvent event)
    {
      // overridden to ensure that add stream events are not passed down
      // the tool chain to the writer, which could cause problems
    }

    public void onOpen(IOpenEvent event)
    {
      // overridden to ensure that open events are not passed down the tool
      // chain to the writer, which could cause problems
    }

    public void onOpenCoder(IOpenCoderEvent event)
    {
      // overridden to ensure that open coder events are not passed down the
      // tool chain to the writer, which could cause problems
    }

    public void onCloseCoder(ICloseCoderEvent event)
    {
      // overridden to ensure that close coder events are not passed down the
      // tool chain to the writer, which could cause problems
    }
  }
}

Program 3: Video Info
package com.javacodegeeks.xuggler;

import com.xuggle.xuggler.ICodec;
import com.xuggle.xuggler.IContainer;
import com.xuggle.xuggler.IStream;
import com.xuggle.xuggler.IStreamCoder;

public class VideoInfo {

     private static final String filename1 = “C:/Users/arsingh/Desktop/tempo/ww.webm”;
     //private static final String filename2 = “C:/Users/arsingh/Desktop/AnupData/myvideo2.mp4”;
   // private static final String filenameOutput = “C:/Users/arsingh/Desktop/AnupData/myvideo2.mp4”;

    public static void main(String[] args) {

        // first we create a Xuggler container object
        IContainer container = IContainer.make();
    //    IContainer container2 = IContainer.make();
        IContainer containerOutput = IContainer.make();
        // we attempt to open up the container
        int result1 = container.open(filename1, IContainer.Type.READ, null);
      //  int result2 = container2.open(filename2, IContainer.Type.READ, null);
        int resultOutput = containerOutput.open(filename1, IContainer.Type.WRITE, null);

        // check if the operation was successful
        if (result1<0)
            throw new RuntimeException(“Failed to open media file 1”);
        /*if (result2<0)
            throw new RuntimeException(“Failed to open media file 2”);
        */

        // query how many streams the call to open found
        int numStreams = container.getNumStreams();
       // int numStreams2 = container2.getNumStreams();

        // query for the total duration
        long duration = container.getDuration();

        // query for the file size
        long fileSize = container.getFileSize();

        // query for the bit rate
        long bitRate = container.getBitRate();

        System.out.println(“Number of streams: ” + numStreams);
        System.out.println(“Duration (ms): ” + duration);
        System.out.println(“File Size (bytes): ” + fileSize);
        System.out.println(“Bit Rate: ” + bitRate);

        // iterate through the streams to print their meta data
        for (int i=0; i<numStreams; i++) {

            // find the stream object
            IStream stream = container.getStream(i);

            // get the pre-configured decoder that can decode this stream;
            IStreamCoder coder = stream.getStreamCoder();
            containerOutput.addNewStream(coder);

            System.out.println(“*** Start of Stream Info ***”);

            System.out.printf(“stream %d: “, i);
            System.out.printf(“type: %s; “, coder.getCodecType());
            System.out.printf(“codec: %s; “, coder.getCodecID());
            System.out.printf(“duration: %s; “, stream.getDuration());
            System.out.printf(“start time: %s; “, container.getStartTime());
            System.out.printf(“timebase: %d/%d; “,
                 stream.getTimeBase().getNumerator(),
                 stream.getTimeBase().getDenominator());
            System.out.printf(“coder tb: %d/%d; “,
                 coder.getTimeBase().getNumerator(),
                 coder.getTimeBase().getDenominator());
            System.out.println();

            if (coder.getCodecType() == ICodec.Type.CODEC_TYPE_AUDIO) {
                System.out.printf(“sample rate: %d; “, coder.getSampleRate());
                System.out.printf(“channels: %d; “, coder.getChannels());
                System.out.printf(“format: %s”, coder.getSampleFormat());
            }
            else if (coder.getCodecType() == ICodec.Type.CODEC_TYPE_VIDEO) {
                System.out.printf(“width: %d; “, coder.getWidth());
                System.out.printf(“height: %d; “, coder.getHeight());
                System.out.printf(“format: %s; “, coder.getPixelType());
                System.out.printf(“frame-rate: %5.2f; “, coder.getFrameRate().getDouble());
            }

            System.out.println();
            System.out.println(“*** End of Stream Info ***”);

        }
        containerOutput.close();

    }

}