Just moozing

Before you can check your notes, you must make them…

gstreamer and HTML5

with 12 comments

It is fairly trivial to have gstreamer retrieve using HTTP and sending video through sockets. I want to use the video tags from HTML5 to show the gstreamer video.

HTTP server

I didn’t find an HTTP server as part of gstreamer. Suggestions include shout2send and tcpserversink, but I wanted a trivial HTTP server. I implemented a simple server in Python.

import subprocess # for piping
from BaseHTTPServer import HTTPServer, BaseHTTPRequestHandler

class RequestHandler(BaseHTTPRequestHandler):
    def _writeheaders(self):
        self.send_response(200) # 200 OK http response
        self.send_header('Content-type', 'video/ogg')

    def do_HEAD(self):

    def do_GET(self):

        DataChunkSize = 10000

        command = '(echo "--video boundary--"; curl -s http://trendnetcam/video.cgi;) | gst-launch fdsrc do-timestamp=true ! multipartdemux boundary="video boundary--" ! jpegdec ! videorate ! video/x-raw-yuv,framerate=4/1 ! theoraenc ! oggmux !  filesink location=/dev/stdout'
        print("running command: %s" % (command, ))
        p = subprocess.Popen(command, stdout=subprocess.PIPE, bufsize=-1, shell=True)

        print("starting polling loop.")
        while(p.poll() is None):
            print "looping... "
            stdoutdata = p.stdout.read(DataChunkSize)

        print("Done Looping")

        print("dumping last data, if any")
        stdoutdata = p.stdout.read(DataChunkSize)

if __name__ == '__main__':
    serveraddr = ('', 8765) # connect to port 8765
    srvr = HTTPServer(serveraddr, RequestHandler)

The basic idea is to have an webserver running and when a browser connects, gstreamer is started and data is piped as the file content.

The command on line 18 is described in a previous blog.

The webpage

The page is accessible on http://localhost:8765 and whatever other IP the machine has (using port 8765). The HTML part could look like this.

    <title>HTML5 video test page</title>
    Connection to localhost port 8765 for video stuff. <br />
    <a href="http://www.w3schools.com/html5/html5_video.asp">something about HT$

    <video autoplay>
        <source src="http://localhost:8765" />
        Your browser does not support the video tag.

This is a basic HTML page using a video tag. It will show the video in an embedded player. Go here for video tag information.

More to come?

There are a lot of stuff that could be improved

  • Security! I just copied from the first example I found. do_POST() might still be implemented and could be used to read files.
  • Error handling is non-existent.
  • Currently there is only one thing been served. It could be cool to have dynamic list of streams that could be accessed.

Written by moozing

December 29, 2011 at 09:00

Posted in Tech

Tagged with , ,

12 Responses

Subscribe to comments with RSS.

  1. Hello,

    Your example is quite amazing. I quite to adapt it for my case. I want to stream my camera output in a web browser. I succeeded to send my camera flux to the following address using this command line.

    gst-launch v4l2src device=/dev/video0 ! ‘video/x-raw-yuv,width=640,height=480,framerate=30/1’ ! ffmpegcolorspace ! jpegenc ! multipartmux ! tcpserversink host=localhost port=8080.

    The problem is I can’t display it in my browser using the video tag in HTML5. I also tried with theora encoder and oggmux options but still doesn’t work.

    In your example, the video comes from this url ” http://trendnetcam/video.cgi“. In my case should I tryed to directly read in /dev/video0 or create a tcpserversink and read in localhost:8080 ?

    Looking forward to hearing from you,



    January 5, 2012 at 12:22

    • Thanks. It is nice to know that people are actually reading what I write.

      In my example gstreamer receives the data from stdin using the fdsrc, you should be able to use v4l2src instead. tcpserversrc sends a stream but lacks the all the HTTP stuff, so it is not useful for this.

      This works for me
      gst-launch v4l2src device=/dev/video0 ! video/x-raw-yuv,width=640,height=480,framerate=10/1 ! theoraenc ! oggmux ! filesink location=/dev/stdout

      Note that the framerate is changed, my camera couldn’t do 30/1 🙂

      The entry referred to related to line 18, includes some debugging tips. I find the debugging option –gst-debug-level=2 or 3, especially useful.


      January 5, 2012 at 21:18

  2. Thank you for your answer.

    Yesterday, i succeeded to display the stream of my camera with the following commands line.

    Firstly in a terminal :

    gst-launch v4l2src device=/dev/video0 ! ffmpegcolorspace ! ‘video/x-raw-yuv,width=640,height=480,framerate=30/1’ ! jpegenc ! multipartmux boundary=”video boundary–” ! tcpserversink host=localhost port=8080

    Secondly in command in your python server
    ‘(echo “–video boundary–“; curl -s http://localhost:8080😉 | gst-launch fdsrc do-timestamp=true ! multipartdemux boundary=”video boundary–” ! jpegdec ! videorate ! video/x-raw-yuv,framerate=30/1 ! theoraenc ! oggmux ! filesink location=/dev/stdout’

    The problem is i had a big delay like 4/5 sec minimum.
    I tried your command this morning: It works and the delay is lower. But it’s still around 2 sec. Any idea to reduce it. Maybe an other encoding ?



    January 6, 2012 at 10:31

  3. I tried to encode with webmmux et vp8enc. The problem is i can’t write in /dev/stdout
    From debugger_
    gst_file_sink_event: error: Error while seeking in file “/dev/stdout”.

    Idem with x264enc

    I tried several parameters in ogg encoding like:
    – quality
    – bitrate
    – speed-level
    – keyframe-freq
    2 seconds of delay are stile here…
    Is there a good documentation of gstreamer except http://gstreamer.freedesktop.org/data/doc/gstreamer/head/gst-plugins-base-plugins/html/gst-plugins-base-plugins-theoraenc.html
    Because each parameters are not really well describe.

    Have a nice week-end



    January 6, 2012 at 17:24

    • I suspect that a big part of the delay is some sort of browser buffering. Maybe we have to ditch the HTTP part if we want more real time data.

      And no, the gstreamer homepage is best resource I have located. The alternative if finding examples on the internet, but they tend to be very specific – like mine 🙂


      September 5, 2012 at 22:02

  4. Hi, it’s a very good guide, but I just can’t make it work, i made a copy – paste of your code changing the camera address, but i can’t get video, neither an error or something, i tried sending the data to a file, but the file never grows it always has a 0 Bytes size any idea?



    August 30, 2012 at 20:54

    • I just realized that the camera had enabled login so gstreamer was not able to connect, once i disabled the security it worked, thank you. =)


      August 30, 2012 at 21:47

      • I have elaborate a lot on this in my CamProxy project. Still W-I-P but it could give you some ideas.
        For me the big problem is that, yes the cameras do MJPEG, but they “forget” newlines, headers or whatever, that gstreamer need to work.


        September 5, 2012 at 22:00

      • http://login:password@ , You are telling me this isn’t working ?


        September 13, 2012 at 00:44

      • I am saying “maybe” 🙂

        I found that some of the MJPEG camera are not strictly conform, and that gstreamer had a problem reading the stream.

        Using something like wireshark to see the exact helped me to see that exact problem.


        September 16, 2012 at 14:00

  5. Hi moozing benoit and everyone else

    i was really interested about this guide.
    Like Benoit i tried to stream my webcam to an html5 page but unfortunately without success 😦

    I run in shell line the following command :

    gst-launch –gst-debug-level=2 v4l2src device=/dev/video0 ! ffmpegcolorspace ! ‘video/x-raw-yuv,width=640,height=480,framerate=30/1’ ! jpegenc ! multipartmux boundary=”video boundary–” ! tcpserversink host=localhost port=8080

    in order to stream mi webcam to the address http://localhost:8080

    then i modify the server as follow:

    command = ‘(echo “–video boundary–“; curl -s http://localhost:8080 😉 | gst-launch –gst-debug-level=2 fdsrc do-timestamp=true ! video/x-raw-yuv,width=640,height=480,framerate=30/1 ! theoraenc ! oggmux ! filesink location=/dev/stdout’
    print(“running command: %s” % (command, ))
    p = subprocess.Popen(command, stdout=subprocess.PIPE, bufsize=-1, shell=True)

    the final result is that when i open the page the screen remains black and in the two terminal i’ve some warning…
    in the first one i’ve the following

    0:00:00.143727710 8985 0x10a6cd0 WARN bin gstbin.c:2399:gst_bin_do_latency_func: failed to query latency
    New clock: GstSystemClock
    0:00:01.868448569 8985 0x10b1940 WARN bin gstbin.c:2395:gst_bin_do_latency_func: did not really configure latency of 0:00:00.033333333
    0:00:10.235723274 8985 0xe946d0 WARN GST_POLL gstpoll.c:1040:gst_poll_fd_has_closed: 0xe94680: couldn’t find fd !
    0:00:10.235779776 8985 0xe946d0 WARN GST_POLL gstpoll.c:1089:gst_poll_fd_has_error: 0xe94680: couldn’t find fd !
    0:00:10.235793744 8985 0xe946d0 WARN GST_POLL gstpoll.c:1117:gst_poll_fd_can_read_unlocked: 0xe94680: couldn’t find fd !
    0:00:10.235805617 8985 0xe946d0 WARN GST_POLL gstpoll.c:1189:gst_poll_fd_can_write: 0xe94680: couldn’t find fd !

    and in the otherone

    ERRORE: la pipeline non vuole fare il preroll.
    0:00:00.044157294 8996 0x27a78f0 WARN basetransform gstbasetransform.c:1739:gst_base_transform_prepare_output_buffer: pad-alloc failed: error
    0:00:00.044208278 8996 0x27a78f0 WARN basetransform gstbasetransform.c:2548:gst_base_transform_handle_buffer: could not get buffer from pool: error
    0:00:00.044309966 8996 0x27a78f0 WARN basesrc gstbasesrc.c:2625:gst_base_src_loop: error: Errore interno nel flusso di dati.
    0:00:00.044338182 8996 0x27a78f0 WARN basesrc gstbasesrc.c:2625:gst_base_src_loop: error: streaming task paused, reason error (-5)
    Done Looping
    dumping last data, if any

    any idea to solve it??


    February 7, 2013 at 18:23

    • hello

      this morning i had another test in which i make the server print to video what he read from the stdout
      in code :

      print(“starting polling loop.”)
      while(p.poll() is None):
      print “looping… ”
      stdoutdata = p.stdout.read(DataChunkSize)
      print(“%s” % (stdoutdata, ))

      so i understand that the server receive the chunk from the webcam but the browser (chrome-stable (24.0.1312.69) release for debian) continue to be black :(:(:(


      February 8, 2013 at 12:41

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: