Just moozing

Before you can check your notes, you must make them…

IP camera, gstreamer and virtual video devices

with 9 comments

I have an old IP camera that I want to use for surveillance. First step is to get it into the computer in some usable form. It ended up as a virtual video device using gstreamer. Really cool.

The camera

It is a Trendnet TV-IP100. Being a couple of year old, it probably is not the thing to use for professional solutions, but it is capable of sending mjpg video, and that is what I am going to use.

The usual nmap prodding to see what the device is offering (expand to see detailed output)

# nmap -O -T4 trendnetcam -p0-65535 -v

Starting Nmap 5.21 ( http://nmap.org ) at 2011-12-25 19:32 CET
Nmap wishes you a merry Christmas! Specify -sX for Xmas Scan (http://nmap.org/book/man-port-scanning-techniques.html).
Initiating ARP Ping Scan at 19:32
Scanning trendnetcam (192.168.1.154) [1 port]
Completed ARP Ping Scan at 19:32, 0.01s elapsed (1 total hosts)
Initiating Parallel DNS resolution of 1 host. at 19:32
Completed Parallel DNS resolution of 1 host. at 19:32, 0.00s elapsed
Initiating SYN Stealth Scan at 19:32
Scanning trendnetcam (192.168.1.154) [65536 ports]
Discovered open port 80/tcp on 192.168.1.154
Discovered open port 21/tcp on 192.168.1.154
Discovered open port 8465/tcp on 192.168.1.154
Discovered open port 8481/tcp on 192.168.1.154
Completed SYN Stealth Scan at 19:32, 52.19s elapsed (65536 total ports)
Initiating OS detection (try #1) against trendnetcam (192.168.1.154)
Nmap scan report for trendnetcam (192.168.1.154)
Host is up (0.0011s latency).
rDNS record for 192.168.1.154: trendnetcam.lan
Not shown: 65532 closed ports
PORT     STATE SERVICE
21/tcp   open  ftp
80/tcp   open  http
8465/tcp open  unknown
8481/tcp open  unknown
MAC Address: 00:14:D1:xx:xx:xx (Trendware International)
Device type: webcam
Running: TRENDnet embedded
OS details: TRENDnet TV-IP100 Internet camera
Network Distance: 1 hop
TCP Sequence Prediction: Difficulty=17 (Good luck!)
IP ID Sequence Generation: Incremental

Read data files from: /usr/share/nmap
OS detection performed. Please report any incorrect results at http://nmap.org/submit/ .
Nmap done: 1 IP address (1 host up) scanned in 55.43 seconds
Raw packets sent: 67395 (2.967MB) | Rcvd: 65551 (2.622MB)

Nmap detect that it is a running the OS “TRENDnet TV-IP100 Internet camera”. How cool is that. I thought that if it was that known, metasploit might have some exploits for it also, but I didn’t find any. Too bad, that could have been fun.

So we have HTTP running – I knew that. The ports 8465 and 8481 probably relates to UPnP, which I have no use for currently. The FTP part could be interesting, but not today.

The web interface is so ugly, that I wanted to do some screen shots, but I don’t have the administrator password. I was going to do a factory reset by pressing the reset button for 5 seconds. That necessitates a paper clip of some sort, which I don’t currently have. It will just have to get wiped some other time.

Getting video

gstreamer needs to know where to get the mjpeg stream. The motion homepage gives the exact urls to use. In our case it is http://trendnetcam/VIDEO.CGI. Btw, motion is a really usable piece of surveillance software (to be installed later).

Testing it as describe in this blog entry.

$ gst-launch -vet souphttpsrc location=http://trendnetcam/video.cgi timeout=5 ! jpegdec ! autovideosink

It works and opens a windows with the video. I didn’t find a value for it, but the framerate seemed rather high.

Complete output (should you be interested)

$ gst-launch -vet souphttpsrc location=http://trendnetcam/video.cgi timeout=5 ! jpegdec ! autovideosink
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
/GstPipeline:pipeline0/GstJpegDec:jpegdec0.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)I420, width=(int)640, height=(int)480, framerate=(fraction)0/1
/GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstXvImageSink:autovideosink0-actual-sink-xvimage.GstPad:sink: caps = video/x-raw-yuv, format=(fourcc)I420, width=(int)640, height=(int)480, framerate=(fraction)0/1
/GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0.GstGhostPad:sink: caps = video/x-raw-yuv, format=(fourcc)I420, width=(int)640, height=(int)480, framerate=(fraction)0/1
/GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0.GstGhostPad:sink.GstProxyPad:proxypad0: caps = video/x-raw-yuv, format=(fourcc)I420, width=(int)640, height=(int)480, framerate=(fraction)0/1
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
^CCaught interrupt -- handling interrupt.
Interrupt: Stopping pipeline ...
EOS on shutdown enabled -- Forcing EOS on the pipeline
Waiting for EOS...
Got EOS from element "pipeline0".
EOS received - stopping pipeline...
Execution ended after 3540986465 ns.
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
/GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstXvImageSink:autovideosink0-actual-sink-xvimage.GstPad:sink: caps = NULL
/GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0.GstGhostPad:sink: caps = NULL
/GstPipeline:pipeline0/GstJpegDec:jpegdec0.GstPad:src: caps = NULL
Setting pipeline to NULL ...
Freeing pipeline ...

Gstreamer works with pipelines. The command issued is about a source souphttpsrc, is decoded using some jpeg filter jpegdec, and finally show using the default graphical sink autovideosink.

Gstreamer quickly gets difficult, and I will look into how to do it using python instead of the command line.

Virtual camera device

I located this blog entry which describe how to get your streams shown in skype or other v4l/v4l2 only programs.

It requires the v4l2loopback device. This is a kernel modules that is not part of the mainstream kernels, so we need to complie it ourselves. Luckily, it is included in debian and uses the DKMS framework (which is really cool also).

With some hints from this blog, I succeeded in compiling and installing the module using these commands


$ apt-get install v4l2loopback-dkms
$ apt-get install linux-headers-3.1.0-1-686-pae
$ dpkg-reconfigure v4l2loopback-dkms
$ modprobe -v v4l2loopback

This gives me a v4l2 loopback device on /dev/video1, since my regular built-in USB camera is on /dev/video0.

Testing it with gstreamers built-in test signal


gst-launch -v videotestsrc ! navigationtest ! v4l2sink

Using cheese,  I am able to see the test signal. Why cheese? It is the one that is installed by default, and gives easy access to the v4l2 devices.

And to see the live mjpeg stream from the IP camera run this

gst-launch -vet  souphttpsrc location=http://trendnetcam/video.cgi timeout=5 ! jpegdec ! v4l2sink

In summary, going through the above gives me access to the mjpeg stream of my IP camera. It should be the same for other cameras as long as you are able to locate the correct URL.

When no programs are using the device, gst-launch uses 10% CPU power. It could be nice with an on-demand solution, but this is going to run on a server with motion detection running, so it might not be important.

Written by moozing

December 26, 2011 at 09:00

Posted in Tech

Tagged with , ,

9 Responses

Subscribe to comments with RSS.

  1. […] I had this running already as part of getting to the virtual camera device, it is described in this blog entry. […]

  2. “but this is going to run on a server with motion detection running”
    How do think this method of yours and face detection that i found could work together ? – http://www.smartjava.org/content/face-detection-using-html5-javascript-webrtc-websockets-jetty-and-javacvopencv

    Justs (@MrJusts)

    August 20, 2012 at 03:00

    • Interesting find.
      The way I understand it is that he interfaces HTML5 with Java through a socket on localhost. That could work – you need something special if you don’t want to code all the image processing in javascript 🙂

      Piping image data though a localhost socket enables you to use every image processing library out there for, still, the focus for me is to have simple devices to show what the servers find. So I am primarily working on moving the processing away from the client, so HTML5 is just the front-end.

      And yes, you should be able to use the virtual v4l2 device as you would any other v4l2 device (like the built-in USB webcam)

      moozing

      September 5, 2012 at 21:52

      • Thanks for replay,

        “That could work – you need something special if you don’t want to code all the image processing in javascript ” any suggestions 😀 ?

        Justs (@MrJusts)

        September 13, 2012 at 01:29

      • My current plan includes gstreamer and opencv. And having the processing done on a server.
        This one caught my eye 🙂

        moozing

        September 16, 2012 at 14:05

  3. Hi, what is your e-mail adress ?

    Justs (@MrJusts)

    October 6, 2012 at 02:59

  4. […] Part 1, I am planning on being some sort of gstreamer virtual v4l2 device, that opencv will read from as if it were an ordinary camera. I have done this before. […]


Leave a comment