Tuesday, October 25, 2016

Raspberry Pi Robot with PiShield, Part 3: Fast video streaming

I fiddled for a good part of the day trying to find a low-latency local network video streaming solution to implement a "FPV"-like control for the robot using a RPi camera module. It turns out a bunch of existing solutions (like VLC streaming etc) have really high latencies. This is OK for things like a security camera, but not so much for realtime control.

Finally, this rather hacky solution using netcat appeared to work best:

1. On the client (viewer/controller, running on a Mac) side, run:

 nc -l 5000 | mplayer -fps 30 -cache 1024 - -x 640 -y 360

 2. On the RPi side, run:

 raspivid -n -t 999999 -fps 12 -rot 180 -o - | nc IP_OF_VIEWER 5000

 netcat is used to listen and send; raspivid grabs the video and pipes it to netcat, and on the other side netcat pipes the received data and spits it into mplayer.

 This was the only way I could get less than 1s of latency on the video feed, although it still wasn't great. Having a good USB wifi dongle also helped a bit. Finally, changing the dimensions of the video using raspivid actually made it much slower, probably due to software resizing of each frame on the fly compared to just sending out the raw feed from the camera.

 Next steps: get some more interesting sensor info from the PiShield, and get a separate power source since there seem to be random system freezes when the motor and Pi are both running off the same power bank.

UPDATE Oct 26:

with the advice of a very helpful redditor who suggested gstreamer, I gave it a shot this evening.

On Raspbian, gstreamer1.0 is currently already in the official repos, so no need to add custom sources as these earlier instructions.

For OSX, I just downloaded and installed the latest compiled version from the official source. It also handily tells you that by default the path for the commands are at

/Library/Frameworks/GStreamer.framework/Commands/

From here, we can either have a tcpserver on the Pi that a client can connect to, in which case:

RPi:

raspivid -t 0 -h 720 -w 1080 -fps 25 -hf -b 2000000 -o - | gst-launch-1.0 -v fdsrc ! h264parse !  rtph264pay config-interval=1 pt=96 ! gdppay ! tcpserversink host=minibian.local port=5000

Mac (note explict path to gst-launch-1.0 command, since I haven't added it to path):

/Library/Frameworks/GStreamer.framework/Commands/gst-launch-1.0 -v tcpclientsrc host=IP_OF_RPi port=5000  ! gdpdepay !  rtph264depay ! avdec_h264 ! videoconvert ! osxvideosink sync=false

Now UDP would be faster, and in this case it would be similar to the netcat example where the RPi would define the UDP "sink" using the IP address and PORT of the viewer (on the Mac), which would yield the following:

RPi (note destination should be explicitly defined by the source this time):

raspivid -t 0 -h 720 -w 1080 -fps 25 -hf -b 2000000 -o - | gst-launch-1.0 -v fdsrc ! h264parse !  rtph264pay config-interval=1 pt=96 ! gdppay ! udpsink host=IP_OF_VIEWER port=5000

Mac:


/Library/Frameworks/GStreamer.framework/Commands/gst-launch-1.0 -v udpsrc port=5000  ! gdpdepay !  rtph264depay ! avdec_h264 ! videoconvert ! osxvideosink sync=false

Eye-balling the two versions, I feel like the UDP version is slightly snappier, but considering that my network is not very congested, I have a feeling the difference wouldn't be as much as if I had a lot of traffic running. Regardless, this is a HUGE improvement over the existing solution! Should try to do some timing tests to see if I could get a better measure of the actual latency - perhaps using a photo of a stopwatch like this guy did (which was one of many sources I originally consulted for the netcat solution above as well)!

finally, here's a test on the actual video latency of the setup above: looks like its somewhere between 150 and 200 ms.



FINAL UPDATE (Jan 2017)

Yet another option, is to use RPiCamWebInterface. In the end I didn't get around to do the latency measurement, but it feels *almost* as fast as the previous solution. The other bonus is the client user interface is far nicer!

Here's a sample video showing the stream in action:


No comments: