Sunday, May 10, 2015

Video Streaming Speeds as Compared to original Raspberry Pi and Raspberry Pi 2

Having managed to get reasonable video streaming working on a Raspberry Pi with minimal lag I thought it would be a good idea to test it on a Raspberry PI 2.

Using the Original Raspberry Pi

Using UV4L and its built in MJPEG streaming server I was able to get a pretty reasonable solution together with a lag of about 0.5 seconds. However, I had to use a video size of 320x240 at 15 FPS which is pretty low res. See video below.
I pointed the camera at a stopwatch on a phone which was put up against my tablet which is running the video app developed for the robot (see earlier post). If you pause the video at any time you can see the lag between the two clocks of roughly 0.5 secs. If I set the video to the default 1920x1024 at 30 FPS the lag is 3-4 seconds.

Using the Raspberry Pi 2

The Pi 2 being 6 times faster you'd hope that video rendering would be much faster and you'd be right!
The video below show video streaming using MJPEG at 1920x1024 at 30 FPS and we're getting about 0.4 to 0.5 seconds lag which is pretty good.
I wasn't able to improve on the speed by very much by dropping the frame rate and resolution. Even at 320x240 15 FPS I was still getting a lag of 0.3 seconds. I think there is a certain amount of lag that cannot be overcome. The speed of the wifi and the rendering on the tablet will all add to the delay.


Using WebRTC

UV4L on a Raspberry Pi 2 supports WebRTC which is not available on the original Pi due to API's not bundled in Wheezy for the original Pi. WebRTC is a new protocol and API for supporting real time communication in browsers. Media streaming is just one of the many uses.
Although it works very nicely the speed was very similar to MJPEG.

So, near real time streaming video at full resolution is possible with the Raspberry Pi 2 and UV4L. 

Wednesday, April 22, 2015

Video of Robot

Video Demonstration of Robot

Useful things to see in the video are some more views on construction, how the app interacts with the robot. You can also get a feel for the lag on the video streaming which is not bad, probably less than a second.

Saturday, April 18, 2015

Building the Robot

For full list of required parts see the Part List.

The Robot Arm

Build the arm as per the instructions provided in the kit.

The Track Base

The base was not an off the shelf kit but was instead assembled from parts provided mainly from Tamiya. I wasn't entirely certain whether it would be suitable. Tamiya make a single track tank base kit but this was not wide enough to accommodate the arm.

I have had issues with the tracks sometimes coming off when the base is rotating on the spot (although I have improved this - see later). However it works and its much cheaper than many of the other tank chassis available on the market.


To construct the track base, first build the two motor gear boxes. There are three ratios to choose from. I have used the the lowest ratio (203:1) will give the tracks more torque. This is the slowest speed but suitable for this kind of slow moving robot.

Dont worry at this stage about the position of the main shaft because this will be adjusted later. 

Use the bots included the Universal Plate set to attach the two gear boxes. They should be attached 6 holes in from the rear.


Attache the front axle brackets 5 holes in from the front.



Attach the wheel brackets underneath the main plate as shown below:


Now the wheels and tracks can be attached. The sprocketed wheels are attached to the gear box hex shaft.

At this point adjust the gear box hex shaft position so the the front and rear wheels are aligned. Rotate the shaft until the grub screw in the gear hub is visible, looses and adjust length.

Use all the lengths of track supplied to make four track loops. Loop the tracks onto the wheels.

The hex standoffs can now be attached, one in each corner. If using the screws as in the parts list, use one of the bolts as a spacer.

DC Motors

Solder two wires to each motor and color code the cable so they are easily identified. I left the cable quite long so that there's enough slack to remove the top plate without having to disconnect.

For doing maintenance to the base, the DC motors can be slipped out of the gear boxes. This allows you to leave the cable in place on the motors completely detaching the base from the top.

Attaching the Arm

Using spare screws from the Universal Plate Set, the arm can be bolted to the plate. Pre drill two holes in the front feet for the bolts to go through.

The USB cable for the arm is longer than it needs to be so coil it up and attach to the top plate using wire or tape leaving enough length to connect to the PI.

Power Supply

The gap between the bottom and top plate can be used to house the battery pack for the Pi. I found there wasn't quite enough space to fit the motor power supply in betweeb the plates so this is mounted along side the arm,

A short (15cm) micro USB cable is used to connect the battery and the PI.

Assembling the RTK Motor Controller

Full instructions for assembling the RTK Motor Controller is included with the board.

Lego Pi Platform

I used various lego bits to assemble a mount for the Pi. Lego is of course great because you can prototype different positions and layouts. 

Alternatively, hex stand offs and more plate sets can be used to create a platform. 

The lego pieces are attached using a lego piece shown below:



By chance the shaft fits into the holes in the Universal Plate Set. This allows a cross piece to be attached. Into the holes a lego axle is inserted and then connected to round axle receiving pieces that the platform can be mounted on:


This picture also shows the DC motor wires attached to the RTK Motor Controller going through the holes in the universal plate.


This rear shot shows the lego platform and the cables taped together under the top platform.

The Pi is attached to the play using two left of plastic motherboard pegs which are glued to two lego 2x2 squares which can be seen in the picture below.

I tried to minimise the amount of glueing at all stages so that it was easier to prototype different configurations.

Make sure you leave the Robot Arm on/off switch exposed.

The battery pack for the DC motors is mounted on the side.



I'm using a model B so there are only two USB ports which is just enough for the WIFI dongle and the Robot Arm USB cable.




Issue with Tracks Coming Off

Unfortunately, when the robot was doing a full rotate, i.e. one track going forward, one track in reverse, the outside track going in reverse had a tendency to come off. This was minimised by making sure the wheels were all in line. I also stapled the tracks together the tracks which helped. Also using the spare small sprocket wheels in the wheel and track set for the outside bottom wheels have also helped.

It might be that this configuration cannot deal with the weight of the arm. Any suggestions of how to improve this are gratefully received.

Camera Mount

The camera mount is made by attaching a lego hinge to the camera mount. The hinge is then attached to a piece of lego that is glued to the arm.


Summary

I hope this is enough to get you going building a robot controlled by a Raspberry Pi. The Tamiya kits are great low cost educational kits however, they are not as robust as some platforms and this robot may be pushing it to their limits with the weight of the robot arm.


Sunday, April 12, 2015

Raspberry PI Video Streaming to Android or Web Browser with Minimal Lag

Video streaming on the Raspberry Pi is a commonly asked question on the forums and there are many different methodologies. Although many of them work ok the biggest difference in all of them is the amount of lag in the video stream. My application is using the Raspberry Pi camera attached to a robot arm. As I moved the arm I wanted the see the video in real-time on my Android app. Any lag would be a killer blow to this working nicely.

Ultimately I was constrained in my possible solutions because I wanted to stream to an Android app and Android only supports a specific set of streaming protocols. See http://developer.android.com/guide/appendix/media-formats.html, specifically RTSP and HTTP.

With Android development a MediaPlayer view is used to display video. For streaming it needs to take its source from either a RTSP or HTTP URL. It cannot take a stream directly from a socket because the file descriptor API of MediaPlayer expects a seekable file which obviously a socket isn't.

After a lot of trial and error I managed to get a working solution using u4vl and WebView in Android. To dive straight ahead to my solution have a look at UV4L below. However, before we get there I thought it might be useful to do a quick recap of other solutions.

Things I Tried

When trying different solutions the two main things I was looking at were video lag and ease of integration with an Android app.

The Simplest - Raspvid and nc

Using raspvid and piping it into nc. See https://www.raspberrypi.org/forums/viewtopic.php?f=43&t=87903
On the pi:
raspivid -t 0 -fps 24 -o - | nc -k -l 8554
and on the client:
nc 8554 | vlc --file-caching=1024 file/h264:///dev/stdin
This is certainly the simplest and easiest. However, this is only supported by a limited number of media players and not by Android. The lag with this method was 2-3 seconds.

VLC

Most of the posts suggest using VLC with something along the lines of:

raspivid -o - -t 0 -hf -w 640 -h 360 -fps 5|cvlc -vvv stream:///dev/stdin --sout '#standard{access=http,mux=ts,dst=:8090}' :demux=h264

Although this played on VLC in Windows, Linux and on VLC for Android it was not in a native format supported by the the Android MediaPlayer.
The lag was also terrible at about 5-6 seconds. This could not be reduced be even using small video resolutions.
VLC is doing a certain amount of buffering. If you need real time this is not the way to go.

GStreamer

Many posts talked about getting better performance with minimal lag using GStreamer. GStreamer is a framework for piping video and audio through various filters to do transformations and conversions. Raspvid is piped into GStreamer which in turn is piped into a RTSP server which any web browsers can connect to. RTSP can also be directly stream by MediaPlayer in Android. Thsi seemed like a promising solution. However, the biggest problem is getting a recent built version installed on your Raspberry PI. You can install GStreamer 1.0 on Wheezy using apt-get but you have to compile the RTSP server since its not part of the package install. Unfortunately version 1.2.3 of the RTSP server will not compile against the 1.2 version in the Wheezy repository. At this point I was faced with either trying GStreamer 0.10 which has lag or compiling the whole of GStreamer 1.2.3 yourself. GStreamer 1.4 is going into the next version of Debian (Jessie) but will not be installed on Wheezy. This was all starting to get a little too much like hard work.

GStreamer might still end up being a good solution but its tricky to get installed at the moment. However, here are some useful posts on the subject:

(This includes full build instructions)
Detailed description of setting up GStreamer: http://www.z25.org/static/_rd_/videostreaming_intro_plab/index.html

mjpeg-streamer


Instead of streaming video, how about capturing still images and sending them to a web server one
after each other? This is called Motion JPEG is a commonly used by web cams in surveillance systems. There is a mjpeg-streamer implementation for the Pi; see https://www.raspberrypi.org/forums/viewtopic.php?t=48597

Although this worked quite nicely and was easy to setup it still had a lag of 1-2 seconds.

UV4L


I finally stumbled upon a clear winner; UV4L. UV4L was easy to setup and I could get a solution with very minimal lag. It comes with a server that can stream MJPEG. Since MJPEG is supported by web browsers there is an easy way to integrate a web page into an Android app using WebView. Its  as simple as adding the WebView to your layout then setting to URL to the server running on the Pi. See below.

For setup, follow the instructions on: http://www.linux-projects.org/uv4l/installation/

Install all modules including the optional ones.

For streaming follow the instructions in here http://www.linux-projects.org/uv4l/tutorials/streaming-server/
and you're done!

I had to set the frame rate and the video resolution to be quite low to reduce the lag.

To configure UV4L edit the configuration file in /etc/uv4l/uv4l-raspicam.conf. The key part of the file is:

encoding = mjpeg
width = 320
height = 240
framerate = 10
I also specified some server options to allow more connections:

server-option = --max-streams=5
server-option = --max-threads=10
server-option = --thread-idle-time=5

As soon as I get Raspberry PI 2 I'm going to give the WebRTC option a go and check out its performance.

Android App

The most simplest method is to use a WebView. If you can view it in a web browser you can view it in WebView. First add the WebView to your layout. In in your activity call loadUrl() on the WebView to start streaming.



Note: the URL is loaded in a call to post(). This anonymous method is call after the layout has been drawn. This is important because I'm passing the width and heigtht of the WebView into the URL and the dimensions are only known after the layout has been drawn. The size of the WebView is different for different layouts (landscape/portrait) for phones and tablets.

There is an MJPEG view class available from https://bitbucket.org/neuralassembly/simplemjpegview but I could not get his app to work with my UV4L stream so was not tempted to integrate the classes into my app.

The WebView worked fine and in particular was able to deal with the different windows sizes used on the app's multiple layouts.

Raspberry Pi controlled Robot Arm on Tank Tracks


This is a robot arm on a tank track base controlled by with a Raspberry Pi. Movement is controlled using an Android app specially written for this project. All code is available on GitHub (see links).

The Android app running on a Nexus 10 is shown below.


The buttons on the left control the arm. The buttons on the right control the base. The image in the middle is a real-time video feed from the camera. The App supports multi-touch so its possible to press multiple keys at once to move multiple parts of the arm and the base simultaneously.

I've created a parts list for anyone wanting to build something similar.