(W-ROV)Webserver Remotely Operated Vehicle Observation-Class I, Deep Learning Enabled Python, Github Code Included

Mozahid Sufiyan
8 min readJul 3, 2020

Story

Fig 1. 4X4 Remotely Operated Vehicle with Pan Tilt Camera

On this tutorials we explore:

https://github.com/mozahidsufiyan221/WebServered-Remotely-Operated-Vehicle

In this tutorial, we will combine what we have learned before, controlling our camera position through the internet, as shown in the example:

The above gif shows the camera controlled by buttons, pre-programmed with fixed Pan/Tilt angles. In this tutorial, we will also explore other alternatives to control the camera position through the internet.

Fig 1. Below the block diagram of our project

Step 1: Used Instruments

Main parts:

Fig 2. Pan Tilt Camera

Step 2: Installing the PiCam

1. With your RPi turned-off, install the Camara on its special port as shown below:

2. Turn on your Pi and go to Raspberry Pi Configuration Tool at the main menu and verify if Camera Interface is enabled:

Fig 3. Enable Camera on Raspberry Config

If you needed to Enabled it, press [OK] and reboot your Pi. Make a simple test to verify if everything is OK:

raspistill -o /Desktop/image.png

You will realize that an image icon appears on your Rpi desktop. Click on it to open. If an image appears, your Pi is ready to stream video! If you want to know more about the camera, visit the link: Getting started with picamera.

Fig 3. Pan Tilt Angle Positions

Step 3: Instaling Flask

There are several ways to stream video. The best (and “lighther”) way to do it that I found was with Flask, as developed by Miguel Grinberg. For a detailed explanation about how Flask does this, please see his great tutorial: flask-video-streaming-revisited.

On my tutorial: Python WebServer With Flask and Raspberry Pi, we learned in more details how Flask works and how to implement a web-server to capture data from sensors and show their status on a web page. Here, on the first part of this tutorial, we will do the same, only that the data to be sent to our front end, will be a video stream.

Creating a web-server environment:

The first thing to do is to install Flask on your Raspberry Pi. If you do not have it yet, go to the Pi Terminal and enter:

sudo apt-get install python3-flask

The best when you start a new project is to create a folder where to have your files organized. For example:

From home, go to your working directory:

cd Documents

Create a new folder, for example:

mkdir camWebServer

The above command will create a folder named “camWebServer”, where we will save our python scripts:

/home/pi/Documents/camWebServer

Now, on this folder, let’s create 2 sub-folders: static for CSS and eventually JavaScript files and templates for HTML files. Go to your newer created folder:

cd camWebServer

And create the 2 new sub-folders:

mkdir static

and

mkdir templates

The final directory “tree”, will look like:

├── Documents
├── camWebServer
├── templates
└── static

OK! With our environment in place let’s create our Python WebServer Application to stream video.

Step 4: Creating the Video Streaming Server

First, download Miguel Grinberg’s picamera package: camera_pi.py and save it on created directory camWebServer. This is the heart of our project, Miguel did a fantastic job!

Now, using Flask, let’s change the original Miguel’s web Server application (app.py), creating a specific python script to render our video. We will call it appCam.py:

from flask import Flask, render_template, Response
# Raspberry Pi camera module (requires picamera package, developed by Miguel Grinberg)
from camera_pi import Camera
app = Flask(__name__)
@app.route('/')
def index():
"""Video streaming home page."""
return render_template('index.html')
def gen(camera):
"""Video streaming generator function."""
while True:
frame = camera.get_frame()
yield (b'--frame\r\n'
b'Content-Type: image/jpeg\r\n\r\n' + frame + b'\r\n')
@app.route('/video_feed')
def video_feed():
"""Video streaming route. Put this in the src attribute of an img tag."""
return Response(gen(Camera()),
mimetype='multipart/x-mixed-replace; boundary=frame')
if __name__ == '__main__':
app.run(host='0.0.0.0', port =80, debug=True, threaded=True)

The above script streams your camera video on an index.html page as below:

<html>
<head>
<link rel="stylesheet" href='../static/style.css'/>
</head>
<body>
<h1>MJRoBot Lab Live Streaming</h1>
<h3><img src="{{ url_for('video_feed') }}" width="90%"></h3>
<hr>
</body>
</html>

The most important line of index.html is:

<img src="{{ url_for('video_feed') }}" width="50%">

There is where the video will be “feed” to our web page.

You must also include the style.css file on the static directory to get the above result in terms of style.

All the files can be downloaded from my GitHub: camWebServer

Only to be sure that everything is in the right location, let’s check our environment after all updates:

├── Documents
└── camWebServer
├── camera_pi.py
├── appCam.py
├── templates
| └── index.html
└── static
└── style.css

Now, run the python script on the Terminal:

sudo python3 appCam.py

Go to any browser in your network and enter with http://YOUR_RPI_IP (for example, in my case: 10.0.1.27)

NOTE: If you are not sure about your RPi IP address, run on your terminal:

ifconfig

at wlan0: section you will find it.

The results:

That’s it! From now it is only a matter to sophisticate a page, embedded your video on another page etc.

Step 5: The Pan Tilt Mechanism

Now that we have the camera working and our Flask WebServer streaming its video, let’s install our Pan/tilt mechanism to position the camera remotely.

Fig 4. Electronic Circuit Diagram

The servos should be connected to an external 5V supply, having their data pin (in my case, their yellow wiring) connect to Raspberry Pi GPIO as below:

  • GPIO 17 ==> Tilt Servo
  • GPIO 27 ==> Pan Servo

Do not forget to connect the GNDs together ==> Raspberry Pi — Servos — External Power Supply)

You can have as an option, a resistor of 1K ohm in series, between Raspberry Pi GPIO and Server data input pin. This would protect your RPi in case of a servo problem.

Let’s also use the opportunity and test our servos inside our Virtual Python Environment.

Let’s use Python script to execute some tests with our drivers:

from time import sleep
import RPi.GPIO as GPIO
GPIO.setmode(GPIO.BCM)
GPIO.setwarnings(False)
def setServoAngle(servo, angle):
pwm = GPIO.PWM(servo, 50)
pwm.start(8)
dutyCycle = angle / 18. + 3.
pwm.ChangeDutyCycle(dutyCycle)
sleep(0.3)
pwm.stop()
if __name__ == '__main__':
import sys
servo = int(sys.argv[1])
GPIO.setup(servo, GPIO.OUT)
setServoAngle(servo, int(sys.argv[2]))
GPIO.cleanup()

The core of above code is the function setServoAngle(servo, angle). This function receives as arguments, a servo GPIO number, and an angle value to where the servo must be positioned. Once the input of this function is “angle”, we must convert it to an equivalent duty cycle.

Step 6: Using Incremental — Decremental Angle Buttons

Sometimes what we only need is a few buttons to move our servos in steps:

  • Pan: Left / Right
  • Tilt: Up / Down

We can also use +/- buttons (Incremental — decremental angle), your choice. Let’s create a new directory:

mkdir PanTiltControl2

On this directory we should have the following environment and files:

├── Documents
└── PanTiltControl2
├── camera_pi.py
├── angleServoCtrl.py
├── appCamPanTilt2.py
├── templates
| └── index.html
└── static
└── style.css

The files camera_pi.py and angleServoCtrl.py are the same used before. You can download both from my GitHub, clicking on correspondent links or use the ones that you have downloaded before.

Now we need the appCamPanTilt2.py, the index.html and style.css. You can download those files from my GitHub, clicking on corresponding links. Pay attention to its correct position on your directory.

Let’s see the NEW index.html:

<html>
<head>
<title>MJRoBot Lab Live Streaming</title>
<link rel="stylesheet" href='../static/style.css'/>
</head>
<body>
<h3><img src="{{ url_for('video_feed') }}" width="80%"></h3>
<hr>
<h4> PAN Angle: <a href="/pan/-"class="button">-</a> [ {{ panServoAngle }} ] <a href="/pan/+"class="button">+</a> </h4>
<h4> TILT Angle: <a href="/tilt/-"class="button">-</a> [ {{ tiltServoAngle }} ] <a href="/tilt/+"class="button">+</a> </h4>
<hr>
</body>
</html>

The index.html is very similar to the previous one. The bunch of lines used on the last index.html was replaced by only 2 lines, where we will only have now 4 buttons Pan [+], Pan [-], Tilt [+] and Tilt [-].

Let’s analyze one of the 4 buttons:

<a href="/pan/-"class="button">-</a>

This is also a simple HTML hyperlink TAG, that we have styled as a button (the button style is described in style.css). When we click on this link, we generate a “GET /<servo>/<Increment or decrement angle>”, where <servo> is “pan” and <-> is “decrease angle”. Those parameters will be passed to the Web Server App (appCamPanTilt2.py).

Let’s see this part of code on appCamPanTilt2.py:

@app.route("/<servo>/<angle>")
def move(servo, angle):
global panServoAngle
global tiltServoAngle
if servo == 'pan':
if angle == '+':
panServoAngle = panServoAngle + 10
else:
panServoAngle = panServoAngle - 10
os.system("python3 angleServoCtrl.py " + str(panPin) + " " + str(panServoAngle))
if servo == 'tilt':
if angle == '+':
tiltServoAngle = tiltServoAngle + 10
else:
tiltServoAngle = tiltServoAngle - 10
os.system("python3 angleServoCtrl.py " + str(tiltPin) + " " + str(tiltServoAngle))

templateData = {
'panServoAngle' : panServoAngle,
'tiltServoAngle' : tiltServoAngle
}
return render_template('index.html', **templateData)

In this example, “servo” is equal to “pan”, the lines below will be executed:

if angle == '+':
panServoAngle = panServoAngle + 10
else:
panServoAngle = panServoAngle - 10
os.system("python3 angleServoCtrl.py " + str(panPin) + " " + str(panServoAngle))

Once the “angle” is equal to “-”, we will decrease 10 from panServoAngle and pass this parameter to our command. Suppose that the actualpanServoAngle is 90. The new parameter will be 80.

So, PanPin will be translated by “27” and panServoAngle to “80”. The app will generate the command.

Step 9: Conclusion

As always, I hope this project can help others find their way into the exciting world of electronics!

For details and final code, please visit my GitHub repository:

https://github.com/mozahidsufiyan221/WebServered-Remotely-Operated-Vehicle

--

--

Mozahid Sufiyan

I have been employed in the Oil and Gas Industry since five years ago and I am currently a Chartered Member of the Institution of Mechanical Engineers.