Controlling a Pair of Robotic Eyes

by thomas9363 in Circuits > Raspberry Pi

1332 Views, 32 Favorites, 0 Comments

Controlling a Pair of Robotic Eyes

VideoToGif_GIF.gif
empty.png

The objective of this project is to build a pair of robotic eyes and create an Android app to control them. Using the joystick and buttons on the app, commands are sent via a Bluetooth connection, allowing the eyeballs to rotate and the eyelids to open or close. Two popular microcontroller/microprocessor platforms, the Arduino UNO R3 and the Raspberry Pi 4, are used for control.

 

At the end of this project, I also implemented a small program using the Raspberry Pi and computer vision to enable the eyes to track and follow a human face. The eyeballs move in response to the position of the face in front of them, creating the illusion that the robotic eyes are actively observing and following the person.

Supplies

Materials to build the eyes:

  • 4mm lottery ball (1)
  • Fake eyeballs (2)
  • Small servos and servo horns (6)
  • Universal joints (2)
  • Acrylic glass (3mm and 6mm)
  • Aluminum bar (1mm x 10mm)
  • 2 and 3mm bolts and nuts
  • 1mm brass wire

Electronics used:

  • Sensor Shield 5.0 (1)
  • Arduino UNO R3 (1)
  • Bluetooth module (1)
  • Raspberry Pi 4 (1)
  • PCA9685 16-Channel Module (1)
  • Power supply
  • Smartphone for Android-based Bluetooth control

Building the Eyes

eye_build.jpg

The eyeball is connected to a universal joint, allowing it to move in any direction. The upper and lower eyelids are made from halves of a 40mm lottery ball. They are hinged together using brass strips for opening and closing. The pushrods connecting the servo horn to the eye are made of 1mm brass wire. Before connecting them, the servo angles need to be adjusted to match the initial angles in the program, which, in my case, are all set to 90 degrees. Two eyes are built as shown in the figure above.

APP Design:

EyeControlApp.jpg

The app is created using MIT App Inventor. It uses two clocks: one with a 1000-millisecond interval for Bluetooth connection and one with a 5-millisecond interval for transmitting data. A joystick represents the eyeball, allowing the user to drag it to move in the pan (x) and tilt (y) directions, generating x and y values between 0 and 200. Five buttons control the movement of the eyeballs, and three buttons are dedicated to blinking the eyes. The data is structured as [255, button state, x, y], where "255" is used to separate the data packets, and the button state dictates the action taken. The x and y values are processed at the receiving end and converted into movement angles. You can download the *.aia file from my GitHub repository.

Controlling Using Arduino:

eye_wiring.JPEG

Arduino, a microcontroller, provides real-time control and operates without the need for an operating system. It's essential to ensure that the baud rate for the Bluetooth connection between the module and the program matches; in my setup, I've configured the baud rate of my Bluetooth module to 38400.

 To simplify the connection process, I've mounted a Sensor Shield 5.0 on top of the UNO, as illustrated above. The Sensor Shield offers the advantage of allowing direct connection of servo connectors and the Bluetooth module. I've utilized the 6 PWM pins (3, 5, 6, 9, 10, and 11) for this setup.

 Arduino programming is based on C/C++, and the program utilizes the Servo.h library, making it simple and straightforward to control servos. Below is a code snippet demonstrating how to read incoming data and convert it into movement angles:

if(Serial.available()> 0 ){ 
   in_byte = Serial.read(); // Read the incoming byte
   if (in_byte == 255) { // Check if the byte is the start indicator
     array_index = 0; // Reset array index
   }
   dataIn[array_index] = in_byte; // Store in the dataIn array
   array_index = array_index + 1; // Increment the array index
 }
   pan_servopos = map(dataIn[2]-100, -100, 100, -servoRange, servoRange); //incoming x or y (0 ~ 200) to (-100 ~ +100)
   tilt_servopos = map(dataIn[3]-100, -100, 100, -servoRange, servoRange);//then map to between (-20 ~ +20 degree)

Running the program is straightforward. After uploading the program, follow these steps:

  • Provide the shield with a 5V power supply.
  • Start the app on your phone.
  • Click "Turn ON" within the app.
  • Select the Bluetooth module from the available options.
  • Click a button within the app.
  • Move the joystick to control the robotic eyes.

You can download the program from my GitHub repository for further details.

Controlling Using Raspberry Pi

The Raspberry Pi, with its full-fledged operating system, offers versatility for complex computations and image processing tasks, making it suitable for computer vision and AI-related projects. However, its real-time control capabilities are not as robust as those of dedicated microcontrollers like the Arduino.

Setting up the operating system, currently Debian 12 (Bookworm), on the Raspberry Pi is straightforward. However, configuring Bluetooth on the Raspberry Pi requires additional steps. The steps outlined here are compatible with earlier versions of the operating systems such as Buster and Bullseye. It's important to note that while Buster still defaults to Python 2, I'm using Python 3 for this project. To set Python 3 as the default, follow these steps:

On a terminal, edit the bashrc file: nano ~/.bashrc 
Add this line: alias python='/usr/bin/python3.7'
Run: . ~/.bashrc

 Config Bluetooth on Pi:

Since the Raspberry Pi has built-in Bluetooth capability, there is no need for an external Bluetooth module. However, you need to install the necessary Bluetooth packages before using it. Open a terminal and run the following commands:

sudo apt-get update
sudo apt-get upgrade
sudo apt-get install bluetooth blueman bluez

Install python3 Library for Bluetooth communication:

sudo apt-get install python3-bluetooth 

 The next step is to configure the Bluetooth adapter on the Raspberry Pi and make it discoverable:

sudo bluetoothctl
[bluetooth]# power on
[bluetooth]# agent on
[bluetooth]# discoverable on

You should be able to see your raspberry Pi’s number, for example, DC:A6:32:9D:5C:AC. You can also find the Bluetooth address on your phone in the setting > about phone > status.

[bluetooth]# pairable on 
[bluetooth]# scan on

You Pi will find a lot of Bluetooth devices. One of them is the Bluetooth address and model of your phone, for example, C0:9F:05:4A:BE:C9 OPPO F1s. You can now pair it with your phone by:

[bluetooth]# pair C0:9F:05:4A:BE:C9

You should receive a message on your phone to agree on the pairing request. You can verify if your pairing is successful by checking the Bluetooth setting of your phone.

[bluetooth]# quit

 Next, you need to specify the channel number on which the Bluetooth service will operate by adding a Service Record (SDP - Service Discovery Protocol) entry for a Bluetooth service on a specific channel. There are 79 potential channels available for use. Here, I choose channel 27:

sudo sdptool add --channel=27 SP
sudo sdptool browse local

 If you get an error: Failed to connect to SDP server on FF:FF:FF:00:00:00:No such file or directory, you need to edit this file:

sudo nano /lib/systemd/system/bluetooth.service

Find “ExecStart=/usr/lib/bluetooth/bluetoothd” and add --compat at the end of this line as :

ExecStart=/usr/lib/bluetooth/bluetoothd –compat

 You need to restart the terminal:

sudo systemctl daemon-reload
sudo systemctl restart Bluetooth
 
sudo sdptool add --channel=27 SP
sudo sdptool browse local

Now, you should see channel 27 in the list. Remember that every time you power on Pi, you need to issue:

sudo sdptool add --channel=27 SP

If you find this is annoy, you can automatically start channel 27 at start up by editing this file:

sudo nano /lib/systemd/system/bluetooth.service
ExecStart=/usr/lib/bluetooth/bluetoothd --compat
ExecStartPost=/usr/bin/sdptool add --channel=27 SP

Wiring:

To streamline the wiring process, I employ a PCA9685 servo drive. This component offers the advantage of controlling multiple servos or PWM devices through a single I2C interface, thereby minimizing the number of GPIO pins needed on the microprocessor. Additionally, I utilize the adafruit_servokit library to facilitate servo movement in my code. The wiring configuration is depicted in the connection diagram provided earlier.

Program:

The program is written in Python 3 and can be downloaded from my GitHub repository. A code snippet about transferring data from phone to Pi is shown below:

import bluetooth
# Create a Bluetooth server socket
server_sock = bluetooth.BluetoothSocket(bluetooth.RFCOMM)
# Set the port number for the server socket
port = 27
server_sock.bind(("", port))
# Listen for incoming connections from clients
server_sock.listen(1)
# Accept the client connection
client_sock, address = server_sock.accept()
print("connected at:", address)
# Initialize a list to store data received in 255-based groups
dataIn = [255, 0, 100, 100]
# Initialize an index to keep track of dataIn elements
array_index = 0
 
while True:
   # Receive data from the client
   data = client_sock.recv(1024)
    # Convert the received data (bytes) to an integer      
   in_byte = int.from_bytes(data, byteorder='big')
   # If the received byte is 255, it indicates the start of a new group
   if in_byte == 255:
       array_index = 0
   dataIn[array_index] = in_byte
   array_index += 1
 
   if array_index == 4:
       print("divide:", dataIn[0],"button:", dataIn[1],"X:", dataIn[2],"Y:", dataIn[3])
       print()
   # Ensure array_index does not exceed the length of dataIn
   array_index %= len(dataIn)
# Close the client and server sockets   
client_sock.close()
server_sock.close()   

Controlling eyes:

Once you've added --channel=27 SP to bluetooth.service:

  • Run the Python script.
  • Start your Bluetooth app and connect to the Pi.
  • Select a button to initiate the desired action. 

Tracking a Face

Controlling a Pair of Robotic Eyes

The primary goal of this attempt is to create a feeling for a person interacting with the robotic eyes, making them feel as though they are being watched. To achieve this, I have positioned a CSI camera module between the eyes. Leveraging the capabilities of the Mediapipe Face library, a component of the MediaPipe framework, the camera tracks the face location and prompts the robotic eyes to follow suit. This synchronization is achieved using the picamera2, OpenCV, and adafruit_servokit libraries.

For those interested in replicating or exploring further, the program is available for download on my GitHub repository. The video below showcases both the manual control of the eyes via the app and the autonomous tracking of a human face.