WiFi Controlled Self Balancing Robot With PanTilt Camera

WiFi Controlled Self Balancing Robot With PanTilt Camera

Self-balancing portable robots (SBMR) are unique among other things, only because of their ability to balance in a certain fixed location. A typical SBMR measures the angle of inclination with a combination of accelerometer and gyroscope measurements and corrects itself by moving forward or backward to make the angle zero. Proportional-Integral-Derivative (PID) controller is used to decide how much movement is required and at what rate to quickly correct angle drift and stabilize the robot. Things get tricky when such a robot is moved (moved or rotated) manually from a remote location.

This project shows how to build a basic two wheel WiFi remote controlled SBMR that has a WiFi camera mounted on a servo powered pan and tilt assembly. It is manually controlled by the user from the host computer via WiFi. Due to the live video feed available from the onboard camera, the SBMR does not need to be embedded in the user’s eyes.


Step 1: Build the hardware as described in the Hardware Development section.

Step 2: Install the latest version of CASP from this link: https://aadhuniklabs.com/?page_id=550. Please login through this link: https://aadhuniklabs.com/?page_id=554 For video lessons on CASP. Please note that CASP version of 19.09.2022 and above is required for this project.

Step 3: Download an example of a project “Mobile Self-balancing Remote Controlled Robot with Built-in Camera” at this link: https://aadhuniklabs.com/casp-examples/#robotics Follow the steps in the “Software Development” section.

Step 4: Some modifications as described in the “Modifications” section are required to adjust the software to compatible with the upgraded devices. You can also improve the performance of the bot by modifying the source code from the custom blocks.

Step 5: Finally, the keyboard and mouse controls for controlling the SBMR are described in the ‘Control Methodology’ section.

Please write to us at https://aadhuniklabs.com/contact For any inquiries or suggestions related to this project.

Hardware development

Two DC motors are mounted in a suitable base frame with wheels. The Pan & tilt assembly (with two small vehicles) fits into the frame. A 12V battery is installed on the base frame. The required electronic modules are suitably placed on the base frame and connected according to the connection diagram shown in the “Schematic” section. The typical arrangement is shown in the figure below

This project was built and tested separately on two controllers, the Arduino RP2040 Connect and the Raspberry Pi Pico. These small control panels are used as the main control unit for the robot. They are used in dual-core mode and are clocked up to 200MHz and 250MHz for the Arduino RP2040 and Raspberry Pi Pico respectively to meet the response times required to balance the robot and simultaneously communicate with the host computer via WiFi.

The Arduino RP2040 has an IMU and WiFi on board, because such external components are required to be minimal. Core-0 is used for WiFi connections and all PWM blocks for steering wheels, pan and tilt cameras. Core-1 is widely used in IMU and blocks required for balancing a robot.

For the Raspberry Pi Pico, Core-0 is used for the blocks required for balancing the robot and core-1 is used for the blocks involved in remote communication, manual control of the robot and the camera assembly. The MPU-6050 (or MPU-9250) inertial measurement unit (IMU) is used to sense the angle of inclination (the turning angle here due to the mounting direction of the IMU) of the robot’s angle. It communicates with the onboard microcontroller using the I2C interface. The ESP8266 WiFi module is used to communicate with the remote host computer through WiFi. It communicates with the onboard microcontroller using a serial interface.

The ESP32 camera module is installed on the pan-tilt assembly to capture live video and broadcast it to the host computer. The flashlight on the ESP32-CAM module is manually controlled from the host computer during low light conditions.

A 9V to 12V battery is used to power the entire electrical circuit on the robot. A 9V/12V to 5V DC step-down converter is used to provide the 5V source required to operate the microcontroller, servo and ESP32 camera module.

Software development

a) ESP32 camera configuration

The ESP32 camera must be properly programmed with a valid IP address before it can be used in the project. Please refer to our website Example of ESP32-CAM For details on how to program the unit. User can also refer to the abundant material available on the internet regarding this topic.

b) a program for partial control and local objectives

CASP software is used to quickly generate models and generate binary code for the target microcontroller on the motherboard and computer. This software enables users to visualize the signal graphically at any point of the model in real time. This feature is widely used during PID controller tuning.

A portion of regular CASP blocks are used by two custom blocks (one in the target model and one in the archetype) in models to develop logic that is not possible with normal CASP blocks and to reduce the total number of blocks. The source code for custom blocks is available to the user.

The custom block used in the archetype on the host computer generates the required control signals when the user presses certain keys to control the robot’s movements.

The custom block used in the target model modifies the tilt angle setpoint (fed to the error block) to navigate the robot based on user commands from the host computer and also the conditions of the PID controllers’ output signals before the motors are driven. The user can look at the configuration and source code of these custom blocks and see how to integrate the blocks into the CASP models along with other blocks. User can see our video tutorial on how to create a custom block in this Link.

Two models were developed to achieve the desired goal.

B 1) Target model running on Arduino RP2040 Connect and Raspberry Pi Pico consist of

1) The blinking logic that indicates that the system is running as well as whether the IMU is working properly or not.

2) The block assigned to the IMU, PID and other supporting blocks that keep the robot in balance.

3) WiFi and support blocks that receive the required control signals from the host computer.

4) PWM and servo blocks set on the microcontroller pins.

Here are the steps to properly program the target board.

1) Connect the target to the host computer via a USB cable.

2) Note the serial port number that the board connected to the host computer is connected to, from the host operating system.

3) Run CASP and load the ‘rc_target_arduino’ project for the Arduino RP2040 target or the ‘rc_target_rpi’ project for the Raspberry Pi target. Open the workspace file and set the various block parameters as described in the Modifications section.

4) The WiFi connection block is set to station mode. The user may need to enter the SSID and password of the network to which the device must be connected. The local IP address parameter must be configured as set by the network’s DHCP client.

5) Open Home->Simulation->Setup Simulation Parameters menu item. Under TargetHW->General tabs, set the “Target Hardware Programmer Port” parameter to the serial port that the board is connected to.

6) Build the model and program the board by clicking the play button.

B.2) The original form running on Host PC consists of

1) Camera block that receives live video from the ESP32 camera. The IP address of the ESP32 camera must be entered in the block parameters of this block.

2) Block the display of images to view the live video received from the camera. It is also configured to output keyboard and mouse signals.

3) RC Control Block: It is a custom block that receives keyboard and mouse signals from the image display block and generates appropriate control signals to control the robot’s movement and head movements (moving and tilting).

4) GPIO blocks that map to the target model over a WiFi communication channel.

Here are the steps to run the archetype on the host computer

1) Before proceeding, the host computer must be connected to the same network that the machine was connected to. The robot must be running.

2) Load the “rc_native” project.

3) Click Home-> Simulation-> Configure IO Simulation menu item.

4) The Configure Simulators window will open. Under Nodes and GPIO Device Nodes, change the IP addresses specified in the figure below (by double clicking on the item) to Local IP Addresses and Device Addresses.

5) Click the Connect Device button and select the Online Data check box. The program should now communicate with the target with a cycle time of about 30 milliseconds. The target board is now available as an ‘EP0’ endpoint for the archetype. The archetype can use this endpoint to communicate with the respective IOs on the target.

6) Click the Save button to save the configuration and close the window.

7) Run the form by clicking the Run button. A simulation panel window should open and communicate with the panel.

8) A screenshot of the output emulation board running on the host computer is shown below.


Onboard micro-controller target

1) The IMU sensor offset value (sen_offset block) needs to be adjusted if the robot’s center of gravity is shifted toward one side and/or there is some slope in the floor.

2) The relative, integral and derived parameters of the PID controllers must be adjusted based on the robot’s wheel diameter, wheel backlash, drive motor gear ratio, IMU sensor location, robot weight, height and center of gravity.

3) The “Minimum Displacement PWM” parameter of the assigned block must be modified to correspond to the minimum value required just to rotate the motors on load.

4) Any other logic needed to improve bot performance can be done in the custom block source code.

5) The wheel drive couplings can be reversed for forward or backward movement when W and S are pressed.

original goal

1) The forward and backward rotational motor generating logic has been developed to match the motor drive IC i.e. TA6586 used in the project. The user can modify the logic appropriately if another driver IC is used.

2) Servo motors may require some alignment to counteract the robot’s forward orientation for the default angle specified in the rc_control block parameters.

3) The base speed, speed limits and other parameters related to navigation can be set from the parameters of the rc_control block.

control methodology

1) The simulation panel window shown above accepts keyboard and mouse input when it is active.

2) The user can use the keys W – to move forward, S – to move backward, A – to rotate left in the center and D – to rotate right in the center.

3) A combination of keys W/S & A/D can be used to take left and right turns while moving forward or backward.

4) The speed can be temporarily increased by using the Shift key in combination with the above keys. The base speed can be set using the Page Up and Page Down keys.

5) The K and M switches can be used to adjust the tilt angle of the offset as the robot rests in a particular location. This value is required if the robot’s center of gravity is shifted to one side and/or the ground has some slope.

6) The vertical and horizontal servo angles (from -90 to +90 degrees) can be controlled to control the position of the head by mouse movements.

7) The ‘G’ key is used to position both servos in the virtual corner.

8) The “L” key is used to turn on/off the LED flash light of the ESP32.

#WiFi #Controlled #Balancing #Robot #PanTilt #Camera

Leave a Comment

Your email address will not be published. Required fields are marked *