F17: Optimus
Optimus - An Android app controlled Self Navigating Car powered by SJOne(LPC1758) microcontroller. Optimus manuevers through the selected Routes using LIDAR and GPS Sensors. This wiki page covers the detailed report on how Optimus is built by Team Optimus.
Contents
- 1 Abstract
- 2 Objectives & Introduction
- 3 Team Members & Responsibilities
- 4 Project Schedule
- 5 Parts List & Cost
- 6 CAN Communication
- 7 Hardware & Software Architecture
- 8 Master Controller
- 9 Motor Controller
- 10 Sensor Controller
- 10.1 Introduction
- 10.2 Hardware Implementation
- 10.3 Software Implementation
- 10.3.1 Approach for obtaining the data from the LIDAR
- 10.3.2 Algorithm for interfacing LIDAR to SJOne board and obtaining the obstacle info
- 10.3.3 Flowchart for Communicating with the LIDAR and receiving obstacle information
- 10.3.4 Testing the data obtained from the LIDAR
- 10.3.5 CAN DBC messages sent from the Sensor Controller
- 11 Android Application
- 12 Bluetooth Controller
- 13 Geographical Controller
- 14 Package Design
- 15 Git Project Management
- 16 Technical Challenges
- 17 Project Videos
- 18 Conclusion
- 19 Project Source Code
- 20 References
- 21 Acknowledgement
Abstract
Embedded Systems are omnipresent and one of its unique, yet powerful application is Self Driving Car. In this project we to build a Self-Navigating Car named Optimus, that navigates from a source location to a selected destination by avoiding obstacles in its path.
The key features the system supports are
1. Android Application with Customized map and Dashboard Information.
2. LIDAR powered obstacle avoidance.
3. Route Calculation and Manuvering to the selected destination
4. Self- Adjusting the speed of the car on Ramp.
The system is built on FreeRTOS running on LPC1758 SJOne controller and Android application. The building block of Optimus are the five controllers communicating through High Speed CAN network designed to handle dedicated tasks. The controllers integrates various sensors that is used for navigation of the car.
1. Master Controller - handles the Route Manuevering and Obstacle Avoidance 2. Sensor Controller - detects the surrounding objects 3. Geo Controller - provides current location 4. Drive Controller - controls the ESC 5. Bridge controller - Interfaces the system to Android app
Objectives & Introduction
Our Objective is to build and integrate the functionality of these five controllers to develop fully functioning self-driving system.
Sensor Controller: Sensor controller uses RPLIDAR to scan its 360-degree environment within 6-meter range. It sends the scanned obstacle data to master controller and bridge controller.
Geo Controller: Geo controller uses NAZA GPS module that provides car current GPS location and compass angle. It calculates heading and bearing angle that helps the car to turn with respect to destination direction.
Drive Controller: Drive controller drives the motor based on the commands it receives from the Master.
Bridge Controller: Bridge controller works as a gateway between the Android application and Self-driving car and passes information to/from between them.
Master Controller: Master controller controls all other controllers and takes decision of drive.
Android Application: Android application communicates with the car through Bridge controller. It sends the destination location to be reached to the Geo controller and also provides all the Debugging information of the Car like
1. Obstacles information around the car
2. Car's turning angle
3. Compass value
4. Bearing angle
5. Car's GPS location
6. Destination reached status
7. Total checkpoints in the route
8. Current checkpoint indication
Team Members & Responsibilities
- Master Controller:
- Revathy
- Motor Controller:
- Android and Communication Bridge:
- Geographical Controller:
- Integration Testing:
- Revathy
- Kripanand Jha
- Unnikrishnan
- PCB Design:
Project Schedule
Legend:
Major Feature milestone , CAN Master Controller , Sensor & IO Controller , Android Controller, Motor Controller , Geo , Testing, Ble controller, Team Goal
Week# | Date | Planned Task | Actual | Status |
---|---|---|---|---|
1 | 9/23/2017 |
|
|
Complete. |
2 | 9/30/2017 |
|
|
Complete |
3 | 10/14/2017 |
|
|
Complete |
4 | 10/21/2017 |
|
|
complete |
5 | 10/28/2017 |
|
|
Complete |
6 | 11/07/2017 |
|
|
Complete |
7 | 11/14/2017 |
|
|
Complete. |
8 | 11/21/2017 |
|
|
Complete. |
9 | 11/28/2017 |
|
|
Complete. |
10 | 12/5/2017 |
|
|
Complete |
11 | 12/12/2017 |
|
|
complete |
Parts List & Cost
The Project bill of materials is as listed in the table below.
SNo. | Component | Units | Total Cost |
---|---|---|---|
General System Components | |||
1 | SJ One Board (LPC 1758) | 5 | $400 |
2 | Traxaas RC Car | 1 | From Prof. Kaikai Liu |
3 | CAN Transceivers | 15 | $55 |
4 | PCAN dongle | 1 | From Preet |
5 | PCB Manufacturing | 5 | $70 |
6 | 3D printing | 2 | From Marvin |
6 | General Hardware components( Connectors,standoffs,Soldering Kits) | 1 | $40 |
7 | Power Bank | 1 | $41.50 |
8 | LED Digital Display | 1 | From Preet |
9 | Acrylic Board | 1 | $12.53 |
Sensor/IO Controller Components | |||
10 | RP Lidar | 1 | $449 |
Geo Controller Components | |||
11 | GPS Module | 1 | $69 |
Bluetooth Bridge Controller Component | |||
12 | Bluetooth Module | 1 | $34.95 |
Drive Controller Component | |||
13 | RPM Sensor | 1 | $20 |
CAN Communication
The controllers are connected in a CAN bus at 100K baudrate. System Nodes: MASTER, MOTOR, BLE, SENSOR, GEO
SNo. | Message ID | Message from Source Node | Receivers |
---|---|---|---|
Master Controller Message | |||
1 | 2 | System Stop command to stop motor | Motor |
2 | 17 | Target Speed-Steer Signal to Motor | Motor |
3 | 194 | Telemetry Message to Display it on Android | BLE |
Sensor Controller Message | |||
4 | 3 | Lidar Detections of obstacles in 360 degree grouped as sectors | Master,BLE |
5 | 36 | Heartbeat | Master |
Geo Controller Message | |||
6 | 195 | Compass, Destination Reached flag, Checkpoint id signals | Master,BLE |
7 | 196 | GPS Lock | Master,BLE |
8 | 4 | Turning Angle | Master,BLE |
9 | 214 | Current Coordinate | Master,BLE |
10 | 37 | Heartbeat | Master |
Bluetooth Bridge Controller Message | |||
11 | 1 | System start/stop command | Master |
12 | 38 | Heartbeat | Master |
13 | 213 | Checkpoint Count from AndroidApp | Geo |
14 | 212 | Checkpoints (Lat, Long) from Android App | Geo |
Drive Controller Message | |||
15 | 193 | Telemetry Message | BLE |
16 | 35 | Heartbeat | Master |
DBC File
The CAN message id's transmitted and received from all the controllers are designed based on the priority of the CAN messages. The priority is as follows
Priority Level 1 - User Commands
Priority Level 2 - Sensor data
Priority Level 3 - Status Signals
Priority Level 4 - Heartbeat
Priority Level 5 - Telemetry signals to display in I/O
BU_: DBG DRIVER IO MOTOR SENSOR MASTER GEO BLE
BO_ 1 BLE_START_STOP_CMD: 1 BLE SG_ BLE_START_STOP_CMD_start : 0|4@1+ (1,0) [0|1] "" MASTER SG_ BLE_START_STOP_CMD_reset : 4|4@1+ (1,0) [0|1] "" MASTER
BO_ 2 MASTER_SYS_STOP_CMD: 1 MASTER SG_ MASTER_SYS_STOP_CMD_stop : 0|8@1+ (1,0) [0|1] "" MOTOR
BO_ 212 BLE_GPS_DATA: 8 BLE SG_ BLE_GPS_long : 0|32@1- (0.000001,0) [0|0] "" GEO SG_ BLE_GPS_lat : 32|32@1- (0.000001,0) [0|0] "" GEO
BO_ 213 BLE_GPS_DATA_CNT: 1 BLE SG_ BLE_GPS_COUNT : 0|8@1+ (1,0) [0|0] "" GEO,SENSOR
BO_ 214 GEO_CURRENT_COORD: 8 GEO SG_ GEO_CURRENT_COORD_LONG : 0|32@1- (0.000001,0) [0|0] "" MASTER,BLE SG_ GEO_CURRENT_COORD_LAT : 32|32@1- (0.000001,0) [0|0] "" MASTER,BLE
BO_ 195 GEO_TELECOMPASS: 6 GEO SG_ GEO_TELECOMPASS_compass : 0|12@1+ (0.1,0) [0|360.0] "" MASTER,BLE SG_ GEO_TELECOMPASS_bearing_angle : 12|12@1+ (0.1,0) [0|360.0] "" MASTER,BLE SG_ GEO_TELECOMPASS_distance : 24|12@1+ (0.1,0) [0|0] "" MASTER,BLE SG_ GEO_TELECOMPASS_destination_reached : 36|1@1+ (1,0) [0|1] "" MASTER,BLE SG_ GEO_TELECOMPASS_checkpoint_id : 37|8@1+ (1,0) [0|0] "" MASTER,BLE
BO_ 194 MASTER_TELEMETRY: 3 MASTER SG_ MASTER_TELEMETRY_gps_mia : 0|1@1+ (1,0) [0|1] "" BLE SG_ MASTER_TELEMETRY_sensor_mia : 1|1@1+ (1,0) [0|1] "" BLE SG_ MASTER_TELEMETRY_sensor_heartbeat : 2|1@1+ (1,0) [0|1] "" BLE SG_ MASTER_TELEMETRY_ble_heartbeat : 3|1@1+ (1,0) [0|1] "" BLE SG_ MASTER_TELEMETRY_motor_heartbeat : 4|1@1+ (1,0) [0|1] "" BLE SG_ MASTER_TELEMETRY_geo_heartbeat : 5|1@1+ (1,0) [0|1] "" BLE SG_ MASTER_TELEMETRY_sys_status : 6|2@1+ (1,0) [0|3] "" BLE SG_ MASTER_TELEMETRY_gps_tele_mia : 8|1@1+ (1,0) [0|1] "" BLE
BO_ 196 GEO_TELEMETRY_LOCK: 1 GEO SG_ GEO_TELEMETRY_lock : 0|8@1+ (1,0) [0|0] "" MASTER,SENSOR,BLE BO_ 3 SENSOR_LIDAR_OBSTACLE_INFO: 6 SENSOR SG_ SENSOR_LIDAR_OBSTACLE_INFO_SECTOR0 : 0|4@1+ (1,0) [0|12] "" MASTER,BLE SG_ SENSOR_LIDAR_OBSTACLE_INFO_SECTOR1 : 4|4@1+ (1,0) [0|12] "" MASTER,BLE SG_ SENSOR_LIDAR_OBSTACLE_INFO_SECTOR2 : 8|4@1+ (1,0) [0|12] "" MASTER,BLE SG_ SENSOR_LIDAR_OBSTACLE_INFO_SECTOR3 : 12|4@1+ (1,0) [0|12] "" MASTER,BLE SG_ SENSOR_LIDAR_OBSTACLE_INFO_SECTOR4 : 16|4@1+ (1,0) [0|12] "" MASTER,BLE SG_ SENSOR_LIDAR_OBSTACLE_INFO_SECTOR5 : 20|4@1+ (1,0) [0|12] "" MASTER,BLE SG_ SENSOR_LIDAR_OBSTACLE_INFO_SECTOR6 : 24|4@1+ (1,0) [0|12] "" MASTER,BLE SG_ SENSOR_LIDAR_OBSTACLE_INFO_SECTOR7 : 28|4@1+ (1,0) [0|12] "" MASTER,BLE SG_ SENSOR_LIDAR_OBSTACLE_INFO_SECTOR8 : 32|4@1+ (1,0) [0|12] "" MASTER,BLE SG_ SENSOR_LIDAR_OBSTACLE_INFO_SECTOR9 : 36|4@1+ (1,0) [0|12] "" MASTER,BLE SG_ SENSOR_LIDAR_OBSTACLE_INFO_SECTOR10 : 40|4@1+ (1,0) [0|12] "" MASTER,BLE SG_ SENSOR_LIDAR_OBSTACLE_INFO_SECTOR11 : 44|4@1+ (1,0) [0|12] "" MASTER,BLE
BO_ 4 GEO_TURNING_ANGLE: 2 GEO SG_ GEO_TURNING_ANGLE_degree : 0|9@1- (1,0) [-180|180] "" MASTER,BLE
The CAN DBC is available at the Gitlab link below
https://gitlab.com/optimus_prime/optimus/blob/master/_can_dbc/243.dbc
CAN Bus Debugging
We used PCAN Dongle to connect to the host pc to monitor the CAN Bus traffic using BusMaster tool. The screenshot of the Bus Master log is shown below
Hardware & Software Architecture
Master Controller
Software Architecture Design
The Master Controller Integrates the functionality of all other controllers and it acts as the Central Control Unit of the Self Navigating car. Two of the major functionalities handled by Master Controller is Obstacle avoidance and Route Maneuvering.
The overview of Master Controller Software Architecture is as show in the figure below.
As an analogy to Human driving, it receives the inputs from sensors to determine the surrounding of the Self Navigating car and take decisions based on the environment and current location of the car. The input received and output sent by the Master are as mentioned below:
Input to Master:
1. Lidar Object Detection information - To determine if there is an obstacle in the path of navigation
2. GPS and Compass Reading - To understand the Heading and Bearing angle to decide the direction of movement
3. User command from Android - To stop or Navigate to the Destination
Output from Master:
1. Motor control information - sends the target Speed and Steering direction to the Motor.
Software Implementation
The Master controller runs 2 major algorithms, Route Maneuvering and Obstacle Avoidance. The System start/stop is handled by master based on the Specific commands. The implicit requirement is that When the user selects the destination, route is calculated and the checkpoints of the route are sent from Android through bridge controller to the Geo. Once Geo Controller receives a complete set of checkpoints, the master controller starts the system based on the "Checkpoint ID". If the ID is a non-zero value, the route has started and Master controller runs the Route Maneuvering Algorithm.
The Overall control flow of master controller is shown in the below figure.
Unit Testing
Using Cgreen Unit Testing framework, the Obstacle avoidance algorithm is unit tested.The complete code for unit test is added in git project.
Ensure(test_obstacle_avoidance) { //Obstacle Avoidance Algorithm pmaster->set_target_steer(MC::steer_right); mock_obstacle_detections(MC::steer_right,MC::steer_right,false,false,false,false,false,false,true); assert_that(pmaster->RunObstacleAvoidanceAlgo(obs_status),is_equal_to(expected_steer)); assert_that(pmaster->get_forward(),is_equal_to(true)); assert_that(pmaster->get_target_speed(),is_equal_to(MC::speed_slow)); } Ensure(test_obstacle_detection) { //Obstacle Detection Algorithm mock_CAN_Rx_Lidar_Info(2,2,6,0,2,2,4,0,2,0,5,0); set_expected_detection(true,false,true,false,true,false,false); actual_detections = psensor->RunObstacleDetectionAlgo(); assert_that(compare_detections(actual_detections) , is_equal_to(7)); } TestSuite* master_controller_suite() { TestSuite* master_suite = create_test_suite(); add_test(master_suite,test_obstacle_avoidance); add_test(master_suite,test_obstacle_detection); return master_suite; }
On board debug indications
Sr.No | LED Number | Debug Signal |
---|---|---|
1 | LED 1 | Sensor Heartbeat, Sensor Data Mia |
2 | LED 2 | Geo Heartbeat, Turning Angle Signal Mia |
3 | LED 3 | Bridge Heartbeat mia |
4 | LED 4 | Motor Heartbeat mia |
Design Challenges
The critical part in Obstacle Avoidance Algorithm is designing, 1. Obstacle detcetion 2. Obstacle avoidance. Since we get 360-degree view of obstacles, we need to group the zones into sectors and tracks to process the 360-degree detection and take decision accordingly.
Motor Controller
Design & Implementation
The Motor Controller is responsible for the Movement and Steering action of the Car. It includes two types of motors, DC motor for movement and DC Servo motor for Steering. The Motor has an inbuilt driver called ESC (Electronic Speed Control) Circuit used the manipulate the speed and steering of the Car. It has a PWM input for both Servo Motor and DC Motor. We are using RPM sensor to take the feedback from the motor to monitor the speed.
Hardware Design
SJOne Pin Diagram | ||
---|---|---|
Sr.No | Pin Number | Pin Function |
1 | P0.1 | HEADlIGHTS |
2 | P1.19 | BRAKELIGHTS |
3 | P1.20 | LEFT INDICATORS |
4 | P1.22 | RIGHT INDICATORS |
5 | P0.26 | RPM SENSOR |
6 | P2.0 | SERVO PWM |
7 | P2.1 | MOTOR PWM |
Hardware Specifications
- 1. DC Motor, Servo and ESC
This is a Traxxas Titan 380 18-turn brushed motor. The DC motor comes with the Electronic Speed Control(ESC) module. The ESC module can control both servo and DC motor using Pulse Width Modulation (PWM) control. ESC also requires an initial calibration:
ESC is operated using PWM Signals. The DC motor PWM is converted in the range of -100% to 100% where -100% means "reverse with full speed", 100% means "forward with full speed" and 0 means "Stop or Neutral".
Also, the servo can also be operated in a Safe manner using PWM.
As we need a locked 0 –> 180 degrees motion in certain applications like robot arm, Humanoids etc. We use these Servo motors. These are Normal motors only with a potentiometer connected to its shaft which gives us the feedback of analog value and adjusts its angle according to its given input signal.
So… How to Operate it? A servo usually requires 5V->6V As VCC. (Industrial servos requires more.) and Ground and a signal to adjust its position. The signal is a PWM waveform. For a servo, we need to provide a PWM of frequency about 50Hz-200Hz (Refer the datasheet). So the time duration of a clock cycle goes to 20ms. From this 20ms if the On time is 1ms and off time is 19ms we generally get the 0 degrees position. And when we increase the duty cycle from 1ms to 2ms the angle changes from 0–> 180 degrees. So where can it go wrong-
Power->> The power we provide. Generally we tend to give a higher volt batteries for our applications by pulling the voltage down through regulators to 5Vs. But we surely can-not give supply to the servo through our uC as the servo eats up a hell lot of current.
Another way to burn the servo is at certain times the supply is given directly through the battery so the uC will not blow up. But if you Give a supply say 12Volts then boom. Your servo will go on for ever.
PWM–> PWM should strictly be in the range between 1ms–> 2ms (refer datasheets) If by any mistakes the PWM goes out then boom the servo will start jittering and will heat up and heat up and will burn itself down. But this problem is easily identifiable as there is a jitter sound which if you have got enough experience with servos, you will totally notice the noise. So if the noise is there when you turn on the servo, turn it off right away and change the code ASAP.
Load— Hobby servos don’t have high load bearing capacities and as it is designed that way it always tries to adjust its angle according to signal. But here is the catch. As there is too much off load the servo cannot go further and the signal is forcing it to. So again.. heat… heat and boom. How to avoid this. Give load to the servo only in the figure of safety.
- 2. RPM Sensor
The RPM sensor above requires a specific kind of Installation. STEPS ARE:
Once the installation is done, the RPM can be read using the above magnetic RPM sensor. It gives a high pulse at every rotation of the wheel. Hence, to calculate the RPM, the output of the above sensor is fed to a gpio pin of SJOne board.
Motor Module Hardware Interface
The Hardware connections of Motor Module is shown in above Schematic. The motor receive signals through CAN bus from the Master and feedback is sent via RPM sensor to the Master as current speed of the Car. The speed sent from a RPM sensor over a CAN bus is also utilized by I/O Module and BLE module to print the values on LED display and Android App.
Software Design
The following diagram describes the flow of the software implementation for the motor driver and speed feedback mechanism.
Motor Module Implementation
The motor controller is operated based on the CAN messages received from the Master. The CAN messages for Drive and Steer commands are sent from the Master Controller. Motor controller converts the value received from Master (+100 to -100 for Drive Speed percent and +100 to -100 for Steer angle in the range of 1 to 180 degrees turn) into specific PWM value as required by DC motor and Servo.
- Speed Regulation:
Upon detection of uphill the pulse frequency from RPM Sensor reduces, that means car is slowing down. Hence, in that scenario, car is accelerated (increase PWM) further to maintain the required speed. Similarly in case of Downhill pulse frequency increases, which means car is speeding up. Hence, brakes (reduced PWM) are applied to compensate the increased speed.
Sensor Controller
The Sensor is for detecting and avoiding obstacles. For this purpose we have used RPLIDAR by SLAMTEC.
Introduction
The RPLIDAR A2 is a 360 degree 2D laser scanner (LIDAR) solution developed by SLAMTEC. It can take up to 4000 samples of laser ranging per second with high rotation speed. And equipped with SLAMTEC patented OPTMAG technology, it breakouts the life limitation of traditional LIDAR system so as to work stably for a long time. The system can perform 2D 360-degree scan within a 6-meter range. The generated 2D point cloud data can be used in mapping, localization and object/environment modeling. The typical scanning frequency of the RPLIDAR A2 is 10hz (600rpm).Under this condition, the resolution will be 0.9°. And the actual scanning frequency can be freely adjusted within the 5-15hz range according to the requirements of users. The RPLIDAR A2 adopts the low cost laser triangulation measurement system developed by SLAMTEC, which makes the RPLIDAR A2 has excellent performance in all kinds of indoor environment and outdoor environment without direct sunlight exposure.
This LIDAR consists of a range scanner core and the mechanical powering part which makes the core rotate at a high speed. When it functions normally, the scanner will rotate and scan clockwise. And users can get the range scan data via the communication interface of the RPLIDAR (UART) and control the start, stop and rotating speed of the rotate motor via PWM.
A laser beam is sent out by the transmitter and the reflected laser beam is received back. Depending on the time taken to receive the beam back, the distance of the obstacle is calculated. If there is no obstacle, the beam will not be reflected back.
Hardware Implementation
Specifications of the LIDAR
The specifications of the LIDAR as mentioned in the datasheet are as follows:
Power Supply: 5V
Serial Communication interface: UART
Baud Rate for the UART: 115200
Working mode of the UART: 8N1
PWM Maximum Voltage: 5V (Typical 3.3V)
PWM frequency: 25KHz
Duty Cycle of the PWM wave: 60% - 100%
Connections to the SJOne Board
The LIDAR works with a UART interface and hence has been connected to the UART3 pins of of the SJOne board i.e. P4.28 and P4.29. As the LIDAR needs a 5V supply, it is provided from the PCB (which is powered through a power bank) instead of the SJOne board which can supply only 3.3V. The connections can be seen in the figure below.
Software Implementation
Approach for obtaining the data from the LIDAR
The LIDAR senses all the obstacles around it (360 degrees upto a range of 6000cm) one degree at a time. This means that for one rotation of the LIDAR WE GET 360 values i.e. 360 angles with their corresponding obstacle information. It takes 180ms for the LIDAR to complete one 360 degree scan. Since we do not need obstacle information for each and every angle, we group a few angles together into "sectors" and consider the nearest object present in a sector as an obstacle. To identify how far an obstacle is located, the distance values are grouped into "tracks" i.e 0cm to 25cm is track 1 and 25cm to 50cm is track 2 and so on. The motor will take action depending on the track in which an obstacle is present.
Algorithm for interfacing LIDAR to SJOne board and obtaining the obstacle info
Step 1: Send a GET_HEALTH (0XA5 0X52) Request. If the receive times out it is a communication error.
Step 2: Check if a ‘protection stop’ is happening. If it is happening then send a RESET (0XA5 0X40). Again check for ‘protection stop’ and if it it still set, send a RESET again. If ‘protection stop’ is set even after sending RESET multiple times it means there is a hardware defect. If there is no hardware defect, proceed to the next step.
Step 3: Send a START_SCAN (0XA5 0X20) request. Wait for the response header. If there is no timeout, read the measurement sample. Otherwise check HEALTH_STATUS and MOTOR_STATUS again. Send START_SCAN again.
Step 4: Continuously read the measurement samples.The data sent from the LIDAR will contain the start bit, angle, distance and quality. The start bit is set to 1 after every single 360 degree scan. The angle and distance represent to the motor angle and the obstacle in that corresponding angle. The quality represents the strength of the reflected beam. If the quality is zero it means that there is no obstacle in that direction. This data is processed to be group the information into sectors and tracks.
Step 5: If we wish to stop the readings, send a STOP (0XA5 0X25) request. This is the end of operation.
Flowchart for Communicating with the LIDAR and receiving obstacle information
The entire flowchart for communicating with the LIDAR and receiving data from it is shown in the figure below:
Testing the data obtained from the LIDAR
To perform the initial testing of the LIDAR and to check if we are getting the correct obstacle info, we have made a setup enclosing the LIDAR on all four sides. So, by plotting the distance info given by the LIDAR in Microsoft Excel we can visualize a map of the obstacles as detected by the LIDAR. The map plotted in Excel after closing almost all four sides of the LIDAR can be shown in the figure shown below.
CAN DBC messages sent from the Sensor Controller
The data received from the LIDAR is grouped into sectors and tracks and is sent over the CAN bus. The CAN DBC messages in the DBC file will be as follows
BO_ 3 SENSOR_LIDAR_OBSTACLE_INFO: 6 SENSOR
SG_ SENSOR_LIDAR_OBSTACLE_INFO_SECTOR0 : 0|4@1+ (1,0) [0|12] "" MASTER,BLE SG_ SENSOR_LIDAR_OBSTACLE_INFO_SECTOR1 : 4|4@1+ (1,0) [0|12] "" MASTER,BLE SG_ SENSOR_LIDAR_OBSTACLE_INFO_SECTOR2 : 8|4@1+ (1,0) [0|12] "" MASTER,BLE SG_ SENSOR_LIDAR_OBSTACLE_INFO_SECTOR3 : 12|4@1+ (1,0) [0|12] "" MASTER,BLE SG_ SENSOR_LIDAR_OBSTACLE_INFO_SECTOR4 : 16|4@1+ (1,0) [0|12] "" MASTER,BLE SG_ SENSOR_LIDAR_OBSTACLE_INFO_SECTOR5 : 20|4@1+ (1,0) [0|12] "" MASTER,BLE SG_ SENSOR_LIDAR_OBSTACLE_INFO_SECTOR6 : 24|4@1+ (1,0) [0|12] "" MASTER,BLE SG_ SENSOR_LIDAR_OBSTACLE_INFO_SECTOR7 : 28|4@1+ (1,0) [0|12] "" MASTER,BLE SG_ SENSOR_LIDAR_OBSTACLE_INFO_SECTOR8 : 32|4@1+ (1,0) [0|12] "" MASTER,BLE SG_ SENSOR_LIDAR_OBSTACLE_INFO_SECTOR9 : 36|4@1+ (1,0) [0|12] "" MASTER,BLE SG_ SENSOR_LIDAR_OBSTACLE_INFO_SECTOR10 : 40|4@1+ (1,0) [0|12] "" MASTER,BLE SG_ SENSOR_LIDAR_OBSTACLE_INFO_SECTOR11 : 44|4@1+ (1,0) [0|12] "" MASTER,BLE
Android Application
Description
An Android Mobile Device Application to Navigate and trigger power to the Self Driving Car "OPTIMUS".
Optimus App serves an important role in the SDLC as it integrates the UI alongwith RC Controls like "Power", "Navigation" over Bluetooth channel with the Self navigating CAR.
The App uses RF-Comm Bluetooth Communication protocol and a BLE Transceiver to communicate with CAR and exchange several useful information over a Baud-rate of 9600bps.
Optimus mobile Platform needs to be connected with a specific Device Address based on the BLE Chip type in use.
- OPTIMUS HOME
Features
1. BLUETOOTH
The Android mobile application includes support for the Bluetooth network stack, which allows a device to wirelessly exchange data with the HC-05 Bluetooth device.
The Android application framework provides access to the Bluetooth functionality through the Android Bluetooth APIs. These APIs let applications wirelessly connect to other Bluetooth device, enabling point-to-point wireless features.
Using the Bluetooth APIs, Android application performs the following:
- Scan for other Bluetooth device HC-05 on RC Car [00:**:91:D9:14:**]
- Query the local Bluetooth adapter for paired Bluetooth devices
- Establish RFCOMM channels
- Connect to other devices through service discovery
- Transfer data to and from other devices
- Manage multiple connections
2. MAPS
OPTIMUS App uses Google Maps for setting up the Routing Map information and to decide on the next checkpoint for the Car and the appropriate shortest route by computing the checkpoints using "Adjacency Matrix" and certain algorithms.
Google Maps are used along with other promising features to improve the navigation experience as the Route plot and Checkpoint mapping on groovy paths around campus are difficult to plan and route using Google Api(s).
- MAPS :: ANDROID - BLE COMMUNICATION JSON SCHEMA
The App was also upgraded to have live tracking feature of Car's location by indicating the crossed marker with YELLOW_HUE color to distinguish the original path and the traversed path by the car.
As soon as the car crosses a checkpoint marker the marker color will be updated to YELLOW from its original BLUE Color to indicate the checkpoint flag has been crossed.
Optimus app uses interpolation schemes to calculate intermediate routes and to set checkpoints using Draggable Marker mechanism to set Destination and plot route path till the same.
The Json Format shown has various tags for extracting checkpoint information using Json reader and plotting the points on the Map. Features of the Json Data packet are:
- Feature Properties:
* Name : Description of the route Start Point
* Description [optional] : Custom Description of the route
* LineString : Signifies the route type eg. Line Plot
* coordinates : List of Lat-Long Coordinates till Next major Check point
3. DASHBOARD
Dash Board was designed to have an at a glance View and to project a UI similar to a CAR Dashboard on the App wherein we have Compass Values, Bearing and Heading Angles, Lidar Maps to resonate the data obtained from LIDAR which also helps in debugging the features and the values being sent from respective Sensor Modules.
- OPTIMUS DASHBOARD
- DASHBOARD JSON SCHEMA
- Dashboard Information:
* JSON_ID_GPS_LOCK_STAT : Signifies the current Status of GPS LOCK on the car
* JSON_ID_COMPASS_HEADING : Signifies current Heading Angle from COMPASS
* JSON_ID_COMPASS_BEARING : Signifies current Bearing Angle from COMPASS
* JSON_ID_TURNING_ANGLE : Signifies current TurningAngle from COMPASS
* JSON_ID_DIST_TO_DEST : Signifies distance from Current Location to Destination or Absolute Displacement of Car relative to Destination Checkpoint
* JSON_ID_DEST_REACHED : Signifies whether the car has reached Destination or not!
- LIDAR Information:
* JSON_ID_SENSOR_LIDAR_OBSTACLE_INFO_SEC0 : Signifies Track position of the Obstacles detected on multiple Sectors by LIDAR
For Example: Track 9, Sector 1 means Obstacle is detected at Sector 1 at 450 centimeters or 4.50 meters from the Current position of car at an angle range 20-45 degrees from LIDAR/CAR Front line of vision at that particular time instance
Android: LIDAR PLOT | Android: LIDAR PLOT |
Bluetooth Controller
Hardware Implementation
' Bluetooth Module Pin Configuration:’
We are using HC-05 Bluetooth module to send and receive the data from our android application.
The Bridge controller is connected to the bluetooth module through the uart serial interface (Uart3) with 9600 baud rate 8-bit data and 1 stop bit.
Software Implementation
Pseudo code of Bridge controller:
1. Turn on bridge controller.
2. Initialise Bluetooth controller with Uart3 settings.
3. Initialise CAN-BUS with 100 kbps speed.
4. Handle Incoming IO messages it received from the Geo and the Sensor over CAN Bus.
5. Send the received CAN message to the Android over Bluetooth each second.
6. Send the heartbeat message every second to the Master controller.
7. Read Bluetooth message it received from the Android app.
8. Forward the Android message to GEO controller if it received checkpoints otherwise forward it to Master.
DBC format for messages sent from Bluetooth controller :
BO_ 1 BLE_START_STOP_CMD: 1 BLE SG_ BLE_START_STOP_CMD_start : 0|4@1+ (1,0) [0|1] "" MASTER SG_ BLE_START_STOP_CMD_reset : 4|4@1+ (1,0) [0|1] "" MASTER
BO_ 38 BLE_HEARTBEAT: 1 BLE SG_ BLE_HEARTBEAT_signal : 0|8@1+ (1,0) [0|255] "" MASTER
BO_ 212 BLE_GPS_DATA: 8 BLE SG_ BLE_GPS_long : 0|32@1- (0.000001,0) [0|0] "" GEO SG_ BLE_GPS_lat : 32|32@1- (0.000001,0) [0|0] "" GEO
BO_ 213 BLE_GPS_DATA_CNT: 1 BLE SG_ BLE_GPS_COUNT : 0|8@1+ (1,0) [0|0] "" GEO,SENSOR
Geographical Controller
Introduction
GPS and Compass Module:
GPS:
GPS is a global navigation satellite system that provides geo location and time information to a GPS receiver anywhere on or near the Earth where there is an unobstructed line of sight to four or more GPS satellites.
Compass:
A compass is an instrument used for navigation and orientation that shows direction relative to the geographic cardinal directions (or points). Usually, a diagram called a compass rose shows the directions north, south, east, and west on the compass face as abbreviated initials. When the compass is used, the rose can be aligned with the corresponding geographic directions; for example, the "N" mark on the rose really points northward. Compasses often display markings for angles in degrees in addition to (or sometimes instead of) the rose. North corresponds to 0°, and the angles increase clockwise, so east is 90° degrees, south is 180°, and west is 270°. These numbers allow the compass to show azimuths or bearings, which are commonly stated in this notation. We are using DJI’s NAZA GPS/COMPASS to get the GPS coordinates and Heading angle. The diagram of the module is as follows:
Message Structure:
- GPS':
The 0x10 message contains GPS data. The message structure is as follows:
- Compass:
The 0x20 message contains compass data. The structure of the message is as follows:
- Calibration':
Why calibrate the compass?
Ferromagnetic substances placed on multi-rotor or around its working environment affect the reading of earth’s magnetic field for the digital compass. It also reduces the accuracy of the multi-rotor control, or even reads an incorrect heading. Calibration will eliminate such influences, and ensure MC system performs well in a non-ideal magnetic environment.
When to do it?
• The first time you install Naza compass.
• When the multi-rotor mechanical setup has changed.
• If the GPS/Compass module is re-positioned.
• If electronic devices are added/removed/re-positioned.
Hardware Connection
The Pin Configuration is as follows:
Software Design
Algorithm: Distance calculation:
We are using the ‘haversine’ formula to calculate the great-circle distance between two points – that is, the shortest distance over the earth’s surface
Bearing Angle calculation:
The bearing of a point is the number of degrees in the angle measured in a clockwise direction from the north line to the line joining the centre of the compass with the point. A bearing is used to represent the direction of one point relative to another point. The bearing angle is calculated by using the following formula:
Package Design
PCB Design
3D Printed Sensor Mounts
We designed 3D printing Models for holding the Sensor LiDAR and GPS using OpenScad Software.
Git Project Management
The Gitlab project is managed using working on different branches for different controllers and restricting access to all users to merge the branch to master branch. To get easy notification of all git activity, we created a webhook for git notifications in CMPE243 Slack Channel. The useful features of git such as Issues List, Milestone tracks are used for easy management
The project git repository is below.
https://gitlab.com/optimus_prime/optimus/tree/master
Technical Challenges
Motor Technical Challenges
1) ESC Calibration
We messed up the calibration on the ESC.
XL 5 had a long press option to calibrate the ESC, where the ESC shall:
a) After long press, glow green and start taking PWM signals for neutral (1.5).
b) Glow green once again where we shall feed in PWM signals for Forward (2ms).
b) Glow green twice again where we shall feed in PWM signals for Reverse (1ms)."
-We wrote code to calibrate using EXT-INT (EINT3) over P0.1 - switch to calibrate the ESC this way!
2) ESC Reverse
The ESC was not activating reverse if we directly - as in the datasheet (no formal datasheet - only XL 5 forums - talked about 1ms pulse width at 50Hz for reverse).
We figured out that Reverse is actually 3 steps:
a) goNeutral()
b) goReverse()
c) goNeutral()
d) goReverse()
3) RPM Sensor Installation:
After following the steps to install RPM sensor (as steps above), the RPM sensor was not detecting the Rotation (magnet) of the wheel.
The reason for that was Machine steeled pinion gear and slipper clutch. The Machine steeled pinion gear and slipper clutch that came with the RC car was big. That increased the distance between Magnet and RPM sensor. That's why we were not able to detect RPM of wheel.
We even checked the activity using Digital Oscilloscope.
Then we changed the smaller Machine steeled pinion gear and slipper clutch and reinstalled the RPM sensor and it worked.
Android Issues Undergone
- MAPS: Plotting Routes and Offline Check Points Calculation
With our initial implementation using Google Android API we were able to route maps but sooner during testing of the route navigation we faced a couple of issues as follows:
1. For Straight Line Routes, often the intermediate checkpoints were not received, as according to Google Api's checkpoints are only generated at the intersections where the route bends.
2. Due to the aforesaid drawback on straight routes it was hard to navigate and interpolation was required to make sure the GEO has enough checkpoints to redefine the heading angle before the car goes too far from its destined straight route path.
3. Google Route's are calculated from any point on the ground to the nearest offset point on the pre-drawn custom Google poly-line path, as a result the route from certain locations ended up to be on the sharp edge routes rather than smooth curves which also led to little longer routes and our car ended up in side walks or side bushes while correcting its course to follow the main route.
- Application Compatibility
During Implementation one of the issues faced were the security features of Android applications and permissions to use Geo Locations and App Storage.
Every time after fresh app Installation the permissions had to be revisited and enabled for the app to access them, something which still can be upgraded further.
Testing and Procedures to Overcome Challenges
MAP DEBUGGING & ROUTE CALCULATION
For overcoming the problem of placing routes and calculating the shortest path we decided to interpolate routes in the university premises.
Steps involved:
- Draw polylines routes over saved checkpoint coordinates by reading and parsing a json file at the app level to get the next checkpoint coordinates.
- Use Dijkstra's Algorithm to calculate shortest path between those routes.
- For longer routes two approaches could be taken to calculate the intermediate checkpoints:
- a. Straightline Approach
- b. Geodesy Engineering Approach.
- a. Straightline Approach
Geodesy approach is complex and can be implemented using 'Haversine' technique to calculate the intermediate points between two points along the geographic surface of the earth but since the distances for the demo were not so long enough that can be significantly impacted by the curvature we decided to go with the primary approach.
We used Vincenty formula to compute the interpolated points between two checkpoints when the distance between the two exceeded ~(10±5)meters the algorithm will interpolate the route to give intermediate checkpoints which will be marked on the map using BLUE Markers.
For easy user view we added Hybrid TYPE MAP on the app so that user can have a 3D feel of the route.
MARKERS
- We also added colored Markers for denoting following:
>>> START/STOP : Custom Markers
>>> CAR LOCATION : Yellow Markers
>>> INTERMEDIATE CHECKPOINTS : HUE_BLUE Markers
Sensor Controller
1. LIDAR is not able to detect black colored objects sometimes as the light from the LASER is completely absorbed by black and nothing is reflected back.
LIDAR doesn't detect black objects
2. LIDAR object detection will be the plane where it is mounted. So, if the object height is less than the height the LIDAR is mounted then the object will not be detected.
LIDAR doesn't detect objects lower than it's height
3. If there is very high ramp then ramp will also come in the plane of the LIDAR and it will be considered as an obstacle.
4. LIDAR's Exposure to direct sunlight will cause noise creation in the obstacle detection.
Geo Technical Challenges
The first and the major issue we faced with the GEO module was selecting the proper hardware for GPS and Compass. We tried with Sparkfun, Adafruit and Ublox GPS modules. We observed a lot of time taken by the GPS to get a lock and also the error was high. Then we switched to DJI Naza GPS and we found that it was pretty accurate and the lock up time was hardly a minute. The software issue which we faced with Naza GPS was that it did not have a proper software documentation. We tried to understand the message packets and went through the forums to understand the message layout. After this we were able to integrate the module successfully.
The Naza gps module comes with a in-built compass and it simplified our setup as we did not have to integrate two separate modules.
We faced one more hardware issue once the Rx pin of the gps module was accidentaly connected to the ground pin then the gps started to draw a lot of current. So to avoid this kind of mistakes we integrated fuse with the gps so even if extra current is drawn the fuse will take care that this does not hamper the entire system.
Also, the car was going to the edges even if the path was towards the middle of the road as per google maps. So after developing an app to map the checkpoints we found that the path is actually inside the buildings. So, we had to find a different solution to solve this problem. Afterwards, we created a database of all the routes in campus and then processed the route through the android app.
Project Videos
Conclusion
As a team we were able to achieve the set of goals and requirements within the required time frame. Over the course of this project, we learnt cutting edge industry standards and techniques such as:
- Team Work: Working in a team with so many people gave us a real sense of what happens in the industry when a large number of people work together.
- GIT: Our source code versioning, code review sessions and test management was using GIT.
- CAN: A simple and robust broadcast bus which works with a pair of differential signals. We were able to use the CAN bus to interconnect five LPC1758 micro controllers powered by FreeRTOS.
- Accountability: Dealing with both software and hardware is not an easy task and nothing can be taken for granted, especially the hardware.
- Hardware issues:
- Power Issues: Initially we were using a single port from the Power bank power up everything (all the boards) including the LIDAR. This caused the LIDAR to stop working due to insufficient current. It took a while for us to figure this out.
- GPS: Calibrating the GPS and getting accurate data from the GPS was a challenging task.
- Android Application: Using google maps to obtain checkpoints did not workout as google maps was giving a single checkpoint. So we created a database of checkpoints for navigating the car across SJSU campus.
- Debugging: Connecting the PCAN dongle to the car and moving around with it is a difficult way to debug. Hence we created a dashboard on the android application to view all the useful information on the tab without any hassles.
To the teams that are designing their car:
- If using a LIDAR for obstacle avoidance make sure to test it in all lighting conditions.
- It is better to have PCB instead of soldering everything on a wire-wrapping board.
- Start with the implementation for the Geo module early.
Project Source Code
The source code is available in the below github link
https://gitlab.com/optimus_prime/optimus
References
- ESC Calibration
- ESC PWM information
- ESC XL-5 PWM information
- GEO Bearing information
- LIDAR SDK which helped us in coding for the LIDAR
- LIDAR Data Sheet
- LIDAR User Manual
Acknowledgement
We are thankful for the guidance and support by
Professor
- Preetpal Kang
ISA
- Prashant Aithal
- Saurabh Ravindra Deshmukh
- Purvil Kamdar
- Shruthi Narayan
- Parth Pachchigar
- Abhishek Singh
For 3D printing
- Our sincere thanks to Marvin Flores <marvin.flores@sjsu.edu> for printing our 3D print models.
For Sponsoring R/C car
- Professor Kaikai, Liu