Difference between revisions of "S16: SkyNet"
Proj user2 (talk | contribs) (→Abstract) |
Proj user2 (talk | contribs) (→Meeting notes) |
||
Line 12: | Line 12: | ||
== SkyNet == | == SkyNet == | ||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
== Abstract == | == Abstract == |
Revision as of 06:09, 23 May 2016
Contents
Grading Criteria
- How well is Software & Hardware Design described?
- How well can this report be used to reproduce this project?
- Code Quality
- Overall Report Quality:
- Software Block Diagrams
- Hardware Block Diagrams
- Schematic Quality
- Quality of technical challenges and solutions adopted.
SkyNet
Abstract
SkyNet is a tracking camera mount that follows a given target using computer vision technologies. The system utilizes two stepper motors that are controlled by inputs given from a Raspberry Pi 3. The Raspberry Pi 3 utilizes the OpenCV open source library to calculate the deviation of a tracked object from the center of its view. System then control the motors to correct the camera position such that the target is aligned to the center of the video. The mount will be able to hold any standard 5-inch phone or small point and shoot for video recording. The completed project was able to hold any smartphone and small point and shoot cameras. The system allowed for limited pitch rotation and unlimited yaw rotation. The tracking solution used an OpenCV method called haar cascade. The tracking was adequate enough to track a slow moving target and had issues with backgrounds of similar color to the tracked person.
Objectives & Introduction
The objective of this project was to create a tracking camera mount that kept a target within the center of the camera's view at all times. To accomplish this a custom frame was created to house a stepper motor system and an OpenCV based tracking solution. The project has a custom frame that was modeled in a CAD software called Fusion360 and printed using a 3D printer. The stepper motor system consists of two motors for yaw and pitch rotation. The OpenCV tracking software was run on a Raspberry Pi 3 that used the Raspberry Pi camera as an input to the software.
Team Members & Responsibilities
There were four team members in this project and two teams of two were created for each part of the project. One team was focused on creating an OpenCV based tracking solution. The other team was focused on creating a physical system for the tracking software to control. The breakdown of the teams is shown below.
OpenCV tracking solution
- Steven Hwu
- Jason Tran
Motor System and frame
- Andrew Herbst
- Vince Ly
Motor System
The objective of the motor system was to create a system that allowed yaw and pitch rotation. The system was designed to be place on a table or a tripod at chest level. The frame that held the system was designed to use the least amount of material and allow for unobstructed rotation on either axis. To accomplish this stepper motors will be placed horizontally and vertically. Stepper motors are ideal for an accurate system like this because of the simple control scheme. Also stepper motors are very accurate because they rotate in steps and can micro-step to decrease the jitter of motor movement. A Raspberry Pi 3 was the processing platform chosen. This platform runs linux and had the bare minimum performance to run an OpenCV tracking solution. To power the system a power circuit was created to power the motors and the Raspberry Pi. Ideally the system would include a battery but due to lack of time a power adapter supplied power to the system. This restricted the movement of rotation on the horizontal access.
The bare minimum requirements for the motor system is summarized below.
Specifications
- Allow for yaw and pitch rotation
- Small footprint of the entire system frame
- A compute platform capable of running OpenCV tracking code
- Power circuit to power two stepper motors/compute platform
Tracking system
ADD SUMMARY PARAGRAPH/MINIMUM REQUIREMENTS SUMMARY HERE
Schedule
Show a simple table or figures that show your scheduled as planned before you started working on the project. Then in another table column, write down the actual schedule so that readers can see the planned vs. actual goals. The point of the schedule is for readers to assess how to pace themselves if they are doing a similar project.
Week# | Due Date | Task | Completed | Notes |
---|---|---|---|---|
1 | 3/29 | - Create Parts list and place order (Motors, Cameras, etc.)
- Compile OpenCV C++ code and run examples on Raspberry Pi 3 |
Completed | - Ordered parts on 3/27
- OpenCV library is building on both development PCs (Steven/Jason) TODO: - Run OpenCV on Raspberry Pi 3 |
2 | 4/5 | - Create motorized unit
- Create the CAD model for 3D printing - Create the breakout board for the motor controller - Be able to track an object in frame (Highlight object) |
Completed | - Successfully tracked an object in HSV color space.
- Successfully tracked human w/ various other objects. - Human tracking not enough to track one person, trying combination of HSV tracking - Initial CAD model created w/ Autodesk Fusion 360 - Created PCB for TI chip, looking for fab houses |
3 | 4/12 | - Test various motors for behavior/control
- Extrapolate movement of object |
Completed | - Looking into Stepper vs Servo vs Brushless for optimal control/smoothness
- Looking into face tracking as option - Researching how to narrow down human tracking to 1 specific person. |
4 | 4/19 | - Sync-up on how to command motors (scaling, etc.)
- Create API interface to control motors - Create communication tasks to control motors |
- Moving forward with Stepper motor implementation
- Using 12V wall adapter as power source, if time permits get a battery sytem - Aim to finish the frame by this weekend, Horizontal part prints on Saturday and Veritical part prints on Sunday - Developing HSV as fallback tracking - Moving forward with Machine learning implementation | |
5 | 4/26 | - Integration of control system and motor unit | Completed | - Push back integration to 5/3
- Tested stepper motor drivers |
6 | 5/3 | - Created motor system circuit design/wiring layout | Completed | - Pushed integration to 5/10 |
7 | 5/10 | - Finish motor system wiring
- Integration between motor system and tracking system |
Completed | - Pushed back optimization till 5/17 |
7 | 5/17 | - Motor system Optimization | Completed | - Finished motor system optimization
- tracking optimization on 5/22 |
7 | 5/22 | - tracking optimization | Completed | - Tracking with backgrounds similar color to person causes issues
- Turn on smartphone light to increase contrast between user and background - Motor system extremely hot, might have to do with current draw |
Parts List & Cost
ECU:
RaspBerry Pi 3 Rev B ~$40 - https://www.raspberrypi.org/products/raspberry-pi-3-model-b/
Tracking Camera:
Raspberry Pi Camera Board Module ~$20 - http://www.amazon.com/Raspberry-5MP-Camera-Board-Module/dp/B00E1GGE40
Stepper Motors:
Nema 14 Stepper Motor (Pitch Motor)~$24 - http://www.amazon.com/0-9deg-Stepper-Bipolar-36x19-5mm-4-wires/dp/B00W91K3T6/ref=pd_sim_60_2?ie=UTF8&dpID=41CZcFZwJJL&dpSrc=sims&preST=_AC_UL160_SR160%2C160_&refRID=0ZZRP157RFT10QDHHKN4
Stepping Motor Nema 17 Stepping Motor (Yaw Motor) ~$13 - http://www.amazon.com/Stepping-Motor-26Ncm-36-8oz-Printer/dp/B00PNEQ9T4/ref=sr_1_4?s=industrial&ie=UTF8&qid=1463981462&sr=1-4&keywords=stepper+motor+nema+17
Controllers:
x2 DRV8825 Stepper Motor Driver Carrier, High Current ~$10 - https://www.pololu.com/product/2133
Phone mount:
x1 Tripod phone mount ~$5
Whole Enclosure:
3D printed
Total: ~$123
Design & Implementation
The design section can go over your hardware and software design. Organize this section using sub-sections that go over your design and implementation.
Hardware Design
A custom frame was created to hold the motors and cameras in place. The initial design of the system was to use brushless motors to control the motion of the camera as an object was being tracked. This proved to be difficult because brushless motors have very high KV (rpm constant). This value is RPM/Volt, essentially a ratio to convert voltage to RPM. The motor initially being used was a motor rated at 70 KV.
Notes from 4/4
Other motors are being considered for this project which include servos and stepper motors. Servo motors have the problem of being too jittery/abrupt which is not ideal for recording video, this would defeat the purpose of the camera system. Stepper motors are also not ideal due to the fact that they are very heavy and require a large frame for the system to be structurally sound. Experiments will continue to verify which motor is ideal for the task.
Power Unit:
A +12V supply was needed to power both of the motor drivers. A +12V wall adapter was used along with a pair of coupling capacitors to run power to the DRV8825 IC's. In addition, a +5V supply was required to power the Raspberry Pi 3 board, which controls the motor drivers over GPIO. To step down the 12 volts from the wall adapter, we used a 5V switching regulator because it could handle the heat and power capacity that would be required when running the motors.
Motor Driver:
The TI DRV8825 IC is capable of driving a bipolar stepper motor, so two of these chips were implemented for the pitch and yaw motors. The controller was powered with a 12V supply, as shown in the figure. This specific motor driver was chosen primarily because of its simple STEP/DIR interface. The driver would receive a direction signal along with a PWM signal. The frequency of the PWM signal dictates the speed at which the stepper motor would turn in the direction specified by the DIR pin. The motor driver also had microstepping capabilities so that we could coordinate the steps per rotation of each motor to deliver the maximum smoothness while turning the motor. The breakout board that carried the motor driver allowed for easy implementation and connection to the Raspberry Pi and PWM driver.
PWM Driver:
Because the Raspberry Pi 3 only has one hardware PWM output pin, an external PWM driver was chosen to drive the PWM signals that controlled the two stepper motors. The PCA9685 16-channel PWM controller was chosen as the interface between the microcontroller and the two stepper motors. The PWM controller communicates with the Raspberry Pi over the I2C bus with just two wires. With this device, the PWM frequency could be adjusted between 24Hz and 1526 Hz, which was an acceptable range for the application of this project. The frequency was set by modifying the register that dictated the frequency prescaler in the equation in figure 4.
Hardware Interface
In this section, you can describe how your hardware communicates, such as which BUSes used. You can discuss your driver implementation here, such that the Software Design section is isolated to talk about high level workings rather than inner working of your project.
Motor System Class:
In order to communicate from the higher-level OpenCV motion tracking module to the lower-level motors, a Motor System class was developed in the intermediate layer. The Motor System class was developed with various functions that the motion tracking module would call upon to control the individual motors' movement. A summary of the basic functions the Motor System provided are below. The main purpose of the Motor System is to pass the speed and direction commanded by the OpenCV module to the individual motors and also to determine the motor step size that would produce the most refined movement of the motors.
// Motor System Class void power_off(void); void power_on(void); /** * Sets the speed for both the yaw_motor and pitch motor * * @param x_speed speed of horizontal motor * @param y_speed speed of vertical motor */ void set_x_y_speed(float x_speed, float y_speed); /** * Checks if either system is faulted * * @return true if either motor is faulted */ bool is_faulted(); /* * Sets motor step size for maximum smoothness given minimum rotation time */ void set_smoothness();
The Motor System is also responsible through its constructor for initializing the pitch motor and yaw motor by assigning each GPIO pin to its corresponding function in the Driver class.
// Motor Initialization: pitch_motor = new Motor_driver(PITCH_PHYSICAL_STEPS, PWM_0, PIN_2, PIN_3, PIN_21, PIN_22, PIN_23, PIN_24, PIN_0); yaw_motor = new Motor_driver(YAW_PHYSICAL_STEPS, PWM_1, PIN_4, PIN_5, PIN_6, PIN_25, PIN_27, PIN_28, PIN_1);
Motor Driver Class:
The Motor Driver class has two instances, one for each motor, which is passed information such as speed and direction from the Motor System functions. Below is a list of the important functions available to the Motor Driver, the most vital being the rotate() function which parses the x and y speed information from the Motor System class and sets the corresponding DIR pin and sets the frequency of the PWM signal to send to the motor to control speed.
// Motor Driver Class /** * Sets the rotation of the motor can either be * NEGATIVE, POSITIVE, or HOLD * * @param motion specifies the type of motion * @param speed Sets the speed of the motor, 0 <= speed >= 1000 */ void rotate(motion_e motion, int speed); /** * Sets the step size of the motor * * @param step_size_e step size for motor */ void set_step_size(step_size_e step_size); /** * Sets the direction of the motor positive or negative * * @param motion direction to move */ void set_dir(motion_e motion); /** * Holds the current position motor is at, used for speed == 0 * */ void hold_position(void);
When the motors are initialized, the constructor carries out the following tasks:
1. Set steps per rotation 2. Set current motion to HOLD 3. Set step size to 1 4. Set current speed to zero 5. Initialize the control pins of the motor driver to correspond to the GPIO pins of the Raspberry Pi
PWM Driver:
The Raspberry Pi communicates with the PCA9685 PWM driver over the I2C bus through two wires, SDA and SCL. The microcontroller can manipulate the motor drivers by writing to different control registers on the PCA9685 in order to configure the output PWM and duty cycle to the motor drivers. For example, the sequence to set an output PWM frequency is as follows:
1. Check that the requested frequency is within the boundary (24-1526 Hz) and clamp value if necessary 2. Calculate prescaler based on requested frequency using formula from figure 4 3. Set sleep mode bit by writing 0x10 to Mode 1 address 0x00 4. Write the calculated prescaler value to the prescaler register at address 0xFE 5. Clear the sleep bit by writing 0x00 to Mode 1 address 0x00
Software Design
Show your software design. For example, if you are designing an MP3 Player, show the tasks that you are using, and what they are doing at a high level. Do not show the details of the code. For example, do not show exact code, but you may show psuedocode and fragments of code. Keep in mind that you are showing DESIGN of your software, not the inner workings of it.
Implementation
This section includes implementation, but again, not the details, just the high level. For example, you can list the steps it takes to communicate over a sensor, or the steps needed to write a page of memory onto SPI Flash. You can include sub-sections for each of your component implementation.
Tracking Application using Haar Cascade Object Detection
The top level software using Haar Cascade object detection consists of 3 major steps:
- Initialization
- Create a classifier object, then load a classifier file
- A classifier is an XML file that describes a particular object, for example, the full body of a person.
- Initialize the video source
- The video source is an object that represents the video capture device, for example, a USB camera. Real-world visual data as a matrix can be obtained from this object.
- Create a classifier object, then load a classifier file
- Capture an instance from the video source (also known as a frame), then perform object detection on it
- Object detection is performed by searching the given frame for objects that match the previously loaded classifier file.
- After analyzing, a list of found objects is returned; found objects are represented as Rectangle objects.
- Determine if the tripod’s camera needs to be rotated based on the target’s location in the frame.
- Using the first detected object in the objects found list as the target, determine whether the tripod should pan or tilt.
- Currently, the tripod has the capability to pan/tilt in the following directions:
- Left
- Right
- Up
- Down
- If the panning/tilting action is required, activate the appropriate stepper motors (x-axis or y-axis)
Testing & Technical Challenges
Describe the challenges of your project. What advise would you give yourself or someone else if your project can be started from scratch again? Make a smooth transition to testing section and described what it took to test your project.
Include sub-sections that list out a problem and solution, such as:
My Issue #1
Discuss the issue and resolution.
Conclusion
Conclude your project here. You can recap your testing and problems. You should address the "so what" part here to indicate what you ultimately learnt from this project. How has this project increased your knowledge?
Project Video
Upload a video of your project and post the link here.
Project Source Code
References
Acknowledgement
Any acknowledgement that you may wish to provide can be included here.
References Used
List any references used in project.
Appendix
You can list the references you used.