S16: SkyNet

From Embedded Systems Learning Academy
Revision as of 20:55, 22 May 2016 by Proj user2 (talk | contribs) (Software Design)

Jump to: navigation, search

Grading Criteria

  • How well is Software & Hardware Design described?
  • How well can this report be used to reproduce this project?
  • Code Quality
  • Overall Report Quality:
    • Software Block Diagrams
    • Hardware Block Diagrams
      Schematic Quality
    • Quality of technical challenges and solutions adopted.

SkyNet

Meeting notes

This section is temporary and will only exist on the wiki during development.

Date Meeting Notes
3/27 Tracking:
   C++ for OpenCV
       Have target stand dead center and let the PC choose what to target
           - Green for go 
           - red for lost
   Graceful halt, error compensation in OpenCV layer

Motor System:

   API for % speed for x and y axis
   x_axis_speed(int speed_percent)
   y_axis_speed(int speed_percent)
   "Dumb motor system" should not know about error or destination, only knows how fast to go in a direction

Enclosure:

   Visible framework
   3D printing
   Autodesk Fusion 360
4/3 +++++++ Meetings Notes ++++++++

PCB Design

   - Completed Design for PCB
   - Initial quote was 18 per board
   - Looking into the price and seeing other fab house prices
   - Aiming to get PCB in 2 weeks time

Motor controller

   - Going to use TI controller as base
   - Fallback is Servo Motors
   - Motors used in robotic arms? slow and percise
   - Will be using EVM from TI to test out TI behavior
   - Looking into other controllers to get desired result L6234, SPWM signals (Sine-wave PWM)

CAD Frame

   - Re-adjust Raspberry Pi Cubby on Horizontal Frame
   - Add more support to Horizontal Frame arms
   - Aiming to get test print by next meeting, test durability.

OpenCV

   - Trained model for people detection already exists
       - How to differentiate a person? Premade function gives back a list of everyone
       - We CAN detect people, We need to find out how to narrrow down the targets
           - Use rectangles to get coordinates combined with HSV
           - Combine coordinates, analyze with HSV
   - HAR face tracking
       - possibly can single out a target
4/10 +++++++ Meetings Notes ++++++++

Hardware Side:

   Project Frame
       - Successfully printed out Horizontal frame, approximately 9 hours to complete
           - Used a SLA converter to get SLA file from 3D model in Fusion 360
           - Used a slicer tool to create a sliced version of the model to use in the 3D printer 
       - Successfully drove a motor slow
           - Used method detailed in this article http://www.berryjam.eu/2015/04/driving-bldc-gimbals-at-super-slow-speeds-with-arduino/
   Raspberry Pi PWM
       - Found two libraries to implement PWM control
   Circuit diagram
       - Started on full circuit diagram

Tracking:

   New ways of tracking
       - Used frame difference tracking
       - Pattern tracking
   Template Matching and frame difference
       - Look for frame differences and then choose objects based on the different objects in a frame
       - Use template matching to then track the object that was choosen
   Raspberry Pi Camera implementation
       - Developed on laptop first and then moved to Raspberry Pi
       - Heating issues on the Raspberry Pi
       - Really bad performance capture at 20-30 frames
       - Will get metrics of performance in next run
       - Reduce the 
   Template Matching
       - It can dictate anything as an object
           - car, hand, etc.
       - We have to select it to target
       - Fast moving objects were still trackable
4/17 +++++++ Meetings Notes ++++++++

Critical Meeting Hardware Side:

   Stepper motor control
       - Accurate because of the way the motor is built
       - Motors can be bought at various steps per rotation
       - Microstepping can increase the amount of steps per rotation
   Why switch from brushless motors?
       - Brushless motor control is too inaccurate for camera control
       - Voltage/current draw too large while using brushless
   Power circuit
       - Circuit to supply the power to the system
       - The motors themselves have a max draw about 1 amp
       - Raspberry PI needs minimum 5V and at max 1A
       - Development phase use a 12V 3A wall adapter
       - If time permits run system on battery power

Tracking:

   Template Matching inadequate 
       - Takes too much processing power
   Machine Learning Training
       - Offline training
       - Online training (Real time training)
   Fallback HSV tracking
       - Will develop until it can be feasibly used
   Min spec 
       - Tracking a student walking across the classroom

Abstract

SkyNet is a tracking tripod mount that will follow a given target using computer vision technologies. The system utilizes two brushless motors that are controlled by inputs given from a Raspberry Pi 3. The Raspberry Pi 3 utilizes the OpenCV open source library to calculate the deviation of a tracked object from the center of its view. It will then control the motors to correct the camera position such that the target will always be in the center of the video. The mount will be able to hold any standard 5-inch phone (should aim for universal mount) for video recording.

Objectives & Introduction

Show list of your objectives. This section includes the high level details of your project. You can write about the various sensors or peripherals you used to get your project completed.

Team Members & Responsibilities

  • Steven Hwu
    • OpenCV
  • Jason Tran
    • OpenCV
  • Andrew Herbst
    • Brushless Motor system
  • Vince Ly
    • Brushless Motor system

Schedule

Show a simple table or figures that show your scheduled as planned before you started working on the project. Then in another table column, write down the actual schedule so that readers can see the planned vs. actual goals. The point of the schedule is for readers to assess how to pace themselves if they are doing a similar project.

Week# Due Date Task Completed Notes
1 3/29 - Create Parts list and place order (Motors, Cameras, etc.)

- Compile OpenCV C++ code and run examples on Raspberry Pi 3

Completed - Ordered parts on 3/27

- OpenCV library is building on both development PCs (Steven/Jason)

TODO:

- Run OpenCV on Raspberry Pi 3

2 4/5 - Create motorized unit

- Create the CAD model for 3D printing

- Create the breakout board for the motor controller

- Be able to track an object in frame (Highlight object)

Completed - Successfully tracked an object in HSV color space.

- Successfully tracked human w/ various other objects. - Human tracking not enough to track one person, trying combination of HSV tracking

- Initial CAD model created w/ Autodesk Fusion 360

- Created PCB for TI chip, looking for fab houses

3 4/12 - Test various motors for behavior/control

- Extrapolate movement of object

Completed - Looking into Stepper vs Servo vs Brushless for optimal control/smoothness

- Looking into face tracking as option

- Researching how to narrow down human tracking to 1 specific person.

4 4/19 - Sync-up on how to command motors (scaling, etc.)

- Create API interface to control motors

- Create communication tasks to control motors

- Moving forward with Stepper motor implementation

- Using 12V wall adapter as power source, if time permits get a battery sytem

- Aim to finish the frame by this weekend, Horizontal part prints on Saturday and Veritical part prints on Sunday

- Developing HSV as fallback tracking

- Moving forward with Machine learning implementation

5 4/26 - Integration of control system and motor unit
6 5/3 - Control Calibration

- Use case test

7 5/10 - Finish Report/Slide deck(?)

Parts List & Cost

ECU:

   RaspBerry Pi 3 Rev B

Tracking Camera:

   Raspberry Pi Camera Board Module ~$20
       - http://www.amazon.com/Raspberry-5MP-Camera-Board-Module/dp/B00E1GGE40

Brushless Motors:

   Nema 14 Stepper Motor ~$24
       - http://www.amazon.com/0-9deg-Stepper-Bipolar-36x19-5mm-4-wires/dp/B00W91K3T6/ref=pd_sim_60_2?ie=UTF8&dpID=41CZcFZwJJL&dpSrc=sims&preST=_AC_UL160_SR160%2C160_&refRID=0ZZRP157RFT10QDHHKN4

Controllers:

   DRV8711 Stepper Motor Controller IC
       - http://www.ti.com/lit/ds/symlink/drv8711.pdf

Phone mount:

   Vince's cheap ass mount ~Free

Whole Enclosure:

   3D printed

Design & Implementation

The design section can go over your hardware and software design. Organize this section using sub-sections that go over your design and implementation.

Hardware Design

A custom frame was created to hold the motors and cameras in place. The initial design of the system was to use brushless motors to control the motion of the camera as an object was being tracked. This proved to be difficult because brushless motors have very high KV (rpm constant). This value is RPM/Volt, essentially a ratio to convert voltage to RPM. The motor initially being used was a motor rated at 70 KV.

Notes from 4/4

Other motors are being considered for this project which include servos and stepper motors. Servo motors have the problem of being too jittery/abrupt which is not ideal for recording video, this would defeat the purpose of the camera system. Stepper motors are also not ideal due to the fact that they are very heavy and require a large frame for the system to be structurally sound. Experiments will continue to verify which motor is ideal for the task.

Hardware Interface

In this section, you can describe how your hardware communicates, such as which BUSes used. You can discuss your driver implementation here, such that the Software Design section is isolated to talk about high level workings rather than inner working of your project.

Software Design

Show your software design. For example, if you are designing an MP3 Player, show the tasks that you are using, and what they are doing at a high level. Do not show the details of the code. For example, do not show exact code, but you may show psuedocode and fragments of code. Keep in mind that you are showing DESIGN of your software, not the inner workings of it.

Tracking using Haar Cascade Object Detection

The top level software using Haar Cascade object detection consists of 3 major steps:

  1. Initialization
    • Create a classifier object, then load a classifier file
      • A classifier is an XML file that describes a particular object, for example, the full body of a person.
    • Initialize the video source
      • The video source is an object that represents the video capture device, for example, a USB camera. Real-world visual data as a matrix can be obtained from this object.
  2. Capture an instance from the video source (also known as a frame), then perform object detection on it
    • Object detection is performed by searching the given frame for objects that match the previously loaded classifier file.
    • After analyzing, a list of found objects is returned; found objects are represented as Rectangle objects.
  3. Determine if the tripod’s camera needs to be rotated based on the target’s location in the frame.
    • Using the first detected object in the objects found list as the target, determine whether the tripod should pan horizontally or vertically.
    • Currently, the tripod has the capability to pan in the following directions:
      • Left
      • Right
      • Up
      • Down
    • If the panning action is required, activate the appropriate stepper motors

Implementation

This section includes implementation, but again, not the details, just the high level. For example, you can list the steps it takes to communicate over a sensor, or the steps needed to write a page of memory onto SPI Flash. You can include sub-sections for each of your component implementation.

Testing & Technical Challenges

Describe the challenges of your project. What advise would you give yourself or someone else if your project can be started from scratch again? Make a smooth transition to testing section and described what it took to test your project.

Include sub-sections that list out a problem and solution, such as:

My Issue #1

Discuss the issue and resolution.

Conclusion

Conclude your project here. You can recap your testing and problems. You should address the "so what" part here to indicate what you ultimately learnt from this project. How has this project increased your knowledge?

Project Video

Upload a video of your project and post the link here.

Project Source Code

References

Acknowledgement

Any acknowledgement that you may wish to provide can be included here.

References Used

List any references used in project.

Appendix

You can list the references you used.