S15: Connect Four - Robotic Player
Contents
Design & Implementation
System Diagram
The following graphic provides a high-level representation of the system's various software/hardware/human interfaces:
Hardware Design
Discuss your hardware design here. Show detailed schematics, and the interface here.
Hardware Interface
In this section, you can describe how your hardware communicates, such as which BUSes used. You can discuss your driver implementation here, such that the Software Design section is isolated to talk about high level workings rather than inner working of your project.
- Physical Layer Interfaces
- SPI - Pixy Camera
- Bluetooth - Laptop Communication
- PWM (stepper) - 1 Wire PWM with 2 Control Signals.
- PWM (servo) - 2 Wire PWM.
Software Design
The system's software interfaces are comprised of the following components (organized by their respective hardware hosts):
- Laptop
- SJSU One
Bluetooth Socket Interface
Connect 4 AI
GameTask
MotorTask
PixyTask
Drivers for the Pixy were designed using Ozmekian system principles (every machine characteristic/action has a corresponding human analogue). The Pixy's primary tasks as part of Konnector is to send the location and coordinates of human-inserted Connect 4 game chips to the SJSU One board. This involves recording location/coordinate information of incoming blobs, performing statistical analysis on said blobs, and subsequently notifying other system components once a human chip-insertion has been detected. The Pixy's software interface is therefore comprised of the classes PixyBrain, PixyEyes, and PixyMouth, each one encompassing the corresponding facilities that would be required of a human assuming the same role.
Following the same analogy, PixyTask can be thought of as the "consciousness" of the system, interfacing directly only with the 'Pixy' class, which defines actions for the individual components to take in response to recorded state/external input:
template<typename KEY_T, typename FUN_T, typename ... ARG_T>
struct FuncMap_t
{
map<KEY_T, function<FUN_T(ARG_T ... xArgs)>> fpMap;
void vSetHandler(KEY_T xElem, function<FUN_T(ARG_T ... xArgs)> fnHandler)
{
fpMap[xElem] = fnHandler;
}
function<FUN_T(ARG_T ... xArgs)>& vResponse(KEY_T xElem)
{
return fpMap[xElem];
}
};
A function is defined to respond to each of the following system states (Github):
enum State_t
{
RESET=0x01, // SW(1)
EMA_ALPHA_UP=0x02, // SW(2)
EMA_ALPHA_DOWN=0x04, // SW(3)
SW_4=0x08, // SW(4)
CALIB=16,
WAITING_FOR_HUMAN=17,
WAITING_FOR_BOT=18,
ERROR=19
} eState, eLastState;
The system typically alternates between states 'WAITING_FOR_HUMAN' and 'WAITING_FOR_BOT', whose handler is defined as:
xFuncMap->vSetHandler(WAITING_FOR_BOT, [&] ()
{
PixyCmd_t xBotInsertCmd;
if (xQueueReceive(getSharedObject(shared_PixyQueue), &xBotInsertCmd))
{
PixyBrain->lBotInsert(xBotInsertCmd);
eState = WAITING_FOR_HUMAN;
}
else
{
eState = WAITING_FOR_BOT;
}
});
Functions are involved during FreeRTOS' implicit 'void run(void)' function by calling the following helper:
bool run(void *p)
{
pPixy->vAction(SW.getSwitchValues());
return true;
}
PixyBrain
PixyBrain functions are invoked to handle processing of incoming data from PixyEyes, converting raw data into messages of interest to deliver to the higher-level (Python) AI code. The handler set for the 'WAITING_FOR_HUMAN' state exhibits this relationship, with the 'PixyBrain' pointer dereferencing 'PixyEyes' to use the blocks it has 'seen':
<syntaxhighlight lang="cpp"> xFuncMap->vSetHandler(WAITING_FOR_HUMAN, [&] () {
pPixyBrain->lSampleChips(pPixyEyes.get()); if (/* Human inserts chip */) { eState = WAITING_FOR_BOT; } else /* Insertion not detected */ { eState = WAITING_FOR_HUMAN; }
});
PixyEyes
PixyMouth
Implementation
This section includes implementation, but again, not the details, just the high level. For example, you can list the steps it takes to communicate over a sensor, or the steps needed to write a page of memory onto SPI Flash. You can include sub-sections for each of your component implementation.