Goals
Our software goal was to create Raspberry Pi and Arduino scripts that would work together (over a serial connection) to process user input and control both the stepper motor and LED matrices in order to display the proper output.
Process
Raspberry Pi
The Raspberry Pi is the brains of the CoSign. It handles running the python script, connecting to the internet, as well as interfacing with the microphone, LED matrix, and Arduino. We had adjust some of the Pi's internal settings in order for it to work with the microphone and USB sound card we had ordered. To do this, we made it recognize our sound card by default instead of the built-in ALSA sound card that could not process the quality of audio we needed. We also needed to make the Pi run code on startup to limit the number of wires attached to our sign (this way, a monitor would not be necessary to start the software). To do this, we edited the Pi's rc.local file to run a bash script that would execute our main program files.
Python
The python script is run on the Raspberry Pi and handles looking up locations on the internet, calculating their relative distance and direction, and printing to the LED matrix. The libraries and APIs that it uses are:
- freegeoip.net: This allows us to look up the sign's location using the Raspberry Pi's IP address.
- serial: This allows us to interface with our Arduino.
- speech_recognition: This is Google's speech recognition API; it allows us to recognize what the user says into the microphone.
- geopy: This assists with calculating the distance between two GPS coordinates using the Vincenty method.
- googleplaces: This gets the GPS coordinates for the user-specified location.
- rpi-rgb-led-matrix: Handles interfacing with the LED matrix display through the Raspberry Pi's GPIO pins
- threading: This allows us to print to the LED matrix and process things in the background at the same time.
- Startup Step: This is the first step called when the sign is supplied power. It prints a 'Loading..." message to the display while it connects to the Arduino, gets the magnetometer data from the Arduino, and finds its location using freegeoip.
- Waiting for User Step: After the sign is all set up, the script prints a "Press the Button" message to the display while listening for the Arduino to tell it that the button was pressed.
- Listening and Calculating Step: After the button is pressed, the script prints a "Listening..." message to the display while it listens and recognizes the words the user says into the microphone. After getting the user input, it looks up that location and gets its GPS coordinates. It then calculates the location's relative distance and direction away from the sign.
- Output Step: After we have the location's name, distance, and direction, we are ready to act on them. This step prints the location name and distance to the display and sends the direction angle to the Arduino for it to move the motor accordingly.
Arduino
The Arduino is responsible for controlling four major components: the motor, the button, the limit switch, and the magnetometer:
- Stepper Motor: This actually spins the sign. The Arduino tells it how much to turn the sign in order to point at the correct angle.
- Button: This is what the user presses when they want to give the sign a new location to search for. The Arduino is constantly checking if the button was pressed so it can tell the Raspberry Pi to start listening for a new location.
- Limit Switch: This is pressed when the sign has hit its angle limit. There are a bunch of wires running up the sign that get twisted when the sign turns, and they could possibly break if they are twisted too far. Having a limit switch allows us to know if the sign has been turned too far so we can reset it.
- Magnetometer: This measures the sign's compass direction. If the sign needs to point East, it will turn a different angle depending on if it's facing North or South. Having a magnetometer allows us to know what direction the sign is pointing.
Pivots and Challenges
We originally were planning to run the LED matrix displays off of the Arduino, but quickly realized that the Arduino cannot run the displays when they are daisy-chained together. Instead, it mirrors the display, which is not what we wanted. After a bit a research we found that we could run the daisy-chained displays using the Raspberry Pi. This was a relatively easy fix, but we did waste time learning about how to run the displays off of the Arduino.
The display needs to be constantly updated (in a while loop), so we had to figure out how to print information to the user while also doing background computations. To fix this, we ended up using the python threading library to be able to run two functions in parallel. This also meant that everything in our code had to be in functions, so that required a lot of rearranging.
The display needs to be constantly updated (in a while loop), so we had to figure out how to print information to the user while also doing background computations. To fix this, we ended up using the python threading library to be able to run two functions in parallel. This also meant that everything in our code had to be in functions, so that required a lot of rearranging.
Reflection
Transitioning from running the display off the Arduino to running it from the Raspberry Pi turned out to be way more difficult that we anticipated. We thought that it would be quick and straightforward but it turned out to be much more complicated. It was very gratifying, however, when we all got it to finally work. This project has taught all of us new things about software. It was very rewarding to learn useful tools, such as interfacing between a Raspberry Pi and Arduino and running multiple functions in parallel in python.