![]() I refactored some sample Python code provided by the OmniPreSense team to gather the speed reported by the OPS243 module: def ops_get_speed(): """ capture speed reading from OPS module """ while True: speed_available = False Ops_rx_bytes = ser.readline() Ops_rx_bytes_length = len(Ops_rx_bytes) if Ops_rx_bytes_length != 0: Ops_rx_str = str(Ops_rx_bytes) if Ops_rx_str.find(') By borrowing sample Python code provided by the Edge Impulse team, I was able to piece together a solution that took video frames and output an inference result.ĭuring testing it become fun to watch the processing, as it was taking = 60% was enough to continue on to the next step.Īt this point my program could be reasonably confident it was looking at a vehicle, so it was now time to check the speed from the OPS243 doppler radar module. ![]() With the Edge Impulse Python SDK installed, utilizing the ML model couldn't have been much easier. Let's see how these problems were solved. Create a cloud-based dashboard for reporting.Send the event over cellular to the cloud.Display the speed on the seven-segment display.Immediately measure that vehicle's speed.Use the Edge Impulse ML model to identify a vehicle.I decided to segment my Python code into five discreet tasks: With my hardware assembled and ML model in place, it was time to write some Python! NOTE: If you want to skip the play-by-play, you can find the completed Python project on GitHub. With Edge Impulse dependencies installed (drumroll please.□) I downloaded my model file using this command: edge-impulse-linux-runner -download model.eim Next, I needed the Edge Impulse Linux Python SDK installed: sudo apt-get install libatlas-base-dev libportaudio0 libportaudio2 libportaudiocpp0 portaudio19-dev pip3 install edge_impulse_linux -i To verify the installation was successful, I ran the edge-impulse-linux command to log in to Edge Impulse and register my RPi as a device. Your mileage may vary if you try this on an older RPi model!įirst, I installed Edge Impulse for Linux on the RPi with the following commands: curl -sL | sudo bash - sudo apt install -y gcc g++ make build-essential nodejs sox gstreamer1.0-tools gstreamer1.0-plugins-good gstreamer1.0-plugins-base gstreamer1.0-plugins-base-apps npm config set user root & sudo npm install edge-impulse-linux -g -unsafe-perm NOTE: Edge Impulse explicitly states support only for the Raspberry Pi 4. With the construction of the Edge Impulse ML model complete, it was time to deploy the model to the Raspberry Pi. ![]() The Notecard (and its companion Notecarrier-Pi HAT) provide not only cellular data access but also integration with the Blues Wireless cloud service, Notehub.io, to securely relay collected data to my cloud application for reporting purposes. To solve the data issue, I went with the Blues Wireless Notecard. On another project I was able to get 41 hours of unoptimized run time out of a Raspberry Pi with this power bank, which is ideal for this project. To solve the power issue, I used a 30, 000 mAh USB-C battery pack. How are you gonna get the data off of it?.When making an IoT project portable, there are two main considerations: I chose to use a simple seven-segment display to report on inference results and gathered speed data. Since the Raspberry Pi would be running in a "headless" state, I wanted to have some active feedback while it's working. This $249 device has been around for years, provides a serial interface, and a mature API for accessing speed data up to ~150mph.
0 Comments
Leave a Reply. |