Blog
March 25, 2020Machine Turning: Improving Bike Safety With Machine Learning
- Author
- Kyle Stevens
- Topics
- Machine Learning, Mobile Innovation
Imagine a world where machine learning allows cyclists to effectively tell everyone on the road what their next action is, whether motorists know hand gestures or not. It’s not just a fun idea. It’s life-saving! According to the National Highway Traffic Safety Administration, there were 857 bicyclists killed in traffic crashes in the United States in 2018.
Another recent study conducted by researchers at UCSF (2017) reported that, “Over a 17-year period, the medical costs of bicycle injuries to adults in the United States, both fatal and non-fatal, amounted to $237 billion. ‘There were 3.8 million non-fatal adult bicycle injuries and nearly 9,839 deaths that were reported during the study period from 1997 to 2013.’”
Generally, the roads would be much safer for cyclists if everyone just obeyed the rules of the road and operated with caution around each other. But that isn’t always easy. Not all cyclists signal their next action, and not all motorists even know what those signals mean. What if we could take those challenges off the table?
Using Machine Learning to Find A Solution
CapTech leadership put forth a company-wide challenge embracing our culture of innovation. We were tasked with creating something new and interesting using wearables technology, or shall we say “awareable technology.” We weren’t necessarily trying to build new wearables; rather, we were challenged to maximize the potential of wearable sensors combined with the computational power in these tiny devices.
To answer the challenge, our [city] team created an automatic visual turn signal system for cyclists that activates when the rider performs standard hand signal gestures. We combined sensor data from an Apple Watch with a specially trained machine learning model to detect hand signals in real-time. When a signal is detected, the watch communicates wirelessly to a rear mounted iPhone to show visual turn signals on the rider’s back. Signals are displayed until a rider completes their turn. The result is a closed system that operates automatically without adding cognitive overhead.
Putting Wearables to Work in New Ways
To accomplish our goal, we created two different apps. The first app was created to capture motion data and tag it in real-time. This app enabled us to quickly collect a lot of real-world data to feed into the machine learning process. The second app incorporated our trained machine learning model to recognize gestures and display turn signals.
Our apps run simultaneously on the iPhone and Apple Watch. The watch is worn on a rider’s left wrist so that they can perform proper traffic turn signals with their left arm. The phone is worn on the rider’s back using a harness to display the flashing turn signals to other commuters. The Apple Watch component processes real-time motion data through our Core ML model in order to recognize arm gestures, and the iPhone component displays turn signals on the screen whenever the watch detects a turn signal gesture. We also use the phone’s compass to detect completed turns and deactivate turn signals.
The result of these two components working together in real-time is a passive interaction-less app that amplifies the rider’s intent without adding cognitive load. Our solution augments a rider’s intentions while allowing them to focus solely on proper hand signaling and safely completing their turns.
We believe that wearable technology is at its best when operating automatically in response to a user’s actions and environment. Reducing or removing cognitive load while acting on a user’s behalf is a critical aspect of wearable technology. The result is something helpful—and in this case, it’s beyond helpful. It’s life-saving. And that isn’t something that has to remain imagined.
Our Team:
We were grateful to have the opportunity to deliver a truly meaningful solution using emerging technology. Our team was formed from two groups within CapTech - Data Science and Mobile Applications. Our data science group included Gabriel Jessee and Connor McKenna. Our mobile application group included Kyle Stevens, Ben Nowak, and Sean Moran.