
Does OKO Have the Green Light in the Blindness Community?
By Dezman Jackson
The world around us is designed for a sighted consumer base. Therefore, our approaches to navigating are generally centered around using alternative senses like hearing or touch, modifying the environment, and/or developing low- or high-tech devices such as the long white cane. Smartphone technology has revolutionized daily life for the general public, including the blind. OKO, an app developed by the Belgian company Ayes, has garnered much recent interest.
OKO assists users in crossing at light-controlled intersections, by using the smartphone’s camera to detect the typical phases of traffic lights. I was introduced to the app recently when Ayes visited the Jernigan Institute.
I am a blind person with no usable vision. I travel in various environments and conditions with relative ease. I have also been an orientation and mobility instructor holding National Orientation and Mobility Certification (NOMC) for a handful of years.
Background on Accessible Traffic Signals
According to the US Access Board, Accessible Pedestrian Signals (APS) technology has existed in the United States for twenty-five years. APS give pedestrians nonvisual information on the state of traffic signals (e.g., walk/don’t walk, green/red). This information is typically provided through patterns of audible tones and tactile/vibration feedback. APS on opposing corners are sometimes synchronized to assist travelers in alignment when crossing the street. Although not available at every signalized intersection, they have become increasingly prevalent, particularly when new signals are installed or the general infrastructure is upgraded. A desire to fill in the gaps has fueled Ayes’s endeavor to develop a solution—OKO.
The Inner Workings and My Experience
OKO is currently only available on the Apple iOS platform for iPhones. Downloading and installation are free of charge. Upon opening the app, you will be presented with first-time use instructions.
I found the main interface to be fairly simple and minimalistic in terms of the number of items on the screen. Although the buttons and other elements are labeled and spoken by VoiceOver, some of the labels were not intuitive. For instance, one of the buttons is labeled “Account.” Upon activation, you get options for configuring the camera, audio and haptic feedback options, and so on. A more intuitive name for the button might be “Settings.”
In using the app at several intersections, I found that best results come when holding the phone in portrait mode at chest level, while pointing the back camera in the direction one is facing. OKO uses artificial intelligence to seek out “walk,” “countdown,” and “don’t walk” signals for each traffic light. It gives feedback to users through sound, vibration, and/or by visually displaying the camera’s view on the phone’s screen. When the app detects the “do-not-walk” phase of a cycle, there is a certain beeping/vibrating pattern approximately once per second. When the app detects a walk signal, the pattern speeds up a bit. When the app detects a countdown signal, the user detects a different rhythmic pattern.
Ayes quite appropriately recommends using gear that doesn’t cover or go in the ears. I typically used my Shokz OpenRun bone conduction headset.
I found lining up so that the phone could detect the traffic signal challenging. When I did achieve the proper alignment, the app did a fine job of informing me of changes in the signal. I would like to see Ayes implement some way to inform users of the number of seconds on the countdown timer. I suppose this could be estimated for now by the countdown pattern.
Ayes really touts the ability of the app to assist travelers in making straight crossings through feedback (or lack thereof) while crossing. As long as you are making a straight crossing, you should continue to hear and/or feel feedback. If you stop receiving feedback, you are veering and should adjust until your phone’s camera resumes.
I found keeping this continuous feedback difficult. I made more or less straight crossings based on the usual parallel traffic and where I landed on the other side of the street. Therefore, I deduce that keeping the hand I was using to hold the phone steady likely affected this.
Final Thoughts
Creating technology to deal with traffic signals is quite a feat, particularly in one of the most critical aspects of nonvisual independent travel. I applaud Ayes for leveraging the power of AI in a good first run at it. I would caution travelers against solely outsourcing their crossing decisions to OKO in lieu of using best practices in orientation and mobility techniques as the foundational skills. This app should be used as a supplement. Ayes does state as much in the app and requires users to agree that they understand.
The company also recommends that users get familiar with the app by using it with a companion. While I champion users getting the help they need to feel comfortable using this technology, I believe that the app should be intuitive enough for users to get going without the need for a companion. The company also states that the app should only be used during daylight hours. I assume this is due to a technical limitation of signal detection. Obviously independent travel happens after hours, and I would hope any limitations in this aspect of the app will be addressed. I would also wish for versions of this app to be developed for other devices such as those running Android and associated derivatives such as BlindShell.
Finally, while OKO is a great resource to have in our pockets, not everyone will have access to the technology that supports it. Our blind and deafblind community should continue to advocate for more universal access to pedestrian signals and to involve us in the implementation. Does OKO have the green light in the blindness community? I say yes with the aforementioned caveats. Thank you to Ayes for your innovation and involving us in the process.