Oculis
26 August 2025
An iOS application designed to enhance public transit accessibility in Singapore, especially for the visually impaired to identify buses easily.
The Problem
In Singapore, between 3 and 7 out of every 1,000 people live with blindness, and an estimated 1.5% experience low vision. Although existing navigation apps provide routing information, they are often costly and not designed with accessibility as a priority.
The visually impaired community faces particular challenges when navigating crowded bus stops where multiple buses may arrive at once. Without visual confirmation, users must rely heavily on bystanders to identify the correct bus, creating dependency and uncertainty in their daily commutes. This lack of accessible public transit information limits independent mobility and undermines confidence when navigating Singapore’s complex transport network.
Our Solution
Oculis improves public transit accessibility by integrating on-device machine learning with live data from Singapore’s Land Transport Authority (LTA). The app uses computer vision to identify buses in real time while delivering audio feedback optimized for Apple’s VoiceOver and other screen readers.
By simply pointing their device’s camera, users receive instant voice announcements identifying approaching buses, removing the guesswork at busy stops. The app integrates seamlessly with Singapore’s transport infrastructure, functioning offline and requiring no costly subscriptions.
Oculis empowers visually impaired Singaporeans to travel independently and confidently, fostering a more inclusive society where mobility barriers are eliminated.
Journey
When we first began, our goal was to build an all-encompassing navigation system for the visually impaired community in Singapore. We imagined a solution that could address everything from detecting obstacles, reading signage and more believing that covering as many needs as possible would be the most valuable approach.
However, we quickly realised how hard it was to understand the subtle everyday challenges faced in real navigation as sighted individuals. Many of our early prototypes, while ambitious, didn't align with what users actually needed. We turned to the Singapore Association of the Visually Handicapped (SAVH) and the community itself. Hearing their personal accounts and witnessing their struggles firsthand revealed critical pain points we had overlooked.
These experiences reshaped the way we built Oculis. Rather than spreading ourselves thin across many features, we committed to solving one urgent problem well: helping users identify buses reliably in crowded, hectic environments. This shift in focus not only sharpened our product direction but also transformed our approach to user research, emphasizing real-world testing and co-creation with the community.
The collaboration has been invaluable. We are deeply thankful to Lyn, Choon Hwee, Ben, and the entire SAVH team for guiding us and helping accelerate our pilot efforts, as well as to our mentor Jin from TicTag, whose steady support kept us on track. This journey has shown us that technology alone isn’t enough it’s the partnership with users that makes solutions meaningful, impactful, and truly accessible.
Impact
Oculis partnered with the Singapore Association of the Visually Handicapped (SAVH) to launch pilot testing programs with the visually impaired community.
During this pilot phase, Oculis has successfully:
• Onboarded 11 visually impaired users including volunteers and employees working at SAVH
• Achieved 21 navigation sessions over the span of one week with active user engagement
• Delivered bus detection performance only 3-4 seconds slower compared to a sighted user actively looking for the bus service
• Demonstrated early signs of increased independence and greater access to public transportation for users
While some technical challenges and opportunities for improvement remain, users are already finding meaningful value in their daily navigation experiences through Oculis.
Pilot Findings & Feedback
Pilot Testing Approach
As of right now, we've run pilot tests with several members of SAVH which are categorised into:
• Guided tests: We physically met up with members of SAVH to introduce the prototype and guide them on how to use the app at the nearby Lighthouse School bus stop.
• Independent tests: We let them try out the app on their own time using TestFlight during their regular bus rides.
Pilot Feedback
"I've been waiting for a solution like this for years! Other apps I've tried just weren't reliable. It's incredible that a team fresh out of poly has created something so accurate that it has genuinely made my daily commute easier."
Key Outcomes
• Received positive feedback on the accuracy of the bus detection model, with users noting its reliability even under challenging lighting conditions.
• Users appreciated the simplicity of the interface and the unobtrusive design of the audio feedback system.
• The most frequently requested feature was haptic feedback for directional guidance, which was successfully implemented to help users orient their phones toward approaching buses.
• These insights guided our development roadmap, where we prioritized enhanced haptic feedback, faster loading times, and more refined audio cues.
Roadmap
Over the next six months, the Oculis team plans to further develop the application by adding a few new key features:
• Road Navigation: Providing turn-by-turn guidance.
• Enhanced Bus Experience: Improving the current bus detection features.
• Bus Service Navigation: Guiding users to specific bus services.
• Indoor Navigation: Expanding the app's functionality for use in indoor spaces.
Try our platform here!: https://oculis.vision/
For more details, email us: hello@oculis.vision