What is yaR?
How does it work?
Imagine having a friend who can describe the world around you - from the smile on your grandchild's face to the words on a street sign. That's yaR. It's not cold, clinical tech. It's a companion that whispers the world into your ear.
From our users
Users click a button and ask a question about their surroundings. yaR immediately snaps a photo, capturing the scene. Its advanced AI then processes this image along with the user's query. Within seconds, yaR responds using a clear voice, providing the information the user needs.
Be Part of yaR's Journey
We developed yaR hand-in-hand with the Singapore Association of Visually Handicapped. Their members tested it in real situations - describing scenes, reading labels, you name it. It wasn't always smooth sailing, but their feedback was gold. They pointed out the good, the bad, and the "needs work." That's how yaR got better. We're still learning, still improving, thanks to them.
We've open-sourced yaR's code on GitHub. Developers, your skills can help refine yaR. Not a coder? No worries - our Kickstarter is coming soon. Whether you're tweaking algorithms or backing the project, you're helping make the world more accessible. Ready to join us?
Can I see yaR in action? Sure.
Our Story
We're not a big tech company. We're a group of friends who believed that everyone deserves to experience the world in full color. yaR was born in late-night coding sessions, countless cups of coffee, and the unwavering belief that technology can bridge the gap between sight and insight.
From left to right: Yajat, Manas, Shrivardhan and Sparsh.
The Future is What We Make It
YaR isn't finished. It never will be. Because as long as there are new experiences to be had, new sights to be described, new worlds to be explored, we'll be here, coding, dreaming, and believing in a world where everyone can see - in their own unique way.