Charles Leclercq, CEO of ARxVision, outlines the conception of ARxVIsion's collaboration with Seeing AI and NaviLens apps, and details the popular uses for the headset so far.
Those with low vision and blindness now have a new perspective to view the world around them with ARxVision's new ARx AI Gen 1.5 headset. The headset, paired with Seeing AI and NaviLens apps, allows users to get audio descriptions of the world around them while remaining hands-free. Charles Leclercq, ARxVision CEO, details the uses of the headset in an exclusive interview with Optometry Times.
Editor's note - This transcript has been edited for clarity.
Jordana Joy:
Hi everyone. I'm here today with Charles Leclercq, CEO of ARxVision. Welcome, it's great to have you.
Charles Leclercq:
Hi Jordana, it's nice to be here.
Jordana Joy:
Great. So tell me a bit about the development of ARxVision headset.
Leclercq:
So we created the ARxVision ARx AI headset to extend the capabilities of smartphones and clearly over the past 18 months, generative AI has been booming. The value AI-enabled apps can deliver is tremendous for the community. The only downside is that they still require individuals to hold their smartphones in front of them to scan their surroundings, which, in terms of user experience is not ideal. And on the science side of things, we're seeing a lot of research papers that have been published that explained that holding a smartphone increases cognitive load, and can increase the risk of accidents and injuries. What we've done here is that we've created the Arx AI headset to extend a smartphone. So when you connect it to a smartphone, you can put the smartphone in your pocket and you can benefit from the power of all these wonderful apps, like the Microsoft Seeing AI app and NaviLens app.
Joy:
Great. So tell me a bit about how the day to day function changes for people with low vision who may be using ARxVision's headset.
Leclercq:
I think it's really about ease of use and convenience. What I like the most about the interaction ARx AI creates is that you could be having a conversation with someone, be part of the action, of what's happening, and still benefit from the AI augmentation through the audio of bone conduction speakers. So obviously, people have been using it at home to read the newspaper, or letters they've received, or recipes, and things like that. We've seen a lot of people using it for shopping, so finding the aisle in a shop, or making sure that they are purchasing the right variety of their favorite cereals or whatever it may be. Then we've also done trials in hospitals and the feedback was that ARx AI was actually enhancing access to healthcare because the patient felt more confident going in transportation networks on their own, and then navigating inside the hospital. That's huge because it means that people can access health care, but also, nurses and doctors in the hospital can spend more time focusing on what matters. And finally, we've seen that there was a lot of use cases for work. The blind community sometimes uses apps that provide remote visual assistance, but it still requires to hold a smartphone in front of you. And so for the first time, this community was telling ARx that they were so excited to be able to get the same assistance but they are still able to use both their hands and that was making them more employable, which is extremely meaningful and very motivating for them.