Smart Glasses Meet AI: Making Technology Accessible to All

James Sweetlove
|  Created: October 9, 2025  |  Updated: November 13, 2025
Smart Glasses Meet AI Making Technology Accessible to All

Discover how AI-powered smart glasses are impacting accessibility and independence for people with visual impairments, the elderly, and those with cognitive disabilities. In this episode, Karthik Kannan, Founder and CTO of Envision, returns to the show to share the incredible journey of developing Ally—an AI assistant that works hands-free through affordable smart glasses at just $399. Learn about the evolution of wearable technology, conversational AI, and how this breakthrough is making advanced technology accessible to everyone who needs it most.

From reading menus at restaurants to navigating train stations independently, these AI glasses are transforming daily life for millions. Karthik discusses partnerships, the future of conversational AI, and why smart glasses are the perfect form factor for AI applications. 

Resources from this episode:

Listen to the Episode

Watch the Episode

Transcript

James: Hi everyone, this is James from the Ctrl+Listen Podcast, brought to you by Octopart. Today we have a special guest back on the show for the second time. This is Karthik Kannan. He is the CTO of Envision. They’re a fantastic company that has been on the show before and makes some really amazing technology in the vision and smart glasses space. Thank you so much for coming on the show. It’s great to have you back.

Karthik: Thanks so much, James. Super excited to be here. Thanks for having me again.

James: Of course, because I feel like your story is not finished, it just keeps developing. I love keeping up with what you’re doing. Maybe for anyone who didn’t catch the first episode, can you tell us a little bit about the company as a whole and your journey?

Karthik: Sure. My name is Karthik and I am the co-founder and CTO of Envision. Envision is a company that builds tools for people with a visual impairment so they can live more independently. We use AI to help them do all kinds of tasks. We help them read text, we help them recognize faces of their friends and family, their personal belongings, and so much more.

Envision is a company that basically does all of this with smart glasses. We have our AI glasses that a blind person wears like a regular pair of sunglasses. The glasses have a camera on them and a speaker located right next to your ear, and people simply talk to the glasses and get things done.

For example, at a restaurant they can read a menu by themselves. They don’t have to call the waiter or expect somebody else to do that for them. If they’re at a supermarket, they don’t need someone to help them with their shopping. Blind people can shop independently. If they’re at a train station, they can navigate, get to the right platform, know if the train is arriving, all of that with the help of our glasses.

James: That’s fantastic. I love that. What was your background before this? Where did you come from previously to founding Envision?

Karthik: I started Envision about eight years ago. I was super young. I think I was 23 when I started Envision, and my background is primarily software engineering and AI research. I was working with multiple companies in the very early days of AI. I was working in research and applied AI engineering about nine years ago.

I went to a blind school in India to talk to kids there about what a researcher does and what an engineer does. That conversation with the kids sparked an interest in trying to see how we could use AI to help people who are blind or low vision, because AI, even eight or nine years ago, was starting to get much better than humans at a lot of tasks.

I thought, this is a tool that doesn’t expect the world around us to change. When you use ChatGPT today to ask for a description of an image, you just take the picture, and it gives you a description. You don’t have to set the scene in a particular way, and even if the image is dark or blurred, the AI is intelligent enough to tell you. That was roughly what it was like eight years ago as well, more primitive, but still there.

I took a bet that this technology was going to improve, and that was the genesis of Envision.

James: You were a bit ahead of the time. I think you saw what was coming when other people hadn’t quite envisioned it yet.

Karthik: Yeah, it was crazy. Many people did not know about AI, or AI was this thing you only heard about in sci-fi novels, especially the kind of AI that we see today. Honestly, I think even people who worked in the industry, like I was really knee-deep into this—this was my bread and butter day in and day out—but the whole inflection point that happened with ChatGPT and Gen AI even caught, I think, the ChatGPT people by surprise. That’s the real story.

It’s a crazy time to be in the AI space right now. It’s absolutely amazing.

James: Definitely, and I want to come back to that a little later in the episode. There’s a whole discussion I want to have around conversational AI and how important that is to your product. But maybe we can start with looking at the wearable space itself.

Obviously wearables are a whole sector of the industry that has been advancing rapidly in the last five years. Do you want to tell us a little bit about what’s been happening in that space?

Karthik: Sure. When we first started building AI for blind or low vision people, we built an app called Envision AI, which was basically a Swiss army knife tool for people who are blind or low vision. It had a button where you could tap and it would read text, another button that would describe a scene for you, another that would detect objects in your environment. It was these discrete things that people could trigger.

Even back then, a lot of blind or low vision people felt we could improve the experience if, through some miracle, we could make all of this hands-free. For a blind person there is a cane in one hand, a guide dog in the other hand, and on top of that if you ask them to hold a phone and navigate like this, that’s another obstacle for them.

So we started looking into smart glasses in 2017. Back then, the wearables of that time were very much like, if you’ve seen the movie “Robocop,” they looked exactly like that. You looked like Robocop wearing them. They were bulky and they weren’t very powerful.

We had to find a sweet spot: something that was not stigmatizing but still powerful. For a blind or low vision person, the moment they enter a room they are aware that people tend to look at them or that attention is drawn toward them. They don’t want to wear something that draws even more attention than necessary. So we had to find something powerful enough but with a design that was almost like a regular pair of glasses.

That’s when we encountered Google Glass. A lot of people know that Google put out Google Glass and discontinued it, but not many know they actually brought it back from the dead. In 2019 they brought back the Google Glass, and that’s what we put our first version of software on and called it the Envision Glasses.

But even those glasses, although they were powerful standalone glasses, we couldn’t get them for less than around $1,500. Then you add shipping, our own margins, software costs—the cost ballooned to about $2,500 to $3,000 for a pair of those glasses. For the longest time, for the last five years, that has been the story.

Over the last couple of years, as AI started to take off, more people in the industry started to realize that a wearable is the best form factor for AI. I know Zuckerberg is famous for saying that, but I’ve been yelling it in my own little circle for the last five years.

When you experience AI through a pair of glasses and realize you can have a conversation with it and have your hands free, it opens the door to so many more interesting applications and helps you a lot more.

That’s what’s happening right now. Smart glasses are becoming mainstream. You’ve got the Ray-Ban Meta glasses becoming popular, Google has announced their smart glasses as well, and there is an entire ecosystem of independent glasses manufacturers. I’ll talk about one of them in a bit as well.

All of them are starting to make more mainstream glasses because the technology is there, and the technology now has a very strong application with AI. It has a very strong partnership that was lacking before. A lot of people wondered, “What would I use smart glasses for?” but today with AI people are realizing, “This is what smart glasses have been waiting for.”

That’s the renaissance happening in the industry right now.

James: And now they’re truly smart. I’d love to talk more about the glasses manufacturer you mentioned. That is Solos, yeah?

Karthik: Yeah.

James: I’d love to hear about the partnership you formed with them with your Ally Glasses.

Karthik: We’re super excited about it. Ally is our AI assistant that is available today and already helping lots of people through smartphone apps and the web. It’s a really helpful assistant for blind or low vision people that understands their question and then picks the right tool to solve it.

Currently people are using it to read text, get descriptions of what’s around them, identify objects, and answer questions for all kinds of daily tasks. We thought, wouldn’t it be amazing if we could take Ally and give it to you in a hands-free way so you can take Ally everywhere with you?

We teamed up with Solos, a company based out of Hong Kong that makes really great smart glasses, and we partnered with them to create Ally Solos Glasses. They’re available for pre-order right now.

There’s no screen and no complicated buttons. You put on the glasses and talk to Ally like you would talk normally. These glasses are super lightweight—about 42 grams, around an ounce and a half. They come with a 10-hour battery life, HD cameras, and you can basically wear them all day.

They’re for people who are blind or low vision, and also for elderly adults who feel smartphones are too complicated, and for people with cognitive disabilities. So they have a much wider impact.

What’s also very interesting is that it’s a very powerful piece of technology at just $399. We were able to slash the price by almost 10x from what we had before.

James: Fantastic. I remember the price discussion we had last time you were on. It’s crazy how much you’ve managed to cut that price down. Very impressive.

Karthik: Yeah, it’s been incredible in terms of how we are able to capitalize on all these small improvements. These small things add up over the years and suddenly, before you know it, there is a big shift in the industry.

James: Yeah, I think it speaks a lot to how far technology has come and the fact that it’s reducing cost that much from what it was even three years ago.

Karthik: Yeah. I think it’s the impetus. When we approached Solos and said, “This is the right partnership, this is the right kind of tool that gives meaning to your glasses,” they really understood that.

This is just our first smart glasses platform. There are many other smart glasses that will come up in the next few years, and Ally will be on all of them. We are going to have smart glasses hopefully as prevalent as smartphones, if not more. It’s going to be a very common class of device that people carry around.

James: I want to talk a little bit about the app and the companion, because that can be very important to the full function of the glasses. Can you run us through what those are?

Karthik: The Ally app is the core companion to the Ally Solos Glasses. When people buy a pair of Ally Solos Glasses, they also get access to Ally.

Once they pair the glasses with the Ally app on their smartphone—it’s available on iOS and Android and also on the web—we’ll be bringing Ally to more places. There’s a WhatsApp client in the works, a Telegram client in the works, and so on. We’re going to have Ally across all platforms possible.

Once you put on your Ally Solos Glasses and pair them with the Ally app on your phone, you can simply initiate a call and have a conversation with Ally hands-free.

One of the nice things about Ally is that you can personalize it. We picked the name because it felt apt: you can create multiple allies, and each ally can have a different personality and access to different tools.

For example, my ally that I use outside of work is called Captain Jack. He has a very Jack Sparrow–like personality and has access to the web. I’ve got a more serious personality for work who I call Alfred, like Alfred from Batman. I use that on the desktop, and he has access to my calendar, my email, and all of these different things.

I can switch between these different allies, and when I use them with the glasses they have different personalities and abilities. You can change all of that in the Ally app as well. There is a lot you can do with the app on your phone, and when you combine it with the glasses it lets you use all those tools hands-free.

James: Wow. And when you say hands-free, for example if we’re talking about calling someone through WhatsApp, would that be completely voice-activated? You would just say “Call this person’s name,” and it would automatically go through the preferred app you’ve set?

Karthik: At the moment we don’t have direct connections to third-party apps, but it will work that way once we introduce that. You’ll have the option to open specific apps.

These glasses also behave like headphones, so you can use them when you’re having conversations outside of the Ally app as well. You can use them to pick up voice calls, or if you have them connected to Siri or Google Assistant you can respond to messages hands-free.

You can also listen to music on them. They’ve got these Whisper-technology earphones built in, so you can listen to music in HD. It’s perfectly possible to use these glasses as a regular pair of headphones and also with the Ally app to use AI functions hands-free.

James: That’s fantastic. I love that.

Can we talk a bit about some of the primary users? You’ve mentioned people who have low vision and elderly people. Is there anyone else that would benefit from these?

Karthik: Ally Solos Glasses and the app itself are built for people who are blind or low vision, because the app and the glasses are completely accessible. They work with screen readers on your phone, like VoiceOver or TalkBack. They also work with JAWS and other screen readers on your desktop.

Apart from blind and low vision folks, we have made Ally very usable for the elderly, for older adults who would like to use technology to help them but don’t necessarily want to set up accounts and do all the steps people often find difficult.

In fact, to use Ally you don’t even need to create an account. You can hit a button as soon as you install the app and start having a conversation with it.

That’s one of the nice things: because conversational AI has gotten so good, we can easily teach Ally to understand the user’s intent and pick the right tools to solve it.

So when you show Ally a piece of paper and say, “Can you scan this receipt and tell me how much I should pay?” or “Can you translate this?” or “Can you read what’s on this product packaging?” Ally understands that you’re looking to read something or get a description and picks the right tool.

It’s also for people who would love to use technology but find technology complicated in its current form.

We have also tested Ally to help people with cognitive disabilities. There are people with dyslexia and dementia who use Ally. It’s an accessibility or assistive tech tool they can use as well.

We used to build our products primarily for people who are blind or low vision, but because this technology has matured so much over the last couple of years, we can open it up to a much broader audience.

So if you have parents at home who you’d like to help get more done with AI, and you think AI can help them but they find things like ChatGPT too complicated, Ally Solos Glasses are the right tool for you.

James: That’s fantastic. I love how accessible it’s become. We had a guest on recently who talked about the fact that ease of use is sometimes overlooked by companies and can be a key failing. They might have the best product in the world, but if the average person can’t access it or operate it correctly, it could completely fail.

I love that you’ve gone in that direction so thoroughly—that it is about ease of use and about the person being able just to put something on and operate it.

Karthik: When we first got access to GPT-4 Vision, which was the model that really changed it all—many people know about ChatGPT, but that is 3.5. After that came 4, which gave more conversational abilities to the AI and also gave AI the ability to see.

When we first encountered that in 2023, it changed a lot for us. We realized that the whole idea of software with menus and buttons and layers of options was going to go away. Currently we are living in an age of user interface that is almost 30 or 40 years old. The whole windowed system came with the Mac and so on, and it has run its course.

Over the next few years we will be moving to a more fluid conversational interface. That was a wave we wanted to capitalize on.

Given that we have our hearts in accessibility, what excited us a lot was the fact that we could get rid of all these menus and layers. We could even get rid of the user manual. Ally does not come with a user manual. You just put it on and it walks you through what it can do. At any point you can ask Ally, “What are the things you can do?” and Ally will tell you.

We’ve reached the phase where the software can talk about itself and inform the user about what it can and cannot do. That makes it tremendously useful for older adults.

James: I love that.

I wanted to touch back on something we glanced over: you mentioned the Whisper headphones. Say someone’s in a meeting like that—no one else would hear what the headphones are saying when the glasses are speaking to the person, correct? How does that technology work exactly?

Karthik: I don’t know all the specifics, but I’ve tested this in quieter environments. Because of the way they position the speakers in your ears, they’ve developed a way of positioning them so that it offers more privacy.

If you’re in a room full of people, you can be the only person to hear this instead of everybody around you. We wanted to work with someone like that because Ally is also something people can use in their workspace. A lot of people use Envision’s products at work.

One of my ambitions is to get more blind and low vision people involved in the workplace and have them be just as productive as sighted people. That’s very empowering, and we wanted to pick the right tool to help us do that.

James: Definitely. I think a lot more people are going to be willing to use those features in the workspace if they know that no one else is going to hear them and they’re not going to disrupt anything.

So, conversational AI—coming back to that—what do you see as the next steps in that space? How do you see it advancing in the next few years?

Karthik: In the very short term, we’re going to get more reliability. Even though conversational AI tools are quite reliable today, there are points where they fail, and they don’t fail gracefully; they just fail. As someone building these tools, you have to put a lot of effort into making sure there are more graceful failure states.

Conversational AI is going to get really reliable. It’s going to get more accurate. I think we’ll move more into what people are calling omni-modal AI—AI that can understand text, audio, and video, and is capable of outputting text, audio, and video, all in real time.

Some models like ChatGPT and Gemini can do that today. I think that will be the norm in the near future.

In the next three to five years I definitely think embodied AI is going to become a thing, where we would start having this intelligence that was earlier inside a server sitting in a data farm somewhere in the US or China move into the real world. People are going to start building robots, house-helps, and so on.

We’ll see more mechanized labor coming into the workforce over the next three to five years and hopefully removing us from the drudgery of everyday tasks we don’t want to do.

From an accessibility point of view, it’s going to be amazing. It’s going to be the golden age for accessibility, because all these tools will have some impact on able-bodied humans, but they’ll have a tremendous impact—ten times more—on someone with a disability. That’s the future I’d love to be a part of.

James: I’ve spoken to a few people on the show about this, and it’s the speed of evolution in AI that has caught a lot of people off guard. If you look back even one or two years at what was available versus now, it’s a completely different playing field.

Karthik: Yeah. Even for somebody who’s really in the industry—in a small way shaping it and working in it—it still catches me off guard to see how far things are evolving.

I think that will continue for the next one or two years. Then there will come a point where the current technology will hit a ceiling and stabilize.

I encourage everybody around me to lean heavily on AI. My dad is 65 years old and I’m trying to get him to use as much AI in his work as possible. He’s seeing a lot of benefits from it.

I encourage people around me, even in the company, to use AI because I think that’s what the norm will be.

It’s like how we use the internet today. We don’t think about having this conversation as being strange—you’re in one place, I’m in another—because we’re used to it. I think that’s where it will get even with AI.

James: It’s the new standard. It’s like when the Microsoft Office suite came around originally—you were just expected to have some skills in that space and a basic understanding of how it operates. I think AI has definitely become the norm.

Karthik: Just being able to talk to AI and get things done is going to be a key skill.

I’ve been thinking about how my own communication skills have slightly improved since I learned to talk to AI. With AI you have to be so clear. You need a lot of clarity about what you’re looking for, and you need to express that clearly; otherwise you get gibberish output—garbage in, garbage out.

If you can articulate your thoughts well, you can get a lot done with the current state of technology.

Work is going to be about having natural conversations with an AI agent rather than doing arcane programming and all of that. That’s probably going to go away in the next few years.

James: What would you say to people who are afraid of AI and think it’s just there to take jobs?

Karthik: I keep reading this everywhere, and I believe in it too: if you’re a developer, you will not be replaced by AI, but you will be replaced by another developer who knows how to use AI. That makes a lot of sense to me.

Lately I’ve been reading books that talk about these inflection points of technology and how people reacted to them. It’s interesting how people had similar opinions about technologies that came before. People thought the printing press was going to ruin minds because people could suddenly access all these books and nobody had control over what was being written and published.

Similarly, when the internet came out, many people assumed their jobs would go away.

With AI, I definitely think you should embrace it. As a single person you can have the output of multiple people if you know how to use this correctly, and that is incredibly valuable for employers.

As a startup founder myself, we heavily prioritize people who know how to use AI and who are comfortable with it, because the output from such a person is going to be a lot more than from someone who does not.

We have developers in the company who do more work, not because they are putting in more hours, but because their hours count for more. They work the same number of hours or even fewer, but the output we see from a single developer is much higher than before.

It gives small companies like us a chance to compete with the big players.

Anyone who is afraid of AI—once they start using it and realize what the tool is about and how to use it effectively—they are going to be infinitely more valuable than they are today, to anyone who’s hiring and to themselves in terms of productivity.

James: The other thing is that it takes away the knowledge ceiling. If you were someone who struggled with research or finding information, that’s no longer an issue. All you have to do is ask the right questions now to get the answer you need.

Karthik: Exactly. It induces a certain clarity of thought and discipline. It’s fast-changing, but because it’s fast-changing, the amount of impact you can have in your workplace, your life, and society as an individual is going to be far more than before.

There’s a lot of talk in the startup industry about a one-person billion-dollar company with all these AI tools. I think that’s going to happen. Many things in Silicon Valley are an exaggeration, but I think this is not. At some point in the coming decade, because these tools will help automate a lot of work, it’s possible.

If not one person, maybe ten people. Today you can have a $100 million company with ten people in it. It’s not unimaginable that in the same way, ten people could do a billion dollars’ worth of business.

James: That’s very exciting. I think that brings us to time. Is there anything we should be looking out for from your company in the near future? Anything we should be expecting?

Karthik: We’re going to have Ally come onto more glasses. Right now we’re starting off with the Ally Solos Glasses, and that’s something I’m very excited about because I believe this is the moment we’ve been waiting for.

I often talk in the company about how we are like people on surfboards paddling for eight years into the ocean to catch the right wave, and this is the right wave.

Anyone who wants to use AI effectively can do so with the Ally Solos Glasses. I’m quite excited about that.

People can expect affordable AI at scale to help blind or low vision people, the elderly, and people with cognitive disabilities live more independently. That’s what we’re trying to do: independence at scale.

James: Amazing vision. If people want to keep up and view the product, what are the best places to do so?

Karthik: They can visit our website, ally.me. They can also check out the Ally Solos Glasses at ally.me/glasses. We do weekly webinars, we’ve got demo videos, and if they like it they can pre-order from the website.

James: Fantastic. And social media?

Karthik: On social media you can follow us @ally.me on all platforms. We’re on Twitter, Facebook, and others with the handle ally.me.

James: Well, thank you so much for coming on again. We’re going to have you back on in another few years to see where you’re at then. I’m sure there will be some amazing advancements. I appreciate your time; it was a great conversation.

Karthik: Thank you so much, James. I really appreciate you having me here as well.

About Author

About Author

James Sweetlove is the Social Media Manager for Altium where he manages all social accounts and paid social advertising for Altium, as well as the Octopart and Nexar brands, as well as hosting the CTRL+Listen Podcast series. James comes from a background in government having worked as a commercial and legislative analyst in Australia before moving to the US and shifting into the digital marketing sector in 2020. He holds a bachelor’s degree in Anthropology and History from USQ (Australia) and a post-graduate degree in political science from the University of Otago (New Zealand). Outside of Altium James manages a successful website, podcast and non-profit record label and lives in San Diego California.

Related Resources

Back to Home
Thank you, you are now subscribed to updates.