Date: Mar 01, 2017 Source: Nanalyze (
click here to go to the source)
When thinking about the potential of emerging technologies, it's often good to try and visualize the most extreme endpoint that you can think of, where the technology would be fully matured. We've done this before when we thought about what real-time virtual reality might look like, or when we thought about what a fully functioning autonomous taxi company would need to operate. Now we'd like to think about the same thing for augmented reality (AR), a technology that we think may present one of the biggest investing opportunities ever.
With augmented reality, it all comes down to the hardware. We saw that Google Glass had some problems with adoption, mainly that people are uncomfortable with the fact that other people are walking around possibly recording everything. We know that there are at least 13 other pieces of hardware being developed that all promise to spur the adoption of augmented reality. We need to think beyond glasses though. We need to think what augmented reality hardware will be like at the peak of its maturity. Ideally we can do away with glasses entirely, and go right to smart contact lenses that look something like this:
Source: Popular Science
The idea of smart contact lenses isn't as far away as you might think. The first problem that crops up is how exactly do we power the electronics in a set of "smart" contact lenses. As it turns out, we can use the energy of motion or kinetic energy. Every time the eye blinks, we get some power. Now that we have the power problem solved, there are at least several applications we can think of in order of easiest first:
Level 1 - Multifocal contact lenses like these from Visioneering Technologies, Inc. (VTI) or curing color blindness like these smart contact lenses called Colormax
Level 2 - Gathering information from your body - like glucose monitoring for diabetics
Level 3 - Augmenting your vision with digital overlay
Level 4 - Complete virtual reality (not sure if this is possible based on the eye symmetry but we can dream a dream)
So when we ask the question "how far away are we from having smart contact lenses" the answer isn't that simple. The first level we have already achieved. Now let's look at companies that are addressing Level 2 smart contact lenses which is gathering information about the human body, using a method we like to call "EYE-O-T".
The most notable player in the smart contact lenses game is Google Alphabet. Just last month, their life sciences division called Verily took in an $800 million investment. Navigating to their website shows a whole plethora of projects that they're working on, one of which is a glucose-sensing "smart lens" for diabetics:
While Verily was working with Novartis to test the product on people last year, this didn't happen which led to speculation that the technology may not even be feasible. If you're at all interested in some controversy around the topic of "smart contact lenses", then give this article from Stat a read. Here's a quote and a really cool term you can throw around now, "slideware":
But a former Verily manager recently called the lens "slideware" -- a Silicon Valley term for breakthroughs that exist only on PowerPoint images. The company indeed produced a prototype, but it didn't work, the former manager told STAT.
Why didn't it work? Some experts are saying it's because you cannot measure glucose level from tears. We call that a "showstopper". Here's another excerpt from the very interesting STAT investigation:
Smith has evaluated more than 30 "noninvasive" technologies that measure glucose from sweat, saliva, and tears. "I saw people working on this, and time after time after time, failing in the same ways or in entirely new ones," he said in an interview. They all faced a problem no technical advance can overcome, Smith said. None of those fluids offers glucose readings that reflect the levels of glucose in blood.
Smith referenced above is a chemist and former chief scientific officer of the LifeScan division of Johnson & Johnson so definitely subject matter expert material there. This makes us wonder just what data you can gather from the human body using smart contact lenses. Verily isn't the only company trying to answer that question.
Founded in 2013, Canadian startup Medella Health took in their first round of $1.4 million in summer of last year to fund a 15-person team that wants to build the exact same glucose monitoring solution as Verily. They claim their smart contact lens will cost "roughly $25/contact lens to create, whereas based on Google's public disclosures, their's will cost around between $200-300 per lens". The CEO, Harry Gandhi, is a Thiel Fellow which means that Peter Thiel gave him $100,000 to drop out of college and run with this idea. According to an interview last year with Communitech News, Mr. Gandhi stated that the company plans to have their smart contact lens ready for testing in around November of this year.
To conclude, it's still debatable as to whether Level 2 smart contact lenses are technically feasible for monitoring glucose or what else they may be able to monitor.
Moving on to Level 3 (AR) and Level 4 (VR) smart contact lenses, we're running into a few problems understanding how the hardware part of this technology might work. In the case of augmented reality, you have two types: location-based and image-based. For location-based, the AR can navigate with GPS and that's how it knows to display things like street directions or a map. For image-based AR, you'll need to have a video camera. This type of AR needs to "see" what it is augmenting digital content over. There's no way you could get that sort of hardware onto a contact lens - could you? Maybe you could just place it in a pendant that the user wears around their neck. Or eyeglasses?
One startup that is developing eyeglasses that project an image onto smart contact lenses is Innovega, a Bellevue Washington startup which just took in a seed funding round of $3 million from Chinese company Tencent. Here's a look at their hardware:
Note that they'll need to get FDA approval before they can sell the solution because it will support prescription eyeware, and obviously Tencent thinks that's going to happen. eMACULA has entered the process of regulatory approval, and expects to receive market clearance by early 2018. The contact lenses are expected to cost about 20% more than regular disposable contacts. The glasses will cost a little more than the price of regular designer spectacles and sunglasses.
When we move from augmented reality to virtual reality, we don't necessarily need that camera anymore but we will need motion sensors. You'll need motion sensors not only for knowing where the user's gaze is in the virtual world but also for foveated rendering which helps make VR less intensive by monitoring the location of the eyeball. How are you going to do that without utilizing an external piece of hardware? Can motion sensors actually be shrunk small enough to fit on a pair of contact lenses? Nanotechnology can probably make that happen - eventually. Until then, it looks like the Innovega "smart contact lenses" with "smart glasses" solution is going to be the most viable for AR/VR applications.
As for "Level 2" where the smart contact lenses read vital information from the human body, like glucose monitoring, the jury is still out as to whether or not that will work. We'll just have to continue waiting.
Looking to buy shares in companies before they IPO? A company called Motif Investing lets you buy pre-IPO shares in companies that are led by JP Morgan. You can open an account with Motif with no deposit required so that you are ready to buy pre-IPO shares when they are offered.