Will Lasers Save Intel’s Google Glass Clones From Sucking?

Sharing is caring!

Intel thinks it’s figured it out. The company that makes the CPUs in most of our computers has produced a set of glasses that could do what Google Glass could not—and what Magic Leap desperately wants to. It’s produced a pair of “smart glasses” that look like something you might actually wear.

Well, not necessarily in public: Intel’s Vaunt glasses look more like the safety glasses people wear around heavy machinery than the corrective ones currently resting on my nose. But these new glasses shoot lasers into your eyes to deliver the information that Google needed a big ugly arm (and Magic Eye needs a set of goggles) to transmit. While that’s tremendously exciting, after reading The Verge’s big story on the Intel Vaunt glasses, I don’t know if it’s enough.

Let’s be clear: I want a personal PC that filters information directly into my eyeball. I want to look at a restaurant and know if the food will be good, or shake someone’s hand and instantly know their name. I want to be able to know who’s calling with a single glance to the side, and to check my Twitter feed with a flicker of my eye. I am the arsehole gleefully excited for my cyberpunk future, but I also don’t want to see people screw it up again, because Google Glass bungled the job so badly it set personal HUDs back at least half a decade or more.

Intel’s take appears to be a supremely conservative one. Instead of a multicolour display that augments reality, the Vaunt glasses reportedly shine a small red display into your eyeball using a Vertical-Cavity Surface-Emitting Laser—that’s the kind of laser found in your printer, mouse, and in the dot projector on the iPhone X. According to Intel, the version they use is low-powered enough to cause zero damage to your retinas.

Which is great, because you don’t want to risk blindness for the relatively small amount of data Vaunt glasses would provide. It only gives you notifications you’re accustomed to getting from your smart watch or the lockscreen of your phone. Here’s a screengrab from the Verge’s video that gives you an idea of what it would look like.

This is supposed to be the laser’s image “painted” onto your retina along side how you would see the text while actually looking with your eyeball. (Screenshot: YouTube/The Verge)

That’s definitely not Magic Leap levels of AR—that’s not even the same level of AR as found on your smartphone. It’s more like the stuff Pontiac rolled out in some of its cars more than a decade ago. Because it’s so simple, Intel can get away with packing it into a much smaller package. So if you wear the Vaunt glasses, most people wouldn’t immediately know you had a HUD strapped to your face.

That’s a critical first step in HUD adoption. Despite what stuff like Netflix’s Altered Carbon promises, no one actually wants to look like a big nerd as they interact with invisible computers. But Intel’s conservatism could harm it if something like Magic Leap takes off. The Vaunt glasses’ low resolution (400×150 pixels) monocolor display may disappear from view when not in use and require no adjustment for people with bad eyesight, but it also doesn’t have any way of gathering data beyond what it can pull from your phone. So it can’t perceive the world around it and adjust accordingly. You won’t be able to look at a plate of food and have the glasses ID the pasta type.

You won’t even, necessarily, be able to interact with the glasses except for when using your phone. According to The Verge, how one interacts with the glasses is still up in the air. The current test models made available to The Verge had a compass and accelerometer built in, but nothing else.

However Itai Vonshak, head of products for Intel’s New Devices Group, did give an example of using the glasses with Alexa—sort of like the glasses Vuzix was showing off at CES this year. That strongly suggests a microphone and speaker could be added at a later date.

But is shouting at one’s glasses really the future of interaction with wearable tech? Will we really talk to Alexa as we ride the train or saunter through the mall? Or will we use a controller, as Magic Leap suggests? Or gestures, as with Microsoft’s Hololens?

Intel’s Vaunt seems to take us a step closer to personal HUDs, but the biggest question still remains: How the heck are we supposed to interact with these computers of the future? “We really believe that it can’t have any social cost,” Vonshak told The Verge. “So if it’s weird, if you look geeky, if you’re tapping and fiddling—then we’ve lost.”

Too bad Vonshak hasn’t explained how Intel will “win.” The personal computer didn’t become common until the mouse. The smartphone didn’t grow popular until Apple came up with the pinch and zoom. It’s not enough to build new tech—one has to resolve how we interact with that tech in the most natural way possible. According to The Verge, Intel still doesn’t know what that interaction will look like.

One can only hope Intel figures it out soon. The Vaunt glasses will be made available to developers later this year and will work with Android and iOS devices. The only thing potential users will need is their pupillary distance, which anyone with eyeglasses will already have on hand from their optician. People with perfect vision will probably need to make an appointment.

Sharing is caring!

One comment

Leave a Reply

Your email address will not be published. Required fields are marked *