Space Glasses

Wearable Technology, for your face

Once computers got small enough that “wearable technology” was a thing we could talk about with a straight face, glasses were an obvious form factor. Eye glasses were already the world’s oldest wearable technology! But glasses are tricky. For starters, they’re small. But also, they already work great at what they do, a nearly peerless piece of accessibility technology. They last for years, work on all kinds of faces, work in essentially any environment you can think of, and can seamlessly treat any number of conditions simultaneously. It’s not immediately obvious what value there is in adding electricity and computers. My glasses already work great, why should I need to charge them, exactly? Plus, if you need glasses you need them. I can drive home if my watch crashes, I can’t go anywhere if my glasses break.

There’s a bit in the Hitchhiker’s Guide to the Galaxy which has sort of lost it’s context now, about how goofy digital watches were, considering they didn’t do anything that clockwork watches couldn’t do except “need new batteries.” Digital Glasses have that problem, but more so.

So instead smartphones happened, and then smart watches.

But still, any number of companies have tried to sell you a computer you strap to your head and over your eyes. Mostly, these exist on an axis between 3d headsets, a form factor that mostly froze somewhere around the VirtualBoy in the early 90s, and the Google Glass, which sounded amazing if you never saw or wore one. Now it looks like Apple is ”finally” going to lift the curtain on their version of a VR/AR glasses headset.

A couple of lifetimes ago, I worked with smart glasses. Specifically, I was on the team that shipped Level Smart Glasses, along with a bunch of much more interesting stuff that was never released. For a while, I was a major insurance company’s “Lead Engineer for Smart Glasses”. (“Hey, what can I tell ya? It was the best of times, it was the worst of times. Truthfully, I don’t think about those guys that much since all that stuff went down.”)

I spent a lot of time thinking about what a computer inside your glasses could do. The terminology slid around a lot. “Smart Glasses.” “Wearable Tech.” “Digital Eyewear.” “Smart Systems.” “VR headsets.” “Reality Goggles.”

I needed a name that encompassed the whole universe of head-mounted wearable computing devices. I called them Space Glasses. Internally at least, the name stuck.

Let me tell you about Space Glasses.

Let’s Recap

Traditionally the have been two approaches to a head-mounted computer.

First, you have the VR Headset. This broke out into the mainstream in the early 90s with products like Nintendo’s Virtual Boy, but also all those “VR movies” (Johnny Mnemonic, Disclosure, Lawnmower Man, Virtuosity,) and a whole host of game initiatives lost to time. (Who else remembers System Shock had a VR mode? Or Magic Carpet?)

On the other hand, you have the Heads Up Display, which from a pop-culture perspective goes back to mid 80s movies like Terminator or Robocop, and maybe all the way back to Razor Molly in Neuromancer. These stayed fictional while the VR goggles thrashed around. And then Google Glass happened.

Google Glass was a fantastic pitch followed up by a genuinely terrible product. I was at CES a couple years back, and there’s an entire cottage industry of people trying to ship a product that matches the original marketing for Glass.

Glass managed to be the best and worst thing that could have happened to the the industry. It demonstrated that such a thing was possible, but did it in a way that massively repulsed most of the population.

My glass story goes like this: I was at a convention somewhere in the greater Silicon Valley area, probably the late lamented O’Reilly Velocity. I’m getting coffee before the keynote. It’s the usual scrum of folks milling around a convention center lobby, up too early, making small talk with strangers. And there’s the guy. Very valley software engineer type, pasty, button down shirt. Bit big, a real husky guy. And he’s staring at me. Right at me, eyes drilling in. He’s got this look. This look.. I have no idea who he is, I look up, make eye contact. He keeps starting with that expression. And for a split second, I think, “Well, huh, I guess I’m about to get into a fistfight at a convention.” Because everything about this guy’s expression says he’s about to take a swing. Then he reaches up and taps his google glasses. And I realize that he had no idea I was there, he was reading email. And thats when I knew that product was doomed. Because pulling out your phone and starting at it serves an incredibly valuable social indicator that you’re using a device.. With a seamless heads-up display like glass, there was no way to communicate when you were reading twitter as opposed to starting down a stranger.

Which is a big part of why everyone wearing them became glassholes.

Plus, you looked like a massive, unredeemable dork. To mis-quote a former boss of mine, no produc tis going to work if it’ll make it harder for you to get laid, and Glass was the most effective form of birth control known to lifekind.

Underreported between the nuclear-level dorkiness and the massive privacy concerns was the fact that Glass was incredibly uncomfortable to wear for more than a couple of minutes at a time.

Despite that, the original Glass pitch is compelling, and there’s clearly a desire to find an incarnation of the idea that doesn’t set off the social immune system.

Glass and Better-made VirtualBoy’s aren’t the only ways to go, though.

Spectrums of Possibilities

There are a lot of ways to mount a microprocessor to someone’s head. I thought of all the existing space glasses form factors operating on two main orthogonal axes, or spectrums. I’ll spare you the 2x2 consultant chart, and just describe them:

  • With a screen, or without. There are plenty of other sensors or ways to share information with the wearer, but “does it have a screen or heads-up-display” is a key differentiator.
  • All Day wear vs Single Task wear. Do you wear them all the time, like prescription spectacles, or do you put them on for a specific time and reason, like sunglasses?

There are also two lesser dimensions I mention for completeness:

  • Headset-style design vs “normal” glasses design. This is more a factor of the current state of miniaturization than a real design choice. Big headsets are big only because they can’t fit all that in a package that looks like a Ray-bans wayfarer. Yet. You can bet the second that the PS VR can look like the Blues Brother’s sunglasses, they will.
  • VR vs AR. If you have a screen, does the picture replace the real world completely, or merge with it? While this is a pretty major difference now—think VR headset vs Google glass—like the above this is clearly a quirk of an immature technology. It wont take long before any mature product can do both, and swap between them seamlessly.

What do we use them for, though?

This is all well and good, but what are the use cases, really?

On the “no screen” side of the house: not much. Those are, fundamentally, regular “dumb” non-electric glasses. Head mounted sensors are intersting, but not interesting enough to remember to charge another device on their own. People did some interesting things using sound instead of vision (Bose, for example,) but ultimately, the correct form factor for an audio augmented reality device are AirPods.

Head-mounted sensors, on their own, are interesting. You get very different, and much cleaner, data than from a watch or a phone in a pocket, mostly because you have a couple million years of biological stabilization working for you, instead of against you. Plus, they’re open to the air, they have the same “sight-lines” as the operator, and they have direct skin contact.

But not interesting enough to get someone to plug their glasses in every night.

With a screen, then, or some kind of heads-up display.

For all-day wear, it’s hard to imagine something compelling enough to be successful. Folks who need prescriptions have already hired their glasses to do something very specific, and folks who don’t need corrective eyewear will, rounding to the nearest significant digit, never wear spectacles all day if they don’t need to.

Some kind of head’s up display is, again, sort of interesting, but does anyone really want their number of unread emails hovering in their peripheral vision at all times?

I saw a very cool demo once where the goggles used the video camera, some face recognition technology, and a database to essentially overlay people’s business cards—their name & title—under their faces. “Great for people who can’t remember names!” And, like, that’s a cool demo, and great you could pull that off, but buddy, I think you might be mistaking your own social anxiety for a product market just a little bit. And man, if you think you’re awkward at social events when you can’t remember someone’s name, I hate to break it to you, but reading their name off your cyber goggles is not going to help things.

For task-based wear, the obvious use remains games. Games, and game-like “experiences”, see what this couch looks like in your own living room, and the like. There’s some interesting cases around 3d design, being able to interact with an object under design as if it was really there.

So, essentially, we’ve landed on VR goggles, which have been sputtering right on the edge of success for close to 30 years now, assuming we only start counting with the Virtual Boy.

There’s currently at least three flavors of game-focused headwear—Meta’s Quest (the artist formerly known as the Oculus,) Sony’s Playstation VR, and Valve’s index. Nearby, you have things like Microsoft’s HoloLens and MagicLeap which are the same thing but “For Business”, and another host of similar devices I can’t think of. (Google Cardboard! Nintendo Labo VR!)

But, fundamentally, these are all the same—strap some screens directly to your eyes and complete a task.

And, that’s a pretty decent model! VR googles are fun, certainly in short bursts. Superhot VR is a great game!

Let’s briefly recap the still-unsolved challenges.

First, they’re all heavy, uncomfortable, and expensive. These are the sort of problems that Moore’s Law and Efficiency of Scale will solve assuming people keep pouring money in, so can largely write those off.

Second, you look like a dork when you wear these. In addition to having half a robot face, reacting to things no one else can see looks deeply, deeply silly. There is no less-attractive person than a person playing a VR game.

Which brings us to the third, and hardest problem: VR goggles as they exist today are fundamentally isolating.

An insufficiently acknowledged truth is that at their core, computers and their derivatives are fundamentally social devices. Despite the pop-culture archetype of the lone hacker, people are constantly waving people over to look at what’s on their screen, passing their phone around, trading the controller back and forth. Consoles games might be “single player,” but they’re rarely played by one person.

VR goggles deeply break this. You can’t drop in and look over someone’s shoulder when they have the headwear, easily pass the controller back and forth, have a casual game night.

Four friends on a couch playing split screen Mario Kart is a very, very different game than four friends each with a headset strapped over their eyes.

Not an unsolvable set of problems, but space glasses that don’t solve for these will never break out past a niche market.

AR helps this a lot. The most compelling use for AR to date is still Pokemon Go, using the phone’s camera to show Pokemon out in the real world. Pokemon Go was a deeply social activity when it was a its peak, nearly sidestepping all the isolating qualities AV/VR tends to have.

Where do they fit?

At this point, it’s probably worth stepping back and looking at a slightly bigger picture. What role do space glasses fill, or fill better that the other computing technology we have?

Everyone likes to compare the introduction of new products to the the smartphone, but that isn’t a terribly useful comparison; the big breakthrough there was to realize that it was possible to demote “making phone calls” to an app instead of a whole device, and then make a computer with that app on it small enough to hold in your hand.

The watch is a better example. Wristwatches are, fundamentally, information radiators. Classic clockwork based watches radiated a small set of information all the time. The breakthrough was to take that idea and run with it, and use the smart part of smart watches to radiate more and different kinds of information. Then, as a bonus, pack some extra human-facing sensors in there. Largely, anything that tried to expand the watch past an information radiator has not gone so well, but adding new kinds of information has.

What about glasses then? Regular eye glasses, help you see things you couldn’t otherwise see. In the case of prescription glasses, they bring things into focus. Sunglasses help you see things in other environments. Successful smart glasses will take this and run with it, adding more and different things you can see.

Grasping towards Conclusions

Which all (conveniently) leads us to what I think is the best theoretical model for space glasses—Tony Stark’s sunglasses.

They essentially solve for all of the above problems. They look good—ostentatious but not unattractive. It’s obvious when he’s using them. While on, they offer the wearer an unobstructed view of the world with a detailed display overlayed. Voice controlled.

And, most critically, they’re presented as an interface to a “larger” computer somewhere else—in the cloud, or back at HQ. They’re a terminal. They don’t replace the computer, they replace the monitor.

And that’s where we sit today. Some expensive game hardware, and a bunch of other startups and prototypes. What’s next?

Space Glasses, Apple Style

What, then, about Apple?

From the rumor mill, it seems clear that they had multiple form factors in play over the course of their headset project, they seem to have settled on the larger VR goggles/headset style that most everyone else has also landed on.

It also seems clear that this has been in the works for a while, with various hints and seemingly imminent announcements. Personally, I was convinced that this was going to be announced in 2020, and there was a bunch of talks at WWDC that year that seemed to have an empty space where “and you can do this on the goggles!” was supposed to go.

And of course that tracks with the rumor that that Apple was all in on a VR-headset, which then got shot by Jonny Ive and they pivoted to AR. Which jives with the fact that Apple made a big developer play into AR/VR back in 2017, and then just kinda... let it sit. And now Ive is out and they seem to be back to a headset?

What will they be able to do?

Famously, Apple also never tells people what's coming... but they do often send signals out to the developer community so they can get ready ahead of time. (The definitive example was the year they rolled out the ability for iOS apps to support multiple screen sizes 6 months before they shipped a second size of phone.)

So. Some signals from over the last couple of years that seem to be hinting at what their space glasses can do. (In the parlance of our times, it's time for some Apple glasses kremlinology game theory!)

ArKit's location detection. AR Kit can now use a combination of the camera, apple maps data, and the iPad's LIDAR to get a crazy accurate physical location in real space. There's no reason to get hyper-accurate device location for an iPad. But for a head-mounted display, with a HUD...?

Not to mention some very accurate people Occlusion & Detection in AR video.

RealityKit, meanwhile, has some insane AR composition tools, which also leverage the LIDAR camera from the iPad, and can render essentially photo-real objects ito the "real world”.

Meanwhile, some really interesting features on the AirPods, like spatial audio in AirPods Pro. Spacial has been out for a while now, and seems like the sort of thing you try once and then gorfet about? A cool demo. But, it seems like a way better idea if when you turn your head, you can also see what’s making the sounds?

Opening up the AirPods API: "AirPods Pro Motion API provides developers with access to orientation, user acceleration, and rotational rates for AirPods Pro — ideal for fitness apps, games, and more." Did anyone make apps for AirPods? But as a basic API for head-tracking?

Widgets! A few versions back, Apple rolled a way to do Konfabulator-esque (or, if you rather, Android-style) widgets for the iOS home screen. There's some strong indications that these came out of the Apple watch team (codenamed chrono, built around SwiftUI,) and may have been intended as a framework for custom watch faces. But! A lightweight way to take a slice of an app and "project" a minimal UI as part of a larger screen? That's perfect for a glasses-based HUD. I can easily see allowing iOS widgets to run on the glasses with no extra modifications on top of what the develoer had to do to get them running on the home screen. Day 1 of the product and you have a whole app store full of ready-to-go HUD components.

App Clips! On the one hand, it's "QR codes, but by Apple!" On the other hand, what we have here is a way to load up an entire app experience by just looking at a picture. Seems invaluable for a HUD+camera form factor? Especially a headset with a strong AR component—looking at elements in AR space download new features?

Hand and pose tracking. Part of greater ML/Vision frameworks, they rolled out crazy-accurate hand tracking, using their on-device ML. Check out the demo at 6:40 in this developer talk

Which is pretty cool on it's own except they ALSO rolled out:

Handwriting detection. Scribble is the new-and-improved iPad+pencil handwriting detector, and there's some room for a whole bunch of Newton jokes here. But mixed with the hand tracking? That's a terribly compelling interaction paradigm for a HUD-based device. Just write in the air in front of you, the space glasses turn that into text on the fly.

And related, iOS 14 added ML detection and real time translation of sign language. (?!)

Finally, there's a strong case to be made that the visual overhaul they gave MacOS 11 and iOS14 is about making it more "AR-friendly”, which would be right about the last time the goggles were rumored to be close to shipping.

In short, this points to a device:

  1. Extremely aware of it's location in physical space, more so than just GPS, via both LIDAR and vision processing.
  2. Able to project UI from phone apps onto a HUD.
  3. Able to download new apps by looking at a visual code.
  4. Hand tracking and handwriting recognition as a primary input paradigm.
  5. Spacial audio.
  6. Able to render near-photoreal "things" onto a HUD blended with their environment.
  7. Able to do real-time translation of languages, including sign language.

From a developer story, this seems likely to operate like the watch was at first. Tethered to a phone, which drives most of the processing power and projects the UI elements on to the glasses screen.

What are they For?

What they can do is all well and good, but what’s the pitch? Those are all features, or parts of features. Speeds and Feeds, which isn’t Apple’s style.What will Apple say they’re for?

The Modern-era (Post-Next) Apple doesn’t ship anything without a story. Which is good, more companies should spend the effort to build a story about why you need this, what this new thing is for, how it fits into your life. What problems you have this solves.

The iPod was “carry all your music with you all the time”.

The iPhone was the classic “three devices” in one.

The iPod Touch struggled with “the iPhone, but without a phone!”, but landed on “the thing you buy your kids to play games before you’re willing to buy them their own phone.”

The iPad was “your phone, but bigger!”

The Watch halfheartedly tried to sell itself as an enhanced communication device (remember the heartbeat thing?) before realizing it was a fitness device.

AirPods were “how great would it be if your earbuds didn’t have wires? Also, check out this background noise reduction.”

The HomePod is “a speaker you can yell requests at.”

So, what’s will the Space Glasses be?

For anyone else, the obvious play would be games, but games just aren’t a thing Apple is willing to be good at. There’s pretty much a straight line from letting Halo, made by Mac developers, become a huge hit as an XBOX exclusive to this story from Panic’s Cabel Sasser about why Untitled Goose Game is on every platform except the Mac App Store.

This is not unlike their failures to get their pro audio/video apps out into the Hollywood ecosystem. Both require a level of coöperation with other companies that Apple has never been willing to do.

Presumably, they’ll announce some VR games to go on the Apple Glasses. The No Mans Sky team is strongly hinting they’ll be there, so, okay? That’s a great game, but a popular VR-compatible game from six years ago is table stakes. Everyone else already has that. What’s new?

They’ve never treated games as a primary feature of a new platform. Games are always a “oh yeah, them too” feature.

What, then?

I suspect they’ll center around “Experiences”. VR/AR environments. Attend a live concert like you’re really there! Music is the one media type Apple is really, really good at, so I expect them to lean heavily into that. VR combined with AirPods-style spacial audio could be compelling? (This would be easier to believe if they were announcing the goggles at their music event in September instead of WWDC.)

Presumably this will have a heavily social component as well—attend concerts with your family from out of town. Hang out in cyberspace! Explore the Pyramids with your friends!

There’s probably also going to be a remote-but-together shared workspace thing. Do your zoom meetings in VR instead of starting at the Brady Bunch credits on your laptop.

There’s probably also going to be a whole “exciting new worlds of productivity” where basic desktop use gets translated to VR/AR. Application windows floating in air around your monitor! Model 3d objects with your hands over your desk!

Like the touch bar before it, what’s really going to be interesting here is what 1st party apps gets headset support on day one. What’s the big demo from the built-in apps? Presumably, Final Cut gets a way to edit 360 video in 360, but what else? Can I spread my desktop throughout the volume of my office? Can I write an email by waving my hands in empty space?

Anyway.

The whole time I was being paid to think about Space Glasses, Apple was the Big Wave. The Oncoming Storm. We knew they were going to release something, and if anyone could make it work, it would be them. I spent hours on hours trying to guess what they would do, so we could either get out ahead or get out of the way.

I’m so looking forward to finding out what they were really building all that time.

Previous
Previous

Apple Vision Pro: New Ways to be Alone

Next
Next

Pluralistic vs Looters