I’ve been a moderate enthusiast of the facebook faceputer (aka meta rayban smart glasses). At first, I was very excited with the possibility of being able to take a picture anywhere and ask an AI about it. It has not yet met my expectations for what I hoped it could to do, but I feel like the hardware is all there.

I’ve ended up using it to ask random questions while I’m walking on the street. Recently as I was sending a query to be transmitted over bluetooth/wifi to my phone and over cellular network over to cloud infrastructure to be parsed into tokens that a neural network would understand and fire off mixture of models to gather relevant vectors from training and web search to formulate a response of tokens to be evaluated and eventually I get a pretty good response back to my question on “what was the name of the band in kpop demon hunters?” [aside: the answer is huntr/x btw and they’re on spotify :D ]

Three K-Pop singers on stage. Two of them wearing VR headsets

I can’t imagine the cost of that one interaction, but today, I’m getting it practically for free. It reminded me when there were a lot of search engines out there before google’s algorithm proved to be the most successful. And then they realized they needed to make money to be a viable business. So, I believe it was Eric Schmidt who was brought in and helped monetize the search engine, or rather the traffic and eyeballs that their search engine was attracting.

So what does that look like for AI search engines today? Will we see ads in the results of our queries? The next time I ask about kpop demon hunters, will it first try to sell me on other similar tv shows before I get my answers? On a screen, you can have ads on the side which are fairly non intrusive, but in a voice/ear interface, that kind of ad can be really annoying.

But there is still one thing that meta can gather from this. Intent. One of the most valuable things Google search achieves is understanding what people want to do or buy or are curious about. This intent is valuable ultimately for advertisers and understanding trends and correlations. So maybe that is one of the things a faceputer hooked up to a neural network can benefit from. I give a bit of my information and data on random questions and I get back an instant answer machine. Not a bad trade-off.

I can see why other companies are taking note of what Meta accomplished by partnering with Luxottica to make something people would actually wear and find useful. Snap was close, but most people wouldn’t put those spectacles on their faces. Google glass made you really stand out, almost defiantly as a tech nerd. As other companies try to capture the smart glasses market, they’re at the least trying to capture user intent. And for that you need to be a market leader.

Now when these smart glasses get AR (augmented reality/mixed reality) screens and are able to highlight (or hide!?) certain stores as I’m walking down the street or pop up ads on a sale on krispy kreme donuts.. that will be scary. Hiding elements from my view is straight out of a dystopian black mirror episode. I wonder how that monetization framework will play out. Will making more queries increase the number of ads that get thrown at my retinas? Or will I have a slider that lets me reduce or increase the pervasiveness of virtual ads my glasses show me? i think the tech is still like 5 or 10 years away for good AR that fits in a glasses form factor, but who knows what technological breakthroughs can be accelerated by AI?

For now, I’ll just enjoy my ad-free random stupid questions I get to ask my glasses.