Apple is not waiting for the AI wearables market to come to it. According to Bloomberg's Mark Gurman, the company is accelerating development on three AI-powered wearables simultaneously: smart glasses, a wearable pendant, and AirPods with built-in cameras. All three are designed around Visual Intelligence -- the feature Tim Cook has signaled will define Apple's next chapter.
This is not a concept deck. Apple has solved the battery problem for the glasses, moved beyond external battery packs to frame-integrated power, and is embedding dual camera lenses for both computer vision and photo capture. The pendant and camera AirPods are progressing on parallel tracks. A product event is reportedly planned for March 4, and supply chain sources indicate the first of these devices could ship by late 2026 or early 2027.
For businesses that build products, serve customers, or manage teams, this is a signal worth paying attention to right now.
Three Devices, One Strategy
The three wearables share a common thread: they all put AI sensors on the user's body in a way that gives Apple Intelligence persistent awareness of the physical world.
Smart glasses will include two camera lenses -- one for real-time computer vision (object recognition, text translation, contextual overlays) and another for photos and video. Supply chain analyst Ming-Chi Kuo has confirmed that Apple is working with Asian lens and display suppliers on a heads-up display that prioritizes all-day comfort over the immersive approach of Vision Pro. Think lightweight eyewear with subtle notification overlays, not a ski-goggle headset.
An AI pendant would give users a camera-and-microphone wearable that clips to clothing, providing continuous audio and visual context without requiring glasses. This fills a gap for users who do not want to wear something on their face but still want ambient AI capabilities.
AirPods with cameras round out the trio by turning Apple's most popular wearable into a visual AI device. With cameras embedded in the earbuds or their case, these could enable features like real-time scene description, translation of visual text, or AI-powered audio cues based on what the user is looking at.
The common denominator across all three is Apple's acquisition of Q.ai -- the $2 billion deal for silent speech AI that we covered last month. Q.ai's technology for interpreting micro facial movements and sub-vocalized speech would let users interact with Siri without speaking aloud, solving the social awkwardness problem that has held back voice-first wearables for a decade.
Why Visual Intelligence Changes the Equation
Apple Intelligence debuted as an on-device AI framework for iPhone, iPad, and Mac. But executives have reportedly acknowledged that a phone sitting in your pocket is not the ideal platform for AI that benefits from seeing and hearing what you experience. A device on your face, in your ear, or on your lapel can process environmental context continuously.
Visual Intelligence is the specific feature set that makes this work. According to 9to5Mac's reporting, Apple is prototyping use cases like real-time foreign language translation overlaid in peripheral vision, automatic identification of people you are meeting at conferences, and proactive suggestions based on what you are looking at. All of this processed on-device using Apple's custom silicon -- no server round-trip required.
That on-device processing is the privacy play. Meta's Ray-Ban glasses send data to Meta's servers for AI processing. Apple's approach would keep visual and audio data local, which is a significant differentiator for business users handling sensitive information.
The Meta Race Is Already On
Apple is not entering an empty category. Meta's Ray-Ban smart glasses have been a commercial hit since late 2023, and Zuckerberg has confirmed a next-generation model with a built-in display is in development. At $299, the Ray-Ban Metas are priced for impulse purchase. Apple's glasses are expected to land between $800 and $1,500.
Meanwhile, OpenAI is developing its own AI hardware push with Jony Ive -- including smart glasses targeted for 2028. Samsung rebooted Bixby as an AI agent with Perplexity search built into its device stack. Google is expected to reveal its own AI glasses later this year.
The convergence is unmistakable: every major platform company has concluded that AI needs to escape the phone screen. The wearable form factor is where they all agree the next interface lives.
What Small Businesses Should Watch For
You are not going to buy Apple AI glasses for your team this quarter. But three things are happening right now that matter for your planning.
First, customer expectations are about to shift. When millions of consumers start wearing AI-powered devices that can instantly identify products, compare prices, read reviews, and place orders through a glance and a whisper, your storefront -- physical or digital -- needs to be ready. Product images, business listings, and service descriptions will need to be machine-readable and AI-friendly, not just human-friendly.
Second, voice and visual search will accelerate. If Apple ships even one of these three devices successfully, Siri becomes a visual assistant that your customers use while looking at your products, your signage, or your competitors. Optimizing for visual AI discovery is going to be as important as SEO was in 2010.
Third, workplace wearables are coming. The pendant form factor in particular suggests Apple sees enterprise use cases -- warehouse workers, field technicians, healthcare staff -- where hands-free AI assistance has obvious value. If your business involves physical work environments, start thinking about how ambient AI wearables could change training, safety monitoring, and workflow management.
The Design Lesson Apple Learned
The most telling detail in all of this reporting is Apple's insistence that the glasses look like normal eyewear. Vision Pro was technically brilliant but socially isolating. Google Glass became a cultural joke. Apple's leadership has internalized these failures: the technology only wins if people are willing to wear it in public without feeling self-conscious.
That is why Apple is reportedly working with luxury eyewear designers and spending heavily on materials science to make the frames indistinguishable from high-end prescription glasses. The AI is meant to be invisible. The device is meant to look like fashion.
For businesses building AI products and experiences: take that lesson seriously. The AI that succeeds at scale is the one that disappears into normal life, not the one that demands attention.
The Bottom Line
Apple simultaneously accelerating three AI wearables is one of the strongest signals yet that the phone-first era of AI interaction has an expiration date. When the company that defined the smartphone decides the future of AI lives on your face, in your ear, and on your collar, the rest of the industry is going to follow.
The question is not whether ambient AI wearables will become mainstream. It is how fast your business adapts to a world where your customers are wearing them.
Preparing your business for the next wave of AI interfaces? Barista Labs helps small businesses build AI strategies that stay ahead of platform shifts -- before they become urgent.
