The Zhitong Finance App learned that Facebook and Instagram parent company Meta Platforms (META.US) said in a recently published blog post that due to inventory restrictions and unprecedented strong demand in the US market, the company is drastically delaying the international market expansion process for its “Ray-Ban Display” AI smart glasses.
With the advent of ChatGPT and fully detonating the global AI wave, Meta under Zuckerberg's leadership is trying to fully integrate the AI software and hardware ecosystem, and then become the strongest leader in the field of AI models, intelligent AI application software, and AR+AI smart glasses. In particular, there is a big bet that smart glasses will be the best end side carrier for cutting-edge artificial intelligence technology.
Mark Zuckerberg, founder and CEO of Meta, is doubling his bets on the future of “end-side AI” other than smartphones, which is generally regarded as the most suitable AI smart carrier, that is, AI smart glasses are the “AI entrance to the post-intelligence era.” Zuckerberg plans to perfectly integrate advanced AI models with intelligent wearable devices, and strive to challenge Apple (AAPL.US)'s absolute dominance in a wide range of consumer electronics fields, including iPhones and lightweight wearables. Undoubtedly, the “Ray-Ban Display” AI smart glasses with miniature screens have become another milestone in Meta's grand blueprint for true “augmented reality (AR) +AI” smart glasses.
The new description of the “post-smart era” mainly highlights a sense of trend, describing the “melee era” where the growth rate of smartphone demand continues to slow, and at the same time, the penetration rate of new wearable consumer electronics carriers equipped with large AI models, such as AI smart glasses, continues to surge. Global demand for smartphones has gradually peaked, and at the same time, various types of smart consumer electronics equipped with large AI models have risen. When smartphones are no longer the only, or even the most central electronic interaction center, AI assistants need a more snug, real-time, and low-friction entrance; smart glasses (especially models with camera/microphone, or even display) are naturally adapted to “let the big AI model see what you see, listen to what you hear, and give tips/answers at any time.”
In a market statement in the second half of last year, Zuckerberg explained his concept of “personal superintelligence.” The concept includes an AR+AI smart glasses that can “see, hear, and deeply interact with users around the clock”, and may completely replace smartphones as the main entry point into the AI-based digital world.
Ray-Ban Display became popular in the US as soon as it was launched
“Since we officially launched this product last fall, we've seen an overwhelming level of buying interest, so the product waitlist has now been extended to 2026. Due to this unprecedented demand in the US market and limited inventory, we have decided to suspend our international expansion to the UK, France, Italy, and Canada, which was originally planned in early 2026. We will continue to focus on fulfilling orders in the US market while re-evaluating our strategy for international availability.” Meta's latest blog post shows.
Facebook's parent company Meta has long collaborated with Ray-Ban manufacturer Luxottica (Luxottica) to develop a series of smart glasses products. Meta CEO Mark Zuckerberg released the $799 Meta Ray-Ban Display AI smart glasses product in September of last year, which became popular all over the United States.
These AI smart glasses are glasses-like consumer electronic devices with camera and audio functions. They look like standard Ray-Ban frames, but pioneered the addition of hands-free camera and video recording, an open speaker, microphone, and a fully integrated Meta's original AI all-in-one assistant. The AI smart glasses series integrates dual 12 MP cameras in the latest generation to take pictures and support up to 60 seconds of 1080p video recording, and is equipped with visible recording indicators to alert people around; at the same time, it uses an open speaker and multi-microphone array to enable music playback, calls, and voice input, and keep ambient sounds clearly audible. Additionally, fuselage controls include an electronic trackpad on the lens leg to adjust volume or playback by clicking and sliding, and a physical shutter button for quickly taking pictures or recording video clips.
According to information, this “Meta Ray-Ban Display” version, which is more advanced than AR+AI smart glasses on the market, innovatively embeds a small full-color micro display (with a compact field of view with a small field of view) in the right lens to display AI response/response, navigation prompts, messages, and other visual cues; it can also be paired with a wrist-worn Neural Band, which can read fine hand muscle signals to achieve micro-gesture control, which points to achieve more complex AR-style deep interaction in the shape of relatively traditional glasses .
Display + wristband gestures make it a transition from being a wearable “able to take a photo/voice assistant” to an everyday tool that “can view/go back/navigate without pulling out the phone”: preview of messages and social media content, walking navigation maps, real-time subtitles/translation, camera framing and zoom, etc. are all displayed directly on the lens and operated with the wristband. During CES 2026, which began on January 6, Meta also enhanced more productivity-oriented features such as “teleprompter” and “EMG handwriting input,” further improving the space for imagination of the frequency of use of “non-smart smartphone scenarios.”
Compared to competing smart glasses, Meta's Ray-Ban Display also satisfies the “mainstream eyewear form+real display+real input” three-piece set. Many AI smart glasses competitors either have no display (partial audio assistant) or have a display but are bulky/rely on phones. Meta's exclusive difference is that the in-lens display carries “contextual information presentation” (navigation, messages, AI response cards, etc.); Neural Band (EMG EMG gestures with wristband) turns “input/control” into hidden gestures, and even extends to productivity use cases such as “handwriting input” and “teleprompter turning”.
Smart glasses may be the best end-side carrier for artificial intelligence technology
In terms of other important trends in the AI smart glasses industry, Google's parent company Alphabet (GOOG.US) announced a $150 million deep partnership with Warby Parker (WRBY.US) last year; according to media reports, OpenAI, the developer of the world-popular AI app ChatGPT, is cooperating with Apple (AAPL.US) to develop an unprecedented AI smart glasses model.
With the rapid development of edge computing, 5G networks, and artificial intelligence technology, end-side consumer electronic devices are increasingly capable of processing data in real time and seamlessly interacting with large cloud AI models. This also enables smart glasses not only to collect real-time environmental data (such as vision, sound, location information, etc.), but also to achieve real-time generative AI function applications through local preliminary processing, that is, certain complex AI workloads are completed in cloud AI computing power systems, and require high-speed real-time or sensitive tasks to be processed locally, such as voice interaction, real-time translation, Augmented reality navigation, contextual information superposition, etc.
The upgraded Ray-Ban Meta smart glasses in 2024 can be described as the first AI smart glasses series product in the market to set off a wave of consumer purchases. They are equipped with an NPU integrated SoC, advanced camera, and audio components. These technologies enable core intelligent functions, such as photo/video capture and audio playback, and support a wide range of application scenarios using device-side AI algorithms, end-side AI based on smart phones and the Llama cloud AI model.
Counterpoint said that with the popularity of Ray-Ban Meta smart glasses around the world, an unprecedented wave of AI smart glasses has emerged. From reference design products provided by supply chain companies to commercial products launched by eyewear brand manufacturers, they will be launched after the end of 2024. The agency predicts that as leading smartphone companies launch their first AI smart glasses in 2025, and more smart consumer electronics companies may enter the market in 2025 and 2026, Counterpoint expects the overall global smart glasses market to achieve 60% year-on-year growth in 2025 and maintain a compound annual growth rate of more than 60% between 2025 and 2029.

According to a Counterpoint statistics report, global smart glasses shipments surged 210% year on year in 2024, while international smart glasses market shipments in the first half of 2025 achieved a 110% year-on-year increase over the strong base of the same period last year. What is even more significant is that in the first half of 2025, AI smart glasses already accounted for about 78% of global smart glasses shipments, and the growth rate of the AI smart glasses segment was even more significant (described as a strong growth level of “250% + over the same period”), reflecting the shift in market focus from “audio glasses for listening to songs/calls” to AI smart glasses with “shooting + multi-modal perception + end-side AI all-round assistant”.