A First Look at Samsung’s AI-First Smart Glasses
Leaked renders and specifications suggest Samsung is preparing its first entry into the smart glasses market, widely nicknamed “Galaxy Glasses” and reportedly codenamed Jinju. Unlike fully fledged AR headsets, these upcoming smart glasses appear to prioritise practicality and everyday wearability. Reports describe a lightweight frame at around 50 grams, with styling reminiscent of Ray-Ban Meta’s design language, including visible camera bumps to clearly signal that the wearer is using camera-equipped eyewear. Crucially, Samsung’s smart glasses reportedly omit any integrated display, signalling an AI-first, audio-centric approach rather than immersive augmented reality. The device is expected to serve as a bridge between smartphones and future AR hardware, offering hands-free access to voice assistants, communications and context-aware features. With a potential unveiling hinted for an upcoming Samsung Unpacked event, the company looks ready to stake a claim in the emerging post-smartphone landscape, where everyday eyewear becomes a discreet, always-on computing companion.

Leaked Specifications and Smart Glasses Features
According to the leaks, Samsung’s upcoming smart glasses will run on Qualcomm’s Snapdragon AR1 processor, a chip designed for lightweight wearable and AI workloads. The device reportedly includes a 12MP Sony IMX681 camera, enabling hands-free photo and video capture from a first-person perspective. Connectivity is said to cover WiFi and Bluetooth 5.3, aligning with expectations for always-connected wearable devices. Audio and comfort seem central to the design. Directional speakers and patented bone-conduction technology are expected to deliver sound without blocking the ears, helping users remain aware of their surroundings. Photochromic transition lenses—similar to optional add-ons in rival products—should make the glasses more practical outdoors by automatically adapting to changing light conditions. The platform is reportedly based on Android XR with Gemini AI integration, reinforcing Samsung’s emphasis on AI-driven, real-time assistance rather than visual overlays. While some details such as video resolutions and battery life remain unknown, the core smart glasses features clearly target daily, unobtrusive use.
How Samsung’s Glasses Stack Up Against Meta, Apple and Google
Samsung’s smart glasses are entering a competitive field defined by players like Meta, Apple and Google. Meta’s Ray-Ban Meta smart glasses offer cameras, audio and AI features, also without a display, and Samsung’s reported specs closely mirror that formula, including a 12MP camera sensor, Snapdragon AR1 chipset and similar battery capacity. However, Samsung’s emphasis on more visible camera bumps may be a deliberate nod to transparency amid growing privacy concerns surrounding always-on cameras. Apple is reportedly developing its own premium smart glasses, expected to leverage deep ecosystem integration, while Google is pushing Android XR in partnership with eyewear brands such as Gentle Monster, Warby Parker and Gucci’s parent company Kering. Some of those Google-backed models are expected to include displays, making Samsung’s first-generation, screen-free approach more conservative. Nonetheless, leaks suggest Samsung is already working on a second, display-equipped pair of smart glasses targeted for around 2027, which would move it closer to Meta’s and future Apple AR experiences.
Use Cases, AI Integration and Market Implications
By removing the display and centring interaction on voice, audio and AI, Samsung is clearly positioning its upcoming smart glasses as an everyday assistant rather than a niche gadget. Expected features such as microphones for voice control, AI assistant integration and bone-conduction audio suggest scenarios like hands-free calling, real-time navigation prompts, contextual reminders and on-the-go information retrieval. The camera adds first-person content capture, which could appeal to creators and professionals who need quick, unobtrusive recording. Strategically, this AI-first, screen-free design lowers the barrier to adoption by keeping the form factor close to traditional eyewear while quietly introducing users to wearable computing. It also aligns with a broader industry vision of a post-smartphone future, where ambient, context-aware devices replace constant screen-checking. If Samsung executes well on privacy, comfort and software integration, these smart glasses could accelerate mainstream acceptance of AI-powered wearables and lay the groundwork for the company’s more advanced, display-based AR glasses planned for later in the decade.
