Two Samsung smart glasses, two very different purposes
Samsung is not betting on just one kind of smart eyewear. Instead, it is developing two distinct Samsung smart glasses lines designed for very different needs. The first, often referred to as Galaxy Glasses AI, are lightweight AI assistant glasses without a visible display. Leaked renders show them looking like regular spectacles with discreet cameras at the frame edges and no screen, similar to Ray-Ban Meta smart glasses, but running Android XR with access to Google’s Gemini assistant and Google Maps for spoken navigation and translation. A second, more advanced Samsung AR HUD product is planned as a separate line. These HUD-style glasses will include an integrated display for extended reality overlays and gaming, with tighter One UI integration aimed at professional and immersive use. Together, they signal a dual strategy: casual everyday AI assistant glasses first, full AR HUD computing glasses later, each optimized for its own audience.

AI assistant frames vs AR HUD: design, power and use cases
Under the hood, Samsung’s two devices differ as much as their target users. The Jinju AI glasses are built around Qualcomm’s Snapdragon AR1+ platform and focus on comfort, with an expected weight of around 50 grams and a 12MP camera for quick photos and short video clips. They pair over Bluetooth, integrate with One UI on Galaxy phones, and behave like open-ear headphones plus camera, ideal for hands-free calls, social sharing and spoken AI assistance while commuting or walking. The Haean HUD glasses, by contrast, add a built-in display and enhanced XR processing, reportedly tying into a newer One UI version. That means higher power draw, more components and likely more weight, in exchange for heads-up overlays for navigation, training, design review or gaming. Where Jinju is about all-day wear and subtle AI help, Haean is more like a wearable AR monitor for focused tasks, less suited to being worn from breakfast to bedtime in a hot climate.
Where Samsung fits in the three smart glasses categories
Current smart glasses categories fall into three broad buckets. First are camera-first AI glasses that prioritise audio, AI and camera while skipping displays; they connect to your phone and work like voice-first assistants with built-in cameras, as seen with Ray-Ban Meta and similar models. Second are display-centric glasses that act as virtual monitors over USB-C, projecting a large screen from your phone, laptop or console. Third are true AR/XR computing glasses that run on their own processors and overlay information directly onto your real-world view. Samsung’s Galaxy Glasses AI clearly sit in the first group: AI assistant glasses with cameras, microphones and tight phone integration, but no visible display. The Samsung AR HUD line falls into the third category, aiming at independent AR computing with richer overlays and more complex apps. Understanding these smart glasses categories is crucial, because picking the wrong type — not the wrong brand — is what most often leads to buyer regret.
How Samsung’s dual strategy stacks up against rivals
Many competitors are committing to a single smart glasses category: Meta leans heavily into camera-first AI glasses, while others focus on display-only video viewers. Samsung is unusual in openly pursuing both AI-first frames and a full AR HUD device. This two-pronged roadmap has strategic advantages. The Galaxy Glasses AI can compete directly with Ray-Ban Meta by offering similar hardware but leveraging Android XR, Gemini and Google Maps, potentially delivering more capable voice assistance and better everyday utility. Samsung can refine its AI features, fit and battery life on this simpler product while building developer interest. The HUD glasses then benefit from that groundwork, inheriting ecosystem integration, user feedback and perhaps shared apps, instead of launching as an isolated niche gadget. For long-term software support, this could mean a larger combined user base across two product families, making it more attractive for third-party developers to build AR, productivity and travel applications that work across both Samsung smart glasses lines.
Malaysian use cases: who should wait for HUD and who shouldn’t
For users in Malaysia, the choice will hinge on climate, lifestyle and how deeply you already live in the Galaxy ecosystem. Galaxy Glasses AI frames should pair naturally with Galaxy smartphones, offering hands-free camera shots at cafés, on-bike voice navigation through hot city streets, and spoken translation of signs when travelling around ASEAN, all without the visual distraction of floating screens. Their lighter weight is an advantage for all-day wear in humid weather, though always-on microphones and cameras may raise privacy concerns in offices, classrooms or religious spaces. The Samsung AR HUD model will appeal more to professionals and enthusiasts who benefit from on-lens information: logistics staff needing warehouse overlays, field technicians viewing instructions, or gamers craving immersive HUD experiences at home. For most people, the first-wave AI assistant glasses are likely to be the more practical starting point, while those who specifically want visual AR overlays — and are willing to trade some comfort and discretion — may prefer to wait for the HUD version.

