Onyx AR 2.0: A Maker-Friendly Path to DIY AR Glasses
Onyx AR 2.0 is a home‑built, open source headset that proves DIY AR glasses are no longer science fiction. Created by YouTuber Mañolo Mancelli, the project embraces a modular, 3D‑printed design instead of chasing ultra‑miniaturised consumer aesthetics. The electronics are spread across a custom frame, with each section dedicated to a specific function, resulting in 3D printed smart glasses that are bulkier than commercial devices but still wearable for everyday tinkering. For hobbyists, the appeal is control: you can inspect every PCB, tweak every bracket, and adapt the hardware to your own needs rather than those of a mass‑market product. As a modular AR kit, Onyx AR 2.0 is less about competing with premium smart eyewear and more about offering an accessible playground where makers can learn optics, embedded electronics, and interaction design by literally wearing their experiments.

Inside the Modular AR Kit: Frames, Optics, Displays and Compute
At the heart of Onyx AR 2.0 is a custom power management board that regulates a single‑cell LiPo battery and steps its output up to a stable 5 V rail for the rest of the electronics. A tactile power switch and status LEDs give basic feedback without adding complexity. The visual system uses a tiny microdisplay driven by a dedicated board that converts composite video into a format the panel understands. A magnifier lens pushes the image out to a comfortable focal distance, while a semi‑transparent combiner reflects the picture into the wearer’s eye and still lets the real world through. The 3D‑printed frame is the glue that ties this modular AR kit together, distributing components along the arms and front section. Makers can redesign parts of the frame, swap optics, or plug in different compute modules, turning the headset into a living testbed rather than a fixed consumer product.

What DIY AR Glasses Can (and Can’t) Do Today
A build like Onyx AR 2.0 will not rival the ultra‑light, highly integrated AR glasses developed by specialist companies, which push towards record‑breaking weight reduction and polished user experiences. But DIY AR glasses excel in areas that matter to tinkerers: flexibility, transparency, and custom workflows. The microdisplay and combiner setup is well suited to overlaying heads‑up information, debugging output from microcontrollers, basic navigation prompts, or experimental interfaces driven by a small single‑board computer or retro console. You will not get the sophisticated computer vision, long‑lasting batteries, or seamless industrial design common in commercial AI smart glasses. Instead, you gain a sandbox where you can test your own UI ideas, prototype gesture or controller input, and iterate rapidly. For many makers, that trade‑off—raw capability for hackability—is precisely the point of an open source headset like this.
Skills, Tools and Budget: What It Takes to Build Your Own Headset
Assembling a DIY AR headset is more advanced than typical beginner projects, but it is approachable for an enthusiastic maker with some experience. You will need access to a reasonably accurate 3D printer, basic hand tools, and comfort with soldering, wiring, and handling LiPo batteries safely. Familiarity with simple optics and display drivers helps when aligning the microdisplay, magnifier, and combiner so the image is sharp and comfortable to view. On the software side, you should be ready to configure the driving electronics and whatever compute board you select, whether that is streaming composite video or rendering a minimalist UI. The modular design of Onyx AR 2.0 reduces risk: you can debug the power board, then the display chain, then the frame fit, rather than committing to a monolithic build. Expect to invest time in iteration and fine‑tuning; the reward is a headset you understand down to every screw hole and trace.
Why Open-Source AR Matters—and What’s Coming Next
Projects like Onyx AR 2.0 highlight how open source headset designs can shape the future of smart eyewear from the bottom up. Makers are free to experiment with unconventional form factors, niche use cases, or accessibility‑focused interfaces long before they are commercially attractive. As component technology advances, these builds will only get more capable. Ultra‑bright, low‑power LCoS microdisplays are being developed specifically for lightweight AR glasses, promising higher contrast and better visibility in challenging lighting without sacrificing battery life. At the same time, companies are exploring neural input devices such as wristbands that translate subtle bio‑signals into spatial interaction for AR hardware. Over time, such microdisplays and neural interfaces can trickle into DIY AR glasses, where the community can prototype new interaction paradigms and pressure‑test ideas in the open—ultimately influencing what next‑generation consumer headsets look and feel like.
