Most conversations about AR glasses still start with Meta Ray-Ban. That is the wrong reference point. Meta Ray-Ban Gen 2 has no display at all: it is a camera and a speaker in a glasses frame. It is a wearable AI assistant, not an AR device.
Snap Spectacles 5th gen is genuinely different. It has a binocular see-through display with a 46-degree diagonal field of view. When you put them on, digital objects appear in the real world at the correct scale and position, anchored in space. Both hands are tracked simultaneously. The world is being reconstructed in 3D in real time. That is a different category of device entirely, and it changes what is possible for brand experiences.
This article is a practical overview of the hardware, what Lens Studio 5.0 enables, what has already been built, and how brands and studios can access the dev kit today.
What the 5th gen hardware is
The Spectacles 5th gen is a developer kit, not a consumer product. Snap leases units to approved studios and developers while consumer Specs are prepared for a 2026 launch via Specs Inc., Snap's dedicated AR glasses subsidiary.
The display uses LCoS micro-projectors with waveguides to render digital content directly in the wearer's line of sight. At 37 pixels per degree, the visual density is comparable to Apple Vision Pro. Visible resolution is 1008 x 1398 pixels per eye. Motion-to-photon latency sits at 13ms, which keeps virtual objects feeling stable and grounded in the physical space rather than floating.
Display
True binocular AR overlay / 46° diagonal FOV
Tracking
6DoF world + two-hand tracking
Build with
Lens Studio 5.0 (TypeScript / JS)
Battery
~45 min continuous / USB-C
Dev kit
$99/month (12-month)
Available
spectacles.com/build
The device weighs 226 grams and runs on a Qualcomm Snapdragon processor. Snap signed a multi-year deal with Qualcomm in April 2026 to power the consumer Specs platform. GPS was added via firmware in March 2025, extending location capability for outdoor experiences. The dev kit is available in the US, France, Germany, Spain, Italy, Austria, and the Netherlands.
Why the display changes everything for brand briefs
Phone AR puts a frame between the user and the experience. Even the best phone AR is still a rectangle you hold at arm's length. The digital content lives inside the screen, separated from the real world by a physical and perceptual boundary.
A true see-through AR display removes that boundary. Virtual objects exist in the room with you. A product can appear on the table. A character can walk through a crowd. A spatial label can hover above an exhibit at the correct reading distance. The experience does not require the user to hold anything or look in a particular direction. They just see it.
This changes how you brief an experience. The relevant question is not "what goes on screen" but "what does the user walk through." Spatial layout, physical movement, and gaze become the primary design considerations. Copy and visual design become secondary. That is a genuinely different discipline from phone AR, and it is one most brands have not encountered yet.
The Meta Ray-Ban comparison
Meta Ray-Ban is sometimes positioned as competition to Spectacles. It is not the same product. Gen 2 Ray-Ban has no AR display: it captures video and audio and delivers AI responses through a built-in speaker. There is nothing to see in the glasses themselves. It is a useful product for AI-assisted real-world queries, but it produces no spatial experiences. For brand activations where visual presence matters, Spectacles is in a different category.
What a true AR display unlocks for brand activations
When the display is real, several things become possible that phone AR cannot deliver:
- Hands-free engagement. The user can pick things up, gesture, point, collaborate, and interact with people around them while the experience runs. This transforms the social dynamic of a brand activation entirely.
- Shared spatial moments. Multiple people wearing Spectacles see the same virtual objects in the same physical location at the same time. That shared perception is something that literally cannot happen on a phone screen.
- Spatial scale. A product can be rendered at 1:1 scale in a real room. An architectural visualization can fill the space it is designed for. A character can stand at actual human height.
- Peripheral presence. AR content can sit at the edge of vision, providing ambient context without demanding attention. That is a completely different tone to the full-screen phone experience.
What Lens Studio 5.0 enables
Lens Studio is Snap's development environment for Spectacles experiences. Studios use TypeScript or JavaScript. It is the same tool used for Snapchat Lenses, so studios already familiar with Snap's ecosystem have a foundation to build on. The platform capabilities have advanced substantially with the 5.0 release and subsequent updates.
Spatial anchors and persistent AR
The WorldAnchor API lets you pin AR content to specific physical locations. That content persists across sessions: if a user removes the glasses and puts them back on, the virtual object is still in the same position. For brand activations at fixed venues, this means experiences can be built that feel like they belong to the space, not like overlays dropped on top of it.
Snap announced a partnership with Niantic at AWE in June 2025 that extends this capability significantly. Niantic's Visual Positioning System (VPS) brings centimeter-level accuracy AR placement at millions of real-world locations, integrated directly into Lens Studio and Spectacles. Spatial anchors stop being limited to controlled indoor environments and become viable at outdoor landmarks, public spaces, and cultural sites globally.
Colocated lenses: shared AR for up to three wearers
Colocated Lenses allow up to three Spectacles devices to share the same AR space simultaneously via Bluetooth. All wearers see the same virtual objects in the same physical positions. For brand activations, this is one of the most compelling formats available: a shared spatial moment that requires no screen, no app install, and no choreography beyond putting on the glasses.
A group of three people can collaborate on a virtual object together. They can play a spatial game in the same room. They can explore a product together from different angles simultaneously. None of that is possible on a phone.
Hand tracking and the Spectacles Interaction Kit
Spectacles tracks both hands simultaneously without a controller. The Spectacles Interaction Kit (SIK) provides three interaction modes out of the box: indirect ray-based pointing (like a laser pointer from your fingertip), direct pinch (physically reaching and pinching a virtual object), and direct poke (pressing into a virtual surface). Voice input is natively supported alongside hand tracking.
This combination produces interactions that feel natural in a way phone-based AR cannot match. Reaching out and pinching a virtual product to rotate it is a physically intuitive behavior. Poking a virtual button is satisfying in a way tapping a phone screen is not. Body tracking, including the Full Body Mesh and 3D Body Tracking templates, is also available for experiences built around the wearer's full physical presence.
World mesh and surface understanding
World Mesh 2.0 in Lens Studio 4.55 provides real-time 3D reconstruction of surfaces in the environment. Virtual content can land on real tables, sit on real floors, and occlude behind real walls. For experiences that need to feel physically integrated rather than floating above the world, this is the foundation.
The WorldQueryModule lets you raycast to real surfaces without generating a full mesh, reducing the performance cost for simpler placement needs. The Depth Texture API adds another layer of environment understanding for more complex experiences.
Snap Cloud and real-time sync
Snap Cloud, powered by Supabase, was announced at Lens Fest 2025 and brings APIs, edge functions, storage, and real-time sync directly into the Lens Studio ecosystem. This means Spectacles experiences can communicate with external data sources, persist user state across sessions, and sync between multiple users without the studio needing to build and host separate backend infrastructure. For brands that want to tie Spectacles activations to loyalty programs, product databases, or live event data, Snap Cloud removes a significant development barrier.
Commerce Kit and Snap OS browser
Commerce Kit, announced at Lens Fest 2025 and currently rolling out to select creators, enables in-Lens payments. Snap OS 2.0 includes a native browser with WebXR support. Together, these signal that Spectacles is being positioned as a commerce-capable platform, not just an experience layer. Travel Mode enables on-the-go AR experiences beyond controlled indoor environments.
What has been built
The most useful way to understand what Spectacles can do is to look at what has already shipped.
noodle at MIT Reality Hack 2026
At MIT Reality Hack 2026 in January, RBKAVIN. Immersive Studio built noodle for the Snap Spectacles category. noodle is a mixed reality creative workbench that turns the physical environment around you into an infinite canvas for generative AI. Using only hands and voice, a user can go from a 2D sketch drawn in the air to a 3D object placed in the real world. The workflow is node-based: users build generative AI pipelines spatially, without a keyboard or screen in sight. noodle won the Snap Spectacles + AI Lens category at MIT Reality Hack 2026.
The project demonstrated something we did not fully anticipate before building it: when interaction is purely physical and spatial, the act of creating becomes genuinely absorbing in a way screen-based tools are not. There is no context-switching between the tool and the thing being made. You are always inside the work. That quality, which we described as pure creative flow, is something brands can design for. See the noodle portfolio page for the full case study, or try the live Snap Camera Kit demo.
LEGO BRICKTACULAR
LEGO used Spectacles to build an interactive AR game controlled entirely by hand gestures and voice. Users can free-build or tackle LEGO sets in AR, with virtual bricks responding to physical hand interactions. It demonstrated the platform's viability for entertainment brands targeting families and younger audiences, and showed how a well-known physical product translates naturally to spatial AR interaction.
ILM Immersive: Star Wars Holocron Histories
Industrial Light and Magic's immersive division used Spectacles for a Star Wars experience bringing the Holocron lore into physical space. This is a significant validation: one of the most technically demanding IP brands in entertainment chose Spectacles as the delivery platform for a spatial narrative. The result demonstrates how licensed characters, objects, and environments can occupy real space at the correct scale with the production quality those IP owners require.
Avatar: The Last Airbender (Paramount / Nickelodeon)
Paramount and Nickelodeon built a Spectacles experience around Avatar: The Last Airbender, placing characters and elements from the series into real environments. For animation and entertainment brands, Spectacles offers a way to bring characters into the room with fans in a way that no prior consumer technology has permitted. The physicality of seeing a character standing in front of you at full scale is qualitatively different from seeing them on a screen.
SightCraft by Verse Immersive
Verse Immersive launched SightCraft, a multiplayer venue AR game playable on Spectacles across real physical locations, with rollout to more venues underway. SightCraft is the clearest example of how colocated, persistent spatial AR becomes a venue product rather than a one-off activation. Brands with venue presence, whether retail, hospitality, or entertainment, have a direct model here for how Spectacles can become a durable part of the visitor experience rather than a campaign moment.
Niantic: Peridot and Scaniverse on Spectacles
Niantic brought both Peridot (their AR creature companion game) and Scaniverse (3D scanning) to Spectacles. Peridot on Spectacles removes the phone entirely: the creature exists in your environment without mediation. Scaniverse demonstrates the device's 3D capture capability. Both products show what existing mobile AR experiences gain when moved to a true spatial display.
Specs Inc. and what the spinout signals for brands
In January 2026, Snap spun out Specs Inc. as a wholly owned subsidiary dedicated to AR glasses. Around 100 roles are being hired. The company has a distinct brand identity separate from Snapchat.
This is not a cosmetic restructuring. Snap is treating Spectacles as a standalone computing platform with its own business logic, not as a Snapchat peripheral. Snap CEO Evan Spiegel framed the device's purpose concisely in an interview with Fortune in June 2025: "We're building a computer that we hope you'll use less." That is a meaningful design philosophy. Spectacles is being built to be ambient and useful, not addictive and attention-maximizing. For brands, that means the platform will attract audiences who are specifically not seeking to be advertised at in the conventional sense. Experiences need to offer genuine value to earn attention in that environment.
The multi-year Qualcomm deal signed in April 2026 signals sustained hardware investment at the silicon level. This is not a one-generation experiment. Consumer Specs are planned for 2026, with a target retail price that Snap has not yet confirmed publicly. When consumer availability arrives, the studios with a body of Spectacles work will have a significant head start on a new platform with no established creative vocabulary and very limited competition.
How brands access the Spectacles dev kit
The primary route is the Spectacles Developer Program at spectacles.com/build. Hardware is leased at $99/month on a 12-month commitment. Student and educator pricing is $49.50/month. The program includes access to the 5th gen hardware, Lens Studio 5.0, and Snap's developer support resources. It is currently available in the US, France, Germany, Spain, Italy, Austria, and the Netherlands.
For brands that want to commission work without building an internal Spectacles capability, the Snap AR Creator Marketplace connects brands with approved Lens creators who can build Spectacles experiences on commission. The marketplace has over 400,000 developers in the broader Lens Studio community, with a subset holding Spectacles production experience.
The third route is working directly with a studio that already holds a dev kit and has shipped Spectacles work. This is the fastest path to a brief-to-build timeline, because the hardware, toolchain, and design muscle are already in place. RBKAVIN. Immersive Studio holds a Spectacles dev kit and has shipped work on the platform, including the noodle project at MIT Reality Hack 2026.
We build for Snap Spectacles. If you are thinking about a Spectacles experience for a brand activation or event, get in touch.
Designing for 45 minutes
The Spectacles 5th gen battery provides approximately 45 minutes of continuous use. That is the most important hardware constraint for brand activations, and it shapes every brief that involves the device.
45 minutes is not a limitation if you design around it. Most successful Spectacles activations we have seen and built are structured as short-burst experiences of five to fifteen minutes, designed to be compelling within that window rather than stretched across the full battery life. The constraint actually enforces a discipline that makes experiences better: it is hard to justify padding when you only have fifteen minutes of a user's attention.
Rotation systems for events
For events where multiple users will rotate through the experience, the 45-minute battery life translates directly into a logistics plan. A set of four devices, staggered across charging cycles, can sustain continuous operation across a full event day. Each device charges via USB-C and a standard power bank is sufficient for top-up cycles. Designing the experience duration to fit comfortably within a single charge cycle, with buffer time for device management, is straightforward at fifteen minutes per session.
Battery-aware experience design
The sharper consideration is designing experiences that do not feel truncated at their natural ending point. A five-minute experience that resolves completely is far stronger than a twenty-minute experience that gets cut off at battery death or transition. Each Spectacles session should have a defined arc with a beginning, a specific middle beat, and a clear ending. The ending should feel earned within the session, not dependent on a subsequent session the user may not have.
For Lens Studio, this means building in graceful state handling for session interruptions and making sure key content beats are front-loaded rather than held as rewards for extended engagement. Test the experience at the actual battery level you expect at the point of user handoff, not just at full charge. Spectacles performance is stable across the charge cycle, but knowing the exact behavior at different states avoids surprises at activation.
Multi-session formats
Spatial Anchors allow persistent content across sessions. For brand activations running across multiple days, this opens up multi-session experience design: a user can start an experience at one visit, and the spatial state is preserved when they return. Content can layer and evolve across visits. For venue brands or multi-day events, this is a significant creative opportunity that no phone-based activation can replicate.
Frequently asked questions
What is the Snap Spectacles developer kit?
A leased developer hardware program at $99/month on a 12-month commitment. You get access to 5th gen Spectacles hardware, Lens Studio 5.0, and Snap developer support. It is the primary way studios and brands can build Spectacles experiences before the consumer launch. Available at spectacles.com/build.
Can multiple people share a Snap Spectacles AR experience at the same time?
Yes. Colocated Lenses allow up to three Spectacles devices to share the same AR space simultaneously via Bluetooth. All wearers see virtual objects in the same physical location. This is one of the most compelling formats for brand activations: a shared spatial moment that simply cannot happen on a phone.
How do hand gestures work on Spectacles?
Spectacles tracks both hands simultaneously. The Spectacles Interaction Kit (SIK) provides three interaction modes out of the box: indirect ray-based pointing, direct pinch, and direct poke. No controller needed. Experiences feel natural and physical in a way that holding a phone never does.
What did RBKAVIN. Immersive Studio build for Snap Spectacles?
We built noodle at MIT Reality Hack 2026: a mixed reality creative workbench that turns physical surroundings into an infinite canvas for generative AI. Users move from 2D sketch to 3D reality using only hands and voice. noodle won the Snap Spectacles + AI Lens category at MIT Reality Hack 2026. You can also try the live Snap Camera Kit demo.
When are Snap Spectacles available to consumers?
Consumer Specs are planned for 2026 via Specs Inc., Snap's dedicated AR glasses subsidiary. No firm release date has been confirmed as of April 2026. Until then, the $99/month developer kit is how studios and brands access the hardware.
Build a Spectacles experience
We hold a Spectacles dev kit and have shipped work on the platform. Talk to us before the build window fills.
Start a project