When someone searches for a smart glasses developer, they usually have one of a few things in mind: a Snap Spectacles experience for a brand event, a Meta Ray-Ban integration for a content campaign, or a vague conviction that smart glasses are the next platform and they need someone who actually works on them.

All of those are reasonable starting points. But the thing worth saying upfront: smart glasses is not one category. The two main platforms right now have almost nothing in common technically, and the build work is genuinely different. Getting that distinction right at the brief stage saves months.

Two platforms, two very different briefs

Snap Spectacles (5th generation) are AR glasses with a real display. They project digital content into the wearer's field of view at 46 degrees FOV. There is six-degrees-of-freedom spatial tracking, two-hand tracking, world mesh, surface detection, and the ability to anchor content to locations in physical space. You can run colocated multi-user experiences across up to three devices simultaneously. This is spatial computing. You are designing an experience that happens in a place.

Meta Ray-Ban (Gen 2, the widely available model) has no display. There is no AR overlay. What they do have: a camera, microphones, speakers, and an AI assistant. The Meta Wearables Device Access Toolkit (in public preview since December 2025, at developers.meta.com/wearables/) gives third-party developers access to the camera feed and audio. You can build POV content pipelines, livestream integrations with Instagram and Facebook Live, and AI-guided audio experiences. The brief here is about content and intelligence, not spatial overlay.

The Meta Ray-Ban Display, at $799, adds a 600x600px in-lens screen with 20 degree FOV and a Neural Band EMG wristband for control. That is closer to Spectacles in concept, but it is a newer, less-adopted platform with a different development path.

Snap Spectacles

Spatial computing platform

True AR display. Hand tracking, world understanding, spatial anchors. Built in Lens Studio 5.0 with TypeScript. Dev kit at $99/month. The brief: design a spatial experience.

Meta Ray-Ban Gen 2

AI camera platform

No display. Camera, audio, AI assistant. Meta Wearables SDK for third-party access. The brief: build a content pipeline, audio experience, or livestream integration.

Building for Snap Spectacles

Developer using hand gestures with Snap Spectacles 5th gen AR glasses
Building for Spectacles means designing for hand gestures and spatial interaction, not touch. © Pocket-lint

Spectacles experiences are built in Snap Lens Studio 5.0. The language is TypeScript or JavaScript. If you have built Snapchat Lenses before, the tooling is familiar, but the design constraints are entirely different. Lens Studio has over 400,000 developers in its community, which means there is a wide pool of people who can write the code. Knowing how to design and ship a good Spectacles experience is a smaller group.

Hand tracking and the Spectacles Interaction Kit

The Spectacles Interaction Kit (SIK) is the primary interaction framework. It handles hand tracking, gesture recognition, and the logic that maps hand movements to UI actions. Designing for SIK means thinking through every interaction as a physical gesture rather than a tap or click. What does the user do to confirm a selection? How do they dismiss something? What happens when they reach toward an object?

The constraint is that the interaction has to be legible without a tutorial. Users at a brand event do not read instructions. The experience has to make its own affordances visible through spatial cues and immediate feedback.

Spatial anchors and world understanding

Spectacles can use spatial anchors to pin digital content to specific locations in the physical environment. A product appears on a specific shelf. A character stands at a marked point on a floor. This works reliably in controlled environments where the anchor positions can be set up before the activation runs.

World mesh and surface detection let the experience understand the geometry of the space. Content can land on surfaces, respond to walls, or react to objects the user picks up. This is where Spectacles starts to behave like a genuine spatial computing device rather than a head-mounted screen.

Colocated multi-user experiences

Up to three Spectacles devices can run a shared colocated Lens simultaneously, meaning multiple users see the same spatial content in the same physical space. This is one of the most compelling things Spectacles can do that phone AR simply cannot. A group of people standing in the same room all see the same thing, anchored to the same physical points, interacting with the same content. The social dynamic of shared AR is fundamentally different from passing a phone around.

For brand activations, this opens up multiplayer formats: collaborative experiences, competitive games, or shared narratives where each participant's actions affect the shared environment.

What a finished Spectacles experience looks like

The deliverable is a Lens: an installable experience that runs on the device. For events, this typically means a session-based Lens with a defined start, a defined duration, and a defined end state. The experience needs to be battery-aware. Spectacles battery life is a real constraint that event formats have to design around.

The noodle project, which won the Snap category at MIT Reality Hack 2026, was a mixed reality creative workbench: users could sketch in 2D using hand gestures and voice, and the system converted those inputs into 3D objects anchored in the space around them. It ran on Spectacles using Lens Studio 5.0 and demonstrated what a production-quality Spectacles experience looks like. See the noodle case study.

Building for Meta Ray-Ban

Person wearing Ray-Ban Meta smart glasses using Meta AI
Ray-Ban Meta with Meta AI integration. The brief is content, audio, and AI, not AR overlays. © Meta

The Meta Wearables Device Access Toolkit is the entry point for third-party development on Meta Ray-Ban glasses. It provides programmatic access to the camera feed and audio. This is not a visual AR platform. There is no way to overlay content in the wearer's field of view on the Gen 2 model. The builds here are about what the glasses can capture and transmit, not what they can display.

Camera access and POV content pipelines

The most straightforward Meta Ray-Ban build is a POV content pipeline. The glasses capture first-person video and audio. A third-party app or backend processes that input in real time or post-capture. Use cases include creator tools that process the footage, brand activations where the wearer's POV is the content asset, and documentation workflows where the glasses function as a hands-free recording device.

The SDK build here involves authenticating the device, handling the camera stream, and whatever processing or routing happens downstream. The experience design is about what happens to the footage, not what the wearer sees.

Livestream integrations

Meta Ray-Ban glasses support Instagram Live and Facebook Live streaming directly from the device. Third-party SDK builds can integrate into this pipeline, adding overlays, processing, or routing to the stream at the platform level rather than the device level. For brands, this is a live POV activation format: the glasses wearer is on camera, live, in the first person, with the audience watching through their eyes.

The build work here involves platform API integration, stream handling, and any real-time processing that needs to happen between the glasses and the broadcast output.

Meta Ray-Ban Display builds

The Display model ($799, 600x600px, 20 degree FOV) uses a Neural Band EMG wristband for interaction. Building for it sits closer to traditional AR development: you are designing content for a small in-lens display with muscle-signal input. It is a newer platform with a smaller install base, and the development tooling is less mature than Lens Studio. The brief for a Display build is more like a constrained AR experience than a Spectacles experience, but the FOV and interaction method are different enough that it requires its own design approach.

What the event activation layer looks like

For most brand builds, the Spectacles Lens or Meta SDK integration is only part of the work. The other part is getting it in front of people at an event, running reliably, for hours, with staff who can handle anything that goes wrong.

Hardware sourcing and setup

Spectacles are developer hardware. They are not rented from a consumer electronics shop. Getting devices for an event means sourcing through the developer program, factoring in lead time, and managing a limited pool of units. Setup involves pairing devices, staging the Lens, confirming spatial anchor positions in the actual venue, and testing the full experience on the hardware in the space before doors open.

This is where builds that have only been tested in a studio fall apart. The room lighting is different. The floor texture confuses the world mesh. An anchor that was placed in the morning has drifted by the afternoon. Hardware testing in the actual venue, ideally a full day before the activation runs, is not optional.

Session design and battery logistics

Spectacles battery life determines how long each session can run and how many sessions are possible before a charge cycle. Event design has to account for this. A 4-minute activation with 15 minutes of charging in between is a very different operational format than a continuous 90-minute installation. Session design, rotation logistics, and staff briefing are part of the deliverable for an event activation package.

On-site support

Something will behave differently on the day than it did in testing. The studio that built the experience is the right person to handle that. On-site support for smart glasses activations is not optional staffing: it is part of the build. Studios that hand off and disappear create problems for brands and their event teams.

What to look for in a smart glasses studio

If you are briefing smart glasses work, here is what separates studios that can deliver from studios that are figuring it out on your budget.

They own hardware You cannot build a Spectacles experience without a Spectacles device. Ask whether the studio has the hardware on hand. A studio that needs to order hardware after signing a contract is six weeks behind before they start.
Lens Studio proficiency, not just AR experience General AR experience does not transfer directly to Spectacles. Lens Studio 5.0 with TypeScript, SIK for hand tracking, and spatial anchor logic are specific skills. Ask for shipped Lenses, not just phone AR portfolios.
Event delivery track record Building the experience is one thing. Running it at an event with real people, under time pressure, in a venue you've never visited is another. Ask specifically about event activations, not just demos or installations in controlled spaces.
Platform clarity A studio that treats Spectacles and Meta Ray-Ban as the same category has not shipped on both. The brief, tools, and skills are different. Clarity about which platform is right for the objective is a signal that the studio knows what they are doing.
On-site commitment Confirm early whether the studio will be present during the activation. If the answer is no, weight that heavily. Smart glasses activations at events require someone who built the experience to be on the floor.

What we have built

RBKAVIN. Immersive Studio has Spectacles production credits and active Meta Ray-Ban SDK builds.

noodle won the Snap category at MIT Reality Hack 2026. It was a mixed reality creative workbench built for Spectacles: 2D sketches via hand tracking and voice, converted in real time into 3D objects anchored in space. The project ran on Lens Studio 5.0. Case study here.

The live Snap Camera Kit integration is available at ar.rbkavin.studio/demos/snap/ for reference.

If you are planning a Spectacles activation or a Meta Ray-Ban SDK build, get in touch early. Build windows that tie to event dates fill up, and testing on hardware takes time that cannot be compressed at the end of a project.

Frequently asked questions

What does a Snap Spectacles developer build?

Lenses: spatial experiences that run on the Spectacles hardware. Built in Lens Studio 5.0 using TypeScript or JavaScript. Can include hand tracking interactions, world-anchored AR content that persists in a space, shared multi-user experiences, and AI-integrated generative layers. The dev kit is available at $99/month and hardware is required for proper testing.

Can brands hire a studio to build for Meta Ray-Ban glasses?

Yes, though the work is different from AR glasses builds. Without an AR display, the brief is about content pipelines, AI-guided audio experiences, POV livestreaming integrations, and SDK-based camera access. The Meta Wearables Device Access Toolkit (in public preview since December 2025) is the route for third-party builds. Studios need SDK access and familiarity with Meta's platform requirements.

How long does it take to build a smart glasses experience?

For a Spectacles Lens activation: typically 6-10 weeks from brief to event-ready, depending on interaction complexity and whether multi-user or spatial anchor features are required. Meta Ray-Ban SDK builds vary more: a POV content tool might take 3-4 weeks; a full event activation with AI integration runs 8-12 weeks. Testing on hardware adds time that phone AR builds don't have.

What is the difference between building for Snap Spectacles and Meta Ray-Ban?

Spectacles has a true AR display and full spatial computing capabilities: hand tracking, world understanding, persistent anchors. You are designing a spatial experience. Meta Ray-Ban (Gen 2 screenless) has no display. You are building a content and AI platform: POV capture, audio interactions, livestream pipelines. The Meta Ray-Ban Display adds an in-lens screen but is a newer, less widely adopted platform. The brief, the tools, and the skills involved are substantially different.

Building for smart glasses?

Get in touch. We will confirm whether the platform fits the brief and give you an honest read on timeline and scope.

Get in touch