This is the one to watch
I have built experiences for a lot of AR hardware over the years. Most of it required significant creative compromise: the display was too narrow, the battery was too short, the interaction model was too unfamiliar, or the hardware itself was too strange for a mainstream audience to wear in public. Every platform had one or two things right and several things that needed explanation.
Meta Orion, shown publicly for the first time at Meta Connect in September 2024, is different. Not because it is perfect. It is a prototype, the compute puck is not subtle, and it is nowhere near a consumer product yet. But because it is the first time I have seen the full picture assemble correctly: a display wide enough to feel spatial rather than framed, an input method that does not require you to hold anything or wave your arms, and an AI layer integrated deeply enough to actually be useful. For the first time, I can see the brief that brands will eventually bring to studios like ours, and it is a genuinely interesting brief.
What Orion actually is
Before getting into what it means for brand work, it is worth being precise about what Orion is and what it is not. Orion is a true AR glasses prototype. It uses silicon carbide waveguide lenses to project holographic images directly into the lens, so digital content appears to exist in the physical world in front of you. This is fundamentally different from Meta Ray-Ban Gen 2, which has no display, and from the Ray-Ban Display model, which has a small 20-degree LCOS screen.
The field of view is 70 degrees diagonal. That number matters enormously. The current Ray-Ban Display is 20 degrees, roughly a Post-it note held at arm's length. Snap Spectacles 5th gen is 46 degrees. Orion at 70 degrees approaches the range where you stop thinking about where the display ends and just see the content. For brand work, the difference between 20 degrees and 70 degrees is the difference between a notification and an environment.
Input is handled by two mechanisms working together: eye tracking built into the glasses and a Neural Band worn on the wrist. The Neural Band reads muscle signals via electromyography: imperceptible wrist and finger movements translate into interface commands without requiring you to raise or wave your hands. For everyday wear, this is the breakthrough. The act of interacting with an Orion experience looks, from the outside, like you are doing nothing. That removes the biggest social friction barrier that current AR hardware faces.
Why 70 degrees changes the creative brief
Field of view is not just a spec number. It determines what kind of spatial experience is possible. On a 20-degree display, you are designing a ticker or a notification layer: a single piece of information overlaid on the world. On a 46-degree display like Spectacles, you are designing a spatial interface: menus, objects, anchored UI panels, content that fills a room. At 70 degrees, you are designing an environment.
At 70 degrees, a brand can overlay an entire scene onto the world around the user. A product launch does not need a stage: the stage is the room. A retail display does not need a fixture: the fixture is the aisle the customer is walking down. A live event does not need LED walls: the walls are the walls, layered with whatever the brand wants the audience to see. The visual language shifts from informational (look at this thing) to environmental (look around you). That is a completely different brief from anything the industry has worked with before.
I want to be honest about what this means for creative work: we do not fully know yet. The Orion prototype exists; a consumer product with mass distribution does not. The experiences that will define this platform have not been made. But the studios that will make them will be the ones that already understand spatial anchoring, 6DoF content design, wrist-based interaction, and AI-first experience logic, because those are exactly the skills that Snap Spectacles and Meta Ray-Ban are building today.
The Neural Band is the most underrated part
Everyone talks about Orion's field of view. The Neural Band gets less attention, and I think that is backwards. The FOV is impressive but evolutionary: we have been waiting for wider FOV AR for years. The Neural Band is something genuinely new, and it solves a problem that no other wearable interface has solved cleanly.
The core problem with AR input is visibility. On phone AR, you tap a screen. On Snap Spectacles, you use hand gestures. Both of those are legible: a bystander can see what you are doing and understand that you are interacting with something digital. The social cost is low for a short experience, but it becomes a barrier for all-day wear. The Neural Band removes that cost entirely. The interaction is in your wrist. It is invisible. You sit in a meeting, select something in your AR view, and no one around you sees anything different.
For brand activations, this opens a category that has never really worked in consumer AR: ambient, persistent, wearable experience. A user can walk through a physical retail environment, look at products, and receive contextual overlays without pulling out a phone and without performing visible gestures. The interaction feels private, which is a completely different emotional register from anything that requires a raised hand or a tap.
What brand activations might actually look like
I spend a lot of time thinking about this. The honest answer is that the formats do not exist yet in finalized form. But based on what Orion makes technically possible and what we have learned building spatial experiences on Spectacles and Ray-Ban, here is how I think brand activations will work on consumer Orion.
Persistent spatial product layers
The most obvious application is what spatial anchoring enables: a brand can place a persistent AR object at a physical location and have it remain there for any Orion user who visits. A sneaker brand places a life-size 3D model of a new shoe on the shelf next to the physical one. A car brand places a full-size vehicle configuration tool in their showroom. A fashion label turns a store fitting room into a virtual try-on space with no screen required. The user walks in, puts on the glasses, and the brand's layer is already there. This is closer to store design than it is to app design.
AI-contextual experiences
Orion runs Meta AI natively. Point the glasses at a product and the AI can tell you what it is, compare it to alternatives, surface review data, show you how other people styled it, play a short video about how it was made. The interaction is as natural as asking a knowledgeable friend a question. For brands, this means contextual commerce is no longer a QR code you have to scan and a webpage that loads slowly. It is immediate, conversational, and ambient, triggered by what the user is already looking at.
Environmental brand moments at live events
At a concert, a brand takeover no longer needs physical installation. A spatial layer visible to Orion users in the venue can transform the environment: additional visual effects, brand-aligned spatial graphics layered over the stage, a persistent trophy or collectible that appears after the performance. The physical event remains unchanged for everyone else. For Orion users who opt in, the brand has added a layer to the experience that feels like it was always part of it.
The honest read on timeline
I want to be direct here, because there is a lot of hype around Orion and some of it is not grounded. The consumer version, internally codenamed Artemis, is targeted for approximately 2027. That is not a committed shipping date. It is a prototype roadmap ambition. Hardware product development at this level of technical complexity routinely slips by a year or more, and the silicon carbide waveguide manufacturing process is reportedly expensive enough that even if Orion ships in 2027, the price point will determine whether it reaches mass distribution or stays in the premium tier for the first few years.
None of this means brands should wait. The opposite is true. The creative and technical vocabulary for spatial AR is being written right now, on the hardware that exists today. Snap Spectacles is where you prove world-locked content, spatial anchors, hand-first interaction, and 6DoF experience design. Meta Ray-Ban is where you prove AI-first interaction logic, ambient wearable formats, and Neural Band input. Both of those proof points translate directly to Orion. The studio that arrives at the first Orion brief with three years of spatial AR work behind them is not starting from zero.
Why I am paying attention now
I have been asked a few times whether Orion changes the smart glasses picture enough to warrant attention while it is still a prototype. My answer is yes, but not because we should be building for Orion specifically. It is because Orion is the clearest signal the industry has ever sent about the direction consumer AR is heading. When Meta, with its distribution, its AI infrastructure, and its hardware manufacturing partnerships, commits to a direction: true holographic display, hands-free input, ambient AI, the rest of the industry aligns around it.
The experiences we are building on Spectacles right now for clients are not throwaway prototype work. They are a live laboratory for every creative and technical question that Orion will make mainstream: how do you anchor content in a real-world space? How do you design interaction that does not require held attention? How do you structure a spatial narrative with a beginning, middle, and end? These questions do not have obvious answers yet. The studios that have answered them in practice, on real hardware, with real users, will be the ones that can execute when Orion ships at scale.
That is the bet we are making. Not on a specific release date. On the direction of the medium.
What this means if you are a brand
If you are a brand thinking about smart glasses, here is the practical read. You are probably two years from Orion being a platform worth activating on at consumer scale. You are not two years from needing to understand spatial AR. The learning curve for this medium is steep, not technically but creatively. The instinct to port what works on phone to a spatial canvas is strong and almost always wrong. The brands that will have compelling Orion activations in 2027 will have spent 2025 and 2026 understanding what spatial means for their specific customer and category.
The current smart glasses ecosystem (Meta Ray-Ban for ambient AI activations, Snap Spectacles for immersive spatial builds) is not a placeholder for Orion. It is the proving ground. If you have not yet run a spatial activation, that is where to start: with hardware that exists, with audiences that can experience it now, with learnings that compound before the platform you are really waiting for arrives.
If you want to talk through what that looks like for a specific brief, the contact link is below.