The honest starting point

Most measurement problems with AR campaigns are not technical. They happen because nobody agreed on what success looks like before the brief was signed off. The activation launches, the numbers come in, and then everyone argues about whether those numbers are good.

AR sits in a specific place in the funnel: deeper engagement than a display ad, lighter touch than a purchase. It is primarily an awareness and engagement tool. Measuring it like direct-response media will produce disappointment. Measuring it on its own terms, and communicating that frame to your leadership team upfront, produces clarity.

The rule that prevents most post-campaign arguments: agree on three things before you brief the studio. What action do you want people to take? How will you know if it reached the right people? What does "good" look like as a number for this specific format? The answers are different for every campaign.

Three questions every campaign should answer before launch

These are not reporting questions. They are brief questions. Get them agreed at the start, and measurement becomes a formality.

  1. What action do we want people to take? Not a vague outcome like "brand awareness." Something specific: open the lens and share it, scan the QR code, watch the AR trail to the end, tap through to the product page. One primary action per campaign. Everything else is secondary data.
  2. How will we know if the activation reached the right people? Reach and impressions tell you scale, not quality. For a Snap lens, look at the age breakdown and geography in Lens Studio analytics. For WebAR at a venue, footfall data tells you whether the people who scanned were the audience you intended. For social, look at the profile of who shared it.
  3. What is a realistic success benchmark for this format? Benchmarks differ by format, by paid support, and by brand size. A lens getting 10,000 plays with zero paid promotion is a strong organic result. The same number on a campaign with heavy paid media behind it is a sign something went wrong. Set the benchmark before launch, not after the numbers land.

Metrics by format

Each AR format produces a different set of numbers from a different source. Know your platform, know your dashboard.

Snap AR (Lens Studio)

Snap provides lens analytics directly through the Lens Studio dashboard. You do not need to instrument anything yourself. The key metrics are:

Snap AR
Impressions
How many times the lens was opened, including by people who received it via share.
Snap AR
Plays
How many times the lens ran. More meaningful than impressions alone.
Snap AR
Shares
Lens sent to others or posted as a Snap. The most valuable organic signal.
Snap AR
Avg. play time
How long users actively engaged. Typically 10 to 20 seconds for well-designed lenses.
Snap AR
Reach
Unique users. Separates genuine audience size from repeat opens.
Snap AR
Viral impressions
Shares that resulted in new opens by other users. The organic multiplier.

Viral impressions are worth paying close attention to. When a shared lens gets opened by someone who didn't receive paid media, that is genuine earned reach. It is the closest equivalent to word-of-mouth in an AR context. The HBO House of the Dragon activation we built on Snap generated over 100K impressions, with a meaningful portion driven by viral opens from shares rather than direct campaign delivery.

WebAR

WebAR does not have its own native analytics dashboard. You measure it through whatever analytics stack you already use: Google Analytics 4, Mixpanel, or a custom event setup. This gives you more flexibility but requires setup before launch. The metrics to track are:

WebAR
Unique sessions
Total individual visits to the experience URL. Your baseline audience size.
WebAR
Camera activations
How many sessions actually launched the AR view. Separates browsers from participants.
WebAR
Session duration
Time in the experience. Long sessions mean the AR is holding attention.
WebAR
QR scan count
If using QR entry, track scans separately to understand physical touchpoint performance.
WebAR
Bounce / engagement rate
Did people stay or leave immediately? GA4 engagement rate is more useful than bounce rate here.
WebAR
Downstream conversions
Ticket purchase, sign-up, product page visit. Set as GA4 conversion events before launch.

Camera activation rate is the metric most teams forget to track. Sessions alone tells you how many people found the experience. Activations tells you how many people actually used it. The gap between the two is where you find friction to fix. For the Chester Zoo Halloween WebAR trail, we tracked activations at each individual QR entry point across the site, which showed us exactly which animal locations drew the most engagement. Over 3,000 kids used the trail during Halloween week.

Custom native app AR

A native app gives you the most granular analytics control of any format. With Firebase, Amplitude, or a custom SDK you can track individual feature interactions, dropoff points within a session, time spent on specific AR moments, and return visit rates. The trade-off is friction: users have to download the app first, so your conversion funnel starts earlier and has an extra step to lose people.

For native app AR, the metrics that matter most are: activation rate (app downloads that led to AR being used), median session length, feature-level engagement (which AR moment held attention longest), and return sessions if the experience is designed for repeat use.

Live event and experiential AR

Live event attribution is harder, but not impossible. You cannot always connect a person standing in front of an installation to a downstream action. What you can track:

  • Footfall through the activation, either via crowd counting tech or dwell-time sensors
  • QR and URL scans at physical entry points
  • Social mentions and UGC volume in the 24 to 48 hours following the activation
  • Media coverage, including earned pickup from press and creators
  • Pre and post brand uplift surveys for bigger activations where budget allows

For experiential, social volume and UGC are often the most compelling numbers to put in a stakeholder report because they are visible and legible. A wall of people's posts is easier to present to a CMO than a dwell-time heatmap.

Setting realistic benchmarks

What does good look like? The honest answer is: it depends on the format, the brand scale, and the media support behind the campaign. Some orientation points based on what we typically see:

  • Snap lens, major brand launch with paid media support: 50,000 to 100,000+ impressions is achievable. Organic only, without any paid push, is highly variable and often lower by a factor of five to ten.
  • WebAR trail at a physical venue: 500 to 5,000 sessions is a strong result, depending on venue footfall. Chester Zoo achieved 3,000+ users in a single Halloween week, which represents strong activation of the physical audience.
  • Social sharing rate on a Snap lens: if 5 to 10 percent of lens players share it, that is genuinely strong organic lift. Most campaigns sit below 5 percent.
  • Average play time: anything above 10 seconds for a Snap lens is solid. Above 15 seconds is excellent. Below 5 seconds suggests the experience is not landing.

See the full cost and format breakdown in our AR activation cost guide for more context on how format choice affects expected performance.

One thing worth saying plainly: a WebAR experience with 800 genuinely engaged sessions can be a better result than a Snap lens with 80,000 impressions and a 1-second average play time. Volume is easy to inflate. Engagement depth is harder to fake.

Vanity metrics to avoid

Some numbers look impressive and tell you nothing useful. Flag these in your reporting setup before the campaign launches so they don't end up anchoring the post-mortem conversation.

  • Lens opens with zero-second play time. These are accidental taps or immediate exits. They inflate impressions without representing any real engagement. Snap's dashboard distinguishes between opens and plays. Use plays.
  • Follower count changes as a primary metric. An AR campaign is not a growth-hack. Follower change during a campaign is mostly noise. It is influenced by ten other things happening simultaneously.
  • Raw app downloads without engagement data. A download that never results in an AR activation is not a success. It is a non-event. Always pair download numbers with activation rate.
  • Total reach across all channels combined. Stacking your paid media reach, your organic reach, and your partner reach into a single "total reach" figure obscures where the activation actually performed. Report each channel separately.

Building the stakeholder report

A good AR campaign report is one page of signal, not ten pages of screenshots. Structure it as:

  1. The agreed objective and KPI. Restate what success was defined as before launch. This frames everything that follows.
  2. The primary metric result. Did you hit the KPI? By how much, in either direction? Be direct.
  3. Two or three supporting metrics. Average play time, share rate, camera activation rate. These add depth but should support the primary metric, not distract from it.
  4. One qualitative signal. A sample of UGC, a creator post, a piece of earned media coverage. This grounds the numbers in something visible and human.
  5. Attribution gap acknowledged honestly. AR sits between awareness and conversion. Direct revenue attribution is rarely clean unless you have a downstream conversion event explicitly instrumented. Say this plainly rather than letting leadership infer it wasn't tracked.

The attribution gap is worth addressing directly rather than hoping nobody asks. AR awareness activity that cannot be directly linked to a sale is still valuable, for the same reasons that OOH and sponsorship spend is valuable: it builds recognition and context for the moments where people do convert. Frame it that way rather than pretending the gap doesn't exist.

Frequently asked questions

What metrics should I report for a Snap AR lens?

Report impressions, plays, shares, average play time, and reach (unique users). Snap's Lens Studio dashboard provides all of these directly. The most powerful organic metric to highlight is viral impressions: shares that led to a new open by a different user. That number shows whether the lens had genuine word-of-mouth lift beyond paid delivery.

How do I measure WebAR campaign performance?

Connect Google Analytics 4, Mixpanel, or a custom event setup to the experience URL before launch. Track unique sessions, camera activations (the moment the AR view actually opens), session duration, and any conversion events downstream such as a ticket purchase or email sign-up. If entry is via QR code, track scan volume separately so you can measure the physical touchpoint performance independently of the digital experience.

Is AR engagement higher than traditional digital ads?

Typically yes, when you measure time spent and interaction depth. A Snap lens session typically averages between 10 and 20 seconds of active interaction. A WebAR experience at a physical venue can hold attention for 60 to 90 seconds or more. Display ad viewability benchmarks sit well below those figures. The honest caveat: AR reaches fewer people than a broad paid media buy. The comparison depends on whether you are optimising for depth of engagement or raw reach. For most brand campaigns, AR earns its place as an engagement layer alongside broader media, not as a replacement for it.

How do I justify AR spend to my leadership team?

Frame AR in the same terms you use for any brand awareness investment: cost per engaged minute, share rate, UGC generated, and earned media coverage. Acknowledge the attribution gap honestly upfront. AR sits between awareness and conversion, so direct revenue attribution is rarely clean unless you have explicitly instrumented downstream conversion events. Present engagement depth and audience quality as the primary value, with any downstream conversion events as supporting evidence. Leadership teams respond better to an honest framing than to numbers that quietly exclude what wasn't tracked. You can also compare cost per second of engagement against equivalent video formats, where AR typically compares well.