NOODLE
CASE STUDY
Studio / Portfolio / Noodle
MIT Reality Hack 2026 × Snap Spectacles

Noodle

Snap Spectacles MIT Reality Hack Spatial AI Winner
36hBuilt in 36 hours
2Prizes won
MITReality Hack 2026
The brief

Sketch to 3D in one
spatial flow.

MIT Reality Hack 2026 open brief: build something meaningful for spatial computing in 36 hours. The team chose to solve a real problem for creative professionals. The constraint was the hardware: Snap Spectacles Gen 5, hand tracking only, no keyboard, no mouse.

Instead of building another AR productivity tool that digitised an existing desktop workflow, we asked what a creative workflow looks like when it starts in physical space. The answer was a node-based spatial canvas where a sketch on your real desk becomes the first input, your voice becomes the prompt, and a 3D model sitting on that same desk is the output.

No app switching, no file management, no keyboard. The entire idea-to-object pipeline in one continuous spatial flow.

Event
MIT Reality Hack 2026
Platform
Snap Spectacles Gen 5
Role
Creative Director · Lead XR Developer
Awards
Founders Lab Track Prize · Best Use of Spatial AI
Year
January 2026
Deliverable
Mixed reality creative workbench with node-based spatial AI pipeline.
In the space

Spatial nodes,
real desk, AI output.

Video
Snap Spectacles
Click to play
Noodle spatial AI creative tool on Snap Spectacles MIT Reality Hack 2026
Noodle node canvas for sketch to 3D workflow
AI-generated 3D model from Noodle
Noodle running on Snap Spectacles in space
Process

Problem before idea.
Interface as logic.

01
The problem before the idea
Most hackathon projects start with the technology. We started with a friction point: creative professionals switch between an average of 10 applications to take a sketch to a 3D concept. That number became the brief we gave ourselves. Every design decision after that was tested against one question: does this reduce the switches or add one?
02
The wire system as interface
The node-and-wire interaction was not just visual. It was the logic of the tool made physical. We built a custom wire renderer in TypeScript so that dragging a connection between two nodes actually triggered the pipeline. The act of connecting became the act of creating. That decision unified the UI and the backend into a single gesture.
03
Designing for latency
Generative AI takes 10 to 30 seconds to return an output. In a keyboard-based tool, that is a loading bar. On Spectacles with no screen to retreat to, it would have broken the experience entirely. We designed visual feedback states that made the wait feel like the system was thinking alongside you, not making you wait.
The result

Two prizes, one flow.
Constraints that paid off.

Winner Founders Lab Track Prize at MIT Reality Hack 2026
Winner Best Use of Spatial AI, sponsored by Snap Spectacles
Winner Snap Spectacles Community Challenge
Friction, not features, produces tighter concepts
The "Toggle Tax" insight gave the whole project a spine. Every feature we considered could be evaluated against one test: does it reduce context switching or not? That constraint eliminated scope creep and kept the concept coherent under hackathon time pressure.
Hardware constraints force better design
No keyboard on Spectacles is a limitation that became a design principle. Voice-first and gesture-first interaction is not just accessible, it is faster for the right tasks. The constraint produced a more considered UI than an unconstrained brief would have.
Latency is a UX problem, not a technical one
The 10 to 30 second generation wait is a fixed technical reality. Treating it as a UX problem to solve rather than a spec to apologise for changed the output entirely. The experience felt fluid not because generation was fast, but because waiting was designed.
Physical and digital can share one logic
Grabbing a real sketch off a real desk and pulling it into a spatial node graph sounds complex. It worked because the interaction metaphor was consistent throughout. When the metaphor holds, the learning curve collapses.
Next project
House of the Dragon × Snapchat AR
View project →