In this deep-dive interview, Ernest Spicer, game developer and data scientist, shares how he built one of the first AI-integrated apps for Snap Spectacles, why AR is the real future of wearable tech, and how he's designing tools that tackle real problems in education and accessibility.
Before getting into the interview, I wanted to quickly introduce you to today’s sponsor Gracia AI. Gracia AI is the only app that allows you to experience Gaussian Splatting volumetric videos on a standalone headset, either in VR or MR.
It is a truly impressive experience and I recommend you to try it out right now on your Quest or PC-powered headset.
Interview with Ernest Spicer
What was your initial experience like developing on the Spectacles platform?
Ernest Spicer: I've been in game development for over 20 years, so IDEs (Integrated Development Environments) are critical for me. Spectacles stood out because they were the first in AR to offer an IDE that mimics Unity, making onboarding smooth for creators like me. Instead of diving into C++ or C, you're working with JavaScript, which lowers the barrier significantly. You can build and deploy right into the Snap ecosystem, it’s seamless, and you immediately see reactions and feedback. From an innovation standpoint, they’re far ahead of other AR platforms.
Were there any downsides to developing on the Spectacles?
Ernest Spicer: Absolutely, it's not perfect. The battery life is short, and the field of view is limited. The Snapdragon processors can overheat quickly if you pack in too many 3D elements. But considering it’s a prototype, it’s forgivable. Internally, they’re addressing these issues. For me, the ability to build fast and get real-time feedback from a platform that supports creators far outweighs the limitations. It’s a promising direction, especially for early adopters.
Given those limitations, were you still able to accomplish what you set out to do?
Ernest Spicer: We did exactly what we intended. Our Pac-Man Fit game only took five weeks from concept to deployment. That includes building spatial elements, game mechanics, and integrating computer vision. The launch process was just minutes, no painful debugging or third-party approvals like with Unity or Unreal. Specs have nailed social AR deployment. They removed so much friction and made it easy to get things into the hands of users fast.
Why did you choose to develop for Spectacles if adoption is still low?
Ernest Spicer: That’s a great question. My company is edtech-focused, and I convinced them AR was worth exploring. Spectacles were the only glasses with a solid dev environment and a scalable manufacturing setup. Others like Meta Ray-Bans were nightmares in terms of support and infrastructure. Spectacles might not be everywhere yet, but they’re ahead in usability, affordability, and educational access. For $50 a month, schools can try them out. There’s real potential there.
What advice do you have for someone just starting to develop on Spectacles?
Ernest Spicer: Tap into the Snap ecosystem, it’s gold. They have a Reddit community where engineers actually respond and help troubleshoot. Also, get good at JavaScript. The entire platform runs on it, and most of your interactions will be built with it. Snap provides great starter kits, so even non-coders are putting out lenses. Understanding the file structure and how to make parts of the lens communicate is key. You don’t need to start from scratch, just build on what’s already out there.
Can you walk us through the process of publishing a lens?
Ernest Spicer: Sure. After testing and debugging your lens directly on the specs, you hit a button and it pushes right to the device. Once it's solid, you submit through Lens Studio with a name, thumbnail, and short description. Snap reviews and typically pushes updates every one to three weeks. It's that easy. We often give Snap a heads-up since we’re pushing the limits with our features, but for most creators, the process is smooth and streamlined.
Besides Pac-Man Fit, what other lenses are you working on?
Ernest Spicer: We’re building a platform called SolariDX. Iit’s like an all-in-one education and engineering tool for solar energy. It runs AI models locally and connects to LMS systems like Canvas or Brightspace. We’ve got 3D animations, quizzes, spatial anchors for live radiation data, and multimodal interfaces that let users ask questions with images. It’s designed for both schools and solar professionals.
Where do you see AR glasses fitting into the future of education?
Ernest Spicer: AR is the only real pathway to wearable AI in education. VR headsets block spatial awareness and isolate users. AR glasses like Spectacles let students stay connected to the real world while interacting with augmented content. More importantly, they offer an antidote to what I call cognitive offloading, when students rely too much on AI to do the work for them. With AR, they learn by doing, which means they’re actively engaging and retaining the information.
How did you manage to integrate AI directly into the glasses?
Ernest Spicer: It took about two months of hard work. We had to adapt the JavaScript in experimental mode and work closely with Snap to push past some limitations. Now the specs can call any API: OpenAI, Hugging Face, or even our own. We’ve basically replicated how they call OpenAI’s API and applied that logic to any model we want. We’ve been working on that kind of infrastructure for a few years, so now we can plug in whatever backend we need.
What are your thoughts on implementing AI in current apps?
Ernest Spicer: That’s the future. We run our entire education suite on a 4-vCPU server, no need for massive GPUs. The key is fine-tuning small language models on focused datasets. You reduce hallucinations, improve accuracy, and cut infrastructure costs. Tools like our skills-matching engine for universities run on tiny compute but deliver huge value. Bigger isn’t always better. DeepSeek just proved that by outperforming GPT-4 with a model that runs on a laptop. Focus wins.
Do not forget to check out the full interview on your favourite platform 👇
That’s it for today
See you next week