In this episode, we sit down with Gordon Midwood, co-founder of Anything World, to discuss how AI is revolutionizing 3D animation and rigging. Gordon shares insights on AI’s impact on game development, social media content production, and how even AAA studios are adopting this tech as part of their workflow. We also dive into the challenges of AI animation, the roadblocks in industry adoption, and why AI is more of a creative tool than a job killer.
Before we dive into the conversation I wanted to let you know that as part of the partnership with AWE, I’ve got some exciting opportunities to share with you: The AWE Builders Nexus!
AWE Builders Nexus is an all-encompassing program for new and accomplished XR builders designed to connect and empower you to succeed—onsite at AWE USA 2025 in Long Beach California on June 10-12!
Build something extraordinary, get advice and funding, scale through partnerships, and attract customers.
Fill out the Builders Nexus form and get a super affordable $49 1 Day Expo pass or a 20% discount on all-access passes! 🎟✨
Interview with Gordon Midwood
What makes Anything World different from other rigging and animation solutions?
Gordon Midwood: At Anything World, we use AI and machine learning to rig and animate 3D models automatically. What makes us unique compared to tools like Mixamo is that we handle everything—not just humanoid characters. We rig and animate scorpions, insects, birds, hedgehogs, hippos—you name it. Most other tools focus solely on bipedal humans, but we set out to see if we could bring anything to life, which is why we’re called Anything World.
How does your AI-powered animation pipeline work?
Gordon Midwood: We have a seven-step machine-learning pipeline that we’ve been developing for over five years. The first step is classification—understanding what the model is. When you upload a 3D model, our system analyzes its shape and structure. If we recognize it, we generate an appropriate rig. If not, we might suggest an adjacent rig or, in rare cases, say, “Sorry, we can’t animate this.” From there, we apply animations that match the model type—so a dog will have walking and sniffing animations, a bird will have flying, and so on.
Once a model is rigged, what happens next?
Gordon Midwood: The simplest workflow is: you either generate a model on our site or upload your own, and within minutes, it’s rigged and animated. You can then pick it up in Unity or Unreal, where we have native plugins, or download it as an FBX and bring it into Blender for further refinement. We also have an API for bulk processing, so if a developer has 1,000 models to rig and animate, they can do that programmatically without ever opening a UI.
What’s the most exciting use case you’ve seen so far?
Gordon Midwood: There’s a game on Steam called Word Warrior that’s powered by Anything World. It’s like Typing of the Dead, but instead of zombies, you type words, and those words come to life as animated 3D objects at runtime. So if the game asks for a seven-letter word starting with “U” and you type “umbrella,” an umbrella appears fully animated in real-time. That’s the magic of runtime asset generation—it unlocks entirely new game mechanics that weren’t possible before.
You mentioned runtime animation. How does that work?
Gordon Midwood: Right now, we can rig and animate a model in 3 to 5 minutes. We’re aiming to get that down to under 1 minute by the end of this year and 10-20 seconds next year. The goal is instant animation at runtime. Imagine a Roblox-like game where players can generate anything they imagine, and it immediately moves realistically. That’s a paradigm shift in game design.
Can you share an example of a runtime-generated game that wouldn’t have been possible without AI?
Gordon Midwood: Imagine a puzzle game where you need to rescue a polar bear from an iceberg. You could generate killer whales to push it, helicopters to lift it, or even build a bridge out of animated 3D objects. The fact that anything can exist dynamically in the game world changes how we think about game mechanics. It’s like Scribblenauts, but in 3D.
How do you see AI shifting the role of animators?
Gordon Midwood: I totally get why some animators are skeptical about AI, but I see it as a creative tool, not a replacement. Rigging and animation are painfully slow, so if AI can automate the tedious bits, artists can spend more time on bespoke animations, storytelling, and world-building. It’s like how Photoshop didn’t replace painters—it just gave them better tools. AI will enhance creativity, not kill it.
What’s your response to people who say AI-generated content will flood the internet with junk?
Gordon Midwood: Yeah, I hear that argument a lot. AI makes it way easier to generate content, which means a lot of it will be garbage. But honestly? That’s just the internet. We already live in an attention economy, and AI isn’t going to change that—it’s just going to make content creation more accessible. If quality storytelling Matt Coolers, people will still seek out human-crafted content. AI is just a tool.
What’s your take on AI-generated animations?
Gordon Midwood: Right now, AI-generated animation isn’t quite production-ready, but that’s changing fast. The dream is to type a prompt like “Make this dog run, sniff a bucket, then roll over”, and have it animate in real-time. We’re working on that for this year. The challenge is making AI understand movement like an animator does—not just mix existing animations, but truly generate new, natural-looking motion. No one’s cracked that yet at a high-quality level, but we plan to.
Are big studios adopting AI animation, or are they hesitant?
Gordon Midwood: It’s a mixed bag. Indie developers tend to jump right in, experiment, and build cool stuff. But big studios are more cautious. They worry about legal issues, especially with AI-generated content, and they also don’t want to disrupt their pipelines. Studios have been using the same animation workflows for years, so getting them to embrace AI is a slow process. That said, we’re in talks with Disney, Universal, and other major players, so it’s only a matter of time before AI becomes standard in their workflows.
What’s stopping AI from being fully integrated into AAA game development?
Gordon Midwood: Two things: quality and trust. Studios need absolute precision in their animations, and AI isn’t quite there yet. Also, there’s still this AI scepticism—people don’t want to deal with legal headaches or PR nightmares about AI replacing jobs. But once AI animation hits a certain quality threshold, there’ll be no reason not to use it.
Are there any unexpected industries using AI-generated animation?
Gordon Midwood: Yeah! We’re seeing a lot of interest in social media and marketing. People are creating AI-animated characters for TikTok, Instagram, and ads. We’d love to integrate with TikTok’s Effect House, for example, so people can generate, rig, and animate models for AR filters instantly. It’s early days, but AI-powered social content is coming fast.
What’s next for Anything World?
Gordon Midwood: This year, we want to be the first to offer true synthetic animation generation, so instead of picking from preset animations, you can just tell AI what you want, and it animates it dynamically. We’re also working on higher-fidelity rigs with facial animation, control rigs, and more bones, so we can meet the quality demands of AAA studios. The end goal? Making real-time 3D animation as easy as typing a sentence.
Final thoughts: where is AI animation headed?
Gordon Midwood: AI is changing everything. Five years ago, people laughed at text-to-3D. Now, everyone is doing it. In another five years, AI-powered animation will be the norm. If you’re a developer, embrace it now, because the tools are only getting better.
That’s it for today, and don’t forget to subscribe to the newsletter if you find this interesting.
See you next week
Love this