If you read our last post about crafting virtual selves, you know we were just getting started. Well buckle up — because this time we didn’t just make a few avatars. We built a full automated pipeline that converts 90+ characters into working VRM avatars for Animaze. No errors. No crashes. Just vibes and virtual people. 🎮🔥
🧠 What We Were Trying to Do
Our goal was simple in theory: take a library of 3D characters in FBX format — knights, vampires, zombies, grannies, aliens, and yes, even a unicorn 🦄 — and convert them all into VRM files that work in Animaze with full face tracking, hand tracking, and body movement.
Simple in theory. Absolutely wild in practice.
🔧 The Tech Stack
- Blender 5.0 — our workhorse for 3D processing
- VRM Addon for Blender — handles the VRM0 format export
- Animaze — the real-time avatar software we’re feeding everything into
- Python — the glue that holds the whole pipeline together
- Mixamo FBX files — our source characters (140 of them!)
😤 What Went Wrong (Everything, At First)
The White Model Problem — our first big wall. Characters were exporting fine but showing up completely white in Animaze. Turns out Animaze doesn’t read the standard glTF PBR material block — it reads a totally separate VRM-specific section called materialProperties. The Blender VRM exporter was filling that section with VRM_USE_GLTFSHADER and empty texture references. Animaze sees that and goes… blank. 🤷
The fix? Stop post-processing the materials entirely and let the VRM exporter handle it natively. Sometimes the best fix is removing code, not adding it.
The Backwards Characters Problem — half our avatars were loading into Animaze facing the wrong direction. We spent way too long on fancy detection logic checking arm positions and bone coordinates. The real answer was embarrassingly simple: Mixamo exports characters facing -Z, and we were importing with axis_forward='-Z', so they cancelled out and everyone walked in backwards. One line fix. 😅
The Subprocess Crash Problem — early on we tried running Blender as a subprocess for each character to isolate crashes. This caused more problems than it solved. We scrapped it and went back to a simple try/except loop. Less clever, more reliable.
🏗️ How the Pipeline Works Now
We ended up with a clean, simple architecture. Every character gets their own .py file based on a working template. Each one has two settings at the top:
EXTRA_ROT_Z = 0 # set to 180 if character loads backwards
JOIN_MESHES = False # set to True if character has many mesh parts
That’s it. No magic detection. No complex batch logic. Just two knobs per character. A single run_all.py chains all the scripts together and runs them through Blender one by one. Set it going and walk away.
🎯 What We Learned
Animaze reads VRM materialProperties, not glTF PBR. If your character is white, this is why. Don’t post-process — let the VRM addon handle it natively.
Rename bones before checking orientation. We had a bug where orientation detection couldn’t find LeftUpperArm because bones still had their mixamorig: prefix at that point. Order matters.
Simple beats clever every time. We rewrote the orientation detection three times trying to be smart about it. A hardcoded 180-degree flag per character is faster, more reliable, and easier to understand six months from now.
Individual files are better than one mega-script. When one character crashes, the others keep running. When you need to tweak one character, you edit one file. Obvious in hindsight.
Auto-weights beat envelope weights for complex meshes. Learned this the hard way when the unicorn exploded into pieces. 🧩
🌟 The Results
- ✅ 90+ working VRM avatars loading in Animaze
- ✅ Full face tracking — expressions, blinks, mouth movements
- ✅ Hand tracking via Animaze’s Mediapipe open beta
- ✅ Zero import errors in Animaze
- ✅ A fully automated pipeline — drop in a new FBX, run one script, done
🚀 What’s Next
We’ve got 90 avatars ready to go — knights, vampires, zombies, soldiers, grannies, aliens, clowns, pirates. The plan is to start streaming with them, swap characters mid-session, and keep building out the collection.
This project started as “let’s try making an avatar” and turned into building a proper 3D conversion pipeline from scratch. That’s the SolarBluSeth way — start curious, go deep, figure it out. 💛
Stay tuned for videos, streams, and probably more chaotic deep dives into tech rabbit holes. The virtual world is ours. 🌐✨
— SolarBluSeth | solarblu.net