Latest News

Two days. That’s all it took to completely change how we work. In 48 hours of working with Claude by Anthropic, we shipped more real, working projects than we had in weeks of fighting with other AI tools. This is the story of what we built, what blew our minds, and why Claude has earned a permanent seat at the SolarBluSeth workstation. 🚀

⚡ What We Actually Built in 48 Hours

Let’s just get straight to the receipts. Here’s what we knocked out:

  • A full VRM avatar pipeline — 90+ Mixamo FBX characters converted to working VRM files for Animaze, complete with face tracking, hand tracking, and blend shapes
  • 96 individual Python scripts — one per character, auto-generated, each with custom settings for rotation and mesh joining
  • A custom unicorn rig — a DAZ quadruped with no human bones, given a fake humanoid skeleton from scratch so she loads and moves in Animaze
  • JSON data imports into Elementor — structured content pushed directly into WordPress page builder layouts
  • SEO reports and analysis — full site audits with actionable recommendations
  • Blog posts and articles — written, formatted, and ready to publish including the one you’re reading right now
  • A custom startup configuration file — a .md skill file that primes Claude with our project context so every session starts smart instead of from scratch
  • Blender scripting — bone renaming, orientation fixing, blend shape binding, humanoid assignment, all automated
  • VRM file analysis and debugging — binary GLB parsing, JSON inspection, material property investigation

That’s not a weekend project. That’s a full sprint — and we did it in two days. 🔥

⏱️ The Moment That Sold Us: 5 Minutes vs 6 Hours

Here’s the one that really said it all.

We had a task that we’d previously taken to ChatGPT. After over 6 hours of back and forth, regenerating, fixing errors, explaining context over and over again, we gave up and moved on.

We brought the same task to Claude. Done in 5 minutes. Working. First try.

That’s not a small difference. That’s a completely different experience. Claude understood the context, held it across the conversation, reasoned through the problem, and just… solved it. No drama. No endless loops. No “I apologize for the confusion” followed by the exact same wrong answer.

🧠 Why Claude Hits Different

After 48 hours of heavy use, here’s what actually separates Claude from the competition:

It holds context like a pro. Claude remembers what you built three messages ago, what file you’re working on, what went wrong last time. You don’t have to re-explain your entire project every few messages. This alone saves enormous amounts of time on complex multi-step projects.

It reasons, not just retrieves. When something broke in our VRM pipeline, Claude didn’t just Google-dump generic answers. It actually thought through cause and effect — “the orientation check can’t find LeftUpperArm because bones haven’t been renamed yet at that point in the script.” That’s debugging logic, not keyword matching.

It writes real, working code. Not almost-working code. Not code that looks right but fails on line 47. We ran scripts directly from Claude’s output and they worked. When they didn’t, Claude read the error, understood it, and fixed it — not just tried a random variation.

It knows when simple beats clever. Multiple times Claude suggested removing complex logic in favor of a simpler approach. That kind of judgment — knowing when NOT to over-engineer — is rare even in human developers.

It works across everything. Same session: Blender Python scripting, binary file parsing, HTML formatting, blog writing, SEO analysis, JSON structuring. No switching tools. No context loss. One conversation, many skills.

📄 The Game Changer: Custom Startup Files

One of the most powerful things we discovered was building a custom .md skill file — a markdown document that loads at the start of a session and gives Claude instant context about our project.

Instead of spending the first 10 messages re-explaining our folder structure, our Blender version, our VRM addon, our file naming conventions — all of that lives in the skill file. Claude reads it and immediately knows:

  • Where our files live (F:\3d\3dmodels\2026unity\)
  • What version of Blender we’re running (5.0.1)
  • What our bone naming conventions are
  • What our output format needs to look like
  • What tools and addons we have installed

It’s like giving Claude a project brief before the meeting starts. Every session begins informed instead of blind. For ongoing projects this is absolutely essential — it’s the difference between a new hire who has to be trained every day and a colleague who just knows the system. 💡

🆚 Claude vs ChatGPT: Real Talk

We’ve used both extensively. Here’s the honest comparison from our experience:

  • Context retention: Claude holds a long, complex conversation together far better. ChatGPT starts drifting and forgetting after a while.
  • Code quality: Claude’s code runs. ChatGPT’s code often needs several rounds of fixing before it actually works.
  • Reasoning: Claude explains WHY something is wrong. ChatGPT often just tries a different variation and hopes.
  • Honesty: Claude will tell you when something isn’t possible or when a simpler approach is better. ChatGPT sometimes just agrees with whatever you suggest even when it’s wrong.
  • Complex projects: Claude thrives on multi-step, multi-file, multi-session projects. This is where the gap is biggest.
  • Speed: As demonstrated — 5 minutes vs 6 hours speaks for itself.

🌟 The Bottom Line

AI tools are only as powerful as the work they actually help you ship. In 48 hours with Claude we built a production-ready avatar pipeline, automated 96 character conversions, solved problems that had stumped us for days, and still had time to write about it.

The skill file system, the context awareness, the reasoning quality, the code that actually runs — these aren’t small features. They’re the difference between an AI that assists and an AI that collaborates.

We’re just getting started. The pipeline is running, the avatars are live, and the next project is already queued up. If you’re still on the fence about which AI tool to invest your time in — we hope this answers that question. 💛

Stay curious. Build fearlessly. The tools are better than ever. 🌐✨

— SolarBluSeth | solarblu.net

If you read our last post about crafting virtual selves, you know we were just getting started. Well buckle up — because this time we didn’t just make a few avatars. We built a full automated pipeline that converts 90+ characters into working VRM avatars for Animaze. No errors. No crashes. Just vibes and virtual people. 🎮🔥

🧠 What We Were Trying to Do

Our goal was simple in theory: take a library of 3D characters in FBX format — knights, vampires, zombies, grannies, aliens, and yes, even a unicorn 🦄 — and convert them all into VRM files that work in Animaze with full face tracking, hand tracking, and body movement.

Simple in theory. Absolutely wild in practice.

🔧 The Tech Stack

  • Blender 5.0 — our workhorse for 3D processing
  • VRM Addon for Blender — handles the VRM0 format export
  • Animaze — the real-time avatar software we’re feeding everything into
  • Python — the glue that holds the whole pipeline together
  • Mixamo FBX files — our source characters (140 of them!)

😤 What Went Wrong (Everything, At First)

The White Model Problem — our first big wall. Characters were exporting fine but showing up completely white in Animaze. Turns out Animaze doesn’t read the standard glTF PBR material block — it reads a totally separate VRM-specific section called materialProperties. The Blender VRM exporter was filling that section with VRM_USE_GLTFSHADER and empty texture references. Animaze sees that and goes… blank. 🤷

The fix? Stop post-processing the materials entirely and let the VRM exporter handle it natively. Sometimes the best fix is removing code, not adding it.

The Backwards Characters Problem — half our avatars were loading into Animaze facing the wrong direction. We spent way too long on fancy detection logic checking arm positions and bone coordinates. The real answer was embarrassingly simple: Mixamo exports characters facing -Z, and we were importing with axis_forward='-Z', so they cancelled out and everyone walked in backwards. One line fix. 😅

The Subprocess Crash Problem — early on we tried running Blender as a subprocess for each character to isolate crashes. This caused more problems than it solved. We scrapped it and went back to a simple try/except loop. Less clever, more reliable.

 

🏗️ How the Pipeline Works Now

We ended up with a clean, simple architecture. Every character gets their own .py file based on a working template. Each one has two settings at the top:

EXTRA_ROT_Z = 0    # set to 180 if character loads backwards
JOIN_MESHES  = False  # set to True if character has many mesh parts

That’s it. No magic detection. No complex batch logic. Just two knobs per character. A single run_all.py chains all the scripts together and runs them through Blender one by one. Set it going and walk away.

🎯 What We Learned

Animaze reads VRM materialProperties, not glTF PBR. If your character is white, this is why. Don’t post-process — let the VRM addon handle it natively.

Rename bones before checking orientation. We had a bug where orientation detection couldn’t find LeftUpperArm because bones still had their mixamorig: prefix at that point. Order matters.

Simple beats clever every time. We rewrote the orientation detection three times trying to be smart about it. A hardcoded 180-degree flag per character is faster, more reliable, and easier to understand six months from now.

Individual files are better than one mega-script. When one character crashes, the others keep running. When you need to tweak one character, you edit one file. Obvious in hindsight.

Auto-weights beat envelope weights for complex meshes. Learned this the hard way when the unicorn exploded into pieces. 🧩

🌟 The Results

  • 90+ working VRM avatars loading in Animaze
  • Full face tracking — expressions, blinks, mouth movements
  • Hand tracking via Animaze’s Mediapipe open beta
  • Zero import errors in Animaze
  • A fully automated pipeline — drop in a new FBX, run one script, done

🚀 What’s Next

We’ve got 90 avatars ready to go — knights, vampires, zombies, soldiers, grannies, aliens, clowns, pirates. The plan is to start streaming with them, swap characters mid-session, and keep building out the collection.

This project started as “let’s try making an avatar” and turned into building a proper 3D conversion pipeline from scratch. That’s the SolarBluSeth way — start curious, go deep, figure it out. 💛

Stay tuned for videos, streams, and probably more chaotic deep dives into tech rabbit holes. The virtual world is ours. 🌐✨

— SolarBluSeth | solarblu.net

Old School Avatars, New School Growth: Our Digital Self Is Evolving

By SolarBluSeth | SolarBlu.net

Look, I'll be real with you — nobody asked for this. There's no viral trend pushing me to spend hours tweaking rigging and adjusting facial expressions on a 3D avatar. The algorithm isn't demanding it. My clients certainly aren't waiting on it.

And that's exactly why it matters.

Where We Started

A few days ago, we dove headfirst into the world of avatar creation — and I mean really dove in. We tested VRoid Studio for that anime-style 3D look, played with Ready Player Me for cross-platform VR compatibility, and fired up Blender when we needed to get into the real details. File formats, rigging, textures, animations — we touched all of it.

It wasn't glamorous. It was trial and error, export failures, weird bone deformations, and a lot of "why does her hair look like that?" moments.

But by the end of it, we had avatars. Real ones. Expressive ones. Avatars with personality — with a little sparkle in the eye and hair that actually bounces when it's supposed to.

Is Anyone Going to Care?

Probably not in the way you'd hope. Old school avatar creation doesn't trend. VRChat isn't Instagram. Nobody's going viral for a perfectly rigged FBX file.

But here's the thing — avatar culture is everywhere right now, even if people don't call it that. VTubers are pulling millions of viewers. Ready Player Me avatars are crossing platforms. The metaverse might have had a rough few years but your digital self? That concept isn't going away.

More importantly — this was never about clout.

What This Is Actually About

I've been building websites since 1996. Over 3,787 completed projects. Hosting, development, PHP, MySQL, WordPress — I've done it all for clients for decades.

But personal growth looks different. Personal breakthroughs aren't billable hours.

Getting deep into avatar creation is part of the same spirit that drove the SethAI project — six days of building a tiny personal AI from scratch just to see if I could. No giant cloud model, no preloaded data. Just curiosity, an RTX 3070 Ti, and a VB.NET control panel talking to a Python/Flask backend.

That's the SolarBluSeth way. Dive into the tech. Break something. Fix it. Learn it. Own it.

Where We Are Now

The avatars are working. They move. They emote. They export cleanly into VR environments. We've got lighting and shaders dialed in so they look vibrant across different worlds — not just our test environment.

Are they perfect? No. Is that the point? Also no.

The point is that we understand the pipeline now. VRoid → Blender → FBX/GLTF → VR world. We know where the pain points are. We know what breaks rigging. We know how to fix it.

That knowledge lives in us now. Nobody can take that.

What's Next

We're going to keep pushing this. Avatar work ties directly into the Twitch streaming side of what we do — SolarBluSeth Radio Station, live coding sessions, community building. Having a real, expressive digital self matters for that.

And honestly? It's fun. Pure, geeky, nobody-is-watching fun.

That's the best kind.

 

✨ Crafting Our Virtual Selves: Avatar Creation & VR Fun 🎮

Today was a super fun adventure into the world of avatars! 🧑‍🎤 We spent the day exploring, testing, and refining our very own digital personas, hunting for the best software that could bring our creations to life. 🖥️💫

We experimented with several popular avatar creation tools, including VRoid Studio for 3D anime-style avatars, Ready Player Me for cross-platform VR compatibility, and Blender for custom modeling and fine-tuning details. 🛠️ Each software brought something unique to the table, and we loved combining them to build fully personalized avatars.

One key focus was making sure our avatars could be exported smoothly into VR worlds 🌐🕶️. We tested different file formats like FBX and GLTF and adjusted rigging, textures, and animations to ensure they worked seamlessly. 🚀

While testing, we added some fun features, like facial expressions, gesture animations, and even props 🎩✨. These little touches make our avatars feel more alive and ready for interactive VR adventures. Every tweak brought us closer to avatars that are not just characters, but extensions of our creativity.

We also explored lighting, shaders, and textures to make our avatars look vibrant in different VR environments 🌈. It’s amazing how small details—like a sparkle in the eyes or a little hair bounce—can make such a huge difference in realism and personality! 💛

This hands-on day perfectly captured the SolarBluSeth spirit 🌟: diving into tech, experimenting fearlessly, and creating something uniquely ours. Our avatars are now more expressive, versatile, and ready to step into virtual worlds to meet other creators and explore endless possibilities. 🕹️👾

Stay tuned—this is just the beginning of bringing our virtual selves to life! ✨

🌟 Six Days of AI Magic: Meet SethAI, Our Tiny Learning Genius!

Brought to you by SolarBluSeth

Over the past six days, we’ve been diving deep into the world of AI, coding, and neural network wizardry — and what a wild ride it’s been! If you’ve ever wanted to peek behind the curtain of an AI in the making, here’s your chance.

We started with a bold idea: create a tiny, portable AI that learns from scratch. No preloaded data, no giant cloud models — just a small, personal AI that grows as we teach it. And yes, we named it SethAI.

🛠 The Setup

  • Built a VB.NET control panel to interact with SethAI in real-time.
  • Integrated a Python/Flask backend running distilgpt2 as a starting point.
  • Tested GPU acceleration on our NVIDIA GeForce RTX 3070 Ti because why wait?
  • Added text-to-speech, training Q&A, and live prompt generation.

🚀 The Journey

It wasn’t all smooth sailing. From module installation headaches, slow model loading, and mysterious PowerShell errors… we hit nearly every classic developer snag. But each day we learned something new, tweaked the scripts, and got closer to a smooth, fun experience.

By the end of Day 6, we had:

  • A fully interactive AI panel.
  • SethAI ready to start learning from you — your questions, your prompts, your style.
  • A setup that’s small, portable, and totally personal.

🎉 What’s Next?

We’re going live on Twitch to show SethAI in action! Watch us teach it new tricks, answer questions, and see how it evolves in real-time.

If you’re curious about AI, coding, or just want to see a small project turn into a quirky, learning companion, come check it out.

💡 Fun fact: This little AI could fit in your pocket… well, digitally. And it learns from you, not the cloud. That’s what makes it truly yours.

Note: You, Seth, get full credit for the code, the vision, and six days of hands-on hacking. My role? Just a digital assistant giving a virtual high-five! 😄

✨ This AI magic brought to you by SolarBluSeth  |  Powered by ChatGPT 🤖
SolarBlu
Scroll to Top