Four projects across UX/UI and AI-native builds. Twelve case studies of comprehensive synergistic outcomes. Real work, real clients, real arguments.
Got my dream car. No idea what I was doing. Scratch-built a full-stack, multi-modal RAG AI sidekick — built on FastAPI, Claude, Supabase + pgvector — to ingest gigabytes of manuals, forum data, and repair docs. Now it diagnoses issues from photos and tracks every bolt I've touched.
246,000 messages. 20+ insights. Turns out you text most at 11am — and 2am. Built as a wrapped macOS app with PyQt6, sentiment analysis, topic modeling, and an AI chat tab powered by Claude.
↗Vibe-coded, tested, vetted, awesome. Vision recognition reads paper tickets at each stage of the make process. Twilio texts customers their ETA. Two displays — one in the dining room, one visible from the parking lot — show the live order queue. Employee dashboard lets staff override any step. Built solo for a shop that refused to use a KDS.
↗Real-time sales, labor, transactions, and voids across 7 shops on a single dark display. Boss asked. Delivered in 3 hours with Claude Code. Designed, built, deployed, and in use.
↗Got the dream car. A 1973 Corvette. No garage experience, no mechanic on speed dial, and no idea what I was looking at under the hood. The shop quotes were brutal. The manuals were 400 pages of dense technical copy from 1972.
The real problem: all the knowledge exists — gigabytes of it — scattered across factory service manuals, forum threads, and repair logs. It just wasn't accessible in real time, while kneeling on a garage floor with grease on my hands.
Scratch-built a full-stack, multi-modal RAG AI sidekick on FastAPI, Claude, Supabase, and pgvector. Ingested gigabytes of factory manuals, forum data, and repair documentation. The system can now diagnose issues from photos, answer highly specific technical questions with citation, and track every bolt touched in a persistent repair log.
The multi-modal layer changed everything. Instead of typing a description of a problem, you photograph it. Claude identifies the component, cross-references the manual, and surfaces the relevant repair procedure. Going to the shop. Not anymore.
The biggest lesson: domain expertise doesn't have to live in a person. When you structure the knowledge correctly — chunked, embedded, retrievable — the model becomes genuinely expert. Not a chatbot. An expert.
If CHRSTPHR did this again: earlier investment in the photo-to-diagnosis pipeline. The multi-modal capability turned out to be the killer feature. Text queries are fine. Photos are magic.
Right project, right timing, right opinions. Tell CHRSTPHR what you're working on — replies come in real sentences.
↳ START SOMETHING