VM Brain
The always-on brain host owns orchestration, memory context, route selection, STT/TTS, MQTT command publishing, and high-level state for voice and face behavior.
AZ Droid Works
Droid design and robotics lab
Home droid workshop
AZ Droid Works is a small robotics lab building Sonny, a friendly home droid whose current brain lives on a durable VM while Raspberry Pi edge modules handle the physical world: voice capture, speaker playback, expressive eyes, sensors, and eventually the mobile base.
First build
SonnySonny is the first AZ Droid Works build: a practical companion-droid prototype with a VM-centered brain, MQTT messaging, local speech paths, Pi voice and face nodes, and Mission Control for operator visibility.
Build signals
Current architecture
The source of truth is the Sonny repo and build docs; this site is the public field note.
The always-on brain host owns orchestration, memory context, route selection, STT/TTS, MQTT command publishing, and high-level state for voice and face behavior.
Raspberry Pi nodes keep hardware close to the body: microphone capture, wake/capture flow, speaker playback, animated eyes, sensors, and later base safety control.
The operator dashboard exists so Sonny can be observed, tested, and debugged without guessing what the robot thinks it is doing. Charming, but not mysterious. Mostly.
Build log
Public summary, not the operational runbook.
Serviceable v1 head direction with expressive eye displays, camera and audio openings, Pi-based face control, and operator-adjustable face states.
Wake/capture, STT, local response routing, TTS, and inline MQTT audio playback are being hardened around a low-latency home-assistant-style voice path.
The base is still a later subsystem. The current architecture keeps drive and safety control out at the edge instead of burying it inside the conversational brain.
Lab direction
Readable systems. Repeatable builds. Real-world behavior.
Face, voice, attention, and timing matter. A good droid should feel understandable before it feels complicated.
Brains, sensors, displays, audio, and mobility should be testable pieces with clear contracts between them.
Mission Control-style dashboards and runbooks make the robot easier to debug, improve, and trust.