<- Back to projects

2026 / UC Berkeley (ME235)

Robogotchi

Real-time interactive robot pet with ESP32 firmware, web GUI, OLED expressions, RFID feeding, touch, IMU, and motor behaviors

Assembled Robogotchi prototype

Overview

Robogotchi is a Tamagotchi-style robot pet that responds to physical interaction in real time. The system combines an ESP32-family controller, a browser-based dashboard served by the robot, dual OLED facial expressions, motor control, RFID feeding, touch sensing, IMU shake detection, buzzer feedback, and camera-follow hooks.

Problem / Context

The project needed to demonstrate a real-time mechatronic system with multiple inputs, multiple outputs, visible state, and interruptible behavior. The design challenge was to make the technical system feel like a coherent character rather than a pile of sensors.

Role

Team project - firmware architecture, interaction logic, embedded integration support, hardware bring-up, and portfolio synthesis

Institution

UC Berkeley (ME235)

Team

Loris Emanuelli, ME235 project teammates

Tags

Robotics / Embedded Systems / Interaction Design / Prototyping

Process

  • - Defined interaction goals around petting, feeding, movement, attention, health, happiness, and follow behavior
  • - Mapped robot states and event priorities before integrating hardware
  • - Built modular ESP-IDF / Arduino-compatible firmware wrappers for motors, touch, IMU, buzzer, displays, RFID, camera interface, telemetry, and web dashboard
  • - Created hardware tests for touch, motors, OLED displays, RFID, and I2C scanning
  • - Integrated the prototype into a compact wheeled robot body with separated head, body, wiring, and electronics zones

Key design decisions

  • - Use a non-blocking state machine so touch, feed, and upset events can interrupt autonomous behaviors
  • - Serve the GUI from the robot in AP mode so a laptop can connect directly without a separate app install
  • - Keep camera and RFID modules behind thin interfaces so teammates can implement or replace them without rewriting robot behavior
  • - Represent emotion through both physical motion and visual facial expressions for an immediate demo effect

Engineering details

  • - ESP32 boots and serves a web GUI in AP mode with observed SSID Robogotchi-AP and local URL http://192.168.4.1/
  • - State machine includes STARTUP, IDLE, HAPPY, SAD, HUNGRY, SLEEP, FOLLOW, and DEAD modes
  • - Priority order: petting/touch, feed, shake, then autonomous follow/hunger/sleep/idle transitions
  • - Dual OLED bring-up used separate I2C buses; touch test, OLED tests, motor test, and RFID diagnostic sketches are included
  • - Telemetry outputs compact JSON-like state, motion, health, energy, attention, happiness, tracking, and uptime values

Outcomes

  • - Functional assembled robot prototype with visible character form and internal electronics
  • - Integration-ready firmware backbone with modular hardware wrappers and event-driven behavior
  • - Web dashboard and telemetry path for live state monitoring during demonstrations
  • - Clean shared bundle for teammates with firmware snapshot and hardware tests

Gallery

Assembled Robogotchi prototype
Assembled Robogotchi prototype
Robogotchi GUI dashboard for pet state, care, feeding, and camera feedback
Robogotchi GUI dashboard for pet state, care, feeding, and camera feedback
Front-open body with head, wheelbase, and electronics
Front-open body with head, wheelbase, and electronics
Internal wiring and electronics zones
Internal wiring and electronics zones
Front view of the completed robot form
Front view of the completed robot form
Dashboard-linked robot photo asset
Dashboard-linked robot photo asset

What I would do next

  • - Tighten cable management and create a more serviceable internal electronics tray
  • - Replace placeholder camera hooks with a robust color-tracking pipeline
  • - Tune the emotional state decay rates from longer user tests
  • - Refine the shell design for easier assembly, battery access, and stronger character identity