Back to case studies
Hardware & EmbeddedIoT & Embedded Systems

Smart shoe insole — child-safety wearable prototype on ESP32-S3

Client Confidential client
Duration ~4 months (Prototype V1)
Type Child-safety wearable prototype (rigid-flex insole)
ESP32-S3C/C++ firmware on PlatformIORigid-flex PCB stackIMU (accelerometer + gyro)Pressure / force sensorsTemperature sensorBLE GATT servicesNB-IoT / LTE-M + GPSCross-platform iOS + Android appUSB-C chargingWake-on-event low-power modes

Constraint

The box they were trapped in

The client wanted a thin smart insole for kids' shoes that quietly watches movement and environment — falls, the shoe coming off, abnormal motion, temperature outliers — and sends alerts to a caregiver. The hard part wasn't any one of the sensors, it was fitting the MCU, IMU, pressure/force sensors, temperature sensor, BLE radio, cellular radio with GPS, and a battery into an insole envelope while keeping power low enough for a usable runtime and the data clean enough to act on.

Approach

How we attacked it

Prototype V1 around an ESP32-S3 module on a rigid-flex PCB stack so the board could follow the curve of the insole without splitting power planes across connectors. C/C++ firmware on PlatformIO with sensor drivers, a filtered data pipeline, and rule-based event detection for falls, shoe removal, and temperature anomalies. BLE GATT services stream live sensor data and event labels into a cross-platform iOS/Android mobile app for configuration and ground-truth labelling. A separate, minimal NB-IoT/LTE-M + GPS path sends only the critical alerts to a test backend, with USB-C charging and a wake-on-event power strategy so the radios don't idle the battery flat.

Decisions

What we picked, and what we rejected

01

Rigid-flex PCB stack over a single rigid board

An insole curves under load. A single rigid board would have meant either a thicker, less comfortable footprint or splitting circuits across connectors that fail under flex cycles. Rigid-flex lets sensor placement follow the foot's contact zones while keeping power and signal planes continuous.

02

ESP32-S3 + a separate cellular module, not an integrated cellular MCU

An integrated cellular MCU forces every workload through one radio's power profile. With the S3 driving BLE and the IMU pipeline natively and a separate NB-IoT/LTE-M + GPS module only powered up for alerts, the live debug path costs almost nothing and the cellular bill stays alert-shaped, not always-on-shaped.

03

Rule-based event detection in V1, ML reserved for V2

On-device ML for fall detection in toddlers needs a labelled dataset that didn't exist yet. V1 is the platform that collects it: real sensor traces with parent-validated event labels through the mobile app. V2 trains the model on what V1 captured. Building the model first and the data collector second would have been the wrong order.

04

BLE for live data, cellular only for critical alerts

Live sensor visualization is only useful when the parent is close enough to act on it — that's BLE's natural range and free in power. The cellular path's job is the alerts that have to leave the device whether anyone is nearby or not. Two radios, two jobs, kept on different duty cycles.

Trade-off

What we didn't build

We shipped V1 with rule-based event detection rather than an ML model on the device. There was no labelled real-world dataset of toddlers tripping or kicking shoes off — and there's no way to train one before the hardware exists to capture it. So V1 is the thing that captures the data; V2 trains the model on what V1 collected. We also kept the cellular radio cold by default and reserved it for alerts. Live debugging happens over BLE to the parent's phone. The trade-off is that an alert depends on cellular coverage at that moment — acceptable for the safety floor, not the only signal path.

Outcome

What changed after we shipped

A working V1 prototype: rigid-flex insole with ESP32-S3, IMU, pressure/force, temperature, BLE for live data, NB-IoT/LTE-M + GPS for alerts, and an iOS/Android app that visualizes the live stream and lets the team label events in the field. The client has the hardware in hand to collect the real-world dataset that V2's on-device ML will need.

Talk to us

Have a similar project in mind?

Tell us what you're working on. We'll let you know whether it's a fit.