HearoPilot: On-Device AI Meeting Assistant for Android

By Prahlad Menon 4 min read

A few weeks ago I wrote about the AI meeting copilot landscape — how tools like OpenOats and Meetily are proving that real-time transcription and summarization can run locally, while cloud incumbents like Otter, Fathom, and Fireflies still route your conversations through someone else’s servers. Every tool in that roundup was either desktop-native or cloud-dependent. The mobile gap was glaring.

HearoPilot fills it. Built by de.ai (Decentralized AI), it’s an Android app that runs real-time speech-to-text and LLM-powered meeting insights entirely on your phone. No cloud calls. No meeting bots joining your Zoom. No data leaving the device. Period.

It’s available now on the Google Play Store, and the full source is on GitHub under the Apache 2.0 license.

What It Actually Does

HearoPilot offers four recording modes designed around different use cases:

  • Simple Listening — quick capture and transcription
  • Short Meeting — optimized for standups and 1:1s
  • Long Meeting — handles extended sessions with smart chunking
  • Real-Time Translation — live translation across languages as people speak

After recording, the on-device LLM generates structured insights: summaries, key points, action items — the same outputs you’d expect from Otter or Fathom, except nothing touches a server.

The app supports 25 UI languages with localized LLM system prompts, meaning the AI doesn’t just translate — it thinks in the target language’s context. That’s a level of internationalization you rarely see in meeting tools, cloud or otherwise.

The Technical Stack Is Serious

What makes HearoPilot genuinely impressive isn’t the feature list — it’s the engineering underneath.

Speech-to-text runs on Sherpa-ONNX with NVIDIA’s NeMo Parakeet TDT 0.6B model, quantized to Int8 (~670MB). This is a production-grade ASR pipeline running on a phone, not a toy demo.

LLM inference uses llama.cpp running Google’s Gemma 3 1B — available in Q8_0 (~1GB) or the more aggressive IQ4_NL quantization (~650MB) for devices with less headroom. The total model footprint is roughly 1.7GB of storage.

The optimizations are where the real craft shows:

  • VAD (Voice Activity Detection) pipeline filters silence and noise before it hits the STT model, saving compute cycles and improving accuracy
  • KV cache reuse across LLM inference calls avoids redundant computation — critical when you’re generating insights from a long transcript
  • Thermal throttling awareness — the app monitors device thermals and adapts inference speed to prevent your phone from becoming a space heater
  • Adaptive memory management keeps the app stable on devices with varying RAM availability

The codebase itself is clean: Kotlin with Jetpack Compose, Hilt for dependency injection, Room for local persistence, and Material Design 3. It’s textbook Clean Architecture, which matters because it’s open source — people will actually read and fork this code (24 forks already).

Why This Matters

The cloud meeting AI market has a fundamental trust problem. When Otter transcribes your board meeting or Fireflies records your therapy-adjacent 1:1 with a direct report, that audio hits external servers. You’re trusting a startup’s security posture with your most sensitive professional conversations.

Desktop local-first tools like OpenOats and Meetily solved this for laptop users. But meetings happen everywhere — in coffee shops, on walks, in cars. Your phone is the device that’s always with you, and until now, there was no serious option for on-device meeting AI on mobile.

HearoPilot changes that. Running a 0.6B parameter STT model and a 1B parameter LLM on a phone — with thermal management, VAD, and KV cache optimization — is a legitimate engineering achievement. This isn’t a research project; it’s a usable app on the Play Store.

The Bottom Line

Requirements: Android 11+ (API 30), ~1.7GB storage for models, and a reasonably modern chipset.

License: Apache 2.0 — use it, fork it, build on it.

Links: GitHub · Google Play Store

If you’ve been waiting for a privacy-respecting meeting assistant that lives in your pocket, this is the one to watch. The local-first AI meeting space just went mobile.