Know when they finish talking.

A lightweight ML library detects turn completion, thinking pauses, and interrupts. It works entirely in the browser. There are no servers, no API keys, and no latency.

npm install @utterance/core
~2MB model97.8% accuracyRuns in-browser via WASMv2
//Benefits

Why Utterance?

No cloud dependency

Everything runs in the browser. No servers, no API keys, no network requests for audio processing.

Zero latency

On-device inference means instant results. No round trip to a server. Decisions happen in milliseconds.

Privacy first

Audio never leaves the user’s device. No recording, no uploading, no third-party processing.

Lightweight model

Small ONNX model that loads fast and runs efficiently. It is designed for real-time performance on any device.

Framework agnostic

Works with any JavaScript framework. Use it with React, Vue, vanilla JS, or any voice SDK.

Simple event API

Just listen for turnEnd, pause, and interrupt events. Get building in minutes, not hours.

//Under the hood

Trained to understand conversations.

A hybrid conv + attention model trained on real conversational data. Quantized to int8 and optimized for WASM, so it runs on any device without breaking a sweat.

~2 MBInt8 quantized ONNX model. Loads instantly, even on slow connections.
97.8%Validation accuracy across all four turn-taking classes.
100 msInference batched every 100ms. Decisions happen before you notice.
4 classesSpeaking, thinking pause, turn complete, and interrupt intent.
//Quick start

Install and start detecting in seconds.

index.ts
$npm install @utterance/core
import { Utterance } from "@utterance/core";

const detector = new Utterance();

detector.on("turnEnd", (result) => {
  console.log("User is done speaking", result.confidence);
});

detector.on("pause", (result) => {
  console.log("User is thinking...", result.duration);
});

detector.on("interrupt", () => {
  console.log("User wants to speak. Stop AI response");
});

await detector.start();

Open source. Community driven.

MIT licensed. Free forever. Star us on GitHub, join the Discord, or open a PR.