masker · custom-llm proxy

connecting… Vapi / Bolna → POST /v1/chat/completionsmock OpenAI +0 ms wedge

Patient turn

what the agent heard from the caller

Waiting for the next inbound chat completion request…

What OpenAI sees

payload that leaves your VPC
{ /* awaiting first request */ }

Streamed reply

left = redacted SSE · right = rehydrated for TTS
redacted (SSE wire)

          
rehydrated (TTS input)

          

Live audit chain

every redaction is signed and chained
kind detector original placeholder action stage
No detections yet.

Compliance report

GET /api/v1/sessions/:id/report · SOC 2 · HIPAA Safe Harbor View JSON ↗

Compliance report will appear here after the first call. Latest call session is auto-selected.