Skip to main content

Real-Time Fact Checking for Zoom Meetings

VeriFact joins your Zoom call, transcribes speech as it happens, detects factual claims, and verifies them against your reference documents — surfacing cited evidence inside the meeting, live.

Live Demo

Live Transcript
Live
Our Q3 revenue increased by 23% compared to last quarter.
This growth was driven primarily by our new product line.
We expect this trend to continue into Q4.
Swipe will appear here when a claim is verified.

The Challenge

Meetings are noisy. Facts shouldn't be.

$12.9M
average annual cost of poor data quality, per organization
$39B
in stock-market losses every year, traced to misinformation
70%
more likely a false claim spreads than a true one

The Problem

Unverified claims derail decisions and burn hours on manual fact-checking.

  • Supported
    52%

    Backed by evidence

  • Refuted
    30%

    Contradicted by evidence

  • Not Enough Info
    18%

    Insufficient evidence

Our Solution

Live claim detection with instant source citations; right inside your call.

  • Sub-second claim detection via fine-tuned BERT
  • LLM-verified evidence with source citations
  • Real-time verdicts delivered inside the call

Zoom App Integration

Invite the VeriFact bot to any Zoom call. It transcribes speech via Azure Speech API, detects claims with BERT, and surfaces LLM-verified evidence in the side pane — live.

Powered By Intelligence

A seamless pipeline from speech to verified truth

  1. A Zoom bot joins the meeting, captures audio per participant, and streams real-time transcripts with speaker attribution.

    Technical Details

    Built on the Zoom Meeting SDK (attendee-upstream / Django). Audio is transcribed via Azure Cognitive Services Speech-to-Text by default. The transcription provider is fully configurable — swap in any locally hosted or remote STT service without changing the rest of the pipeline.

  2. A fine-tuned BERT transformer classifies each transcript segment as a factual claim or not in real time.

    Technical Details

    Model: XiaojingEllen/bert-finetuned-claim-detection (HuggingFace). Segments are scored with a configurable confidence threshold (default 0.5). Jobs are dispatched asynchronously via BullMQ queues so the API stays non-blocking.

  3. Detected claims trigger a retrieval pass over organisation documents using dense vector search and cross-encoder reranking.

    Technical Details

    Documents are chunked (800 tokens, 120-token overlap) and embedded with intfloat/e5-base-v2. Per-org LlamaIndex indices are LRU-cached in memory. A cross-encoder reranker (ms-marco-MiniLM-L-2-v2) refines the top-k hits before they reach the LLM. Source documents are stored in S3.

  4. Retrieved evidence and the claim are sent to an LLM which classifies the stance and extracts supporting highlights.

    Technical Details

    Default LLM: Qwen3-4B (self-hosted via llama-cpp). Configurable to OpenAI GPT-4o, Google Gemini, or Anthropic Claude. Outputs a structured verdict — SUPPORTS, REFUTES, or NEI — plus a confidence score, explanation, and evidence highlights.

  5. Verdicts and evidence are pushed to the Zoom side pane in real time so participants see results without leaving the call.

    Technical Details

    Frontend built with React 18 + Vite + Zoom Apps SDK. The backend broadcasts claim and verdict events over WebSocket (ws://). Users can upload reference documents, review claim history, and manage organisation-level document libraries.

Meet the Team

The innovators behind VeriFact

  • Alhassan Al-badri, VeriFact team member

    Alhassan Al‑badri

  • Egehan Yıldız, VeriFact team member

    Egehan Yıldız

  • İrem Damla Karagöz, VeriFact team member

    İrem Damla Karagöz

  • Orhun Ege Çelik, VeriFact team member

    Orhun Ege Çelik

  • Eray İşçi, VeriFact team member

    Eray İşçi

← Swipe to see more team members →

FAQ

Frequently Asked Questions

About using VeriFact for real-time fact checking in Zoom meetings.

  • Does VeriFact work with Zoom?

    Yes. VeriFact is built as a Zoom App and Zoom Bot. The host installs it from the Zoom App Marketplace, and once activated in a meeting, the bot joins the call, transcribes speech in real time, and surfaces fact-checked verdicts inside Zoom’s side panel.

  • How does real-time fact checking work?

    VeriFact streams meeting audio to Azure Speech for transcription, runs a fine-tuned BERT model to detect factual claims as they’re spoken, and uses retrieval-augmented generation (RAG) over your organization’s reference documents to verify each claim. Verdicts (TRUE / FALSE / UNCERTAIN) and supporting passages appear inside the meeting within seconds.

  • What sources does VeriFact verify claims against?

    Your own. VeriFact does not verify against the open web. Owners and admins upload reference documents (PDF, Word, Excel, PowerPoint, CSV, or plain text) that act as the source of truth for that organization. Every verdict cites the specific passage it was drawn from.

  • Is VeriFact different from Zoom's AI Companion or meeting recap tools?

    Yes. AI Companion summarizes what was said. VeriFact tells you whether what was said is supported by your trusted documents — claim by claim, with citations, while the meeting is still happening. Summaries describe the call; verdicts let you act on it.

  • What happens to meeting audio and transcripts?

    Audio is streamed to Microsoft Azure Speech for real-time transcription and is not retained after transcription completes. Transcripts, detected claims, and verdicts are stored privately for your organization for the lifetime of your account, are never shared with other organizations, and are never used to train models.

  • What does VeriFact cost?

    VeriFact is currently free while in early access. Pricing for production deployments will be announced separately.