Ask Concord

Answers from our documentation

Ask anything about Concord. Every answer comes from our actual documentation.

Technical concepts

Conformal Prediction

A distribution-free statistical wrapper that gives any predictor a coverage-guaranteed prediction set. Useful for honest "I don't know" outputs.

Definition

Conformal prediction is a statistical framework for turning any underlying predictor into one that emits a prediction set with a guaranteed coverage rate, distribution-free. Given a chosen miscoverage rate alpha (typically 0.05 for a 95% coverage target), a conformal predictor calibrated on a held-out labeled set returns a set of candidate labels guaranteed to contain the true label at least 1-alpha fraction of the time, in expectation. For a binary entity-resolution decision the set is one of three: `{match}`, `{non-match}`, or `{match, non-match}`. The two-element set is the honest output when the underlying score sits in a region where the calibration data does not justify a confident answer. Concord by IaxaI uses conformal prediction on top of temperature-scaled Bhattacharyya scores for calibrated identity. The standard finite-sample correction `(1-alpha)(1+1/n)` is applied to the empirical quantile during calibration. Coverage holds without assumptions about the underlying distribution.

Stop reconciling. Start trusting one timeline.

30-minute walkthrough. Your tools. Your tenants. Your audit cycle. We will show you exactly where Concord earns its keep.