Ask Concord

Answers from our documentation

Ask anything about Concord. Every answer comes from our actual documentation.

Technical concepts

Sentence Transformer Embeddings

Dense vectors that turn text into geometry, used for translation alignment, entity resolution, and knowledge-graph retrieval.

Definition

Sentence transformers are neural models fine-tuned to produce a single dense vector that captures the meaning of a piece of text. The current Concord by IaxaI MVP runs on the `all-MiniLM-L6-v2` model (384-dim); the V1 upgrade re-embeds the corpus on `all-mpnet-base-v2` (768-dim) for higher fidelity. Two pieces of text with similar meaning sit close together in that vector space. Two pieces with different meanings sit far apart. The embedding is a primitive that shows up in three places. Translation alignment compares vendor field names and OCSF target names by cosine similarity in embedding space. The bidirectional alignment score, calibrated by Platt scaling, becomes the confidence on a mapping decision. Entity resolution uses a multi-modal embedding fusion (text plus structural specs plus categorical features) for richer text-similar entities. Knowledge graph retrieval embeds the query and runs dual-path search against pre-embedded phrase and passage nodes. Embeddings ship in the engine container; no external API call leaves the customer network at query time.

Stop reconciling. Start trusting one timeline.

30-minute walkthrough. Your tools. Your tenants. Your audit cycle. We will show you exactly where Concord earns its keep.