Ask Concord

Answers from our documentation

Ask anything about Concord. Every answer comes from our actual documentation.

Core Engine: Universal Adapter

Every vendor signal in. Nothing on the live path that isn't deterministic.

The Universal Adapter is how data gets into Concord by IaxaI. Syslog, webhooks, file uploads, connector pulls. Known vendors hit a cached mapping in sub-millisecond. Unknown vendors get parked, flagged, and onboarded under analyst supervision. Never in front of a live event.

Onboarding a new client tool stack?

The Problem

Every new client brings tools you've never seen.

Onboarding is the part of MSSP work nobody puts on the website. A new bank runs a regional EDR you've never integrated. A new healthcare payer bought a niche identity tool last quarter. A scale-up has three log sources nobody on your team has touched. Engineering builds a parser. Then another. Then another. Onboarding becomes a queue.

The Impact

Or worse. Somebody slips an LLM into the live ingestion path.

Plenty of platforms now lean on a model call to parse unknown logs in real time. It demos beautifully and falls apart under audit. Non-deterministic outputs. Variable latency. A model version change quietly rewrites how every event gets interpreted. When the regulator asks why a SOC alert resolved the way it did, "the LLM said so" is not a defensible answer.

How Concord Helps

Concord splits the path in two. Known vendors flow through a deterministic cached mapping at sub-millisecond latency. Unknown vendors park in a queue, flagged for onboarding. An analyst-supervised wizard runs the LLM against the parked sample, shows the proposed OCSF mapping with calibrated confidence, and lets the analyst review, edit, accept, or reject. Once accepted, the mapping lands in the cache and the parked events flow through the deterministic path. The live pipeline never blocks on a model call.

The Outcome

Onboard new client tools in minutes. Defend every decision in October.

Stand a new client up without an engineering build. Hand the analyst the parked sample, walk through the proposed mapping, accept or edit. Every onboarding decision (analyst ID, model version, prompt, accepted mapping) written to the audit ledger. Deterministic on the hot path. Reviewable on the cold path. Both ledgered.

How It Works

Two paths. One entry point. The hot path is always deterministic.

Every event Concord receives gets a schema-shape hash and a cache lookup before anything else happens. What happens next depends on whether Concord has seen that shape before.

Path 1: Known Vendor

Cached mapping. Sub-millisecond. Deterministic.

The schema-shape hash matches an entry in the mapping cache. Field extraction runs against a pre-computed map. Severity normalizes against a known scale. The event hands off to the Translation engine with its OCSF tuple already attached. No model call. No surprise. Same result every time, byte-for-byte.

What ships today

  • 30+ vendor mappings in the cache library
  • 6 production-ready connectors: CrowdStrike, Okta, Microsoft Graph, Splunk, Palo Alto, plus the connector framework itself
  • Multi-tier syslog listener with disk spillover queue at 1GB
  • Webhook endpoint with bearer auth, dedup, and rate limiting

Path 2: Unknown Vendor

Park. Flag. Derive offline. Analyst approves.

The schema-shape hash misses. The event gets parked in an encrypted queue with a source hint. A "needs onboarding" badge appears in the analyst console. Ingestion never blocks. When an analyst opens the wizard, Concord runs the patent-pending Universal Adapter derivation against the parked sample and presents a proposed OCSF mapping with per-field calibrated confidence and rationale.

What the analyst sees

  • The parked sample event
  • The proposed OCSF mapping with diff preview
  • Calibrated confidence per field
  • Edit inline, accept as a unit, or reject and route to engineering

On acceptance, the mapping writes to the cache, the parked events release through the deterministic path, and the audit ledger captures the analyst ID, the model version, the prompt, and the accepted mapping. A receipt the next regulator can replay.

The Rule

No ML in the hot path. Rigorously enforced.

The live ingestion path is always deterministic. Receive, schema hash, cache lookup, extract, hand off. That's it. The LLM only runs in two places: the analyst-supervised onboarding wizard, and the drift-triggered re-mapping worker. Both are out of band. Neither blocks live events.

This matters for two reasons. First, latency. A cached mapping hits in under five milliseconds. A model call costs hundreds to thousands. The throughput target (100K events per second at the syslog receiver) is unreachable with a model in the path.

Second, audit. Regulators want to know what happened and why. "The mapping cache returned this OCSF tuple for this schema hash, and here's the analyst who approved it" is a defensible answer. "The LLM said so" is not. Concord ledgers every onboarding decision with the analyst ID, the model version, and the prompt. So even the cold path is reproducible.

Why MSSPs Care

New client. New tools. No engineering build.

The unknown-vendor case is the MSSP onboarding problem. Concord turns it into a wizard, not a sprint.

Stand a new client up in a session, not a sprint

When a new bank, healthcare payer, insurance carrier, or scale-up hands you a stack with vendors you've never integrated, the unknown ones park gracefully on day one. Walk through the onboarding wizard with your senior analyst and you're live. Engineering does not have to write a parser.

Parked events never get lost

Unknown vendors don't blackhole. Their events sit encrypted in the parked queue with a source hint and a schema-shape hash. Once the mapping is approved, the parked events release through the deterministic path retroactively. You don't lose the first day of telemetry while engineering catches up.

Multi-tenant from the receiver outward

Tenant tag stamped at ingest. Per-tenant rate limits and per-tenant disk-spillover quotas so one noisy client cannot starve others on a shared Edge Gateway. TLS and mTLS termination on the syslog and webhook receivers. Per-tenant key issuance on the webhook auth path.

Drift-coupled re-mapping

When a vendor renames a field or restructures a schema, Drift Detection fires a re-mapping job. The Universal Adapter proposes a diff against the existing cached mapping. Above a confidence threshold and below a change-magnitude threshold, the new mapping auto-applies with versioning. Below either, it routes back to the analyst onboarding flow as a drift review.

What Ships Today vs The V1 Work

Receivers are built. The hot-path purification is the work.

Shipped

  • Multi-tier syslog listener. UDP and TCP. RFC3164, RFC5424, CEF, LEEF, and JSON auto-detection. 100K-event in-memory queue with SQLite disk spillover at a 1GB cap. Exponential-backoff retry with circuit breaker.
  • Webhook endpoint. Bearer-token auth, per-source rate limit, event-hash dedup, 10MB payload guard.
  • File uploader. CSV, JSON, XML. Schema inference into the graph layer.
  • Connector-pull framework. OAuth2, API key, and bearer auth types. Encrypted credential storage. Cursor-based incremental fetch. Six production-ready connectors live across CrowdStrike, Okta, Microsoft Graph, Splunk, and Palo Alto.
  • Mapping cache library. 30+ vendor packs covering EDR, identity, SIEM, NDR, email security, and the major financial-services platforms. Each pack contains transforms, transports, field maps, and CEF headers for vendor identification.

V1 Work

  • Hot-path purification. Remove the synchronous model call from the live normalize path. The hot path becomes cache lookup or park-and-flag, full stop.
  • Park-and-flag queue. Encrypted parked-event store with source hints, schema-shape hashes, and onboarding session linkage.
  • Onboarding wizard. The analyst-supervised LLM run. Proposed mapping, per-field confidence, edit inline, accept or reject. Acceptance writes cache and ledger.
  • Drift-coupled re-mapping. When the Drift Detector flags schema-shape drift on a known vendor, fire a re-mapping job and route by confidence threshold.
  • Multi-tenant receiver hardening. TLS and mTLS termination. Per-tenant webhook key issuance. Tenant tagging at ingest. Per-tenant rate limits and spillover quotas.
  • Mapping versioning and rollback. Every mapping versioned, ledgered, and one-click revertible.

Built to be auditable, not just functional

Patent-pending

Universal Adapter LLM-driven mapping is part of the Semantic Translation patent family in active prosecution at the USPTO.

Air-gapped deployable

Edge Gateway runs on-prem in Docker. Onboarding and drift LLM calls default to a local model so client telemetry never has to leave the customer perimeter.

Ledger-backed

Every mapping derivation, every analyst acceptance, every drift re-mapping, every parked event written to the hash-chained Auditability Ledger.

Where It Fits

The sole entry point into the Concord engine.

Every event that reaches Translation, Entity Resolution, the Knowledge Graph, or any of the three surfaces came in through the Universal Adapter. Receivers on one side. A deterministic hot path and a supervised cold path on the other. The audit ledger underneath both.

Stop reconciling. Start trusting one timeline.

30-minute walkthrough. Your tools. Your tenants. Your audit cycle. We will show you exactly where Concord earns its keep.