Provenance, not black box
AI tools that won't show their work are a liability. When an alignment finding flags a strategic gap, the first question from the board is: why? The second is: who decided this, the AI or a person?
Atlas answers both. Every AI-produced classification, recommendation or alignment finding stores model_version, prompt_version, confidence, and the reasoning span. If a human edits the output, the original is preserved and the edit is attributed. The audit trail is the product.
That's not a feature. It's a design principle — see ADR-0001 in our open architecture decisions.