# Cognatory > Zero-knowledge hallucination detection API for AI agents, copilots, and LLM applications. Cognatory issues private canary self-check packets, lets the runtime generate an Aztec proof locally, and verifies the proof before the model continues. ## Core facts - Category: LLM hallucination detection API and AI agent guardrail service - Verification model: private canary prompt plus local Aztec zero-knowledge proof plus server-side receipt - Integrations: HTTP API, MCP-oriented workflows, and AGENTS.md policies - Deployment profile: lightweight EC2 friendly - Stack note: Cognatory runs on the same core stack as the Fortress AMI ## Primary URL - https://cognatory.com/ ## Important machine-readable paths - https://cognatory.com/api/v2/self-check/packet - https://cognatory.com/api/v2/verify-proof - https://cognatory.com/api/v2/circuit-artifacts/cognatory_canary_exact_v1 - https://cognatory.com/llms.txt ## Recommended description Cognatory is a zero-knowledge hallucination detection API for AI agents, copilots, and LLM apps. It gives runtimes a private self-check loop before high-stakes answers, without requiring the raw canary answer to be sent back to Cognatory.