The Copilot family of Chat completion endpoints are used for generating contextual crypto responses using Messari’s knowledge graph and specialized tools. The family offers three endpoint options for generating contextual crypto responses using Messari’s knowledge graph: The OpenAI-compatible endpoint returns data in the standard OpenAI response format, making it compatible with popular SDKs like the OpenAI Python package. This endpoint supports both streaming and synchronous response modes, allowing you to plug directly into packages expecting OpenAI format. The Vercel DSP-compatible endpoint implements the Vercel Data Stream Protocol, enabling direct integration with Vercel’s AI SDK library through the useChat function; it only supports streaming responses. The Messari Format endpoint serves as the standard option but uses Messari’s proprietary format. For most implementations, the OpenAI-compatible endpoint is recommended for its broader compatibility and flexibility.

Chat Completions Quickstart

AI Copilot Chat Example

Get started quickly with our SDK example for AI Copilot Chat. This guide walks you through instantiating the client, making chat completion requests, and handling responses.

Authorization

Refer to our Authentication docs for creating an API key.

Copilot Rate Limits

User TierRequests Per Day
Unpaid2
Lite10
Pro20
Enterprise20

Credits

If more Copilot requests are needed, contact api@messari.io about enrolling in our consumption based credit system. If you run out of credits, your requests will fallback to our default rate limits defined in the above table. After exhausting the credits and daily rate limit, the API will respond with a 429 error with the message of "rate limit exceeded. Resets in <time_till_next_message>"