Skip to main content

Get Your API Key

  1. Sign up at redpill.ai
  2. Navigate to the Dashboard
  3. Generate your API key
  4. Add credits to your account
Keep your API key secure! Never commit it to version control or expose it in client-side code.

Make Your First Request

RedPill is OpenAI-compatible. Just change the base URL:
curl https://api.redpill.ai/v1/chat/completions \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer YOUR_API_KEY" \
  -d '{
    "model": "openai/gpt-5",
    "messages": [
      {
        "role": "user",
        "content": "Explain how RedPill protects my privacy"
      }
    ]
  }'
All requests automatically flow through RedPill’s privacy-protected gateway, regardless of which model you choose.

Try Different Models

RedPill supports 50+ models from multiple providers:
curl https://api.redpill.ai/v1/chat/completions \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer YOUR_API_KEY" \
  -d '{
    "model": "anthropic/claude-sonnet-4.5",
    "messages": [{"role": "user", "content": "What are the privacy features?"}]
  }'

Browse All Models

See the complete list of 50+ supported models →

Use Phala Confidential AI Models

Phala models run entirely in private GPU enclaves with cryptographic attestation:
from openai import OpenAI

client = OpenAI(
    api_key="YOUR_API_KEY",
    base_url="https://api.redpill.ai/v1"
)

# DeepSeek V3 - Confidential
response = client.chat.completions.create(
    model="phala/deepseek-chat-v3-0324",
    messages=[
        {"role": "user", "content": "Analyze this sensitive financial data: ..."}
    ]
)

print(response.choices[0].message.content)
View all 6 Phala confidential models at Confidential AI Models

Streaming Responses

Enable streaming for real-time responses:
stream = client.chat.completions.create(
    model="openai/gpt-5",
    messages=[{"role": "user", "content": "Write a story about privacy"}],
    stream=True
)

for chunk in stream:
    if chunk.choices[0].delta.content:
        print(chunk.choices[0].delta.content, end="")

Verify Private Execution

Get cryptographic proof that your request ran in secure hardware:
# 1. Make a request and get the request ID
RESPONSE=$(curl -s https://api.redpill.ai/v1/chat/completions \
  -H "Authorization: Bearer YOUR_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{"model":"phala/qwen-2.5-7b-instruct","messages":[{"role":"user","content":"Hello"}]}')

REQUEST_ID=$(echo $RESPONSE | jq -r '.id')

# 2. Get attestation report
curl https://api.redpill.ai/v1/attestation/report?model=phala/qwen-2.5-7b-instruct \
  -H "Authorization: Bearer YOUR_API_KEY"

# 3. Get cryptographic signature
curl https://api.redpill.ai/v1/signature/$REQUEST_ID?model=phala/qwen-2.5-7b-instruct \
  -H "Authorization: Bearer YOUR_API_KEY"

Learn About Attestation

Understand how to verify private execution →

List Available Models

Discover all models and their capabilities:
curl https://api.redpill.ai/v1/models \
  -H "Authorization: Bearer YOUR_API_KEY"

Function Calling

Use function calling with any supported model:
tools = [
    {
        "type": "function",
        "function": {
            "name": "get_weather",
            "description": "Get current weather",
            "parameters": {
                "type": "object",
                "properties": {
                    "location": {"type": "string"},
                    "unit": {"type": "string", "enum": ["celsius", "fahrenheit"]}
                },
                "required": ["location"]
            }
        }
    }
]

response = client.chat.completions.create(
    model="openai/gpt-5",
    messages=[{"role": "user", "content": "What's the weather in SF?"}],
    tools=tools
)

print(response.choices[0].message.tool_calls)

Advanced Function Calling

Learn more about function calling →

Next Steps

Need Help?