Overview
RedPill is fully compatible with the official OpenAI SDK . Just change the base URL and you get access to 50+ models with TEE privacy protection.
The easiest way to get started - works with your existing OpenAI code!
Installation
Python
TypeScript/JavaScript
Python Quick Start
Basic Setup
from openai import OpenAI
client = OpenAI(
api_key = "YOUR_REDPILL_API_KEY" ,
base_url = "https://api.redpill.ai/v1"
)
response = client.chat.completions.create(
model = "openai/gpt-5" ,
messages = [
{ "role" : "system" , "content" : "You are a helpful assistant." },
{ "role" : "user" , "content" : "Explain how TEE protects my data" }
]
)
print (response.choices[ 0 ].message.content)
Streaming Responses
from openai import OpenAI
client = OpenAI(
api_key = "YOUR_REDPILL_API_KEY" ,
base_url = "https://api.redpill.ai/v1"
)
stream = client.chat.completions.create(
model = "openai/gpt-5" ,
messages = [{ "role" : "user" , "content" : "Write a story about AI" }],
stream = True
)
for chunk in stream:
if chunk.choices[ 0 ].delta.content:
print (chunk.choices[ 0 ].delta.content, end = "" )
Function Calling
from openai import OpenAI
client = OpenAI(
api_key = "YOUR_REDPILL_API_KEY" ,
base_url = "https://api.redpill.ai/v1"
)
tools = [
{
"type" : "function" ,
"function" : {
"name" : "get_weather" ,
"description" : "Get current weather in a location" ,
"parameters" : {
"type" : "object" ,
"properties" : {
"location" : {
"type" : "string" ,
"description" : "City name, e.g. San Francisco"
},
"unit" : {
"type" : "string" ,
"enum" : [ "celsius" , "fahrenheit" ]
}
},
"required" : [ "location" ]
}
}
}
]
response = client.chat.completions.create(
model = "openai/gpt-5" ,
messages = [{ "role" : "user" , "content" : "What's the weather in Paris?" }],
tools = tools
)
# Check if model wants to call a function
if response.choices[ 0 ].message.tool_calls:
tool_call = response.choices[ 0 ].message.tool_calls[ 0 ]
print ( f "Function: { tool_call.function.name } " )
print ( f "Arguments: { tool_call.function.arguments } " )
Embeddings
from openai import OpenAI
client = OpenAI(
api_key = "YOUR_REDPILL_API_KEY" ,
base_url = "https://api.redpill.ai/v1"
)
response = client.embeddings.create(
model = "text-embedding-3-small" ,
input = "RedPill protects your AI requests with hardware TEE"
)
embedding = response.data[ 0 ].embedding
print ( f "Embedding dimensions: { len (embedding) } " ) # 1536
Vision (Image Analysis)
from openai import OpenAI
client = OpenAI(
api_key = "YOUR_REDPILL_API_KEY" ,
base_url = "https://api.redpill.ai/v1"
)
response = client.chat.completions.create(
model = "openai/gpt-5" ,
messages = [
{
"role" : "user" ,
"content" : [
{ "type" : "text" , "text" : "What's in this image?" },
{
"type" : "image_url" ,
"image_url" : {
"url" : "https://example.com/image.jpg"
}
}
]
}
]
)
print (response.choices[ 0 ].message.content)
TypeScript/JavaScript Quick Start
Basic Setup
import OpenAI from 'openai' ;
const client = new OpenAI ({
apiKey: process . env . REDPILL_API_KEY ,
baseURL: 'https://api.redpill.ai/v1'
});
async function main () {
const response = await client . chat . completions . create ({
model: 'openai/gpt-5' ,
messages: [
{ role: 'system' , content: 'You are a helpful assistant.' },
{ role: 'user' , content: 'Explain how TEE protects my data' }
]
});
console . log ( response . choices [ 0 ]. message . content );
}
main ();
Streaming in TypeScript
import OpenAI from 'openai' ;
const client = new OpenAI ({
apiKey: process . env . REDPILL_API_KEY ,
baseURL: 'https://api.redpill.ai/v1'
});
async function main () {
const stream = await client . chat . completions . create ({
model: 'openai/gpt-5' ,
messages: [{ role: 'user' , content: 'Write a story about AI' }],
stream: true
});
for await ( const chunk of stream ) {
process . stdout . write ( chunk . choices [ 0 ]?. delta ?. content || '' );
}
}
main ();
Function Calling in TypeScript
import OpenAI from 'openai' ;
const client = new OpenAI ({
apiKey: process . env . REDPILL_API_KEY ! ,
baseURL: 'https://api.redpill.ai/v1'
});
async function main () {
const tools : OpenAI . Chat . ChatCompletionTool [] = [
{
type: 'function' ,
function: {
name: 'get_weather' ,
description: 'Get current weather in a location' ,
parameters: {
type: 'object' ,
properties: {
location: {
type: 'string' ,
description: 'City name'
},
unit: {
type: 'string' ,
enum: [ 'celsius' , 'fahrenheit' ]
}
},
required: [ 'location' ]
}
}
}
];
const response = await client . chat . completions . create ({
model: 'openai/gpt-5' ,
messages: [{ role: 'user' , content: "What's the weather in Paris?" }],
tools
});
const toolCall = response . choices [ 0 ]. message . tool_calls ?.[ 0 ];
if ( toolCall ) {
console . log ( 'Function:' , toolCall . function . name );
console . log ( 'Arguments:' , toolCall . function . arguments );
}
}
main ();
Using Multiple Models
RedPill gives you access to 50+ models through the same SDK:
from openai import OpenAI
client = OpenAI(
api_key = "YOUR_REDPILL_API_KEY" ,
base_url = "https://api.redpill.ai/v1"
)
# OpenAI GPT-4o for general tasks
gpt4_response = client.chat.completions.create(
model = "openai/gpt-5" ,
messages = [{ "role" : "user" , "content" : "Summarize this article..." }]
)
# Anthropic Claude for analysis
claude_response = client.chat.completions.create(
model = "anthropic/claude-sonnet-4.5" ,
messages = [{ "role" : "user" , "content" : "Analyze this data..." }]
)
# DeepSeek for coding
deepseek_response = client.chat.completions.create(
model = "deepseek/deepseek-chat" ,
messages = [{ "role" : "user" , "content" : "Write a Python function..." }]
)
# Phala TEE model for sensitive data
phala_response = client.chat.completions.create(
model = "phala/qwen-2.5-7b-instruct" ,
messages = [{ "role" : "user" , "content" : "Process this confidential..." }]
)
Environment Variables
Never hardcode API keys. Use environment variables:
REDPILL_API_KEY = sk-your-api-key-here
import os
from openai import OpenAI
client = OpenAI(
api_key = os.environ.get( "REDPILL_API_KEY" ),
base_url = "https://api.redpill.ai/v1"
)
// Uses process.env.REDPILL_API_KEY automatically
const client = new OpenAI ({
baseURL: 'https://api.redpill.ai/v1'
});
Error Handling
from openai import OpenAI, APIError, RateLimitError
client = OpenAI(
api_key = "YOUR_REDPILL_API_KEY" ,
base_url = "https://api.redpill.ai/v1"
)
try :
response = client.chat.completions.create(
model = "openai/gpt-5" ,
messages = [{ "role" : "user" , "content" : "Hello!" }]
)
print (response.choices[ 0 ].message.content)
except RateLimitError:
print ( "Rate limit exceeded. Please wait and retry." )
except APIError as e:
print ( f "API error: { e } " )
except Exception as e:
print ( f "Unexpected error: { e } " )
Async Support
Python Async
import asyncio
from openai import AsyncOpenAI
client = AsyncOpenAI(
api_key = "YOUR_REDPILL_API_KEY" ,
base_url = "https://api.redpill.ai/v1"
)
async def main ():
response = await client.chat.completions.create(
model = "openai/gpt-5" ,
messages = [{ "role" : "user" , "content" : "Hello!" }]
)
print (response.choices[ 0 ].message.content)
asyncio.run(main())
TypeScript Async (built-in)
// All OpenAI SDK methods are async by default in TypeScript
const response = await client . chat . completions . create ({
model: 'openai/gpt-5' ,
messages: [{ role: 'user' , content: 'Hello!' }]
});
Migration from OpenAI
Update Base URL
Add base_url="https://api.redpill.ai/v1" to your client initialization
Update API Key
Replace your OpenAI API key with your RedPill API key
Add Provider Prefix
Change model="gpt-4" to model="openai/gpt-5"
Test
Run your application - everything else stays the same!
Detailed Migration Guide Full migration guide with examples →
Supported Features
Feature Python TypeScript Notes Chat Completions ✅ ✅ All 50+ models Streaming ✅ ✅ Real-time responses Function Calling ✅ ✅ Tool use support Embeddings ✅ ✅ Vector generation Vision ✅ ✅ Image analysis Async/Await ✅ ✅ Non-blocking calls
Popular Models
Model Best For Context openai/gpt-5General purpose 128K tokens anthropic/claude-sonnet-4.5Reasoning, analysis 200K tokens deepseek/deepseek-chatCoding tasks 64K tokens google/gemini-2.5-proMultimodal tasks 2M tokens phala/qwen-2.5-7b-instructConfidential AI (TEE) 32K tokens
View All Models Browse complete model list →
Example Projects
CLI Chatbot Build a command-line AI assistant
Document Q&A Create a RAG system with embeddings
Code Assistant Build an AI pair programmer
Data Analyzer Analyze datasets with AI
Next Steps