The Tinfoil Node SDK is a wrapper around the OpenAI Node client that provides secure communication with Tinfoil enclaves. It has the same API as the OpenAI SDK with additional security features including automatic verification that the endpoint is running in a secure Tinfoil enclave, TLS certificate pinning, and attestation validation.All payloads are encrypted end-to-end using EHBP (Encrypted HTTP Body Protocol), which encrypts data directly to the attested enclave using HPKE (RFC 9180).
The SDK supports browser environments, allowing you to use the secure enclave-backed API directly from web applications.
Security Warning: Using API keys directly in the browser exposes them to anyone who can view your page source. For production applications, always use a backend server to handle API keys.
// 1. Create a clientconst client = new TinfoilAI({ apiKey: process.env.TINFOIL_API_KEY,});// 2. Use client as you would OpenAI client // see https://github.com/openai/openai-node for API documentation
import { TinfoilAI } from 'tinfoil';// Configure client for DeepSeek V3.1 Terminusconst client = new TinfoilAI({ apiKey: process.env.TINFOIL_API_KEY});// Example: Multi-step agentic workflowconst response = await client.chat.completions.create({ model: 'deepseek-v31-terminus', messages: [ { role: 'user', content: 'Write a TypeScript function to validate email addresses using regex, then test it with several examples and debug any issues.' } ]});console.log(response.choices[0]?.message?.content);
DeepSeek R1
Copy
Ask AI
import { TinfoilAI } from 'tinfoil';// Configure client for DeepSeek R1const client = new TinfoilAI({ apiKey: process.env.TINFOIL_API_KEY});// Example: Complex reasoning taskconst response = await client.chat.completions.create({ model: 'deepseek-r1-0528', messages: [ { role: 'user', content: 'Solve this step by step: If a train travels 120 miles in 2 hours, and then increases its speed by 25% for the next 3 hours, how far does it travel in total?' } ]});console.log(response.choices[0]?.message?.content);
Mistral Small 3.1 24B
Copy
Ask AI
import { TinfoilAI } from 'tinfoil';// Configure client for Mistral Small 3.1 24Bconst client = new TinfoilAI({ apiKey: process.env.TINFOIL_API_KEY});// Example: Multilingual conversationconst response = await client.chat.completions.create({ model: 'mistral-small-3-1-24b', messages: [ { role: 'user', content: 'Explain the concept of machine learning in both English and French.' } ]});console.log(response.choices[0]?.message?.content);
Llama 3.3 70B
Copy
Ask AI
import { TinfoilAI } from 'tinfoil';// Configure client for Llama 3.3 70Bconst client = new TinfoilAI({ apiKey: process.env.TINFOIL_API_KEY});// Example: Conversational AIconst response = await client.chat.completions.create({ model: 'llama3-3-70b', messages: [ { role: 'user', content: 'What are the key differences between renewable and non-renewable energy sources?' } ]});console.log(response.choices[0]?.message?.content);
GPT-OSS 120B
Copy
Ask AI
import { TinfoilAI } from 'tinfoil';// Configure client for GPT-OSS 120Bconst client = new TinfoilAI({ apiKey: process.env.TINFOIL_API_KEY});// Example: Advanced reasoning with configurable effort levelsconst response = await client.chat.completions.create({ model: 'gpt-oss-120b', messages: [ { role: 'user', content: 'Analyze the trade-offs between different database architectures for a high-traffic e-commerce platform.' } ]});console.log(response.choices[0]?.message?.content);
Qwen3 Coder 480B
Copy
Ask AI
import { TinfoilAI } from 'tinfoil';// Configure client for Qwen3 Coder 480Bconst client = new TinfoilAI({ apiKey: process.env.TINFOIL_API_KEY});// Example: Repository-scale code understanding and agentic codingconst response = await client.chat.completions.create({ model: 'qwen3-coder-480b', messages: [ { role: 'user', content: 'Review this codebase and suggest architectural improvements for better modularity and testability.' } ]});console.log(response.choices[0]?.message?.content);
Qwen 2.5 72B
Copy
Ask AI
import { TinfoilAI } from 'tinfoil';// Configure client for Qwen 2.5 72Bconst client = new TinfoilAI({ apiKey: process.env.TINFOIL_API_KEY});// Example: Code generation and analysisconst response = await client.chat.completions.create({ model: 'qwen2-5-72b', messages: [ { role: 'user', content: 'Write a TypeScript function to calculate the Fibonacci sequence up to n terms, then explain how it works.' } ]});console.log(response.choices[0]?.message?.content);
This library is a drop-in replacement for the official OpenAI Node.js client that can be used with Tinfoil. All methods and types are identical. See the OpenAI client for complete API usage and documentation.
For advanced security verification and custom integrations, the SDK exposes a SecureClient class.
The SecureClient provides low-level access to the secure transport layer and detailed verification information: