Skip to main content

Node.js SDK

Node.js SDK for Tinfoil’s secure AI inference API GitHub: tinfoil-node Build Status NPM version

Overview

The Tinfoil Node SDK is a wrapper around the OpenAI Node client that provides secure communication with Tinfoil enclaves. It has the same API as the OpenAI SDK with additional security features including automatic verification that the endpoint is running in a secure Tinfoil enclave, TLS certificate pinning, and attestation validation. All payloads are encrypted end-to-end using EHBP (Encrypted HTTP Body Protocol), which encrypts data directly to the attested enclave using HPKE (RFC 9180).

Installation

npm install tinfoil
Requirements: Node 20+

Quick Start

import { TinfoilAI } from "tinfoil";

const client = new TinfoilAI({
  apiKey: "<YOUR_API_KEY>", // or use TINFOIL_API_KEY env var
});

const completion = await client.chat.completions.create({
  messages: [{ role: "user", content: "Hello!" }],
  model: "llama3-3-70b",
});

Migration from OpenAI

Migrating from OpenAI to Tinfoil is straightforward. The client is designed to be compatible with the OpenAI node client:
// Before (OpenAI)
- import OpenAI from 'openai';
- 
- const client = new OpenAI({
-   apiKey: process.env.OPENAI_API_KEY,
- });

// After (Tinfoil)
+ import { TinfoilAI } from 'tinfoil';
+ 
+ const client = new TinfoilAI({
+   apiKey: process.env.TINFOIL_API_KEY
+ });
All method signatures remain the same since TinfoilAI extends the standard OpenAI client with built-in security features.

Browser Support

The SDK supports browser environments, allowing you to use the secure enclave-backed API directly from web applications.
Security Warning: Using API keys directly in the browser exposes them to anyone who can view your page source. For production applications, always use a backend server to handle API keys.

Browser Usage

import { TinfoilAI } from 'tinfoil';

const client = new TinfoilAI({
  apiKey: 'your-api-key',
  dangerouslyAllowBrowser: true // Required for browser usage
});

// Optional: pre-initialize
await client.ready();

const completion = await client.chat.completions.create({
  model: 'llama3-3-70b',
  messages: [{ role: 'user', content: 'Hello!' }]
});
Browser Requirements:
  • Modern browsers with ES2020 support
  • WebAssembly support for enclave verification

Running the Chat Example

To run the streaming chat example:
  1. Clone the repository
  2. Install dependencies:
npm install
  1. Create a .env file with your configuration:
TINFOIL_API_KEY=your-api-key
  1. Run the example:
cd examples/chat
npx ts-node main.ts
The example demonstrates streaming chat completions with the Tinfoil API wrapper.

Usage

// 1. Create a client
const client = new TinfoilAI({
  apiKey: process.env.TINFOIL_API_KEY,
});

// 2. Use client as you would OpenAI client 
// see https://github.com/openai/openai-node for API documentation

Model Examples

Below are specific examples for each supported model. Click on any model to see its configuration and usage example.

Chat Models

import { TinfoilAI } from 'tinfoil';

// Configure client for DeepSeek V3.1 Terminus
const client = new TinfoilAI({
  apiKey: process.env.TINFOIL_API_KEY
});

// Example: Multi-step agentic workflow
const response = await client.chat.completions.create({
  model: 'deepseek-v31-terminus',
  messages: [
    {
      role: 'user',
      content: 'Write a TypeScript function to validate email addresses using regex, then test it with several examples and debug any issues.'
    }
  ]
});

console.log(response.choices[0]?.message?.content);
import { TinfoilAI } from 'tinfoil';

// Configure client for DeepSeek R1
const client = new TinfoilAI({
  apiKey: process.env.TINFOIL_API_KEY
});

// Example: Complex reasoning task
const response = await client.chat.completions.create({
  model: 'deepseek-r1-0528',
  messages: [
    {
      role: 'user',
      content: 'Solve this step by step: If a train travels 120 miles in 2 hours, and then increases its speed by 25% for the next 3 hours, how far does it travel in total?'
    }
  ]
});

console.log(response.choices[0]?.message?.content);
import { TinfoilAI } from 'tinfoil';

// Configure client for Mistral Small 3.1 24B
const client = new TinfoilAI({
  apiKey: process.env.TINFOIL_API_KEY
});

// Example: Multilingual conversation
const response = await client.chat.completions.create({
  model: 'mistral-small-3-1-24b',
  messages: [
    {
      role: 'user',
      content: 'Explain the concept of machine learning in both English and French.'
    }
  ]
});

console.log(response.choices[0]?.message?.content);
import { TinfoilAI } from 'tinfoil';

// Configure client for Llama 3.3 70B
const client = new TinfoilAI({
  apiKey: process.env.TINFOIL_API_KEY
});

// Example: Conversational AI
const response = await client.chat.completions.create({
  model: 'llama3-3-70b',
  messages: [
    {
      role: 'user',
      content: 'What are the key differences between renewable and non-renewable energy sources?'
    }
  ]
});

console.log(response.choices[0]?.message?.content);
import { TinfoilAI } from 'tinfoil';

// Configure client for GPT-OSS 120B
const client = new TinfoilAI({
  apiKey: process.env.TINFOIL_API_KEY
});

// Example: Advanced reasoning with configurable effort levels
const response = await client.chat.completions.create({
  model: 'gpt-oss-120b',
  messages: [
    {
      role: 'user',
      content: 'Analyze the trade-offs between different database architectures for a high-traffic e-commerce platform.'
    }
  ]
});

console.log(response.choices[0]?.message?.content);
import { TinfoilAI } from 'tinfoil';

// Configure client for Qwen3 Coder 480B
const client = new TinfoilAI({
  apiKey: process.env.TINFOIL_API_KEY
});

// Example: Repository-scale code understanding and agentic coding
const response = await client.chat.completions.create({
  model: 'qwen3-coder-480b',
  messages: [
    {
      role: 'user',
      content: 'Review this codebase and suggest architectural improvements for better modularity and testability.'
    }
  ]
});

console.log(response.choices[0]?.message?.content);
import { TinfoilAI } from 'tinfoil';

// Configure client for Qwen 2.5 72B
const client = new TinfoilAI({
  apiKey: process.env.TINFOIL_API_KEY
});

// Example: Code generation and analysis
const response = await client.chat.completions.create({
  model: 'qwen2-5-72b',
  messages: [
    {
      role: 'user',
      content: 'Write a TypeScript function to calculate the Fibonacci sequence up to n terms, then explain how it works.'
    }
  ]
});

console.log(response.choices[0]?.message?.content);

Audio Models

import fs from "fs";
import { TinfoilAI } from 'tinfoil';

// Configure client for Whisper Large V3 Turbo
const client = new TinfoilAI({
  apiKey: process.env.TINFOIL_API_KEY
});

// Example: Audio transcription
const audioFile = fs.createReadStream("meeting_recording.mp3");
const transcription = await client.audio.transcriptions.create({
  model: "whisper-large-v3-turbo",
  file: audioFile,
  language: "en", // Optional: specify language for better accuracy
  prompt: "This is a business meeting discussing quarterly results" // Optional: provide context
});

console.log("Transcription:", transcription.text);
import fs from "fs";
import { TinfoilAI } from 'tinfoil';

// Configure client for Kokoro TTS
const client = new TinfoilAI({
  apiKey: process.env.TINFOIL_API_KEY
});

const textToSpeak = "Welcome to Tinfoil's secure AI platform. Your data remains private and protected.";

// Example: Text-to-speech with different voices
// Single voice
const response1 = await client.audio.speech.create({
  model: "kokoro",
  voice: "af_sky",
  input: textToSpeak,
  response_format: "mp3"
});

const buffer1 = Buffer.from(await response1.arrayBuffer());
await fs.promises.writeFile("speech_single.mp3", buffer1);

// Combined voices for richer sound
const response2 = await client.audio.speech.create({
  model: "kokoro",
  voice: "af_sky+af_bella",
  input: textToSpeak,
  response_format: "mp3"
});

const buffer2 = Buffer.from(await response2.arrayBuffer());
await fs.promises.writeFile("speech_combined.mp3", buffer2);

console.log("Speech files generated successfully!");

Embedding Models

import { TinfoilAI } from 'tinfoil';

// Configure client for Nomic Embed Text
const client = new TinfoilAI({
  apiKey: process.env.TINFOIL_API_KEY
});

// Example: Generate embeddings for similarity search
const documents = [
  "Artificial intelligence is transforming modern technology.",
  "Machine learning enables computers to learn from data.",
  "The weather today is sunny and warm.",
  "Deep learning uses neural networks with multiple layers."
];

// Generate embeddings for all documents
const embeddings: number[][] = [];
for (const doc of documents) {
  const response = await client.embeddings.create({
    model: "nomic-embed-text",
    input: doc,
    encoding_format: "float"
  });
  embeddings.push(response.data[0].embedding);
}

// Calculate similarity between first two documents
const dotProduct = (a: number[], b: number[]) => 
  a.reduce((sum, val, i) => sum + val * b[i], 0);
const magnitude = (vec: number[]) => 
  Math.sqrt(vec.reduce((sum, val) => sum + val * val, 0));

const similarity = dotProduct(embeddings[0], embeddings[1]) / 
  (magnitude(embeddings[0]) * magnitude(embeddings[1]));

console.log(`Similarity between first two AI-related documents: ${similarity.toFixed(3)}`);
console.log(`Embedding dimension: ${embeddings[0].length}`);

API Documentation

This library is a drop-in replacement for the official OpenAI Node.js client that can be used with Tinfoil. All methods and types are identical. See the OpenAI client for complete API usage and documentation.

SecureClient

For advanced security verification and custom integrations, the SDK exposes a SecureClient class. The SecureClient provides low-level access to the secure transport layer and detailed verification information:
import { SecureClient } from "tinfoil";

const client = new SecureClient();

try {
  await client.ready();
  const response = await client.fetch('https://api.example.com/data');
} catch (error) {
  // Even on error, access the verification document
  const doc = await client.getVerificationDocument();

  // Document contains detailed step information:
  // - fetchDigest: GitHub release digest retrieval
  // - verifyCode: Code measurement verification
  // - verifyEnclave: Runtime attestation verification
  // - compareMeasurements: Code vs runtime measurement comparison
  // - createTransport: Transport initialization (optional)
  // - verifyHPKEKey: HPKE key verification (optional)
  // - otherError: Catch-all for unexpected errors (optional)

  console.log('Security verified:', doc.securityVerified);

  // Check individual steps
  if (doc.steps.verifyEnclave.status === 'failed') {
    console.log('Enclave verification failed:', doc.steps.verifyEnclave.error);
  }
}