Node.js SDK Node.js SDK for Tinfoil’s secure AI inference API
GitHub: tinfoil-node
Installation
Migration from OpenAI
Migrating from OpenAI to Tinfoil is straightforward. The client is designed to be compatible with the OpenAI Node.js client:
// Before (OpenAI)
- import OpenAI from 'openai';
-
- const client = new OpenAI({
- apiKey: process.env.OPENAI_API_KEY,
- });
// After (Tinfoil)
+ import { TinfoilAI } from 'tinfoil';
+
+ const client = new TinfoilAI({
+ enclave: 'enclave.example.com',
+ repo: 'org/model-repo',
+ apiKey: process.env.TINFOIL_API_KEY
+ });
All method signatures remain the same since TinfoilAI
extends the standard OpenAI client with built-in security features.
Quick Start
import { TinfoilAI } from 'tinfoil' ;
import dotenv from 'dotenv' ;
// Load environment variables
dotenv . config ();
try {
const client = new TinfoilAI ({
enclave: 'enclave.example.com' ,
repo: 'org/model-repo' ,
apiKey: process . env . TINFOIL_API_KEY
});
// Uses identical method calls as the OpenAI client
const completion = await client . chat . completions . create ({
messages: [{ role: 'user' , content: 'Hello!' }],
model: 'llama3-3-70b'
});
console . log ( completion . choices [ 0 ]?. message ?. content );
} catch ( error ) {
console . error ( 'Error:' , error instanceof Error ? error . message : String ( error ));
}
Running the Chat Example
To run the streaming chat example:
Clone the repository
Install dependencies:
Create a .env
file with your configuration:
TINFOIL_API_KEY = your-api-key
Run the example:
cd examples/chat
npx ts-node main.ts
The example demonstrates streaming chat completions with the Tinfoil API wrapper.
Usage
// 1. Create a client
const client = new TinfoilAI ({
enclave: 'enclave.example.com' , // Enclave hostname
repo: 'org/model-repo' , // GitHub repository
apiKey: process . env . TINFOIL_API_KEY ,
});
// 2. Use client as you would OpenAI client
// see https://github.com/openai/openai-node for API documentation
Audio
Speech-to-Text (Transcription)
import fs from "fs" ;
import { TinfoilAI } from 'tinfoil' ;
const client = new TinfoilAI ({
enclave: 'audio-processing.model.tinfoil.sh' ,
repo: 'tinfoilsh/confidential-audio-processing' ,
apiKey: process . env . TINFOIL_API_KEY
});
const audioFile = fs . createReadStream ( "audio.mp3" );
const transcription = await client . audio . transcriptions . create ({
model: "whisper-large-v3-turbo" ,
file: audioFile ,
});
console . log ( transcription . text );
Text-to-Speech (Synthesis)
import fs from "fs" ;
import { TinfoilAI } from 'tinfoil' ;
const client = new TinfoilAI ({
enclave: 'audio-processing.model.tinfoil.sh' ,
repo: 'tinfoilsh/confidential-audio-processing' ,
apiKey: process . env . TINFOIL_API_KEY
});
const response = await client . audio . speech . create ({
model: "kokoro" ,
voice: "af_sky+af_bella" ,
input: "Hello world!" ,
});
const buffer = Buffer . from ( await response . arrayBuffer ());
await fs . promises . writeFile ( "output.mp3" , buffer );
console . log ( "Speech saved to output.mp3" );
Embeddings
import { TinfoilAI } from 'tinfoil' ;
const client = new TinfoilAI ({
enclave: 'nomic-embed-text.model.tinfoil.sh' ,
repo: 'tinfoilsh/confidential-nomic-embed-text' ,
apiKey: process . env . TINFOIL_API_KEY
});
const response = await client . embeddings . create ({
model: "nomic-embed-text" ,
input: "The quick brown fox jumps over the lazy dog" ,
});
const embeddingVector = response . data [ 0 ]. embedding ;
console . log ( `Embedding dimension: ${ embeddingVector . length } ` );
Streaming
import { TinfoilAI } from 'tinfoil' ;
import dotenv from 'dotenv' ;
dotenv . config ();
try {
const client = new TinfoilAI ({
enclave: 'enclave.example.com' ,
repo: 'org/model-repo' ,
apiKey: process . env . TINFOIL_API_KEY
});
const stream = await client . chat . completions . create ({
model: "llama3-3-70b" ,
messages: [{ role: "user" , content: "Write a short story" }],
stream: true ,
});
for await ( const chunk of stream ) {
process . stdout . write ( chunk . choices [ 0 ]?. delta ?. content || "" );
}
} catch ( error ) {
console . error ( 'Streaming error:' , error instanceof Error ? error . message : String ( error ));
}
Version Compatibility
The configuration above has been tested with:
Node.js 18+
TypeScript 5.8+
ts-node 10.9+
API Documentation
This library is a drop-in replacement for the official OpenAI Node.js client that can be used with Tinfoil. All methods and types are identical. See the OpenAI client for complete API usage and documentation.