Documentation Index Fetch the complete documentation index at: https://docs.tinfoil.sh/llms.txt
Use this file to discover all available pages before exploring further.
Swift SDK Swift SDK for Tinfoil’s secure AI inference API
GitHub: tinfoil-swift
Overview
The Tinfoil Swift SDK is a wrapper around the MacPaw OpenAI SDK that provides secure communication with Tinfoil enclaves. It has the same API as the OpenAI SDK with additional security features including automatic verification that the endpoint is running in a secure Tinfoil enclave, TLS certificate pinning, and attestation validation.
Installation
Swift Package Manager
Add to your Package.swift:
dependencies: [
. package ( url : "https://github.com/tinfoilsh/tinfoil-swift.git" , branch : "main" )
]
Xcode
Go to File → Add Package Dependencies
Enter the repository URL: https://github.com/tinfoilsh/tinfoil-swift.git
Select the version you want to use
Click “Add Package”
Note: Tinfoil Swift requires the MacPaw OpenAI SDK as a dependency. When you add Tinfoil Swift through Swift Package Manager, the OpenAI SDK will be automatically included.
Requirements
iOS 17.0+ / macOS 14.0+
Swift 5.9+
Xcode 15.0+
Migration from OpenAI
Migrating from OpenAI to Tinfoil is straightforward. The client is designed to be compatible with the MacPaw OpenAI Swift client:
// Before (OpenAI)
- import OpenAI
- let client = OpenAI(
- apiToken: ProcessInfo.processInfo.environment["OPENAI_API_KEY"] ?? ""
- )
// After (Tinfoil)
+ import TinfoilAI
+ import OpenAI
+ let client = try await TinfoilAI.create(
+ apiKey: ProcessInfo.processInfo.environment["TINFOIL_API_KEY"] ?? ""
+ )
All method signatures remain the same since TinfoilAI provides the same API as the OpenAI client with built-in security features.
Model Examples
Below are specific examples for each supported model. Click on any model to see its configuration and usage example.
Chat Models
import TinfoilAI
import OpenAI
let client = try await TinfoilAI. create (
apiKey : ProcessInfo. processInfo . environment [ "TINFOIL_API_KEY" ] ?? ""
)
// Example: Complex reasoning task
let chatQuery = ChatQuery (
messages : [
. user (. init ( content : . string ( "Solve this step by step: If a train travels 120 miles in 2 hours, and then increases its speed by 25% for the next 3 hours, how far does it travel in total?" )))
],
model : "glm-5-1"
)
let response = try await client. chats ( query : chatQuery)
print (response. choices . first ?. message . content ?? "No response" )
import TinfoilAI
import OpenAI
import Foundation
let client = try await TinfoilAI. create (
apiKey : ProcessInfo. processInfo . environment [ "TINFOIL_API_KEY" ] ?? ""
)
// Example: Image analysis
let imageData = try Data ( contentsOf : URL ( fileURLWithPath : "image.jpg" ))
let base64Image = imageData. base64EncodedString ()
let chatQuery = ChatQuery (
messages : [
. user (. init ( content : . vision ([
. chatCompletionContentPartTextParam (. init ( text : "What's in this image?" )),
. chatCompletionContentPartImageParam (. init ( imageUrl : . init ( url : "data:image/jpeg;base64, \( base64Image ) " )))
])))
],
model : "qwen3-vl-30b"
)
let response = try await client. chats ( query : chatQuery)
print (response. choices . first ?. message . content ?? "No response" )
import TinfoilAI
import OpenAI
let client = try await TinfoilAI. create (
apiKey : ProcessInfo. processInfo . environment [ "TINFOIL_API_KEY" ] ?? ""
)
// Example: Conversational AI
let chatQuery = ChatQuery (
messages : [
. user (. init ( content : . string ( "What are the key differences between renewable and non-renewable energy sources?" )))
],
model : "llama3-3-70b"
)
let response = try await client. chats ( query : chatQuery)
print (response. choices . first ?. message . content ?? "No response" )
import TinfoilAI
import OpenAI
let client = try await TinfoilAI. create (
apiKey : ProcessInfo. processInfo . environment [ "TINFOIL_API_KEY" ] ?? ""
)
// Example: Advanced reasoning with configurable effort levels
let chatQuery = ChatQuery (
messages : [
. user (. init ( content : . string ( "Analyze the trade-offs between different database architectures for a high-traffic e-commerce platform." )))
],
model : "gpt-oss-120b"
)
let response = try await client. chats ( query : chatQuery)
print (response. choices . first ?. message . content ?? "No response" )
import TinfoilAI
import OpenAI
let client = try await TinfoilAI. create (
apiKey : ProcessInfo. processInfo . environment [ "TINFOIL_API_KEY" ] ?? ""
)
// Example: Native multimodal agentic coding
let chatQuery = ChatQuery (
messages : [
. user (. init ( content : . string ( "Review this codebase and suggest architectural improvements for better modularity and testability." )))
],
model : "kimi-k2-5"
)
let response = try await client. chats ( query : chatQuery)
print (response. choices . first ?. message . content ?? "No response" )
Audio Models
Transcription import TinfoilAI
import OpenAI
import Foundation
let client = try await TinfoilAI. create (
apiKey : ProcessInfo. processInfo . environment [ "TINFOIL_API_KEY" ] ?? ""
)
// Example: Audio transcription
let audioURL = URL ( fileURLWithPath : "meeting_recording.mp3" )
let audioData = try Data ( contentsOf : audioURL)
let transcriptionQuery = AudioTranscriptionQuery (
file : audioData,
fileType : . mp3 ,
model : "voxtral-small-24b" ,
prompt : "This is a business meeting discussing quarterly results" , // Optional: provide context
language : "en" // Optional: specify language for better accuracy
)
let transcription = try await client. audioTranscriptions ( query : transcriptionQuery)
print ( "Transcription:" , transcription. text )
Audio Q&A import TinfoilAI
import OpenAI
import Foundation
let client = try await TinfoilAI. create (
apiKey : ProcessInfo. processInfo . environment [ "TINFOIL_API_KEY" ] ?? ""
)
// Load audio and encode as base64
let audioURL = URL ( fileURLWithPath : "audio.mp3" )
let audioData = try Data ( contentsOf : audioURL)
let audioBase64 = audioData. base64EncodedString ()
let chatQuery = ChatQuery (
messages : [
. user (. init ( content : . contentParts ([
. text (. init ( text : "How many words are in this audio?" )),
. audio (. init ( inputAudio : . init ( data : audioBase64, format : . mp3 )))
])))
],
model : "voxtral-small-24b"
)
let response = try await client. chats ( query : chatQuery)
print (response. choices . first ?. message . content ?? "" )
// Output: "The audio contains 600 words."
Embedding Models
import TinfoilAI
import OpenAI
import Foundation
let client = try await TinfoilAI. create (
apiKey : ProcessInfo. processInfo . environment [ "TINFOIL_API_KEY" ] ?? ""
)
// Example: Generate embeddings for similarity search
let documents = [
"Artificial intelligence is transforming modern technology." ,
"Machine learning enables computers to learn from data." ,
"The weather today is sunny and warm." ,
"Deep learning uses neural networks with multiple layers."
]
// Generate embeddings for all documents
var embeddings: [[ Double ]] = []
for document in documents {
let embeddingQuery = EmbeddingsQuery (
input : . string (document),
model : "nomic-embed-text"
)
let response = try await client. embeddings ( query : embeddingQuery)
if let embedding = response.data. first ?.embedding {
embeddings. append (embedding)
}
}
// Calculate similarity between first two documents
func dotProduct ( _ a : [ Double ], _ b : [ Double ]) -> Double {
return zip (a, b). map (*). reduce ( 0 , +)
}
func magnitude ( _ vector : [ Double ]) -> Double {
return sqrt (vector. map { $0 * $0 }. reduce ( 0 , +))
}
let similarity = dotProduct (embeddings[ 0 ], embeddings[ 1 ]) /
( magnitude (embeddings[ 0 ]) * magnitude (embeddings[ 1 ]))
print ( "Similarity between first two AI-related documents: \( String ( format : "%.3f" , similarity) ) " )
print ( "Embedding dimension: \( embeddings[ 0 ]. count ) " )
Error Handling
import TinfoilAI
do {
let client = try await TinfoilAI. create (
apiKey : ProcessInfo. processInfo . environment [ "TINFOIL_API_KEY" ] ?? ""
)
let response = try await client. chats ( query : chatQuery)
} catch {
print ( "Error creating client or making request: \( error ) " )
}
Verification Document
For advanced security verification, you can provide a callback to receive detailed information about the attestation and verification process:
let client = try await TinfoilAI. create (
apiKey : ProcessInfo. processInfo . environment [ "TINFOIL_API_KEY" ] ?? "" ,
onVerification : { document in
guard let document = document else { return }
// Document contains detailed step information:
// - fetchDigest: GitHub release digest retrieval
// - verifyCode: Code measurement verification
// - verifyEnclave: Runtime attestation verification
// - compareMeasurements: Code vs runtime measurement comparison
print ( "Security verified:" , document. securityVerified )
// Check individual steps
if document.steps.verifyEnclave.status == .failed {
if let error = document.steps.verifyEnclave. error {
print ( "Enclave verification failed:" , error)
}
}
}
)
API Documentation
This library is a drop-in replacement for the MacPaw OpenAI Swift client that can be used with Tinfoil. The TinfoilAI client provides the same API as the OpenAI client. See the MacPaw OpenAI documentation for complete API usage and documentation.
Guides
Tool Calling Learn how to use function calling capabilities with AI models.
Structured Outputs Use JSON schema validation for reliable data extraction.
Image Processing Process images with multimodal AI models.
Document Processing Upload and process documents securely.