Skip to main content

Swift SDK

Swift SDK for Tinfoil’s secure AI inference API
GitHub: tinfoil-swift Build Status

Overview

The Tinfoil Swift SDK is a wrapper around the MacPaw OpenAI SDK that provides secure communication with Tinfoil enclaves. It has the same API as the OpenAI SDK with additional security features including automatic verification that the endpoint is running in a secure Tinfoil enclave, TLS certificate pinning, and attestation validation.

Installation

Swift Package Manager

Add to your Package.swift:
dependencies: [
    .package(url: "https://github.com/tinfoilsh/tinfoil-swift.git", branch: "main")
]

Xcode

  1. Go to File → Add Package Dependencies
  2. Enter the repository URL: https://github.com/tinfoilsh/tinfoil-swift.git
  3. Select the version you want to use
  4. Click “Add Package”
Note: Tinfoil Swift requires the MacPaw OpenAI SDK as a dependency. When you add Tinfoil Swift through Swift Package Manager, the OpenAI SDK will be automatically included.

Requirements

  • iOS 17.0+ / macOS 12.0+
  • Swift 5.9+
  • Xcode 15.0+

Migration from OpenAI

Migrating from OpenAI to Tinfoil is straightforward. The client is designed to be compatible with the MacPaw OpenAI Swift client:
// Before (OpenAI)
- import OpenAI
- let client = OpenAI(
-    apiToken: ProcessInfo.processInfo.environment["OPENAI_API_KEY"] ?? ""
- )

// After (Tinfoil)
+ import TinfoilAI
+ import OpenAI
+ let client = try await TinfoilAI.create(
+     apiKey: ProcessInfo.processInfo.environment["TINFOIL_API_KEY"] ?? ""
+ )
All method signatures remain the same since TinfoilAI.create() returns a standard OpenAI client with built-in security features.

Model Examples

Below are specific examples for each supported model. Click on any model to see its configuration and usage example.

Chat Models

import TinfoilAI
import OpenAI

// Configure client for DeepSeek V3.1 Terminus
let client = try await TinfoilAI.create(
    apiKey: ProcessInfo.processInfo.environment["TINFOIL_API_KEY"] ?? ""
)

// Example: Multi-step agentic workflow
let chatQuery = ChatQuery(
    messages: [
        .user(.init(content: .string("Write a Swift function to validate email addresses using regex, then test it with several examples and debug any issues.")))
    ],
    model: "deepseek-v31-terminus"
)

let response = try await client.chats(query: chatQuery)
print(response.choices.first?.message.content ?? "No response")
import TinfoilAI
import OpenAI

// Configure client for DeepSeek R1
let client = try await TinfoilAI.create(
    apiKey: ProcessInfo.processInfo.environment["TINFOIL_API_KEY"] ?? ""
)

// Example: Complex reasoning task
let chatQuery = ChatQuery(
    messages: [
        .user(.init(content: .string("Solve this step by step: If a train travels 120 miles in 2 hours, and then increases its speed by 25% for the next 3 hours, how far does it travel in total?")))
    ],
    model: "deepseek-r1-0528"
)

let response = try await client.chats(query: chatQuery)
print(response.choices.first?.message.content ?? "No response")
import TinfoilAI
import OpenAI

// Configure client for Mistral Small 3.1 24B
let client = try await TinfoilAI.create(
    apiKey: ProcessInfo.processInfo.environment["TINFOIL_API_KEY"] ?? ""
)

// Example: Multilingual conversation
let chatQuery = ChatQuery(
    messages: [
        .user(.init(content: .string("Explain the concept of machine learning in both English and French.")))
    ],
    model: "mistral-small-3-1-24b"
)

let response = try await client.chats(query: chatQuery)
print(response.choices.first?.message.content ?? "No response")
import TinfoilAI
import OpenAI

// Configure client for Llama 3.3 70B
let client = try await TinfoilAI.create(
    apiKey: ProcessInfo.processInfo.environment["TINFOIL_API_KEY"] ?? ""
)

// Example: Conversational AI
let chatQuery = ChatQuery(
    messages: [
        .user(.init(content: .string("What are the key differences between renewable and non-renewable energy sources?")))
    ],
    model: "llama3-3-70b"
)

let response = try await client.chats(query: chatQuery)
print(response.choices.first?.message.content ?? "No response")
import TinfoilAI
import OpenAI

// Configure client for GPT-OSS 120B
let client = try await TinfoilAI.create(
    apiKey: ProcessInfo.processInfo.environment["TINFOIL_API_KEY"] ?? ""
)

// Example: Advanced reasoning with configurable effort levels
let chatQuery = ChatQuery(
    messages: [
        .user(.init(content: .string("Analyze the trade-offs between different database architectures for a high-traffic e-commerce platform.")))
    ],
    model: "gpt-oss-120b"
)

let response = try await client.chats(query: chatQuery)
print(response.choices.first?.message.content ?? "No response")
import TinfoilAI
import OpenAI

// Configure client for Qwen3 Coder 480B
let client = try await TinfoilAI.create(
    apiKey: ProcessInfo.processInfo.environment["TINFOIL_API_KEY"] ?? ""
)

// Example: Repository-scale code understanding and agentic coding
let chatQuery = ChatQuery(
    messages: [
        .user(.init(content: .string("Review this codebase and suggest architectural improvements for better modularity and testability.")))
    ],
    model: "qwen3-coder-480b"
)

let response = try await client.chats(query: chatQuery)
print(response.choices.first?.message.content ?? "No response")
import TinfoilAI
import OpenAI

// Configure client for Qwen 2.5 72B
let client = try await TinfoilAI.create(
    apiKey: ProcessInfo.processInfo.environment["TINFOIL_API_KEY"] ?? ""
)

// Example: Code generation and analysis
let chatQuery = ChatQuery(
    messages: [
        .user(.init(content: .string("Write a Swift function to calculate the Fibonacci sequence up to n terms, then explain how it works.")))
    ],
    model: "qwen2-5-72b"
)

let response = try await client.chats(query: chatQuery)
print(response.choices.first?.message.content ?? "No response")

Audio Models

import TinfoilAI
import OpenAI
import Foundation

// Configure client for Whisper Large V3 Turbo
let client = try await TinfoilAI.create(
    apiKey: ProcessInfo.processInfo.environment["TINFOIL_API_KEY"] ?? ""
)

// Example: Audio transcription
let audioURL = URL(fileURLWithPath: "meeting_recording.mp3")
let audioData = try Data(contentsOf: audioURL)

let transcriptionQuery = AudioTranscriptionQuery(
    file: audioData,
    fileType: .mp3,
    model: "whisper-large-v3-turbo",
    prompt: "This is a business meeting discussing quarterly results", // Optional: provide context
    language: "en" // Optional: specify language for better accuracy
)

let transcription = try await client.audioTranscriptions(query: transcriptionQuery)
print("Transcription:", transcription.text)
Not supported.

Embedding Models

import TinfoilAI
import OpenAI
import Foundation

// Configure client for Nomic Embed Text
let client = try await TinfoilAI.create(
    apiKey: ProcessInfo.processInfo.environment["TINFOIL_API_KEY"] ?? ""
)

// Example: Generate embeddings for similarity search
let documents = [
    "Artificial intelligence is transforming modern technology.",
    "Machine learning enables computers to learn from data.",
    "The weather today is sunny and warm.",
    "Deep learning uses neural networks with multiple layers."
]

// Generate embeddings for all documents
var embeddings: [[Double]] = []
for document in documents {
    let embeddingQuery = EmbeddingsQuery(
        input: .string(document),
        model: "nomic-embed-text"
    )
    
    let response = try await client.embeddings(query: embeddingQuery)
    if let embedding = response.data.first?.embedding {
        embeddings.append(embedding)
    }
}

// Calculate similarity between first two documents
func dotProduct(_ a: [Double], _ b: [Double]) -> Double {
    return zip(a, b).map(*).reduce(0, +)
}

func magnitude(_ vector: [Double]) -> Double {
    return sqrt(vector.map { $0 * $0 }.reduce(0, +))
}

let similarity = dotProduct(embeddings[0], embeddings[1]) / 
    (magnitude(embeddings[0]) * magnitude(embeddings[1]))

print("Similarity between first two AI-related documents: \(String(format: "%.3f", similarity))")
print("Embedding dimension: \(embeddings[0].count)")

Error Handling

import TinfoilAI

do {
    let client = try await TinfoilAI.create(
        apiKey: ProcessInfo.processInfo.environment["TINFOIL_API_KEY"] ?? ""
    )

    let response = try await client.chats(query: chatQuery)
} catch {
    print("Error creating client or making request: \(error)")
}

Verification Document

For advanced security verification, you can provide a callback to receive detailed information about the attestation and verification process:
let client = try await TinfoilAI.create(
    apiKey: ProcessInfo.processInfo.environment["TINFOIL_API_KEY"] ?? "",
    onVerification: { document in
        guard let document = document else { return }

        // Document contains detailed step information:
        // - fetchDigest: GitHub release digest retrieval
        // - verifyCode: Code measurement verification
        // - verifyEnclave: Runtime attestation verification
        // - compareMeasurements: Code vs runtime measurement comparison

        print("Security verified:", document.securityVerified)

        // Check individual steps
        if document.steps.verifyEnclave.status == .failed {
            if let error = document.steps.verifyEnclave.error {
                print("Enclave verification failed:", error)
            }
        }
    }
)

API Documentation

This library is a drop-in replacement for the MacPaw OpenAI Swift client that can be used with Tinfoil. All methods and types are identical. See the MacPaw OpenAI documentation for complete API usage and documentation.

Guides