Documentation Index Fetch the complete documentation index at: https://docs.tinfoil.sh/llms.txt
Use this file to discover all available pages before exploring further.
Go SDK Go SDK for Tinfoil’s secure AI inference API
GitHub: tinfoil-go
Overview
The Tinfoil Go SDK is a wrapper around the OpenAI Go client that provides secure communication with Tinfoil enclaves. It has the same API as the OpenAI SDK with additional security features including automatic verification that the endpoint is running in a secure Tinfoil enclave, TLS certificate pinning, and attestation validation.
Installation
New to Go? Start here - Project Setup
If you don’t have a Go module set up yet, initialize a new one: # Create a new directory for your project
mkdir my-tinfoil-app
cd my-tinfoil-app
# Initialize a Go module
go mod init my-tinfoil-app
Add the Tinfoil SDK to your project:
go get github.com/tinfoilsh/tinfoil-go
Migration from OpenAI
Migrating from OpenAI to Tinfoil is straightforward. The client is designed to be compatible with the OpenAI Go client:
// Before (OpenAI)
- import (
- "github.com/openai/openai-go/v3"
- "github.com/openai/openai-go/v3/option"
- )
-
- client := openai.NewClient(
- option.WithAPIKey("OPENAI_API_KEY")
- )
// After (Tinfoil)
+ import (
+ "github.com/openai/openai-go/v3"
+ "github.com/openai/openai-go/v3/option"
+ "github.com/tinfoilsh/tinfoil-go"
+ )
+
+ client, err := tinfoil.NewClient(
+ option.WithAPIKey("TINFOIL_API_KEY"),
+ )
All method signatures remain the same since tinfoil.NewClient() returns a standard OpenAI client with built-in security features.
Usage
// 1. Create a client
client , err := tinfoil . NewClient (
option . WithAPIKey ( os . Getenv ( "TINFOIL_API_KEY" )),
)
if err != nil {
log . Fatal ( err )
}
// 2. Use client as you would openai.Client
// see https://pkg.go.dev/github.com/openai/openai-go for API documentation
Model Examples
Below are specific examples for each supported model. Click on any model to see its configuration and usage example.
Chat Models
package main
import (
"context"
"fmt"
"log"
"os"
"github.com/openai/openai-go/v3"
"github.com/openai/openai-go/v3/option"
"github.com/tinfoilsh/tinfoil-go"
)
func main () {
client , err := tinfoil . NewClient (
option . WithAPIKey ( os . Getenv ( "TINFOIL_API_KEY" )),
)
if err != nil {
log . Fatal ( err )
}
// Example: Complex reasoning task
chatCompletion , err := client . Chat . Completions . New ( context . TODO (), openai . ChatCompletionNewParams {
Messages : [] openai . ChatCompletionMessageParamUnion {
openai . UserMessage ( "Solve this step by step: If a train travels 120 miles in 2 hours, and then increases its speed by 25 % f or the next 3 hours, how far does it travel in total?" ),
},
Model : "glm-5-1" ,
})
if err != nil {
log . Fatal ( err )
}
fmt . Println ( chatCompletion . Choices [ 0 ]. Message . Content )
}
package main
import (
"context"
"encoding/base64"
"fmt"
"log"
"os"
"github.com/openai/openai-go/v3"
"github.com/openai/openai-go/v3/option"
"github.com/tinfoilsh/tinfoil-go"
)
func main () {
client , err := tinfoil . NewClient (
option . WithAPIKey ( os . Getenv ( "TINFOIL_API_KEY" )),
)
if err != nil {
log . Fatal ( err )
}
// Example: Image analysis
imageData , err := os . ReadFile ( "image.jpg" )
if err != nil {
log . Fatal ( err )
}
base64Image := base64 . StdEncoding . EncodeToString ( imageData )
imageURL := fmt . Sprintf ( "data:image/jpeg;base64, %s " , base64Image )
chatCompletion , err := client . Chat . Completions . New ( context . TODO (), openai . ChatCompletionNewParams {
Model : "qwen3-vl-30b" ,
Messages : [] openai . ChatCompletionMessageParamUnion {
openai . UserMessage ([] openai . ChatCompletionContentPartUnionParam {
openai . TextContentPart ( "What's in this image?" ),
openai . ImageContentPart ( openai . ChatCompletionContentPartImageImageURLParam {
URL : imageURL ,
}),
}),
},
})
if err != nil {
log . Fatal ( err )
}
fmt . Println ( chatCompletion . Choices [ 0 ]. Message . Content )
}
package main
import (
"context"
"fmt"
"log"
"os"
"github.com/openai/openai-go/v3"
"github.com/openai/openai-go/v3/option"
"github.com/tinfoilsh/tinfoil-go"
)
func main () {
client , err := tinfoil . NewClient (
option . WithAPIKey ( os . Getenv ( "TINFOIL_API_KEY" )),
)
if err != nil {
log . Fatal ( err )
}
// Example: Conversational AI
chatCompletion , err := client . Chat . Completions . New ( context . TODO (), openai . ChatCompletionNewParams {
Messages : [] openai . ChatCompletionMessageParamUnion {
openai . UserMessage ( "What are the key differences between renewable and non-renewable energy sources?" ),
},
Model : "llama3-3-70b" ,
})
if err != nil {
log . Fatal ( err )
}
fmt . Println ( chatCompletion . Choices [ 0 ]. Message . Content )
}
package main
import (
"context"
"fmt"
"log"
"os"
"github.com/openai/openai-go/v3"
"github.com/openai/openai-go/v3/option"
"github.com/tinfoilsh/tinfoil-go"
)
func main () {
client , err := tinfoil . NewClient (
option . WithAPIKey ( os . Getenv ( "TINFOIL_API_KEY" )),
)
if err != nil {
log . Fatal ( err )
}
// Example: Advanced reasoning with configurable effort levels
chatCompletion , err := client . Chat . Completions . New ( context . TODO (), openai . ChatCompletionNewParams {
Messages : [] openai . ChatCompletionMessageParamUnion {
openai . UserMessage ( "Analyze the trade-offs between different database architectures for a high-traffic e-commerce platform." ),
},
Model : "gpt-oss-120b" ,
})
if err != nil {
log . Fatal ( err )
}
fmt . Println ( chatCompletion . Choices [ 0 ]. Message . Content )
}
package main
import (
"context"
"fmt"
"log"
"os"
"github.com/openai/openai-go/v3"
"github.com/openai/openai-go/v3/option"
"github.com/tinfoilsh/tinfoil-go"
)
func main () {
client , err := tinfoil . NewClient (
option . WithAPIKey ( os . Getenv ( "TINFOIL_API_KEY" )),
)
if err != nil {
log . Fatal ( err )
}
// Example: Agentic coding with multi-step reasoning
chatCompletion , err := client . Chat . Completions . New ( context . TODO (), openai . ChatCompletionNewParams {
Messages : [] openai . ChatCompletionMessageParamUnion {
openai . UserMessage ( "Review this codebase and suggest architectural improvements for better modularity and testability." ),
},
Model : "kimi-k2-5" ,
})
if err != nil {
log . Fatal ( err )
}
fmt . Println ( chatCompletion . Choices [ 0 ]. Message . Content )
}
Audio Models
Transcription package main
import (
"context"
"fmt"
"log"
"os"
"github.com/openai/openai-go/v3"
"github.com/openai/openai-go/v3/option"
"github.com/tinfoilsh/tinfoil-go"
)
func main () {
client , err := tinfoil . NewClient (
option . WithAPIKey ( os . Getenv ( "TINFOIL_API_KEY" )),
)
if err != nil {
log . Fatal ( err )
}
// Example: Audio transcription
audioFile , err := os . Open ( "meeting_recording.mp3" )
if err != nil {
log . Fatal ( err )
}
defer audioFile . Close ()
transcription , err := client . Audio . Transcriptions . New ( context . TODO (), openai . AudioTranscriptionNewParams {
Model : "voxtral-small-24b" ,
File : audioFile ,
Language : openai . String ( "en" ), // Optional: specify language for better accuracy
Prompt : openai . String ( "This is a business meeting discussing quarterly results" ), // Optional: provide context
})
if err != nil {
log . Fatal ( err )
}
fmt . Println ( "Transcription:" , transcription . Text )
}
Audio Q&A package main
import (
"context"
"encoding/base64"
"fmt"
"log"
"os"
"github.com/openai/openai-go/v3"
"github.com/openai/openai-go/v3/option"
"github.com/tinfoilsh/tinfoil-go"
)
func main () {
client , err := tinfoil . NewClient (
option . WithAPIKey ( os . Getenv ( "TINFOIL_API_KEY" )),
)
if err != nil {
log . Fatal ( err )
}
// Load audio and encode as base64
audioData , err := os . ReadFile ( "audio.mp3" )
if err != nil {
log . Fatal ( err )
}
audioBase64 := base64 . StdEncoding . EncodeToString ( audioData )
response , err := client . Chat . Completions . New ( context . TODO (), openai . ChatCompletionNewParams {
Model : "voxtral-small-24b" ,
Messages : [] openai . ChatCompletionMessageParamUnion {
openai . UserMessage ([] openai . ChatCompletionContentPartUnionParam {
openai . TextContentPart ( "How many words are in this audio?" ),
openai . InputAudioContentPart ( openai . ChatCompletionContentPartInputAudioInputAudioParam {
Data : audioBase64 ,
Format : "mp3" ,
}),
}),
},
})
if err != nil {
log . Fatal ( err )
}
fmt . Println ( response . Choices [ 0 ]. Message . Content )
// Output: "The audio contains 600 words."
}
Embedding Models
package main
import (
"context"
"fmt"
"log"
"math"
"os"
"github.com/openai/openai-go/v3"
"github.com/openai/openai-go/v3/option"
"github.com/tinfoilsh/tinfoil-go"
)
func main () {
client , err := tinfoil . NewClient (
option . WithAPIKey ( os . Getenv ( "TINFOIL_API_KEY" )),
)
if err != nil {
log . Fatal ( err )
}
// Example: Generate embeddings for similarity search
documents := [] string {
"Artificial intelligence is transforming modern technology." ,
"Machine learning enables computers to learn from data." ,
"The weather today is sunny and warm." ,
"Deep learning uses neural networks with multiple layers." ,
}
// Generate embeddings for all documents
var embeddings [][] float64
for _ , doc := range documents {
response , err := client . Embeddings . New ( context . TODO (), openai . EmbeddingNewParams {
Model : "nomic-embed-text" ,
Input : openai . EmbeddingNewParamsInputUnion { OfArrayOfStrings : [] string { doc }},
EncodingFormat : openai . EmbeddingNewParamsEncodingFormatFloat ,
})
if err != nil {
log . Fatal ( err )
}
embeddings = append ( embeddings , response . Data [ 0 ]. Embedding )
}
// Calculate similarity between first two documents
similarity := cosineSimilarity ( embeddings [ 0 ], embeddings [ 1 ])
fmt . Printf ( "Similarity between first two AI-related documents: %.3f \n " , similarity )
fmt . Printf ( "Embedding dimension: %d \n " , len ( embeddings [ 0 ]))
}
func cosineSimilarity ( a , b [] float64 ) float64 {
dotProduct := 0.0
normA := 0.0
normB := 0.0
for i := 0 ; i < len ( a ); i ++ {
dotProduct += a [ i ] * b [ i ]
normA += a [ i ] * a [ i ]
normB += b [ i ] * b [ i ]
}
return dotProduct / ( math . Sqrt ( normA ) * math . Sqrt ( normB ))
}
Advanced Functionality
For advanced use cases requiring manual verification or direct HTTP access, use the SecureClient from the verifier package:
import (
"fmt"
"github.com/tinfoilsh/tinfoil-go"
"github.com/tinfoilsh/tinfoil-go/verifier/client"
)
// Create a secure client for manual verification and HTTP access
secureClient := client . NewSecureClient (
"inference.tinfoil.sh" ,
"tinfoilsh/confidential-model-router" ,
)
// Verify the enclave attestation
groundTruth , err := secureClient . Verify ()
if err != nil {
return fmt . Errorf ( "verification failed: %w " , err )
}
// Get the verified HTTP client for custom requests
httpClient , err := secureClient . HTTPClient ()
if err != nil {
return fmt . Errorf ( "failed to get HTTP client: %w " , err )
}
// Access enclave and repository information
fmt . Printf ( "Enclave: %s \n " , secureClient . Enclave ())
fmt . Printf ( "Repository: %s \n " , secureClient . Repo ())
API Documentation
This library is a drop-in replacement for the official OpenAI Go client that can be used with Tinfoil. All methods and types are identical. See the OpenAI Go client documentation for complete API usage and documentation.
Guides
Tool Calling Learn how to use function calling capabilities with AI models.
Structured Outputs Use JSON schema validation for reliable data extraction.
Image Processing Process images with multimodal AI models.
Document Processing Upload and process documents securely.