Documentation Index
Fetch the complete documentation index at: https://docs.tinfoil.sh/llms.txt
Use this file to discover all available pages before exploring further.
LangChain integration
Tinfoil’s SDKs expose verified, attested HTTP clients that can be injected into LangChain. This lets you build chains, agents, and RAG pipelines while using Tinfoil as the backend. The integration works by passing Tinfoil’s secure transport layer into LangChain’s OpenAI provider. All LangChain features work as before.Supported languages: Python, JavaScript/TypeScript, and Go.
Installation
Quick start
Create a Tinfoil-verified LangChain client in a few lines:Streaming
How it works
Each Tinfoil SDK verifies the remote enclave’s hardware attestation and pins TLS certificates before any data is sent. The integration injects this verified transport into LangChain’s OpenAI provider:| Language | Tinfoil transport | Injected via |
|---|---|---|
| Python | httpx.Client with pinned SSL context | ChatOpenAI(http_client=...) |
| JavaScript | Fetch function with EHBP encryption | ChatOpenAI({ configuration: { fetch } }) |
| Go | *http.Client with pinned TLS | openai.WithHTTPClient(...) |
Python SDK
Full Python SDK reference and examples.
JavaScript SDK
Full JavaScript SDK reference and examples.
Go SDK
Full Go SDK reference and examples.
Tool Calling
Use function calling with LangChain agents and Tinfoil.

