Appearance
Getting Started
To use the Inference Grid, you can either connect directly via Spark - which enables users to pay via micro-payments from non-custodial wallets - or you can sign up for Platform - which offers a more traditional billing experience.
Spark
If you're interested in leveraging micro-payments and non-custodial wallets, then you should connect directly to the relay. Using our client SDK and any Lightning-compatible wallet, you can pay for LLM requests in realtime:
ts
import { SparkWallet } from "@buildonspark/spark-sdk";
import { InferenceGrid, Role } from "inference-grid-sdk";
/// "0000..." is a special key which is rate-limited but publicly available. To avoid rate limits, you
// can register your own private key with the inference grid.
const client = new InferenceGrid({
privateKey: "0000000000000000000000000000000000000000000000000000000000000000",
});
// This is an example Spark wallet. Replace the mnemonic with your own!
const { wallet } = await SparkWallet.initialize({
mnemonicOrSeed: "record risk crouch submit abuse strategy great category alley lend upgrade fancy",
options: {
network: "MAINNET"
}
});
// Submit a request.
const { message, invoice } = await client.chat({
model: {
modelIds: ['openai/gpt-4o']
},
messages: [{
role: Role.USER,
content: 'Hello, world!',
}]
});
// Don't forget to pay!
if (invoice) {
await wallet.payLightningInvoice({
invoice,
maxFeeSats: 1000,
})
}
Platform
If you're not interested in connecting directly to the relay, you can sign up for an account at Platform and fund your account via Spark, Lightning, or Stripe. This will enable you to use any OpenAI-compatible SDK to connect the Inference Grid:
python
from openai import OpenAI
client = OpenAI(
base_url="https://relay.inferencegrid.ai/",
api_key="..."
)
client.chat.completions.create(
model="openai/gpt-4o|anthropic/claude-3.7-sonnet",
messages=[
{
"role": "user",
"content": "Tell me a joke!",
},
],
)
ts
import OpenAI from 'openai';
const client = new OpenAI({
baseURL: "https://relay.inferencegrid.ai/",
apiKey: "..."
});
await client.chat.completions.create({
model: "openai/gpt-4o|anthropic/claude-3.7-sonnet",
messages: [
{
role: "user",
content: "Tell me a joke!",
},
],
});