Using LLM APIs have never been easier.

Ship AI Apps with a single function that speaks Gemini and GPT today — Claude, Groq, Ollama, and more next. No lock-in, just ship.

npm i use-every-llm

We not only save developers time, but increase development speed also

Indie hackers shipping fast
Perfect for indie hackers shipping fast
Avoiding lock-in
Who don't want to commit clients to one LLM provider.
Product teams needing reliability
Product teams needing reliability + easy fallbacks between providers.

How it works

Get started in 3 simple steps

1

Install

npm i use-every-llm
2

Configure

Configure your provider key(s)

3

Import & Call

useLLM({model:"any model",prompt:"..."})

Feature Matrix

Text generation
GeminiGPT
Image Understanding
GeminiGPT
Audio Understanding
GeminiGPT
Code Examples

See it in action

From simple text generation to complex multimodal AI interactions

Text Generation With Text Prompt
Basic text generation
Text Prompt
text-generation.js
1import useLLM from 'use-every-llm' 2 3const result = await useLLM({ 4 model: "gemini-2.0-flash", 5 prompt: "What model is it", 6}); 7 8console.log(result.text);

Streaming
Real-time response streaming
Streaming Text
streaming-response.js
1import useLLM from 'use-every-llm' 2 3const result = await useLLM({ 4 model: "gemini-2.0-flash", 5 prompt: "What model is it", 6 streamingResponse: true, 7}); 8 9for await (const chunk of result ) { 10 console.log(chunk.text); 11}
Image + text
Vision capabilities
Multimodal
text-with-image-generation.js
1import useLLM from 'use-every-llm' 2 3const result = await useLLM({ 4 model: "gemini-1.5-flash", 5 prompt: "what image is it?", 6 image: "image.png", 7}); 8 9console.log(result.text);

Video + Text
Vision capabilities
Reliability
hello-world.jsx
1import useLLM from 'use-every-llm' 2 3const result = await useLLM({ 4 model: "gemini-2.0-flash", 5 prompt: "what video is it?", 6 systemPrompt: "You are a video describer", 7 video: "My Movie.mp4", 8}); 9 10console.log(result.text);

Trust & Safety

Privacy

No analytics by default. No key collection. All requests go straight to the provider you choose.

License

MIT licensed for maximum flexibility and peace of mind.

Security

Env-based keys, recommended server-only usage for maximum security.

Future Roadmap

Groq (LLaMA/Mixtral)
Claude (Anthropic)
Ollama (local)
Image/Video/Audio generation helpers
MCP Support + Tool Calling
Agent Development Kit