Skip to content

Llm Models

Flagged

Access Claude, Gemini, Kimi, GLM and 100+ LLMs via inference.sh CLI using OpenRouter. Models: Claude Opus 4.5, Claude Sonnet 4.5, Claude Haiku 4.5, Gemini 3...

536 downloads
$ Add to .claude/skills/

About This Skill

# LLM Models via OpenRouter

Access 100+ language models via inference.sh CLI.

!LLM Models via OpenRouter

Quick Start

```bash curl -fsSL https://cli.inference.sh | sh && infsh login

# Call Claude Sonnet infsh app run openrouter/claude-sonnet-45 --input '{"prompt": "Explain quantum computing"}' ```

> Install note: The install script only detects your OS/architecture, downloads the matching binary from `dist.inference.sh`, and verifies its SHA-256 checksum. No elevated permissions or background processes. Manual install & verification available.

Available Models

| Model | App ID | Best For | |-------|--------|----------| | Claude Opus 4.5 | `openrouter/claude-opus-45` | Complex reasoning, coding | | Claude Sonnet 4.5 | `openrouter/claude-sonnet-45` | Balanced performance | | Claude Haiku 4.5 | `openrouter/claude-haiku-45` | Fast, economical | | Gemini 3 Pro | `openrouter/gemini-3-pro-preview` | Google's latest | | Kimi K2 Thinking | `openrouter/kimi-k2-thinking` | Multi-step reasoning | | GLM-4.6 | `openrouter/glm-46` | Open-source, coding | | Intellect 3 | `openrouter/intellect-3` | General purpose | | Any Model | `openrouter/any-model` | Auto-selects best option |

Search LLM Apps

```bash infsh app list --search "openrouter" infsh app list --search "claude" ```

Examples

Claude Opus (Best Quality)

```bash infsh app run openrouter/claude-opus-45 --input '{ "prompt": "Write a Python function to detect palindromes with comprehensive tests" }' ```

Claude Sonnet (Balanced)

```bash infsh app run openrouter/claude-sonnet-45 --input '{ "prompt": "Summarize the key concepts of machine learning" }' ```

Claude Haiku (Fast & Cheap)

```bash infsh app run openrouter/claude-haiku-45 --input '{ "prompt": "Translate this to French: Hello, how are you?" }' ```

Kimi K2 (Thinking Agent)

```bash infsh app run openrouter/kimi-k2-thinking --input '{ "prompt": "Plan a step-by-step approach to build a web scraper" }' ```

Any Model (Auto-Select)

```bash # Automatically picks the most cost-effective model infsh app run openrouter/any-model --input '{ "prompt": "What is the capital of France?" }' ```

With System Prompt

```bash infsh app sample openrouter/claude-sonnet-45 --save input.json

# Edit input.json: # { # "system": "You are a helpful coding assistant", # "prompt": "How do I read a file in Python?" # }

infsh app run openrouter/claude-sonnet-45 --input input.json ```

Use Cases

  • Coding: Generate, review, debug code
  • Writing: Content, summaries, translations
  • Analysis: Data interpretation, research
  • Agents: Build AI-powered workflows
  • Chat: Conversational interfaces

Related Skills

```bash # Full platform skill (all 150+ apps) npx skills add inference-sh/skills@inference-sh

# Web search (combine with LLMs for RAG) npx skills add inference-sh/skills@web-search

# Image generation npx skills add inference-sh/skills@ai-image-generation

# Video generation npx skills add inference-sh/skills@ai-video-generation ```

Browse all apps: `infsh app list`

Documentation

Use Cases

  • Access Claude, Gemini, Kimi, GLM, and 100+ LLMs via inference.sh CLI
  • Route requests to different LLM providers through OpenRouter integration
  • Compare outputs across multiple AI models for quality evaluation
  • Build multi-model workflows that select the best model per task
  • Access latest models including Claude Opus 4.5, Gemini 3, and DeepSeek R1

Pros & Cons

Pros

  • +Compatible with multiple platforms including claude-code, openclaw
  • +Well-documented with detailed usage instructions and examples
  • +Strong community adoption with a large number of downloads
  • +Automation-first design reduces manual intervention

Cons

  • -No built-in analytics or usage metrics dashboard
  • -Configuration may require familiarity with ai & machine learning concepts

FAQ

What does Llm Models do?
Access Claude, Gemini, Kimi, GLM and 100+ LLMs via inference.sh CLI using OpenRouter. Models: Claude Opus 4.5, Claude Sonnet 4.5, Claude Haiku 4.5, Gemini 3...
What platforms support Llm Models?
Llm Models is available on Claude Code, OpenClaw.
What are the use cases for Llm Models?
Access Claude, Gemini, Kimi, GLM, and 100+ LLMs via inference.sh CLI. Route requests to different LLM providers through OpenRouter integration. Compare outputs across multiple AI models for quality evaluation.

100+ free AI tools

Writing, PDF, image, and developer tools — all in your browser.

Next Step

Use the skill detail page to evaluate fit and install steps. For a direct browser workflow, move into a focused tool route instead of staying in broader support surfaces.