---
title: "PipeLLM Home"
route_path: "/"
canonical_url: "https://www.pipellm.ai/"
markdown_path: "/llms/home.md"
markdown_url: "https://www.pipellm.ai/llms/home.md"
content_type: "landing-page"
description: "Landing page for PipeLLM, focused on using OpenAI, Anthropic, and Gemini SDKs with any model through one endpoint."
generated_at: "2026-03-27T06:53:30.752Z"
---
Canonical page: https://www.pipellm.ai/
Markdown mirror: https://www.pipellm.ai/llms/home.md
Content type: landing-page
Generated at: 2026-03-27T06:53:30.752Z
# PipeLLM Home

## Purpose

This landing page introduces PipeLLM as a compatibility gateway that lets developers use OpenAI, Anthropic, or Gemini SDKs with any supported model. It is written to help both humans and machines quickly understand what PipeLLM does, who it is for, and where to go next.

## Query Intents

- What is PipeLLM?
- Does PipeLLM support OpenAI SDK compatibility?
- Can PipeLLM convert between OpenAI, Anthropic, and Gemini SDK formats?
- Can PipeLLM route one SDK to many model providers?
- Where is the PipeLLM model library?
- Where do I start building with PipeLLM?

## Core Message

- Use OpenAI, Anthropic, and Gemini SDKs with any model.
- PipeLLM converts requests and responses automatically.
- You can switch providers and models without rewriting your app.

## What the Page Communicates

### SDK compatibility

The page positions PipeLLM as a compatibility layer for familiar developer workflows:

- OpenAI SDK compatibility
- Anthropic SDK compatibility
- LangChain and LangGraph style integration patterns

### Model access

The page emphasizes that PipeLLM can route requests across many model families and providers from one entry point.

### Developer workflow

The intended workflow on the page is:

1. Keep your existing SDK patterns.
2. Point requests at the PipeLLM endpoint.
3. Browse models in the model library.
4. Move into the console to start building.

## Key URLs Mentioned or Linked From This Page

- API endpoint: `https://api.pipellm.ai`
- Model library: `https://www.pipellm.ai/models`
- Docs: `https://docs.pipellm.ai`
- Console: `https://console.pipellm.ai`
- Blog: `https://www.pipellm.ai/blog`

## Main Sections

### Hero

The hero section introduces PipeLLM with the message that developers can keep OpenAI, Anthropic, or Gemini SDK workflows while switching models through one endpoint.

### SDK switching section

This section demonstrates that developers can preserve existing code structure while PipeLLM converts requests and responses between SDK formats.

### Blog overview

The page surfaces recent blog content so users can continue into product updates, technical content, or company writing.

### Final call to action

The page closes with a clear prompt to start building and unify an LLM API stack.

## Recommended Follow-Up Pages

- Use `/models` to explore the model catalog.
- Use `/blog` to read published articles.
- Use `https://docs.pipellm.ai` for product documentation.
- Use `https://console.pipellm.ai` to start building.
