---
title: "GPT-5 chat"
route_path: "/model/gpt-5-chat"
canonical_url: "https://www.pipellm.ai/model/gpt-5-chat"
markdown_path: "/llms/models/gpt-5-chat.md"
markdown_url: "https://www.pipellm.ai/llms/models/gpt-5-chat.md"
content_type: "model-detail-page"
description: "Machine-readable detail page for GPT-5 chat."
generated_at: "2026-03-27T06:53:30.752Z"
---
Canonical page: https://www.pipellm.ai/model/gpt-5-chat
Markdown mirror: https://www.pipellm.ai/llms/models/gpt-5-chat.md
Content type: model-detail-page
Generated at: 2026-03-27T06:53:30.752Z
# GPT-5 chat
## Query Intents
- Understand pricing, provider availability, context window, and capabilities for GPT-5 chat.
- Compare GPT-5 chat against other models available through PipeLLM.
- Find the canonical model identifier to use in SDK or API requests.
## Overview
GPT-5 Chat is designed for advanced, natural, multimodal, and context-aware conversations for enterprise applications.
## Model Metadata
- Display name: GPT-5 chat
- Model ID: gpt-5-chat
- Provider family: Openai
- Release date: Unknown
- Context window: 128K
- Max output: 16K
- Input modalities: text
- Output modalities: text
- Tool use support: Yes
- Computer use support: No
- Cache control support: Yes
## Official Pricing (per 1M tokens)
| Metric | <=200K Context | >200K Context |
| --- | --- | --- |
| Input Price | $1.25 | — |
| Output Price | $10 | — |
| Cache Read | $0.125 | — |
| Image Input | $0 | — |
| Image Output | $0 | — |

## Provider Availability
| Provider | Region | Context Window | Max Output | Input Price | Output Price | Cache Read | Cache Write |
| --- | --- | --- | --- | --- | --- | --- | --- |
| OpenAI | — | 128K | 16K | $1.25 | $10 | $0.125 | — |
