[Preview] v1.80.10.rc.1
Deploy this version​
- Docker
- Pip
docker run litellm
docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:v1.80.10.rc.1
pip install litellm
pip install litellm==1.80.10
Key Highlights​
- Agent (A2A) Usage UI - Track and visualize agent (A2A) spend directly in the dashboard
Agent (A2A) Usage UI​
Users can now filter usage statistics by agents, providing the same granular filtering capabilities available for teams, organizations, and customers.
Details:
- Filter usage analytics, spend logs, and activity metrics by agent ID
- View breakdowns on a per-agent basis
- Consistent filtering experience across all usage and analytics views
New Providers and Endpoints​
New Providers (5 new providers)​
| Provider | Supported LiteLLM Endpoints | Description |
|---|
New LLM API Endpoints (2 new endpoints)​
| Endpoint | Method | Description | Documentation |
|---|
New Models / Updated Models​
New Model Support (33 new models)​
| Provider | Model | Context Window | Input ($/1M tokens) | Output ($/1M tokens) | Features |
|---|

