Skip to main content

[Preview] v1.80.10.rc.1

Krrish Dholakia
CEO, LiteLLM
Ishaan Jaff
CTO, LiteLLM

Deploy this version​

docker run litellm
docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:v1.80.10.rc.1

Key Highlights​


Agent (A2A) Usage UI​

Users can now filter usage statistics by agents, providing the same granular filtering capabilities available for teams, organizations, and customers.

Details:

  • Filter usage analytics, spend logs, and activity metrics by agent ID
  • View breakdowns on a per-agent basis
  • Consistent filtering experience across all usage and analytics views

New Providers and Endpoints​

New Providers (5 new providers)​

ProviderSupported LiteLLM EndpointsDescription

New LLM API Endpoints (2 new endpoints)​

EndpointMethodDescriptionDocumentation

New Models / Updated Models​

New Model Support (33 new models)​

ProviderModelContext WindowInput ($/1M tokens)Output ($/1M tokens)Features

Features​

Bug Fixes​


LLM API Endpoints​

Features​

Bugs​


Management Endpoints / UI​

Features​

Bugs​


AI Integrations (2 new integrations)​


Spend Tracking, Budgets and Rate Limiting​


MCP Gateway​


Agent Gateway (A2A)​


Performance / Loadbalancing / Reliability improvements​


Documentation Updates​


Infrastructure / CI/CD​


New Contributors​


Full Changelog​

View complete changelog on GitHub