Berri AI
The fastest way to take your LLM app to production
Pinned Loading
Repositories
Showing 10 of 62 repositories
- litellm Public
Python SDK, Proxy Server (AI Gateway) to call 100+ LLM APIs in OpenAI (or native) format, with cost tracking, guardrails, loadbalancing and logging. [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, VLLM, NVIDIA NIM]
BerriAI/litellm’s past year of commit activity - mock-token-exchange-server Public
Mock OAuth 2.0 Token Exchange Server (RFC 8693) for testing LiteLLM OBO flow
BerriAI/mock-token-exchange-server’s past year of commit activity - terraform-provider-litellm Public Forked from ncecere/terraform-provider-litellm
litellm terraform provider
BerriAI/terraform-provider-litellm’s past year of commit activity - mock-oauth2-mcp-server Public
A mock OAuth2 + MCP (Model Context Protocol) server for testing client_credentials flows. Useful for E2E testing LiteLLM proxy MCP OAuth2 M2M authentication.
BerriAI/mock-oauth2-mcp-server’s past year of commit activity - litellm-observatory Public
End-to-end testing suite for LiteLLM deployments - provider tests, performance metrics, and API validation
BerriAI/litellm-observatory’s past year of commit activity - litellm-pgvector Public
BerriAI/litellm-pgvector’s past year of commit activity