Magentic: Your Open Source AI Integration Tool
Overview
Magentic is a powerful open-source tool designed to seamlessly integrate Large Language Models (LLMs) into Python code. With its user-friendly decorators, such as @prompt
and @chatprompt
, developers can create functions that yield structured outputs from LLMs, blending traditional Python coding with advanced AI capabilities.
Features
- Structured Outputs: Utilize Pydantic models and Python types for organized data.
- Streaming Outputs: Get real-time structured outputs and function calls as they are generated.
- LLM-Assisted Retries: Enhance adherence to complex output schemas with intelligent retries.
- Observability: Integrate with OpenTelemetry and Pydantic Logfire for robust observability.
- Type Annotations: Benefit from improved linting and IDE support.
- Multi-Provider Configuration: Easily switch between LLM providers like OpenAI, Anthropic, and Ollama.
Installation
To install Magentic, use:
pip install magentic
or for uv users:
uv add magentic
Configure your OpenAI API key with the environment variable OPENAI_API_KEY
.
Usage
With decorators like @prompt
, you can swiftly define LLM prompts as Python functions. For example:
from magentic import prompt
@prompt('Add more "dude"ness to: {phrase}')
def dudeify(phrase: str) -> str: ...
dudeify("Hello, how are you?")
Benefits for Users
Magentic simplifies the integration of AI into applications, making it ideal for developers looking to leverage LLM