Use Models Directly
This example shows how to use Athena’s language models directly when building with our Python SDK.
- Use any model available in your workspace
- Fully compatible with LangChain, a popular open-source library for LLM apps
- Get started with
athena.llm.invoke("Hello!")
- Efficient parallel processing and support for high volumes with
llm.batch()
Available Models
Specify models explicitly using with_config()
. The default model is Claude.
Available models include:
claude_3_7_sonnet
: Claude 3.7 Sonnet (default)openai_gpt_4_5
: OpenAI GPT-4.5 Previewopenai_gpt_4
: OpenAI GPT-4openai_gpt_4_turbo
: OpenAI GPT-4 Turboopenai_gpt_4_turbo_preview
: OpenAI GPT-4 Turbo Previewopenai_gpt_4o
: OpenAI GPT-4oopenai_gpt_4o_mini
: OpenAI GPT-4o Minifireworks_llama_3p1_70b
: Llama 3.1 70B (Fireworks)fireworks_llama_3p1_405b
: Llama 3.1 405B (Fireworks)fireworks_function_v2
: Fireworks Function v2