Metadata-Version: 2.4
Name: strands-agents
Version: 1.28.0
Summary: A model-driven approach to building AI agents in just a few lines of code
Project-URL: Homepage, https://github.com/strands-agents/sdk-python
Project-URL: Bug Tracker, https://github.com/strands-agents/sdk-python/issues
Project-URL: Documentation, https://strandsagents.com
Author-email: AWS <opensource@amazon.com>
License: Apache-2.0
License-File: LICENSE
License-File: NOTICE
Classifier: Development Status :: 5 - Production/Stable
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: Apache Software License
Classifier: Operating System :: OS Independent
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Programming Language :: Python :: 3.13
Classifier: Programming Language :: Python :: 3.14
Classifier: Topic :: Scientific/Engineering :: Artificial Intelligence
Classifier: Topic :: Software Development :: Libraries :: Python Modules
Requires-Python: >=3.10
Requires-Dist: boto3<2.0.0,>=1.26.0
Requires-Dist: botocore<2.0.0,>=1.29.0
Requires-Dist: docstring-parser<1.0,>=0.15
Requires-Dist: jsonschema<5.0.0,>=4.0.0
Requires-Dist: mcp<2.0.0,>=1.23.0
Requires-Dist: opentelemetry-api<2.0.0,>=1.30.0
Requires-Dist: opentelemetry-instrumentation-threading<1.00b0,>=0.51b0
Requires-Dist: opentelemetry-sdk<2.0.0,>=1.30.0
Requires-Dist: pydantic<3.0.0,>=2.4.0
Requires-Dist: typing-extensions<5.0.0,>=4.13.2
Requires-Dist: watchdog<7.0.0,>=6.0.0
Provides-Extra: a2a
Requires-Dist: a2a-sdk<0.4.0,>=0.3.0; extra == 'a2a'
Requires-Dist: a2a-sdk[sql]<0.4.0,>=0.3.0; extra == 'a2a'
Requires-Dist: fastapi<1.0.0,>=0.115.12; extra == 'a2a'
Requires-Dist: httpx<1.0.0,>=0.28.1; extra == 'a2a'
Requires-Dist: starlette<1.0.0,>=0.46.2; extra == 'a2a'
Requires-Dist: uvicorn<1.0.0,>=0.34.2; extra == 'a2a'
Provides-Extra: all
Requires-Dist: a2a-sdk<0.4.0,>=0.3.0; extra == 'all'
Requires-Dist: a2a-sdk[sql]<0.4.0,>=0.3.0; extra == 'all'
Requires-Dist: anthropic<1.0.0,>=0.21.0; extra == 'all'
Requires-Dist: boto3-stubs[sagemaker-runtime]<2.0.0,>=1.26.0; extra == 'all'
Requires-Dist: fastapi<1.0.0,>=0.115.12; extra == 'all'
Requires-Dist: google-genai<2.0.0,>=1.32.0; extra == 'all'
Requires-Dist: httpx<1.0.0,>=0.28.1; extra == 'all'
Requires-Dist: litellm<2.0.0,>=1.75.9; extra == 'all'
Requires-Dist: llama-api-client<1.0.0,>=0.1.0; extra == 'all'
Requires-Dist: mistralai>=1.8.2; extra == 'all'
Requires-Dist: ollama<1.0.0,>=0.4.8; extra == 'all'
Requires-Dist: openai<1.110.0,>=1.68.0; extra == 'all'
Requires-Dist: openai<2.0.0,>=1.68.0; extra == 'all'
Requires-Dist: opentelemetry-exporter-otlp-proto-http<2.0.0,>=1.30.0; extra == 'all'
Requires-Dist: sphinx-autodoc-typehints<4.0.0,>=1.12.0; extra == 'all'
Requires-Dist: sphinx-rtd-theme<4.0.0,>=1.0.0; extra == 'all'
Requires-Dist: sphinx<10.0.0,>=5.0.0; extra == 'all'
Requires-Dist: starlette<1.0.0,>=0.46.2; extra == 'all'
Requires-Dist: uvicorn<1.0.0,>=0.34.2; extra == 'all'
Requires-Dist: writer-sdk<3.0.0,>=2.2.0; extra == 'all'
Provides-Extra: anthropic
Requires-Dist: anthropic<1.0.0,>=0.21.0; extra == 'anthropic'
Provides-Extra: bidi
Requires-Dist: aws-sdk-bedrock-runtime; (python_version >= '3.12') and extra == 'bidi'
Requires-Dist: smithy-aws-core>=0.0.1; (python_version >= '3.12') and extra == 'bidi'
Provides-Extra: bidi-all
Requires-Dist: a2a-sdk<0.4.0,>=0.3.0; extra == 'bidi-all'
Requires-Dist: a2a-sdk[sql]<0.4.0,>=0.3.0; extra == 'bidi-all'
Requires-Dist: aws-sdk-bedrock-runtime; (python_version >= '3.12') and extra == 'bidi-all'
Requires-Dist: fastapi<1.0.0,>=0.115.12; extra == 'bidi-all'
Requires-Dist: google-genai<2.0.0,>=1.32.0; extra == 'bidi-all'
Requires-Dist: httpx<1.0.0,>=0.28.1; extra == 'bidi-all'
Requires-Dist: opentelemetry-exporter-otlp-proto-http<2.0.0,>=1.30.0; extra == 'bidi-all'
Requires-Dist: prompt-toolkit<4.0.0,>=3.0.0; extra == 'bidi-all'
Requires-Dist: pyaudio<1.0.0,>=0.2.13; extra == 'bidi-all'
Requires-Dist: smithy-aws-core>=0.0.1; (python_version >= '3.12') and extra == 'bidi-all'
Requires-Dist: sphinx-autodoc-typehints<4.0.0,>=1.12.0; extra == 'bidi-all'
Requires-Dist: sphinx-rtd-theme<4.0.0,>=1.0.0; extra == 'bidi-all'
Requires-Dist: sphinx<10.0.0,>=5.0.0; extra == 'bidi-all'
Requires-Dist: starlette<1.0.0,>=0.46.2; extra == 'bidi-all'
Requires-Dist: uvicorn<1.0.0,>=0.34.2; extra == 'bidi-all'
Requires-Dist: websockets<17.0.0,>=15.0.0; extra == 'bidi-all'
Provides-Extra: bidi-gemini
Requires-Dist: google-genai<2.0.0,>=1.32.0; extra == 'bidi-gemini'
Provides-Extra: bidi-io
Requires-Dist: prompt-toolkit<4.0.0,>=3.0.0; extra == 'bidi-io'
Requires-Dist: pyaudio<1.0.0,>=0.2.13; extra == 'bidi-io'
Provides-Extra: bidi-openai
Requires-Dist: websockets<17.0.0,>=15.0.0; extra == 'bidi-openai'
Provides-Extra: dev
Requires-Dist: commitizen<5.0.0,>=4.4.0; extra == 'dev'
Requires-Dist: hatch<2.0.0,>=1.0.0; extra == 'dev'
Requires-Dist: moto<6.0.0,>=5.1.0; extra == 'dev'
Requires-Dist: mypy<2.0.0,>=1.15.0; extra == 'dev'
Requires-Dist: pre-commit<4.6.0,>=3.2.0; extra == 'dev'
Requires-Dist: pytest-asyncio<1.4.0,>=1.0.0; extra == 'dev'
Requires-Dist: pytest-cov<8.0.0,>=7.0.0; extra == 'dev'
Requires-Dist: pytest-timeout<3.0.0,>=2.0.0; extra == 'dev'
Requires-Dist: pytest-xdist<4.0.0,>=3.0.0; extra == 'dev'
Requires-Dist: pytest<10.0.0,>=9.0.0; extra == 'dev'
Requires-Dist: ruff<0.15.0,>=0.13.0; extra == 'dev'
Requires-Dist: tenacity<10.0.0,>=9.0.0; extra == 'dev'
Provides-Extra: docs
Requires-Dist: sphinx-autodoc-typehints<4.0.0,>=1.12.0; extra == 'docs'
Requires-Dist: sphinx-rtd-theme<4.0.0,>=1.0.0; extra == 'docs'
Requires-Dist: sphinx<10.0.0,>=5.0.0; extra == 'docs'
Provides-Extra: gemini
Requires-Dist: google-genai<2.0.0,>=1.32.0; extra == 'gemini'
Provides-Extra: litellm
Requires-Dist: litellm<2.0.0,>=1.75.9; extra == 'litellm'
Requires-Dist: openai<1.110.0,>=1.68.0; extra == 'litellm'
Provides-Extra: llamaapi
Requires-Dist: llama-api-client<1.0.0,>=0.1.0; extra == 'llamaapi'
Provides-Extra: mistral
Requires-Dist: mistralai>=1.8.2; extra == 'mistral'
Provides-Extra: ollama
Requires-Dist: ollama<1.0.0,>=0.4.8; extra == 'ollama'
Provides-Extra: openai
Requires-Dist: openai<2.0.0,>=1.68.0; extra == 'openai'
Provides-Extra: otel
Requires-Dist: opentelemetry-exporter-otlp-proto-http<2.0.0,>=1.30.0; extra == 'otel'
Provides-Extra: sagemaker
Requires-Dist: boto3-stubs[sagemaker-runtime]<2.0.0,>=1.26.0; extra == 'sagemaker'
Requires-Dist: openai<2.0.0,>=1.68.0; extra == 'sagemaker'
Provides-Extra: writer
Requires-Dist: writer-sdk<3.0.0,>=2.2.0; extra == 'writer'
Description-Content-Type: text/markdown

<div align="center">
  <div>
    <a href="https://strandsagents.com">
      <img src="https://strandsagents.com/latest/assets/logo-github.svg" alt="Strands Agents" width="55px" height="105px">
    </a>
  </div>

  <h1>
    Strands Agents
  </h1>

  <h2>
    A model-driven approach to building AI agents in just a few lines of code.
  </h2>

  <div align="center">
    <a href="https://github.com/strands-agents/sdk-python/graphs/commit-activity"><img alt="GitHub commit activity" src="https://img.shields.io/github/commit-activity/m/strands-agents/sdk-python"/></a>
    <a href="https://github.com/strands-agents/sdk-python/issues"><img alt="GitHub open issues" src="https://img.shields.io/github/issues/strands-agents/sdk-python"/></a>
    <a href="https://github.com/strands-agents/sdk-python/pulls"><img alt="GitHub open pull requests" src="https://img.shields.io/github/issues-pr/strands-agents/sdk-python"/></a>
    <a href="https://github.com/strands-agents/sdk-python/blob/main/LICENSE"><img alt="License" src="https://img.shields.io/github/license/strands-agents/sdk-python"/></a>
    <a href="https://pypi.org/project/strands-agents/"><img alt="PyPI version" src="https://img.shields.io/pypi/v/strands-agents"/></a>
    <a href="https://python.org"><img alt="Python versions" src="https://img.shields.io/pypi/pyversions/strands-agents"/></a>
  </div>
  
  <p>
    <a href="https://strandsagents.com/">Documentation</a>
    ◆ <a href="https://github.com/strands-agents/samples">Samples</a>
    ◆ <a href="https://github.com/strands-agents/sdk-python">Python SDK</a>
    ◆ <a href="https://github.com/strands-agents/tools">Tools</a>
    ◆ <a href="https://github.com/strands-agents/agent-builder">Agent Builder</a>
    ◆ <a href="https://github.com/strands-agents/mcp-server">MCP Server</a>
  </p>
</div>

Strands Agents is a simple yet powerful SDK that takes a model-driven approach to building and running AI agents. From simple conversational assistants to complex autonomous workflows, from local development to production deployment, Strands Agents scales with your needs.

## Feature Overview

- **Lightweight & Flexible**: Simple agent loop that just works and is fully customizable
- **Model Agnostic**: Support for Amazon Bedrock, Anthropic, Gemini, LiteLLM, Llama, Ollama, OpenAI, Writer, and custom providers
- **Advanced Capabilities**: Multi-agent systems, autonomous agents, and streaming support
- **Built-in MCP**: Native support for Model Context Protocol (MCP) servers, enabling access to thousands of pre-built tools

## Quick Start

```bash
# Install Strands Agents
pip install strands-agents strands-agents-tools
```

```python
from strands import Agent
from strands_tools import calculator
agent = Agent(tools=[calculator])
agent("What is the square root of 1764")
```

> **Note**: For the default Amazon Bedrock model provider, you'll need AWS credentials configured and model access enabled for Claude 4 Sonnet in the us-west-2 region. See the [Quickstart Guide](https://strandsagents.com/) for details on configuring other model providers.

## Installation

Ensure you have Python 3.10+ installed, then:

```bash
# Create and activate virtual environment
python -m venv .venv
source .venv/bin/activate  # On Windows use: .venv\Scripts\activate

# Install Strands and tools
pip install strands-agents strands-agents-tools
```

## Features at a Glance

### Python-Based Tools

Easily build tools using Python decorators:

```python
from strands import Agent, tool

@tool
def word_count(text: str) -> int:
    """Count words in text.

    This docstring is used by the LLM to understand the tool's purpose.
    """
    return len(text.split())

agent = Agent(tools=[word_count])
response = agent("How many words are in this sentence?")
```

**Hot Reloading from Directory:**
Enable automatic tool loading and reloading from the `./tools/` directory:

```python
from strands import Agent

# Agent will watch ./tools/ directory for changes
agent = Agent(load_tools_from_directory=True)
response = agent("Use any tools you find in the tools directory")
```

### MCP Support

Seamlessly integrate Model Context Protocol (MCP) servers:

```python
from strands import Agent
from strands.tools.mcp import MCPClient
from mcp import stdio_client, StdioServerParameters

aws_docs_client = MCPClient(
    lambda: stdio_client(StdioServerParameters(command="uvx", args=["awslabs.aws-documentation-mcp-server@latest"]))
)

with aws_docs_client:
   agent = Agent(tools=aws_docs_client.list_tools_sync())
   response = agent("Tell me about Amazon Bedrock and how to use it with Python")
```

### Multiple Model Providers

Support for various model providers:

```python
from strands import Agent
from strands.models import BedrockModel
from strands.models.ollama import OllamaModel
from strands.models.llamaapi import LlamaAPIModel
from strands.models.gemini import GeminiModel
from strands.models.llamacpp import LlamaCppModel

# Bedrock
bedrock_model = BedrockModel(
  model_id="us.amazon.nova-pro-v1:0",
  temperature=0.3,
  streaming=True, # Enable/disable streaming
)
agent = Agent(model=bedrock_model)
agent("Tell me about Agentic AI")

# Google Gemini
gemini_model = GeminiModel(
  client_args={
    "api_key": "your_gemini_api_key",
  },
  model_id="gemini-2.5-flash",
  params={"temperature": 0.7}
)
agent = Agent(model=gemini_model)
agent("Tell me about Agentic AI")

# Ollama
ollama_model = OllamaModel(
  host="http://localhost:11434",
  model_id="llama3"
)
agent = Agent(model=ollama_model)
agent("Tell me about Agentic AI")

# Llama API
llama_model = LlamaAPIModel(
    model_id="Llama-4-Maverick-17B-128E-Instruct-FP8",
)
agent = Agent(model=llama_model)
response = agent("Tell me about Agentic AI")
```

Built-in providers:
 - [Amazon Bedrock](https://strandsagents.com/latest/user-guide/concepts/model-providers/amazon-bedrock/)
 - [Anthropic](https://strandsagents.com/latest/user-guide/concepts/model-providers/anthropic/)
 - [Gemini](https://strandsagents.com/latest/user-guide/concepts/model-providers/gemini/)
 - [Cohere](https://strandsagents.com/latest/user-guide/concepts/model-providers/cohere/)
 - [LiteLLM](https://strandsagents.com/latest/user-guide/concepts/model-providers/litellm/)
 - [llama.cpp](https://strandsagents.com/latest/user-guide/concepts/model-providers/llamacpp/)
 - [LlamaAPI](https://strandsagents.com/latest/user-guide/concepts/model-providers/llamaapi/)
 - [MistralAI](https://strandsagents.com/latest/user-guide/concepts/model-providers/mistral/)
 - [Ollama](https://strandsagents.com/latest/user-guide/concepts/model-providers/ollama/)
 - [OpenAI](https://strandsagents.com/latest/user-guide/concepts/model-providers/openai/)
 - [SageMaker](https://strandsagents.com/latest/user-guide/concepts/model-providers/sagemaker/)
 - [Writer](https://strandsagents.com/latest/user-guide/concepts/model-providers/writer/)

Custom providers can be implemented using [Custom Providers](https://strandsagents.com/latest/user-guide/concepts/model-providers/custom_model_provider/)

### Example tools

Strands offers an optional strands-agents-tools package with pre-built tools for quick experimentation:

```python
from strands import Agent
from strands_tools import calculator
agent = Agent(tools=[calculator])
agent("What is the square root of 1764")
```

It's also available on GitHub via [strands-agents/tools](https://github.com/strands-agents/tools).

### Bidirectional Streaming

> **⚠️ Experimental Feature**: Bidirectional streaming is currently in experimental status. APIs may change in future releases as we refine the feature based on user feedback and evolving model capabilities.

Build real-time voice and audio conversations with persistent streaming connections. Unlike traditional request-response patterns, bidirectional streaming maintains long-running conversations where users can interrupt, provide continuous input, and receive real-time audio responses. Get started with your first BidiAgent by following the [Quickstart](https://strandsagents.com/latest/documentation/docs/user-guide/concepts/experimental/bidirectional-streaming/quickstart) guide. 

**Supported Model Providers:**
- Amazon Nova Sonic (v1, v2)
- Google Gemini Live
- OpenAI Realtime API

**Installation:**

```bash
# Server-side only (no audio I/O dependencies)
pip install strands-agents[bidi]

# With audio I/O support (includes PyAudio dependency)
pip install strands-agents[bidi,bidi-io]
```

**Quick Example:**

```python
import asyncio
from strands.experimental.bidi import BidiAgent
from strands.experimental.bidi.models import BidiNovaSonicModel
from strands.experimental.bidi.io import BidiAudioIO, BidiTextIO
from strands.experimental.bidi.tools import stop_conversation
from strands_tools import calculator

async def main():
    # Create bidirectional agent with Nova Sonic v2
    model = BidiNovaSonicModel()
    agent = BidiAgent(model=model, tools=[calculator, stop_conversation])

    # Setup audio and text I/O (requires bidi-io extra)
    audio_io = BidiAudioIO()
    text_io = BidiTextIO()

    # Run with real-time audio streaming
    # Say "stop conversation" to gracefully end the conversation
    await agent.run(
        inputs=[audio_io.input()],
        outputs=[audio_io.output(), text_io.output()]
    )

if __name__ == "__main__":
    asyncio.run(main())
```

> **Note**: `BidiAudioIO` and `BidiTextIO` require the `bidi-io` extra. For server-side deployments where audio I/O is handled by clients (browsers, mobile apps), install only `strands-agents[bidi]` and implement custom input/output handlers using the `BidiInput` and `BidiOutput` protocols.

**Configuration Options:**

```python
from strands.experimental.bidi.models import BidiNovaSonicModel

# Configure audio settings and turn detection (v2 only)
model = BidiNovaSonicModel(
    provider_config={
        "audio": {
            "input_rate": 16000,
            "output_rate": 16000,
            "voice": "matthew"
        },
        "turn_detection": {
            "endpointingSensitivity": "MEDIUM"  # HIGH, MEDIUM, or LOW
        },
        "inference": {
            "max_tokens": 2048,
            "temperature": 0.7
        }
    }
)

# Configure I/O devices
audio_io = BidiAudioIO(
    input_device_index=0,  # Specific microphone
    output_device_index=1,  # Specific speaker
    input_buffer_size=10,
    output_buffer_size=10
)

# Text input mode (type messages instead of speaking)
text_io = BidiTextIO()
await agent.run(
    inputs=[text_io.input()],  # Use text input
    outputs=[audio_io.output(), text_io.output()]
)

# Multi-modal: Both audio and text input
await agent.run(
    inputs=[audio_io.input(), text_io.input()],  # Speak OR type
    outputs=[audio_io.output(), text_io.output()]
)
```

## Documentation

For detailed guidance & examples, explore our documentation:

- [User Guide](https://strandsagents.com/)
- [Quick Start Guide](https://strandsagents.com/latest/user-guide/quickstart/)
- [Agent Loop](https://strandsagents.com/latest/user-guide/concepts/agents/agent-loop/)
- [Examples](https://strandsagents.com/latest/examples/)
- [API Reference](https://strandsagents.com/latest/api-reference/agent/)
- [Production & Deployment Guide](https://strandsagents.com/latest/user-guide/deploy/operating-agents-in-production/)

## Contributing ❤️

We welcome contributions! See our [Contributing Guide](CONTRIBUTING.md) for details on:
- Reporting bugs & features
- Development setup
- Contributing via Pull Requests
- Code of Conduct
- Reporting of security issues

## License

This project is licensed under the Apache License 2.0 - see the [LICENSE](LICENSE) file for details.

## Security

See [CONTRIBUTING](CONTRIBUTING.md#security-issue-notifications) for more information.

