Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
OPENAI_API_KEY="your_openai_api_key_here"
22 changes: 22 additions & 0 deletions examples/start-agents/aws_strands_agent_starter/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,22 @@
# Use the official lightweight Python image
FROM python:3.11-slim

# Prevent Python from writing .pyc files and enable unbuffered logging
ENV PYTHONDONTWRITEBYTECODE=1
ENV PYTHONUNBUFFERED=1

# Set the working directory
WORKDIR /usr/src/app

# Install dependencies first for Docker caching
COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt

# Copy the application code
COPY app/ ./app/

# Expose the API port
EXPOSE 8000

# Start the FastAPI server using Uvicorn
CMD ["uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "8000"]
143 changes: 143 additions & 0 deletions examples/start-agents/aws_strands_agent_starter/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,143 @@
# 🌤️ AWS Strands Agent Starter (Dual-Entrypoint)

*Deploy this production-ready AI agent on [Saturn Cloud](https://saturncloud.io/).*

**Hardware:** CPU/GPU | **Resource:** Python Project & API | **Tech Stack:** AWS Strands SDK, FastAPI, OpenAI, Docker

<p align="center">
<img src="https://img.shields.io/badge/Deployed_on-Saturn_Cloud-blue?style=for-the-badge&logo=cloud" alt="Saturn Cloud">
<img src="https://img.shields.io/badge/API-FastAPI-009688?style=for-the-badge&logo=fastapi&logoColor=white" alt="FastAPI">
<img src="https://img.shields.io/badge/Framework-AWS_Strands-FF9900?style=for-the-badge&logo=amazonaws&logoColor=white" alt="AWS Strands">
<img src="https://img.shields.io/badge/Provider-OpenAI-412991?style=for-the-badge&logo=openai&logoColor=white" alt="OpenAI">
<img src="https://img.shields.io/badge/Tool-Open--Meteo_API-00B0FF?style=for-the-badge" alt="Open-Meteo">
</p>

## 📖 Overview

This template provides a dual-entrypoint implementation of a model-driven AI Agent utilizing the open-source **AWS Strands SDK**.

It features a shared core architecture (`app/agent.py`) that can be executed in two ways:
1. **Interactive CLI (`weather_agent.py`):** For rapid local prototyping and terminal-based debugging.
2. **Production Microservice (`app/main.py`):** A high-performance FastAPI backend that allows external applications to query the agent asynchronously via standard HTTP REST endpoints.

---

## 🏗️ Setup & Installation

**1. Create Virtual Environment & Install Dependencies**
```bash
python -m venv venv
source venv/bin/activate
pip install -r requirements.txt

```

**2. Configure Environment Variables**
Create a `.env` file in the root directory.

```bash
cp .env.example .env
nano .env
# Define OPENAI_API_KEY. Save and exit.

```

---

## 💻 Method 1: Interactive CLI (Prototyping)

Use the CLI script to test new prompts, verify tool execution, and chat with the agent directly in your terminal.

```bash
python weather_agent.py

```

**Example Prompts:**

* *"What is the weather like in Tokyo right now?"*
* *"Should I wear a jacket in London today?"*

To terminate the interactive loop, input `exit`.

---

## 🌐 Method 2: FastAPI Microservice (Production)

Serve the agent as a RESTful web API for integration with frontends, mobile apps, or other microservices.

**Run the Server:**

```bash
uvicorn app.main:app --reload

```

**Test the API:**
Once the server is running, FastAPI automatically generates an interactive Swagger UI documentation page at `http://127.0.0.1:8000/docs`, allowing you to test the agent visually.

Alternatively, send a standard POST request:

```bash
curl -X 'POST' \
'[http://127.0.0.1:8000/api/chat](http://127.0.0.1:8000/api/chat)' \
-H 'accept: application/json' \
-H 'Content-Type: application/json' \
-d '{
"query": "What is the weather like in Tokyo right now?"
}'

```

---

### Install Docker and Build the Container

Run these commands one by one in your terminal to install and start the Docker engine:

**1. Update your package manager:**

```bash
sudo apt update

```

**2. Install Docker:**

```bash
sudo apt install -y docker.io

```

**3. Start the Docker service:**

```bash
sudo systemctl start docker
sudo systemctl enable docker

```

**4. Add your user to the Docker group (so you don't have to type `sudo` every time):**

```bash
sudo usermod -aG docker $USER
newgrp docker

## 🐳 Docker Deployment

To deploy this application to production environments (like AWS ECS or cloud container registries), build and run the included Dockerfile.

```bash
docker build -t strands-weather-agent .
docker run -p 8000:8000 --env-file .env strands-weather-agent

```

---

## 📚 Official Documentation & References

* **Deployment Platform:** [Saturn Cloud Documentation](https://saturncloud.io/docs/)
* **AI Agent Framework:** [AWS Strands Agents Documentation](https://strandsagents.com/latest/)
* **API Framework:** [FastAPI Documentation](https://fastapi.tiangolo.com/)
* **Weather API Routing:** [Open-Meteo API Reference](https://open-meteo.com/en/docs)
Empty file.
86 changes: 86 additions & 0 deletions examples/start-agents/aws_strands_agent_starter/app/agent.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,86 @@
import os
import json
import urllib.request
import urllib.parse
from dotenv import load_dotenv

from strands import Agent, tool
from strands.models.openai import OpenAIModel

# Load environment variables
load_dotenv()

@tool
def get_realtime_weather(location: str) -> str:
"""Fetches current weather data for a specified city or location."""
print(f" [System] Strands SDK executing tool: Geocoding '{location}'...")
headers = {"User-Agent": "Strands-Agent-API/1.0"}

try:
# Step A: Convert city name to coordinates safely
safe_location = urllib.parse.quote(location)
geocode_url = f"https://geocoding-api.open-meteo.com/v1/search?name={safe_location}&count=1&format=json"

req = urllib.request.Request(geocode_url, headers=headers)
with urllib.request.urlopen(req, timeout=15.0) as response:
geo_data = json.loads(response.read().decode())

if not geo_data.get("results"):
return f"System Error: Could not find geographical coordinates for '{location}'."

lat = geo_data["results"][0]["latitude"]
lon = geo_data["results"][0]["longitude"]
country = geo_data["results"][0].get("country", "Unknown Region")

print(f" [System] Strands SDK executing tool: Fetching weather for Lat: {lat}, Lon: {lon}...")

# Step B: Fetch weather using coordinates
weather_url = f"https://api.open-meteo.com/v1/forecast?latitude={lat}&longitude={lon}&current=temperature_2m,wind_speed_10m&timezone=auto"

req2 = urllib.request.Request(weather_url, headers=headers)
with urllib.request.urlopen(req2, timeout=15.0) as response2:
weather_data = json.loads(response2.read().decode())

current = weather_data.get("current", {})
temp = current.get("temperature_2m", "Unknown")
wind = current.get("wind_speed_10m", "Unknown")

return f"Location: {location}, {country}. Temperature: {temp}°C, Wind Speed: {wind} km/h."

except Exception as e:
print(f" [Tool Error] API connection failed: {str(e)}")
return f"System Error: Network failure inside the weather tool - {str(e)}"

def initialize_agent() -> Agent:
"""Initializes and returns the Strands SDK Agent."""
if not os.getenv("OPENAI_API_KEY"):
raise ValueError("Environment Error: OPENAI_API_KEY is not defined.")

llm_provider = OpenAIModel(
client_args={
"api_key": os.getenv("OPENAI_API_KEY"),
"timeout": 60.0,
"max_retries": 3
},
model_id="gpt-4o-mini"
)

return Agent(
model=llm_provider,
tools=[get_realtime_weather],
system_prompt=(
"You are a concise, highly accurate weather assistant. "
"Use the provided tool to fetch real-time weather data for the user's requested location. "
"Extract the location from the prompt, fetch the data, and present the findings clearly."
)
)

# Singleton instance of the agent to avoid re-initializing on every API call
strands_agent = initialize_agent()

def invoke_agent(user_query: str) -> str:
"""Passes the prompt to the Strands framework and returns the string response."""
result = strands_agent(user_query)

# Convert the Strands AgentResult object into a standard Python string
return str(result)
33 changes: 33 additions & 0 deletions examples/start-agents/aws_strands_agent_starter/app/main.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
from fastapi import FastAPI, HTTPException
from pydantic import BaseModel
from app.agent import invoke_agent

# Initialize the FastAPI application
app = FastAPI(
title="AWS Strands Weather Agent API",
description="A production-ready microservice utilizing the AWS Strands SDK.",
version="1.0.0"
)

# Define the expected JSON payload schema
class ChatRequest(BaseModel):
query: str

class ChatResponse(BaseModel):
response: str

# Health check endpoint for load balancers and container orchestrators
@app.get("/health")
def health_check():
return {"status": "healthy"}

# The primary AI interaction endpoint
@app.post("/api/chat", response_model=ChatResponse)
async def chat_endpoint(request: ChatRequest):
try:
# Route the query to the Strands agent logic
agent_reply = invoke_agent(request.query)
return ChatResponse(response=agent_reply)
except Exception as e:
# Gracefully handle framework or network errors
raise HTTPException(status_code=500, detail=f"Agent Execution Error: {str(e)}")
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
strands-agents[openai]>=0.1.0
fastapi>=0.110.0
uvicorn>=0.29.0
python-dotenv>=1.1.0
pydantic>=2.6.0
32 changes: 32 additions & 0 deletions examples/start-agents/aws_strands_agent_starter/weather_agent.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,32 @@
import os
from dotenv import load_dotenv

# Import the shared agent logic from your app package
from app.agent import invoke_agent

# Initialize environment variables
load_dotenv()

if __name__ == "__main__":
print("--- AWS Strands Agent Starter (CLI Mode) ---")
print("Framework: Strands SDK | Provider: OpenAI")
print("Agent is ready. (Type 'exit' to quit)")

if not os.getenv("OPENAI_API_KEY"):
print("❌ Error: OPENAI_API_KEY is missing from your .env file.")
exit(1)

while True:
user_query = input("\nAsk for the weather: ")

if user_query.lower() in ['exit', 'quit']:
print("Terminating CLI process.")
break

if user_query.strip():
try:
# Execute the Strands agent loop via the shared application logic
response = invoke_agent(user_query)
print(f"\nAgent: {response}")
except Exception as e:
print(f"\nExecution Error: {e}")
9 changes: 9 additions & 0 deletions examples/start-agents/camel_ai_benchmarker/.env.example
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
# 1. OpenAI Credentials
OPENAI_API_KEY="sk-your-openai-key-here"

# 2. Nebius Studio Credentials
NEBIUS_API_KEY="your-nebius-api-key-here"

# 3. Crusoe Inference Credentials
CRUSOE_API_KEY="your-crusoe-api-key-here"
CRUSOE_API_BASE="https://managed-inference-api-proxy.crusoecloud.com/v1"
Loading
Loading