Skip to content

feat(sessions): add get_events() and filter_events() methods to Session#4958

Closed
ecanlar wants to merge 2357 commits intogoogle:mainfrom
ecanlar:feature/add-session-filter-events
Closed

feat(sessions): add get_events() and filter_events() methods to Session#4958
ecanlar wants to merge 2357 commits intogoogle:mainfrom
ecanlar:feature/add-session-filter-events

Conversation

@ecanlar
Copy link
Copy Markdown

@ecanlar ecanlar commented Mar 23, 2026

Summary

This PR implements event filtering capabilities for the Session class to support rewind operations.

Changes

  • Added get_events() method: Provides a consistent API for accessing all session events
  • Added filter_events() method: Allows filtering events with options to exclude rewound events
  • Implemented rewind filtering logic: When a session is rewound, events from the rewound invocation onwards are automatically filtered out
  • Comprehensive test coverage: Added extensive unit tests covering all scenarios including edge cases
  • Updated copyright year: Updated copyright notices to 2026

Key Features

  1. get_events(): Returns all events in the session
  2. filter_events(exclude_rewound=True): Returns filtered events, excluding those invalidated by rewind operations
  3. Backward iteration algorithm: Efficiently filters rewound events by iterating backward through the event list

Testing

The implementation includes comprehensive unit tests in test_session_filter_events.py covering:

  • Normal event retrieval
  • Single and multiple rewind scenarios
  • Edge cases (rewind target not found, chronological order preservation)
  • Multiple events in the same invocation

Related Issue

Fixes #4959

didier-durand and others added 30 commits February 10, 2026 14:01
Merge google#3975

### Link to Issue or Description of Change

Fixing various typos: see commit diffs for details

**1. Link to an existing issue (if applicable):**

- Closes: N/A
- Related: N/A

**2. Or, if no issue exists, describe the change:**

Fixing various typos: see commit diffs for details

**Problem:**

Improve quality of repo

**Solution:**
Pull this P/R

### Testing Plan

N/A
**Unit Tests:**

- [N/A] I have added or updated unit tests for my change.
- [X] All unit tests pass locally.

**Manual End-to-End (E2E) Tests:**

N/A

### Checklist

- [X] I have read the [CONTRIBUTING.md](https://github.com/google/adk-python/blob/main/CONTRIBUTING.md) document.
- [X] I have performed a self-review of my own code.
- [N/A] I have commented my code, particularly in hard-to-understand areas.
- [N/A] I have added tests that prove my fix is effective or that my feature works.
- [X] New and existing unit tests pass locally with my changes.
- [N/A] I have manually tested my changes end-to-end.
- [N/A] Any dependent changes have been merged and published in downstream modules.

### Additional context

N/A

COPYBARA_INTEGRATE_REVIEW=google#3975 from didier-durand:fix-typos-b ca1cba4
PiperOrigin-RevId: 868308215
Co-authored-by: Sasha Sobran <asobran@google.com>
PiperOrigin-RevId: 868324488
…ractions API integration

Related: google#4311

Co-authored-by: Xuan Yang <xygoogle@google.com>
PiperOrigin-RevId: 868340444
Merge google#4288

**Please ensure you have read the [contribution guide](https://github.com/google/adk-python/blob/main/CONTRIBUTING.md) before creating a pull request.**

### Link to Issue or Description of Change

**1. Link to an existing issue (if applicable):**

- Closes: google#4274

**2. Or, if no issue exists, describe the change:**

N/A - Issue exists

### Testing Plan

_Please describe the tests that you ran to verify your changes. This is required
for all PRs that are not small documentation or typo fixes._

**Unit Tests:**

- [X] I have added or updated unit tests for my change.
- [X] All unit tests pass locally.

$ pytest tests/unittests/cli/ -v

============================= test session starts ==============================
platform darwin -- Python 3.11.14, pytest-9.0.2, pluggy-1.6.0
collected 246 items
...
====================== 246 passed, 147 warnings in 21.38s ======================

**Manual End-to-End (E2E) Tests:**

1. Verify CLI flag is recognized:
$ adk api_server --help | grep auto_create_session --auto_create_session
Automatically create a session if it doesn't exist when calling /run.

2. Start server with flag enabled:
$ adk api_server --auto_create_session

3. Test /run endpoint without pre-creating session:
$ curl -X POST http://localhost:8000/run \
  -H "Content-Type: application/json" \
  -d '{"app_name": "my_agent", "user_id": "user1", "session_id": "new_session", "new_message": {"role": "user", "parts": [{"text": "Hello"}]}}'

Expected: Session auto-created, request succeeds (no 404 error).

### Checklist

- [X] I have read the [CONTRIBUTING.md](https://github.com/google/adk-python/blob/main/CONTRIBUTING.md) document.
- [X] I have performed a self-review of my own code.
- [X] I have commented my code, particularly in hard-to-understand areas.
- [X] I have added tests that prove my fix is effective or that my feature works.
- [X] New and existing unit tests pass locally with my changes.
- [X] I have manually tested my changes end-to-end.
- [X] Any dependent changes have been merged and published in downstream modules.

### Additional context
This PR exposes the existing Runner.auto_create_session functionality (added in commit 8e69a58 / ADK v1.23.0) through the adk api_server CLI command.

Files changed (3 files, ~15 lines):

src/google/adk/cli/cli_tools_click.py - Add --auto_create_session CLI option
src/google/adk/cli/fast_api.py - Pass parameter through get_fast_api_app()
src/google/adk/cli/adk_web_server.py - Store and use in _create_runner()

COPYBARA_INTEGRATE_REVIEW=google#4288 from ekimcodes:main 3c8d299
PiperOrigin-RevId: 868361303
Co-authored-by: Xiang (Sean) Zhou <seanzhougoogle@google.com>
PiperOrigin-RevId: 868370272
Sessions were being erroneously cached and reused across different asyncio event loops, causing "Event loop is closed" in environments with transient loops. This updates the session caching to be loop-aware: before reusing a cached session, check that the stored loop matches the current loop. Also, if session is disconnected and loops do not match, discard the cached entry without calling aclose().

Co-authored-by: Kathy Wu <wukathy@google.com>
PiperOrigin-RevId: 868380746
…ssions pagination

Merge google#4435

### Link to Issue or Description of Change

- Closes: google#4302

**Problem:**

`VertexAiSessionService.list_sessions()` only returns the first ~100 sessions. The `sessions_iterator` from `api_client.agent_engines.sessions.list()` is an `AsyncPager` — it implements `__aiter__`/`__anext__` for fetching subsequent pages, but the code uses a plain `for` loop which only calls `__iter__`/`__next__`, so it never fetches beyond the first page.

**Solution:**

Changed `for api_session in sessions_iterator` to `async for api_session in sessions_iterator` so the `AsyncPager` actually paginates. Updated the test mock to return an `AsyncIterableList` (supports both sync and async iteration) instead of a bare list, so the tests properly simulate real `AsyncPager` behaviour.

### Testing Plan

**Unit Tests:**

```
$ pytest tests/unittests/sessions/
115 passed, 1 warning in 2.25s
```

The existing `test_list_sessions`, `test_list_sessions_with_pagination`, and `test_list_sessions_all_users` all continue to pass with the updated mock.

Co-authored-by: Liang Wu <wuliang@google.com>
COPYBARA_INTEGRATE_REVIEW=google#4435 from anmolg1997:fix/vertex-ai-session-service-pagination 14c71b6
PiperOrigin-RevId: 868466166
Previously submitted change is causing [Pyink error](https://github.com/google/adk-python/actions/runs/21884351396/job/63176122118) in repo because Pyink recently updated to 25.12.0.

Co-authored-by: Liang Wu <wuliang@google.com>
PiperOrigin-RevId: 868806575
Merge google#3926

### Link to Issue or Description of Change

**1. Link to an existing issue (if applicable):**

- Related: google#3916

**2. Or, if no issue exists, describe the change:**

**Problem:**
While `DatabaseSessionService` already supports PostgreSQL through SQLAlchemy, there is no documentation or sample code showing users how to configure and use it.

**Solution:**
Add a comprehensive sample under `contributing/samples/postgres_session_service/` that demonstrates:
- How to configure `DatabaseSessionService` with PostgreSQL
- The auto-generated database schema (sessions, events, app_states, user_states tables)
- Connection URL format and configuration options
- A working sample agent with session persistence

### Testing Plan

**Unit Tests:**

- [x] I have added or updated unit tests for my change.
- [x] All unit tests pass locally.

This is a documentation-only change (new sample), so no new unit tests are required. Existing tests continue to pass.

**Manual End-to-End (E2E) Tests:**

Tested locally with the following steps:

1. Started PostgreSQL using `docker compose up -d`
2. Set environment variables:

```bash
export POSTGRES_URL=postgresql+asyncpg://postgres:postgres@localhost:5432/adk_sessions
export GOOGLE_CLOUD_PROJECT=$(gcloud config get-value project)
export GOOGLE_CLOUD_LOCATION=us-central1
export GOOGLE_GENAI_USE_VERTEXAI=true
```
3. Ran `pip install google-adk asyncpg greenlet` to install the required packages
4. Ran `python main.py` - session created successfully
5. Ran `python main.py` again - previous session resumed with event history
6. Verified tables and rows created in PostgreSQL (sessions, events, app_states, user_states)

### Checklist

- [x] I have read the https://github.com/google/adk-python/blob/main/CONTRIBUTING.md document.
- [x] I have performed a self-review of my own code.
- [x] I have commented my code, particularly in hard-to-understand areas.
- [x] I have added tests that prove my fix is effective or that my feature works.
- [x] New and existing unit tests pass locally with my changes.
- [x] I have manually tested my changes end-to-end.
- [x] Any dependent changes have been merged and published in downstream modules.

### Additional context

This PR adds documentation and a working sample for an already-supported feature. The DatabaseSessionService class already handles PostgreSQL through its DynamicJSON type decorator which uses JSONB for PostgreSQL.

Files added:
- contributing/samples/postgres_session_service/README.md - Comprehensive guide
- contributing/samples/postgres_session_service/agent.py - Sample agent
- contributing/samples/postgres_session_service/main.py - Usage example
- contributing/samples/postgres_session_service/compose.yml - Local PostgreSQL setup
- contributing/samples/postgres_session_service/\_\_init\_\_.py - Package init

Co-authored-by: Liang Wu <wuliang@google.com>
COPYBARA_INTEGRATE_REVIEW=google#3926 from hiroakis:feat-support-pg-for-conversation a5279d4
PiperOrigin-RevId: 868816317
… Actions

Remove the release-please GitHub App config files and add an Actions-based
release pipeline with candidate branch strategy. This includes workflows for
cutting releases, running release-please, finalizing releases, publishing to
PyPI, and cherry-picking fixes to release candidates.

Co-authored-by: Wei Sun (Jack) <weisun@google.com>
PiperOrigin-RevId: 868825704
Co-authored-by: Liang Wu <wuliang@google.com>
PiperOrigin-RevId: 868837301
This enables to re-use the ADK web interface in other contexts more easily.
For example, when having an own run-time the web interface can be exposed
for visualization during development.

Also add documentation for function.

PiperOrigin-RevId: 868848338
Close google#3527

Co-authored-by: George Weale <gweale@google.com>
PiperOrigin-RevId: 868875061
Merge google#4455

**Please ensure you have read the [contribution guide](https://github.com/google/adk-python/blob/main/CONTRIBUTING.md) before creating a pull request.**

### Link to Issue or Description of Change

**1. Link to an existing issue (if applicable):**

- Closes: #_issue_number_
- Related: #_issue_number_

**2. Or, if no issue exists, describe the change:**

_If applicable, please follow the issue templates to provide as much detail as
possible._

**Problem:**
_A clear and concise description of what the problem is._

**Solution:**
_A clear and concise description of what you want to happen and why you choose
this solution._

### Testing Plan

_Please describe the tests that you ran to verify your changes. This is required
for all PRs that are not small documentation or typo fixes._

**Unit Tests:**

- [ ] I have added or updated unit tests for my change.
- [ ] All unit tests pass locally.

_Please include a summary of passed `pytest` results._

**Manual End-to-End (E2E) Tests:**

_Please provide instructions on how to manually test your changes, including any
necessary setup or configuration. Please provide logs or screenshots to help
reviewers better understand the fix._

### Checklist

- [ ] I have read the [CONTRIBUTING.md](https://github.com/google/adk-python/blob/main/CONTRIBUTING.md) document.
- [ ] I have performed a self-review of my own code.
- [ ] I have commented my code, particularly in hard-to-understand areas.
- [ ] I have added tests that prove my fix is effective or that my feature works.
- [ ] New and existing unit tests pass locally with my changes.
- [ ] I have manually tested my changes end-to-end.
- [ ] Any dependent changes have been merged and published in downstream modules.

### Additional context

_Add any other context or screenshots about the feature request here._

COPYBARA_INTEGRATE_REVIEW=google#4455 from google:release/candidate f660ec8
PiperOrigin-RevId: 868877872
Co-authored-by: Xuan Yang <xygoogle@google.com>
PiperOrigin-RevId: 868890552
This allows users to load skills from a directory and pass it into the SkillToolset constructor.

Co-authored-by: Kathy Wu <wukathy@google.com>
PiperOrigin-RevId: 868929937
This can fix unit test github action error by removing `--extra eval`. `--extra a2a` is not needed because it's included in `test`. All tests are still passing.

Co-authored-by: Liang Wu <wuliang@google.com>
PiperOrigin-RevId: 869014318
It's not used in the codebase.

Co-authored-by: Liang Wu <wuliang@google.com>
PiperOrigin-RevId: 869020801
…gent

Closes google#3905

Co-authored-by: Liang Wu <wuliang@google.com>
PiperOrigin-RevId: 869232930
Co-authored-by: Yifan Wang <wanyif@google.com>
PiperOrigin-RevId: 869375052
Co-authored-by: George Weale <gweale@google.com>
PiperOrigin-RevId: 869397634
Co-authored-by: Xuan Yang <xygoogle@google.com>
PiperOrigin-RevId: 869400758
currently we started to relay live request to streaming tool even when the tool was not called yet.

Co-authored-by: Xiang (Sean) Zhou <seanzhougoogle@google.com>
PiperOrigin-RevId: 869421826
previously we only register streaming tool that accept stream input at runner, now uniformly register all streaming tool at runner.

Co-authored-by: Xiang (Sean) Zhou <seanzhougoogle@google.com>
PiperOrigin-RevId: 869447996
Co-authored-by: Xiang (Sean) Zhou <seanzhougoogle@google.com>
PiperOrigin-RevId: 869480442
google-genai-bot and others added 20 commits March 18, 2026 13:28
…tracing into environment simulation

PiperOrigin-RevId: 885759367
…ing to LiveConnectConfig

RunConfig.response_modalities is typed as list[str] for backward
compatibility, but LiveConnectConfig.response_modalities expects
list[Modality] (an enum). Assigning strings directly causes a Pydantic
serialization warning on every live streaming session:

  PydanticSerializationUnexpectedValue(Expected `enum` - serialized
  value may not be as expected [field_name='response_modalities',
  input_value='AUDIO', input_type=str])

Convert each modality value to types.Modality at the assignment point in
basic.py using types.Modality(m), which is a no-op for values that are
already Modality enums and converts plain strings like "AUDIO" to
Modality.AUDIO. The public RunConfig interface remains list[str] so
existing callers are unaffected.

Fixes: google#4869

Co-authored-by: Xiang (Sean) Zhou <seanzhougoogle@google.com>
PiperOrigin-RevId: 885781014
Co-authored-by: Kathy Wu <wukathy@google.com>
PiperOrigin-RevId: 885801247
Co-authored-by: Kathy Wu <wukathy@google.com>
PiperOrigin-RevId: 885813216
Co-authored-by: Kathy Wu <wukathy@google.com>
PiperOrigin-RevId: 885814460
Co-authored-by: Xuan Yang <xygoogle@google.com>
PiperOrigin-RevId: 885829552
Merge google#4780

## Bug

`AnthropicLlm.part_to_message_block()` only serializes `FunctionResponse.response` dicts that contain a `"content"` or `"result"` key. When neither key is present the variable `content` stays as `""` and an empty `ToolResultBlockParam` is sent to Claude.

This silently drops the output of several `SkillToolset` tools:

| Tool | Keys returned | Handled before this fix? |
|---|---|---|
| `load_skill` (success) | `skill_name`, `instructions`, `frontmatter` | **No** |
| `run_skill_script` (success) | `skill_name`, `script_path`, `stdout`, `stderr`, `status` | **No** |
| Any skill tool (error) | `error`, `error_code` | **No** |
| `load_skill_resource` (success) | `skill_name`, `path`, `content` | Yes (`"content"` key) |

Because `load_skill` is the entry-point for skill instructions, Claude models using `SkillToolset` **never received skill instructions**, making the feature completely non-functional with Anthropic models.

## Fix

Added an `else` branch in `part_to_message_block()` that JSON-serializes the full response dict when neither `"content"` nor `"result"` is present:

```python
elif response_data:
    # Fallback: serialize the entire response dict as JSON so that tools
    # returning arbitrary key structures (e.g. load_skill returning
    # {"skill_name", "instructions", "frontmatter"}) are not silently
    # dropped.
    content = json.dumps(response_data)
```

This is consistent with how Gemini handles it — the Gemini integration passes `types.Part` objects directly to the Google GenAI SDK which serializes them natively, so there is no key-based filtering at all.

## Testing plan

Added 4 new unit tests to `tests/unittests/models/test_anthropic_llm.py`:

- `test_part_to_message_block_arbitrary_dict_serialized_as_json` — covers the `load_skill` response shape
- `test_part_to_message_block_run_skill_script_response` — covers the `run_skill_script` response shape
- `test_part_to_message_block_error_response_not_dropped` — covers error dict responses
- `test_part_to_message_block_empty_response_stays_empty` — ensures empty dict still produces empty content (no regression)

All 35 tests in `test_anthropic_llm.py` pass:

```
35 passed in 7.32s
```

Run with:
```bash
uv sync --extra test
pytest tests/unittests/models/test_anthropic_llm.py -v
```

Co-authored-by: Kathy Wu <wukathy@google.com>
COPYBARA_INTEGRATE_REVIEW=google#4780 from akashbangad:fix/anthropic-llm-skill-toolset-fallback c23ad37
PiperOrigin-RevId: 885831845
Renames the imported module from `agentic_sandbox` to `k8s_agent_sandbox` in `gke_code_executor.py` and updates the required version in `pyproject.toml` to `>=0.1.1.post3`.

Closes google#4883

Co-authored-by: Liang Wu <wuliang@google.com>
PiperOrigin-RevId: 885840142
- Introduced a new `AuthScheme` named `GcpIamConnectorAuth` in `google.adk.integrations.iam_connector` package.
- The feature is currently disabled through the newly added `GCP_IAM_CONNECTOR_AUTH`experimentation flag.
- The newly added `GcpAuthProvider` class in `adk/integrations/iam_connector/gcp_auth_provider.py` is for internal ADK use and developer should not depend on it.

PiperOrigin-RevId: 886070137
PiperOrigin-RevId: 886119834
The ADK experimental warnings can already be disabled with an environment
variable, enable the same behavior for A2A to avoid log spam.

Co-authored-by: Tim Niemueller <timdn@google.com>
PiperOrigin-RevId: 886233489
This change introduces a separate async session factory for Spanner connections configured with `read_only=True`

Close google#4771

Co-authored-by: George Weale <gweale@google.com>
PiperOrigin-RevId: 886267104
Modified compaction logic to exclude events containing function calls for which no corresponding function response has been recorded in the session. Added helper functions to identify pending function call IDs and check if an event contains such IDs. Applied this exclusion to both sliding window and token threshold-based compaction strategies

Close google#4740

Co-authored-by: George Weale <gweale@google.com>
PiperOrigin-RevId: 886268526
Kebab-case is the standard for skill names/directories given by the Skill spec. However, to enable support for importing within skill scripts (python does not allow imports with kebab-case, only snake case), allow snake case for the skill name.

Co-authored-by: Kathy Wu <wukathy@google.com>
PiperOrigin-RevId: 886351132
DiscoveryEngineSearchTool hardcoded search_result_mode to CHUNKS, which
fails with a 400 error for structured datastores (e.g. Jira Cloud
connectors) that require DOCUMENTS mode. This broke VertexAiSearchTool
with bypass_multi_tools_limit=True for structured data.

Add auto-detection: default mode is now None, which tries CHUNKS first
and automatically falls back to DOCUMENTS when the API returns the
structured-datastore error. Users can also explicitly set the mode via
the new SearchResultMode enum to skip auto-detection.

Fixes google#3406

Co-authored-by: Xiang (Sean) Zhou <seanzhougoogle@google.com>
PiperOrigin-RevId: 886354417
Co-authored-by: Xuan Yang <xygoogle@google.com>
PiperOrigin-RevId: 886446298
Support Starlette lifespan protocol in to_a2a() so users can run
async startup/shutdown logic (e.g. initializing DB connections,
loading prompt registries) without resorting to module-level globals.

The lifespan parameter accepts a standard Starlette async context
manager. Internally, a composed lifespan runs the A2A route setup
first, then delegates to the user's lifespan if provided. This also
replaces the deprecated add_event_handler("startup", ...) pattern
with the modern Starlette(lifespan=...) constructor.

Closes google#4701

Co-authored-by: Xiang (Sean) Zhou <seanzhougoogle@google.com>
PiperOrigin-RevId: 886892363
This change enables the conversion of ADK EventActions, serialized within A2A object metadata, back into ADK Event objects. It includes logic to parse JSON-encoded metadata values and to merge EventActions from multiple sources within an A2A Task

Close google#3968

Co-authored-by: George Weale <gweale@google.com>
PiperOrigin-RevId: 886927688
@google-cla
Copy link
Copy Markdown

google-cla bot commented Mar 23, 2026

Thanks for your pull request! It looks like this may be your first contribution to a Google open source project. Before we can look at your pull request, you'll need to sign a Contributor License Agreement (CLA).

View this failed invocation of the CLA check for more information.

For the most up to date status, view the checks section at the bottom of the pull request.

@adk-bot adk-bot added the services [Component] This issue is related to runtime services, e.g. sessions, memory, artifacts, etc label Mar 23, 2026
@adk-bot
Copy link
Copy Markdown
Collaborator

adk-bot commented Mar 23, 2026

Response from ADK Triaging Agent

Hello @ecanlar, thank you for your contribution!

To help us track feature requests and bugs, could you please create an issue and link it to this PR? If the issue already exists, you can link it by adding "Fixes #issue_number" or "Resolves #issue_number" to the PR description.

This will help us with the review process. Thanks!

@ecanlar ecanlar force-pushed the feature/add-session-filter-events branch 2 times, most recently from 18b7b01 to cd2e412 Compare March 23, 2026 11:22
@ecanlar
Copy link
Copy Markdown
Author

ecanlar commented Mar 23, 2026

Cerrando esta PR porque incluía demasiados commits. Creando una nueva PR limpia con solo los cambios necesarios.

@ecanlar ecanlar closed this Mar 23, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

services [Component] This issue is related to runtime services, e.g. sessions, memory, artifacts, etc

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Add get_events() and filter_events() methods to Session