Skip to content
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 6 additions & 0 deletions integrations/llms/vertex-ai.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -75,6 +75,12 @@ Use the Portkey instance to send requests to any models hosted on Vertex AI. You
Vertex AI uses OAuth2 to authenticate its requests, so you need to send the **access token** additionally along with the request.
</Warning>

<Note>
**Self-Hosted GKE Deployments:** When running the Portkey Gateway on GKE with [Workload Identity Federation](/self-hosting/hybrid-deployments/gcp#setting-up-iam-permission) enabled (`GCP_AUTH_MODE=workload`), the Gateway automatically acquires Vertex AI access tokens from the GCP metadata server. No manual `Authorization` header is needed.

The GSA bound to the Gateway's KSA must have the `roles/aiplatform.user` role. See the [GCP deployment guide](/self-hosting/hybrid-deployments/gcp#setting-up-iam-permission) for setup.
</Note>

<Tabs>
<Tab title="NodeJS SDK">
```js
Expand Down