From ba8fcff5626fd3afdc128f4171a8a6bdf147cf2c Mon Sep 17 00:00:00 2001 From: Vrushank V Date: Mon, 9 Feb 2026 18:21:01 -0800 Subject: [PATCH] Update vertex-ai.mdx --- integrations/llms/vertex-ai.mdx | 6 ++++++ 1 file changed, 6 insertions(+) diff --git a/integrations/llms/vertex-ai.mdx b/integrations/llms/vertex-ai.mdx index 8a05cb46..3cf11666 100644 --- a/integrations/llms/vertex-ai.mdx +++ b/integrations/llms/vertex-ai.mdx @@ -75,6 +75,12 @@ Use the Portkey instance to send requests to any models hosted on Vertex AI. You Vertex AI uses OAuth2 to authenticate its requests, so you need to send the **access token** additionally along with the request. + +**Self-Hosted GKE Deployments:** When running the Portkey Gateway on GKE with [Workload Identity Federation](/self-hosting/hybrid-deployments/gcp#setting-up-iam-permission) enabled (`GCP_AUTH_MODE=workload`), the Gateway automatically acquires Vertex AI access tokens from the GCP metadata server. No manual `Authorization` header is needed. + +The GSA bound to the Gateway's KSA must have the `roles/aiplatform.user` role. See the [GCP deployment guide](/self-hosting/hybrid-deployments/gcp#setting-up-iam-permission) for setup. + + ```js