diff --git a/integrations/llms/vertex-ai.mdx b/integrations/llms/vertex-ai.mdx
index 8a05cb46..3cf11666 100644
--- a/integrations/llms/vertex-ai.mdx
+++ b/integrations/llms/vertex-ai.mdx
@@ -75,6 +75,12 @@ Use the Portkey instance to send requests to any models hosted on Vertex AI. You
Vertex AI uses OAuth2 to authenticate its requests, so you need to send the **access token** additionally along with the request.
+
+**Self-Hosted GKE Deployments:** When running the Portkey Gateway on GKE with [Workload Identity Federation](/self-hosting/hybrid-deployments/gcp#setting-up-iam-permission) enabled (`GCP_AUTH_MODE=workload`), the Gateway automatically acquires Vertex AI access tokens from the GCP metadata server. No manual `Authorization` header is needed.
+
+The GSA bound to the Gateway's KSA must have the `roles/aiplatform.user` role. See the [GCP deployment guide](/self-hosting/hybrid-deployments/gcp#setting-up-iam-permission) for setup.
+
+
```js