diff --git a/documentation/docs/get-started/setup.mdx b/documentation/docs/get-started/setup.mdx index 7a2cbb16..cf71b441 100644 --- a/documentation/docs/get-started/setup.mdx +++ b/documentation/docs/get-started/setup.mdx @@ -329,7 +329,7 @@ Using Ollama? See the [Ollama Integration](/advanced/ollama) section for more cu - Give the configuration a friendly name like `Gemini`. Do not configure the API base url. 2. Create a new [chat model](http://localhost:42110/server/admin/database/chatmodel/add) - Set the `chat-model` field to a [Google Gemini chat model](https://cloud.google.com/vertex-ai/generative-ai/docs/learn/models#gemini-models). Example: `gemini-1.5-flash`. - - Set the `model-type` field to `Gemini`. + - Set the `model-type` field to `Google`. - Set the `ai model api` field to the Gemini AI Model API you created in step 1.