mirror of
https://github.com/khoaliber/khoj.git
synced 2026-03-05 05:39:11 +00:00
Expose web, desktop settings page under /settings, not /configure
- Update references to the settings page to use new url across docs and code - Rename desktop and web settings page to settigns.html instead of config[ure].html
This commit is contained in:
@@ -34,4 +34,4 @@ Using LiteLLM with Khoj makes it possible to turn any LLM behind an API into you
|
||||
5. Create a new [Server Chat Setting](http://localhost:42110/server/admin/database/serverchatsettings/add/) on your Khoj admin panel
|
||||
- Default model: `<name of chat model option you created in step 4>`
|
||||
- Summarizer model: `<name of chat model option you created in step 4>`
|
||||
6. Go to [your config](http://localhost:42110/configure) and select the model you just created in the chat model dropdown.
|
||||
6. Go to [your config](http://localhost:42110/settings) and select the model you just created in the chat model dropdown.
|
||||
|
||||
@@ -27,4 +27,4 @@ LM Studio can expose an [OpenAI API compatible server](https://lmstudio.ai/docs/
|
||||
5. Create a new [Server Chat Setting](http://localhost:42110/server/admin/database/serverchatsettings/add/) on your Khoj admin panel
|
||||
- Default model: `<name of chat model option you created in step 4>`
|
||||
- Summarizer model: `<name of chat model option you created in step 4>`
|
||||
6. Go to [your config](http://localhost:42110/configure) and select the model you just created in the chat model dropdown.
|
||||
6. Go to [your config](http://localhost:42110/settings) and select the model you just created in the chat model dropdown.
|
||||
|
||||
@@ -31,6 +31,6 @@ Ollama exposes a local [OpenAI API compatible server](https://github.com/ollama/
|
||||
5. Create a new [Server Chat Setting](http://localhost:42110/server/admin/database/serverchatsettings/add/) on your Khoj admin panel
|
||||
- Default model: `<name of chat model option you created in step 4>`
|
||||
- Summarizer model: `<name of chat model option you created in step 4>`
|
||||
6. Go to [your config](http://localhost:42110/configure) and select the model you just created in the chat model dropdown.
|
||||
6. Go to [your config](http://localhost:42110/settings) and select the model you just created in the chat model dropdown.
|
||||
|
||||
That's it! You should now be able to chat with your Ollama model from Khoj. If you want to add additional models running on Ollama, repeat step 6 for each model.
|
||||
|
||||
@@ -34,4 +34,4 @@ For specific integrations, see our [Ollama](/advanced/ollama), [LMStudio](/advan
|
||||
5. Create a new [Server Chat Setting](http://localhost:42110/server/admin/database/serverchatsettings/add/) on your Khoj admin panel
|
||||
- Default model: `<name of chat model option you created in step 4>`
|
||||
- Summarizer model: `<name of chat model option you created in step 4>`
|
||||
6. Go to [your config](http://localhost:42110/configure) and select the model you just created in the chat model dropdown.
|
||||
6. Go to [your config](http://localhost:42110/settings) and select the model you just created in the chat model dropdown.
|
||||
|
||||
Reference in New Issue
Block a user