mirror of
https://github.com/khoaliber/khoj.git
synced 2026-03-07 21:29:13 +00:00
Properly close chat stream iterator even if response generation fails
Previously chat stream iterator wasn't closed when response streaming for offline chat model threw an exception. This would require restarting the application. Now application doesn't hang even if current response generation fails with exception
This commit is contained in:
@@ -224,7 +224,7 @@ def llm_thread(g, messages: List[ChatMessage], model: Any, max_prompt_size: int
|
||||
g.send(response["choices"][0]["delta"].get("content", ""))
|
||||
finally:
|
||||
state.chat_lock.release()
|
||||
g.close()
|
||||
g.close()
|
||||
|
||||
|
||||
def send_message_to_model_offline(
|
||||
|
||||
Reference in New Issue
Block a user