Move prompts into LLMs module (#70)
Since the only usage of prompt is within LLMs, it is reasonable to keep it within the LLM module. This way, it would be easier to discover module, and make the code base less complicated. Changes: * Move prompt components into llms * Bump version 0.3.1 * Make pip install dependencies in eager mode --------- Co-authored-by: ian <ian@cinnamon.is>
This commit is contained in:
committed by
GitHub
parent
8532138842
commit
693ed39de4
@@ -4,8 +4,8 @@ from typing import List
|
||||
from theflow import Compose, Node, Param
|
||||
|
||||
from kotaemon.base import BaseComponent
|
||||
from kotaemon.llms import BasePromptComponent
|
||||
from kotaemon.llms.chats.openai import AzureChatOpenAI
|
||||
from kotaemon.prompt.base import BasePromptComponent
|
||||
|
||||
|
||||
class Thought(BaseComponent):
|
||||
|
Reference in New Issue
Block a user