Move prompts into LLMs module (#70)
Since the only usage of prompt is within LLMs, it is reasonable to keep it within the LLM module. This way, it would be easier to discover module, and make the code base less complicated. Changes: * Move prompt components into llms * Bump version 0.3.1 * Make pip install dependencies in eager mode --------- Co-authored-by: ian <ian@cinnamon.is>
This commit is contained in:
committed by
GitHub
parent
8532138842
commit
693ed39de4
@@ -9,11 +9,11 @@ from kotaemon.base import BaseComponent
|
||||
from kotaemon.base.schema import RetrievedDocument
|
||||
from kotaemon.docstores import InMemoryDocumentStore
|
||||
from kotaemon.embeddings import AzureOpenAIEmbeddings
|
||||
from kotaemon.llms import PromptTemplate
|
||||
from kotaemon.llms.chats.openai import AzureChatOpenAI
|
||||
from kotaemon.pipelines.agents import BaseAgent
|
||||
from kotaemon.pipelines.retrieving import RetrieveDocumentFromVectorStorePipeline
|
||||
from kotaemon.pipelines.tools import ComponentTool
|
||||
from kotaemon.prompt.template import PromptTemplate
|
||||
from kotaemon.vectorstores import InMemoryVectorStore
|
||||
|
||||
from .utils import file_names_to_collection_name
|
||||
|
Reference in New Issue
Block a user