Allow users to add LLM within the UI (#6)
* Rename AzureChatOpenAI to LCAzureChatOpenAI * Provide vanilla ChatOpenAI and AzureChatOpenAI * Remove the highest accuracy, lowest cost criteria These criteria are unnecessary. The users, not pipeline creators, should choose which LLM to use. Furthermore, it's cumbersome to input this information, really degrades user experience. * Remove the LLM selection in simple reasoning pipeline * Provide a dedicated stream method to generate the output * Return placeholder message to chat if the text is empty
This commit is contained in:
committed by
GitHub
parent
e187e23dd1
commit
a203fc0f7c
@@ -5,7 +5,7 @@ from kotaemon.base import BaseComponent, Document, LLMInterface, Node, Param, la
|
||||
from kotaemon.contribs.promptui.logs import ResultLog
|
||||
from kotaemon.embeddings import AzureOpenAIEmbeddings
|
||||
from kotaemon.indices import VectorIndexing, VectorRetrieval
|
||||
from kotaemon.llms import AzureChatOpenAI
|
||||
from kotaemon.llms import LCAzureChatOpenAI
|
||||
from kotaemon.storages import ChromaVectorStore, SimpleFileDocumentStore
|
||||
|
||||
|
||||
@@ -34,7 +34,7 @@ class QuestionAnsweringPipeline(BaseComponent):
|
||||
]
|
||||
|
||||
retrieval_top_k: int = 1
|
||||
llm: AzureChatOpenAI = AzureChatOpenAI.withx(
|
||||
llm: LCAzureChatOpenAI = LCAzureChatOpenAI.withx(
|
||||
azure_endpoint="https://bleh-dummy-2.openai.azure.com/",
|
||||
openai_api_key=os.environ.get("OPENAI_API_KEY", "default-key"),
|
||||
openai_api_version="2023-03-15-preview",
|
||||
|
Reference in New Issue
Block a user