* Rename AzureChatOpenAI to LCAzureChatOpenAI * Provide vanilla ChatOpenAI and AzureChatOpenAI * Remove the highest accuracy, lowest cost criteria These criteria are unnecessary. The users, not pipeline creators, should choose which LLM to use. Furthermore, it's cumbersome to input this information, really degrades user experience. * Remove the LLM selection in simple reasoning pipeline * Provide a dedicated stream method to generate the output * Return placeholder message to chat if the text is empty |
||
---|---|---|
.. | ||
{{cookiecutter.project_name}} | ||
tests | ||
.gitattributes | ||
.gitignore | ||
.pre-commit-config.yaml | ||
README.md | ||
setup.py |
Install
# Create new conda env (optional)
conda create -n {{ cookiecutter.project_name }} python=3.10
conda activate {{ cookiecutter.project_name }}
# Clone and install the project
git clone "<{{ cookiecutter.project_name }}-repo>"
cd "<{{ cookiecutter.project_name }}-repo>"
pip install -e .
# Generate the project structure
cd ..
kh start-project
Usage
- Build the pipeline in
pipeline.py
For supported utilities and tools, refer: https://github.com/Cinnamon/kotaemon/wiki/Utilities
Contribute
- For project issues and errors, please report in this repo issues.
- For kotaemon issues and errors, please report or make PR fixes in https://github.com/Cinnamon/kotaemon.git
- If the template for this project has issues and errors, please report or make PR fixes in https://github.com/Cinnamon/kotaemon/tree/main/templates/project-default