kotaemon/templates/project-default/{{cookiecutter.project_name}}
Duc Nguyen (john) a203fc0f7c
Allow users to add LLM within the UI (#6)
* Rename AzureChatOpenAI to LCAzureChatOpenAI
* Provide vanilla ChatOpenAI and AzureChatOpenAI
* Remove the highest accuracy, lowest cost criteria

These criteria are unnecessary. The users, not pipeline creators, should choose
which LLM to use. Furthermore, it's cumbersome to input this information,
really degrades user experience.

* Remove the LLM selection in simple reasoning pipeline
* Provide a dedicated stream method to generate the output
* Return placeholder message to chat if the text is empty
2024-04-06 11:53:17 +07:00
..
{{cookiecutter.project_name}} Allow users to add LLM within the UI (#6) 2024-04-06 11:53:17 +07:00
tests [AUR-387, AUR-425] Add start-project to CLI (#29) 2023-10-03 11:55:34 +07:00
.gitattributes [AUR-387, AUR-425] Add start-project to CLI (#29) 2023-10-03 11:55:34 +07:00
.gitignore [AUR-387, AUR-425] Add start-project to CLI (#29) 2023-10-03 11:55:34 +07:00
.pre-commit-config.yaml [AUR-387, AUR-425] Add start-project to CLI (#29) 2023-10-03 11:55:34 +07:00
README.md [AUR-387, AUR-425] Add start-project to CLI (#29) 2023-10-03 11:55:34 +07:00
setup.py [AUR-387, AUR-425] Add start-project to CLI (#29) 2023-10-03 11:55:34 +07:00

Project {{ cookiecutter.project_name }}

pre-commit

Install

# Create new conda env (optional)
conda create -n {{ cookiecutter.project_name }} python=3.10
conda activate {{ cookiecutter.project_name }}

# Clone and install the project
git clone "<{{ cookiecutter.project_name }}-repo>"
cd "<{{ cookiecutter.project_name }}-repo>"
pip install -e .

# Generate the project structure
cd ..
kh start-project

Usage

  • Build the pipeline in pipeline.py

For supported utilities and tools, refer: https://github.com/Cinnamon/kotaemon/wiki/Utilities

Contribute