pin llama-cpp-python to 0.2.55 due to https://github.com/abetlen/llama-cpp-python/issues/1288
This commit is contained in:
@@ -28,7 +28,7 @@ call :activate_environment
|
||||
|
||||
@rem install dependencies
|
||||
@rem ver 0.2.56 produces segment error for /embeddings on MacOS
|
||||
call python -m pip install llama-cpp-python[server]!=0.2.56
|
||||
call python -m pip install llama-cpp-python[server]==0.2.55
|
||||
|
||||
@REM @rem start the server with passed params
|
||||
call python -m llama_cpp.server %*
|
||||
|
Reference in New Issue
Block a user