Files
llama-cpp-python/llama_cpp/server
Lucas Doyle b9098b0ef7 llama_cpp server: prompt is a string
Not sure why this union type was here but taking a look at llama.py, prompt is only ever processed as a string for completion

This was breaking types when generating an openapi client
2023-05-02 14:47:07 -07:00
..
2023-05-01 22:38:46 -04:00
2023-05-02 14:47:07 -07:00