Files
llama-cpp-python/llama_cpp
Lucas Doyle b9098b0ef7 llama_cpp server: prompt is a string
Not sure why this union type was here but taking a look at llama.py, prompt is only ever processed as a string for completion

This was breaking types when generating an openapi client
2023-05-02 14:47:07 -07:00
..
2023-03-24 14:59:29 -04:00
2023-05-01 14:47:55 -04:00
2023-05-01 21:51:16 -04:00