Files
llama-cpp-python/llama_cpp
Lucas Doyle dbbfc4ba2f llama_cpp server: fix to ChatCompletionRequestMessage
When I generate a client, it breaks because it fails to process the schema of ChatCompletionRequestMessage

These fix that:
- I think `Union[Literal["user"], Literal["channel"], ...]` is the same as Literal["user", "channel", ...]
- Turns out default value `Literal["user"]` isn't JSON serializable, so replace with "user"
2023-05-01 15:38:19 -07:00
..
2023-03-24 14:59:29 -04:00
2023-05-01 14:47:55 -04:00
2023-05-01 17:45:08 -04:00