Files
llama-cpp-python/llama_cpp
Lucas Doyle a5aa6c1478 llama_cpp server: add missing top_k param to CreateChatCompletionRequest
`llama.create_chat_completion` definitely has a `top_k` argument, but its missing from `CreateChatCompletionRequest`. decision: add it
2023-05-01 15:38:19 -07:00
..
2023-03-24 14:59:29 -04:00
2023-05-01 14:47:55 -04:00
2023-05-01 17:45:08 -04:00