2023-03-27 01:35:51 -04:00
2023-03-24 19:02:36 -04:00
2023-03-24 19:10:31 -04:00
2023-03-25 16:26:23 -04:00
2023-03-27 01:35:51 -04:00
2023-03-23 05:37:26 -04:00
2023-03-23 05:33:06 -04:00
2023-03-24 19:02:36 -04:00
2023-03-24 18:57:59 -04:00
2023-03-26 14:00:37 -04:00
2023-03-24 00:06:24 -04:00
2023-03-26 14:00:37 -04:00

🦙 Python Bindings for llama.cpp

PyPI PyPI - Python Version PyPI - License PyPI - Downloads

Simple Python bindings for @ggerganov's llama.cpp library. This package provides:

  • Low-level access to C API via ctypes interface.
  • High-level Python API for text completion
    • OpenAI-like API
    • LangChain compatibility

Installation

Install from PyPI:

pip install llama-cpp-python

Usage

>>> from llama_cpp import Llama
>>> llm = Llama(model_path="models/7B/...")
>>> output = llm("Q: Name the planets in the solar system? A: ", max_tokens=32, stop=["Q:", "\n"], echo=True)
>>> print(output)
{
  "id": "cmpl-xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx",
  "object": "text_completion",
  "created": 1679561337,
  "model": "models/7B/...",
  "choices": [
    {
      "text": "Q: Name the planets in the solar system? A: Mercury, Venus, Earth, Mars, Jupiter, Saturn, Uranus, Neptune and Pluto.",
      "index": 0,
      "logprobs": None,
      "finish_reason": "stop"
    }
  ],
  "usage": {
    "prompt_tokens": 14,
    "completion_tokens": 28,
    "total_tokens": 42
  }
}

License

This project is licensed under the terms of the MIT license.

Description
Python bindings for llama.cpp
Readme 3 MiB
Languages
Python 96.6%
Dockerfile 1.7%
Shell 0.6%
Makefile 0.6%
CMake 0.5%