update readme with PyTorch inference engine and llama.cpp link to issue

This commit is contained in:
Alex Cheema
2024-10-01 03:05:19 +04:00
parent 5b8b17dc7a
commit 0120891c35

View File

@@ -224,11 +224,11 @@ exo supports the following inference engines:
- ✅ [MLX](exo/inference/mlx/sharded_inference_engine.py)
- ✅ [tinygrad](exo/inference/tinygrad/inference.py)
- 🚧 [llama.cpp](TODO)
- 🚧 [PyTorch](https://github.com/exo-explore/exo/pull/139)
- 🚧 [llama.cpp](https://github.com/exo-explore/exo/issues/167)
## Networking Modules
- ✅ [GRPC](exo/networking/grpc)
- 🚧 [Radio](TODO)
- 🚧 [Bluetooth](TODO)