mirror of
https://github.com/pmeier/light-the-torch.git
synced 2024-09-08 23:29:28 +03:00
update README (#144)
This commit is contained in:
38
README.md
38
README.md
@@ -42,7 +42,7 @@ package indices, you can still use `pip install`, but some
|
||||
[additional options](https://pytorch.org/get-started/locally/) are needed:
|
||||
|
||||
```shell
|
||||
pip install torch --extra-index-url https://download.pytorch.org/whl/cu113
|
||||
pip install torch --index-url https://download.pytorch.org/whl/cu118
|
||||
```
|
||||
|
||||
[^1]:
|
||||
@@ -51,14 +51,11 @@ pip install torch --extra-index-url https://download.pytorch.org/whl/cu113
|
||||
|
||||
While this is certainly an improvement, it still has a few downsides:
|
||||
|
||||
1. You need to know what computation backend, e.g. CUDA 11.3 (`cu113`), is supported on
|
||||
1. You need to know what computation backend, e.g. CUDA 11.8 (`cu118`), is supported on
|
||||
your local machine. This can be quite challenging for new users and at least tedious
|
||||
for more experienced ones.
|
||||
2. Besides the stable binaries, PyTorch also offers nightly and test ones. To install
|
||||
them, you need a different `--extra-index-url` for each.
|
||||
3. For the nightly and test channel you also need to supply the `--pre` option. Failing
|
||||
to do so, will pull the stable binary from PyPI even if the rest of the installation
|
||||
command is correct.
|
||||
them, you need a different `--index-url` for each.
|
||||
|
||||
If any of these points don't sound appealing to you, and you just want to have the same
|
||||
user experience as `pip install` for PyTorch distributions, `light-the-torch` was made
|
||||
@@ -96,11 +93,11 @@ In fact, `ltt` is `pip` with a few added options:
|
||||
the computation backend you want to use:
|
||||
|
||||
```shell
|
||||
ltt install --pytorch-computation-backend=cu102 torch torchvision torchaudio
|
||||
ltt install --pytorch-computation-backend=cu121 torch torchvision torchaudio
|
||||
```
|
||||
|
||||
Borrowing from the mutex packages that PyTorch provides for `conda` installations,
|
||||
`--cpuonly` is available as shorthand for `--pytorch-computation-backend=cu102`.
|
||||
`--cpuonly` is available as shorthand for `--pytorch-computation-backend=cpu`.
|
||||
|
||||
In addition, the computation backend to be installed can also be set through the
|
||||
`LTT_PYTORCH_COMPUTATION_BACKEND` environment variable. It will only be honored in
|
||||
@@ -113,8 +110,8 @@ In fact, `ltt` is `pip` with a few added options:
|
||||
ltt install --pytorch-channel=nightly torch torchvision torchaudio
|
||||
```
|
||||
|
||||
If `--pytorch-channel` is not passed, using `pip`'s builtin `--pre` option will
|
||||
install PyTorch test binaries.
|
||||
If `--pytorch-channel` is not passed, using `pip`'s builtin `--pre` option implies
|
||||
`--pytorch-channel=test`.
|
||||
|
||||
Of course, you are not limited to install only PyTorch distributions. Everything shown
|
||||
above also works if you install packages that depend on PyTorch:
|
||||
@@ -133,8 +130,8 @@ specific tasks.
|
||||
|
||||
- While searching for a download link for a PyTorch distribution, `light-the-torch`
|
||||
replaces the default search index with an official PyTorch download link. This is
|
||||
equivalent to calling `pip install` with the `--extra-index-url` option only for
|
||||
PyTorch distributions.
|
||||
equivalent to calling `pip install` with the `--index-url` option only for PyTorch
|
||||
distributions.
|
||||
- While evaluating possible PyTorch installation candidates, `light-the-torch` culls
|
||||
binaries incompatible with the hardware.
|
||||
|
||||
@@ -144,16 +141,18 @@ A project as large as PyTorch is attractive for malicious actors given the large
|
||||
base. For example in December 2022, PyTorch was hit by a
|
||||
[supply chain attack](https://pytorch.org/blog/compromised-nightly-dependency/) that
|
||||
potentially extracted user information. The PyTorch team mitigated the attack as soon as
|
||||
it was detected by temporarily hosting all third party dependencies for the nightly
|
||||
Linux releases on their own indices. With that,
|
||||
it was detected by temporarily hosting all third party dependencies on their own
|
||||
indices. With that,
|
||||
`pip install torch --extra-index-url https://download.pytorch.org/whl/cpu` wouldn't pull
|
||||
anything from PyPI and thus avoiding malicious packages placed there.
|
||||
anything from PyPI and thus avoiding malicious packages placed there. Ultimately, this
|
||||
became the permanent solution and the official installation instructions now use
|
||||
`--index-url` and thus preventing installing anything not hosted on their indices.
|
||||
|
||||
However, due to `light-the-torch`'s index patching, this mitigation would have been
|
||||
However, due to `light-the-torch`'s index patching, this mitigation was initially
|
||||
completely circumvented since only PyTorch distributions would have been installed from
|
||||
the PyTorch indices. Since version `0.7.0`, `light-the-torch` will only pull third-party
|
||||
dependencies for nightly Linux PyTorch releases from PyPI in case they are specifically
|
||||
requested and pinned. For example `ltt install --pytorch-channel=nightly torch` and
|
||||
dependencies from PyPI in case they are specifically requested and pinned. For example
|
||||
`ltt install --pytorch-channel=nightly torch` and
|
||||
`ltt install --pytorch-channel=nightly torch sympy` will install everything from the
|
||||
PyTorch indices. However, if you pin a third party dependency, e.g.
|
||||
`ltt install --pytorch-channel=nightly torch sympy==1.11.1`, it will be pulled from PyPI
|
||||
@@ -162,7 +161,8 @@ regardless of whether the version matches the one on the PyTorch index.
|
||||
In summary, `light-the-torch` is usually as safe as the regular PyTorch installation
|
||||
instructions. However, attacks on the supply chain can lead to situations where
|
||||
`light-the-torch` circumvents mitigations done by the PyTorch team. Unfortunately,
|
||||
`light-the-torch` is not officially supported and thus also not tested by them.
|
||||
`light-the-torch` is not officially supported by PyTorch and thus also not tested by
|
||||
them.
|
||||
|
||||
## How do I contribute?
|
||||
|
||||
|
||||
@@ -123,9 +123,9 @@ class LttOptions:
|
||||
"--pytorch-computation-backend",
|
||||
help=(
|
||||
"Computation backend for compiled PyTorch distributions, "
|
||||
"e.g. 'cu102', 'cu115', or 'cpu'. "
|
||||
"e.g. 'cu118', 'cu121', or 'cpu'. "
|
||||
"Multiple computation backends can be passed as a comma-separated "
|
||||
"list, e.g 'cu102,cu113,cu116'. "
|
||||
"list, e.g 'cu118,cu121'. "
|
||||
"If not specified, the computation backend is detected from the "
|
||||
"available hardware, preferring CUDA over CPU."
|
||||
),
|
||||
|
||||
Reference in New Issue
Block a user