Hi, I’m managing a Lambda Workstation with RTX 3090’s, and fully up to date Lambda stack. This is a shared system, so what I’d like is to let users install and manage their own
conda environments. The basic expectation is that this would work:
$ conda create -n pytorch python=3.9
$ conda activate pytorch
(pytorch) $ conda install -c pytorch pytorch torchaudio torchvision
However, the version of pytorch distributed via
pip, is seemingly incompatible with the RTX 3090. They even print an explicit error mentioning using pytorch with the RTX 3090 that points to the install from source instructions.
So, my question to those of you out there that use
conda: how are you installing pytorch?
(I believe that my case is different from the one mentioned here, as I believe that post was in reference to the system-wide lambda stack python / pytorch. I’ve just tested that, and it works as expected on the pytorch/mnist example script.)