So I just did an update, an all my virtualenv are now broken. I have read this, which has gotten some ways in getting me to understand the issues. But I never installed anything outside of isolates virtualenvs that matters to me (e.g. on the system-wide pip/python). And the fix in my case is pretty easy - I can just re-source all my virtualenvs with the new interpreter.
What I am really looking for is How to change the way I source my virtualenv to avoid system-wide updates to the python interpreter (e.g. to python 3.11 next year say) messing with my virtualenv?
base interpreter (from which I created the virtual env originally): usr/bin/python
Originally, I used the system interpreter to create the virtualenv. My understanding was that the base interperter only matters to well create the original structure (e.g. /myproject/venv/*), but once that is done, /myproject/venv/bin/python is the one actually doing the job, and then all the packages installed with pip (/myproject/venv/bin/pip) as installed within venv. Thus everything should be insolated from the system as this point.
But if I do in my (broken) project ls -la /myproject/venv/bin/, then I see that python -> /usr/bin/python, thus itâs a symlink⌠to the system-wide python, which is now 3.10. So Iâm guessing this is the root cause of my issues. But how to avoid this next time?
You created a virtual environment, so far so good but you correctly identified that the actual Python interpreter will be the system one. This is intended behavior.
To have different Python interpreters, I strongly recommend pyenv (or Docker).
Yeah I thought about managing that with Docker - in fact, I already manage node/npm (for js) using Docker containers as a simple âenvironment wrapperâ that manages the dependencies (then bind mount to the host project directory to get live reload etc.). I guess I could do the same with python. That being said, it does get a bit heavy to add that layer to the whole process. I liked using virtualenv directly since with IntellJ, I can basically just let my IDE handle it, and not really care much about any of it.
Maybe pyenv is closer to what I actually want/need. I could probably use it to create myself a few configs template to re-use across projects.
It copies '/usr/bin/pythonX.Y/, but maybe it still uses
the systemâs current â/usr/lib/pythonX.Y/â in place, without copying.
I have a vague memory that I tried it a year ago, and thatâs what I saw.
Last June I managed to build CPython 3.9.5
so that it was totally separate from system Python.
Tried to do it again last week with 3.10.2 â
followed the official directions carefully, but failed.
Iâm retired and have only a few hobby projects, so for me it doesnât really matter.
Oh, and then I could base the virtualenv on those instead, and theyâre just my programs so unaffected by upgrades. Yes I think this is a good approach.
Have been investigating, as I didnât know which to try.
Hereâs some info:
EDIT1:
Pyenv builds (compiles) a completely separate Python interpreter
for each version of Python that you install with it, in any non-system location.
It does not link to system python libraries.
EDIT2:
With pyenv you can create as many versions of Python interpreter as you want,
and for each of those versions you can make as many environments as you want.
Pyenv is well integrated with âvirtualenvâ, through pyenv-virtualenv, so consider
using âpyenv-virtualenvâ to make these environments, rather then âvenvâ.
Is made from pure shell scripts (no dependency on python)
Lets you change the âglobalâ Python version, but on a per-user basis,
so the operating system still uses âsystem pythonâ in the normal way.
Provides support for per-project Python versions.
pyenv-virtualenv :
is a pyenv plugin that provides features to manage virtualenvs
This article favors pyenv, and explains why very well.
He decides against using Anaconda and Miniconda because they are both very big,
and also because he thinks it would become confusing to have
2 repositories and 2 methods of installing packages (pip and conda).