How to best manage python interpreters for stability (dev)?


So I just did an update, an all my virtualenv are now broken. I have read this, which has gotten some ways in getting me to understand the issues. But I never installed anything outside of isolates virtualenvs that matters to me (e.g. on the system-wide pip/python). And the fix in my case is pretty easy - I can just re-source all my virtualenvs with the new interpreter.

What I am really looking for is How to change the way I source my virtualenv to avoid system-wide updates to the python interpreter (e.g. to python 3.11 next year say) messing with my virtualenv?

So, my setup is this:

  • python (3.9 originally): /myproject/venv/bin/python
  • packages: /myproject/venv/<project packages, python interpreter, etc.>
  • pip: /myproject/venv/bin/pip
  • base interpreter (from which I created the virtual env originally): usr/bin/python

Originally, I used the system interpreter to create the virtualenv. My understanding was that the base interperter only matters to well create the original structure (e.g. /myproject/venv/*), but once that is done, /myproject/venv/bin/python is the one actually doing the job, and then all the packages installed with pip (/myproject/venv/bin/pip) as installed within venv. Thus everything should be insolated from the system as this point.

But if I do in my (broken) project ls -la /myproject/venv/bin/, then I see that python -> /usr/bin/python, thus it’s a symlink… to the system-wide python, which is now 3.10. So I’m guessing this is the root cause of my issues. But how to avoid this next time?

1 Like

You created a virtual environment, so far so good but you correctly identified that the actual Python interpreter will be the system one. This is intended behavior.

To have different Python interpreters, I strongly recommend pyenv (or Docker).


Yeah I thought about managing that with Docker - in fact, I already manage node/npm (for js) using Docker containers as a simple “environment wrapper” that manages the dependencies (then bind mount to the host project directory to get live reload etc.). I guess I could do the same with python. That being said, it does get a bit heavy to add that layer to the whole process. I liked using virtualenv directly since with IntellJ, I can basically just let my IDE handle it, and not really care much about any of it.

Maybe pyenv is closer to what I actually want/need. I could probably use it to create myself a few configs template to re-use across projects.

1 Like

When you say “virtualenv”, do you mean actual virtualenv,
or is it shorthand for “virtual environment” generically?

I tried the built-in “venv”, as it seems easier to understand.
Does the option ‘--copies’ do what you want?

It copies '/usr/bin/pythonX.Y/, but maybe it still uses
the system’s current ‘/usr/lib/pythonX.Y/’ in place, without copying.
I have a vague memory that I tried it a year ago, and that’s what I saw.

Last June I managed to build CPython 3.9.5
so that it was totally separate from system Python.
Tried to do it again last week with 3.10.2 –
followed the official directions carefully, but failed.

I’m retired and have only a few hobby projects, so for me it doesn’t really matter.

1 Like

Can you use (the python of) anaconda as default?

1 Like

This is essentially the same as using pyenv:

Installing a Python interpreter in the version of your choice for exclusively your user without modifying any system package.


Using pip, no intention of changing packaged manager for that, to be honest!

Oh, and then I could base the virtualenv on those instead, and they’re just my programs so unaffected by upgrades. Yes I think this is a good approach.

Well, I’m using the IDE’s wrapper for that, but yes it is actually virtualenv.

A question: can I install conda by AUR and I can use conda to manage my system-default python?

I don’t recommend modifying the system-default Python.

With Pyenv, you can also install Anaconda/miniconda, and set your preferred version.

1 Like

Have been investigating, as I didn’t know which to try.
Here’s some info:

Pyenv builds (compiles) a completely separate Python interpreter
for each version of Python that you install with it, in any non-system location.
It does not link to system python libraries.

With pyenv you can create as many versions of Python interpreter as you want,
and for each of those versions you can make as many environments as you want.
Pyenv is well integrated with ‘virtualenv’, through pyenv-virtualenv, so consider
using ‘pyenv-virtualenv’ to make these environments, rather then ‘venv’.

virtualenv :
creates an environment that

  1. has its own installation directories,
  2. does not share libraries with other virtualenv environments
  3. optionally does not access system python libraries

The built-in ‘venv’ does not have that 3rd capability.

pyenv :

  • Is made from pure shell scripts (no dependency on python)
  • Lets you change the “global” Python version, but on a per-user basis,
    so the operating system still uses “system python” in the normal way.
  • Provides support for per-project Python versions.

pyenv-virtualenv :
is a pyenv plugin that provides features to manage virtualenvs

This article favors pyenv, and explains why very well.
He decides against using Anaconda and Miniconda because they are both very big,
and also because he thinks it would become confusing to have
2 repositories and 2 methods of installing packages (pip and conda).


Wow, I also had the same question. Thanks for the thread, it helped a lot.

This topic was automatically closed 2 days after the last reply. New replies are no longer allowed.