Can I ask a dumb question. Why does Ruby (for example) not have this problem, but python still can't ship a standard solution which isn't constantly changing and rolled up in some corporate offering?
Python packaging for for Python only modules has never been a problem. When people say they hate python packaging they are usually talking about being able to successfully install dependencies without much thinking.
But, the biggest reason that doesn't work is because of the dependencies that have to be compiled. Which brings it's own problems.
Have you ever had a c dependency on node or ruby on a system that wasn't the same system they built it with? Turns out it sucks in all the languages. It's just that the amount of c-level packages in python is quite larger than say ruby. The likelihood of a problem is significantly larger.
Ruby is mostly used in web dev, where most if not all of your dependencies tend to be pure Ruby.
Python is used heavily for DS/ML/AI, which is exactly the area where native code packages are necessary and prevalent. Worse yet is that those packages often involve GPU code, and things like CUDA bring their own complications.
If you're writing web apps in Python, dependencies haven't really been a problem for a long time now.
The reason python is a mess is because it became the default for academics and data scientists who have no real interest in code quality or maintainability and the rot just spread from there.
I've done a few Ruby/C and Python/C bindings, the APIs are worlds apart: I'd say Ruby's is like afternoon cocktails with a cultivated Swedish philosopher, Python's is the aftermath of a bar-room brawl where broken bottles were used.
> Ruby is mostly used in web dev, where most if not all of your dependencies tend to be pure Ruby.
There is literally no such thing as a rails app that’s pure ruby. Rails depends on nokogiri, which is a libxml2 wrapper, and all activerecord database adapters are C bindings. Ruby development involves dealing with frequent native extension compilation, just like python.
Those are all fine and easy dependencies to install in Python too though, lxml comes in as part of the Python standard lib, and database adaptors like psycopg all bundle the full source code needed.
Where it breaks down in Python is when the dependencies are enormous source code projects with their own dependencies in their own right, that you can't reasonably bundle in with your own project. Conda aimed to solve that by being cross-platform. In the old Python days you'd do 'pip install X' and it would fail because it expected the package to be available, on your system, and in the PATH/LD_LIBRARY_PATH. Then wheels (python packages that could bundle libraries rather than just source code) came along and people started bundling lowest-common-denominator versions that most people could use but at non-optimal performance, and that's roughly still where we are today.
Before you answer that you have to answer what problem this is solving that PyPI doesn’t already address. uv works great against “legacy” package indexes so I’m not really clear why it’s needed other than to introduce lock-in to a for-profit facility.
Because CPython and PyPA are dysfunctional organizations in the hands of people who are in the right (often corporate) cliques. Don't expect anything from there.