mathstodon.xyz is one of the many independent Mastodon servers you can use to participate in the fediverse.
A Mastodon instance for maths people. We have LaTeX rendering in the web interface!

Server stats:

2.8K
active users

#conda

1 post1 participant0 posts today

Praca z #Conda-forge:

Czekasz 15 godzin, aż paczka się zbuduje. Tuż przed końcem testowania czas się kończy i budowanie zostaje przerwane.

Zwiększasz dopuszczalny czas budowania w conda-forge.yml. Znów czekasz 15 godzin. Znów budowanie zostaje przerwane.

Tym razem pamiętasz, żeby dać rerender. Miejmy nadzieję, że się uda…

#Conda-forge work be like:

You wait 15 hours for a build to finish. It times out after 15 hours, just barely before it finishes running the test suites.

You increase the timeout in conda-forge.yml. You wait 15 hours again. The build times out again.

You remember to rerender this time. Let's hope it will finish successfully this time…

#HDF5 jest super. W skrócie:

1. Oryginalnie, projekt używał systemu budowania autotools. Instalował binarkę h5cc, która — obok bycia nakładką na kompilator — miała dodatkowe opcje do uzyskiwania informacji o instalacji HDF5.
2. Później dodano alternatywny system budowania #CMake. W ramach tego systemu budowania instalowana jest uproszczona binarka h5cc, bez tych dodatkowych funkcji.
3. Każdy, kto próbował budować przez CMake, szybko odkrywał, że ta nowa binarka psuje większość paczek używających HDF5, więc wracano do autotools i zgłoszono problem do HDF5.
4. Autorzy zamknęli zgłoszenie, stwierdzając (tłum. moje): "Zmiany w h5cc przy użyciu CMake zostały udokumentowane w Release.txt, kiedy ich dokonano - kopia archiwalna powinna być dostępna w plikach z historią."
5. Autorzy ogłosili zamiar usunięcia wsparcia autotools.

Co stawia nas w następującej sytuacji:

1. Praktycznie wszyscy (przynajmniej #Arch, #Conda-forge, #Debian, #Fedora, #Gentoo) używa autotools, bo budowanie przy pomocy CMake psuje zbyt wiele.
2. Oryginalnie uznano to za problem w HDF5, więc nie zgłaszano problemu innym paczkom. Podejrzewam, że wiele dystrybucji nawet nie wie, że HDF5 odrzuciło zgłoszenie.
3. Paczki nadal są "zepsute", i zgaduję, że ich autorzy nawet nie wiedzą o problemie, bo — cóż, jak wspominałem — praktycznie wszystkie dystrybucje nadal używają autotools, a przy testowaniu budowania CMake nikt nie zgłaszał problemów do innych paczek.
4. Nawet nie mam pewności, czy ten problem da się "dobrze" naprawić. Nie znam tej paczki, ale wygląda to, jakby funkcjonalność usunięto bez alternatywy, i tym samym ludzie mogą co najwyżej samemu zacząć używać CMake (wzdych) — tym samym oczywiście psując swoje paczki na wszystkich dystrybucjach, które budują HDF5 przez autotools, o ile nie dodadzą dodatkowo kodu dla wsparcia tego drugiego wariantu.
5. Wszystko wskazuje na to, że HDF5 jest biblioteką, której autorów nie obchodzą ich własni użytkownicy.

github.com/HDFGroup/hdf5/issue

When building hdf5 with autotools, the following file is used to produce h5cc and friends: https://github.com/HDFGroup/hdf5/blob/develop/bin/h5cc.in However, when building with cmake, the following...
GitHubh5cc is severely lacking when building hdf5 with cmake, breaking downstream users · Issue #1814 · HDFGroup/hdf5By BtbN

#HDF5 is doing great. So basically:

1. Originally, upstream used autotools. The build system installed a h5cc wrapper which — besides being a compiler wrapper — had a few config-tool style options.
2. Then, upstream added #CMake build system as an alternative. It installed a different h5cc wrapper that did not have the config-tool style options anymore.
3. Downstreams that tried CMake quickly discovered that the new wrapper broke a lot of packages, so they reverted to autotools and reported a bug.
4. Upstream closed the bug, handwaving it as "CMake h5cc changes have been noted in the Release.txt at the time of change - archived copy should exist in the history files."
5. Upstream announced the plans to remove autotools support.

So, to summarize the current situation:

1. Pretty much everyone (at least #Arch, #Conda-forge, #Debian, #Fedora, #Gentoo) is building using autotools, because CMake builds cause too much breakage.
2. Downstreams originally judged this to be a HDF5 issue, so they didn't report bugs to affected packages. Not sure if they're even aware that HDF5 upstream rejected the report.
3. All packages remain "broken", and I'm guessing their authors may not even be aware of the problem, because, well, as I pointed out, everyone is still using autotools, and nobody reported the issues during initial CMake testing.
4. I'm not even sure if there is a good "fix" here. I honestly don't know the package, but it really sounds like the config-tool was removed with no replacement, so the only way forward might be for people to switch over to CMake (sigh) — which would of course break the packages almost everywhere, unless people also add fallbacks for compatibility with autotools builds.
5. The upstream's attitude suggests that HDF5 is pretty much a project unto itself, and doesn't care about its actual users.

github.com/HDFGroup/hdf5/issue

When building hdf5 with autotools, the following file is used to produce h5cc and friends: https://github.com/HDFGroup/hdf5/blob/develop/bin/h5cc.in However, when building with cmake, the following...
GitHubh5cc is severely lacking when building hdf5 with cmake, breaking downstream users · Issue #1814 · HDFGroup/hdf5By BtbN
I don't get #Python package dependencies. I have two #conda envs with exactly python versions, packages all from #pip with major overlaps. Tried install another package in env A, after downgrading several packages including #pandas all went well. But when I try loading the newly installed package in REPL cmd, an error occurred complaining about pandas version. I removed the package, deactivate env A and activate env B, install same package in env B, same downgrade happened with pandas while this time all packages loaded without any issue. And this is not the first time I see this. Before this I use conda/mamba to manage packages and I switched to pip for package management hoping this won't happen anymore.

good grief, we broke our #ai #alttext script. for a while we were running the latest #pixtral #vision #model however we wanted to try #Microsoft #PHi4 #vision model however we use #conda for our python virtualization management.

this is a huge mistake #lol #tech #aidev #thestruggle

i need a #venv for both pixtral and phi4 in one scripts run time.

suggestions anyone to work this mess?

#python #development #fail

obviously we are running with the phi4v model however we were testing across all the accounts when we realized we broke the production scripts.

same login, supposedly different virtual environments.

blah. perhaps this is what #uv is meant to fix?

Not really sure, I guess I could talk to #ChatGPT about it lol

I think it's quite problematic on #Linux to start off with regular ol' #Python that came with the system (i.e. /usr/bin/python), and then as you go installing some packages (i.e. on the #AUR if you're on #ArchLinux) which will then install some Python libraries using it, and then you start using something like #Conda or #Miniconda whereby subsequent package installations or updates may be installing these libraries on the Conda environment (i.e. ~/miniconda3/python) and not the system, so there's some overlap there or so? I'm wondering what's the best way of moving forward from this point - esp since sometime ago, it's no longer possible to raw pip install <package> anymore for wtv reason.

I know we like to act like space is completely free these days, but maybe we take that too far.

"Hm, I think I should -v this one tar invocation, just to see what I'm actually backing up from my homedir."

"Sure! Here's your .python directory, containing thousands of files supporting python libraries. And your .local/python directory, containing thousands of files supporting python libraries. Oh, and your .conda directory, never guess what's in there..."

"I get it."

"Do you want to know what's in your .pyenv director~"

"I. Get. It."

(ETA: 1.2GB at the end. Though most of that is just .pyenv; the others are much smaller).

As part of a submission to @joss I had to explore using #conda to build #software for the first time.

I have always been reticent to use it (I like pip for python, and much of my work is on #HPC where environment modules are king.

I wrote a short post reflecting on what I learnt and my new opinions on conda: jackatkinson.net/post/ponderin

jackatkinson.net · Pondering CondaSome ruminations on conda following use in a recent project

Troubleshooting Python Virtual Environment Errors on Windows 11. Common causes include PATH variable issues, environment creation inconsistencies, and permission problems. Learn how to resolve these errors and improve your workflow using advanced techniques & tools like virtualenvwrapper or conda. #PythonVirtualEnvironmentError #Windows11 #Virtualenv #Conda #PythonError #Programming
tech-champion.com/microsoft-wi