Praca z #Conda-forge:
Czekasz 15 godzin, aż paczka się zbuduje. Tuż przed końcem testowania czas się kończy i budowanie zostaje przerwane.
Zwiększasz dopuszczalny czas budowania w conda-forge.yml. Znów czekasz 15 godzin. Znów budowanie zostaje przerwane.
Tym razem pamiętasz, żeby dać rerender. Miejmy nadzieję, że się uda…
#Conda-forge work be like:
You wait 15 hours for a build to finish. It times out after 15 hours, just barely before it finishes running the test suites.
You increase the timeout in conda-forge.yml. You wait 15 hours again. The build times out again.
You remember to rerender this time. Let's hope it will finish successfully this time…
#HDF5 jest super. W skrócie:
1. Oryginalnie, projekt używał systemu budowania autotools. Instalował binarkę h5cc, która — obok bycia nakładką na kompilator — miała dodatkowe opcje do uzyskiwania informacji o instalacji HDF5.
2. Później dodano alternatywny system budowania #CMake. W ramach tego systemu budowania instalowana jest uproszczona binarka h5cc, bez tych dodatkowych funkcji.
3. Każdy, kto próbował budować przez CMake, szybko odkrywał, że ta nowa binarka psuje większość paczek używających HDF5, więc wracano do autotools i zgłoszono problem do HDF5.
4. Autorzy zamknęli zgłoszenie, stwierdzając (tłum. moje): "Zmiany w h5cc przy użyciu CMake zostały udokumentowane w Release.txt, kiedy ich dokonano - kopia archiwalna powinna być dostępna w plikach z historią."
5. Autorzy ogłosili zamiar usunięcia wsparcia autotools.
Co stawia nas w następującej sytuacji:
1. Praktycznie wszyscy (przynajmniej #Arch, #Conda-forge, #Debian, #Fedora, #Gentoo) używa autotools, bo budowanie przy pomocy CMake psuje zbyt wiele.
2. Oryginalnie uznano to za problem w HDF5, więc nie zgłaszano problemu innym paczkom. Podejrzewam, że wiele dystrybucji nawet nie wie, że HDF5 odrzuciło zgłoszenie.
3. Paczki nadal są "zepsute", i zgaduję, że ich autorzy nawet nie wiedzą o problemie, bo — cóż, jak wspominałem — praktycznie wszystkie dystrybucje nadal używają autotools, a przy testowaniu budowania CMake nikt nie zgłaszał problemów do innych paczek.
4. Nawet nie mam pewności, czy ten problem da się "dobrze" naprawić. Nie znam tej paczki, ale wygląda to, jakby funkcjonalność usunięto bez alternatywy, i tym samym ludzie mogą co najwyżej samemu zacząć używać CMake (wzdych) — tym samym oczywiście psując swoje paczki na wszystkich dystrybucjach, które budują HDF5 przez autotools, o ile nie dodadzą dodatkowo kodu dla wsparcia tego drugiego wariantu.
5. Wszystko wskazuje na to, że HDF5 jest biblioteką, której autorów nie obchodzą ich własni użytkownicy.
#HDF5 is doing great. So basically:
1. Originally, upstream used autotools. The build system installed a h5cc wrapper which — besides being a compiler wrapper — had a few config-tool style options.
2. Then, upstream added #CMake build system as an alternative. It installed a different h5cc wrapper that did not have the config-tool style options anymore.
3. Downstreams that tried CMake quickly discovered that the new wrapper broke a lot of packages, so they reverted to autotools and reported a bug.
4. Upstream closed the bug, handwaving it as "CMake h5cc changes have been noted in the Release.txt at the time of change - archived copy should exist in the history files."
5. Upstream announced the plans to remove autotools support.
So, to summarize the current situation:
1. Pretty much everyone (at least #Arch, #Conda-forge, #Debian, #Fedora, #Gentoo) is building using autotools, because CMake builds cause too much breakage.
2. Downstreams originally judged this to be a HDF5 issue, so they didn't report bugs to affected packages. Not sure if they're even aware that HDF5 upstream rejected the report.
3. All packages remain "broken", and I'm guessing their authors may not even be aware of the problem, because, well, as I pointed out, everyone is still using autotools, and nobody reported the issues during initial CMake testing.
4. I'm not even sure if there is a good "fix" here. I honestly don't know the package, but it really sounds like the config-tool was removed with no replacement, so the only way forward might be for people to switch over to CMake (sigh) — which would of course break the packages almost everywhere, unless people also add fallbacks for compatibility with autotools builds.
5. The upstream's attitude suggests that HDF5 is pretty much a project unto itself, and doesn't care about its actual users.
What's better: VS code or JupyterLab?
@qgis 3.42 has landed in #condaforge
https://anaconda.org/conda-forge/qgis
My favorite way to install #QGIS
good grief, we broke our #ai #alttext script. for a while we were running the latest #pixtral #vision #model however we wanted to try #Microsoft #PHi4 #vision model however we use #conda for our python virtualization management.
this is a huge mistake #lol #tech #aidev #thestruggle
i need a #venv for both pixtral and phi4 in one scripts run time.
suggestions anyone to work this mess?
obviously we are running with the phi4v model however we were testing across all the accounts when we realized we broke the production scripts.
same login, supposedly different virtual environments.
blah. perhaps this is what #uv is meant to fix?
Not really sure, I guess I could talk to #ChatGPT about it lol
We're looking to extend our team with a Senior Rust Engineer
Do you want to leave a lasting mark on the open source ecosystem by building awesome tools for other developers?
Apply here: https://apply.workable.com/j/12939AB951
I think it's quite problematic on #Linux to start off with regular ol' #Python that came with the system (i.e. /usr/bin/python
), and then as you go installing some packages (i.e. on the #AUR if you're on #ArchLinux) which will then install some Python libraries using it, and then you start using something like #Conda or #Miniconda whereby subsequent package installations or updates may be installing these libraries on the Conda environment (i.e. ~/miniconda3/python
) and not the system, so there's some overlap there or so? I'm wondering what's the best way of moving forward from this point - esp since sometime ago, it's no longer possible to raw pip install <package>
anymore for wtv reason.
I released version 0.18.6 oft #boinor today. Though I fixed some open bugs from the #poliastro list of issues, this release is basically done to upload something to #conda and prepare a #debian package.
#python #astronomy #software #release
I know we like to act like space is completely free these days, but maybe we take that too far.
"Hm, I think I should -v
this one tar invocation, just to see what I'm actually backing up from my homedir."
"Sure! Here's your .python
directory, containing thousands of files supporting python libraries. And your .local/python
directory, containing thousands of files supporting python libraries. Oh, and your .conda
directory, never guess what's in there..."
"I get it."
"Do you want to know what's in your .pyenv
director~"
"I. Get. It."
(ETA: 1.2GB at the end. Though most of that is just .pyenv
; the others are much smaller).
As part of a submission to @joss I had to explore using #conda to build #software for the first time.
I have always been reticent to use it (I like pip for python, and much of my work is on #HPC where environment modules are king.
I wrote a short post reflecting on what I learnt and my new opinions on conda: https://jackatkinson.net/post/pondering_conda/
Vous n'avez rien compris au changement de licence Anaconda ?
On vous explique pourquoi et comment se mettre en règle :
https://bioinfo-fr.net/conda-et-le-piege-de-la-licence-anaconda
#bioinfofr #anaconda #conda
Troubleshooting Python Virtual Environment Errors on Windows 11. Common causes include PATH variable issues, environment creation inconsistencies, and permission problems. Learn how to resolve these errors and improve your workflow using advanced techniques & tools like virtualenvwrapper or conda. #PythonVirtualEnvironmentError #Windows11 #Virtualenv #Conda #PythonError #Programming
https://tech-champion.com/microsoft-windows/troubleshooting-python-virtual-environment-activation-errors-on-windows-11