mathstodon.xyz is one of the many independent Mastodon servers you can use to participate in the fediverse.
A Mastodon instance for maths people. We have LaTeX rendering in the web interface!

Server stats:

2.7K
active users

#AutomaticDifferentiation

1 post1 participant0 posts today
Hacker News<p>An illustrated guide to automatic sparse differentiation</p><p><a href="https://iclr-blogposts.github.io/2025/blog/sparse-autodiff/" rel="nofollow noopener noreferrer" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">iclr-blogposts.github.io/2025/</span><span class="invisible">blog/sparse-autodiff/</span></a></p><p><a href="https://mastodon.social/tags/HackerNews" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>HackerNews</span></a> <a href="https://mastodon.social/tags/automaticdifferentiation" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>automaticdifferentiation</span></a> <a href="https://mastodon.social/tags/sparseautodiff" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>sparseautodiff</span></a> <a href="https://mastodon.social/tags/illustration" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>illustration</span></a> <a href="https://mastodon.social/tags/machinelearning" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>machinelearning</span></a> <a href="https://mastodon.social/tags/ICLR2025" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>ICLR2025</span></a></p>
LavX News<p>Unlocking the Power of Reverse Mode Automatic Differentiation in Machine Learning</p><p>Automatic Differentiation (AD) is revolutionizing how developers compute derivatives within complex machine learning models. This article dives deep into reverse mode AD, its implementation, and its c...</p><p><a href="https://news.lavx.hu/article/unlocking-the-power-of-reverse-mode-automatic-differentiation-in-machine-learning" rel="nofollow noopener noreferrer" target="_blank"><span class="invisible">https://</span><span class="ellipsis">news.lavx.hu/article/unlocking</span><span class="invisible">-the-power-of-reverse-mode-automatic-differentiation-in-machine-learning</span></a></p><p><a href="https://mastodon.cloud/tags/news" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>news</span></a> <a href="https://mastodon.cloud/tags/tech" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>tech</span></a> <a href="https://mastodon.cloud/tags/MachineLearning" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>MachineLearning</span></a> <a href="https://mastodon.cloud/tags/AutomaticDifferentiation" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>AutomaticDifferentiation</span></a> <a href="https://mastodon.cloud/tags/ReverseModeAD" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>ReverseModeAD</span></a></p>
Andreas<p>Have you ever thought 💡 of using JAX as 🧮 <a href="https://mathstodon.xyz/tags/automaticdifferentiation" class="mention hashtag" rel="tag">#<span>automaticdifferentiation</span></a> engine in 💻 finite element simulations? Boost the performance 🏇 of computationally-expensive hyperelastic material models with <a href="https://mathstodon.xyz/tags/jit" class="mention hashtag" rel="tag">#<span>jit</span></a> in 🔍 FElupe! 🚀 🚀</p><p><a href="https://github.com/adtzlr/felupe" target="_blank" rel="nofollow noopener noreferrer" translate="no"><span class="invisible">https://</span><span class="">github.com/adtzlr/felupe</span><span class="invisible"></span></a></p><p><a href="https://mathstodon.xyz/tags/python" class="mention hashtag" rel="tag">#<span>python</span></a> <a href="https://mathstodon.xyz/tags/jax" class="mention hashtag" rel="tag">#<span>jax</span></a> <a href="https://mathstodon.xyz/tags/finiteelementmethod" class="mention hashtag" rel="tag">#<span>finiteelementmethod</span></a> <a href="https://mathstodon.xyz/tags/scientificcomputing" class="mention hashtag" rel="tag">#<span>scientificcomputing</span></a> <a href="https://mathstodon.xyz/tags/computationalmechanics" class="mention hashtag" rel="tag">#<span>computationalmechanics</span></a> <a href="https://mathstodon.xyz/tags/fea" class="mention hashtag" rel="tag">#<span>fea</span></a> <a href="https://mathstodon.xyz/tags/fem" class="mention hashtag" rel="tag">#<span>fem</span></a> <a href="https://mathstodon.xyz/tags/hyperelasticity" class="mention hashtag" rel="tag">#<span>hyperelasticity</span></a></p>
Dr. P. M. Secular<p>This paper from <span class="h-card" translate="no"><a href="https://qubit-social.xyz/@jenseisert" class="u-url mention">@<span>jenseisert</span></a></span> and colleagues sounds interesting!</p><p>&quot;The incorporation of automatic differentiation in tensor networks algorithms has ultimately enabled a new, flexible way for variational simulation of ground states and excited states. In this work, we review the state of the art of the variational iPEPS framework. We present and explain the functioning of an efficient, comprehensive and general tensor network library for the simulation of infinite two-dimensional systems using iPEPS, with support for flexible unit cells and different lattice geometries.&quot;</p><p><a href="https://scirate.com/arxiv/2308.12358" target="_blank" rel="nofollow noopener noreferrer" translate="no"><span class="invisible">https://</span><span class="">scirate.com/arxiv/2308.12358</span><span class="invisible"></span></a></p><p><a href="https://mathstodon.xyz/tags/quantum" class="mention hashtag" rel="tag">#<span>quantum</span></a> <a href="https://mathstodon.xyz/tags/TensorNetwork" class="mention hashtag" rel="tag">#<span>TensorNetwork</span></a> <a href="https://mathstodon.xyz/tags/computational" class="mention hashtag" rel="tag">#<span>computational</span></a> <a href="https://mathstodon.xyz/tags/physics" class="mention hashtag" rel="tag">#<span>physics</span></a> <a href="https://mathstodon.xyz/tags/AutomaticDifferentiation" class="mention hashtag" rel="tag">#<span>AutomaticDifferentiation</span></a> <a href="https://mathstodon.xyz/tags/iPEPS" class="mention hashtag" rel="tag">#<span>iPEPS</span></a></p>
Virgile Andreani<p>I really enjoyed the talk by Manuel Drehwald at <a href="https://fosstodon.org/tags/RustSciComp23" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>RustSciComp23</span></a> who drew the lines of an exciting future for <a href="https://fosstodon.org/tags/AutomaticDifferentiation" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>AutomaticDifferentiation</span></a> in <a href="https://fosstodon.org/tags/Rust" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Rust</span></a> with <a href="https://fosstodon.org/tags/LLVM" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>LLVM</span></a> <a href="https://fosstodon.org/tags/Enzyme" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Enzyme</span></a> , which should be directly integrated into the compiler at an horizon of a couple of months.</p><p>If I understood correctly, the idea is to differentiate code at the LLVM IR level, *after optimization* (and to do another pass of optimization after that). This can produce faster code than the AD engines that operate at the source code level.</p>
marco<p><a href="https://sigmoid.social/tags/CFP" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>CFP</span></a> for `Differentiable Almost Everything: Differentiable Relaxations, Algorithms, Operators, and Simulators`, a workshop at <a href="https://sigmoid.social/tags/ICML" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>ICML</span></a> </p><p><a href="https://differentiable.xyz/" rel="nofollow noopener noreferrer" target="_blank"><span class="invisible">https://</span><span class="">differentiable.xyz/</span><span class="invisible"></span></a></p><p><a href="https://twitter.com/FHKPetersen/status/1657283011459821569" rel="nofollow noopener noreferrer" target="_blank"><span class="invisible">https://</span><span class="ellipsis">twitter.com/FHKPetersen/status</span><span class="invisible">/1657283011459821569</span></a></p><p>Differentiable programming is a powerful tool, so I am quite interested in this workshop (especially as a <a href="https://sigmoid.social/tags/JuliaLang" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>JuliaLang</span></a> user, which has fantastic <a href="https://sigmoid.social/tags/AD" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>AD</span></a> support).</p><p><a href="https://sigmoid.social/tags/AutomaticDifferentiation" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>AutomaticDifferentiation</span></a> <a href="https://sigmoid.social/tags/ML" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>ML</span></a></p>
Andreas<p>Lots of repeated gradients/hessians to evaluate? Tensortrax is waiting for you. <a href="https://github.com/adtzlr/tensortrax" target="_blank" rel="nofollow noopener noreferrer" translate="no"><span class="invisible">https://</span><span class="">github.com/adtzlr/tensortrax</span><span class="invisible"></span></a></p><p><a href="https://mathstodon.xyz/tags/python" class="mention hashtag" rel="tag">#<span>python</span></a> <a href="https://mathstodon.xyz/tags/hyperdualnumbers" class="mention hashtag" rel="tag">#<span>hyperdualnumbers</span></a> <a href="https://mathstodon.xyz/tags/automaticdifferentiation" class="mention hashtag" rel="tag">#<span>automaticdifferentiation</span></a> <a href="https://mathstodon.xyz/tags/forwardmode" class="mention hashtag" rel="tag">#<span>forwardmode</span></a></p>
Marco 🌳 Zocca<p>not to brag or anything but my ad-delcont library has been an inspiration to this :) </p><p><a href="https://github.com/konn/ad-delcont-primop" rel="nofollow noopener noreferrer" target="_blank"><span class="invisible">https://</span><span class="ellipsis">github.com/konn/ad-delcont-pri</span><span class="invisible">mop</span></a> </p><p>this is a line of work that uses delimited continuations to implement reverse-mode <a href="https://sigmoid.social/tags/AutomaticDifferentiation" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>AutomaticDifferentiation</span></a> , rather than reifing the program into a graph. As such, it enables a nice purely functional API and this latest incarnation performs pretty well too</p><p><a href="https://sigmoid.social/tags/machinelearning" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>machinelearning</span></a> <a href="https://sigmoid.social/tags/functionalprogramming" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>functionalprogramming</span></a> <a href="https://sigmoid.social/tags/haskell" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>haskell</span></a></p>
Stephen De Gabrielle<p><em>Understanding and Implementing Automatic Differentiation</em><br>2022-12-04 :: racket, math, machine-learning, projects, tutorials<br><u>By: Mike Delmonaco</u></p><p><a href="https://quasarbright.github.io/blog/2022/12/understanding-and-implementing-automatic-differentiation.html" rel="nofollow noopener noreferrer" target="_blank"><span class="invisible">https://</span><span class="ellipsis">quasarbright.github.io/blog/20</span><span class="invisible">22/12/understanding-and-implementing-automatic-differentiation.html</span></a></p><p><a href="https://types.pl/tags/Racket" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Racket</span></a> <a href="https://types.pl/tags/RacketLang" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>RacketLang</span></a> <a href="https://types.pl/tags/RacketLanguage" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>RacketLanguage</span></a> <a href="https://types.pl/tags/AI" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>AI</span></a> #<a href="https://types.pl/tags/MachineLearning" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>MachineLearning</span></a> <a href="https://types.pl/tags/AutomaticDifferentiation" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>AutomaticDifferentiation</span></a> <a href="https://types.pl/tags/tutorial" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>tutorial</span></a></p>
marco<p>Call for Talk Proposals for the Enzyme (<a href="https://sigmoid.social/tags/AutomaticDifferentiation" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>AutomaticDifferentiation</span></a> in <a href="https://sigmoid.social/tags/LLVM" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>LLVM</span></a>) Conference.</p><p><a href="https://enzyme.mit.edu/conference" rel="nofollow noopener noreferrer" target="_blank"><span class="invisible">https://</span><span class="">enzyme.mit.edu/conference</span><span class="invisible"></span></a></p><p><a href="https://sigmoid.social/tags/JuliaLang" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>JuliaLang</span></a> <a href="https://sigmoid.social/tags/AD" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>AD</span></a></p>
aspuru<p>Check out our <a href="https://mastodon.social/tags/matterlab" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>matterlab</span></a> <span class="h-card"><a href="https://mstdn.social/@uoft" class="u-url mention" rel="nofollow noopener noreferrer" target="_blank">@<span>uoft</span></a></span> work on inverse design for the Hückel method using automatic differentiation. </p><p><a href="https://arxiv.org/abs/2211.16763" rel="nofollow noopener noreferrer" target="_blank"><span class="invisible">https://</span><span class="">arxiv.org/abs/2211.16763</span><span class="invisible"></span></a></p><p>Led by Rodrigo Vargas and Kjell Jorner </p><p><a href="https://mastodon.social/tags/ChemiVerse" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>ChemiVerse</span></a> <a href="https://mastodon.social/tags/machinelearning" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>machinelearning</span></a> <a href="https://mastodon.social/tags/AutomaticDifferentiation" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>AutomaticDifferentiation</span></a> <a href="https://mastodon.social/tags/ai" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>ai</span></a> <a href="https://mastodon.social/tags/chemistry" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>chemistry</span></a> <a href="https://mastodon.social/tags/toronto" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>toronto</span></a></p>
Felix Köhler<p>📺 I started a new video series on primitive rules for <a href="https://fosstodon.org/tags/automaticdifferentiation" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>automaticdifferentiation</span></a> :<a href="https://www.youtube.com/watch?v=PwSaD50jTv8&amp;list=PLISXH-iEM4Jn3SEi07q8MJmDD6BaMWlJE" rel="nofollow noopener noreferrer" target="_blank"><span class="invisible">https://www.</span><span class="ellipsis">youtube.com/watch?v=PwSaD50jTv</span><span class="invisible">8&amp;list=PLISXH-iEM4Jn3SEi07q8MJmDD6BaMWlJE</span></a><br>starting at scalar rules, continuing with vector/array rules, and finally some results from using the implicit function theorem.</p><p>Primitive rules build the basis for automatically differentiating through arbitrary computer programs.</p><p>New video to be released every three days :)</p>
marco<p>A consistently solid <a href="https://sigmoid.social/tags/JuliaLang" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>JuliaLang</span></a> YouTube Channel: <a href="https://youtube.com/c/MachineLearningSimulation" rel="nofollow noopener noreferrer" target="_blank"><span class="invisible">https://</span><span class="ellipsis">youtube.com/c/MachineLearningS</span><span class="invisible">imulation</span></a></p><p>It mainly covers some advanced topics in one of the strongest areas of <a href="https://sigmoid.social/tags/Julia" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Julia</span></a>, <a href="https://sigmoid.social/tags/AutomaticDifferentiation" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>AutomaticDifferentiation</span></a>, especially when applied to the scientific computing domain. For example: <a href="https://youtu.be/e4O6Z9o_D0k" rel="nofollow noopener noreferrer" target="_blank"><span class="invisible">https://</span><span class="">youtu.be/e4O6Z9o_D0k</span><span class="invisible"></span></a></p><p>Most topics also have a video covering it using <a href="https://sigmoid.social/tags/Jax" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Jax</span></a> or one of the specialized <a href="https://sigmoid.social/tags/PyTorch" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>PyTorch</span></a> or <a href="https://sigmoid.social/tags/TensorFlow" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>TensorFlow</span></a> extensions (e.g., TensorFlow Distributions).</p>
claude<p>Found a bug in <a href="https://post.lurk.org/tags/zoomasm" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>zoomasm</span></a>'s 360 projection (the distance estimate scaling was wrong).</p><p>Trying to do the maths by hand is too hard, so I copied my <a href="https://post.lurk.org/tags/GLSL" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>GLSL</span></a> <a href="https://post.lurk.org/tags/DualNumber" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>DualNumber</span></a> implementation (for <a href="https://post.lurk.org/tags/AutomaticDifferentiation" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>AutomaticDifferentiation</span></a>) from my fragm-examples repository, minus the <a href="https://post.lurk.org/tags/CPreProcessor" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>CPreProcessor</span></a> macro hell, plus some quaternion-to-rotation-matrix code ported from Python that I found online.</p><p>Now it looks okish in the <a href="https://post.lurk.org/tags/equirectangular" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>equirectangular</span></a> view, need to render some tests at various orientations and inject spatial metadata for viewing in VLC to be more sure I got it right...</p>