'Sliced-Wasserstein Distances and Flows on Cartan-Hadamard Manifolds', by Clément Bonet, Lucas Drumetz, Nicolas Courty.
http://jmlr.org/papers/v26/24-0359.html
#manifolds #manifold #wasserstein
'Sliced-Wasserstein Distances and Flows on Cartan-Hadamard Manifolds', by Clément Bonet, Lucas Drumetz, Nicolas Courty.
http://jmlr.org/papers/v26/24-0359.html
#manifolds #manifold #wasserstein
'Manifold Learning by Mixture Models of VAEs for Inverse Problems', by Giovanni S. Alberti, Johannes Hertrich, Matteo Santacesaria, Silvia Sciutto.
http://jmlr.org/papers/v25/23-0396.html
#autoencoders #manifold #manifolds
By the way, you can see the poster I presented in 2017 here: https://aten.cool/documents/NCUWM_poster.pdf
This kind of idea appears in my work with Semin Yoo on constructing manifolds from quasigroups, so I'm still up to some of the same things seven years later.
Quasigroup manifolds paper: https://arxiv.org/abs/2110.05660
Nonlinear #manifolds underlie #NeuralPopulation activity during #behaviour – new #preprint by Fortunato et al. (2023)
I have a really weird #ICanHazPDF request. I remember a #MathOverflow question about the classification of #manifolds in which a paper (apparently unpublished) was linked from the author's website. I think it was by either Manolescu or Nicolaescu and it was a very nice, short survey of the current state of the classification. I thought I had a copy of this, but I can't find it or the original MathOverflow question. I've tried DuckDuckGo, Yandex, Google, and Bing to no avail. It's not this (https://pi.math.cornell.edu/~hatcher/Papers/3Msurvey.pdf) by Allen Hatcher. Did I hallucinate this survey article?
Meet our wonderful DEIA Co-Chair @lukesjulson who is eyeing:
meeting students & postdocs
discussing science
eating fresh-baked pão com chouriço after a late night of dancing at #Cosyne2024
work bridging the gap between spiking data & dynamical systems on #manifolds
Looking forward to tomorrow’s 2nd workshop on #symmetry, invariance and #NeuralRepresentations at the #BernsteinConference: #GroupTheory, #manifolds, and #Euclidean vs #nonEuclidean #geometry #perception … I’m pretty excited
#CompNeuro #computationalneuroscience
#introduction Hi all, I recently switched instances and am getting my profile set up. If you followed me on my old instance I'll not be using that anymore.
I am a PhD candidate in Albert Einstein College of Medicine, working with #intrinsic #manifolds and #decisionmaking , doing some visual cortex research by coincidence.
Discussing slide 49/237 with #chatgpt
#Embedded vs #Immersed #Manifolds
https://chat.openai.com/share/6cbf3e83-a52d-469e-8ea8-c003c06a1073
and with #bard
Nonconvex-nonconcave min-max optimization on Riemannian manifolds
Andi Han, Bamdev Mishra, Pratik Jawanpuria, Junbin Gao
Action editor: Zhihui Zhu.
Tomorrow I will give a(n online) talk at the ICIAM within the minisymposium “Approximation and modeling with manifold-valued data” about the Riemannian DC algorithm
at 18:05 JST (11:05 CEST). The slides are already online at https://ronnybergmann.net/talks/2023-ICIAM-Difference-of-Convex.pdf #Manifolds #ICIAM #Optimization
ManifoldsBase.jl 0.14.10 introduces an interface to implement `Weigarten(M, p, X, V)` maps (https://juliamanifolds.github.io/ManifoldsBase.jl/dev/functions/#ManifoldsBase.Weingarten-Tuple{AbstractManifold,%20Any,%20Any,%20Any}).
Together with the new ManifoldDiff.jl 0.3.6 this allows for a generic implementation of a conversion from Euclidean to Riemannian Hessians for embedded submanifolds, see https://juliamanifolds.github.io/ManifoldDiff.jl/dev/library/#ManifoldDiff.riemannian_Hessian-Tuple{AbstractManifold,%20Any,%20Any,%20Any,%20Any}. #Manifolds #Julia
Neural Implicit Manifold Learning for Topology-Aware Density Estimation
I will be at ICML for the workshops later this week. Thanks to @emtiyaz and Thomas (https://moellenh.github.io) for inviting me to the workshop on “Duality Principles for Modern Machine Learning” (https://dp4ml.github.io); hope to also attend the TAG-ML workshop https://www.tagds.com/events/conference-workshops/tag-ml23 Friday.
Wrapped $\beta$-Gaussians with compact support for exact probabilistic modeling on manifolds
...
Addendae (cont'd)
Manifold hypothesis
https://en.wikipedia.org/wiki/Manifold_hypothesis
Many high-dimensional data sets (requiring many variables) in the real world actually lie along low-dimensional latent manifolds in that high-dimensional space (described by a smaller number of variables).
This principle may underpin the effectiveness of ML algorithms in describing high-dimensional data sets by considering a few common features.
On the curvature of the loss landscape
https://arxiv.org/abs/2307.04719
A main challenge in modern deep learning is to understand why such over-parameterized models perform so well when trained on finite data ... we consider the loss landscape as an embedded Riemannian manifold ... we focus on the scalar curvature, which can be computed analytically for our manifold ...
Manifolds: https://en.wikipedia.org/wiki/Manifold
...
New #preprint by @AitorMoraGre on #neural #manifolds in #V1 changing with top-down signals from #V4 targeting the foveal region
https://www.biorxiv.org/content/10.1101/2023.06.14.544966v1.article-info
#Twitter thread: https://twitter.com/AitorMoraGre/status/1669375717149220864 (including a cool animation )
'Large sample spectral analysis of graph-based multi-manifold clustering', by Nicolas Garcia Trillos, Pengfei He, Chenghui Li.
http://jmlr.org/papers/v24/21-1254.html
#laplacians #manifolds #laplacian
Nonconvex-nonconcave min-max optimization on Riemannian manifolds