mathstodon.xyz is one of the many independent Mastodon servers you can use to participate in the fediverse.
A Mastodon instance for maths people. We have LaTeX rendering in the web interface!

Server stats:

2.8K
active users

#probability

3 posts3 participants0 posts today
WordofTheHour<p><a href="https://mastodon.social/tags/probability" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>probability</span></a> : appearance of reality or truth</p><p>- French: Probabilité</p><p>- German: die Wahrscheinlichkeit</p><p>- Italian: probabilità</p><p>- Portuguese: probabilidade</p><p>- Spanish: probabilidad</p><p>------------</p><p>Thank you so much for being a member of our community!</p>
Dr Mircea Zloteanu 🌼🐝<p><a href="https://mastodon.social/tags/statstab" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>statstab</span></a> #321 You Just Said Something Wrong About Logistic Regression by <span class="h-card" translate="no"><a href="https://mastodon.social/@Phdemetri" class="u-url mention" rel="nofollow noopener noreferrer" target="_blank">@<span>Phdemetri</span></a></span></p><p>Thoughts: Odd, probabilities, and risk ratios. Coefficients in logistic regression are only one of these.</p><p><a href="https://mastodon.social/tags/logisticregression" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>logisticregression</span></a> <a href="https://mastodon.social/tags/oddsratio" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>oddsratio</span></a> <a href="https://mastodon.social/tags/riskratio" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>riskratio</span></a> <a href="https://mastodon.social/tags/probability" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>probability</span></a> <a href="https://mastodon.social/tags/odds" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>odds</span></a> <a href="https://mastodon.social/tags/regression" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>regression</span></a> </p><p><a href="https://dpananos.github.io/posts/2024-02-06-logit/" rel="nofollow noopener noreferrer" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">dpananos.github.io/posts/2024-</span><span class="invisible">02-06-logit/</span></a></p>
david jon furbish<p>I cannot think of an applied mathematics that is more beautiful and far-reaching, or philosophically wilder, than probability. No, nonlinear dynamics and chaos people, it’s not even close 🤣</p><p><a href="https://mastodon.online/tags/probability" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>probability</span></a><br><a href="https://mastodon.online/tags/mathematics" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>mathematics</span></a><br><a href="https://mastodon.online/tags/appliedmathematics" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>appliedmathematics</span></a><br><a href="https://mastodon.online/tags/philosophy" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>philosophy</span></a><br><a href="https://mastodon.online/tags/philosophyofscience" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>philosophyofscience</span></a> <br><span class="h-card" translate="no"><a href="https://newsmast.community/@philosophy" class="u-url mention" rel="nofollow noopener noreferrer" target="_blank">@<span>philosophy@newsmast.community</span></a></span> <br><span class="h-card" translate="no"><a href="https://a.gup.pe/u/philosophy" class="u-url mention" rel="nofollow noopener noreferrer" target="_blank">@<span>philosophy@a.gup.pe</span></a></span></p>
Fabio Capela<p>Quantum physics and investing both deal with uncertainty and probability. Just as quantum particles exist in superposition until measured, your investment outcomes exist as probability distributions until &#39;measured&#39; by time. We can only optimize the process, not predict exact outcomes! <a href="https://mathstodon.xyz/tags/QuantumInvesting" class="mention hashtag" rel="tag">#<span>QuantumInvesting</span></a> <a href="https://mathstodon.xyz/tags/Probability" class="mention hashtag" rel="tag">#<span>Probability</span></a> <a href="https://mathstodon.xyz/tags/UncertaintyPrinciple" class="mention hashtag" rel="tag">#<span>UncertaintyPrinciple</span></a></p>
Fabio Capela<p>Monte Carlo simulations reveal that a 3-year market crash early in retirement reduces portfolio success rates by ~20% compared to identical crashes occurring later. This mathematical asymmetry is the essence of sequence risk in retirement planning. <a href="https://mathstodon.xyz/tags/FIRE" class="mention hashtag" rel="tag">#<span>FIRE</span></a> <a href="https://mathstodon.xyz/tags/RetirementMath" class="mention hashtag" rel="tag">#<span>RetirementMath</span></a> <a href="https://mathstodon.xyz/tags/Probability" class="mention hashtag" rel="tag">#<span>Probability</span></a></p>
Nick Byrd, Ph.D.<p>🤔 Not what I expected!</p><p>More people preferred high diagnostic uncertainty than low, even though the higher uncertainty caused more worry.</p><p>Within- and between-subject, quant and qual results: <a href="https://doi.org/10.1136/jme-2024-109932" rel="nofollow noopener noreferrer" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">doi.org/10.1136/jme-2024-10993</span><span class="invisible">2</span></a></p><p><a href="https://nerdculture.de/tags/medicine" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>medicine</span></a> <a href="https://nerdculture.de/tags/decisionScience" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>decisionScience</span></a> <a href="https://nerdculture.de/tags/probability" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>probability</span></a> <a href="https://nerdculture.de/tags/bioethics" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>bioethics</span></a> <a href="https://nerdculture.de/tags/openAccess" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>openAccess</span></a></p>
Fabio Capela<p>The Monty Hall problem: You&#39;re on a game show with 3 doors. One has a car, two have goats. You pick a door. The host (who knows what&#39;s behind each door) opens a different door with a goat. Should you switch your choice? (Hint: Yes, it doubles your chances!) <a href="https://mathstodon.xyz/tags/Mathematics" class="mention hashtag" rel="tag">#<span>Mathematics</span></a> <a href="https://mathstodon.xyz/tags/Probability" class="mention hashtag" rel="tag">#<span>Probability</span></a> <a href="https://mathstodon.xyz/tags/MontyHall" class="mention hashtag" rel="tag">#<span>MontyHall</span></a></p>
Longreads<p>"Our world should be at its most analyzable, explicable — but still it can feel like sorcery."</p><p>Eric Boodman for New York magazine: <a href="https://longreads.com/2025/04/08/does-luck-exist/" rel="nofollow noopener noreferrer" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">longreads.com/2025/04/08/does-</span><span class="invisible">luck-exist/</span></a> </p><p><a href="https://mastodon.world/tags/Longreads" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Longreads</span></a> <a href="https://mastodon.world/tags/Luck" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Luck</span></a> <a href="https://mastodon.world/tags/Chance" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Chance</span></a> <a href="https://mastodon.world/tags/Probability" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Probability</span></a> <a href="https://mastodon.world/tags/Philosophy" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Philosophy</span></a>#Superstition</p>
Ava<p>Suppose I have a random event with k possible outcomes of equal probability. What distribution (if any) describes the probability of obtaining a specific sequence of length m after n events?</p><p><a href="https://mathstodon.xyz/tags/probability" class="mention hashtag" rel="tag">#<span>probability</span></a> <a href="https://mathstodon.xyz/tags/probabilitydistribution" class="mention hashtag" rel="tag">#<span>probabilitydistribution</span></a> <a href="https://mathstodon.xyz/tags/statistics" class="mention hashtag" rel="tag">#<span>statistics</span></a></p>
pglpm<p>Happy Birthday, Laplace! 🎂 🪐 🎓 One of the first to use Bayesian probability theory in the modern way!</p><p>"One sees in this essay that the theory of probabilities is basically only common sense reduced to a calculus. It makes one estimate accurately what right-minded people feel by a sort of instinct, often without being able to give a reason for it. It leaves nothing arbitrary in the choice of opinions and of making up one's mind, every time one is able, by this means, to determine the most advantageous choice. Thereby, it becomes the most happy supplement to ignorance and to the weakness of the human mind. If one considers the analytical methods to which this theory has given rise, the truth of the principles that serve as the groundwork, the subtle and delicate logic needed to use them in the solution of the problems, the public-benefit businesses that depend on it, and the extension that it has received and may still receive from its application to the most important questions of natural philosophy and the moral sciences; if one observes also that even in matters which cannot be handled by the calculus, it gives the best rough estimates to guide us in our judgements, and that it teaches us to guard ourselves from the illusions which often mislead us, one will see that there is no science at all more worthy of our consideration, and that it would be a most useful part of the system of public education." </p><p>*Philosophical Essay on Probabilities*, 1814 &lt;<a href="https://doi.org/10.1007/978-1-4612-4184-3" rel="nofollow noopener noreferrer" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">doi.org/10.1007/978-1-4612-418</span><span class="invisible">4-3</span></a>&gt;</p><p><a href="https://c.im/tags/science" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>science</span></a> <a href="https://c.im/tags/probability" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>probability</span></a> <a href="https://c.im/tags/bayesian" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>bayesian</span></a> <a href="https://c.im/tags/physics" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>physics</span></a></p>
CubeRootOfTrue<p>ChatGPT&#39;s thought processes:</p><p>I&#39;m tying together the rich tapestry of philosophical insights and category theory, exploring how enriched categories, with their hom-sets as objects in a monoidal category, aid in managing complexity, partial orders, or metrics, moving beyond conventional morphism sets.</p><p>I&#39;m starting to see how enriched categories with hom-sets as partially ordered sets can capture complexity, cost, or transformation intricacies through preorders, partial orders, metrics, or probabilities.</p><p>I’m piecing together how enriched categories can reveal more about morphisms by incorporating structures like complexity, cost, or probability, allowing short and long morphisms to be systematically compared.</p><p>... and then it crashed, without producing a response. This is very similar to my experience with this material, yes, quite human-like</p><p><a href="https://mathstodon.xyz/tags/categorytheory" class="mention hashtag" rel="tag">#<span>categorytheory</span></a> <a href="https://mathstodon.xyz/tags/enriched" class="mention hashtag" rel="tag">#<span>enriched</span></a> <a href="https://mathstodon.xyz/tags/probability" class="mention hashtag" rel="tag">#<span>probability</span></a></p>
Christian Testa<p>Over the past couple of years, I've really fallen in love with <a href="https://fediscience.org/tags/tikz" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>tikz</span></a> and all of its quirks. </p><p>TikZ is a plotting/graphics package for LaTeX that is especially useful for creating mathematical diagrams. </p><p>The support for mathematical notation is unbeatable and the flexibility of the language is extremely high. Also, graphics rendered to pdf/svg in this way are extremely lightweight and reproducible. </p><p>I do find it very challenging syntax to remember though, so I put together this GitHub repository to keep track of tikz code I've written. </p><p><a href="https://github.com/ctesta01/tikz-examples/" rel="nofollow noopener noreferrer" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">github.com/ctesta01/tikz-examp</span><span class="invisible">les/</span></a></p><p>Each graphic shown in the README is linked to its underlying .tex code. </p><p>Also the README has several links to documentation / tutorials that I've found helpful along with some tips I've learned from experience. </p><p><a href="https://fediscience.org/tags/mathematics" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>mathematics</span></a> <a href="https://fediscience.org/tags/statistics" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>statistics</span></a> <a href="https://fediscience.org/tags/probability" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>probability</span></a> <a href="https://fediscience.org/tags/geometry" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>geometry</span></a> <a href="https://fediscience.org/tags/graphics" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>graphics</span></a> <a href="https://fediscience.org/tags/TeXLaTeX" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>TeXLaTeX</span></a></p>
Knowledge Zone<p>The <a href="https://mstdn.social/tags/Birthday" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Birthday</span></a> <a href="https://mstdn.social/tags/Paradox" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Paradox</span></a> states that in a random group of 23 people, there's a greater than 50% chance that at least two people share a birthday. </p><p>This counterintuitive result arises from the fact that we're comparing all possible pairs of people, not just matching one person's birthday to a specific date.</p><p><a href="https://mstdn.social/tags/Math" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Math</span></a> <a href="https://mstdn.social/tags/Probability" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Probability</span></a></p><p><a href="https://knowledgezone.co.in/kbits/6443fc44d43b3e28788e4418" rel="nofollow noopener noreferrer" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">knowledgezone.co.in/kbits/6443</span><span class="invisible">fc44d43b3e28788e4418</span></a></p>
Ross Kang<p>For various (mathematical, meteorological, alimentary) reasons, I usually prefer 2π day.<br />Nevertheless, today I make the following offering:</p><p><a href="http://arxiv.org/abs/2503.10002" target="_blank" rel="nofollow noopener noreferrer" translate="no"><span class="invisible">http://</span><span class="">arxiv.org/abs/2503.10002</span><span class="invisible"></span></a></p><p>Pjotr Buys, <span class="h-card" translate="no"><a href="https://mathstodon.xyz/@Janvadehe" class="u-url mention">@<span>Janvadehe</span></a></span> and I used Shearer&#39;s induction to address the question:</p><p>How few independent sets can a triangle-free graph of average degree d have?</p><p> <br />The answer is close to how many a random graph has.<br />What is perhaps surprising is just *how* close it comes.<br />(I queried the combinatorial hive mind about this last week.)</p><p><a href="https://mathstodon.xyz/tags/combinatorics" class="mention hashtag" rel="tag">#<span>combinatorics</span></a> <a href="https://mathstodon.xyz/tags/graphtheory" class="mention hashtag" rel="tag">#<span>graphtheory</span></a> <a href="https://mathstodon.xyz/tags/ExtremalCombinatorics" class="mention hashtag" rel="tag">#<span>ExtremalCombinatorics</span></a> <a href="https://mathstodon.xyz/tags/probability" class="mention hashtag" rel="tag">#<span>probability</span></a> <a href="https://mathstodon.xyz/tags/math" class="mention hashtag" rel="tag">#<span>math</span></a> <a href="https://mathstodon.xyz/tags/mathematics" class="mention hashtag" rel="tag">#<span>mathematics</span></a> <a href="https://mathstodon.xyz/tags/piDay" class="mention hashtag" rel="tag">#<span>piDay</span></a></p>
kazé<p>Dear LazyWeb: is there a C/C++, <a href="https://mastodon.social/tags/RustLang" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>RustLang</span></a> or <a href="https://mastodon.social/tags/Zig" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Zig</span></a> equivalent of <a href="https://mastodon.social/tags/SciPy" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>SciPy</span></a>’s `stats` module for statistical analysis? Namely:<br> • a collection of common PDFs (probability density functions);<br> • MLE (maximum likelihood estimation) for these common distributions;<br> • KDE (kernel density estimation).</p><p>SciPy’s API is a pleasure to work with. Anything that comes close but usable from C/C++/Rust/Zig would make my life so much easier. Boosts appreciated for visibility.</p><p><a href="https://mastodon.social/tags/statistics" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>statistics</span></a> <a href="https://mastodon.social/tags/probability" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>probability</span></a> <a href="https://mastodon.social/tags/DataScience" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>DataScience</span></a></p>
QIC Journal<p>🙌<a href="https://fediscience.org/tags/call4reading" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>call4reading</span></a></p><p>✍️Return <a href="https://fediscience.org/tags/probability" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>probability</span></a> and self-similarity of the <a href="https://fediscience.org/tags/Rieszwalk" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Rieszwalk</span></a> <a href="https://fediscience.org/tags/by" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>by</span></a> Ryota Hanaoka and Norio Konno</p><p>🔗10.26421/QIC21.5-6-5 (<a href="https://fediscience.org/tags/arXiv" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>arXiv</span></a>:2010.04518)</p><p><a href="https://fediscience.org/tags/Quantumwalks" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Quantumwalks</span></a> <a href="https://fediscience.org/tags/Singularcontinuousmeasure" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Singularcontinuousmeasure</span></a></p>
Dr Mircea Zloteanu 🌼🐝<p><a href="https://mastodon.social/tags/statstab" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>statstab</span></a> #296 The most important theory in statistics</p><p>Thoughts: Fisher revolutionised statistics, and the Maximum Likelihood Estimate (MLE) is the crown jewel.</p><p><a href="https://mastodon.social/tags/MLE" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>MLE</span></a> <a href="https://mastodon.social/tags/Fisher" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Fisher</span></a> <a href="https://mastodon.social/tags/statistics" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>statistics</span></a> <a href="https://mastodon.social/tags/stats" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>stats</span></a> <a href="https://mastodon.social/tags/probability" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>probability</span></a> <a href="https://mastodon.social/tags/likelihood" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>likelihood</span></a> </p><p><a href="https://youtu.be/YevSE6bRhTo" rel="nofollow noopener noreferrer" translate="no" target="_blank"><span class="invisible">https://</span><span class="">youtu.be/YevSE6bRhTo</span><span class="invisible"></span></a></p>
Victoria Stuart 🇨🇦 🏳️‍⚧️<p>[2502.05244] Probabilistic Artificial Intelligence<br><a href="https://arxiv.org/abs/2502.05244" rel="nofollow noopener noreferrer" translate="no" target="_blank"><span class="invisible">https://</span><span class="">arxiv.org/abs/2502.05244</span><span class="invisible"></span></a><br><a href="https://news.ycombinator.com/item?id=43318624" rel="nofollow noopener noreferrer" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">news.ycombinator.com/item?id=4</span><span class="invisible">3318624</span></a></p><p>Manuscript 418pp ...</p><p><a href="https://mastodon.social/tags/ML" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>ML</span></a> <a href="https://mastodon.social/tags/probability" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>probability</span></a> <a href="https://mastodon.social/tags/MLbooks" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>MLbooks</span></a> <a href="https://mastodon.social/tags/MLtheory" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>MLtheory</span></a></p>
Ross Gayler<p>Some probability/maths/optimisation questions for the Fedi-hive mind:</p><p>Bayes' Theorem is<br>P(H | E) = P(E | H) P(H) / P(E)<br>where H and E are events (that I have labelled for my mnemonic convenience to suggest Hypothesis and Evidence, but they're just events).</p><p>Assume that:<br>* There is some fixed database of records with a fixed set of fields.<br>* The events H and E are predicates of individual database records.<br>* The event predicates are functions of the field values in the record being evaluated.<br>* We are interpreting the relative frequency of the event predicate being true over all the record in the database as the probability of the event defined by the predicate.</p><p>The typical statement of Bayes' Theorem appears to assume that the definitions of the events H and E are fixed and given, and the only thing of interest is how to calculate with them.</p><p>1. Does it make sense to have a fixed definition of H and search over the space of possible definitions of E to maximise P(H | E)?</p><p>2. Is there a name for this? (I presume it's been suggested many times already.) Is it abductive inference because you're trying to find the "best explanation" of H?</p><p>3. Are there constraints that need to be placed on the optimisation? (a. You wouldn't want the E definition to be a copy of or equivalent to the H definition. b. You wouldn't want the E definition to be some degenerate case, e.g. with P(E) vanishingly small. c. You probably want some regularisation penalty that prefers simple definitions of E over more complex ones.</p><p>Any comments on this and pointers into the literature would be greatly appreciated.</p><p><a href="https://aus.social/tags/math" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>math</span></a> <a href="https://aus.social/tags/probability" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>probability</span></a> <a href="https://aus.social/tags/Bayes" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Bayes</span></a> <a href="https://aus.social/tags/optimisation" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>optimisation</span></a> <a href="https://aus.social/tags/AbductiveInference" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>AbductiveInference</span></a></p>
Hacker News<p>Math That Matters: The Case for Probability over Polynomials — <a href="https://anandsanwal.me/math-eduction-more-probability-statistics-less-calculus/" rel="nofollow noopener noreferrer" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">anandsanwal.me/math-eduction-m</span><span class="invisible">ore-probability-statistics-less-calculus/</span></a><br><a href="https://mastodon.social/tags/HackerNews" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>HackerNews</span></a> <a href="https://mastodon.social/tags/MathThatMatters" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>MathThatMatters</span></a> <a href="https://mastodon.social/tags/Probability" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Probability</span></a> <a href="https://mastodon.social/tags/Polynomials" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Polynomials</span></a> <a href="https://mastodon.social/tags/Education" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Education</span></a> <a href="https://mastodon.social/tags/Statistics" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Statistics</span></a> <a href="https://mastodon.social/tags/Calculus" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Calculus</span></a></p>