eupolicy.social is one of the many independent Mastodon servers you can use to participate in the fediverse.
This Mastodon server is a friendly and respectful discussion space for people working in areas related to EU policy. When you request to create an account, please tell us something about you.

Server stats:

214
active users

#probability

0 posts0 participants0 posts today
Dr Mircea Zloteanu ☀️ 🌊🌴<p><a href="https://mastodon.social/tags/statstab" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>statstab</span></a> #388 The odds are it's wrong: Correcting a common mistake in statistics</p><p>Thoughts: Report probabilities instead, which in R (not SPSS) can be easily computed for your models.</p><p><a href="https://mastodon.social/tags/odds" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>odds</span></a> <a href="https://mastodon.social/tags/oddsratios" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>oddsratios</span></a> <a href="https://mastodon.social/tags/riskratios" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>riskratios</span></a> <a href="https://mastodon.social/tags/probability" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>probability</span></a> <a href="https://mastodon.social/tags/r" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>r</span></a> <a href="https://mastodon.social/tags/logisticregression" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>logisticregression</span></a></p><p><a href="https://onlinelibrary.wiley.com/doi/10.1111/test.12391" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">onlinelibrary.wiley.com/doi/10</span><span class="invisible">.1111/test.12391</span></a></p>
GMRaphi<p>omg mein guter Freund zombiecalypse hat mal wieder abgeliefert: "Wieso gehen Pläne schief, erster Grund: Optimismus". </p><p><a href="https://rollenspiel.social/tags/ttrpg" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ttrpg</span></a> <a href="https://rollenspiel.social/tags/pnpde" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>pnpde</span></a> <a href="https://rollenspiel.social/tags/dnd" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>dnd</span></a> <a href="https://rollenspiel.social/tags/dice" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>dice</span></a> <a href="https://rollenspiel.social/tags/statistics" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>statistics</span></a> <a href="https://rollenspiel.social/tags/success" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>success</span></a> <a href="https://rollenspiel.social/tags/failure" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>failure</span></a> <a href="https://rollenspiel.social/tags/probability" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>probability</span></a> <a href="https://rollenspiel.social/tags/maths" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>maths</span></a> <a href="https://rollenspiel.social/tags/mathe" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>mathe</span></a></p>
Archimage<p>“You could pay as little as $0.” “Save up to $2000” may be true, but probably false without the odds.</p><p><a href="https://writing.exchange/tags/marketing" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>marketing</span></a> <a href="https://writing.exchange/tags/failure" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>failure</span></a> <a href="https://writing.exchange/tags/probability" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>probability</span></a></p>
Steven Saus [he/him]<p>(29 May) Your next gaming dice could be shaped like a dragon or armadillo </p><p>Statistically, &amp;ldquo;the real behavior of a rolling object is largely a function of its geometry.&amp;rdquo;&amp;#8230; </p><p><a href="https://s.faithcollapsing.com/g6yqw" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="">s.faithcollapsing.com/g6yqw</span><span class="invisible"></span></a> <br>Archive: ais: <a href="https://archive.md/wip/HYNoa" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="">archive.md/wip/HYNoa</span><span class="invisible"></span></a> ia: <a href="https://s.faithcollapsing.com/oeaqj" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="">s.faithcollapsing.com/oeaqj</span><span class="invisible"></span></a> </p><p><a href="https://faithcollapsing.com/tags/3d" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>3d</span></a>-printing <a href="https://faithcollapsing.com/tags/computational" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>computational</span></a>-science <a href="https://faithcollapsing.com/tags/computer" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>computer</span></a>-simulations <a href="https://faithcollapsing.com/tags/geometry" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>geometry</span></a> <a href="https://faithcollapsing.com/tags/probability" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>probability</span></a> <a href="https://faithcollapsing.com/tags/science" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>science</span></a> <a href="https://faithcollapsing.com/tags/shape" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>shape</span></a>-analysis</p>
Skewray Research<p>Stable distributions have two shape parameters that both affect the “chubbiness” of a distribution, α (tail parameter) and γ (core width). The γ parameter corresponds to σ for a normal distribution, while α gives the slope of the tails at large distances from the core. If we are comparing two stable distributions, can we tell which parameter is responsible for the difference?</p><p><a href="https://mathstodon.xyz/tags/probability" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>probability</span></a> <a href="https://mathstodon.xyz/tags/statistics" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>statistics</span></a> </p><p><a href="https://www.skewray.com/articles/distance-between-symmetric-stable-distributions" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://www.</span><span class="ellipsis">skewray.com/articles/distance-</span><span class="invisible">between-symmetric-stable-distributions</span></a></p>
david jon furbish<p>I cannot think of an applied mathematics that is more beautiful and far-reaching, or philosophically wilder, than probability. No, nonlinear dynamics and chaos people, it’s not even close 🤣</p><p><a href="https://mastodon.online/tags/probability" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>probability</span></a><br><a href="https://mastodon.online/tags/mathematics" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>mathematics</span></a><br><a href="https://mastodon.online/tags/appliedmathematics" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>appliedmathematics</span></a><br><a href="https://mastodon.online/tags/philosophy" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>philosophy</span></a><br><a href="https://mastodon.online/tags/philosophyofscience" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>philosophyofscience</span></a> <br><span class="h-card" translate="no"><a href="https://newsmast.community/@philosophy" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>philosophy@newsmast.community</span></a></span> <br><span class="h-card" translate="no"><a href="https://a.gup.pe/u/philosophy" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>philosophy@a.gup.pe</span></a></span></p>
Longreads<p>"Our world should be at its most analyzable, explicable — but still it can feel like sorcery."</p><p>Eric Boodman for New York magazine: <a href="https://longreads.com/2025/04/08/does-luck-exist/" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">longreads.com/2025/04/08/does-</span><span class="invisible">luck-exist/</span></a> </p><p><a href="https://mastodon.world/tags/Longreads" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Longreads</span></a> <a href="https://mastodon.world/tags/Luck" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Luck</span></a> <a href="https://mastodon.world/tags/Chance" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Chance</span></a> <a href="https://mastodon.world/tags/Probability" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Probability</span></a> <a href="https://mastodon.world/tags/Philosophy" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Philosophy</span></a>#Superstition</p>
Ava<p>Suppose I have a random event with k possible outcomes of equal probability. What distribution (if any) describes the probability of obtaining a specific sequence of length m after n events?</p><p><a href="https://mathstodon.xyz/tags/probability" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>probability</span></a> <a href="https://mathstodon.xyz/tags/probabilitydistribution" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>probabilitydistribution</span></a> <a href="https://mathstodon.xyz/tags/statistics" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>statistics</span></a></p>
kazé<p>Dear LazyWeb: is there a C/C++, <a href="https://mastodon.social/tags/Rust" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Rust</span></a> or <a href="https://mastodon.social/tags/Zig" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Zig</span></a> equivalent of <a href="https://mastodon.social/tags/SciPy" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>SciPy</span></a>’s `stats` module for statistical analysis? Namely:<br> • a collection of common PDFs (probability density functions);<br> • MLE (maximum likelihood estimation) for these common distributions;<br> • KDE (kernel density estimation).</p><p>SciPy’s API is a pleasure to work with. Anything that comes close but usable from C/C++/Rust/Zig would make my life so much easier. Boosts appreciated for visibility.</p><p><a href="https://mastodon.social/tags/statistics" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>statistics</span></a> <a href="https://mastodon.social/tags/probability" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>probability</span></a> <a href="https://mastodon.social/tags/DataScience" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>DataScience</span></a></p>
Ross Gaylermaths/Bayes/probability/optimisation/inference questions
Europe Says<p><a href="https://www.europesays.com/1881524/" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://www.</span><span class="">europesays.com/1881524/</span><span class="invisible"></span></a> Elon Musk Says There’s ‘Only a 20% Chance of Annihilation’ With AI <a href="https://pubeurope.com/tags/AI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AI</span></a> <a href="https://pubeurope.com/tags/annihilation" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>annihilation</span></a> <a href="https://pubeurope.com/tags/ArtificialIntelligence" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ArtificialIntelligence</span></a> <a href="https://pubeurope.com/tags/chance" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>chance</span></a> <a href="https://pubeurope.com/tags/Concern" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Concern</span></a> <a href="https://pubeurope.com/tags/ElonMusk" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ElonMusk</span></a> <a href="https://pubeurope.com/tags/GoodOutcome" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>GoodOutcome</span></a> <a href="https://pubeurope.com/tags/human" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>human</span></a> <a href="https://pubeurope.com/tags/HumanIntelligence" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>HumanIntelligence</span></a> <a href="https://pubeurope.com/tags/Interview" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Interview</span></a> <a href="https://pubeurope.com/tags/LastYear" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>LastYear</span></a> <a href="https://pubeurope.com/tags/NextYear" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>NextYear</span></a> <a href="https://pubeurope.com/tags/OpenAI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>OpenAI</span></a> <a href="https://pubeurope.com/tags/other" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>other</span></a> <a href="https://pubeurope.com/tags/PodcastEpisode" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>PodcastEpisode</span></a> <a href="https://pubeurope.com/tags/probability" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>probability</span></a></p>
Eric Maugendre<p><span class="h-card" translate="no"><a href="https://a.gup.pe/u/data" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>data</span></a></span> <span class="h-card" translate="no"><a href="https://a.gup.pe/u/datadon" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>datadon</span></a></span> 🧵</p><p>Accuracy! To counter regression dilution, a method is to add a constraint on the statistical modeling.<br>Regression Redress restrains bias by segregating the residual values.<br>My article: <a href="http://data.yt/kit/regression-redress.html" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">http://</span><span class="ellipsis">data.yt/kit/regression-redress</span><span class="invisible">.html</span></a></p><p><a href="https://hachyderm.io/tags/bias" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>bias</span></a> <a href="https://hachyderm.io/tags/modeling" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>modeling</span></a> <a href="https://hachyderm.io/tags/dataDev" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>dataDev</span></a> <a href="https://hachyderm.io/tags/AIDev" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AIDev</span></a> <a href="https://hachyderm.io/tags/modelEvaluation" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>modelEvaluation</span></a> <a href="https://hachyderm.io/tags/regression" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>regression</span></a> <a href="https://hachyderm.io/tags/modelling" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>modelling</span></a> <a href="https://hachyderm.io/tags/dataLearning" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>dataLearning</span></a> <a href="https://hachyderm.io/tags/linearRegression" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>linearRegression</span></a> <a href="https://hachyderm.io/tags/probability" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>probability</span></a> <a href="https://hachyderm.io/tags/probabilities" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>probabilities</span></a> <a href="https://hachyderm.io/tags/statistics" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>statistics</span></a> <a href="https://hachyderm.io/tags/stats" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>stats</span></a> <a href="https://hachyderm.io/tags/correctionRatio" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>correctionRatio</span></a> <a href="https://hachyderm.io/tags/ML" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ML</span></a> <a href="https://hachyderm.io/tags/distributions" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>distributions</span></a> <a href="https://hachyderm.io/tags/accuracy" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>accuracy</span></a> <a href="https://hachyderm.io/tags/RegressionRedress" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>RegressionRedress</span></a> <a href="https://hachyderm.io/tags/Python" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Python</span></a> <a href="https://hachyderm.io/tags/RStats" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>RStats</span></a></p>
Ross Kang<p>A post of <span class="h-card" translate="no"><a href="https://mathstodon.xyz/@11011110" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>11011110</span></a></span> has reminded me that (after a year and a half lurking here) it's never too late for me to toot and pin an intro here.</p><p>I am a Canadian mathematician in the Netherlands, and I have been based at the University of Amsterdam since 2022. I also have some rich and longstanding ties to the UK, France, and Japan.</p><p>My interests are somewhere in the nexus of Combinatorics, Probability, and Algorithms. Specifically, I like graph colouring, random graphs, and probabilistic/extremal combinatorics. I have an appreciation for randomised algorithms, graph structure theory, and discrete geometry.</p><p>Around 2020, I began taking a more active role in the community, especially in efforts towards improved fairness and openness in science. I am proud to be part of a team that founded the journal, Innovations in Graph Theory (<a href="https://igt.centre-mersenne.org/" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="">igt.centre-mersenne.org/</span><span class="invisible"></span></a>), that launched in 2023. (That is probably the main reason I joined mathstodon!) I have also been a coordinator since 2020 of the informal research network, A Sparse (Graphs) Coalition (<a href="https://sparse-graphs.mimuw.edu.pl/" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="">sparse-graphs.mimuw.edu.pl/</span><span class="invisible"></span></a>), devoted to online collaborative workshops. In 2024, I helped spearhead the MathOA Diamond Open Access Stimulus Fund (<a href="https://www.mathoa.org/diamond-open-access-stimulus-fund/" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://www.</span><span class="ellipsis">mathoa.org/diamond-open-access</span><span class="invisible">-stimulus-fund/</span></a>).</p><p>Until now, my posts have mostly been about scientific publishing and combinatorics.</p><p><a href="https://mathstodon.xyz/tags/introduction" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>introduction</span></a> <br><a href="https://mathstodon.xyz/tags/openscience" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>openscience</span></a> <br><a href="https://mathstodon.xyz/tags/diamondopenaccess" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>diamondopenaccess</span></a> <br><a href="https://mathstodon.xyz/tags/scientificpublishing" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>scientificpublishing</span></a> <br><a href="https://mathstodon.xyz/tags/openaccess" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>openaccess</span></a> <br><a href="https://mathstodon.xyz/tags/RemoteConferences" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>RemoteConferences</span></a> <br><a href="https://mathstodon.xyz/tags/combinatorics" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>combinatorics</span></a> <br><a href="https://mathstodon.xyz/tags/graphtheory" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>graphtheory</span></a> <br><a href="https://mathstodon.xyz/tags/ExtremalCombinatorics" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ExtremalCombinatorics</span></a> <br><a href="https://mathstodon.xyz/tags/probability" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>probability</span></a></p>
Eric Maugendre<p><span class="h-card" translate="no"><a href="https://a.gup.pe/u/data" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>data</span></a></span> <span class="h-card" translate="no"><a href="https://a.gup.pe/u/datadon" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>datadon</span></a></span> 🧵</p><p>How to assess a statistical model?<br>How to choose between variables?</p><p>Pearson's <a href="https://hachyderm.io/tags/correlation" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>correlation</span></a> is irrelevant if you suspect that the relationship is not a straight line.</p><p>If monotonic relationship:<br>"<a href="https://hachyderm.io/tags/Spearman" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Spearman</span></a>’s rho is particularly useful for small samples where weak correlations are expected, as it can detect subtle monotonic trends." It is "widespread across disciplines where the measurement precision is not guaranteed".<br>"<a href="https://hachyderm.io/tags/Kendall" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Kendall</span></a>’s Tau-b is less affected [than Spearman’s rho] by outliers in the data, making it a robust option for datasets with extreme values."<br>Ref: <a href="https://statisticseasily.com/kendall-tau-b-vs-spearman/" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">statisticseasily.com/kendall-t</span><span class="invisible">au-b-vs-spearman/</span></a></p><p><a href="https://hachyderm.io/tags/normality" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>normality</span></a> <a href="https://hachyderm.io/tags/normalDistribution" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>normalDistribution</span></a> <a href="https://hachyderm.io/tags/modeling" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>modeling</span></a> <a href="https://hachyderm.io/tags/dataDev" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>dataDev</span></a> <a href="https://hachyderm.io/tags/AIDev" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AIDev</span></a> <a href="https://hachyderm.io/tags/ML" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ML</span></a> <a href="https://hachyderm.io/tags/modelEvaluation" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>modelEvaluation</span></a> <a href="https://hachyderm.io/tags/regression" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>regression</span></a> <a href="https://hachyderm.io/tags/modelling" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>modelling</span></a> <a href="https://hachyderm.io/tags/dataLearning" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>dataLearning</span></a> <a href="https://hachyderm.io/tags/featureEngineering" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>featureEngineering</span></a> <a href="https://hachyderm.io/tags/linearRegression" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>linearRegression</span></a> <a href="https://hachyderm.io/tags/modeling" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>modeling</span></a> <a href="https://hachyderm.io/tags/probability" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>probability</span></a> <a href="https://hachyderm.io/tags/probabilities" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>probabilities</span></a> <a href="https://hachyderm.io/tags/statistics" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>statistics</span></a> <a href="https://hachyderm.io/tags/stats" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>stats</span></a> <a href="https://hachyderm.io/tags/correctionRatio" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>correctionRatio</span></a> <a href="https://hachyderm.io/tags/ML" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ML</span></a> <a href="https://hachyderm.io/tags/Pearson" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Pearson</span></a> <a href="https://hachyderm.io/tags/bias" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>bias</span></a> <a href="https://hachyderm.io/tags/regressionRedress" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>regressionRedress</span></a> <a href="https://hachyderm.io/tags/distributions" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>distributions</span></a></p>
Sylvia Wenmackers 🦉🍀<p>I read about the 1-in-83 (&gt;1% !) odds of a decent-sized <a href="https://scholar.social/tags/asteroid" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>asteroid</span></a> (2024 YR4) hitting Earth in 2032. ☄️ <a href="https://www.space.com/180-foot-asteroid-1-in-83-chance-hitting-Earth-2032" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://www.</span><span class="ellipsis">space.com/180-foot-asteroid-1-</span><span class="invisible">in-83-chance-hitting-Earth-2032</span></a><br>First thought: "Not now, large space rock." 😬<br>But soon after, I wondered: how do they determine this <a href="https://scholar.social/tags/probability" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>probability</span></a>? 🤔</p><p>Turns out it's a bit like weather forecasts: they run multiple simulations (variations on the measured data) and report the fraction of how often a certain event happens (rain/collision course). <a href="https://scholar.social/tags/2024yr4" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>2024yr4</span></a> 1/2</p>
Markus Redeker<p>New blog post: A short solution to the Monty Hall problem that I have not seen elsewhere (<a href="https://functor.network/user/414/entry/867" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">functor.network/user/414/entry</span><span class="invisible">/867</span></a>).</p><p><a href="https://mathstodon.xyz/tags/WordsAndSomeFormulas" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>WordsAndSomeFormulas</span></a> <a href="https://mathstodon.xyz/tags/MontyHall" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>MontyHall</span></a> <a href="https://mathstodon.xyz/tags/Probability" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Probability</span></a> <a href="https://mathstodon.xyz/tags/Mathematics" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Mathematics</span></a></p>
Cheng Soon Ong<p>"... probability probably does not exist — but it is often useful to act as if it does."<br>David Spiegelhalter provides a short essay that touches on the main aspects of the elusive idea of probability.<br>⁠<a href="https://www.nature.com/articles/d41586-024-04096-5" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://www.</span><span class="ellipsis">nature.com/articles/d41586-024</span><span class="invisible">-04096-5</span></a></p><p>His book on Uncertainty just came out yesterday, which I expect will explain these ideas in more detail.<br>⁠<a href="https://www.penguin.com.au/books/the-art-of-uncertainty-9780241658628" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://www.</span><span class="ellipsis">penguin.com.au/books/the-art-o</span><span class="invisible">f-uncertainty-9780241658628</span></a></p><p><a href="https://masto.ai/tags/MachineLearning" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>MachineLearning</span></a> <a href="https://masto.ai/tags/Statistics" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Statistics</span></a> <a href="https://masto.ai/tags/Probability" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Probability</span></a> <a href="https://masto.ai/tags/scicomm" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>scicomm</span></a></p>
Eric Maugendre<p><span class="h-card" translate="no"><a href="https://a.gup.pe/u/data" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>data</span></a></span> <span class="h-card" translate="no"><a href="https://a.gup.pe/u/datadon" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>datadon</span></a></span> 🧵</p><p>Redressing <a href="https://hachyderm.io/tags/Bias" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Bias</span></a>: "Correlation Constraints for Regression Models":<br>Treder et al (2021) <a href="https://doi.org/10.3389/fpsyt.2021.615754" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">doi.org/10.3389/fpsyt.2021.615</span><span class="invisible">754</span></a></p><p><a href="https://hachyderm.io/tags/dataDev" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>dataDev</span></a> <a href="https://hachyderm.io/tags/linearRegression" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>linearRegression</span></a> <a href="https://hachyderm.io/tags/modeling" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>modeling</span></a> <a href="https://hachyderm.io/tags/probability" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>probability</span></a> <a href="https://hachyderm.io/tags/probabilities" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>probabilities</span></a> <a href="https://hachyderm.io/tags/statistics" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>statistics</span></a> <a href="https://hachyderm.io/tags/stats" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>stats</span></a> <a href="https://hachyderm.io/tags/modelling" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>modelling</span></a> <a href="https://hachyderm.io/tags/regression" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>regression</span></a> <a href="https://hachyderm.io/tags/correctionRatio" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>correctionRatio</span></a> <a href="https://hachyderm.io/tags/skLearn" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>skLearn</span></a> <a href="https://hachyderm.io/tags/scikitLearn" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>scikitLearn</span></a> <a href="https://hachyderm.io/tags/python" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>python</span></a> <a href="https://hachyderm.io/tags/AIDev" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AIDev</span></a></p>
Eric Maugendre<p>"In real life, we weigh the anticipated consequences of the decisions that we are about to make. That approach is much more rational than limiting the percentage of making the error of one kind in an artificial (null hypothesis) setting or using a measure of evidence for each model as the weight."<br>Longford (2005) <a href="http://www.stat.columbia.edu/~gelman/stuff_for_blog/longford.pdf" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">http://www.</span><span class="ellipsis">stat.columbia.edu/~gelman/stuf</span><span class="invisible">f_for_blog/longford.pdf</span></a></p><p><a href="https://hachyderm.io/tags/modeling" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>modeling</span></a> <a href="https://hachyderm.io/tags/nullHypothesis" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>nullHypothesis</span></a> <a href="https://hachyderm.io/tags/probability" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>probability</span></a> <a href="https://hachyderm.io/tags/probabilities" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>probabilities</span></a> <a href="https://hachyderm.io/tags/pValues" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>pValues</span></a> <a href="https://hachyderm.io/tags/statistics" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>statistics</span></a> <a href="https://hachyderm.io/tags/stats" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>stats</span></a> <a href="https://hachyderm.io/tags/statisticalLiteracy" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>statisticalLiteracy</span></a> <a href="https://hachyderm.io/tags/bias" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>bias</span></a> <a href="https://hachyderm.io/tags/inference" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>inference</span></a> <a href="https://hachyderm.io/tags/modelling" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>modelling</span></a> <a href="https://hachyderm.io/tags/regression" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>regression</span></a> <a href="https://hachyderm.io/tags/linearRegression" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>linearRegression</span></a></p>
Mark Rubin<p>"It does not make sense to think of our judgements as being estimates of ‘true’ probabilities." David Spiegelhalter's excellent introduction to the nuances and intricacies of probability. <a href="https://doi.org/10.1038/d41586-024-04096-5" rel="nofollow noopener" target="_blank">doi.org/10.1038/d415...</a> <a href="https://bsky.app/search?q=%23Methodology" rel="nofollow noopener" target="_blank">#Methodology</a> <a href="https://bsky.app/search?q=%23Statistics" rel="nofollow noopener" target="_blank">#Statistics</a> <a href="https://bsky.app/search?q=%23Probability" rel="nofollow noopener" target="_blank">#Probability</a> 🧪</p>