ohai.social is one of the many independent Mastodon servers you can use to participate in the fediverse.
A cozy, fast and secure Mastodon server where everyone is welcome. Run by the folks at ohai.is.

Administered by:

Server stats:

1.8K
active users

#Probability

0 posts0 participants0 posts today

Happy Birthday, Laplace! 🎂 🪐 🎓 One of the first to use Bayesian probability theory in the modern way!

"One sees in this essay that the theory of probabilities is basically only common sense reduced to a calculus. It makes one estimate accurately what right-minded people feel by a sort of instinct, often without being able to give a reason for it. It leaves nothing arbitrary in the choice of opinions and of making up one's mind, every time one is able, by this means, to determine the most advantageous choice. Thereby, it becomes the most happy supplement to ignorance and to the weakness of the human mind. If one considers the analytical methods to which this theory has given rise, the truth of the principles that serve as the groundwork, the subtle and delicate logic needed to use them in the solution of the problems, the public-benefit businesses that depend on it, and the extension that it has received and may still receive from its application to the most important questions of natural philosophy and the moral sciences; if one observes also that even in matters which cannot be handled by the calculus, it gives the best rough estimates to guide us in our judgements, and that it teaches us to guard ourselves from the illusions which often mislead us, one will see that there is no science at all more worthy of our consideration, and that it would be a most useful part of the system of public education."

*Philosophical Essay on Probabilities*, 1814 <doi.org/10.1007/978-1-4612-418>

ChatGPT's thought processes:

I'm tying together the rich tapestry of philosophical insights and category theory, exploring how enriched categories, with their hom-sets as objects in a monoidal category, aid in managing complexity, partial orders, or metrics, moving beyond conventional morphism sets.

I'm starting to see how enriched categories with hom-sets as partially ordered sets can capture complexity, cost, or transformation intricacies through preorders, partial orders, metrics, or probabilities.

I’m piecing together how enriched categories can reveal more about morphisms by incorporating structures like complexity, cost, or probability, allowing short and long morphisms to be systematically compared.

... and then it crashed, without producing a response. This is very similar to my experience with this material, yes, quite human-like

Over the past couple of years, I've really fallen in love with #tikz and all of its quirks.

TikZ is a plotting/graphics package for LaTeX that is especially useful for creating mathematical diagrams.

The support for mathematical notation is unbeatable and the flexibility of the language is extremely high. Also, graphics rendered to pdf/svg in this way are extremely lightweight and reproducible.

I do find it very challenging syntax to remember though, so I put together this GitHub repository to keep track of tikz code I've written.

github.com/ctesta01/tikz-examp

Each graphic shown in the README is linked to its underlying .tex code.

Also the README has several links to documentation / tutorials that I've found helpful along with some tips I've learned from experience.

Dear LazyWeb: is there a C/C++, #RustLang or #Zig equivalent of #SciPy’s `stats` module for statistical analysis? Namely:
• a collection of common PDFs (probability density functions);
• MLE (maximum likelihood estimation) for these common distributions;
• KDE (kernel density estimation).

SciPy’s API is a pleasure to work with. Anything that comes close but usable from C/C++/Rust/Zig would make my life so much easier. Boosts appreciated for visibility.

[2502.05244] Probabilistic Artificial Intelligence
arxiv.org/abs/2502.05244
news.ycombinator.com/item?id=4

Manuscript 418pp ...

arXiv logo
arXiv.orgProbabilistic Artificial IntelligenceArtificial intelligence commonly refers to the science and engineering of artificial systems that can carry out tasks generally associated with requiring aspects of human intelligence, such as playing games, translating languages, and driving cars. In recent years, there have been exciting advances in learning-based, data-driven approaches towards AI, and machine learning and deep learning have enabled computer systems to perceive the world in unprecedented ways. Reinforcement learning has enabled breakthroughs in complex games such as Go and challenging robotics tasks such as quadrupedal locomotion. A key aspect of intelligence is to not only make predictions, but reason about the uncertainty in these predictions, and to consider this uncertainty when making decisions. This is what this manuscript on "Probabilistic Artificial Intelligence" is about. The first part covers probabilistic approaches to machine learning. We discuss the differentiation between "epistemic" uncertainty due to lack of data and "aleatoric" uncertainty, which is irreducible and stems, e.g., from noisy observations and outcomes. We discuss concrete approaches towards probabilistic inference and modern approaches to efficient approximate inference. The second part of the manuscript is about taking uncertainty into account in sequential decision tasks. We consider active learning and Bayesian optimization -- approaches that collect data by proposing experiments that are informative for reducing the epistemic uncertainty. We then consider reinforcement learning and modern deep RL approaches that use neural network function approximation. We close by discussing modern approaches in model-based RL, which harness epistemic and aleatoric uncertainty to guide exploration, while also reasoning about safety.

The impact #probability for #2024YR4 has been revised downward in the past days to well below 1%.

I wonder if showing how these impact probabilities change over time also helps to convince people that they should switch doors in the Monty Hall problem. 🚪🚪🐐

The analogy can be strengthened if you take into account that looking for the asteroid in a particular direction and *not* seeing it may also lower the impact probability. #PhilSci

Animation from blogs.esa.int/rocketscience/20

Continued thread

Music Tribe support asked me to send them a video to figure out what is going on. That;s a first for me. But I did, and I hope they figure out the issue with my behringer chaos. It is a clone of a Mutable Instruments Marbles, but no space saving like say, a uMarbles or Pachinko.

Conference: Probability in Philosophy and Science at the University of Graz 🇦🇹, September 24-26.
It will be a long train journey, but I will speaking there; I don't think any of the other speakers are active on Mastodon.
🎯 More info on the event: philevents.org/event/show/1315
🎲 Call for papers (deadline April 30): philevents.org/event/show/1315
#PhilSci #QuantumPhysics #Probability #Epistemology

philevents.orgProbability in Philosophy and ScienceProbabilities permeate all aspects of our lives. The beliefs we form, the various risks we assess, the epistemic uncertainty we account for, and the decisions we make typically depend on how likely we take some relevant states of the world to be. As finite subjects living in a vast world, we are constantly facing various forms of uncertainty, and it is our experiences that provide us with our most direct, though still limited, access to the world. Given the limited and perspectival character of our fundamental justifiers, i.e. our experiences, our beliefs (and the justification we have for them) seem to come in degrees. Many identify such degrees of belief, or credences, with subjective probabilities. On the other hand, it seems that probability can be interpreted in a more objective way as well. One’s subjective probability assignments may be internally consistent and yet strike us as objectively unjustified or inadequate. Accordingly, some aim to account for probability in epistemic terms that are less subjective than belief, linking it to evidence or justification. Finally, and perhaps most widely assumed, there is the view that probability is a feature of the world itself: probabilities are completely independent from any subject but are out there in the world. This view seems to align well with certain branches of science such as statistics or quantum mechanics. Of course, we can also be pluralists about probability, allowing for both subjective and objective forms of probabilities. However, despite this widespread and apparently intuitive distinction, the details of this overall picture still remain widely debated. What is more, the role probability plays in science remains strongly contested. For instance, quantum mechanics is one of the most fundamental scientific theories, but it is far from clear how we are supposed to interpret quantum probabilities. For some, they are prime examples of objective probabilities; others advocate a thoroughly subjective interpretation. On top of that, various researchers working on reconstructing quantum theory from information-theoretic principles have come to the conclusion that quantum theory fundamentally is a theory of probability. This conference has three interrelated aims: to 1) interrogate the nature and epistemological implications of probability, 2) address the role of probability in science, and 3) assess the epistemic, formal, and pragmatic norms governing our probability assignments. Issues we wish to discuss include, but are not limited to: - the nature of probability; - pluralism about probability; - the relation between probability and concepts such as belief, experience, and justification; - the constraints on rational probability assignments; - the relation between probability and reality; - phenomenological approaches to probability; - the place of probability in action; - the role that probabilities play in the sciences; - the interpretation of quantum probabilities.

My latest Substack article: Zipf’s Law in Python.

Zipf's Law describes a probability distribution that can be applied to many types of data such as populations, incomes and company revenues. In this article I'll use it to analyse word frequencies in Bram Stoker's Dracula. (Spolier alert: the most common word is "the".)

#statistics #probability #python #pythonprogramming

codedrome.substack.com/p/zipfs

CodeDrome · Zipf’s Law in PythonBy Chris Webb