recentpopularlog in


« earlier   
[1609.06840] Exact Sampling from Determinantal Point Processes
Determinantal point processes (DPPs) are an important concept in random matrix theory and combinatorics. They have also recently attracted interest in the study of numerical methods for machine learning, as they offer an elegant "missing link" between independent Monte Carlo sampling and deterministic evaluation on regular grids, applicable to a general set of spaces. This is helpful whenever an algorithm explores to reduce uncertainty, such as in active learning, Bayesian optimization, reinforcement learning, and marginalization in graphical models. To draw samples from a DPP in practice, existing literature focuses on approximate schemes of low cost, or comparably inefficient exact algorithms like rejection sampling. We point out that, for many settings of relevance to machine learning, it is also possible to draw exact samples from DPPs on continuous domains. We start from an intuitive example on the real line, which is then generalized to multivariate real vector spaces. We also compare to previously studied approximations, showing that exact sampling, despite higher cost, can be preferable where precision is needed.
sampling  Statistics  MachineLearning  Probability 
22 hours ago by csantos
The maths of randomness
For an idea we are all familiar with, randomness is surprisingly hard to formally define. We think of a random process as something that evolves over time but in a way we can’t predict. One example would be the smoke that comes out of your chimney. Although there is no way of exactly predicting the shape of your smoke plume, we can use probability theory – the mathematical language we use to describe randomness – to predict what shapes the plume of smoke is more (or less) likely to take.
Submitted by Rachel on April 20, 2018
statistics  probability  Math 
yesterday by rcyphers
The Probability Distribution of the Future
The key point is that the future should be viewed as a range of possibilities and their respective likelihoods - essentially, a probability distribution.

Learn to adjust the probabilities on the fly as you get more information. Bayesian updating.

But ... in reality, only one thing will happen. So, are you comfortable if that does happen? As Buffet says, "In order to win, you must first survive". Consequences matter.

In the real world, risk = probability of failure x consequences. Risk is not only financial.

Most importantly, knowing the outcome does not teach you about the risk of the decision. Knowing that something worked out, we argue that it wasn't that risky after all. But what if, in reality, we were simply fortunate?

The truth is that most times we don't know the probability distribution at all. Because the world is not a predictable casino game — an error Nassim Taleb calls the Ludic Fallacy — the best we can do is guess. With intelligent estimations, we can work to get the rough order of magnitude right, understand the consequences if we're wrong, and always be sure to never fool ourselves after the fact.
future  probability  risk  consequences  howardmarks  warrenbuffet  farnamstreet  nnt  taleb 
2 days ago by drmeme
Probability Theory (For Scientists and Engineers)
Formal probability theory is a rich and complex field of mathematics with a reputation for being confusing if not outright impenetrable. Much of that intimidation, however, is due not to the abstract mathematics but rather how they are employed in practice. In particular, many introductions to probability theory sloppily confound the abstract mathematics with their practical implementations, convoluting what we can calculate in the theory with how we perform those calculations. To ma...
tut  math  probability  overview 
6 days ago by cjitlal

Copy this bookmark:

to read