The Limited Times

Now you can see non-English news...

When tossing a coin, it is more likely to land on the same side as it was tossed.

2024-03-27T11:06:27.717Z

Highlights: When tossing a coin, it is more likely to land on the same side as it was tossed. The frequentist approach was used by great personalities in the history of probability and statistics such as the Comte de Buffon or Karl Pearson. The Bayesian approach uses the degree of uncertainty we have about a process and the observations made contribute to improving that uncertainty. This is the approach used in a recent study carried out by more than 50 researchers from the Netherlands and allows us to trust the coin to break the same tie.


An analysis of the simple game of tossing a coin allows us to reflect on randomness and different approaches to probability: the frequentist and the Bayesian.


Imagine that you take a coin and are about to throw it into the air.

What would you say is the probability that it will land heads?

Does it matter which side you throw it from?

Most people would say that the probability of coming up heads is 50%, regardless of the initial position of the coin, but it is not that simple.

The two previous questions correspond to two different events.

The first deals with the probability that it will come up heads – or tails, it would be the same.

However, the second refers to the probability that it will come up heads, if the coin was face up before it was thrown.

This second probability, which is called conditioned, can be different from the first.

On this question, in 2007, mathematicians Peris Diaconis, Susan Holmes and Richard Montgomery proposed a physical model that showed a slight bias in favor of the coin landing as it was thrown.

In conclusion, they indicated that, when tossing a coin in the air, it will land on the same side as it was thrown 51% of the time.

However, if you don't know how the coin is placed, the probability of it coming up heads – or tails – is 50%.

But how is it possible to say that the probability is that, with complete certainty?

This is an estimation problem, that is, from the start we do not know the probability of getting heads and we want to correctly estimate its value based on the evidence.

More information

Dobble, the board game that hides advanced geometries

The best-known approach to do this starts from the interpretation of probability as frequency, giving rise to what is known as frequentist statistics.

More specifically, under this approach, the probability that we want to estimate is interpreted as the proportion of heads that will be observed when tossing the coin infinitely many times under the same conditions.

Thus, to approximate it, it will be enough to toss the coin a high number of times, under the same conditions, and approximate the true probability by the proportion of heads observed.

The frequentist approach was used by great personalities in the history of probability and statistics such as the Comte de Buffon or Karl Pearson.

The first, tossed a coin 4040 times, obtaining 2048 heads, which represents a probability estimate of 4040/2048 = 0.5069, that is, 50.69%;

The second made 24,000 throws of which 12,012 fell showing his face 50.005% of the time.

However, the starting point of this approach creates a certain paradox: by tossing a coin with exactly the same conditions, wouldn't it be expected to obtain the same result?

Newtonian physics would affirm that yes and, in fact, it is the small initial variations that induce randomness in the results, which is why it is paradoxical to think about this premise of repetition.

This starting point is even more elusive when studying the probability of having a disease... in that case, what should be repeated?

The person's life?

Also, how many tosses will it take to get close enough to the true value?

Thus, although the frequentist approach is a valid and very well-studied approach, it sometimes leads to certain reasoning that is difficult to interpret and has even led it to be questioned in some scientific journals.

More information

One hundred years of the mathematician who explored infinite symmetries

To overcome these limitations, it is possible to use another statistical approach: the one known as Bayesian.

Under this paradigm, probability is the degree of uncertainty we have about a process and the observations made contribute to improving that uncertainty.

It is a mathematical representation of the learning process.

Returning to the example of the coin, we seek to estimate the value of the probability that it will come up heads.

To do this, the first step is to determine possible a priori values ​​for this probability.

In the case of not having any prior knowledge, it can be established that the probability could be anything between 0 and 100%.

Afterwards, many coin tosses are made that will reduce the uncertainty, limiting which possible values ​​are credible for the probability of heads.

This is the approach used in a recent study carried out by more than 50 researchers from the Netherlands.

The study moves away from the idea of ​​repetition and its complication in interpretation: they performed 350,757 tosses of different types of coins to obtain a range of a posteriori values ​​for the probability of heads that is between 49.9% and 50 ,3%.

Thus, this result reinforces the already known and tested idea of ​​50-50 and therefore allows us to trust the coin to break the tie.

In the same study they were also able to support the Diaconis, Holmes and Montgomery model: they established a range for the probability of the coin falling in its original position of between 50.3% and 50.9%, that is, although not It is 51% exactly, it does point to the existence of a certain bias.

Beyond this example, Bayesian statistics has played a critical role in historical events such as Alan Turing's deciphering of the Enigma machine.

It is currently used wherever complex processes such as the distribution of species, climatological models, or the spatial relationship underlying health or other phenomena are studied.

In addition, it is one of the techniques present within what is known as machine learning.

Anabel Forte

is a professor at the University of Valencia

Editing and coordination:

Ágata A. Timón G Longoria

(ICMAT)

Café y Teoremas

is a section dedicated to mathematics and the environment in which it is created, coordinated by the Institute of Mathematical Sciences (ICMAT), in which researchers and members of the center describe the latest advances in this discipline, share points of encounter between mathematics and other social and cultural expressions and remember those who marked its development and knew how to transform coffee into theorems.

The name evokes the definition of the Hungarian mathematician Alfred Rényi: “A mathematician is a machine that transforms coffee into theorems.”

You can follow

MATERIA

on

Facebook

,

X

and

Instagram

, or sign up here to receive

our weekly newsletter

.

Source: elparis

All news articles on 2024-03-27

You may like

Trends 24h

Latest

© Communities 2019 - Privacy

The information on this site is from external sources that are not under our control.
The inclusion of any links does not necessarily imply a recommendation or endorse the views expressed within them.