LogiQminds

View Original

The most significant analyzing tool- Normal Distribution.

Mathematicians in the 19th century began to see the statistical patterns in chance events. Moreover, they recognized the possibility of observation of an individual data value is the greatest near its mean value (average) and dies rapidly as a difference from mean value upsurges. A normal distribution is the formula for this phenomenon. Also, since the curve appears similar to a bell, it is known as the bell curve.

Adolphe Quetelet Belgian sociologist and mathematician in the year 1835 collected large number of data that were based on births, crimes, height, weight, deaths, and many more. When he conspired the proportions of individuals with the listed social variables, patterns like beautiful bell curves were observed by him. This became an outstanding find; it was not expected by anyone that social variables can conform to any kind of mathematical law as the fundamental reasons involved complex human choices. Thus, this situation led to an understanding that the behavior of the people in the masses is predictable than that of an individual.

Normal distribution is a routine-based technique for testing a hypothesis and is broadly used in medical trials for novel treatments and drugs. For instance- the observational data may be faulty due to human error or other variations, suggesting that new medicine is twice as productive as the old one. However, we can check the reliability of the data by making use of Normal distribution (the possibility of errors in the data). The method involves testing hypothesis that novel medicine is twice as productive and hypothesis that the outcome rose by chance. Therefore, if the probability of the latter hypothesis that is given by the bell curve is lower, the observation is measured as reliable.