Расподела вероватноће — разлика између измена

Садржај обрисан Садржај додат
м Бот: исправљена преусмерења; козметичке измене
.
Ред 1:
У [[probability theory|теорији вероватноће]] и [[statistics|статистици]], '''расподела вероватноће''' је математичка [[Function (mathematics)|функција]] која пружа вероватноћу појаве различитих могућих исхода у [[Experiment (probability theory)|експерименту]]. У техничком смислу, дистрибуција вероватноће је опис [[Randomness|рандомне]] појаве у погледу [[Вероватноћа|вероватноће]] догађаја. На пример, ако би се [[случајна променљива]] {{mvar|X}} користила за означавање исхода бацања новчића („експеримент”), тада би расподела вероватноће од {{mvar|X}} добила вредност 0,5 за {{math|''X'' {{=}} главе}}, и {{math|''X'' {{=}} репови}} (под претпоставком да је кованица поштена). Примери рандомних појава обухватају резултате експеримента или [[Survey methodology|истраживања]].
'''Расподела вероватноће''' или '''густина вероватноће''' је функција која се користи у [[Теорија вероватноће|теорији вероватноће]] и [[Статистика|статистици]]. Она представља закон вероватноће изражен у форми [[интеграл]]а.
{{рут}}
Расподела вероватноће is specified in terms of an underlying [[sample space]], which is the [[Set (mathematics)|set]] of all possible [[Outcome (probability)|outcomes]] of the random phenomenon being observed. The sample space may be the set of [[real numbers]] or a set of [[vector (mathematics)|vectors]], or it may be a list of non-numerical values; for example, the sample space of a coin flip would be {{math|{heads, tails} }}.
 
Probability distributions are generally divided into two classes. A '''discrete probability distribution''' (applicable to the scenarios where the set of possible outcomes is [[discrete probability distribution|discrete]], such as a coin toss or a roll of dice) can be encoded by a discrete list of the probabilities of the outcomes, known as a [[probability mass function]].<ref>{{cite web|title=AP Statistics Review - Density Curves and the Normal Distributions|url=http://apstatsreview.tumblr.com/post/50058615236/density-curves-and-the-normal-distributions?action=purge|accessdate=16 March 2015}}</ref> On the other hand, a '''continuous probability distribution''' (applicable to the scenarios where the set of possible outcomes can take on values in a continuous range (e.g. real numbers), such as the temperature on a given day) is typically described by [[probability density function]]s (with the probability of any individual outcome actually being 0). The [[normal distribution]] is a commonly encountered continuous probability distribution. More complex experiments, such as those involving [[stochastic processes]] defined in [[continuous time]], may demand the use of more general [[probability measure]]s.
 
A probability distribution whose sample space is one-dimensional (for example real numbers, list of labels, ordered labels or binary) is called [[Univariate distribution|univariate]], while a distribution whose sample space is a [[vector space]] of dimension 2 or more is called [[Multivariate distribution|multivariate]]. A univariate distribution gives the probabilities of a single [[random variable]] taking on various alternative values; a multivariate distribution (a [[joint probability distribution]]) gives the probabilities of a [[random vector]] – a list of two or more random variables – taking on various combinations of values. Important and commonly encountered univariate probability distributions include the [[binomial distribution]], the [[hypergeometric distribution]], and the [[normal distribution]]. The [[multivariate normal distribution]] is a commonly encountered multivariate distribution.
 
== Увод ==
[[File:Dice Distribution (bar).svg|thumb|250px|right|The [[probability mass function]] (pmf) ''p''(''S'') specifies the probability distribution for the sum ''S'' of counts from two [[dice]]. For example, the figure shows that ''p''(11) = 2/36 = 1/18. The pmf allows the computation of probabilities of events such as ''P''(''S'' > 9) = 1/12 + 1/18 + 1/36 = 1/6, and all other probabilities in the distribution.]]
 
To define probability distributions for the simplest cases, it is necessary to distinguish between '''discrete''' and '''continuous''' [[random variable]]s. In the discrete case, it is sufficient to specify a [[probability mass function]] <math>p</math> assigning a probability to each possible outcome: for example, when throwing a fair [[dice]], each of the six values 1 to 6 has the probability 1/6. The probability of an [[Event (probability theory)|event]] is then defined to be the sum of the probabilities of the outcomes that satisfy the event; for example, the probability of the event "the dice rolls an even value" is
:<math>p(2) + p(4) + p(6) = 1/6+1/6+1/6=1/2.</math>
 
In contrast, when a random variable takes values from a continuum then typically, any individual outcome has probability zero and only events that include infinitely many outcomes, such as intervals, can have positive probability. For example, the probability that a given object weighs ''exactly'' 500&nbsp;g is zero, because the probability of measuring exactly 500&nbsp;g tends to zero as the accuracy of our measuring instruments increases. Nevertheless, in quality control one might demand that the probability of a "500&nbsp;g" package containing between 490&nbsp;g and 510&nbsp;g should be no less than 98%, and this demand is less sensitive to the accuracy of measurement instruments.
 
Continuous probability distributions can be described in several ways. The [[probability density function]] describes the [[infinitesimal]] probability of any given value, and the probability that the outcome lies in a given interval can be computed by [[Integration (mathematics)|integrating]] the probability density function over that interval.<ref>{{cite book|chapter-url=https://www.dartmouth.edu/~chance/teaching_aids/books_articles/probability_book/Chapter4.pdf|chapter=Conditional Probability - Discrete Conditional|last1=Grinstead|first1=Charles M.|last2=Snell|first2=J. Laurie|publisher=Orange Grove Texts|isbn=161610046X |title=Grinstead & Snell's Introduction to Probability|date=2009|accessdate=2019-07-25}}</ref> On the other hand, the [[cumulative distribution function]] describes the probability that the random variable is no larger than a given value; the probability that the outcome lies in a given interval can be computed by taking the difference between the values of the cumulative distribution function at the endpoints of the interval. The cumulative distribution function is the [[antiderivative]] of the probability density function provided that the latter function exists.
 
[[File:Standard deviation diagram.svg|right|thumb|250px|The [[probability density function]] (pdf) of the [[normal distribution]], also called Gaussian or "bell curve", the most important continuous random distribution. As notated on the figure, the probabilities of intervals of values correspond to the area under the curve.]]
 
== Дефиниција ==
 
Расподела вероватноће је ненегативна функција ''-{ƒ}-'' дефинисана на [[реалан број|скупу реалних бројева]] <math>\ \scriptstyle\mathbb{R},\ </math>, таква да је вероватноћа да случајна променљива узме вредност из интервала [''-{a}-'', ''-{b}-''] за свако ''-{''a < b}-''}- дата интегралом: <math>\int_a^b f(x)\,dx.</math> Интеграл функције ''-{ƒ}-'' на целом скупу <math>\ \mathbb{R}\ </math> једнак је 1.
 
== Види још ==
 
* [[Функција расподеле]]
 
== Референце ==
{{Reflist}}
 
== Литература ==
{{refbegin}}
* B. S. Everitt: ''The Cambridge Dictionary of Statistics'', [[Cambridge University Press]], Cambridge (3rd edition, 2006). {{ISBN|0-521-69027-7}}
* Bishop: ''Pattern Recognition and Machine Learning'', [[Springer Nature|Springer]], {{ISBN|0-387-31073-8}}.
* {{cite journal |doi = 10.1016/j.ejmp.2014.05.002 |pmid = 25059432 |title = Data distributions in magnetic resonance images: A review |journal = [[Physica Medica]] |volume = 30 |issue = 7 |pages = 725–741 |year = 2014 |last1 = den Dekker |first1 = A. J. |last2 = Sijbers |first2 = J.}}
* {{cite book | author = Pierre Simon de Laplace | year = 1812 | title = Analytical Theory of Probability}}
* {{cite book | author = Andrei Nikolajevich Kolmogorov | year = 1950 | title = Foundations of the Theory of Probability}}
* {{cite book | author = Patrick Billingsley | title = Probability and Measure | publisher = John Wiley and Sons | location = New York, Toronto, London | year = 1979 | isbn = 0-471-00710-2}}
* {{cite book | author = David Stirzaker | year = 2003 | title = Elementary Probability | isbn = 0-521-42028-8}}
{{refend}}
 
== Спољашње везе ==
{{commons|Probability distribution|Probability distribution}}
* {{springer|title=Probability distribution|id=p/p074900}}
* -{[http://threeplusone.com/FieldGuide.pdf Field Guide to Continuous Probability Distributions], Gavin E. Crooks.}-
 
{{Authority control}}
 
{{DEFAULTSORT:Расподела вероватноће}}
 
[[Категорија:Теорија вероватноће]]