Home Dental treatment Distribution law of a discrete random variable. Distribution polygon

Distribution law of a discrete random variable. Distribution polygon

Random value is a quantity that, as a result of experiment, takes on a previously unknown value.

    Number of students present at the lecture.

    The number of houses put into operation in the current month.

    Ambient temperature.

    The weight of a fragment of an exploding shell.

Random variables are divided into discrete and continuous.

Discrete (discontinuous) called a random variable that takes on separate values, isolated from each other, with certain probabilities.

The number of possible values ​​of a discrete random variable can be finite or countable.

Continuous called a random variable that can take any value from some finite or infinite interval.

Obviously, the number of possible values ​​of a continuous random variable is infinite.

In the given examples: 1 and 2 are discrete random variables, 3 and 4 are continuous random variables.

In the future, instead of the words “random variable” we will often use the abbreviation c. V.

As a rule, random variables will be denoted by capital letters, and their possible values- small.

In the set-theoretic interpretation of the basic concepts of probability theory, the random variable X is a function of an elementary event: X =φ(ω), where ω is an elementary event belonging to the space Ω (ω  Ω). In this case, the set Ξ of possible values ​​of c. V. X consists of all the values ​​that the function φ(ω) takes.

Law of distribution of a random variable is any rule (table, function) that allows you to find the probabilities of all kinds of events associated with a random variable (for example, the probability that it will take on some value or fall on some interval).

Forms for specifying the laws of distribution of random variables. Distribution series.

This is a table in the top line of which all possible values ​​of the random variable X are listed in ascending order: x 1, x 2, ..., x n, and in the bottom line - the probabilities of these values: p 1, p 2, ..., p n, where p i = Р(Х = x i ).

Since the events (X = x 1 ), (X = x 2 ), ... are inconsistent and form a complete group, the sum of all probabilities in the bottom line of the distribution series is equal to one

The distribution series is used to specify the distribution law of only discrete random variables.

Distribution polygon

The graphical representation of a distribution series is called a distribution polygon. It is constructed like this: for each possible value of c. V. a perpendicular to the x-axis is restored, on which the probability of a given value c is plotted. V. For clarity (and only for clarity!), the resulting points are connected by straight segments.

Cumulative distribution function (or simply distribution function).

This is a function that, for each value of the argument x, is numerically equal to the probability that the random variable  will be less than the value of the argument x.

The distribution function is denoted by F(x): F(x) = P (X  x).

Now you can give more precise definition continuous random variable: a random variable is called continuous if its distribution function is a continuous, piecewise differentiable function with a continuous derivative.

The distribution function is the most universal form of specifying c. v., which can be used to specify distribution laws for both discrete and continuous s. V.

Problem 14. In the cash lottery, 1 win of 1,000,000 rubles, 10 wins of 100,000 rubles are played. and 100 wins of 1000 rubles each. with a total number of tickets of 10,000. Find the law of distribution of random winnings X for the owner of one lottery ticket.

Solution. Possible values ​​for X: X 1 = 0; X 2 = 1000; X 3 = 100000;

X 4 = 1000000. Their probabilities are respectively equal: R 2 = 0,01; R 3 = 0,001; R 4 = 0,0001; R 1 = 1 – 0,01 – 0,001 – 0,0001 = 0,9889.

Therefore, the law of distribution of winnings X can be given by the following table:

Construct a distribution polygon.

Solution. Let's build a rectangular coordinate system, and we'll plot possible values ​​along the abscissa axis x i, and along the ordinate axis - the corresponding probabilities p i. Let's plot the points M 1 (1;0,2), M 2 (3;0,1), M 3 (6;0.4) and M 4 (8;0.3). By connecting these points with straight line segments, we obtain the desired distribution polygon.

§2. Numerical characteristics of random variables

A random variable is completely characterized by its distribution law. An averaged description of a random variable can be obtained by using its numerical characteristics

2.1. Expected value. Dispersion.

Let a random variable take values ​​with probabilities accordingly.

Definition. The mathematical expectation of a discrete random variable is the sum of the products of all its possible values ​​and the corresponding probabilities:

.

Properties of mathematical expectation.

The dispersion of a random variable around the mean value is characterized by dispersion and standard deviation.

The variance of a random variable is the mathematical expectation of the squared deviation of a random variable from its mathematical expectation:

The following formula is used for calculations

Properties of dispersion.

2. , where are mutually independent random variables.

3. Standard deviation .

Problem 16. Find the mathematical expectation of a random variable Z = X+ 2Y, if the mathematical expectations of random variables are known X And Y: M(X) = 5, M(Y) = 3.

Solution. We use the properties of mathematical expectation. Then we get:

M(X+ 2Y)= M(X) + M(2Y) = M(X) + 2M(Y) = 5 + 2 . 3 = 11.

Problem 17. Variance of a random variable X is equal to 3. Find the variance of random variables: a) –3 X; b) 4 X + 3.

Solution. Let's apply properties 3, 4 and 2 of dispersion. We have:

A) D(–3X) = (–3) 2 D(X) = 9D(X) = 9 . 3 = 27;

b) D(4X+ 3) = D(4X) + D(3) = 16D(X) + 0 = 16 . 3 = 48.

Problem 18. Given an independent random variable Y– the number of points dropped when throwing dice. Find the distribution law, mathematical expectation, dispersion and mean standard deviation random variable Y.

Solution. Random variable distribution table Y has the form:

Y
R 1/6 1/6 1/6 1/6 1/6 1/6

Then M(Y) = 1 1/6 + 2 1/6 + 3 1/6+ 4 1/6+ 5 1/6+ 6 1/6 = 3.5;

D(Y) = (1 – 3.5) 2 1/6 +(2 – 3.5) 2 /6 + (3 – 3.5) 2 1/6 + (4 – 3.5) 2 / 6 +(5 – –3.5) 2 1/6 + (6 – 3.5) 2. 1/6 = 2.917; σ (Y) 2,917 = 1,708.

Answer: Consider a discontinuous random variable X with possible values. Each of these values ​​is possible, but not certain, and the value X can accept each of them with some probability. As a result of the experiment, the value X will take one of these values, i.e. one of the complete group of incompatible events will occur:

Let us denote the probabilities of these events by letters R with the corresponding indices:

That is, the probability distribution of various values ​​can be specified by a distribution table, in which all the values ​​​​accepted by a given discrete random variable are indicated in the top line, and the probabilities of the corresponding values ​​are indicated in the bottom line. Since incompatible events (3.1) form a complete group, then , that is, the sum of the probabilities of all possible values ​​of the random variable is equal to one. The probability distribution of continuous random variables cannot be presented in the form of a table, since the number of values ​​of such random variables is infinite even in a limited interval. Moreover, the probability of getting any particular value is zero. A random variable will be fully described from a probabilistic point of view if we define this distribution, that is, we indicate exactly what probability each of the events has. With this we will establish the so-called law of distribution of a random variable. The law of distribution of a random variable is any relationship that establishes a connection between the possible values ​​of a random variable and the corresponding probabilities. We will say about a random variable that it is subject to a given distribution law. Let us establish the form in which the distribution law of a discontinuous random variable can be specified X. The simplest form The definition of this law is a table that lists the possible values ​​of the random variable and the corresponding probabilities:

x i x 1 x 2 × × × x n
p i p 1 p 2 × × × p n

We will call such a table a series of distributions of a random variable X.

Rice. 3.1

To give the distribution series a more visual appearance, they often resort to its graphical representation: the possible values ​​of the random variable are plotted along the abscissa axis, and the probabilities of these values ​​are plotted along the ordinate axis. For clarity, the resulting points are connected by straight segments. Such a figure is called a distribution polygon (Fig. 3.1). The distribution polygon, as well as the distribution series, completely characterizes the random variable. it is one of the forms of the law of distribution. Sometimes the so-called “mechanical” interpretation of the distribution series is convenient. Let us imagine that a certain mass equal to unity is distributed along the abscissa axis so that in n masses are concentrated at individual points, respectively . Then the distribution series is interpreted as a system of material points with some masses located on the abscissa axis.

Experience is any implementation of certain conditions and actions under which the random phenomenon being studied is observed. Experiments can be characterized qualitatively and quantitatively. A random quantity is a quantity that, as a result of experiment, can take on one or another value, and it is not known in advance which one.

Random variables are usually denoted (X,Y,Z), and the corresponding values ​​(x,y,z)

Discrete are random variables that take individual values ​​isolated from each other that can be overestimated. Continuous quantities the possible values ​​of which continuously fill a certain range. The law of distribution of a random variable is any relation that establishes a connection between the possible values ​​of random variables and the corresponding probabilities. Distribution row and polygon. The simplest form of the distribution law of a discrete quantity is a distribution series. The graphical interpretation of the distribution series is the distribution polygon.

You can also find the information you are interested in in the scientific search engine Otvety.Online. Use the search form:

More on topic 13. Discrete random variable. Distribution polygon. Operations with random variables, example:

  1. 13. Discrete random variable and the law of its distribution. Distribution polygon. Operations with random variables. Example.
  2. The concept of “random variable” and its description. Discrete random variable and its law (series) of distribution. Independent random variables. Examples.
  3. 14. Random variables, their types. The law of probability distribution of a discrete random variable (DRV). Methods for constructing random variables (RVs).
  4. 16. Distribution law of a discrete random variable. Numerical characteristics of a discrete random variable: mathematical expectation, dispersion and standard deviation.
  5. Mathematical operations on discrete random variables and examples of constructing distribution laws for KX, X"1, X + K, XV based on given distributions of independent random variables X and Y.
  6. The concept of a random variable. Law of distribution of discrete cases. quantities. Mathematical operations on random. quantities.
  • 2.1. Relative frequency. Relative frequency stability
  • 2.2. Limitations of the classical definition of probability. Statistical probability
  • 2.3. Geometric probabilities
  • 2.4. Probability addition theorem
  • 2.5. Complete group of events
  • 2.6. Opposite events
  • 2.7. The principle of practical impossibility of unlikely events
  • 2.8. Producing events. Conditional probability
  • 2.9. Probability multiplication theorem
  • 2.10. Independent events. Multiplication theorem for independent events
  • 2.10. Probability of at least one event occurring
  • Lecture No. 3 Corollaries of addition and multiplication theorems
  • 3.1. Theorem for adding probabilities of joint events
  • 3.2. Total Probability Formula
  • 3.3. Probability of hypotheses. Bayes formulas
  • 4. Repetition of tests
  • 4.1. Bernoulli's formula
  • 4.2. Limit theorems in Bernoulli's scheme
  • 4.3. Local and integral theorems of Moivre-Laplace
  • 4.3. Probability of relative frequency deviation from constant probability in independent trials
  • 5. Random variables
  • 5.1. The concept of a random variable. Distribution law of a random variable
  • 5.2. Distribution law of a discrete random variable. Distribution polygon
  • 5.3. Binomial distribution
  • 5.4. Poisson distribution
  • 5.5. Geometric distribution
  • 5.6. Hypergeometric distribution
  • 6. Mathematical expectation of a discrete random variable
  • 6.1. Numerical characteristics of discrete random variables
  • 6.2. Expectation of a discrete random variable
  • 6.3. Probabilistic meaning of mathematical expectation
  • 6.4. Properties of mathematical expectation
  • 6.5. Mathematical expectation of the number of occurrences of an event in independent trials
  • 7. Dispersion of a discrete random variable
  • 7.1. The feasibility of introducing a numerical characteristic of the scattering of a random variable
  • 7.2. Deviation of a random variable from its mathematical expectation
  • 7.3. Variance of a discrete random variable
  • 7.4. Formula for calculating variance
  • 7.5. Dispersion properties
  • 7.6. Variance of the number of occurrences of an event in independent trials
  • 7.7. Standard deviation
  • 7.8. Standard deviation of the sum of mutually independent random variables
  • 7.9. Identically distributed mutually independent random variables
  • 7.10. Initial and central theoretical points
  • 8. Law of Large Numbers
  • 8.1. Preliminary remarks
  • 8.2. Chebyshev's inequality
  • 8.3. Chebyshev's theorem
  • 8.4. The essence of Chebyshev's theorem
  • 8.5. The significance of Chebyshev's theorem for practice
  • 8.6. Bernoulli's theorem
  • Probability distribution function of a random variable
  • 9.1. Definition of the distribution function
  • 9.2. Properties of the distribution function
  • 9.3. Distribution function graph
  • 10. Probability density of a continuous random variable
  • 10.1. Determination of distribution density
  • 10.2. Probability of a continuous random variable falling into a given interval
  • 10.3. Law of uniform probability distribution
  • 11. Normal distribution
  • 11.1. Numerical characteristics of continuous random variables
  • 11.2. Normal distribution
  • 11.3. Normal curve
  • 11.4. Influence of normal distribution parameters on the shape of the normal curve
  • 11.5. Probability of falling into a given interval of a normal random variable
  • 11.6. Calculating the probability of a given deviation
  • 11.7. Three sigma rule
  • 11.8. The concept of Lyapunov's theorem. Statement of the central limit theorem
  • 11.9. Estimation of the deviation of the theoretical distribution from the normal one. Skewness and kurtosis
  • 11.10. Function of one random argument and its distribution
  • 11.11. Mathematical expectation of a function of one random argument
  • 11.12. Function of two random arguments. Distribution of the sum of independent terms. Stability of normal distribution
  • 11.13. Chi square distribution
  • 11.14. Student distribution
  • 11.15. Fischer–Snedecor f distribution
  • 12. Exponential distribution
  • 12.1. Definition of exponential distribution
  • 12.2. Probability of falling into a given interval of an exponentially distributed random variable
  • § 3. Numerical characteristics of the exponential distribution
  • 12.4. Reliability function
  • 12.5. Exponential reliability law
  • 12.6. Characteristic property of the exponential reliability law
  • 5.2. Distribution law of a discrete random variable. Distribution polygon

    At first glance, it may seem that to define a discrete random variable it is enough to list all its possible values. In reality, this is not so: random variables can have the same lists of possible values, but their probabilities can be different. Therefore, to specify a discrete random variable, it is not enough to list all its possible values; you also need to indicate their probabilities.

    Distribution law of a discrete random variable call the correspondence between possible values ​​and their probabilities; it can be specified tabularly, analytically (in the form of a formula) and graphically.

    Definition. Any rule (table, function, graph) that allows you to find the probabilities of arbitrary events AS (S– -algebra of events in space ), in particular, indicating the probabilities of individual values ​​of a random variable or a set of these values, is called random variable distribution law(or simply: distribution). About s.v. they say that “it obeys a given law of distribution.”

    Let X– d.s.v., which takes values X 1 , X 2 , …, x n,... (the set of these values ​​is finite or countable) with some probability p i, Where i = 1,2,…, n,… Distribution law d.s.v. convenient to set using the formula p i = P{X = x i)Where i = 1,2,…, n,..., which determines the probability that as a result of the experiment r.v. X will take the value x i. For d.s.v. X the distribution law can be given in the form distribution tables:

    x n

    R n

    When specifying the law of distribution of a discrete random variable in a table, the first row of the table contains possible values, and the second – their probabilities. such a table is called near distribution.

    Taking into account that in one trial the random variable takes one and only one possible value, we conclude that the events X = x 1 , X = x 2 , ..., X = x n form a complete group; therefore, the sum of the probabilities of these events, i.e. the sum of the probabilities of the second row of the table is equal to one, that is .

    If the set of possible values X infinitely (countably), then the series R 1 + R 2 + ... converges and its sum is equal to one.

    Example. There are 100 tickets issued for the cash lottery. One win of 50 rubles is drawn. and ten winnings of 1 rub. Find the distribution law of a random variable X– the cost of possible winnings for the owner of one lottery ticket.

    Solution. Let's write the possible values X: X 1 = 50, X 2 = 1, X 3 = 0. The probabilities of these possible values ​​are: R 1 = 0,01, R 2 = 0,01, R 3 = 1 – (R 1 + R 2)=0,89.

    Let us write the required distribution law:

    Control: 0.01 + 0.1 + 0.89 =1.

    Example. There are 8 balls in the urn, 5 of which are white, the rest are black. 3 balls are drawn at random from it. Find the law of distribution of the number of white balls in the sample.

    Solution. Possible values ​​of r.v. X– there are numbers of white balls in the sample X 1 = 0, X 2 = 1, X 3 = 2, X 4 = 3. Their probabilities will be accordingly

    ;
    ;
    .

    Let us write the distribution law in the form of a table.

    Control:
    .

    Distribution law d.s.v. can be specified graphically if the possible values ​​of r.v. are plotted on the abscissa axis, and the probabilities of these values ​​are plotted on the ordinate axis. a broken line connecting points in succession ( X 1 , R 1), (X 2 , R 2),... called polygon(or polygon) distribution(see Fig. 5.1).

    Rice. 5.1. Distribution polygon

    Now we can give a more precise definition of d.s.v.

    Definition. Random value X is discrete, if there is a finite or countable set of numbers X 1 , X 2 , ... such that P{X = x i } = p i > 0 (i= 1,2,...) and p 1 + p 2 + R 3 +… = 1.

    Let us define mathematical operations on discrete r.v.

    Definition.Amount (difference, work) d.s.v. X, taking values x i with probabilities p i = P{X = x i }, i = 1, 2, …, n, and d.s.v. Y, taking values y j with probabilities p j = P{Y = y j }, j = 1, 2, …, m, is called d.s.v. Z = X + Y (Z = XY, Z = XY), taking values z ij = x i + y j (z ij = x i y j , z ij = x i y j) with probabilities p ij = P{X = x i , Y = y j) for all specified values i And j. If some amounts coincide x i + y j (differences x i y j, works x i y j) the corresponding probabilities are added.

    Definition.Work d.s.v. on number s called d.s.v. cX, taking values Withx i with probabilities p i = P{X = x i }.

    Definition. Two d.s.v. X And Y are called independent, if events ( X = x i } = A i And ( Y = y j } = B j independent for any i = 1, 2, …, n, j = 1, 2, …, m, that is

    Otherwise r.v. called dependent. Several r.v. are called mutually independent if the distribution law of any of them does not depend on what possible values ​​the other quantities took.

    Let's consider several of the most commonly used distribution laws.



    New on the site

    >

    Most popular