An unknown data set has a standard deviation of 10. What is the standard deviation when each value in the data set is multiplied by 5?

Respuesta :

Let [tex]\mathbf x=\{x_i~:~1\le i\le n\}[/tex] be the sample, with [tex]n[/tex] the size of the sample.

The mean of the sample is

[tex]\displaystyle\bar x=\frac1n\sum_{i=1}^nx_i[/tex]

Let [tex]\mathbf y[/tex] be the same data set but with each value scaled up by 5. Then the mean here is

[tex]\bar y=\displaystyle\frac1n\sum_{i=1}^ny_i=\frac5n\sum_{i=1}^nx_i=5\bar x[/tex]

The standard deviation for [tex]\mathbf x[/tex] is given by

[tex]s_{\mathbf x}=\sqrt{\displaystyle\frac1{n-1}\sum_{i=1}^n(\bar x-x_i)^2}[/tex]

For [tex]\mathbf y[/tex], you would have

[tex]s_{\mathbf y}=\sqrt{\displaystyle\frac1{n-1}\sum_{i=1}^n(\bar y-y_i)^2}[/tex]
[tex]s_{\mathbf y}=\sqrt{\displaystyle\frac1{n-1}\sum_{i=1}^n(5\bar x-5x_i)^2}[/tex]
[tex]s_{\mathbf y}=\sqrt{\displaystyle\frac{25}{n-1}\sum_{i=1}^n(\bar x-x_i)^2}[/tex]
[tex]s_{\mathbf y}=5\sqrt{\displaystyle\frac1{n-1}\sum_{i=1}^n(\bar x-x_i)^2}[/tex]
[tex]s_{\mathbf y}=5s_{\mathbf x}[/tex]