Propagation of Errors

If the measurement of a variable $y=f(x_1,\cdots,x_n)$ as a function of several other variable $x_1,\cdots,x_n$ depends on the measurement of all such variables, then its measurement error in terms of the residual $r_y=y-y_0$, where $y_0$ represents the true value of $y$, depends on those $r_i=x_i-x_{i0}\;(i=1,\cdots,n)$ of all other variables, where $x_{i0}$ represents the true value of $x_i$.

Specifically, consider the Taylor expansion of $y$ near its true value $y_0$:

  $\displaystyle y=y_0+\sum_{i=1}^n \frac{\partial f}{\partial x_i}(x-x_0)
+\sum_{...
...
+\cdots
\approx y_0+\sum_{i=1}^n \frac{\partial f}{\partial x_i}(x_i-x_{i0})
$ (94)
The approximation is based on the assumption that all these errors are small. Now we get the residue of $y$ given the specific values $r_1,\cdots,r_n$:
  $\displaystyle r_y=y-y_0\approx \sum_{i=1}^n \frac{\partial f}{\partial x_i}(x_i-x_{i0})
=\sum_{i=1}^n \frac{\partial f}{\partial x_i}r_i
$ (95)

But typically the specific residuals $r_i$ are unknown. We can then treat all measurement errors as random variables represented by their means and variances (uncertainty), and further consider how they propagate to affect the measurement of $y$ approximated as

  $\displaystyle y=y_0+\sum_{i=1}^n \frac{\partial f}{\partial x_i}(x-x_0)
$ (96)
As shown here, if all variables are independent of each other, then the standard deviation of $r_y$ can be approximated as
  $\displaystyle \sigma_y=\sqrt{ \sum_{i=1}^n \left(\frac{\partial f}{\partial x_i}\right)^2 \sigma_{x_i}^2}
$ (97)
where $s_y$ and$s_i$ are the standard deviation of $y$ and $x_i$.

Example Consider the measurement of a resistor $R$ by a voltage divider circuit:

  $\displaystyle V=\frac{R}{R+R_0} V_0
$ (98)
where $V_0$ is a given voltage, $R_0$ is another resistor, and $V$ is the voltage across $R$. Solving this for $R$, we get:
  $\displaystyle R=f(R_0, V_0, V)=\frac{VR_0}{V_0-V}
$ (99)
and The first order derivatives of the function are
  $\displaystyle \frac{\partial f}{\partial R_0} = \frac{V}{V_0-V},\;\;\;\;\;\;\;\...
...,\;\;\;\;\;\;\;\;\;\;
\frac{\partial f}{\partial V} = \frac{V_0R_0}{(V_0-V)^2}
$ (100)
and the standard deviation of $R$ can be written as a function of those of the three other variables:
$\displaystyle \sigma_R$ $\textstyle =$ $\displaystyle \sqrt{ \left(\frac{\partial f}{\partial R_0}\right)^2 \sigma_{R_0...
...ht)^2 \sigma_{V_0}^2
+\left(\frac{\partial f}{\partial V}\right)^2 \sigma_V^2 }$  
  $\textstyle =$ $\displaystyle \sqrt{
\left(\frac{V}{V_0-V}\right)^2\sigma_{R_0}^2
+\left(\frac{...
...right)^2\sigma_{V_0}^2
+\left(\frac{R_0V_0}{(V_0-V)^2}\right)^2\sigma_{R_0}^2
}$