TransWikia.com

What error to use on measurement?

Physics Asked by Harry Spratt on June 21, 2021

I have a question regarding errors which has always stumped me. It’s what error should be quoted on a measurement. Let’s say I have a very simple situation with a force on a spring. For

$$ F = kx$$

typically the Force (F) is plotted against the extension (x). To calculate the error on this one could use the error using

begin{equation}
sigma_f^2 = sum{left(frac{partial f}{partialbeta_i}right)^2sigma^2_{beta{_i}}}
label{eq:error}
end{equation}

which provides a simple but effective estimation of the error. This equation assumes independent, uncorrelated values. When the variables $beta$ are correlated with each other the covariances are not negligible. As such where the function involves a number of variables together with their associated errors, as for $alpha$, the individual errors need to be propagated. The error propagation equation is expressed in matrix form as:
begin{equation}
sigma_f^2 = {g}^T{Vg}
end{equation}

in which $sigma_f^2$ represents the variance of a function $f$ whose parameters are $beta$, whose variance-covariance matrix is $mathbf{V}$ with the $i$th element of the vector $mathbf{g}$ being $frac{partial f}{partialbeta_i}$. [I know this form is better but despite researching am struggling to implement it].

During my fitting with python using script it outputs the covariance of every parameter which it has fit. Presumably as above. However say it calculates the covariance for the spring constant. In this case what should I use that as the error on k. Should it be calculated using the errors on the individual elements as in the first error propagation equation or using the covariance produced by scipy. Surely the covariance only computes the scatter of the points and not the error itself, as it has no baring on the error of the initial measurements. In addition would be taking the average of the largest and smallest gradient be separate from both these methods for calculating the error on the gradient k?

Sorry for the long post, to me it is very confusing what errors mean what and which should be used.

One Answer

With a single input variable $x$ the covariance "matrix" of the estimator is a single number: the variance of the estimator. If you are really interested in this case, I did not understand the question.

I understand your question, if we consider the model $f(x_1, x_2) = k_1 x_1 + k_2 x_2$ with two input variables. In this case, the (least square) linear regression fit provides a $(2 times 2)$-covariance matrix, $$ Cov[vec{hatbeta}] = left( begin{matrix} Var[hat beta_1] & Cov[hatbeta_1, hatbeta_2] Cov[hatbeta_2, hatbeta_1] & Var[hat beta_2] end{matrix} right) %= %sigma_{epsilon}^2 ;(X^T cdot X)^{-1} %where $X$ is the so called design matrix, and $$ where $hat beta_1$ and $hat beta_2$ are the point estimators.

  • The point estimators (=best fit value) are the expectation values of the estimators, $E[hat beta_i] = hat beta_i = k_i$.
  • The uncertainties of the estimators are usually taken to be the standard deviation of the estimators, $Sd[hat beta_i] = {sigma}_{hatbeta_i}$. Thus, we use the (square root of the) diagonal element of the covariance matrix.

While this is the standard method, it should not be used if the two input variables are collinear (this is how we call "correlated" input variables). Thus, the first thing should be to quantify the severity of the collinear. You could e.g. use the variance inflation factor.

If the collinearity is "sever", but you like to use a model with the original input parameters, I suggest you try ridge regression, or lasso. An alternative would be to combine the two parameters and use a single "effective" parameters. This method is called principal component analysis.

Thus, the simple answer to your question is: Do not use the least square fit, if your input variables are collinear.

Answered by Semoi on June 21, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP