Cross Validated Asked by mksm1228 on February 5, 2021
I have two instruments that measure the same parameter, I’ll call them 1 and 2. 1 is calibrated and zeroed during the study and 2 is not calibrated and zeroed so the baseline values are all over the place.
I am trying to superimpose the data on a scatterplot by time matching 1 and 2. In addition to time matching, I also have to match the Y-axis. I am trying to match 2 to 1 since 1 is calibrated and zeroed. There are thousands of data points so sample size should not be an issue.
I figured if I look at the average difference (so difference between 1 and 2, then averaged) between 1 and 2, that would tell me in some sense how close I am to matching the Y-axis between the two data sets.
Statistically speaking, is there any acceptable value for average difference? Maybe something like an average difference < 0.5 is considered to be optimal? Eventually, I want to plot 1 vs 2 on a Bland-Altman plot.
Get help from others!
Recent Answers
Recent Questions
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP