Operations Research Asked by Jane on August 19, 2021
I have some data that looks almost linear. I need to make a fit in order to evaluate the slope.
The thing is, the slope is changing if I change the fit range (not all the data is linear so I need to choose myself the range) I wondered how would you suggest to evaluate the added error from the arbitrary selection of the range. For example, maybe I should make a few fits for a few ranges, and take the "range of slopes"/stdv as the added uncertainty?
thanks
There exist methods for fitting a piece-wise linear function to data in an optimal way, for various interpretations of what it means for this fit to be optimal. These methods may still require you to specify the number of segments in advance, but playing around with the number of segments seems less arbitrary to me than selecting the ranges by yourself.
This question on StackOverflow discusses Python libraries to do piece-wise linear fitting, including pwlf. You may also have a look at the Wikipedia entry for Segmented regression.
Answered by Kevin Dalmeijer on August 19, 2021
Get help from others!
Recent Answers
Recent Questions
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP