Data Science Asked by Rich Barlow on July 20, 2021
I’m trying to analyze a repeated data stream from a sensor. The data probably has events at the 100us scale, it occurs at a fairly random interval, and the signals are extremely similar. The problem is that I don’t have any sensors that have sub-millisecond refresh rates. The best I can do with what I have is about 1ms sensor reads.
So . . .
The idea is that if I sample a large number of event the chances are that the sampling ticks are falling on different sections of the signal. If this is true then a series of 10-20 events should give me the data that I need to reconstruct the signal. The signals repeat in a fairly random fashion and are very similar in each dataset so the chances are that each sampling run is gathering “pieces” of the total dataset.
Is there a name for this sort of procedure so that I can start to research how to do it? Are there algorithms or toolsets in the open source world to accomplish this task?
Right now I’m literally graphing the data on graph paper and overlaying the image. This is less than ideal . . .
Get help from others!
Recent Answers
Recent Questions
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP