TransWikia.com

Is this a proper way to find the derivate of a 12-bit ADC?

Electrical Engineering Asked by MrYui on November 14, 2021

I’m seeking a solution to find the derivative of an 12-bit ADC. It’s for STM32 but let’s say it’s for likely for an arbitrary micro controller.

while(1){

  uint16_t a1 = adc(0); // Read the analog value first time

  uint32_t t1 = time(); // Get the time

  uint16_t a2 = adc(0); // Read the analog value second time

  uint32_t delta = time() - t1; // Update the time difference

  uint16_t d = (a2-a1)/(delta); // Find the derivative of adc(0)

}

Are they any better solutions rather than this solution if I’m using FreeRTOS?

What if I have noise in my measurements? What can I do to prevent that?

3 Answers

Others talked about the timing issues, so I will focus on removing the noise from the measurements as it could be large. Basic principle is to approximate a derivative with more points than two. One of the simplest technique is called five point stencil which can be easily used on most MCUs. It uses 4 points to calculate the first derivative as: $$f'(x)approxfrac{-f(x+2h)+8f(x+h)-8f(x-h)+f(x-2h)}{12h}$$ Where f(x) are the ADC readings and h is the time difference between two readings. This technique can be generalized with finite difference coefficients. Another possibility would be to filter your ADC samples before calculating the derivative, for example with Gaussian filter.

Answered by Rokta on November 14, 2021

The RTOS and timing catches having been discussed by others, the numerical differentiation greatly amplifies quantization noise for medium- and low-changing signals. Depending on a nature of signals under consideration, there exist well established numerical techniques to deal with this amplification: oversampling, interpolation, Kalman filtering etc.

An article on an oversampling technique application compares a dependence of quantization noise amplification on signal change rate in Figure 2 (see below).

enter image description here

(a) Piecewise quadratic acceleration profile. (b) Theoretical and finite-difference derivatives.]1

You may see patterns relevant to your observation. If this is the case, you can try oversampling. But don't forget about other techniques!

Answered by V.V.T on November 14, 2021

Your understanding of the concept looks basically correct but there are a couple of issues.

Unless your sampling time is variable, you should not need to read the time. You should already know the sampling interval beforehand

If the sampling time is variable, you need to record the time associated with when the actual sample was taken. Your current code doesn't do that. It records the time completely independently of the sample some time after the sample has already been taken. If you are behind an RTOS there is probably a much better way to do it like setting the ADC to automatically sample and automatically writie the samples into a buffer so you can read the samples and differentiate whenever you want knowing that the interval time is constant.

Yes, noise will be an issue. You want to filter the ADC readings before differentiating since it is very sensitive to noise. Even a small magnitude change over a small time interval (such as when sampling at a very high frequency) results in enormous slopes that may completely swamp your graph. I've found the easiest way to deal with this is to actually reduce the sampling rate which reduces the slope of the line between adjacent samples.

Answered by DKNguyen on November 14, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP