Electrical Engineering Asked on October 12, 2020
I’m running an ATMega1284 at 20MHz. I’ve set the ADC up in free running mode as such:
void SetupADCInFreeRunningMode(uint16_t ref, uint16_t channel, uint16_t leftAdjust, uint16_t clockDiv)
{
ADMUX |= ref | leftAdjust | channel;
DIDR0 |= true << ADC3D;
ADCSRA |= true << ADEN | true << ADATE | true << ADIE | clockDiv;
}
And I’m sampling the ADC into a buffer in the ADC interrupt:
ISR(ADC_vect)
{
DISABLE_INT;
ToggleLed(Red);
if (BufferPosition < ADC_BUFFER_SIZE)
{
ADCBuffer[BufferPosition++] = ADCH - (int8_t)127;
}
else
{
BufferFull = true;
}
ENABLE_INT;
}
Since I’m having issues with my Goertzel implementation returning bad results I wanted to check my sample rate. Based on the simulator, this ought to be around 12019Hz.
However, after plugging my data analyzer into the LED pin being toggled, I find that it’s sampling at an inconsistent speed, I get two samples at 12.4kHz followed by one sample at 9.9kHz. Given I disable the other interrupts whilst the sample is being taken, what could be causing this?
In order to help debug this I removed everything except the LED toggle:
ISR(ADC_vect)
{
DISABLE_INT;
ToggleLed(Red);
ENABLE_INT;
}
Now I get a consistent (all in us) 80 on, 80 off, 80 on, 80 off, 80 on, 100 off. Does the ADC have to reset every 6 interrupts or something?
TIMER1_COMPA
was set to run at 200Hz. With this ISR:
ISR(TIMER1_COMPA_vect, ISR_NOBLOCK)
{
PushFrame();
}
However, disabling this interrupt so the ADC interrupt is the only one on the system does not help.
With an absolutely minimal working example:
//Standard lib includes
#include <stdbool.h>
//AVR includes
#include <avr/io.h>
#include <avr/interrupt.h>
#include <avr/sleep.h>
//Project includes
#include "leds.h"
#include "ADC.h"
//Named Literals
#define COMP_200Hz 12500 //NOTE: Needs 8x prescaler
#define F_CPU 20e6
//Macros
#define ENABLE_INT sei()
#define DISABLE_INT cli()
//Functions
int main(void)
{
SetupADCInFreeRunningMode(AVCC, ADC3, LAEnabled, CDOneTwoEigth);
ENABLE_INT;
StartADC();
/* Replace with your application code */
while (true)
{
;
}
}
ISR(ADC_vect)
{
ToggleLed(Red);
}
For completeness:
void ToggleLed(Led colour)
{
switch (colour)
{
case Red:
PORTD ^= true << RED_LED;
break;
case Yellow:
PORTD ^= true << YELLOW_LED;
break;
case Green:
PORTD ^= true << GREEN_LED;
break;
}
}
The problem is still occurring.
I’m really stumped on this one.
I’ve upped the sample rate on my logic analyser to 8MHz and now get this output.
So something strange is happening periodically.
The inconsistency isn't periodical as you say, looks more like random.
ISV vector or compare have some limitations when a call is break by an incorrect memory assignment. I assume that is function from the Watchdog, but could that be random triggered with so little calls?
One sample is periodically disappearing due probably a incorrect flag setting or a exited interrupt function during the ISR ON cycle.
This may be due hardware interrupts may be triggered and stopped from the internal scheduler or a periodic bug in the MCU produced by the load itself.
Eventually some inconsistent ISR timing setting may produce this kind of failure.
Answered by dbs on October 12, 2020
Get help from others!
Recent Answers
Recent Questions
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP