ADC in AVR ATmega16
ADC in AVR ATmega16
ADC (Analog to Digital converter) is the most widely used device in embedded systems which is
designed especially for data acquisition. In the AVR Atmega series normally 10-bit ADC is
inbuilt in the controller.
ATmega16/32 supports eight ADC channels, which means we can connect eight analog inputs at
a time. ADC channel 0 to channel 7 are present on PORTA. i.e. Pin no.33 to 40.
i.e. When the input is 0V, the digital output will be 0V & when input is 5V (and Vref=5V), we
will get the highest digital output corresponding to 1023 steps, which is 5V.
ATmega16 ADC
It is 10-bit ADC
Converted output binary data is held in two special functions 8-bit register ADCL (result
Low) and ADCH (result in High).
ADC gives 10-bit output, so (ADCH: ADCL) only 10-bits are useful out of 16-bits.
We have options to use this 10-bits as upper bits or lower bits.
We also have three options for Vref. 1. AVcc (analog Vcc), 2. Internal 2.56 v3. External
Aref. Pin.
The total conversion time depends on crystal frequency and ADPS0: 2 (frequency
devisor)
If you decided to use AVcc or Vref pin as ADC voltage reference, you can make it more
stable and increase the precision of ADC by connecting a capacitor between that pin
and GND.
ADC Register
0 0 AREF pin
1 0 Reserved
1 1 Internal 2
We can select input channel ADC0 to ADC7 by using these bits. These bits are also used to
select comparator (inbuilt in AVR) inputs with various gain. We will cover these comparator
operations in another part.
Selecting a channel is very easy, just put the channel number in MUX4 : 0.
Suppose you are connecting the input to ADC channel 2 then put 00010 in MUX4 : 0.
Suppose you are connecting the input to ADC channel 5 then put 00101 in MUX4 : 0.
ADCSRA Register:
Writing one to this bit enables the ADC. By writing it to zero, the ADC is turned off. Turning the
ADC off while a conversion is in progress, will terminate this conversion.
Writing one to this bit, results in Auto Triggering of the ADC is enabled.
This bit is set when an ADC conversion completes and the Data Registers are updated.
Writing one to this bit, the ADC Conversion Complete Interrupt is activated.
These bits determine the division factor between the XTAL frequency and the input clock to the
ADC
We can select any divisor and set frequency Fosc/2, Fosc/4, etc. for ADC, But in AVR, ADC
requires an input clock frequency less than 200KHz for max. accuracy. So we have to always
take care of not exceeding ADC frequency more than 200KHz.
Suppose your clock frequency of AVR is 8MHz, then we must have to use devisor 64 or 128.
Because it gives 8MHz/64 = 125KHz, which is lesser than 200KHz.
Circuit Diagram
Program
#define F_CPU 8000000UL
#include <avr/io.h>
#include <util/delay.h>
#include <stdlib.h>
#include "LCD_16x2_H.h"
void ADC_Init()
{
DDRA=0x0; /* Make ADC port as input */
ADCSRA = 0x87; /* Enable ADC, fr/128 */
ADMUX = 0x40; /* Vref: Avcc, ADC channel: 0 */
_delay_us(10);
AinLow = (int)ADCL; /* Read lower byte*/
Ain = (int)ADCH*256; /* Read higher 2 bits and
Ain = (int
/* )ADCH* 256;
bits and /* Read higher 2 bits and= (int)ADCH*256;
Multiply with weight */
Ain = Ain + AinLow;
return(Ain); /* Return digital value*/
}
int main()
{
char String[5];
int value;
ADC_Init();
LCD_Init(); /* Initialization of LCD */
LCD_String("ADC value"); /* Write string on 1st line of LCD */
while(1)
{