TransWikia.com

What will happen if we try to take a voltage reading by keeping it in current mode in a multimeter?

Physics Asked by user272860 on December 7, 2020

There are different modes present in a multimeter. one is
the current mode and voltage mode for their respective measurements. what
will happen if one try to take a voltage reading by keeping it in current mode?

2 Answers

An ammeter can be thought of as a galvanometer with a tiny shunt resistance in series (so as to not disturb the total current drawn from the source) and a voltmeter can be thought of as a galvanometer with a very high shunt resistance attached in parallel. Ammeters should thus be connected to a circuit in series in order to measure the resistance, as shown below:

                                                      enter image description here

When you set the multimeter to the "current" mode, it is essentially working as an ammeter, meaning that it's a galvanometer with a very low resistance (say, $r$) in series. If you were to use it to measure voltage, you would connect this (in parallel) over some resistance $R$, as shown below.

                                            enter image description here

The circuit's load across will now become approximately $r$ (specifically, it will be the equivalent resistance of $r$ and $R$ in parallel $= frac{r R}{R+r}$, very close to $r$), meaning that the circuit will be equivalent to a power source connected to a very low resistance.

As a result the circuit will draw a large amount of current from the source (in practical terms, it will be the maximum current the source can provide), and this will be the current measured by the multimeter. Of course, the value of $r$ chosen by the multimeter depends on the the actual "current" setting. In some multimeters, I've found that this usually blows the fuse if you're using the "sensitive" current setting.

Of course, if you're using a source that can actually provide a current much larger than the multimeter can handle, Bad Things Will Happen. (But hopefully -- if you have to ask this question -- you won't be using such a source!)

Answered by Philip on December 7, 2020

Whenever a multimeter measures a current it is actually measuring the voltage drop across the so called shunt resistor. A shunt resistor is a resistor with a very low resistence so that whenever it's placed in series with the circuit, and so when you try to measure the current of said circuit, it does not effect, to an extensive degree, the current draw of the circuit.

The meter measures the voltage drop across the shunt, which has a more or less precise resistence, like $R=0.01 Omega$, and using Ohm's law it finds the current $$I = frac{Delta V}{R} = 100Delta V; [A]$$ where the final result is found using the hypotetical resistence of $R=0.01Omega$.

Now that the current measurement is clear, you should be able to understand that if you measure a voltage, which requires the multimeter to be taken in parallel with the circuit, you're just putting the shunt resistor in parallel with your circuit at the point of measurement.

Suppose that you're trying to measure the volage drop across a $100Omega$ resistor with the multimeter in current mode. What you're actually doing is putting the shunt resistor, suppose again $R = 0.01Omega$, in parallel to the $100Omega$ resistor. Suppose that your power supply is $10 V$. Since the shunt is in parallel to the $100Omega$ resistor you'll have $10V$ across your shunt which means that the multimeter will display a current of $$I = frac{Delta V}{R} = (100*10) A =1000 A$$

So your multimeter will draw a big amount of current from your circuit, and so it will just melt the fuse that your multimeter should, hopefully, have.

Answered by Davide Morgante on December 7, 2020

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP