# What is the point of a voltage divider if you can't drive anything with it?

Physics Asked on January 6, 2022

The voltage divider formula is only valid if there is no current drawn across the output voltage, so how could they be used practically? Since using the voltage for anything would require drawing current, that would invalidate the formula. So what’s the point; how can they be used?

As an alternative to using a high-input-impedance device like a comparator or op-amp attached to the division point, one can instead use a low-impedance device that draws zero current at its operating point. For example, a Wheatstone Bridge connects a galvanometer (a sensitive low-impedance current detector) between two voltage dividers.

Describing how to balance a Wheatstone Bridge, to find the value of an unknown resistance Rₓ by the equation Rₓ = (R₂/R₁)·R₃, the Wikipedia article mentioned above says:

R₁, R₂, and R₃ are resistors of known resistance and the resistance of R₂ is adjustable. The resistance R₂ is adjusted until the bridge is "balanced" and no current flows through the galvanometer ... At this point, the voltage between the two midpoints ... will be zero.

When that voltage is zero, no current will flow, and both dividers will produce voltages at their true ratios.

Answered by James Waldby - jwpat7 on January 6, 2022

The first and most obvious use of the voltage divider is to use your intended load as one of the resistors in the divider. This works well if the load is well approximated as a resistor.

That's how the in-cable volume regulator of the headphones work.

When the load is not linear, one can simply make it more linear by paralleling it with another resistor.

That's how most negative feedbacks work - the amplifier input is rarely linear and not always has exactly known V/A behaviour, but it is guaranteed that the impedance is more than a certain value. You parallel this input impedance with a known, much lower resistor value and use it for one of the divider legs.

Answered by fraxinus on January 6, 2022

Just came up today. Needed a comparator that would trip at 2.8V. Had a 3.3V supply. So, voltage divider of 3.32k/17.8k driving one input of comparator (current load is negligible), test voltage to other input. Just everyday EE.

Answered by John Doty on January 6, 2022

The word "anything" is to broad and depends of your context.

If your domain is electronics (as opposed to for example Electrotechnics), voltage dividers with negligible loads are used everywhere.

"Negligible" means that the current is so low that we don't care of its effect.

Example : a voltage divider connected to a $$2 V$$ power supply, that is made of two $$1 kOmega$$ resistors, and that sources $$1 mu A$$ microampere delivers a voltage of $$0.9995$$ instead of $$1 V$$.

You may ask : In that case, why not use a simple resistor instead of $$2$$ resistors ?

Answer : the situation that happens very often is that the specified $$1mu A$$ current is not the real value but a maximum value. So this permits to get rid of the load current uncertainty.

Answered by andre314 on January 6, 2022

This is perhaps a niche concern, but one advantage of voltage dividers is that they also cleanly divide down the noise.(*)

In very low temperature electronics experiments, you might want to drive your very sensitive device that can't handle more than 10 microvolts of applied voltage, but your voltage source might be way more noisy than this. So, you instead start out with 1 volt at your voltage source, and divide it down 100000x.

As you say, the resistance of the divider does end up adding to the final measured resistance of your device. But, this can just be subtracted off after the measurement is done.

(*) - they can also add back in a lot of noise if one is not careful of ground loops: the voltage divider adds its local ground voltage noise fully into the output.

Answered by Nanite on January 6, 2022

Oh, but you can. You can drive an high impedance input with it...including a buffer, which can then in turn be used to drive whatever you want. The more current you draw the more the voltage will droop, so you just make sure to draw as little current as possible. So that the output is, for example, 99.9% of what the divider formula says it should be.

The divider formula is simply a equation that holds true under certain ideal conditions. If you want to mathematically analyze it under real conditions, the equation gets complicated and case-specific, so often it is just easier to force your real world usage such that the equation's assumptions are approximated very closely.

Answered by DKNguyen on January 6, 2022

In a high-impedance amplifier, the currents are small in proportion to the voltages present and in this case, voltage dividers see popular and common use to prescale the overall gain of the first stage of the amplifier and also to vary the output level of the amplifier.

The effect you mention (finite current flow causing the voltages in the divider circuit to shift) is called loading and can be minimized even in low-impedance circuits through appropriate choices of the resistances in the divider circuit.

Answered by niels nielsen on January 6, 2022

You don't have to draw significant current to "use" a voltage. For example, if you want to measure the output voltage, which is a perfectly useful thing to do, then you can just attach a voltmeter. And ideally, voltmeters don't draw current at all.

If you wanted to drive something at a lower voltage than the input, you wouldn't use a voltage divider because that would be extremely wasteful; most of the energy would be lost in the resistors.

Answered by knzhou on January 6, 2022