Impedance Matching Questions

Guest · 2479

0 Members and 1 Guest are viewing this topic.

Frank Mena

  • Guest
on: February 06, 2010, 07:09:19 AM
I can understand that the source should have a low impedance and the input it drives should have a higher impedance.   From reading I also understand that the general rule of thumb is 1:10 for a minimum and 1:100 is better.  But what happens when an input has a very very  low impedance, and  my other question is  how low is too low?  what is the ratio for too low?...a 1:1000 ratio?, a 1:10000 or a 1:100000, or even greater?
Cheers
Frank M



Offline Paul Joppa

  • Global Moderator
  • Hero Member
  • *****
    • Posts: 5833
Reply #1 on: February 06, 2010, 12:10:19 PM
"Should" is perhaps too strong a word for the rule of thumb. Any ratio of impedances can be used if the components are designed for those impedances.

To answer the last question first, the source impedance and load impedance form a voltage divider, so a low load impedance driven from a high source impedance reduces the voltage severely. A shunt-mode level control is an example where this property is taken advantage of.

In the old days (i.e. the first half of the 20th century) most gear was designed for matched impedances. This came from telephone system designs, which originally had no amplification. Without amplifiers, it was important to transfer as much power as possible. As you can imagine, a high load impedance will absorb very little current, while a low load impedance will absorb all the available current but with little voltage. Power is the product of voltage times current, and the greatest power is transferred when the load impedance is equal to the source impedance. For this to work in practice, you need a standard impedance which most devices would implement; for telephones this was 600 ohms. One virtue of this approach, now mostly lost to us, is that many functions can be implemented passively - mixers, attenuators, and equalizers for example.

Modern gear is mostly designed to operate on voltages rather than power or current. We have amplifiers, so we do not need to absorb the maximum power from a source device, we just need to sense its voltage in order to amplify it. Amplifiers have an input impedance so high that the usual range of source impedances does not matter, and they produce a voltage output with a source impedance so low that the usual range of load impedances does not matter. With this arrangement, there does not need to be much of a standard for impedances; source impedances from 1 ohm to several thousand ohms are commonly used, with load impedances from a few thousand ohms to megohms. Only occasionally does this produce any problems.

We might just as well have standardized on current rather than voltages, in which case the rule would be high source impedances and low load impedances. In fact, this is how most DACs work. They just use a current-input, voltage-output amplifier to convert from current to voltage.

Paul Joppa