CD Player Output Voltage

dbishopbliss · 10143

0 Members and 1 Guest are viewing this topic.

Offline dbishopbliss

  • Sr. Member
  • ****
    • Posts: 287
on: March 22, 2011, 07:54:26 AM
From what I understand CD players have a nominal output voltage of 2.5V.

Does this mean that preamps and linestages that were designed before CD Players WILL get overloaded by the signal coming from a CD player?  Or is the correct term, MIGHT?  In other words, do CD Players regularly output this much voltage or is it really just when playing really loud source material?

Does this mean if I choose an operating point for a linestage (or driver tube in an integrated amp) that was below the -2.5V grid line it will (or might) get overdriven by the CD player?

The reason I'm asking is the tube I'm looking at is more linear near the -2V gridline than it is near the -3V gridline.  However, linearity won't do me much good if the tube is overdriven.

Could I simply add a padding resistor to the input to attenuate the voltage coming in to be less that 2V?

David B Bliss
Bottlehead: Foreplay I, Foreplay III, Paramour I w/Iron Upgrade, S.E.X. w/Iron Upgrade
Speakers: FE127E Metronomes, Jim Griffin Jordan/Aurum Cantus Monitors, ART Arrays
Other: Lightspeed Attenuator, "My Ref" Rev C Amps, Lampucera DAC


Offline JC

  • Sr. Member
  • ****
    • Posts: 485
Reply #1 on: March 22, 2011, 08:04:44 AM
You know, I have never been sure of what the output rating of CD players in general refers to exactly.  Peak-to-Peak Voltage is quite a bit different than RMS Voltage, for instance.  Maybe someone else can point to something definitive.

For your question, it seems to me that you would want to know the peak Voltage points of the signal.

Having said all that, though, it would certainly be easy enough to pad down the input you use for the CDP.  A resistor or two would do it.

Jim C.


Offline Doc B.

  • Administrator
  • Hero Member
  • *****
    • Posts: 9557
    • Bottlehead
Reply #2 on: March 22, 2011, 08:08:04 AM
That's why they put a volume control at the input of a preamp, so you can trim down the signal. The place where too much signal can be a problem is at an input where there is no level control, a common example would be at the input of a power amplifier. If you connected a CD player directly to an amp with no level control in between, you can imagine what might happen.

FWIW there is a nominal standard of 2V for CD player output. But in reality the output levels of various players and DACs can be anywhere from way less than a volt to 8V.

Dan "Doc B." Schmalle
President For Life
Bottlehead Corp.


Offline JC

  • Sr. Member
  • ****
    • Posts: 485
Reply #3 on: March 22, 2011, 08:20:17 AM
Yup, the DAC I have is one of the 8 Volt variety.  And, I can't say that I have ever come across a satisfactory explanation of why that Voltage was chosen.  For that matter, I'm not sure why any of the output Voltages you mention are considered "nominal" by various device manufacturers.

I will say this, though: If video used as many variations on levels and impedance as audio, we would probably have never had VCRs, let alone DVD players.

Jim C.