You can't really just declare an arbitrary X mV of ripple.
Oh yes I can!! It is not doing me a damn bit of good, but I can!!
You have to look at the ripple as a percentage of the B+ and consider how that affects the circuit's S/N ratio.
Thanks Doc this is most helpful!
Start with what resolution your DAC chip is. 24bit? The best S/N you can get from 24 bit DACs these days is about -124dB. For SACD it's about -120dB. If it's a 16 bit DAC maximum theoretical S/N is about -96dB. So take one of those as your starting point and see how close you can get. It will be nearly impossible to hit those super low numbers with a tube stage unless you use a lot of feedback.
The DAC is 18bit, (AD1865). Plate voltage on the output tube is 200V. So if I understand you correctly, (no given there!), if I get down to about 1-3mV ripple, the noise would be down about -96 to -106dB. And that would be in the range of the S/N ratio for the 18bit DAC, 110dB.
So just figure to design the power supply for the lowest ripple you can muster. Typically that means some serious voltage regulation. If you get it low enough then you can just tear your hair out trying to get the 1/f noise out of the tubes because it will dominate - assuming you have the grounding perfect, excellent shielding, etc., etc.
According to PSUD2 I can get down to about 2mv with a C-LC-RC-LC. The first C is just a trimmer cap to get the voltage range. The RC helps bring the voltage down and provides some filtering.
If I get this to work, then Phase 2 will be try some tube regulation.
Cheers,
Geary