My mains VAC dropped as low as 115VAC over the summer, causing the filament voltage regulator to dropout and hum in my Beepre with EML 300Bs (1.3A filament). And the low mains voltage must certainly have been mucking with my calibrated filament voltages in the Kaiju as well - in which I also use EML 300Bs. My temporary solution has been to use a variac to bring up the voltage. But I've wanted a better solution - one that would be fun to build.
So I've been planning to build a DIY mains voltage quasi-regulator built around a big 2000VA Hammond auto-transformer (AT). The AT is intended to take an input of 115 VAC, and has output taps for 85, 95, 105, 110, 115 and 125 VAC. I planned to use the 115 tap as the fixed output and have switchable inputs at the 105, 110, and115 taps for nominal +10, +5 and 0V choices. I figured this would be fine since I'd have a 9-10A fuse on the AT that would limit the VA to max 1350 (in the off-chance, worst-case-scenario that my mains went to 125V and I had the AT input selected for +10V): so well below its 2000 VA rating.
PB warned me that the output voltages might be a bit unpredictable if only low current were drawn from the AT - I probably wouldn't ever draw more than 5 amps from the thing - and quite likely less. But I like the idea of having the headroom to scale up if I ever want to. I did some reading and saw that the "regulation" of ATs is pretty good - meaning that output voltage of an AT is less dependent on current draw than is the case with a traditional primary/secondary transformer. So I figured I'd give it a shot, and if the voltage increase was a bit off I could trim it with my planned "diode voltage trimmers". And here is where my problem was born. My thinking was that since Schottky diodes, unlike resistors, provide a relatively fixed voltage drop regardless of current, I could use them to trim the output voltage from the AT. The idea was to put a pair of reverse paralleled diodes on each of the hot/live wire and the common wire, with the eventual load (amps) sitting between them: when voltage was positive, one pair would drop a fixed nominal 1.7VAC before the load, and vice versa. I planned to have two such "diode trimmer switches" so that I could choose 0, -1.7 or -3.4 V, on top of the zero, +5 and +10 V from the AT.
Problem: I forgot to take into consideration heat dissipation of the diodes. If I wanted the DIY regulator to handle up to 10A, that would mean each diode would be generating 17W of heat. 8 diodes @ 17W each is 136 Watts! That's a lot of heat to get rid of from inside an enclosure.
So my tentative plan B is to build the the AT box without the diode trimmers and just install the output switches (0, +5 and +10V) and a small 17A rated common mode choke pcb for taming whatever nasties might be on my mains.
BUT, I'D STILL LIKE TO DO THE DIODE TRIMMER THING, IF POSSIBLE, ON A SMALLER SCALE: 3A instead of 10A. I have three 3A rated combo common/differential mode choke pcbs that I want to pair with the diode trimmer switches. I don't want to run the chokes at their max rating - from what I've read, inductance will be a bit higher at lower currents and that should be good for filtering out noise. My plan is to limit each combo filter to 1A of current. So 3 filters * 1A * 1.7V drop * 8 diodes (4 per switch) = 40.8W.
So my question is: Do people think it's feasible to dissipate 41W of heat (max) from inside an enclosure? - I'm willing to implement whatever combo of heatsinking, air holes/screens and even battery powered fans that might be necessary to make this work. [Edit: and just to be clear - I'm not asking for people to tell me exactly how to solve my heat dissipation problem (although that would be awesomely generous); I just want to know whether this is even worth me trying to figure out - or is it just a lost cause?]
many thanks in advance, Derek