PB - I tried out my 3.3K transformers this morning.
I nixed my bias supply and just used 520K to ground, not ideal since the grid voltage isn't really adjustable, it is fixed by the gate-to-source voltage of my FET buffer, so -3.1V on the grid.
I was able to adjust my bias point to some degree using the trimpot on the Maida regulator I am using. I dialed it in to a 335Va / 57mA / -3.1Vg bias point.
With this, I am getting 156V to 524V peak-to-peak swing, with the 801A hard clipping beyond 524V on the positive peaks. This results in 16Vpp into 8ohms, about 4W out. This is pretty darn consistent with the datasheet if I draw out the load line.
Ideally, I would get to a true 0V bias point and 320V with a 3K (rather than 3.3K) primary impedance, it is doable, have to put +3V on the gate of my FET. At that bias, if the datasheet is to be believed - and so far has been consistent with my measurements - even assuming no copper losses, that makes 5.5W into a 3K load. Even if you did something crazy and biased at +10V on the grid at 280V on the plate, that is 400Vpp into 3K, makes a 6.4W.
So, I am just not seeing how it is physically possible to get 8W out of an 801A A2 amp at any 0V bias point. Am I missing something here?
I added additional NFB given the reduced gain needed from the driver stage, got the output impedance down to 2ohms. FR of the LL1620 below, think I am hitting transformer limitations on the high end again.
Thanks for your thoughts.