I must be missing something, but what is it that is stumping you? I am unsure.
If you are using a light bulb in the AC mains line to limit the current, it will definitely drop some Voltage with the amp turned 'On'. With the amp turned 'Off', there will be no current flow through the light bulb, and it won't drop any Voltage at all; hence, your meter reads the full 120+ at the inlet to the amp. I can only guess that the light bulb is dropping slightly more Voltage as its filament warms up, so the resultant Voltage at your amp drops slightly.
An interesting experiment might be to configure your test set-up so that you can measure the Voltage across the light bulb. Then, let everything get nice and cool. Then, watch the Voltage across the light bulb as you go through turning the amp on and letting it warm up.
What you should find is that the Voltage drop across the light bulb, added to the Voltage measured at the Amp, will always add up to your line Voltage of 120+. Obviously, if you can borrow another meter to measure both things simultaneously, this experiment can go more quickly! But, you can do it in sequence, too, just let everything cool completely between measurements.
The reason for the cooling off periods is this: The filament of the light bulb and the filaments of the tubes are both basically tungsten, and tungsten filaments have remarkably more resistance, the warmer they get. Also, as mentioned above, the power supply caps, when they get charged up, don't want to draw any more current until they discharge. With the SEX, there are bleeder resistors to drain them slowly when the amp is turned off, but once they've drained out, they will want a lot of current when the amp gets turned back on.
It is very good that you got an outlet checker and did these additional measurements. Having a good safety ground on equipment with exposed metal parts can be vital.
Again, I fear I have missed the issue you are having, so please feel free to re-state it, and I will do my best to give you a better response.