etting your gains with an o-scope is idiotic. reasons being: - You can't hear even significant distortion in sub frequencies so why set it at "no" distortion - Not all CD's are recorded at the same level so what level do you set it at with the o-scope? - You can't hear the difference between 900w and 1000w so why do you care if you set it to 1000w? (you can take this a lot further than 10% btw) - Everyone ignores the resolution of the Oscope and assumes that a mostly rounded wave means no distortion. It would be a good exercise for you to see a 5% distorted wave in the time domain and then reduce the spatial resolution. No clipping and clipping can easily look the same There are more, but you get the picture. There is really no logic in using a DMM, Oscope, Distortion detector (ROFL) for setting gains. The reason the oscope was made popular on forums is that people were using DMM's and setting their gains to a voltage that their amps were rated for but couldn't produce cleanly at all. If you are dumb enough to need "every last drop" then the oscope is better than the DMM but that isn't saying anything.