It was suggested that because the efficiency of a power supply increases as it is loaded to its design output power, it is possible to draw 5W of power to charge a USB device, while taking less than 5W of extra power from the wall outlet. While that sounded reasonable on the face of it, I wondered if the idea represented better use of existing power, or actually getting something for nothing. So as an electronics engineer with an interest in physics, I did some calculations to find out. Here is the response I posted:
Short Answer:
YES; you'll always pay for the USB power with at least that much more power from the wall. Not only is this required by the laws of thermodynamics, it's also inherent in the way power supplies work.
Longer Answer:
We'll take the whole system of the computer, its internal power supply, its operating circuits and the USB port circuitry to be one big, black box called the Supply. For the purposes of this illustration, the whole computer is one oversized USB charger, with two outputs: the computer operating power, which we will call Pc, and the output USB power, which we will call Pu.
Converting power from one form, (voltage, current, frequency), to another, and conducting power from one part of a circuit to another, are all physical processes which are less than perfect. Even in an ideal world, with superconductors and yet-to-be-invented components, the circuit can be no better than perfect. (The importance of this subtle message will turn out to be the key to this answer). If you want 1W out of a circuit, you must put in at least 1W, and in all practical cases a bit more than 1W. That bit moreis the power lost in the conversion and is called loss. We will call the loss power Pl, and it is directly related to the amount of power delivered by the supply. Loss is almost always evident as heat, and is why electronic circuits which carry large power levels must be ventilated.
There is some mathematical function, (an equation), which describes how the loss varies with output power. This function will involve the square of output voltage or current where power is lost in resistance, a frequency multiplied by output voltage or current where power is lost in switching. But we don't need to dwell on that, we can wrap all that irrelevant detail into one symbol, which we will call f(Po), where Po is the total output power, and is used to relate output power to loss by the equation Pl = f(Pc+Pu).
A power supply is a circuit which requires power to operate, even if it is delivering no output power at all. Electronics engineers call this the quiescent power, and we'll refer to it as Pq. Quiescent power is constant, and is absolutely unaffected by how hard the power supply is working to deliver the output power. In this example, where the computer is performing other functions besides powering the USB charger, we include the operating power of the other computer functions in Pq.
All this power comes from the wall outlet, and we will call the input power, Pw, (Pi looks confusingly likePl, so I switched to Pw for wall-power).
So now we are ready to put the above together and get a description of how these power contributions are related. Well firstly we know that every microwatt of power output, or loss, comes from the wall. So:
Pw = Pq + Pl + Pc + Pu
And we know that Pl = f(Pc+Pu), so:
Pw = Pq + f(Pc+Pu) + Pc + Pu
Now we can test the hypothesis that taking power from the USB output increases then wall power by less than the USB power. We can formalise this hypothesis, see where it leads, and see whether it predicts something absurd, (in which case the hypothesis is false), or predicts something realistic, (in which case the hypotheses remains plausible).
We can write the hypothesis first as:
(Wall power with USB load) - (Wall power without USB load) < (USB power)
and mathematically as:
[ Pq + f(Pc+Pu) + Pc + Pu ] - [ Pq + f(Pc) + Pc ] < Pu
Now we can simplify this by eliminating the same terms on both sides of the minus sign and removing the brackets:
f(Pc+Pu) + Pu - f(Pc) < Pu
then by subtracting Pu from both sides of the inequality (< sign):
f(Pc+Pu) - f(Pc) < 0
Here is our absurdity. What this result means in plain English is:
The extra loss involved in taking more power from the supply is negative
This means negative resistors, negative voltages dropped across semiconductor junctions, or power magically appearing from the cores of inductors. All of this is nonsense, fairy tales, wishful thinking of perpetual-motion machines, and is absolutely impossible.
Conclusion:
It is not physically possibly, theoretically or otherwise, to get power out of a computer USB port, with less than the same amount of extra power coming from the wall outlet.
What did @zakinster miss?
With the greatest respect to @zakinster, he has misunderstood the nature of efficiency. Efficiency is aconsequence of the relationship between input power, loss and output power, and not a physical quantity for which input power, loss and output power are consequences.
To illustrate, let's take the case of a power supply with a maximum output power of 900W, losses given by Pl = APo² + BPo where A = 10^-4 and B = 10^-2, and Pq = 30W. Modelling the efficiency (Po/Pi) of such a power supply in Excel and graphing it on a scale similar to the Anand Tech curve, gives:
This model has a very steep initial curve, like the Anand Tech supply, but is modelled entirely according to the analysis above which makes free power absurd.
Let's take this model and look at the examples @zakinster gives in Case 2 and Case 3. If we change Pq to 50W, and make the supply perfect, with zero loss, then we can get 80% efficiency at 200W load. But even in this perfect situation, the best we can get at 205W is 80.39% efficiency. To reach the 80.5% @zakinster suggests is a practical possibility requires a negative loss function, which is impossible. And achieving 82% efficiency is still more impossible.
For summary, please refer to Short Answer above.
No comments:
Post a Comment