I have been enjoying the recent Circuit Surgery columns on Power Supplies and Potential Dividers in EPE Magazine. I love the way that Ian Bell tackles each subject, explaining the theory clearly from first principles using worked examples to aid understanding. I wondered whether Circuit Surgery would take a look at a subject that has always been a little fuzzy for me: Switching Power Supply Source Impedance and how to mitigate for it. We all know the rule of thumb that the source impedance of the input to a voltage regulator must be lower than the negative input impedance of the converter (negative because the line voltage drops when more current is taken by the regulator) by a factor of at least 10 times. We also know that if the source has a high impedance (e.g. long wires from the source to the regulator) that this can be mitigated for with some input capacitance, usually a large value electolytic. What's not clear to me is how big that capacitor should be for a given source impedance. Explanations range from the highly mathematical to the generic "use a 470uF for everything". Is there a way to work out a suitable minimum capacitance value, based on the estimated or calculated impedance of the source?