Hello!
Ohm's law has been known for quite a long time, and it can also be used here: put a RESISTOR between the power supply and the battery. If the voltage difference is 2V (20V - 18V), and this difference is supposed to cause 0.5A flow, then Ohm's Law gives the resistance: 2V/0.5A= 4 ohms. In this resistor, the power will be 2Vx0.5A=1W, so its load capacity should be at least that.
However, if the battery voltage is 20V, and the rectifier voltage is also 20V, then the current stops flowing, regardless of the resistance.
However, charging the battery is not so simple: if it is a lead acid battery that has 9 2-volt cells, the final charging voltage is 21.6V, and a power supply that gives a lower voltage simply WILL NOT CHARGE this battery.
If it is an alkaline battery (e.g. NiMH) then there are 15 cells of 1.2V, but its final voltage will be 21.3V - and again, a rectifier that gives less - WILL NOT CHARGE that battery.
The voltage thing: if your rectifier gives 20V, is it a DC (stabilized) voltage or a ripple (rectified sine wave), and under what load?
And what happens if you put a capacitor on the output of the rectifier, e.g. 100uF/40V?