It depends on your supply voltage and the actual current drawn by the charger, but this will vary during the charging process.
So let's assume the voltage is 230 volts, and current is maximum, i.e. 4 amps.
Then max power input is 230 x 4 = 920 W
Max energy consumption is (230 x 4 /1000) x 9 = 8.28 kWh or units.
However, if the batteries are lithium ion, charging current varies, and a multi-stage charger will charge at a constant current during the first stage until the voltage reaches a certain level. Then the current decreases as the charger puts out a constant voltage during the second stage of charging. So actual energy use will be less than this, maybe half.
Energy = Power x time, so for a constant load e.g. a heater you simply multiply the power by time to get a figure for energy consumption. For a device like the charger above where the power varies, the best way to measure actual energy consumption is to use a power/energy adaptor which will give a more accurate figure as it continually measures instantaneous power and does an integration calculation and comes up with a total figure for energy use.
Take a look at this article for more info: