Hi, I have no good knowledge in electrical engineering. Can anyone enlighten me as to the interpretation of my test results? I am currently doing a battery life test as part of my thesis. The situation is like this: The disposable lithium battery has a starting voltage of 3.6 V. A minute after connecting it with a led light and a resistor the voltage drops to 3.51 V. afterwards I check the voltage left daily. after 3 days it had 3.25178V. 4th-day measurement was 3.178912 but the next day measurement (meaning 5th day) showed 3.26891V. Is it possible for a disposable battery to increase voltage? What could be the reason for this discrepancy in measurement?
But, is it possible to happen? Or there is no way it could happen? As far as I know, voltage should drop continuously as days passes by. But in my case, there was a certain day where the voltage increased as compared to the previous day readings.
Check out what your load is at each step, remembering that your load is not static if there is an LED present. What happens if you change the load on a battery. Cheers, Richard
sorry, i really don't get it. the aim of the battery test was to check the current consumption and for how many days the battery lasts. from day one of the experiment, the circuitry remains the same. i have not added any changes. i just check the battery voltage everyday with a flukemeter. so i expect the battery voltage to decrease due to current consumption. nominal 3.6V day 1) 3.5100 V (dec) day 3) 3.25178V (dec) day 4) 3.178912 V (dec) day 5) 3.26891V (inc) -> isn't this supposed to decrease also? day 6) 2.98661
Don't use an LED in the circuit, just a fixed load resistor, and do the measurements again and see what happens. What happens to the LED current as the voltage across it drops, is the voltage versus current linear as it is in a resistor?. In other words, the LED is changing the way that the circuit is discharging is my guess. Testing battery life without using constant power output does not give the full story. In the real world, one has to assume that the power output from a battery stays constant (it is likely to be powering a switch mode supply) in use, so in fact, during discharge, the current output from the battery needs to increase as its terminal voltage drops, which exacerbates the problem of determining battery life. cheers, Richard
This is normal. A cell has a certain terminal voltage dependent on its state of charge. This voltage will drop as soon as a load is connected due to the cell's internal resistance. After a period of discharge, a cell's voltage drops, but when disconnected recovers to some extent after a period of time. The internal structure and chemistry of the cell will affect it's discharge characteristics. Changes in ambient temperature will also account for differences in the cell's voltage. Are you measuring the current taken by the circuit ?
You are working to a lot of decimal places ! So the reading may even be affected by how firmly you press the meter probe on the battery terminal each day !
A very useful site for further info: batteryuniversity.com
Concerning Richard's comment about the LED: keep the circuit the same throughout the experiment. What he is saying is that LEDs are very sensitive to voltage - a small change in voltage acros an LED will result in a large change in current, so as the battery voltage lowers, the LED takes less current. However, if your circuit is just a resistor and LED in series, the resistor value will be the main factor in determining the current taken. But the reduction in cell voltage will distinctly reduce the current taken by the LED.
You say you have no knowledge of electrical engineering. Here are some of the many sites offering free basic courses: