I have a question? What are the methods to eliminate the DC voltage drop on the line coming from a 12V or 24V power supply? I have quite a long cable line. I know that you can use a wire with a larger cable diameter, but are there any other methods? They say there are 3. Help!
Use a higher supply voltage Use a variable power supply Use wires of the appropriate diameter - the formula for the percentage voltage drop is on the elec
The problem of minimizing the voltage drop on a long line always comes down to 2 methods:
1 passive - in which the resistance of the power cables is minimized by increasing the cross-section,
2 active - in which voltage drops on the lines are compensated by increasing the source voltage with the use of an error amplifier.
Method 1 has limitations due to the fact that voltage drops on the wires can NEVER be eliminated to zero and are DEPENDENT on the current flowing through the load.
Method 2 has no such limitation, but it is important to measure the voltage directly at the load terminals, at the end of the power cables. In practice, it looks like this: The connection to the power supply and the load is made with four wires, one pair is "power" and the other "measuring". The supply pair should be of such cross-section that the current density is appropriate for the given load. The cross-section of the test leads, due to the fact that they are loaded on the other side with the input of the high-impedance measuring system, may be small. The measuring system connected directly to the load selects such a control of the series valve that the voltage directly at the output of the power supply is increased by the loss on the line. This measurement method is only used in "better" laboratory power supplies, although I personally think that the name "laboratory" should already cover such a function.
The dangers that can be imagined in this method of measurement are that if we assume that one measuring wire is broken, then the power supply will give the maximum voltage at the output, because the measuring system will send false information about the amount of loss on the line.
If the test leads are connected to the output terminals of the power supply before the supply line, then we will be dealing with a classic power supply, for which the load in this case is not the actual load at the end of the line, but the load at the end of the line together with the power cables.
I understand that in your case you need a practical solution:
If you know that you have a constant current draw by the load at the end of the line (which I doubt), then increase the supply voltage by controlling the voltage directly at the load.
If the current is not constant and you care about good voltage stabilization, it remains only to use the power supply I wrote about above.
If you don't have the above options, then the "thick" wires remain.
Possibly, increase the voltage and install stabilizers at the devices. Can you specify what you want to power this line with? It will be easier to choose a method.
crunky: "Why don't you specify what you want to power with this line? It will be easier to choose a method."
I mean a theoretical explanation of how to deal with voltage drops on power supply lines in general in low-voltage systems (e.g. powering controllers in fire alarm systems or cameras in cctv systems) in cases where the power supply cannot be moved closer to the device. Choosing the thickness of the cables is an obvious method for me. But are there any other methods that can be easily and cheaply used (such as, as above, increasing the constant voltage at the power supply output where in fire alarm systems we deal with certified power supplies?)
Assuming that we do not replace the wires, and that we do not use the four-terminal power supply I wrote about, and if 230 on the line is not an option, I can only see an increase in the voltage on the primary side and adding a stabilizer for each device.
Now a bit of theory about this case: the receiver requires a certain amount of power to work properly. Now we can choose to work on low voltage with high current or on high voltage with low current (of course in the right proportions so that the power is the same -> P=UI). The first method will always involve a large voltage drop on the supply lines because the voltage drop is proportional to the current. Therefore, if we do not want to have a voltage drop on the line, we always choose a higher operating voltage, which automatically lowers the current. By the way, this solution is also used when "powering" audio radio center lines. Cables can be hundreds of meters long and end with a low impedance loudspeaker. The whole problem is that you have to use transformers or converters or power supplies on the receiver side, you can't escape it.
I am not a specialist in alarm systems, you say certified power supplies - I do not know the case. I would also like to know more about this.
However, I think that increasing the power supply by a few volts on the primary side and installing a small low-voltage regulator on the receiver side, which can be done literally on a few elements (LM317 is one of many possibilities here) would pass the exam. Power dissipation could be realized with a small heatsink.
You can always choose the power supply certified for a higher output voltage. The question remains whether such a stabilizer can be mounted on the device. It seems to me that the certificate is about safety related to galvanic separation from the network, but correct me if I'm wrong, the problem may be more complex.
And what is the power consumption of the camera you mention? And how much voltage drop do you want to fight?
If it is to be used in a fire alarm system, then skip the modifications. You mentioned certified power supplies that are used in integrated fire alarm systems in public facilities. Such combinations can end up in court. The entire system should be designed and implemented by authorized persons, and the entire project should usually be checked by an expert, and after the operation checked by a firefighter. When it comes to powering industrial cameras, I don't really see a big problem - internal color 12V with automatic lens have a power consumption of up to 500mA, so a possible drop can be compensated by increasing the line supply voltage or AC power supply (some CCTV cameras allow DC or AC power supply ).
edison: "we can compensate for the drop by increasing the line supply voltage or AC power supply (some CCTV cameras allow DC or AC power supply)."
Am I to understand from the above that for alternating current the voltage drops on the line are lower than for direct current? Therefore, I am asking you to explain what is the difference in voltage drop between DC and AC voltage when we have the same length and thickness of the wire.
The voltage drop on the lines in the case of AC voltage supply is comparable to DC voltage supply (it depends only on the rms value of the current). AC power supply has the advantage that the stabilizer is built into the camera and such a system is insensitive to voltage fluctuations in much larger limits.
When it comes to drops at the same voltages, currents, lengths and cross-sections, the drops will be the same. As an example, you gave cameras, so I give you one of the possible solutions, i.e. cameras that have a choice of supply voltages; - constant 12V -variable 24V Assuming that the camera has a power consumption of 4W, we can assume that for direct current it will be about 330mA and for alternating current it will be about 160mA. Not to mention the 230V AC cameras for which the drops are marginal. From the above, it follows that some manufacturers, anticipating the possibility of drops, have enabled power supply with a higher voltage.
I have a similar problem. Well, the case concerns CCTV monitoring where I have cameras with IR backlight (two large LEDs)
Everything works fine during the day. The problem occurs at night when the IR backlight turns on. Of the 7 cameras, 3 are working with interference. I measured the voltage at the cameras farthest from the power supply (about 40m) is only 5V. And the current that the camera consumes with the diodes on is as much as 0.87A (without diodes it is only 0.08A). The power supply I have in the installation is 12V 7.5A. The cable installation is a 4x2x0.5 earth telecommunication cable (in gel). So I conclude that I have too small a wire cross-section. Will twisting e.g. two strands together increase my cross-section, i.e. I will get the same effect as if I used a cross-section twice as large??
The cameras have a 12V power supply, they consume about 0.4A with the LEDs on. I measured the power consumption of the camera by connecting it to the power supply, it came out to be 0.47A.
If you have spare veins, use them. Connecting the wires in parallel gives a larger cross-section. Each doubling of the number of the same wires = approximately twice as much voltage drop.
Use a step-up converter at the input, and put a step-down converter at the end of the cable. 12V power supply -> 30V converter -> cable - voltage drop e.g. to 23V at IR -> 12V converter -> camera ATTE has such ready-made solutions in its offer
Can I use a Pulsar switched-mode power supply to power the cameras? They offer 10A with voltage regulated by a potentiometer from 12V - 15V. Or maybe for CCTV it needs to be stabilized and not pulsed ?? For now, I'm going to play with more tension. But if that doesn't work, what's better? 1. The application recommended by Totoya, i.e. with the use of voltage increasing / decreasing converters? 2. Should you use a 40V power supply, and put a stabilizer in front of the camera where it has 40V at the input and 12V at the output, the DELTA company does something like this??
I suggested a converter because giving a stabilizer causes large power losses in the form of heat. Stabilizers have a very low efficiency and, in addition, they get hot, which is not always the case.
In addition, if it is a buffer operation, it is of great importance how much power supplied by the power supply (battery) we actually use, and how much we waste on heating
I replaced the power supply from the MPOWER 12V 7.5A DESKTOP power supply (Chinese) with the Polish PULSAR 12V - 15V 10A impulse built-in power supply. I raised the voltage with the potentiometer to 15.3V and everything works. In the farthest section, where before the voltage dropped by half (to 5V), it now drops to 11.5V, which is enough for an IR camera.
Have you checked the camera voltage with the illuminator on?
The higher current consumed by the camera causes a greater voltage drop than during daytime operation. You may find that the cameras turn off/reset at night.
No, of course I checked with IR on. After all, my whole problem is the operation of the cameras at night, because during the day the cameras consume less than 100mA.
arek_f do you have a buffer power supply? Because if you use a regular power supply, you can afford to raise the voltage. If it's a buffer, you're "killing" the battery very nicely The potentiometer in the Pulsars is not to be adjusted just like that ... that is, you can, but then the battery also gets a higher voltage. Also, try your solution on a 400mA camera at a greater distance Of course, guerrilla warfare in our country has always been good. Naturally, the above may not apply to you if you have a NOT buffered power supply. But how do you adjust the voltage for several cameras at different distances?
In the example, there is a calculation for 100m, running the whole thing with one twisted pair: 1 signal pair, 1 free pair, 1 pair + power supply, 1 pair - power supply. I started the camera with the illuminator (after switching on the illuminator, it took 0.9A) at 350m.
It's ok without me This "Pulsar" power supply is not Chinese? It doesn't look like it was produced by Pulsar, maybe just a Pulsar sticker? Anyway, such a boost works only in non-bufferówkach and rather at small "relatively" small distances. Anyway, + for the idea
I have another question about another new CCTV installation. A friend asked me to help him set up 3-4 cameras in the garage. The cables will be routed outdoors (not in the ground). The distances are 20-30 meters.
I have a dilemma on which wires to do this installation?
BNC or twisted pair?? Of course, I consider the gel twisted pair and the BNC for outdoor use.
At such a distance "to choose from color" Calculate how much coaxial + power supply + BNCki vs twisted pair will come out Personally, I would put twisted pair in a few years you will be able to exchange for IP
Only if the FTP CAT 5E gelled twisted pair with a wire diameter of 0.5mm will be able to run 12V power to the camera that consumes 0.558A (6.7W) with the IR LEDs on ??
I'm afraid that I won't have a problem with the power supply again.
However, CCTV BNC + cables have a cross-section of 0.5mm2 or even 1.0mm2 of the power wire.
After 30m UTP (only one pair for power supply) IR on (0.6A at 12V) on the camera you will have about 10.5V - The camera will work without any problems (and to be sure you can put one pair on + and one on -) I'm uploading a calculator for counting inheritances that I use
The discussion focuses on methods to eliminate DC voltage drop in long cable runs from 12V or 24V power supplies, particularly in low-voltage systems like fire alarms and CCTV cameras. Key methods include using a higher supply voltage, increasing wire diameter, and employing voltage stabilizers or converters. The conversation highlights the importance of measuring voltage at the load terminals and suggests using four-wire connections for accurate readings. Additionally, it discusses the advantages of alternating current (AC) over direct current (DC) in terms of voltage drop and efficiency. Practical solutions include using step-up converters at the input and step-down converters at the load, as well as adjusting power supply voltage to compensate for drops. The impact of wire gauge and current load on voltage drop is also emphasized, with recommendations for using parallel wires to increase effective cross-section. Summary generated by the language model.