DVR MAN 0 Posted April 22, 2008 Why don't the camera manufacturers list the allowable input voltage tolerance on their spec. sheets? I would like to think that camera input voltage would allow about a +- two volt spread. It would be nice to have this info when selecting a particular camera. Share this post Link to post Share on other sites
survtech 0 Posted April 22, 2008 It depends on the design of the camera. A well-designed camera should be able to tolerate at least +/- 10% from nominal. This reflects power supply tolerances and cable lengths. For instance, Pelco specs out their IP110 camera system as "Input Voltage 24 VAC (18-36)", which is 25% low to 50% high, and their C10DN-6 at "24 VAC ±15%/12 VDC ±15%". Then again, we have Dallmeier, whose cameras are rated +/- 5% at 12VDC. That means 11.4 to 12.6 volts. STUPID! Share this post Link to post Share on other sites
DVR MAN 0 Posted April 23, 2008 Thanks. I think all surveillance cameras should post their power variables in their spec. sheets. Then the installer could taylor a system with confidence knowing that all components were matched correctly. Share this post Link to post Share on other sites
DVR MAN 0 Posted April 23, 2008 I just added an Altronics rack mount 16 output 12 Volt @ 3.5@ per output camera supply. It came set at 11.883 volts. I bumped it up to 12.2 volts (via Fluke 89 NIST). I figure that it is a safe place to start. Share this post Link to post Share on other sites
survtech 0 Posted April 23, 2008 Your best bet is to measure the voltage at the camera and adjust it so the camera's voltage doesn't exceed 12.6 volts under normal operation. That assumes you don't have high current intermittent devices connected like heater/blowers or IR LED's, both of which can greatly increase the current draw and thus the voltage drop in the cable. Share this post Link to post Share on other sites
DVR MAN 0 Posted April 24, 2008 Your best bet is to measure the voltage at the camera and adjust it so the camera's voltage doesn't exceed 12.6 volts under normal operation. That assumes you don't have high current intermittent devices connected like heater/blowers or IR LED's, both of which can greatly increase the current draw and thus the voltage drop in the cable. There is ample current available (3.5 amps per output). My IR cams draw 1 amp each when IR is active. The camera voltage is 11.998 volts with IR on (as set at 12.2 volts at supply). I will kick it up to 12.5 volts. The regulators should clip, or raise the voltage to keep it stable. I just want a happy medium setting at the supply end so as to not over work the regulators. Thanks! Share this post Link to post Share on other sites
vin2install 0 Posted April 25, 2008 You can tell if there is too many volts going to your cameras are if your IR's are on even if it is in the daytime Share this post Link to post Share on other sites
scorpion 0 Posted April 25, 2008 or smoke wafting out the seams? Share this post Link to post Share on other sites
si_kungs 0 Posted April 26, 2008 we use 12 volts regulated power supply but the output when measured is 13.8V . will there be a problem with this? will the camera be busted easily? Share this post Link to post Share on other sites
vin2install 0 Posted April 30, 2008 It depends on how long your wire runs are. Share this post Link to post Share on other sites
si_kungs 0 Posted April 30, 2008 for gauge #18 wire, how long should be the minimum run? I'm not familiar with the resistance of the wire. Share this post Link to post Share on other sites