adster 0 Posted April 22, 2012 Hi guys, Just a basic theory question here. Say you have two camera power supplies. One is a 4 camera 5 AMP supply (with all 4 cameras hooked into it) but the other is a 16 camera 25 AMP supply (with only 4 cameras hooked into it)...would the 25 AMP supply still cost more to run every month than the smaller unit? Thanks, Bob Share this post Link to post Share on other sites
Soundy 1 Posted April 22, 2012 Not necessarily. The power supply's current rating only indicates the maximum it can provide; the cameras will draw whatever amount of current they need. The only thing that might make a difference is whether one is a transformer-and-regulator type and one is a switching type: the former will waste more energy in the conversion process, while the latter design is much more efficient. The ACTUAL cost-to-operate of either type will be minimal, anyway - you might be talking $30 vs. $35 per year. Say all four cameras are rated at 500mA@12VDC - that's 6W each, or 32W total. Electricity is generally charged per kilowatt-hour... at 0.032kW, that's 31.25 hours of operation (assuming 100% efficient energy conversion) to actually consume 1kW. Around here, the highest we pay for residential power is 10.19 cents per kWh... that would be 10.19c for every 31.25 hours of operation. For one year, that would be 280.32kWh, or $28.5646 per year. Share this post Link to post Share on other sites
adster 0 Posted April 22, 2012 Thanks Matt, it's much clearer to me now. I want to build in a lot of expandability to my system. Share this post Link to post Share on other sites