Angelfire 0 Posted July 18, 2013 Hiya Folks, I'm starting to get back to my project before I have to leave the country in a few weeks. I thought I had it all sorted out but am now questioning myself and have generally just confused myself. Here's the situation. Looking at installing IP cams at my home and detached garage. Was planning roughly 5-7 outdoor cams and 1-2 indoor cams for the house and 5-6 outdoor/1-2 indoor for the garage. Ultimately going to more like maybe 10-12 outdoor on the house and 8-10 outdoor on the garage (in the future). Have pretty much decided on the Acti 3mp domes (D72 and D55). Given I plan to add more in the future, I'm thinking of locating 16 ch network switches at both the house and the garage, then feeding them to either a couple of NVR's, a PC, or maybe even a server. I'm not looking for continuous recording but instead, just recording when motion is detected but would like the ability to view the camera feeds. I've been thinking of just going with the free Acti software. I have an older PC or two that I could gut and use the case so that might be the easiest option but I don't want to do something now only to find out I didn't plan well for future expansion. So that's where I get to wondering if I should go towards a Dell PowerEdge server. They seem relatively inexpensive on ebay and may be a better option (or not!). The NVR's seem simple to implement but I can't help but think the cost of them would be prohibitive. Whatever I get, I need to be able to manage it remotely since I will be out of the country. Please let me know what other information you need to help me out. Thanks very much. Share this post Link to post Share on other sites
grege 0 Posted July 18, 2013 My network guy laughed at me when I said I was going to use a Dell PowerEdge server at my lake house to monitor my cameras. He recommended buying a new PC because he said not only would the PowerEdge use a lot of electricity but that things in it would start failing. I've got an old PC there that I put a 1TB drive in and will just continue to use it. YMMV Share this post Link to post Share on other sites
buellwinkle 0 Posted July 18, 2013 That's a lot of cameras but even an older PC should handle that to run the server component and you have nothing to lose in trying it. The client component is web based, so as long as you have IE, you can access it the same at home as while traveling. I was accessing it from Italy last month, no problems other that it was slower because of hotel internet limitations. It takes a long time to connect remotely, maybe 2-3 minutes, but once connected it worked well, even fluid video motion on the limited line. You can also access via their free IOS program but I have not tried it as I'm more of an Android guy so I use IP Cam Viewer app. Also, for what it's worth, the first 16 cameras are free, additional licenses are extra and the place you buy from should throw it on that large of a deal. PM me your email address and I can get you the forms for ACTi project pricing which will save you some money. Share this post Link to post Share on other sites
SectorSecurity 0 Posted July 18, 2013 For that many cameras I would suggest at least a core i7 and a pretty good video card. Share this post Link to post Share on other sites
buellwinkle 0 Posted July 18, 2013 For that many cameras I would suggest at least a core i7 and a pretty good video card. Any video card will be fine as it's all browser based anyway. I used to run 6 cameras, all ACTi megapixel on an Atom processor with plenty of room to spare. Their 32 channel NVR I believe uses an Atom processor. On my old i3-540, 6 cameras is like 2% CPU utilization. Share this post Link to post Share on other sites
MaxIcon 0 Posted July 19, 2013 Yeah, my limited benchmarking shows that NVR clients mostly care about 2D video performance, which makes sense. Most video card benchmarks weigh the 3D portion more heavily, since that's where most people are looking for performance. That's also what sucks down the most power, if you're looking for keeping power consumption down. If you use benchmarks to select video cards, it's important to focus on the 2D scores more than the 3D scores. Share this post Link to post Share on other sites
mechBgon 0 Posted July 20, 2013 (edited) For my surveillance computer at work, I picked a Core i3-3225 and used the CPU's onboard video to drive two 1920 x 1200 displays. CPU usage is around 2% when not displaying the video streams (just recording them to disk). Simultaneously displaying all of the following primary streams (and scaling them to fit on the display all at once) can take it up to around 25-30% CPU usage: four 1600 x 1200 @ 2Mbit/sec one 2560 x 1920 @ 8Mbit/sec four D1 @ 2Mbit/sec from an analog-to-IP converter So these are my thoughts on your scenario: 1. you are investing a lot of money in cameras, so it wouldn't make a lot of sense to skimp on the reliability of the computer that's supposed to store the video by pulling an old PC out of mothballs. I'd get or build a new computer. 2. you could start off with a Core i3 and see how it goes, then expand to an i7 if necessary. Ditto for the on-CPU video. 3. it will probably depend on the software to some extent, but in my case recording the video is a very low-stress chore since it's already in H.264 from the cameras. It's not like the CPU has to encode it on the fly. For my build, I went with a top-notch Seasonic power supply, an H77-based motherboard, the i3-3225, 1.35-volt RAM, a couple 3TB hard drives and a chassis that has direct fan cooling for the hard drives. Power consumption when recording is in the mid-30W range and it has a couple hours of runtime on my UPS in the event of an outage (the cameras are on a separate UPS). It would probably be noticeably lower if I had a Haswell-core i3, but they haven't been released yet and I was getting impatient Edited July 20, 2013 by Guest Share this post Link to post Share on other sites
thewireguys 3 Posted July 20, 2013 The specs needed for your servers and based off the VMS you plan to use. Pick the VMS first then spec the server. Share this post Link to post Share on other sites
MR2 0 Posted July 21, 2013 The specs needed for your servers and based off the VMS you plan to use. Pick the VMS first then spec the server. Probably that... I would always go with the latest processor you can if nothing else but for efficiency, when i was shopping for an updated PC to run my VMS at home the first thing I looked at was idle power usage, to give you an idea, we have some old HP Rack server that at idle, chew somewhere around the 200-300w range... the PC I just grabbed (an ebay HP 6300 pro) idles at 41w then have a look at the drives, the best power per watt Vs space I found was the Seagate 4tb, it only costs $220 AUD, but running at 5900rpm it only draws 5w, you can get lower power drive but they have a lower capacity, and to get a higher capacity the amount of drives needed for the array means you end up using more power in total anyway) I found it's cheaper to get a PC & NAS combo, and both of them together will use less power than building a massive single PC & easier to get parts 5-10 years in the future Share this post Link to post Share on other sites
chucky 0 Posted July 21, 2013 Based on a PC system what would the preferred video card be? HDMI output allowing you to drive HD displays? Thanks Chucky Share this post Link to post Share on other sites
fenderman 0 Posted July 21, 2013 Based on a PC system what would the preferred video card be? HDMI output allowing you to drive HD displays? Thanks Chucky use on board video...low power consumption and more than enough to power 2 displays....most pcs have one hdmi and one dvi --both digital connections. Most business pcs, have display port with is also digital. Don't waste money on a card that will consume tons of power over its life... Share this post Link to post Share on other sites
MaxIcon 0 Posted July 21, 2013 Yeah, all of my systems use the on-CPU video currently. The 3rd and 4th gen Intel chips with HD4000 and HD4600 graphics are very good 2D performers at very low power, and even my older i3-540 box doesn't suffer from the on-CPU graphics. Adding a mid-range video card would increase my power costs by $150/year per PC and would increase the heat in the box without any improvement in performance. Share this post Link to post Share on other sites
GMaster1 0 Posted July 21, 2013 When people use the words "Dell PowerEdge" to describe a dell server, I feel like they are describing the 2900 series and below which makes sense as to why you noted their cheap prices on eBay. Their newer models are downright amazing and consume less power than a modern desktop at times. For general audience reading this: If you do get an older PowerEdge for a file server or something, remember to buy a spare Perc controller. Most of them had capacitor issues. You're out of the country soon -- I'd at least leave a key hidden under a rock or something so that when your server/NVR/etc goes down, you have someone local to power cycle things if you don't want to invest in PDU appliances that reboot things when un-pingable and whatnot. Share this post Link to post Share on other sites
MR2 0 Posted July 21, 2013 When people use the words "Dell PowerEdge" to describe a dell server, I feel like they are describing the 2900 series and below which makes sense as to why you noted their cheap prices on eBay. Their newer models are downright amazing and consume less power than a modern desktop at times. For general audience reading this: If you do get an older PowerEdge for a file server or something, remember to buy a spare Perc controller. Most of them had capacitor issues. You're out of the country soon -- I'd at least leave a key hidden under a rock or something so that when your server/NVR/etc goes down, you have someone local to power cycle things if you don't want to invest in PDU appliances that reboot things when un-pingable and whatnot. please provide figures backing this. Specifically idle power usage & DB in operation keep in mind people doing home security systems may not necessarily want to spend $2-3k on a Dell server which (Especially if you go the the 1/2 RU ones) creates a lot of noise in another-wise quiet environment over a normal desktop PC or HP Micro server that are designed to work in environments that are already very quiet. Share this post Link to post Share on other sites
mechBgon 0 Posted July 22, 2013 Yeah, all of my systems use the on-CPU video currently. The 3rd and 4th gen Intel chips with HD4000 and HD4600 graphics are very good 2D performers at very low power, and even my older i3-540 box doesn't suffer from the on-CPU graphics. Adding a mid-range video card would increase my power costs by $150/year per PC and would increase the heat in the box without any improvement in performance. Another vote for the on-CPU Intel video. Scaling/displaying and recording nine primary streams at once in Grandstream's software, and piping it over a Remote Desktop Connection, my i3-3225 comes up to full speed (3.3GHz) and 30% CPU load. Recording the streams without displaying them, it throttles down to 1.55GHz and 2% CPU load. Based on my UPS's wattage readout (typically mid-30W area), the system itself (not counting monitors) is sipping about US$20/year in electricity, which seems good. I spend more than that on all sorts of foolish things Share this post Link to post Share on other sites
MR2 0 Posted July 22, 2013 Yeah, all of my systems use the on-CPU video currently. The 3rd and 4th gen Intel chips with HD4000 and HD4600 graphics are very good 2D performers at very low power, and even my older i3-540 box doesn't suffer from the on-CPU graphics. Adding a mid-range video card would increase my power costs by $150/year per PC and would increase the heat in the box without any improvement in performance. Another vote for the on-CPU Intel video. Scaling/displaying and recording nine primary streams at once in Grandstream's software, and piping it over a Remote Desktop Connection, my i3-3225 comes up to full speed (3.3GHz) and 30% CPU load. Recording the streams without displaying them, it throttles down to 1.55GHz and 2% CPU load. Based on my UPS's wattage readout (typically mid-30W area), the system itself (not counting monitors) is sipping about US$20/year in electricity, which seems good. I spend more than that on all sorts of foolish things that is awesome! I can't wait to get one of the new Haswell cores! how much do you pay per kwh? we're 24cents Share this post Link to post Share on other sites
mechBgon 0 Posted July 22, 2013 Yeah, all of my systems use the on-CPU video currently. The 3rd and 4th gen Intel chips with HD4000 and HD4600 graphics are very good 2D performers at very low power, and even my older i3-540 box doesn't suffer from the on-CPU graphics. Adding a mid-range video card would increase my power costs by $150/year per PC and would increase the heat in the box without any improvement in performance. Another vote for the on-CPU Intel video. Scaling/displaying and recording nine primary streams at once in Grandstream's software, and piping it over a Remote Desktop Connection, my i3-3225 comes up to full speed (3.3GHz) and 30% CPU load. Recording the streams without displaying them, it throttles down to 1.55GHz and 2% CPU load. Based on my UPS's wattage readout (typically mid-30W area), the system itself (not counting monitors) is sipping about US$20/year in electricity, which seems good. I spend more than that on all sorts of foolish things that is awesome! I can't wait to get one of the new Haswell cores! how much do you pay per kwh? we're 24cents Yes! Mine is a mere Ivy Bridge. I would've picked a Haswell Core i3 for sure if they had been available, but Intel started with the i7s and i5s At any rate, this is still a big improvement over my quad-core Phenom II, about half the power consumption (and twice the battery-backup runtime) in any scenario. We are at 6.925 cents per kw-h here, which I believe is relatively low? For folks with expensive electricity, if you go with the self-built PC route and it'll be running 24/7, also consider a power supply with an 80Plus Gold or Platinum rating. I got one of these: http://www.newegg.com/Product/Product.aspx?Item=N82E16817151117 For those interested in power consumption numbers, I was at work and re-checked the wattage readings on my UPS's display (computer only, no monitors). ~34 watts when recording 9 streams but not displaying them, in which case the CPU is throttled down to 1.55GHz. 44 watts when recording and displaying 9 streams with a combined ~21Mbit/sec at the network port, in which case the CPU comes up to its full rated speed but only about 30% CPU load. For a commercial system similar to this, people in Newegg's service area could add your preferred HDDs and OS on one of these Lenovo barebones servers: http://www.newegg.com/Product/Product.aspx?Item=N82E16859106336 Share this post Link to post Share on other sites
MaxIcon 0 Posted July 22, 2013 Heh! Yeah, that's low. Here in Silicon Valley, we have a tiered system, and anything I add is in the top tier, which is $0.336/kWh. My i5-3570k box runs 70% CPU and draws 90W, so it's costing $260/year to run just for the box. My UPS shows 120W, but it also has the POE switch and a server on it, so I'm not entirely sure I trust its readout. I'll fingerprint it more soon. Share this post Link to post Share on other sites
buellwinkle 0 Posted July 22, 2013 I consider electric use all the time as our top tier rates (anything more than 60w light bulb) is 29 cents per kw hr and going up a bunch in September with SDG&E. With our usage, they are saying $100 more per month, our about a 35% increase. They say 11% increase but that's just a single person in a studio apartment without an air conditioner or an NVR PC. If we had rates down to single digits, wow, I can upgrade my Smart car to a Porsche with the money I would save, LOL. Share this post Link to post Share on other sites
chucky 0 Posted July 23, 2013 How does everyone control cost of electricity of a POE switch or multiple POE switches powering all your cameras? Mine consumes up to 170 watts. Thanks Chucky Share this post Link to post Share on other sites
MaxIcon 0 Posted July 23, 2013 You can't really control POE power costs, except for buying a modern one that uses low power when nothing's connected. Once you connect 10 cameras at 8W, you're looking at 80W plus whatever the conversion efficiency eats up, so 100W or more just to power the cams. Have you measured the actual consumption, or is that the max power? 170W is quite a lot unless you're running a big pile of cameras. Mine consumes quite a bit less than that with 10 cameras connected, and it's an old-school corporate 24 port switch. Share this post Link to post Share on other sites
chucky 0 Posted July 24, 2013 It is a 24 port POE but no I have not measured watts consumed by a fixed number of cameras being powered. The cooling fan's sounds like a hair dryer running. Thank You Chucky Share this post Link to post Share on other sites
mikmort 0 Posted July 24, 2013 It is unlikely the 24 port POE switch uses anywhere close to 170 watts without anything hooked up to it. If you want you could disconnect everything and measure the switch with a Kill-a-watt meter. Of course as the previous poster said if you lots of cameras attached it will consume the power necessary to run the cameras which is unavoidable. Share this post Link to post Share on other sites