mcs 0 Posted March 22, 2007 Tests done By www.benchmarkmagazine.com Samsung matches same quality as PANASONIC- but beats the price. Share this post Link to post Share on other sites
mcs 0 Posted March 22, 2007 Wot No comments??? go figger Share this post Link to post Share on other sites
survtech 0 Posted March 22, 2007 I have never used any of those cameras. They are all too big to fit into Pelco DF5 domes. I have used Pelco CCC1300/1370/1380, Ikegami ICD505, Ganz ZCD mini-domes, Sanyo, Dallmeier and a few others. I wonder why they didn't test any of those brands since they are very popular. Share this post Link to post Share on other sites
rikky 0 Posted March 22, 2007 interesting results. helpfull if we knew how these guys have put the cameras to the test. Share this post Link to post Share on other sites
phred 0 Posted March 22, 2007 I am a big fan of Samsung cameras. They don't get it right with every model – few do – but the SDC415 is quite incredible for the very modest price. Share this post Link to post Share on other sites
videobruce 0 Posted March 22, 2007 What is the "Image Resolution Accruacy" vs the "Image Resolution Achieved"?? Share this post Link to post Share on other sites
mcs 0 Posted March 22, 2007 I dont know get on their site and email them for specifics, this is jhust something that was emailed to me, yeah I think they tested the most USED models and brands as the samsung 730 and 740 are a cut above the 415, but heck for 199 the 415 suits my budget and is and does more then msot cameras suppliers sell. and 2 yr wty to Share this post Link to post Share on other sites
jisaac 0 Posted March 22, 2007 i would be very interested to know how they defined each category and how they get each point in the 1-10 scale. Share this post Link to post Share on other sites
mcs 0 Posted March 22, 2007 I have done it for you, await the reply and stay tuned blurb; BENCHMARK - THE PERFORMANCE INDEX FOR SECURITY SYSTEMS Benchmark is the only magazine in the security industry which is dedicated to establishing a performance index for electronic security products and solutions. It achieves this in two ways; firstly by group-testing equipment and technologies to establish real-world results, and secondly by visiting projects and discussing the merits and benefits of the solutions with the end user. Benchmark also collates data from previous tests and assessments and uses this information to create its performance index; a feature that allows readers to quickly and easily check performance data about products and components. This information will also be available in multimedia formats in the near future. All Benchmark tests are fully independent and are supervised by the Benchmark editorial team. All products are tested simultaneously and with the same supporting equipment. Details of such equipment are given as a part of the test process. Manufacturers are not involved in the tests, and products for group-tests are selected by the Benchmark editorial team following consultation with installers and specifiers. Because Benchmark is fiercely independent, you can be sure that you are receiving unbiased and honest information. Ensure that you are kept up-to-date with the facts about security system performance; subscribe to Benchmark, the only indepedent performance-related index for security solutions. Share this post Link to post Share on other sites
rikky 0 Posted March 22, 2007 ... doesn't make me any wiser Share this post Link to post Share on other sites
mcs 0 Posted March 22, 2007 I have done it for you, await the reply and stay tuned blurb; . Share this post Link to post Share on other sites
mcs 0 Posted March 23, 2007 I presume from the Panasonic/Samsung reference that you are discussing the camera test that appeared in the July/August 2006 issue, featuring Bosch,CBC, GE Security, Honeywell, JVC, Vicon, Videcon and of course Panasonic and Samsung Techwin. This was the first part of the test; the second part featured cameras from Baxall, Eyes 2 Eyes, Deview, Pelco, Sanyo, Siemens,Sony and Vista. All testing is carried out in real world situations with the products (cameras in this case) performing side by side. We have another publication, PSI, which features independent tests on products based around bench reports, but Benchmark puts the kit into real applications to see what an installer is likely to achieve, performance wise. We have been testing security equipment for around 15 years, and our team collectively has over 200 years experience in the electronic security industry (sometimes it does feel as if those years are consecutive). The actual testing (in this case) looked at four areas. These were image resolution, colour and greyscale reproduction, day/night performance and functionality. Our attitude is that it matters not if a camera (or any other device) is bristling with functions; you only get marks for those that are likely to be of use ... and they must work too! The exact testing process varies from test to test, as we look at different aspects of performance, and as the seasons dictate! Everything we do is catalogued, and this means we even mention what the weather was like when testing! However, ultimately the performance of a product is judged against its peers. Because of our knowledge of the market-place, we are also able to assess whether that performance is industry-leading, or just the best in a bunch of products. As an example, when testing for sensitivity, we adopt a universal and (some might argue, especially those with poor cameras) slightly old fashioned approach. We measure the minimum light level at the viewed scene, in lux, that is required to allow the camera to deliver a 1 volt peak to peak video signal with all processing and signal boosting turned off. This allows readers to instantly see which perform the best without allowing for IRE figures, differing lens apertures, processing elements or - quite frankly - utter bull****. As an aside, all of our test products are sourced via distribution, or where this is not possible via tame installers. This means that manufacturers do not know we are testing the product, nor can they prepare product especially for us! Some one asked about Image Resolution Accuracy and Image Resolution Achieved. The latter is the best resolution we could achieve with the camera in the conditions of the test. However, when considering high resolution cameras, some are 480 TVL, some are 510TVL, some are 520TVL and some are 540TVL. By simply considering resolution achieved, it would initially appear as if a poorly performing 540TVL camera is better than a 480TVL camera performing well. In order to remove confusion, we also show how well the cameras perform by showing the percentage of accuracy against the published specification. For example, a 480TVL camera delivering 460TVL is delivering 96 per cent of its specified resolution. It's a process to ensure that people can't use the data we provide in an unethical way. Any other questions, feel free to ask.I apologise if any of the above leaves you still confused, but this written in a hurry, and when I have a few free seconds I'll gladly take more time to go into more detail. -- PSI Â the only independent technical magazine for security installers http://www.psimagazine.co.uk Benchmark - the performance index for security solutions http://www.benchmarkmagazine.com Share this post Link to post Share on other sites
videobruce 0 Posted March 23, 2007 To sum it up; 'Achieved' is the actual number of lines and 'Accuracy' is the percentage of the published spec? How about tests with first the base model and then the advanced model (the one that is 2x the price and all the features with the same chip and circuit board)? Most manufactures have a model in the $170 price range and the $350-$400 range. Same camera, but with added features. Question is; is 2x more REALLY worth the price? Share this post Link to post Share on other sites
videobruce 0 Posted March 29, 2007 Never heard of what?? Share this post Link to post Share on other sites