AmericanPaul 0 Posted February 15, 2016 I have a question regarding how camera frame rate (PAL vs. NTSC) is affected by power phase (60hz vs. 50hz), more specifically, whether or not IP cameras are even affected by power phase. I was told recently that IP cameras are "digital", therefore power phase does not affect the frame rate, but I guess I do not fully understand how this is possible. I think that this may come down to where/how the IP cameras AD converter gets its time? I think an IP camera basically works like this: LENS>CCD/CMOS>ADC>DSP Where the lens captures the image via the imager, is then converted from analog to digital via the ADC, and finally is compressed, encoded, etc. by the DSP? The AD converter is sampling the analog signal voltage and comparing it to its reference voltage, but I think that voltage is derived from the cameras power source frequency (50 or 60hz), which even if it is converted to DC would still retain its frequency, right? I may have no idea what I am talking about, but any help is appreciated. Thanks Share this post Link to post Share on other sites
AmericanPaul 0 Posted September 4, 2019 I’m still curious about this, and never found an answer. Anyone? Share this post Link to post Share on other sites