Jump to content
cglaeser

Avigilon "Change Display Quality" performance implications

Recommended Posts

When selecting among Low, Medium, High, and Maximum in the Change Display Quality dialog box, which resources are impacted? When increasing the display quality, does this a) increase the processing at the server, b) increase the bandwidth requirements between client and server, c) increase the processing at the client? If the processing is done at the client, and assuming H.264 and not MJPEG, can this processing be done by a high-performance video card on the client?

 

Best,

Christopher

Share this post


Link to post
Share on other sites

Changing the setting to maximum disables hdsm which will increase bandwidth and CPU usage on your clients. It is advised to leave it at the default setting.

 

Thanks for the quick response. I don't need and don't use Maximum, but sometimes I have to drop from High (default) to Medium due to lagging responsiveness. What resource(s) would I need to increase so that I can leave the quality set to High with better responsiveness? Faster processor on the server? Faster processor on the client? Increased WAN bandwidth? Given that the client is more responsive some days compared to others, I'm guessing it is the WAN.

 

Best,

Christopher

Share this post


Link to post
Share on other sites
Changing the setting to maximum disables hdsm which will increase bandwidth and CPU usage on your clients. It is advised to leave it at the default setting.

 

Thanks for the quick response. I don't need and don't use Maximum, but sometimes I have to drop from High (default) to Medium due to lagging responsiveness. What resource(s) would I need to increase so that I can leave the quality set to High with better responsiveness? Faster processor on the server? Faster processor on the client? Increased WAN bandwidth? Given that the client is more responsive some days compared to others, I'm guessing it is the WAN.

 

Best,

Christopher

 

Is this over a LAN or WAN connection?

Share this post


Link to post
Share on other sites
If your streaming over a WAN connection then you should definitely not have display quality at maximum.

 

Understood. I don't use maximum. I'm trying to understand what resources are affected. If the camera generates two streams, how are the four settings (low, medium, high, maximum) generated? Does the server send a subset of a stream to reduce the bandwidth? Is it possible for the server to strip off some of the high res information from a stream to reduce WAN bandwidth?

 

Best,

Christopher

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×