fbpx
19.3 C
Sydney
Tuesday, November 19, 2024

Buy now

  • HIKVISION NVR
  • HIKVISION AX PRO
  • HID SIGNO
HomeSecurityAlarm SystemsObjective vs Subjective Camera Testing: Which is Best?

Objective vs Subjective Camera Testing: Which is Best?

There are objective tests available which measure parameters of camera performance. At the same time, there are some variables that seem subjective – they might depend on application and relate to focus at a necessary depth of field, motion blur in a specific location facing specific challenges, backlight performance, performance at the edge and much more, simultaneously.

Is there a software analysis tool that’s capable of managing all the vagaries that combine to decide key performance vectors? What does SEN think the key operational vector of a surveillance camera is and can it really be tested for across an entire scene – edge to edge, front to back?

A: Let’s define some terms first. Subjective valuations are based entirely on personal feelings, tastes or opinions. Comparatively, objective valuations are not based on personal feelings, tastes or opinions. The ambiguity of human labels aside, I think it’s fair to argue that any camera test based on the observable performance of a camera system is not simply based on personal feelings, tastes or opinions. At the same time, it’s not entirely objective either.

There may be some elements of an image that might be called subjective but it’s tricky. For instance, a quality like motion blur, which is difficult to quantify and hard to place a value on subjectively, might be judged by testing for modulation transfer function (MTF) or spatial frequency response (SFR). Image quality analysis software solution like Image Engineering’s iQ-Analyser, which is distributed locally by VidiLabs, gives a lot of scope to get a more accurate picture of camera performance based on objective parameters.

Vectors like OCEF, dynamic range, white balance, noise and ISO speed, visual noise, MTF, limiting resolution, distortions including lateral and longitudinal chromatic aberrations, vignetting, shading and flare, as well as colour rendition, can all be tested for. Obviously, they can be tested across multiple cameras in the same application to get a strong sense of which camera performs best. It’s the way a camera system handles all these shifting vectors across a scene in real time through changing conditions that’s hardest to assess. As control room operators know very well, their applications will often pose unique challenges through a daily light cycle.

In our opinion, the key vector of a surveillance camera is sharpness – think of sharpness as the clarity of boundaries between colours and tones in a scene. You’d measure the sharpness of digital cameras (with their variable sensor sizes) as line-widths-per-picture-height, though cycles-per-pixel would be another useful measurement, telling you how effective each pixel is across an entire scene. Subjectively and objectively, an ideal area for sharpness measurement using MTF is to pick an area showing about half sharpness value – low or peak – an area where there’s still plenty of detail discernible to the human eye. If you try to assess sharpness subjectively in the murkiest parts of a scenes it will be a struggle – under about 10 per cent MTF the human eye is a poor performer. Software will do better at the challenging ends of the performance spectrum.

iQA 600 measure
Image Engineering iQ Analyser

A nice thing about sharpness as a test vector for surveillance cameras is that factors like flare, blooming, amplification noise, lens distortions, poor sensor performance – an inability to render subtle differences in colour, for instance – will all feed into the overall sharpness score. This is a double-edged sword for the perfectionist tester. The 10-90 per cent rise distance of pixel level is a typical sharpness measurement of an entire camera solution – lens, sensor, camera engine, (monitor if tested by eye), cabling and network – as well as the firmware inside and outside the camera that might be putting work onto an image. Calculating the rise distance of lens or sensor individually is something else entirely.

At SEN, we measure performance in our camera tests by viewing test target bar patterns, matching colours to a colour bar and assessing degrees of blur, among other things. This process is undertaken with the naked eye – it’s subjective but not entirely so, and it’s not without considerable value. But subjective tests, even those of experienced testers using the same tests targets and comparing images of the same scenes in varying conditions, can’t be exactly repeated and can’t give absolute measurements that allow true comparisons between cameras the way software can. For instance, consider the point sharpness gives way – the vanishing resolution – which may vary across and through a scene, and will certainly be the subject of debate between human observers.

I wouldn’t say the human eye isn’t capable of highly accurate assessments, but those assessments are going to be generalisations and will have biases that could be governed by observer expectation, tiredness, the quality of an observer’s vision, the performance of an observer’s monitor, whatever image quality happens to be front of mind in a split second of decision and more. Comparatively, whichever values software applies, it will apply them all equally, all the time. An observer might perceive excellent colour rendition and natural tones but an ability to put consistent empirical values on a sensor’s colour performance by measuring peak sensitivity of red, green and blue pixels, or put a value on SNR against variable ISO? Not a chance.

At the same time, could a piece of test software better comprehend the key camera qualities required in a specific application during a 24-hour light cycle – judging not one parameter but balancing the whole and making real time compromises in multiple areas of performance to achieve operational requirements elsewhere? Again, I think not a chance. Using both types of assessment together, however, is likely to provide the most accurate and most consistent results. But perhaps most importantly, any conversation about camera performance is worth having. ♦

Click to Bookmark Post
Post Bookmarked

AUTHOR

SEN News
SEN Newshttps://sen.news
Security & Electronics Networks - Leading the Security Industry with News and Latest Events. Providing information and pre-release updates on the latest tech and bringing it all to you daily. SEN News has been in print for over 20 years and has grown strong as a worldwide resource in digital media.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Related Articles

TODAYS WEATHER

19.3 C
Sydney
13.1 C
Canberra
27.5 C
Darwin
10.7 C
Hobart
24.1 C
Perth
23.9 C
Brisbane
12.1 C
Auckland
20.3 C
Melbourne

RECOMMENDED

- Advertisement -

POLL

RECOMMENDED