fbpx
16.5 C
Sydney
Saturday, April 20, 2024

Buy now

  • HIKVISION
  • HID
  • HIKVISION
HomeSecurityAlarm SystemsLaws of Motion: MQU Motion Blur Test

Laws of Motion: MQU Motion Blur Test

MACQUARIE University undertook a dome camera motion blur test in January as part of an ongoing CCTV upgrade which will see the university’s 900-camera CCTV solution migrated to an external cloud provider.

MQU’S motion blur test was undertaken by MQU’s CCTV consultant Scott Myles and included parameters such as motion blur, bit rate, sharpness, WDR, depth of field, noise and colour rendition in low light and very low light applications, as well as ease of integrating with MQU’s Milestone XProtect Corporate VMS. SEN’s Rotakin unit was seconded to assist the process of establishing units of blur and the results of the test, showed 3 standouts, though each camera approached the challenges of the application in different ways, creating application-specific trade-offs.

“The test was conducted in the sort of conditions a camera will typically face in a university environment and it was pleasing to find 3 cameras that met the criteria, though to varying levels against each parameter,” Myles said. “It was also good to get input from one of MQU’s experienced security operators, whose instincts corroborated the data we had gathered.

“While this test is concerned primarily with blur – we didn’t want to meet that parameter to the exclusion of other valuable parameters – sharpness, colour, depth of field and the rest. I told the engineers who set the cameras up prior to the test, to apply camera settings which were good for both day and night, giving us the best balance through a 24-hour light cycle, while keeping control of bitrate.”

The test jig.jpg MR
Setting up the test cameras

When SEN arrived at MQU with Ronnie Rotakin, the cameras were already set up on a board outside the MQU security office pointing up the carpark towards a chain link fence and gate, with a line of trees in the background. Nearer distances were marked off at 5m intervals using labelled traffic cones. SEN’s Rotakin unit was deployed to measure motion blur. I’d decided not to bring test target Norman, which is designed to measure colour rendition and sharpness.

In hindsight, the use of Norman would have allowed a more accurate sense of static colour rendition and face recognition deeper into the scene, but in the planning process I decided a static Norman would distract from the primary goal of the test, which was to establish the image quality of moving faces. We managed without Norman, but we missed him.

The Cones.jpg MR

With cameras, cones and Rotakin set up it was simply a matter of waiting for light levels to fall so we could go through the camera group one parameter at a time. Worth pointing out at this point, there were a couple of cameras that had not been integrated with the Milestone system and so could not be considered.

We started out viewing the scene from the security control room in reasonably good light – around 20-40 lux at the target. The light was primarily low kelvin – between 1800 and 3000 in my estimation. This quite high level of light – you’d rarely experience 40 lux on the street at night – still left some cameras with a motion blur deficit.

We did not go into each camera individually to check shutter speed but in some cases, it seemed as little as 1/30th of a second. One camera, however, seemed to have sacrificed light gathering potential and detail deeper into the scene to handle motion blur better – the strategy was successful, but it came at a cost of darkness elsewhere in the scene.

The labelled cones were excellent in that they took the guesswork out of depth of field assessments. As the night wore on and our eyes got tired, it became tempting to generalise performance and having those labelled cones in place allowed working depth of field – the distance to which sharpness is retained – to be pin-pointed within 5 metres.

Scott and John 1.jpg MR 1
Scott Myles (left) with MQU campus security manager, John Durbridge

Another variable was the sweet spot in the focal point of each camera lens. All the cameras were operating at their widest lens settings – around 2.8mm for the most part – yet there seemed to be differences with focal points. This was exacerbated by digital zoom, which added pixel spread to the equation. Then there were processing differences, amplification variations, colour casts introduced by the low kelvin luminaries in the carpark. Making judging these differences more challenging, some camera characteristics were inter-related.

The mounting of cameras flat on the mounting board introduced another variable during the WDR test – internal dome bubble reflections and internal lens reflections. In one case, the WDR test provoked a camera to display a curious pattern of concentric rings and a large greenish disc on the monitor and it was only later in the test we realised what we were seeing was a reflection off the front of the lens reflecting back at the sensor off the dome bubble and the green disk was a reflection of the lens coating.

The test

Going through the performance parameters one camera at a time for each parameter was instructive. Importantly for the client, the test is being conducted in real time using MQU’s Milestone XProtect Corporate VMS. As we were beginning to go through the cameras something interesting happened. A person who could see the Rotakin spinning outside the security office rode down on a bicycle to take a look and during this little investigation moved between the cameras and Ronnie. These movements at this close distance, which constituted a greater portion of the scene, led to the introduction of more widespread blur and artefacts as cameras tried to process the scene.

While we can’t give the full test results, we can speak generally about performance. There was some colour casting with a number of the cameras as the low kv light spilled into the image. I thought the Vivotek was best against motion blur but managed this at the cost of some darkness in the rest of the image – perhaps a fast shutter speed. Pelco was a good static image but was less strong against blur. Panasonic and Axis delivered good all-round images – stronger in some areas and less strong in others – but importantly, the order of performance changed as we tested each parameter.

Kiza in control.jpg MR

When it came to colour rendition I liked Panasonic, Pelco, Axis then Vivotek. Depth of field varied and we saw sweet spots in the focal range, too. Pelco showed this characteristic the most, being strong in the middle distance. The night image quality looked like Panasonic, then Axis then Pelco, then Vivotek. Sharpness was Axis, Vivotek, Panasonic, Pelco. When it came to bitrates, Vivotek by a mile then Axis. With motion blur I liked Panasonic, Vivotek, Axis then Pelco.

When we turned all the lights off, performance shifted again. With sharpness I thought Panasonics, Axis, Vivotek then Pelco. With bitrate, Vivotek by a mile, with good sharpness and low noise. With depth of field, Panasonic, Axis, Vivotek then Pelco. With motion blur, Vivotek, Panasonic, Axis, then Pelco. With WDR, Panasonic then Vivotek but we decided Axis was giving the best faces in these conditions, though losing some background in the process. At another point in the test we agreed the Pelco was giving the best front to back static image for situational awareness, taking every factor into account.

“All in all, this was a very tough test – there’s so much variation in different parts of the scene as conditions change and each of the cameras handles things differently and exhibits different characteristics throughout the process,” said Myles. “Importantly from the perspective of MQU, the idea here is to narrow down the field to the group of cameras that handles the conditions in the most balanced way and I think we have achieved that.”

Security Office.jpg MR

Something interesting from my point of view, is that testing cameras in isolation as SEN often does, doesn’t allow you to rate cameras against each other in terms of comparative characteristics. And having a second pair of eyes on the monitors helps, too. Many of the rankings came down to an argument. As Myles pointed out during the process, it was relatively easy to separate the cameras based on a single parameter of performance, though he also pointed out that every layer of performance was different in every situation tested – that was where things got tough to call.

The MQU test showed you should not choose a camera based solely on one characteristic – for instance, low blur at a particular focal length, but you would reject a camera if it displayed high levels of motion blur in all conditions. As Myles pointed out, end users want more than a face at 5m – what that more is, depends on the customer and the application. At the end of the night before packing up, we asked one of the MQU operators which camera he thought was managing the challenging conditions best. When he told us his opinion, we both agreed with him but later, I had doubts.

By John Adams

AUTHOR

SEN News
SEN Newshttps://sen.news
Security & Electronics Networks - Leading the Security Industry with News and Latest Events. Providing information and pre-release updates on the latest tech and bringing it all to you daily. SEN News has been in print for over 20 years and has grown strong as a worldwide resource in digital media.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Related Articles

TODAYS WEATHER

16.5 C
Sydney
7.2 C
Canberra
28.5 C
Darwin
8.2 C
Hobart
19.4 C
Perth
17.6 C
Brisbane
16.8 C
Auckland
27.6 C
Melbourne

RECOMMENDED

- Advertisement -

POLL

RECOMMENDED