fbpx
17.1 C
Sydney
Saturday, April 20, 2024

Buy now

  • HIKVISION
  • HID
  • HIKVISION

BRS AISight: The Age Of Reason

BRS AISight: The Age Of Reason

VIDEO surveillance has a fundamental problem. It’s a reactive solution that allows investigators to see who did what, when – all this in the past tense. Such capability is beaut from the point of view of convictions. If your system is all about resolving lines of corporate investigation, then event recording may be all you ever need.
But if your site has higher security requirements, or if you have a group of higher risk locations you need to monitor live, then you are going to want something more. And that something cannot be dozens of CCTV operators hunched over screens alert to the tiniest incident. Running a dedicated video surveillance division around the clock is painfully expensive – only casinos can afford it.
The solution is something we’ve all heard plenty about over the past decade, something we’ve all given up believing in – video analytics. The conundrum is that no video analytics solution we’ve seen in Australia has worked effectively. It’s been too expensive, too limited, too prone to false alarms and plain too hard to manage.
Now BRS Labs says that its technology resolves the issues of the rules-based past. The company’s AISight video analytics solution never false alarms, never misses unusual events and costs far less than manned control rooms. AISight is a server-based product that can handle 30 inputs, optical or thermal, in H.264 or MPEG4 compressions. The system that can report to workstations or mobile devices and that can handle multi-platform applications including 32 and 64-bit Windows and Linux.
Essentially AISight incorporates learning and analysis engines that combine to allow the system to observe events, analyze them, and remember them in the same functional way the human brain makes and stores memories. This ability to recall means that when new events differ from AISight’s memories, it can judge that a suspect event is occurring and trip an alarm.
AISight currently covers many US railways, as well as being used by intelligence and federal agencies that have used analytics and given up on it despite still needing to watch huge numbers of cameras. And that’s the main issue for video surveillance systems, according to Ray Davis founder, chairman and CEO of BRS Labs.
“There are just so many cameras out there wasting away,” Davis says. “To use all these camera systems proactively in real time you have to have an operator per camera 24/7 or use a system that emulates a person 24/7. AISight is that system. It doesn’t go to sleep, blink, doze-off or day dream.”
There’s no doubt that video analytics was seen to have significant potential after 9-11.
“About 20 companies ran at the market after 9-11,” Davis explains. “They took the quickest way to the product with a rules-based technology. With rules-based systems you have to hard code or program everything. If you have a camera and you want to catch a guy jumping a fence you have to draw a line around the fence, program everything you want to catch in each scene of each camera.”
According to Davis, that caused serious problems. Manual setup was time-consuming and expensive. Another problem was maintenance as cameras move, objects in the scene move or the season changes, the system must be reprogrammed.
Third and most drastic, according to Davis, is false alarms, and he says this is because if you set up a rule then it’s a yes/no hard-coded rule and any variation in the scene will trip the alarm, whether it be headlights, shifting shadows or moving foliage.
“You can typically get from 100 to 3000 false alarms per day, per camera,” says Davis. “So while rules-based analytics was designed to watch thousands of cameras instead what it has done is create thousands and thousands of false alarms. It may catch the bad guy but you’d need a huge staff to filter the false alarms – that defeats the whole purpose of analytics.”
The failure has not been without its benefits for BRS, which arrived in the market in 2005 as a relative latecomer.
“We were fortunate,” says Davis. “We were not only able to see the past products, we were also able to perceive the future – to see what video analytics was going to need and what rules-based video analytics was not going to be able to deliver. We quickly realised artificial neural networks were the only way to make the technology work.”
According to Davis, the central object of video analytics is to have a computer watch a camera and tell security staff when something is important enough to look at.
“But rules-based analytics doesn’t watch the monitor,” he says. “Instead it sets up rules-based zones and if a pixel crosses a zone, then the system generates an alarm. Fact is, the only way to watch a camera is to have a simulated human watch a camera – that’s artificial intelligence, machine learning, artificial neural networks.”
Davis concedes AI is a broad term.
“When I use the term AI here what I’m talking about is extremely advanced machine learning capability – something that can see an event, memorise, reason about it then apply this learning to other events,” he explains.
“Early analytics companies could not achieve this. There’s no way for a computer to understand a video. If you think about it, an email can be read textually, but there is nothing that understands a video. You can’t just have the computer tell you what’s happening.
To resolve the problem BRS created a piece of software called Rosetta.
“Rosetta translates each frame of video into a language – a language the computer understands,” Davis says. “We then pass that language to an artificial neural network that simulates a human brain and what this language does is identify every facet of each video frame and send it to a computer.
“The computer is able to understand the temporal nature of each frame coming through, compare what is different between frames so as to understand objects and their attributes like mass and colours, and to classify objects into different groups – humans, trains, cars, animals, birds, plants.
“The system recognises the behaviour of those objects – down to recognising the normal paths joggers would take through a park or swimmers would take in a swimming pool, how a car drives – how fast, where does it park, which direction does it usually go.”
According to Davis, by inventing this technology and finalising 88 patents in 40 countries, BRS Labs has locked up reason-based processing for video analytics.
“The U.S. security industry now sees us as the only video analytics that works, and the only reason-based analytics provider,” he says.

How AISight works

Perhaps the neatest thing about AISight, and I should point out here that I did not see the system running live but viewed multiple situational demos, is its ease of installation and setup.
Once the software has been started, it connects to a video network and begins to monitor the environment and activities going on in each assigned camera view – up to 30 inputs per server. Each of the camera views is stored as a separate memory. Elements that are always present in the environment become part of the recognised background of a scene.
Meanwhile, objects that enter the field of view are analyzed based on appearance, classification and interaction within its environment and other objects. This classification is important. Objects might be adjudged cars, trucks, humans, pets, birds.
As BRS explains it, the newly set up AISight software analyzes the structures, sizes, shapes, locations, velocities, accelerations, paths of objects and other characteristics of all objects within the scene and forms memories about them over a few days. It also records timestamps for these events and remembers during what times of day or days of the week events most frequently occur.
The more objects and behaviors are observed, the more weight these memories are given. The less frequently the system has observed an event in the past, the weaker its memory will be about the event and the more unusual it will deem a particularly uncommon activity. This might include a human figure on a runway or freeway, or a car on a railway track.
Unusual activity is immediately reported to security personnel to enable a proactive response to potential threats, but normal activity is ignored. And even when AISight has learned to ignore certain activities, it can still be told to alert security personnel of those activities regardless of how often they occur, if needed.
Because of its ability to learn, remember, and slowly but never entirely forget, AISight’s ability to provide currently relevant, accurate alerts evolves alongside the environment. It adapts to moving vegetation, lighting changes, repositioning of furniture, weather patterns, and myriad of other environmental aspects that challenge rules-based video analytic systems.

AISight functions

AISight has a range of monitoring capabilities that include things like classification anomalies. In such cases AISight recognizes when subject types match no known pattern of appearance/properties and issues an alert to that effect. The system can also report on position-based activities and motion. It does this by learning to recognize where it is common for things to appear, disappear, enter, exit, stop, start, turn, move quickly/slowly, accelerate quickly/slowly. AISight uses spatial memories (position-based maps) to recognize anomalies for behaviors that violate the learned patterns regardless of where they occur in the scene.
The system also uses its temporal (time-based) memories to issue alerts on anomalies for behaviors that violate the learned patterns based on time-based maps. This allows for normal activity that occurs on one day of the week, but never on another, to be tracked and reported. Importantly AISight achieves autonomous maturity recognition – AISight determines on its own when it has achieved sufficient scene knowledge and then begins alerts by itself.
While there aren’t any systems installed in Australia yet, there are plenty in the US, including a system that tracks 150 objects and activities at 12 MTA train stations in San Francisco. Its capabilities includes spotting unusual loitering, detecting bags left behind or people crossing the tracks.
According to public records, the system is also being installed at the World Trade Centre and is the system of choice for the safe city surveillance installation in Houston Texas. Other successful applications include port security in Louisiana and water treatment plants in El Paso. Ray Davis says AISight is the revolution the market has been waiting for.
“It has no false alarms and it catches the bad guy every time,” he says. “We can even catch behaviour leading up to a crime, whether it be in the minutes leading up to a car theft or the days leading up to a bank robbery.”
I think it’s fair to say that the entire video surveillance market would love to see a video analytics solution that works – a solution that energized our huge investment in CCTV. Whether AISight is used to handle all the cameras on a high security site or deployed to sharpen the reflexes of a selection of key camera views on a less secure sites, there’s no doubt every security department would love to have this technology at their side.

“Rosetta translates each frame of video into a language – a language the computer understands. We then pass that language to an artificial neural network that simulates a human brain and what this language does is identify every facet of each video frame and send it to a computer”

SEN News
SEN Newshttps://sen.news
Security & Electronics Networks - Leading the Security Industry with News and Latest Events. Providing information and pre-release updates on the latest tech and bringing it all to you daily. SEN News has been in print for over 20 years and has grown strong as a worldwide resource in digital media.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

JOBS

Related Articles

Stay Connected

2,467FansLike
1,386FollowersFollow
0FollowersFollow

REVIEW SCORE

- Advertisement -

LATEST POLL

Latest Articles