Mid-year/EOFY is a good time to think about the state of play in the electronic security and wider technology markets as we move into the second half of another pandemic year.
More than ever, it’s clear that COVID-19 has been disruptor and catalyst, sharpening the functionality of many of our solutions, clearing undergrowth around hidden niches, and liberating existing solutions from the constraints of technology as an expression of cultural habit.
Much of what constitutes the ‘new normal’ from the point of view of technology is not new at all – instead, it’s mature technology off leash. Viscous oozings like cloud, face recognition and video analytics, some of which have been our favourite vapourware for a decade and more, are now the subject of operational musings among serious end users and their trusted consultants, suppliers and integrators.
But that’s not the path we want to go down here. We’ve prated on about accelerationism and the 4th industrial revolution long enough. Instead, there’s a wider social new normal that looks increasingly likely to impress itself onto our expositions of security technology. That social new normal will impact the sorts of solutions users want in the future – solutions that are flexible, cybersecure, operationally beneficial, protective of privacy and human rights, and kind to the biosphere.
Something about this new normal – it’s going to need to take the concerns of users into account in the development pipe. These concerns are multi-faceted and COVID has changed their shape. They include fears about unregulated technological monoliths, fears of technology in the hands of government, privacy issues and more, leavened with the business ramifications of lost trust.
The wider new normal is going to include a greater social divide, hybrid education and training models, flexible workspaces and new ways of doing old things. Some pandemic business practises – think remote medical services – will not revert. Proper internet – it’s no longer an option. Preferably 250Mbps down and 50+ up. Think the future won’t be 5G? When 1-2Gbps cellular comes, all users will want it.
The pandemic has accelerated trends in areas like touchless access control and has centred security solutions, including access control and video surveillance, placing security at the heart of BMS, but which aspects or collections of our technology will offer greatest business opportunity through delivery of real operational values remains unclear. Whatever that value is, it’s not likely security providers will be the ones who decide.
Something else that’s increasingly pervasive among technology commentators is recognition of the negative potential of pervasive technology applied without constraint. This is less about the real threat of AI-powered automation taking the jobs on which working people depend in countries like Australia and NZ, and more about opaque notions of social control predicated on worst case scenarios.
The fact unregulated technology is already used by corporations to discover the preferences of users with every click seems less of a concern to commentators than the application of biometric authentication technologies in closed working environments.
At another level, artificial intelligence is going to empower some workers. When it comes to assisting security officers, security operators and security managers make sense of a cavalcade of inputs in real time, AI has a lot to recommend it. But this AI needs to have a use case and must be carefully applied. And as far as possible, our AI needs to be transparent in its underlying operations.
In the new normal, many elderly Australians aren’t going into aged care unless they must. That means that the monitoring of millions of medical alarms, home automation solutions handling lighting, layers of notifications, automated watering, and other functions, will all have a major role to play. IoT is going to be less about monitoring whitegoods and more about monitoring life safety, health, quality of life – and about informing family of incidents unfolding in real time.
You can’t talk about IoT without bumping into privacy and that’s going to be important, too. When you combine AI and IoT with biological feedbacks and face recognition in areas like remote medical diagnosis, you really are creating a recipe for privacy heartburn. How these conflicting necessities are managed in secure and responsible ways will be key in the new normal. We read fears of hacked pacemakers but from a provider perspective, the real issues are going to express themselves as a lack of consumer trust and slow uptake of smart services.
The new normal is also likely to see examination of the centralisation of cloud services, especially for mission-critical applications. And governance of AI in surveillance and establishment of AI ethics – maybe even for home automation – is going to find itself on the table. We’re also likely to see trends towards data minimisation, data ephemerality, and data rights impacting on security system parameters.
There’s also evidence of the increasing intrusion of (cough) sociologists into discussions around security technology. There can be a disconnect between the sprawling opinions of sociology and the falsifiable mechanisms of empiricism that makes its baldest assertions hard to bear, especially when they present as emotional reactions to outdated data. But this doesn’t mean we won’t have to thoughtfully address these assertions in the ‘new normal’.
#securityelectronicsandnetworks.com