Home Case Studies Sydney Harbour Tunnel’s Fibre Diet

Sydney Harbour Tunnel’s Fibre Diet

0
INSTALLING electronic security systems is a challenging science. There are disparate comms technologies, challenges with power supply, issues with lighting and the vagaries of integrating control. But such considerations pale into insignificance when your workspace is filled with speeding vehicles, tens of thousands of them every day. Making matters more challenging, these 80,000-plus vehicles have precedence in a city whose road system is best described as stretched.
It was precisely this situation that faced Bob Allen, general manager of the Sydney Harbour Tunnel Company and long-term surveillance contractor, Lionel Ascone, managing director of integrator, Trantek. Vital to north/south traffic flows in and around Sydney city, it is imperative the SHT constantly remain open with the exception being late night closures at specific times, bi-monthly. What this meant for installers was they had a couple of hours at a time to undertake a demanding upgrade to an operational system – not an ideal set of circumstances.
Amplifying the challenges is the physical nature of the site. It’s vast and comprises 3 sections, a pair of 900 metre land tunnels on the north side of the harbour and a pair of 400 metre land tunnels on the south side. Linking these is a 960 metre immersed tube laying 25 metres below the harbour. A key element of the installation is the Sydney Harbour Tunnel Company head office and control room to which all cameras in the tunnel return for 24-monitoring.
The Boys 0
Right from the start we should acknowledge the imperative of the surveillance system to the operation of the SHT. Without video coverage of the tunnel and approaches, the tunnel must be closed. This means complete chaos for Sydney’s traffic system.
This imperative framed the character of this surveillance upgrade, which had to incorporate minimal points of failure and be based on proven technology that was future proof. Yet the same imperative inhibits the ease with which any such upgrade can be undertaken. For SHT, change must be incremental and it must have no impact on the live system monitoring traffic in the tunnel. It’s this paradox that lies at the heart of the SHT remote control room installation.
Before we get at the upgrade, however, let’s consider the legacy system installed in the tunnel. It’s analogue, mostly Panasonic fixed cameras (now supported by a number of new Sanyo and Bosch PTZ cameras). In the field these analogue camera signals are run to nearby cabinets and they’re multiplexed onto fibre optic cable for transmission to the main SHT control room in North Sydney. Once legacy camera signals are inside the control room they are recorded in real time on DVRs, as well as being passed to a Philips Allegiant Matrix Switcher allowing display on a video wall.
Operationally there are 40 cameras in each tube with a camera every 60 metres. According to Allen, from a management perspective, these cameras are the instant location devices of a management system driven by camera numbers.
“For our operators, the most accurate way to assess the location of an event is to look at the camera number you are viewing – that’s because there’s a camera every 60m. As an operator, if you put a camera number into the computer the system will tell you exactly where the camera is the tunnel – this means you know where in the tunnel you are looking at.

“Up till this remote control room went in, if we lost this building the tunnel would have to close. Now we can see what we are doing – that’s what this was all about. Surveillance is the only system we have duplicated – that’s how important it is to us”

“The system also tells you which traffic plan to use to resolve the issue – all from a single camera number – all the procedures relate directly to camera numbers. We call the system that governs the tunnel the Central Control Computer System – it’s a customised SCADA solution.
“The main control room has screens showing layouts of the tunnel, the current traffic plan, message boards, ventilation fans – the top part is southbound and the bottom is northbound. You can see all the crucial systems at a single glance.”
Using the CCCS, operators can select cameras live or recorded, drive PTZs, bring up traffic plans, manage ventilation fans and check current message boards. It’s this management capability that SHT needed to replicate at the remote site so that in the event the SHT building was damaged and the main control room could not be used, SHT staff could quickly walk to the remote location and activate the second control room.
When security managers and integrators think about adding an additional control room to a site like a major sports ground or an industrial facility they usually consider how easy this is in these days of networkability. But for SHT, things were not so simple.
“When we put the remote control room in we couldn’t just take a feed from here and run it over to the remote location because if this building is lost then so is our matrix switch,” Allen explains.
“Instead what we needed to do was run another batch of fibre optic cables that would allow us to take a separate feed from the field cameras to the remote control room. Another part of this upgrade was installing some new PTZ cameras inside the tunnel which we didn’t have, so this had to factored in, too.”
According to Allen, the remote control room is a space within a structure SHT already uses to support and service the tunnel. The location is close enough so that staff can get there quickly in order to re-open the tunnel should the main control room be off-line for any reason.
“We say it might take a couple of hours to move to the new control room and have the tunnel up and running,” says Allen. “The fact is the tunnel cannot remain open without CCTV cameras operating so this is a key service. From a management perspective, if you can’t use the cameras you can’t see traffic, over-height trucks and you can’t control smoke. The tunnel would have to be closed if operators were blind – no ifs, ands or buts – it would simply have to be shut.
“Up till this remote control room went in, if we lost this building the tunnel would have to close. We did have separate points where you could plug in a control computer which would give you control of some of the facility but you still can’t see what you are doing inside the tunnel.
“Now, with the installation of the remote control room, we can see what we are doing – that’s what this was all about. Surveillance is the only system we have duplicated – that’s how important it is to us.”

System design

Central to the installation of the remote control room was an upgrade of the comms network that supports the overall system. While the existing control room was supported by analogue fibre, the new solution needed to be digital – existing cameras had to be linked to encoders and these encoders needed to be connected to a new fibre optic network.
According to Trantek’s Lionel Ascone, the way the upgrade to the system works is modular and it’s designed for maximum simplicity and robustness. A key element of the upgrade was that it had to be entirely separate from the existing solutions until the point of crossover– there could be no disruption of live surveillance operations.
“On the legacy side of the SHT surveillance system, field cameras go to concentrators, then to multiplexers and then come back here over fibre to an analogue switcher – that’s the existing system,” he explains.

“In terms of network layout, the fibre backbone is a single logical ring with a capacity of 2Gbps. This backbone is made up of 2 fibre rings so if you lose one, the other remains available”

“With the alternative digital system every single legacy camera as well as the new PTZs goes to encoders located in cabinets inside the tunnel. From there they go onto a fibre backbone that is available at the remote site where the cameras can be monitored, controlled and recorded. Everything that goes to the remote site is digital.
“In terms of network layout, the fibre backbone is a single logical ring with a capacity of 2Gbps. This backbone is made up of 2 fibre rings so if you lose one, the other remains available,” says Ascone. “The fibre goes around the whole tunnel in a ring, starting here and going north to the main control room. It then comes out of the main control room and goes into the southbound tunnel through existing trenching and racks and it breaks out into every control point in the tunnel.
“Along the fibre backbone are 7-8 core switches and equipment like encoders comes into these core switches carrying video signals. There’s another set of links in between each core switch and every piece of equipment is connected to 2 core switches for redundancy.
“This means the system can afford to lose a core switch and nothing will happen and it can afford to lose half the fibre and nothing will happen. It’s also possible to lose any 2 fibre links and up to 2 sets of core switches so it’s very robust.
“The only single point of failure in the whole system is the encoder itself but it’s a single point of failure to cameras only. We can lose up to 2 cameras if an encoder fails but those 2 fixed cameras will be overlapped by a PTZ so we can turn the PTZs to cover any area we lose – the system is designed for this.”
Clearly then, the network layout is extremely solid and it’s designed not just to support the existing demands of tunnel operation but to be future proof. Over time, the entire system will migrate to a digital solution in both control rooms, leveraging the upgraded fibre solution that is now in place.

The installation

As we all know, talking about system design is one thing but installing a major infrastructure solution like this, which involves pulling new fibre and re-terminating every single camera in the system to new digital encoders, was something else again.
“The most important element of this application and the most challenging aspect of the job was that the tunnel operation had to remain open – that was the most challenging thing for me,” explains Allen.
“We had to run the fibre optic cable, we had to break into the cameras and get them into the secondary control room racks in the remote location and this could only be done in a 5-hour window on certain nights of a single week, bi-monthly.
“And it could only be done without interfering with the operation of the existing CCTV system. That’s why it took so long to complete this installation – it took 2 years because of the challenges of accessing the tunnel.”
Ascone agrees tunnel access was the key.
“Obviously, the most time consuming and difficult part of an installation like this was running fibre and getting all the connections done,” he explains. “We could only shut the tunnel for 5 consecutive nights in each direction once every 8 weeks so it had to be scheduled in advance and this made the upgrade much more difficult.
“As well as the scheduling difficulties there’s the actual time you have to work. On the appointed days, the tunnel is closed between ten and eleven o’clock at night and you have to get your team together and get all your material ready to go in,” he explains.
“By the time you go in it’s pretty much midnight and then by 4am you have to start packing up – if you are not coming out by 4.30 there’s a lot of pressure on to get out – it’s stressful.
“ODG Data installed the fibre optic cable and at the height of the installation there were about 20 techs working in this way. They’d come in and do what they could then get out and then we wouldn’t see them for 8 weeks as we had to wait for the next opportunity to work in the tunnel,” Ascone says.
There were also new cameras to install, not just inside the tunnel but viewing the southern approaches.
“Something else we had to consider was that we couldn’t reach all the PTZ cameras on the far side of the tunnel with the new cabling – it was impossible to install the necessary infrastructure over there so we had to put in additional cameras to cover these south side scenes that we could reach with fibre,” says Allen.
There were decisions to be made about the new cameras, according to Ascone.
“The new PTZs are analogue – Sanyo and Bosch,” he says. “We wanted to put in some megapixel cameras but because we had to be compatible with the rest of the system, which is analogue, we decided to stick with lower cost analogue cameras until the rest of the system is fully digital and then we’ll swap out the cameras – uptime is the key thing.”

The remote control room

For many security operations, finding and fitting out a remote control room  that was both geographically accessible and highly secure could pose difficulties but Sydney Harbour Tunnel was able to access a site nearby. The remote control room isn’t pretty but it’s functional and there’s plenty of space.
As part of my tour of the remote control room we dive into a labyrinthine world of concrete tunnels and steel doors. Completely isolated from other operations, the remote control room is ideal. According to Allen there was no need to install any infrastructure – staff facilities like bathrooms were in place, which kept costs down.
“Gary Payseno, our operations manager and I, nutted out the basics of the control room,” Allen explains. “There was a fair thought process that went into this facility, trying to work out where everything had to go. It’s a shared space so we had to move things around, cut some holes in the concrete wall so we could connect all the systems.
“We had to think about which cameras we wanted to display – we did not want to put a whole new video wall in, so we had to think about the biggest monitor we could buy that would display the cameras we needed.”
We approach the control room through an adjacent room in which there are racks containing the hardware devices driving the new digital surveillance system and recording the video. The storage array is RAID 5 and it’s all off-the-shelf hardware and network switches.
In the remote control room there are 2 large LCD screens mounted on a bare concrete wall in front of which is a long desk comprising 2 operator workstations – the layout is spare but functional. A pair of control units along with control keyboards is located on the operator’s desk.
The control room itself is an excellent space with a large free wall for display. The video wall’s 2 monitors are a good size for the designated space. Each is designed to display a maximum of 32 screens. One side is designated to the southbound tunnel, the other to northbound. As I sit there, Ascone drives the system, dragging and dropping cameras from a camera tree on the left of the screen of a workstation and plopping them onto tiles on the screen for display.
He steers a new Bosch AutoDome PTZ – the image is good – there’s a surprising amount of light in the tunnel – fluoro – and the camera is doing well with this. You aren’t getting license plates with ceiling mounted analogue obviously but things are clear and crisp.
Latency isn’t noticeable and there’s no blooming or smearing of images despite the headlights. When we switch to another camera I can see there’s a build up of soot on the housing window that’s having an impact on the picture – that’s one of the challenges of an install like this one. It’s dirty in a road tunnel and you can’t readily clean housings.
As part of the tour, we view a recent exercise in which the tunnel was closed and a car burned inside to test emergency systems including smoke dispersal and emergency procedures. It’s a graphic display of the threats and capabilities of a major tunnel like this one. In the test the system works well – the smoke is rapidly hauled to the nearest ventilation duct.
Allen, Payseno and Ascone are excited by the procedure, by the way their systems confront the threat. The whole team live and breathe the tunnel. Allen and Payseno worked as operators on the first and second consecutive days of the tunnel’s operation way back in 1992 and their operational knowledge is intuitive and deep.
I’m intrigued by the management solution driving this system because I’ve never seen anything like it before. These days it’s common to see one of perhaps a half-dozen major video management solutions on a workstation and to be instantly appraised of the functionality of an entire application through familiarity with other solutions. SHT is different.

“We had to run the fibre optic cable, we had to break into the cameras and get them into the secondary control room racks in the remote location and this could only be done in a 5-hour window on certain nights of a single week, bi-monthly”

“We use a Cware recording system to handle storage and we use VMTs, our own product, to select cameras to monitors – we designed this system ourselves,” explains Ascone. “Operators use a keyboard which does switching over the network. It offers the same function as an analogue switcher but it uses network video. It directs a camera connected to an encoder, to a decoder and onto a digital monitor, instead of software decoding.”
While many sites use software management systems that require payment of license fees per channel on an annual basis, Allen makes it clear he doesn’t like this business model.
“License fees for cameras are very expensive when you’re supplying management solutions on an annual basis – it can be thousands of dollars per camera – it’s not worth it if you have a hundred or more cameras,” he says.
Instead, SHT uses customised applications of software that are robust and easy to operate. Watching Ascone work his system I can see there aren’t all the bells and whistles other solutions boast but the vital functionality is right there without the fussy layering you sometimes see.

Conclusion

Sydney Harbour Tunnel’s remote control room upgrade provides a fascinating look at the challenges facing system designers, integrators and managers of major infrastructure sites. It’s a hybrid solution, heavy on the analogue side – it’s not a greenfield site and it’s not a replacement.
Instead, the existing system must continue to function flawlessly throughout the application of a secondary comms solution which itself is designed to form the plinth of SHT’s digital future. Comprehending this you can see the difficulties and the vital importance of this upgrade to SHT’s long term operations.
Allen is justifiably pleased with the system and he has no doubt of its importance to his city.
“You have to look at this upgrade as a community benefit – it is designed purely to look after Sydney’s road users – the key idea is that if there is a functional tunnel we want to be able to use it if there is ever a problem with the control room,” he explains.
“The upgrade took a long time to complete but it’s done now and it was done with no outages causing closures of the tunnel. If there had been an out it would have been a total disaster – I’m not trying to exaggerate this – but if there was an incident in the tunnel, traffic across the city would come to a standstill – it would be that bad.”
The only momentary downtime associated with the upgrade occurred as techs set each camera up. In those few moments there was a tiny, controlled outage, one camera at a time, of 30 seconds duration. Ascone says this was necessary as the installers had to take out the existing feeds and plug them into the new hardware devices – it was a straight swap of the cable.
As I’m leaving, Ascone points out that while the remote control room is now completed, the long process of upgrade to digital at SHT is by no means over.
“The next stage is that the digital network will come to the main control room here in North Sydney,” Ascone explains. “It’s an ongoing project, remember.”
By John Adams

“Operators use a keyboard which does switching over the network. It offers the same function as an analogue switcher but it uses network video. It directs a camera connected to an encoder, to a decoder and onto a digital monitor, instead of software decoding”

NO COMMENTS

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Exit mobile version