It literally took an earth-shattering event for the University of Auckland to implement a meshed network fabric, but now it has one it will be able to predict when and where the next major rumbles are likely to occur on the Shaky Isles.
James Harper, the university’s associate director of operations, says it was considering replacing a traditional, three-tiered “tree” network but the earthquakes that devastated the country last year hastened the decision to spread its data centre eggs across more than the one basket.
Although the university doesn’t officially acknowledge the quakes as triggers to installing Juniper’s QFabric, he says it played on people’s minds.
“The catalyst for QFabric is we recently completed construction of a new data centre at Tamaki, a satellite campus 10 kilometres away,” Harper explains.
“We had two data centres about 300 metres from each other: that posed significant risks for us. It was something we’ve been planning for a long time but [the quakes] certainly reminded senior management of the importance of business continuity.”
The fabric has enabled Auckland University to play service provider to other institutions and e-science researchers, including helping New Zealand’s eScience Infrastructure researchers with their earthquake prediction simulations.
“It’s not just a straight replacement [of the network]; we’re seeing a lot of investment into high-performance computing,” Harper says.
Auckland University is playing host to sector-wide initiatives throughout New Zealand as well as supporting a number of Crown Research Institutes (CRIs), separately run government research groups with specific scientific goals.
“We were looking to put in a new structure in the data centre to support high-performance, highly scalable services we expect to see over the next 10 years,” Harper says.
That saw top-of-rack switching replaced with a simpler 10GB Ethernet QFabric solution at the Auckland and Tamaki campuses with a fibre backbone for processing, failover and disaster recovery. Harper says it has “deterministic latency” – researchers know how long and how many hops a data packet takes, essential for intensive computation.
Although there are risks associated with using a Layer-2 technology this way, Harper notes that technology Juniper developed for it allows the university to run two fabrics as one.
“We wanted to deploy something in one data centre and know it would work the same way. We have dark fibre between the two so we had no shortage of bandwidth.”
He and his team have succeeded in keeping resources connected because, as they migrate between data centre virtual local area networks, they keep their internet protocol addresses.
“Previously that meant you had traffic going from one data centre to the other [unnecessarily]. Now traffic stays local and traverses the dark fibre if it needs to get to a server at the other data centre.”
And because the institution knows the upper latency bound, it simplifies network-aware applications, such as those used by researchers.
Harper estimates fabric cut power costs by up to 95 percent and “from my point of view, the cost per port and scalability was compelling”. And given that the university was an existing Juniper user with a number of switching units already installed, its IT team was already familiar with the Junos operating system.
Sky high
One of Auckland University’s more notable service provider customers is the Auckland University of Technology, which is analysing results from the Square Kilometre Array (SKA), peering into space from across South Africa, Australia and New Zealand, in Harper’s data centres. It’s creating an explosion of data up and down the country that will see the UoA move to 40 GB Ethernet, Harper says.
Auckland University runs a parallel fibre channel storage array knitted with Cisco switches but there are no plans to collapse it into the internet protocol Juniper network, he says.
“Fibre channel gives you resilience, load balancing, the way you just plug it in and it sorts itself out and seeing those features pop in on a QFabric solution is [excellent]. It’s taken far too long for that to happen. QFabric would support Fibre Channel if we went in that direction.”
It took Harper’s team and Juniper engineers two weeks to install. Harper estimates the data centre now has cable runs about 5 percent of that it used previously, reducing to about 1.6 kilometres.
Back on the other side of the ditch, Fox Sports has deployed Juniper aided by Sydney reseller, ICT Networks. It generates 30 terabytes of data a weekend and a petabyte every six months and with the move to a new data centre at Gore Hill on the cards next year, it needed to handle its explosive data growth.
Fox Sports estimates the Junos-QFabric solution is about two years ahead of Cisco’s, its previous network switch provider, its chief technology officer Michael Tomkins told The Australian.
“This is a distributed architecture, so I can have parts of it in different racks and I can allocate the switch and the high bandwidth where it’s required,” Tomkins said. “It gives me redundancy, flexibility and [Juniper’s Junos network operating system] . . . management of the platform is much simpler. The throughput is better than I can get anywhere else.”
Juniper Networks A/NZ vice-president Mark Iles says the single-layer, single way to look at switching is attractive to end customers. Other architectures insist on system administrators corralling hundreds or thousands of switches, he says.
“We do away with the three hops where you investigate the data at the access, aggregation and core layers and where it needs to go that makes for a high-density environment that’s very complex to manage,” Iles says. “QFabric connects any device to any other device and we can do that for upwards of 6000 10GB Ethernet ports.”
Juniper is implemented in stages alongside a customer’s infrastructure, Iles explains further: “Use the equipment you have now and start to migrate as you grow, so it’s not a rip and replace.”
Iles emphasises that Juniper relies on the channel for its income and has accordingly invested in partner training, inviting resellers to bring customers into its data centre to see fabric at work. And it helps resellers model customer needs to see the best time to move to fabric – helpful if they are nervous about write downs.