As we come into the week where many investigations are going on by Ofgem (the national regulator), National Grid (the national system operator and transmission network) and the generating companies (those who run the power plants), this is a great time for us to reflect as a data centre operator on what happened.
 
The simple answer: nothing. Customers in our data centre didn’t see a change in their power supplied, or the operation of their services. Due to us having resilient UPS (Uninterruptable Power Supply) systems and on-site diesel power generators, whenever we see something happen on the National Grid which could impact our power supply, our automated systems fire up the generators to ensure a consistent and continual supply.

How are we confident our backup systems will work?

Once a month, we do a full building blackout test and run on these systems for at least an hour. This test gives us the confidence that everything in that automatic chain from detection systems, transfer switches, UPS systems, and generators all perform exactly as expected – with no panic or cause for concern. We also have an extensive PPM (Planned Preventative Maintenance) programme to ensure all of the equipment is in the best possible condition when needed.

What did we observe?

At precisely 16:53:39 we saw the frequency (normally ~50Hz) drop below the normal tolerance of +/- 0.5Hz from our main grid connection. This has been backed up by the reports seen nationally and from our peer operators with whom we discuss and share information on key information relative to operational effectiveness.

Why did we see a fluctuation in frequency?

The frequency of electricity is tightly controlled, to within 0.5Hz of the standard 50Hz. The frequency is affected by the amount of power generated (too much excess and the frequency can increase), and the demand for power (more demand than generation will cause the frequency to be drawn downward). On Friday, it was reported that two power plants (referred to as power generators) had problems near simultaneously. The first being a gas power plant at Little Barford providing ~650MW of capacity, and then the Hornsea Wind Farm providing ~1,200MW. The drop of ~1,850MW of power generating capacity represents ~6.5% of the demand of ~29000MW. That is very significant, and the speed in which those two power plants became unavailable, meant that the frequency of the system began to drop.

Why was there so much disruption to the country?

While we didn’t experience any issues at our data centres due to our resilient backup systems, it did impact a number of our employees who were travelling home on Friday evening! There is a designated protocol referred to as ‘demand control’ (or disconnecting higher power usage customers from the network). You may also have heard this referred to as load shedding. For large power users in the UK, there is a set response that automatically causes them to disconnect their usage from the electricity supply should the frequency fall below a set threshold, defined in a technical standard referred to as G59/2. I’m sure this will be reviewed as to the relevancy and ‘impact’ on the nation as a whole. For protecting the overall electricity network, disconnecting part of the railway network makes a lot of sense. However, there is a heavy impact from that disconnection, in comparison to say data centres who have the appropriate backup power systems in place.
 
Ultimately, all of this is will be going through an extensive review from all of the interested parties, and I hope this will guide future policy decisions on protecting the network. These rare events do occur (the last major incident was 11 years ago in 2008) and act as a huge learning point for all involved.