in

Coronavirus: 'Worrying' questions remain after technical error blamed by PHE for 15,000 missing cases, expert says

Following the revelation that some 15,000 overlooked coronavirus test results had been added to the weekend’s totals, experts have warned the seemingly “fundamental” IT error raises “worrying” questions over the government’s past and future handling of data.

The explanation prompted concern that additional cases may remain overlooked, with one expert comparing the system apparently used to collate testing data to “a similar architecture you used to see in banks 30 years ago”.

And as officials grappled with drastically altered local infection rates, a data scientist warned the blunder’s possible “knock-on effects” on the already strained contact-tracing system could have a “substantial influence on the generation of new cases”.

After the government’s Covid-19 Dashboard showed record daily caseloads of 12,872 and 22,961 on Saturday and Sunday respectively, officials said the stark rise was due to an IT glitch discovered on Friday evening.

As a result, 15,841 cases were included that had previously been left out of the totals announced between 25 September and 2 October.

While the revelation prompted furious and still unanswered questions over how many possibly infected people had been missed by contact tracers as a result of the glitch, Public Health England (PHE) sought to clarify the cause of the error.

The “technical issue” was that some digital “files containing positive test results exceeded the maximum file size” accepted by the government’s central computer systems, officials said.

A “rapid mitigation” had been put in place to split large files, while “a full end-to-end review of all systems” had been ordered “to mitigate the risk of this happening again”, PHE said, adding that there are already a number of automated and manual checks within the system.

But a computer science expert said the explanation raised a “whole series of unanswered questions, some of which are quite worrying”.

The statement suggests that the mistake was only picked up as officials audited or reconciled previous counts of results, and that computer systems did not issue basic alerts that some results had been rejected by the central server.

“It’s quite an oddity because it only seems to have been found out by happenstance. It’s not that the system was warning them,” Alan Woodward, visiting professor at the University of Surrey’s Department of Computer Science, told The Independent

This appeared to raise the possibility that test results had been slipping through the gaps for weeks, and thus omitted from official counts in the past.

“It is a possibility, and that’s one of the really worrying things about this, are they going back to look?” Prof Woodward said. “I mean, how big are these files?”

He added: “If this [explanation] is plausible, what’s also plausible is the fact that it may have been happening before, so were the numbers higher all along?

“But then presumably somebody would have caught it earlier and, if they didn’t catch it, does that mean they weren’t reconciling things to make sure all the tests were actually counted?”

Asked whether smaller quantities of test results could have been slipping under the radar for weeks, a PHE spokesperson told The Independent: “Smaller numbers wouldn’t have triggered this problem. It’s not that the individual files are too big, it’s that when they’re all grouped together and large numbers are reported simultaneously, that can become too big. 

“I’m not sure how big the individual files are, but I’m reassured that the problem is with bundling them together rather than the actual file size. 

“It’s also important to note that this isn’t the only place where there’s a record of [the results] — the individual trusts who perform the tests keep a record which can be … cross-referenced.”

Despite the multiple records available for cross-referencing, the spokesperson was unable to explain the week-long delay in officials realising thousands of results had not been counted.

Meanwhile, the admission that the government’s computer systems send test results over in batches rather than in real-time raised eyebrows.

“That’s a very old fashioned way of doing it,” Prof Woodward said. “In this day and age, especially with continuous connectivity, there should be no reason why … once [a test result] gets confirmed and validated inside the lab, it goes up, so as real-time as possible could be done.”

He added that the structure by which results appear to be collated from individual laboratories is “a similar architecture you used to see in banks 30 years ago”.

Coronavirus test chief Baroness Harding denies system is failing as pressure mounts

“Being naive, I thought it would have been just literally the test centres would have had a … piece of software they would be putting their results into and that would then get stored on the central database,” Prof Woodward said. “But it looks like what’s happening is that all the testing centres have got their own systems and then at some point a batch of those get sent to the centre, which is a bit odd for this day and age.”

He continued: “It’s not a huge surprise on a big system that you get teething problems, but at the same time — and I should show some of my frustration here — building a completely new system as we have done rather than continue with the existing NHS track and trace and just utilising that all seems odd to me. The very time you do not need to be ironing out the bugs in a new system is in the middle of a pandemic.”

Meanwhile, Test and Trace chiefs insisted that all those who tested positive received their results “in the normal way” and were told to self-isolate.

And despite daily positive tests caseloads in the week from 25 September being between 744 and 4,786 infections higher than shown in government figures, the government insisted decisions on localised restrictions affecting millions of people had not been affected.

But in Liverpool, where new restrictions were imposed on Thursday, the revised total meant the city’s infection rate soared from 287 to 456 cases per 100,000 residents, while Manchester emerged as the worst-hit part of the UK, with some 496 infections per 100,000 people.

“The big problem with this latest data issue from PHE is that we were misled as to the underlying trend during that period,” said independent statistician Nigel Marriott. 

“Up to Friday it looked like the recent surge in cases had paused and there was hope of a turnaround in some places.  But with the revisions, it is clear that there is still a strong upward trend and more measures may be needed to reverse the trend.  

“At present, the ‘50,000 cases by mid-October scenario’ postulated by [England’s chief medical officer Chris Whitty] last month can’t be ruled out, although I suspect the number will be closer to 25,000. 

“What hasn’t changed is the sensitivity of the national trend to what is happening in the North.  The sooner the North slows down and reverses, the less likely we are to fulfil the CMO’s scenario”.

And as cabinet minister Therese Coffey was unable to give an estimate for how many people had been missed by contact-tracers as a result of the blunder, one expert warned there would be a knock-on effect upon future contact-tracing efforts.

“While it appears [contacts] are now being contacted as a matter of priority, this additional strain on a system already stretched to its limit implies that further delays are likely to occur for other cases where contact tracing is needed,” said Rowland Kao, a professor of veterinary epidemiology and data science at the University of Edinburgh.

“These knock-on effects may have a substantial influence on the generation of new cases, over a period even longer than that.”


Source: UK Politics - www.independent.co.uk


Tagcloud:

US election: Joe Biden unlikely to give Boris Johnson ‘warm, welcoming embrace’ says former ambassador

Chancellor hints of tax rises ahead as he says governments have sacred duty to balance books