GCSEs and our fatigue of the tech-based world created by Covid-19

From education to politics, people miss the moderation of a face-to-face reality

LONDON, ENGLAND - JANUARY 05: The son of the photographer begins the new school term of 2021 at home by watching an online introduction from his teacher, on January 05, 2021 in London, England. All primary and secondary schools have now closed until at least mid-February, with GCSE and A-level exams facing cancellation for a second year. British Prime Minister made a national television address on Monday evening announcing England is to enter its third coronavirus lockdown of the year. On Monday the UK recorded more than 50,000 new confirmed Covid cases for the seventh day in a row. (Photo by Leon Neal/Getty Images)
Powered by automated translation

It must have seemed like a no-brain decision to deploy mass data processing techniques to deliver a result for Britain’s school leavers in the country’s national exams.

Educational reformist could only dream of a fell-swoop moment that would pitch the process into the 21st century. The school shutdown earlier this year, for a time, seemed to provide an unexpected opportunity. Instead the outcome was messy and there was a revolt from those subjected to a change they did not understand.

The fallout shows that the interplay between humans and digital innovation is one of the great puzzles of the Covid-19 pandemic.

A video grab from footage broadcast by the UK Parliament's Parliamentary Recording Unit (PRU) shows Britain's Education Secretary Gavin Williamson as he announced that GCSE and A Level exams will not go ahead this year due to the ongoing Covid-19 pandemic, in a hybrid, socially distanced session at the House of Commons in London on January 6, 2021. Education Secretary Gavin Williamson confirmed children would be assessed by their teachers instead of sitting end-of-year exams. Exams were also cancelled last summer, but Williamson insisted there would be no return to his department's use of a computer algorithm to apportion grades for university aspirants, which turned into a fiasco. - RESTRICTED TO EDITORIAL USE - MANDATORY CREDIT " AFP PHOTO / PRU " - NO USE FOR ENTERTAINMENT, SATIRICAL, MARKETING OR ADVERTISING CAMPAIGNS
 / AFP / PRU / Robert BODMAN / RESTRICTED TO EDITORIAL USE - MANDATORY CREDIT " AFP PHOTO / PRU " - NO USE FOR ENTERTAINMENT, SATIRICAL, MARKETING OR ADVERTISING CAMPAIGNS
Britain's Education Secretary Gavin Williamson announced that GCSE and A-level exams will not go ahead this year due to the pandemic, in a hybrid, socially distanced session at the House of Commons in London on January 6, 2021. AFP

The British government announced last Wednesday that for a second year running it would cancel the A-level and GCSE exams process. The decision was inevitable but it also marked the demise of an experiment.

Officials had tried to replace testing with an algorithmic allocation of marks when the issue first arose in the spring. Now, in the third pandemic lockdown, the simple reality is that approach was unacceptable to all those with a stake in the exams outcomes.

The issue affects thousands of students in the UAE and a number of Middle Eastern countries, where A-levels and GCSEs are widely used, blue-riband exams.

Why the algorithm method became unacceptable goes to the heart of a wider discovery about digital tools during the lockdowns. Constrained by circumstances, people have been better able to recognise the loss of autonomy to machines. They can draw a line against it going too far.

The A-level results episode was a salutary lesson. Students didn’t understand why they had been assessed at a particular grade. Unable to see the rationale at work, they did not want to give over their destiny to an opaque system. This framework was not one that could be easily be influenced, and there was no certainty of a just outcome.

There is no certainty the alternative system will work either. Teachers operating an assessment of their pupils themselves is bound to mean subjective application of the marking system.

This, however, has a flesh and blood manifestation. Students can relate to it. Some societies have trialled digital social credit mechanisms to provide incentives and disincentives while using digital services.

Commercial developers use algorithms to market and tailor product ranges. It is one thing to have minute-to-minute purchases determined by the system. An entirely different vista opens up if lifelong, gateway decisions are entrusted to automated systems.

The uprising against the mathematical application cannot be surprising, given what’s at stake. By confining people to their homes, the pandemic has allowed for a reassessment of the role of technology. To communicate, people use devices. To shop, they use algorithmic platforms. To be entertained, they are almost exclusively reliant on the virtual.

Most acceptable to consumers are platforms that involve an exchange of convenience for data. Augmented intelligence that helps direct healthcare or improves educational access is another welcome development. It’s the out-and-out replacement of human beings that was rejected in the A-level debacle and increasingly elsewhere.

The Facebook account for U.S. President Donald Trump on a laptop computer arranged in Arlington, Virginia, U.S., on Thursday, Jan. 7, 2021. Facebook Inc. Chief Executive OfficerMark Zuckerbergsaid that the current block on Trump's Facebook and Instagram accounts will be extended indefinitely and for at least the next two weeks. Photographer: Andrew Harrer/Bloomberg
Social media has become a centre of political polarisation. Bloomberg
Politics done face to face has, in recent decades, tended towards moderation

Social media platforms face a similar backlash. Pressure for more controls on how the sites are used is bound to escalate after the events of the week in Washington.

One telling statistic is that almost two thirds of people who belong to extremist Facebook groups in the US were directed to join by the site’s suggestion. Directing people into extremist circles is actively posing a risk for society beyond anything a commercial enterprise should entertain.

Again, the focus shifts to the risks posed by technological advancement to the wider population. Mass impact is the factor that technology cannot wish away. Politics done face to face has, in recent decades, tended towards moderation.

Investigations into how the unfortunate woman who was shot and killed in the US Capitol was radicalised show a pathway of escalation on messaging platforms. Ashli Babbitt was not a victim of a moment, but almost a decade of exposure to intense political escalation had its effect.

After so many months of pandemic confinement, what is remarkable is that most people have not become more susceptible to conspiracy theories and remain passive consumers of whatever the algorithm doles out.

The strong uptake of vaccine programmes offered by governments in places such as the UAE and UK shows people can prioritise themselves and override the virtual disinformation that is often described as bombarding ordinary individuals.

There seems to be a heightened awareness of personal vulnerabilities. Taking this forward becomes an act of mental fortitude.

Perhaps it is understandable that a pathogen that can so easily infect our systems dictates much of our lives. Thus we reassess all kinds of easy interactions at the virtual level and during interactions with automated systems. The skill of differentiation has come to the fore. As a one-off, a driverless car is rational, but as a collective activity it could easily fail to gain cross-community confidence.

The four walls that surround those who isolate are also mental markers that can be externalised. That tilts the balance in favour of how much faith the systems and algorithms can generate among the people.

Damien McElroy is London bureau chief at The National