Even after a final term with schools closed for the pandemic, Sam Sharpe-Roe was optimistic about the coming school year. Teachers from his West London school had given him grades — three A’s and one B — that were strong enough to secure him a spot at his first choice of university next month.
But after the British government used a computer-generated score to replace exams that were canceled because of the coronavirus, all his grades fell and the college revoked his admission.
Mr. Sharpe-Roe, along with thousands of other students and parents, had received a crude lesson in what can go wrong when a government relies on an algorithm to make important decisions affecting the public.Experts said the grading scandal was a sign of debates to come as Britain and other countries increasingly use technology to automate public services, arguing that it can make government more efficient and remove human prejudices.But critics say the opaque systems often amplify biases that already exist in society and are typically adopted without sufficient debate, faults that were put on clear display in the grading disaster.Nearly 40 percent of students in England saw their grades reduced after the government re-evaluated the exams, known as A-levels, with the software model. It included in its calculations a school’s past performance on the tests and a student’s earlier results on “mock” exams.Government officials said the model was meant to make the system more fair, balancing out potentially inflated scores given by some teachers. But students and their parents, particularly those from lower-income areas with struggling schools, were outraged that their futures had been turned over to lines of code that favored students from private schools and wealthy areas.Even after the government apologized and threw out the computer scores, many students had already lost their slots at their preferred universities, sending the admission process into further chaos.“These algorithms are obviously not correct,” said Mr. Sharpe-Roe, 18, whose home borough of Ealing is enormously diverse but also divided by race, ethnicity and income. “I know a load of other people who are in a similar situation.”Students and parents were outraged that the algorithm used to adjust exam scores favored students from private schools and wealthy areas.Credit…Peter Nicholls/ReutersThe outcome, experts say, was entirely predictable. In fact, the Royal Statistical Society had for months warned the test administration agency, Ofqual, that the model was flawed.“It’s government trying to emulate Silicon Valley,” said Christiaan van Veen, director of the digital welfare state and human rights project at New York University. “But the public sector is completely different from private companies.”As an investigator for the United Nations, Mr. van Veen studies how Britain and other countries use computers to automate social services. He said the techniques were being applied to policing and court sentencing, health care, immigration, social welfare and more. “There are no areas of government that are exempt from this trend,” he said.Britain has been particularly aggressive in adopting new technology in government, often with mixed results. Earlier this month, the government said it would stop using an algorithm for weighing visa applications after facing a legal complaint that this was discriminatory. A few days later, a British court ruled against the use of some facial-recognition software by the police.The country’s automated welfare system, Universal Credit, has faced years of criticism, including from the United Nations, for making it harder for some citizens to obtain unemployment benefits. Britain’s contact-tracing app, which the government had said would be key to containing the coronavirus, has been delayed by technical problems.“There is an idea that if it has an algorithm attached to it, it’s novel and interesting and different and innovative, without understanding what those things could be doing,” said Rachel Coldicutt, a technology policy expert in London who is working on a book about responsible innovation.Those who have called for more scrutiny of the British government’s use of technology said the testing scandal was a turning point in the debate, a vivid and easy-to-understand example of how software can affect lives.Cori Crider, a lawyer at Foxglove, a London-based law firm that filed a complaint against the grading algorithm, said the problem was not the use of technology itself but the lack of transparency. Little is known about how the models work before they are introduced.“There has been a tendency to compute first and ask questions later,” said Ms. Crider, who also brought the legal challenge against the visa algorithm. “There’s been a refusal to have an actual debate about how these systems work and whether we want them at all.”Mr. Williamson, the education secretary, apologized and announced a new scoring system. But that came too late for many students, who had already lost slots at their preferred universities.For years, Britain has heralded technology as a way modernize government and provide social services more efficiently. The trend has spanned several administrations but has been given fresh momentum under Prime Minister Boris Johnson.His top political adviser, Dominic Cummings, has argued forcefully that Silicon Valley thinking is needed to create high performance government, including new workers in areas like data science and artificial intelligence. He has expressed admiration for the “frontiers of the science of prediction.”The Coronavirus Outbreak ›Frequently Asked QuestionsUpdated August 17, 2020Why does standing six feet away from others help?The coronavirus spreads primarily through droplets from your mouth and nose, especially when you cough or sneeze. The C.D.C., one of the organizations using that measure, bases its recommendation of six feet on the idea that most large droplets that people expel when they cough or sneeze will fall to the ground within six feet. But six feet has never been a magic number that guarantees complete protection. Sneezes, for instance, can launch droplets a lot farther than six feet, according to a recent study. It’s a rule of thumb: You should be safest standing six feet apart outside, especially when it’s windy. But keep a mask on at all times, even when you think you’re far enough apart.I have antibodies. Am I now immune?As of right now, that seems likely, for at least several months. There have been frightening accounts of people suffering what seems to be a second bout of Covid-19. But experts say these patients may have a drawn-out course of infection, with the virus taking a slow toll weeks to months after initial exposure. People infected with the coronavirus typically produce immune molecules called antibodies, which are protective proteins made in response to an infection. These antibodies may last in the body only two to three months, which may seem worrisome, but that’s perfectly normal after an acute infection subsides, said Dr. Michael Mina, an immunologist at Harvard University. It may be possible to get the coronavirus again, but it’s highly unlikely that it would be possible in a short window of time from initial infection or make people sicker the second time.I’m a small-business owner. Can I get relief?The stimulus bills enacted in March offer help for the millions of American small businesses. Those eligible for aid are businesses and nonprofit organizations with fewer than 500 workers, including sole proprietorships, independent contractors and freelancers. Some larger companies in some industries are also eligible. The help being offered, which is being managed by the Small Business Administration, includes the Paycheck Protection Program and the Economic Injury Disaster Loan program. But lots of folks have not yet seen payouts. Even those who have received help are confused: The rules are draconian, and some are stuck sitting on money they don’t know how to use. Many small-business owners are getting less than they expected or not hearing anything at all.What are my rights if I am worried about going back to work?Employers have to provide a safe workplace with policies that protect everyone equally. And if one of your co-workers tests positive for the coronavirus, the C.D.C. has said that employers should tell their employees — without giving you the sick employee’s name — that they may have been exposed to the virus.What is school going to look like in September?It is unlikely that many schools will return to a normal schedule this fall, requiring the grind of online learning, makeshift child care and stunted workdays to continue. California’s two largest public school districts — Los Angeles and San Diego — said on July 13, that instruction will be remote-only in the fall, citing concerns that surging coronavirus infections in their areas pose too dire a risk for students and teachers. Together, the two districts enroll some 825,000 students. They are the largest in the country so far to abandon plans for even a partial physical return to classrooms when they reopen in August. For other districts, the solution won’t be an all-or-nothing approach. Many systems, including the nation’s largest, New York City, are devising hybrid plans that involve spending some days in classrooms and other days online. There’s no national policy on this yet, so check with your municipal school system regularly to see what is happening in your community.In response to the coronavirus, Britain has sought help from companies like Palantir, a Silicon Valley analytics firm that was hired to manage data for the country’s National Health Service. A London-based artificial intelligence firm, Faculty, is working on predictive systems to help track the virus.In another embarrassing misstep, the government decided to build its own contact-tracing app rather than using technical standards set by Apple and Google, despite warnings that it would have limitations. The release has been delayed for months.Britain is not alone in turning some decisions over to computer systems. In the United States, algorithms are used by police departments to determine where officers patrol and by courts to set prison sentences. In Spain, the monitoring group Algorithm Watch identified a system being used to predict households at risk of domestic violence. The Netherlands abandoned the use of a system to detect welfare fraud after a judge said it was unlawful.The techniques are often pitched as apolitical, but researchers say they disproportionately affect lower-income and minority groups. “One of the great benefits of these tools for governments is it allows them to portray the decisions they are making as neutral and objective, as opposed to moral decisions,” said Virginia Eubanks, an associate professor at the State University of New York at Albany, whose book, “Automating Inequity,” explores the topic.In Britain, the political fallout of the grading mishap dominated the news and led to calls for the country’s education minister to resign. Students protested outside Parliament, chanting expletives at “the algorithm.”Critics say the experience shows the risks ahead as more sophisticated tools like artificial intelligence become available and companies pitch them to public agencies.Mr. Sharpe-Roe said “there’s a lot of anger” at having his fate set by an algorithm. After struggling to regain his lost spot at college, he decided to defer for a year to work.