More stories

  • in

    Palantir, the all-seeing US data company keen to get into NHS health systems | Arwa Mahdawi

    Palantir, the all-seeing US tech company, could soon have the data of millions of NHS patients. My response? Yikes!Arwa MahdawiYou might never have heard of tech billionaire Peter Thiel’s CIA-backed analytics company. But it could know all about you if it wins a contract to manage NHS data Peter Thiel has a terrible case of RBF – reclusive billionaire face. I’m not being deliberately mean-spirited, just stating the indisputable fact that the tech entrepreneur, a co-founder of PayPal, doesn’t exactly give off feel-good vibes. There is a reason why pretty much every mention of Thiel tends to be peppered with adjectives such as “secretive”, “distant” and “haughty”. He has cultivated an air of malevolent mystique. It’s all too easy to imagine him sitting in a futuristic panopticon, torturing kittens and plotting how to overthrow democracy.It’s all too easy to imagine that scenario because (apart from the torturing kittens part, obviously), that is basically how the 54-year-old billionaire already spends his days. Thiel was famously one of Donald Trump’s biggest donors in 2016; this year, he is one of the biggest individual donors to Republican politics. While it is hardly unusual for a billionaire to throw money at conservative politicians, Thiel is notable for expressing disdain for democracy, and funding far-right candidates who have peddled Trump’s dangerous lie that the election was stolen from him. As the New York Times warned in a recent profile: “Thiel’s wealth could accelerate the shift of views once considered fringe to the mainstream – while making him a new power broker on the right.”When he isn’t pumping money into far-right politicians, Thiel is busy accelerating the surveillance state. In 2004, the internet entrepreneur founded a data-analytics company called Palantir Technologies (after the “seeing stones” used in The Lord of the Rings), which has been backed by the venture capital arm of the CIA. What dark magic Palantir does with data is a bit of a mystery but it has its fingers in a lot of pies: it has worked with F1 racing, sold technology to the military, partnered with Space Force and developed predictive policing systems. And while no one is entirely sure about the extent of everything Palantir does, the general consensus seems to be that it has access to a huge amount of data. As one Bloomberg headline put it: “Palantir knows everything about you.”Soon it might know even more. The Financial Times recently reported that Palantir is “gearing up” to become the underlying data operating system for the NHS. In recent months it has poached two top executives from the NHS, including the former head of artificial intelligence, and it is angling to get a five-year, £360m contract to manage the personal health data of millions of patients. There are worries that the company will then entrench itself further into the health system. “Once Palantir is in, how are you going to remove them?” one source with knowledge of the matter told the FT.How worried should we be about all this? Well, according to one school of thought, consternation about the potential partnership is misplaced. There is a line of argument that it is just a dull IT deal that people are getting worked up over because they don’t like the fact that Thiel gave a bunch of money to Trump. And to be fair, even if you think Thiel is a creepy dude with creepy beliefs, it is important to note that he is not the only guy in charge of Palantir: the company was co-founded in 2003 by Alex Karp, who is still the CEO; he voted for Hillary Clinton and has described himself as a progressive (although, considering his affinity for the military, he certainly has a different view of progress than I do).My school of thought, meanwhile, is best summarised as: yikes. Anyone who has had any experience of the abysmal US healthcare system should be leery of private American companies worming their way into the NHS. Particularly when the current UK government would privatise its own grandmother if the price was right. I don’t know exactly what Palantir wants with the NHS but I do know it’s worth keeping an eye on it. It’s certainly keeping an eye on you.
    Arwa Mahdawi is a Guardian columnist
    Do you have an opinion on the issues raised in this article? If you would like to submit a letter of up to 300 words to be considered for publication, email it to us at guardian.letters@theguardian.comTopicsTechnologyOpinionArtificial intelligence (AI)NHSUS politicsHealthcommentReuse this content More

  • in

    What a picture of Alexandria Ocasio-Cortez in a bikini tells us about the disturbing future of AI | Arwa Mahdawi

    Want to see a half-naked woman? Well, you’re in luck! The internet is full of pictures of scantily clad women. There are so many of these pictures online, in fact, that artificial intelligence (AI) now seems to assume that women just don’t like wearing clothes.That is my stripped-down summary of the results of a new research study on image-generation algorithms anyway. Researchers fed these algorithms (which function like autocomplete, but for images) pictures of a man cropped below his neck: 43% of the time the image was autocompleted with the man wearing a suit. When you fed the same algorithm a similarly cropped photo of a woman, it auto-completed her wearing a low-cut top or bikini a massive 53% of the time. For some reason, the researchers gave the algorithm a picture of the Democratic congresswoman Alexandria Ocasio-Cortez and found that it also automatically generated an image of her in a bikini. (After ethical concerns were raised on Twitter, the researchers had the computer-generated image of AOC in a swimsuit removed from the research paper.)Why was the algorithm so fond of bikini pics? Well, because garbage in means garbage out: the AI “learned” what a typical woman looked like by consuming an online dataset which contained lots of pictures of half-naked women. The study is yet another reminder that AI often comes with baked-in biases. And this is not an academic issue: as algorithms control increasingly large parts of our lives, it is a problem with devastating real-world consequences. Back in 2015, for example, Amazon discovered that the secret AI recruiting tool it was using treated any mention of the word “women’s” as a red flag. Racist facial recognition algorithms have also led to black people being arrested for crimes they didn’t commit. And, last year, an algorithm used to determine students’ A-level and GCSE grades in England seemed to disproportionately downgrade disadvantaged students.As for those image-generation algorithms that reckon women belong in bikinis? They are used in everything from digital job interview platforms to photograph editing. And they are also used to create huge amounts of deepfake porn. A computer-generated AOC in a bikini is just the tip of the iceberg: unless we start talking about algorithmic bias, the internet is going to become an unbearable place to be a woman. More

  • in

    From the Iowa caucuses to the Barnes & Noble fiasco, it’s clear: tech cannot save us | Julia Carrie Wong

    We have fallen for the idea that apps and artificial intelligence can substitute judgement and hard work. They can’t Every four years, journalists from around the world are drawn to the Iowa caucuses like podcasters to a murder. The blatantly anti-democratic tradition appeals to certain journalistic biases: the steadfast belief of the political press that […] More