Photo by Shahadat Rahman on Unsplash

It was announced last year that the controversial tech firm Palantir had won a huge new contract with the NHS to provide a digital platform for the management of health data. The company will provide the NHS with a new “federated data platform” which will help existing systems to connect more smoothly and manage data more efficiently. Proponents would like us to think it is simply an efficient system for data management and decision making; but the hidden logics perhaps point to the encouragement of a more suspicious and exclusionary approach to health management.

Much of the controversy over this contract has come from privacy concerns with so much patient data seemingly being handed over to a private company and especially one with a chequered history and dubious political affiliations. While some of the worry over this contract has been framed as encroachment of private enterprise into the sacred sphere of the NHS, such partnerships are far from unusual. At least 101 contracts have been awarded to private companies just by NHS digital in the last year. A significant part of the concern comes from the history of Palantir’s founder and chair Peter Thiel. A significant player in Silicon Valley, Thiel co-founded PayPal, was the first external investor in Facebook and set up, or was otherwise connected to, several influential venture capital firms. He was also one of Donald Trump’s biggest financial backers for his 2016 presidential run.

Thiel, like many of his Silicon Valley colleagues, is a fan of science fiction and fantasy and likes to name his companies after his favourite works. Valar Ventures and Mithril Capital reference a race of angels and a super strong metal from JRR Tolkien’s Middle Earth books respectively. Palantir itself is taken from the name of a set of magical spheres featured in the Lord of the Rings books. These enable the holder to remotely view what is happening in other parts of the world, look into the past, communicate with, and even see into the mind of, anyone touching other Palantiri. Even someone unfamiliar with the Lord of the Rings probably doesn’t need to be told that this magical orb is mostly used by an evil wizard for nefarious purposes and is considered by the heroes of the story to be too dangerous to tangle with. Thiel is of course well aware of these connotations and no doubt revels in the opportunities for mischief-making; if only in the opportunity to get stuffy corporate suits to use silly made-up words from fantasy novels.

The personal history of Palantir’s founder causes a worry for many people simply because he seems like a dodgy character who has dubious connections and already wields too much social, economic and political influence. As with any private contracts with public bodies there are concerns over privacy and data management and the broader exploitation of data with Amnesty International calling for the UK government to provide assurances that such data won’t be monetised. Given the history of the company the direct commercial exploitation of patient data is unlikely. Although without the benefit of a magical ball to see into the mind of Peter Thiel (shudder) I can’t be sure. While many of the digital media companies we interface with directly (Meta, Google, TikTok) do focus on targeted advertising driven by user data; Palantir are more interested in selling infrastructure. Or perhaps more accurately they want to integrate into and reshape existing infrastructure for their own purposes. Their customers are principally governments (or their departments or agencies) rather than individual users. Such contracts are undeniably profitable but perhaps more significantly they place the company in a strategic position of power. If infrastructural systems are of fundamental social and political importance (such as NHS data), and are built on the company’s systems, then a relationship of dependency is created with the potential to influence operations. Such influence also largely goes under the radar due to the general public rarely, or perhaps never, engaging with these types of companies.

But what kind of influence can they have? While the statements from the NHS and Palantir themselves present a seemingly benign picture of servicing and processing data a quick look at the history of the company does suggest they might encourage a particular perception of the world. However, as the historian of technology Melvin Kranzberg stated in his first “law of technology”: “Technology is neither good nor bad; nor is it neutral”.

Some of Palantir’s earliest contracts were with the US intelligence services during the “war on terror”. Their systems were able to draw on multiple types of data; public, private, commercial and classified and applied network analysis, amongst other methods, to identify potential terror threats. They analysed both data (e.g. content of private messages) and metadata (e.g. contacts messages were sent to, location of devices) to identify “risky” individuals. Importantly, such people were not necessarily considered to be risky due to having committed any previous acts of terror but largely due to their positions within a network (e.g. who their contacts were) or patterns of behaviour consistent with other terrorists. Similar technology was later sold to police forces for “predictive policing”.

Such an approach is almost a perverse inversion of a sociological ontology as it sees risk as discernible only from a network view, but it simultaneously individualises the solution. It represents an exclusionary politics which appeals to security services and politicians by turning people into actionable items (in order to quell paranoid fears) by embodying them within individuals. The transposition of these systems and principles into the management of health represents a “sphere transgression” bringing the values and logics of its previous application along with it.

The first work Palantir did in healthcare followed similarly predictive logics. For instance, the analysis they provided for the NHS’s Covid-19 data store which was used to plan the government’s response to the pandemic. Specifically, to:

  • Proactively increase health and care resources in emerging hot spots;
  • Ensure critical equipment is supplied to the facilities with greatest need; and
  • Divert patients/service users to the facilities that are best able to care for them based on demand, resources, and staffing capacity.

They did this by taking data from diverse sources (e.g. 111 calls, COVID-19 test results, hospital occupancy figures) to inform decision making which it was acknowledged would have some degree of automated, artificial intelligence component. The key aim of the system was, according to the CEO of NHSX, to form a ‘single source of truth’to support decision-making (“One ring to rule them all”, perhaps).

Data privacy and commercial exploitation of data are certainly risks but ones which existing procedures can, in principle, be adapted to monitor and manage. Tackling the transposition of systems, and corporate cultures, which apply system level risks to individual profiles is more difficult. This is especially so when data analysis systems, like Palantir, are perceived not as producing partial representations (based on partial data) but transparent representations of truth.