How Palantir Falls Short of Responsible Corporate Conduct

ASSOCIATED PRESS

In a recent opinion piece in The Washington Post, Alex Karp, the chief executive of a data-analysis company called Palantir, warns that when Silicon Valley executives “try to impose their moral framework on America, something has gone seriously and dangerously awry.” Public policy decisions, he adds, need to be made by “elected representatives and judges, not by unelected engineers running global businesses.” But Karp fails to acknowledge that companies like Palantir decide with whom they do business and have an obligation to address the harmful consequences of their business relationships. This responsibility rests not only on tech companies but on all corporations.

Karp tries to frame corporate obligations solely in legal terms, stating that “no leader should knowingly permit his or her products to be used illegally.”  Abiding by the law is necessary but insufficient. Companies must also pursue their core business objectives ethically and take proactive steps to mitigate the risks associated with their business models and their products. Palantir’s high profile and often controversial business activities provide an instructive case study.

Founded in 2004 in the wake of the 9/11 attacks, Palantir received early investments from Peter Thiel, one of its founders, and In-Q-Tel, a venture capital arm of the Central Intelligence Agency. The CIA and other U.S. government intelligence agencies, as well as elements of the U.S. military, became the firm’s anchor clients. Palantir has helped strengthen U.S. national security interests, enabling the government to fill some of the gaps in intelligence-gathering that the 9/11 attacks laid bare.

The company’s more-than-2,000 engineers span the globe, gathering and analyzing data for various federal, state, and local government agencies, and a wide range of commercial clients. Outside analysts estimated that if the company decided to go public, its valuation might be as high as $41 billion. As a privately held company, Palantir isn’t required to reveal much about its finances or operations.

Critics of Palantir have focused primarily on two human-rights-related issues: its technical support for controversial U.S. immigration enforcement practices and its insufficient attention to privacy rights in its work with local police departments.

In recent months, hundreds of Palantir’s employees have publicly protested the firm’s contract with the Immigration Customs and Enforcement agency (ICE). Palantir’s data analysis software was tied to workplace raids and other controversial actions by ICE, which critics charge violated due process of law. Though Palantir’s contract with ICE was set to expire later this month, the company has extended it for another three years.

Palantir’s contract extension stands in stark contrast to Google’s response to similar employee demands in 2018. Google announced that it would not renew its contract with the Pentagon related to artificial intelligence and drone imagery after more than 6,000 employees signed a petition demanding that Google refrain from building warfare technology.

Karp has publicly condemned Google’s decision, and in his op-ed, dismisses criticisms, arguing that “immigration policy is not a software challenge; it’s a political one. The solution lies with our political and judiciary system, not with Silicon Valley’s C-suite.” This response sidesteps his company’s responsibility to assess whether Palantir’s data analysis for ICE has facilitated the Trump Administration’s well-documented mistreatment of asylum seekers and other migrants.

Palantir also has faced criticism for its work with local police forces. The company has helped pioneer so-called predictive policing, which attempts to use statistical analysis to forecast and prevent potential criminal activity. Palantir’s Gotham software aggregates massive amounts of personal data, enabling law enforcement officers to gain a broad understanding of a suspect’s life without a warrant. According to a Vice News investigation, the software can map a person’s family members and business associates, as well as email addresses, phone numbers, current and previous addresses, bank accounts, social security numbers, and height, weight, and eye color.

For about eight years, the Los Angeles Police Department used Palantir’s software for Operation LASER, whereby LA residents would be scored based on their interactions with police. If an individual received or accrued a sufficiently high score, officers were encouraged to knock on their door, tell them they’re being monitored, and look for opportunities to stop or arrest them, according to Bloomberg.

In April, the LAPD ended its use of Palantir’s software after criticism from activist groups and the department’s own civilian oversight panel. A similar program was also underway in New Orleans, using criminal history and social media to forecast criminal behavior. Palantir’s technology was used by the New Orleans Police Department for six years without public knowledge or city council approval until outgoing-Mayor Mitch Landrieu announced his office would not renew its pro bono contract with Palantir. Despite these critiques, Palantir continues to advertise law enforcement services on its website.

Though Palantir has created a Privacy and Civil Liberties team, it has deployed only 10 engineers to staff this massive effort. This under-staffing means the company effectively outsources these responsibilities to its clients. This fails to meet the need. When firms like Palantir compete for lucrative government contracts, it is incumbent on them to take seriously the human rights risks that are associated with their services. To date, Palantir has failed to assume this essential responsibility.

I am the Jerome Kohlberg professor of ethics and finance at NYU Stern School of Business and director of the Center for Business and Human Rights. I served in the Obama ...