Prominent artificial intelligence research organization OpenAI recently appointed newly retired U.S. Army General and former National Security Agency (NSA) director Paul M. Nakasone to its board of directors.

Nakasone will join the board’s newly announced Safety and Security Committee, slated to advise OpenAI’s board on critical safety- and security-related matters and decisions.

Established following an exodus of OpenAI higher-ups concerned about the company’s perceived de-prioritization of safety-related matters, the new Safety and Security Committee is OpenAI’s apparent effort to re-establish a safety-forward reputation with an increasingly wary public.

AI safety concerns are of the utmost importance, but OpenAI should not use them to ram through an appointment that appears poised to normalize AI’s militarization.

The ‘revolving door’ strikes again

Following his 38-year military career, including over five years heading U.S. Army Cyber Command, Nakasone’s post-retirement OpenAI appointment and shift to the corporate sector mimics the military-industrial complex’s ever-“revolving door” between senior defense or intelligence agency officials and private industry.

Quietly lifting language barring the military application of its tech from its website this year, OpenAI apparently wants in on the cash.

The phenomenon manifests itself in rampant conflicts of interest and massive military contracts alike: according to an April 2024 Costs of War report, U.S. military and intelligence contracts awarded to major tech firms had ceilings “worth at least $53 billion combined” between 2019 and 2022.

Quietly lifting language barring the military application of its tech from its website this year, OpenAI apparently wants in on the cash. The company is collaborating with the Pentagon on cybersecurity-related tools to prevent veteran suicide.

A slippery slope

OpenAI remains adamant its tech cannot be used to develop or use weapons, despite recent policy changes. But AI’s rapid wartime proliferation in Gaza and Ukraine highlights other industry players’ lack of restraint; failing to keep up could mean losing out on lucrative military contracts in a competitive and unpredictable industry.

Similarly, OpenAI’s current usage policies affirm that the company’s products cannot be used to “compromise the privacy of others,” especially in the forms of “facilitating spyware, communications surveillance, or unauthorized monitoring of individuals.” But Nakasone’s previous role as director of the NSA, an organization infamous for illegally spying on Americans, suggests such policies may not hold water.

Nakasone’s OpenAI appointment signals that a treacherous, more militarized road for OpenAI, as well as AI as a whole, likely lies ahead.

In the words of NSA whistleblower Edward Snowden: “There is only one reason for appointing an [NSA] Director to your board. This is a willful, calculated betrayal of the rights of every person on Earth.”

Considering the growing military use of AI-powered surveillance systems, including AI-powered reconnaissance drones and AI-powered facial recognition technology, the possible wartime surveillance implications of OpenAI’s NSA hire cannot be ruled out.

Meanwhile, OpenAI’s scandal-laden track record, which includes reportedly shoplifting actor Scarlett Johansson’s voice for ChatGPT, CEO Sam Altman’s failed ousting, and previously restrictive, often lifelong nondisclosure agreements for former OpenAI employees, remains less than reassuring.

All things considered, Nakasone’s OpenAI appointment signals that a treacherous, more militarized road for OpenAI, as well as AI as a whole, likely lies ahead.

Your support matters…

Independent journalism is under threat and overshadowed by heavily funded mainstream media.

You can help level the playing field. Become a member.

Your tax-deductible contribution keeps us digging beneath the headlines to give you thought-provoking, investigative reporting and analysis that unearths what's really happening- without compromise.

Give today to support our courageous, independent journalists.

SUPPORT TRUTHDIG