By Jack Queen
NEW YORK, March 11 (Reuters) – Anthropic’s lawsuit challenging its Pentagon blacklisting is likely to test the reach of an obscure law aimed at guarding military systems against sabotage, and legal experts say the artificial intelligence lab appears to have a strong case that President Donald Trump’s administration overstepped.
Anthropic said in its lawsuit filed on Monday that the Defense Department’s decision to exclude the company from military contracts by designating it as a supply chain risk violated its free speech and due process rights and was aimed at punishing the company for its views on AI safety in warfare.
The designation could cut Anthropic’s 2026 revenue by multiple billions of dollars and cause reputational harm, company executives said Tuesday.
In labeling Anthropic as a supply chain risk, the Pentagon invoked a rarely used law that allows it to bar companies from certain contracts if they risk exposing military information systems to enemy infiltration. The law has never been tested in court or used against a U.S. company, according to a Reuters review of legal databases.
Courts often defer to the executive branch’s judgment on national security, and that will likely be the centerpiece of the government’s defense. But five national security law experts told Reuters the Pentagon may have overstepped.
“It’s not at all clear that the statute can even apply to an American company where there’s no foreign entanglement,” said University of Minnesota Law School professor Alan Rozenshtein.
The Defense Department said it does not comment on pending litigation.
‘EXQUISITE’ TECHNOLOGY
Anthropic, which is incorporated and headquartered in the U.S., said it is not an “adversary,” which the Trump administration has defined in executive orders to mean China, Russia, Iran, North Korea, Cuba and Venezuela, according to Anthropic’s lawsuit.
The company also said Defense Secretary Pete Hegseth gave no explanation for how Anthropic’s Claude AI tool constituted a supply chain risk despite its continued use by the U.S. military and Hegseth’s own praise of Claude as “exquisite” technology that the Defense Department would “love” to work with during a February 24 meeting with Anthropic cited in the lawsuit.
The military has used Claude as recently as last month during strikes on Iran, according to Reuters reports.
Hegseth designated Anthropic a national security supply chain risk on March 3 after the company refused to lift restrictions on Claude that prohibit the military from using it for autonomous weapons or domestic surveillance.
In a February 27 social media post announcing the designation, Hegseth accused Anthropic of cloaking itself in the “sanctimonious rhetoric of ‘effective altruism'” to “strong-arm the United States military into submission.”
Anthropic has said AI is not reliable enough for autonomous weapons and that it opposes domestic surveillance as a matter of principle. The Pentagon has said that Anthropic’s restrictions could endanger American lives.
‘SURVEIL, DENY, DISRUPT’
Under U.S. law, a supply chain risk is a threat that an adversary could sabotage, infiltrate or disrupt a federal information technology system or network.
The law, known as Section 3252, invoked by the Pentagon, allows the defense secretary to exclude companies from certain contracts to guard against the risk that an “adversary” may “sabotage, maliciously introduce unwanted function” or otherwise “subvert” a military information system to “surveil, deny, disrupt, or otherwise degrade” its function.
The Pentagon separately designated Anthropic a supply chain risk under a similar law that could eventually broaden contract exclusions to the civilian government. Anthropic filed a separate legal challenge to that designation on Monday.
The Pentagon can only exclude companies under 3252 as a last resort, and other defense contractors are not required to stop working with them entirely.
Reuters could not identify any other companies that have been publicly designated supply chain risks under 3252, although the obscure government procurement statute does not require public disclosure of designations.
‘BAD BLOOD’
Amos Toh, a national security law expert at the left-leaning Brennan Center for Justice, said nothing about Claude’s usage policies appeared to pose foreign sabotage or subversion risks.
“These are basically safety protocols. You can debate whether these protocols are acceptable or not, but they run directly counter to the risk that the law is designed to regulate,” Toh said.
Anthropic’s lawsuit said that the supply chain risk designation punishes the company for its views on AI safety in violation of the First Amendment of the U.S. Constitution, which protects free speech and expression.
Legal experts said Trump and Hegseth’s public attacks on Anthropic, including a social media post in which Trump called it a “RADICAL LEFT WOKE COMPANY,” could bolster this argument.
“A lot of things Hegseth has said and the Pentagon has done undermine their case and suggest there was personal animus and bad blood between the parties, and that the Pentagon had it out for Anthropic,” said Joel Dodge, a law expert at Vanderbilt University.
Anthropic also said Hegseth’s supply chain risk order violated its Fifth Amendment right to due process by imposing “draconian punishments” without any “meaningful process,” factual findings or opportunity for Anthropic to challenge them.
SECOND-GUESSING THE PRESIDENT
Courts are generally reluctant to question federal agency decisions but are especially deferential to the executive branch’s judgment on national security matters.
This would likely be the centerpiece of the government’s defense, according to legal experts, who said Justice Department lawyers could cite a long line of cases where courts have found that it is generally not the place of judges to second-guess how the president and the military he commands defend the country.
The government could similarly argue that the president and his cabinet secretaries have broad authority to choose suppliers and that the military cannot rely on a vendor whose usage policies constrain military action.
The Justice Department could also cite legal precedent stating that contract decisions are not First Amendment violations if they are supported by legitimate policy or operational reasons.
Eric Crusius, an attorney and government contract specialist who is not involved in the case, said the government is trying to impose the “death penalty” on Anthropic and will need to show “there was no alternative and that they meticulously considered other options prior to pulling the trigger.”
‘ARBITRARY AND CAPRICIOUS’
Anthropic said in its lawsuit that Hegseth’s decision violated the Administrative Procedure Act, a law that allows courts to overturn actions that are found to be “arbitrary, capricious, an abuse of discretion, or otherwise not in accordance with law.”
Legal experts said the apparent contradictions in the government’s position are strong evidence that Hegseth’s decision was arbitrary.
“The government was simultaneously threatening to use the (Defense Production Act) to force Anthropic to sell its services, using its services in active military operations, and saying it’s too dangerous to use them in government contracts,” said University of Minnesota Law School professor Alan Rozenshtein.
“Not all of these things can be true,” he said.
(Reporting by Jack Queen in New York; Editing by Noeleen Walder and Christopher Cushing)

Comments