Military & Defense

US Federal Court Rules Against Anthropic, Barring it from Working with the Pentagon

North America / United States0 views1 min
US Federal Court Rules Against Anthropic, Barring it from Working with the Pentagon

This image was generated by AI and may not depict real events.

A US federal court has ruled against Anthropic, barring the AI company from working with the Pentagon due to its refusal to allow the military to use its models for any lawful purpose. The US government has drawn up new guidelines for civilian contracts with AI companies, mandating them to grant the government an irrevocable license to use their models for all legal purposes.

The US government has introduced new rules for AI companies working with the military. Anthropic, a company that makes AI models, has been barred from working with the Pentagon due to its refusal to comply with the new rules. The rules require AI companies to grant the government an irrevocable license to use their models for all legal purposes. The US Department of War designated Anthropic as a Supply Chain Risk due to its ethical codes not aligning with the government's needs. The new guidelines may impact compliance with international regulations like the EU Digital Services Act. The US government is tightening regulations on AI companies to ensure military-friendly practices.

This content was automatically generated and/or translated by AI. It may contain inaccuracies. Please refer to the original sources for verification.

Rate this article

0.0 (0 ratings)Log in to rate

Comments (0)

Log in to comment.

Loading...

Chat

No messages. Start the conversation!

Start the conversation!

Log in to send messages