COINPURO - Crypto Currency Latest News logo COINPURO - Crypto Currency Latest News logo
Cryptopolitan 2026-02-14 21:00:12

Reports show Anthropic’s Claude was used by the U.S. military to capture Venezuelan leader Nicolás Maduro

Claude, Anthropic’s flagship product, has been found to be involved in the U.S. military raid of Venezuela to capture its president and his wife, according to reports. Despite Anthropic’s anti-violence policies, the company’s partnership with Palantir allows Claude to be used for military operations. Some believe the AI was used for non-violent tasks. How was Claude AI involved in the capture of Nicolás Maduro? A series of new reports has revealed that the U.S. military used Anthropic’s artificial intelligence model, Claude, during the high-stakes operation to capture former Venezuelan President Nicolás Maduro. The mission, known as “Operation Resolve,” took place in early January 2026 and resulted in the arrest of Maduro and his wife, Cilia Flores, in the heart of Caracas. According to The Wall Street Journal and Fox News, Claude was integrated into the mission through Anthropic’s partnership with the data analytics firm Palantir Technologies . The U.S. Department of War, led by Secretary Pete Hegseth, has increasingly used commercial AI models to modernize its combat operations . Details on what specific tasks Claude performed are classified, but the AI is known to be used for summarizing massive amounts of intelligence data, analyzing satellite imagery, and possibly providing decision support for complex troop movements. The raid occurred in the early hours of January 3, 2026, when U.S. Special Operations Forces, including Delta Force commandos, successfully breached Maduro’s fortified palace. President Donald Trump later described that Maduro was “bum-rushed” before he could reach a steel-reinforced safe room. Venezuelan air defenses were suppressed, and several military sites were bombed during the mission. Maduro was transported to a U.S. warship and then to New York City, where he currently faces federal charges of narco-terrorism and cocaine importation. Did the operation violate Anthropic’s anti-violence rules? Claude is designed with a constitutional focus on safety, so how was it used in a lethal military operation? Anthropic’s public usage guidelines prohibit Claude from being used for violence, weapon development, or surveillance. Anthropic has stated that it monitors all usage of its tools and ensures they comply with its policies. However, the partnership with Palantir allows the military to use Claude in classified environments. Sources familiar with the matter suggest that the AI may have been used for non-lethal support tasks, such as translating communications or processing logistics. Nevertheless, the Department of War is currently pushing for AI companies to remove many of their standard restrictions for military use. Reports indicate that the Trump administration is considering canceling a $200 million contract with Anthropic because the company has raised concerns about its AI being used for autonomous drones or surveillance. Secretary Pete Hegseth has stated that “the future of American warfare is spelled AI” and has made it clear that the Pentagon will not work with companies that limit military capabilities. Get 8% CASHBACK when you spend crypto with COCA Visa card. Order your FREE card.

最阅读新闻

coinpuro_earn
阅读免责声明 : 此处提供的所有内容我们的网站,超链接网站,相关应用程序,论坛,博客,社交媒体帐户和其他平台(“网站”)仅供您提供一般信息,从第三方采购。 我们不对与我们的内容有任何形式的保证,包括但不限于准确性和更新性。 我们提供的内容中没有任何内容构成财务建议,法律建议或任何其他形式的建议,以满足您对任何目的的特定依赖。 任何使用或依赖我们的内容完全由您自行承担风险和自由裁量权。 在依赖它们之前,您应该进行自己的研究,审查,分析和验证我们的内容。 交易是一项高风险的活动,可能导致重大损失,因此请在做出任何决定之前咨询您的财务顾问。 我们网站上的任何内容均不构成招揽或要约