COINPURO - Crypto Currency Latest News logo COINPURO - Crypto Currency Latest News logo
Cryptopolitan 2026-02-14 21:00:12

Reports show Anthropic’s Claude was used by the U.S. military to capture Venezuelan leader Nicolás Maduro

Claude, Anthropic’s flagship product, has been found to be involved in the U.S. military raid of Venezuela to capture its president and his wife, according to reports. Despite Anthropic’s anti-violence policies, the company’s partnership with Palantir allows Claude to be used for military operations. Some believe the AI was used for non-violent tasks. How was Claude AI involved in the capture of Nicolás Maduro? A series of new reports has revealed that the U.S. military used Anthropic’s artificial intelligence model, Claude, during the high-stakes operation to capture former Venezuelan President Nicolás Maduro. The mission, known as “Operation Resolve,” took place in early January 2026 and resulted in the arrest of Maduro and his wife, Cilia Flores, in the heart of Caracas. According to The Wall Street Journal and Fox News, Claude was integrated into the mission through Anthropic’s partnership with the data analytics firm Palantir Technologies . The U.S. Department of War, led by Secretary Pete Hegseth, has increasingly used commercial AI models to modernize its combat operations . Details on what specific tasks Claude performed are classified, but the AI is known to be used for summarizing massive amounts of intelligence data, analyzing satellite imagery, and possibly providing decision support for complex troop movements. The raid occurred in the early hours of January 3, 2026, when U.S. Special Operations Forces, including Delta Force commandos, successfully breached Maduro’s fortified palace. President Donald Trump later described that Maduro was “bum-rushed” before he could reach a steel-reinforced safe room. Venezuelan air defenses were suppressed, and several military sites were bombed during the mission. Maduro was transported to a U.S. warship and then to New York City, where he currently faces federal charges of narco-terrorism and cocaine importation. Did the operation violate Anthropic’s anti-violence rules? Claude is designed with a constitutional focus on safety, so how was it used in a lethal military operation? Anthropic’s public usage guidelines prohibit Claude from being used for violence, weapon development, or surveillance. Anthropic has stated that it monitors all usage of its tools and ensures they comply with its policies. However, the partnership with Palantir allows the military to use Claude in classified environments. Sources familiar with the matter suggest that the AI may have been used for non-lethal support tasks, such as translating communications or processing logistics. Nevertheless, the Department of War is currently pushing for AI companies to remove many of their standard restrictions for military use. Reports indicate that the Trump administration is considering canceling a $200 million contract with Anthropic because the company has raised concerns about its AI being used for autonomous drones or surveillance. Secretary Pete Hegseth has stated that “the future of American warfare is spelled AI” and has made it clear that the Pentagon will not work with companies that limit military capabilities. Get 8% CASHBACK when you spend crypto with COCA Visa card. Order your FREE card.

가장 많이 읽은 뉴스

coinpuro_earn
면책 조항 읽기 : 본 웹 사이트, 하이퍼 링크 사이트, 관련 응용 프로그램, 포럼, 블로그, 소셜 미디어 계정 및 기타 플랫폼 (이하 "사이트")에 제공된 모든 콘텐츠는 제 3 자 출처에서 구입 한 일반적인 정보 용입니다. 우리는 정확성과 업데이트 성을 포함하여 우리의 콘텐츠와 관련하여 어떠한 종류의 보증도하지 않습니다. 우리가 제공하는 컨텐츠의 어떤 부분도 금융 조언, 법률 자문 또는 기타 용도에 대한 귀하의 특정 신뢰를위한 다른 형태의 조언을 구성하지 않습니다. 당사 콘텐츠의 사용 또는 의존은 전적으로 귀하의 책임과 재량에 달려 있습니다. 당신은 그들에게 의존하기 전에 우리 자신의 연구를 수행하고, 검토하고, 분석하고, 검증해야합니다. 거래는 큰 손실로 이어질 수있는 매우 위험한 활동이므로 결정을 내리기 전에 재무 고문에게 문의하십시오. 본 사이트의 어떠한 콘텐츠도 모집 또는 제공을 목적으로하지 않습니다.