Anthropic builds new Claude AI models specifically for American national security that is designed to process classified information
- Advertisement -
- Advertisement -
- Anthropic has developed various American national security -oriented models
- They can handle classified material and “refuse less”
- Many AI developers are looking for US government contracts
Anthropic has unveiled a series of AI models that were specially designed for use by US government entities.
The models, called “Claude Gov”, are designed to help the US government with strategic planning, operational support and intelligence analysis.
The models are specifically trained to process classified information and the context of intelligence and defense, and have also been adjusted to “refuse less” when processing classified data.
Automate and help US government
‘[These] Models are already used by agencies at the highest level of American national security, and access to these models is limited to those who operate in such classified environments. [They] underwent the same rigorous safety tests as all our Claude models, “said Anthropic in his announcement.
It is not only anthropic that started pitching models for the US government. Openi And Meta both recently revealed their willingness to offer AI models for the use of the US government.
Anthropic and Cohere have also collaborated with Palantir separately to develop AI models for the use of the government. Palantir himself is also looking for government contracts, and the development of “immigrationos” for American immigration and adapted enforcement (ICE).
The new Claude GOV models have improved possibilities compared to other business models developed by anthropic, including “improved skill” in languages that are crucial for American national security, and a better understanding of cyber security Lingo.
The White House recently pushed two ‘America First’ AI guidelines that are aimed at trading guarantees in exchange for faster modernization and greater efficiency of the department.
By Techcrunch
Maybe you like it too
- Advertisement -