- Meta will now work with government agencies to develop military applications.
- Concerns have been raised about security risks for AI
- Researchers find evidence that China has already used Llama for defense
Meta has announced that it is offering the use of its Calla generative AI model to government organizations for “national security applications” and that it is working with US agencies and contractors to support their work.
Among those Meta has partnered with are Lockheed Martin, AWS and Oracle. One example the company has given is its work with Oracle to “synthesize aircraft maintenance documents” to allow technicians to diagnose problems “more quickly and accurately.”
Lockheed Martin is also said to have incorporated Llama into its AI factory, which Meta says has accelerated code generation, data analysis, and improved business processes.
Radical change in policy
This is a significant change from Llama's acceptable use policy, which prohibits the use of models for “military, warfare, nuclear industries or applications, espionage,” and specifically prohibits the development of weapons and the promotion of violence.
Some question the use of AI in defense, citing security concerns such as potentially compromising data. Other vulnerabilities, such as biases and hallucinations, are intrinsic to AI and cannot be avoided, experts warned.
The catalyst for this drastic policy change could be recent reports that China has used the model in its own military applications. The state reportedly used Llama to collect and process intelligence, creating 'ChatBIT' for military dialogue and answering questions.
This, of course, was against Llama's terms of use, but since the model is public and open source, the policy is difficult to enforce.
“In the global competition in AI, the supposed role of a single, outdated version of an open-source American model is irrelevant when we know that China is already investing more than a trillion dollars to surpass the United States in AI,” said Meta . in a statement.
Meta has confirmed that it will also make exceptions for government agencies in the other Five Eyes countries: Canada, Australia, the United Kingdom and New Zealand.
Through TechCrunch