Open-source AI is a spectrum, not a binary choice with clear-cut risks


In comments to the National Telecommunications and Information Administration (NTIA), Google weighs in on the risks and benefits of open-source AI models and calls for their responsible use.

The US government’s National Telecommunications and Information Administration (NTIA) has launched a public consultation on basic AI models with freely available weights.

Google, like Meta before it, also weighs in, emphasizing that the terms “open” and “freely available” are more of a spectrum than a binary decision that sees only “pro” or “against” open source.

Access to AI systems is better described as different levels of openness, with the risk profile depending on the chosen form of publication. Google, for example, offers its “Gemma” models with free parameter weights, but restricts their use through licensing terms. Other companies, such as Meta, make their models available to selected researchers before releasing them in full.



OpenAI also tends to release its open-source software in stages. However, malicious actors are unlikely to care much about possible license violations.

Open-source is irreversible

Google warns that freely available models are difficult to control and can increase the risk of abuse. Once the weights are public, it is almost impossible to restrict access. Vulnerabilities could also be more easily exploited by attackers.

At the same time, Google emphasizes the many benefits of open AI models: they enable innovation, promote competition, and facilitate access to AI technology, especially in emerging markets. Open models are also useful for security research, as experts can test them extensively. Meta makes a similar argument.

To mitigate risks, Google recommends rigorous internal review processes, extensive testing for potential misuse, and the provision of tools for safe use.

The company also advocates close collaboration between government, industry, and civil society to jointly develop standards and guidelines.


Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top