AI Essentials: What is open-source?

Engine
3 min readOct 3, 2024

--

By Min Jun Jung, Policy Fellow, Engine Advocacy & Foundation

This blog continues our series called “AI Essentials,” which aims to bridge the knowledge gap surrounding AI-related topics. It discusses what open-source is and why it’s significant for startups and the broader AI ecosystem.

Open-source software is in just about every tech product in existence, from your phone, to your car, to your refrigerator. It has led to orders-of-magnitude reductions in costs to start a company, helped improve security, and fostered innovation. In many policy conversations about AI, the term “open-source” often comes up, with similar implications for startups, but there aren’t yet the same conventions about what exactly open-source AI entails.

Open-source has historically described software that “anyone can view, modify, and distribute.” In AI, this generally means making the algorithms, code, weights, and data used for AI development available to the public. This approach facilitates collaboration and innovation among researchers, developers, and startups.

But openness in AI generally isn’t binary — fully closed or fully open. Instead, some open-source models embody a philosophy of full transparency by releasing all aspects of the model, while other developers retain some resources as proprietary information, only releasing a combination of the pretrained model weights, code, or datasets. Further, unlike open-source software, which has a well-developed set of licensing norms, open-source AI is less standardized. Some open-source AI resources come with restrictive licenses that prevent their use for, e.g., commercial use or distribution (which make them comparatively less useful for startups).

Open-source models are crucial for startups because they reduce the need to develop AI models from scratch. Training AI models requires vast amounts of data, storage, and computational capabilities, which startups typically do not have. Having access to pretrained models and their weights enables startups to build on and adapt models to their needs through a process called fine-tuning. This accessibility accelerates innovation by drastically reducing costs and reducing barriers to entry for startups. Additionally, it enables startups to build better products by enabling them to focus their limited resources on their true innovation rather than foundational technology.

Policymakers are currently evaluating the benefits of promoting open weights to foster innovation, and it’s critical that they strike a careful balance in their regulatory approach to avoid imposing excessive burdens on startups or undercutting a key path for them to innovate in AI. The Federal Trade Commission even recently weighed in to underscore the benefits for startup competitiveness of open-weight models. But openness can get caught in the crosshairs of overzealous regulation. For example, proposals like California’s recently-vetoed SB 1047 are aimed at regulating AI model development, but would make model developers responsible for future (mis)uses of the model. Under that arrangement, no developer would open source their AI models to others, because they could be held accountable for actions they had no control over.

Concerns about potential misuse merit attention, such as malicious actors exploiting open models to develop harmful AI technologies, but these risks are inherent to AI broadly and not specific to open-source systems. In fact, there are ways that open source can actually enhance AI safety by allowing a bigger and more diverse group of developers to use the technology and identify problems. In contrast, “closed” models place the burden of identifying vulnerabilities and biases solely on the original developers, potentially increasing the risks of adversarial attacks or unethical practices.

Open-source AI — or AI systems developed using open-source resources — should not be subjected to special or additional rules simply because they’re open-source. The inherent risks of AI systems are similar, whether they are “open” or “closed,” and regulations should be specifically tailored to their use. Supporting open-source AI not only means supporting startups but also contributes to the broader advancement of AI technologies.

Engine is a non-profit technology policy, research, and advocacy organization that bridges the gap between policymakers and startups. Engine works with government and a community of thousands of high-technology, growth-oriented startups across the nation to support the development of technology entrepreneurship through economic research, policy analysis, and advocacy on local and national issues.

--

--

Engine
Engine

Written by Engine

Engine is the voice of startups in government. We are a nonprofit that supports entrepreneurship through economic research, policy analysis, and advocacy.

No responses yet