"Insights from GitHub’s Chief Counsel on the EU AI Act and Its Effects on Open Source Software"

As 2023 draws to a close, European lawmakers have finally reached a preliminary agreement on the regulation of artificial intelligence (AI) after years of deliberation. The EU AI Act is set to establish a risk-based framework that categorizes AI systems according to their potential impact on citizens' rights. While many free and open-source AI models are expected to be exempt from the Act, the final text of the regulation is still in progress and has not yet been made publicly available.

In an insightful conversation, GitHub's Chief Legal Officer, Shelley McKinley, shared her perspectives on how this new legislation will influence open-source development. Microsoft-owned GitHub hosts the world’s largest open-source repository and community, making it a crucial player in the ongoing discourse about AI regulation.

### How Does the EU AI Act Impact Open Source Models?

**Shelley McKinley:** Based on the provisional agreement, if upheld in the final legislation, the AI Act will provide a level of certainty for developers working on most open-source AI projects. This is particularly beneficial for creators who focus on developing and sharing individual components, such as datasets, training code, models, and software applications. The Act includes an exemption for open-source projects but does impose restrictions in certain high-risk categories, including banned systems, high-risk systems with transparency requirements, and the largest foundational models.

Importantly, the Act emphasizes the regulation of general-purpose models—whether open-source or proprietary—that present systemic risks. Developers of these models must adhere to documentation, evaluation, reporting on energy usage, cybersecurity mandates, and other compliance obligations. This approach aligns with our perspective at GitHub, which prioritizes risk awareness while fostering responsible innovation.

### Is the Regulation Adequate?

While the regulation's text is still pending, the early negotiations have shown a balanced approach between regulating high-risk scenarios and promoting open innovation—a stance we've advocated since the European Commission's initial proposal in 2021. As the finer details of compliance continue to evolve, the political agreement reached thus far offers a positive outlook for developers.

### Consequences of Not Having the Exemption

**Related Insights:** Without the exemption for open-source, we would likely see a significant decline in developers' contributions to open-source AI within Europe. This reduction wouldn’t just affect the EU; the repercussions would echo globally. Studies indicate that open-source software components make up a substantial portion of today's digital landscape, with recent reports showing they are present in 96% of software and comprise 76% of a software stack. A retreat from open-source development in the EU could stifle innovation in AI on a worldwide scale.

### Risks of Awaiting Final Confirmation

Even with a provisional agreement aimed at addressing high-risk AI without impeding broader innovation, the delay in releasing a consolidated text until January poses potential risks. There remains a concern that the final wording will lack the clarity needed for open-source projects. Nevertheless, I remain hopeful that policymakers are tuned in to community feedback, ensuring a text that allows for responsible and open developer collaboration within the EU.

### Challenges for Open Source Project Organizations

Though the open-source community awaits clarity on these regulations, organizations involved in AI open-source projects still face significant challenges, notably in attracting skilled technical talent. To address these concerns, GitHub is committed to bolstering the open-source ecosystem by raising awareness among organizations about their reliance on open-source technologies and offering pathways for investment in critical projects.

Our GitHub Sponsors program exemplifies this dedication, providing financial support to maintainers whose work ensures the health of the software supply chain. For instance, these funds can help organizations access necessary technical expertise, creating a mutually beneficial scenario.

It’s encouraging that our approaches have informed governmental initiatives as well. The German Sovereign Tech Fund, for example, supports vital open-source projects financially. Our partnership with the Open Technology Fund’s Free and Open Source Software Sustainability Fund aims to enhance internet freedom infrastructure.

As organizations address talent acquisition challenges, it's crucial to cultivate an engaging workplace that attracts developers. This can involve investing in AI tools that streamline processes, maintaining open lines of communication with leadership, and striving to improve diversity and representation within teams.

Ultimately, cultivating an ecosystem that nurtures open-source innovation will be key to advancing AI responsibly and effectively.

Most people like

Find AI tools in YBX

Related Articles
Refresh Articles