|
META TOPICPARENT | name="FirstEssay" |
| |
< < | Rocky Mountain High: Colorado’s AI Act and Hallucinations of Open-Source | > > | Rocky Mountain High: Colorado’s AI Act and Hallucinating about Open-Source | |
-- By MichaelMacKay - 25 Oct 2024 | | Fool’s Gold | |
< < | The first state to regulate recreational dispensaries is the first to dispense with unregulated AI. Commentators like EFF have called Colorado’s Artificial Intelligence Act (CAIA), enacted last summer, “comprehensive,” but Colorado’s law merely resembles the European Union’s AI Act. “The Revolution permanently put the major models of European governance off the table,”[1] but apparently, Denver has taken too little from Brussels and put too much out of reach of the state. “Today, most commercial software includes open source components,”[2] and last year, researchers at Harvard Business School estimated that, conservatively, OSS produced at least $4.5T in economic value. Now with DeepSeek? , it is more doubtful than ever that developing “high-risk AI” will require proprietary ownership. Before the CAIA goes into effect February 6, 2026, its omission of open source software (OSS) is an open wound that Denver can still patch: (1) directly, by amending the CAIA to align more closely with the EU, or (2) indirectly, by amending the Colorado Privacy Act (CPA) to collaterally attack the issue as Sacramento did with the updated California Consumer Privacy Act (CCPA). | > > | The first state to regulate recreational dispensaries is the first to dispense with unregulated AI. Commentators like EFF have called Colorado’s Artificial Intelligence Act (CAIA), enacted last summer, “comprehensive,” but Colorado’s law merely resembles the European Union’s AI Act. “The Revolution permanently put the major models of European governance off the table,”[1] but apparently, Denver took too little from Brussels, which put too much out of reach of the state. “Today, most commercial software includes open source components,”[2] and last year, researchers at Harvard Business School estimated that open-source software (OSS) produced at least $4.5T in economic value. As demonstrated by DeepSeek? , it is now more doubtful than ever that developing “high-risk AI” requires proprietary ownership, so before the CAIA goes into effect February 6, 2026, the omission of OSS from the CAIA is an open wound that Denver can still patch: (1) directly, by amending the CAIA to align better with the EU, or (2) indirectly, by amending the Colorado Privacy Act (CPA) to collaterally attack the issue as Sacramento has with the California Consumer Privacy Act (CCPA). Here, the direct approach is favored. | |
Open-source Opening | |
< < | When DeepSeek? , an OSS project from China, was released, Nvidia, a chips manufacturer, lost $1T in value, as share price tumbled 17%.[3] One of the world’s most valuable companies, Nvidia was chastened by news that massive amounts of compute power would no longer be necessary for cutting-edge development in AI—compared to OpenAI? ’s leading model, O1, DeepSeek? performed at least as well across a battery of tests at comparatively negligible costs. Originally, OpenAI? was founded as a nonprofit to promote the open source development of AI (ergo, OpenAI? ), but that founding mythos was lost somewhere along the way—apparently, when humans could no longer handle the gift of knowledge, Prometheus also returned to reclaim fire (Washington State may have a different Mount Olympus, but the ashes of lighting money on fire may yet be traced to Redmond).
Hence, as for the CAIA and other aspiring regulators, the arrival of DeepSeek? was not so much a “sputnik” moment, as much as a wake-up call from Silicon Valley’s recent economic history. In 1979, Oracle released its first commercial relational database management system, called “Oracle Version 2,” and in 1996, researchers from Berkeley launched PostgreSQL, which became a much-beloved OSS alternative. Today, some industry studies indicate that the latter has more market share than Oracle’s current suite of tools. Even AlphaFold 3 (the model behind the 2024 Nobel Prize in Chemistry) is OSS, so Denver’s narrow view of the “developer” behind major breakthroughs appears to miss the scientific process that underlies progress in software. Notably, where there are replicability issues in scientific journals, OSS projects virtually rest in a state of truth, where pull requests are functionally hypotheses, extending the state of knowledge (merged, if true, or else restored to previous builds). Generally, superior code is shipped where collaboration and criticism are the defaults of production. | > > | When DeepSeek? was released, Nvidia, a chips manufacturer, lost $1T in value, as its share price tumbled 17%.[3] Once the world’s most valuable company, Nvidia was chastened by news that massive amounts of compute power would no longer be necessary for cutting-edge development in AI—compared to OpenAI? ’s leading model, O1, DeepSeek? performed at least as well against a battery of tests at comparatively negligible costs. Originally, OpenAI? was founded as a nonprofit to promote open-source AI (ergo, OpenAI? ), but somewhere along the way, that founding mythos was lost—apparently, when humans could no longer handle the gift of knowledge, Prometheus returned to reclaim that fire (Washington State may have a different Mount Olympus, but some suggest lighting money on fire can be traced to Redmond).
As for the CAIA, the arrival of DeepSeek? was not so much a “sputnik” moment, as much as a lesson from Silicon Valley’s recent history. In 1979, Oracle released its first commercial relational database management system, called “Oracle Version 2,” and in 1996, researchers from Berkeley launched PostgreSQL, which became a much-beloved OSS alternative. Today, some industry studies indicate that the latter has more market share than Oracle’s current suite of tools. Even AlphaFold 3 (the model behind the 2024 Nobel Prize in Chemistry) is OSS, so Denver’s myopic view of the “developer” behind major breakthroughs misses the scientific process that underlies progress in software. To that end, where there are replicability issues in many scientific journals, OSS virtually sits in a state of truth, insofar as pull requests are hypotheses (merged, if true, or else restored to previous builds, if false). Thus, superior code is shipped where collaboration and criticism are hallmarks of distributed production. | |
Minding the Gap | |
< < | Under the CAIA, a “developer” or “deployer” of AI systems which interact with Coloradans is subject to AG sanctions if, within 90 days of learning how the “high-risk AI” system caused algorithmic discrimination, there is no follow-up disclosure. Inspired by Article 12(1) of the EU AI Act (“High-risk AI systems shall technically allow for the automatic recording of events (logs) over the lifetime of the system.”), Denver monitors the same range of “high-risk” activities outsourced to AI (e.g. deciding whom to hire, whom to give a home loan, etc.), but a closer read of one key provision in Section 1 (“Definitions”) betrays a wide chasm, as Colorado says: | > > | Under the CAIA, a “developer” or “deployer” of AI systems which interact with Coloradans may be subject to AG sanctions if, within 90 days of learning how the “high-risk AI” system caused algorithmic discrimination, but here, there is an open gap for OSS. Modeled after Article 12(1) of the EU AI Act (“High-risk AI systems shall technically allow for the automatic recording of events (logs) over the lifetime of the system.”), Denver appears to monitor the same range of “high-risk” activities outsourced to dice-rolling AI (e.g. deciding whom to hire, whom to give a home loan, etc.), but a closer read of one key provision in Section 1 (“Definitions”) betrays a chasm in Colorado: | | (7) "DEVELOPER" MEANS A PERSON DOING BUSINESS IN THIS STATE THAT DEVELOPS OR INTENTIONALLY AND SUBSTANTIALLY MODIFIES AN ARTIFICIAL INTELLIGENCE SYSTEM. | | (3) ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge;[4] | |
< < | Put differently, the CAIA says nothing about OSS like DeepSeek? . OSS is often provided as is with broad disclaimers against warranty, indemnity, or other liability, but DeepSeek? , for example, is offered on an MIT license, which also raises another issue within that world of free software. Apache, BSD, and MIT licenses, for instance, do not require the disclosure of source code—as opposed to GNU or Mozilla—so the fact that DeepSeek? ’s GitHub? repo exposes some of its source code is not required. Currently, there are no enforcement actions by Brussels under the full EU AI Act (which only comes into effect, August 2, 2026), but state regulators could probably promulgate rules on licensing without further changing the law.[5] Thus, the CAIA might read: | > > | Put differently, the CAIA cannot bind OSS like DeepSeek? . Typically, OSS is provided as-is with broad disclaimers against warranty, indemnity, or other liability, but where DeepSeek? is offered on an MIT license, there is a related issue. Apache, BSD, and MIT licenses, for instance, do not require the disclosure of source code—unlike GNU or Mozilla—so the fact that DeepSeek? ’s GitHub? repo exposes some of its source code is not a requirement. Still, Colorado's AG could promulgate rules on licensing without further changing the law beyond embracing OSS:[5] | | | |
< < | (7) "DEVELOPER" MEANS A PERSON DOING BUSINESS IN THIS STATE, whether for payment or free of charge, THAT DEVELOPS OR INTENTIONALLY AND SUBSTANTIALLY MODIFIES AN ARTIFICIAL INTELLIGENCE SYSTEM. | > > | (7) "DEVELOPER" MEANS A PERSON DOING BUSINESS IN THIS STATE, whether for payment or free of charge..." | | | |
< < | Such legislation and regulation would then cover companies like RedHat? (upselling services, but not necessarily selling free AI), and again, by excluding OSS developers from sources of “high-risk AI,” the CAIA would tend to overlook industry-leading developers like DeepSeek? .[6] | > > | Still, until the full EU AI Act goes into effect August 2, 2026, there will inevitably be unknowns in enforcement. However, Denver can more fully fork the legal code to better address problems posed by even mature OSS developers like RedHat? (upselling services but not necessarily core products, which could be AI).[6] | |
California Dreaming | |
< < | Alternatively, if regulators are afraid of entering a deep sea of open-source development, they may try to align the Colorado Privacy Act with the California Privacy Rights Act (CPRA), but such an indirect approach to AI policy would likely fall short of the CAIA's goals. CPRA, which amended California’s CCPA in 2023, identified “automated decision-making” by AI, as a liability borne by the business. However, CPRA only required compliance from businesses earning $25 million in annual gross revenue, which belies how dimly OSS projects must have been viewed by the state. Of course, Californians’ amended CCPA is still stronger than Coloradoans’ CPA. For example, if a bank like Wells Fargo were to use AI in mortgage servicing, it could still found be found liable under California’s GLBA data-only exemption, but Colorado (like most states with privacy policies) would exempt the entire financial institution at the GLBA entity-level.[7] However, again, Sacramento's threshold of a for-profit business model likely warrants caution on taking an indirect approach to protecting data as opposed to "consequential decisions" from AI. | > > | Alternatively, if regulators fear a deep sea of OSS, they may align the Colorado Privacy Act with the California Privacy Rights Act (CPRA), but such an roundabout approach to regulation is arguably unwise. CPRA, which amended California’s CCPA in 2023, identified “automated decision-making” by AI, as liability borne by the business. However, CPRA only required compliance from businesses earning $25 million in annual gross revenue, which belied how dimly OSS projects were viewed. Of course, Californians’ CCPA provides more protections than Coloradoans’ CPA. If Wells Fargo were to use AI in mortgage servicing, it could still found be found liable under California’s GLBA data-only exemption, but Colorado (like most states with privacy policies) would exempt that financial institution at the GLBA entity-level.[7] Still, as a matter of public policy, Sacramento's for-profit focus means it probably does not warrant imitation where bots can be even more problematic pretenders when offered for free. | | Up the Mountain | |
< < | Ultimately, proponents of AI reform should continue to call out the CAIA's continental drift. EU member states are not disinterested aggregators of data—just swipe off the tram in Amsterdam! But by ensuring that OSS is included when seeking source code from any "developer" (or "provider" per the EU AI Act), Colorado will at least operationalize the other mechanisms that it has borrowed from Europe (e.g. currently, per 6-1-1703, compliance for a "deployer" is parasitic on cooperation from a "developer," so a "deployer" using open-source AI where not all of the source code is available would be unable to comply). That said, there are still other issues in the CAIA's design, including how the law mandates a "deployer" follow frameworks like NIST's AI Risk Management Framework or ISO/IEC 42001, which are governance structures probably better suited for traditional corporate environments than decentralized development communities. Still, absent federal legislation, this first salvo in the states regulating AI will greatly shape the national conversation as a whole. However, unless more “comprehensive” language truly reflects the fact that "open source is eating software faster than software is eating the world,” the CAIA will tend to see friends around the campfire... and everybody’s high. | > > | Ultimately, proponents of AI reform should call out the CAIA's continental drift. EU member states are not disinterested actors—just swipe off the tram in Amsterdam! But by ensuring that open-source AI is envisaged when seeking out the source code from any "developer" (or "provider" per the EU AI Act), Colorado will better act on what it has already borrowed from Europe (e.g. per 6-1-1703, compliance for a "deployer" depends on cooperation from a "developer"). Of course, there are still other issues in the CAIA's design like how it mandates a "deployer" follow frameworks like NIST's AI Risk Management Framework or ISO/IEC 42001, which are governance structures maybe better suited for traditional corporate environments than decentralized communities. Absent federal legislation, though, first salvo from the states will shape national conversation as a whole, especially as Texas considers its own bill based on Colorado's. However, unless new “comprehensive” language truly reflects the fact "open source is eating software faster than software is eating the world,” the CAIA will tend to see friends around the campfire... and everybody’s high. | | |
|