Law in the Internet Society

View   r11  >  r10  >  r9  >  r8  >  r7  >  r6  ...
MichaelMacKayFirstEssay 11 - 16 Feb 2025 - Main.MichaelMacKay
Line: 1 to 1
 
META TOPICPARENT name="FirstEssay"
Line: 16 to 16
 

Open-source Opening

Changed:
<
<
When DeepSeek? was released, U.S. stocks lost $1T in value, as Nvidia was chastened by news that its chips might not be as necessary for developing AI. Compared to OpenAI? ’s leading model, O1, DeepSeek? performed at least as well for relatively negligible overhead costs. As for the CAIA then, the arrival of DeepSeek? was not so much a “sputnik” moment as much as another lesson from recent history. In 1979, Oracle released its first commercial relational database management system, called “Oracle Version 2,” and in 1996, researchers from Berkeley launched PostgreSQL, a much-beloved OSS alternative. Today, some industry studies indicate the latter has more market share than Oracle’s current suite. Even AlphaFold 3 (behind the 2024 Nobel Prize in Chemistry) is OSS, and where replicability issues in the sciences abound, OSS virtually sits in a state of truth. Thus, Denver should acknowledge how the collaborative conditions of free software enable superior code.
>
>
When DeepSeek? was released, U.S. stocks lost $1T in value, as Nvidia was chastened by news that its chips might not be as necessary for cutting-edge AI. Compared to OpenAI? ’s leading model, O1, DeepSeek? performed at least as well for relatively negligible overhead costs. As for the CAIA then, the arrival of DeepSeek? was not so much a “sputnik” moment as much as another lesson from recent history. In 1979, Oracle released its first commercial relational database management system, called “Oracle Version 2,” and in 1996, researchers from Berkeley launched PostgreSQL, a much-beloved OSS alternative. Today, some industry studies indicate the latter has more market share than Oracle’s current suite. Even AlphaFold 3 (behind the 2024 Nobel Prize in Chemistry) is OSS, and where replicability issues in the sciences abound, OSS virtually sits in a state of truth. Thus, Denver should acknowledge how the collaborative conditions of free software enable superior code.
 

Brussels' Boilerplate

Line: 29 to 29
 (3) ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge [emphasis];[2]
Changed:
<
<
Put differently, the CAIA does not countenance OSS. Typically, OSS is provided as-is for free with broad disclaimers against warranty, indemnity, or other liability, but where DeepSeek? is offered on an MIT license, there is also another issue. Specifically, Apache, BSD, and MIT licenses do not require the disclosure of source code—unlike GNU or Mozilla—so the fact that DeepSeek? ’s GitHub? repo reveals some source code is not required. Nevertheless, Colorado's AG can promulgate rules around proper licensing without further changing Europe's framework, which should be adopted:[3]
>
>
Put differently, the CAIA does not countenance OSS. Typically, OSS is provided as-is for free with broad disclaimers against warranty, indemnity, or other liability, but where DeepSeek? is offered on an MIT license, there is also another issue. Specifically, Apache, BSD, and MIT licenses do not require the disclosure of source code—unlike GNU or Mozilla—so the fact that DeepSeek? ’s GitHub? repo reveals some source code is not required. Nevertheless, Colorado's AG can promulgate rules around proper licensing without further changing Europe's framework,[3] which should be adopted:
 (7) "DEVELOPER" MEANS A PERSON DOING BUSINESS IN THIS STATE, whether for payment or free of charge..."

California Dreaming

Changed:
<
<
Alternatively, Denver can attack the issue collaterally by looking to the California Privacy Rights Act (CPRA), but such a roundabout solution is arguably unwise. CPRA, which changed California’s CCPA in 2023, identified “automated decision-making” by AI, as a liability borne by the business. But CPRA only required compliance from businesses earning $25 million in annual gross revenue, betraying how dimly OSS must have been viewed by the state.[4] Of course, the CCPA is more robust than the CPA. For instance, if Wells Fargo were to use AI in mortgage servicing, it could still be found liable under California’s GLBA data-only exemption, whereas Colorado would exempt it at the GLBA entity-level.[5] Still, as a matter of public policy, Sacramento's for-profit focus probably does not warrant imitation, especially as bots' imitations are probably most pernicious when available for free (n.b. the commercial success of RedHat? ).
>
>
Alternatively, Denver can regulate AI indirectly by looking to the California Privacy Rights Act (CPRA), but such a roundabout solution is arguably unwise. CPRA, which changed California’s CCPA in 2023, identified “automated decision-making” by AI, as a liability borne by the business. But CPRA only required compliance from businesses earning $25 million in annual gross revenue, betraying how dimly OSS must have been viewed by the state.[4] Of course, the CCPA is more robust than the CPA. For instance, if Wells Fargo were to use AI in mortgage servicing, it could still be found liable under California’s GLBA data-only exemption, whereas Colorado would exempt the bank at the GLBA entity-level.[5] Still, as a matter of public policy, Sacramento's for-profit focus probably does not warrant imitation, especially as bots' imitations are likely more pervasive when available for free (n.b. the commercial success of RedHat? ).
 

Up the Mountain

Changed:
<
<
Ultimately, the current White House may view any regulation skeptically, but Denver's proponents of AI reform should probably reconsider the CAIA's continental drift. EU member states are not disinterested actors—just swipe off the tram in Amsterdam! But by gathering and analyzing all relevant source code from any relevant "developer" (or "provider" per the EU), Colorado will at least ensure what it has borrowed from Europe works (e.g. per 6-1-1703 of the CAIA, compliance for a "deployer" depends on cooperation from a "developer"). Surely, other issues remain relevant, particularly how CAIA confers a rebuttable presumption of reasonable care where a "deployer" follows frameworks like ISO/IEC 42001 or NIST's AI Risk Management Framework that OSS projects may find hard to implement. Yet, absent federal legislation, this first salvo from the states will shape the national conversation as a whole, as evidenced by Texas imitating Colorado. However, so long as the law does not acknowledge that "open source is eating software faster than software is eating the world,” legislators will probably just see friends around the campfire and everybody’s high...
>
>
Ultimately, the current White House may view regulation skeptically, but Denver's proponents of AI reform should probably reconsider the CAIA's continental drift. EU member states are not disinterested actors—just swipe off the tram in Amsterdam! But by gathering and analyzing all relevant source code from any relevant "developer" (or "provider" per the EU), Colorado will at least ensure that what it has borrowed from Europe works (e.g. per 6-1-1703 of the CAIA, compliance for a "deployer" depends on cooperation from a "developer"). Surely, other issues remain relevant, particularly how the CAIA confers a rebuttable presumption of reasonable care where a "deployer" follows frameworks like ISO/IEC 42001 or NIST's AI Risk Management Framework that OSS projects may struggle to implement. Yet, absent federal legislation, this first salvo from the states will shape the national conversation as a whole, as evidenced by Texas imitating Colorado. However, so long as the CAIA fails to acknowledge that "open source is eating software faster than software is eating the world,” legislators will probably just see friends around the campfire and everybody’s high...
 

Endnotes:

Changed:
<
<
  1. David Tollen, The Tech Contracts Handbook, Appendix 2 (ABA Publishing, 2021).
>
>
  1. David Tollen, The Tech Contracts Handbook, Appendix 2 (ABA Publishing, 2021).
 
  1. See also, “(10) ‘making available on the market’ means the supply of an AI system or a general-purpose AI model... whether in return for payment or free of charge;”
  2. As Colorado's AG is empowered under the CAIA to issue rules for enforcement, Denver could also say which licenses satisfy required disclosure (e.g. CDDL, EPL, GPL, MPL, etc.).
  3. DeepSeek? was recently the most downloaded free app on Apple's App Store and Google Play.

MichaelMacKayFirstEssay 10 - 16 Feb 2025 - Main.MichaelMacKay
Line: 1 to 1
 
META TOPICPARENT name="FirstEssay"
Line: 11 to 11
 

Fool’s Gold

Changed:
<
<
The first state to regulate recreational dispensaries is the first to dispense with unregulated AI. Commentators like EFF have called Colorado’s Artificial Intelligence Act (CAIA), enacted last summer, “comprehensive,” but Colorado’s law merely resembles the European Union’s AI Act. “The Revolution permanently put the major models of European governance off the table,”[1] but apparently, Denver took too little from Brussels and put too much out of reach of the state. “Today, most commercial software includes open source components,”[2] and last year, researchers at Harvard Business School estimated that open-source software (OSS) produced at least $4.5T in economic value. Now, with DeepSeek? , it is more doubtful than ever that developing “high-risk AI” requires proprietary ownership, so before the CAIA goes into effect February 6, 2026, the CAIA's omission of OSS is an open wound that Denver should patch.
>
>
The first state to regulate recreational dispensaries is the first to dispense with unregulated AI. Commentators like EFF have called Colorado’s Artificial Intelligence Act (CAIA), enacted last summer, “comprehensive,” but Colorado’s law merely resembles the European Union’s AI Act. “The Revolution permanently put the major models of European governance off the table,” but apparently, Denver took too little from Brussels and put too much out of reach of the state. “Today, most commercial software includes open source components,”[1] and last year, researchers at Harvard Business School estimated that open-source software (OSS) produced at least $4.5T in economic value. Now, with DeepSeek? , it is more doubtful than ever that developing “high-risk AI” requires proprietary ownership, so the CAIA's omission of OSS is an open wound that should be patched before the law's effective date, February 6, 2026.
 

Open-source Opening

Changed:
<
<
When DeepSeek? was released, U.S. stocks lost $1T in value.[3] Once the world’s most valuable company, Nvidia was chastened by news that its most advanced chips might not be as necessary for cutting-edge AI—compared to OpenAI? ’s leading model, O1, DeepSeek? performed at least as well at comparatively negligible costs (ironically, OpenAI? was founded as a nonprofit to promote open-source AI, ergo, "OpenAI").As for the CAIA, the arrival of DeepSeek? then was not so much a “sputnik” moment as much as a lesson from Silicon Valley's history. In 1979, for example, Oracle released its first commercial relational database management system, called “Oracle Version 2,” and in 1996, researchers from Berkeley launched PostgreSQL, a much-beloved OSS alternative. Today, some industry studies indicate that the latter has more market share than Oracle’s current suite of tools. Even AlphaFold 3 (behind the 2024 Nobel Prize in Chemistry) is OSS. To that end, where there are replicability issues in the sciences, OSS virtually sits in a state of truth, so apparently, Denver missed how developers tend to ship superior code wherever there is real collaboration and criticism as with free software.
>
>
When DeepSeek? was released, U.S. stocks lost $1T in value, as Nvidia was chastened by news that its chips might not be as necessary for developing AI. Compared to OpenAI? ’s leading model, O1, DeepSeek? performed at least as well for relatively negligible overhead costs. As for the CAIA then, the arrival of DeepSeek? was not so much a “sputnik” moment as much as another lesson from recent history. In 1979, Oracle released its first commercial relational database management system, called “Oracle Version 2,” and in 1996, researchers from Berkeley launched PostgreSQL, a much-beloved OSS alternative. Today, some industry studies indicate the latter has more market share than Oracle’s current suite. Even AlphaFold 3 (behind the 2024 Nobel Prize in Chemistry) is OSS, and where replicability issues in the sciences abound, OSS virtually sits in a state of truth. Thus, Denver should acknowledge how the collaborative conditions of free software enable superior code.
 
Changed:
<
<

Minding the Gap

>
>

Brussels' Boilerplate

 
Changed:
<
<
Inspired by Article 12(1) of the EU AI Act (“High-risk AI systems shall technically allow for the automatic recording of events (logs) over the lifetime of the system.”), Denver monitors the same range of “high-risk” activities outsourced to AI (e.g. deciding whom to hire, whom to give a home loan, etc.), but there is a wide chasm between the EU AI Act and the CAIA's language per Section 1 (“Definitions”):
>
>
Embracing Article 12(1) of the EU AI Act (“High-risk AI systems shall technically allow for the automatic recording of events (logs) over the lifetime of the system.”), Denver monitors the same range of “high-risk” activities outsourced to AI (e.g. deciding whom to hire, whom to give a home loan, etc.), but a wide chasm exists between the EU and Colarado per the CAIA's Section 1 (“Definitions”):
 (7) "DEVELOPER" MEANS A PERSON DOING BUSINESS IN THIS STATE THAT DEVELOPS OR INTENTIONALLY AND SUBSTANTIALLY MODIFIES AN ARTIFICIAL INTELLIGENCE SYSTEM.
Changed:
<
<
Whereas the EU AI Act's Article 3 says:
>
>
By contrast, the EU AI Act's Article 3 says:
 
Changed:
<
<
(3) ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge;[4]
>
>
(3) ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge [emphasis];[2]
 
Changed:
<
<
Put differently, the CAIA does not countenance OSS like DeepSeek? . Typically, OSS is provided as-is for free with broad disclaimers against warranty, indemnity, or other liability, but where DeepSeek? is offered on an MIT license, there is also a related issue. Apache, BSD, and MIT licenses, for instance, do not require the disclosure of source code—unlike GNU or Mozilla—so the fact that DeepSeek? ’s GitHub? repo exposes some of its source code is unnecessary. Colorado's AG could promulgate rules on proper licensing without further changing the European framework:[5]
>
>
Put differently, the CAIA does not countenance OSS. Typically, OSS is provided as-is for free with broad disclaimers against warranty, indemnity, or other liability, but where DeepSeek? is offered on an MIT license, there is also another issue. Specifically, Apache, BSD, and MIT licenses do not require the disclosure of source code—unlike GNU or Mozilla—so the fact that DeepSeek? ’s GitHub? repo reveals some source code is not required. Nevertheless, Colorado's AG can promulgate rules around proper licensing without further changing Europe's framework, which should be adopted:[3]
 (7) "DEVELOPER" MEANS A PERSON DOING BUSINESS IN THIS STATE, whether for payment or free of charge..."
Deleted:
<
<
However, until the full EU AI Act goes into effect August 2, 2026, there will be certain unknowns in enforcement. Nevertheless, forking the full legal code makes more sense for Colorado where there are known quantities like RedHat? (an OSS developer, upselling some services but not necessarily core products that could become industry leaders in AI).[6]
 

California Dreaming

Changed:
<
<
Alternatively, Denver may attack the issue collaterally by looking to the California Privacy Rights Act (CPRA), but such a roundabout solution is arguably unwise. CPRA, which changed California’s CCPA in 2023, identified “automated decision-making” by AI, as a liability borne by the business. However, CPRA only required compliance from businesses earning $25 million in annual gross revenue, betraying how dimly OSS projects must have been viewed by the state. Of course, the CCPA is more robust than the CPA generally. If Wells Fargo were to use AI in mortgage servicing, for instance, it could be found liable under California’s GLBA data-only exemption, but Colorado (like most states with privacy policies) would exempt such a financial institution at the GLBA entity-level.[7] But as a matter of public policy, Sacramento's for-profit focus probably does not warrant imitation, as bots' imitations are probably more problematic when available for free.
>
>
Alternatively, Denver can attack the issue collaterally by looking to the California Privacy Rights Act (CPRA), but such a roundabout solution is arguably unwise. CPRA, which changed California’s CCPA in 2023, identified “automated decision-making” by AI, as a liability borne by the business. But CPRA only required compliance from businesses earning $25 million in annual gross revenue, betraying how dimly OSS must have been viewed by the state.[4] Of course, the CCPA is more robust than the CPA. For instance, if Wells Fargo were to use AI in mortgage servicing, it could still be found liable under California’s GLBA data-only exemption, whereas Colorado would exempt it at the GLBA entity-level.[5] Still, as a matter of public policy, Sacramento's for-profit focus probably does not warrant imitation, especially as bots' imitations are probably most pernicious when available for free (n.b. the commercial success of RedHat? ).
 

Up the Mountain

Changed:
<
<
Ultimately, there are drawbacks to any new regime, but Denver's proponents of AI reform should probably reconsider the CAIA's continental drift. EU member states are not disinterested aggregators of data—just swipe off the tram in Amsterdam! But by gathering and analyzing all relevant source code from any "developer" (or "provider" per the EU AI Act), Colorado would at least ensure what has been borrowed from Europe actually works (e.g. per 6-1-1703 of the CAIA, compliance for a "deployer" depends on cooperation from a "developer"). Surely, other issues remain like how the CAIA confers a rebuttable presumption of reasonable care on any "deployer" that follows frameworks like ISO/IEC 42001 or NIST's AI Risk Management Framework, which may be unrealistic for decentralized developer communities outside the US. Yet, absent federal legislation, this first salvo from the states will likely shape the national conversation as a whole, as shown recently by Texas looking to the CAIA for guidance. Still, unless new “comprehensive” language truly reflects the fact that "open source is eating software faster than software is eating the world,” legislators will tend to just see friends around the campfire and everybody’s high...
>
>
Ultimately, the current White House may view any regulation skeptically, but Denver's proponents of AI reform should probably reconsider the CAIA's continental drift. EU member states are not disinterested actors—just swipe off the tram in Amsterdam! But by gathering and analyzing all relevant source code from any relevant "developer" (or "provider" per the EU), Colorado will at least ensure what it has borrowed from Europe works (e.g. per 6-1-1703 of the CAIA, compliance for a "deployer" depends on cooperation from a "developer"). Surely, other issues remain relevant, particularly how CAIA confers a rebuttable presumption of reasonable care where a "deployer" follows frameworks like ISO/IEC 42001 or NIST's AI Risk Management Framework that OSS projects may find hard to implement. Yet, absent federal legislation, this first salvo from the states will shape the national conversation as a whole, as evidenced by Texas imitating Colorado. However, so long as the law does not acknowledge that "open source is eating software faster than software is eating the world,” legislators will probably just see friends around the campfire and everybody’s high...
 
Added:
>
>
Endnotes:
 
Deleted:
<
<
Endnotes:
  1. Robert W. Gordon, The Citizen-Lawyer - A Brief Informal History of a Myth with Some Basis in Reality, 50 Wm. & Mary L. Rev. 1169 (2009), https://scholarship.law.wm.edu/wmlr/vol50/iss4/4, p. 1182.
 
  1. David Tollen, The Tech Contracts Handbook, Appendix 2 (ABA Publishing, 2021).
Changed:
<
<
  1. Karen Friar and Ines Ferré, DeepSeek? sell-off reminds investors of the biggest earnings story holding up the stock market, Yahoo Finance (January 27, 2025), https://finance.yahoo.com/news/live/stock-market-today-nasdaq-clobbered-nvidia-sinks-17-while-dow-stages-comeback-as-ai-fears-shake-markets-210101592.html.
  2. See also, “(10) ‘making available on the market’ means the supply of an AI system or a general-purpose AI model for distribution or use on the Union market in the course of a commercial activity, whether in return for payment or free of charge;”
  3. The state AG—already empowered under the CAIA to issue rules for enforcement—could further specify appropriate licenses (e.g. CDDL, EPL, GPL, MPL, etc.).
  4. In January, DeepSeek? was the most downloaded free app on the AppStore? and Google Play.
  5. Only 20 states have data privacy policies: thirteen have exemptions for data and entities under the Gramm-Leach-Bliley Act; four for just GLBA entities; and three (CO, OR, and MN) for GLBA data-only.
>
>
  1. See also, “(10) ‘making available on the market’ means the supply of an AI system or a general-purpose AI model... whether in return for payment or free of charge;”
  2. As Colorado's AG is empowered under the CAIA to issue rules for enforcement, Denver could also say which licenses satisfy required disclosure (e.g. CDDL, EPL, GPL, MPL, etc.).
  3. DeepSeek? was recently the most downloaded free app on Apple's App Store and Google Play.
  4. Only 20 states have data privacy policies: thirteen exempt Gramm-Leach-Bliley Act entities and data; four, GLBA entities only; and three, just GLBA data.
 



MichaelMacKayFirstEssay 9 - 15 Feb 2025 - Main.MichaelMacKay
Line: 1 to 1
 
META TOPICPARENT name="FirstEssay"
Line: 11 to 11
 

Fool’s Gold

Changed:
<
<
The first state to regulate recreational dispensaries is the first to dispense with unregulated AI. Commentators like EFF have called Colorado’s Artificial Intelligence Act (CAIA), enacted last summer, “comprehensive,” but Colorado’s law merely resembles the European Union’s AI Act. “The Revolution permanently put the major models of European governance off the table,”[1] but apparently, Denver took too little from Brussels, which put too much out of reach of the state. “Today, most commercial software includes open source components,”[2] and last year, researchers at Harvard Business School estimated that open-source software (OSS) produced at least $4.5T in economic value. As demonstrated by DeepSeek? , it is now more doubtful than ever that developing “high-risk AI” requires proprietary ownership, so before the CAIA goes into effect February 6, 2026, the omission of OSS from the CAIA is an open wound that Denver can still patch: (1) directly, by amending the CAIA to align better with the EU, or (2) indirectly, by amending the Colorado Privacy Act (CPA) to collaterally attack the issue as Sacramento has with the California Consumer Privacy Act (CCPA). Here, the direct approach is favored.
>
>
The first state to regulate recreational dispensaries is the first to dispense with unregulated AI. Commentators like EFF have called Colorado’s Artificial Intelligence Act (CAIA), enacted last summer, “comprehensive,” but Colorado’s law merely resembles the European Union’s AI Act. “The Revolution permanently put the major models of European governance off the table,”[1] but apparently, Denver took too little from Brussels and put too much out of reach of the state. “Today, most commercial software includes open source components,”[2] and last year, researchers at Harvard Business School estimated that open-source software (OSS) produced at least $4.5T in economic value. Now, with DeepSeek? , it is more doubtful than ever that developing “high-risk AI” requires proprietary ownership, so before the CAIA goes into effect February 6, 2026, the CAIA's omission of OSS is an open wound that Denver should patch.
 

Open-source Opening

Changed:
<
<
When DeepSeek? was released, Nvidia, a chips manufacturer, lost $1T in value, as its share price tumbled 17%.[3] Once the world’s most valuable company, Nvidia was chastened by news that massive amounts of compute power would no longer be necessary for cutting-edge development in AI—compared to OpenAI? ’s leading model, O1, DeepSeek? performed at least as well against a battery of tests at comparatively negligible costs. Originally, OpenAI? was founded as a nonprofit to promote open-source AI (ergo, OpenAI? ), but somewhere along the way, that founding mythos was lost—apparently, when humans could no longer handle the gift of knowledge, Prometheus returned to reclaim that fire (Washington State may have a different Mount Olympus, but some suggest lighting money on fire can be traced to Redmond).

As for the CAIA, the arrival of DeepSeek? was not so much a “sputnik” moment, as much as a lesson from Silicon Valley’s recent history. In 1979, Oracle released its first commercial relational database management system, called “Oracle Version 2,” and in 1996, researchers from Berkeley launched PostgreSQL, which became a much-beloved OSS alternative. Today, some industry studies indicate that the latter has more market share than Oracle’s current suite of tools. Even AlphaFold 3 (the model behind the 2024 Nobel Prize in Chemistry) is OSS, so Denver’s myopic view of the “developer” behind major breakthroughs misses the scientific process that underlies progress in software. To that end, where there are replicability issues in many scientific journals, OSS virtually sits in a state of truth, insofar as pull requests are hypotheses (merged, if true, or else restored to previous builds, if false). Thus, superior code is shipped where collaboration and criticism are hallmarks of distributed production.

>
>
When DeepSeek? was released, U.S. stocks lost $1T in value.[3] Once the world’s most valuable company, Nvidia was chastened by news that its most advanced chips might not be as necessary for cutting-edge AI—compared to OpenAI? ’s leading model, O1, DeepSeek? performed at least as well at comparatively negligible costs (ironically, OpenAI? was founded as a nonprofit to promote open-source AI, ergo, "OpenAI").As for the CAIA, the arrival of DeepSeek? then was not so much a “sputnik” moment as much as a lesson from Silicon Valley's history. In 1979, for example, Oracle released its first commercial relational database management system, called “Oracle Version 2,” and in 1996, researchers from Berkeley launched PostgreSQL, a much-beloved OSS alternative. Today, some industry studies indicate that the latter has more market share than Oracle’s current suite of tools. Even AlphaFold 3 (behind the 2024 Nobel Prize in Chemistry) is OSS. To that end, where there are replicability issues in the sciences, OSS virtually sits in a state of truth, so apparently, Denver missed how developers tend to ship superior code wherever there is real collaboration and criticism as with free software.
 

Minding the Gap

Changed:
<
<
Under the CAIA, a “developer” or “deployer” of AI systems which interact with Coloradans may be subject to AG sanctions if, within 90 days of learning how the “high-risk AI” system caused algorithmic discrimination, but here, there is an open gap for OSS. Modeled after Article 12(1) of the EU AI Act (“High-risk AI systems shall technically allow for the automatic recording of events (logs) over the lifetime of the system.”), Denver appears to monitor the same range of “high-risk” activities outsourced to dice-rolling AI (e.g. deciding whom to hire, whom to give a home loan, etc.), but a closer read of one key provision in Section 1 (“Definitions”) betrays a chasm in Colorado:
>
>
Inspired by Article 12(1) of the EU AI Act (“High-risk AI systems shall technically allow for the automatic recording of events (logs) over the lifetime of the system.”), Denver monitors the same range of “high-risk” activities outsourced to AI (e.g. deciding whom to hire, whom to give a home loan, etc.), but there is a wide chasm between the EU AI Act and the CAIA's language per Section 1 (“Definitions”):
 (7) "DEVELOPER" MEANS A PERSON DOING BUSINESS IN THIS STATE THAT DEVELOPS OR INTENTIONALLY AND SUBSTANTIALLY MODIFIES AN ARTIFICIAL INTELLIGENCE SYSTEM.
Changed:
<
<
Whereas Article 3 of the EU AI Act says:
>
>
Whereas the EU AI Act's Article 3 says:
 (3) ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge;[4]
Changed:
<
<
Put differently, the CAIA cannot bind OSS like DeepSeek? . Typically, OSS is provided as-is with broad disclaimers against warranty, indemnity, or other liability, but where DeepSeek? is offered on an MIT license, there is a related issue. Apache, BSD, and MIT licenses, for instance, do not require the disclosure of source code—unlike GNU or Mozilla—so the fact that DeepSeek? ’s GitHub? repo exposes some of its source code is not a requirement. Still, Colorado's AG could promulgate rules on licensing without further changing the law beyond embracing OSS:[5]
>
>
Put differently, the CAIA does not countenance OSS like DeepSeek? . Typically, OSS is provided as-is for free with broad disclaimers against warranty, indemnity, or other liability, but where DeepSeek? is offered on an MIT license, there is also a related issue. Apache, BSD, and MIT licenses, for instance, do not require the disclosure of source code—unlike GNU or Mozilla—so the fact that DeepSeek? ’s GitHub? repo exposes some of its source code is unnecessary. Colorado's AG could promulgate rules on proper licensing without further changing the European framework:[5]
 (7) "DEVELOPER" MEANS A PERSON DOING BUSINESS IN THIS STATE, whether for payment or free of charge..."
Changed:
<
<
Still, until the full EU AI Act goes into effect August 2, 2026, there will inevitably be unknowns in enforcement. However, Denver can more fully fork the legal code to better address problems posed by even mature OSS developers like RedHat? (upselling services but not necessarily core products, which could be AI).[6]
>
>
However, until the full EU AI Act goes into effect August 2, 2026, there will be certain unknowns in enforcement. Nevertheless, forking the full legal code makes more sense for Colorado where there are known quantities like RedHat? (an OSS developer, upselling some services but not necessarily core products that could become industry leaders in AI).[6]
 

California Dreaming

Changed:
<
<
Alternatively, if regulators fear a deep sea of OSS, they may align the Colorado Privacy Act with the California Privacy Rights Act (CPRA), but such an roundabout approach to regulation is arguably unwise. CPRA, which amended California’s CCPA in 2023, identified “automated decision-making” by AI, as liability borne by the business. However, CPRA only required compliance from businesses earning $25 million in annual gross revenue, which belied how dimly OSS projects were viewed. Of course, Californians’ CCPA provides more protections than Coloradoans’ CPA. If Wells Fargo were to use AI in mortgage servicing, it could still found be found liable under California’s GLBA data-only exemption, but Colorado (like most states with privacy policies) would exempt that financial institution at the GLBA entity-level.[7] Still, as a matter of public policy, Sacramento's for-profit focus means it probably does not warrant imitation where bots can be even more problematic pretenders when offered for free.
>
>
Alternatively, Denver may attack the issue collaterally by looking to the California Privacy Rights Act (CPRA), but such a roundabout solution is arguably unwise. CPRA, which changed California’s CCPA in 2023, identified “automated decision-making” by AI, as a liability borne by the business. However, CPRA only required compliance from businesses earning $25 million in annual gross revenue, betraying how dimly OSS projects must have been viewed by the state. Of course, the CCPA is more robust than the CPA generally. If Wells Fargo were to use AI in mortgage servicing, for instance, it could be found liable under California’s GLBA data-only exemption, but Colorado (like most states with privacy policies) would exempt such a financial institution at the GLBA entity-level.[7] But as a matter of public policy, Sacramento's for-profit focus probably does not warrant imitation, as bots' imitations are probably more problematic when available for free.
 

Up the Mountain

Changed:
<
<
Ultimately, proponents of AI reform should call out the CAIA's continental drift. EU member states are not disinterested actors—just swipe off the tram in Amsterdam! But by ensuring that open-source AI is envisaged when seeking out the source code from any "developer" (or "provider" per the EU AI Act), Colorado will better act on what it has already borrowed from Europe (e.g. per 6-1-1703, compliance for a "deployer" depends on cooperation from a "developer"). Of course, there are still other issues in the CAIA's design like how it mandates a "deployer" follow frameworks like NIST's AI Risk Management Framework or ISO/IEC 42001, which are governance structures maybe better suited for traditional corporate environments than decentralized communities. Absent federal legislation, though, first salvo from the states will shape national conversation as a whole, especially as Texas considers its own bill based on Colorado's. However, unless new “comprehensive” language truly reflects the fact "open source is eating software faster than software is eating the world,” the CAIA will tend to see friends around the campfire... and everybody’s high.
>
>
Ultimately, there are drawbacks to any new regime, but Denver's proponents of AI reform should probably reconsider the CAIA's continental drift. EU member states are not disinterested aggregators of data—just swipe off the tram in Amsterdam! But by gathering and analyzing all relevant source code from any "developer" (or "provider" per the EU AI Act), Colorado would at least ensure what has been borrowed from Europe actually works (e.g. per 6-1-1703 of the CAIA, compliance for a "deployer" depends on cooperation from a "developer"). Surely, other issues remain like how the CAIA confers a rebuttable presumption of reasonable care on any "deployer" that follows frameworks like ISO/IEC 42001 or NIST's AI Risk Management Framework, which may be unrealistic for decentralized developer communities outside the US. Yet, absent federal legislation, this first salvo from the states will likely shape the national conversation as a whole, as shown recently by Texas looking to the CAIA for guidance. Still, unless new “comprehensive” language truly reflects the fact that "open source is eating software faster than software is eating the world,” legislators will tend to just see friends around the campfire and everybody’s high...
 
Changed:
<
<
Endnotes
  1. Robert Gordon, The Citizen Lawyer, A Brief Informal History of Myth with Some Basis in Reality, p. 1182.
>
>
Endnotes:
  1. Robert W. Gordon, The Citizen-Lawyer - A Brief Informal History of a Myth with Some Basis in Reality, 50 Wm. & Mary L. Rev. 1169 (2009), https://scholarship.law.wm.edu/wmlr/vol50/iss4/4, p. 1182.
 
  1. David Tollen, The Tech Contracts Handbook, Appendix 2 (ABA Publishing, 2021).
  2. Karen Friar and Ines Ferré, DeepSeek? sell-off reminds investors of the biggest earnings story holding up the stock market, Yahoo Finance (January 27, 2025), https://finance.yahoo.com/news/live/stock-market-today-nasdaq-clobbered-nvidia-sinks-17-while-dow-stages-comeback-as-ai-fears-shake-markets-210101592.html.
  3. See also, “(10) ‘making available on the market’ means the supply of an AI system or a general-purpose AI model for distribution or use on the Union market in the course of a commercial activity, whether in return for payment or free of charge;”

MichaelMacKayFirstEssay 8 - 15 Feb 2025 - Main.MichaelMacKay
Line: 1 to 1
 
META TOPICPARENT name="FirstEssay"
Changed:
<
<

Rocky Mountain High: Colorado’s AI Act and Hallucinations of Open-Source

>
>

Rocky Mountain High: Colorado’s AI Act and Hallucinating about Open-Source

  -- By MichaelMacKay - 25 Oct 2024
Line: 11 to 11
 

Fool’s Gold

Changed:
<
<
The first state to regulate recreational dispensaries is the first to dispense with unregulated AI. Commentators like EFF have called Colorado’s Artificial Intelligence Act (CAIA), enacted last summer, “comprehensive,” but Colorado’s law merely resembles the European Union’s AI Act. “The Revolution permanently put the major models of European governance off the table,”[1] but apparently, Denver has taken too little from Brussels and put too much out of reach of the state. “Today, most commercial software includes open source components,”[2] and last year, researchers at Harvard Business School estimated that, conservatively, OSS produced at least $4.5T in economic value. Now with DeepSeek? , it is more doubtful than ever that developing “high-risk AI” will require proprietary ownership. Before the CAIA goes into effect February 6, 2026, its omission of open source software (OSS) is an open wound that Denver can still patch: (1) directly, by amending the CAIA to align more closely with the EU, or (2) indirectly, by amending the Colorado Privacy Act (CPA) to collaterally attack the issue as Sacramento did with the updated California Consumer Privacy Act (CCPA).
>
>
The first state to regulate recreational dispensaries is the first to dispense with unregulated AI. Commentators like EFF have called Colorado’s Artificial Intelligence Act (CAIA), enacted last summer, “comprehensive,” but Colorado’s law merely resembles the European Union’s AI Act. “The Revolution permanently put the major models of European governance off the table,”[1] but apparently, Denver took too little from Brussels, which put too much out of reach of the state. “Today, most commercial software includes open source components,”[2] and last year, researchers at Harvard Business School estimated that open-source software (OSS) produced at least $4.5T in economic value. As demonstrated by DeepSeek? , it is now more doubtful than ever that developing “high-risk AI” requires proprietary ownership, so before the CAIA goes into effect February 6, 2026, the omission of OSS from the CAIA is an open wound that Denver can still patch: (1) directly, by amending the CAIA to align better with the EU, or (2) indirectly, by amending the Colorado Privacy Act (CPA) to collaterally attack the issue as Sacramento has with the California Consumer Privacy Act (CCPA). Here, the direct approach is favored.
 

Open-source Opening

Changed:
<
<
When DeepSeek? , an OSS project from China, was released, Nvidia, a chips manufacturer, lost $1T in value, as share price tumbled 17%.[3] One of the world’s most valuable companies, Nvidia was chastened by news that massive amounts of compute power would no longer be necessary for cutting-edge development in AI—compared to OpenAI? ’s leading model, O1, DeepSeek? performed at least as well across a battery of tests at comparatively negligible costs. Originally, OpenAI? was founded as a nonprofit to promote the open source development of AI (ergo, OpenAI? ), but that founding mythos was lost somewhere along the way—apparently, when humans could no longer handle the gift of knowledge, Prometheus also returned to reclaim fire (Washington State may have a different Mount Olympus, but the ashes of lighting money on fire may yet be traced to Redmond). Hence, as for the CAIA and other aspiring regulators, the arrival of DeepSeek? was not so much a “sputnik” moment, as much as a wake-up call from Silicon Valley’s recent economic history. In 1979, Oracle released its first commercial relational database management system, called “Oracle Version 2,” and in 1996, researchers from Berkeley launched PostgreSQL, which became a much-beloved OSS alternative. Today, some industry studies indicate that the latter has more market share than Oracle’s current suite of tools. Even AlphaFold 3 (the model behind the 2024 Nobel Prize in Chemistry) is OSS, so Denver’s narrow view of the “developer” behind major breakthroughs appears to miss the scientific process that underlies progress in software. Notably, where there are replicability issues in scientific journals, OSS projects virtually rest in a state of truth, where pull requests are functionally hypotheses, extending the state of knowledge (merged, if true, or else restored to previous builds). Generally, superior code is shipped where collaboration and criticism are the defaults of production.
>
>
When DeepSeek? was released, Nvidia, a chips manufacturer, lost $1T in value, as its share price tumbled 17%.[3] Once the world’s most valuable company, Nvidia was chastened by news that massive amounts of compute power would no longer be necessary for cutting-edge development in AI—compared to OpenAI? ’s leading model, O1, DeepSeek? performed at least as well against a battery of tests at comparatively negligible costs. Originally, OpenAI? was founded as a nonprofit to promote open-source AI (ergo, OpenAI? ), but somewhere along the way, that founding mythos was lost—apparently, when humans could no longer handle the gift of knowledge, Prometheus returned to reclaim that fire (Washington State may have a different Mount Olympus, but some suggest lighting money on fire can be traced to Redmond).

As for the CAIA, the arrival of DeepSeek? was not so much a “sputnik” moment, as much as a lesson from Silicon Valley’s recent history. In 1979, Oracle released its first commercial relational database management system, called “Oracle Version 2,” and in 1996, researchers from Berkeley launched PostgreSQL, which became a much-beloved OSS alternative. Today, some industry studies indicate that the latter has more market share than Oracle’s current suite of tools. Even AlphaFold 3 (the model behind the 2024 Nobel Prize in Chemistry) is OSS, so Denver’s myopic view of the “developer” behind major breakthroughs misses the scientific process that underlies progress in software. To that end, where there are replicability issues in many scientific journals, OSS virtually sits in a state of truth, insofar as pull requests are hypotheses (merged, if true, or else restored to previous builds, if false). Thus, superior code is shipped where collaboration and criticism are hallmarks of distributed production.

 

Minding the Gap

Changed:
<
<
Under the CAIA, a “developer” or “deployer” of AI systems which interact with Coloradans is subject to AG sanctions if, within 90 days of learning how the “high-risk AI” system caused algorithmic discrimination, there is no follow-up disclosure. Inspired by Article 12(1) of the EU AI Act (“High-risk AI systems shall technically allow for the automatic recording of events (logs) over the lifetime of the system.”), Denver monitors the same range of “high-risk” activities outsourced to AI (e.g. deciding whom to hire, whom to give a home loan, etc.), but a closer read of one key provision in Section 1 (“Definitions”) betrays a wide chasm, as Colorado says:
>
>
Under the CAIA, a “developer” or “deployer” of AI systems which interact with Coloradans may be subject to AG sanctions if, within 90 days of learning how the “high-risk AI” system caused algorithmic discrimination, but here, there is an open gap for OSS. Modeled after Article 12(1) of the EU AI Act (“High-risk AI systems shall technically allow for the automatic recording of events (logs) over the lifetime of the system.”), Denver appears to monitor the same range of “high-risk” activities outsourced to dice-rolling AI (e.g. deciding whom to hire, whom to give a home loan, etc.), but a closer read of one key provision in Section 1 (“Definitions”) betrays a chasm in Colorado:
 (7) "DEVELOPER" MEANS A PERSON DOING BUSINESS IN THIS STATE THAT DEVELOPS OR INTENTIONALLY AND SUBSTANTIALLY MODIFIES AN ARTIFICIAL INTELLIGENCE SYSTEM.
Line: 30 to 31
 (3) ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge;[4]
Changed:
<
<
Put differently, the CAIA says nothing about OSS like DeepSeek? . OSS is often provided as is with broad disclaimers against warranty, indemnity, or other liability, but DeepSeek? , for example, is offered on an MIT license, which also raises another issue within that world of free software. Apache, BSD, and MIT licenses, for instance, do not require the disclosure of source code—as opposed to GNU or Mozilla—so the fact that DeepSeek? ’s GitHub? repo exposes some of its source code is not required. Currently, there are no enforcement actions by Brussels under the full EU AI Act (which only comes into effect, August 2, 2026), but state regulators could probably promulgate rules on licensing without further changing the law.[5] Thus, the CAIA might read:
>
>
Put differently, the CAIA cannot bind OSS like DeepSeek? . Typically, OSS is provided as-is with broad disclaimers against warranty, indemnity, or other liability, but where DeepSeek? is offered on an MIT license, there is a related issue. Apache, BSD, and MIT licenses, for instance, do not require the disclosure of source code—unlike GNU or Mozilla—so the fact that DeepSeek? ’s GitHub? repo exposes some of its source code is not a requirement. Still, Colorado's AG could promulgate rules on licensing without further changing the law beyond embracing OSS:[5]
 
Changed:
<
<
(7) "DEVELOPER" MEANS A PERSON DOING BUSINESS IN THIS STATE, whether for payment or free of charge, THAT DEVELOPS OR INTENTIONALLY AND SUBSTANTIALLY MODIFIES AN ARTIFICIAL INTELLIGENCE SYSTEM.
>
>
(7) "DEVELOPER" MEANS A PERSON DOING BUSINESS IN THIS STATE, whether for payment or free of charge..."
 
Changed:
<
<
Such legislation and regulation would then cover companies like RedHat? (upselling services, but not necessarily selling free AI), and again, by excluding OSS developers from sources of “high-risk AI,” the CAIA would tend to overlook industry-leading developers like DeepSeek? .[6]
>
>
Still, until the full EU AI Act goes into effect August 2, 2026, there will inevitably be unknowns in enforcement. However, Denver can more fully fork the legal code to better address problems posed by even mature OSS developers like RedHat? (upselling services but not necessarily core products, which could be AI).[6]
 

California Dreaming

Changed:
<
<
Alternatively, if regulators are afraid of entering a deep sea of open-source development, they may try to align the Colorado Privacy Act with the California Privacy Rights Act (CPRA), but such an indirect approach to AI policy would likely fall short of the CAIA's goals. CPRA, which amended California’s CCPA in 2023, identified “automated decision-making” by AI, as a liability borne by the business. However, CPRA only required compliance from businesses earning $25 million in annual gross revenue, which belies how dimly OSS projects must have been viewed by the state. Of course, Californians’ amended CCPA is still stronger than Coloradoans’ CPA. For example, if a bank like Wells Fargo were to use AI in mortgage servicing, it could still found be found liable under California’s GLBA data-only exemption, but Colorado (like most states with privacy policies) would exempt the entire financial institution at the GLBA entity-level.[7] However, again, Sacramento's threshold of a for-profit business model likely warrants caution on taking an indirect approach to protecting data as opposed to "consequential decisions" from AI.
>
>
Alternatively, if regulators fear a deep sea of OSS, they may align the Colorado Privacy Act with the California Privacy Rights Act (CPRA), but such an roundabout approach to regulation is arguably unwise. CPRA, which amended California’s CCPA in 2023, identified “automated decision-making” by AI, as liability borne by the business. However, CPRA only required compliance from businesses earning $25 million in annual gross revenue, which belied how dimly OSS projects were viewed. Of course, Californians’ CCPA provides more protections than Coloradoans’ CPA. If Wells Fargo were to use AI in mortgage servicing, it could still found be found liable under California’s GLBA data-only exemption, but Colorado (like most states with privacy policies) would exempt that financial institution at the GLBA entity-level.[7] Still, as a matter of public policy, Sacramento's for-profit focus means it probably does not warrant imitation where bots can be even more problematic pretenders when offered for free.
 

Up the Mountain

Changed:
<
<
Ultimately, proponents of AI reform should continue to call out the CAIA's continental drift. EU member states are not disinterested aggregators of data—just swipe off the tram in Amsterdam! But by ensuring that OSS is included when seeking source code from any "developer" (or "provider" per the EU AI Act), Colorado will at least operationalize the other mechanisms that it has borrowed from Europe (e.g. currently, per 6-1-1703, compliance for a "deployer" is parasitic on cooperation from a "developer," so a "deployer" using open-source AI where not all of the source code is available would be unable to comply). That said, there are still other issues in the CAIA's design, including how the law mandates a "deployer" follow frameworks like NIST's AI Risk Management Framework or ISO/IEC 42001, which are governance structures probably better suited for traditional corporate environments than decentralized development communities. Still, absent federal legislation, this first salvo in the states regulating AI will greatly shape the national conversation as a whole. However, unless more “comprehensive” language truly reflects the fact that "open source is eating software faster than software is eating the world,” the CAIA will tend to see friends around the campfire... and everybody’s high.
>
>
Ultimately, proponents of AI reform should call out the CAIA's continental drift. EU member states are not disinterested actors—just swipe off the tram in Amsterdam! But by ensuring that open-source AI is envisaged when seeking out the source code from any "developer" (or "provider" per the EU AI Act), Colorado will better act on what it has already borrowed from Europe (e.g. per 6-1-1703, compliance for a "deployer" depends on cooperation from a "developer"). Of course, there are still other issues in the CAIA's design like how it mandates a "deployer" follow frameworks like NIST's AI Risk Management Framework or ISO/IEC 42001, which are governance structures maybe better suited for traditional corporate environments than decentralized communities. Absent federal legislation, though, first salvo from the states will shape national conversation as a whole, especially as Texas considers its own bill based on Colorado's. However, unless new “comprehensive” language truly reflects the fact "open source is eating software faster than software is eating the world,” the CAIA will tend to see friends around the campfire... and everybody’s high.
 

MichaelMacKayFirstEssay 7 - 11 Feb 2025 - Main.MichaelMacKay
Line: 1 to 1
 
META TOPICPARENT name="FirstEssay"
Line: 17 to 17
 

Open-source Opening

When DeepSeek? , an OSS project from China, was released, Nvidia, a chips manufacturer, lost $1T in value, as share price tumbled 17%.[3] One of the world’s most valuable companies, Nvidia was chastened by news that massive amounts of compute power would no longer be necessary for cutting-edge development in AI—compared to OpenAI? ’s leading model, O1, DeepSeek? performed at least as well across a battery of tests at comparatively negligible costs. Originally, OpenAI? was founded as a nonprofit to promote the open source development of AI (ergo, OpenAI? ), but that founding mythos was lost somewhere along the way—apparently, when humans could no longer handle the gift of knowledge, Prometheus also returned to reclaim fire (Washington State may have a different Mount Olympus, but the ashes of lighting money on fire may yet be traced to Redmond).

Changed:
<
<
Hence, as for the CAIA and other aspiring regulators, the arrival of DeepSeek? was not so much a “sputnik” moment, as much as a wake-up call from Silicon Valley’s recent economic history. In 1979, Oracle released its first commercial relational database management system, called “Oracle Version 2,” and in 1996, researchers from Berkeley launched PostgreSQL, which became a much-beloved OSS alternative. Today, some industry studies indicate that the latter has more market share than Oracle’s current suite of tools. Even AlphaFold? 2 (the model behind the 2024 Nobel Prize in Chemistry) is OSS, so Denver’s narrow view of the “developer” behind major breakthroughs appears to miss the scientific process that underlies progress in software. Notably, where there are replicability issues in journals, OSS projects virtually rest in a state of truth, where pull requests are functionally hypotheses, extending the state of knowledge (merged, if true, or else restored to previous builds). Generally, superior code is shipped where collaboration and criticism are the defaults of production.
>
>
Hence, as for the CAIA and other aspiring regulators, the arrival of DeepSeek? was not so much a “sputnik” moment, as much as a wake-up call from Silicon Valley’s recent economic history. In 1979, Oracle released its first commercial relational database management system, called “Oracle Version 2,” and in 1996, researchers from Berkeley launched PostgreSQL, which became a much-beloved OSS alternative. Today, some industry studies indicate that the latter has more market share than Oracle’s current suite of tools. Even AlphaFold 3 (the model behind the 2024 Nobel Prize in Chemistry) is OSS, so Denver’s narrow view of the “developer” behind major breakthroughs appears to miss the scientific process that underlies progress in software. Notably, where there are replicability issues in scientific journals, OSS projects virtually rest in a state of truth, where pull requests are functionally hypotheses, extending the state of knowledge (merged, if true, or else restored to previous builds). Generally, superior code is shipped where collaboration and criticism are the defaults of production.
 

Minding the Gap


MichaelMacKayFirstEssay 6 - 11 Feb 2025 - Main.MichaelMacKay
Line: 1 to 1
 
META TOPICPARENT name="FirstEssay"
Line: 11 to 11
 

Fool’s Gold

Changed:
<
<
The first state to regulate recreational dispensaries is the first to dispense with unregulated AI. Commentators like EFF have called Colorado’s Artificial Intelligence Act (CAIA), enacted last summer, “comprehensive,” but Colorado’s law merely resembles the European Union’s AI Act. “The Revolution permanently put the major models of European governance off the table,”[1] but apparently, Denver has taken too little from Brussels and put too much out of reach of the state. “Today, most commercial software includes open source components,” and now with DeepSeek? , it is more doubtful than ever that developing “high-risk AI” will require proprietary ownership. Before the CAIA goes into effect February 6, 2026, its omission of open source software (OSS) is an open wound that Denver can still patch: (1) directly, by amending the CAIA to align more closely with the EU, or (2) indirectly, by amending the Colorado Privacy Act (CPA) to collaterally attack the issue as Sacramento did with the updated California Consumer Privacy Act (CCPA).
>
>
The first state to regulate recreational dispensaries is the first to dispense with unregulated AI. Commentators like EFF have called Colorado’s Artificial Intelligence Act (CAIA), enacted last summer, “comprehensive,” but Colorado’s law merely resembles the European Union’s AI Act. “The Revolution permanently put the major models of European governance off the table,”[1] but apparently, Denver has taken too little from Brussels and put too much out of reach of the state. “Today, most commercial software includes open source components,”[2] and last year, researchers at Harvard Business School estimated that, conservatively, OSS produced at least $4.5T in economic value. Now with DeepSeek? , it is more doubtful than ever that developing “high-risk AI” will require proprietary ownership. Before the CAIA goes into effect February 6, 2026, its omission of open source software (OSS) is an open wound that Denver can still patch: (1) directly, by amending the CAIA to align more closely with the EU, or (2) indirectly, by amending the Colorado Privacy Act (CPA) to collaterally attack the issue as Sacramento did with the updated California Consumer Privacy Act (CCPA).
 

Open-source Opening

Line: 33 to 33
 Put differently, the CAIA says nothing about OSS like DeepSeek? . OSS is often provided as is with broad disclaimers against warranty, indemnity, or other liability, but DeepSeek? , for example, is offered on an MIT license, which also raises another issue within that world of free software. Apache, BSD, and MIT licenses, for instance, do not require the disclosure of source code—as opposed to GNU or Mozilla—so the fact that DeepSeek? ’s GitHub? repo exposes some of its source code is not required. Currently, there are no enforcement actions by Brussels under the full EU AI Act (which only comes into effect, August 2, 2026), but state regulators could probably promulgate rules on licensing without further changing the law.[5] Thus, the CAIA might read:

(7) "DEVELOPER" MEANS A PERSON DOING BUSINESS IN THIS STATE, whether for payment or free of charge, THAT DEVELOPS OR INTENTIONALLY AND SUBSTANTIALLY MODIFIES AN ARTIFICIAL INTELLIGENCE SYSTEM.

Changed:
<
<
Such legislation and regulation would then cover companies like RedHat? (upselling their services, but not necessarily marketing their free AI), and again, by excluding OSS developers from sources of “high-risk AI,” the CAIA would tend to otherwise overlook industry-leading developers like DeepSeek? .[6]
>
>
Such legislation and regulation would then cover companies like RedHat? (upselling services, but not necessarily selling free AI), and again, by excluding OSS developers from sources of “high-risk AI,” the CAIA would tend to overlook industry-leading developers like DeepSeek? .[6]
 

California Dreaming

Line: 42 to 43
 

Up the Mountain

Changed:
<
<
Ultimately, proponents of AI reform should continue to call out the CAIA's continental drift. The EU’s state actors are not disinterested aggregators of source code—just swipe off the tram in Amsterdam! But by inserting OSS into the “developer” definition (or "provider" per the EU AI Act), Colorado will at least operationalize the other mechanisms it has borrowed from European regulators. That said, there is still complexity in the CAIA's design, as the law mandates a "deployer" implement frameworks like NIST's AI Risk Management Framework or ISO/IEC 42001, which are governance structures perhaps better suited for traditional corporate environments than decentralized development communities. Still, absent federal legislation, this first salvo in regulating AI by the states will greatly shape the national conversation as a whole. Hence, unless “comprehensive” language truly reflects the reality of a vast OSS underground, the CAIA will tend to see friends around the campfire... and everybody’s high.
>
>
Ultimately, proponents of AI reform should continue to call out the CAIA's continental drift. EU member states are not disinterested aggregators of data—just swipe off the tram in Amsterdam! But by ensuring that OSS is included when seeking source code from any "developer" (or "provider" per the EU AI Act), Colorado will at least operationalize the other mechanisms that it has borrowed from Europe (e.g. currently, per 6-1-1703, compliance for a "deployer" is parasitic on cooperation from a "developer," so a "deployer" using open-source AI where not all of the source code is available would be unable to comply). That said, there are still other issues in the CAIA's design, including how the law mandates a "deployer" follow frameworks like NIST's AI Risk Management Framework or ISO/IEC 42001, which are governance structures probably better suited for traditional corporate environments than decentralized development communities. Still, absent federal legislation, this first salvo in the states regulating AI will greatly shape the national conversation as a whole. However, unless more “comprehensive” language truly reflects the fact that "open source is eating software faster than software is eating the world,” the CAIA will tend to see friends around the campfire... and everybody’s high.
 
Line: 52 to 53
 
  1. Karen Friar and Ines Ferré, DeepSeek? sell-off reminds investors of the biggest earnings story holding up the stock market, Yahoo Finance (January 27, 2025), https://finance.yahoo.com/news/live/stock-market-today-nasdaq-clobbered-nvidia-sinks-17-while-dow-stages-comeback-as-ai-fears-shake-markets-210101592.html.
  2. See also, “(10) ‘making available on the market’ means the supply of an AI system or a general-purpose AI model for distribution or use on the Union market in the course of a commercial activity, whether in return for payment or free of charge;”
  3. The state AG—already empowered under the CAIA to issue rules for enforcement—could further specify appropriate licenses (e.g. CDDL, EPL, GPL, MPL, etc.).
Changed:
<
<
  1. In February, DeepSeek? was the most downloaded app on the AppStore? and Google Play.
  2. Only 20 states have data privacy policies: thirteen have exemptions for data and entities under the Gramm-Leach-Bliley Act; four for just GLBA entities; and three (CO, OR, and MN) for GLBA data-only.
>
>
  1. In January, DeepSeek? was the most downloaded free app on the AppStore? and Google Play.
  2. Only 20 states have data privacy policies: thirteen have exemptions for data and entities under the Gramm-Leach-Bliley Act; four for just GLBA entities; and three (CO, OR, and MN) for GLBA data-only.
 



MichaelMacKayFirstEssay 5 - 11 Feb 2025 - Main.MichaelMacKay
Line: 1 to 1
 
META TOPICPARENT name="FirstEssay"
Changed:
<
<

Rocky Mountain High: Colorado’s AI Act and Hallucinations about Open-Source

>
>

Rocky Mountain High: Colorado’s AI Act and Hallucinations of Open-Source

  -- By MichaelMacKay - 25 Oct 2024
Line: 42 to 42
 

Up the Mountain

Changed:
<
<
Ultimately, proponents of AI reform should continue to call out the CAIA's continental drift. The EU’s state actors are not disinterested in new technologies—just swipe off the tram in Amsterdam! But by inserting OSS into the “developer” definition (or "provider" definition like the EU), the state will at least help operationalize the other mechanisms borrowed from European regulators. That said, there is still complexity in the CAIA's administration, as the law mandates a "deployer" implement frameworks like NIST's AI Risk Management Framework or ISO/IEC 42001, which are governance structures better suited for traditional corporate environments than decentralized development communities. Still, absent federal legislation, this first salvo in the AI policy from the states will greatly shape the nation as a whole. Hence, unless “comprehensive” language truly reflects the reality of a vast OSS underground, the CAIA will tend to see friends around the campfire... and everybody’s high.
>
>
Ultimately, proponents of AI reform should continue to call out the CAIA's continental drift. The EU’s state actors are not disinterested aggregators of source code—just swipe off the tram in Amsterdam! But by inserting OSS into the “developer” definition (or "provider" per the EU AI Act), Colorado will at least operationalize the other mechanisms it has borrowed from European regulators. That said, there is still complexity in the CAIA's design, as the law mandates a "deployer" implement frameworks like NIST's AI Risk Management Framework or ISO/IEC 42001, which are governance structures perhaps better suited for traditional corporate environments than decentralized development communities. Still, absent federal legislation, this first salvo in regulating AI by the states will greatly shape the national conversation as a whole. Hence, unless “comprehensive” language truly reflects the reality of a vast OSS underground, the CAIA will tend to see friends around the campfire... and everybody’s high.
 

MichaelMacKayFirstEssay 4 - 11 Feb 2025 - Main.MichaelMacKay
Line: 1 to 1
 
META TOPICPARENT name="FirstEssay"
Line: 11 to 11
 

Fool’s Gold

Changed:
<
<
The first state to regulate recreational dispensaries is the first to dispense with “AI.” Commentators like EFF have called Colorado’s Artificial Intelligence Act (CAIA), enacted last summer, “comprehensive,” but Colorado’s law merely resembles the European Union’s AI Act. “The Revolution permanently put the major models of European governance off the table,”[1] but apparently, Denver has taken too little from Brussels and put too much out of reach of the state. “Today, most commercial software includes open source components,” and now with DeepSeek? , it is more doubtful than ever that developing “high-risk AI” will require proprietary ownership. Before the CAIA goes into effect February 6, 2026, its omission of open source software (OSS) is an open wound that Denver can still patch: (1) directly, by amending the CAIA to align more closely with the EU, or (2) indirectly, by amending the Colorado Privacy Act (CPA) to collaterally attack the issue as Sacramento did with the updated California Consumer Privacy Act (CCPA).
>
>
The first state to regulate recreational dispensaries is the first to dispense with unregulated AI. Commentators like EFF have called Colorado’s Artificial Intelligence Act (CAIA), enacted last summer, “comprehensive,” but Colorado’s law merely resembles the European Union’s AI Act. “The Revolution permanently put the major models of European governance off the table,”[1] but apparently, Denver has taken too little from Brussels and put too much out of reach of the state. “Today, most commercial software includes open source components,” and now with DeepSeek? , it is more doubtful than ever that developing “high-risk AI” will require proprietary ownership. Before the CAIA goes into effect February 6, 2026, its omission of open source software (OSS) is an open wound that Denver can still patch: (1) directly, by amending the CAIA to align more closely with the EU, or (2) indirectly, by amending the Colorado Privacy Act (CPA) to collaterally attack the issue as Sacramento did with the updated California Consumer Privacy Act (CCPA).
 

Open-source Opening


MichaelMacKayFirstEssay 3 - 10 Feb 2025 - Main.MichaelMacKay
Line: 1 to 1
 
META TOPICPARENT name="FirstEssay"
Changed:
<
<

Paper Title

A title it a terrible thing to waste.
>
>

Rocky Mountain High: Colorado’s AI Act and Hallucinations about Open-Source

  -- By MichaelMacKay - 25 Oct 2024
Deleted:
<
<
In "The Use of Knowledge in Society," Friedrich Hayek argued that civilization advances by expanding the number of actions we can perform “without thinking about them.” When he won the 1978 Nobel Prize in Economics, it was largely for illustrating how price signals create efficient markets. Yet, Black Monday—a massive crash less than a decade later—posed a critical question: who is actually thinking?
 
Changed:
<
<
In open-source software (OSS), proof of thought is essential, as free software means the right to inspect code. Where marginal costs are negligible, OSS also often surpasses proprietary production. So why have high-frequency trading (HFT) firms like Virtu or Citadel not open-sourced more projects? The commercial success of Google’s TensorFlow? , released in 2015 under an Apache license, a free software license, shows how sharing some core infrastructure can attract powerful network effects while preserving competitive advantages. Despite this, HFTs—now responsible for most daily trading volume—are unswayed. Given the proven benefits of open-source efforts in other complex fields (n.b. this year's Nobel Prizes), their resistance to greater openness is enigmatic where profitability and market influence could be enhanced by selectively open-sourcing some more ancillary code.
>
>

Fool’s Gold

The first state to regulate recreational dispensaries is the first to dispense with “AI.” Commentators like EFF have called Colorado’s Artificial Intelligence Act (CAIA), enacted last summer, “comprehensive,” but Colorado’s law merely resembles the European Union’s AI Act. “The Revolution permanently put the major models of European governance off the table,”[1] but apparently, Denver has taken too little from Brussels and put too much out of reach of the state. “Today, most commercial software includes open source components,” and now with DeepSeek? , it is more doubtful than ever that developing “high-risk AI” will require proprietary ownership. Before the CAIA goes into effect February 6, 2026, its omission of open source software (OSS) is an open wound that Denver can still patch: (1) directly, by amending the CAIA to align more closely with the EU, or (2) indirectly, by amending the Colorado Privacy Act (CPA) to collaterally attack the issue as Sacramento did with the updated California Consumer Privacy Act (CCPA).

Open-source Opening

When DeepSeek? , an OSS project from China, was released, Nvidia, a chips manufacturer, lost $1T in value, as share price tumbled 17%.[3] One of the world’s most valuable companies, Nvidia was chastened by news that massive amounts of compute power would no longer be necessary for cutting-edge development in AI—compared to OpenAI? ’s leading model, O1, DeepSeek? performed at least as well across a battery of tests at comparatively negligible costs. Originally, OpenAI? was founded as a nonprofit to promote the open source development of AI (ergo, OpenAI? ), but that founding mythos was lost somewhere along the way—apparently, when humans could no longer handle the gift of knowledge, Prometheus also returned to reclaim fire (Washington State may have a different Mount Olympus, but the ashes of lighting money on fire may yet be traced to Redmond). Hence, as for the CAIA and other aspiring regulators, the arrival of DeepSeek? was not so much a “sputnik” moment, as much as a wake-up call from Silicon Valley’s recent economic history. In 1979, Oracle released its first commercial relational database management system, called “Oracle Version 2,” and in 1996, researchers from Berkeley launched PostgreSQL, which became a much-beloved OSS alternative. Today, some industry studies indicate that the latter has more market share than Oracle’s current suite of tools. Even AlphaFold? 2 (the model behind the 2024 Nobel Prize in Chemistry) is OSS, so Denver’s narrow view of the “developer” behind major breakthroughs appears to miss the scientific process that underlies progress in software. Notably, where there are replicability issues in journals, OSS projects virtually rest in a state of truth, where pull requests are functionally hypotheses, extending the state of knowledge (merged, if true, or else restored to previous builds). Generally, superior code is shipped where collaboration and criticism are the defaults of production.

 
Changed:
<
<
Market Microstructure and the Case for Openness
>
>

Minding the Gap

 
Changed:
<
<
Secrecy has long defined finance, as evidenced by the NYSE’s former "upstairs room,” but beneath the fog lies a path between Scylla and Charybdis, liquidity and price accuracy, that developers at HFTs like Two Sigma must chart. According to the Glosten-Milgrom model, market makers like HFTs set bid-ask spreads based on their expected likelihood of encountering informed traders, so theoretically, open-sourcing key algorithms would likely result in greater losses for HFTs and wider bid-ask spreads for everyone else.
>
>
Under the CAIA, a “developer” or “deployer” of AI systems which interact with Coloradans is subject to AG sanctions if, within 90 days of learning how the “high-risk AI” system caused algorithmic discrimination, there is no follow-up disclosure. Inspired by Article 12(1) of the EU AI Act (“High-risk AI systems shall technically allow for the automatic recording of events (logs) over the lifetime of the system.”), Denver monitors the same range of “high-risk” activities outsourced to AI (e.g. deciding whom to hire, whom to give a home loan, etc.), but a closer read of one key provision in Section 1 (“Definitions”) betrays a wide chasm, as Colorado says:
 
Changed:
<
<
Yet, while conventional wisdom suggests open-sourcing strategic assets would eliminate profits, certain market-making might become more profitable when open-sourced. Consider index arbitrage between S&P 500 futures and underlying stocks. Although the general concept—maintaining price parity between the index and its components—is widely known, profit lies in sophisticated execution: fast order routing, effective risk management, and strategic unwinding of positions. If leading HFTs could open-source basic arbitrage infrastructure, they could cut technology maintenance costs and still compete on speed, efficiency, and capital management. The wider adoption of such infrastructure might also improve market stability, as HFTs could quote tighter spreads based on more predictable counterparty behaviors.
>
>
(7) "DEVELOPER" MEANS A PERSON DOING BUSINESS IN THIS STATE THAT DEVELOPS OR INTENTIONALLY AND SUBSTANTIALLY MODIFIES AN ARTIFICIAL INTELLIGENCE SYSTEM.
 
Changed:
<
<
Open for Business
>
>
Whereas Article 3 of the EU AI Act says:
 
Changed:
<
<
The affirmation of market efficiency in Halliburton II may also turn regulatory burdens into revenue streams. Similar to how IEX transformed its speed bump into a monetized feature, HFTs could open-source core liquidity provision algorithms while offering premium services like risk analytics, execution optimization, and custom implementations. The shift would make them more infrastructure providers than proprietary traders.
>
>
(3) ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge;[4]
 
Changed:
<
<
For example, a firm specializing in ETF market-making could open-source its quote maintenance algorithms while monetizing the surrounding ecosystem. The core activity—maintaining ETF prices per underlying assets—is already well understood. Hence, the firm could: (i) offer premium features like optimized hedging tools, (ii) charge for privileged exchange connectivity, (iii) consult on trading system design, (iv) reduce compliance costs by increasing transparency, and (v) accelerate institutional orders through demonstrated openness. That model mirrors Bloomberg’s success, where the terminal’s dominance has been maintained not through secrecy per se but superior implementation and network effects.
>
>
Put differently, the CAIA says nothing about OSS like DeepSeek? . OSS is often provided as is with broad disclaimers against warranty, indemnity, or other liability, but DeepSeek? , for example, is offered on an MIT license, which also raises another issue within that world of free software. Apache, BSD, and MIT licenses, for instance, do not require the disclosure of source code—as opposed to GNU or Mozilla—so the fact that DeepSeek? ’s GitHub? repo exposes some of its source code is not required. Currently, there are no enforcement actions by Brussels under the full EU AI Act (which only comes into effect, August 2, 2026), but state regulators could probably promulgate rules on licensing without further changing the law.[5] Thus, the CAIA might read:
 
Changed:
<
<
Open to Rules
>
>
(7) "DEVELOPER" MEANS A PERSON DOING BUSINESS IN THIS STATE, whether for payment or free of charge, THAT DEVELOPS OR INTENTIONALLY AND SUBSTANTIALLY MODIFIES AN ARTIFICIAL INTELLIGENCE SYSTEM. Such legislation and regulation would then cover companies like RedHat? (upselling their services, but not necessarily marketing their free AI), and again, by excluding OSS developers from sources of “high-risk AI,” the CAIA would tend to otherwise overlook industry-leading developers like DeepSeek? .[6]
 
Deleted:
<
<
Under SEC Rule 10b-5, disclosures of source code could theoretically be compelled if HFT strategies materially impact markets. Forward-thinking firms might preemptively open-source parts of their infrastructure, such as anti-manipulation systems, while retaining their proprietary alpha-generating strategies. This approach echoes how Credit Suisse leveraged dark pool disclosures as a competitive advantage, so regulation can be a net positive. Exchanges may also incentivize this shift by modifying access rules to favor HFTs revealing more about their tooling and libraries, even if limited to backtesting frameworks or infrastructure automation. As Shoshana Zuboff wrote regarding eToro’s success in directing trades, even modest incentives significantly shape information flows among market participants (Age of Surveillance Capitalism, 273). By extension, exchange-level incentives could create a feedback loop of increased transparency and market efficiency, rewarding all stakeholders.
 
Changed:
<
<
Open to Criticism
>
>

California Dreaming

 
Changed:
<
<
However, the OSS analogy may be imperfect where neither coding culture nor zero marginal costs apply. Often, traders and developers at HFTs work in silos on highly specialized projects, but there are also real costs associated with co-locating servers near exchanges—expenses not borne by many OSS endeavors. Moreover, capital markets are inherently zero-sum; sharing could reveal strategic advantages even in innocuous tools like data visualization software. Additionally, recent events, such as the Reddit-fueled "meme stock" phenomenon, highlight that while current opacity in HFTs may not serve them well, increased transparency may yet expose them to volatile market dynamics too soon.
>
>
Alternatively, if regulators are afraid of entering a deep sea of open-source development, they may try to align the Colorado Privacy Act with the California Privacy Rights Act (CPRA), but such an indirect approach to AI policy would likely fall short of the CAIA's goals. CPRA, which amended California’s CCPA in 2023, identified “automated decision-making” by AI, as a liability borne by the business. However, CPRA only required compliance from businesses earning $25 million in annual gross revenue, which belies how dimly OSS projects must have been viewed by the state. Of course, Californians’ amended CCPA is still stronger than Coloradoans’ CPA. For example, if a bank like Wells Fargo were to use AI in mortgage servicing, it could still found be found liable under California’s GLBA data-only exemption, but Colorado (like most states with privacy policies) would exempt the entire financial institution at the GLBA entity-level.[7] However, again, Sacramento's threshold of a for-profit business model likely warrants caution on taking an indirect approach to protecting data as opposed to "consequential decisions" from AI.
 
Changed:
<
<
Closed to Some Traditions
>
>

Up the Mountain

 
Changed:
<
<
Ultimately, the future of open-sourcing certain high-frequency trading software may benefit from revisiting the distant past. In The Doctor and Student, Christopher St. Germain once wrote that “the law whereby all things were in common, was never of the law of reason, but only in the time of extreme necessity.” He further noted that “the law of property is not the law of reason, but the law of custom.” In terms of OSS, if such financial tools were held in common, costly mistakes could be avoided, but even if such ownership had never been a custom in certain domains, certainly, the current hype around AI attests to an “extreme necessity,” lest someone at Citadel recklessly bootstrap an AI-enabled API to proprietary code.
>
>
Ultimately, proponents of AI reform should continue to call out the CAIA's continental drift. The EU’s state actors are not disinterested in new technologies—just swipe off the tram in Amsterdam! But by inserting OSS into the “developer” definition (or "provider" definition like the EU), the state will at least help operationalize the other mechanisms borrowed from European regulators. That said, there is still complexity in the CAIA's administration, as the law mandates a "deployer" implement frameworks like NIST's AI Risk Management Framework or ISO/IEC 42001, which are governance structures better suited for traditional corporate environments than decentralized development communities. Still, absent federal legislation, this first salvo in the AI policy from the states will greatly shape the nation as a whole. Hence, unless “comprehensive” language truly reflects the reality of a vast OSS underground, the CAIA will tend to see friends around the campfire... and everybody’s high.
 
Deleted:
<
<
To that end, in 2010, regulators discovered that HFTs were largely to blame for the flash crash when markets lost $1T in value in 36 minutes. Relatedly, almost exactly one decade ago today, a major flash rally prompted an interagency report by the Treasury Department, which (while not outright naming any particular HFTs) ominously noted: “At times, self-trading may reflect unlawful conduct.”
 
Deleted:
<
<
Even before new technological developments, one wonders whether reason should have already dispatched with custom. But today, where there is still a use for knowledge in society, there is probably a grave need for smart money.
 
Changed:
<
<
But you didn't look. You could have found HFT-Orderbook, hummingbot, StockSharp? and a clutch of other tools. Instead of hypothesizing why software either does or doesn't exist, it would have been a good move to find the software that does exist, look at the projects' composition, roadmap, development tempo, etc., in order to understand what is actually going on. Christopher St. Germain has less to tell us about this matter than GitHub? , I fancy.
>
>
Endnotes
  1. Robert Gordon, The Citizen Lawyer, A Brief Informal History of Myth with Some Basis in Reality, p. 1182.
  2. David Tollen, The Tech Contracts Handbook, Appendix 2 (ABA Publishing, 2021).
  3. Karen Friar and Ines Ferré, DeepSeek? sell-off reminds investors of the biggest earnings story holding up the stock market, Yahoo Finance (January 27, 2025), https://finance.yahoo.com/news/live/stock-market-today-nasdaq-clobbered-nvidia-sinks-17-while-dow-stages-comeback-as-ai-fears-shake-markets-210101592.html.
  4. See also, “(10) ‘making available on the market’ means the supply of an AI system or a general-purpose AI model for distribution or use on the Union market in the course of a commercial activity, whether in return for payment or free of charge;”
  5. The state AG—already empowered under the CAIA to issue rules for enforcement—could further specify appropriate licenses (e.g. CDDL, EPL, GPL, MPL, etc.).
  6. In February, DeepSeek? was the most downloaded app on the AppStore? and Google Play.
  7. Only 20 states have data privacy policies: thirteen have exemptions for data and entities under the Gramm-Leach-Bliley Act; four for just GLBA entities; and three (CO, OR, and MN) for GLBA data-only.
 



MichaelMacKayFirstEssay 2 - 18 Nov 2024 - Main.EbenMoglen
Line: 1 to 1
 
META TOPICPARENT name="FirstEssay"
Deleted:
<
<
It is strongly recommended that you include your outline in the body of your essay by using the outline as section titles. The headings below are there to remind you how section and subsection titles are formatted.
 

Paper Title

Added:
>
>
A title it a terrible thing to waste.

 -- By MichaelMacKay - 25 Oct 2024
Line: 40 to 43
 Even before new technological developments, one wonders whether reason should have already dispatched with custom. But today, where there is still a use for knowledge in society, there is probably a grave need for smart money.
Added:
>
>
But you didn't look. You could have found HFT-Orderbook, hummingbot, StockSharp? and a clutch of other tools. Instead of hypothesizing why software either does or doesn't exist, it would have been a good move to find the software that does exist, look at the projects' composition, roadmap, development tempo, etc., in order to understand what is actually going on. Christopher St. Germain has less to tell us about this matter than GitHub? , I fancy.

 
You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable.

MichaelMacKayFirstEssay 1 - 26 Oct 2024 - Main.MichaelMacKay
Line: 1 to 1
Added:
>
>
META TOPICPARENT name="FirstEssay"
It is strongly recommended that you include your outline in the body of your essay by using the outline as section titles. The headings below are there to remind you how section and subsection titles are formatted.

Paper Title

-- By MichaelMacKay - 25 Oct 2024

In "The Use of Knowledge in Society," Friedrich Hayek argued that civilization advances by expanding the number of actions we can perform “without thinking about them.” When he won the 1978 Nobel Prize in Economics, it was largely for illustrating how price signals create efficient markets. Yet, Black Monday—a massive crash less than a decade later—posed a critical question: who is actually thinking?

In open-source software (OSS), proof of thought is essential, as free software means the right to inspect code. Where marginal costs are negligible, OSS also often surpasses proprietary production. So why have high-frequency trading (HFT) firms like Virtu or Citadel not open-sourced more projects? The commercial success of Google’s TensorFlow? , released in 2015 under an Apache license, a free software license, shows how sharing some core infrastructure can attract powerful network effects while preserving competitive advantages. Despite this, HFTs—now responsible for most daily trading volume—are unswayed. Given the proven benefits of open-source efforts in other complex fields (n.b. this year's Nobel Prizes), their resistance to greater openness is enigmatic where profitability and market influence could be enhanced by selectively open-sourcing some more ancillary code.

Market Microstructure and the Case for Openness

Secrecy has long defined finance, as evidenced by the NYSE’s former "upstairs room,” but beneath the fog lies a path between Scylla and Charybdis, liquidity and price accuracy, that developers at HFTs like Two Sigma must chart. According to the Glosten-Milgrom model, market makers like HFTs set bid-ask spreads based on their expected likelihood of encountering informed traders, so theoretically, open-sourcing key algorithms would likely result in greater losses for HFTs and wider bid-ask spreads for everyone else.

Yet, while conventional wisdom suggests open-sourcing strategic assets would eliminate profits, certain market-making might become more profitable when open-sourced. Consider index arbitrage between S&P 500 futures and underlying stocks. Although the general concept—maintaining price parity between the index and its components—is widely known, profit lies in sophisticated execution: fast order routing, effective risk management, and strategic unwinding of positions. If leading HFTs could open-source basic arbitrage infrastructure, they could cut technology maintenance costs and still compete on speed, efficiency, and capital management. The wider adoption of such infrastructure might also improve market stability, as HFTs could quote tighter spreads based on more predictable counterparty behaviors.

Open for Business

The affirmation of market efficiency in Halliburton II may also turn regulatory burdens into revenue streams. Similar to how IEX transformed its speed bump into a monetized feature, HFTs could open-source core liquidity provision algorithms while offering premium services like risk analytics, execution optimization, and custom implementations. The shift would make them more infrastructure providers than proprietary traders.

For example, a firm specializing in ETF market-making could open-source its quote maintenance algorithms while monetizing the surrounding ecosystem. The core activity—maintaining ETF prices per underlying assets—is already well understood. Hence, the firm could: (i) offer premium features like optimized hedging tools, (ii) charge for privileged exchange connectivity, (iii) consult on trading system design, (iv) reduce compliance costs by increasing transparency, and (v) accelerate institutional orders through demonstrated openness. That model mirrors Bloomberg’s success, where the terminal’s dominance has been maintained not through secrecy per se but superior implementation and network effects.

Open to Rules

Under SEC Rule 10b-5, disclosures of source code could theoretically be compelled if HFT strategies materially impact markets. Forward-thinking firms might preemptively open-source parts of their infrastructure, such as anti-manipulation systems, while retaining their proprietary alpha-generating strategies. This approach echoes how Credit Suisse leveraged dark pool disclosures as a competitive advantage, so regulation can be a net positive. Exchanges may also incentivize this shift by modifying access rules to favor HFTs revealing more about their tooling and libraries, even if limited to backtesting frameworks or infrastructure automation. As Shoshana Zuboff wrote regarding eToro’s success in directing trades, even modest incentives significantly shape information flows among market participants (Age of Surveillance Capitalism, 273). By extension, exchange-level incentives could create a feedback loop of increased transparency and market efficiency, rewarding all stakeholders.

Open to Criticism

However, the OSS analogy may be imperfect where neither coding culture nor zero marginal costs apply. Often, traders and developers at HFTs work in silos on highly specialized projects, but there are also real costs associated with co-locating servers near exchanges—expenses not borne by many OSS endeavors. Moreover, capital markets are inherently zero-sum; sharing could reveal strategic advantages even in innocuous tools like data visualization software. Additionally, recent events, such as the Reddit-fueled "meme stock" phenomenon, highlight that while current opacity in HFTs may not serve them well, increased transparency may yet expose them to volatile market dynamics too soon.

Closed to Some Traditions

Ultimately, the future of open-sourcing certain high-frequency trading software may benefit from revisiting the distant past. In The Doctor and Student, Christopher St. Germain once wrote that “the law whereby all things were in common, was never of the law of reason, but only in the time of extreme necessity.” He further noted that “the law of property is not the law of reason, but the law of custom.” In terms of OSS, if such financial tools were held in common, costly mistakes could be avoided, but even if such ownership had never been a custom in certain domains, certainly, the current hype around AI attests to an “extreme necessity,” lest someone at Citadel recklessly bootstrap an AI-enabled API to proprietary code.

To that end, in 2010, regulators discovered that HFTs were largely to blame for the flash crash when markets lost $1T in value in 36 minutes. Relatedly, almost exactly one decade ago today, a major flash rally prompted an interagency report by the Treasury Department, which (while not outright naming any particular HFTs) ominously noted: “At times, self-trading may reflect unlawful conduct.”

Even before new technological developments, one wonders whether reason should have already dispatched with custom. But today, where there is still a use for knowledge in society, there is probably a grave need for smart money.


You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable. To restrict access to your paper simply delete the "#" character on the next two lines:

Note: TWiki has strict formatting rules for preference declarations. Make sure you preserve the three spaces, asterisk, and extra space at the beginning of these lines. If you wish to give access to any other users simply add them to the comma separated ALLOWTOPICVIEW list.


Revision 11r11 - 16 Feb 2025 - 04:36:18 - MichaelMacKay
Revision 10r10 - 16 Feb 2025 - 02:10:39 - MichaelMacKay
Revision 9r9 - 15 Feb 2025 - 17:49:53 - MichaelMacKay
Revision 8r8 - 15 Feb 2025 - 06:39:38 - MichaelMacKay
Revision 7r7 - 11 Feb 2025 - 22:24:36 - MichaelMacKay
Revision 6r6 - 11 Feb 2025 - 20:51:43 - MichaelMacKay
Revision 5r5 - 11 Feb 2025 - 04:54:58 - MichaelMacKay
Revision 4r4 - 11 Feb 2025 - 01:18:51 - MichaelMacKay
Revision 3r3 - 10 Feb 2025 - 23:23:39 - MichaelMacKay
Revision 2r2 - 18 Nov 2024 - 18:06:26 - EbenMoglen
Revision 1r1 - 26 Oct 2024 - 03:36:19 - MichaelMacKay
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM