Regulation of artificial intelligence (AI), warns Dean Ball, a board member of The Alexander Hamilton Institute for the Study of Western Civilization (AHI), could easily be distorted by special interests such as public employees and professions affected by major developments in it. A frequent writer on AI from his position as a research fellow at George Mason University’s Mercatus Center, Mr. Ball distinguishes between the regulation of “people’s conduct with AI” and the regulation of AI models. Over time, he fears, politics may tend to infect any model-based regulation that government enacts.

The latest piece for his online newsletter Hyperdimensional, “The Political Economy of AI Regulation,” proceeds on two assumptions: 1. Over the long run of 10 to 20 years, and perhaps sooner, AI will probably “clash with the economic interests of entrenched groups with significant political sway (doctors, lawyers, teachers, etc.)”; and 2. “Government regulators are subject to political pressures from … those same groups—or from political leaders” who are under such pressures.

Officials given regulatory power over AI models would, therefore, gradually tend to regulate the actual use of artificial intelligence, not just (as the regulatory law authorizes) the forms it takes.  Regulation doesn’t “spin out of control on day one … it spins out of control as it interacts with the broader political and economic system.” A new agency will always want to prove its value by doing something, rather than little or nothing—and that’s “where the trouble starts.”

“Let’s say that many parents start choosing to homeschool their children using AI, or send their kids to private schools that use AI to reduce the cost of education … in some states, public school enrollment is declining [as a result] … Some employees of the public school system will inevitably be let go. In most states … we can reasonably assume that even the threat of this would be considered a five-alarm fire by many within the state’s political class.”

“If AI is as powerful as I think it will be,” Mr. Ball predicts, “these political fights are inevitable, and they will be brutal. Issues like this, as opposed to debating whether GPT-7 will kill all of humanity, are what I expect to be spending most of my time writing about in five years. Maybe you love the public school system, so my example did not resonate. In that case, pick … whatever area of society or the economy you hope AI will transform. Does that area have politically powerful forces who benefit from the status quo?”