Discussion about this post

User's avatar
John Roxton's avatar

While I share your dislike of snake oil salesmen, I am much less enthusiastic about this effort and I think you have taken an overly-optimistic view of the possible second-order effects of this kind of highly-automated and effective regulatory enforcement.

You say "It’d be like refusing to use nuclear power, just because the same scientific principles can be harnessed to build nuclear bombs" - I think this is a telling comparison, but not for the reason you meant. The *very real and dangerous* pathway from nuclear energy to nuclear weapons could be one of a small number of legitimate reasons to oppose the development of nuclear power in countries that do not already have nuclear weapons; the ostensible purpose of the Atoms for Peace programme and the treaties it spawned was to prevent people from moving along that pathway, because, for the most part, we all agree that more people having nuclear weapons is bad!

Similarly, there is an obvious pathway from the 'narrow use-cases' you describe to the mass identification and prosecution of 'problematic' speech, or other uses of AI to enforce laws that were written on the assumption that they would be sporadically enforced by human policemen (not to mention any future laws that are currently unthinkable, but might become attractive to a politician once they become enforceable via AI). You write, correctly, that "it will be important to build new norms around when pro-active bullshit detector regulation is acceptable or not", but omit the key point: that we *must* establish those norms, and in a more serious and lasting manner than social convention, *before* we start down the path.

To return to your nuclear analogy: Eisenhower recognized that the genie was escaping from the bottle, and that nations without nuclear bombs would eventually get them whether the secrets of nuclear power were released or not. And so, many serious people worked very hard over many decades to create an international framework that would decouple these things, blocking the pathway, in which nations could get assistance with the development of nuclear power in exchange for externally-enforced commitments that they would not proceed down the pathway to nuclear weapons. We need something similar in spirit here - if government is to be given this power, as you advocate, it must be under tight controls. The benefits of 'narrow use-case' applications are, like those of nuclear power, significant - but they should not be an unqualified excuse to run straight for the enrichment programme.

Expand full comment
Lee's avatar

Sorry mate but I’m 100% certain that AI will be used 1000 times more often to push and promote quack medicine than it is to prevent it

Expand full comment
3 more comments...

No posts