Roger Brownsword brought up (PDF) the interesting topic of the relationship of moral codes of society with regulation and technology at the Technology & Regulation Symposium at the Berkeley Center for Law and Technology. Essentially law, regulation and technology can supplement the moral codes of a society – so less or more is required depending on the strength of common belief and adherence to those moral codes. As a shift occurs away from belief in doing something because it is “right”, to self-interest (the prudential approach), and finally what is only possible or practical. The second can rely on signals that you will be detected and convicted, e.g. with many CCTV cameras. The third is evidenced by technologies used to enforce options, such as turnstiles for example.
Travis D. Breaux, Assistant Professor of Computer Science, Carnegie Mellon University presented interesting thoughts on regulatory patterns during the Berkeley Center of Law and Technology “Technology, Transforming the Regulatory Endeavor” symposium.
He suggested that the following regulatory “Patterns” that should be followed in drafting regulations. Regulations should:
- Allow suspending the course of a prescribed action when appropriate. An example is suspending required notification during a police investigation.
- Allow design alternatives by giving guidance not implementation details. This allows for technology change, for example. An example might be allowing a change of notification from paper mail to email by not being prescriptive in mechanism.
- Support thresholds and exceptions. For example, allow substituting a notice on a web site rather than individual notices, to enable scaling.
- Enable indemnification. An example is to generally require use of encryption but with exception if credit card processing rules are met.
- Support prohibitions. For example disallow use of SSN unless already used, then require notification of continued use and allow people to prohibit its use.
Code can be viewed as an implicit form of regulation by embedding assumptions and decisions in the technology.
Interestingly enough, code often does not reflect regulator intent. Travis Breaux from CMU noted that engineers often miss important constraints in regulations and sometimes amplify the requirements when dealing with ambiguity in the regulations. System design can have an impact, for example direct screen writing in an OS might run counter of implicit accessibility assumptions in a regulation. Danielle Citron from the Maryland School of Law reinforced this theme, noting that programmers can distort policy through simplifications or make policy changes of their own. An example is that incorrect rules in a system design resulted in food stamps being denied to those with prior drug convictions – a decision against the law.
Automation is an issue since people tend to believe a computer result (“automation bias”), notice is lacking, and it is hard to have a hearing without the information implicit in the software.
Has policy been unintentionally been delegated to programmers?
Moral Hazard is a term which means that you might act differently if you think the risk is borne by others. That said, if you have a strong constitution, you might wish to read the following excellent material from John Mauldin: