The curse of the rule maker. But there’s a lesson in it.
Stephen Wilson made an excellent point in his comment “Privacy movement deja vu” on my blog on “Another swallow flew by, but who was looking?”
Stephen Wilson made an excellent point in his comment “Privacy movement deja vu“ on my blog on “Another swallow flew by, but who was looking?“
He pointed out the curse of the rule maker: Rules are made to be broken.
It doesn’t matter whether the rule maker is Parliament passing laws, Standards bodies setting standards, Professional Standards bodies setting membership criteria or anything else.
We even see it within bureaucracies – how many of you have on the shelf (virtual or real) hundreds of pages of rules all sitting there to be ignored or worked around? When I was in Department of Finance, with an over blown sense of our own self importance, we would send Circulars to the divisions in the Department and Memoranda to other Departments telling them how to participate in the annual Budget process. They had some impact, but never the full intended impact.
Think of other examples all around us:
- GST legislation was supposed to shut down the cash economy in Australia. It doesn’t seem to have done anything of the sort. All it achieved was a change in how you did it.
- HIH was governed by prudential rules. It still went to the wall.
- AWB was covered by all sorts of rules on how it exported & against using corrupt practices. It still managed to fund a significant proportion of Saddam Hussein’s war.
Closer to the world in which I engage, the whole concept behind privacy law is that personal information should only be used or disclosed by the organisation for the (single) purpose for which it was collected. There are exceptions to this in the law, including uses & disclosures for other purposes to which the individual has consented. So what emerged? Bundled Consent where individuals are given the take it or leave it option of agreeing to any other uses of their personal information that the organisation might think up later, or not get the service they were seeking.
Nothing is more certain than the huge global industry that will develop to manipulate any carbon credit or tax or trading regime that is developed to respond to climate change.
Stephen’s further point is that compliance with standards (or laws or other rules) begins to degrade as soon as a formulaic approach to compliance assessment kicks in. This is worsened when the correlation between compliance and the true incentives faced by the organisation (brand, bottom line or whatever) are weak.
The response from the rule makers, if they indeed want their rules to be effective?
Change the rules!
Sometimes this leads to a peculiar form of arms race, as in the case of taxation law. Clever people circumventing or minimising tax leads the rule makers to tighten the law. In response, those affected by the rule changes simply do what any market does: re-optimise around the new set of rules, including putting resources into finding their weaknesses, ways around them, ignoring them if they can etc.
At other times, the response is more circular: in its simplest manifestation, law is changed from stance A to stance B when A proves to be less & less effective. Later, when stance B proves to be less & less effective, there is a return to stance A.
We see this on a 10-15 year cycle time in the rules applying to listed companies (ie the combination of corporations law, Stock Exchange rules etc). A bull run ends in over inflated prices; companies collapse; a tighter set of compliance rules is implemented; markets recover (usually for totally unrelated reasons such as the normal economic cycle). Then the complaints start about regulatory burden followed by relaxation of the rules, so that all is ready for the next generation of cowboys and girls to move in to start the rule cycle all over again.
The lesson: This is all NORMAL. Gaming any rule set is one of the rules of the game.
It has been going on since well before the Babylonians started writing down rules in clay 5,000 years ago.
The amazing thing: why are we always so surprised when it happens?
The implications for rule makers include:
- To the greatest extent possible, make the incentives to do the right thing endogenous to the organisation rather than external. Recent Anti Money Laundering law is an example of attempting this by aligning the external (stopping organised crime & terrorism) with the internal (stopping fraud on the financial institution & its customers; enriching customer relations databases etc). It has only partially worked.
- Minimise the incentive for ‘regulatory arbitrage’ or ‘edge riding’. In particular, avoid rules that provide large rewards for minimal change in business operations. Behavioural targeting is mostly aimed at understanding the individual and targeting them with advertising etc, without knowing their identity simply because of the additional rules that apply to ‘personal information’. This will be a huge issue in the next couple of years as the analytics improve and the richness of the data (from Web2.0 services such as social networks) grows.
- Ensure that the rule making process itself ‘learns’ & can act on what it learns easily and rapidly.
It’s not easy.
The work for the Privacy & Trust Partnership has sought to address these implications. If you haven’t looked at them already, have a look at the Working Paper & the White Paper written in 2007, both online in the Privacy & Trust forum, here on Open Forum. We would love to hear your views on whether the proposed Privacy Risk Rating model might work. Even more so, it would be great to hear, whether there is a better way. Write a comment direct on the Privacy & Trust forum page or email Chris Cowper at IIS.
Malcolm
Malcolm Crompton is Managing Director of Information Integrity Solutions (IIS), a globally connected company that works with public sector and private sector organisations to help them build customer trust through respect for the customer and their personal information.
Malcolm Crompton is the Founder and Lead Privacy Advisor of IIS Partners (IIS), a company that works with public and private sector organisations to build trust with customers through protecting their personal information.