There are times when we become aware of information that is not consistent with the belief we have about something. For instance we used to think that bleeding could cure diseases and, more recently, we thought that smoking could be good for us – but in both cases new medial information has shown those beliefs were not only false but actually harmful – although in each case it took a while for that conclusion to be accepted, writes Prof Simon Bridge.
“When my information changes, I change my mind – what do you do?”
Attributed to John Maynard Keynes |
So what do we do when we receive information that does not agree with our thinking?
One tactic is to try somehow not to receive, or to have to acknowledge, such messages – and we may not even be conscious that we are avoiding them. This approach can sometimes be found in government departments when, in case there could be suspicions that something might not be working as planned, reassurance is sought. If there are both positive and negative reports, the positive ones will be favoured and, if more and apparently independent reinforcement is needed, consultants can be hired to investigate with the unvoiced understanding that a report in line with current thinking would be very welcome. The Renewable Heat Initiative (RHI) offers examples of such practices – and of the problems that do occur when uncomfortable information, although available, somehow fails to be noticed and remains an ‘unknown known’.
However, if new and conflicting information cannot be avoided, then we may have to change our minds and I am indebted to Marilyn Ferguson, in a book to which I was once referred, for her suggestion that there are four basic ways in which we could do that.
The first one she calls change by exception where the old approach is retained but some anomalies are recognised. We even have an expression for such differences and think of them as ‘exceptions that prove the rule’ – instead of what they might actually be which is exceptions that dis-prove the rule.
The second relatively easy approach is incremental change where change happens bit by bit so there is no sudden shift or need for significant and uncomfortable readjustments and, if the change is slow enough, many will be unaware of any difference.
However sometimes the contrast between reality and the conventional approach becomes so great that something has to give. Then there is a temptation to apply what Ferguson terms pendulum change – the abandonment of the previous view and its replacement with a new and often contrary one. It is as if the previous software could be uninstalled and a different programme found and downloaded. However often the old system was not completely wrong and the new one is over-hyped but as yet untested and prone to error. But pendulum change appears to be based on a belief that it must be one or the other and therefore, if one is wrong, the other must be right. But a switch from one to the other does not seek to build on the best of both – it is more like a revolution changing a country from capitalism with all its faults to communism with its own, but different, imperfections.
Thus pendulum change fails to retain what was right with the old and to distinguish, from over-statements about it, what may actually be useful in the new. But experience can show that change is still necessary and often the only way to make something work well is the process of testing and then feedback and improvement, known as ‘trial and error’. Indeed, scientists know that the route to progress is to look out for, and then act on, evidence contrary to current understanding. However there is a strong temptation to hold onto what we have, especially if we have an investment in it – for instance if we have contributed to its development or progressed to our current positions though knowledge of it.
Nevertheless often the best way to change is what Ferguson calls paradigm change: the deliberate construction of a new approach retaining the good points of the old combined with new ideas to correct its apparent faults. That may be uncomfortable because it is contrary to human nature but it has the advantage that it can work – if we can do it.
So what do you do when new but uncomfortable information becomes available? Try to avoid or minimise it in case it shows a need for change – or actually seek it out because it could show where improvement is possible? Or are you like most of us – often confused and somewhere in the middle?
Marilyn Ferguson, The Aquarian Conspiracy, (Los Angeles: J P Tarcher Inc, 1987)