As promised, here is the second installment on my discussion of knob-turning in ethical systems.
One of the problems of consequentialist moral systems such as my own, is that they require knowledge of the consequences of a given action. Given that these consequences usually lie somewhere in the future, this is not a trivial requirement. For sure, something like murder is reasonably easy to assess: while there might be some slight increase in happiness (in the murderer, for instance), the net effect is bound to be strongly negative: death means the loss of a potentially long and happy life, and the traumatic experience of grief amongst love ones.
But what about something less clear, like cheating or lying? Why, exactly, do we instinctively consider these things to be wrong, and can our moral systems justify these instincts? Does dishonesty always have a victim?
One potentially useful way of addressing these questions is to imagine the phenomenon in question writ large. To exaggerate it beyond realistic limits. To turn the knob all the way up to ten. This is, as far as I can tell, what Kant was trying to do with his categorical imperative, and what Fyfe was trying to do with his turning of desire knobs. The issue I have with these two examples, though, is the metric used to judge the result. I therefore offer here an example of how my own metric (the net change in contentment) can be used to evaluate an action or practice when it is hypothetically exaggerated.
The example is cheating on exams, recently brought to my attention by a Christian friend with whom I have an ongoing and fruitful debate about religious matters. How, he asked me, does my moral system determine whether cheating on an exam is right or wrong?
If we imagine a typical case of academic cheating, we can conclude that some suffering is likely to result: among the cheater’s classmates for instance, or later in the cheater’s own life when his lack of hard, honest work begins to take its toll. Conversely, there may be some short-term increase in the contentment of the cheater in securing a good grade for his exam. It seems to me, however, that the suffering outweighs the contentment, especially if you consider feelings of guilt and regret that the cheater may feel.
Can we confirm this conclusion by turning up the knobs? Can we identify further consequences of cheating by doing so? Well, let us consider a situation in which all students begin to cheat on a regular basis. If this happens, then schools will cease generating knowledgeable and skilled young people. Instead, schools will become rubber stamp institutions in which diplomas are handed out once students have copied or plagiarized the prerequisite number of exam solutions from previous students’ work. Universities would be in the same boat. The problem with this, of course, is that all kinds of valuable services would cease. For instance, there would be no more qualified doctors. There would be no more research into disease. The net result would be an enormous increase in suffering, probably for generations to come. Thus, by looking at an exaggerated scenario, we see that a permissive attitude toward cheating is potentially catastrophic under the metric of contentment utilitarianism.
We have to conclude, then, that cheating should generally be considered wrong, unless there are special circumstances that change the balance of suffering and contentment (contentment utilitarianism does not, by its very nature, establish any absolute, dogmatic views).
The same may be said of lying: if lying were so common in society that no trust could be established between people, then enterprises such as medical care, research, and even policing and fire fighting, would be drastically hobbled, and significant suffering would ensue. This gives us good reason, under the metric of contentment utilitarianism, to eschew lying as a general principle, keeping in mind (as always) that exceptions may exist.