Utilitarian types tend to worship science, when probed as to why they worship science they will give an answer resembling ‘because it is the most reliable way to approach truth’. But of course, these are not truth maximizers, they are utilitarians… so even if it is the most reliable way to approach truth, that truth is only valued for the utility it brings.
Utilitarians are aware that their position implies that is just a means to an end. They react to this implication in one of two ways. The Sam Harri’s path, which is that truth is in alignment with welfare. Or you can take the position that it is sometimes okay to lie like Noah Smith claims. The second position is just biting the bullet, so let us focus on the Sam Harris path.
Utility and truth are always in alignment so you should always be truthful. This is a particularly heroic assumption, indeed, probably the clearest example of this is to imagine somebody on their deathbed asking for glimmers of hope. I doubt even utilitarians would deny these cases exist, but maybe they can just make a weird sub clause for people on their deathbeds. To avoid this clause, let us take a more relevant case: There is an unprecedented pandemic, if the doctors don’t get the masks while they deal with patients, two thousand more people will die, so you want to make sure that normal people don’t rush to buy masks to ensure the doctors can be supplied. It should be obvious that a utilitarian would lie! Heck, the nice thing about democratic systems is that ‘trust’ is usually person specific, If Fauci lies about it, he can be replaced with the next useful idiot who doesn’t have a record of lying and bam, credibility is reset. Utilitarians can try to argue that it isn’t about trust in a person but about trust in the ‘institution’ but I think that this critique is overblown, most people usually forgive the lies and hope the next person of influence isn’t going to also lie.
Utility is furthered more on average by truth-telling. This point of view relies on the agent not being able to distinguish between situations where truth helps utility and situations where it doesn’t. Indeed, this seems to be one of the advantages of science, science helps us find state variables where it is easier to predict the consequences of our actions. So if it is true that on average, you should tell the truth, then a corollary of this is that as our understanding of the situations improves, our lying should increase.
It is also interesting to understand what this does to the probability of being truthful as a function of influence. Take Harris’s main reason for telling the truth, that it allows agents to form a relationship (which increases welfare). Now it seems quite clear that when somebody of influence speaks, there are not proportionately as many relationships being formed. It would be rather absurd to claim that Mehmet the Conqueror formed a relationship with everyone who listened to him. Instead it seems clear that when Mehmet speaks, very few relationships are being formed but there is a unilateral effect on a lot of people. So the tradeoff for a person of influence, seems skewed towards lying.
So to sum it all up, Utilitarians are telling us they will lie more, the more their understanding increases and they will also lie more the more influential they become. As soon as they have a position of influence, they will in fact lie. This is but an implication of what it means to value truth for its effects on pain/pleasure.