I saw Richard Muller give a talk about climate change. Perhaps you've heard of him. He's that physicist that got funding from oil companies to do an independent reanalysis of Earth surface temperature data. And then when Republicans chose him as a witness for a congressional hearing, he surprised them by testifying that his results, even after correcting for biases, confirmed previous analysis.
Muller's thesis was that you should have been skeptical of climate change before, but you should not be now (now that Muller published his research, har har). I'm not really sure what this has to do with synchrotron light sources, which is what the conference was about, but it was a good talk.
Anyway, if you remember Richard Muller from the news, you may also remember when the e-mails of a bunch of climate scientists got leaked. Much was made of a particular e-mail that spoke of a "trick" used to "hide the decline". The "trick" of course simply refers to a trick of the trade. The decline refers to tree ring data. It's well-known in the scientific literature that tree ring data diverges from more reliable temperature measurements after 1960, for reasons unknown. Thus, the trick is to ignore tree ring data after 1960 and show more reliable estimates in its place.
Muller made a point which I feel I missed at the time. The correct thing to do is show data, even when it is discordant with your conclusions. And then you should be able to convince your audience anyways.
It's not uncommon for some particular bit of data to contradict one's conclusions. That doesn't necessarily mean your conclusions are incorrect or unpersuasive, it just means that not every single piece of data supports it. There are lots of reasons why data can be wrong. So show the data and explain why you think it's wrong.
But I can also think of a few situations where it would be appropriate to hide the data. For example, if it's a well-known effect with a standard procedure for correction. Relating to my own research, no one would complain if I neglected to mention the data I took from obviously dirty superconductor samples.
I'm not sure whether, in this specific case, it was okay to hide the discordant data. The divergence of tree ring data is known in the literature, and the researchers openly state that they correct it.
Eh, I guess I don't care either way. I'm more interested in the general principle, that it is good to show discordant evidence. Even moving away from science, when I relate anecdotes, I like to note the limitations of my experiences as evidence. Of course, I hope to convince despite limitations, but if I fail, that's that.