LONDON — There are three kinds of lies, goes the old saying popularized by Mark Twain: “lies, damned lies, and statistics.” That’s been on my mind this month as I’ve watched a big scandal over aid workers’ sexual misconduct unfolding in the British press.

It began with an exposé by The Times of London alleging that staff members from the charity Oxfam had paid earthquake survivors in Haiti for sex. But the most attention-grabbing headline in the expanding scandal came from The Sun, which blared that a “bombshell U.N. dossier” found that United Nations aid workers had raped 60,000 people.

That is a horrifying number. It is an attention-grabbing number. It is also more or less a made-up number.

And it’s got me thinking about which stories get believed, how numbers can become a stand-in for rigor and objectivity, and how that can be a problem.

I called Andrew MacLeod, the author of the document in question, to ask where his data came from. To his credit, Mr. MacLeod, a former United Nations employee who once described himself in his Twitter bio as a “Humanitarian|Speaker|Futurist|Visiting Professor|Corporate Director|Traveler,” was quite upfront about the fact that his numbers were only very rough estimates. The Sun, he said, exaggerated his claims considerably.

The “dossier,” Mr. MacLeod explained, wasn’t a leaked internal United Nations report. It was a two-page memo he had written for the British government long after he left the United Nations’ employ, plus the text of an email he had sent to people he hoped would support a new nonprofit he and some lawyer colleagues were founding to work on this issue.

More important, the terrifying numbers the memo cited were just rough guesses intended to hint at the scale of the problem, not actual measurements.

Take that estimate of 60,000 rapes, for instance. For one thing, it was actually an estimate of the number of victims of all forms of sexual abuse and exploitation, not just rape.

To arrive at that figure, Mr. MacLeod said, he started with a 2017 report in which the United Nations said it had recorded 311 victims of sexual exploitation by peacekeepers in the previous year. He then made a series of assumptions that led him from 311 reported cases to his headline-grabbing claim of 60,000. But those assumptions were little more than guesses, and Mr. MacLeod freely admits he has no hard data to back them up.

The first assumption was based on a remark by the United Nations secretary general, António Guterres, that the problem of sexual abuse was probably worse on the civilian side of the organization than within the peacekeeping forces. Mr. MacLeod concluded that this meant there must be at least 312 victims of the non-peacekeeping staff — 311 plus 1 — but to keep his “estimate” conservative, he put the number at 289, to arrive at an even 600 for the year. Needless to say, that kind of rounded guess based on an offhand statement is unlikely to be reliable.

From there, Mr. Macleod estimated that only 10 percent of the assaults were actually recorded, a figure he said was based on reporting rates in the United Kingdom, to arrive at a figure of 6,000 victims annually. But because the United Nations doesn’t have good (or possibly any) data about the rate at which victims report assaults by its staff, there’s no way to know if that was the right multiplier to use.

Then, to arrive at an estimate for the decade, Mr. Macleod assumed that 2016 was a representative year, so he multiplied 6,000 by 10.

Presto: 311 becomes 60,000.

In Mr. MacLeod’s telling, he always intended his memo to be an advocacy statement rather than what he called “peer-reviewed” statistical analysis — essentially a way to say “here is a problem that is very large and very bad” (my phrase), with digits instead of words.

The problem is, that isn’t how journalists use statistics, or how the public consumes them.

Good journalists always try to check the provenance of the numbers we cite. But we aren’t trained statisticians, so we can’t necessarily perform independent checks of every figure experts provide to us. And sometimes numbers manage to enter the news ecosystem so quickly and pervasively that our fact checks can’t keep up.

After the Parkland school shooting, for instance, you may have seen the viral statistic that America had experienced 18 school shootings already this calendar year.

In fact, it turns out, that number was misleading. It came from Everytown for Gun Safety, a gun-control organization, whose definition of “school shootings” is far broader than the common understanding of the term. It counts every time a person fires a gun in or near a school building as a “school shooting,” including a suicide in Michigan that took place outside a shuttered, empty school.

That’s not to say, of course, that such events are not upsetting and tragic in their own right. But they are not what people imagine when they hear the term “school shooting.” So the statistic, without context, was misleading.

The Washington Post eventually published a careful debunking of Everytown’s figures. But by that time, the number had already been tweeted by Senator Bernie Sanders and Mayor Bill de Blasio of New York, and cited by news organizations including MSNBC, the BBC, NBC News, ABC News and CBS News.

That matters. Numbers, if trustworthy, are a way to make reporting more rigorous and objective, and to show that a story is based on cold hard facts rather than a handful of anecdotes. And when a number is big and splashy, like Mr. MacLeod’s 60,000 victims or Everytown’s 18 school shootings, it can make a point very powerfully.

When people see a number in the paper or on a trusted news program, they tend to believe that it’s valid, especially if an expert is cited as the source. They don’t see an advocacy statement. They see rigor and objectivity, even if none actually exist. The result is that a number will often be taken much more seriously than an advocacy statement would be.

That’s all the more true if the story gets picked up by other news outlets. Mr. MacLeod, who has appeared on television and given other print interviews, seems to be everywhere. And the more he gets quoted, the more authoritative he seems to be.

The first problem with that, of course, is that the public is now under the mistaken impression that a reliable whistle-blower investigation has concluded that the United Nations is harboring thousands of predators, when in fact it was just a back-of-the-envelope sketch of the size of the problem. It’s entirely possible that the number of victims is much, much higher. It also might be much lower. We just don’t know.

But even more worrying is the way that bad information, like bad money, drives out good.

“Those of us who work on this issue know that if we take one misstep, the men who abuse will use it against us to make sure nothing changes,” Sarah Martin, a consultant who wrote a 2005 report on sexual assault and exploitation by members of United Nations peacekeeping missions, said by email.

Ms. Martin, who has advised the United Nations and other international aid organizations on gender-based violence, said she and her colleagues feared that Mr. MacLeod’s figures would provide exactly that kind of excuse, distracting from the less splashy but more reliable information that is already available.

“I have heard so many horrific stories from women that I don’t need false statistics waved around,” Ms. Martin said. “It discredits the very brave women and children who struggle to come forward and then do — usually to disbelief and disinterest.”