Back in 1970, when risk assessment became widely-known, people needed to understand risk — but without getting panicked. Assessors wanted to invent a non-scary term to communicate a very small risk of death. And thus the "micromort" was born.
Often, people are confused by the fact that everything, even the most innocuous stuff, is risky. They're also not entirely sure how those risks stack up. When Ronald A. Howard became a professor at Stanford University, he decided to focus on decision analysis. In particular, he wanted to break down the idea of fatal risk into small units. So he came up with the "micromort." A micromort is a one-in-a-million chance of death. Different activities could be assessed by how many micromorts they add to your everyday level of micromorts.
Unfortunately, the micromort doesn't really make risk sound any less scary.
Drinking the water in Miami for a year increases your likelihood of dying by one micromort. Does that reassure you? Not me, and I don't even know the micromorts for the water supply for the rest of the country. Nor am I pleased to know that walking 17 miles also increases your micromort level by one, or that you can drive 250 miles in a car before you get the same increase. (Although I suspect restricting the use of cars would decrease the micromort level for walking.)
Perhaps a better way of explaining micromorts would be the 20-coin rule. Plus Magazine points out that if you threw 20 coins in the air, the likelihood that they'd all come down heads would equal roughly one micromort. That seems like a simple way to help people (including anxious writers) see how unlikely a micromort actually is.