
Imagine that one day you open your email inbox and find a promotional email claiming to produce ‘100% accurate stock market predictions’. You don’t think much of it. You peruse it superficially before sending it over to the junk box. “Watch stock X on 29 January 2025, it will go up” the mail says before being deleted permanently. The next day, out of pure curiosity, you take a look at stock X and see that it indeed went up.
“Lucky guess.” You rationalize and leave things be.
Next week, you get an email from the same group. This time they say, “Watch stock Y on 6 February 2025, it will go down.”. You are a bit curious now but still delete the mail and don’t think much of it. To your surprise, stock Y closes lower on the 6th of February.
You are now more than just curious. Could this really be the algorithm making perfect predictions? You wait for similar mails for two more weeks and each time, the prediction is accurate. The algorithm has made the correct prediction 4 out of 4 times. You are all in now. In the 5th week, you get an email saying that if you would like more such predictions, you will need to avail the services of the financial advisors. You might think any of the following things at this point:
They have made the correct prediction about a particular stock, its movements, and the date associated with the movements for 4 consecutive weeks.
This is unlikely to happen purely by chance.
Given the skepticism with which you analyzed the data in front of you, there is enough evidence to seek the services of these financial advisors given their track record.
And this is where you would fall for a popular scam that weaponizes your own skepticism against you using the survivorship bias.
The modus operandi behind such scams is pretty simple. Each week, 10,000 mails are sent out about a particular stock going up or down on a given date. Half the mails say the stock will go up while the other half say it will go down.
Eventually, at least 5,000 people will receive the correct prediction simply by chance. The process is repeated again a week later. Now out of these 5,000 people, half will receive the correct prediction again since a stock can only go up or down, even if by miniscule amounts.
After 4 repetitions, the scammers will have a group of ~625 people who have received 4 correct predictions in a row. These 625 people will now be pitched the services of a “financial advisor” for hefty fees.
The scam simply uses probability and randomness to goad unsuspecting people into giving away their money. From the perspective of the people who fall for this scam, they are the victims of the “survivorship bias”: a cognitive bias where we make decisions based on the data that is presented to us without thinking about the data that was missing or kept from us.
The survivorship bias is probably best known for its military application during the Second World War, when the U.S. military sought the help of the Statistical Research Group (SRG) at Columbia University to solve a critical problem: identifying areas of aircraft that were most vulnerable to enemy fire and should be reinforced to improve survivability.
Abraham Wald, one of the statisticians associated with the SRG wrote an 8 point memorandum about his work. One of the major challenges Wald faced was that the data he had was only from planes that had survived the journey back home. Wald had to come up with methods to estimate the survivability of damage on planes that did not come back. You can get an idea of the work Wald did from this piece by Bill Casselman.
You might think that the goal would be to reinforce planes in the sections of where the surviving planes took the most hits. The basic lesson from Wald’s work was that the planes needed to be reinforced in places where the surviving planes took the fewest hits. The logic is that if no planes are surviving with hits in those regions, it's likely because hits to those regions cause critical damage. By taking this approach, Wald took into account not only the data that was presented to him, but also that which did not survive.
You might have seen this illustration when talking about Wald’s work and the survivorship bias on the internet.

The red dots highlight where the surviving planes were being hit most often. Wald’s work suggested the areas with few or not red dots need to be reinforced since damage to these areas must have critically damaged the planes.
While the illustration itself is a simplification of Wald’s work, it gets the point of the survivorship bias across – focusing only on what is visible can lead to skewed conclusions.
The survivorship bias has implications for everyday life outside the military too. Take the example of dropping out of college: Mark Zuckerberg’s story of dropping out of college to start a company and ending up a billionaire gets a lot of attention. Does this mean that if you want to start a successful company, you're better off dropping out of college, rather than waiting until you finish?
Not necessarily. Survivorship bias plays a role here as well. We are more likely to hear the stories of college drop-outs who have managed to achieve business success because they get a lot of attention, without hearing the stories about the many people who dropped out of college to start a company only to have that company fail.
Even if we were aware of both – those who dropped out and failed and those who succeeded, it still wouldn't be possible to tell if dropping out is associated with startup success.
We'd need to know the rate of success among both those who do drop out AND those who don't drop out. Only then we would be able to make any definitive claims on the link between dropping out and success in starting a company.
At its core, the survivorship bias is a case of “silent evidence” as Nassim Taleb phrased it, being ignored for “louder, more visible” evidence. Given that silent evidence, by its nature, is silent/invisible, survivorship bias can be a tough bias to overcome. Here are some steps you can take to ensure you don’t fall for it.
Seek out missing cases: It’s not enough to analyze the information presented to you. Survivorship bias happens when we focus on successes or visible cases while ignoring those that didn’t make it or that are easily missed. Ask questions like: “What’s missing from this picture?” or “What’s different about the cases that failed or weren’t included?” Actively researching those overlooked instances will help you get a more accurate and complete view. To do this, it might also help to think about the mechanisms that are used to collect the data - often, this will reveal certain perspectives getting systematically left out (such as in the case of popular media only covering success stories of startup founders dropping out of college).
Good data trumps stories: it is true that individual stories of overcoming the odds can be inspiring but individual stories are just that…stories. They are incomplete and do not provide a full picture of what is typical or likely. Relying on data will increase your chances of making good decisions than relying on stories (given that the data you are using is reliable and valid)
Seek out failures/edge cases: While it might seem pessimistic in approach, it is important to seek out cases where an idea/belief failed and understand the reasons behind the failure. Failures often carry valuable lessons about limitations, risks, or factors that lead to different outcomes. For instance, if you’re considering adopting a business strategy based on a company’s success, dig into the stories of companies that tried similar strategies and failed—this can help you identify potential pitfalls.
By making these steps a regular part of your thinking, you can reduce the likelihood of falling victim to survivorship bias and make decisions that are better informed and more realistic.