By: C. Paixão
The term ‘butterfly effect’ is a theory that small things affect and have non-linear impacts upon future events. This theory was developed by a meteorologist and mathematician called Edward Norton Lorenz, who discovered in the 1960’s that “the tiny, butterfly-scale changes to the starting point of his computer weather models resulted in anything from sunny skies to violent storms, with no way to predict in advance what the outcome might be”.
The concept of this theory is imagined with the details of a tornado (how it forms, the exact time it forms, the exact path it takes, etc.) being affected by minor perturbations and changes in the atmosphere, such as the one of a distant butterfly flapping its wings many weeks prior to the event. Originally, Lorenz used a seagull to explain the concepts of his theory, that cause a storm, but was convinced, in 1972, to change his theory and make it more poetic by using a butterfly and a tornado as the consequence. Since then, the idea of the "butterfly effect" has been used outside the realm of meteorology to refer to any circumstance in which a tiny alteration is thought to result in more significant effects in the future, a little like a ripple effect.
One weather expert commented that if the hypothesis were true, it would only take one butterfly to flap its wings to completely change the path of the weather. Although the controversy has not yet been resolved, the most current data appears to be in the butterfly's favor. Lorenz consistently emphasized that it is impossible to determine exactly "what tipped a system". The butterfly serves as a metaphor for an elusive quantity. The butterfly effect is a model that highlights the shortcomings in other models, which is kind of disheartening. Given that we have no way of making accurate predictions due to the exponential growth of errors, it demonstrates that science is less accurate than we might think.
Many individuals in the early stages of computing thought computers would help us comprehend complex systems and make precise forecasts. People aspired to regain control of the weather after centuries of being its slaves. Lorenz shocked the world of forecasting with one innocent error, unleashing ripples that, appropriately, went far beyond meteorology.
On the other hand, quantum scientists have disproved the ‘butterfly effect’ at the quantum level, contradicting that changes made in the past would impact the future upon returning to the present. In a simulation conducted using a quantum computer, a piece of information is sent to the past. This information therefore is damaged. However, when this information returns to the present, it is barely altered, and by sending the piece further back in time, the information is found to be even less damaged when returned to the present. An effect as such is only successful in quantum mechanics, in simulations conducted by quantum computers, as time travel is not yet possible and is not yet a clarified phenomenon.
“We can actually see what happens with a complex quantum world if we travel back in time, add small damage, and return. We found that our world survives, which means there's no butterfly effect in quantum mechanics.” said Nikolai Sinitsyn, a theoretical physicist in Los Alamos National Laboratory.
Although the theory has not yet been resolved, it is highly defended and used by many scientists, mathematicians, economists, and used in some traditions and religions, where this concept of a future event may be a consequence of a small change or event in the past, like ripples. However, quantum scientists have claimed to have disproven this controversial theory, by conducting simulations and comparing results where the ‘butterfly effect’ is seen to be an untrue and incorrect theory.