Causality III - Mechanistic deterministic vs Stochastic causality perspectives and the Statistical perspective.

Causality III - Mechanistic deterministic vs Stochastic causality perspectives and the Statistical perspective.

This edition might be more Philosophical in nature, but for a good reason. Thinking about the phenomena and causality around us and providing a Statistical perspective on it.

One interesting paper by Laplace [1], actually implies that predictive power about the future events could be absolute if all mechanics and knowledge of all phenomena were known to ones intellect.

(Data Scientists should keep this in mind when creating predictive models in my opinion)

Contrary to the general wisdom, i would not disagree with this. Starting at physics, if one was to know all the properties of a particle in a vacuum, and all the energies transferred to it, the prediction of its motion should be absolute.

But there is a catch. Statistically speaking, Stochastic processes are also a part of natures phenomena and they exist almost anywhere. We can measure them in almost any phenomena.

One can think of a dice roll. Actually a very simple physics experiment. But its not practically predictable for in general, even though Laplace would be right even here [1]. The phenomena still has random nature built in, as do many other features in the world around us.

What i like about the Mechanistic approaches is that they try to explain the phenomena as having a clear mechanics of what is happening, trying to explain the process of causality between two phenomena.

But in reality Statisticians are typically confronted with problems and sometimes even advantages of uncertainty originating from complexity of interactions between many phenomena, noise and randomness. Dealing with these is in my opinion one of the most important aspects of today's Statistics and Data Science, even tough the mechanistic approach as complement can be very important too.

Mechanistic deterministic and Stochastic approaches don't have to be mutually exclusive in my opinion. In fact i think they overlap, especially in causal inference.

So using the maximum explainabaility, mechanistic approach while dealing with randomness and noise is the ideal combination in my opinion and this should be the strategy for future Statistical methods.

They types of causes are the central aspect here too and more in this in Causal Inference IV. Thank you for reading!


References:

  1. Laplace, Pierre Simon, A Philosophical Essay on Probabilities. 1902.

Bob Durrant

Data Scientist | PhD in Computer Science

1 年

I agree that Laplace's mechanistic view of the universe does appear at odds with the usual statistical viewpoint, but the two can be reconciled, I think, by recognising that probability is a mathematical idealization of (properties of) the world at large that helps us to model or interpret what's going on. From this viewpoint randomness arises as a consequence of either the observer's lack of perfect information or else their inability to make sense of it, even when perfect information is available. For example, in principle, given the instantaneous state of my computer's memory I should be able to say exactly what can be seen on the display at the same time, but it's simply too hard in practice - so for me the display contents are random (given the computer's state) and it's more useful for me to think of them that way. Ditto the roll of a die, or measurements of any suitably complex system. BTW I read in an interview that Persi Diaconis has a coin-tossing machine that can completely predictably toss a head or a tail... ??

要查看或添加评论,请登录

Darko Medin的更多文章

社区洞察

其他会员也浏览了