Data Dearth: The death of incompleteness
:Pixabay

Data Dearth: The death of incompleteness

- Designing reliable yet relevant theories for sustaining prosperity.

 

It is quite a clear idea that the trends gotten from Big data and Data analytical techniques do not offer an in-depth and fundamental approach to understanding complex systems, how innovations work, and predicting the future. This includes the fact that before strong models of how systems work can be constructed, huge amounts of data are needed to be collected and in use; essentially, the model/theory developed requires more and more data from different contexts. Data analysis focuses more on the trends of the data and its use to predict the future than the cause-effect relationship that governs the system.


There is increasingly the need for a more fundamental approach to modeling and with less cost of data, which is building theories that instead describes the cause-effect relationships in systems using variables that constitute the scope of the system that we are trying to measure, and functions that illustrate the dynamic relationships between them. These theories offer a more reliable path to developing better predictive models for the future since the output stems from abstracted and tested theories of how the system works. But are they absolute?

By virtue of the fact that they are theories and abstracted, they are subject to some limitations: Incomplete. ‘Non-absoluteness’. A theory does not describe the whole system, it rather abstracts, and the minute details brushed off during abstraction conspire to limit the theory thus formed. For example, despite the plethora of sound theories built from Bernoulli’s principles and the development of fluid dynamics of how things move and fly, there are still plane crashes happening every now and then (Efosa, 2021). This is a limitation peculiar to theories alone, and since they only get resolved after the theory has failed, and they are sometimes costly. Theory crafting can be done today using various tools such as System Dynamics modeling [1]. Systems Dynamics is used to make reliable predictions but fall prey continually to the limitations of the theory described above: Incompleteness.

 

I propose a more robust: relevant, and reliable model -the combination of stolid theory-crafting alongside the plugging of living insights processed from big data. A combination of the reliability and rigor of System Dynamics and passing in quick updates by regular streams of data insights do present a more sustainable and wholesome path to modeling that can stand the severity of time and context.

This will help improve the reliability and relevancy in understanding and predicting the path of innovation. With this, we will be able to promote the current validity of established theories -when valid in new situations, these theories are reinforced; when they are not true, they morph accordingly—a living theory.

With this approach, the two update sources are: update from theory failure (which will be relatively few) and update from big data relating to the same scope.



[1] (which offers the approach to understanding a system; both the material and the motivational perspectives that do constitute them, modeling them by variables, the relationships between them, and time delays, such that we are able to understand how the system changes over time, when subjected to certain fringe conditions.)



要查看或添加评论,请登录

社区洞察

其他会员也浏览了