Markov chains

Markov chains

Markov chains are a stochastic model that describes possible events in which the probability of the next event depends on the current event.

An example of a two-node Markov chain below shows a model of sunny/rainy prediction and the probabilities.

No alt text provided for this image

The model above can be represented by the probability matrix: 

No alt text provided for this image

The model transitions from one state to another with a probability defined by matrix Q. Great visualization of Markov Chain process can be found here: https://setosa.io/ev/markov-chains/

Markov chain is a mathematical model that describes transitions from one state to another according to a certain probability matrix.

An important assumption of this model is that no matter how the process arrived at its present state, the possible future states are fixed.

Markov chain can be used to model fault states.

A production line states can be described as

  • Running
  • Auto-mode
  • Manual mode
  • Fault

Faults can be detailed as

  • Drive fault
  • Hydraulic pressure high
  • Change filter
  • etc.

Once we defend states we need to collect data and record timestamps of the state start time and state end time. This information is needed to calculate the probability matrix. 

In order to create a Markov model, we need to define probabilities for each state. In the next articles, we will look at how to define the transition probability matrix and at continuous time fault probability. 


 

要查看或添加评论,请登录

Yury Kozlov, P.Eng., PMP的更多文章

社区洞察

其他会员也浏览了