Using Markov Models of Fault Growth Physics and Environmental Stresses to Optimize Control Actions

Shared by Miryam Strautkalns, updated on Apr 12, 2013


Author(s) :
B. Bole, K. Goebel, G. Vachtsevanos

A contrived example of a dice throwing game was considered in order to provide some insight into the general problem developing prognostics-based control routines that utilize uncertain models of component fault dynamics and future environmental stresses to assess and mitigate risk. A generalized Markov modeling representation of fault dynamics was developed for the case that available modeling of fault growth physics and available modeling of future environ- mental stresses are represented by two independent Markov process models. A finite horizon dynamic programming algorithm was given for a Markov decision process representation of the prognostics-based control problem and this algorithm was used to identify an optimal control policy for the dice throwing game that is considered in this paper. The outcomes obtained from simulations of the optimizing control policy were observed to differ only slightly from the outcomes that would have been achievable if all modeling uncertainties were removed from the example dice throwing game.

show more info
Publication Name
Publication Location
Year Published


1.1 MB 18 downloads


Add New Comment