Markov processes | กระบวนการมาร์คอฟ [TU Subject Heading] |
markov process | (n) a simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state, Syn. Markoff process |
Markov process | n. [ after |
マルコフ過程 | [マルコフかてい, marukofu katei] (n) Markov process [Add to Longdo] |