8 ผลลัพธ์ สำหรับ markov chain
หรือค้นหา: -markov chain-, *markov chain*

ศัพท์บัญญัติราชบัณฑิตยสถาน
Markov chainลูกโซ่มาร์คอฟ [คณิตศาสตร์๑๙ ก.ค. ๒๕๔๗]

WordNet (3.0)
markov chain(n) a Markov process for which the parameter is discrete time values, Syn. Markoff chain

Collaborative International Dictionary (GCIDE)
Markov chain

n. [ after A. A. Markov, Russian mathematician, b. 1856, d. 1922. ] (Statistics) A random process (Markov process) in which the probabilities of discrete states in a series depend only on the properties of the immediately preceding state or the next preceeding state, independent of the path by which the preceding state was reached. It differs from the more general Markov process in that the states of a Markov chain are discrete rather than continuous. Certain physical processes, such as diffusion of a molecule in a fluid, are modelled as a Markov chain. See also random walk. [ Also spelled Markoff chain. ] [ PJC ]


EDICT JP-EN Dictionary
マルコフ連鎖[マルコフれんさ, marukofu rensa] (n) { comp } Markov chain [Add to Longdo]
マルコフ連鎖モンテカルロ法[マルコフれんさモンテカルロほう, marukofu rensa montekaruro hou] (n) { comp } Markov chain Monte Carlo methods [Add to Longdo]
遷移確率[せんいかくりつ, sen'ikakuritsu] (n) { comp } transition probability (e.g. in a Markov chain) [Add to Longdo]

COMPDICT JP-EN Dictionary
マルコフ連鎖[マルコフれんさ, marukofu rensa] Markov chain [Add to Longdo]
遷移確率[せんいかくりつ, sen'ikakuritsu] transition probability (e.g. in a Markov chain) [Add to Longdo]

Time: 0.2558 seconds, cache age: 11.044 (clear)Longdo Dict -- https://dict.longdo.com/