4 ผลลัพธ์ สำหรับ markov process
/ม้า (ร) โข่ว ฝึ พร้า เซะ สึ/     /mˈɑːrkəʊv prˈɑːsˌes/
หรือค้นหา: -markov process-, *markov process*, markov proces

คลังศัพท์ไทย (สวทช.)
Markov processesกระบวนการมาร์คอฟ [TU Subject Heading]

WordNet (3.0)
markov process(n) a simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state, Syn. Markoff process

Collaborative International Dictionary (GCIDE)
Markov process

n. [ after A. A. Markov, Russian mathematician, b. 1856, d. 1922. ] (Statistics) a random process in which the probabilities of states in a series depend only on the properties of the immediately preceding state or the next preceeding state, independent of the path by which the preceding state was reached. It is distinguished from a Markov chain in that the states of a Markov process may be continuous as well as discrete. [ Also spelled Markoff process. ] [ PJC ]


EDICT JP-EN Dictionary
マルコフ過程[マルコフかてい, marukofu katei] (n) Markov process [Add to Longdo]

Time: 1.2955 secondsLongdo Dict -- https://dict.longdo.com/