Markov-chains-马尔科夫链

上传人:206****923 文档编号:91849578 上传时间:2019-07-02 格式:DOC 页数:69 大小:2.74MB
返回 下载 相关 举报
Markov-chains-马尔科夫链_第1页
第1页 / 共69页
Markov-chains-马尔科夫链_第2页
第2页 / 共69页
Markov-chains-马尔科夫链_第3页
第3页 / 共69页
Markov-chains-马尔科夫链_第4页
第4页 / 共69页
Markov-chains-马尔科夫链_第5页
第5页 / 共69页
点击查看更多>>
资源描述

《Markov-chains-马尔科夫链》由会员分享,可在线阅读,更多相关《Markov-chains-马尔科夫链(69页珍藏版)》请在金锄头文库上搜索。

1、Markov Chains 4.1 INTRODUCTION AND EXAMPLESConsider a stochastic process Xn,n=0,1,2, . that takes on a finite or countable number of possible values. Unless otherwise mentioned, this set of possible will be denoted by the set of nonnegative integers 0,1,2, .If Xn=i, then the process is said to be in

2、 state i at time n. We suppose that whenever the process is in state i, there is a fixed probability Pij that it will next be in state j. That is, we suppose that(4.1.1) PXn+1=j Xn-1=in-1,.,X1=i1,X0=i0=Pijfor all states i0 ,i1,.in-1 , i, j and all n 0.Such a stochastic process is known as a Markov c

3、hain. Equation(4.1.1)may be interpreted as stating that, for a Markov chain,the conditional distribution of any future state Xn+1 given the past states X0,X1,.,Xn-1 and the present state Xn, is independent of the past states and depends only on the present state. This is called the Markovian propert

4、y. The value Pij represents the probability that the process will, when in state i, next make a transition into state j. Since probabilities are nonnegative and since the process must make a transition into some state, we have that Pij0,i,j0; ,i=0,1,.Let P denote the matrix of one-step transition pr

5、obabilities Pij,so that P=EXAMPLE 4.1(A) The M/G/1 Queue. Suppose that customers arrive at a service center in accordance with a Poisson process with rate.There is a single server and those arrivals finding the server free go immediately into service; all others wait in line until their service turn

6、. The service times of successive customers are assumed to be independent random variables having a common distribution G; and they are also assumed to be independent of the arrival process.The above system is called the M/G/1 queueing system. The letter M stands for the fact that the interarrival d

7、istribution of customers is exponential, G for the service distribution;the number 1 indicates that there is a single server. If we let X(t) denote the number of customers in the system at t, then X(t),t0 would not possess the Markovian property that the conditional distribution of the future depend

8、s only on the present and not on the past. For if we know the number in the system at time t, then, to predict future behavior, whereas we would not care how much time had elapsed since the last arrival (since the arrival process is memoryless),we would care how long the person in service had alread

9、y been there(since the service distribution G is arbitrary and therefore not memoryless). As a means of getting around the above dilemma let us only look at the system at moments when customers depart. That is, let Xn denote the number of customers left behind by the nth departure, n1.Also,let Yn de

10、note the number of customers arriving during the service period of the (n+1)st customer.When Xn0,the nth departure leaves behind Xn customers-of which one enters service and the other Xn-1 wait in line. Hence, at the next departure the system will contain the Xn-1 customers that were in line in addi

11、tion to any arrivals during the service time of the (n+1)st customer. Since a similar argument holds when Xn=0,we see that (4.1.2) Xn+1= Since Yn, n1,represent the number of arrivals in nonoverlapping service intervals, it follows, the arrival process being Poisson process, that they are independent

12、 and (4.1.3) PYn =j= j=0,1,.Form (4,1.2)and (4.1.3)it follows thatXn,n=1,2,.is a Markov chain with transition probabilities given by P= j P= j P=0 otherwiseEXAMPLE 4.1(B) The M/G/1 Queue. Suppose that customers arrive at a single-server center in accordance with an arbitrary renewal process having i

13、nterarrival distribution G. Suppose further that the service distribution is exponential with rate . If we let Xn denote the number of customers in the system as seen by the nth arrival, it is easy to see that the processXn,n1 is a Markov chain. To compute the transition probabilities Pij for this M

14、arkov chain, let us first note that, as long as there are customers to be served, the number of services in any length of time t is a Poisson random variable with mean t. This is true since the time between successive services is exponential and, as we know, this implies that number of services thus

15、 constitutes a Poisson process. Therefore,Pi,i+1-j=,iWhich follows since if an arrival finds i in the system, then the next arrival will find i+1 minus the numbers served, and the probability that j will be served is easily seen (by conditioning on the time between the successive arrivals)to equal t

16、he right-hand side of the above. The formula for Pi0 is little different (it is the probability that at least i+1 Poisson events occur in a random length of time having distribution G)and thus is given byPi0=,iRemark The reader should note that in the previous two examples we were able to discover an embedde

展开阅读全文
相关资源
正为您匹配相似的精品文档
相关搜索

最新文档


当前位置:首页 > 中学教育 > 其它中学文档

电脑版 |金锄头文库版权所有
经营许可证:蜀ICP备13022795号 | 川公网安备 51140202000112号