信号检测与估计2017期中试题及答案

上传人:豆浆 文档编号:24903774 上传时间:2017-12-08 格式:PDF 页数:9 大小:268.12KB
返回 下载 相关 举报
信号检测与估计2017期中试题及答案_第1页
第1页 / 共9页
信号检测与估计2017期中试题及答案_第2页
第2页 / 共9页
信号检测与估计2017期中试题及答案_第3页
第3页 / 共9页
信号检测与估计2017期中试题及答案_第4页
第4页 / 共9页
信号检测与估计2017期中试题及答案_第5页
第5页 / 共9页
点击查看更多>>
资源描述

《信号检测与估计2017期中试题及答案》由会员分享,可在线阅读,更多相关《信号检测与估计2017期中试题及答案(9页珍藏版)》请在金锄头文库上搜索。

1、Midterm SolutionEE251: Signal Detection and Parameter Estimation2016 FallProblem 1:(a):Method 1:If we want to use LMMSE to derive the estimate XLMMSE, we need let Y be a noisein this model. According to the Bayesian Gauss-Markov Theorem, the noise need bezero-mean. Therefore, we set a new model asZ

2、12 = X + (Y 12) = X + w (1)Since the X and Y follows uniform distribution, then their mean is 12 and varianceis 112.XLMMSE = EX + C 1XX + HTC 1w H 1 HTC 1wZ 12 HEX= 12 + 112 + 12 12 (Z 12 12) = Z2(2)And its corresponding mean-square error isE(X X)2 = C 1XX + HTC 1w H 1 = 112 + 12 = 124 (3)Method 2:E

3、XjZ = EX + CXZC 1ZZ(Z EZ) (4)We need calculate CXZ and C 1ZZ.CXZ = E(X EX)(Z EZ)= E(X 12)(X + Y 1)= EX2 + XY X 12X 12Y + 12= 112(5)CZZ = E (Z EZ)2 = E X2 + 2XY + Y 2 2Z + 1 = 16 (6)Then, we can derive the LMMSE asXLMMSE = EXjZ = EX + CXZC 1ZZ(Z EZ) = 12 +112 6 (Z 1) =Z2 (7)1信号检测与估计2017期中试题及答案And its

4、 corresponding mean-square error isE(X X)2 = CXX CXZC 1ZZCZX = 124 (8)(b):XMMSE = EXjZ (9)Then, we need derive the f(XjZ) rst.f(X) = u(x)u(1 x)f(ZjX) = u(z x)u(x + 1 z) (10)Now, we can calculate f(XjZ) asf(XjZ) = f(ZjX)f(X)R f(ZjX)f(X)dX= u(x)u(1 x)u(z x)u(x + 1 z)Rmin(1;z)max(z 1;0) 1dx= u(x)u(1 x)

5、u(z x)u(x + 1 z)min(1;z) max(z 1;0)(11)The denominator is the pdf of Z.Figure 1: The pdf of Z.If you want to discuss the value of z here, you can derive two cases of f(XjZ) asfollows:2Figure 2: The pdf of X given Z for 0 z 1.Figure 3: The pdf of X given Z for 1 z 2.Lastly, we derive the estimate XMM

6、SE.EXjZ =Zxf(xjz)dx=Z xu(x)u(1 x)u(z x)u(x + 1 z)min(1;z) max(z 1;0) dx= 1min(1;z) max(z 1;0)Z min(1;z)max(z 1;0)xdx= 1min(1;z) max(z 1;0) 12x2jmin(1;z)max(z 1;0)= min(1;z) + max(z 1;0)2=(1+z 12 1 z 2z2 0 z 1= z2(12)Therefore, XMMSE = z2 and its corresponding MSE is 124.3Problem 2:(a): We can calcul

7、ate the posterior density asf(XjN = n) = P(N = njX)fX(x)R10 P(N = njX)fX(x)dx= xne (1+a)xR10 xne (1+a)xdx(13)Then, let us calculate the denominator.Z10xne (1+a)xdx = Z 10xn 11 + ad(e (1+a)x)= 11 + a xne (1+a)xj10 Z 10e (1+a)xnxn 1dx= n1 + aZ 10e (1+a)xxn 1dx.= n!(1 + a)n+1(14)Therefore, the posterio

8、r density isf(XjN = n) = (1 + a)n+1xne (1+a)xn! (15)(b):X = EXjN=Z 10x(1 + a)N+1xNe (1+a)xN! dx= (1 + a)N+1N!Z 10xN+1e (1+a)xdx= (1 + a)N+1N! (N + 1)!(1 + a)N+2= N + 1a + 1(16)(c): The MAP estimate of X is given byX = arg maxXf(XjN) (17)Firstly, we get the log-likelihood function aslnf(XjN) = (N + 1

9、) ln(1 + a) + N lnx (1 + a)x ln(N!) (18)Let lnf(XjN)x = 0, we can derive the estimate lnf(XjN)x = 0Nx (1 + a) = 0X = N1 + a(19)4Problem 3:(a): We know that the probability of variable Z isP(Z = k;p) =nkpk(1 p)n k= n!k!(n k)!pk(1 p)n k(20)The log-likelihood function L(Z;p) is given asL(Z;p) = lnP(Z;p

10、) = lnn!Z!(n Z)!+ Z lnp + (n Z) ln(1 p) (21)Since we want to derive the maximum likelihood estimate of p, we solve the equationL(Z;p)p = 0 to get the estimate pML.L(Z;p)p = 0Zp n Z1 p = 0pML = Zn(22)(b): If we use the conclusion of binomial distribution EZ = np and var(Z) = np(1 p),we can solve the

11、problem easily.EpML = EZn = npn = p (23)So, the ML estimate is unbiased. And then we calculate the MSE:E (pML p)2 = EZ2n2 p2 = var(Z) + E2 Zn2 p2= np(1 p) + n2p2n2 p2 = p(1 p)n(24)However, if you do not know the conclusion of binomial distribution, you can alsoderive the result by yourself.EZ =nXk=0

12、k n!k!(n k)!pk(1 p)n k=nXk=1n!k!(n k)!pk(1 p)n ky=k 1=n 1Xy=0n (n 1)!y!(n 1 y)!py+1(1 p)n 1 y= np n 1Xy=0(n 1)!y!(n 1 y)!py(1 p)n 1 y= np (p + (1 p)n 1 = np(25)5E Z2 Z =nXk=0k (k 1) n!k!(n k)!pk(1 p)n k=nXk=2n!(k 2)!(n k)!pk(1 p)n ky=k 2=n 2Xy=0n (n 1) (n 2)!y!(n 2 k)! py+2(1 p)n 2 y= n (n 1) p2(p +

13、 (1 p)n 2= n(n 1)p2(26)Then, EZ2 = n(n 1)p2 + np.6Problem 4:(a): Since the random variables Yk are independent, then we can derive the N observa-tions joint probability function asP(Y1;Y2; ;YN;p) =NYk=1P(Yk;p)= pN(1 p)PNk=1 Yk(27)Then, S is a su cient statistic for estimating p. The generating funct

14、ion of S can becalculated as follows:GS(z) = E zS = E NYk=1zYk#( )=NYk=1E zYk = (GY (z)N= pN(1 qez)N(28)The equation ( ) depends on the independence of the Yk. If you are not sure for thisconclusion, you can also use the property of conditional expectation EA = EEAjB,where B can be any condition.E N

15、Yk=1zYk#= EE NYk=1zYkjY1;Y2; ;YN 1#= EN 1Yk=1zYkE zYN #= E zYN EN 1Yk=1zYk#.=NYk=1E zYk (29)We let ez ! z 1, the GS(z) changes to pN(1 qz 1)N . According to the de nition ofgenerating function, we can write GS(z) asGS(z) = E ezS = E z S =1Xn=0P(S = n)z n(30)If we think P(S = n) is a signal series xn

16、 in time domain, then pN(1 qz 1)N is its Z-transform.xn = Z 1pN(1 qz 1)N= pNZ 11(1 qz 1)N(31)7We denote Xk(z) = 1(1 qz 1)k , and corresponding inverse Z-transform is xkn. For sim-plicity, we leave out un in following process.X1(z) = 11 qz 1 Z 1 !x1n = qnzdX1(z)dz = qz(z q)2 Z 1 !n qnX2(z) = zq qz(z q)2 Z 1 !x2n = (n + 1)qnzdX2(z)dz = 2q

展开阅读全文
相关资源
相关搜索

当前位置:首页 > 商业/管理/HR > 其它文档

电脑版 |金锄头文库版权所有
经营许可证:蜀ICP备13022795号 | 川公网安备 51140202000112号