” 写作CS 505程序、 辅导Python,CSCS 505 Spring 2021 Assignment 1 (100 points) Probability BasicProblems due 11:59PM EST, February 7.Submit in Blackboard by 11:59PM EST, February 7.Please indicate names of those you collaborate with.Every late day will reduce your score by 20After 2 days (i.e., if you submit on the 3rd day after due date), it will be marked 0.For all questions, you must show how you derive your answer.Problem 1. (2 pts) Use conditional probability P(S|T) = P(S,T)P(T) to prove the Chain rule:P(S, T, U) = P(S|T, U)P(T|U)P(U).Problem 2. (8 pts) Suppose you are locked out of the main ship like Dave was by the ships AI system HAL; but thistime, HAL is giving you a chance to successfully return to the main ship. Without you knowing, HAL had connecteda tunnel to your space pod that can lead you back to the main ship, but he did not tell you which hatchway out of thethree you have in your pod will lead to this tunnel. If you open the wrong hatchway, you will get sucked out of thepod and into Space without your space helmet! HAL had asked you to choose one of the three hatchways but at thelast second before you open that hatchway, he told you which hatchway (among the other two you did not pick) willlead you to your death (perhaps hes being nice? Or maybe, he wants to make you waver and not survive!). Now thequestion becomes whether you should stick with your current one, or change to the yet unidentified other hatchway.What should you do to have the highest probability of getting to the tunnel and back to the main ship safe and soundto finally disconnect HAL?Consider the event that the hatchway you originally pick will lead you to safety, then consider the probabilitythat you will get out of there safely given that you decide to change or not. Work the probability out using conditionalprobability. You must show how you arrive at your choice.Problem 3. (10 pts) Let A and B be random variables and f be a function, prove that E[f(A)] = E[E(f(A)|B)]Problem 4. (10 pts) Suppose we have sample of real values s1, s2, s3, …, sn. Each sampled from p.d.f. p(s) wheref(s) = (es if s 00 elsewhere is unknown parameter. Derive the maximum likelihood estimation of . Assume that all si in our sampleare larger than 1.Problem 5. (3 pts) Suppose we have 3 variables S, T, U. IfP(U|S) = 0.9P(U|T) = 0.6Can you compute P(U|S, T)? If not, just write theres not enough information.1Problem 6. (4 pts) If instead, we have the following informationP(U|S) = 0.9 P(U|T) = 0.6P(S) = 0.4 P(T) = 0.5Can you compute P(U|S, T)? If not, just write theres not enough information.Problem 7. (3 pts) If instead we have the following information:P(U, S) = 0.3 P(S) = 0.5 P(T) = 1Can we compute P(U|S, T) if not, just write theres not enough information.Problem 8. (5 pts) Suppose there is a gathering of people wearing different color shirts. Two thirds of the peoplewearing green shirts are laughing, and one tenth of the gathering consists of green shirt wearing people. Only one infive of the other people not wearing green shirt are laughing. Whats the probability that a randomly chosen personin the gathering is a laughing green shirt wearing person?Problem 9. (5 pts) Two third of baby animals in this world are fluffy. If a baby animal is fluffy, then it is more likelycute! i.e. the probability that a Baby animal is cute given its fluffy is 0.8 while the probability its cute given its notfluffy is 0.1. A randomly picked baby animal is cute! What is the probability it is fluffy?Problem 10. (5 pts) One third of all animals in the poorly maintained zoo in Indonesia are hungry. A hungryanimal is more likely to be cranky. A cranky animal is more likely to be scary. Whats the probability that a hungryscary animal is cranky? Given that the probability of an animal being cranky given that its hungry is 0.7 while theprobability of an animal being cranky given its not hungry is 0.1 and the probability of an animal being scary giventhat it is cranky is 0.9 while the probability of an animal being scary given that its not cranky is 0.05.Problem 11. (10 pts) Given:P(A) = 12P(B|A) = 110P(C|A) = 0P(D|B) = 12P(B|notA) = 12P(C|notA) = 12P(D|notB) = 1and D is conditionally independent of A given B, and C is conditionally independent of D given A. Compute P(C, D).2Problem 12. Consider two machines programmed to generate compliments, each outputs one of 6 words; and agame where a user can press a button on the machine, and out come a compliment to make his/her day. The higher(i.e., the more positive) the compliment is, the higher will the users satisfaction be. The satisfaction score for thewords are (in order from least positive to most positive): adequate (score: 1), fine (score: 2), nice (score: 3), good(score: 4), great (score: 5), fantastic (score: 6). One of the machine has not developed its own intelligence, so itworks as programmed (i.e., the vanilla machine) and always selects one of the 6 words equally at random, whileanother (i.e., the AI machine) has developed an understanding that users will love it much more and keep it aroundlonger if it always Gives the highest compliment, so it much prefers to give highest compliment to users each time:P(word) = (13 word = fantastic215 word = adequate, fine, nice, good, greatA professor has decided to purchase these two machines to compliment his students; but because he doesntalways want to give them the highest compliment, he uses another machine (i.e., mood machine) that will, dependingon the professors mood, choose to press the button of the vanilla machine or the AI one. When the professors notfeeling great, the mood machine will, with probability m pushes the button of the vanilla machine.a. (5 pts) What is the expectation of the satisfaction score (in terms of m)?b. (10 pts) What is the variance of the score in terms of m?c. (4 pts) To generalize the above, we can think of a sample space containing several distributions:Pk(W) =P(W|M = k), k = 1 n (i.e, the two compliment machines above) where each of M has also a distributionP(M = k) (i.e, the Mood machine). Formulate P(W) in terms of Pk(W) and P(M).d. (8 pts) Formulate E(W) in terms of E(W|M). Simplify your answer as much as possible.e. (8 pts) Formulate V ar(W) in terms of V ar(W|M) and E(W|M). Simplify your answer as much as possible.3如有需要,请加QQ:99515681 或WX:codehelp
“
添加老师微信回复‘’官网 辅导‘’获取专业老师帮助,或点击联系老师1对1在线指导。