User Tools

Site Tools


public:t-622-arti-13-1:lab_5_-_bayesian_networks

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Next revision
Previous revision
public:t-622-arti-13-1:lab_5_-_bayesian_networks [2013/03/12 08:58] – created stephanpublic:t-622-arti-13-1:lab_5_-_bayesian_networks [2024/04/29 13:33] (current) – external edit 127.0.0.1
Line 9: Line 9:
   * http://www.aispace.org/bayes/bayes.jnlp (using Java Web Start)   * http://www.aispace.org/bayes/bayes.jnlp (using Java Web Start)
   * http://www.cs.cmu.edu/~javabayes/ (a slightly older Java applet)   * http://www.cs.cmu.edu/~javabayes/ (a slightly older Java applet)
 +
 +The following laws of probabilities might be helpful:
 +  * <latex>$P(\neg A) = 1 - P(A)$</latex>
 +  * Bayes Law: <latex>$P(A \land B) = P(A|B) * P(B)$</latex>
 +  * Marginalization/Summing out: <latex>$P(A) = \sum_b P(A \land b) = P(A \land B) + P(A \land \neg B)$</latex>
 +  * Independence 1: <latex>$P(A|B,C) = P(A|C)$</latex>, if <latex>$A$</latex> and <latex>$B$</latex> are independent given <latex>$C$</latex>.
 +  * Independence 2: <latex>$P(A \land B|C) = P(A|C) * P(B|C)$</latex>, if <latex>$A$</latex> and <latex>$B$</latex> are independent given <latex>$C$</latex>. (This follows from Independence 1 and Bayes Law.)
 +
 +All those laws allow arbitrary additional conditions as long as all probabilities have the condition:
 +  * Bayes Law: <latex>$P(A \land B|C) = P(A|B,C) * P(B|C)$</latex>
 +  * Marginalization/Summing out: <latex>$P(A|C) = \sum_b P(A \land b|C) = P(A \land B|C) + P(A \land \neg B|C)$</latex>
  
 ==== Problem 1: Smelly Doors ==== ==== Problem 1: Smelly Doors ====
Line 18: Line 29:
   * it can check whether the room smells bad or not and use that information as an indicator.   * it can check whether the room smells bad or not and use that information as an indicator.
  
-  - Design a very simple Bayesian network usable by the NPC to decide when in front of a door, whether he should open it or not. Start by inventing reasonable probabilities for the relation between the contents of the room and its smell.+  - Design a very simple Bayesian network (2 nodes) usable by the NPC to decide when in front of a door, whether he should open it or not. Start by inventing reasonable probabilities for the relation between the contents of the room and its smell.
   - How probable is it that the room contains a monster if the air smells bad?   - How probable is it that the room contains a monster if the air smells bad?
   - Argue how, instead of using made-up probabilities, the NPC can learn as he opens doors and dynamically update the Bayesian network becoming smarter.   - Argue how, instead of using made-up probabilities, the NPC can learn as he opens doors and dynamically update the Bayesian network becoming smarter.
Line 24: Line 35:
 Hand in: Hand in:
   - the Bayesian network (e.g., a screenshot and the probability tables for all nodes of the network)   - the Bayesian network (e.g., a screenshot and the probability tables for all nodes of the network)
 +  - answer to question 2 above
   - a few sentences on how the NPC could learn the necessary probabilities   - a few sentences on how the NPC could learn the necessary probabilities
  
Line 30: Line 42:
 {{  :public:t-622-arti-09-1:treasurechest.jpg|}} {{  :public:t-622-arti-09-1:treasurechest.jpg|}}
  
-You are a seasoned tomb raider and have spent the last week rummaging through an old pirate cove full of treasure. So far you have opened 100 chests and of those, 50 have in fact contained treasure! Out of these 50, 40 were trapped and you sustained some painful damage from opening them. Out of these 40 trapped chests, 28 were also locked. Now, of the 10 untrapped chests, three were locked. One would think that only chests with treasure would be trapped, but these pirates were truly nasty, they also put traps on chests with no treasure. Of the 50 chests containing no treasure, 20 were trapped! +You are a seasoned tomb raider and have spent the last week rummaging through an old pirate cove full of treasure. So far you have opened 100 chests and of those, 50 have in fact contained treasure! Out of these 50, 40 were trapped and you sustained some painful damage from opening them. Out of these 40 trapped chests, 28 were also locked. Now, of the 10 untrapped chests, three were locked. One would think that only chests with treasure would be trapped, but these pirates were truly nasty, they also put traps on chests with no treasure. Of the 50 chests containing no treasure, 20 were trapped! You forgot how many of the chests without treasure were locked, but you believe that the ratios were similar to the ones with treasure.
  
 You have now discovered a new chest that you haven't seen before. When you take a careful look, you notice that it is locked. What is the chance that this chest will contain treasure? What is the chance that it will be trapped? You are not feeling so good after all the previous traps, so will it be worth opening this chest if your life is on the line? You have now discovered a new chest that you haven't seen before. When you take a careful look, you notice that it is locked. What is the chance that this chest will contain treasure? What is the chance that it will be trapped? You are not feeling so good after all the previous traps, so will it be worth opening this chest if your life is on the line?
/var/www/cadia.ru.is/wiki/data/attic/public/t-622-arti-13-1/lab_5_-_bayesian_networks.1363078703.txt.gz · Last modified: 2024/04/29 13:32 (external edit)

Donate Powered by PHP Valid HTML5 Valid CSS Driven by DokuWiki