Differences

This shows you the differences between two versions of the page.

Link to this comparison view

en:dydaktyka:problog:lab1 [2017/05/29 14:44]
msl
en:dydaktyka:problog:lab1 [2019/06/27 15:49]
Line 1: Line 1:
-====== Probabilistic Programming --- Medical Cases ====== 
  
-This class will cover use cases of the Bayesian methods in the medical domain. First part of the class is based on article: "Local computations with probabilities on graphical structures and their application to expert systems"​ by Lauritzen, Steffen L. and David J. Spiegelhalter. Second part is inspired by "An intercausal cancellation model for bayesian-network engineering. International Journal of Approximate Reasoning"​ by S.P. Woudenberg, L. C. van der Gaag, and C. M. Rademaker. 
- 
-===== Medical Diagnosis ===== 
- 
-In this section we will follow a simplified use case of the medical diagnosis, as defined in the following quote from the article. 
- 
-{{ :​en:​dydaktyka:​problog:​diagnosis_story.png?​nolink&​400 |}} 
- 
-==== Structure ==== 
- 
-First task of the "​knowledge engineer"​ is to find a structure of Bayesion network which fits the story. There exist automatic tools to learn the structure from examples, but in this case the structure should be clear enough to create the network by hand. 
- 
-=== Assignments === 
- 
-  - Draw (on paper?) a Bayesian network describing the story from the previous section. 
-  - Write the corresponding ProbLog program: 
-    - there is no need for the first order logic here 
-    - use arbitrary probabilities 
- 
-==== Probabilities ==== 
- 
-The problem with Bayesian model you've just created is that it doesn'​t provide with any useful info. Mostly because of the arbitrary prior probabilities,​ you've used. Reality is rather harsh, often you don't have access to any realistic priors (one of the arguments of critics of Bayesian methods). In this section we will try to make up for that and find make the network useful. 
- 
-=== Learning ===  
- 
-The simplest way to have realistic priors is to not have any priors at all :) In other words --- we assume, we know nothing about probabilities. In ProbLog you can state this fact by using ''​t(_)''​ predicate, e.g. 
- 
-<code prolog> 
-t(_)::​smoker. 
-</​code> ​ 
- 
-Says you do not know nothing about probability of patient being smoker. 
- 
-Now when we have admitted our lack of knowledge, we can start learning! In ProbLog learning can be achieved either by command line tool: 
-<code bash> 
-problog lfi 
-</​code>​ 
-or in on-line editor by simply choosing ''​Learning''​ from the list. 
- 
-In both cases you have to provide some learning examples, that consists simply of evidences separated by dotted line, e.g. two different patients can be described as: 
- 
-<code prolog> 
-evidence(smoker). 
-evidence(\+visitedAsia). 
-evidence(\+tubercolosis). 
-evidence(\+lung_cancer). 
-evidence(\+dyspnea). 
-evidence(\+xray_positive). 
----------------- 
-evidence(\+xray_positive). 
-evidence(tubercolosis). 
-evidence(visitedAsia). 
-evidence(\+lung_cancer). 
-evidence(dyspnea). 
-evidence(\+smoker). 
-</​code>​ 
- 
-The learning should result in new model with new probabilities. 
- 
-<WRAP center round important 60%> 
-If you receive an "​Inconsistent Evidence"​ error, include leak probabilities in the model. Leak probabilities are probabilities stating that some random variable can be assigned to a value without any particular reason, e.g. here we state that variable ''​var''​ can't be true if no reason is true. 
- 
-<code prolog> 
-t(_)::var :- reason1. 
-t(_)::var :- reason2. 
-0.0::var. % leak probability 
-</​code>​ 
-Make sure to include leak probabilities such that all possible ev 
-idence can be linked to a possible world (otherwise ProbLog will return an “Inconsistent Evidence” error). 
-</​WRAP>​ 
- 
- 
-== Assignments:​ == 
- 
-  - replace all probabilities in your model with ''​t(_)''​ 
-  - put some random learning data in the on-line IDE and check results of learning 
-  - download {{ :​en:​dydaktyka:​problog:​patients_10000.txt |data of 100 000 patients}} 
-    - try to use it in on-line IDE 
-    - try to use it offline (cmdline) 
-      - you may have to ask the teacher to install the problog for you 
-      - you may have to limit number of iterations learning takes 
-    - what are the learned probabilites?​ 
-    - what is the probability of a smoker with positive x-ray to have a lung cancer? 
-  
en/dydaktyka/problog/lab1.txt · Last modified: 2019/06/27 15:49 (external edit)
www.chimeric.de Valid CSS Driven by DokuWiki do yourself a favour and use a real browser - get firefox!! Recent changes RSS feed Valid XHTML 1.0