Probabilità e Incertezza di Misura
lezioni per i Dottorati di Ricerca in Fisica (32o Ciclo)
(G. D'Agostini)


Syllabus for the written exam

Corso di 20 ore, con inizio giovedì 12 gennaio 2017


Programma

Modalità di esame

Da decidere fra le seguenti due opzioni (e forse entrambe, con la prima intermedia, e programma da ridefinire):

Lezioni

Nr.Giorno OrarioAula
1Gio 12/01 16:00-18:00Rasetti
2Ven 13/01 16:00-18:00Rasetti
3Gio 19/01 16:00-18:00Rasetti
4Ven 20/01 16:00-18:00Rasetti
5Mar 24/01 16:00-18:00Rasetti
6Mer 25/01 18:00-20:00Rasetti
7Gio 26/01 16:00-18:00Rasetti
8Ven 3/02 16:00-18:00Rasetti
9Ven 10/02 16:00-18:00Rasetti
10Gio 16/02 16:00-18:00Rasetti

Dettaglio degli argomenti delle lezioni

Legenda
Lezione 1 (12/1/17)
Introduction to the course:
Lezione 2 (13/1/17)
Intro 2  
Lezione 3 (19/1/17)
Claim of discoveries based on sigma's  
Lezione 4 (20/1/17)
R session — Measurements  
Lezione 5 (24/1/17)
Probabilistic inference (and forecasting) References Problems (AIDS; three boxes with Gold/Silver rings; particle identification) Self study (for the moment)  
Lezione 6 (25/1/17)
Bayesian reaoning Main references Try to play with the R commands shown in the papers (or to reproduce them in other 'human oriented' scripting languages)

Probability distributions — an useful vademecum
Please come to the next lecture with Probability distributions installed (there is also an Apple version).

 
Lezione 7 (26/1/17)
Bayesian inference: applications (+ other matter) Assignments (please try!)
  • Analysing the structure of the Poisson distribution (with parameter λ) and the Gamma distribution (in the variable λ),
    • find the update rule of the Gamma (expressed in function of λ), given x observed counts
    • find the expression of
      • f(λ|x)
      • E[λ|x]
      • Var(λ|x)
      • &sigma(λ|x)
      in the special case in which α=1 and β=0.
  • Inference of μ of Gaussian, given the observed value x and assuming σ well known:
    • Assuming a uniform prior, i.e. f0(μ) = const,
      • find the expression of f(μ|x,σ),
      • find the expression of the expected value and the 'standard uncertainty' of μ.
    Other references
    • As far as the techical properties of the various probability distributions are concerned, see e.g. the lecture notes Probabilità e incertezze di misura, Parte 2 and Parte 3 (in Italian, but math is math, and the names are similar).
     
    Lezione 8 (3/2/17)
    Bayesian inference: applications (+ other matter)
    • More on inference of Bernoulli p and Poisson λ
    • More on conjugate priors.
    • Poisson process and related distributions: Poisson, exponential, Erlang → Gamma.
    • Summary of distributions derived by the Bernoulli process,
    • Disequalities of Markov and Cebicev (not relevant for the course).
    • More on the misunderstangings of Bernoulli's theorem.
    • 'Non memory' property of the geometric and exponential.
    • Generalities about probability distributions of continuous variables (a reminder).
    • Triangular distributions (symmetric and asymmetric) and their role in modelling uncertainties due to 'systematic errors'.
    • Decays: from probabilistic model to (approximated) deterministic model described by the famous differential equation whose solution id the exponential distribution:
    • Gaussian distribution
      • A short reminder
      • The reason why many distribution 'tend to' a Gaussian, thanks to the Central Limit Theorem: reproductivity property under the sum.
      • Properties of the -log of the Gaussian pdf:
        • parabola with the minimum at μ and second derivative equal to 1/σ2
      • The “Gaussian trick” (not official name, and anyway due to Laplace) to evaluate in a simple way (just derivatives, instead of integrals!!) μ and σ of pdf 'assumed to be' (by general arguments or by visual inspection) approximately Gaussian

    References

    Assignments

    •   Problem 4 of the entry test
    •   Problem 7 of the entry test
    • Apply the “Gaussian trick” in the following cases
      • f(λ|x) assuming a flat pdf for λ;
      • f(p|n,x) assuming a flat pdf for p.
      (Note: multiplicative factors are irrelevant because the 'trick' is based on the log).
    • Install JAGS and Rjags and test the installation executing the following simple scripts → R scripts can be executed e.g. by source("simpleMC_1.R"),
     
    Lezione 9 (19/2/17)
    Multidimensional problems
    • Multiple variables ('uncertain vectors'): generalities for discrete and continuous variables (including extension of Bayes theorem).
    • Covariance and correlation coefficient.
    • Independence vs null covariance.
    • Bivariate normal distribution: joint distribution, marginal and conditional → problem 7 of entry test.
    • Matrix form of the bivariate normal distribution and its extension to the multivariate case: covariance matrix
    • 'Gaussian trick' in many dimensions.
    • Linear combination of uncertain variables: expected values, variance and covariances (also taking into account possible covariances between 'input quantities'):
      → compact form ('transformation of covariance matrices').
    • Linearization: the Matrix C becomes the matrix of derivatives taken in the expected values of the input quantities.
    • General solution of uncertainty about functions of uncertain variables: discrete and continuous case (with the Dirac delta).
    • Warning about propagating "best values" and "confidence intervals".
    • Details on the inference of the parameters of a gaussian model:
      • Single measurement: → μ;
      • Role of prior and Gaussian prior: %rarr; "combinations of results";
      • Predictive distribution: problem 3 of entry test (taking the arithmetic average as an 'equivalent single measurement', in virtue of sufficiency)
      • Inferring μ from a sample → statistical sufficiency of the arithmetic average.
      • Joint inference of μ and σ from a sample:
        • general ideas;
        • approximated results for large samples (application of the 'Gaussian trick', details left as exercise).

    References

     
    Lezione 10 (16/2/17)
    Systematics -- Fits
    • Uncertainty due to 'systematics':
      • Reminding ISO terminology: variables of influence
      • General strategies to include the uncertainties due uncertain variables of influence.
      • Detailed case of the uncertain offset.
      • Correlations of results affected by common systematics.
    • Handling uncertainty due systematics by transformation: detailed case of offset and scale uncertainty.
    • Fits
      • Building up the model as a Bayesian network
      • Simplest case of linear model between true values, with a bunch of assumptions/approximations
      • Exact solution of the 'simplest model' with known 'σs'.
      • Example with Jags:

    References


  • Back to G.D'Agostini - Teaching
    Back to G.D'Agostini Home Page