PDF Google Drive Downloader v1.1


Report a problem

Content text hw5_solutions.pdf

IFT6269-A2021 Prof: Simon Lacoste-Julien Hwk 5 Dec 22, 2021 Solutions by S. Chandar, G. Huang and J. Gallego 1. Cautionary tale about importance sampling (10 points) Suppose that we wish to estimate the normalizing constant Zp for an un-normalized Gaussian p ̃(x) = exp(− 1 2σ2 p x 2 ). In other words, we have p(·) ∼ N (·|0, σ2 p ) with p(x) = ̃p(x)/Zp. Given N i.i.d. samples x (1), . . . , x(N) from a standard normal q(·) ∼ N (0, 1), consider the importance sampling estimate: Zˆ = 1 N X N i=1 p ̃(x (i) ) q(x (i) ) . (a) Show that Zˆ is an unbiased estimator of Zp. (b) Letting f(x) := ̃p(x)/q(x), show that var(Zˆ) = 1 N Var(f(X)) whenever Var(f(X)) < ∞. (c) For what values of σ 2 p is this variance finite? Hint: Keep your variances non-negative! Solution: (a) . Eq[Zˆ] = Eq " 1 N X N i=1 p ̃(x (i) ) q(x (i) ) # = 1 N X N i=1 Eq " p(x (i) )Zp q(x (i) ) # = 1 N X N i=1 Z R✟ ✟✟✟ q(x (i) ) p(x (i) )Zp ✟✟ ✟✟ q(x (i) ) dx (i) = 1 N X N i=1 Zp Z R p(x (i) )dx (i) = 1 N X N i=1 Zp = Zp. (b) Var " 1 N X N i=1 f(x (i) ) # = 1 N2 X N i=1 Var [f(X)] = 1 N2 NVar [f(X)] = 1 N Var [f(X)] . (c) Var[Zˆ] is finite when Var [f(X)] = Var [ ̃p(x)/q(x)] is finite: Var " p ̃(x) q(x) # = Eq " p ̃(x) 2 q(x) 2 # −Eq " p ̃(x) q(x) #2 = Z R p ̃(x) 2 q(x) dx−Z 2 p = Z R √ 2π exp σ 2 p − 2 2σ 2 p x 2 ! dx−Z 2 p . This expression is finite if and only if the integral is finite. Thus we need σ 2 p < 2. 2. Gibbs sampling and mean field variational inference (10 points) Consider the Ising model with binary variables Xs ∈ {0, 1} and a factorization of the form: p(x; η) = 1 Zp exp   X s∈V ηsxs + X {s,t}∈E ηstxsxt   . (a) Derive the Gibbs sampling updates for this model. (b) Consider fully factorized approximation q. Use the notation q(Xs = 1) = τs. Derive an expression for KL(q||p) − log(Zp). Note: We can’t directly compute the value of the KL divergence, as it would depend on the intractable quantity log(Zp). When we subtract it, the remainder KL(q||p) − log(Zp) can be computed efficiently. (c) Derive the naive mean field updates, based on a fully factorized approximation. More specifically, we do cyclic coordinate descent on KL(q||p), sequentially updating the pa- rameter τs ∈ [0, 1]. Note that Zp is a constant with respect to the parameters τs.
IFT6269-A2021 Prof: Simon Lacoste-Julien Hwk 5 Dec 22, 2021 Solutions by S. Chandar, G. Huang and J. Gallego Solution: (a) For Gibbs sampling, we compute the conditional probability p(xs|x−s) in order to sample a new value of xs: p(xs = 1|x−s) = p(xs = 1, x−s) p(xs = 1, x−s) + p(xs = 0, x−s) = 1 1 + p(xs=0,x−s) p(xs=1,x−s) = σ log p(xs = 1, x−s) p(xs = 0, x−s) ! = σ  ηs + X {s,t}∈E ηstxt   where σ(u) = 1/(1 + e −u ). (b) q(x) = Y s∈V q(xs) = Y s∈V τ xs s (1 − τs) 1−xs The KL divergence is given by: KL(q||p) = Z q(x) log q(x) p(x) dx = − Z q(x) log p(x)dx+ Z q(x) log q(x)dx = H(q, p)−H(q) Let us begin with the second term: −H(q) = Eq[log q(x)] = Eq "X s∈V xs log τs + (1 − xs) log(1 − τs) # = X s∈V Eq[xs] log τs + (1 − Eq[xs]) log(1 − τs) = X s∈V τs log τs + (1 − τs) log(1 − τs). Now, recall that by independence of xs and xt in q(x), Eq[xsxt ] = Eq[xs]Eq[xt ]. Thus, we can express the first term as: H(q, p) = Eq[− log p(x)] = Eq  − X s∈V ηsxs − X {s,t}∈E ηstxsxt   + log(Zp) = − X s∈V ηsEq[xs] − X {s,t}∈E ηstEq[xs]Eq[xt ] + log(Zp) = − X s∈V ηsτs − X {s,t}∈E ηstτsτt + log(Zp). KL(q||p) − log(Zp) = − X s∈V ηsτs − X {s,t}∈E ηstτsτt + X s∈V τs log τs + (1 − τs) log(1 − τs)
IFT6269-A2021 Prof: Simon Lacoste-Julien Hwk 5 Dec 22, 2021 Solutions by S. Chandar, G. Huang and J. Gallego (c) To obtain a coordinate descent method, we analytically solve the optimization problem for each variable at a time (considering the others fixed): ∂KL(q||p) ∂τs = −ηs − X {s,t}∈E ηstτt + log τs + 1 − log(1 − τs) − 1 = 0 log τs 1 − τs = ηs + X {s,t}∈E ηstτt Solving for τs, the mean field updates rule is: τˆs = σ  ηs + X {s,t}∈E ηstτt   3. Implementation of Gibbs sampling and mean field variational inference (20 points) Instructions: https://drive.google.com/file/d/1BruL_rMBFMs1mTeBUxJEM4rWbH1hjTI4/view?usp=sharing Solution: https://drive.google.com/file/d/1HoO-EJgcsqglJ8HHAp92o4ihlq_tVd2g/view?usp=sharing

Related document

x
Report download errors
Report content



Download file quality is faulty:
Full name:
Email:
Comment
If you encounter an error, problem, .. or have any questions during the download process, please leave a comment below. Thank you.