One-Shot Generalization in Deep Generative Models Danilo J. RezendeShakir Mohamed Ivo Danihelka Karol Gregor Daan Wierstra
Google DeepMind,London
2. Varieties of Attention Spatially-transformed attentionA more powerful approach is to use amechanism that provides invariance to shape and size of objects in the images (generalaffine transformations).
Spatial transformers (ST) process an inputimage x, using parameters h, and generate anoutput y(x,h):
where κh and κw are 1-dimensional kernels, indicates the tensor outer-product of the two kernels and indicates a convolution.
Inference,readingattention: spatial transformers allow the model toobserve the input image in a canonical form, providing the desired invariance
生成,writing attention:handle position, scale and rotation of parts of the generated image
3. Iterative and Attentive Generative Models 3.1. Latent Variable Models and Variational Inference带潜变量的生成模型通过一个观察样本描述概率的产生。
最简单的公式:使用高斯潜变量z的PCA和因子分析
Generativemodels with latent variables:
DeepGenerative Models:A hierarchyof L layers of latent variables, where each layerdepends on the layer above in a non-linear way
使用深度神经网络表示非线性依赖。
We specifythisnon-linear dependency using deep neural netWorks.
To compute the marginal probability of the data, we must integrate over anyunobserved variables:
Deep latent Gaussian models深度潜在高斯模型
——prior distributionp(z)先验概率(关于潜变量z),高斯分布(分布已知,非线性组合)
Likelihoodfunction似然函数p(x|z),appropriate for the observed data, suchas a Gaussian, Bernoulli, categorical or other distribution, and that isdependent in a nonlinear way on the latent variables.
潜变量z,数据点data points x
积分难求,变分:
Variational inference: Transforms the difficult integration into an optimization problemthat is typically more scalable and easier to solve.
使用下界近似边缘似然Approximate the marginal likelihood by a lower bound
Negative free energy负自由能
折中——
第一项:重构能力
第二项:后验分布复杂性
通过已知的带有变分参数φ的后验概率posteriorsqφ(z|x)近似真正的后验分布
Optimization parameters θ和φ:
amortizedinference approach 平摊推理方法
Represent the distribution q(z|x) as a recognition or inferencemodel
平摊后验推断posterior inference的开销
Generativemodel:A decoder of the latent variables
Inference network: data —(encoder)—> latent description
Combination :
deep latent variable model (typically latent Gaussian)
with variational inferencethat is implemented using an inference network is referred to as avariational auto-encoder (VAE).
3.2. Sequential Generative Models目前提出的生成模型:刻画单步模型single step:
通过使用非线性前馈转换为潜变量,估计i.i.d(独立同分布)数据的似然函数。
Asequential generative model:
使用VAE(变分自编码)的潜变量模型的扩展。
Combine:stochastic & deterministic computation
Toform a multi-step generative process uses recursive transformations of thelatent variables.
结合随机和确定计算,使用潜变量的递归变换形成一个多步生成过程。
3.2.1. GENERATIVE MODELDescribe the observed data over T timesteps using a set of latent variables zt at each step.
使用一系列潜变量z,在T个时间步长上描述观测数据