Considering Gaussian decoder for Variational autoencoders
up vote
0
down vote
favorite
I am trying to implement variation auto-encoder for real data where both encoder and decoder are modeled via multivariate Gaussian. I have found several implementations online for the case where the encoder is Gaussian and decoder is Bernoulli, but nothing for Gaussian decoder case. For the case of Bernoulli decoder the reconstruction loss can be defined as follows
reconstr_loss = tf.nn.sigmoid_cross_entropy_with_logits(labels=x,logits=x_out_logit)
where x_out_logit is modeled by a DNN. I am not sure how to write reconstruction loss for the Gaussian case. I assumed the decoder should output mean (gz_mean) and variance (gz_log_sigma_sq) as well (similar to the Gaussian encoder) and since the reconstruction loss is the Gaussian probability, I defined it to be
mvn = tf.contrib.distributions.MultivariateNormalDiag(loc=self.gz_mean,scale_diag=tf.sqrt(tf.exp(self.gz_log_sigma_sq)))
reconstr_loss = tf.log(1e-20+mvn.prob(self.x))
However this loss does not seem to work, mvn.prob(self.x) is always zero no matter what training step. Please let me know of any ideas or any git-hub source which considers this case.
tensorflow autoencoder
add a comment |
up vote
0
down vote
favorite
I am trying to implement variation auto-encoder for real data where both encoder and decoder are modeled via multivariate Gaussian. I have found several implementations online for the case where the encoder is Gaussian and decoder is Bernoulli, but nothing for Gaussian decoder case. For the case of Bernoulli decoder the reconstruction loss can be defined as follows
reconstr_loss = tf.nn.sigmoid_cross_entropy_with_logits(labels=x,logits=x_out_logit)
where x_out_logit is modeled by a DNN. I am not sure how to write reconstruction loss for the Gaussian case. I assumed the decoder should output mean (gz_mean) and variance (gz_log_sigma_sq) as well (similar to the Gaussian encoder) and since the reconstruction loss is the Gaussian probability, I defined it to be
mvn = tf.contrib.distributions.MultivariateNormalDiag(loc=self.gz_mean,scale_diag=tf.sqrt(tf.exp(self.gz_log_sigma_sq)))
reconstr_loss = tf.log(1e-20+mvn.prob(self.x))
However this loss does not seem to work, mvn.prob(self.x) is always zero no matter what training step. Please let me know of any ideas or any git-hub source which considers this case.
tensorflow autoencoder
add a comment |
up vote
0
down vote
favorite
up vote
0
down vote
favorite
I am trying to implement variation auto-encoder for real data where both encoder and decoder are modeled via multivariate Gaussian. I have found several implementations online for the case where the encoder is Gaussian and decoder is Bernoulli, but nothing for Gaussian decoder case. For the case of Bernoulli decoder the reconstruction loss can be defined as follows
reconstr_loss = tf.nn.sigmoid_cross_entropy_with_logits(labels=x,logits=x_out_logit)
where x_out_logit is modeled by a DNN. I am not sure how to write reconstruction loss for the Gaussian case. I assumed the decoder should output mean (gz_mean) and variance (gz_log_sigma_sq) as well (similar to the Gaussian encoder) and since the reconstruction loss is the Gaussian probability, I defined it to be
mvn = tf.contrib.distributions.MultivariateNormalDiag(loc=self.gz_mean,scale_diag=tf.sqrt(tf.exp(self.gz_log_sigma_sq)))
reconstr_loss = tf.log(1e-20+mvn.prob(self.x))
However this loss does not seem to work, mvn.prob(self.x) is always zero no matter what training step. Please let me know of any ideas or any git-hub source which considers this case.
tensorflow autoencoder
I am trying to implement variation auto-encoder for real data where both encoder and decoder are modeled via multivariate Gaussian. I have found several implementations online for the case where the encoder is Gaussian and decoder is Bernoulli, but nothing for Gaussian decoder case. For the case of Bernoulli decoder the reconstruction loss can be defined as follows
reconstr_loss = tf.nn.sigmoid_cross_entropy_with_logits(labels=x,logits=x_out_logit)
where x_out_logit is modeled by a DNN. I am not sure how to write reconstruction loss for the Gaussian case. I assumed the decoder should output mean (gz_mean) and variance (gz_log_sigma_sq) as well (similar to the Gaussian encoder) and since the reconstruction loss is the Gaussian probability, I defined it to be
mvn = tf.contrib.distributions.MultivariateNormalDiag(loc=self.gz_mean,scale_diag=tf.sqrt(tf.exp(self.gz_log_sigma_sq)))
reconstr_loss = tf.log(1e-20+mvn.prob(self.x))
However this loss does not seem to work, mvn.prob(self.x) is always zero no matter what training step. Please let me know of any ideas or any git-hub source which considers this case.
tensorflow autoencoder
tensorflow autoencoder
asked Nov 11 at 19:06
parson
32
32
add a comment |
add a comment |
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Some of your past answers have not been well-received, and you're in danger of being blocked from answering.
Please pay close attention to the following guidance:
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53252177%2fconsidering-gaussian-decoder-for-variational-autoencoders%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown