Skip to content

Commit

Permalink
force from * import * statements in all code snippets
Browse files Browse the repository at this point in the history
  • Loading branch information
dustinvtran committed Mar 8, 2017
1 parent 93adc8e commit 0394128
Show file tree
Hide file tree
Showing 5 changed files with 15 additions and 1 deletion.
2 changes: 1 addition & 1 deletion docs/tex/tutorials/decoder.tex
Original file line number Diff line number Diff line change
Expand Up @@ -49,7 +49,7 @@ \subsection{Probabilistic decoder}
from keras.layers import Dense

z = Normal(mu=tf.zeros([N, d]), sigma=tf.ones([N, d]))
hidden = Dense(256, activation='relu')(z.value())
hidden = Dense(256, activation='relu')(z)
x = Bernoulli(logits=Dense(28 * 28)(hidden))
\end{lstlisting}
It starts with a $d$-dimensional standard normal prior, one for each
Expand Down
5 changes: 5 additions & 0 deletions docs/tex/tutorials/gan.tex
Original file line number Diff line number Diff line change
Expand Up @@ -37,6 +37,8 @@ \subsubsection{Data}
TensorFlow placeholder with a fixed batch size of $M$ images.

\begin{lstlisting}[language=Python]
from tensorflow.examples.tutorials.mnist import input_data

mnist = input_data.read_data_sets(DATA_DIR, one_hot=True)
x_ph = tf.placeholder(tf.float32, [M, 784])
\end{lstlisting}
Expand Down Expand Up @@ -65,6 +67,9 @@ \subsubsection{Model}
$[0,1]$.

\begin{lstlisting}[language=Python]
from edward.models import Uniform
from tensorflow.contrib import slim

def generative_network(eps):
h1 = slim.fully_connected(eps, 128, activation_fn=tf.nn.relu)
x = slim.fully_connected(h1, 784, activation_fn=tf.sigmoid)
Expand Down
2 changes: 2 additions & 0 deletions docs/tex/tutorials/latent-space-models.tex
Original file line number Diff line number Diff line change
Expand Up @@ -69,6 +69,8 @@ \subsubsection{Model}

In Edward, the model is written as follows:
\begin{lstlisting}[language=Python]
from edward.models import Normal, Poisson

N = x_train.shape[0] # number of data points
K = 3 # latent dimensionality

Expand Down
5 changes: 5 additions & 0 deletions docs/tex/tutorials/mixture-density-network.tex
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,8 @@ \subsubsection{Data}
for every input $x_n$ there are multiple outputs $y_n$.

\begin{lstlisting}[language=Python]
from sklearn.model_selection import train_test_split

def build_toy_dataset(N):
y_data = np.random.uniform(-10.5, 10.5, N).astype(np.float32)
r_data = np.random.normal(size=N).astype(np.float32) # random noise
Expand Down Expand Up @@ -64,6 +66,9 @@ \subsubsection{Model}
units for each hidden layer.

\begin{lstlisting}[language=Python]
from edward.models import Categorical, Mixture, Normal
from tensorflow.contrib import slim

def neural_network(X):
"""mu, sigma, logits = NN(x; theta)"""
# 2 hidden layers with 15 hidden units
Expand Down
2 changes: 2 additions & 0 deletions docs/tex/tutorials/probabilistic-pca.tex
Original file line number Diff line number Diff line change
Expand Up @@ -79,6 +79,8 @@ \subsubsection{Model}
our variables of interest.

\begin{lstlisting}[language=Python]
from edward.models import Normal

w = Normal(mu=tf.zeros([D, K]), sigma=2.0 * tf.ones([D, K]))
z = Normal(mu=tf.zeros([N, K]), sigma=tf.ones([N, K]))
x = Normal(mu=tf.matmul(w, z, transpose_b=True), sigma=tf.ones([D, N]))
Expand Down

0 comments on commit 0394128

Please sign in to comment.