PR : Fix typo in nmt_with_attention.ipynb

Please approve this CL. It will be submitted automatically, and its GitHub pull request will be marked as merged.

Imported from GitHub PR 

weigths -> weights

Copybara import of the project:

  - edcca75684186e4467b255e84ef36d4bbfda4e0e Fix typo in nmt_with_attention.ipynb by Alexa <alexa.nguyen@adelaide.edu.au>
  - 7599568da82684829d1546116947ce2c5c256e6e Merge edcca75684186e4467b255e84ef36d4bbfda4e0e into d4639... by Alexa <alexa.nguyen@adelaide.edu.au>

COPYBARA_INTEGRATE_REVIEW=https://github.com/tensorflow/tensorflow/pull/26170 from nguyen-alexa:master edcca75684186e4467b255e84ef36d4bbfda4e0e
PiperOrigin-RevId: 236166878
This commit is contained in:
A. Unique TensorFlower 2019-02-28 11:54:03 -08:00 committed by TensorFlower Gardener
parent d51261962b
commit 5e32d8fd7f

View File

@ -688,7 +688,7 @@
" for t in range(max_length_targ):\n",
" predictions, dec_hidden, attention_weights = decoder(dec_input, dec_hidden, enc_out)\n",
" \n",
" # storing the attention weigths to plot later on\n",
" # storing the attention weights to plot later on\n",
" attention_weights = tf.reshape(attention_weights, (-1, ))\n",
" attention_plot[t] = attention_weights.numpy()\n",
"\n",
@ -842,4 +842,4 @@
]
}
]
}
}