NMT is often performed using sequence to sequence modeling, where the input is a sequence of variable length tensor representation of a sentence in source language , and the output is the another variable length tensor representation of target language. Sockeye project, a sequence-to-sequence framework for Neural Machine Translation based on Apache MXNet Incubating. It implements the well-known encoder-decoder architecture with attention. The talk covers LSTM networks, NMT fundamentals, an overview of how to use Sockeye for implementing translation tasks, and areas of active research for those who are interested in further study of the subject.
In this lab, we will cover deep learning fundamentals and focus on the powerful and scalable Apache MXNet open source deep learning framework. At the end of this hands on lab, youll be able to train your own deep neural network and fine tune existing state of the art models for image and object recognition. Well also dive deep into setting up your deep learning infrastructure on AWS.