Description:

My first deliverable is an example program of wordtovec implemented in TensorFlow and also using gensim. This program used softmax to convert words into vector. Prior to implementation, I studied machine learning, neural network and Python. Word embedding is a parameterized function mapping words in some language to high-dimensional vectors. Methods to generate this mapping include neural networks, dimensionality reduction on the word co-occurrence matrix, probabilistic models, and explicit representation in terms of the context in which words appear. A word embedding is sometimes called a word representation or a word vector. It maps words to a high dimensional vector of real numbers. The meaningful vector learned can be used to perform some task.

Deliverables: