Chris Pollett > Students > Qiao

    Print View

    [Bio]

    [Blog]

    [CS 297 Proposal]

    [Del 1-Example Program]

    [Del 2-Introduction to Word Embedding]

    [Del 3-Data Preprocessing Program]

    [CS 297 Report_PDF]

    [CS 298 Proposal]

    [CS 298 Report PDF]

    [CS 298 Presentation PDF]

























Deliverable #2: Presentation on word embedding.

This deliverable is an introduction to word embedding, and, some thoughts on the project.

A word embedding is a parameterized function mapping words in some language to high-dimensional vectors. It is sometimes called a word representation or a word vector. Methods to generate this mapping include neural networks, dimensionality reduction on the word co-occurrence matrix, probabilistic models, and explicit representation in terms of the context in which words appear. My literature reviews and presentation focus on learning word embedding by a neural network.

References:

Christopher Olah, "Deep Learning, NLP, and Representations", http://colah.github.io/posts/2014-07-NLP-RNNs-Representations/

Sanjeev Arora, Yingyu Liang, Tengyu Ma, "A Simple but Tough-to-Beat Baseline for Sentence Embeddings", 2017

Introduction to Word Embedding.pdf