Nội dung text [Exercise] Skip-gram [Eng].docx
● Calculations (fill in the blanks): o Embedding for "cat" (row 1 of updated ): [___, ___, ___]. o Embedding for "fish" (row 6): [___, ___, ___]. o Dot product: ___ \cdot ___ + ___ \cdot ___ + ___ \cdot ___ = ___. o . o . o . o Interpretation: ___ (e.g., high cosine indicates semantic similarity). Exercise 1.5: Counting Parameters ● Formulas and Explanation: o Total parameters: (sum of parameters in and ). ● Calculations (fill in the blanks): o For V=7, D=2: . o For D=3: . o For D=5: $#\Theta = ___. Part 2: Skip-gram with Negative Sampling This part uses negative sampling with to approximate softmax, making computations more efficient. Exercise 2.1: Vectorization and Forward Pass for Positive Word ● Formulas and Explanation: o Vectorization: for . o Center embedding: . o Positive score: (dot product measuring similarity). ● Calculations (fill in the blanks): o One-hot for ("likes", index=2): . o Embedding . o (for "cat"). o $s_o = ___ \cdot ___ + ___ \cdot ___ + ___ \cdot ___ = ___. Exercise 2.2: Negative Sampling and Scores for Negative Words ● Formulas and Explanation: