500+ quant interview questions for Jane Street, Citadel, Two Sigma, DE Shaw, and other top quantitative finance firms.
C++ and Python coding challenges for quant developer interviews
Statistical analysis and quantitative modeling problems
Trading MCQs, probability brainteasers, and market scenarios
Practice quant interview questions on MyntBit - the all-in-one quant learning platform. Free questions available for C++ coding, Python problems, probability brainteasers, and trading MCQs.
Difficulty: Medium
Category: Linear Algebra & Machine Learning
Practice quant interview questions from top firms including Jane Street, Citadel, Two Sigma, DE Shaw, and other leading quantitative finance companies.
Topics: attention-mechanism, machine-learning, linear-algebra, transformer
In a self-attention mechanism, we have query (Q), key (K), and value (V) matrices. Given an input sequence represented as a matrix $X$, these matrices are derived through linear transformations: $Q = XW_Q$, $K = XW_K$, and $V = XW_V$, where $W_Q$, $W_K$, and $W_V$ are learnable weight matrices. How is the output of the self-attention mechanism computed, assuming the dimension of the keys is $d_k$?
Practice this medium trader interview question on MyntBit - the all-in-one quant learning platform with 500+ quant interview questions for Jane Street, Citadel, Two Sigma, and other top quantitative finance firms.