Gradient Boosting: Exact Greedy Split - Quant Researcher Interview Question
Difficulty: Hard
Category: machine_learning
Asked at: D.E. Shaw, Citadel, Two Sigma, WorldQuant, G-Research
Topics: xgboost, gradient_boosting, decision_trees, optimization
Problem Description
Gradient Boosting Decision Trees (GBDT) are a cornerstone of quantitative finance, widely used for alpha signal combination and risk modeling due to their ability to capture non-linear dependencies and regime shifts. The construction of these ensembles relies on the Exact Greedy Split algorithm, which iteratively identifies the optimal feature and threshold to split data by maximizing the reduction in a differentiable loss function. This process involves calculating gradient statistics and evalu
Practice this hard researcher interview question on MyntBit - the LeetCode for quants with 200+ quant interview questions for Jane Street, Citadel, Two Sigma, and other top quantitative finance firms.