First Name: 
Auxiliadora
Last Name: 
Sarmiento

Datasets & Competitions

This code implements a novel family of generalized exponentiated gradient (EG) updates derived from an Alpha-Beta divergence regularization function. Collectively referred to as EGAB, the proposed updates belong to the category of multiplicative gradient algorithms for positive data and demonstrate considerable flexibility by controlling iteration  behavior and performance  through three hyperparameters: alpha, beta, and the learning rate eta. To enforce a unit l1 norm constraint for nonnegative weight vectors within generalized EGAB algorithms, we develop two slightly distinct approaches.

Categories:
61 Views