Implementation of Exponentiated Gradient Algorithms based on Alpha-Beta divergences for OLPS

Citation Author(s):
Sergio
Cruces
Universidad de Sevilla
Auxiliadora
Sarmiento Vega
Universidad de Sevilla
Submitted by:
Auxiliadora Sar...
Last updated:
Wed, 09/18/2024 - 04:48
DOI:
10.21227/dxpv-1g10
Data Format:
Research Article Link:
License:
0
0 ratings - Please login to submit your rating.

Abstract 

This code implements a novel family of generalized exponentiated gradient (EG) updates derived from an Alpha-Beta divergence regularization function. Collectively referred to as EGAB, the proposed updates belong to the category of multiplicative gradient algorithms for positive data and demonstrate considerable flexibility by controlling iteration  behavior and performance  through three hyperparameters: alpha, beta, and the learning rate eta. To enforce a unit l1 norm constraint for nonnegative weight vectors within generalized EGAB algorithms, we develop two slightly distinct approaches. One method exploits scale-invariant loss functions, while the other relies on gradient projections onto the feasible domain. As an illustration of their applicability, we evaluate the proposed updates in addressing the online portfolio selection problem (OLPS) using gradient-based methods. Here, they not only offer a unified perspective on the search directions of various OLPS algorithms (including the standard exponentiated gradient and diverse mean-reversion strategies), but also facilitate smooth interpolation and extension of these updates due to the flexibility in hyperparameter selection. Simulation results confirm that the adaptability of these generalized gradient updates can effectively enhance the performance for some portfolios, particularly in scenarios involving transaction costs.

Instructions: 

DEMO of Alpha-Beta Exponential Gradient algorithms: EGAB-N and EGAB-P. 

By Sergio Cruces & Auxiliadora Sarmiento. Email:{sergio, sarmiento}@us.es

with acknowledgment to the help of: Andrzej Cichocki and Toshihisa Tanaka.

 Please cite the related article:  https://doi.org/10.48550/arXiv.2406.00655

"Generalized Exponentiated Gradient Algorithms and Their Application to On-Line Portfolio Selection" by Andrzej Cichocki, Sergio Cruces, Auxiliadora Sarmiento, Toshihisa Tanaka.

We provide the necessary MATLAB code and auxiliary functions to implement the proposed algorithms (EGAB-P and EGAB-N) and the baseline UBAH strategy for Online Portfolio Selection (OLPS). They can be found in the run_alg.m and unif_buy_and_hold.m files. The demonstration file run_demo.m automates the primary experimental tasks: including dataset loading, algorithm hyperparameter optimization during training, performance testing of the proposed algorithms on selected datasets, and the generation of summary results and visual performance figures.

Additionally, the folder ./data contains data sets from the following sources:.

-mat-file data sets included in the open-source toolbox for Online Portfolio Selection (OLPS toolbox) available at https://github.com/OLPS/OLPS

-.csv-file data sets provided by A. Cichocki 

Funding Agency: 
MCIN/AEI/ 10.13039/501100011033
Grant Number: 
PID2021-123090NB-I00

Comments

awfa

Submitted by Abhijit Takalkar on Thu, 09/26/2024 - 00:27