Author: Ramos-P'erez, Eduardo; Alonso-Gonz'alez, Pablo J.; N'unez-Vel'azquez, Jos'e Javier
Title: Multi-Transformer: A New Neural Network-Based Architecture for Forecasting S&P Volatility Cord-id: kn5qejvx Document date: 2021_9_26
ID: kn5qejvx
Snippet: Events such as the Financial Crisis of 2007-2008 or the COVID-19 pandemic caused significant losses to banks and insurance entities. They also demonstrated the importance of using accurate equity risk models and having a risk management function able to implement effective hedging strategies. Stock volatility forecasts play a key role in the estimation of equity risk and, thus, in the management actions carried out by financial institutions. Therefore, this paper has the aim of proposing more ac
Document: Events such as the Financial Crisis of 2007-2008 or the COVID-19 pandemic caused significant losses to banks and insurance entities. They also demonstrated the importance of using accurate equity risk models and having a risk management function able to implement effective hedging strategies. Stock volatility forecasts play a key role in the estimation of equity risk and, thus, in the management actions carried out by financial institutions. Therefore, this paper has the aim of proposing more accurate stock volatility models based on novel machine and deep learning techniques. This paper introduces a neural network-based architecture, called Multi-Transformer. Multi-Transformer is a variant of Transformer models, which have already been successfully applied in the field of natural language processing. Indeed, this paper also adapts traditional Transformer layers in order to be used in volatility forecasting models. The empirical results obtained in this paper suggest that the hybrid models based on Multi-Transformer and Transformer layers are more accurate and, hence, they lead to more appropriate risk measures than other autoregressive algorithms or hybrid models based on feed forward layers or long short term memory cells.
Search related documents:
Co phrase search for related documents- long short and loss reduce: 1, 2, 3
- long short and lstm contrast: 1
- long short and lstm layer: 1, 2, 3, 4, 5, 6, 7
- long short and lstm layer model: 1
- long short and lstm long short term memory: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73
- long short and lstm transformer: 1
- long short and machine deep: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23
- long short and mae mean absolute value: 1
- long short term and loss function: 1, 2, 3, 4, 5, 6, 7, 8, 9
- long short term and loss reduce: 1, 2
- long short term and lstm contrast: 1
- long short term and lstm layer: 1, 2, 3, 4, 5, 6, 7
- long short term and lstm layer model: 1
- long short term and lstm long short term memory: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73
- long short term and lstm transformer: 1
- long short term and machine deep: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23
- long short term and mae mean absolute value: 1
- loss function and lstm long short term memory: 1, 2, 3, 4
- loss function and lstm transformer: 1
Co phrase search for related documents, hyperlinks ordered by date