Selected article for: "standard error and time series"

Author: Zhang, Wenyong; Li, Lingfei; Zhang, Gongqiu
Title: A Two-Step Framework for Arbitrage-Free Prediction of the Implied Volatility Surface
  • Cord-id: wwvlhyoq
  • Document date: 2021_6_14
  • ID: wwvlhyoq
    Snippet: We propose a two-step framework for predicting the implied volatility surface over time without static arbitrage. In the first step, we select features to represent the surface and predict them over time. In the second step, we use the predicted features to construct the implied volatility surface using a deep neural network (DNN) model by incorporating constraints that prevent static arbitrage. We consider three methods to extract features from the implied volatility data: principal component a
    Document: We propose a two-step framework for predicting the implied volatility surface over time without static arbitrage. In the first step, we select features to represent the surface and predict them over time. In the second step, we use the predicted features to construct the implied volatility surface using a deep neural network (DNN) model by incorporating constraints that prevent static arbitrage. We consider three methods to extract features from the implied volatility data: principal component analysis, variational autoencoder and sampling the surface, and we predict these features using LSTM. Using a long time series of implied volatility data for S\&P500 index options to train our models, we find that sampling the surface with DNN for surface construction achieves the smallest error in out-of-sample prediction. Furthermore, the DNN model for surface construction not only removes static arbitrage, but also significantly reduces the prediction error compared with a standard interpolation method. Our framework can also be used to simulate the dynamics of the implied volatility surface without static arbitrage.

    Search related documents:
    Co phrase search for related documents
    • activation function and long lstm model short term memory: 1
    • activation function and long lstm short term memory: 1, 2
    • activation function and loss function: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12
    • activation function and lstm model: 1
    • activation function and lstm model short term memory: 1
    • activation function and lstm short term memory: 1, 2
    • activation function and machine learning: 1, 2, 3, 4, 5, 6
    • activation function and machine learning model: 1
    • adam optimizer and long lstm model short term memory: 1
    • adam optimizer and long lstm short term memory: 1, 2
    • adam optimizer and loss function: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10
    • adam optimizer and lstm model: 1, 2
    • adam optimizer and lstm model short term memory: 1
    • adam optimizer and lstm short term memory: 1, 2
    • adam optimizer and machine learning: 1, 2