ABSTRACT Technical trading rules can be generated from historical data for decision making in stock markets. Genetic programming (GP) as an artificial intelligence technique is a valuable method to automatically generate such technical... more
ABSTRACT Technical trading rules can be generated from historical data for decision making in stock markets. Genetic programming (GP) as an artificial intelligence technique is a valuable method to automatically generate such technical trading rules. In this paper, GP has been applied for generating risk-adjusted trading rules on individual stocks. Among many risk measures in the literature, conditional Sharpe ratio has been selected for this study because it uses conditional value at risk (CVaR) as an optimal coherent risk measure. In our proposed GP model, binary trading rules have been also extended to more realistic rules which are called trinary rules using three signals of buy, sell and no trade. Additionally we have included transaction costs, dividend and splits in our GP model for calculating more accurate returns in the generated rules. Our proposed model has been applied for 10 Iranian companies listed in Tehran Stock Exchange (TSE). The numerical results showed that our extended GP model could generate profitable trading rules in comparison with buy and hold strategy especially in the case of risk adjusted basis.
We examine the stability of a portfolio management model based on the conditional value-at-risk (CVaR) measure. The stochastic programming model controls total risk exposure of an international investment portfolio. This includes both... more
We examine the stability of a portfolio management model based on the conditional value-at-risk (CVaR) measure. The stochastic programming model controls total risk exposure of an international investment portfolio. This includes both market risk in multiple countries as well as currency exchange risk. Uncertainty in asset returns and exchange rates is modeled in terms of discrete scenarios that are generated by means of a moment matching method. The procedure generates a set of scenarios with statistical properties of the random variables matching specific target values that are estimated using historical market data. First, we establish that the scenario generation procedure does not bias the results of the optimization program, and we determine the required number of scenarios to attain stable solutions. We then investigate the sensitivity of the CVaR model to errors (mis-specifications) in the statistics of stochastic parameters (i.e., mean, variance, skewness and kurtosis of th...
This paper proposes a conditional technique for the estimation of VaR and expected shortfall measures based on the skewed generalized t (SGT) distribution. The estimation of the conditional mean and conditional variance of returns is... more
This paper proposes a conditional technique for the estimation of VaR and expected shortfall measures based on the skewed generalized t (SGT) distribution. The estimation of the conditional mean and conditional variance of returns is based on ten popular variations of the GARCH model. The results indicate that the TS-GARCH and EGARCH models have the best overall performance. The remaining GARCH specifications, except in a few cases, produce acceptable results. An unconditional SGT-VaR performs well on an in-sample evaluation and fails the tests on an out-of-sample evaluation. The latter indicates the need to incorporate time-varying mean and volatility estimates in the computation of VaR and expected shortfall measures.
The objective of this paper is to combine a real options framework with portfolio optimization techniques and to apply this new framework to investments in the electricity sector. In particular, a real options model is used to assess the... more
The objective of this paper is to combine a real options framework with portfolio optimization techniques and to apply this new framework to investments in the electricity sector. In particular, a real options model is used to assess the adoption decision of particular technologies under uncertainty. These technologies are coal-fired power plants, biomassfired power plants and onshore wind mills, and
This paper provides a new method, which we call the “MV+CVaR approach”, for managing unexpected mortality changes underlying annuities and life insurance. The MV+CVaR approach optimizes the mean-variance tradeoff of an insurer’s... more
This paper provides a new method, which we call the “MV+CVaR approach”, for managing unexpected mortality changes underlying annuities and life insurance. The MV+CVaR approach optimizes the mean-variance tradeoff of an insurer’s mor-tality portfolio, subject to constraints on downside risk. We show the efficiency of our MV+CVaR mortality portfolio by conducting a detailed analysis of its perfor-mance based on the moment methods and the maximum entropy. Our numerical examples illustrate the superiority of the MV+CVaR approach in mortality risk management and shed new light on natural hedging effects of annuities and life insurance.
The Basel II Capital Accord of 2004 sets guidelines on operational risk capital requirements to be adopted by internationally active banks by around year-end 2007. Operational loss databases are subject to a minimum recording threshold of... more
The Basel II Capital Accord of 2004 sets guidelines on operational risk capital requirements to be adopted by internationally active banks by around year-end 2007. Operational loss databases are subject to a minimum recording threshold of roughly $10,000 (internal) and $1 million (external) – an aspect often overlooked by practitioners. We provide theoretical and empirical evidence that ignoring these thresholds leads to underestimation of the VaR and CVaR figures within the Loss Distribution Approach. We emphasize that four crucial components of a reliable operational loss actuarial model are: (1) non-homogenous Poisson process for the loss arrival process, (2) flexible loss severity distributions, (3) accounting for the incomplete data, and (4) robustness analysis of the model.
Several portfolio selection models take into account practical limitations on the number of assets to include and on their weights in the portfolio. We present here a study of the Limited Asset Markowitz (LAM), of the Limited Asset Mean... more
Several portfolio selection models take into account practical limitations on the number of assets to include and on their weights in the portfolio. We present here a study of the Limited Asset Markowitz (LAM), of the Limited Asset Mean Absolute Deviation (LAMAD) and of the Limited Asset Conditional Value-at-Risk (LACVaR) models, where the assets are limited with the introduction of
Consider a dynamic decision making model under risk with a fixed planning horizon, namely the dynamic capacity control model. The model describes a firm, operating in a monopolistic setting and selling a range of products consuming a... more
Consider a dynamic decision making model under risk with a fixed planning horizon, namely the dynamic capacity control model. The model describes a firm, operating in a monopolistic setting and selling a range of products consuming a single resource. Demand for each product is time-dependent and modeled by a random variable. The firm controls the revenue stream by allowing or
In this paper we consider the sensitivity problem connected with portfolio optimization results when different measures of risk such as portfolio rates of return standard deviation, portfolio VaR, CVaR are minimized. Conditioning the data... more
In this paper we consider the sensitivity problem connected with portfolio optimization results when different measures of risk such as portfolio rates of return standard deviation, portfolio VaR, CVaR are minimized. Conditioning the data (represented by spectral condition index of the rates of return correlation matrix) plays, as it is shown, a crucial role in describing the properties of the models. We report on the research conducted for 13 largest firms on Warsaw Stock Exchange.
The article is dedicated to the optimization of credit risk through the application of Conditional Value at Risk (CVaR). CVaR is a risk measure, the expected loss exceeding Value-at-Risk and is also known as Mean Excess, Mean Shortfall,... more
The article is dedicated to the optimization of credit risk through the application of Conditional Value at Risk (CVaR). CVaR is a risk measure, the expected loss exceeding Value-at-Risk and is also known as Mean Excess, Mean Shortfall, or Tail VaR. The link between credit risk and the current financial crisis accentuates the importance of measuring and predicting extreme credit risk. Conditional Value at Risk has become an increasingly popular method for measurement and optimization of extreme market risk. The use of model can regulate all positions in a portfolio of financial instruments in order to minimize CVaR subject to trading and return constraints at the same time. The credit risk distribution is created by Monte Carlo simulations and the optimization problem is solved effectively by linear programming. We apply these CVaR techniques to the optimization of credit risk on portfolio of selected bonds.
Value-at-Risk (VaR) and Conditional Value-at-Risk (CVaR) are two risk measures which are widely used in the practice of risk management. This paper deals with the problem of computing both VaR and CVaR using stochastic approximation (with... more
Value-at-Risk (VaR) and Conditional Value-at-Risk (CVaR) are two risk measures which are widely used in the practice of risk management. This paper deals with the problem of computing both VaR and CVaR using stochastic approximation (with decreasing steps): we propose a first Robbins-Monro procedure based on Rockaffelar-Uryasev's identity for the CVaR. The convergence rate of this algorithm to its target satisfies a Gaussian Central Limit Theorem. As a second step, in order to speed up the initial procedure, we propose a recursive importance sampling (I.S.) procedure which induces a significant variance reduction of both VaR and CVaR procedures. This idea, which goes back to the seminal paper of B. Arouna, follows a new approach introduced by V. Lemaire and G. Pag\`es. Finally, we consider a deterministic moving risk level to speed up the initialization phase of the algorithm. We prove that the convergence rate of the resulting procedure is ruled by a Central Limit Theorem with minimal variance and its efficiency is illustrated by considering several typical energy portfolios.
The Basel II Capital Accord of 2004 sets guidelines on operational risk capital requirements to be adopted by internationally active banks by around year-end 2007. Operational loss databases are subject to a minimum recording threshold of... more
The Basel II Capital Accord of 2004 sets guidelines on operational risk capital requirements to be adopted by internationally active banks by around year-end 2007. Operational loss databases are subject to a minimum recording threshold of roughly $10,000 (internal) and $1 million (external) – an aspect often overlooked by practitioners. We provide theoretical and empirical evidence that ignoring these thresholds leads to underestimation of the VaR and CVaR figures within the Loss Distribution Approach. We emphasize that four crucial components of a reliable operational loss actuarial model are: (1) non-homogenous Poisson process for the loss arrival process, (2) flexible loss severity distributions, (3) accounting for the incomplete data, and (4) robustness analysis of the model.