Google Analytics

Wednesday, September 7

Value-at-Risk: Introduction

Slides from an introductory lecture on Value-at-Risk, courtesy of a friend.

Value at Risk Intro

n 
 

VALUE-AT-RISK
1. What is VaR – and what it’s not
  • Assume the trading position is composed of one long Q2 forward contract
  • Price of the contract is 27 euros, underlying value 58.968 euros – is this the total amount at risk?
  • Is the risk the collateral requirement of the position?
  • You have a sell stop-order in the markets at 26 euros. Is the risk 27-26 = 2.184 euros?
  • What if you are addionally short a Q3 forward? – is the total risk 123.883 euros? Or zero?

  • VaR measures the largest expected loss over a certain period of time under normal market conditions at a certain confidence level.
  • A company can say that its daily trading VaR equals €1 million with 99% confidence level
  • This means that under normal market conditions a loss larger than €1 million would on average happen once every hundred days.

  • We’ll keep the presentation short and quickly calculate the correct VaR for one bought forward contract.
  • We need the historical loss that is seen only once in hundred days
  • Solution: we take 1000 daily forward returns, put them in ascending order and pick the tenth largest loss day (-4.65%)
  • We do the same for 5% VaR and pick the 50th largest loss, -1.96%

The bad:
  • Assumption: returns normally distributed
  • Assumption: the correlations between different instruments is assumed to be stable. (we only had one instrument in our example, and still borked)
  • Generally, tomorrow is assumed to be like yesterday
  • Nonlinear instruments (e.g. options) do not fit into the model
  • There is no single, official, correct VaR-method. This makes comparing the models or using the model outputs difficult
  • VaR that deals with these weaknesses is bound to become complicated, heavy and still rely on other assumptions.

The good:
  • VaR is better than nothing
  • VaR is better than the alternatives, especially when it is combined with limits and scenario analysis
  • Portfolio theory is comfortable with only normally distributed world. More advanced VaR models do not suffer from this limitation
  • Portfolio theory would think that playing lottery is very risky, as the potential payoff makes the results volatile. VaR thinks the risk is the possibility of losses, not profits
  • VaR on is a simple dashboard figure that investors and regulators want and even management can understand. Like it or not, VaR has to measured


2. VaR-methods
  • Sharpe, Markowitz et al experimented with rudimentary VaR calculations already in 1950’s, but lack of computational power made any practical applications unrealistic
  • The end of Bretton Woods in 1971, oil crisis, inflation, volatility of the interest rates, government debt created new markets – derivatives and their pricing were ”discovered”: leverage became possible
  • Bank risk profiling was oldfashioned – +500 crude oil futures and -200 pork belly futures cannot be added by delta method
  • By 1993 several banks had developed a proprietary VaR-method. Because of large derivative losses by many corporations and banks, J.P.Morgan published its own RiskMetrics-VaR
  • RM VaR documentation and factor correlation data were free. JPM’s idea was not to become a service provider, instead it wanted to trade derivatives with others

  • Variance-covariance method (VCV i.e. delta-normal)
  • Historical simulation
  • Monte Carlo simulation

  • VCV-method: calculate historical returns and variances and covariances for all the instruments in the portfolio.
  • Instead of full covariance matrix calculation, the problem can be solved by using only few variables (factors). In classical portfolio theory only the ”beta” or codependence of a stock’s price between a benchmark index is calculated, instead of all the covariances between all the individual stocks.
  • These factors are identified and selected with the help of cluster analysis and PCA (principal component analysis)
  • Assumption – risk factors (and instrument price returns) are multivariate normal and the price of the portfolio is linearly dependent on these factors
  • Assumption 2 – the historical numbers give a good approximation of the future

  • Expected return can be assumed to be an average of historical returns (?)
  • Expected returns are usually not an issue, since they are a relatively small source of risk compared to variance of returns
  • Variance can be estimated in many ways:
  • long term average
  • moving average (accounts for heteroscedasticity)
  • GARCH (also accounts for mean reversion)
  • EGARCH (also accounts for asymmetrical variance response)
  • FIGARCH (also accounts for long-term ’memory’ effects)
  • etc., etc.
  • Variance clusters and is predictable to a certain extent

  • Q3-07 ja Q2 07-forwards’ price correlation 0.98 and return correlation 0.88

  • Great, the example position on slide 3 (long/short) has a very small risk!

  • Traditional VaR agrees with our intuition and we get a small number

  • Q3-07 ja Q2 07-forwards’ price correlation 0.98 and return correlation 0.88

  • Even a high correlation does not explain everything:                    R2=0.882=0.77                   

  • Short-term correlation is very volatile: 10-day corr varies between 0.1 and 1.

  • Naive VCV-thinking gives us too too low VaR-number

  • Well-behaving correlation
  • Nonlinear function
  • Perfect correlation & outlier
  • Zero correlation & outlier
  • All four have the same mean, standard deviation and correlation.


Open questions
  • Length of time period for the VCV calculation? If the time span is short, results are volatile, if long, changes are not picked up
  • Period of time chosen? Is the period representative? Will it be different now?
  • Factors or individual instruments?
  • By using factors, one could pay more attention in making sure they are ”well-behaved” and ”representative” of the issues – but this could also backfire
  • Classic case GBP/DEM in 1992, before devaluation – should you use forward or spot rates in risk calculations, does history have any value empirically?
  • VCV can be calculated in real time

  • We’ll take four years of daily market data, get approximately 4 x 52 x 5 = 1000 data points for the value of the position
  • Transparent like no other, least amount of assumptions – except that tomorrow will be like yesterday
  • No assumptions of normal distribution or stabile VCV-matrix
  • Still one has to decide how long and what period to use
  • Does not solve the problem of instruments that have not been (i.e. new futures and option series)
  • With modern computing results in real time.

  • Decide how many times paths are iterated (N)
  • Create market models for all instruments (or factors) and simulate an imaginary daily price changes
  • Calculate change in portfolio’s value, given the simulated prices
  • Repeat N times
  • End result is N number of portfolio values, where one can easily locate the 1% VaR figure.
  • MC is only as good as the market model If VCV cannot be used because of lack of historical data or because of heavy non-normality or existence of options in the portfolio, MC is the only way to go.
  • MC is like the historical simulation, but with made-up data
  • Computationally the heaviest, in practice not available in realtime


  • 3. VaR-model: creation
    Whatever method you use, the same grunt work has to be done:
  • Identification and classification of historical price changes or factors, estimating the parameters of their joint distribution
  • Defining the position to be measured and pricing it
  • Getting the VaR metrics out by combining the two

  • Position mapping is critical – but often thought to be a secondary issue
  • Should one include whole firm, risk management or only trading positions? Should they be calculated separately or combined?
  • What is ’position’? Long-term financing costs, credit risks, operative risks?
  • How to include nonlinear instruments and other exoticity?
  • When to calculate VaR? A daily figure calculated at the end of the day? How about intraday positions and intraday risks for longer positions?

  • Factors or instruments?
  • Length and choice of data period, possible weighting scheme to give greater weight to recent data over older data
  • VCV-matrix calculated from historical, implied or econometric models?
  • How to accommodate the weak predictability of covariance estimates – confidence intervals, statistical significance, seasonal models?
  • Can the system handle non-normal distributions or nonlinearity and should this be accommodated for when planning the data collection?

  • Naturally VaR-model should be tested before implementation
  • The only practical way of testing is historical simulation
  • VaR could be ”teached” with older data, then checked how it works with out-of-sample period (e.g. how often the 1% threshold is trespassed)


4. VaR-metrics and their use
  • VaR: max loss usually
  • Conditional VaR (Expected Tail Loss, Expected Shortfall): when losses go beyond VaR-limit, how much they are on average
  • Minimizing CVaR also minimizes VaR

  • Profit/VaR (vs. Sharpe Ratio’s  Proft/Standard Deviation)
  • Marginal VaR: if you add €1 to one portfolio component, how much your VaR changes
  • Incremental VaR: The change in VaR from adding a position to the portfolio
  • Component VaR: The change in VaR from removing a position from the portfolio
  • PaR Profit-at-risk
  • Relative VaR
  • Cash Flow at Risk
  • EBITDA at Risk
  • Long Term VaR
  • Short Term VaR
  • Trading VaR/trading
  • Stop loss x VaR
  • Counterparty VaR

  • In practice a well-modeled VaR figure at 99% level means that in a normal year losses larger than the VaR figure are met on average 2.5 times.
  • The higher the probability figure, the smaller the tail section that is under examination. The further one goes down the tail, the less experience (and data) there is, and one should be less confident in the resulting loss estimates.
  • Rule of thumb: the VaR-period (e.g. 1 or 10 days) should be selected so that usually there are no major changes in the portfolio during that time. There is no meaningful way of calculating a daily VaR for a high-frequency operation
  • With a selection of a longer VaR-period, one ends up with less data


5. Advanced models
  • On slide 4 we noticed that the assumption of normality leads to too low VaR estimates
  • z(cf) = z(c) +1/6 * {z(c)^2 - 1}*S + 1/24 * {z(c)^3 - 3*z(c)}*K - 1/36 * { 2*z(c)^3 - 5*z(c)) * S^2
  • where z(cf) is Cornish-Fisher critical value
  • z(c) is the critical value for the probability 1-a assuming normality (-2.33 99%)
  • S is skewness and K is kurtosis
  • Now for the 99% case we get a higher z-value (-5.6, -2.33), and VaR estimate would be instead of 2.33 x 1.28% (=2.98%) this: 5.6 x 1.28% (=7.18%).

  • Cornish-Fisher allows working with non-normal distributions as long as there are no other hidden surprises besides non-normal kurtosis and skewness.
  • How to include options and other non-linear instruments?
  • Quadratic (or Delta-Gamma) VaR
  • Beyond the scope of this presentation

6. Stress tests and scenario analysis
  • Weaknesses of VaR are well-known, also by the regularors. Even a weak VaR-model can measure risks reasonably enough in a ’normal’ environment, but what if something strange happens?
  • Scenario analysis is a close relative to historical analysis. One selects a bad historical event and sees how the portfolio and the VaR measure would have worked.
  • Stress test is a self-made scenario analysis
  • Anything, not only price changes, can be included in the stress test (e.g. liquidity desert, making opening or closing of positions impossible or very expensive)
  • Correlations moving to -1 or +1
  • Increases in volatility
  • Changes in forward curves
  • Most famous stress tests recently have been the infamous European bank tests (that did no even include the possibility of a sovereign failure)

7. Finally
  • In 2006 energy industry used $4.4 billion on portfolio- and risk management systems
  • In 2007 an estimated $5.25 billion were used (thank you, Amaranth!)                                    (Carbon360 survey)
  • One third of hedge funds make their risk management work on Excel – perhaps because they know that fancier stuff isn’t more effective, or because they want to show what they show to investors                                                                           
  • Nordic SPAN uses VaR-based calculations for margin requirements
  • All the big participants use VaR, for internal and regulatory purposes
  • Large players (hedge funds) demand and get VaR-based margin practices from their prime brokers

1. Conglomerate or departmental risk management
  • bottom-up or top-down
2. Official vs. internal-only
  • Interpreting the greeks, using and calculating volatility- and distribution forecasts
3. Market risk
  • Length of sample, estimating the variance-covariance matrix, non-normality
4. Credit risk
  • using credit derivatives
5. Liquidity risk
  • Still badly known, usually integrated to market risk
6. Operative risk
  • law, pricing- and model risks, roque dealers and risk managers. No standard practice, view or certainty
7.  Nonlinear instruments
  • Delta-method, delta-gamma-method, full revaluation (Monte Carlo)
8.  Estimating volatilities
  • Volatility of volatility? Historical, implied, econometric? Volatility curve and –smile?
9.   Estimating correlations
  • Same problems as with volatility estimations. Correlation derivatives could help. Time synchronization issues in products traded in different time zones

Managing director wants to hire someone who can answer the question ”how much is 2 + 2”.
Engineers uses a slide ruler and states it is between 3.98 ja 4.02.
Mathematician guarantees that she can prove it is 4 after two hours of non-trivial calculations.
Physicist, by means of deduction, decides the magnitude of the answer is 1x101.
Logician, after thinking for hours tells that the problem is solvable.
The social welfate professional apologizes his lack of knowledge, but wants to tell it is good that a topic of that importance was brought forward.
The lawyer remembers a previous case where it was 4.
Trader wants to know before answering are you looking to buy or sell.
Risk professional gets up from the chair, checks the aisle so that nobody can hear and whispers to the manager’s ear: ”what do you want it to be”.

www.gloriamundi.org
Largest source for VaR
www.rhoworks.com
VaR-program for testing
www.riskmetrics.com
The oldest VaR-producer, the Risk Metrics official manuals, online courses etc are valuable stuff for learners
www.riskglossary.com

Jorion, Philippe, Value at Risk: The New Benchmark for Managing Financial Risk 2001
the benchmark book
Holton, Glyn A., Value-at-Risk: Theory and Practice 2003
some think this is the best out there (www.value-at-risk.net)
Dowd, Kevin, Beyond Value at Risk – The New Science of Risk Management 2003
Lots of material, but not that technical. Good for beginners who want to have an overview on the topic
Javanainen, Timo, Analytical Delta-Gamma VaR Methods for Portfolios of Electricity Derivatives 2004