Exponential Smoothing Epidemia / Jamila Awad

Exponential Smoothing Epidemia:

Financial Risk Management Swift in Modern Banking System

Author

Jamila Awad

Rights Reserved

JAW Group

Date

October 2012

Executive Summary

The paper depicts a guideline to frame financial risk management swift modern banking system in accordance with supervision authorities. The framework is segmented in distinctive components to harangue fundamental procedures in establishing a coherent risk management setting. Thus, risk parameters are modeled into stress scenario attempts, pricing techniques dictate accurate estimates, and finally, risk statistic schematizations elaborate a reference point in performance. Lastly, financial reporting instructions establishconcise data interpretation and visualization artifact guidelines.
Introduction

The proliferation of financial risk management methodologies to smooth modern banking operations and gold chamber investor confidence revives an epidemic fear sentiment of anarchy in society. Financial market meltdowns condemn money game institutions and force rejuvenated fund infrastructures to safeguard depositors from abrasive banking practices. The commerce exchangeentanglements necessitate efficient transparent banking transactions. The blossoming incrust of complex financial instruments classified derivatives spreeeclipsing existing clientele and charm new comersto gratify modern banking system. In addition, engineered techniques empower trendy financial risk management practitioners to minimize risk exposure and secure depository contributions.

International economic regulators continuously collaborate with various financial participants to enhance optimal banking establishments. Furthermore, integrated supervision committees coalesce harmonious global political relationships to systemize a groundbreaking prudential banking epoch. In 1994, J.P. Morgan pioneered a risk metrics widespread framework that stands subject to sequenced revitalization.

The rationale behind refinement consists of embedding behavioral finance knowledge into time series to properly address financial risk approximations in lengthy time horizons. In addition, the requirements frame financial institutions in implementing a universalized standardization of risk management procedures to induce competitive battlegrounds in a rigid scrupulously monitored international banking setting. Thus, designated authorities will magnify convergence in supervisory practice.

The essence of the paper aims to render a softer financial risk measurement pamphlet without compromising the adroitness in execution.The dissertation certainly remonstratesindulging in a critical approach but rather submits a contributive capitulation of financial risk measurement guidelines in modern banking system.

The discourse is segmented in four sections condensingkey techniques to vaccinate a financial risk management prototype. The first section elaborates about risk quantum techniques by defining risk parameters. The second sectionpresents pricing standards to assess variable arrangements that quote valuation estimates and calibrate financial instruments. The third sectiondescribes risk statistic testsin compliance with coherent properties that group retaining sound risk measurement procedures. The final sectionpresents exemplary financial reporting standards and data visualization techniques to promote wise financial risk decisions.

1.Risk Quantum Techniques

1.1Risk Parameters

Risk parameters portray factors that underline security prices and repercuss portfolio values. Banking risk management focuses on constructing individual risk parameters scenarios to conjecture stating plausible portfolio value changes that enumerateall probable outcomes. The assumptions retained for each risk factor induces various models to measure risk that are elaborated in distinctive sectors: A supposition about past behavior compared to an assumption about credible future value of a risk parameter.

The grouped elements present compelling risk factors retained to engineer premises:

  • Commodity prices: Spot commodity prices settle immediate commodity transactions however future prices sketch maturity curves with precise delivery dates.
  • Equity prices: Equity vulnerability is depicted by time series of prices or mapped to a suitable index. Financial instruments holding a price history are also modeled like equity prices.
  • Foreign exchange rates: Spot foreign exchange rates entail the risk of cash positions in outlandish currency. Interest rates and foreign exchange rates are juxtaposed to introduce forward currency prices.
  • Interest rates: Zero-coupon bond prices derive from interest rates. The fixed income exposure triggering factors are hence characterized by zero-coupon bonds.

The risk parameters are then modeled through distributional suppositions and historical dispersions to generate profit & loss scenarios.

1.2 Models Founded on Distributional Suppositions

Models founded on distributional suppositions decipher three prototypes:

1.2.1 The Multivariate NormalModel for Returns

The template depicting a multivariate normal model for returns infiltrates new arrival of information to update return volatility estimates thus alleviating old observations preponderance with time. The model vulgarizes that future distribution of returns are governed by volatility appraisals because volatility persists larger than the expected return at short horizons. The EWMA summarizes the foundations of the model and its suppositions incorporating volatility stochasticity whereas daily returns fit normal distribution and time independency (Reference 1.2.1).

1.2.2 Monte Carlo Modeling

The Monte Carlo technique initiates random scenarios based on independent Brownian accretions where a series of correlated factors induct linear sequence of independent variable series. In addition, the daily returns obey a multivariate normal distribution characterized with a mean equaling zero and a covariance matrix ∑. Afterwards, a Cholesky Decomposition or a Singular Value Decomposition induces a transposed matrix noted CT (Trefethen & Bau, 1997). Possible outcomes of grouped returns for multiple risk parameters are initiated with the following equation: ∑ = CT*C. Independent standard normal variables are converted into a series of returns. Individual financial instrumentsare then priced from daily scenarios to release profit & loss portfolios.

1.2.3 Parametric Approach

The parametric methodology summarizes a compromise between accuracy and speed because the technique delivers quicker but less meticulous results. It incorporates individual financial instrument linear estimates of pricing mapping into a first order Taylor series expansion to assess Value-at-Risk (VaR). The following step effectuates a delta approach to evaluate the sensitivity of present values to changes in risk parameters whereas logarithmic returns are normally distributed. Finally, individual assets in the portfolio are then weighted to sample a linear series of risk factor returns. Independent delta equivalents are afterwards conglomerated to trace profit & loss portfolio values.

1.3 Technique Founded on Historical Dispersions

Empirical figures for individual risk parameters are gathered to trace the shape of the distribution from the frequency of observations. Furthermore, historical noted risk factors are supposed to be analogously distributed and independent to harmonize an identical distribution relevant to the forecast horizon. Likelihoods are then enumerated by the empirical set frame selected to trace the historical distribution of risk parameters. Scaling past observations is traced by an approximation of their volatility to palliate long and short sample periods (Hull & White, 1998). In practice, empirical simulations are achieved from past returns sample matrix and implemented to risk parameters hence producing profit & loss portfolio pricing scenarios. The drawback of the historical methodology relies on the employment of overlapping returns that generates artificial autocorrelation and instigates a bias in the approximations. On the other hand, actual empirical returns encapsulate a fat tails distribution often detected in a risk parameter return dispersion.

The presented elaborated methodologies dictate computational procedures to trace profit & loss portfolio scenarios for anticipated risk parameters. The next component projects stress tests chaperoningfinancial risk measurement methods to buffer intrinsic blemishes in statistical techniques.

1.4Stress Scenario Attempts

Stress tests investigate panoply of meager probable outcomes reposing outside the statistical model forecast. These outcomes derive from various causes: alternation in investors risk aversion, natural disasters, political insecurity, hazardous threats on currencies or any major occurrence shifting monetary policy. Stress scenario attempts are performed in stages: plausible scenarios accentuating potential downsides of portfolios under restricted market conditions are firstly selected and secondly revalued marking-to-market to englobe all risk parameters and calculate profit & loss values.

A standard approach summarizes three alternatives to choice scenarios:

1.4.1Empirical Premise

The empirical premise selects data from a specified time to mimic past events following a turbulent event shifting market variables. The logarithm of historical returns is retained to diagnose profit & loss portfolios. The empirical stress outcomes characterize the resulting changes in portfolio values.

1.4.2User-Defined Plain Setting

Empirical stress scenarios become more useful when adjunct to a user-defined plain setting hence bridging all potential stress suppositions and merge macroeconomic financial information. The method revalues profit & loss portfolio figures following a modification of a risk parameter however omits to juxtapose the effect of the alternation on the behavior of remaining risk factors.

1.4.3User-Defined Forecast Scenario

The user-defined forecast scenario exhibits the correlation between all risk parameters to expose realistic stress settings. It unravels the interconnection between market variables by modifying a risk parameter figure and anticipating the changes in peripheral risk factors. The procedure implies a multivariate regression to exhibit relevant information about sensitivity prices following a shift in peripheral risk parameters and interpret profit & loss portfolio figures.

In summary, stress scenario attempts to channel statistical techniques and study the movement of risk parameters under abnormal market conditions as well as strengthen probable conclusions about profit & loss portfolio hypotheses. The next composite vulgarizes a baggage of instruments describing principles to frame pricing functions.

2. Pricing Framework

The pricing methodologies illustrate how financial instruments are monetary valued in function of risk parameters. The procedure enhances numerical accurate figures to accompany stress tests, Value-at-Risk calculations and a wide range of risk management tools.

2.1 Discounting and Cash Flow Charting

Cash flows illustrate a currency amount attached to a payment date positioned marked-to-market to prevail its present value. A convention settles to multiply cash flows by a discount factor that relies of current market state. In addition, interest rates are expressed from continuously compounded rates. The term structure of interest rates links cash flow payments to interest quotes. The cash flow mapping traces a set of vertices to simplify computing various volatilities and correlations of portfolioscontaining financial instruments in the parametric Value-at-Risk estimation.Cash flow charting enables to cleave actual cash flows in various time periods to a synthetic cash flow map. The procedure is induced from linear interpolation of interest rates. Cash flow mapping also sustains the sensitivity of present values to alterations in the zero rates for sided vertices. Thus, a financial instrument portfolio is converted into a holding of conventional cash flows.

2.2 Widespread Arrangement for Floating Rate Financial Instruments

A generalized principle sets guidelines for the pricing of financial instruments holding floating coupon rates founded on an interest rate benchmark relying on the future values of the stated interest rates. The concept induces incorporating a forward rate to the fundamental arbitrage conjunction between current and future rates to appoint cash flows and render the pricing of floating rate financial instruments. However, the widespread arrangement is amplified to circumvent the case where reference and discount curves of financial instruments differ. The exception is observed in LIBOR-based floating rates that affect cash flows due to differences in credit standing. In the stated case, the correct discount curve has to be incorporated to display the credit rating of the issuer.

2.3 Model to Resolve Commodities, Equity and Foreign Exchange

The present value of commodity futures is expressed by multiplying the number of commodity units to the difference between quoted future prices and the future price of the financial instrument at the entrance time of the contract. Equity futures contracts are revalued on a daily basis due to maintenance margins quoted marked-to-market. In consequence, the present value of equity futures is stated by multiplying the number of units to the difference between quoted future prices and the future price of the financial instrument at the entrance time of the contract. Finally, foreign exchange forwards are modeled as two discount bonds integrating current and discount exchange rates for given currencies.

2.4 Frame of Reference for Nonlinear Instruments and Derivatives

Nonlinear financial instruments are characterized by the dependency of their values on underlying risk parameters alterations. For example, an equity futures projects a nonlinear instrument where its value stand inversely proportional to a zero-coupon bond value. The following financial tools require a specific methodology: options, interest rate derivatives, and finally, interest rate trees.

2.4.1 Black-Scholes Option Pricing

The Black-Scholes option pricing formula summarizes the Put-Call parity (Reference 2.4.1) to value options with accuracy. The strength of the scientific equation relies on the independency of option prices to expected rates of return. On the other hand, the implied volatility of a financial instrument stands a drawback because it is derived from calibrating the option price in the market.

2.4.2 Interest Rate Derivatives Resolution with Black Model

The Black Model does not requisite a geometric Brownian motion supposition to quote the underlying asset price. However, it presumes that the value of the underlying financial tool at the maturity of the option is lognormal with an anticipated value equivalent to its forward value (Reference 2.4.2).Black’s formula suits interest rate derivatives and various European options. Nevertheless, the technique omits to describe the stochastic development of interest rates and bond prices.

2.4.3 Term Structure Arrangements and Interest Rate Trees

Term structure arrangements illustrate the evolution of yield curves to value instruments based on no-arbitrage rational whose value rely on stochastic procedures. The model incorporates the initial term structure as input and studies its involvement through interest rates observed in the market. Thus, the market prices of bonds are recouped by the technique (Hull & White, 1998). Interest rate trees portray a stochastic model for short rate discrete time series. The trees are calibrated for each risk parameter scenario to current market interest rates and volatilities to price complex financial tools.

2.4.4 Analytic Estimates

Analytic estimates enable to price the value of financial tools uncovered in standard valuation methodologies such as American options. For example, an average price option is approximated from the assumption that the underlying asset is lognormally distributed and a concise precision can be tailored by calculating the first two moments of the underlying asset from the standard risk-neutral valuation. A Taylor series expansion simplifies perplex pricing functions to small changes in risk parameters to evaluate complex instruments with quadratic approximations and conclude a careful pricing figure.

2.4.5 Price Calibration

The presented models to value financial instrument pricing might provide inaccurate estimates due to false current price levels. The notion of calibration is embodied to adjust pricing parameters by gauging the implied volatility or calibrating a spread over a base discount curve. This procedure safeguards that the current price of a financial tool derived through the pricing function stands consistent with the observed market price.

3. Risk Statistic Schemes

Risk statistic methodologies are practiced through parametric simulations. Prudential risk management is favored to encompass statistic test flaws whereas a framework of risk statistic tools enhances accurate settings to measure total risk.

The selected risk statistics models:

3.1 Value-at-Risk

The popular financial risk measurement Value-at-Risk (VaR) estimates the likelihood of recognizing a monetary loss exceeding a specific amount. It depicts a percentile of a profit & loss portfolio distribution enumerated either as a potential loss from current portfolio value or as expected loss in the forecast horizon. VaR approximations differ to assessed suppositions founded to generate scenarios in various models like the parametric approach, the Monte Carlo simulation and the empirical replication.

3.1.1 The Simulation Methodology

Firstly, profit & loss scenarios are derived for a portfolio following a Monte Carlo or an empirical simulation. VaR is defined from the selected percentile and the profit & loss distribution whereas a simulation error relies on the appointed confidence interval.

3.1.2 The Parametric Approach

The parametric approach settles the average profit & loss to zero deviating from the assumption that the linkage between instrument prices and risk factors is linear. The VaR is then estimated as the percentile of a profit & loss distribution for a corresponding percentile of a standard normal distribution.

3.2 Marginal Value-at-Risk

The marginal Value-at-Risk depicts the amount of risk merged from a position to a portfolio which demonstrates the nominal change from a sale in a specific position. Marginal VaR equals the difference between the total portfolio VaR and the VaR of portfolio free from the specified position. Furthermore, it will reckon the correlation between the position with respect to the portfolio.

3.3 Incremental Value-at-Risk

Incremental Value-at-risk (IVaR) illustrates the information about the sensitivity of VaR to portfolio holding alterations. This approach affiliates the effect of position purchases and sales that influence overall risk. IVaR equals the percentage change in size of each position multiplied by the estimated VaR. This method is coherent with the additive property in the allocation of risk conditional to the sum of the risks equals the total risk. Two practices quantify IVAR: the parametric method and the simulation model.

3.3.1 Parametric Methodology to CalculateIncremental Value-at-Risk

The VaR derivative of the portfolio is evaluated in conjuncture with the size of individual position to present a delta equivalent. A vector is then produced to interpret VaR sensitivities to risk parameters.

3.3.2 The Simulation Model to CalculateIncremental Value-at-Risk