CHAPTER 6

Treasury and Asset/Liability Management

Thomas Day

SunGard BancWare

Introduction

The demand of today’s marketplace requires a more proactive and integrated view of balance sheet risks and returns. Given the market’s relentless pursuit for the most efficient and productive use of capital, measuring the efficiency of capital utilization subject to a firm’s consolidated risk and return appetite remains of utmost importance for banks of all sizes. This drive toward improved enterprise risk measurement and governance is not happening in isolation, nor is it merely a response to government policy or compliance mandates. Markets are becoming more liquid, and portfolios, positions and the idiosyncratic attributes of portfolios are becoming more tradable. Components of risk that historically have not been “actionable” are being made increasingly liquid through the use of advanced financial engineering and structuring tools, broader acceptance of new financial models and methods, enhanced communication and technology and improved data analysis capabilities. As markets continue to provide richer sets of risk management tools, such as structured investment vehicles, credit derivatives1 and loan and credit risk trading mechanisms and markets,2 the requirement that internal risk measurement and pricing systems be consistent with market-based pricing mechanisms will only increase.

To keep pace with these changes, large financial institutions are embracing the principles of advanced risk management as advocated under emerging policy regimes, such as the Basel II Capital Accord. While opinions are numerous and varied as to the efficacy and appropriateness of Basel II’s somewhat prescriptive set of rules, it is clear that ­market and financial professionals understand the importance of enhanced ­governance around risk-based pricing and economic value creation, considering the risk-based capital attributed to deals as well as costs and returns. Slowly, the financial firm is moving from a “buy, fund and hold” mentality (a “warehouse” for risk) to “buy, decision and act” (a “weigh station” for risk). In support of this evolution, quantitative risk measurement, data analysis, and reporting tools—which have been traditionally used to drive transaction-level decisions—increasingly inform strategic enterprise-wide decisions.

At center stage of this change sits the treasury and asset/liability (A/L) management group. It is within the treasury division that decisions revolve around portfolios, risks, and markets. Treasury is ideally situated to help the organization navigate the transition from reaction-oriented risk measurement to action-oriented risk management. Positioned between the internal origination channels and the external distribution channels, treasury has good understanding of the internal volume and risk-generation capabilities of the bank, on both sides of the balance sheet, and the price of risk in the primary and secondary markets. It can execute trades, buy and sell portfolios and represent portfolio attributes, risks, and opportunities to the market. For these reasons, treasury can often play a critical role in “activating” modern portfolio strategies and tactics across an organization and the organization’s balance sheet.

This chapter will discuss the importance of balance sheet management and the role of the treasury division in creating more enterprise value for the financial services firm. It will discuss the common practices and responsibilities of this group. Finally, it will provide an introduction to the key risk measurement objectives of financial organizations, the most common risk modeling methods and some of the more fundamental assumptions used within these models.

The Nature of the Balance Sheet

One of the most important artifacts created by a financial services organization is its balance sheet. The balance sheet is the engine that drives earnings power, risk and capital requirements. It can be a measure of strength or—depending on its embedded risks—it can be a measure of actual or potential weakness. The balance sheet is often considered a mysterious creation of accounting rules and is far too often thought of in a vacuum, a relatively static entity with no heartbeat; merely the amalgamation of numerous journal entries by accounting and finance staff. This is a poor, albeit far too common, conceptualization.

A more accurate analogy positions the balance sheet as a living, breathing entity with its own personality, character flaws, and strengths; it moves, alters course and sometimes surprises, although “surprise minimization” should be an implicit, if not explicit, goal of treasury. Given that the balance sheet is complex and changing, managing its personality, complexion, and direction is absolutely critical to the safety and soundness of a financial organization.

Covering all aspects of balance sheet risk is a monumental task, especially given the volume of activity represented by a typical bank’s balance sheet and the complexity and diversity of risks implied by its composition. Often, the balance sheet is the result of millions of underlying transactions, with a taxonomy of risk attributes that is truly staggering, including: loans with irregular payment schedules; leases with peculiar features; commercial loans with embedded rate collars; mortgages with interest-only lock-in/float periods and other hybrid characteristics; deposits with embedded options; periodic caps; periodic floors; negative amortization features; servicing rights—and the list goes on and on. Capturing these transactions and their contribution to enterprise risk is a challenging, albeit necessary, task. Moreover, given that the balance sheet changes dramatically over time, it is crucial that this “position change” is somehow captured, measured, and assessed.

Fundamentally, it is the role of the treasury group to understand the balance sheet and the diverse organizational activities that cause it to change over time. Treasury must be in a position to explain these changes and, where necessary, act to mitigate, offset or control assumed and anticipated risks. Thinking of the balance sheet in this context is a powerful analogy as it helps explain some fundamental truths about treasury and A/L management.

Responsibilities of the Treasury Division

In performing its core function—to influence the strategic direction of the balance sheet—the treasury group weighs and measures activities, influences, and sometime re-orients the direction of business. It should be skeptical of inflated performance plans and should challenge business strategies where necessary. Of key importance, the group ensures that new business is priced relative to its risk and maintains an unflinching eye on the “market price” for equivalent portfolios of risk. Across the balance sheet, the treasury group must always evaluate whether the best use of capital is for originating risk and holding, or “buying in,” transferring or offsetting risk due to irrational pricing or product structuring.

To enable it to perform accurately so many different activities requires a diverse staff with deep knowledge and skill across a wide variety of risk disciplines including credit, market, liquidity, operational, legal and reputational risks. A treasurer’s activity centers on the balance sheet and on creating the appropriate governance structures, reporting mechanisms and the agility necessary to monitor and control balance sheet utilization. This often includes, but is not necessarily limited to, activities listed in Exhibit 6.1.


Exhibit 6.1 Direct and indirect responsibilities of treasury and A/L management

Direct responsibilities

Indirect responsibilities

  • Economic and market analysis
  • Establishing and executing proper A/L, portfolio, liquidity and balance sheet management policies and procedures
  • A/L modelling, management and
    committee reporting
  • Income forecasting, risk analysis and reporting
  • Inculcating a risk-sensitive pricing mechanism and/or culture into the balance sheet management process (usually by ownership of funds transfer pricing (FTP) methodologies)
  • Investment portfolio modelling and management (i.e. the ‘discretionary’ books; often the available for sale (AFS) and held to maturity (HTM) investment portfolios)
  • Off-balance sheet modelling, conduit management and reporting
  • Structural derivative positions
  • Wholesale liquidity and liquidity risk management, including all shelf registrations and associated overhead
  • Capital management products (e.g. TPS, subordinated debt, mandatory convertible debt and other hybrid capital instruments/programmes)
  • Various wire-transfer activity related to portfolio operations
  • AFS and HTM portfolio operations
  • Portions or sometimes complete ownership of cash management operations. In many large banks, this will be housed in a separate division
  • Bank owned life insurance (BOLI) programme(s)
  • Holding company liquidity, funding and portfolio management operations
  • Discretionary strategy analysis
  • Investment banker relations and strategy assessment(s)
  • Understand and influence risk pricing across the organization
  • Administered rate deposit pricing and repricing (sometimes directly influenced)
  • Campaign management (new business; treasury must be indirectly involved)
  • New product creation
  • Mergers and acquisitions
  • Shareholder relations
  • Capital planning and risk-based capital scenario analysis (sometime owned directly by treasury)
  • Budgeting and planning (sometimes integrated with treasury A/L systems, but often owned by the controller or finance division)
  • FTP processes and exception reporting
  • Credit spread pricing and other loan pricing
  • Client derivatives activities (sometimes owned directly by treasury)
  • Mortgage servicing rights valuation, accounting, analysis and reporting (sometimes owned directly)
  • Mortgage pipeline and portfolio hedging
  • IAS 39 and/or FAS 133 (United States) accounting, effectiveness testing and reporting (sometimes directly owned by treasury; depends on jurisdiction)
  • Credit portfolio stress testing and risk analysis
  • Accounting for (1) leases, (2) discretionary portfolio (i.e. AFS and HTM), (3) fees, (4) Special purpose entities/vehicles (SPE/SPVs), (5) currency translation, (6) employee stock options, (7) business combinations, (8) segment reporting, (9) fair value reporting
  • Model validation and risk control

Source: Author’s own.


Of all of these responsibilities, perhaps the most important is developing an intuition as to how the bank’s business, departmental activity and market movements will influence earnings, both today and over an appropriate forecast horizon. In this regard, the treasury division should be focused not only on routine operational duties and roles as described previously, but instead should also seek to understand the firm’s balance sheet risk exposures in as detailed and analytical manner as practical. Over time, the treasury group will be expected to evolve into a much more active balance sheet management function charged with instilling active risk management strategies across the traditional asset/liability, liquidity, and credit risk groups.

As products such as credit derivatives become more mature and risk factors are more easily exchanged across counterparties, organizations will need to monitor and react to market opportunities as well as understand and mitigate balance sheet exposures quickly and efficiently. This represents a tremendous opportunity for treasury to inform forecasting and risk, capital management and enterprise asset/liability allocation, as well as acting as the cultural champion for enhanced risk understanding and process improvement.

Given that the aggregate composition and changing nature of the balance sheet is so fundamental to a bank’s long-term success and financial health, it is critical that the treasury division is well supported and equipped. Without the proper tools, techniques, organizational stature and structure, the group can become overwhelmed with tactical activity, permanently impairing its ability to influence the strategic direction of the balance sheet. Without proper board and senior management oversight, technical understanding and support, treasury will not be effective in its goal of ensuring that the earnings power of the financial firm is sustainable and repeatable over time.

Risks Managed by Treasury

There are five principal risks managed by the treasury division:

  • Accrual book market risk (.interest rate risk.);
  • Funding liquidity risk;
  • Investment and derivatives portfolio3 risks;
  • Counterparty credit risks; and
  • Certain elements of capital risk.
  • We shall consider each in turn.

Accrual book market risk. One of the most important risks managed by the treasury division is the risk embedded in the structural balance sheet. This risk is managed by applying a variety of models, economic scenarios, and stress tests to the structural balance sheet position in an effort to understand exposure to embedded interest rate risk types and sensitivity to budget and forecast business volumes. Interest rate risk is composed of four main types of risk (see Exhibit 6.2). Proper risk management within treasury means that the firm has a process to model these risk types and understands the earnings and value impact of exposure to these risk factors.


Exhibit 6.2 Sources of interest rate risk

Risk Type

Definition

Repricing mismatch risk

The most commonly discussed and well-understood form of interest rate risk, repricing risk is the measure of risk related to timing mismatches associated with repricing events. Banks intentionally accept mismatch risk in order to improve earnings. Repricing mismatch risk is often, but not always, reflected in a bank’s current earnings performance; however, a bank may be creating repricing imbalances that will not be manifested in earnings until sometime in the future. For example: a bank uses a 10-year no-call 2-year funding vehicle to leverage a 10-year bullet-bond purchase. Repricing risk is minimal in years 0 through 2, but exposed for years 3 through 10. A bank that focuses only on short-term repricing gaps may be induced to take on increased interest rate risk by extending maturities to improve yield. When evaluating repricing mismatch risk, therefore, it is essential that the bank consider not only near-term gaps but also long-term repricing gaps. It should also be noted that repricing mismatch can be a cause of hedge ineffectiveness when attempting to apply FAS 133 and IAS 39.

Yield curve risk

Yield curve risk – often confused with basis risk – addresses changes in the relationship between interest rates of different maturities of the same index or market (e.g. a three-month treasury versus a five-year treasury). These relationships change when the slope of the yield curve for a given market flattens, steepens or becomes inverted (i.e. negatively sloped) during an interest rate cycle, which can create significant behavioural incentives across the bank’s product market. For example: when long-term and short-term rates are relatively equivalent, the

volume of adjustable rate lending may be reduced and long-term fixed rate lending may increase, as the costs are relatively equal. However, when differences emerge, the incentives can shift, creating important behavioural patterns that should be modelled and captured in the bank’s A/L risk management process. Many banks assume scenarios that simply address parallel or proportional curve shift and thus do not effectively measure yield curve risk. The extent to which a bank is mismatched along the term structure will increase its exposure to yield curve risks. Certain complex investments can be particularly vulnerable to changes in the shape of the yield curve, including structured products such as dual index notes.

Basis risk

Basis risk arises from the non-parallel responses in the adjustment of interest rates among two or more rate indices, or ‘bases’. For example: the index of a variable rate loan that re prices monthly may be based on a ‘prime’, which changes infrequently, and funded by 30-day certificates of deposit that are based on a LIBOR index, and changes quite frequently. As the loans are based on the ‘prime’ index and the CDs on the LIBOR index, even though maturity is matched, there is still a potential basis risk exposure. Another common form of basis risk is the relationship between non-maturity deposit rates and market rates. The manner in which these deposit basis risks are captured should clearly show the range of exposure to error related to model assumptions made around this risk factor. Many industry practitioners view basis risk as the most significant type of interest rate risk, likely due to the basis risks embedded with non-maturity accounts.

Option risk

Option risk arises when a bank or a bank’s customer has the right (not the obligation) to alter the level and timing of the cash flows of an asset, liability or other instrument. An option gives the option owner the right to buy (call option) or sell (put option) a ­financial instrument at a specified price (strike price) over a ­specified period of time. For the seller (or writer) of an option, there is an obligation to perform if the option holder exercises the option. The option owner’s ability to choose whether to exercise the option creates an asymmetric performance pattern. Generally, option owners will exercise their right only when it is to their ­benefit. As a result, an option owner faces limited downside risk (the premium or amount paid for the option) and ­theoretically ­unlimited upside reward. The option seller faces theoretically ­unlimited downside risk (an option is usually exercised at a ­disadvantageous time for the option seller) and limited upside reward (if the holder does not exercise the option). This is one of the most difficult risks to capture from both an earnings and ­valuation perspective within the context of a bank’s treasury risk management activities.


Source: Author’s own.

Funding liquidity risk. In addition to managing traditional call risk, maturity gap risk and transaction settlement risks (purchase/sale/fail), the treasury’s liquidity desk must analyze the degree to which it is overly reliant on wholesale or otherwise volatile fund providers. Scenarios should be evaluated that stress test reliance on liquidity providers subject to triggers such as rating downgrade, legal actions, compliance impacts, operational shut-downs, and other scenarios. In addition, the firm must put into place a contingency funding plan that can be invoked in the event of a run on deposits, or other liquidity crisis. This requires identifying and measuring back-up funding need, establishing appropriate back-up lines, and periodically testing readiness.

Investment and derivatives portfolio risks. Treasury is normally responsible for managing the firm’s structural investment portfolio, including the available for sale and held to maturity portfolios. It is around this book of business, and the wholesale funding book, that the treasury should build reasonable models to measure value-at-risk exposure. These are not necessarily as rigorous as required within the trading portfolio, given that these positions are not marked-to-market through earnings. Approximations of value-at-risk are normally sufficient to understand risk exposures that accrue to these portfolios today and over time. In best practice, a principle components analysis is used to create a range of interest-rate scenarios that are used to value portfolio positions. The output from these scenarios can then be used to create a distribution of portfolio returns. Sound practice includes simple measures such as account-level effective duration and convexity metrics and then relating exposure to rate scenarios and measuring the potential change in value relative to capital (see Exhibit 6.3 for a hypothetical portfolio risk report).


Exhibit 6.3 Hypothetical portfolio risk report

Market value

Base case effective duration

Base case effective convexity

Base case price

300

200

100

0

100

200

300

NON–AGENCY ABS

$200,000,000.00

0.88

–0.20

101.75

101.39

101.10

100.89

100.36

99.71

98.96

0.85%

0.49%

0.20%

0.00%

–0.53%

–1.17%

–1.92%

AGENCY MBS POOL

$2,500,000,000.00

1.65

–2.19

104.74

103.99

103.51

103.28

100.91

97.49

93.53

1.42%

0.69%

0.22%

0.00%

–2.30%

–5.60%

–9.43%

AGENCY CMO

$550,000,000.00

2.02

–1.82

104.12

103.34

102.77

102.09

99.53

96.16

92.37

1.98%

1.23%

0.66%

0.00%

–2.51%

–5.81%

–9.53%

AGENCY NOTE

$150,000,000.00

0.40

–0.18

102.57

102.24

101.94

101.72

101.18

100.05

98.55

0.83%

0.51%

0.22%

0.00%

–0.53%

–1.64%

–3.12%

TREASURY NOTE

$272,000,000.00

1.02

0.07

104.53

103.45

102.56

101.91

101.00

100.13

100.13

2.57%

1.51%

0.63%

0.00%

–0.90%

–1.75%

–2.60%

NON-AGENCY CMO

$3,500,000,000.00

1.89

–1.68

102.85

102.28

101.77

101.17

98.84

95.92

92.68

1.65%

1.09%

0.59%

0.00%

–2.30%

–5.19%

–8.39%

TOTAL PORTFOLIO

$7,172,000,000.00

TOTAL ASSETS

$20,491,428,571.43

Leverage capital

$1,254,075,428.57

Exposure at +200bp

–5.07%

Current leverage ratio

6.12%

Net leverage at +200bp

4.88%


Source: Author’s own.


Counterparty credit risk. Very little reporting is typically generated around treasury relationships with dealer counterparties. However, important risks and concentrations should be evaluated, and limits established, to ensure potential exposure, expected exposure and credit risk mitigation agreements are well controlled. The treasury division should establish proper documentation, controls and reporting around master agreements, confirmations, broker/dealer correspondence and other activity.

Capital risk. Fundamentally, this is the management of the instruments contained within the capital base. It is becoming more common for the treasury division to integrate economic capital sensitivity measures into its balance sheet scenario testing, including measures of credit risk and credit transitions over the earnings forecast horizon, which requires firms to derive models to allocate balances to the allowance for loan and lease losses, non-accruals, and losses. Many treasury divisions do not manage these risks directly, but instead take input projections for these elements from the portfolio credit risk management and controller divisions. Treasury also may dedicate staff to better deployment of capital through hybrid and other capital instruments, as well as managing any share buy-back program.

Enterprise risk management (ERM): ERM has recently become a topic of considerable interest among financial organizations, bank rating agencies and bank supervisors. However, the phrase “enterprise risk management” is often overused and misunderstood. ERM is not a product that can be purchased, a policy that can be mandated or a regulation that can be prescribed. The Committee of Sponsoring Organizations of the Treadway Commission (COSO)4 offers the following definition:

ERM is a process, effected by an entity’s board of directors, management and other personnel, applied in strategy setting and across the enterprise, designed to identify potential events that may affect the entity, and manage risk to be within its risk appetite in order to provide reasonable assurance regarding the achievement of entity objectives. [Emphasis added]

An evolutionary process, ERM requires the adoption of risk management as a core value within an organization. While it is not the domain of treasury to implement ERM, a treasury division that establishes sound risk management practices should be encompassed within and complementary to an effective ERM framework.

The Dominance of Earnings

For most financial organizations, the dominant driver of earnings is net interest income (NII) and a critical management activity is creating stability of the earnings stream across a wide variety of economic scenarios. To achieve this, it is common for a bank’s board of directors to orient treasury around the governance of consolidated earnings performance, particularly net interest income and the net interest margin (NIM). The board’s policies typically require the treasury division to simulate earnings over specified time horizons and to stress test and sensitivity test earnings forecasts, subject to potential and expected market and ­economic environments. While opinions vary as to the applicability of stochastic versus deterministic simulations, the use of scenario analysis is critical and should be understood and monitored by senior management of the firm.

In most large financial organizations, the scenarios seek to limit risk to earnings in such a manner that forecast earnings, based on consensus estimates of market factors and balance sheet volumes, would produce sustainable earnings over time. In addition, banks typically seek to limit exposure to extreme movement in market factors such that may place the firm’s earnings or value in undue peril. In practice, this normally incorporates a neutral position relative to movement in market risk factors, with a dominant focus on measuring and managing earnings at risk (EAR).

Earnings at Risk (EAR)

As mentioned previously, net interest income (NII) is often the ­dominant portion of overall bank earnings, usually representing more than 60 ­percent of total revenues. The NII is sometimes referred to as the “spread” or rate-sensitive component of a bank’s revenue stream, as the quality and sustainability of this revenue depends on the spread between the yield on assets and cost of liabilities.

The net interest spread is influenced by the volume, mix and quality of rate-sensitive assets and liabilities, today and over time. Within today’s balance sheet, there are products that will reprice over 12 months; this repricing will be based on rates that exist at an event date in the future.

For example, consider a firm that holds within its current position balance sheet a commercial loan that originated several years ago. This loan was fixed-rate at 6 percent for three years; after this fixed period, the loan reprices at 200 basis points off the US dollar Libor curve, currently at a rate of 5.15 percent. Today’s balance is US$1m and the repricing date is 188 days in the future. Once the loan reprices, it will do so every 60 days for a period of 12 months, at which point it will mature. The firm then funds this position with a term deposit fixed at 3.5 percent. The following Exhibit 6.4 expresses the income earned on this position over the next three repricing dates, 7 August, 6 October and 5 December.


Exhibit 6.4: Base Case analysis

31/01/07

07/08/07

06/10/07

05/12/07

03/02/08

Commercial loan, hybrid 3:2M Libor:

$1,000,000

$1,000,000

$1,000,000

$1,000,000

$1,000,000

Interest income

$5,000

$31,333

$11,917

$11,917

$11,917

Time deposit

$1,000,000

$1,000,000

$1,000,000

$1,000,000

$1,000,000

Interest expense

$2,917

$18,278

$5,833

$5,833

$5,833

Net interest income

$2,083

$13,056

$6,083

$6,083

$6,083

Average earning assets

$1,000,000

$1,000,000

$1,000,000

$1,000,000

$1,000,000

NIM (annualized)

2.50%

2.50%

3.65%

3.65%

3.65%

EPS

$0.12

$0.12

$0.18

$0.18

$0.18


Source: Author’s own.


Exhibit 6.5 Down 200 bp over horizon

31/01/07

07/08/07

06/10/07

05/12/07

03/02/08

Commercial loan, hybrid 3:2M Libor:

$1,000,000

$1,000,000

$1,000,000

$1,000,000

$1,000,000

Interest income

$5,000

$31,333

$10,250

$9,417

$8,583

Time deposit

$1,000,000

$1,000,000

$1,000,000

$1,000,000

$1,000,000

Interest expense

$2,917

$18,278

$5,833

$5,833

$5,833

Net interest income

$2,083

$13,056

$4,417

$3,583

$2,750

Average earning assets

$1,000,000

$1,000,000

$1,000,000

$1,000,000

$1,000,000

NIM (annualized)

2.50%

2.50%

2.65%

2.15%

1.65%

EPS

$0.12

$0.12

$0.13

$0.10

$0.08


Source: Author’s own.


From an earnings perspective, the firm has six more months of ­earnings at 6 percent, at which point the loan reprices to 7.15 percent. Over this horizon, the earnings per share sums to US$0.77. This is an incomplete picture of risk, however, as it assumes that the two-month US dollar Libor rate in August will be 5.15 percent, the rate that prevails today on January 31, 2007. If rates change between January 31, 2007 and August 7, 2007, the reported earnings over this horizon will be incorrect. Let us consider the impact of rates declining by 200 basis points over this horizon.

This reduction in rates produces a new EPS of US$0.55, a decline of approximately 29 percent from the US$0.77 level. This dramatic drop in earnings is a red flag and would certainly motivate the organization to seek strategies to mitigate or offset this risk exposure, assuming that the probability attached to the scenario (i.e., a decline in rates of 200 bp) was considered sufficiently high to warrant taking remedial action.

Regardless of likelihood, this type of measure is easy to understand. The firm can place a risk limit and corporate policy around this measure, and report to the board, senior management and the A/L committee on a routine basis (at minimum quarterly, but often monthly). Strategies to offset exposure, such as new pricing campaigns, purchasing or entering into a financial hedge or other such action would be evaluated and presented to the appropriate governance committee on a routine basis.

It is this spread relationship, and the timing and magnitude of repricing events between assets and liabilities, which drives a financial firm’s NII. Managing the sensitivity of net interest spread is one of the core functions of the treasury group and many of its other activities are geared toward measuring, monitoring, controlling and protecting the spread. These objectives are achieved through direct influence as well as through the establishment of a market-oriented fund transfer pricing (FTP) regime, profitability, relationship pricing, and loan/facility pricing spreadsheets/models.

The Importance of Value

While measures of earnings at risk are the dominant and proper orientation of effective balance sheet management for the accrual book, EAR scenarios and stress testing are insufficient. Suppose bank management offered only one-year certificates of deposit (CDs) at 4 percent and in turn used the deposits to fund five-year fixed rate, callable loans at 8 ­percent. The bank has created a nice 4 percent margin, right? However, what happens if interest rates increased three percentage points so that in one year, when the CDs mature, it costs 7 percent to bring in new money?

The 4 percent margin will soon reduce to 1 percent when the deposits renew at the higher rate. This margin compression is earnings at risk. However, there is even worse news for the bank. The loans that were made for five years could now be made for a higher rate, perhaps as high as 11 percent; the original 8 percent plus the 3 percent increase in rates. The loan customers would be thankful that they had locked in their loan rate at 8 percent. But what has happened to the value of the loans? If the market rate for similar loans is 11 percent, the bank is holding relatively low-coupon instruments. In effect, they are giving up an additional 3 ­percent for four years (four years as one year has passed). Should the bank decide to sell the lower-coupon loans in the current interest rate environment, they would have to sell at a deep discount. This is what is known as economic value-at-risk, or the risk that the market value of assets and liabilities will change due to changes in interest rates.

If the transaction is viewed in a more “technical” sense, you can quantify the exposure and characterize the distinguishing features between EAR and economic value of equity (EVE). For example, assume that we have two banks: A and B. The transactions they each make on day 1 are as follows

Obviously, Bank A is in a better income position, earning a full 100 basis point spread over Bank B. A naïve analysis would look at the one-year earnings and conclude that Bank A is managing its business more effectively. Focusing solely on short-term earnings, which many banks are inclined to do, we make this assumption. But then consider what happens when interest rates increase dramatically, by say 300 basis points. What happens to the value of each bank’s position? Which bank is in a better position?

Bank B continues to earn a 3 percent spread while the spread earned by Bank A is now 1 percent. Said differently, Bank A has “lost” 300 basis points for four years. The present value of this lost 3 percent annuity (i.e., US$3 over four years) is 9.31 (exactly equal to the discount on the loan). This “opportunity cost” of capital is not illusory.

Should Bank A attempt to readjust its position, it would only be able to sell its loan for US$90.69 and reinvest at the new market rate of 11 percent. Even if no repositioning occurs (i.e., take your loss today), the “unrealized loss” is nonetheless real. If Bank A maintains its holdings, it will have lower long-term earnings than Bank B (the long-term earnings potential of the company has been impaired). This said, EVE is not without its flaws.


Exhibit 6.6 Comparative interest rate risk exposure

Bank

Transactions

A

  • 5-year loan at 8%; short call
  • Funded by 1-year money at 4%

B

  • 1-year loan at 7%
  • Funded by 1-year money at 4%

Source: Author’s own.


Exhibit 6.7 Comparative value at risk

Bank

Transactions

A

  • PV of a 4-year loan at 8% when market rates are 11% is 90.69
  • The face value of the deposit at the end of year one remains 100 since it is repriced at the market rate

B

  • PV of a 4-year loan at 11% when the market rates are 11% is 100, or par
  • As above, the value of the deposit is 100

Source: Author’s own.

Economic Value of Equity

In practice, EVE has been interpreted to mean the net present value of discounted cash flows (DCF) across a bank’s balance sheet. So, for ­example, if you hold a mortgage loan, the method would require the bank to create a base-case scenario that reflects the principle and interest (P&I) cash flow schedules that are expected to occur over the remaining life of the instrument. For a measure of risk, this cash flow schedule will change as interest rates change. For a traditional fixed-rate mortgage, the schedule of P&I will decline as rates rise and increase as rates fall. As the value of the mortgage is a function of these cash flows, as interest rates increase and cash flows decline, the PV will decline due to discounting a lower volume of cash flows in the early years, and more cash flows in the later years. This seems reasonable. To see what is wrong with this method, however, consider a bank that owns a one-year floating-rate loan and has sold a five-year cap on the loan, a product very similar to the adjustable rate products that have proliferated on today’s balance sheets. The bank’s position is:

Capped Floater Loan = Non-Capped Floating Rate Loan. 5-Yr Cap Option

Assume then that the cap option is 200 bp out of the money at origination and that the loan originated at 3M Libor flat when the 3M Libor was 3.80 percent. The cap is, thus, 5.80 percent. Once the 3M Libor exceeds 5.80 percent at a reset date, the loan becomes a fixed-rate product and no longer adjusts upward. We are going to assume that volatility is flat at 15 percent for the purpose of this illustration. If you compute EVE as a discounted cash flow, which is the current approach used by many banks, the option has no effect on cash flows until the 3M Libor exceeds 5.80 percent. In the base case, you would see no effect. If a +200 bp shock were to occur you would likewise see no effect. The following Exhibit 6.8 highlights the coupon and EVE/NPV (Net Present Value) impact.


Exhibit 6.8 Impact of rate movement on loan coupon

Dn 100

Base case

Up 100

Up 200

Up 300

Base rate

3.80%

3.80%

3.80%

3.80%

3.80%

Rate shock

–1.00%

0.00%

1.00%

2.00%

3.00%

Net rate

2.80%

3.80%

4.80%

5.80%

6.80%

Cap level

5.80%

5.80%

5.80%

5.80%

5.80%

Net rate

2.80%

3.80%

4.80%

5.80%

5.80%

EVE/NPV

$ 100.00

$ 100.00

$ 100.00

$ 100.00

$ 95.79


Source: Author’s own.


A market practitioner will immediately observe that the option value is misrepresented in this approach. As expected, the yield on the ­position floats until the interest rate passes the cap strike rate of 5.80 percent. However, if you are only discounting cash flows (i.e., principal and income streams), the cap “value” is unobserved in any rate environment below the cap strike. The more appropriate method would be to measure the value of the embedded option along with the instrument. In doing so, the values in Exhibit 6.9 are obtained.


Exhibit 6.9 Impact of rate movement on value

Dn 100

Base case

Up 100

Up 200

Up 300

Floating rate note

$ 100.00

$ 100.00

$ 100.00

$ 100.00

$ 100.00

Option value

$ (0.41)

$ (1.38)

$ (3.33)

$ (6.34)

$ (9.83)

OAV

$ 99.59

$ 98.62

$ 96.67

$ 93.66

$ 90.17

EVE/NPV

$ 100.00

$ 100.00

$ 100.00

$ 100.00

$ 95.79

Difference

$ (0.41)

$ (1.38)

$ (3.33)

$ (6.34)

$ (9.83)


Source: Author’s own.


These differences are far from trivial. Risk managers and bank supervisors should be hesitant to specify a methodology as simple as the NPV/EVE method, which is, arguably, the method advocated if interpreted according to existing regulatory guidance, as well as the majority of US industry practice. This situation has festered over the years due to two main factors:

  • The acceptance of the EVE/NPV metric as a less than ideal but sufficient measure of risk; and
  • The dominance of income simulation and income ­forecasting as the principle metric for measuring structural accrual-book risk.

These are powerful arguments against better methodologies; however, they hinge on the issue of “sufficiency.” That is, is it accurate to say that the EVE/NPV measure is a “sufficient” measure of risk? Is EVE sufficient when coupled with an income simulation measure? Good arguments can be made that for many accrual-book positions EVE is an insufficient measure, even when coupled with income simulation measures. Given the growing complexity of embedded options, particularly in the mortgage space in the United States, and recognizing technology advances that permit more rigorous measurement of position risk, there is a case that can be made to enhance structural position risk measurement.

Consider, for example, the effective duration of the position under an EVE measure and calculated relative to the +/ñ 100bp shock from Exhibit 6.9 (shown earlier). Clearly, the effective duration (ED) is 0 under the EVE method. That is, there appears to be no price risk. In reality, the effective duration for this instrument is around 1.48 percent. While seemingly a small number, assume for a moment that this is the only position held by the financial organization. Further, assume the bank is funded with 90 percent quarterly reset funds (for simplicity, assume this is equivalent to an ED of 0.25) and 10 percent equity. In this case, Duration of Equity (DE) is solved in the following manner:

where

TA = total assets

TL = total liabilities

DA = duration of assets

DL = duration of liabilities

DE = duration of equity.

Solving for DE when DA and DL are 0 and 0.25, respectively, leads to a DE of 2.25 percent; however, when the asset duration is reflected at the more accurate 1.48 percent, DE becomes 12.56 percent. In this case, the change in equity given a change in rates is DE = 12.56 percent*∆y*Price.

If you assume this DE can be scaled by a change in yield, then a 200 bp change in rates, without considering effective convexity, gives 25.12 percent. Under the most recent Basel proposals for supervision of interest rate risk, this exposure level would qualify as an “outlier” bank. What at first may have appeared to be a rather benign position turns out to be high risk. Looked at another way, perhaps more cynically, there appears to be plenty of room to “game” the rules if permitted to use simple measures rather than being asked to apply more rigorous and modern valuation algorithms to structural balance sheet positions. In practice, it could be argued that a portion of the gaming already takes place in more opaque portfolios, such as the deposit book of business where rules are unclear, practices wildly divergent and measurement methods inconclusive.

Clearly, the quality and integrity of balance sheet measurement methodologies are pivotal for a reasonable portrayal of risk. To achieve this, and as seen in the previous example, reasonable pricing of embedded derivative(s) should be required as a part of the measure being considered. Under the DCF/EVE measure, options are not fairly represented.

Pulling the Balance Sheet Puzzle Together

Static and Dynamic Models

Static models are based on positions at a given point in time. Typically, this is a firm’s current position (CP) report. Gap, economic value of equity and duration-based models are typically considered static models. These models rely on current position balances and the cash flows expected to be produced from those balances over the life of the transaction.

Dynamic models, on the other hand, attempt to capture how a bank’s management and its customers are expected to react to a changing interest rate environment. These models are more useful for active management of position risk over time. That is, within the model, balance sheet footings (i.e., volumes within asset/liability classes) are allowed to change according to behavioral rules that are built by the analyst. It is important that the A/L group has control over this process, and that it is not unduly influenced by the sales and/or budgeting/financial control side of the firm. The A/L group needs to be able to independently evaluate planned behavior in the context of historical performance and with an eye toward future market conditions. For example, if the national prime lending rate is rising, management may raise loan rates by a certain percentage and, consequently, future loan volumes may be assumed to decline, although existing business would be allowed to reprice at the higher rate. In such a circumstance, it is often the case that neither the sales side or the budgeting side will want or be motivated to recognize or account for this behavior until the impact is virtually certain.

Given the flexibility and more realistic conditions that dynamic models can explore, most firms find that dynamic income simulation models are more informative than static models. While short-term focused, the dynamic income simulation helps the proactive organization forecast the results of active decision making and develop strategies under various potential rate scenarios. While this sounds great, be aware that such forecasting is laden with assumptions that must be made about the direction of interest rates, how customers will react, how business strategy would change under each environment and how other key variables would be impacted. Thus, income simulation modeling is only as good as the quality of the assumptions and staff running the system. Some banks with very simple tools outperform banks using the most sophisticated models offered by vendors. Often, nothing replaces good professional judgment, market intuition and experience.

Major Assumptions Used Within a Balance Sheet Model

In general, most dynamic simulation models will estimate cash flows for most standard instruments including fixed/floating rate securities, loans, deposit products, and funding sources. Most models can incorporate various types of amortization schedules, rate quote conventions, spread assumptions, term assumptions, call and sinking fund provisions, and payment frequencies—assuming the data are accurate and available. In this regard, all models are simply “toolkits” to build accurate cash- flow forecasts.

There are other important considerations in developing a balance sheet model, however, which include:

  • Architecture. Does the model allow for easy distributed and/or grid compute to leverage technology? Is the model easily scalable? Is the technology based on component architecture such as a service-oriented architecture?
  • Modularity. Does the model port well into other systems to take advantage of possible points of integration? Are reporting dimensions dynamic and does the model allow for on-the-fly changes?
  • Reporting. Is it relatively easy to get assumptions into the model and customizable information out? Are tools available for multidimensional analysis?
  • Usability. Does the model make the performing of calculations relatively easy?
  • Schedulability. Can processes be batch-controlled and ­automated?
  • Benchmarking. Can the model be easily benchmarked and tested against known quantities?

Regarding calculations, the usefulness of simulation techniques—earnings and value—depends on the validity of the underlying data and assumptions and the accuracy of the basic structure on which the model is run. The results obtained will not be meaningful if these assumptions do not fairly reflect the bank’s internal and external environments. Some of the major assumptions and/or inputs that are basic to all simulation models include the following.

Starting Balance Sheet

It is imperative that the bank ensures that the beginning balance sheet used for projecting earnings is accurate and captures all material positions and their characteristics. The old axiom “garbage in, garbage out” applies here. Internal audit can usually validate the integrity of the data. Critically important is how balance sheet accounts, and/or instruments, are aggregated into the model chart of accounts (COA). The classic ­example of poor aggregation is to separate all mortgages into two worlds; for ­example, all fixed-rate products over 7 percent and all under 7 ­percent. Such wide aggregation groups, if the positions are significant, are ­probably conducive to poor modeling results.

Selection of Rate Scenarios and Driver Rates

The driver rates should be comprehensive and properly assigned to the account pools or instrument profiles. Moreover, the interest rate scenarios that are run should contain both realistic rate paths (i.e., where and when the bank expects rates to move, such as the forward rate path) as well as stressed rate paths (i.e., +/ 200 bp shocks). Regardless, the input of rate paths should be consistent with expectations and the scenario under consideration. In some cases, the bank may be looking at reports that claim to be rate increases when in fact the analyst forgot to change one or more paths, or has made nontransparent overrides to modeled rates or spreads. Therefore, having checks and balances around these inputs is important. With some models, the process for entering market data and rates can be convoluted and overly complex. A rule of thumb in any modeling environment is to try to keep things as seamless and simple as possible, but not too simple. Unnecessarily complex input processes can result in lower integrity and may expose the modeling effort to higher levels of potential error.

New Business Assumptions

It is critical to know whether and how the model incorporates new business, or if it is a “constant scale” (i.e., constant) balance sheet model (i.e., this means that any run-off is replaced but only to the point that it brings the position back in balance with the starting position account level). Notably, new business assumptions can mask/hide inherent interest rate risk. For example, if the model assumes that loans grow by 30 percent funding by a similar increase in core monies, when in fact they each only grow by 5 percent, the assumed growth may mask earnings deterioration in a rising rate environment, as the higher rate production never ­actually occurs. It is for this reason that some practitioners recommend running both static and dynamic simulations to gauge the range of exposure. No matter your perspective, it is important to understand the level of new business assumed to roll onto the book (i.e., “new business assumptions”), how it is priced when it hits the book, how it is funded and its assumed life.

Repricing Attributes

If analysis is performed at the pool level, rather than at the instrument level, the analyst will need to make assumptions about how the pool reprices. Even in the case where instrument-level attributes are captured, assumptions will need to be made for existing and new business where there are either gaps in the data or a need to create behavioral characteristics. If 100 percent reprice every three months, it might be reasonable to assume that 33 percent will reprice each month. Such assumptions may lead to inaccurate results and, thus, should be documented. Another example may be a 30-year adjustable-rate mortgage. It has a 30-year maturity but reprices each year. Repricing assumptions also are important for products with caps and floors. In the ARM example, many ARMs have periodic and lifetime rate caps/floors. Thus, if rates increase 300 basis points within one year, the ARM could only adjust up, for example, by 200 basis points (i.e., the periodic cap). Another important point is how the new business is assumed to come onto the book. If you assume that the spread to your driver rate for indirect loans is 150 basis points when, in fact, it is only 75 basis points, the new business will look more profitable than it is. Being able to model this “spread” risk may be an important consideration.

Run-Off Schedules

When will a loan cease to be outstanding? Answering this question is especially important for products with embedded options. One cannot assume a static amortization schedule for loans with prepayment options. Similar run-off issues exist for CDs if modeled on a pool level. For example, if you have a pool of six-month CDs, it may be reasonable to assume that one sixth of the pool runs off each month.

Core Deposit Behavior

How the bank models its core deposits will influence the accuracy of the model. It is for this reason that the core deposit assumptions should be thoroughly understood, documented, and validated. The modeling of core deposits is an art, not a science, and opinions as to the best methods to use are wide and diverse. Given varying approaches, risk managers and A/L analysts should ensure that the assumptions are documented, rational, consistent with observed behavior (i.e., supported by analysis), and are not “manipulated” (e.g., altered to produce a desired outcome). In one bank, the treasurer was very quick to alter repricing beta assumptions in order to “manage” the model results to the output he desired. This manipulation of the model output was always “defensible” but not well communicated, documented or supported by empirical analysis. These apparently minor changes had major effects and in this case there was never any discussion of such change in the governance forums that existed. In practice, such “gaming” of model results should be avoided or, at minimum, any changes to critical assumptions should be required to be defended—with empirical research/analysis—at high-level risk governance forums.

Prepayment Assumptions

Loans that are subject to prepayment must be simulated to consider the prepayment option embedded in the contracts. The most common method is to use generic prepayment speeds obtained from an external source, such as Bloomberg. In certain instances, the bank will take these generic speeds and adjust them to account for idiosyncratic factors that characterize the bank’s local market. Underlying such adjustments should be documented analysis, even if it is fairly basic. The more preferred method for modeling prepayment risk is to use econometric models to project prepayment speeds. These models are often fairly complex and many ALM tools source the prepayment speeds from third-party specialist firms. A good example of such a firm is Andrew Davidson & Co., Inc. The advantage of using these models is that the speeds produced can be at the transaction level and include individual loan characteristics, including things such as age, location, origination channel, spread at origination, curve at origination, and other items.

Conclusion

Financial theory and practice have converged at a rapid rate within the last decade. While attribution for this enhanced alignment is wide and various, it is clear that the increased acceptance of advanced financial concepts, dramatic improvements in computing power and improved data management have been primary motivators for this change.

This evolution in financial practice shows no signs of slowing. The world is getting smaller and advanced risk management capabilities more pervasive; markets are more liquid than ever, spreads narrower and ­pricing more competitive; transactions and information are arriving at ever-faster speed, with ever more complexity, and decisions need to be made on the fly. For firms to gain strategic advantage, it is no longer sufficient to merely measure, report, and avoid risk. Today, the field of participants in the financial services business is too diverse to allow for such a relaxed approach toward risk management. Successful firms are those that proactively manage risk, not just avoid it.

The importance of more comprehensive risk measurement and management is echoed by former US Federal Reserve Bank Chairman, Alan Greenspan. In his October 5, 2004 speech to the American Bankers ­Association, Chairman Greenspan noted:

It would be a mistake to conclude (from these comments) that the only way to succeed in banking is through ever-greater size and diversity. Indeed, better risk management may be the only truly necessary element of success in banking.

Although commonly assumed to be so, size is not the critical element of success. Rather, the ability to rapidly, accurately, and dynamically manage your portfolio of risks is the dominant success factor.

Being nimble and reactive toward risk and risk management has never been as approachable to all financial firms as it is today, thanks to better data models, pervasive access to data and information and much faster computers and computer technology. These increased capabilities are allowing firms to price risk across products, relationships and geographies with an eye toward the “market price” of risk rather than the traditional banking model of pricing risk to peer.

The next step in the evolution of financial intermediation is one that will require active “portfolio” management rather than traditional reactive “transaction” management. It requires the implementation of processes and procedures that ensure that the bank is being compensated for the risks it assumes. Active portfolio management should allow the bank to determine where value is being created and where it is being destroyed. It involves the ability to routinely (perhaps on a daily basis) attribute the correct amount of capital at any level desired, often at the level of the transaction. It provides an ability to more actively buy, hedge, sell or hold risks, choosing which risks to keep and which risks to mitigate or eliminate. Perhaps, most importantly, it allows for the opportunity to align incentives and business behaviors with risks being underwritten.


1 Just a few years ago (1995), the US credit derivatives market was only US$2bn in notional, as reported by James Allen in American Banker. As of 2005, this number ranges from US$5.8tn (source: OCC and based on insured US commercial banks) to more than US$17.3tn (source: ISDA). This remarkable rate of growth reflects not only risk management within the dealer community, but is also indicative of growing acceptance and use within the end-user community.

2 In the future, it is likely that firms will also be able to buy and sell ­derivatives tied to catastrophic operational risk losses. The analogy would be catastrophe bonds currently being used by the insurance and reinsurance business to lay off second-loss pieces in the event of some natural disaster, such as hurricane ­exposure.

3 The treasury division often manages the derivative hedge book, which is used to offset repricing mismatches or adjust duration exposures.

4 Enterprise Risk Management—Integrated Framework, September 2004.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset