Admin's other articles

4349 The World without Bankruptcy Laws

Bankruptcy is one of the natural states which a company may find itself in. Entrepreneurship is primarily about taking risks. When companies take risks, some of them succeed, whereas others fail. Hence failure is a natural part of the business. However, many critics of bankruptcy laws believe that there isn’t a need for an elaborate […]

4348 The Wirecard and Infosys Scandals are a Lesson on How NOT to Treat Whistleblowers

What is the Wirecard Scandal all about and Why it is a Wakeup Call for Whistleblowers Anyone who has been following financial and business news over the last couple of years would have heard about Wirecard, the embattled German payments firm that had to file for bankruptcy after serious and humungous frauds were uncovered leading […]

4347 Why the Digital Age Demands Decision Makers to be Like Elite Marines and Zen Monks

How Modern Decision Makers Have to Confront Present Shock and Information Overload We live in times when Information Overload is getting the better of cognitive abilities to absorb and process the needed data and information to make informed decisions. In addition, the Digital Age has also engendered the Present Shock of Virality and Instant Gratification […]

4346 Why Indian Firms Must Strive for Strategic Autonomy in Their Geoeconomic Strategies

Geopolitics, Economics, and Geoeconomics In the evolving global trading and economic system, firms and corporates are impacted as much by the economic policies of nations as they are by the geopolitical and foreign policies. In other words, any global firm wishing to do business in the international sphere has to be cognizant of both the […]

4345 Why Government Should Not Invest Public Money in Sports Stadiums Used by Professional Franchises

In the previous article, we have already come across some of the reasons why the government should not encourage funding of stadiums that are to be used by private franchises. We have already seen that the entire mechanism of government funding ends up being a regressive tax on the citizens of a particular city who […]

See More Article from Admin

It is a long established fact that a reader will be distracted by the readable content of a page when looking at its layout.

Visit Us

Our Partners

Search with tags

  • No tags available.

A loss distribution approach is a common approach followed by risk management practitioners in order to identify and evaluate the possible risks that they are likely to face in the due course of business. The loss distribution approach has actually been designed by the actuarial practitioners who work in the insurance industry. It is for this reason that this technique is mathematically advanced and therefore complicated in nature. In the modern world, the loss distribution approach has become an integral part of the advanced measurement approach prescribed by the Bank of International Settlements (BIS) in the Basel norms. In this article, we will have a closer look at the step-by-step process which needs to be followed while implementing the loss distribution approach.

Step #1: Severity Estimation

The first step in the loss distribution approach is to measure the severity of the impact that the risk events would produce if they were to actually occur. In theory, this is easy to collect. However, in reality, this data is scarce and quite often simply inaccurate. This is because of issues such as reporting bias and scaling frequency.

This is because these analyses become valuable only when large sets of data are used. It is difficult to procure large sets of homogenous data. The data is often drawn from different institutions in different countries which have very different rules and regulations. Hence, whenever an attempt is made to scale the data, it often becomes unviable. Even if the data is derived from different units of the same bank, it may still be unviable. Experienced risk management personnel are able to adjust this data and reduce the problem of scaling frequency. However, they are not able to eliminate it.

Also, there is always a chance that the loss estimates reported varying from the actual loss estimates that occurred. The data may also have to be adjusted in order to eliminate this impact. It is common for some organizations to set their reporting cap higher. This means that they have to report fewer loss events. However, it also skews the distribution of the entire data set.

Step #2: Frequency Estimation

The next step is to use a statistical distribution that can help predict the number of times an event will occur within a given period of time. Now, there are several complicated statistical distributions that are possible here. However, in most cases, organizations assume that the loss events are likely to follow a Poisson distribution.

This is because the Poisson distribution is statistically known to be the best distribution when it comes to predicting the probability regarding how many times will an unrelated event occur during a given time period. It is important for the loss events to be independent of each other if the Poisson distribution is being used.

Statistically, also, the Poisson distribution is easier to work with. This is because the entire distribution can be described using one parameter, often called the “lambda”.

Step #3: Calculation of Capital Charge

Once the loss frequency and loss severity are known, the next step is to determine the capital charge. The capital charge is basically the amount of money that an organization needs to set aside in order to meet its operational risk requirements.

There are standard methods that have been developed over the years, in order to determine the capital charge. These methods are called Monte-Carlo simulations. The name “Monte Carlo” has been derived from the city of Monte Carlo which is famous for casinos that are obviously synonymous with random events. One of the most common methods used can be followed in three simple steps. In the first step, the 99.9 percentile distribution of the data is found out. On the other hand, in the second step, the mean i.e. the 50 percentile distribution is found out. The 50th percentile is subtracted from the 99th percentile in the third and final step.

The Monte Carlo method is the best possible method. However, it too has certain shortcomings. For instance, the results given by the method are fairly inaccurate when it comes to loss events that may be correlated to each other.

Step #4: Calculation of Confidence Interval

The end result of the above-mentioned step is that a number i.e. a capital charge is derived. However, while performing statistical analysis, we prefer ranges to numbers. Also, we like to attach a probability to the value within the range actually being true. This is what is called the creation of confidence intervals. This is a difficult step where a lot of advanced statistical techniques are used in order to come up with the confidence interval. This is crucial since all further decisions are made based on the data provided in this confidence interval.

The loss data approach may seem to be very sophisticated and therefore very accurate. Most of the time it is accurate. However, it is based on the approach that the future will be like the past. This is because it relies on empirical data throughout the process. As we already know, this may not be the case many times. This is the reason why organizations should not blindly follow the loss distribution approach. Instead, this approach should be used in conjunction with other data regarding the operating environment.

Article Written by

Admin

Leave a reply

Your email address will not be published. Required fields are marked *

Related Posts

Why the Digital Age Demands Decision Makers to be Like Elite Marines and Zen Monks

Admin

Personal Grooming Tips for Women

Admin

Politics in Virtual Workplace

Admin