Math, asked by lakes5065, 1 year ago

What is var and how it is calculated at the 95% confidence level?

Answers

Answered by DIVINEREALM
4
VAR summarizes the predicted maximum loss (or worst loss) over a target horizon within a given confidence interval.

Calculation.
Assume you hold $100 million in medium-term notes. How much could you lose in a month? As much as $100,000? Or $1 million? Or $10 million? Without an answer to this question, investors have no way to decide whether the returns they receive is appropriate compensation for risk.

Returns ranged from a low of -6.5% to a high of +12.0%. Now construct regularly spaced ``buckets'' going from the lowest to the highest number and count how many observations fall into each bucket. For instance, there is one observation below -5%. There is another observation between -5% and -4.5%. And so on. By so doing, you will construct a ``probability distribution'' for the monthly returns, which counts how many occurrences have been observed in the past for a particular range.

For each return, you can then compute a probability of observing a lower return. Pick a confidence level, say 95%. For this confidence level, you can find on the graph a point that is such that there is a 5% probability of finding a lower return. This number is -1.7%, as all occurrences of returns less than -1.7% add up to 5% of the total number of months, or 26 out of 516 months. Note that this could also be obtained from the sample standard deviation, assuming the returns are close to normally distributed.
Similar questions