Importance Sampling

classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|

Importance Sampling

andrea.odetti
Hi guys,

I'm tring to introduce importance sampling into the MonteCarlo Framework of
QuantLib but I have some problems.

I think a good place to work is the class
RandomNumbers::RandomArrayGenerator where in the method next() I can change
the value of next_weight, but:

1) I would like to know the value of the Brownian Motion that is the value
of sqrtCovariance_ * next_.value (but without the DT into the
sqrtCovaraince)

2) I don't know the value of the sum of all weight until the end of the
simulation.

Actually, I'm using the Girsanov Theorem to change the drift of my process,
but I should multiply the payoff for each path by (in the case of a single
asset, with only one step in time)

exp(-h * W(T) - 0.5 * h^2 * T)

is there anybody who has an idea, about where i should work?

bye bye

andrea



Reply | Threaded
Open this post in threaded view
|

Re: Importance Sampling

Luigi Ballabio-4
Ciao Andrea,

At 07:26 PM 5/20/02 +0200, [hidden email] wrote:
>I'm tring to introduce importance sampling into the MonteCarlo Framework of
>QuantLib but I have some problems.
>
>I think a good place to work is the class
>RandomNumbers::RandomArrayGenerator where in the method next() I can
>change the value of next_weight, but...

I think RandomArrayGenerator is too high-level already as it delegates the
actual random number generation to its generator_ data member. The type of
generator_ is determined by RandomArrayGenerator template argument.

Unfortunately, the Gaussian generators currently implemented in QuantLib
are all unbiased ones, while for importance sampling as I understand it,
the desired importance bias should be introduced from the beginning.
Namely, once we draw numbers from a Gaussian distribution G(x), we cannot
introduce importance sampling a posteriori. Instead, we should sample from
a distribution F(x) = G(x)H(x), where H(x) expresses the bias, and give
each sample x_i a weight 1/H(x_i).

So we have one or possibly two problems here: one is that there's no such
biased generator in QuantLib, and by the way, you are welcome to implement
and contribute it*; the second is that you might want to introduce a
different bias for each component of your random array. I think this could
be managed by using different biased generators for different components,
with the warning that one should be careful not to introduce artificial
correlations between them.

Bye for now,
                 Luigi



Reply | Threaded
Open this post in threaded view
|

Re: Importance Sampling

Kris .
> introduce importance sampling a posteriori. Instead, we should sample from
> a distribution F(x) = G(x)H(x), where H(x) expresses the bias, and give
> each sample x_i a weight 1/H(x_i).

One way of looking at it is as a tilted distribution, with the mean
shifted,and
  F(x) = G(x) * H(x)   where H(x) = rand(0,1)/rand(t,1)

where t is some constant by which you are shifting the mean.

Kris