Hi, I’m new to QuantLib and am having a problem generating
an implied volatility for an Option when dividends are involved using
DividendVanillaOption and FDDividendAmericanEngine. If there are no dividends on the
underlier I have no problem generating a volatility using VanillaOption and
FDAmericanEngine. The following code gives a “not enough points to
interpolate” error: DividendVanillaOption option(stochasticProcess,
payoff, amExercise,
dividendDates, dividendPayments); Size timeSteps = 101; // Finite
differences engine option.setPricingEngine(boost::shared_ptr<PricingEngine>(
new
FDDividendAmericanEngine(timeSteps,timeSteps))); // Sample
volatility given this FDDividendAmericanEngine.. Volatility v = option.impliedVolatility(optionPrice,
1.0e-4, 20, QL_MIN_VOLATILITY,
QL_MAX_VOLATILITY); Being a newbie, I lifted this pretty much from the
AmericanOption sample but changed to use the dividend classes. I populate dividendDates and
dividendPayments with Dates and Amounts for each dividend payment. I’ve also tried
changing the timeSteps parameter to be odd/even and no change – I always get
the “not enough points to interpolate” error. Like I say, code similar to above using VanillaOption and
FDAmericanEngine gives me a reasonable looking volatility figure. Am I going about
generating implied volatility in the right way? Any help pointers greatly
appreciated. Thanks, Ferghil O’Rourke |
Looking through the mail archives I can’t
find an mention of anyone having issues with DividendVanillaOption.
Does anyone know if it works? I’m generating an implied volatility
OK using VanillaOption but when I have a Dividend schedule I’m
getting “not enough points to interpolate”. I pass the Dates and Dividend amounts in
as two vectors of <Date> and <Real>, as per the documentation. -Ferghil From: forourke
[mailto:[hidden email]] Hi, I’m new to QuantLib and am having a problem generating
an implied volatility for an Option when dividends are involved using
DividendVanillaOption and FDDividendAmericanEngine. If there are no dividends on the
underlier I have no problem generating a volatility using VanillaOption and
FDAmericanEngine. The following code gives a “not enough points to
interpolate” error: DividendVanillaOption option(stochasticProcess,
payoff, amExercise,
dividendDates, dividendPayments); Size timeSteps = 101; // Finite
differences engine option.setPricingEngine(boost::shared_ptr<PricingEngine>(
new
FDDividendAmericanEngine(timeSteps,timeSteps))); // Sample
volatility given this FDDividendAmericanEngine.. Volatility v = option.impliedVolatility(optionPrice,
1.0e-4, 20, QL_MIN_VOLATILITY,
QL_MAX_VOLATILITY); Being a newbie, I lifted this pretty much from the
AmericanOption sample but changed to use the dividend classes. I populate dividendDates and
dividendPayments with Dates and Amounts for each dividend payment. I’ve also tried
changing the timeSteps parameter to be odd/even and no change – I always get
the “not enough points to interpolate” error. Like I say, code similar to above using VanillaOption and
FDAmericanEngine gives me a reasonable looking volatility figure. Am I going about
generating implied volatility in the right way? Any help pointers greatly
appreciated. Thanks, Ferghil O’Rourke |
Free forum by Nabble | Edit this page |