http://quantlib.414.s1.nabble.com/SABR-global-versus-local-fit-tp17538p17552.html
So, when a strike/tenor/expiry falls in between the market price grid, how is it interpolated?
The reason I ask is I calc'd -100bps 1x5 (93 vol) and 2x5 (78 vol) swaptions, and then calc’d a 1.5x5, but I forgot to move the strike to reflect the change in the forward. The vol I got was identical (to 5 decimal places) to the vol for the 1x5 at the same strike. When I moved it to reflect the new forward, I go a number in line with the mid point between the -100bp 1x5 and 2x5. Is this the expected behavior?
> On Jun 20, 2016, at 1:21 PM, Peter Caspers <
[hidden email]> wrote:
>
> Hi Terry,
>
> yes, the strike is calculated and excluded, if strike + displacement < cutoffstrike (which is a parameter in the cube’s constructor and defaulted to 1bp, to account for slight differences in the broker’s and QL’s ATM calculation). It would be useful to exclude NA strikes as you suggest. I guess in QL language NA would best correspond to the concept of "invalid quotes" (e.g. SimpleQuotes with value = Null<Real>(), or SimpleQuotes with unspecified value, i.e. just default constructed). Yes, that’s something for 1.9, what do you think Luigi (also how do we best represent NAs here)?
>
> About your "stdDev (nan)" error, I am not sure, we’d need a test case to look at, but if NANs are supplied for strikes < -displacement that shouldn’t matter indeed. I don’t know, we’d need a test case to look at, can you maybe provide a small example in this direction?
>
> Vega weighting: This is set in the cube’s ctor, it is the parameter
>
> bool vegaWeightedSmileFit
>
> Duration weight: Yes, it would be implicit in the vega weight if we’d use the swaption’s vega directly, but actually we have (see sabrinterpolation.hpp L136)
>
> blackFormulaStdDevDerivative(strike, forward, stdDev, 1.0, addParams[0]);
>
> where addParams[0] is the displacement and 1.0 is the discounted annuity, so the weight is normalized. But again, this doesn’t affect the single calibration at one option / swap tenor point and the vega weights are normalized later in the optimization anyway, so it wouldn’t even make a technical difference if we had the "correct” annuity in L136 of sabrinterpolation.hpp (we also have another sqrt(T) factor in it, since it is the derivative by std dev instead of vol, but this is also washed away in the normalization later). I still don’t get the discussion around local vs global fit, the cube performs a series of local fits, there is no global objective function involved at this level (if I understand “global” and “local” correctly).
>
> About the normal vols, yes it would be good to support them in swaptionvolcube1x, and also to include the ZABR model into this generic cube (and the recent development by Antonov et al.). It makes much sense and I have it on my list for a while, but not so much time unfortunately as all of us probably...
>
> About the shift, you just supply it together with the ATM matrix, then it is handled automatically.
>
> Kind Regards
> Peter
>
>> On 20 Jun 2016, at 18:38, terry leitch <
[hidden email]> wrote:
>>
>> Also, I get the following error if I don’t pre-process the NA’s with a spline fit replacement: "Error: stdDev (nan) must be non-negative “ I replaced the NA’s with zeros to eliminate this. This is confusing, because if the method is looking at the negative strike, why wouldn’t it ignore the NA value?
>>
>>
>>> On Jun 20, 2016, at 10:48 AM, terry leitch <
[hidden email]> wrote:
>>>
>>> Peter,
>>>
>>> Some follow ups:
>>> NA’s, when you say there needs to be a value, a non-NA value? I feed the NA’s in and the calibration fails, so I guess you calculate the strike and exclude it, not look at the NA value and exclude it?
>>>
>>> Duration weight: weighting by vega implicitly weights by duration. How do you set the option to weight by vega?
>>>
>>> I misspoke, I read your ZABR presentation and was wondering about the normal implementation overall, not as it relates to swaptionvolcube1. I believe the structure is swaptionvolcube1x ? Are there examples or test programs using it? I have both normal and lognormal vols, so it would be good to offer the option in the interface.
>>>
>>> New question: if I set shifted to true and supply a shift, do I need to feed in a discount curve representing the shift or do the methods handle this automatically?
>>>
>>> Terry Leitch
>>>
>>>> On Jun 18, 2016, at 3:26 PM, Peter Caspers <
[hidden email]> wrote:
>>>>
>>>> Just in case you are interested in what is going on in QuantLib (and
>>>> to answer the original questions). The rest of the discussion
>>>> beginning from "ditch the ql implementation..." rather belongs into
>>>> some wilmott thread probably...
>>>>
>>>> NAs: Those occur in the market quotes for strikes below minus
>>>> displacement (for shifted lognormal quotes), and those strikes are
>>>> consistently ignored by the swaption vol cube 1; you still have to
>>>> feed some value, but it doesn't matter what it is, it does not enter
>>>> the calibration.
>>>>
>>>> Weights: You can minimize the RMSE in implied (shifted lognormal) vols
>>>> and optionally weight them by vega. The calibration is done per option
>>>> / swap point of the ATM swaption matrix, so it doesn't really make
>>>> sense (i.e. it wouldn't make any difference) to weight by underlying
>>>> annuity or anything. The reason why short underlying are harder to fit
>>>> lies more in the nature of the market smiles for those I believe.
>>>>
>>>> Normal vol option: There is none for the SABR cube, currently only
>>>> lognormal and shifted lognormal vols are supported. Where did you see
>>>> the option?
>>>>
>>>> Regards
>>>> Peter
>>>>
>>>>
>>>> On 18 June 2016 at 20:24, Mike DelMedico <
[hidden email]> wrote:
>>>>> I guess the way to attack this is based on what you are trying to
>>>>> accomplish. Since I'm trading with the end product, I need extremely high
>>>>> levels of precision. If you are going to pass in common surface points
>>>>> (1m/3m/6m/1y etc) on common tenors (1y/2y/5y/10y/30y) using data from icap
>>>>> or tullet (not an exchange but rather data collection brokerages), then you
>>>>> would be fine to run risk analysis on portfolios, but that's about it. If
>>>>> you need to accurately mark portfolio NPV though, that system will be
>>>>> useless.
>>>>>
>>>>> Common sabr implementation using black vols to calibrate is very bad
>>>>> practice. Instead, you should calibrate using premiums only. I stopped
>>>>> using sabr to drive my surface 18m ago and I've never looked back. You can't
>>>>> trade any of the parameters directly in the market, so why would you use
>>>>> them to set your skews or to measure risk?
>>>>>
>>>>> Try looking at the input data you are sending to the cube (hint: not the
>>>>> sabr parameters) more closely and you should see a pattern there. This
>>>>> pattern is the building block for a proper surface implementation.
>>>>>
>>>>> Also, the skew data from tullet and icap is updated very infrequently since
>>>>> skew trades only go through once in a while (like right now when the market
>>>>> is repricing risk). Your ATM straddle prices are probably good to use
>>>>> though. Use the forward premium prices to extract vols though, not the spot
>>>>> premiums.
>>>>>
>>>>> Hope this helps.
>>>>> Mike
>>>>>
>>>>> On Jun 18, 2016 11:47 AM, "terry leitch" <
[hidden email]> wrote:
>>>>>>
>>>>>> Ok, I take what I said back. I updated the curve and, more importantly,
>>>>>> priced with a cube with non-NA strikes. No interpolation. I get values at
>>>>>> 5-10 year expiration and a 5 year swap tenor that are within a couple of
>>>>>> 10th’s in vol points of the quoted values and also a price in line even at
>>>>>> –200bps.
>>>>>>
>>>>>> One odd thing is that the errors noted previously occurred when I
>>>>>> truncated the cue to just 2 expiries and 2 swap tenors to box the swaption,
>>>>>> but errors seemed to increase with fewer points. I would expect it to
>>>>>> perform the same (if only local fit) or better (if a global fit).
>>>>>>
>>>>>> The problem is determining when to drop tenors/expiries at strikes due to
>>>>>> NA’s. I used more tenors at the front and dropped strikes, but the
>>>>>> extrapolation came up short by about a vol point for the 5x5 –200bp strike,
>>>>>> whereas when I dropped expiries and kept strikes, I got a fairly precise
>>>>>> result.
>>>>>>
>>>>>> Work in progress.
>>>>>>
>>>>>> From: terry leitch <
[hidden email]>
>>>>>> Date: Saturday, June 18, 2016 at 10:58 AM
>>>>>> To: Mike DelMedico <
[hidden email]>
>>>>>> Cc: <
[hidden email]>
>>>>>> Subject: Re: [Quantlib-users] SABR global versus local fit
>>>>>>
>>>>>> I was coming to that conclusion but your 2 cents accelerated it. I thought
>>>>>> I could salvage the cube by paring down the expirations and tenors used to
>>>>>> local choices, but it seems to still be off by about 5-10% in percent of vol
>>>>>> points for the trials I’m running.
>>>>>>
>>>>>> Did you use linear interpolation on the sabr coefficients? Do you
>>>>>> interpolate alpha by expiry and tenor and the other coefs by strike, expiry,
>>>>>> and tenor? Do you duration weight the interpolated vols?
>>>>>>
>>>>>> I wonder what is the added benefit of using a 4 parameter stochastic model
>>>>>> that is arb free at a finite number of times versus just using a 3 dimension
>>>>>> interpolation on strike,expiry, and tenor.
>>>>>>
>>>>>> Too bad the cube is off. It’s close but not quite.
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Jun 17, 2016, at 3:48 PM, Mike DelMedico <
[hidden email]>
>>>>>> wrote:
>>>>>>
>>>>>> I would ditch the ql implementation of swaption cube, and go with local
>>>>>> sabr fits and some robust interpolation methods. Much more flexible in my
>>>>>> opinion. I ran into problems when data wasn't symmetrical, as is the case
>>>>>> in most of the major currencies. Just my two cents.
>>>>>>
>>>>>> On Jun 17, 2016 2:01 PM, "terry leitch" <
[hidden email]> wrote:
>>>>>>>
>>>>>>> I’m currently writing an interface for Rquantlib into the saber swaption
>>>>>>> module using SwaptionVolCube1. I have an end of day surface from a well
>>>>>>> known exchange that covers from 1x1 all the way to 10x30 swaptions at +/-
>>>>>>> 200bps around ATM. Due to the proprietary nature, I don’t have permission to
>>>>>>> share but will try to seek it so I can be more specific.
>>>>>>>
>>>>>>> First issue is NA’s. How does quant lib handle them? The data at the
>>>>>>> front end has from 1-5 strikes with NA’s due to the low rate environment. To
>>>>>>> keep the strikes consistent across all time I need to put in values, but the
>>>>>>> fit doesn’t converge at the front end, possibly due to the interpolated
>>>>>>> values. I’ve tried several extrapolations but none produce a cube that will
>>>>>>> converge in the fit, always exceeding the max error. My current view is that
>>>>>>> I need to develop vol cubes for a given problem based on the structure of
>>>>>>> expirations. So, for a xx5 I build a shorter Does anyone have a view?
>>>>>>>
>>>>>>> Second, how does the cube fit method weight the volatilities for swap
>>>>>>> tenors? Is it equal weighting? If it is, that would be an issue because the
>>>>>>> duration of a 1x1 is a fraction of a 1x30. I noticed most error messages in
>>>>>>> the fit involve 1Y swap tenors, is the fit weighting by duration but then
>>>>>>> failing on an absolute measure of volatility difference, a difference I
>>>>>>> might be willing to ignore due to overall magnitude?
>>>>>>>
>>>>>>> Third, are there any examples where the normal vol option has been used?
>>>>>>>
>>>>>>>
>>>>>>> ------------------------------------------------------------------------------
>>>>>>> What NetFlow Analyzer can do for you? Monitors network bandwidth and
>>>>>>> traffic
>>>>>>> patterns at an interface-level. Reveals which users, apps, and protocols
>>>>>>> are
>>>>>>> consuming the most bandwidth. Provides multi-vendor support for NetFlow,
>>>>>>> J-Flow, sFlow and other flows. Make informed decisions using capacity
>>>>>>> planning
>>>>>>> reports.
http://sdm.link/zohomanageengine>>>>>>> _______________________________________________
>>>>>>> QuantLib-users mailing list
>>>>>>>
[hidden email]
>>>>>>>
https://lists.sourceforge.net/lists/listinfo/quantlib-users>>>>>>>
>>>>>>
>>>>>>
>>>>>> ------------------------------------------------------------------------------
>>>>>> What NetFlow Analyzer can do for you? Monitors network bandwidth and traffic
>>>>>> patterns at an interface-level. Reveals which users, apps, and protocols are
>>>>>> consuming the most bandwidth. Provides multi-vendor support for NetFlow,
>>>>>> J-Flow, sFlow and other flows. Make informed decisions using capacity
>>>>>> planning reports.
>>>>>>
http://sdm.link/zohomanageengine_______________________________________________>>>>>> QuantLib-users mailing list
[hidden email]
>>>>>>
https://lists.sourceforge.net/lists/listinfo/quantlib-users>>>>>
>>>>>
>>>>> ------------------------------------------------------------------------------
>>>>> What NetFlow Analyzer can do for you? Monitors network bandwidth and traffic
>>>>> patterns at an interface-level. Reveals which users, apps, and protocols are
>>>>> consuming the most bandwidth. Provides multi-vendor support for NetFlow,
>>>>> J-Flow, sFlow and other flows. Make informed decisions using capacity
>>>>> planning
>>>>> reports.
http://sdm.link/zohomanageengine>>>>> _______________________________________________
>>>>> QuantLib-users mailing list
>>>>>
[hidden email]
>>>>>
https://lists.sourceforge.net/lists/listinfo/quantlib-users>>>>>
>>>
>>
>
patterns at an interface-level. Reveals which users, apps, and protocols are
consuming the most bandwidth. Provides multi-vendor support for NetFlow,
J-Flow, sFlow and other flows. Make informed decisions using capacity planning
reports.