Financial time series modelling is frequently linked to ARCH and GARCH models, which are obviously estimated by the computationally complex (conditional) maximum likelihood estimation procedure. However, in many practical applications, e.g. in the case of high-frequency data, it is necessary to adopt numerically more efficient techniques to calibrate or control such models.
The aim of this contribution is to analyse a two stage recursive estimation procedure suggested for the standard GARCH modelling class. Particularly, a Monte Carlo study is performed to examine behaviour of the supposed method.
Although the authors of the recurrent technique have presented its adequacy, the results delivered by simulation experiments are not convincing. Such conclusions clearly indicate a need for some revisions; consequently, main ideas of future research are introduced and commented.