Predictability of Ensemble Forecasting Estimated Using the Kullback-Leibler Divergence in the Lorenz Model
-
Graphical Abstract
-
Abstract
A new method to quantify the predictability limit of ensemble forecasting is presented using the Kullback-Leibler (KL) divergence (also called the relative entropy), which provides a measure of the difference between the probability distributions of ensemble forecasts and local reference (true) states. The KL divergence is applicable to a non-normal distribution of ensemble forecasts, which is a substantial improvement over the previous method using the ensemble spread. An example from the three-variable Lorenz model illustrates the effectiveness of the KL divergence, which can effectively quantify the predictability limit of ensemble forecasting. On this basis, the KL divergence is used to investigate the dependence of the predictability limit of ensemble forecasting on the initial states and the magnitude of initial errors. The local predictability limit of ensemble forecasting varies considerably with the initial states, as well as with the magnitude of initial errors. Further research is needed to examine the real-world applications of the KL divergence in measuring the predictability of ensemble weather forecasts.
-
-