In this contribution we focus on recently introduced dynamic distributed estimation in exponential family of distributions. We consider a set of cooperative nodes, modelling a random variable with a probability density function (pdf), depending on an unknown parameter theta.
In order to improve their estimates of this parameter, nodes share the observations and/or the estimates of parameter theta. Since the method exploits the Bayes rule, nodes are also allowed to share the hyperparameters.
Based on data provided we distinguish between two phases: the adaptation phase and the combination phase. In the adaptive phase we first consider minimization of a cost function based on the Kullback-Leibler (KL) divergence resulting in the weighted geometric mean of pdfs of observations.
It then enters the Bayes rule and leads to up-date of the hyperparameters. In the combination phase nodes share their hyperparamters and/or the estimates of parameter theta.
We then again exploit the KL divergence and obtain either weighted linear combination of provided hyperparameters or estimates of parameter. By repeating the diffusion part whenever new observations and estimates are available we obtain a dynamic version.
The inseparable part of the procedure is the assignment of weights. Static weights, which do not change with new set of data, lead to estimates with reasonable properties but do not reflect nodes' reliability.
Thus we suggest to model also the weights in order to reflect the reliability and to improve the resulting estimates of parameter theta. The minimum cross-entropy principle allows us to obtain a new probability distribution function over the weights with every new set of data.
The additional constraints on expected values of the KL divergences help us distinguish changes in the nodes estimates and capture probable failures of nodes.