This function computes the effective sample size (ESS) given a vector of weights, using the classical \((\sum w)^2 / \sum(w^2)\) formula (sometimes referred to as "Kish's effective sample size").
Details
The effective sample size (ESS) reflects how many observations you would have if all were equally weighted. If the weights vary substantially, the ESS can be much smaller than the actual number of observations. Formally:
$$ \mathrm{ESS} = \frac{\left(\sum_i w_i\right)^2}{\sum_i w_i^2}. $$
Diagnostic Value:
Indicator of Weight Concentration: A large discrepancy between ESS and the actual sample size indicates that a few observations carry disproportionately large weights, effectively reducing the usable information in the dataset.
Variance Inflation: A small ESS signals that weighted estimates are more sensitive to a handful of observations, inflating the variance and standard errors.
Practical Guidance: If ESS is much lower than the total sample size, it is advisable to investigate why some weights are extremely large or small. Techniques like weight trimming or stabilized weights might be employed to mitigate the issue