Moran's I


Moran's I is a measure of spatial autocorrelation defined as

I = \dfrac{N \Sigma_i \Sigma_j w_{ij}(x_i - \overline{x})(x_j - \overline{x})}{(\Sigma_i \Sigma_j w_{ij}) \Sigma_i(x_i - \overline{x})^2} \text { or} \newline\text{ }\newline I = \dfrac{(\text{population})(\text{sum of weighted products of deviations})} {(\text{sum of weights})(\text{sum of squared deviations})} ,

where N is the population, W is a matrix of weights w_{ij}, and x_i, x_j are observations with \overline{x} the average. The recurring term (x_i - \overline{x}) represents each attribute's deviation, or lag, from the mean. If for example N = $N$ 5 inc("N"); dec("N"); matrices["W"].rows(get("N")); matrices["W"].columns(get("N")); matrices["W"].refresh(); update("xvec"); update("eqtopsum"); update("eqWsum"); update("eqf"); , we might have \mathbf{x} = \{ $xvals$ \} [1, 3, 4, 5, 2] var xv = variables["xvals"].value; while(xv.length > get("N")) xv.pop(); while(xv.length < get("N")) xv.push(Math.floor(Math.random() * 10)); update("eqxbar"); , making the average value \overline{x} = $xbar$ 3 var xv = variables["xvals"].value; var sum = 0; for(i = 0; i < xv.length; ++i) sum += xv[i]; sum /= xv.length; set("xbar", sum); update("eqdiffsum"); and the sum of squared deviations \Sigma_i (x_i - \overline{x})^2 = {$diffsum$} 1 var sum = 0; var xv = variables["xvals"].value; var xb = get("xbar"); for(i = 0; i < xv.length; ++i) sum += (xv[i] - xb)*(xv[i] - xb); set("diffsum", sum); . If the matrix of weights

W = matrices["W"].incr($i, $j); matrices["W"].decr($i, $j); //update("eqi"); //update("eqj"); //update("eqwij"); update("eqtopsum"); update("eqWsum"); update("eqf"); ,

we can compute the sums \Sigma_i \Sigma_j w_{ij}(x_i - \overline{x})(x_j - \overline{x}) = {$topsum$} 1 var sum = 0; var xv = variables["xvals"].value; var xbar = get("xbar"); for(i = 0; i < xv.length; ++i) for(j = 0; j < xv.length; ++j) { var wij = parseFloat(get("W")[i][j]); // why is this coming out a string all of a sudden? console.log(typeof(wij)); sum += wij * (xv[i] - xbar) * (xv[j] - xbar) } set("topsum", sum); in the numerator and \Sigma_i \Sigma_j w_{ij} = {$Wsum$} 1 var sum = 0; var xv = variables["xvals"].value; for(i = 0; i < xv.length; ++i) for(j = 0; j < xv.length; ++j) { var wij = parseFloat(get("W")[i][j]); sum += wij; } set("Wsum", sum); in the denominator. Combining all the parts,

I = \dfrac{N \Sigma_i \Sigma_j w_{ij}(x_i - \overline{x})(x_j - \overline{x})}{(\Sigma_i \Sigma_j w_{ij}) \Sigma_i(x_i - \overline{x})^2} = \dfrac{($N$)($topsum$)}{($Wsum$)($diffsum$)} = {$I$}. 0 set("I", (get("N") * get("topsum"))/(get("Wsum") * get("diffsum")))

When w_{ij} is an inverse distance between i and j, the numerator is large when pairs of points which are close together tend to be outside the average in the same direction. The denominator is large when all points are close together and points are far from the mean.