Compute distance based Local Moran for 527k+ point dataset using spdep library

时间:2019-04-16 23:16:47

标签: r spdep

As the title says, I'm trying to compute Local Moran for a 527k point dataset using the spdep package, creating neighborhoods based on distance. The generalized process I'm doing is the following:

library(spdep)
# Convert coordinates to matrix
matrix_pts <- as.matrix(coordinates)
# Generate neighbors    
neighbors <- dnearneigh(matrix_pts, 
                       d1 = 0, 
                       d2 = range)
# Get weight matrix
wm <- nb2listw(neighbors, 
               zero.policy = T, 
               style = style) 
# Get moran statistics
moran_stat <- localmoran(value, 
                        wm,
                        zero.policy = T)

But I've run into a problem where I can't create the neighborhoods using dnearneigh, since the dataset is way too large, and neighborhoods are composed of 200-1000 points.

I tried the solution depicted Here and I got myself a dataframe with a first row with IDs and a second row with a list containing the IDs of neighboring points (i.e):

   id       int_ids
1: 239226 239226,242762,339386,444833,243000,240521,...
2: 242762 239226,242762,339386,444833,243000,240521,...
3: 339386 239226,242762,339386,444833,243000,240521,...
4: 444833 239226,242762,339386,444833,243000,240521,...
5: 243000 239226,242762,339386,444833,243000,240521,...
6: 240521 239226,242762,339386,444833,243000,240521,...

However, I don't know how to create the nb object required by nb2listw, and digging around hasn't helped me a lot.

Is there a way to transform this dataframe into a nb object? If I were able to do so, would the weight matrix be as hard to create as the neighbors ? Is there a another way to compute local moran for a dataset of this volume ?

0 个答案:

没有答案