Represents feature images as a neighborhood across scales. each subject gets a label image and feature list. these labels/features should be the same type for all subjects. e.g each subject has a k-label image where the labels cover the same anatomy and the feature images are the same. for each label in a mask, produce a multi-resolution neighborhood sampling from the data within the label, for a given feature. do this for each feature. so, 2 labels will double the width of the predictor matrix. an additional scale parameter or feature will increase the width. more samples will increase the number of rows in the predictor matrix. the labels allow one to collect predictors side-by-side from different parts of an image-based feature set. future work will allow other covariates.

getMultiResFeatureMatrix(
  x,
  labelmask,
  rad = NA,
  multiResSchedule = c(0),
  nsamples = 10
)

Arguments

x

a list of feature images

labelmask

the mask defines the image space for the associated feature and increases the number of predictors. more labels means more predictors. a different mask may be used for each feature, if desired, in which case this should be a list of masks where the length of the list is equal to the number of features.

rad

vector of dimensionality d define nhood radius

multiResSchedule

a vector of smoothing values

nsamples

defines the number of samples to take for each label.

Value

mat a matrix of predictors with n samples rows

Author

Avants BB, Tustison NJ

Examples


img <- antsImageRead(getANTsRData("r16"))
seg <- kmeansSegmentation(img, 3)$segmentation
flist <- list(img, img %>% iMath("Grad"))
featMat <- getMultiResFeatureMatrix(flist, seg,
  rad = c(1, 1),
  multiResSchedule = c(2, 1), nsamples = 10
)