[med-svn] [Git][med-team/nipy][master] 5 commits: d/control: build depends on pybuild-plugin-pyproject.

Étienne Mollier (@emollier) gitlab at salsa.debian.org
Thu Aug 17 20:10:49 BST 2023



Étienne Mollier pushed to branch master at Debian Med / nipy


Commits:
cdcb104a by Étienne Mollier at 2023-08-17T21:01:54+02:00
d/control: build depends on pybuild-plugin-pyproject.

- - - - -
df190faf by Étienne Mollier at 2023-08-17T21:03:05+02:00
nibabel5.1.0.patch: new: resolve test failures.

These test failures are related to the introduction of the newer
nibabel version in Debian, which deprecates a couple of functions,
resulting in some build time test errors in nipy.

Closes: #1042053

- - - - -
ac3a6f50 by Étienne Mollier at 2023-08-17T21:04:58+02:00
standard-gifty-support.patch: new: fix gifti error.

- - - - -
c620ecbb by Étienne Mollier at 2023-08-17T21:06:23+02:00
remove-imagefileerror.patch: new: remove obsolete import.

- - - - -
ce3ea4a2 by Étienne Mollier at 2023-08-17T21:09:21+02:00
ready to upload to unstable.

- - - - -


6 changed files:

- debian/changelog
- debian/control
- + debian/patches/nibabel5.1.0.patch
- + debian/patches/remove-imagefileerror.patch
- debian/patches/series
- + debian/patches/standard-gifty-support.patch


Changes:

=====================================
debian/changelog
=====================================
@@ -1,10 +1,21 @@
-nipy (0.5.0-8) UNRELEASED; urgency=medium
+nipy (0.5.0-8) unstable; urgency=medium
 
   [ Yaroslav Halchenko ]
   * removed myself and Michael since we have not attended to this package for
     awhile
 
- -- Étienne Mollier <emollier at debian.org>  Sat, 04 Feb 2023 15:07:02 +0100
+  [ Étienne Mollier ]
+  * nibabel5.0.0.patch: update following review upstream.
+    Thanks to Chris Markiewicz and Yaroslav Halchenko
+  * d/control: build depends on pybuild-plugin-pyproject.
+  * nibabel5.1.0.patch: new: resolve test failures.
+    These test failures are related to the introduction of the newer
+    nibabel version in Debian, which deprecates a couple of functions,
+    resulting in some build time test errors in nipy. (Closes: #1042053)
+  * standard-gifty-support.patch: new: fix gifti error.
+  * remove-imagefileerror.patch: new: remove obsolete import.
+
+ -- Étienne Mollier <emollier at debian.org>  Thu, 17 Aug 2023 21:07:20 +0200
 
 nipy (0.5.0-7) unstable; urgency=medium
 


=====================================
debian/control
=====================================
@@ -6,6 +6,7 @@ Testsuite: autopkgtest-pkg-python
 Priority: optional
 Build-Depends: debhelper-compat (= 13),
                dh-python,
+               pybuild-plugin-pyproject,
                python3-all-dev,
                python3-setuptools,
                python3-scipy,


=====================================
debian/patches/nibabel5.1.0.patch
=====================================
@@ -0,0 +1,3372 @@
+Description: fix deprecation errors in reverse dependencies.
+ Calls to nibabel.onetime.setattr_on_read are deprecated since nibabel 3.2,
+ and raise an ExpiredDeprecationError since nibabel 5.0, causing test failures
+ in reverse dependencies such as nipype.  This patch migrates the code to the
+ documented replacement nibabel.onetime.auto_attr.
+Author: Étienne Mollier <emollier at debian.org>
+Bug-Debian: https://bugs.debian.org/1042133
+Forwarded: no
+Last-Update: 2023-08-17
+---
+This patch header follows DEP-3: http://dep.debian.net/deps/dep3/
+--- nipy.orig/nipy/core/image/image.py
++++ nipy/nipy/core/image/image.py
+@@ -19,7 +19,7 @@
+ 
+ import numpy as np
+ 
+-from nibabel.onetime import setattr_on_read
++from nibabel.onetime import auto_attr
+ 
+ # These imports are used in the fromarray and subsample functions only, not in
+ # Image
+@@ -80,27 +80,27 @@
+                                np.diag([3,5,7,1]))
+     _doc['coordmap'] = "Affine transform mapping from axes coordinates to reference coordinates."
+ 
+-    @setattr_on_read
++    @auto_attr
+     def shape(self):
+         return self._data.shape
+     _doc['shape'] = "Shape of data array."
+ 
+-    @setattr_on_read
++    @auto_attr
+     def ndim(self):
+         return len(self._data.shape)
+     _doc['ndim'] = "Number of data dimensions."
+ 
+-    @setattr_on_read
++    @auto_attr
+     def reference(self):
+         return self.coordmap.function_range
+     _doc['reference'] = "Reference coordinate system."
+ 
+-    @setattr_on_read
++    @auto_attr
+     def axes(self):
+         return self.coordmap.function_domain
+     _doc['axes'] = "Axes of image."
+ 
+-    @setattr_on_read
++    @auto_attr
+     def affine(self):
+         if hasattr(self.coordmap, "affine"):
+             return self.coordmap.affine
+@@ -280,9 +280,9 @@
+             order = [self.axes.index(s) for s in order]
+         new_cmap = self.coordmap.reordered_domain(order)
+         # Only transpose if we have to so as to avoid calling
+-        # self.get_data
++        # self.get_fdata
+         if order != list(range(self.ndim)):
+-            new_data = np.transpose(self.get_data(), order)
++            new_data = np.transpose(self.get_fdata(), order)
+         else:
+             new_data = self._data
+         return self.__class__.from_image(self,
+@@ -344,19 +344,19 @@
+     def __setitem__(self, index, value):
+         """Setting values of an image, set values in the data array."""
+         warnings.warn("Please don't use ``img[x] = y``; use "
+-                      "``img.get_data()[x]  = y`` instead",
++                      "``img.get_fdata()[x]  = y`` instead",
+                       DeprecationWarning,
+                       stacklevel=2)
+         self._data[index] = value
+ 
+     def __array__(self):
+         """Return data as a numpy array."""
+-        warnings.warn('Please use get_data instead - will be deprecated',
++        warnings.warn('Please use get_fdata instead - will be deprecated',
+                       DeprecationWarning,
+                       stacklevel=2)
+-        return self.get_data()
++        return self.get_fdata()
+ 
+-    def get_data(self):
++    def get_fdata(self):
+         """Return data as a numpy array."""
+         return np.asanyarray(self._data)
+ 
+@@ -371,7 +371,7 @@
+         Returns
+         -------
+         img_subsampled: Image
+-            An Image with data self.get_data()[slice_object] and an
++            An Image with data self.get_fdata()[slice_object] and an
+             appropriately corrected CoordinateMap.
+ 
+         Examples
+@@ -380,10 +380,10 @@
+         >>> from nipy.testing import funcfile
+         >>> im = load_image(funcfile)
+         >>> frame3 = im[:,:,:,3]
+-        >>> np.allclose(frame3.get_data(), im.get_data()[:,:,:,3])
++        >>> np.allclose(frame3.get_fdata(), im.get_fdata()[:,:,:,3])
+         True
+         """
+-        data = self.get_data()[slice_object]
++        data = self.get_fdata()[slice_object]
+         g = ArrayCoordMap(self.coordmap, self.shape)[slice_object]
+         coordmap = g.coordmap
+         if coordmap.function_domain.ndim > 0:
+@@ -406,7 +406,7 @@
+     def __eq__(self, other):
+         return (isinstance(other, self.__class__)
+                 and self.shape == other.shape
+-                and np.all(self.get_data() == other.get_data())
++                and np.all(self.get_fdata() == other.get_fdata())
+                 and np.all(self.affine == other.affine)
+                 and (self.axes.coord_names == other.axes.coord_names))
+ 
+@@ -506,7 +506,7 @@
+     Returns
+     -------
+     img_subsampled: Image
+-         An Image with data img.get_data()[slice_object] and an appropriately
++         An Image with data img.get_fdata()[slice_object] and an appropriately
+          corrected CoordinateMap.
+ 
+     Examples
+@@ -516,7 +516,7 @@
+     >>> from nipy.core.api import subsample, slice_maker
+     >>> im = load_image(funcfile)
+     >>> frame3 = subsample(im, slice_maker[:,:,:,3])
+-    >>> np.allclose(frame3.get_data(), im.get_data()[:,:,:,3])
++    >>> np.allclose(frame3.get_fdata(), im.get_fdata()[:,:,:,3])
+     True
+     """
+     warnings.warn('subsample is deprecated, please use image '
+@@ -783,7 +783,7 @@
+     rimg = rollimg(img, axis)
+     for i in range(rimg.shape[0]):
+         if asarray:
+-            yield rimg[i].get_data()
++            yield rimg[i].get_fdata()
+         else:
+             yield rimg[i]
+ 
+@@ -864,7 +864,7 @@
+     This allows us to test for something that is duck-typing an image.
+ 
+     For now an array must have a 'coordmap' attribute, and a callable
+-    'get_data' attribute.
++    'get_fdata' attribute.
+ 
+     Parameters
+     ----------
+@@ -890,4 +890,4 @@
+     '''
+     if not hasattr(obj, 'coordmap') or not hasattr(obj, 'metadata'):
+         return False
+-    return callable(getattr(obj, 'get_data'))
++    return callable(getattr(obj, 'get_fdata'))
+--- nipy.orig/examples/compute_fmri_contrast.py
++++ nipy/examples/compute_fmri_contrast.py
+@@ -78,9 +78,9 @@
+ 
+ # Show Z-map image
+ mean_map = multi_session_model.means[0]
+-plot_map(z_map.get_data(),
++plot_map(z_map.get_fdata(),
+          z_map.get_affine(),
+-         anat=mean_map.get_data(),
++         anat=mean_map.get_fdata(),
+          anat_affine=mean_map.get_affine(),
+          cmap=cm.cold_hot,
+          threshold=2.5,
+--- nipy.orig/examples/core/parcel_generator.py
++++ nipy/examples/core/parcel_generator.py
+@@ -21,7 +21,7 @@
+ BG_IMAGE_FNAME = pjoin(DATA_PATH, 'mni_basal_ganglia.nii.gz')
+ 
+ bg_img = nipy.load_image(BG_IMAGE_FNAME)
+-bg_data = bg_img.get_data()
++bg_data = bg_img.get_fdata()
+ 
+ """
+ I happen to know that the image has these codes:
+--- nipy.orig/examples/data/README_mni_basal_ganglia.rst
++++ nipy/examples/data/README_mni_basal_ganglia.rst
+@@ -21,7 +21,7 @@
+                         'AtlasGrey.mnc')
+     atlas_img = nib.load(atlas_fname)
+     # Data is in fact uint8, but with trivial float scaling
+-    data = atlas_img.get_data().astype(np.uint8)
++    data = atlas_img.get_fdata().astype(np.uint8)
+     bg_data = np.zeros_like(data)
+     for code in (14, 16, 39, 53): # LR striatum, LR caudate
+         in_mask = data == code
+--- nipy.orig/examples/ds105/ds105_example.py
++++ nipy/examples/ds105/ds105_example.py
+@@ -195,7 +195,7 @@
+     # time as the first dimension, i.e. fmri[t] gives the t-th volume.
+     fmri_im = futil.get_fmri(path_info) # an Image
+     fmri_im = rollimg(fmri_im, 't')
+-    fmri = fmri_im.get_data() # now, it's an ndarray
++    fmri = fmri_im.get_fdata() # now, it's an ndarray
+ 
+     nvol, volshape = fmri.shape[0], fmri.shape[1:]
+     nx, sliceshape = volshape[0], volshape[1:]
+@@ -310,8 +310,8 @@
+         fixed_effect = 0
+         fixed_var = 0
+         for effect, sd in results[con]:
+-            effect = load_image(effect).get_data()
+-            sd = load_image(sd).get_data()
++            effect = load_image(effect).get_fdata()
++            sd = load_image(sd).get_fdata()
+             var = sd ** 2
+ 
+             # The optimal, in terms of minimum variance, combination of the
+@@ -362,8 +362,8 @@
+     for s in subj_con_dirs:
+         sd_img = load_image(pjoin(s, "sd.nii"))
+         effect_img = load_image(pjoin(s, "effect.nii"))
+-        sds.append(sd_img.get_data())
+-        Ys.append(effect_img.get_data())
++        sds.append(sd_img.get_fdata())
++        Ys.append(effect_img.get_fdata())
+     sd = array(sds)
+     Y = array(Ys)
+ 
+@@ -424,7 +424,7 @@
+          vector of signs
+     """
+     if api.is_image(mask):
+-        maska = mask.get_data()
++        maska = mask.get_fdata()
+     else:
+         maska = np.asarray(mask)
+     maska = maska.astype(np.bool)
+@@ -438,8 +438,8 @@
+     for s in subj_con_dirs:
+         sd_img = load_image(pjoin(s, "sd.nii"))
+         effect_img = load_image(pjoin(s, "effect.nii"))
+-        sds.append(sd_img.get_data()[maska])
+-        Ys.append(effect_img.get_data()[maska])
++        sds.append(sd_img.get_fdata()[maska])
++        Ys.append(effect_img.get_fdata()[maska])
+     sd = np.array(sds)
+     Y = np.array(Ys)
+ 
+--- nipy.orig/examples/ds105/ds105_util.py
++++ nipy/examples/ds105/ds105_util.py
+@@ -256,7 +256,7 @@
+     # Get information for this subject and run
+     path_dict = path_info_run(subj, run)
+     # Get mask
+-    msk = load_image(mask_fname).get_data().copy().astype(bool)
++    msk = load_image(mask_fname).get_fdata().copy().astype(bool)
+     # Get results directories for this run
+     rootdir = path_dict['rootdir']
+     res_dir = pjoin(rootdir, 'results_run%03d' % run)
+@@ -271,8 +271,8 @@
+                 if not exists(other_fname):
+                     print(this_fname, 'present but ', other_fname, 'missing')
+                     continue
+-                this_arr = load_image(this_fname).get_data()
+-                other_arr = load_image(other_fname).get_data()
++                this_arr = load_image(this_fname).get_fdata()
++                other_arr = load_image(other_fname).get_fdata()
+                 ok = np.allclose(this_arr[msk], other_arr[msk])
+                 if not ok and froot in ('effect', 'sd', 't'): # Maybe a sign flip
+                     ok = np.allclose(this_arr[msk], -other_arr[msk])
+--- nipy.orig/examples/fiac/fiac_example.py
++++ nipy/examples/fiac/fiac_example.py
+@@ -183,7 +183,7 @@
+     # time as the first dimension, i.e. fmri[t] gives the t-th volume.
+     fmri_im = futil.get_fmri(path_info) # an Image
+     fmri_im = rollimg(fmri_im, 't')
+-    fmri = fmri_im.get_data() # now, it's an ndarray
++    fmri = fmri_im.get_fdata() # now, it's an ndarray
+ 
+     nvol, volshape = fmri.shape[0], fmri.shape[1:]
+     nx, sliceshape = volshape[0], volshape[1:]
+@@ -298,8 +298,8 @@
+         fixed_effect = 0
+         fixed_var = 0
+         for effect, sd in results[con]:
+-            effect = load_image(effect).get_data()
+-            sd = load_image(sd).get_data()
++            effect = load_image(effect).get_fdata()
++            sd = load_image(sd).get_fdata()
+             var = sd ** 2
+ 
+             # The optimal, in terms of minimum variance, combination of the
+@@ -350,8 +350,8 @@
+     for s in subj_con_dirs:
+         sd_img = load_image(pjoin(s, "sd.nii"))
+         effect_img = load_image(pjoin(s, "effect.nii"))
+-        sds.append(sd_img.get_data())
+-        Ys.append(effect_img.get_data())
++        sds.append(sd_img.get_fdata())
++        Ys.append(effect_img.get_fdata())
+     sd = array(sds)
+     Y = array(Ys)
+ 
+@@ -412,7 +412,7 @@
+          vector of signs
+     """
+     if api.is_image(mask):
+-        maska = mask.get_data()
++        maska = mask.get_fdata()
+     else:
+         maska = np.asarray(mask)
+     maska = maska.astype(np.bool)
+@@ -426,8 +426,8 @@
+     for s in subj_con_dirs:
+         sd_img = load_image(pjoin(s, "sd.nii"))
+         effect_img = load_image(pjoin(s, "effect.nii"))
+-        sds.append(sd_img.get_data()[maska])
+-        Ys.append(effect_img.get_data()[maska])
++        sds.append(sd_img.get_fdata()[maska])
++        Ys.append(effect_img.get_fdata()[maska])
+     sd = np.array(sds)
+     Y = np.array(Ys)
+ 
+--- nipy.orig/examples/fiac/fiac_util.py
++++ nipy/examples/fiac/fiac_util.py
+@@ -378,7 +378,7 @@
+     # Get information for this subject and run
+     path_dict = path_info_run(subj, run)
+     # Get mask
+-    msk = load_image(mask_fname).get_data().copy().astype(bool)
++    msk = load_image(mask_fname).get_fdata().copy().astype(bool)
+     # Get results directories for this run
+     rootdir = path_dict['rootdir']
+     res_dir = pjoin(rootdir, 'results_%02d' % run)
+@@ -393,8 +393,8 @@
+                 if not exists(other_fname):
+                     print(this_fname, 'present but ', other_fname, 'missing')
+                     continue
+-                this_arr = load_image(this_fname).get_data()
+-                other_arr = load_image(other_fname).get_data()
++                this_arr = load_image(this_fname).get_fdata()
++                other_arr = load_image(other_fname).get_fdata()
+                 ok = np.allclose(this_arr[msk], other_arr[msk])
+                 if not ok and froot in ('effect', 'sd', 't'): # Maybe a sign flip
+                     ok = np.allclose(this_arr[msk], -other_arr[msk])
+--- nipy.orig/examples/image_from_array.py
++++ nipy/examples/image_from_array.py
+@@ -16,7 +16,7 @@
+ #
+ # Use one of our test files to get an array and affine (as numpy array) from.
+ img = load_image(anatfile)
+-arr = img.get_data()
++arr = img.get_fdata()
+ affine_array = img.coordmap.affine.copy()
+ 
+ # 1) Create a CoordinateMap from the affine transform which specifies
+@@ -33,5 +33,5 @@
+ 
+ # Reload and verify the data and affine were saved correctly.
+ img_back = load_image('an_image.nii.gz')
+-assert np.allclose(img_back.get_data(), img.get_data())
++assert np.allclose(img_back.get_fdata(), img.get_fdata())
+ assert np.allclose(img_back.coordmap.affine, img.coordmap.affine)
+--- nipy.orig/examples/interfaces/process_ds105.py
++++ nipy/examples/interfaces/process_ds105.py
+@@ -66,7 +66,7 @@
+     return sorted(preferred)
+ 
+ 
+-def get_data(data_path, subj_id):
++def get_fdata(data_path, subj_id):
+     data_path = abspath(data_path)
+     data_def = {}
+     subject_path = pjoin(data_path, 'sub%03d' % subj_id)
+@@ -305,7 +305,7 @@
+ def get_subjects(data_path, subj_ids, study_def, ana_def):
+     ddefs = []
+     for subj_id in subj_ids:
+-        ddefs.append(get_data(data_path, subj_id))
++        ddefs.append(get_fdata(data_path, subj_id))
+     return ddefs
+ 
+ 
+@@ -319,7 +319,7 @@
+     else:
+         subj_ids = range(1, 7)
+     for subj_id in subj_ids:
+-        ddef = get_data(data_path, subj_id)
++        ddef = get_fdata(data_path, subj_id)
+         assert len(ddef['functionals']) in (11, 12)
+         process_subject(ddef, STUDY_DEF, {})
+ 
+--- nipy.orig/examples/interfaces/process_fiac.py
++++ nipy/examples/interfaces/process_fiac.py
+@@ -12,7 +12,7 @@
+                                  fltcols)
+ 
+ 
+-def get_data(data_path, subj_id):
++def get_fdata(data_path, subj_id):
+     data_def = {}
+     subject_path = pjoin(data_path, 'fiac%s' % subj_id)
+     data_def['functionals'] = sorted(
+@@ -189,7 +189,7 @@
+ 
+ def process_subjects(data_path, subj_ids):
+     for subj_id in subj_ids:
+-        ddef = get_data(data_path, subj_id)
++        ddef = get_fdata(data_path, subj_id)
+         process_subject(ddef)
+ 
+ 
+--- nipy.orig/examples/labs/example_glm.py
++++ nipy/examples/labs/example_glm.py
+@@ -89,7 +89,7 @@
+ ########################################
+ 
+ # GLM fit
+-Y = fmri_data.get_data().reshape(np.prod(shape), n_scans)
++Y = fmri_data.get_fdata().reshape(np.prod(shape), n_scans)
+ glm = GeneralLinearModel(X)
+ glm.fit(Y.T)
+ 
+--- nipy.orig/examples/labs/need_data/demo_blob_from_image.py
++++ nipy/examples/labs/need_data/demo_blob_from_image.py
+@@ -40,10 +40,10 @@
+ 
+ # prepare the data
+ nim = load(input_image)
+-mask_image = Nifti1Image((nim.get_data() ** 2 > 0).astype('u8'),
++mask_image = Nifti1Image((nim.get_fdata() ** 2 > 0).astype('u8'),
+                          nim.get_affine())
+ domain = grid_domain_from_image(mask_image)
+-data = nim.get_data()
++data = nim.get_fdata()
+ values = data[data != 0]
+ 
+ # compute the  nested roi object
+--- nipy.orig/examples/labs/need_data/demo_roi.py
++++ nipy/examples/labs/need_data/demo_roi.py
+@@ -68,7 +68,7 @@
+ nim = load(input_image)
+ affine = nim.get_affine()
+ shape = nim.shape
+-data = nim.get_data()
++data = nim.get_fdata()
+ values = data[data != 0]
+ 
+ # compute the nested roi object
+@@ -97,7 +97,7 @@
+ 
+ # --- 2.d make a set of ROIs from all the blobs
+ roi = mroi.subdomain_from_image(blobPath)
+-data = load(input_image).get_data().ravel()
++data = load(input_image).get_fdata().ravel()
+ feature_activ = [data[roi.select_id(id, roi=False)] for id in roi.get_id()]
+ roi.set_feature('activ', feature_activ)
+ roi.plot_feature('activ')
+--- nipy.orig/examples/labs/need_data/demo_ward_clustering.py
++++ nipy/examples/labs/need_data/demo_ward_clustering.py
+@@ -32,10 +32,10 @@
+     mkdir(write_dir)
+ 
+ # read the data
+-mask = load(mask_image).get_data() > 0
++mask = load(mask_image).get_fdata() > 0
+ ijk = np.array(np.where(mask)).T
+ nvox = ijk.shape[0]
+-data = load(input_image).get_data()[mask]
++data = load(input_image).get_fdata()[mask]
+ image_field = Field(nvox)
+ image_field.from_3d_grid(ijk, k=6)
+ image_field.set_field(data)
+--- nipy.orig/examples/labs/need_data/example_roi_and_glm.py
++++ nipy/examples/labs/need_data/example_roi_and_glm.py
+@@ -51,7 +51,7 @@
+     get_second_level_dataset()
+ 
+ mask = load(mask_path)
+-mask_array, affine = mask.get_data() > 0, mask.get_affine()
++mask_array, affine = mask.get_fdata() > 0, mask.get_affine()
+ 
+ # timing
+ n_scans = 128
+@@ -98,7 +98,7 @@
+ # Get the FMRI data
+ #######################################
+ fmri_data = surrogate_4d_dataset(mask=mask, dmtx=X)[0]
+-Y = fmri_data.get_data()[mask_array]
++Y = fmri_data.get_fdata()[mask_array]
+ 
+ # artificially added signal in ROIs to make the example more meaningful
+ activation = 30 * (X.T[1] + .5 * X.T[0])
+--- nipy.orig/examples/labs/need_data/first_level_fiac.py
++++ nipy/examples/labs/need_data/first_level_fiac.py
+@@ -94,10 +94,10 @@
+ 
+     # make a snapshot of the contrast activation
+     if contrast_id == 'Effects_of_interest':
+-        vmax = max(- z_map.get_data().min(), z_map.get_data().max())
++        vmax = max(- z_map.get_fdata().min(), z_map.get_fdata().max())
+         vmin = - vmax
+-        plot_map(z_map.get_data(), z_map.get_affine(),
+-                 anat=mean_map.get_data(), anat_affine=mean_map.get_affine(),
++        plot_map(z_map.get_fdata(), z_map.get_affine(),
++                 anat=mean_map.get_fdata(), anat_affine=mean_map.get_affine(),
+                  cmap=cm.cold_hot,
+                  vmin=vmin,
+                  vmax=vmax,
+--- nipy.orig/examples/labs/need_data/glm_beta_and_variance.py
++++ nipy/examples/labs/need_data/glm_beta_and_variance.py
+@@ -102,7 +102,7 @@
+ ########################################
+ beta_hat = fmri_glm.glms[0].get_beta()  # Least-squares estimates of the beta
+ variance_hat = fmri_glm.glms[0].get_mse() # Estimates of the variance
+-mask = fmri_glm.mask.get_data() > 0
++mask = fmri_glm.mask.get_fdata() > 0
+ 
+ # output beta images
+ beta_map = np.tile(mask.astype(np.float)[..., np.newaxis], dim)
+--- nipy.orig/examples/labs/need_data/histogram_fits.py
++++ nipy/examples/labs/need_data/histogram_fits.py
+@@ -44,11 +44,11 @@
+ 
+ # Read the mask
+ nim = load(mask_image)
+-mask = nim.get_data()
++mask = nim.get_fdata()
+ 
+ # read the functional image
+ rbeta = load(input_image)
+-beta = rbeta.get_data()
++beta = rbeta.get_fdata()
+ beta = beta[mask > 0]
+ 
+ mf = plt.figure(figsize=(13, 5))
+--- nipy.orig/examples/labs/need_data/localizer_glm_ar.py
++++ nipy/examples/labs/need_data/localizer_glm_ar.py
+@@ -142,10 +142,10 @@
+     save(z_map, image_path)
+ 
+     # Create snapshots of the contrasts
+-    vmax = max(- z_map.get_data().min(), z_map.get_data().max())
++    vmax = max(- z_map.get_fdata().min(), z_map.get_fdata().max())
+     if index > 0:
+         plt.clf()
+-    plot_map(z_map.get_data(), z_map.get_affine(),
++    plot_map(z_map.get_fdata(), z_map.get_affine(),
+              cmap=cm.cold_hot,
+              vmin=- vmax,
+              vmax=vmax,
+--- nipy.orig/examples/labs/need_data/one_sample_t_test.py
++++ nipy/examples/labs/need_data/one_sample_t_test.py
+@@ -76,9 +76,9 @@
+ save(z_map, path.join(write_dir, 'one_sample_z_map.nii'))
+ 
+ # look at the result
+-vmax = max(- z_map.get_data().min(), z_map.get_data().max())
++vmax = max(- z_map.get_fdata().min(), z_map.get_fdata().max())
+ vmin = - vmax
+-plot_map(z_map.get_data(), z_map.get_affine(),
++plot_map(z_map.get_fdata(), z_map.get_affine(),
+          cmap=cm.cold_hot,
+          vmin=vmin,
+          vmax=vmax,
+--- nipy.orig/examples/labs/need_data/tmin_statistic.py
++++ nipy/examples/labs/need_data/tmin_statistic.py
+@@ -119,8 +119,8 @@
+ # these dimensions correspond to 'left' and 'right'
+ 
+ # Create snapshots of the contrasts
+-vmax = max(- z_map.get_data().min(), z_map.get_data().max())
+-plot_map(z_map.get_data(), fmri_glm.affine,
++vmax = max(- z_map.get_fdata().min(), z_map.get_fdata().max())
++plot_map(z_map.get_fdata(), fmri_glm.affine,
+          cmap=cm.cold_hot,
+          vmin=- vmax,
+          vmax=vmax,
+--- nipy.orig/examples/labs/need_data/viz.py
++++ nipy/examples/labs/need_data/viz.py
+@@ -31,7 +31,7 @@
+ 
+ # First example, with a anatomical template
+ img = load(os.path.join(data_dir, 'spmT_0029.nii.gz'))
+-data = img.get_data()
++data = img.get_fdata()
+ affine = img.get_affine()
+ 
+ viz.plot_map(data, affine, cut_coords=(-52, 10, 22),
+@@ -42,7 +42,7 @@
+ try:
+     anat_img = load(example_data.get_filename('neurospin', 'sulcal2000',
+                                               'nobias_anubis.nii.gz'))
+-    anat = anat_img.get_data()
++    anat = anat_img.get_fdata()
+     anat_affine = anat_img.get_affine()
+ except OSError as e:
+     # File does not exist: the data package is not installed
+--- nipy.orig/examples/labs/need_data/viz3d.py
++++ nipy/examples/labs/need_data/viz3d.py
+@@ -35,10 +35,10 @@
+     get_second_level_dataset()
+ 
+ brain_map = load(input_image)
+-vmin, vmax = brain_map.get_data().min(), brain_map.get_data().max()
++vmin, vmax = brain_map.get_fdata().min(), brain_map.get_fdata().max()
+ 
+ # make a simple 2D plot
+-plot_map(brain_map.get_data(), brain_map.get_affine(),
++plot_map(brain_map.get_fdata(), brain_map.get_affine(),
+          cmap=cm.cold_hot,
+          vmin=vmin,
+          vmax=vmax,
+@@ -48,7 +48,7 @@
+ 
+ # More plots using 3D
+ if True:  # replace with False to skip this
+-    plot_map(brain_map.get_data(), brain_map.get_affine(),
++    plot_map(brain_map.get_fdata(), brain_map.get_affine(),
+              cmap=cm.cold_hot,
+              vmin=vmin,
+              vmax=vmax,
+@@ -58,7 +58,7 @@
+ 
+     from nipy.labs import viz3d
+     try:
+-        viz3d.plot_map_3d(brain_map.get_data(), brain_map.get_affine(),
++        viz3d.plot_map_3d(brain_map.get_fdata(), brain_map.get_affine(),
+                         cmap=cm.cold_hot,
+                         vmin=vmin,
+                         vmax=vmax,
+--- nipy.orig/examples/tissue_classification.py
++++ nipy/examples/tissue_classification.py
+@@ -76,8 +76,8 @@
+ ngb_size = int(get_argument('ngb_size', 6))
+ 
+ # Perform tissue classification
+-mask = mask_img.get_data() > 0
+-S = BrainT1Segmentation(img.get_data(), mask=mask, model='5k',
++mask = mask_img.get_fdata() > 0
++S = BrainT1Segmentation(img.get_fdata(), mask=mask, model='5k',
+                         niters=niters, beta=beta, ngb_size=ngb_size)
+ 
+ # Save label image
+@@ -95,6 +95,6 @@
+     gold_ppm_img = (args.probc, args.probg, args.probw)
+     for k in range(3):
+         img = load_image(gold_ppm_img[k])
+-        gold_ppm[..., k] = img.get_data()
+-    d = fuzzy_dice(gold_ppm, S.ppm, np.where(mask_img.get_data() > 0))
++        gold_ppm[..., k] = img.get_fdata()
++    d = fuzzy_dice(gold_ppm, S.ppm, np.where(mask_img.get_fdata() > 0))
+     print('Fuzzy Dice indices: %s' % d)
+--- nipy.orig/nipy/algorithms/diagnostics/screens.py
++++ nipy/nipy/algorithms/diagnostics/screens.py
+@@ -62,7 +62,7 @@
+     '''
+     if img4d.ndim != 4:
+         raise ValueError('Expecting a 4d image')
+-    data = img4d.get_data()
++    data = img4d.get_fdata()
+     cmap = img4d.coordmap
+     # Get numerical index for time axis in data array
+     time_axis = input_axis_index(cmap, time_axis)
+--- nipy.orig/nipy/algorithms/diagnostics/tests/test_screen.py
++++ nipy/nipy/algorithms/diagnostics/tests/test_screen.py
+@@ -61,12 +61,12 @@
+                  ['max', 'mean', 'min',
+                   'pca', 'pca_res',
+                   'std', 'ts_res'])
+-    data = img.get_data()
++    data = img.get_fdata()
+     # Check summary images
+-    assert_array_equal(np.max(data, axis=-1), res['max'].get_data())
+-    assert_array_equal(np.mean(data, axis=-1), res['mean'].get_data())
+-    assert_array_equal(np.min(data, axis=-1), res['min'].get_data())
+-    assert_array_equal(np.std(data, axis=-1), res['std'].get_data())
++    assert_array_equal(np.max(data, axis=-1), res['max'].get_fdata())
++    assert_array_equal(np.mean(data, axis=-1), res['mean'].get_fdata())
++    assert_array_equal(np.min(data, axis=-1), res['min'].get_fdata())
++    assert_array_equal(np.std(data, axis=-1), res['std'].get_fdata())
+     pca_res = pca(data, axis=-1, standardize=False, ncomp=10)
+     # On windows, there seems to be some randomness in the PCA output vector
+     # signs; this routine sets the basis vectors to have first value positive,
+@@ -77,11 +77,11 @@
+     # Test that screens accepts and uses time axis
+     data_mean = data.mean(axis=-1)
+     res = screen(img, time_axis='t')
+-    assert_array_equal(data_mean, res['mean'].get_data())
++    assert_array_equal(data_mean, res['mean'].get_fdata())
+     _check_pca(res, pca_res)
+     _check_ts(res, data, 3, 2)
+     res = screen(img, time_axis=-1)
+-    assert_array_equal(data_mean, res['mean'].get_data())
++    assert_array_equal(data_mean, res['mean'].get_fdata())
+     _check_pca(res, pca_res)
+     _check_ts(res, data, 3, 2)
+     t0_img = rollimg(img, 't')
+@@ -89,11 +89,11 @@
+     res = screen(t0_img, time_axis='t')
+     t0_pca_res = pca(t0_data, axis=0, standardize=False, ncomp=10)
+     t0_pca_res = res2pos1(t0_pca_res)
+-    assert_array_equal(data_mean, res['mean'].get_data())
++    assert_array_equal(data_mean, res['mean'].get_fdata())
+     _check_pca(res, t0_pca_res)
+     _check_ts(res, t0_data, 0, 3)
+     res = screen(t0_img, time_axis=0)
+-    assert_array_equal(data_mean, res['mean'].get_data())
++    assert_array_equal(data_mean, res['mean'].get_fdata())
+     _check_pca(res, t0_pca_res)
+     _check_ts(res, t0_data, 0, 3)
+     # Check screens uses slice axis
+@@ -137,8 +137,8 @@
+         # And is the expected analysis
+         # Very oddly on scipy 0.9 32 bit - at least - results differ between
+         # runs, so we need assert_almost_equal
+-        assert_almost_equal(pca_pos(res['pca'].get_data()),
+-                            pca_pos(exp_res['pca'].get_data()))
++        assert_almost_equal(pca_pos(res['pca'].get_fdata()),
++                            pca_pos(exp_res['pca'].get_fdata()))
+         assert_array_equal(res['ts_res']['slice_mean_diff2'],
+                            exp_res['ts_res']['slice_mean_diff2'])
+         # Turn off warnings, also get expected analysis
+--- nipy.orig/nipy/algorithms/diagnostics/tests/test_time_difference.py
++++ nipy/nipy/algorithms/diagnostics/tests/test_time_difference.py
+@@ -74,7 +74,7 @@
+     # Test time and slice axes work as expected
+     fimg = load_image(funcfile)
+     # Put into array
+-    data = fimg.get_data()
++    data = fimg.get_fdata()
+     orig_results = tsd.time_slice_diffs(data)
+     t0_data = np.rollaxis(data, 3)
+     t0_results = tsd.time_slice_diffs(t0_data, 0)
+@@ -97,7 +97,7 @@
+ 
+ def test_against_matlab_results():
+     fimg = load_image(funcfile)
+-    results = tsd.time_slice_diffs(fimg.get_data())
++    results = tsd.time_slice_diffs(fimg.get_fdata())
+     # struct as record only to avoid deprecation warning
+     tsd_results = sio.loadmat(pjoin(TEST_DATA_PATH, 'tsdiff_results.mat'),
+                               struct_as_record=True, squeeze_me=True)
+@@ -123,13 +123,13 @@
+                 'volume_means'):
+         assert_array_equal(arr_res[key], img_res[key])
+     for key in ('slice_diff2_max_vol', 'diff2_mean_vol'):
+-        assert_array_almost_equal(arr_res[key], img_res[key].get_data())
++        assert_array_almost_equal(arr_res[key], img_res[key].get_fdata())
+ 
+ 
+ def test_tsd_image():
+     # Test image version of time slice diff
+     fimg = load_image(funcfile)
+-    data = fimg.get_data()
++    data = fimg.get_fdata()
+     tsda = tsd.time_slice_diffs
+     tsdi = tsd.time_slice_diffs_image
+     arr_results = tsda(data)
+--- nipy.orig/nipy/algorithms/diagnostics/timediff.py
++++ nipy/nipy/algorithms/diagnostics/timediff.py
+@@ -195,7 +195,7 @@
+         raise AxisError('Cannot identify matching input output axes with "%s"'
+                         % slice_axis)
+     vol_coordmap = drop_io_dim(img.coordmap, time_axis)
+-    results = time_slice_diffs(img.get_data(), time_in_ax, slice_in_ax)
++    results = time_slice_diffs(img.get_fdata(), time_in_ax, slice_in_ax)
+     for key in ('slice_diff2_max_vol', 'diff2_mean_vol'):
+         vol = img_class(results[key], vol_coordmap)
+         results[key] = vol
+--- nipy.orig/nipy/algorithms/group/parcel_analysis.py
++++ nipy/nipy/algorithms/group/parcel_analysis.py
+@@ -119,9 +119,9 @@
+         smooth_fn = _smooth_spm
+     else:
+         raise ValueError('Unknown smoothing method')
+-    con = con_img.get_data()
++    con = con_img.get_fdata()
+     if vcon_img is not None:
+-        vcon = con_img.get_data()
++        vcon = con_img.get_fdata()
+     else:
+         vcon = None
+     msk = np.isnan(con)
+@@ -177,7 +177,7 @@
+           and the second, parcel values, i.e., corresponding
+           intensities in the associated parcel image. By default,
+           parcel values are taken as
+-          `np.unique(parcel_img.get_data())` and parcel names are
++          `np.unique(parcel_img.get_fdata())` and parcel names are
+           these values converted to strings.
+         msk_img: nipy-like image, optional
+           Binary mask to restrict analysis. By default, analysis is
+@@ -226,7 +226,7 @@
+         if msk_img is None:
+             self.msk = None
+         else:
+-            self.msk = msk_img.get_data().astype(bool).squeeze()
++            self.msk = msk_img.get_fdata().astype(bool).squeeze()
+         self.res_path = res_path
+ 
+         # design matrix
+@@ -251,7 +251,7 @@
+         # load the parcellation and resample it at the appropriate
+         # resolution
+         self.reference = parcel_img.reference
+-        self.parcel_full_res = parcel_img.get_data().astype('uintp').squeeze()
++        self.parcel_full_res = parcel_img.get_fdata().astype('uintp').squeeze()
+         self.affine_full_res = xyz_affine(parcel_img)
+         parcel_img = make_xyz_image(self.parcel_full_res,
+                                     self.affine_full_res,
+@@ -261,7 +261,7 @@
+                                   reference=(self.con_imgs[0].shape,
+                                              self.affine),
+                                   interp_order=0)
+-        self.parcel = parcel_img_rsp.get_data().astype('uintp').squeeze()
++        self.parcel = parcel_img_rsp.get_fdata().astype('uintp').squeeze()
+         if self.msk is None:
+             self.msk = self.parcel > 0
+ 
+@@ -305,8 +305,8 @@
+                                        'scon' + str(i) + '.nii.gz'))
+                 _save_image(svcon, join(self.res_path,
+                                         'svcon' + str(i) + '.nii.gz'))
+-            cons += [scon.get_data()[self.msk]]
+-            vcons += [svcon.get_data()[self.msk]]
++            cons += [scon.get_fdata()[self.msk]]
++            vcons += [svcon.get_fdata()[self.msk]]
+ 
+         self.cons = np.array(cons)
+         self.vcons = np.array(vcons)
+--- nipy.orig/nipy/algorithms/group/tests/test_parcel_analysis.py
++++ nipy/nipy/algorithms/group/tests/test_parcel_analysis.py
+@@ -61,12 +61,12 @@
+     assert_array_equal(xyz_affine(parcel_mu_img), AFFINE)
+     assert_array_equal(parcel_prob_img.shape, SIZE)
+     assert_array_equal(xyz_affine(parcel_prob_img), AFFINE)
+-    assert parcel_prob_img.get_data().max() <= 1
+-    assert parcel_prob_img.get_data().min() >= 0
+-    outside = parcel_img.get_data() == 0
+-    assert_array_equal(t_map_img.get_data()[outside], 0)
+-    assert_array_equal(parcel_mu_img.get_data()[outside], 0)
+-    assert_array_equal(parcel_prob_img.get_data()[outside], 0)
++    assert parcel_prob_img.get_fdata().max() <= 1
++    assert parcel_prob_img.get_fdata().min() >= 0
++    outside = parcel_img.get_fdata() == 0
++    assert_array_equal(t_map_img.get_fdata()[outside], 0)
++    assert_array_equal(parcel_mu_img.get_fdata()[outside], 0)
++    assert_array_equal(parcel_prob_img.get_fdata()[outside], 0)
+ 
+ 
+ def test_parcel_analysis():
+@@ -93,7 +93,7 @@
+                        design_matrix=X,
+                        cvect=c,
+                        fwhm=0)
+-    t_map = g.t_map().get_data()
++    t_map = g.t_map().get_fdata()
+     m_error = np.abs(np.mean(t_map))
+     v_error = np.abs(np.var(t_map) - (NSUBJ - 5) / float(NSUBJ - 7))
+     print('Errors: %f (mean), %f (var)' % (m_error, v_error))
+@@ -144,8 +144,8 @@
+     assert_array_equal(xyz_affine(parcel_mu_img), AFFINE)
+     assert_array_equal(parcel_prob_img.shape, SIZE)
+     assert_array_equal(xyz_affine(parcel_prob_img), AFFINE)
+-    assert parcel_prob_img.get_data().max() <= 1
+-    assert parcel_prob_img.get_data().min() >= 0
+-    outside = parcel_img.get_data() == 0
+-    assert_array_equal(parcel_mu_img.get_data()[outside], 0)
+-    assert_array_equal(parcel_prob_img.get_data()[outside], 0)
++    assert parcel_prob_img.get_fdata().max() <= 1
++    assert parcel_prob_img.get_fdata().min() >= 0
++    outside = parcel_img.get_fdata() == 0
++    assert_array_equal(parcel_mu_img.get_fdata()[outside], 0)
++    assert_array_equal(parcel_prob_img.get_fdata()[outside], 0)
+--- nipy.orig/nipy/algorithms/interpolation.py
++++ nipy/nipy/algorithms/interpolation.py
+@@ -75,7 +75,7 @@
+         return self._order
+ 
+     def _buildknots(self):
+-        data = np.nan_to_num(self.image.get_data()).astype(np.float64)
++        data = np.nan_to_num(self.image.get_fdata()).astype(np.float64)
+         if self.order > 1:
+             if self.mode in ('nearest', 'grid-constant'):
+                 # See: https://github.com/scipy/scipy/issues/13600
+--- nipy.orig/nipy/algorithms/kernel_smooth.py
++++ nipy/nipy/algorithms/kernel_smooth.py
+@@ -155,7 +155,7 @@
+             nslice = 1
+         else:
+             raise NotImplementedError('expecting either 3 or 4-d image')
+-        in_data = inimage.get_data()
++        in_data = inimage.get_fdata()
+         for _slice in range(nslice):
+             if in_data.ndim == 4:
+                 data = in_data[_slice]
+--- nipy.orig/nipy/algorithms/registration/groupwise_registration.py
++++ nipy/nipy/algorithms/registration/groupwise_registration.py
+@@ -144,19 +144,19 @@
+         if isinstance(data, np.ndarray):
+             self._data = data
+             self._shape = data.shape
+-            self._get_data = None
++            self._get_fdata = None
+             self._init_timing_parameters()
+         else:
+             self._data = None
+             self._shape = None
+-            self._get_data = data
++            self._get_fdata = data
+ 
+     def _load_data(self):
+-        self._data = self._get_data()
++        self._data = self._get_fdata()
+         self._shape = self._data.shape
+         self._init_timing_parameters()
+ 
+-    def get_data(self):
++    def get_fdata(self):
+         if self._data is None:
+             self._load_data()
+         return self._data
+@@ -209,7 +209,7 @@
+         return (t - corr) / self.tr
+ 
+     def free_data(self):
+-        if self._get_data is not None:
++        if self._get_fdata is not None:
+             self._data = None
+ 
+ 
+@@ -267,12 +267,12 @@
+         if time_interp:
+             self.timestamps = im4d.tr * np.arange(self.nscans)
+             self.scanner_time = im4d.scanner_time
+-            self.cbspline = _cspline_transform(im4d.get_data())
++            self.cbspline = _cspline_transform(im4d.get_fdata())
+         else:
+             self.cbspline = np.zeros(self.dims, dtype='double')
+             for t in range(self.dims[3]):
+                 self.cbspline[:, :, :, t] =\
+-                    _cspline_transform(im4d.get_data()[:, :, :, t])
++                    _cspline_transform(im4d.get_fdata()[:, :, :, t])
+ 
+         # The reference scan conventionally defines the head
+         # coordinate system
+@@ -780,7 +780,7 @@
+         # inbetween sessions.
+         for im in images:
+             xyz_img = as_xyz_image(im)
+-            self._runs.append(Image4d(xyz_img.get_data,
++            self._runs.append(Image4d(xyz_img.get_fdata,
+                                       xyz_affine(xyz_img),
+                                       tr,
+                                       slice_times=slice_times,
+--- nipy.orig/nipy/algorithms/registration/histogram_registration.py
++++ nipy/nipy/algorithms/registration/histogram_registration.py
+@@ -93,7 +93,7 @@
+ 
+         # Clamping of the `from` image. The number of bins may be
+         # overriden if unnecessarily large.
+-        data, from_bins_adjusted = clamp(from_img.get_data(), from_bins,
++        data, from_bins_adjusted = clamp(from_img.get_fdata(), from_bins,
+                                          mask=from_mask)
+         if not similarity == 'slr':
+             from_bins = from_bins_adjusted
+@@ -112,10 +112,10 @@
+         if self._smooth < 0:
+             raise ValueError('smoothing kernel cannot have negative scale')
+         elif self._smooth > 0:
+-            data = smooth_image(to_img.get_data(), xyz_affine(to_img),
++            data = smooth_image(to_img.get_fdata(), xyz_affine(to_img),
+                                 self._smooth)
+         else:
+-            data = to_img.get_data()
++            data = to_img.get_fdata()
+         data, to_bins_adjusted = clamp(data, to_bins, mask=to_mask)
+         if not similarity == 'slr':
+             to_bins = to_bins_adjusted
+@@ -173,13 +173,13 @@
+             size = self._from_img.shape
+         # Adjust spacing to match desired field of view size
+         if spacing is not None:
+-            fov_data = self._from_img.get_data()[
++            fov_data = self._from_img.get_fdata()[
+                 self._slicer(corner, size, spacing)]
+         else:
+-            fov_data = self._from_img.get_data()[
++            fov_data = self._from_img.get_fdata()[
+                 self._slicer(corner, size, [1, 1, 1])]
+             spacing = ideal_spacing(fov_data, npoints=npoints)
+-            fov_data = self._from_img.get_data()[
++            fov_data = self._from_img.get_fdata()[
+                 self._slicer(corner, size, spacing)]
+         self._from_data = fov_data
+         self._from_npoints = (fov_data >= 0).sum()
+--- nipy.orig/nipy/algorithms/registration/resample.py
++++ nipy/nipy/algorithms/registration/resample.py
+@@ -101,7 +101,7 @@
+         ref_aff = xyz_affine(reference)
+     if not len(ref_shape) == 3 or not ref_aff.shape == (4, 4):
+         raise ValueError('Input image should be 3D')
+-    data = moving.get_data()
++    data = moving.get_fdata()
+     if dtype is None:
+         dtype = data.dtype
+ 
+--- nipy.orig/nipy/algorithms/registration/tests/test_fmri_realign4d.py
++++ nipy/nipy/algorithms/registration/tests/test_fmri_realign4d.py
+@@ -36,14 +36,14 @@
+ 
+ 
+ def test_scanner_time():
+-    im4d = Image4d(IM.get_data(), IM.affine, tr=3.,
++    im4d = Image4d(IM.get_fdata(), IM.affine, tr=3.,
+                    slice_times=(0, 1, 2))
+     assert_equal(im4d.scanner_time(0, 0), 0.)
+     assert_equal(im4d.scanner_time(0, im4d.tr), 1.)
+ 
+ 
+ def test_slice_info():
+-    im4d = Image4d(IM.get_data(), IM.affine, tr=3.,
++    im4d = Image4d(IM.get_fdata(), IM.affine, tr=3.,
+                    slice_times=(0, 1, 2), slice_info=(2, -1))
+     assert_equal(im4d.slice_axis, 2)
+     assert_equal(im4d.slice_direction, -1)
+@@ -52,9 +52,9 @@
+ def test_slice_timing():
+     affine = np.eye(4)
+     affine[0:3, 0:3] = IM.affine[0:3, 0:3]
+-    im4d = Image4d(IM.get_data(), affine, tr=2., slice_times=0.0)
++    im4d = Image4d(IM.get_fdata(), affine, tr=2., slice_times=0.0)
+     x = resample4d(im4d, [Rigid() for i in range(IM.shape[3])])
+-    assert_array_almost_equal(im4d.get_data(), x)
++    assert_array_almost_equal(im4d.get_fdata(), x)
+ 
+ 
+ def test_realign4d_no_time_interp():
+@@ -158,7 +158,7 @@
+     aff = xyz_affine(IM)
+     aff2 = aff.copy()
+     aff2[0:3, 3] += 5
+-    im2 = make_xyz_image(IM.get_data(), aff2, 'scanner')
++    im2 = make_xyz_image(IM.get_fdata(), aff2, 'scanner')
+     runs = [IM, im2]
+     R = SpaceTimeRealign(runs, tr=2., slice_times='ascending', slice_info=2)
+     R.estimate(refscan=None, loops=1, between_loops=1, optimizer='steepest')
+--- nipy.orig/nipy/algorithms/registration/tests/test_histogram_registration.py
++++ nipy/nipy/algorithms/registration/tests/test_histogram_registration.py
+@@ -83,7 +83,7 @@
+ 
+ def _test_similarity_measure(simi, val):
+     I = make_xyz_image(make_data_int16(), dummy_affine, 'scanner')
+-    J = make_xyz_image(I.get_data().copy(), dummy_affine, 'scanner')
++    J = make_xyz_image(I.get_fdata().copy(), dummy_affine, 'scanner')
+     R = HistogramRegistration(I, J)
+     R.subsample(spacing=[2, 1, 3])
+     R.similarity = simi
+@@ -150,9 +150,9 @@
+ 
+ def test_joint_hist_eval():
+     I = make_xyz_image(make_data_int16(), dummy_affine, 'scanner')
+-    J = make_xyz_image(I.get_data().copy(), dummy_affine, 'scanner')
++    J = make_xyz_image(I.get_fdata().copy(), dummy_affine, 'scanner')
+     # Obviously the data should be the same
+-    assert_array_equal(I.get_data(), J.get_data())
++    assert_array_equal(I.get_fdata(), J.get_fdata())
+     # Instantiate default thing
+     R = HistogramRegistration(I, J)
+     R.similarity = 'cc'
+@@ -193,14 +193,14 @@
+     """ Test the histogram registration class.
+     """
+     I = make_xyz_image(make_data_int16(), dummy_affine, 'scanner')
+-    J = make_xyz_image(I.get_data().copy(), dummy_affine, 'scanner')
++    J = make_xyz_image(I.get_fdata().copy(), dummy_affine, 'scanner')
+     R = HistogramRegistration(I, J)
+     assert_raises(ValueError, R.subsample, spacing=[0, 1, 3])
+ 
+ 
+ def test_set_fov():
+     I = make_xyz_image(make_data_int16(), dummy_affine, 'scanner')
+-    J = make_xyz_image(I.get_data().copy(), dummy_affine, 'scanner')
++    J = make_xyz_image(I.get_fdata().copy(), dummy_affine, 'scanner')
+     R = HistogramRegistration(I, J)
+     R.set_fov(npoints=np.prod(I.shape))
+     assert_equal(R._from_data.shape, I.shape)
+@@ -224,9 +224,9 @@
+     mask[10:20, 10:20, 10:20] = True
+     R = HistogramRegistration(I, J, to_mask=mask, from_mask=mask)
+     sim1 = R.eval(Affine())
+-    I = make_xyz_image(I.get_data()[mask].reshape(10, 10, 10),
++    I = make_xyz_image(I.get_fdata()[mask].reshape(10, 10, 10),
+                        dummy_affine, 'scanner')
+-    J = make_xyz_image(J.get_data()[mask].reshape(10, 10, 10),
++    J = make_xyz_image(J.get_fdata()[mask].reshape(10, 10, 10),
+                        dummy_affine, 'scanner')
+     R = HistogramRegistration(I, J)
+     sim2 = R.eval(Affine())
+--- nipy.orig/nipy/algorithms/registration/tests/test_resample.py
++++ nipy/nipy/algorithms/registration/tests/test_resample.py
+@@ -58,11 +58,11 @@
+     img = Image(arr, vox2mni(np.eye(4)))
+     for i in interp_orders:
+         img2 = resample(img, T, interp_order=i)
+-        assert_array_almost_equal(img2.get_data(), img.get_data())
++        assert_array_almost_equal(img2.get_fdata(), img.get_fdata())
+         img_aff = as_xyz_image(img)
+         img2 = resample(img, T, reference=(img_aff.shape, xyz_affine(img_aff)),
+                         interp_order=i)
+-        assert_array_almost_equal(img2.get_data(), img.get_data())
++        assert_array_almost_equal(img2.get_fdata(), img.get_fdata())
+ 
+ 
+ def test_resample_dtypes():
+@@ -85,8 +85,8 @@
+     aff_obj = Affine((.5, .5, .5, .1, .1, .1, 0, 0, 0, 0, 0, 0))
+     for transform in aff_obj, ApplyAffine(aff_obj.as_affine()):
+         img2 = resample(img, transform)
+-        assert(np.min(img2.get_data()) >= 0)
+-        assert(np.max(img2.get_data()) < 255)
++        assert(np.min(img2.get_fdata()) >= 0)
++        assert(np.max(img2.get_fdata()) < 255)
+ 
+ 
+ def test_resample_outvalue():
+@@ -98,7 +98,7 @@
+         for order in (1, 3):
+             # Default interpolation outside is constant == 0
+             img2 = resample(img, transform, interp_order=order)
+-            arr2 = img2.get_data()
++            arr2 = img2.get_fdata()
+             exp_arr = np.zeros_like(arr)
+             exp_arr[:-1,:,:] = arr[1:,:,:]
+             assert_array_equal(arr2, exp_arr)
+@@ -107,14 +107,14 @@
+                             mode='constant', cval=0.)
+             exp_arr = np.zeros(arr.shape)
+             exp_arr[:-1, :, :] = arr[1:, :, :]
+-            assert_array_almost_equal(img2.get_data(), exp_arr)
++            assert_array_almost_equal(img2.get_fdata(), exp_arr)
+             # Test constant value of 1
+             img2 = resample(img, transform, interp_order=order,
+                             mode='constant', cval=1.)
+             exp_arr[-1, :, :] = 1
+-            assert_array_almost_equal(img2.get_data(), exp_arr)
++            assert_array_almost_equal(img2.get_fdata(), exp_arr)
+             # Test nearest neighbor
+             img2 = resample(img, transform, interp_order=order,
+                             mode='nearest')
+             exp_arr[-1, :, :] = arr[-1, :, :]
+-            assert_array_almost_equal(img2.get_data(), exp_arr)
++            assert_array_almost_equal(img2.get_fdata(), exp_arr)
+--- nipy.orig/nipy/algorithms/registration/tests/test_slice_timing.py
++++ nipy/nipy/algorithms/registration/tests/test_slice_timing.py
+@@ -74,7 +74,7 @@
+         stc.estimate(refscan=None, loops=1, between_loops=1, optimizer='steepest')
+         # Check no motion estimated
+         assert_array_equal([t.param for t in stc._transforms[0]], 0)
+-        corrected = stc.resample()[0].get_data()
++        corrected = stc.resample()[0].get_fdata()
+         # check we approximate first time slice with correction
+         assert_false(np.allclose(acquired_signal, corrected, rtol=1e-3,
+                                  atol=0.1))
+--- nipy.orig/nipy/algorithms/resample.py
++++ nipy/nipy/algorithms/resample.py
+@@ -135,7 +135,7 @@
+         TV2IV = compose(image.coordmap.inverse(), TV2IW)
+         if isinstance(TV2IV, AffineTransform): # still affine
+             A, b = to_matvec(TV2IV.affine)
+-            idata = affine_transform(image.get_data(), A,
++            idata = affine_transform(image.get_fdata(), A,
+                                      offset=b,
+                                      output_shape=shape,
+                                      order=order,
+--- nipy.orig/nipy/algorithms/segmentation/tests/test_segmentation.py
++++ nipy/nipy/algorithms/segmentation/tests/test_segmentation.py
+@@ -16,7 +16,7 @@
+ from ....testing import anatfile
+ 
+ anat_img = load_image(anatfile)
+-anat_mask = anat_img.get_data() > 0
++anat_mask = anat_img.get_fdata() > 0
+ 
+ DIMS = (30, 30, 20)
+ 
+@@ -30,7 +30,7 @@
+ 
+ def _test_brain_seg(model, niters=3, beta=0, ngb_size=6, init_params=None,
+                     convert=True):
+-    S = BrainT1Segmentation(anat_img.get_data(), mask=anat_mask,
++    S = BrainT1Segmentation(anat_img.get_fdata(), mask=anat_mask,
+                             model=model, niters=niters, beta=beta,
+                             ngb_size=ngb_size, init_params=init_params,
+                             convert=convert)
+--- nipy.orig/nipy/algorithms/tests/test_kernel_smooth.py
++++ nipy/nipy/algorithms/tests/test_kernel_smooth.py
+@@ -23,7 +23,7 @@
+     sanat = smoother.smooth(anat)
+     assert_equal(sanat.shape, anat.shape)
+     assert_equal(sanat.coordmap, anat.coordmap)
+-    assert_false(np.allclose(sanat.get_data(), anat.get_data()))
++    assert_false(np.allclose(sanat.get_fdata(), anat.get_fdata()))
+ 
+ 
+ def test_funny_coordmap():
+@@ -36,23 +36,23 @@
+     cmap_rot = AffineTransform(cmap.function_range,
+                                cmap.function_range,
+                                aff)
+-    func_rot = Image(func.get_data(), compose(cmap_rot, cmap))
++    func_rot = Image(func.get_fdata(), compose(cmap_rot, cmap))
+     func1 = func_rot[...,1] # 5x4 affine
+     smoother = LinearFilter(func1.coordmap, func1.shape)
+     sfunc1 = smoother.smooth(func1) # OK
+     # And same as for 4x4 affine
+     cmap3d = drop_io_dim(cmap, 't')
+-    func3d = Image(func1.get_data(), cmap3d)
++    func3d = Image(func1.get_fdata(), cmap3d)
+     smoother = LinearFilter(func3d.coordmap, func3d.shape)
+     sfunc3d = smoother.smooth(func3d)
+     assert_equal(sfunc1.shape, sfunc3d.shape)
+-    assert_array_almost_equal(sfunc1.get_data(), sfunc3d.get_data())
++    assert_array_almost_equal(sfunc1.get_fdata(), sfunc3d.get_fdata())
+     # And same with no rotation
+     func_fresh = func[...,1] # 5x4 affine, no rotation
+     smoother = LinearFilter(func_fresh.coordmap, func_fresh.shape)
+     sfunc_fresh = smoother.smooth(func_fresh)
+     assert_equal(sfunc1.shape, sfunc_fresh.shape)
+-    assert_array_almost_equal(sfunc1.get_data(), sfunc_fresh.get_data())
++    assert_array_almost_equal(sfunc1.get_fdata(), sfunc_fresh.get_fdata())
+ 
+ 
+ def test_func_smooth():
+@@ -92,7 +92,7 @@
+         kernel = LinearFilter(coordmap, shape,
+                               fwhm=randint(50, 100 + 1) / 10.)
+         # smoothed normalized 3D array
+-        ssignal = kernel.smooth(signal).get_data()
++        ssignal = kernel.smooth(signal).get_fdata()
+         ssignal[:] *= kernel.norms[kernel.normalization]
+         # 3 points * signal.size array
+         I = np.indices(ssignal.shape)
+--- nipy.orig/nipy/algorithms/tests/test_resample.py
++++ nipy/nipy/algorithms/tests/test_resample.py
+@@ -21,7 +21,7 @@
+     fimg = load_image(funcfile)
+     aimg = load_image(anatfile)
+     resimg = resample_img2img(fimg, fimg)
+-    yield assert_true, np.allclose(resimg.get_data(), fimg.get_data())
++    yield assert_true, np.allclose(resimg.get_fdata(), fimg.get_fdata())
+     yield assert_raises, ValueError, resample_img2img, fimg, aimg
+ 
+ 
+@@ -34,12 +34,12 @@
+     g2 = AffineTransform.from_params('ij', 'xy', np.diag([0.5,0.7,1]))
+     i = Image(np.ones((100,100)), g)
+     # This sets the image data by writing into the array
+-    i.get_data()[50:55,40:55] = 3.
++    i.get_fdata()[50:55,40:55] = 3.
+     a = np.array([[0,1,0],
+                   [1,0,0],
+                   [0,0,1]], np.float)
+     ir = resample(i, g2, a, (100, 100))
+-    assert_array_almost_equal(ir.get_data().T, i.get_data())
++    assert_array_almost_equal(ir.get_fdata().T, i.get_fdata())
+ 
+ 
+ def test_rotate2d2():
+@@ -49,12 +49,12 @@
+     g2 = AffineTransform.from_params('ij', 'xy', np.diag([0.5,0.7,1]))
+     i = Image(np.ones((100,80)), g)
+     # This sets the image data by writing into the array
+-    i.get_data()[50:55,40:55] = 3.
++    i.get_fdata()[50:55,40:55] = 3.
+     a = np.array([[0,1,0],
+                   [1,0,0],
+                   [0,0,1]], np.float)
+     ir = resample(i, g2, a, (80,100))
+-    assert_array_almost_equal(ir.get_data().T, i.get_data())
++    assert_array_almost_equal(ir.get_fdata().T, i.get_fdata())
+ 
+ 
+ def test_rotate2d3():
+@@ -71,13 +71,13 @@
+     g = AffineTransform.from_params('xy', 'ij', np.diag([0.5,0.7,1]))
+     i = Image(np.ones((100,80)), g)
+     # This sets the image data by writing into the array
+-    i.get_data()[50:55,40:55] = 3.
++    i.get_fdata()[50:55,40:55] = 3.
+     a = np.identity(3)
+     g2 = AffineTransform.from_params('xy', 'ij', np.array([[0,0.5,0],
+                                                   [0.7,0,0],
+                                                   [0,0,1]]))
+     ir = resample(i, g2, a, (80,100))
+-    assert_array_almost_equal(ir.get_data().T, i.get_data())
++    assert_array_almost_equal(ir.get_fdata().T, i.get_fdata())
+ 
+ 
+ def test_rotate3d():
+@@ -86,20 +86,20 @@
+     g2 = AffineTransform.from_params('ijk', 'xyz', np.diag([0.5,0.7,0.6,1]))
+     shape = (100,90,80)
+     i = Image(np.ones(shape), g)
+-    i.get_data()[50:55,40:55,30:33] = 3.
++    i.get_fdata()[50:55,40:55,30:33] = 3.
+     a = np.array([[1,0,0,0],
+                   [0,0,1,0],
+                   [0,1,0,0],
+                   [0,0,0,1.]])
+     ir = resample(i, g2, a, (100,80,90))
+-    assert_array_almost_equal(np.transpose(ir.get_data(), (0,2,1)),
+-                              i.get_data())
++    assert_array_almost_equal(np.transpose(ir.get_fdata(), (0,2,1)),
++                              i.get_fdata())
+ 
+ 
+ def test_resample2d():
+     g = AffineTransform.from_params('ij', 'xy', np.diag([0.5,0.5,1]))
+     i = Image(np.ones((100,90)), g)
+-    i.get_data()[50:55,40:55] = 3.
++    i.get_fdata()[50:55,40:55] = 3.
+     # This mapping describes a mapping from the "target" physical
+     # coordinates to the "image" physical coordinates.  The 3x3 matrix
+     # below indicates that the "target" physical coordinates are related
+@@ -111,7 +111,7 @@
+     a = np.identity(3)
+     a[:2,-1] = 4.
+     ir = resample(i, i.coordmap, a, (100,90))
+-    assert_array_almost_equal(ir.get_data()[42:47,32:47], 3.)
++    assert_array_almost_equal(ir.get_fdata()[42:47,32:47], 3.)
+ 
+ 
+ def test_resample2d1():
+@@ -119,7 +119,7 @@
+     # an AffineTransform instance
+     g = AffineTransform.from_params('ij', 'xy', np.diag([0.5,0.5,1]))
+     i = Image(np.ones((100,90)), g)
+-    i.get_data()[50:55,40:55] = 3.
++    i.get_fdata()[50:55,40:55] = 3.
+     a = np.identity(3)
+     a[:2,-1] = 4.
+     A = np.identity(2)
+@@ -127,19 +127,19 @@
+     def mapper(x):
+         return np.dot(x, A.T) + b
+     ir = resample(i, i.coordmap, mapper, (100,90))
+-    assert_array_almost_equal(ir.get_data()[42:47,32:47], 3.)
++    assert_array_almost_equal(ir.get_fdata()[42:47,32:47], 3.)
+ 
+ 
+ def test_resample2d2():
+     g = AffineTransform.from_params('ij', 'xy', np.diag([0.5,0.5,1]))
+     i = Image(np.ones((100,90)), g)
+-    i.get_data()[50:55,40:55] = 3.
++    i.get_fdata()[50:55,40:55] = 3.
+     a = np.identity(3)
+     a[:2,-1] = 4.
+     A = np.identity(2)
+     b = np.ones(2)*4
+     ir = resample(i, i.coordmap, (A, b), (100,90))
+-    assert_array_almost_equal(ir.get_data()[42:47,32:47], 3.)
++    assert_array_almost_equal(ir.get_fdata()[42:47,32:47], 3.)
+ 
+ 
+ def test_resample2d3():
+@@ -147,18 +147,18 @@
+     # the transform: here it is an (A,b) pair
+     g = AffineTransform.from_params('ij', 'xy', np.diag([0.5,0.5,1]))
+     i = Image(np.ones((100,90)), g)
+-    i.get_data()[50:55,40:55] = 3.
++    i.get_fdata()[50:55,40:55] = 3.
+     a = np.identity(3)
+     a[:2,-1] = 4.
+     ir = resample(i, i.coordmap, a, (100,90))
+-    assert_array_almost_equal(ir.get_data()[42:47,32:47], 3.)
++    assert_array_almost_equal(ir.get_fdata()[42:47,32:47], 3.)
+ 
+ 
+ def test_resample3d():
+     g = AffineTransform.from_params('ijk', 'xyz', np.diag([0.5,0.5,0.5,1]))
+     shape = (100,90,80)
+     i = Image(np.ones(shape), g)
+-    i.get_data()[50:55,40:55,30:33] = 3.
++    i.get_fdata()[50:55,40:55,30:33] = 3.
+     # This mapping describes a mapping from the "target" physical
+     # coordinates to the "image" physical coordinates.  The 4x4 matrix
+     # below indicates that the "target" physical coordinates are related
+@@ -170,7 +170,7 @@
+     a = np.identity(4)
+     a[:3,-1] = [3,4,5]
+     ir = resample(i, i.coordmap, a, (100,90,80))
+-    assert_array_almost_equal(ir.get_data()[44:49,32:47,20:23], 3.)
++    assert_array_almost_equal(ir.get_fdata()[44:49,32:47,20:23], 3.)
+ 
+ 
+ def test_resample_outvalue():
+@@ -193,25 +193,25 @@
+                         order=order, mode='constant', cval=0.)
+         exp_arr = np.zeros(arr.shape)
+         exp_arr[:-1, :, :] = arr[1:, :, :]
+-        assert_array_almost_equal(img2.get_data(), exp_arr)
++        assert_array_almost_equal(img2.get_fdata(), exp_arr)
+         # Test constant value of 1
+         img2 = resample(img, coordmap, mapping, img.shape,
+                         order=order, mode='constant', cval=1.)
+         exp_arr[-1, :, :] = 1
+-        assert_array_almost_equal(img2.get_data(), exp_arr)
++        assert_array_almost_equal(img2.get_fdata(), exp_arr)
+         # Test nearest neighbor
+         img2 = resample(img, coordmap, mapping, img.shape,
+                         order=order, mode='nearest')
+         exp_arr[-1, :, :] = arr[-1, :, :]
+-        assert_array_almost_equal(img2.get_data(), exp_arr)
++        assert_array_almost_equal(img2.get_fdata(), exp_arr)
+     # Test img2img
+     target_coordmap = vox2mni(aff)
+     target = Image(arr, target_coordmap)
+     img2 = resample_img2img(img, target, 3, 'nearest')
+-    assert_array_almost_equal(img2.get_data(), exp_arr)
++    assert_array_almost_equal(img2.get_fdata(), exp_arr)
+     img2 = resample_img2img(img, target, 3, 'constant', cval=1.)
+     exp_arr[-1, :, :] = 1
+-    assert_array_almost_equal(img2.get_data(), exp_arr)
++    assert_array_almost_equal(img2.get_fdata(), exp_arr)
+ 
+ 
+ def test_nonaffine():
+@@ -225,7 +225,7 @@
+         in_names, out_names, tin_names, tout_names = names
+         g = AffineTransform.from_params(in_names, out_names, np.identity(3))
+         img = Image(np.ones((100,90)), g)
+-        img.get_data()[50:55,40:55] = 3.
++        img.get_fdata()[50:55,40:55] = 3.
+         tcoordmap = AffineTransform.from_start_step(
+             tin_names,
+             tout_names,
+@@ -241,7 +241,7 @@
+         pylab.gca().set_ylim([0,99])
+         pylab.gca().set_xlim([0,89])
+         pylab.figure(num=4)
+-        pylab.plot(ir.get_data())
++        pylab.plot(ir.get_fdata())
+ 
+ 
+ def test_2d_from_3d():
+@@ -252,11 +252,11 @@
+     shape = (100,90,80)
+     g = AffineTransform.from_params('ijk', 'xyz', np.diag([0.5,0.5,0.5,1]))
+     i = Image(np.ones(shape), g)
+-    i.get_data()[50:55,40:55,30:33] = 3.
++    i.get_fdata()[50:55,40:55,30:33] = 3.
+     a = np.identity(4)
+     g2 = ArrayCoordMap.from_shape(g, shape)[10]
+     ir = resample(i, g2.coordmap, a, g2.shape)
+-    assert_array_almost_equal(ir.get_data(), i[10].get_data())
++    assert_array_almost_equal(ir.get_fdata(), i[10].get_fdata())
+ 
+ 
+ def test_slice_from_3d():
+@@ -270,23 +270,23 @@
+                                     'xyz',
+                                     np.diag([0.5,0.5,0.5,1]))
+     img = Image(np.ones(shape), g)
+-    img.get_data()[50:55,40:55,30:33] = 3
++    img.get_fdata()[50:55,40:55,30:33] = 3
+     I = np.identity(4)
+     zsl = slices.zslice(26,
+                         ((0,49.5), 100),
+                         ((0,44.5), 90),
+                         img.reference)
+     ir = resample(img, zsl, I, (100, 90))
+-    assert_array_almost_equal(ir.get_data(), img[:,:,53].get_data())
++    assert_array_almost_equal(ir.get_fdata(), img[:,:,53].get_fdata())
+     ysl = slices.yslice(22,
+                         ((0,49.5), 100),
+                         ((0,39.5), 80),
+                         img.reference)
+     ir = resample(img, ysl, I, (100, 80))
+-    assert_array_almost_equal(ir.get_data(), img[:,45,:].get_data())
++    assert_array_almost_equal(ir.get_fdata(), img[:,45,:].get_fdata())
+     xsl = slices.xslice(15.5,
+                         ((0,44.5), 90),
+                         ((0,39.5), 80),
+                         img.reference)
+     ir = resample(img, xsl, I, (90, 80))
+-    assert_array_almost_equal(ir.get_data(), img[32,:,:].get_data())
++    assert_array_almost_equal(ir.get_fdata(), img[32,:,:].get_fdata())
+--- nipy.orig/nipy/algorithms/utils/pca.py
++++ nipy/nipy/algorithms/utils/pca.py
+@@ -340,9 +340,9 @@
+         if not mask.coordmap.similar_to(drop_io_dim(img.coordmap, axis)):
+             raise ValueError("Mask should have matching coordmap to `img` "
+                              "coordmap with dropped axis %s" % axis)
+-    data = work_img.get_data()
++    data = work_img.get_fdata()
+     if mask is not None:
+-        mask_data = mask.get_data()
++        mask_data = mask.get_fdata()
+     else:
+         mask_data = None
+     # do the PCA
+--- nipy.orig/nipy/algorithms/utils/tests/test_pca.py
++++ nipy/nipy/algorithms/utils/tests/test_pca.py
+@@ -15,7 +15,7 @@
+ 
+ def setup():
+     img = load_image(funcfile)
+-    arr = img.get_data()
++    arr = img.get_fdata()
+     #arr = np.rollaxis(arr, 3)
+     data['nimages'] = arr.shape[3]
+     data['fmridata'] = arr
+--- nipy.orig/nipy/algorithms/utils/tests/test_pca_image.py
++++ nipy/nipy/algorithms/utils/tests/test_pca_image.py
+@@ -29,7 +29,7 @@
+     # Below, I am just making a mask because I already have img, I know I can do
+     # this. In principle, though, the pca function will just take another Image
+     # as a mask
+-    img_data = t0_img.get_data()
++    img_data = t0_img.get_fdata()
+     mask_cmap = drop_io_dim(img.coordmap, 't')
+     first_frame = img_data[0]
+     mask = Image(np.greater(first_frame, 500).astype(np.float64),
+@@ -37,9 +37,9 @@
+     data_dict['fmridata'] = img
+     data_dict['mask'] = mask
+ 
+-    # print data_dict['mask'].shape, np.sum(data_dict['mask'].get_data())
++    # print data_dict['mask'].shape, np.sum(data_dict['mask'].get_fdata())
+     assert_equal(data_dict['mask'].shape, (17, 21, 3))
+-    assert_almost_equal(np.sum(data_dict['mask'].get_data()), 1071.0)
++    assert_almost_equal(np.sum(data_dict['mask'].get_fdata()), 1071.0)
+ 
+ def _rank(p):
+     return p['basis_vectors'].shape[1]
+@@ -208,7 +208,7 @@
+ def test_5d():
+     # What happened to a 5d image? We should get 4d images back
+     img = data_dict['fmridata']
+-    data = img.get_data()
++    data = img.get_fdata()
+     # Make a last input and output axis called 'v'
+     vcs = CS('v')
+     xtra_cmap = AffineTransform(vcs, vcs, np.eye(2))
+@@ -216,7 +216,7 @@
+     data_5d = data.reshape(data.shape + (1,))
+     fived = Image(data_5d, cmap_5d)
+     mask = data_dict['mask']
+-    mask_data = mask.get_data()
++    mask_data = mask.get_fdata()
+     mask_data = mask_data.reshape(mask_data.shape + (1,))
+     cmap_4d = cm_product(mask.coordmap, xtra_cmap)
+     mask4d = Image(mask_data, cmap_4d)
+@@ -244,7 +244,7 @@
+     fived = Image(data_5d, cmap_5d)
+     # Give the mask a 't' dimension, but no group dimension
+     mask = data_dict['mask']
+-    mask_data = mask.get_data()
++    mask_data = mask.get_fdata()
+     mask_data = mask_data.reshape(mask_data.shape + (1,))
+     # We need to replicate the time scaling of the image cmap, hence the 2. in
+     # the affine
+@@ -268,7 +268,7 @@
+     axis = res['axis']
+     bvs = res[bv_key]
+     bps_img = res['basis_projections']
+-    bps = bps_img.get_data()
++    bps = bps_img.get_fdata()
+     signs = np.sign(bvs[0])
+     res[bv_key] = bvs * signs
+     new_axes = [None] * bps.ndim
+@@ -284,7 +284,7 @@
+     ncomp = 5
+     img = data_dict['fmridata']
+     in_coords = list(img.axes.coord_names)
+-    img_data = img.get_data()
++    img_data = img.get_fdata()
+     for axis_no, axis_name in enumerate('ijkt'):
+         p = pca_image(img, axis_name, ncomp=ncomp)
+         n = img.shape[axis_no]
+@@ -299,7 +299,7 @@
+         pos_dp = res2pos1(dp)
+         img_bps = pos_p['basis_projections']
+         assert_almost_equal(pos_dp['basis_vectors'], pos_p[bv_key])
+-        assert_almost_equal(pos_dp['basis_projections'], img_bps.get_data())
++        assert_almost_equal(pos_dp['basis_projections'], img_bps.get_fdata())
+         # And we've replaced the expected axis
+         exp_coords = in_coords[:]
+         exp_coords[exp_coords.index(axis_name)] = 'PCA components'
+--- nipy.orig/nipy/core/image/image_list.py
++++ nipy/nipy/core/image/image_list.py
+@@ -23,7 +23,7 @@
+         images : iterable
+            an iterable object whose items are meant to be images; this is
+            checked by asserting that each has a `coordmap` attribute and a
+-           ``get_data`` method.  Note that Image objects are not iterable by
++           ``get_fdata`` method.  Note that Image objects are not iterable by
+            default; use the ``from_image`` classmethod or ``iter_axis`` function
+            to convert images to image lists - see examples below for the latter.
+ 
+@@ -51,7 +51,7 @@
+         False
+         >>> np.asarray(sublist).shape
+         (3, 17, 21, 3)
+-        >>> newimg.get_data().shape
++        >>> newimg.get_fdata().shape
+         (17, 21, 3)
+         """
+         if images is None:
+@@ -96,7 +96,7 @@
+         for img in iter_axis(image, in_ax):
+             if dropout:
+                 cmap = drop_io_dim(img.coordmap, out_ax_name)
+-                img = Image(img.get_data(), cmap, img.metadata)
++                img = Image(img.get_fdata(), cmap, img.metadata)
+             imlist.append(img)
+         return klass(imlist)
+ 
+@@ -162,7 +162,7 @@
+         v = np.empty(tmp_shape)
+         # first put the data in an array, with list dimension in the first axis
+         for i, im in enumerate(self.list):
+-            v[i] = im.get_data() # get_data method of an image has no axis
++            v[i] = im.get_fdata() # get_fdata method of an image has no axis
+         # then roll (and rock?) the axis to have axis in the right place
+         if axis < 0:
+             axis += out_dim
+--- nipy.orig/nipy/core/image/tests/test_image.py
++++ nipy/nipy/core/image/tests/test_image.py
+@@ -33,16 +33,16 @@
+ 
+ 
+ def test_init():
+-    data = gimg.get_data()
++    data = gimg.get_fdata()
+     new = Image(data, gimg.coordmap)
+-    assert_array_almost_equal(gimg.get_data(), new.get_data())
++    assert_array_almost_equal(gimg.get_fdata(), new.get_fdata())
+     assert_equal(new.coordmap, gimg.coordmap)
+     assert_raises(TypeError, Image)
+     assert_raises(TypeError, Image, data)
+ 
+ 
+ def test_maxmin_values():
+-    y = gimg.get_data()
++    y = gimg.get_fdata()
+     assert_equal(y.shape, tuple(gimg.shape))
+     assert_equal(y.max(), 23)
+     assert_equal(y.min(), 0.0)
+@@ -86,9 +86,9 @@
+     assert_equal(x.shape, newshape)
+ 
+ 
+-def test_get_data():
+-    # get_data always returns an array
+-    x = gimg.get_data()
++def test_get_fdata():
++    # get_fdata always returns an array
++    x = gimg.get_fdata()
+     assert_true(isinstance(x, np.ndarray))
+     assert_equal(x.shape, gimg.shape)
+     assert_equal(x.ndim, gimg.ndim)
+@@ -106,15 +106,15 @@
+         assert_equal(img_slice.shape, (3,2))
+     tmp = np.zeros(gimg.shape)
+     write_data(tmp, enumerate(iter_axis(gimg, 0, asarray=True)))
+-    assert_array_almost_equal(tmp, gimg.get_data())
++    assert_array_almost_equal(tmp, gimg.get_fdata())
+     tmp = np.zeros(gimg.shape)
+     g = iter_axis(gimg, 0, asarray=True)
+     write_data(tmp, enumerate(g))
+-    assert_array_almost_equal(tmp, gimg.get_data())
++    assert_array_almost_equal(tmp, gimg.get_fdata())
+ 
+ 
+ def test_parcels1():
+-    parcelmap = gimg.get_data().astype(np.int32)
++    parcelmap = gimg.get_fdata().astype(np.int32)
+     test = np.zeros(parcelmap.shape)
+     v = 0
+     for i, d in data_generator(test, parcels(parcelmap)):
+@@ -124,7 +124,7 @@
+ 
+ def test_parcels3():
+     rho = gimg[0]
+-    parcelmap = rho.get_data().astype(np.int32)
++    parcelmap = rho.get_fdata().astype(np.int32)
+     labels = np.unique(parcelmap)
+     test = np.zeros(rho.shape)
+     v = 0
+@@ -171,11 +171,11 @@
+     img = image.Image(obj, coordmap)
+     assert_equal(img.ndim, 3)
+     assert_equal(img.shape, (2,3,4))
+-    assert_array_almost_equal(img.get_data(), 1)
++    assert_array_almost_equal(img.get_fdata(), 1)
+     # Test that the array stays with the image, so we can assign the array
+     # in-place, at least in this case
+-    img.get_data()[:] = 4
+-    assert_array_equal(img.get_data(), 4)
++    img.get_fdata()[:] = 4
++    assert_array_equal(img.get_fdata(), 4)
+ 
+ 
+ def test_defaults_ND():
+@@ -217,31 +217,31 @@
+     coordmap = AffineTransform.from_params('xyz', 'ijk', np.eye(4))
+     img = Image(arr, coordmap, metadata={'field': 'value'})
+     img2 = Image.from_image(img)
+-    assert_array_equal(img.get_data(), img2.get_data())
++    assert_array_equal(img.get_fdata(), img2.get_fdata())
+     assert_equal(img.coordmap, img2.coordmap)
+     assert_equal(img.metadata, img2.metadata)
+     assert_false(img.metadata is img2.metadata)
+     # optional inputs - data
+     arr2 = arr + 10
+     new = Image.from_image(img, arr2)
+-    assert_array_almost_equal(arr2, new.get_data())
++    assert_array_almost_equal(arr2, new.get_fdata())
+     assert_equal(new.coordmap, coordmap)
+     new = Image.from_image(img, data=arr2)
+-    assert_array_almost_equal(arr2, new.get_data())
++    assert_array_almost_equal(arr2, new.get_fdata())
+     assert_equal(new.coordmap, coordmap)
+     # optional inputs - coordmap
+     coordmap2 = AffineTransform.from_params('pqr', 'ijk', np.eye(4))
+     new = Image.from_image(img, arr2, coordmap2)
+-    assert_array_almost_equal(arr2, new.get_data())
++    assert_array_almost_equal(arr2, new.get_fdata())
+     assert_equal(new.coordmap, coordmap2)
+     new = Image.from_image(img, coordmap=coordmap2)
+-    assert_array_almost_equal(arr, new.get_data())
++    assert_array_almost_equal(arr, new.get_fdata())
+     assert_equal(new.coordmap, coordmap2)
+     # Optional inputs - metadata
+     assert_equal(new.metadata, img.metadata)
+     another_meta = {'interesting': 'information'}
+     new = Image.from_image(img, arr2, coordmap2, another_meta)
+-    assert_array_almost_equal(arr2, new.get_data())
++    assert_array_almost_equal(arr2, new.get_fdata())
+     assert_equal(another_meta, new.metadata)
+ 
+ 
+@@ -251,7 +251,7 @@
+     im_scrambled = im.reordered_axes('iljk').reordered_reference('xtyz')
+     im_unscrambled = image.synchronized_order(im_scrambled, im)
+     assert_equal(im_unscrambled.coordmap, im.coordmap)
+-    assert_almost_equal(im_unscrambled.get_data(), im.get_data())
++    assert_almost_equal(im_unscrambled.get_fdata(), im.get_fdata())
+     assert_equal(im_unscrambled, im)
+     assert_true(im_unscrambled == im)
+     assert_false(im_unscrambled != im)
+@@ -311,19 +311,19 @@
+     # 'l' can appear both as an axis and a reference coord name
+     im_unamb = Image(data, AffineTransform.from_params('ijkl', 'xyzl', np.diag([1,2,3,4,1])))
+     im_rolled = image.rollaxis(im_unamb, 'l')
+-    assert_almost_equal(im_rolled.get_data(),
+-                        im_unamb.get_data().transpose([3,0,1,2]))
++    assert_almost_equal(im_rolled.get_fdata(),
++                        im_unamb.get_fdata().transpose([3,0,1,2]))
+     for i, o, n in zip('ijkl', 'xyzt', range(4)):
+         im_i = image.rollaxis(im, i)
+         im_o = image.rollaxis(im, o)
+         im_n = image.rollaxis(im, n)
+-        assert_almost_equal(im_i.get_data(), im_o.get_data())
++        assert_almost_equal(im_i.get_fdata(), im_o.get_fdata())
+         assert_almost_equal(im_i.affine, im_o.affine)
+-        assert_almost_equal(im_n.get_data(), im_o.get_data())
++        assert_almost_equal(im_n.get_fdata(), im_o.get_fdata())
+         for _im in [im_n, im_o, im_i]:
+             im_n_inv = image.rollaxis(_im, n, inverse=True)
+             assert_almost_equal(im_n_inv.affine, im.affine)
+-            assert_almost_equal(im_n_inv.get_data(), im.get_data())
++            assert_almost_equal(im_n_inv.get_fdata(), im.get_fdata())
+ 
+ 
+ def test_is_image():
+@@ -336,7 +336,7 @@
+     assert_true(is_image(img))
+     assert_false(is_image(object()))
+     class C(object):
+-        def get_data(self): pass
++        def get_fdata(self): pass
+     c = C()
+     assert_false(is_image(c))
+     c.coordmap = None
+@@ -357,21 +357,21 @@
+     assert_equal(im1.coordmap, rollimg(im, -3).coordmap)
+     assert_equal(im1.coordmap,
+                  AT('jikl', 'xyzt', aff[:, (1, 0, 2, 3, 4)]))
+-    assert_array_equal(im1.get_data(), np.rollaxis(data, 1))
++    assert_array_equal(im1.get_fdata(), np.rollaxis(data, 1))
+     im2 = rollimg(im, 2)
+     assert_equal(im2.coordmap, rollimg(im, 'k').coordmap)
+     assert_equal(im2.coordmap, rollimg(im, 'z').coordmap)
+     assert_equal(im2.coordmap, rollimg(im, -2).coordmap)
+     assert_equal(im2.coordmap,
+                  AT('kijl', 'xyzt', aff[:, (2, 0, 1, 3, 4)]))
+-    assert_array_equal(im2.get_data(), np.rollaxis(data, 2))
++    assert_array_equal(im2.get_fdata(), np.rollaxis(data, 2))
+     im3 = rollimg(im, 3)
+     assert_equal(im3.coordmap, rollimg(im, 'l').coordmap)
+     assert_equal(im3.coordmap, rollimg(im, 't').coordmap)
+     assert_equal(im3.coordmap, rollimg(im, -1).coordmap)
+     assert_equal(im3.coordmap,
+                         AT('lijk', 'xyzt', aff[:, (3, 0, 1, 2, 4)]))
+-    assert_array_equal(im3.get_data(), np.rollaxis(data, 3))
++    assert_array_equal(im3.get_fdata(), np.rollaxis(data, 3))
+     # We can roll to before a specified axis
+     im31 = rollimg(im, 3, 1)
+     assert_equal(im31.coordmap, rollimg(im, 'l', 'j').coordmap)
+@@ -382,7 +382,7 @@
+     assert_equal(im31.coordmap, rollimg(im, -1, -3).coordmap)
+     assert_equal(im31.coordmap,
+                  AT('iljk', 'xyzt', aff[:, (0, 3, 1, 2, 4)]))
+-    assert_array_equal(im31.get_data(), np.rollaxis(data, 3, 1))
++    assert_array_equal(im31.get_fdata(), np.rollaxis(data, 3, 1))
+     # Check that ambiguous axes raise an exception; 'l' appears both as an axis
+     # and a reference coord name and in different places
+     im_amb = Image(data, AT('ijkl', 'xylt', np.diag([1,2,3,4,1])))
+@@ -391,8 +391,8 @@
+     # reference coord name
+     im_unamb = Image(data, AT('ijkl', 'xyzl', np.diag([1,2,3,4,1])))
+     im_rolled = rollimg(im_unamb, 'l')
+-    assert_array_equal(im_rolled.get_data(),
+-                       im_unamb.get_data().transpose([3,0,1,2]))
++    assert_array_equal(im_rolled.get_fdata(),
++                       im_unamb.get_fdata().transpose([3,0,1,2]))
+     # Zero row / col means we can't find an axis mapping, when fix0 is false
+     aff_z = np.diag([1, 2, 3, 0, 1])
+     im_z = Image(data, AT('ijkl', 'xyzt', aff_z))
+@@ -419,15 +419,15 @@
+         im_i = rollimg(im, i)
+         im_o = rollimg(im, o)
+         im_n = rollimg(im, n)
+-        assert_array_equal(im_i.get_data(), im_o.get_data())
++        assert_array_equal(im_i.get_fdata(), im_o.get_fdata())
+         assert_array_equal(im_i.affine, im_o.affine)
+-        assert_array_equal(im_n.get_data(), im_o.get_data())
++        assert_array_equal(im_n.get_fdata(), im_o.get_fdata())
+         for _im in [im_n, im_o, im_i]:
+             # We're rollimg back.  We want to roll the new axis 0 back to where
+             # it started, which was position n
+             im_n_inv = rollimg(_im, 0, n + 1)
+             assert_array_equal(im_n_inv.affine, im.affine)
+-            assert_array_equal(im_n_inv.get_data(), im.get_data())
++            assert_array_equal(im_n_inv.get_fdata(), im.get_fdata())
+ 
+ 
+ def test_rollimg_rollaxis():
+@@ -439,11 +439,11 @@
+     for axis in chain(range(4), range(-3, -1)):
+         rdata = np.rollaxis(data, axis)
+         rimg = rollimg(img, axis)
+-        assert_array_equal(rdata, rimg.get_data())
++        assert_array_equal(rdata, rimg.get_fdata())
+         for start in chain(range(4), range(-3, -1)):
+             rdata = np.rollaxis(data, axis, start)
+             rimg = rollimg(img, axis, start)
+-            assert_array_equal(rdata, rimg.get_data())
++            assert_array_equal(rdata, rimg.get_fdata())
+ 
+ 
+ def test_rollaxis_inverse():
+@@ -455,6 +455,6 @@
+     for axis in chain(range(4), range(-3, -1)):
+         rimg = image.rollaxis(img, axis)
+         rdata = np.rollaxis(data, axis)
+-        assert_array_equal(rdata, rimg.get_data())
++        assert_array_equal(rdata, rimg.get_fdata())
+         rrimg = image.rollaxis(rimg, axis, inverse=True)
+-        assert_array_equal(data, rrimg.get_data())
++        assert_array_equal(data, rrimg.get_fdata())
+--- nipy.orig/nipy/core/image/tests/test_image_list.py
++++ nipy/nipy/core/image/tests/test_image_list.py
+@@ -64,7 +64,7 @@
+     new_cmap = AffineTransform(CoordinateSystem('ijkl'),
+                                FIMG.coordmap.function_range,
+                                FIMG.coordmap.affine)
+-    fimg2 = Image(FIMG.get_data(), new_cmap)
++    fimg2 = Image(FIMG.get_fdata(), new_cmap)
+     assert_equal(len(ImageList.from_image(fimg2, axis='t')), 20)
+     assert_equal(len(ImageList.from_image(fimg2, axis='l')), 20)
+     assert_raises(AxisError, ImageList.from_image, FIMG, 'q')
+@@ -86,8 +86,8 @@
+     assert_true(isinstance(sublist.get_list_data(axis=0), np.ndarray))
+     # Test __setitem__
+     sublist[2] = sublist[0]
+-    assert_equal(sublist[0].get_data().mean(),
+-                 sublist[2].get_data().mean())
++    assert_equal(sublist[0].get_fdata().mean(),
++                 sublist[2].get_fdata().mean())
+     # Test iterator
+     for x in sublist:
+         assert_true(isinstance(x, Image))
+--- nipy.orig/nipy/core/image/tests/test_image_spaces.py
++++ nipy/nipy/core/image/tests/test_image_spaces.py
+@@ -69,12 +69,12 @@
+             img_r = as_xyz_image(tmap)
+             assert_false(tmap is img_r)
+             assert_equal(img, img_r)
+-            assert_array_equal(img.get_data(), img_r.get_data())
++            assert_array_equal(img.get_fdata(), img_r.get_fdata())
+     img_t0 = rollimg(img, 't')
+     assert_false(is_xyz_affable(img_t0))
+     img_t0_r = as_xyz_image(img_t0)
+     assert_false(img_t0 is img_t0_r)
+-    assert_array_equal(img.get_data(), img_t0_r.get_data())
++    assert_array_equal(img.get_fdata(), img_t0_r.get_fdata())
+     assert_equal(img.coordmap, img_t0_r.coordmap)
+     # Test against nibabel image
+     nimg = nib.Nifti1Image(arr.astype('float'), np.diag([2,3,4,1]))
+@@ -126,10 +126,10 @@
+     aff = np.diag([2,3,4,1])
+     img = make_xyz_image(arr, aff, 'mni')
+     assert_equal(img.coordmap, vox2mni(aff, 1.0))
+-    assert_array_equal(img.get_data(), arr)
++    assert_array_equal(img.get_fdata(), arr)
+     img = make_xyz_image(arr, aff, 'talairach')
+     assert_equal(img.coordmap, vox2talairach(aff, 1.0))
+-    assert_array_equal(img.get_data(), arr)
++    assert_array_equal(img.get_fdata(), arr)
+     img = make_xyz_image(arr, aff, talairach_space)
+     assert_equal(img.coordmap, vox2talairach(aff, 1.0))
+     # Unknown space as string raises SpaceError
+--- nipy.orig/nipy/core/image/tests/test_rollimg.py
++++ nipy/nipy/core/image/tests/test_rollimg.py
+@@ -74,7 +74,7 @@
+     axis_name = img.axes.coord_names[0]
+     output_axes = list(img.axes.coord_names)
+     output_axes.remove(axis_name)
+-    newdata = reduce_op(img.get_data())
++    newdata = reduce_op(img.get_fdata())
+     return Image(newdata, drop_io_dim(img.coordmap, axis))
+ 
+ 
+@@ -126,7 +126,7 @@
+     """
+     rolled_img = rollimg(img, inaxis)
+     inaxis = rolled_img.axes.coord_names[0] # now it's a string
+-    newdata = function(rolled_img.get_data())
++    newdata = function(rolled_img.get_fdata())
+     new_coordmap = rolled_img.coordmap.renamed_domain({inaxis: outaxis})
+     new_image = Image(newdata, new_coordmap)
+     # we have to roll the axis back
+@@ -162,7 +162,7 @@
+         with a modified copy of img._data.
+     """
+     rolled_img = rollimg(img, axis)
+-    data = rolled_img.get_data().copy()
++    data = rolled_img.get_fdata().copy()
+     for d in data:
+         modify(d)
+     import copy
+@@ -179,7 +179,7 @@
+     assert_array_equal(xyz_affine(im), xyz_affine(newim))
+     assert_equal(newim.axes.coord_names, tuple('ijk'))
+     assert_equal(newim.shape, (3, 5, 7))
+-    assert_almost_equal(newim.get_data(), x.sum(3))
++    assert_almost_equal(newim.get_fdata(), x.sum(3))
+     im_nd = Image(x, AT(CS('ijkq'), MNI4, np.array(
+         [[0, 1, 2, 0, 10],
+          [3, 4, 5, 0, 11],
+@@ -207,7 +207,7 @@
+     assert_array_equal(xyz_affine(im), xyz_affine(newim))
+     assert_equal(newim.axes.coord_names, tuple('ijk'))
+     assert_equal(newim.shape, (3, 5, 7))
+-    assert_almost_equal(newim.get_data(), x.sum(3))
++    assert_almost_equal(newim.get_fdata(), x.sum(3))
+ 
+ 
+ def test_call():
+@@ -221,7 +221,7 @@
+     assert_array_equal(xyz_affine(im), xyz_affine(newim))
+     assert_equal(newim.axes.coord_names, tuple('ijk') + ('out',))
+     assert_equal(newim.shape, (3, 5, 7, 6))
+-    assert_almost_equal(newim.get_data(), x[:,:,:,::2])
++    assert_almost_equal(newim.get_fdata(), x[:,:,:,::2])
+ 
+ 
+ def test_modify():
+@@ -242,16 +242,16 @@
+         for a in i, o, n:
+             nullim = image_modify(im, nullmodify, a)
+             meanim = image_modify(im, meanmodify, a)
+-            assert_array_equal(nullim.get_data(), im.get_data())
++            assert_array_equal(nullim.get_fdata(), im.get_fdata())
+             assert_array_equal(xyz_affine(im), xyz_affine(nullim))
+             assert_equal(nullim.axes, im.axes)
+             # yield assert_equal, nullim, im
+             assert_array_equal(xyz_affine(im), xyz_affine(meanim))
+             assert_equal(meanim.axes, im.axes)
+         # Make sure that meanmodify works as expected
+-        d = im.get_data()
++        d = im.get_fdata()
+         d = np.rollaxis(d, n)
+-        meand = meanim.get_data()
++        meand = meanim.get_fdata()
+         meand = np.rollaxis(meand, n)
+         for i in range(d.shape[0]):
+             assert_almost_equal(meand[i], d[i].mean())
+--- nipy.orig/nipy/core/utils/generators.py
++++ nipy/nipy/core/utils/generators.py
+@@ -38,7 +38,7 @@
+     Parameters
+     ----------
+     data : image or array-like
+-        Either an image (with ``get_data`` method returning ndarray) or an
++        Either an image (with ``get_fdata`` method returning ndarray) or an
+         array-like
+     labels : iterable, optional
+         A sequence of labels for which to return indices within `data`. The
+@@ -80,7 +80,7 @@
+     """
+     # Get image data or make array from array-like
+     try:
+-        data = data.get_data()
++        data = data.get_fdata()
+     except AttributeError:
+         data = np.asarray(data)
+     if labels is None:
+--- nipy.orig/nipy/io/files.py
++++ nipy/nipy/io/files.py
+@@ -138,7 +138,7 @@
+         # All done
+         io_dtype = None
+     elif dt_from_is_str and dtype_from == 'data':
+-        io_dtype = img.get_data().dtype
++        io_dtype = img.get_fdata().dtype
+     else:
+         io_dtype = np.dtype(dtype_from)
+     # make new image
+--- nipy.orig/nipy/io/nifti_ref.py
++++ nipy/nipy/io/nifti_ref.py
+@@ -297,7 +297,7 @@
+     data = None
+     if data_dtype is None:
+         if in_hdr is None:
+-            data = img.get_data()
++            data = img.get_fdata()
+             data_dtype = data.dtype
+         else:
+             data_dtype = in_hdr.get_data_dtype()
+@@ -361,12 +361,12 @@
+     # Done if we only have 3 input dimensions
+     n_ns = coordmap.ndims[0] - 3
+     if n_ns == 0: # No non-spatial dimensions
+-        return nib.Nifti1Image(img.get_data(), xyz_affine, hdr)
++        return nib.Nifti1Image(img.get_fdata(), xyz_affine, hdr)
+     elif n_ns > 4:
+         raise NiftiError("Too many dimensions to convert")
+     # Go now to data, pixdims
+     if data is None:
+-        data = img.get_data()
++        data = img.get_fdata()
+     rzs, trans = to_matvec(img.coordmap.affine)
+     ns_pixdims = list(np.sqrt(np.sum(rzs[3:, 3:] ** 2, axis=0)))
+     in_ax, out_ax, tl_name = _find_time_like(coordmap, fix0)
+@@ -543,7 +543,7 @@
+         affine = hdr.get_best_affine()
+     else:
+         affine = affine.copy()
+-    data = ni_img.get_data()
++    data = ni_img.get_fdata()
+     shape = list(ni_img.shape)
+     ndim = len(shape)
+     if ndim < 3:
+--- nipy.orig/nipy/io/tests/test_image_io.py
++++ nipy/nipy/io/tests/test_image_io.py
+@@ -53,7 +53,7 @@
+ @if_templates
+ def test_maxminmean_values():
+     # loaded array values from SPM
+-    y = gimg.get_data()
++    y = gimg.get_fdata()
+     yield assert_equal, y.shape, tuple(gimg.shape)
+     yield assert_array_almost_equal, y.max(), 1.000000059
+     yield assert_array_almost_equal, y.mean(), 0.273968048
+@@ -96,7 +96,7 @@
+     img = Image(data, vox2mni(np.eye(4)))
+     # The dtype_from dtype won't be visible until the image is loaded
+     newimg = save_image(img, name, dtype_from=out_dtype)
+-    return newimg.get_data(), data
++    return newimg.get_fdata(), data
+ 
+ 
+ def test_scaling_io_dtype():
+@@ -116,7 +116,7 @@
+                 # Check the data is within reasonable bounds. The exact bounds
+                 # are a little annoying to calculate - see
+                 # nibabel/tests/test_round_trip for inspiration
+-                data_back = img.get_data().copy() # copy to detach from file
++                data_back = img.get_fdata().copy() # copy to detach from file
+                 del img
+                 top = np.abs(data - data_back)
+                 nzs = (top !=0) & (data !=0)
+@@ -227,11 +227,11 @@
+ 
+ def test_file_roundtrip():
+     img = load_image(anatfile)
+-    data = img.get_data()
++    data = img.get_fdata()
+     with InTemporaryDirectory():
+         save_image(img, 'img.nii.gz')
+         img2 = load_image('img.nii.gz')
+-        data2 = img2.get_data()
++        data2 = img2.get_fdata()
+     # verify data
+     assert_almost_equal(data2, data)
+     assert_almost_equal(data2.mean(), data.mean())
+@@ -250,7 +250,7 @@
+     with InTemporaryDirectory():
+         save_image(img, 'img.nii.gz')
+         img2 = load_image('img.nii.gz')
+-        data2 = img2.get_data()
++        data2 = img2.get_fdata()
+     # verify data
+     assert_almost_equal(data2, data)
+     assert_almost_equal(data2.mean(), data.mean())
+@@ -269,7 +269,7 @@
+     img1 = as_image(six.text_type(funcfile))  # unicode
+     img2 = as_image(img)
+     assert_equal(img.affine, img1.affine)
+-    assert_array_equal(img.get_data(), img1.get_data())
++    assert_array_equal(img.get_fdata(), img1.get_fdata())
+     assert_true(img is img2)
+ 
+ 
+--- nipy.orig/nipy/io/tests/test_nifti_ref.py
++++ nipy/nipy/io/tests/test_nifti_ref.py
+@@ -34,7 +34,7 @@
+     # Make a fresh copy of a image stored in a file
+     img = load(fname)
+     hdr = img.metadata['header'].copy()
+-    return Image(img.get_data().copy(),
++    return Image(img.get_fdata().copy(),
+                  copy(img.coordmap),
+                  {'header': hdr})
+ 
+@@ -53,7 +53,7 @@
+     # Go from nipy image to header and data for nifti
+     fimg = copy_of(funcfile)
+     hdr = fimg.metadata['header']
+-    data = fimg.get_data()
++    data = fimg.get_fdata()
+     # Header is preserved
+     # Put in some information to check header is preserved
+     hdr['slice_duration'] = 0.25
+@@ -63,35 +63,35 @@
+     assert_false(hdr is new_hdr)
+     # Check information preserved
+     assert_equal(hdr['slice_duration'], new_hdr['slice_duration'])
+-    assert_array_equal(data, ni_img.get_data())
++    assert_array_equal(data, ni_img.get_fdata())
+     # Shape obviously should be same
+     assert_equal(ni_img.shape, fimg.shape)
+ 
+ 
+ def test_xyz_affines():
+     fimg = copy_of(funcfile)
+-    data = fimg.get_data()
++    data = fimg.get_fdata()
+     # Check conversion to xyz affable
+     # Roll time to front in array
+     fimg_t0 = fimg.reordered_axes((3, 0, 1, 2))
+     # Nifti conversion rolls it back
+-    assert_array_equal(nipy2nifti(fimg_t0).get_data(), data)
++    assert_array_equal(nipy2nifti(fimg_t0).get_fdata(), data)
+     # Roll time to position 1
+     fimg_t0 = fimg.reordered_axes((0, 3, 1, 2))
+-    assert_array_equal(nipy2nifti(fimg_t0).get_data(), data)
++    assert_array_equal(nipy2nifti(fimg_t0).get_fdata(), data)
+     # Check bad names cause NiftiError
+     out_coords = fimg.reference.coord_names
+     bad_img = fimg.renamed_reference(**{out_coords[0]: 'not a known axis'})
+     assert_raises(NiftiError, nipy2nifti, bad_img)
+     # Check xyz works for not strict
+     bad_img = fimg.renamed_reference(**dict(zip(out_coords, 'xyz')))
+-    assert_array_equal(nipy2nifti(bad_img, strict=False).get_data(), data)
++    assert_array_equal(nipy2nifti(bad_img, strict=False).get_fdata(), data)
+     # But fails for strict
+     assert_raises(NiftiError, nipy2nifti, bad_img, strict=True)
+     # 3D is OK
+     aimg = copy_of(anatfile)
+-    adata = aimg.get_data()
+-    assert_array_equal(nipy2nifti(aimg).get_data(), adata)
++    adata = aimg.get_fdata()
++    assert_array_equal(nipy2nifti(aimg).get_fdata(), adata)
+     # For now, always error on 2D (this depends on as_xyz_image)
+     assert_raises(NiftiError, nipy2nifti, aimg[:, :, 1])
+     assert_raises(NiftiError, nipy2nifti, aimg[:, 1, :])
+@@ -167,7 +167,7 @@
+     assert_equal(hdr.get_dim_info(), (None, None, None))
+     ni_img = nipy2nifti(fimg)
+     assert_equal(get_header(ni_img).get_dim_info(), (None, None, None))
+-    data = fimg.get_data()
++    data = fimg.get_fdata()
+     cmap = fimg.coordmap
+     for i in range(3):
+         for order, name in enumerate(('freq', 'phase', 'slice')):
+@@ -280,7 +280,7 @@
+     # Whether xyzt_unit field gets set correctly
+     fimg_orig = copy_of(funcfile)
+     # Put time in output, input and both
+-    data = fimg_orig.get_data()
++    data = fimg_orig.get_fdata()
+     hdr = fimg_orig.metadata['header']
+     aff = fimg_orig.coordmap.affine
+     out_names = fimg_orig.reference.coord_names
+@@ -334,17 +334,17 @@
+         img = Image(data, cmap)
+         # Time-like in correct position
+         ni_img = nipy2nifti(img)
+-        assert_array_equal(ni_img.get_data(), data)
++        assert_array_equal(ni_img.get_fdata(), data)
+         assert_array_equal(get_header(ni_img).get_zooms(), (2, 3, 4, 5, 6, 7))
+         # Time-like needs reordering
+         cmap = AT(in_cs, CS(xyz_names + ('q', time_like, 'r')), aff)
+         ni_img = nipy2nifti(Image(data, cmap))
+-        assert_array_equal(ni_img.get_data(), np.rollaxis(data, 4, 3))
++        assert_array_equal(ni_img.get_fdata(), np.rollaxis(data, 4, 3))
+         assert_array_equal(get_header(ni_img).get_zooms(), (2, 3, 4, 6, 5, 7))
+         # And again
+         cmap = AT(in_cs, CS(xyz_names + ('q', 'r', time_like)), aff)
+         ni_img = nipy2nifti(Image(data, cmap))
+-        assert_array_equal(ni_img.get_data(), np.rollaxis(data, 5, 3))
++        assert_array_equal(ni_img.get_fdata(), np.rollaxis(data, 5, 3))
+         assert_array_equal(get_header(ni_img).get_zooms(), (2, 3, 4, 7, 5, 6))
+ 
+ 
+@@ -432,12 +432,12 @@
+     for time_like in ('t', 'hz', 'ppm', 'rads'):
+         cmap = AT(in_cs, CS(xyz_names + (time_like, 'q', 'r')), aff)
+         ni_img = nipy2nifti(Image(data, cmap))
+-        assert_array_equal(ni_img.get_data(), data)
++        assert_array_equal(ni_img.get_fdata(), data)
+     # But there is if no time-like
+     for no_time in ('random', 'words', 'I', 'thought', 'of'):
+         cmap = AT(in_cs, CS(xyz_names + (no_time, 'q', 'r')), aff)
+         ni_img = nipy2nifti(Image(data, cmap))
+-        assert_array_equal(ni_img.get_data(), data[:, :, :, None, :, :])
++        assert_array_equal(ni_img.get_fdata(), data[:, :, :, None, :, :])
+ 
+ 
+ def test_save_spaces():
+@@ -493,7 +493,7 @@
+     aff = np.diag([2., 3, 4, 1])
+     ni_img = nib.Nifti1Image(data, aff)
+     img = nifti2nipy(ni_img)
+-    assert_array_equal(img.get_data(), data)
++    assert_array_equal(img.get_fdata(), data)
+ 
+ 
+ def test_expand_to_3d():
+@@ -553,13 +553,13 @@
+             exp_aff = np.dot(np.diag(diag), full_aff)
+         exp_cmap = AT(in_cs, out_cs, exp_aff)
+         assert_equal(img.coordmap, exp_cmap)
+-        assert_array_equal(img.get_data(), data)
++        assert_array_equal(img.get_fdata(), data)
+         # Even if the image axis length is 1, we keep out time dimension, if
+         # there is specific scaling implying time-like
+         ni_img_t = nib.Nifti1Image(reduced_data, xyz_aff, hdr)
+         img = nifti2nipy(ni_img_t)
+         assert_equal(img.coordmap, exp_cmap)
+-        assert_array_equal(img.get_data(), reduced_data)
++        assert_array_equal(img.get_fdata(), reduced_data)
+ 
+ 
+ def test_load_no_time():
+--- nipy.orig/nipy/io/tests/test_save.py
++++ nipy/nipy/io/tests/test_save.py
+@@ -36,7 +36,7 @@
+         img2 = load_image(TMP_FNAME)
+         assert_array_almost_equal(img.affine, img2.affine)
+         assert_equal(img.shape, img2.shape)
+-        assert_array_almost_equal(img2.get_data(), img.get_data())
++        assert_array_almost_equal(img2.get_fdata(), img.get_fdata())
+         del img2
+ 
+ 
+@@ -53,7 +53,7 @@
+         img2 = load_image(TMP_FNAME)
+         assert_array_almost_equal(img.affine, img2.affine)
+         assert_equal(img.shape, img2.shape)
+-        assert_array_almost_equal(img2.get_data(), img.get_data())
++        assert_array_almost_equal(img2.get_fdata(), img.get_fdata())
+         del img2
+ 
+ 
+@@ -77,7 +77,7 @@
+         img2 = load_image(TMP_FNAME)
+         assert_array_almost_equal(img.affine, img2.affine)
+         assert_equal(img.shape, img2.shape)
+-        assert_array_almost_equal(img2.get_data(), img.get_data())
++        assert_array_almost_equal(img2.get_fdata(), img.get_fdata())
+         del img2
+ 
+ 
+@@ -99,13 +99,13 @@
+         save_image(img, TMP_FNAME)
+         tmp = load_image(TMP_FNAME)
+         # Detach image from file so we can delete it
+-        data = tmp.get_data().copy()
++        data = tmp.get_fdata().copy()
+         img2 = api.Image(data, tmp.coordmap, tmp.metadata)
+         del tmp
+     assert_equal(tuple([img.shape[l] for l in [3,2,1,0]]), img2.shape)
+-    a = np.transpose(img.get_data(), [3,2,1,0])
++    a = np.transpose(img.get_fdata(), [3,2,1,0])
+     assert_false(np.allclose(img.affine, img2.affine))
+-    assert_true(np.allclose(a, img2.get_data()))
++    assert_true(np.allclose(a, img2.get_fdata()))
+ 
+ 
+ def test_save4():
+@@ -126,7 +126,7 @@
+     with InTemporaryDirectory():
+         save_image(img, TMP_FNAME)
+         tmp = load_image(TMP_FNAME)
+-        data = tmp.get_data().copy()
++        data = tmp.get_fdata().copy()
+         # Detach image from file so we can delete it
+         img2 = api.Image(data, tmp.coordmap, tmp.metadata)
+         del tmp
+@@ -148,8 +148,8 @@
+     assert_equal(img.shape[::-1], img2.shape)
+     # data should be transposed because coordinates are reversed
+     assert_array_almost_equal(
+-        np.transpose(img2.get_data(),[3,2,1,0]),
+-        img.get_data())
++        np.transpose(img2.get_fdata(),[3,2,1,0]),
++        img.get_fdata())
+     # coordinate names should be reversed as well
+     assert_equal(img2.coordmap.function_domain.coord_names,
+                  img.coordmap.function_domain.coord_names[::-1])
+--- nipy.orig/nipy/labs/datasets/converters.py
++++ nipy/nipy/labs/datasets/converters.py
+@@ -64,7 +64,7 @@
+         copy = False
+ 
+     if isinstance(obj, SpatialImage):
+-        data   = obj.get_data()
++        data   = obj.get_fdata()
+         affine = get_affine(obj)
+         header = dict(get_header(obj))
+         fname = obj.file_map['image'].filename
+@@ -109,7 +109,7 @@
+     for key, value in obj.metadata.items():
+         if key in hdr:
+             hdr[key] = value
+-    img = nib.Nifti1Image(obj.get_data(),
++    img = nib.Nifti1Image(obj.get_fdata(),
+                           obj.affine,
+                           header=hdr)
+     nib.save(img, filename)
+--- nipy.orig/nipy/labs/datasets/volumes/tests/test_volume_grid.py
++++ nipy/nipy/labs/datasets/volumes/tests/test_volume_grid.py
+@@ -100,7 +100,7 @@
+     
+     # Resample an image on itself: it shouldn't change much:
+     img  = img1.resampled_to_img(img1)
+-    yield np.testing.assert_almost_equal, data, img.get_data()
++    yield np.testing.assert_almost_equal, data, img.get_fdata()
+ 
+     # Check that if I 'resampled_to_img' on an VolumeImg, I get an
+     # VolumeImg, and vice versa 
+@@ -113,8 +113,8 @@
+     yield nose.tools.assert_true, isinstance(image2, VolumeGrid)
+     # Check that the data are all the same: we have been playing only
+     # with identity mappings
+-    yield np.testing.assert_array_equal, volume_image2.get_data(), \
+-            image2.get_data()
++    yield np.testing.assert_array_equal, volume_image2.get_fdata(), \
++            image2.get_fdata()
+ 
+ 
+ def test_as_volume_image():
+--- nipy.orig/nipy/labs/datasets/volumes/tests/test_volume_img.py
++++ nipy/nipy/labs/datasets/volumes/tests/test_volume_img.py
+@@ -54,12 +54,12 @@
+     affine[:3, -1] = 0.5 * np.array(shape[:3])
+     ref_im = VolumeImg(data, affine, 'mine')
+     rot_im = ref_im.as_volume_img(affine, interpolation='nearest')
+-    yield np.testing.assert_almost_equal, data, rot_im.get_data()
++    yield np.testing.assert_almost_equal, data, rot_im.get_fdata()
+     # Now test when specifying only a 3x3 affine
+     #rot_im = ref_im.as_volume_img(affine[:3, :3], interpolation='nearest')
+-    yield np.testing.assert_almost_equal, data, rot_im.get_data()
++    yield np.testing.assert_almost_equal, data, rot_im.get_fdata()
+     reordered_im = rot_im.xyz_ordered()
+-    yield np.testing.assert_almost_equal, data, reordered_im.get_data()
++    yield np.testing.assert_almost_equal, data, reordered_im.get_fdata()
+ 
+ 
+ def test_downsample():
+@@ -73,7 +73,7 @@
+     downsampled = data[::2, ::2, ::2, ...]
+     x, y, z = downsampled.shape[:3]
+     np.testing.assert_almost_equal(downsampled, 
+-                                   rot_im.get_data()[:x, :y, :z, ...])
++                                   rot_im.get_fdata()[:x, :y, :z, ...])
+ 
+ 
+ def test_resampling_with_affine():
+@@ -85,7 +85,7 @@
+     for angle in (0, np.pi, np.pi/2, np.pi/4, np.pi/3):
+         rot = rotation(0, angle)
+         rot_im = img.as_volume_img(affine=rot)
+-        yield np.testing.assert_almost_equal, np.max(data), np.max(rot_im.get_data())
++        yield np.testing.assert_almost_equal, np.max(data), np.max(rot_im.get_fdata())
+ 
+ 
+ def test_reordering():
+@@ -109,12 +109,12 @@
+         rot_im = ref_im.as_volume_img(affine=new_affine)
+         yield np.testing.assert_array_equal, rot_im.affine, \
+                                     new_affine
+-        yield np.testing.assert_array_equal, rot_im.get_data().shape, \
++        yield np.testing.assert_array_equal, rot_im.get_fdata().shape, \
+                                     shape
+         reordered_im = rot_im.xyz_ordered()
+         yield np.testing.assert_array_equal, reordered_im.affine[:3, :3], \
+                                     np.eye(3)
+-        yield np.testing.assert_almost_equal, reordered_im.get_data(), \
++        yield np.testing.assert_almost_equal, reordered_im.get_fdata(), \
+                                     data
+ 
+     # Check that we cannot swap axes for non spatial axis:
+@@ -164,7 +164,7 @@
+     # Check that as_volume_img with no arguments returns the same image
+     yield nose.tools.assert_equal, ref_im, ref_im.as_volume_img()
+     copy_im = copy.copy(ref_im)
+-    copy_im.get_data()[0, 0, 0] *= -1
++    copy_im.get_fdata()[0, 0, 0] *= -1
+     yield nose.tools.assert_not_equal, ref_im, copy_im
+     copy_im = copy.copy(ref_im)
+     copy_im.affine[0, 0] *= -1
+@@ -186,7 +186,7 @@
+     data = np.random.random(shape)
+     affine = np.eye(4)
+     ref_im = VolumeImg(data, affine, 'mine')
+-    x, y, z = np.indices(ref_im.get_data().shape[:3])
++    x, y, z = np.indices(ref_im.get_fdata().shape[:3])
+     values = ref_im.values_in_world(x, y, z)
+     np.testing.assert_almost_equal(values, data)
+ 
+@@ -199,9 +199,9 @@
+     affine = np.random.random((4, 4))
+     ref_im = VolumeImg(data, affine, 'mine')
+     yield np.testing.assert_almost_equal, data, \
+-                ref_im.as_volume_img(affine=ref_im.affine).get_data()
++                ref_im.as_volume_img(affine=ref_im.affine).get_fdata()
+     yield np.testing.assert_almost_equal, data, \
+-                        ref_im.resampled_to_img(ref_im).get_data()
++                        ref_im.resampled_to_img(ref_im).get_fdata()
+     
+     # Check that we cannot resample to another image in a different
+     # world.
+@@ -243,7 +243,7 @@
+         
+         # Resample an image on itself: it shouldn't change much:
+         img  = img1.resampled_to_img(img1)
+-        yield np.testing.assert_almost_equal, data, img.get_data()
++        yield np.testing.assert_almost_equal, data, img.get_fdata()
+ 
+ 
+ def test_get_affine():
+--- nipy.orig/nipy/labs/datasets/volumes/volume_data.py
++++ nipy/nipy/labs/datasets/volumes/volume_data.py
+@@ -46,7 +46,7 @@
+         The data is stored in an undefined way: prescalings might need to
+         be applied to it before using it, or the data might be loaded on
+         demand. The best practice to access the data is not to access the
+-        _data attribute, but to use the `get_data` method.
++        _data attribute, but to use the `get_fdata` method.
+     """
+     #---------------------------------------------------------------------------
+     # Public attributes -- VolumeData interface
+@@ -67,7 +67,7 @@
+     #---------------------------------------------------------------------------
+ 
+ 
+-    def get_data(self):
++    def get_fdata(self):
+         """ Return data as a numpy array.
+         """
+         return np.asanyarray(self._data)
+@@ -174,7 +174,7 @@
+ 
+ 
+     def __copy__(self):
+-        return self.like_from_data(self.get_data().copy())
++        return self.like_from_data(self.get_fdata().copy())
+ 
+ 
+     def __deepcopy__(self, option):
+@@ -188,7 +188,7 @@
+     def __eq__(self, other):
+         return (    self.world_space       == other.world_space 
+                 and self.get_transform()   == other.get_transform()
+-                and np.all(self.get_data() == other.get_data())
++                and np.all(self.get_fdata() == other.get_fdata())
+                 and self.interpolation     == other.interpolation
+                )
+ 
+--- nipy.orig/nipy/labs/datasets/volumes/volume_grid.py
++++ nipy/nipy/labs/datasets/volumes/volume_grid.py
+@@ -58,7 +58,7 @@
+         The data is stored in an undefined way: prescalings might need to
+         be applied to it before using it, or the data might be loaded on
+         demand. The best practice to access the data is not to access the
+-        _data attribute, but to use the `get_data` method.
++        _data attribute, but to use the `get_fdata` method.
+ 
+         If the transform associated with the image has no inverse
+         mapping, data corresponding to a given world space position cannot
+@@ -239,7 +239,7 @@
+         # See: https://github.com/scipy/scipy/pull/64
+         if coords.dtype == np.dtype(np.intp):
+             coords = coords.astype(np.dtype(coords.dtype.str))
+-        data = self.get_data()
++        data = self.get_fdata()
+         data_shape = list(data.shape)
+         n_dims = len(data_shape)
+         if n_dims > 3:
+--- nipy.orig/nipy/labs/datasets/volumes/volume_img.py
++++ nipy/nipy/labs/datasets/volumes/volume_img.py
+@@ -55,7 +55,7 @@
+         The data is stored in an undefined way: prescalings might need to
+         be applied to it before using it, or the data might be loaded on
+         demand. The best practice to access the data is not to access the
+-        _data attribute, but to use the `get_data` method.
++        _data attribute, but to use the `get_fdata` method.
+     """
+ 
+     # most attributes are given by the VolumeField interface 
+@@ -135,7 +135,7 @@
+                 'The two images are not embedded in the same world space')
+         if isinstance(target_image, VolumeImg):
+             return self.as_volume_img(affine=target_image.affine,
+-                                    shape=target_image.get_data().shape[:3],
++                                    shape=target_image.get_fdata().shape[:3],
+                                     interpolation=interpolation)
+         else:
+             # IMPORTANT: Polymorphism can be implemented by walking the 
+@@ -159,7 +159,7 @@
+                 return self
+         if affine is None:
+             affine = self.affine
+-        data = self.get_data()
++        data = self.get_fdata()
+         if shape is None:
+             shape = data.shape[:3]
+         shape = list(shape)
+@@ -280,7 +280,7 @@
+ 
+         # Now make sure the affine is positive
+         pixdim = np.diag(A).copy()
+-        data = img.get_data()
++        data = img.get_fdata()
+         if pixdim[0] < 0:
+             b[0] = b[0] + pixdim[0]*(data.shape[0] - 1)
+             pixdim[0] = -pixdim[0]
+@@ -316,7 +316,7 @@
+         if (axis1 > 2) or (axis2 > 2):
+             raise ValueError('Can swap axis only on spatial axis. '
+                              'Use np.swapaxes of the data array.')
+-        reordered_data = np.swapaxes(self.get_data(), axis1, axis2)
++        reordered_data = np.swapaxes(self.get_fdata(), axis1, axis2)
+         new_affine = self.affine
+         order = np.array((0, 1, 2, 3))
+         order[axis1] = axis2
+@@ -335,13 +335,13 @@
+         new_v2w_transform = \
+                         self.get_transform().composed_with(w2w_transform)
+         if hasattr(new_v2w_transform, 'affine'):
+-            new_img = self.__class__(self.get_data(),
++            new_img = self.__class__(self.get_fdata(),
+                                      new_v2w_transform.affine,
+                                      new_v2w_transform.output_space,
+                                      metadata=self.metadata,
+                                      interpolation=self.interpolation)
+         else:
+-            new_img = VolumeGrid(self.get_data(),
++            new_img = VolumeGrid(self.get_fdata(),
+                                 transform=new_v2w_transform,
+                                 metadata=self.metadata,
+                                 interpolation=self.interpolation)
+@@ -364,7 +364,7 @@
+ 
+     def __eq__(self, other):
+         return (    isinstance(other, self.__class__)
+-                and np.all(self.get_data() == other.get_data())
++                and np.all(self.get_fdata() == other.get_fdata())
+                 and np.all(self.affine == other.affine)
+                 and (self.world_space == other.world_space)
+                 and (self.interpolation == other.interpolation)
+--- nipy.orig/nipy/labs/mask.py
++++ nipy/nipy/labs/mask.py
+@@ -159,12 +159,12 @@
+         for index, filename in enumerate(input_filename):
+             nim = load(filename)
+             if index == 0:
+-                first_volume = nim.get_data().squeeze()
++                first_volume = nim.get_fdata().squeeze()
+                 mean_volume = first_volume.copy().astype(np.float32)
+                 header = get_header(nim)
+                 affine = get_affine(nim)
+             else:
+-                mean_volume += nim.get_data().squeeze()
++                mean_volume += nim.get_fdata().squeeze()
+         mean_volume /= float(len(list(input_filename)))
+     del nim
+     if np.isnan(mean_volume).any():
+@@ -296,8 +296,8 @@
+     """
+     mask, mean = None, None
+     for index, session in enumerate(session_images):
+-        if hasattr(session, 'get_data'):
+-            mean = session.get_data()
++        if hasattr(session, 'get_fdata'):
++            mean = session.get_fdata()
+             if mean.ndim > 3:
+                 mean = mean.mean(-1)
+             this_mask = compute_mask(mean, None, m=m, M=M, cc=cc,
+@@ -373,7 +373,7 @@
+     for this_mask in input_masks:
+         if isinstance(this_mask, string_types):
+             # We have a filename
+-            this_mask = load(this_mask).get_data()
++            this_mask = load(this_mask).get_fdata()
+         if grp_mask is None:
+             grp_mask = this_mask.copy().astype(np.int)
+         else:
+@@ -450,7 +450,7 @@
+         # We have a 4D nifti file
+         data_file = load(filenames)
+         header = get_header(data_file)
+-        series = data_file.get_data()
++        series = data_file.get_fdata()
+         if ensure_finite:
+             # SPM tends to put NaNs in the data outside the brain
+             series[np.logical_not(np.isfinite(series))] = 0
+@@ -471,7 +471,7 @@
+         series = np.zeros((mask.sum(), nb_time_points), dtype=dtype)
+         for index, filename in enumerate(filenames):
+             data_file = load(filename)
+-            data = data_file.get_data()
++            data = data_file.get_fdata()
+             if ensure_finite:
+                 # SPM tends to put NaNs in the data outside the brain
+                 data[np.logical_not(np.isfinite(data))] = 0
+--- nipy.orig/nipy/labs/spatial_models/bsa_io.py
++++ nipy/nipy/labs/spatial_models/bsa_io.py
+@@ -68,9 +68,9 @@
+     # Read the masks and compute the "intersection"
+     # mask = np.reshape(intersect_masks(mask_images), ref_dim).astype('u8')
+     if isinstance(mask_images, string_types):
+-        mask = load(mask_images).get_data()
++        mask = load(mask_images).get_fdata()
+     elif isinstance(mask_images, Nifti1Image):
+-        mask = mask_images.get_data()
++        mask = mask_images.get_fdata()
+     else:
+         # mask_images should be a list of strings or images
+         mask = intersect_masks(mask_images).astype('u8')
+@@ -82,7 +82,7 @@
+     # read the functional images
+     stats = []
+     for stat_image in stat_images:
+-        beta = np.reshape(load(stat_image).get_data(), ref_dim)
++        beta = np.reshape(load(stat_image).get_fdata(), ref_dim)
+         stats.append(beta[mask > 0])
+     stats = np.array(stats).T
+ 
+--- nipy.orig/nipy/labs/spatial_models/discrete_domain.py
++++ nipy/nipy/labs/spatial_models/discrete_domain.py
+@@ -240,7 +240,7 @@
+         iim = load(mim)
+     else:
+         iim = mim
+-    return domain_from_binary_array(iim.get_data(), get_affine(iim), nn)
++    return domain_from_binary_array(iim.get_fdata(), get_affine(iim), nn)
+ 
+ 
+ def grid_domain_from_binary_array(mask, affine=None, nn=0):
+@@ -290,7 +290,7 @@
+         iim = load(mim)
+     else:
+         iim = mim
+-    return grid_domain_from_binary_array(iim.get_data(), get_affine(iim), nn)
++    return grid_domain_from_binary_array(iim.get_fdata(), get_affine(iim), nn)
+ 
+ 
+ def grid_domain_from_shape(shape, affine=None):
+@@ -786,7 +786,7 @@
+         if (get_affine(nim) != self.affine).any():
+             raise ValueError('nim and self do not have the same referential')
+ 
+-        data = nim.get_data()
++        data = nim.get_fdata()
+         feature = data[self.ijk[:, 0], self.ijk[:, 1], self.ijk[:, 2]]
+         if fid is not '':
+             self.features[fid] = feature
+--- nipy.orig/nipy/labs/spatial_models/mroi.py
++++ nipy/nipy/labs/spatial_models/mroi.py
+@@ -643,7 +643,7 @@
+         else:
+             data = -np.ones(self.label.size, dtype=np.int32)
+             tmp_image = self.domain.to_image()
+-            mask = tmp_image.get_data().copy().astype(bool)
++            mask = tmp_image.get_fdata().copy().astype(bool)
+             if not roi:
+                 # write a feature
+                 if fid not in self.features:
+@@ -765,7 +765,7 @@
+     else:
+         iim = mim
+ 
+-    return subdomain_from_array(iim.get_data(), get_affine(iim), nn)
++    return subdomain_from_array(iim.get_fdata(), get_affine(iim), nn)
+ 
+ 
+ def subdomain_from_position_and_image(nim, pos):
+@@ -784,7 +784,7 @@
+     coord = np.array([tmp.domain.coord[tmp.label == k].mean(0)
+                       for k in range(tmp.k)])
+     idx = ((coord - pos) ** 2).sum(1).argmin()
+-    return subdomain_from_array(nim.get_data() == idx, get_affine(nim))
++    return subdomain_from_array(nim.get_fdata() == idx, get_affine(nim))
+ 
+ 
+ def subdomain_from_balls(domain, positions, radii):
+--- nipy.orig/nipy/labs/spatial_models/tests/test_mroi.py
++++ nipy/nipy/labs/spatial_models/tests/test_mroi.py
+@@ -205,10 +205,10 @@
+     # Test example runs correctly
+     eg_img = pjoin(dirname(__file__), 'some_blobs.nii')
+     nim = load(eg_img)
+-    arr = nim.get_data() ** 2 > 0
++    arr = nim.get_fdata() ** 2 > 0
+     mask_image = Nifti1Image(arr.astype('u1'), get_affine(nim))
+     domain = grid_domain_from_image(mask_image)
+-    data = nim.get_data()
++    data = nim.get_fdata()
+     values = data[data != 0]
+ 
+     # parameters
+@@ -229,10 +229,10 @@
+     assert_equal(average_activation, nroi.representative_feature('activation'))
+     # Binary image is default
+     bin_wim = nroi.to_image()
+-    bin_vox = bin_wim.get_data()
++    bin_vox = bin_wim.get_fdata()
+     assert_equal(np.unique(bin_vox), [0, 1])
+     id_wim = nroi.to_image('id', roi=True, descrip='description')
+-    id_vox = id_wim.get_data()
++    id_vox = id_wim.get_fdata()
+     mask = bin_vox.astype(bool)
+     assert_equal(id_vox[~mask], -1)
+     ids = nroi.get_id()
+@@ -240,7 +240,7 @@
+     # Test activation
+     wim = nroi.to_image('activation', roi=True, descrip='description')
+     # Sadly, all cast to int
+-    assert_equal(np.unique(wim.get_data().astype(np.int32)), [-1, 3, 4, 5])
++    assert_equal(np.unique(wim.get_fdata().astype(np.int32)), [-1, 3, 4, 5])
+     # end blobs or leaves
+     lroi = nroi.copy()
+     lroi.reduce_to_leaves()
+--- nipy.orig/nipy/labs/spatial_models/tests/test_parcel_io.py
++++ nipy/nipy/labs/spatial_models/tests/test_parcel_io.py
+@@ -19,7 +19,7 @@
+     shape = (10, 10, 10)
+     mask_image = Nifti1Image(np.ones(shape).astype('u1'), np.eye(4))
+     wim = mask_parcellation(mask_image, n_parcels)
+-    assert_equal(np.unique(wim.get_data()), np.arange(n_parcels))
++    assert_equal(np.unique(wim.get_fdata()), np.arange(n_parcels))
+ 
+ 
+ def test_mask_parcel_multi_subj():
+@@ -38,7 +38,7 @@
+             mask_images.append(path)
+ 
+         wim = mask_parcellation(mask_images, n_parcels)
+-        assert_equal(np.unique(wim.get_data()), np.arange(n_parcels))
++        assert_equal(np.unique(wim.get_fdata()), np.arange(n_parcels))
+ 
+ 
+ def test_parcel_intra_from_3d_image():
+--- nipy.orig/nipy/labs/statistical_mapping.py
++++ nipy/nipy/labs/statistical_mapping.py
+@@ -57,11 +57,11 @@
+     """
+     # Masking
+     if len(mask.shape) > 3:
+-        xyz = np.where((mask.get_data() > 0).squeeze())
+-        zmap = zimg.get_data().squeeze()[xyz]
++        xyz = np.where((mask.get_fdata() > 0).squeeze())
++        zmap = zimg.get_fdata().squeeze()[xyz]
+     else:
+-        xyz = np.where(mask.get_data() > 0)
+-        zmap = zimg.get_data()[xyz]
++        xyz = np.where(mask.get_fdata() > 0)
++        zmap = zimg.get_fdata()[xyz]
+ 
+     xyz = np.array(xyz).T
+     nvoxels = np.size(xyz, 0)
+@@ -183,12 +183,12 @@
+     """
+     # Masking
+     if mask is not None:
+-        bmask = mask.get_data().ravel()
+-        data = image.get_data().ravel()[bmask > 0]
++        bmask = mask.get_fdata().ravel()
++        data = image.get_fdata().ravel()[bmask > 0]
+         xyz = np.array(np.where(bmask > 0)).T
+     else:
+         shape = image.shape
+-        data = image.get_data().ravel()
++        data = image.get_fdata().ravel()
+         xyz = np.reshape(np.indices(shape), (3, np.prod(shape))).T
+     affine = get_affine(image)
+ 
+@@ -235,12 +235,12 @@
+      # Compute xyz coordinates from mask
+     xyz = np.array(np.where(mask > 0))
+     # Prepare data & vardata arrays
+-    data = np.array([(d.get_data()[xyz[0], xyz[1], xyz[2]]).squeeze()
++    data = np.array([(d.get_fdata()[xyz[0], xyz[1], xyz[2]]).squeeze()
+                     for d in data_images]).squeeze()
+     if vardata_images is None:
+         vardata = None
+     else:
+-        vardata = np.array([(d.get_data()[xyz[0], xyz[1], xyz[2]]).squeeze()
++        vardata = np.array([(d.get_fdata()[xyz[0], xyz[1], xyz[2]]).squeeze()
+                             for d in vardata_images]).squeeze()
+     return data, vardata, xyz, mask
+ 
+@@ -386,7 +386,7 @@
+             self.xyz = None
+             self.axis = len(data[0].shape) - 1
+         else:
+-            self.xyz = np.where(mask.get_data() > 0)
++            self.xyz = np.where(mask.get_fdata() > 0)
+             self.axis = 1
+ 
+         self.spatial_shape = data[0].shape[0: -1]
+@@ -397,9 +397,9 @@
+             if not isinstance(design_matrix[i], np.ndarray):
+                 raise ValueError('Invalid design matrix')
+             if nomask:
+-                Y = data[i].get_data()
++                Y = data[i].get_fdata()
+             else:
+-                Y = data[i].get_data()[self.xyz]
++                Y = data[i].get_fdata()[self.xyz]
+             X = design_matrix[i]
+ 
+             self.glm.append(glm(Y, X, axis=self.axis,
+--- nipy.orig/nipy/labs/tests/test_mask.py
++++ nipy/nipy/labs/tests/test_mask.py
+@@ -70,7 +70,7 @@
+     with InTemporaryDirectory():
+         # Make a 4D file from the anatomical example
+         img = nib.load(anatfile)
+-        arr = img.get_data()
++        arr = img.get_fdata()
+         a2 = np.zeros(arr.shape + (2, ))
+         a2[:, :, :, 0] = arr
+         a2[:, :, :, 1] = arr
+@@ -122,7 +122,7 @@
+     with InTemporaryDirectory():
+         # Make a 4D file from the anatomical example
+         img = nib.load(anatfile)
+-        arr = img.get_data()
++        arr = img.get_fdata()
+         a2 = np.zeros(arr.shape + (2, ))
+         a2[:, :, :, 0] = arr
+         a2[:, :, :, 1] = arr
+--- nipy.orig/nipy/labs/utils/reproducibility_measures.py
++++ nipy/nipy/labs/utils/reproducibility_measures.py
+@@ -680,9 +680,9 @@
+     group_con = []
+     group_var = []
+     for s in range(nsubj):
+-        group_con.append(load(contrast_images[s]).get_data()[mask])
++        group_con.append(load(contrast_images[s]).get_fdata()[mask])
+         if len(variance_images) > 0:
+-            group_var.append(load(variance_images[s]).get_data()[mask])
++            group_var.append(load(variance_images[s]).get_fdata()[mask])
+ 
+     group_con = np.squeeze(np.array(group_con)).T
+     group_con[np.isnan(group_con)] = 0
+--- nipy.orig/nipy/labs/utils/simul_multisubject_fmri_dataset.py
++++ nipy/nipy/labs/utils/simul_multisubject_fmri_dataset.py
+@@ -192,7 +192,7 @@
+ 
+     if mask is not None:
+         shape = mask.shape
+-        mask_data = mask.get_data()
++        mask_data = mask.get_fdata()
+     else:
+         mask_data = np.ones(shape)
+ 
+@@ -283,7 +283,7 @@
+     if mask is not None:
+         shape = mask.shape
+         affine = get_affine(mask)
+-        mask_data = mask.get_data().astype('bool')
++        mask_data = mask.get_fdata().astype('bool')
+     else:
+         affine = np.eye(4)
+         mask_data = np.ones(shape).astype('bool')
+--- nipy.orig/nipy/labs/utils/tests/test_simul_multisubject_fmri_dataset.py
++++ nipy/nipy/labs/utils/tests/test_simul_multisubject_fmri_dataset.py
+@@ -113,8 +113,8 @@
+     mask = np.random.rand(*shape) > 0.5
+     mask_img = Nifti1Image(mask.astype(np.uint8), np.eye(4))
+     imgs = surrogate_4d_dataset(mask=mask_img)
+-    mean_image  = imgs[0].get_data()[mask].mean()
+-    assert_true((imgs[0].get_data()[mask == 0] < mean_image / 2).all())
++    mean_image  = imgs[0].get_fdata()[mask].mean()
++    assert_true((imgs[0].get_fdata()[mask == 0] < mean_image / 2).all())
+ 
+ 
+ def test_surrogate_array_4d_dmtx():
+--- nipy.orig/nipy/labs/viz_tools/anat_cache.py
++++ nipy/nipy/labs/viz_tools/anat_cache.py
+@@ -76,7 +76,7 @@
+                         'required to plot anatomy, see the nipy documentation '
+                         'installaton section for how to install template files.')
+             anat_im = load(filename)
+-            anat = anat_im.get_data()
++            anat = anat_im.get_fdata()
+             anat = anat.astype(np.float)
+             anat_mask = ndimage.morphology.binary_fill_holes(anat > 0)
+             anat = np.ma.masked_array(anat, np.logical_not(anat_mask))
+--- nipy.orig/nipy/labs/viz_tools/slicers.py
++++ nipy/nipy/labs/viz_tools/slicers.py
+@@ -57,7 +57,7 @@
+ def _xyz_order(map, affine):
+     img = VolumeImg(map, affine=affine, world_space='mine')
+     img = img.xyz_ordered(resample=True, copy=False)
+-    map = img.get_data()
++    map = img.get_fdata()
+     affine = img.affine
+     return map, affine
+ 
+--- nipy.orig/nipy/modalities/fmri/fmri.py
++++ nipy/nipy/modalities/fmri/fmri.py
+@@ -23,7 +23,7 @@
+         images : iterable
+            an iterable object whose items are meant to be images; this is
+            checked by asserting that each has a `coordmap` attribute and a
+-           ``get_data`` method.  Note that Image objects are not iterable by
++           ``get_fdata`` method.  Note that Image objects are not iterable by
+            default; use the ``from_image`` classmethod or ``iter_axis`` function
+            to convert images to image lists - see examples below for the latter.
+         volume_start_times: None or float or (N,) ndarray
+--- nipy.orig/nipy/modalities/fmri/fmristat/model.py
++++ nipy/nipy/modalities/fmri/fmristat/model.py
+@@ -132,7 +132,7 @@
+                  volume_start_times=None):
+         self.fmri_image = fmri_image
+         try:
+-            self.data = fmri_image.get_data()
++            self.data = fmri_image.get_fdata()
+         except AttributeError:
+             self.data = fmri_image.get_list_data(axis=0)
+         self.formula = formula
+@@ -197,7 +197,7 @@
+     formula :  :class:`nipy.algorithms.statistics.formula.Formula`
+     rho : ``Image``
+        image of AR(1) coefficients.  Returning data from
+-       ``rho.get_data()``, and having attribute ``coordmap``
++       ``rho.get_fdata()``, and having attribute ``coordmap``
+     outputs :
+     volume_start_times : 
+     """
+@@ -206,14 +206,14 @@
+                  volume_start_times=None):
+         self.fmri_image = fmri_image
+         try:
+-            self.data = fmri_image.get_data()
++            self.data = fmri_image.get_fdata()
+         except AttributeError:
+             self.data = fmri_image.get_list_data(axis=0)
+         self.formula = formula
+         self.outputs = outputs
+         # Cleanup rho values, truncate them to a scale of 0.01
+         g = copy.copy(rho.coordmap)
+-        rho = rho.get_data()
++        rho = rho.get_fdata()
+         m = np.isnan(rho)
+         r = (np.clip(rho,-1,1) * 100).astype(np.int) / 100.
+         r[m] = np.inf
+@@ -227,7 +227,7 @@
+ 
+         iterable = parcels(self.rho, exclude=[np.inf])
+         def model_params(i):
+-            return (self.rho.get_data()[i].mean(),)
++            return (self.rho.get_fdata()[i].mean(),)
+         # Generates indexer, data, model
+         m = model_generator(self.formula, self.data,
+                             self.volume_start_times,
+--- nipy.orig/nipy/modalities/fmri/fmristat/tests/test_iterables.py
++++ nipy/nipy/modalities/fmri/fmristat/tests/test_iterables.py
+@@ -31,7 +31,7 @@
+ FIMG = load_image(funcfile)
+ # Put time on first axis
+ FIMG = rollimg(FIMG, 't')
+-FDATA = FIMG.get_data()
++FDATA = FIMG.get_fdata()
+ FIL = FmriImageList.from_image(FIMG)
+ 
+ # I think it makes more sense to use FDATA instead of FIL for GLM
+--- nipy.orig/nipy/modalities/fmri/fmristat/tests/test_model.py
++++ nipy/nipy/modalities/fmri/fmristat/tests/test_model.py
+@@ -41,7 +41,7 @@
+         assert_raises(ValueError, moi.__getitem__, 0)
+         new_img = load_image(fname)
+         for i in range(shape[0]):
+-            assert_array_equal(new_img[i].get_data(), i)
++            assert_array_equal(new_img[i].get_fdata(), i)
+         del new_img
+ 
+ 
+@@ -80,17 +80,17 @@
+             ar.execute()
+             f_img = load_image('F_out.nii')
+             assert_equal(f_img.shape, one_vol.shape)
+-            f_data = f_img.get_data()
++            f_data = f_img.get_fdata()
+             assert_true(np.all((f_data>=0) & (f_data<30)))
+             resid_img = load_image('resid_AR_out.nii')
+             assert_equal(resid_img.shape, funcim.shape)
+-            assert_array_almost_equal(np.mean(resid_img.get_data()), 0, 3)
++            assert_array_almost_equal(np.mean(resid_img.get_fdata()), 0, 3)
+             e_img = load_image('T_out_effect.nii')
+             sd_img = load_image('T_out_sd.nii')
+             t_img = load_image('T_out_t.nii')
+-            t_data = t_img.get_data()
++            t_data = t_img.get_fdata()
+             assert_array_almost_equal(t_data,
+-                                      e_img.get_data() / sd_img.get_data())
++                                      e_img.get_fdata() / sd_img.get_fdata())
+             assert_true(np.all(np.abs(t_data) < 6))
+             # Need to delete to help windows delete temporary files
+             del rho, resid_img, f_img, e_img, sd_img, t_img, f_data, t_data
+--- nipy.orig/nipy/modalities/fmri/glm.py
++++ nipy/nipy/modalities/fmri/glm.py
+@@ -461,10 +461,10 @@
+             z_image, = multi_session_model.contrast([np.eye(13)[1]] * 2)
+ 
+             # The number of voxels with p < 0.001 given by ...
+-            print(np.sum(z_image.get_data() > 3.09))
++            print(np.sum(z_image.get_fdata() > 3.09))
+         """
+         # manipulate the arguments
+-        if isinstance(fmri_data, string_types) or hasattr(fmri_data, 'get_data'):
++        if isinstance(fmri_data, string_types) or hasattr(fmri_data, 'get_fdata'):
+             fmri_data = [fmri_data]
+         if isinstance(design_matrices, (string_types, np.ndarray)):
+             design_matrices = [design_matrices]
+@@ -519,16 +519,16 @@
+         """
+         from nibabel import Nifti1Image
+         # get the mask as an array
+-        mask = self.mask.get_data().astype(np.bool)
++        mask = self.mask.get_fdata().astype(np.bool)
+ 
+         self.glms, self.means = [], []
+         for fmri, design_matrix in zip(self.fmri_data, self.design_matrices):
+             if do_scaling:
+                 # scale the data
+-                data, mean = data_scaling(fmri.get_data()[mask].T)
++                data, mean = data_scaling(fmri.get_fdata()[mask].T)
+             else:
+-                data, mean = (fmri.get_data()[mask].T,
+-                              fmri.get_data()[mask].T.mean(0))
++                data, mean = (fmri.get_fdata()[mask].T,
++                              fmri.get_fdata()[mask].T.mean(0))
+             mean_data = mask.astype(np.int16)
+             mean_data[mask] = mean
+             self.means.append(Nifti1Image(mean_data, self.affine))
+@@ -588,7 +588,7 @@
+             contrast_.z_score()
+ 
+         # Prepare the returned images
+-        mask = self.mask.get_data().astype(np.bool)
++        mask = self.mask.get_fdata().astype(np.bool)
+         do_outputs = [output_z, output_stat, output_effects, output_variance]
+         estimates = ['z_score_', 'stat_', 'effect', 'variance']
+         descrips = ['z statistic', 'Statistical value', 'Estimated effect',
+--- nipy.orig/nipy/modalities/fmri/tests/test_fmri.py
++++ nipy/nipy/modalities/fmri/tests/test_fmri.py
+@@ -49,7 +49,7 @@
+     img_shape = img.shape
+     exp_shape = (img_shape[0],) + img_shape[2:]
+     j = 0
+-    for i, d in axis0_generator(img.get_data()):
++    for i, d in axis0_generator(img.get_fdata()):
+         j += 1
+         assert_equal(d.shape, exp_shape)
+         del(i); gc.collect()
+@@ -66,9 +66,9 @@
+ 
+ def test_labels1():
+     img = load_image(funcfile)
+-    data = img.get_data()
+-    parcelmap = Image(img[0].get_data(), AfT('kji', 'zyx', np.eye(4)))
+-    parcelmap = (parcelmap.get_data() * 100).astype(np.int32)
++    data = img.get_fdata()
++    parcelmap = Image(img[0].get_fdata(), AfT('kji', 'zyx', np.eye(4)))
++    parcelmap = (parcelmap.get_fdata() * 100).astype(np.int32)
+     v = 0
+     for i, d in axis0_generator(data, parcels(parcelmap)):
+         v += d.shape[1]
+--- nipy.orig/nipy/modalities/fmri/tests/test_glm.py
++++ nipy/nipy/modalities/fmri/tests/test_glm.py
+@@ -57,7 +57,7 @@
+         multi_session_model.fit()
+         z_image, = multi_session_model.contrast([np.eye(rk)[1]] * 2)
+         assert_array_equal(get_affine(z_image), get_affine(load(mask_file)))
+-        assert_true(z_image.get_data().std() < 3.)
++        assert_true(z_image.get_fdata().std() < 3.)
+         # Delete objects attached to files to avoid WindowsError when deleting
+         # temporary directory
+         del z_image, fmri_files, multi_session_model
+@@ -71,28 +71,28 @@
+     multi_session_model = FMRILinearModel(fmri_data, design_matrices, mask=None)
+     multi_session_model.fit()
+     z_image, = multi_session_model.contrast([np.eye(rk)[1]] * 2)
+-    assert_equal(np.sum(z_image.get_data() == 0), 0)
++    assert_equal(np.sum(z_image.get_fdata() == 0), 0)
+ 
+     # compute the mask
+     multi_session_model = FMRILinearModel(fmri_data, design_matrices,
+                                           m=0, M=.01, threshold=0.)
+     multi_session_model.fit()
+     z_image, = multi_session_model.contrast([np.eye(rk)[1]] * 2)
+-    assert_true(z_image.get_data().std() < 3. )
++    assert_true(z_image.get_fdata().std() < 3. )
+ 
+     # with mask
+     multi_session_model = FMRILinearModel(fmri_data, design_matrices, mask)
+     multi_session_model.fit()
+     z_image, effect_image, variance_image= multi_session_model.contrast(
+         [np.eye(rk)[:2]] * 2, output_effects=True, output_variance=True)
+-    assert_array_equal(z_image.get_data() == 0., load(mask).get_data() == 0.)
++    assert_array_equal(z_image.get_fdata() == 0., load(mask).get_fdata() == 0.)
+     assert_true(
+-        (variance_image.get_data()[load(mask).get_data() > 0, 0] > .001).all())
++        (variance_image.get_fdata()[load(mask).get_fdata() > 0, 0] > .001).all())
+ 
+     # without scaling
+     multi_session_model.fit(do_scaling=False)
+     z_image, = multi_session_model.contrast([np.eye(rk)[1]] * 2)
+-    assert_true(z_image.get_data().std() < 3. )
++    assert_true(z_image.get_fdata().std() < 3. )
+ 
+ 
+ def test_high_level_glm_contrasts():
+@@ -104,8 +104,8 @@
+                                             contrast_type='tmin-conjunction')
+     z1, = multi_session_model.contrast([np.eye(rk)[:1]] * 2)
+     z2, = multi_session_model.contrast([np.eye(rk)[1:2]] * 2)
+-    assert_true((z_image.get_data() < np.maximum(
+-        z1.get_data(), z2.get_data())).all())
++    assert_true((z_image.get_fdata() < np.maximum(
++        z1.get_fdata(), z2.get_fdata())).all())
+ 
+ 
+ def test_high_level_glm_null_contrasts():
+@@ -120,7 +120,7 @@
+     single_session_model.fit()
+     z1, = multi_session_model.contrast([np.eye(rk)[:1], np.zeros((1, rk))])
+     z2, = single_session_model.contrast([np.eye(rk)[:1]])
+-    np.testing.assert_almost_equal(z1.get_data(), z2.get_data())
++    np.testing.assert_almost_equal(z1.get_fdata(), z2.get_fdata())
+ 
+ 
+ def ols_glm(n=100, p=80, q=10):
+@@ -336,7 +336,7 @@
+     multi_session_model.fit()
+     z_image, = multi_session_model.contrast([np.eye(13)[1]] * 2)
+     # Check number of voxels with p < 0.001
+-    assert_equal(np.sum(z_image.get_data() > 3.09), 671)
++    assert_equal(np.sum(z_image.get_fdata() > 3.09), 671)
+ 
+ 
+ if __name__ == "__main__":
+--- nipy.orig/nipy/tests/test_scripts.py
++++ nipy/nipy/tests/test_scripts.py
+@@ -142,7 +142,7 @@
+         cmd = ['nipy_3dto4d'] + imgs_3d  + ['--out-4d=' + out_4d]
+         run_command(cmd)
+         fimg_back = load_image(out_4d)
+-        assert_almost_equal(fimg.get_data(), fimg_back.get_data())
++        assert_almost_equal(fimg.get_fdata(), fimg_back.get_fdata())
+         del fimg_back
+ 
+ 
+--- nipy.orig/nipy/algorithms/statistics/models/model.py
++++ nipy/nipy/algorithms/statistics/models/model.py
+@@ -6,7 +6,7 @@
+ 
+ from scipy.stats import t as t_distribution
+ 
+-from nibabel.onetime import setattr_on_read
++from nibabel.onetime import auto_attr
+ 
+ from ...utils.matrices import pos_recipr
+ 
+@@ -121,14 +121,14 @@
+         # put this as a parameter of LikelihoodModel
+         self.df_resid = self.df_total - self.df_model
+ 
+-    @setattr_on_read
++    @auto_attr
+     def logL(self):
+         """
+         The maximized log-likelihood
+         """
+         return self.model.logL(self.theta, self.Y, nuisance=self.nuisance)
+ 
+-    @setattr_on_read
++    @auto_attr
+     def AIC(self):
+         """
+         Akaike Information Criterion
+@@ -136,7 +136,7 @@
+         p = self.theta.shape[0]
+         return -2 * self.logL + 2 * p
+ 
+-    @setattr_on_read
++    @auto_attr
+     def BIC(self):
+         """
+         Schwarz's Bayesian Information Criterion
+--- nipy.orig/nipy/algorithms/statistics/models/regression.py
++++ nipy/nipy/algorithms/statistics/models/regression.py
+@@ -30,7 +30,7 @@
+ from scipy import stats
+ import scipy.linalg as spl
+ 
+-from nibabel.onetime import setattr_on_read
++from nibabel.onetime import auto_attr
+ 
+ from nipy.algorithms.utils.matrices import matrix_rank, pos_recipr
+ 
+@@ -262,7 +262,7 @@
+         """
+         return X
+ 
+-    @setattr_on_read
++    @auto_attr
+     def has_intercept(self):
+         """
+         Check if column of 1s is in column space of design
+@@ -274,7 +274,7 @@
+             return True
+         return False
+ 
+-    @setattr_on_read
++    @auto_attr
+     def rank(self):
+         """ Compute rank of design matrix
+         """
+@@ -715,14 +715,14 @@
+         self.wY = wY
+         self.wresid = wresid
+ 
+-    @setattr_on_read
++    @auto_attr
+     def resid(self):
+         """
+         Residuals from the fit.
+         """
+         return self.Y - self.predicted
+ 
+-    @setattr_on_read
++    @auto_attr
+     def norm_resid(self):
+         """
+         Residuals, normalized to have unit length.
+@@ -742,7 +742,7 @@
+         """
+         return self.resid * pos_recipr(np.sqrt(self.dispersion))
+ 
+-    @setattr_on_read
++    @auto_attr
+     def predicted(self):
+         """ Return linear predictor values from a design matrix.
+         """
+@@ -751,7 +751,7 @@
+         X = self.model.design
+         return np.dot(X, beta)
+ 
+-    @setattr_on_read
++    @auto_attr
+     def R2_adj(self):
+         """Return the R^2 value for each row of the response Y.
+ 
+@@ -768,7 +768,7 @@
+         d *= ((self.df_total - 1.) / self.df_resid)
+         return 1 - d
+ 
+-    @setattr_on_read
++    @auto_attr
+     def R2(self):
+         """
+         Return the adjusted R^2 value for each row of the response Y.
+@@ -782,7 +782,7 @@
+         d = self.SSE / self.SST
+         return 1 - d
+ 
+-    @setattr_on_read
++    @auto_attr
+     def SST(self):
+         """Total sum of squares. If not from an OLS model this is "pseudo"-SST.
+         """
+@@ -791,34 +791,34 @@
+                           "SST inappropriate")
+         return ((self.wY - self.wY.mean(0)) ** 2).sum(0)
+ 
+-    @setattr_on_read
++    @auto_attr
+     def SSE(self):
+         """Error sum of squares. If not from an OLS model this is "pseudo"-SSE.
+         """
+         return (self.wresid ** 2).sum(0)
+ 
+-    @setattr_on_read
++    @auto_attr
+     def SSR(self):
+         """ Regression sum of squares """
+         return self.SST - self.SSE
+ 
+-    @setattr_on_read
++    @auto_attr
+     def MSR(self):
+         """ Mean square (regression)"""
+         return self.SSR / (self.df_model - 1)
+ 
+-    @setattr_on_read
++    @auto_attr
+     def MSE(self):
+         """ Mean square (error) """
+         return self.SSE / self.df_resid
+ 
+-    @setattr_on_read
++    @auto_attr
+     def MST(self):
+         """ Mean square (total)
+         """
+         return self.SST / (self.df_total - 1)
+ 
+-    @setattr_on_read
++    @auto_attr
+     def F_overall(self):
+         """ Overall goodness of fit F test,
+         comparing model to a model with just an intercept.


=====================================
debian/patches/remove-imagefileerror.patch
=====================================
@@ -0,0 +1,38 @@
+Description: remove check for ImageFileError.
+ This error class completely disappeared from the library it is imported from.
+ Reading the comment around the code, it seems to be in use fo legacy support
+ of said library, so shouldn't be a problem in Debian context.
+ .
+ This patch contributes to resolve partially nipy test failures at build time.
+Author: Étienne Mollier <emollier at debian.org>
+Bug-Debian: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=1042053
+Forwarded: no
+Last-Update: 2023-08-17
+---
+This patch header follows DEP-3: http://dep.debian.net/deps/dep3/
+--- nipy.orig/nipy/io/tests/test_image_io.py
++++ nipy/nipy/io/tests/test_image_io.py
+@@ -6,7 +6,7 @@
+ import numpy as np
+ 
+ import nibabel as nib
+-from nibabel.spatialimages import ImageFileError, HeaderDataError
++from nibabel.spatialimages import HeaderDataError
+ from nibabel import Nifti1Header
+ 
+ from ..api import load_image, save_image, as_image
+@@ -41,13 +41,11 @@
+ 
+ def test_badfile():
+     filename = "bad_file.foo"
+-    # nibabel prior 2.1.0 was throwing a ImageFileError for the not-recognized
+-    # file type.  >=2.1.0 give a FileNotFoundError.
+     try:
+         from numpy.compat.py3k import FileNotFoundError
+     except ImportError:
+         FileNotFoundError = IOError
+-    assert_raises((ImageFileError, FileNotFoundError), load_image, filename)
++    assert_raises(FileNotFoundError, load_image, filename)
+ 
+ 
+ @if_templates


=====================================
debian/patches/series
=====================================
@@ -3,3 +3,6 @@ local-mathjax.patch
 numpy-fix.patch
 numpy1.24.patch
 nibabel5.0.0.patch
+nibabel5.1.0.patch
+standard-gifty-support.patch
+remove-imagefileerror.patch


=====================================
debian/patches/standard-gifty-support.patch
=====================================
@@ -0,0 +1,23 @@
+Description: explicitly cast data format to gifti.
+ The newer GiftiDataArray now works only with float32 (and various integer
+ types) but receives float64 matrices from NumPy, thus it is necessary to
+ indicate the datatype to the operator.
+ .
+ This patch contributes to resolve nipy test failures at package build time.
+Author: Étienne Mollier <emollier at debian.org>
+Bug-Debian: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=1042053
+Forwarded: no
+Last-Update: 2023-08-17
+---
+This patch header follows DEP-3: http://dep.debian.net/deps/dep3/
+--- nipy.orig/nipy/labs/spatial_models/tests/test_discrete_domain.py
++++ nipy/nipy/labs/spatial_models/tests/test_discrete_domain.py
+@@ -227,7 +227,7 @@
+                             [0, 1, 3],
+                             [0, 2, 3],
+                             [1, 2, 3]])
+-    darrays = [nbg.GiftiDataArray(coords)] + [nbg.GiftiDataArray(triangles)]
++    darrays = [nbg.GiftiDataArray(coords, datatype='float32')] + [nbg.GiftiDataArray(triangles, datatype='float32')]
+     toy_image = nbg.GiftiImage(darrays=darrays)
+     domain = domain_from_mesh(toy_image)
+     # if we get there, we could build the domain, and that's what we wanted.



View it on GitLab: https://salsa.debian.org/med-team/nipy/-/compare/12a4fbea8c99c1e5dc07ee81bc3da1a450617050...ce3ea4a238fa50b95648598a1074ab039cfa74f7

-- 
View it on GitLab: https://salsa.debian.org/med-team/nipy/-/compare/12a4fbea8c99c1e5dc07ee81bc3da1a450617050...ce3ea4a238fa50b95648598a1074ab039cfa74f7
You're receiving this email because of your account on salsa.debian.org.


-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://alioth-lists.debian.net/pipermail/debian-med-commit/attachments/20230817/0d0bc0b9/attachment-0001.htm>


More information about the debian-med-commit mailing list