# iris.cube¶

Classes for representing multi-dimensional data with metadata.

In this module:

A single Iris cube of data and metadata.

Typically obtained from iris.load(), iris.load_cube(), iris.load_cubes(), or from the manipulation of existing cubes.

For example:

>>> cube = iris.load_cube(iris.sample_data_path('air_temp.pp'))
>>> print(cube)
air_temperature / (K)               (latitude: 73; longitude: 96)
Dimension coordinates:
latitude                           x              -
longitude                          -              x
Scalar coordinates:
forecast_period: 6477 hours, bound=(-28083.0, 6477.0) hours
forecast_reference_time: 1998-03-01 03:00:00
pressure: 1000.0 hPa
time: 1998-12-01 00:00:00, bound=(1994-12-01 00:00:00, 1998-12-01 00:00:00)
Attributes:
STASH: m01s16i203
source: Data from Met Office Unified Model
Cell methods:
mean within years: time
mean over years: time


class iris.cube.Cube(data, standard_name=None, long_name=None, var_name=None, units=None, attributes=None, cell_methods=None, dim_coords_and_dims=None, aux_coords_and_dims=None, aux_factories=None, cell_measures_and_dims=None)

Bases: iris._cube_coord_common.CFVariableMixin

Creates a cube with data and optional metadata.

Not typically used - normally cubes are obtained by loading data (e.g. iris.load()) or from manipulating existing cubes.

Args:

Kwargs:

• standard_name
The standard name for the Cube’s data.
• long_name
An unconstrained description of the cube.
• var_name
The netCDF variable name for the cube.
• units
The unit of the cube, e.g. "m s-1" or "kelvin".
• attributes
A dictionary of cube attributes
• cell_methods
A tuple of CellMethod objects, generally set by Iris, e.g. (CellMethod("mean", coords='latitude'), ).
• dim_coords_and_dims
A list of coordinates with scalar dimension mappings, e.g [(lat_coord, 0), (lon_coord, 1)].
• aux_coords_and_dims
A list of coordinates with dimension mappings, e.g [(lat_coord, 0), (lon_coord, (0, 1))]. See also Cube.add_dim_coord() and Cube.add_aux_coord().
• aux_factories
A list of auxiliary coordinate factories. See iris.aux_factory.
• cell_measures_and_dims
A list of CellMeasures with dimension mappings.
For example::
>>> from iris.coords import DimCoord
>>> from iris.cube import Cube
>>> latitude = DimCoord(np.linspace(-90, 90, 4),
...                     standard_name='latitude',
...                     units='degrees')
>>> longitude = DimCoord(np.linspace(45, 360, 8),
...                      standard_name='longitude',
...                      units='degrees')
>>> cube = Cube(np.zeros((4, 8), np.float32),
...             dim_coords_and_dims=[(latitude, 0),
...                                  (longitude, 1)])

add_aux_coord(coord, data_dims=None)

Adds a CF auxiliary coordinate to the cube.

Args:

Kwargs:

• data_dims
Integer or iterable of integers giving the data dimensions spanned by the coordinate.

Raises a ValueError if a coordinate with identical metadata already exists on the cube.

See also Cube.remove_coord().

add_aux_factory(aux_factory)

Adds an auxiliary coordinate factory to the cube.

Args:

add_cell_measure(cell_measure, data_dims=None)

Adds a CF cell measure to the cube.

Args:

Kwargs:

• data_dims
Integer or iterable of integers giving the data dimensions spanned by the coordinate.

Raises a ValueError if a cell_measure with identical metadata already exists on the cube.

See also Cube.remove_cell_measure().

add_cell_method(cell_method)

Add a CellMethod to the Cube.

add_dim_coord(dim_coord, data_dim)

Add a CF coordinate to the cube.

Args:

Raises a ValueError if a coordinate with identical metadata already exists on the cube or if a coord already exists for the given dimension.

See also Cube.remove_coord().

aggregated_by(coords, aggregator, **kwargs)

Perform aggregation over the cube given one or more “group coordinates”.

A “group coordinate” is a coordinate where repeating values represent a single group, such as a month coordinate on a daily time slice. Repeated values will form a group even if they are not consecutive.

The group coordinates must all be over the same cube dimension. Each common value group identified over all the group-by coordinates is collapsed using the provided aggregator.

Args:

Kwargs:

• kwargs:
Aggregator and aggregation function keyword arguments.
Returns:
iris.cube.Cube.

Note

This operation does not yet have support for lazy evaluation.

For example:

>>> import iris
>>> import iris.analysis
>>> import iris.coord_categorisation as cat
>>> fname = iris.sample_data_path('ostia_monthly.nc')
>>> new_cube = cube.aggregated_by('year', iris.analysis.MEAN)
>>> print(new_cube)
surface_temperature / (K)           (time: 5; latitude: 18; longitude: 432)
Dimension coordinates:
time                           x            -              -
latitude                       -            x              -
longitude                      -            -              x
Auxiliary coordinates:
forecast_reference_time        x            -              -
year                           x            -              -
Scalar coordinates:
forecast_period: 0 hours
Attributes:
Conventions: CF-1.5
STASH: m01s00i024
Cell methods:
mean: month, year
mean: year

aux_factory(name=None, standard_name=None, long_name=None, var_name=None)

Returns the single coordinate factory that matches the criteria, or raises an error if not found.

Kwargs:

• name
If not None, matches against factory.name().
• standard_name
The CF standard name of the desired coordinate factory. If None, does not check for standard name.
• long_name
An unconstrained description of the coordinate factory. If None, does not check for long_name.
• var_name
The netCDF variable name of the desired coordinate factory. If None, does not check for var_name.

Note

If the arguments given do not result in precisely 1 coordinate factory being matched, an iris.exceptions.CoordinateNotFoundError is raised.

cell_measure(name_or_cell_measure=None)

Return a single cell_measure given the same arguments as Cube.cell_measures().

Note

If the arguments given do not result in precisely 1 cell_measure being matched, an iris.exceptions.CellMeasureNotFoundError is raised.

Cube.cell_measures() for full keyword documentation.

cell_measure_dims(cell_measure)

Returns a tuple of the data dimensions relevant to the given CellMeasure.

• cell_measure
The CellMeasure to look for.
cell_measures(name_or_cell_measure=None)

Return a list of cell measures in this cube fitting the given criteria.

Kwargs:

See also Cube.cell_measure().

collapsed(coords, aggregator, **kwargs)

Collapse one or more dimensions over the cube given the coordinate/s and an aggregation.

Examples of aggregations that may be used include COUNT and MAX.

Weighted aggregations (iris.analysis.WeightedAggregator) may also be supplied. These include MEAN and sum SUM.

Weighted aggregations support an optional weights keyword argument. If set, this should be supplied as an array of weights whose shape matches the cube. Values for latitude-longitude area weights may be calculated using iris.analysis.cartography.area_weights().

Some Iris aggregators support “lazy” evaluation, meaning that cubes resulting from this method may represent data arrays which are not computed until the data is requested (e.g. via cube.data or iris.save). If lazy evaluation exists for the given aggregator it will be used wherever possible when this cube’s data is itself a deferred array.

Args:

• coords (string, coord or a list of strings/coords):
Coordinate names/coordinates over which the cube should be collapsed.
• aggregator (iris.analysis.Aggregator):
Aggregator to be applied for collapse operation.

Kwargs:

• kwargs:
Aggregation function keyword arguments.
Returns:
Collapsed cube.

For example:

>>> import iris
>>> import iris.analysis
>>> path = iris.sample_data_path('ostia_monthly.nc')
>>> new_cube = cube.collapsed('longitude', iris.analysis.MEAN)
>>> print(new_cube)
surface_temperature / (K)           (time: 54; latitude: 18)
Dimension coordinates:
time                           x             -
latitude                       -             x
Auxiliary coordinates:
forecast_reference_time        x             -
Scalar coordinates:
forecast_period: 0 hours
longitude: 180.0 degrees, bound=(0.0, 360.0) degrees
Attributes:
Conventions: CF-1.5
STASH: m01s00i024
Cell methods:
mean: month, year
mean: longitude


Note

Some aggregations are not commutative and hence the order of processing is important i.e.:

tmp = cube.collapsed('realization', iris.analysis.VARIANCE)
result = tmp.collapsed('height', iris.analysis.VARIANCE)


is not necessarily the same result as:

tmp = cube.collapsed('height', iris.analysis.VARIANCE)
result2 = tmp.collapsed('realization', iris.analysis.VARIANCE)


Conversely operations which operate on more than one coordinate at the same time are commutative as they are combined internally into a single operation. Hence the order of the coordinates supplied in the list does not matter:

cube.collapsed(['longitude', 'latitude'],
iris.analysis.VARIANCE)


is the same (apart from the logically equivalent cell methods that may be created etc.) as:

cube.collapsed(['latitude', 'longitude'],
iris.analysis.VARIANCE)

convert_units(unit)

Change the cube’s units, converting the values in the data array.

For example, if a cube’s units are kelvin then:

cube.convert_units('celsius')


will change the cube’s units attribute to celsius and subtract 273.15 from each value in data.

Warning

Calling this method will trigger any deferred loading, causing the cube’s data array to be loaded into memory.

coord(name_or_coord=None, standard_name=None, long_name=None, var_name=None, attributes=None, axis=None, contains_dimension=None, dimensions=None, coord_system=None, dim_coords=None)

Return a single coord given the same arguments as Cube.coords().

Note

If the arguments given do not result in precisely 1 coordinate being matched, an iris.exceptions.CoordinateNotFoundError is raised.

Cube.coords() for full keyword documentation.

coord_dims(coord)

Returns a tuple of the data dimensions relevant to the given coordinate.

When searching for the given coordinate in the cube the comparison is made using coordinate metadata equality. Hence the given coordinate instance need not exist on the cube, and may contain different coordinate values.

Args:

• coord (string or coord)
The (name of the) coord to look for.
coord_system(spec=None)

Find the coordinate system of the given type.

If no target coordinate system is provided then find any available coordinate system.

Kwargs:

• spec:

The the name or type of a coordinate system subclass. E.g.

cube.coord_system("GeogCS")
cube.coord_system(iris.coord_systems.GeogCS)


If spec is provided as a type it can be a superclass of any coordinate system found.

If spec is None, then find any available coordinate systems within the iris.cube.Cube.

Returns:
The iris.coord_systems.CoordSystem or None.
coords(name_or_coord=None, standard_name=None, long_name=None, var_name=None, attributes=None, axis=None, contains_dimension=None, dimensions=None, coord_system=None, dim_coords=None)

Return a list of coordinates in this cube fitting the given criteria.

Kwargs:

• name_or_coord

Either

(a) a standard_name, long_name, or var_name. Defaults to value of default (which itself defaults to unknown) as defined in iris._cube_coord_common.CFVariableMixin.

(b) a coordinate instance with metadata equal to that of the desired coordinates. Accepts either a iris.coords.DimCoord, iris.coords.AuxCoord, iris.aux_factory.AuxCoordFactory or iris.coords.CoordDefn.

• standard_name

The CF standard name of the desired coordinate. If None, does not check for standard name.

• long_name

An unconstrained description of the coordinate. If None, does not check for long_name.

• var_name

The netCDF variable name of the desired coordinate. If None, does not check for var_name.

• attributes

A dictionary of attributes desired on the coordinates. If None, does not check for attributes.

• axis

The desired coordinate axis, see iris.util.guess_coord_axis(). If None, does not check for axis. Accepts the values ‘X’, ‘Y’, ‘Z’ and ‘T’ (case-insensitive).

• contains_dimension

The desired coordinate contains the data dimension. If None, does not check for the dimension.

• dimensions

The exact data dimensions of the desired coordinate. Coordinates with no data dimension can be found with an empty tuple or list (i.e. () or []). If None, does not check for dimensions.

• coord_system

Whether the desired coordinates have coordinate systems equal to the given coordinate system. If None, no check is done.

• dim_coords

Set to True to only return coordinates that are the cube’s dimension coordinates. Set to False to only return coordinates that are the cube’s auxiliary and derived coordinates. If None, returns all coordinates.

See also Cube.coord().

copy(data=None)

Returns a deep copy of this cube.

Kwargs:

• data:
Replace the data of the cube copy with provided data payload.
Returns:
A copy instance of the Cube.
core_data()

Retrieve the data array of this Cube in its current state, which will either be real or lazy.

If this Cube has lazy data, accessing its data array via this method will not realise the data array. This means you can perform operations using this method that work equivalently on real or lazy data, and will maintain lazy data if present.

extract(constraint)

Filter the cube by the given constraint using iris.Constraint.extract() method.

has_lazy_data()

Details whether this Cube has lazy data.

Returns:
Boolean.
interpolate(sample_points, scheme, collapse_scalar=True)

Interpolate from this Cube to the given sample points using the given interpolation scheme.

Args:

• sample_points:

A sequence of (coordinate, points) pairs over which to interpolate. The values for coordinates that correspond to dates or times may optionally be supplied as datetime.datetime or cftime.datetime instances.

• scheme:

The type of interpolation to use to interpolate from this Cube to the given sample points. The interpolation schemes currently available in Iris are:

Kwargs:

• collapse_scalar:
Whether to collapse the dimension of scalar sample points in the resulting cube. Default is True.
Returns:
A cube interpolated at the given sample points. If collapse_scalar is True then the dimensionality of the cube will be the number of original cube dimensions minus the number of scalar coordinates.

For example:

>>> import datetime
>>> import iris
>>> path = iris.sample_data_path('uk_hires.pp')
>>> print(cube.summary(shorten=True))
air_potential_temperature / (K)     (time: 3; model_level_number: 7; grid_latitude: 204; grid_longitude: 187)
>>> print(cube.coord('time'))
DimCoord([2009-11-19 10:00:00, 2009-11-19 11:00:00, 2009-11-19 12:00:00], standard_name='time', calendar='gregorian')
>>> print(cube.coord('time').points)
[349618. 349619. 349620.]
>>> samples = [('time', 349618.5)]
>>> result = cube.interpolate(samples, iris.analysis.Linear())
>>> print(result.summary(shorten=True))
air_potential_temperature / (K)     (model_level_number: 7; grid_latitude: 204; grid_longitude: 187)
>>> print(result.coord('time'))
DimCoord([2009-11-19 10:30:00], standard_name='time', calendar='gregorian')
>>> print(result.coord('time').points)
[349618.5]
>>> # For datetime-like coordinates, we can also use
>>> # datetime-like objects.
>>> samples = [('time', datetime.datetime(2009, 11, 19, 10, 30))]
>>> result2 = cube.interpolate(samples, iris.analysis.Linear())
>>> print(result2.summary(shorten=True))
air_potential_temperature / (K)     (model_level_number: 7; grid_latitude: 204; grid_longitude: 187)
>>> print(result2.coord('time'))
DimCoord([2009-11-19 10:30:00], standard_name='time', calendar='gregorian')
>>> print(result2.coord('time').points)
[349618.5]
>>> print(result == result2)
True

intersection(*args, **kwargs)

Return the intersection of the cube with specified coordinate ranges.

Coordinate ranges can be specified as:

1. instances of iris.coords.CoordExtent.
2. keyword arguments, where the keyword name specifies the name of the coordinate (as defined in iris.cube.Cube.coords()) and the value defines the corresponding range of coordinate values as a tuple. The tuple must contain two, three, or four items corresponding to: (minimum, maximum, min_inclusive, max_inclusive). Where the items are defined as:
• minimum
The minimum value of the range to select.
• maximum
The maximum value of the range to select.
• min_inclusive
If True, coordinate values equal to minimum will be included in the selection. Default is True.
• max_inclusive
If True, coordinate values equal to maximum will be included in the selection. Default is True.

To perform an intersection that ignores any bounds on the coordinates, set the optional keyword argument ignore_bounds to True. Defaults to False.

Note

For ranges defined over “circular” coordinates (i.e. those where the units attribute has a modulus defined) the cube will be “rolled” to fit where neccesary.

Warning

Currently this routine only works with “circular” coordinates (as defined in the previous note.)

For example:

>>> import iris
>>> print(cube.coord('longitude').points[::10])
[   0.           37.49999237   74.99998474  112.49996948  149.99996948
187.49995422  224.99993896  262.49993896  299.99993896  337.49990845]
>>> subset = cube.intersection(longitude=(30, 50))
>>> print(subset.coord('longitude').points)
[ 33.74999237  37.49999237  41.24998856  44.99998856  48.74998856]
>>> subset = cube.intersection(longitude=(-10, 10))
>>> print(subset.coord('longitude').points)
[-7.50012207 -3.75012207  0.          3.75        7.5       ]

Returns:
A new Cube giving the subset of the cube which intersects with the requested coordinate intervals.
is_compatible(other, ignore=None)

Return whether the cube is compatible with another.

Compatibility is determined by comparing iris.cube.Cube.name(), iris.cube.Cube.units, iris.cube.Cube.cell_methods and iris.cube.Cube.attributes that are present in both objects.

Args:

Returns:
Boolean.

Note

This function does not indicate whether the two cubes can be merged, instead it checks only the four items quoted above for equality. Determining whether two cubes will merge requires additional logic that is beyond the scope of this method.

lazy_data()

Return a “lazy array” representing the Cube data. A lazy array describes an array whose data values have not been loaded into memory from disk.

Accessing this method will never cause the Cube data to be loaded. Similarly, calling methods on, or indexing, the returned Array will not cause the Cube data to be loaded.

If the Cube data have already been loaded (for example by calling data()), the returned Array will be a view of the loaded cube data represented as a lazy array object. Note that this does _not_ make the Cube data lazy again; the Cube data remains loaded in memory.

Returns:
A lazy array, representing the Cube data.
name(default='unknown')

First it tries standard_name, then ‘long_name’, then ‘var_name’, then the STASH attribute before falling back to the value of default (which itself defaults to ‘unknown’).

regrid(grid, scheme)

Regrid this Cube on to the given target grid using the given regridding scheme.

Args:

• grid:

A Cube that defines the target grid.

• scheme:

The type of regridding to use to regrid this cube onto the target grid. The regridding schemes currently available in Iris are:

Returns:
A cube defined with the horizontal dimensions of the target grid and the other dimensions from this cube. The data values of this cube will be converted to values on the new grid according to the given regridding scheme.

Note

Both the source and target cubes must have a CoordSystem, otherwise this function is not applicable.

remove_aux_factory(aux_factory)

Removes the given auxiliary coordinate factory from the cube.

remove_cell_measure(cell_measure)

Removes a cell measure from the cube.

Args:

• cell_measure (CellMeasure)
The CellMeasure to remove from the cube.
remove_coord(coord)

Removes a coordinate from the cube.

Args:

• coord (string or coord)
The (name of the) coordinate to remove from the cube.
rename(name)

If ‘name’ is a valid standard name it will assign it to standard_name, otherwise it will assign it to long_name.

replace_coord(new_coord)

Replace the coordinate whose metadata matches the given coordinate.

rolling_window(coord, aggregator, window, **kwargs)

Perform rolling window aggregation on a cube given a coordinate, an aggregation method and a window size.

Args:

Kwargs:

• kwargs:
Aggregator and aggregation function keyword arguments. The weights argument to the aggregator, if any, should be a 1d array with the same length as the chosen window.
Returns:
iris.cube.Cube.

Note

This operation does not yet have support for lazy evaluation.

For example:

>>> import iris, iris.analysis
>>> fname = iris.sample_data_path('GloSea4', 'ensemble_010.pp')
>>> print(air_press)
surface_temperature / (K)           (time: 6; latitude: 145; longitude: 192)
Dimension coordinates:
time                           x            -               -
latitude                       -            x               -
longitude                      -            -               x
Auxiliary coordinates:
forecast_period                x            -               -
Scalar coordinates:
forecast_reference_time: 2011-07-23 00:00:00
realization: 10
Attributes:
STASH: m01s00i024
source: Data from Met Office Unified Model
um_version: 7.6
Cell methods:
mean: time (1 hour)

>>> print(air_press.rolling_window('time', iris.analysis.MEAN, 3))
surface_temperature / (K)           (time: 4; latitude: 145; longitude: 192)
Dimension coordinates:
time                           x            -               -
latitude                       -            x               -
longitude                      -            -               x
Auxiliary coordinates:
forecast_period                x            -               -
Scalar coordinates:
forecast_reference_time: 2011-07-23 00:00:00
realization: 10
Attributes:
STASH: m01s00i024
source: Data from Met Office Unified Model
um_version: 7.6
Cell methods:
mean: time (1 hour)
mean: time


Notice that the forecast_period dimension now represents the 4 possible windows of size 3 from the original cube.

slices(ref_to_slice, ordered=True)

Return an iterator of all subcubes given the coordinates or dimension indices desired to be present in each subcube.

Args:

• ref_to_slice (string, coord, dimension index or a list of these):
Determines which dimensions will be returned in the subcubes (i.e. the dimensions that are not iterated over). A mix of input types can also be provided. They must all be orthogonal (i.e. point to different dimensions).

Kwargs:

• ordered: if True, the order which the coords to slice or data_dims
are given will be the order in which they represent the data in the resulting cube slices. If False, the order will follow that of the source cube. Default is True.
Returns:
An iterator of subcubes.

For example, to get all 2d longitude/latitude subcubes from a multi-dimensional cube:

for sub_cube in cube.slices(['longitude', 'latitude']):
print(sub_cube)

slices_over(ref_to_slice)

Return an iterator of all subcubes along a given coordinate or dimension index, or multiple of these.

Args:

• ref_to_slice (string, coord, dimension index or a list of these):
Determines which dimensions will be iterated along (i.e. the dimensions that are not returned in the subcubes). A mix of input types can also be provided.
Returns:
An iterator of subcubes.

For example, to get all subcubes along the time dimension:

for sub_cube in cube.slices_over('time'):
print(sub_cube)


Note

The order of dimension references to slice along does not affect the order of returned items in the iterator; instead the ordering is based on the fastest-changing dimension.

subset(coord)

Get a subset of the cube by providing the desired resultant coordinate. If the coordinate provided applies to the whole cube; the whole cube is returned. As such, the operation is not strict.

summary(shorten=False, name_padding=35)

Unicode string summary of the Cube with name, a list of dim coord names versus length and optionally relevant coordinate information.

transpose(new_order=None)

Re-order the data dimensions of the cube in-place.

new_order - list of ints, optional
By default, reverse the dimensions, otherwise permute the axes according to the values given.

Note

If defined, new_order must span all of the data dimensions.

Example usage:

# put the second dimension first, followed by the third dimension,
and finally put the first dimension third::

>>> cube.transpose([1, 2, 0])

xml(checksum=False, order=True, byteorder=True)

Returns a fully valid CubeML string representation of the Cube.

attributes

A dictionary, with a few restricted keys, for arbitrary Cube metadata.

aux_coords

Return a tuple of all the auxiliary coordinates, ordered by dimension(s).

aux_factories

Return a tuple of all the coordinate factories.

cell_methods

Tuple of iris.coords.CellMethod representing the processing done on the phenomenon.

data

The numpy.ndarray representing the multi-dimensional data of the cube.

Note

Cubes obtained from netCDF, PP, and FieldsFile files will only populate this attribute on its first use.

To obtain the shape of the data without causing it to be loaded, use the Cube.shape attribute.

Example::
>>> fname = iris.sample_data_path('air_temp.pp')
>>> # cube.data does not yet have a value.
...
>>> print(cube.shape)
(73, 96)
>>> # cube.data still does not have a value.
...
>>> cube = cube[:10, :20]
>>> # cube.data still does not have a value.
...
>>> data = cube.data
>>> # Only now is the data loaded.
...
>>> print(data.shape)
(10, 20)

derived_coords

Return a tuple of all the coordinates generated by the coordinate factories.

dim_coords

Return a tuple of all the dimension coordinates, ordered by dimension.

Note

The length of the returned tuple is not necessarily the same as Cube.ndim as there may be dimensions on the cube without dimension coordinates. It is therefore unreliable to use the resulting tuple to identify the dimension coordinates for a given dimension - instead use the Cube.coord() method with the dimensions and dim_coords keyword arguments.

dtype

The data type of the values in the data array of this Cube.

long_name = None

The “long name” for the Cube’s phenomenon.

metadata

An instance of CubeMetadata describing the phenomenon.

This property can be updated with any of:
ndim

The number of dimensions in the data of this cube.

shape

The shape of the data of this cube.

standard_name

The “standard name” for the Cube’s phenomenon.

units

An instance of cf_units.Unit describing the Cube’s data.

var_name

The netCDF variable name for the Cube.

↑ top ↑

All the functionality of a standard list with added “Cube” context.

class iris.cube.CubeList(list_of_cubes=None)

Bases: list

Given a list of cubes, return a CubeList instance.

append(object) → None -- append object to end
clear() → None -- remove all items from L
concatenate(check_aux_coords=True)

Concatenate the cubes over their common dimensions.

Kwargs:

• check_aux_coords
Checks the auxilliary coordinates of the cubes match. This check is not applied to auxilliary coordinates that span the dimension the concatenation is occuring along. Defaults to True.
Returns:
A new iris.cube.CubeList of concatenated iris.cube.Cube instances.

This combines cubes with a common dimension coordinate, but occupying different regions of the coordinate value. The cubes are joined across that dimension.

For example:

>>> print(c1)
some_parameter / (unknown)          (y_vals: 2; x_vals: 4)
Dimension coordinates:
y_vals                           x          -
x_vals                           -          x
>>> print(c1.coord('y_vals').points)
[4 5]
>>> print(c2)
some_parameter / (unknown)          (y_vals: 3; x_vals: 4)
Dimension coordinates:
y_vals                           x          -
x_vals                           -          x
>>> print(c2.coord('y_vals').points)
[ 7  9 10]
>>> cube_list = iris.cube.CubeList([c1, c2])
>>> new_cube = cube_list.concatenate()[0]
>>> print(new_cube)
some_parameter / (unknown)          (y_vals: 5; x_vals: 4)
Dimension coordinates:
y_vals                           x          -
x_vals                           -          x
>>> print(new_cube.coord('y_vals').points)
[ 4  5  7  9 10]
>>>


Contrast this with iris.cube.CubeList.merge(), which makes a new dimension from values of an auxiliary scalar coordinate.

Note

If time coordinates in the list of cubes have differing epochs then the cubes will not be able to be concatenated. If this occurs, use iris.util.unify_time_units() to normalise the epochs of the time coordinates so that the cubes can be concatenated.

Note

Concatenation cannot occur along an anonymous dimension.

concatenate_cube(check_aux_coords=True)

Return the concatenated contents of the CubeList as a single Cube.

If it is not possible to concatenate the CubeList into a single Cube, a ConcatenateError will be raised describing the reason for the failure.

Kwargs:

• check_aux_coords
Checks the auxilliary coordinates of the cubes match. This check is not applied to auxilliary coordinates that span the dimension the concatenation is occuring along. Defaults to True.

Note

Concatenation cannot occur along an anonymous dimension.

copy() → list -- a shallow copy of L
count(value) → integer -- return number of occurrences of value
extend(iterable) → None -- extend list by appending elements from the iterable
extract(constraints, strict=False)

Filter each of the cubes which can be filtered by the given constraints.

This method iterates over each constraint given, and subsets each of the cubes in this CubeList where possible. Thus, a CubeList of length n when filtered with m constraints can generate a maximum of m * n cubes.

Keywords:

• strict - boolean
If strict is True, then there must be exactly one cube which is filtered per constraint.
extract_overlapping(coord_names)

Returns a CubeList of cubes extracted over regions where the coordinates overlap, for the coordinates in coord_names.

Args:

• coord_names:
A string or list of strings of the names of the coordinates over which to perform the extraction.
extract_strict(constraints)

Calls CubeList.extract() with the strict keyword set to True.

index(value[, start[, stop]]) → integer -- return first index of value.

Raises ValueError if the value is not present.

insert()

L.insert(index, object) – insert object before index

merge(unique=True)

Returns the CubeList resulting from merging this CubeList.

Kwargs:

• unique:
If True, raises iris.exceptions.DuplicateDataError if duplicate cubes are detected.

This combines cubes with different values of an auxiliary scalar coordinate, by constructing a new dimension.

For example:

>>> print(c1)
some_parameter / (unknown)          (x_vals: 3)
Dimension coordinates:
x_vals                           x
Scalar coordinates:
y_vals: 100
>>> print(c2)
some_parameter / (unknown)          (x_vals: 3)
Dimension coordinates:
x_vals                           x
Scalar coordinates:
y_vals: 200
>>> cube_list = iris.cube.CubeList([c1, c2])
>>> new_cube = cube_list.merge()[0]
>>> print(new_cube)
some_parameter / (unknown)          (y_vals: 2; x_vals: 3)
Dimension coordinates:
y_vals                           x          -
x_vals                           -          x
>>> print(new_cube.coord('y_vals').points)
[100 200]
>>>


Contrast this with iris.cube.CubeList.concatenate(), which joins cubes along an existing dimension.

Note

If time coordinates in the list of cubes have differing epochs then the cubes will not be able to be merged. If this occurs, use iris.util.unify_time_units() to normalise the epochs of the time coordinates so that the cubes can be merged.

merge_cube()

Return the merged contents of the CubeList as a single Cube.

If it is not possible to merge the CubeList into a single Cube, a MergeError will be raised describing the reason for the failure.

For example:

>>> cube_1 = iris.cube.Cube([1, 2])
>>> cube_2 = iris.cube.Cube([3, 4])
...     iris.coords.DimCoord([0, 1], long_name='z'), 0)
>>> single_cube = iris.cube.CubeList([cube_1, cube_2]).merge_cube()
Traceback (most recent call last):
...
iris.exceptions.MergeError: failed to merge into a single cube.
Coordinates in cube.dim_coords differ: z.
Coordinate-to-dimension mapping differs for cube.dim_coords.

pop([index]) → item -- remove and return item at index (default last).

Raises IndexError if list is empty or index is out of range.

realise_data()

Fetch ‘real’ data for all cubes, in a shared calculation.

This computes any lazy data, equivalent to accessing each cube.data. However, lazy calculations and data fetches can be shared between the computations, improving performance.

For example:

# Form stats.
a_std = cube_a.collapsed(['x', 'y'], iris.analysis.STD_DEV)
b_std = cube_b.collapsed(['x', 'y'], iris.analysis.STD_DEV)
ab_mean_diff = (cube_b - cube_a).collapsed(['x', 'y'],
iris.analysis.MEAN)
std_err = (a_std * a_std + b_std * b_std) ** 0.5

# Compute these stats together (avoiding multiple data passes).
CubeList([a_std, b_std, ab_mean_diff, std_err]).realise_data()


Note

Cubes with non-lazy data are not affected.

remove(value) → None -- remove first occurrence of value.

Raises ValueError if the value is not present.

reverse()

L.reverse() – reverse IN PLACE

sort(key=None, reverse=False) → None -- stable sort *IN PLACE*
xml(checksum=False, order=True, byteorder=True)

Return a string of the XML that this list of cubes represents.

↑ top ↑

Represents the phenomenon metadata for a single Cube.

class iris.cube.CubeMetadata(_cls, standard_name, long_name, var_name, units, attributes, cell_methods)

Create new instance of CubeMetadata(standard_name, long_name, var_name, units, attributes, cell_methods)

count(value) → integer -- return number of occurrences of value
index(value[, start[, stop]]) → integer -- return first index of value.

Raises ValueError if the value is not present.

name(default='unknown')

First it tries self.standard_name, then it tries the ‘long_name’ attribute, then the ‘var_name’ attribute, before falling back to the value of default (which itself defaults to ‘unknown’).

attributes

Alias for field number 4

cell_methods

Alias for field number 5

long_name

Alias for field number 1

standard_name

Alias for field number 0

units

Alias for field number 3

var_name

Alias for field number 2