tensor.utils – Tensor Utils#

pytensor.tensor.utils.as_list(x)[source]#

Convert x to a list if it is an iterable; otherwise, wrap it in a list.

pytensor.tensor.utils.broadcast_static_dim_lengths(dim_lengths: Sequence[int | None]) int | None[source]#

Apply static broadcast given static dim length of inputs (obtained from var.type.shape).

Raises:

ValueError – When static dim lengths are incompatible

pytensor.tensor.utils.hash_from_ndarray(data) str[source]#

Return a hash from an ndarray.

It takes care of the data, shapes, strides and dtype.

pytensor.tensor.utils.normalize_reduce_axis(axis, ndim: int) tuple[int, ...] | None[source]#

Normalize the axis parameter for reduce operations.

pytensor.tensor.utils.shape_of_variables(fgraph: FunctionGraph, input_shapes) dict[pytensor.graph.basic.Variable, tuple[int, ...]][source]#

Compute the numeric shape of all intermediate variables given input shapes.

Parameters:
  • fgraph – The FunctionGraph in question.

  • input_shapes (dict) – A dict mapping input to shape.

Returns:

  • shapes (dict) – A dict mapping variable to shape

  • .. warning:: This modifies the fgraph. Not pure.

Examples

>>> import pytensor.tensor as pt
>>> from pytensor.graph.fg import FunctionGraph
>>> x = pt.matrix("x")
>>> y = x[512:]
>>> y.name = "y"
>>> fgraph = FunctionGraph([x], [y], clone=False)
>>> d = shape_of_variables(fgraph, {x: (1024, 1024)})
>>> d[y]
(array(512), array(1024))
>>> d[x]
(array(1024), array(1024))