function
- defines pytensor.function#
Guide#
This module provides function()
, commonly accessed as pytensor.function
,
the interface for compiling graphs into callable objects.
You’ve already seen example usage in the basic tutorial… something like this:
>>> import pytensor
>>> x = pytensor.tensor.dscalar()
>>> f = pytensor.function([x], 2*x)
>>> f(4)
array(8.0)
The idea here is that we’ve compiled the symbolic graph (2*x
) into a function that can be called on a number and will do some computations.
The behaviour of function can be controlled in several ways, such as
In
, Out
, mode
, updates
, and givens
. These are covered
in the tutorial examples and tutorial on modes.
Reference#
- class pytensor.compile.function.In[source]#
A class for attaching information to function inputs.
- value[source]#
The default value to use at call-time (can also be a Container where the function will find a value at call-time.)
- mutable[source]#
True
means the compiled-function is allowed to modify this argument.False
means it is not allowed.
- borrow[source]#
True
indicates that a reference to internal storage may be returned, and that the caller is aware that subsequent function evaluations might overwrite this memory.
- strict[source]#
If
False
, a function argument may be copied or cast to match the type required by the parametervariable
. IfTrue
, a function argument must exactly match the type required byvariable
.
- allow_downcast[source]#
True
indicates that the value you pass for this input can be silently downcasted to fit the right type, which may lose precision. (Only applies whenstrict
isFalse
.)
- class pytensor.compile.function.Out[source]#
A class for attaching information to function outputs
- pytensor.compile.function.function(inputs, outputs, mode=None, updates=None, givens=None, no_default_updates=False, accept_inplace=False, name=None, rebuild_strict=True, allow_input_downcast=None, profile=None, on_unused_input='raise')[source]#
Return a
callable object
that will calculateoutputs
frominputs
.- Parameters:
params (list of either Variable or In instances, but not shared variables.) – the returned
Function
instance will have parameters for these variables.outputs (list of Variables or Out instances) – expressions to compute.
mode (None, string or
Mode
instance.) – compilation modeupdates (iterable over pairs (shared_variable, new_expression). List, tuple or dict.) – expressions for new
SharedVariable
valuesgivens (iterable over pairs (Var1, Var2) of Variables. List, tuple or dict. The Var1 and Var2 in each pair must have the same Type.) – specific substitutions to make in the computation graph (Var2 replaces Var1).
no_default_updates (either bool or list of Variables) – if True, do not perform any automatic update on Variables. If False (default), perform them all. Else, perform automatic updates on all Variables that are neither in
updates
nor inno_default_updates
.name – an optional name for this function. The profile mode will print the time spent in this function.
rebuild_strict – True (Default) is the safer and better tested setting, in which case
givens
must substitute new variables with the same Type as the variables they replace. False is a you-better-know-what-you-are-doing setting, that permitsgivens
to replace variables with new variables of any Type. The consequence of changing a Type is that all results depending on that variable may have a different Type too (the graph is rebuilt from inputs to outputs). If one of the new types does not make sense for one of the Ops in the graph, an Exception will be raised.allow_input_downcast (Boolean or None) – True means that the values passed as inputs when calling the function can be silently downcasted to fit the dtype of the corresponding Variable, which may lose precision. False means that it will only be cast to a more general, or precise, type. None (default) is almost like False, but allows downcasting of Python float scalars to floatX.
profile (None, True, or ProfileStats instance) – accumulate profiling information into a given ProfileStats instance. If argument is
True
then a new ProfileStats instance will be used. This profiling object will be available via self.profile.on_unused_input – What to do if a variable in the ‘inputs’ list is not used in the graph. Possible values are ‘raise’, ‘warn’, and ‘ignore’.
- Return type:
Function
instance- Returns:
a callable object that will compute the outputs (given the inputs) and update the implicit function arguments according to the
updates
.
Inputs can be given as variables or
In
instances.In
instances also have a variable, but they attach some extra information about how call-time arguments corresponding to that variable should be used. Similarly,Out
instances can attach information about how output variables should be returned.The default is typically ‘FAST_RUN’ but this can be changed in pytensor.config. The mode argument controls the sort of rewrites that will be applied to the graph, and the way the rewritten graph will be evaluated.
After each function evaluation, the
updates
mechanism can replace the value of any (implicit)SharedVariable
inputs with new values computed from the expressions in theupdates
list. An exception will be raised if you give two update expressions for the sameSharedVariable
input (that doesn’t make sense).If a
SharedVariable
is not given an update expression, but has aVariable.default_update
member containing an expression, this expression will be used as the update expression for this variable. Passingno_default_updates=True
tofunction
disables this behavior entirely, passingno_default_updates=[sharedvar1, sharedvar2]
disables it for the mentioned variables.Regarding givens: Be careful to make sure that these substitutions are independent, because behaviour when
Var1
of one pair appears in the graph leading toVar2
in another expression is undefined (e.g. with{a: x, b: a + 1}
). Replacements specified with givens are different from replacements that occur during normal rewriting, in thatVar2
is not expected to be equivalent toVar1
.
- pytensor.compile.function.function_dump(filename, inputs, outputs=None, mode=None, updates=None, givens=None, no_default_updates=False, accept_inplace=False, name=None, rebuild_strict=True, allow_input_downcast=None, profile=None, on_unused_input=None, extra_tag_to_remove=None)[source]#
This is helpful to make a reproducible case for problems during PyTensor compilation.
Ex:
replace
pytensor.function(...)
bypytensor.function_dump('filename.pkl', ...)
.If you see this, you were probably asked to use this function to help debug a particular case during the compilation of an PyTensor function.
function_dump
allows you to easily reproduce your compilation without generating any code. It pickles all the objects and parameters needed to reproduce a call topytensor.function()
. This includes shared variables and their values. If you do not want that, you can choose to replace shared variables values with zeros by calling set_value(…) on them before callingfunction_dump
.To load such a dump and do the compilation:
>>> import pickle >>> import pytensor >>> d = pickle.load(open("func_dump.bin", "rb")) >>> f = pytensor.function(**d)
Note: The parameter
extra_tag_to_remove
is passed to the StripPickler used. To pickle graph made by Blocks, it must be:['annotations', 'replacement_of', 'aggregation_scheme', 'roles']
- class pytensor.compile.function.types.Function(vm: VM, input_storage, output_storage, indices, outputs, defaults, unpack_single: bool, return_none: bool, output_keys, maker: FunctionMaker, name: Optional[str] = None)[source]#
A class that wraps the execution of a
VM
making it easier for use as a “function”.Function
is the callable object that does computation. It has the storage of inputs and outputs, performs the packing and unpacking of inputs and return values. It implements the square-bracket indexing so that you can look up the value of a symbolic node.Functions are copyable via
Function.copy
and thecopy.copy
interface. When a function is copied, this instance is duplicated. Contrast with self.maker (instance ofFunctionMaker
) that is shared between copies. The meaning of copying a function is that the containers and their current values will all be duplicated. This requires that mutable inputs be copied, whereas immutable inputs may be shared between copies.A Function instance is hashable, on the basis of its memory address (its id). A Function instance is only equal to itself. A Function instance may be serialized using the
pickle
orcPickle
modules. This will save all default inputs, the graph, and WRITEME to the pickle file.A
Function
instance has aFunction.trust_input
field that defaults toFalse
. WhenTrue
, theFunction
will skip all checks on the inputs.- finder[source]#
Dictionary mapping several kinds of things to containers.
We set an entry in finder for: - the index of the input - the variable instance the input is based on - the name of the input
All entries map to the container or to DUPLICATE if an ambiguity is detected.
- __call__(*args, **kwargs)[source]#
Evaluates value of a function on given arguments.
- Parameters:
args (list) – List of inputs to the function. All inputs are required, even when some of them are not necessary to calculate requested subset of outputs.
kwargs (dict) –
The function inputs can be passed as keyword argument. For this, use the name of the input or the input instance as the key.
Keyword argument
output_subset
is a list of either indices of the function’s outputs or the keys belonging to theoutput_keys
dict and represent outputs that are requested to be calculated. Regardless of the presence ofoutput_subset
, the updates are always calculated and processed. To disable the updates, you should use thecopy
method withdelete_updates=True
.
- Returns:
List of outputs on indices/keys from
output_subset
or all of them, ifoutput_subset
is not passed.- Return type:
list
- copy(share_memory=False, swap=None, delete_updates=False, name=None, profile=None)[source]#
Copy this function. Copied function will have separated maker and fgraph with original function. User can choose whether to separate storage by changing the share_memory arguments.
- Parameters:
share_memory (boolean) – When True, two function share intermediate storages(storages except input and output storages). Otherwise two functions will only share partial storages and same maker. If two functions share memory and allow_gc=False, this will increase executing speed and save memory.
swap (dict) – Dictionary that map old SharedVariables to new SharedVariables. Default is None. NOTE: The shared variable swap in only done in the new returned function, not in the user graph.
delete_updates (boolean) – If True, Copied function will not have updates.
name (string) – If provided, will be the name of the new Function. Otherwise, it will be old + ” copy”
profile – as pytensor.function profile parameter
- Returns:
Copied pytensor.Function
- Return type:
pytensor.Function