forked from Theano/Theano
-
Notifications
You must be signed in to change notification settings - Fork 0
/
Copy pathglossary.txt
162 lines (124 loc) · 6.76 KB
/
glossary.txt
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
.. _glossary:
Glossary
========
.. glossary::
Apply
Instances of :class:`Apply` represent the application of an :term:`Op`
to some input :term:`Variable` (or variables) to produce some output
:term:`Variable` (or variables). They are like the application of a [symbolic]
mathematical function to some [symbolic] inputs.
Broadcasting
Broadcasting is a mechanism which allows tensors with
different numbers of dimensions to be used in element-by-element
(elementwise) computations. It works by
(virtually) replicating the smaller tensor along
the dimensions that it is lacking.
For more detail, see :ref:`libdoc_tensor_broadcastable`, and also
* `SciPy documentation about numpy's broadcasting <http://www.scipy.org/EricsBroadcastingDoc>`_
* `OnLamp article about numpy's broadcasting <http://www.onlamp.com/pub/a/python/2000/09/27/numerically.html>`_
Constant
A variable with an immutable value.
For example, when you type
>>> x = tensor.ivector()
>>> y = x + 3
Then a `constant` is created to represent the ``3`` in the graph.
See also: :class:`gof.Constant`
Elementwise
An elementwise operation ``f`` on two tensor variables ``M`` and ``N``
is one such that:
``f(M, N)[i, j] == f(M[i, j], N[i, j])``
In other words, each element of an input matrix is combined
with the corresponding element of the other(s). There are no
dependencies between elements whose ``[i, j]`` coordinates do
not correspond, so an elementwise operation is like a scalar
operation generalized along several dimensions. Elementwise
operations are defined for tensors of different numbers of dimensions by
:term:`broadcasting` the smaller ones.
Expression
See :term:`Apply`
Expression Graph
A directed, acyclic set of connected :term:`Variable` and
:term:`Apply` nodes that express symbolic functional relationship
between variables. You use Theano by defining expression graphs, and
then compiling them with :term:`theano.function`.
See also :term:`Variable`, :term:`Op`, :term:`Apply`, and
:term:`Type`, or read more about :ref:`tutorial_graphstructures`.
Destructive
An :term:`Op` is destructive (of particular input[s]) if its
computation requires that one or more inputs be overwritten or
otherwise invalidated. For example, :term:`inplace` Ops are
destructive. Destructive Ops can sometimes be faster than
non-destructive alternatives. Theano encourages users not to put
destructive Ops into graphs that are given to :term:`theano.function`,
but instead to trust the optimizations to insert destructive ops
judiciously.
Destructive Ops are indicated via a ``destroy_map`` Op attribute. (See
:class:`gof.Op`.
Graph
see :term:`expression graph`
Inplace
Inplace computations are computations that destroy their inputs as a
side-effect. For example, if you iterate over a matrix and double
every element, this is an inplace operation because when you are done,
the original input has been overwritten. Ops representing inplace
computations are :term:`destructive`, and by default these can only be
inserted by optimizations, not user code.
Linker
Part of a function :term:`Mode` -- an object responsible for 'running'
the compiled function. Among other things, the linker determines whether computations are carried out with C or Python code.
Mode
An object providing an :term:`optimizer` and a :term:`linker` that is
passed to :term:`theano.function`. It parametrizes how an expression
graph is converted to a callable object.
Op
The ``.op`` of an :term:`Apply`, together with its symbolic inputs
fully determines what kind of computation will be carried out for that
``Apply`` at run-time. Mathematical functions such as addition
(``T.add``) and indexing ``x[i]`` are Ops in Theano. Much of the
library documentation is devoted to describing the various Ops that
are provided with Theano, but you can add more.
See also :term:`Variable`, :term:`Type`, and :term:`Apply`,
or read more about :ref:`tutorial_graphstructures`.
Optimizer
An instance of :class:`Optimizer`, which has the capacity to provide
an :term:`optimization` (or optimizations).
Optimization
A :term:`graph` transformation applied by an :term:`optimizer` during
the compilation of a :term:`graph` by :term:`theano.function`.
Pure
An :term:`Op` is *pure* if it has no :term:`destructive` side-effects.
Storage
The memory that is used to store the value of a Variable. In most
cases storage is internal to a compiled function, but in some cases
(such as :term:`constant` and :term:`shared variable <shared variable>` the storage is not internal.
Shared Variable
A :term:`Variable` whose value may be shared between multiple functions. See :func:`shared <shared.shared>` and :func:`theano.function <function.function>`.
theano.function
The interface for Theano's compilation from symbolic expression graphs
to callable objects. See :func:`function.function`.
Type
The ``.type`` of a
:term:`Variable` indicates what kinds of values might be computed for it in a
compiled graph.
An instance that inherits from :class:`Type`, and is used as the
``.type`` attribute of a :term:`Variable`.
See also :term:`Variable`, :term:`Op`, and :term:`Apply`,
or read more about :ref:`tutorial_graphstructures`.
Variable
The the main data structure you work with when using Theano.
For example,
>>> x = theano.tensor.ivector()
>>> y = -x**2
``x`` and ``y`` are both `Variables`, i.e. instances of the :class:`Variable` class.
See also :term:`Type`, :term:`Op`, and :term:`Apply`,
or read more about :ref:`tutorial_graphstructures`.
View
Some Tensor Ops (such as Subtensor and Transpose) can be computed in
constant time by simply re-indexing their inputs. The outputs from
[the Apply instances from] such Ops are called `Views` because their
storage might be aliased to the storage of other variables (the inputs
of the Apply). It is important for Theano to know which Variables are
views of which other ones in order to introduce :term:`Destructive`
Ops correctly.
View Ops are indicated via a ``view_map`` Op attribute. (See
:class:`gof.Op`.