Skip to content

Latest commit

 

History

History
56 lines (48 loc) · 3.08 KB

File metadata and controls

56 lines (48 loc) · 3.08 KB

Supported Tensor Operations

Tensor Storage

For a nice description of PyTorch's underlying storage implementation, which this library follows, see this nice blog post.

Tensors have an underlying storage pointer, with a shape, stride, and offset. This allows for cheap operations like reshapes, indexing, etc. where the underlying data pointer remains the same, while the shape/stride/offsets are modified. This also means that making inplace modifications to views of a tensor will also have those modifications reflected in the original tensor.

Supported Operations

Most common operations are supported on Tensors, and also have autograd support. These are defined in tensor.h

  • Element-wise Binary: +, -, *, /, %, ==, !=, <, <=, >, >=, ||, &&, |, &, ^, <<, >>, maximum, minimum, pow
  • Element-wise Unary: abs, negate, logical_not, sign, log, log10, log2, log1p, exp, exp2, expm1, sqrt, sin, cos, tan, asin, acos, atan, sinh, cosh, tanh, asinh, acosh, atanh, erf, erfc, tgamma, lgamma, digamma, ceil, floor, round, isinf, isnan, isfinite
  • Matmul: vector-vector, vector-matrix, matrix-vector, matrix-matrix, batched matrix-matrix
  • Activations: sigmoid, log_sigmoid, hardsigmoid, softplus, relu, relu6, leaky_relu, elu, selu, silu, hardtanh, softsign, softmax, log_softmax
  • Shape Modifications: broadcast_to, expand, squeeze, unsqueeze, reshape, flatten, permute, repeat_interleave, repeat, gather
  • Indexing: index, index_select, index_put
  • Reduction: min, argmin, max, argmax, sum, mean, all, any, var
  • Misc: where, isclose, allclose, clamp

Indexing

Tensors support indexing.

  • Indexing with an integer will select the row for the given dimension
  • Indexing with a indexing::Slice will select a slice from start-stop-step of the given dimension
  • These can be combined with a list

See index.h for the indexing structs.

Tensor x = uniform_real(0, 1, {4, 3, 5});
Tensor t1 = x[1];                   // t has shape [3, 5]
Tensor t2 = x[Slice(1, 3)];         // t has shape [2, 3, 5]
Tensor t3 = x[{Slice(), 1}];        // t has shape [4, 5], similar to pytorch x[:,1]

Inplace Operations

Most operations support inplace versions, with a _ suffix. For example, tensor.exp_() will apply exp inplace. In general, inplace operations are not supported on tensors which require gradients to be computed.

Util

  • current_memory_allocated: Current memory allocated in bytes
  • make_dot: Create a dot graphivz of the ops and tensors for the computation graph up to and including the tensor