On this page:
14.1 Single layer functions
line
quad
plane
linear-1-1
softmax
relu
corr
recu
signal-avg
avg-cols
14.2 Deep layer functions
k-relu
k-recu
8.12

14 Layer functions🔗ℹ

These are functions are provided by Malt that can be used either as individual target functions (see Loss Functions) or may be combined with each other to produce new target functions. They are collectively called layer functions since some of these functions are used to construct deep neural networks using layers.

14.1 Single layer functions🔗ℹ

These functions accept an input tensor, a θ and return an output tensor. These functions are all of the type target-fn? (see Loss Functions).

procedure

((line t) θ)  tensor?

  t : tensor?
  θ : theta?
Implements the linear combination given by
(+ (* (ref θ 0) t)
   (ref θ 1))

procedure

((quad t) θ)  tensor?

  t : tensor?
  θ : theta?
Implements the polynomial combination given by
(+ (* (ref θ 0) (sqr t))
   (* (ref θ 1) t)
   (ref θ 2))

procedure

((plane t) θ)  tensor?

  t : tensor?
  θ : theta?
((linear-1-1 t) θ)  tensor?
  t : tensor?
  θ : theta?
Implements the polynomial combination given by
(+ (dot-product (ref θ 0) t) (ref θ 1))

procedure

((softmax t) θ)  tensor?

  t : tensor?
  θ : theta?
Implements the softmax function given by
(let ((z (- t (max t))))
  (let ((expz (exp z)))
    (/ expz (sum expz))))

procedure

((relu t) θ)  tensor?

  t : tensor?
  θ : theta?
Implements the ReLU function given by
(rectify ((linear t) theta))

procedure

((corr t) θ)  tensor?

  t : tensor?
  θ : theta?
Implements the biased correlation function given by
(+ (correlate (ref θ 0) t) (ref θ 1))

procedure

((recu t) θ)  tensor?

  t : tensor?
  θ : theta?
Implements the rectified 1D-convolution function given by
(rectify ((corr t) theta))

procedure

((signal-avg t) θ)  tensor?

  t : tensor?
  θ : theta?
((avg-cols t) θ)  tensor?
  t : tensor?
  θ : theta?
Implements the averaging of the rank 1 elements of a tensor given by
(let ((num-segments (ref (refr (shape t) (- (rank t) 2)) 0)))
  (/ (sum-cols t) num-segments))

14.2 Deep layer functions🔗ℹ

These functions create a stacked composition of layer functions by providing the depth k of composition. The composition is also a layer function.

procedure

(((k-relu k) t) θ)  tensor?

  k : natural?
  t : tensor?
  θ : theta?
Implements a composition of k ReLU functions. Can be used to implement a neural network exclusively made up of dense layers.

procedure

(((k-recu k) t) θ)  tensor?

  k : natural?
  t : tensor?
  θ : theta?
Implements a composition of k rectified 1D-convolution (recu) functions . Can be used to implement a neural network exclusively made up of recu layers.