Welcome to MorphoLayers
Basics on Mathematical Morphology
Initializers
First Steps on training a Deep Learning model
Regularizing a Morphological Layer for Deep Learning
Comparing layers: a practical case on Mnist
Comparing layers: a practical case on Fashion Mnist
References
Acknowledgements
About the author
Index
repository
open issue
Index
A
|
B
|
C
|
D
|
E
|
F
|
G
|
I
|
L
|
M
|
O
|
P
|
Q
|
R
|
S
|
T
A
absortion law
activation function
ADAM
architecture
B
back-propagation
Beucher gradient
C
Callbacks
classification report
closing
closing by reconstruction
compositional map
confusion matrix
contrast mapping
D
Deep Neural Network
dilation
E
EarlyStopping
Elastic net regularization
empirical risk
epoch
erosion
F
F1 score
G
geodesic dilation
I
Initializers
internal gradient
L
L1L2Lattice
L1Lattice
L2Lattice
Lasso regularization
learning rate
loss function
M
macro-average
micro-average
minibatch
MinusOnesZeroCenter
morphological probing
morphological reconstruction
O
opening
opening by reconstruction
P
positive predictive value
Precision
Projected Gradient Descent
Q
Quadratic
R
RandomLattice
RandomNegativeLattice
RandomwithMaxLattice
RandomwithMinLattice
RandomwithZeroLattice
Recall
ReduceLROnPlateau
Ridge regularization
S
sensitivity
SignedOnes
SparseNumZeros
SparseZeros
Standard gradient descent
stochastic gradient descend
Stochastic Gradient Descent
T
Tikhonov regularization
toggle mapping
Top-hat transform