Home
last modified time | relevance | path

Searched refs:use_layer_norm (Results 1 – 4 of 4) sorted by relevance

/frameworks/ml/nn/common/operations/
DLSTM.cpp222 params->use_layer_norm = !IsNullInput(output_layer_norm_weights); in CheckInputTensorDimensions()
811 if (!params.use_layer_norm) { in LSTMStep()
893 if (params.use_layer_norm) { in LSTMStep()
912 if (params.use_layer_norm) { in LSTMStep()
925 if (params.use_layer_norm) { in LSTMStep()
955 if (params.use_layer_norm) { in LSTMStep()
DLSTM.h38 bool use_layer_norm; member
DUnidirectionalSequenceLSTM.cpp110 params.use_layer_norm = hasTensor(context, kOutputLayerNormWeightsTensor); in getLSTMParams()
DBidirectionalSequenceLSTM.cpp190 params_.use_layer_norm = !IsNullInput(fw_input_layer_norm_weights_); in BidirectionalSequenceLSTM()