Home
last modified time | relevance | path

Searched refs:weights (Results 1 – 9 of 9) sorted by relevance

/hardware/interfaces/neuralnetworks/1.3/
Dtypes.hal738 * outputs = activation(inputs * weights’ + bias)
754 * weights, and "batch_size" is calculated by dividing the number of
757 * * 1: A 2-D tensor, specifying the weights, of shape
1153 * * The cell-to-input weights (\f$W_{ci}\f$), cell-to-forget weights
1154 * (\f$W_{cf}\f$) and cell-to-output weights (\f$W_{co}\f$) either all
1157 * * The input-to-input weights (\f$W_{xi}\f$), recurrent-to-input weights
1166 * cell-to-input (\f$W_{ci}\f$) weights must be present. Otherwise, the
1167 * cell-to-input weights must have no value.
1168 * * The projection weights (\f$W_{proj}\f$) is required only for the
1173 * * (HAL version 1.2 or later) The four layer normalization weights either all have
[all …]
Dtypes.t357 * weights or scalars added at construction time. The only information that
/hardware/interfaces/neuralnetworks/1.2/
Dtypes.hal753 * outputs = activation(inputs * weights’ + bias)
768 * weights, and "batch_size" is calculated by dividing the number of
771 * * 1: A 2-D tensor, specifying the weights, of shape
1156 * * The cell-to-input weights (\f$W_{ci}\f$), cell-to-forget weights
1157 * (\f$W_{cf}\f$) and cell-to-output weights (\f$W_{co}\f$) either all
1160 * * The input-to-input weights (\f$W_{xi}\f$), recurrent-to-input weights
1169 * cell-to-input (\f$W_{ci}\f$) weights must be present. Otherwise, the
1170 * cell-to-input weights must have no value.
1171 * * The projection weights (\f$W_{proj}\f$) is required only for the
1176 * * (HAL version 1.2 or later) The four layer normalization weights either all have
[all …]
Dtypes.t320 * weights or scalars added at construction time. The only information that
/hardware/interfaces/neuralnetworks/1.0/
Dtypes.hal537 * outputs = activation(inputs * weights’ + bias)
551 * weights, and "batch_size" is calculated by dividing the number of
553 * * 1: A 2-D tensor, specifying the weights, of shape
882 * * The cell-to-input weights (\f$W_{ci}\f$), cell-to-forget weights
883 * (\f$W_{cf}\f$) and cell-to-output weights (\f$W_{co}\f$) either all
886 * * The input-to-input weights (\f$W_{xi}\f$), recurrent-to-input weights
895 * cell-to-input (\f$W_{ci}\f$) weights must be present. Otherwise, the
896 * cell-to-input weights must have no value.
897 * * The projection weights (\f$W_{proj}\f$) is required only for the
932 * * 1: The input-to-input weights (\f$W_{xi}\f$). Optional.
[all …]
Dtypes.t299 * weights or scalars added at construction time. The only information that
/hardware/interfaces/neuralnetworks/1.1/
Dtypes.t82 * weights or scalars added at construction time. The only information that
Dtypes.hal404 * weights or scalars added at construction time. The only information that
/hardware/qcom/neuralnetworks/hvxservice/1.0/
DHexagonOperationsPrepare.cpp228 const hexagon_nn_input& weights = model->createFullyConnectedWeightTensor(ins[1]); in fully_connected() local
234 return model->addFusedFloatOperation(OP_MatMul_f, NN_PAD_NA, bias, act, {input, weights}, outs); in fully_connected()
716 const hexagon_nn_input& weights = model->createFullyConnectedWeightTensor(ins[1]); in fully_connected() local
731 {input, weights, input_min, input_max, weights_min, weights_max}, outs); in fully_connected()