Informazioni di riferimento sulla funzione Full BrainScript
Questa sezione fornisce informazioni sulle funzioni predefinite di BrainScript.
Le dichiarazioni di tutte le funzioni predefinite sono disponibili nella CNTK.core.bs
che si trova accanto al file binario CNTK.
Le operazioni e i livelli primitivi vengono dichiarati nello spazio dei nomi globale. Le operazioni aggiuntive vengono dichiarate negli spazi dei nomi e verranno fornite con il rispettivo prefisso (ad esempio, BS.RNN.LSTMP
).
Strati
DenseLayer
{outDim, bias= true, activation=Identity, init='uniform', initValueScale=1}
ConvolutionalLayer
{numOutputChannels, filterShape, activation = Identity,
init = "uniform", initValueScale = 1,
stride = 1, pad = false, lowerPad = 0, upperPad = 0,
bias=true}
MaxPoolingLayer
{filterShape, stride = 1, pad = false, lowerPad = 0, upperPad = 0}
AveragePoolingLayer
{filterShape, stride = 1, pad = false, lowerPad = 0, upperPad = 0}
EmbeddingLayer
{outDim, embeddingPath = '', transpose = false}
RecurrentLSTMLayer
{outputDim, cellShape = None, goBackwards = false, enableSelfStabilization = false}
DelayLayer
{T=1, defaultHiddenActivation=0}
Dropout
BatchNormalizationLayer
{spatialRank = 0, initialScale = 1, normalizationTimeConstant = 0, blendTimeConstant = 0, epsilon = 0.00001, useCntkEngine = true}
LayerNormalizationLayer
{initialScale = 1, initialBias = 0}
StabilizerLayer{}
FeatureMVNLayer{}
Compilazione di livelli
Funzioni di attivazione
Operazioni elementwise, unarie
Abs
(x)
Ceil
(x)
Cosine
(x)
Clip
(x, minValue, maxValue)
Exp
(x)
Floor
(x)
Log
(x)
Negate
(x)
-x
BS.Boolean.Not
(b)
!x
Reciprocal
(x)
Round
(x)
Sin
(x)
Sqrt
(x)
Operazioni element per elemento, binario
ElementTimes
(x, y)
x .* y
Minus
(x, y)
x - y
Plus
(x, y)
x + y
`LogPlus
(x, y)
Less
(x, y)
Equal
(x, y)
Greater
(x, y)
GreaterEqual
(x, y)
NotEqual
(x, y)
LessEqual
(x, y)
BS.Boolean.And
(a, b)
BS.Boolean.Or
(a, b)
BS.Boolean.Xor
(a, b)
Operazioni elementwise, ternarie
BS.Boolean.If
(condition, thenVal, elseVal)
Operazioni di convoluzione e prodotto matrice
Times
(A, B, outputRank=1)
A * B
TransposeTimes
(A, B, outputRank=1)
Convolution
(weights, x, kernelShape, mapDims=(0), stride=(1), sharing=(true), autoPadding=(true), lowerPadding=(0), upperPadding=(0), imageLayout='CHW', maxTempMemSizeInSamples=0)
Pooling
(x, poolKind/*'max'|'average'*/, kernelShape, stride=(1), autoPadding=(true), lowerPadding=(0), upperPadding=(0), imageLayout='CHW')
ROIPooling
(x, rois, roiOutputShape, spatialScale=1.0/16.0)
Parametri e costanti learnable
ParameterTensor
{shape, learningRateMultiplier=1.0, init='uniform'/*|gaussian*/, initValueScale=1.0, initValue=0.0, randomSeed=-1, initFromFilePath=''}
Constant
{scalarValue, rows = 1, cols = 1}
-
BS.Constants.Zero
,BS.Constants.One
BS.Constants.True
,BS.Constants.False
,BS.Constants.None
BS.Constants.OnesTensor (shape)
BS.Constants.ZeroSequenceLike (x)
Ingressi
Input
(shape, dynamicAxis='', sparse=false, tag='feature')
DynamicAxis{}
EnvironmentInput (propertyName)
Mean (x)
,InvStdDev (x)
Funzioni di perdita e metriche
CrossEntropyWithSoftmax
(targetDistribution, nonNormalizedLogClassPosteriors)
CrossEntropy
(targetDistribution, classPosteriors)
Logistic
(label, probability)
WeightedLogistic
(label, probability, instanceWeight)
ClassificationError
(labels, nonNormalizedLogClassPosteriors)
MatrixL1Reg(matrix)
MatrixL2Reg(matrix)
SquareError (x, y)
Riduzioni
ReduceSum
(z, axis=None)
ReduceLogSum
(z, axis=None)
ReduceMean
(z, axis=None)
ReduceMin
(z, axis=None)
ReduceMax
(z, axis=None)
CosDistance (x, y)
SumElements (z)
Operazioni di training
BatchNormalization
(input, scale, bias, runMean, runInvStdDev, spatial, normalizationTimeConstant = 0, blendTimeConstant = 0, epsilon = 0.00001, useCntkEngine = true, imageLayout='CHW')
-
Dropout
(x)
Stabilize (x, enabled=true)
StabilizeElements (x, inputDim=x.dim, enabled=true)
CosDistanceWithNegativeSamples (x, y, numShifts, numNegSamples)
Operazioni di rishaping
CNTK2.Reshape (x, shape, beginAxis=0, endAxis=0)
ReshapeDimension (x, axis, shape) = CNTK2.Reshape (x, shape, beginAxis=axis, endAxis=axis + 1)
FlattenDimensions (x, axis, num) = CNTK2.Reshape (x, 0, beginAxis=axis, endAxis=axis + num)
SplitDimension (x, axis, N) = ReshapeDimension (x, axis, 0:N)
Slice (beginIndex, endIndex, input, axis=1)
BS.Sequences.First (x) = Slice (0, 1, x, axis=-1)
BS.Sequences.Last (x) = Slice (-1, 0, x, axis=-1)
Splice (inputs, axis=1)
TransposeDimensions (x, axis1, axis2)
Transpose (x) = TransposeDimensions (x, 1, 2)
BS.Sequences.BroadcastSequenceAs (type, data1)
BS.Sequences.Gather (where, x)
BS.Sequences.Scatter (where, y)
BS.Sequences.IsFirst (x)
BS.Sequences.IsLast (x)
Ricorrenza
OptimizedRNNStack
(weights, input, hiddenDims, numLayers=1, bidirectional=false, recurrentOp='lstm')
BS.Loop.Previous (x, timeStep=1, defaultHiddenActivation=0)
PastValue (shape, x, defaultHiddenActivation=0.1, ...) = BS.Loop.Previous (0, shape, ...)
BS.Loop.Next (x, timeStep=1, defaultHiddenActivation=0)
FutureValue (shape, x, defaultHiddenActivation=0.1, ...) = BS.Loop.Next (0, shape, ...)
LSTMP (outputDim, cellDim=outputDim, x, inputDim=x.shape, aux=BS.Constants.None, auxDim=aux.shape, prevState, enableSelfStabilization=false)
BS.Boolean.Toggle (clk, initialValue=BS.Constants.False)
BS.RNNs.RecurrentLSTMP (outputDim, cellDim=outputDim, x, inputDim=x.shape, previousHook=BS.RNNs.PreviousHC, augmentInputHook=NoAuxInputHook, augmentInputDim=0, layerIndex=0, enableSelfStabilization=false)
BS.RNNs.RecurrentLSTMPStack (layerShapes, cellDims=layerShapes, input, inputShape=input.shape, previousHook=PreviousHC, augmentInputHook=NoAuxInputHook, augmentInputShape=0, enableSelfStabilization=false)
BS.RNNs.RecurrentBirectionalLSTMPStack (layerShapes, cellDims=layerShapes, input, inputShape=input.dim, previousHook=PreviousHC, nextHook=NextHC, enableSelfStabilization=false)
Supporto da sequenza a sequenza
BS.Seq2Seq.CreateAugmentWithFixedWindowAttentionHook (attentionDim, attentionSpan, decoderDynamicAxis, encoderOutput, enableSelfStabilization=false)
BS.Seq2Seq.GreedySequenceDecoderFrom (modelAsTrained)
BS.Seq2Seq.BeamSearchSequenceDecoderFrom (modelAsTrained, beamDepth)
Operazioni per scopi speciali
ClassBasedCrossEntropyWithSoftmax (labelClassDescriptorVectorSequence, mainInputInfo, mainWeight, classLogProbsBeforeSoftmax)
Modifica del modello
BS.Network.Load (pathName)
BS.Network.Edit (inputModel, editFunctions, additionalRoots)
BS.Network.CloneFunction (inputNodes, outputNodes, parameters="learnable" /*|"constant"|"shared"*/)
Altro
Fail (what)
IsSameObject (a, b)
Trace (node, say='', logFrequency=traceFrequency, logFirst=10, logGradientToo=false, onlyUpToRow=100000000, onlyUpToT=100000000, format=[])
Deprecato
ErrorPrediction
(labels, nonNormalizedLogClassPosteriors)
ColumnElementTimes (...) = ElementTimes (...)
DiagTimes (...) = ElementTimes (...)
LearnableParameter(...) = Parameter(...)
LookupTable (embeddingMatrix, inputTensor)
RowRepeat (input, numRepeats)
RowSlice (beginIndex, numRows, input) = Slice(beginIndex, beginIndex + numRows, input, axis = 1)
RowStack (inputs)
RowElementTimes (...) = ElementTimes (...)
Scale (...) = ElementTimes (...)
ConstantTensor (scalarVal, shape)
Parameter (outputDim, inputDim, ...) = ParameterTensor ((outputDim:input), ...)
WeightParam (outputDim, inputDim) = Parameter (outputDim, inputDim, init='uniform', initValueScale=1, initOnCPUOnly=true, randomSeed=1)
DiagWeightParam (outputDim) = ParameterTensor ((outputDim), init='uniform', initValueScale=1, initOnCPUOnly=true, randomSeed=1)
BiasParam (dim) = ParameterTensor ((dim), init='fixedValue', value=0.0)
ScalarParam() = BiasParam (1)
SparseInput (shape, dynamicAxis='', tag='feature')
ImageInput (imageWidth, imageHeight, imageChannels, imageLayout='CHW', dynamicAxis='', tag='feature')
SparseImageInput (imageWidth, imageHeight, imageChannels, imageLayout='CHW', dynamicAxis='', tag='feature')
MeanVarNorm(feat) = PerDimMeanVarNormalization(feat, Mean (feat), InvStdDev (feat))
PerDimMeanVarNormalization (x, mean, invStdDev)
,
PerDimMeanVarDeNormalization (x, mean, invStdDev)
ReconcileDynamicAxis (dataInput, layoutInput)