72
Keras Embedding
The embedding layer is used as an initial layer in the model, emphasizes on changing the positive indexes into a fixed size dense vectors. For example [[4], [20]] -> [[0.25, 0.1], [0.6, -0.2]].
Example
Arguments
- input_dim: It refers to an integer index that is greater than 0, representing the vocabulary size. For example, the maximum index integer must be +1.
- output_dim: It indicates an integer index, which is greater than and equals to 0, representing the dimensionality of the dense embedding.
- embeddings_initializer: It can be defined as an initializer for the embeddings
- embeddings_regularizer: It refers to a regularizer function that is implemented on the embeddings
- activity_regularizer: It is a regularizer function that is applied to its activation or the output of the layer.
- embeddings_constraint: It is defined as a constraint function that is implemented on the embeddings
- mask_zero: For an input value, which is either 0 or not, states the special “padding” value to be masked out. It may take the input as a variable length, which makes it very convenient while using the recurrent layers. All the subsequent layers have to support masking if it is set to True. Specifically, if mask_zero is set to True, then, in that case, the 0 index cannot be utilized in the vocabulary, which means input_dim has to be equal in terms of the size of the vocabulary +1.
- input_length: It defines the length of the input sequences when it is set to static. The input_length arguments help to figure out the shape of dense outputs as it is used when we first connect to the Flatten and then the
Input shape
It is a 2D tensor of shape (batch_size, sequence_length).
Output shape
It is a 3D tensor of shape (batch_size, sequence_length, output_dim).
Next TopicMerge Layers