Skip to main content

CNNGPT - CNNGPT

Project description

Multi-Modality

CNN-Based Language Model

Join our Discord Subscribe on YouTube Connect on LinkedIn Follow on X.com

Detailed Explanation of Each Step

Initialization Parameters

  • vocab_size: The size of the vocabulary (number of unique tokens).
  • embedding_dim: The dimension of the embeddings.
  • num_layers: The number of convolutional layers.
  • kernel_size: The size of the convolutional kernels.
  • hidden_dim: The dimension of the hidden representations (should match embedding_dim for residual connections).
  • max_seq_len: The maximum sequence length the model can handle.

Embedding and Positional Encoding

  • Embeddings: Convert token IDs to dense vectors.
  • Positional Encoding: Adds a learnable positional embedding to each token embedding.

Convolutional Blocks

  • Causal Convolution: Uses padding on the left to ensure that the convolution at time t does not depend on future time steps.
  • Dilation: Expands the receptive field exponentially, allowing the model to capture long-term dependencies.
  • GLU Activation: Introduces a gating mechanism that can control the flow of information.
    • The output of the convolution is split into two halves along the channel dimension.
    • One half is passed through a sigmoid function to act as a gate for the other half.
  • Layer Normalization: Normalizes the outputs to improve training stability.
  • Residual Connections: Adds the input to the output to facilitate training deeper networks.

Output Layer

  • Projection: Maps the final hidden states to the vocabulary space to produce logits for each token.

Handling Tensor Sizes

Throughout the network, we carefully manage tensor shapes to maintain consistency:

  • After embedding and positional encoding: [batch_size, seq_len, embedding_dim]
  • Before convolution: Transposed to [batch_size, embedding_dim, seq_len]
  • After convolution and GLU: [batch_size, hidden_dim, seq_len]
  • After layer normalization and residual connection: Same shape as input to convolution for residual addition.
  • Before output layer: Transposed back to [batch_size, seq_len, embedding_dim]
  • Output logits: [batch_size, seq_len, vocab_size]

Important Notes

  • Causality: By appropriately padding and slicing the convolution outputs, we ensure that the model does not use future information when predicting the current time step.
  • Residual Connections: The embedding_dim and hidden_dim must be equal to correctly add the residual connection.
  • Layer Normalization: Applied over the feature dimension; we transpose the tensor to [batch_size, seq_len, hidden_dim] before applying LayerNorm.
  • GLU Activation Function: The gating mechanism enhances the model's capacity to model complex patterns.
  • Flexibility: The model can handle sequences shorter than max_seq_len; positional encodings are sliced accordingly.

Conclusion

We have successfully translated the detailed algorithm into a PyTorch implementation, carefully following each step and ensuring that the code aligns with the design principles outlined earlier. This CNN-based language model leverages causal and dilated convolutions, gated activations, residual connections, and layer normalization to effectively model textual data for generation tasks.

By understanding each component and its role in the model, we can appreciate how this architecture captures both local and global dependencies in language, offering a powerful alternative to traditional models in natural language processing.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

cnngpt-0.0.1.tar.gz (5.1 kB view hashes)

Uploaded Source

Built Distribution

cnngpt-0.0.1-py3-none-any.whl (5.4 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page