This repository contains implementations of two language models:
- Bigram Language Model (bigram.py) – A simple neural network-based bigram model.
- Transformer-based GPT Model (gpt.py) – A more advanced language model using self-attention.
- Implements a basic bigram language model with embeddings.
- Implements a Transformer-based language model inspired by GPT.
- Uses PyTorch for model training and inference.
- Includes text generation capabilities.
Ensure you have Python installed along with the required dependencies:
pip install torch numpypython bigram.pyThis will train a simple bigram model and generate text.
python gpt.pyThis will train and generate text using a Transformer-based model.
bigram.py: Implements the Bigram Language Model.gpt.py: Implements a Transformer-based GPT-style model.