SMILES GPT (SGPT) for chemical design through reinforcement leaerning
-
Updated
Jun 17, 2024 - Jupyter Notebook
SMILES GPT (SGPT) for chemical design through reinforcement leaerning
SARAN: Shallow Auto-Regressive Attention Network
mychatgpt is a small and useful Python package that provides utils to create OpenAI's GPT conversational agents. This module allows users to have interactive chat with GPT models and keeps track of the chat history. Useful in Python projects as Copilot agent.
A pure Rust GPT implementation from scratch.
This presents an implementation of a Generative Pre-trained Transformer (GPT) model, focusing on comparing the performance and effectiveness of Kolmogorov-Arnold Networks (KANs) and traditional multilayer perceptron (MLP) layers within the architecture.
This project implements GPT-1 using PyTorch, focusing on foundational transformer architectures for natural language processing tasks.
nanoGPT model from scratch
ToyGPT, inspired by Andrej Karpathy’s GPT from scratch, creates a toy generative pre-trained transformer at its most basic level using a simple bigram language model with attention to help educate on the basics of creating an LLM from scratch.
This notebook builds a complete GPT (Generative Pre-trained Transformer) model from scratch using PyTorch. It covers tokenization, self-attention, multi-head attention, transformer blocks, and text generation and all explained step-by-step with a simple nursery rhyme corpus.
This repository serves as a collection of various artificial intelligence projects or experiments I have done, whether academic or personal. See README for more information.
Implementation of the decoder part of the transformer architecture for a generative pre_trained transformer(GPT)
Christopher, software simulatorio de un Chat GPT "Generative Pretrained Transformer"
Implementing GPT, Decoder, LSTM, Lora, Layer-Batch normalizations, LLMs, FFNN, Attention mechanism, Transformer
Add a description, image, and links to the generative-pretrained-transformer topic page so that developers can more easily learn about it.
To associate your repository with the generative-pretrained-transformer topic, visit your repo's landing page and select "manage topics."