Skip to content

openhackathons-org/LLM-Bootcamp

Repository files navigation

LLM-Bootcamp

The Large Language Model (LLM) Bootcamp is designed from a real-world perspective, following the data processing, development, and deployment pipeline paradigm. Attendees walk through the workflow of preprocessing a multi-turn conversational dataset for the summarization task and fine-tune the dataset on SOTA LLM using NeMo-Run. Attendees will also learn to optimize the fine-tuned model and apply prompt engineering techniques to solve complex real-world tasks. Furthermore, we introduced an AI Assistant customer care use case challenge to test attendees' understanding of the material and solidify their experience in the Text Generation domain.

Bootcamp Content

This content contains three labs, plus a challenge notebook:

  • Lab 1: Preprocessing Multi-turn Conversational Dataset
  • Lab 2: Building a Text Summarization Model With NeMo-Run
  • Lab 3: Prompt Engineering Techniques and Test-time Scaling (TTS)
  • Challenge Lab: AI-Powered Chat Summarization for Customer Care Efficiency

Tools and Frameworks

The tools and frameworks used in the Bootcamp material are as follows:

Tutorial duration

The Bootcamp material would take approximately 3 hours and 30 minutes, while the challenge part is expected to be completed within 6 hours.

Deploying the Bootcamp Material

To deploy the Labs, please refer to the deployment guide presented here

Attribution

This material originates from the OpenHackathons GitHub repository. Check out additional materials here

Don't forget to check out additional Open Hackathons Resources and join our OpenACC and Hackathons Slack Channel to share your experience and get more help from the community.

Licensing

Copyright © 2025 OpenACC-Standard.org. This material is released by OpenACC-Standard.org, in collaboration with NVIDIA Corporation, under the Creative Commons Attribution 4.0 International (CC BY 4.0). These materials may include references to hardware and software developed by other entities; all applicable licensing and copyrights apply.

About

No description, website, or topics provided.

Resources

License

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published