Fine-tuning DistilGPT2 and Generating Text

We have some really exciting stuff to cover this week! Even folks that aren’t into the in-depth side of AI should be interested to see how well this can be used to create text from a simple prompt. We will cover the process used to train this model using a collection of SBIR topics, using freely available resources – along with challenges encountered when using this approach.

If you need a refresher on Hugging Face, here’s a video of the session we did back in March – Intro to Hugging Face

We will finish off the night with a demonstration of generating the full text of SBIR topics.

Links:

HuggingFace DistilGPT2
DistilBERT Paper