• Validated claims made in the paper The Curious Case Of Neural Text DeGeneration (https://arxiv.org/pdf/1904.09751.pdf).
• Explored text generation coherence and diversity for top-p (nucleus sampling), beam search and pure sampling.
• Applied aforementioned decoding strategies on a pretrained transformer-based GPT2 model to depict superior performance of nucleus sampling.