More talks in the program:
11:15 - 12:00
In this talk, we will cover how to model model different natural language processing. In present NLP tasks like word based or sentence based classification, sentence generation and question answering, there is a challenge to train models with small domain information. The key solution is using pre-trained model and transfer learn. Bert from google and MTDNN from Microsoft has been breaking all set benchmarks in recent years. Understanding how to use transfer learning, how to use multi tasking is key in building a model for the task. In this talk, we will discuss different models like ULMFIT, GPT and Bert which are popular for transfer learning and then we will analyse how multi tasking can immensely improve this task and different ways of doing multi tasking.