The Conference for Machine Learning Innovation

Multi Tasking Deep Learning for Natural Language Processing – Transfer Learning

Session
Join the ML Revolution!
Register until October 20:
✓ Save up to $233
✓ Team discount
✓ Extra Specials for Freelancers
Register Now
Join the ML Revolution!
Register until October 20:
✓ Save up to $233
✓ Team discount
✓ Extra Specials for Freelancers
Register Now
Join the ML Revolution!
Register until November 03:
✓ Save up to €494
✓ 10% Team Discount
✓ Special discount for freelancers
Register Now
Join the ML Revolution!
Register until November 03:
✓ Save up to €494
✓ 10% Team Discount
✓ Special discount for freelancers
Register Now
Join the ML Revolution!
Until the Conference starts:
✓ Group discount
✓ Special discount for freelancers
Register Now
Join the ML Revolution!
Until the Conference starts:
✓ Group discount
✓ Special discount for freelancers
Register Now
Infos

In this talk, we will cover how to model model different natural language processing. In present NLP tasks like word-based or sentence-based classification, sentence generation and question answering, it is a challenge to train models with little domain information. The key solution is using a pre-trained model and transfer learn. BERT from google and MTDNN from Microsoft have been breaking all set benchmarks in recent years. Understanding how to use transfer learning and multi tasking is key in building a model for the task. In this talk, we will discuss different models like ULMFIT, GPT and BERT, which are popular for transfer learning, and then we will analyze how multi tasking can immensely improve this task and different ways of doing multi tasking.

This Session Diese Session Take me to the current program of . Hier geht es zum aktuellen Programm von Singapore Singapore , Berlin Berlin or oder Munich Munich .

Behind the Tracks