Advanced AI: Modern Natural Language Processing with Deep Learning
The main goal of this course is to provide comprehensive details on the recent advances in deep learning applied to NLP. The session presents state of the art of NLP-centric deep learning research, and focuses on the role of deep learning played in major NLP applications including spoken language understanding, dialog systems, lexical analysis, parsing, knowledge graph, machine translation, question answering, sentiment analysis, social computing, and natural language generation (from images).
This is an advanced course on natural language processing. The focus will be on a few of the most important tactics:
-
Neural Machine Translation(NMT)
-
Attention,
-
TRANSFORMERS,
-
Bidirectional Encoder Representations from Transformers(BERT),
-
GPT,
-
XLNET.
In modern NLP these are the most important tricks used and
this trick underlies most NLP tasks. Various small case studies will be undertaken but the most important being Machine Translation. Unlike other models. These techniques are end-to-end models that can be used for many use cases. These models will be trained differently but models themselves need not be changed. These are really complex models, and getting them to train right or even use right is very
difficult. Therefore understanding their inner working is crucial.
Refer to the case-study section to see what can be done.
​​
The rest of the training is divided between:
Engineering (building data processing pipeline from scratch) and
Algorithms (Linear and Logistic Regression).
Best practices in machine learning (bias/variance theory) innovation process in machine learning and AI and practical examples to take home and practice.
Who should attend:
-
Data scientists, with a technical background in computation, including,
-
Post-doctoral researchers and industrial researchers
-
Educators
-
Anyone interested in getting up to speed with the latest techniques of
deep learning associated with NLP.
Key skills covered:
-
Understand Encoder Decoder Architecture
-
Understand Neural Machine Translation
-
Have an awareness of the hardware issues inherent in implementing scalable neural network models for language data.
-
Understand Attention
-
Be able to derive and implement optimisation algorithms for these models
-
Be able to implement and evaluate common neural network models for language.
-
Understand neural implementations of attention mechanisms and sequence
embedding models and how these modular components can be combined to build
state of the art NLP systems.
Key skills covered:
-
Measuring and Tuning performance of ML algorithms
-
Most effective machine learning techniques
-
Use tools like Scikit for ML tasks
-
Best practices in innovation as it pertains to machine learning and AI
-
You'll learn about not only the theoretical underpinnings of learning, but also gain the practical know-how needed to quickly and powerfully apply these techniques to new problems
-
You will learn how to Prototype and then productionize
Pre-requisites:
-
Solid knowledge of Linear,Logistic Regression
-
Good knowledge of Machine Learning concepts like pipelines, grid search,randomized search, error curves, normalization techniques etc.
-
Working Knowledge of Python
-
Cursory knowledge of Deep Neural Network
-
SOFTWAREProvide training and consulting on AI and more specifically on Deep Learning, and can work with your in-house team. Also we work in DevOps mode if needed to design, develop and maintain the AI solution. We deliver value by understanding your use cases and providing end to end strategy and implementation. It is very important to use open source tools and stay away from proprietary AI platforms. AI is still a moving target. By locking in to an AI platform, you risk wasting a lot of time and resources when new AI technologies appear. We only use open source tools and help you build AI and data science platforms which are platform agnostic and easy to maintain and support. Having done extreme optimizations(fine-tuning) and research on NMT/BERT/XLNET based architectures we are positioned uniquely to the following. Install, configure and optimize vanilla(google's) NMT/BERT/XLNET at your on-premise/cloud hardware Install Stillwater's pretrained and more finely tuned NMT/BERT/XLNET on your on-premise/cloud hardware Building AI platform for an Investment Banking Major for Automated Trading and Operational Optimization(Automated L1/L2/L3 support) using the above technology stack.