Sequence Tagging is an important NLP problem that has several applications, including Named Entity Recognition, Part-of-Speech Tagging, and Argument Component Detection. In our talk, we will focus on a BiLSTM+CNN+CRF model — one of the most popular and efficient neural network-based models for tagging. We will discuss task decomposition for this model, explore the internal design of its components, and provide the ablation study for them on the well-known NER 2003 shared task dataset.
Artem Chernodub has Bachelors of Science and Masters of Science degrees from the Moscow Institute of Physics and Technology. He defended his Ph.D. thesis at the Institute of Mathematical Machines and Systems NASU. Artem took part in scientific and applied machine learning-related projects for the US Air Force, Samsung, Ukraine’s Government, and other organizations. Currently, he works on NLP problems in the industry and lectures for a course titled “Deep Learning” at the Ukrainian Catholic University.