In the previous posts, we saw how to build strong and versatile named entity recognition systems and how to properly evaluate them. But often you want to understand your model beyond the metrics. So in this tutorial I will show you how you can build an explainable and interpretable NER system with keras and the LIME algorithm.
Read more
This is the second post of my series about understanding text datasets. If you read my blog regularly, you probably noticed quite some posts about named entity recognition. In this posts, we focused on finding the named entities and explored different techniques to do this.
Read more
In 2018 we saw the rise of pretraining and finetuning in natural language processing. Large neural networks have been trained on general tasks like language modeling and then fine-tuned for classification tasks. One of the latest milestones in this development is the release of BERT.
Read more
An important part of every machine learning project is the proper evaluation of the performance of the system. In this post we will talk about evaluation of token-based sequence models. This is especially tricky because:
Read more
This is the sixth post in my series about named entity recognition. This time I’m going to show you some cutting edge stuff. We will use a residual LSTM network together with ELMo embeddings, developed at Allen NLP. You will learn how to wrap a tensorflow hub pre-trained model to work with keras. The resulting model with give you state-of-the-art performance on the named entity recognition task.
This is the fifth post in my series about named entity recognition. If you haven’t seen the last four, have a look now. The last time we used a CRF-LSTM to model the sequence structure of our sentences.
Read more
This is the fourth post in my series about named entity recognition. If you haven’t seen the last three, have a look now. The last time we used a recurrent neural network to model the sequence structure of our sentences.
Read more
This is the third post in my series about named entity recognition. If you haven’t seen the last two, have a look now. The last time we used a conditional random field to model the sequence structure of our sentences.
Read more
This is the second post in my series about named entity recognition. If you haven’t seen the first one, have a look now. Last time we started by memorizing entities for words and then used a simple classification model to improve the results a bit.
Read more
In this post, I will introduce you to something called Named Entity Recognition (NER). NER is a part of natural language processing (NLP) and information retrieval (IR). The task in NER is to find the entity-type of words.
Read more