Viterbi knows what you'll type next!

By Evan Kohilas

Elevator Pitch

What if I told you I could predict your next word? Sure, there are plenty of text prediction algorithms that try to do the same, but how many of them will also attempt to read your mind? Come along and I’ll show you how Viterbi can finish your sentences!

Description

When it comes to text prediction, Markov Chains ⛓️ and Neural Networks 🧠🌐️ are cool, fancy but cliché.

Come along and learn about Hidden Markov Models and the Viterbi Algorithm (commonly used in speech recognition), and how we can also use them for contextual word prediction.

I’ll cover my strategies and results, while showing and talking about my hurdles with the Numpy implementation.

Notes

I have always wanted to do something that would make use of my chat history, and so last year I was given the idea to write a contextual autocomplete engine that would try to predict a user’s next word given a sentence.

But I also wanted to do something new and different, and not just make another Markov Chain or Neural Network, and so I realised that I could take the Viterbi Algorithm that I had just learnt, and alter and apply it in a different way to be used for contextual text prediction, that could guess your next word given yours or others, recent or entire history.

In this talk I aim to briefly cover some common text prediction algorithms, such as Markov chains, and then talk about and explain hidden Markov Models and the Viterbi algorithm, such as how they’re normally used and how to predict a hidden state from observable states, and go over my different strategies and results, all alongside showing and talking about my hurdles with the Numpy implementation.