Deploying your first Machine Learning Model

By Aminu Israel 🚀

Elevator Pitch

You’ve probably built that Machine Learning model or that facial recognition Deep Learning model but what’s next? Do you leave it on your computer unutilized or get it deployed for people to interact with it. In this tutorial, I will walk you through a step by step process in deploying your model.

Description

In every Data Science project, the basic life cycle always consists of Data Collecting, Data Preprocessing. Data analysis, Training and Validation and lastly Model Deployment.

A lot of people who always start a track in Data Science stops in the training and validation phase and never try to go further. This makes the ML model built never being utilized and hence makes it redundant. So in this workshop, I will show you how we can deploy a machine learning model either as a side project or integrating your model into production code.

But before we go ahead and start deploying we need to understand the nitty-gritty side of deployment.

What is Model Deployment?

Model Deployment is the method by which you integrate a machine learning model into an existing production environment to make practical business decisions based on data. It is one of the last stages in the machine learning life cycle and can be one of the most cumbersome – DataRobot

Ways ML model can be deployed?

  • As a REST API
  • Using Cloud Platforms (GCP, AWS, Azure)
  • Tensorflow
  • Docker
  • Data Robot

Given there are so many ways you can deploy your model but most times when you build any Machine Learning Model, you’ll most likely deploy your model as a REST API(Application Programming Interface) which makes it easier to integrate into your Web application or mobile apps. Take for instance, you’re asked to build a model that predict the category of customers, the way your API pipeline will be structured is to:

  1. Collect the Data
  2. Preprocess the Data such as cleaning, data type conversion e.t.c
  3. Pass the Data to the ML model to generate predictions
  4. Return the prediction and maybe, the confidence score of the prediction(s) made.

And if it’s a deep learning facial recognition model you’d likely structure your API to:

  1. collect the image
  2. resize the image and convert to an array
  3. pass it through the model to generate predictions
  4. Return Prediction and confidence score

Lets deploy our model

In this case, we’ll be deploying our model as an API and for us to build it, we’ll be using a popular Python web framework which is flask and we’ll be deploying it to a popular web hosting platform(Heroku). Don’t worry if you have no prior knowledge of how the web works, I plan to make the whole process easy to understand.

Workflow

The basic workflow for deploying our model is:

Save model > Create an API using Flask > Integrate model to the API > accept user request > load model and Predict > returns the response

At first, when you finish building your model, you save your model using either pickle or joblib and if it’s a deep learning model there are ways you can save them also. Then you create an API endpoint using flask which then you integrate your model to. What the endpoint does it that it takes a user response which is the data he/she wants to get a prediction from, then at the background it preprocesses the data, after preprocessing, it loads the model and the data is fed to the model, if the data is preprocessed in the right manner it will generate a prediction which is returned back to the user, and funny enough, this thing happens in just a matter of seconds. It’s that easy, trust me.

To see a technical example I have a project which is being deployed, it’s a model which predicts diabetic type and it was used on a web application. Link

Notes

For the participant requirements, he/she must have been able to:

  • Build a Machine Learning Model with Python
  • Knowledge of git and GitHub
  • Knowledge of Heroku(optional)
  • Know how to StackOverflow 😁

Why my talk should be selected.

Over the years a lot of Data Scientist always face an issue with this part of Data Science, demystifying this concept will make the participant of this conference appreciate the fact he/she can build models and can also get them deployed.