# Serving Keras Deep-Learning Models With SKIL

In a previous post, we showed you how to interact with SKIL on a basic level. In this article, we'll show you how to use the new skil Python package to run and deploy models fast. skil is designed to save you time and fit into typical workflows used by data scientists.

Below, you'll learn how to train a Keras model, register it in SKIL, deploy the model in a production-grade service and finally get predictions from that newly deployed service.

If you've never used SKIL, take a look at the Quickstart guide to get the free SKIL Community Edition running. For the rest of this post, we'll assume that SKIL runs on your machine.

### Introduction to the skil Python package

skil is a Python library and we're at version 0.2.8 at the time of this writing. This new package is a high-level wrapper around the existing SKIL clients that helps you get started with SKIL quickly. skil is centered around certain concepts common in data science and embedded in the SKIL UI: namely experiments, models (the product of experiments), and workspaces (which group experiments around a data science project). If you understand the workflow of SKIL, you'll find it easy to use the skil library. If you've never used SKIL, understanding how skil works will tell you much of what you need to know about the platform.

You can install skil from PyPI using

pip install tensorflow==1.10 keras==2.2.2 skil==0.2.8.

We pin the versions of skil and the machine learning libraries we use to ensure long term support for this post. To get you started with skil, open an interactive Python session of your choice and run the following import statements:

from skil import Skil, WorkSpace, Experiment, Model, Deployment, Service


Those are all the classes you need to run and deploy a deep-learning model with SKIL. Here's how these Python classes hook into what you might know from SKIL

• Skil connects to your running SKIL instance and handles all API calls internally.
• WorkSpace corresponds to SKIL workspaces, the basic space for running all your machine learning experiments.
• Experiment in each workspace you can conduct machine learning experiments. If you use the SKIL UI, an experiment would correspond to a notebook run in SKIL.
• Model is a central concept in SKIL wrapping machine learning models. Models are associated with or defined in experiments. An experiment can have several models and you can pick which to deploy.
• Deployment refers to how SKIL will deploy your model.
• Service is what you get when you deploy a model. A service is a hosted model that can be used for inference.

### A first example: predicting handwritten digits with Keras

You're going to train a simple deep learning model using Keras so that you can focus on the workflow. MNIST is the "hello world of deep learning", so let's classify handwritten digits from the MNIST data set using three dense layers with dropout for regularization.

After training the model, you'll store it locally, so that it can be picked up and served by SKIL. Make sure to have keras and a backend like tensorflow installed. You start with a few imports, load the data set into memory, split it into training and test set, and perform some basic preprocessing steps.

import keras
from keras.datasets import mnist
from keras.models import Sequential
from keras.layers import Dense, Dropout

batch_size = 128
num_classes = 10
epochs = 5

(x_train, y_train), (x_test, y_test) = mnist.load_data()

x_train = x_train.reshape(60000, 784)
x_test = x_test.reshape(10000, 784)
x_train = x_train.astype('float32')
x_test = x_test.astype('float32')
x_train /= 255
x_test /= 255

y_train = keras.utils.to_categorical(y_train, num_classes)
y_test = keras.utils.to_categorical(y_test, num_classes)


Next, define the deep learning model by adding layers one by one in a Sequential model. Afterwards, compile the model with a loss function, an optimizer and some optional evaluation metrics.

model = Sequential()

model.compile(loss='categorical_crossentropy',
optimizer='sgd', metrics=['accuracy'])


The model is trained for 5 epochs and stored in a file called "model.h5" in the HDF5 format.

history = model.fit(x_train, y_train,
batch_size=batch_size,
epochs=epochs)

model.save("model.h5")


You can try this with practically any other Keras model, so feel free to experiment!

### Serving your model with SKIL

skil_server = Skil()


If needed, you can pass a username and password as arguments to the Skil instance, but using a vanilla SKIL CE install, you can connect as shown above. Next, create a workspace in SKIL and a new experiment in that workspace.

work_space = WorkSpace(skil_server)
experiment = Experiment(work_space)


You can now use your serialized model ("model.h5") to define a SKIL model in your experiment as follows:

model = Model('model.h5', model_id="keras_model", experiment=experiment)


The fourth and last step is to create a Deployment and deploy your model with it. This will give you a SKIL Service that we'll use for inference next.

deployment = Deployment(skil_server, "keras_deployment")
service = model.deploy(deployment)


That's it. Your model is now live. SKIL takes care of the nasty details for you and makes sure your model stays in production for as long as you need. To test your service, feed data into its predict method.

prediction = service.predict(x_test[:10])


### Conclusion

In this article, we walked through how to use the new Python library skil to access SKIL's functionality in a few simple steps. If you want to learn more about skil, check it out on GitHub.