Overview of Runtime features
Once a pre-trained model is submitted, an algorithm (inference program) is executed and both of the inference time and score are returned.
The returned score is automatically submitted through the evaluation function (existing submission function) .
Please submit a pre-trained model according to the model submitting conventions below:
Creating a submission file
1. Download the submission template file.
Download here. ZIP DL
2. Check directory structure
The directory structure to upload is as follows:
. ├── model Required A directory where the pre-trained model files should be placed. │ └── ... ├── src Required A directory where the python program files should be placed. │ ├── predictor.py Required A file called by the initial program │ └── ... Other files (sub-directories can be created) └── requirements.txt Optional
3. Creating a pre-trained model
Please build your model in the following environment.
- Python3 Anaconda3-2019.03 (Read the install guide here.)
4. Create a predictor.py file.
Please implement the following classes and methods in predictor.py.
ScoringService
A class for inference execution
Please implement the following methods
get_model
A method to load the model. Please note the following conditions:
- Should be a class method
- Must specify an argument model path (str type)
- Must return true (bool type) on success.
* Exception handling (try/except) within get_model is discouraged, as it will prevent details from being displayed when an error occurs.
predict
A method to execute an inference. Please note the following conditions:
- Should be a class method
- Must specify an argument input (str type)
* Please see the file with the same name in the template for details.
5. Zipping a file
Please compress the file in a zip file and submit it to the form.
Submission example
Let us introduce submission guidelines by taking image labeling task for example
Execution environment
Runtime system by taking image labeling task is executed in the following environment:
- Docker: continuumio/anaconda3:2019.03 (Customized)
- vCPU: 1
- Memory: 2GiB
- WORKDIR: src
- External network: Not accessible
* Please use relative paths to specify a directory or file location.
* The execution environment may vary depending on the competition. Please see the competition page for details.
An example of submission file
General case ZIP DL
Using libraries ZIP DL
An example of using predictor.py
An example of using predictor.py
# -*- coding: utf-8 -*- import os import pickle import numpy as np from skimage import io from skimage.feature import hog class ScoringService(object): @classmethod def get_model(cls, model_path='../model'): """Get model method Args: model_path (str): Path to the trained model directory. Returns: bool: The return value. True for success, False otherwise. """ with open(os.path.join(model_path, 'my_model.pkl'), 'rb') as f: cls.model = pickle.load(f) return True @classmethod def predict(cls, input): """Predict method Args: input (str): path to the image you want to make inference from Returns: str: Inference for the given input. """ # load an image and get the file name image = io.imread(input) fname = os.path.basename(input) # do some preprocessing image_array = image/255 feature = hog(image_array, orientations=8, pixels_per_cell=(4,4), cells_per_block=(1,1), block_norm='L2-Hys') # make prediction y_pred = cls.model.predict_proba(np.array([feature]))[0] # make output output = '' for i in y_pred: output+=','+str(i) return fname + output
Next, we will explain the contents.
Importing modules
import os import pickle import numpy as np from skimage import io from skimage.feature import hog
Please import modules as necessary.
ScoringService Class
class ScoringService(object)
Calling the class method of this class executes an inference.
The name of this class should not be changed.
Defining a class variable of the model
model = None
This is a class variable which the model object is assigned.
Other variable names and multiple variables can also be used.
A class method to load the model
@classmethod def get_model(cls, model_path='../model'): """Get model method Args: model_path (str): Path to the trained model directory. Returns: bool: The return value. True for success, False otherwise. """ with open(os.path.join(model_path, 'my_model.pkl'), 'rb') as f: cls.model = pickle.load(f) return True
The program will raise an error if this class method does not exist.
Please use model_path as much as possible for reading files (will be fine as long as using relative paths) .
Returns True on success (Returning None will lead to a model loading error) .
The model loading time is not included in the inference time.
Inference class method
@classmethod def predict(cls, input): """Predict method Args: input (str): path to the image you want to make inference from Returns: str: Inference for the given input. """ # load an image and get the file name image = io.imread(input) fname = os.path.basename(input) # do some preprocessing image_array = image/255 feature = hog(image_array, orientations=8, pixels_per_cell=(4,4), cells_per_block=(1,1), block_norm='L2-Hys') # make prediction y_pred = cls.model.predict_proba(np.array([feature]))[0] # make output output = '' for i in y_pred: output+=','+str(i) return fname + output
The program will raise an error if this class method does not exist.
Runtime program executes an inference by using a model specified with get_model class method.
An executing time of this method is measured as an inference time.
The model loading time is also included in the inference time.
Both of input and output variable type should be str.
When inputting image files, please specify a filepath as input.
Installing packages
https://docs.anaconda.com/anaconda/packages/py3.7_linux-64/
The packages in the table below which [In Installer] box is checked are pre-installed (Please note the version is different).
To install additional packages, specify a package name in requirements.txt and it will be installed by pip.
Debugging
General case
$ pip install -r requirements.txt # Necessary modules will be installed via pip. $ cd src # Moving to a source directory $ python # Executing a Python script Python 3.7.3 (default, Mar 27 2019, 16:54:48) [Clang 4.0.1 (tags/RELEASE_401/final)] :: Anaconda, Inc. on darwin Type "help", "copyright", "credits" or "license" for more information. >>> from predictor import ScoringService # Importing modules >>> ScoringService.get_model() # Loading a model True >>> ScoringService.predict('[Input data for inference]') # Executing an inference '[Inference result]'
Using pyenv
$ pyenv local anaconda3-2019.03
Follows same debugging procedure as the general installation case.
Using Docker
$ docker run -it -v $(pwd):/opt/ml continuumio/anaconda3:2019.03 /bin/bash $ cd /opt/ml
Follows same debugging procedure as the general installation case.
Q&A
Building a local dev environment
Please create a pre-trained model in the same environment with Runtime.
The following three ways can used to build an environment.
General case
Download the file of version 2019.03 of the corresponding architecture from https://repo.continuum.io/archive/ and execute it. If you are using mac, download Anaconda3-2019.03-MacOSX-x86_64.pkg and double-click the downloaded file and click continue to start the installation.
Using pyenv
$ pyenv install anaconda3-2019.03
Using Docker
$ docker pull continuumio/anaconda3:2019.03
Using multiple models
An array-like or dictionary structure variable can be used as a class variable, or models can be assigned separately to the variables like model1, model2. Below are some concrete examples:
import os import pickle import glob class ScoringService(object): model = [] @classmethod def get_model(cls, model_path='../model'): for file in glob.glob(os.path.join(model_path, '*.pkl')): with open(os.path.join(file), 'rb') as inp: cls.model.append(pickle.load(inp)) return True
Where should the files other than the pre-trained models such as json file be placed?
Place the files in suitable locations under the src directory and load them with get_model class method. Below are some concrete examples:
import os import pickle import json class ScoringService(object): model = None json_data = None @classmethod def get_model(cls, model_path='../model'): with open(os.path.join(model_path, 'decision-tree-model.pkl'), 'rb') as inp: cls.model = pickle.load(inp) with open('./data/hyperparameters.json', 'r') as inp: cls.json_data = json.load(inp) return True
Do you support other execution environments?
Currently, we do not support other execution environments.
* The execution environment may vary depending on the competition. Please see the competition page for details.
Do you support GPU?
In some competitions, yes we do. Please see the competition page for details.
It is permitted to install other packages on OS?
Currently, we do not permit to install other packages on OS.
Was this article helpful?
That’s Great!
Thank you for your feedback
Sorry! We couldn't be helpful
Thank you for your feedback
Feedback sent
We appreciate your effort and will try to fix the article