This short tutorial shows how to store data into the prediction property service with python and requests (see also Access to REST-Services in Python).
For IDL you can take a similar rout by adapting these instructions to Access to REST-Services in IDL.
Routes
...
The Prediction Service
The prediction service represents a web interface which allows to request, insert and modify prediction data from the database. Operations are performed by sending URL requests, whereas each operation is well defined as a so called route. The property service comes along with a graphical user interface at http://localhost:8004/ui/). Under the Edit tab there are all routes which can be used to insert or update data of the prediction service.
which provides visual access to all available routes. Hereby, all routes involving the insertion or modification of data are enlisted under the Edit section.
For this tutorial we are using just two of themonly two routes. One to add a new datasets machine learning configuration and one to add new configuration sets to the the consequential prediction service.
- /datasetalgoconfig/bulk{name}
- /configsetprediction/{dataset}
Requests
These two routes are both POST routes which requires them to be called with a POST request. To create simple post requests we recommend the python package requests.
A POST request needs two parameter: One is the address itself which is given by the route and the second one is the data which will be attached to the request.
...
language | py |
---|
...
- bulk
Each route can hold up to three different parameter types which are described within the article "Ingest property data in database (REST API)".
Implementation
Prepare
In our example we run a machine learning algorithm which produces a flare prediction to store within our database. Hereby, the algorithm consists of a training phase and a prediction phase. Within the training phase the algorithm learns and tunes its parameters which are stored within the database as a configuration for later use. Afterwards, within the prediction phase we use this configuration to compute flare predictions which are also stored within the database.
Code Block | ||||
---|---|---|---|---|
| ||||
def:
|
Implementation
Prepare
First of all you have to define the data you would like to store into the prediction service. In our example we are going to store some predictions from a machine learning algorithm into the dataset ml-algorithms.
Code Block | |||||
---|---|---|---|---|---|
| # define data to store
ml_datasets = [
{
"name": "ml-algorithms",
"responsible": "John Doe",
"description": "no comment!"
}
]
ml_configurations = {
'model': { '...': '...' },
'weight-matrix': { '...': '...' },
'biases': { '...': '...' }
}
ml_result = {
'name': 'ml-multilayer-perceptron',
'timestamp': '2016-01-22T17:27:59.001Z',
'configurations': ml_configurations,
'predictions': {
'time-left': '32h12m04s',
'position': {
'lat_hg': 14.0,
'long_hg': -3.4
},
'probability': '88%',
'class': 'M'
}
}
| ||||
import requests
algo_config_name = "my_ml_configuration"
algo_config = {}
response = requests.get('http://localhost:8004/algoconfig/list?algorithm_config_name=%s&algorithm_config_version=latest' % algo_config_name).json()
if response['has_error'] == False and response['result-count'] > 0:
# as we requested the latest configuration we expect only one result within 'data'
algo_config = response['data'][0]
else:
#
http://localhost:8004/algoconfig/list?algorithm_config_name=a #if |
Now the problem is that the ml_result is not structured in the way that the prediction service would understand it. So we have to restructure it to accomplish following definition:
...