...
For this tutorial we are using only two routes. One to add a new machine learning configuration and one to add the consequential prediction predictions.
- /algoconfig/{name}
- /prediction/bulkpredictionset
Each route can hold up to three different parameter types which are described in details by the following article: "Ingest property data in database (REST API)".
...
In our example we run a machine learning algorithm which produces a set of flare prediction predictions to store within our database. Hereby, the algorithm consists of a training phase and a test testing or prediction phase. Within the training phase the algorithm learns and tunes its parameters which then can be stored within the database as a configuration for later use. Afterwards, within the test testing phase, we use this configuration to compute flare predictions which are also stored within the database. The following code shows the two corresponding functions.
Code Block | ||||
---|---|---|---|---|
| ||||
def train_model(model, train_data, validation_data, max_epoches, batch_size, environment): # train model (e.g. until max_epoches are reached or validation loss increases) model.train(train_data, validation_data, max_epoches, batch_size) # store model parameters within database post_data = { "algorithm_run_id": environment['runtime']['run_id'], "config_data": model.get_parameters(), "description": "" } response = requests.post('http://localhost:8004/algoconfig/%s' % environment['algorithm']['cfg_name'], json=post_data).json() if response['has_error'] == True: raise ValueError('An error occurred while storing the algorithm\'s configuration:\n%s' % response['error']) return response['data'] def test_model(model, test_data, environment): # test model (e.g. predict the test_data) model.run(test_data) (time_start, position_hg, # store predictions within database prediction_data) = [] for prediction in model.get_predictionpredictions(): # store predictions within database prediction_data.append({ post_data = [ "time_start": prediction.time_start, { "algorithmtime_configduration": environment['algorithm']['cfg_name']prediction.time_duration, "algorithm_run_idprobability": environment['runtime']['run_id']prediction.probability, "latintensity_hgmin": position_hg[0]prediction.intensity_min, "longintensity_hgmax": position_hg[1]prediction.intensity_max, "prediction_data": prediction_datameta": { "harp": prediction.harp, "nar": prediction.nar }, "source_data": [get_fc_id(row) for row in test_data]prediction.data }) post_data = { "algorithm_config": environment['algorithm']['cfg_name'], "algorithm_run_id": environment['runtime']['run_id'], "time_start "prediction_data": timeprediction_startdata, } "source_data": [get_fc_id(row) for row in test_data] response } response = requests.post('http://localhost:8004/prediction/bulkpredictionset', json=post_data).json() if response['has_error'] == True: raise ValueError('An error occurred while storing the algorithm\'s prediction set:\n%s' % response['error']) return response['data'] |
The post_data structure is equivalent to the algorithm_config_data or prediction_data definitions as given by the routes /algoconfig/{name} and /prediction/bulk:
|
|
Source Code
...