eolearn.core.eoexecution

The module handles execution and monitoring of workflows. It enables executing a workflow multiple times and in parallel. It monitors execution times and handles any error that might occur in the process. At the end it generates a report which contains summary of the workflow and process of execution.

All this is implemented in EOExecutor class.

eolearn.core.eoexecution.LOGGER = <Logger eolearn.core.eoexecution (WARNING)>
class eolearn.core.eoexecution.EOExecutor(workflow, execution_args, *, save_logs=False, logs_folder='.')[source]

Bases: object

Simultaneously executes a workflow with different input arguments. In the process it monitors execution and handles errors. It can also save logs and create a html report about each execution.

Parameters
  • workflow (EOWorkflow) – A prepared instance of EOWorkflow class

  • execution_args (list(dict(EOTask: dict(str: object) or tuple(object)))) – A list of dictionaries where each dictionary represents execution inputs for the workflow. EOExecutor will execute the workflow for each of the given dictionaries in the list. The content of such dictionary will be used as input_args parameter in EOWorkflow.execution method. Check EOWorkflow.execution for definition of a dictionary structure.

  • save_logs (bool) – Flag used to specify if execution log files should be saved locally on disk

  • logs_folder (str) – A folder where logs and execution report should be saved

REPORT_FILENAME = 'report.html'
STATS_START_TIME = 'start_time'
STATS_END_TIME = 'end_time'
STATS_ERROR = 'error'
run(workers=1, multiprocess=True)[source]

Runs the executor with n workers.

Parameters
  • workers (int or None) – Maximum number of workflows which will be executed in parallel. Default value is 1 which will execute workflows consecutively. If set to None the number of workers will be the number of processors of the system.

  • multiprocess (bool) – If True it will use concurrent.futures.ProcessPoolExecutor which will distribute workflow executions among multiple processors. If False it will use concurrent.futures.ThreadPoolExecutor which will distribute workflow among multiple threads. However even when multiprocess=False, tasks from workflow could still be using multiple processors. This parameter is used especially because certain task cannot run with concurrent.futures.ProcessPoolExecutor. In case of workers=1 this parameter is ignored and workflows will be executed consecutively.

get_successful_executions()[source]

Returns a list of IDs of successful executions. The IDs are integers from interval [0, len(execution_args) - 1], sorted in increasing order.

Returns

List of succesful execution IDs

Return type

list(int)

get_failed_executions()[source]

Returns a list of IDs of failed executions. The IDs are integers from interval [0, len(execution_args) - 1], sorted in increasing order.

Returns

List of failed execution IDs

Return type

list(int)

get_report_filename()[source]

Returns the filename and file path of the report

Returns

Report filename

Return type

str

make_report()[source]

Makes a html report and saves it into the same folder where logs are stored.