.. include:: .. _create_job: ************ Create a Job ************ Purpose of this Chapter ======================= Overview ======== The PI has the opportunity to define jobs that should be executed periodically at a fixed time. Therefore the PI has to set the frequency and the interval between the executions. The system supports hourly, daily, and weekly jobs. If the job should run every second hour, then the PI has to set the frequency to hourly and the interval to two. Each job has several TaskManagers. A manager is dedicated to handling one kind of task. For example, a job could have two task managers—the first manager imports the metadata *(e.g., instruments data)*, and the second the series. To resolve dependencies, the PI can order the jobs and the managers. During the execution of a job, the task manager will first collect all tasks and store them in a queue. After it, the tasks will be executed by the managers. .. figure:: ../../graphics/jobs.svg :width: 100% Execution of a task ------------------- 1. Request data that should be imported (**TransferHandler**) 2. Read the data and parse it to an interpretable structure (**Reader**) After reading import and process the data and perform some basic tests for flagging (**Importer**) 3. Perform advanced tests 4. Inform PI that checks the plausibility of the data. Can change the evaluation method and rerun the data processing 5. Prepare the export data (**Exporter**) and write the data to a specific format (**Writer**) 6. Export the data to an external system (**TransferHandler**) Create a Provider ================= 1. Make sure that you have the permissions to create new entries *(admin)* 2. Go to the menu *Workflow* |rarr| *Jobs* 3. Click the button **Create** on the top right corner 4. Create the provider **IAGOS DATA PORTAL** Create a TransferHandler ======================== 1. Go to the menu *Workflow* |rarr| *Transfer* 2. Click the button **Create** on the top right corner 3. Select the **Data Provider** and the **Transfer Type** Create a Job ============ .. important:: You can add users and manage the permissions in the admin section. Therefore, click on the *admin* icon on the right top or log in. After it, click on *Users*. 1. Go to the menu *Workflow* |rarr| *Jobs* 2. Create a job which runs every hour * Name: **ICH**, User: **m.kennert**, Notification: **Never**, Stage: **1** * Start: **2021-06-14**, Frequency: **Hourly**, Interval: **1** Create Managers =============== 1. Go to the detail view of a job 2. Click the button **Add** on the top right corner 3. Add the two managers **Metadata** *(order: 1)* and **Series** *(order: 2)* Create the Import Managers ========================== Metadata -------- 1. Scroll to the Task Manager **Metadata** 2. Click on *Add Import Manager* 3. Input the following attributes: * Source: **/iagos/media/uploads/ich_metadata.xls** * TransferHandler: **IAGOS Data Portal [Directory]** * Importer: **ICH - Metadata (demo)** & Reader: **ICH - Metadata (Excel)** Series ------ 1. Scroll to the Task Manager **Series** 2. Click on *Add Import Manager* 3. Input the following attributes: * Source: **/iagos/media/uploads/H2O*.nc** * TransferHandler: **IAGOS Data Portal [Directory]** * Importer: **ICH - Series (demo)** & Reader: **ICH - Series (netCDF)** Create an Export Manager ======================== 1. Scroll to the Task Manager **Series** 2. Click on *Add Export Manager* 3. Input the following attributes: * Source: **/iagos/media/exports/** * TransferHandler: **IAGOS Data Portal [Directory]** * Exporter: **ICH - Exporter (demo)** & Writer: **ICH - Writer (demo)** Upload files & Execute Job ========================== Uploads the following files: :download:`Excel File <../../media/ich_components.xls>` :download:`netCDF File <../../media/H2O2018061616204002_060590L2AV300.nc>` Execute the job: .. code:: console python manage.py runjobs hourly