Robot Framework
  • 22 Oct 2020
  • 9 Minutes To Read
  • Print
  • Share
  • Dark
    Light

Robot Framework

  • Print
  • Share
  • Dark
    Light

Robot Framework is is a generic open source automation framework. It can be used for test automation and robotic process automation. It is free to use without licencing costs. Combined with the fact that it is actively maintained by the foundation members, it serves as a great platform for multitude of automations.

Robot Framework's functionality can be easily extended with Python or Java libraries. With Aito's Python SDK it is a breeze to get started with predictions!


The Problem

This tutorial walks you through how to use Aito with Robot Framework in invoice automation context. More specifically, you'll be using Aito to predict which product category does a new invoice belong to. Categorizing invoices is one of the more common ML + RPA applications so you might run into it even at your company.

In this example we walk you though of implementing Aito to predict datapoints in purchase automation. For example you can predict missing product codes, cost center and such that the document understanding might not provide.

Before you begin


Getting an Aito instance

If you want to follow along with this get started guide, make sure to get your own Aito instance from the Aito Console.

  1. Sign in or create an account, if you haven't got one already.
  2. In the Aito Console go to the instances page and click the "Create an instance" button.
  3. Select the instance type you want to create and fill in needed fields, Sandbox is the free instance for testing and small projects. Visit our pricing page, to learn more about the Aito instance types.
  4. Click "Create instance" and wait for a moment while your instance is created, you will receive an email once your instance is ready.

Install prequisites

This tutorial provides practical examples for using Aito through Robot Framework. To follow along, youโ€™ll need to have the following programs installed.

  1. Install Python 3.6+
  2. Install Aito Python SDK and Robot Framework
pip install aitoai==0.4.0
pip install robotframework==3.2.2

Documentation for the Aito SDK can be found here. This tutorial has been last tested with Aito Python SDK version 0.3.1 and Robot Framework version 3.2.2.

Environment variables

Set environment variables for the API key and API URL so that your robot code can connect to your instance. You can locate them on your instance page in Aito console.

api_info

Use the below command to set up both API key and URL variables. This example works for Bash, please follow your platform specific way of creating them.

$ export AITO_INSTANCE_URL=your-env-url
$ export AITO_API_KEY=your-api-key

TL;DR

Hereโ€™s a quick overview of the most essential code and steps to upload a dataset to Aito, and then make predictions with new datapoints.

  1. Download CSV (Train.csv)
  2. Upload the data file to your Aito instance through the Aito Console's file upload feature in the instance overview page.
  3. Create a Robot Framework file where you can write the code, named e.g. aito.robot
  4. Initiate Aito client and import libraries
Library     aito.sdk.aito_client.AitoClient     %{AITO_INSTANCE_URL}     %{AITO_API_KEY}    False
Library aito.api
  1. Define variables for table name, target column, response limit and get input data
${table}=       invoice_data
${target}=      Product_Category
${inputs}=      # Read from a file or define manually, must represent a dictionary
  1. Formulate the query for prediction
${query}=     Create Dictionary   from=${table}   where=${inputs}   predict=${target}
  1. Predict
${client}    Get Library Instance    aito_client
${response}=    Predict      ${client}    ${query}
  1. Get the prediction and its probability
${pred}=   ${response['hits'][0]['feature']}
${prob}=   ${response['hits'][0]['$p']}
  1. Run the code
robot aito.robot

Data

Data lives in Aito as tables. The Train.csv of a Kaggle dataset will be put into Aito as one table which will be called invoice_data. It is possible to use linked tables in Aito but in this example having just one table is enough.

Also, in this example we only upload a single historic dataset to work as training data, but in production scenarios you would add more entries to database continuously. No model re-training is needed, just new data.

Aito needs data from the past in order to make predictions for the future. Kaggle has an excellent sample dataset (Train.csv) which contains +5500 rows of data from invoices and the correct category for each. . The details can be used to define the invoice we want to predict category for. The value we want to predict also has to be encoded in the data as a column (or a feature in data science terms), in this case, it is Product_Category.

Here's a snapshot of the data:

Inv_Id Vendor_Code GL_Code Inv_Amt Item_Description Product_Category
15001 VENDOR-1676 GL-6100410 83.24 Artworking/Typesetting ... CLASS-1963
15002 VENDOR-1883 GL-2182000 51.18 Auto Leasing Corporate ... CLASS-1250
15004 VENDOR-1999 6050100 79.02 Store Management Lease/Rent ... CLASS-1274
... ... ... ... ... ...

Upload historic data

  1. Upload the sample dataset into your Aito instance with the drag & drop feature in Aito Console instance overview.

console_upload

  1. Change the table name to invoice_data.

console_upload_table


Code

Let's go through the code line by line. The full code will be available at the end. You can save the code into file named aito.robot.

Imports

The most convenient way to initiate Aito client is to do it while importing the library. This is where you'll need the environment variables since hard-coding your API key and URL is not recommended. You also need to import the default Collections library to help with Python dictionary operations and OperatingSystem library for reading files.

*** Settings ***
Library     Collections
Library     OperatingSystem
Library     aito.sdk.aito_client.AitoClient     %{AITO_INSTANCE_URL}     %{AITO_API_KEY}    False

Keywords

Next you will create a keyword for making predictions and another for uploading data.

Aito Predict

Let's start with predicting.

We are creating a Keyword for Aito Predict that wraps all together and makes test cases easier to read.

As arguments, it expects the table name ( ${table}), the input data without the target column ( ${inputs}), the target column you want to predict ( ${target}) and an optional limit to the amount of entries in the response ( ${limit}).

*** Keywords ***
Aito Predict
    [Arguments]    ${table}    ${inputs}    ${target}    ${limit}=1
    # table: String, name of the table in Aito schema
    # inputs: Dictionary, input data for prediction
    # target: String, name of the feature being predicted
    # limit: Integer, how many entries Aito returns

Note that the ${limit} variable must be an Integer. The following line converts it from Str to Int.

    # Make sure the limit is Integer
    ${limit}=       Convert To Integer  ${limit}

The keyword arguments are compiled into a dictionary which represents the correct query format for Aito, and the request is then sent to _predict endpoint using POST.

    # Construct predict query body as a Dictionary using arguments
    ${query}=       Create Dictionary   from=${table}   where=${inputs}   predict=${target}   limit=${limit}

    # Make a query to Aito
    ${client}    Get Library Instance    aito_client
    ${response}=    Predict    ${client}    ${query}

You may already return the full response and parse it elsewhere. For convenience though, here's how you'll find the most likely value for the target column and it's probability.

    # Return only first feature and probability
    [Return]  ${response['hits'][0]['feature']}  ${response['hits'][0]['$p']}

Aito Upload

Next up is uploading a new entry of data. We also create that as a Keyword.

The keyword expects a name of the table ( ${table}) and the full data entry ( ${inputs}). The entry must contain a value for every column even if it's Null.

Aito Upload
    [Arguments]     ${table}        ${inputs}
    # table: String, name of the table in Aito schema
    # inputs: Dictionary, input data for prediction

The upload entries uploads your data entry. The response lets you know if the upload was successful or if something went wrong.

    ${client}    Get Library Instance    aito_client
    ${response}=    Upload entries     client=${client}    table_name=${table}    entries=${inputs}
    [Return]        ${response}

Variables

Next, set up the variables. The name of the table needs to be defined even if there's only one table in the database schema. You'll also need to define the name of the column which you want to predict, or in other words, the target column.

*** Variables ***
${table}=       invoice_data
${target}=      Product_Category

Predicting categories

Now you are ready to use the keywords. First you will have to create a file that contains the input data. Save the text below into a file named inputdata.txt and store it in the same directory where your aito.robot file is located.

{
 'Inv_Id': 15033,
 'Vendor_Code': 'VENDOR-1424',
 'GL_Code': 'GL-6100410',
 'Inv_Amt': 78.78,
 'Item_Description': 'Digital Display'
}

The object in the file describes an imaginary invoice for which we want to predict the Product_Category.

Reading the file returns a string which represents a dictionary, so you need to turn it into a dictionary with literal Evaluate. Note that if the target variable (in our case Product_Category) exists in the dictionary, it must be removed as all of the variables in the dictionary are used in predicting the target variable. Having the the target variable in the dictionary will skew the results.

*** Test Cases ***
Predict
    # Read file and turn into dictionary
    ${data}=        Get File          inputdata.txt
    ${data}=        BuiltIn.Evaluate          ${data}

Run the predict query by running the Aito Predict keyword which was defined earlier. It returns the predicted value for our target column Product_Category and the probability of the prediction.

    ${pred}   ${prob} =     Aito Predict      ${table}    ${data}    ${target}

Print the results

In case you want to see what kind of result Aito predicts you can add the following line to your robot file.

    Log To Console    \n "Predicted Product Category": ${pred} \n "Probability": ${prob}

Uploading entries

After the predicted value is checked to be correct, the entry can be uploaded into Aito to function as additional training data. You can use the Aito Upload keyword you created for uploading the entry. Please note that uploading predicted values is not recommended since it may degrade accuracy over time. So only add entries into Aito which have been ensured to be correct.

    Set To Dictionary       ${data}            ${target}        ${pred}
    ${response}=            Aito Upload        ${table}         ${data}

Running the robot

Now you are equipped to use Aito through Robot Framework! The code should have been saved into a Robot Framework file, e.g. aito.robot. You can run it with the following command.

robot aito.robot

Wrapping up

Here's the full code we created.

*** Settings ***
Library     Collections
Library     OperatingSystem
Library     aito.client.AitoClient     %{AITO_INSTANCE_URL}     %{AITO_API_KEY}    False

*** Keywords ***
Aito Predict
    [Arguments]    ${table}    ${inputs}    ${target}    ${limit}=1
    # table: String, name of the table in Aito schema
    # inputs: Dictionary, input data for prediction
    # target: String, name of the feature being predicted
    # limit: Integer, how many entries Aito returns

    # Make sure the limit is an integer
    ${limit}=       Convert To Integer  ${limit}

    # Construct predict query body as a Dictionary using arguments
    ${query}=       Create Dictionary   from=${table}   where=${inputs}   predict=${target}   limit=${limit}

    # Query for Aito
    ${response}=    Predict    ${query}
    
    # Return only first feature and probability
    [Return]    ${response['hits'][0]['feature']}    ${response['hits'][0]['$p']}
    
Aito Upload
    [Arguments]     ${table}        ${inputs}
    # table: String, name of the table in Aito schema
    # inputs: Dictionary, input data for prediction
    
    ${endpoint}=    Catenate    SEPARATOR=    /api/v1/data/       ${table}
    ${response}=    Request    method=POST    endpoint=${endpoint}    query=${inputs}
    [Return]        ${response}
    
*** Variables ***
${table}=       invoice_data
${target}=      Product_Category

*** Test Cases ***
Predict And Upload
    # Read file and turn into dictionary
    ${data}=        Get File          inputdata.txt
    ${data}=        BuiltIn.Evaluate          ${data}
    
    # Send predict query to Aito
    ${pred}   ${prob} =     Aito Predict      ${table}    ${data}    ${target}
    
    Log To Console    \n"Predicted Product Category": ${pred} \n"Probability": ${prob}

    # Add result to dictionary and upload data entry to Aito
    Set To Dictionary       ${data}            ${target}        ${pred}
    ${response}=            Aito Upload        ${table}         ${data}

Deleting the data

In case you want to start making predictions from your data on a clean slate, you can delete the table created in this example and data from your instance with the following Aito CLI command.

aito delete-table invoice_data
Was This Article Helpful?