Get started using Robot Framework
  • Updated on 17 Jul 2020
  • 8 minutes to read
  • Print
  • Share
  • Dark
    Light

Get started using Robot Framework

  • Print
  • Share
  • Dark
    Light

Before you begin


Getting an Aito instance

If you want to follow along with this get started guide, you'll have to go and get your own Aito instance from the Aito Console.

  1. Sign in or create an account, if you haven't got one already.
  2. In the Aito Console go to the instances page and click the "Create an instance" button.
  3. Select the instance type you want to create and fill in needed fields, Sandbox is the free instance for testing and small projects. Visit our pricing page, to learn more about the Aito instance types.
  4. Click "Create instance" and wait for a moment while your instance is created, you will receive an email once your instance is ready.

Install prequisites

This tutorial provides practical examples for using Aito through Robot Framework. To follow along, you’ll need to have the following programs installed.

  1. Install Python 3.6+
  2. Install Aito Python SDK and Robot Framework
pip install aitoai
pip install robotframework

Documentation for the Aito SDK can be found here.

Set environment variables

Set environment variables for the API key and API URL . You can locate the key and URL on your instance page in Aito console.
api_info

Use the below commands to set them as environment variables. Note that the exact command may vary between operating systems.

$ export AITO_INSTANCE_URL=your-env-url
$ export AITO_API_KEY=your-api-key

TL;DR

Here’s a quick overview of the most essential code and steps. You’ll use predicting and uploading as examples.

  1. Download CSV (train.csv)
  2. Upload the data file to your Aito instance through the Aito Console's file upload feature in the instance overview page.
  3. Create a Robot Framework file where you can write the code, named e.g. aito.robot
  4. Initiate Aito client and import libraries
Library     aito.sdk.aito_client.AitoClient     %{AITO_INSTANCE_URL}     %{AITO_API_KEY}    False
  1. Define variables for table name, target column, response limit and get input data
${table}=       invoice_data
${target}=      Product_Category
${inputs}=      # Read from a file or define manually, must represent a dictionary
  1. Formulate the query for prediction
${query}=     Create Dictionary   from=${table}   where=${inputs}   predict=${target}
  1. Send the query
${response}=    Request      POST     ${endpoint}    ${query}
  1. Get the prediction and its probability
${pred}=   ${response['hits'][0]['feature']}
${prob}=   ${response['hits'][0]['$p']}
  1. Run the code
robot aito.robot

Intro

Using Aito Python SDK with Robot Framework works just the same way as using any other Python library. Understanding the functionality of Aito Python SDK is highly beneficial, but you may follow along this tutorial to get an idea of the connection of Aito and Robot Framework even without experience in Aito Python SDK.


The Problem

This tutorial walks you through how to use Aito with Robot Framework in invoice automation context. More specifically, you'll be using Aito to predict which product category does a new invoice belong to. Categorizing invoices is one of the more common ML + RPA applications so you might run into it even at your company.


Data

Data lives in Aito as tables. The train.csv of a Kaggle dataset will be put into Aito as one table which will be called invoice_data. It is possible to use linked tables in Aito but in this example having just one table is enough.

Aito needs data from the past in order to make predictions for the future. Kaggle has an excellent sample dataset (train.csv) which contains +5500 rows of data from invoices and the correct category for each. . The details can be used to define the invoice we want to predict category for. The value we want to predict also has to be encoded in the data as a column (or a feature in data science terms), in this case, it is Product_Category.

Here's a snapshot of the data:

Inv_Id Vendor_Code GL_Code Inv_Amt Item_Description Product_Category
15001 VENDOR-1676 GL-6100410 83.24 Artworking/Typesetting ... CLASS-1963
15002 VENDOR-1883 GL-2182000 51.18 Auto Leasing Corporate ... CLASS-1250
15004 VENDOR-1999 6050100 79.02 Store Management Lease/Rent ... CLASS-1274
... ... ... ... ... ...

Upload Data

  1. Upload the sample dataset into your Aito instance with the drag & drop feature in Aito Console instance overview.

console_upload

  1. Change the table name to invoice_data.

console_upload_table


Code

Let's go through the code line by line. The full code will be available at the end. You can save the code into file named aito.robot.

Imports

The most convenient way to initiate Aito client is to do it while importing the library. This is where you'll need the environment variables since hard-coding your API key and URL is not recommended. You also need to import the default Collections library to help with Python dictionary operations and OperatingSystem library for reading files.

*** Settings ***
Library     Collections
Library     OperatingSystem
Library     aito.sdk.aito_client.AitoClient     %{AITO_INSTANCE_URL}     %{AITO_API_KEY}    False

Keywords

Next you will create a keyword for making predictions and another for uploading data.

Aito Predict
Let's start with predicting.

The steps taken in the keyword are quite self-evident. As arguments, it expects the table name ( ${table}), the input data without the target column ( ${inputs}), the target column you want to predict ( ${target}) and an optional limit to the amount of entries in the response ( ${limit}).

*** Keywords ***
Aito Predict
    [Arguments]    ${table}    ${inputs}    ${target}    ${limit}=1
    # table: String, name of the table in Aito schema
    # inputs: Dictionary, input data for prediction
    # target: String, name of the feature being predicted
    # limit: Integer, how many entries Aito returns

Note that the ${limit} variable must be an Integer. The following line converts it from Str to Int.

    # Make sure the limit is Integer
    ${limit}=       Convert To Integer  ${limit}

The keyword arguments are compiled into a dictionary which represents the correct query format for Aito, and the request is then sent to _predict endpoint using POST.

    # Construct predict query body as a Dictionary using arguments
    ${query}=       Create Dictionary   from=${table}   where=${inputs}   predict=${target}   limit=${limit}

    # Query for Aito
    ${response}=    Request              POST    /api/v1/_predict     ${query}

You may already return the full response and parse it elsewhere. For convenience though, here's how you'll find the most likely value for the target column and it's probability.

    # Return only first feature and probability
    [Return]  ${response['hits'][0]['feature']}  ${response['hits'][0]['$p']}

Aito Upload
Next up is uploading a new entry of data.

The keyword expects a name of the table ( ${table}) and the full data entry ( ${inputs}). The entry must contain a value for every column even if it's Null.

Aito Upload
    [Arguments]     ${table}        ${inputs}
    # table: String, name of the table in Aito schema
    # inputs: Dictionary, input data for prediction

Catenate the table name into the base endpoint and send the request with input data. This uploads your data entry. The response lets you know if the upload was successful or if something went wrong.

    ${endpoint}=    Catenate    SEPARATOR=    /api/v1/data/       ${table}
    ${response}=    Request         POST        ${endpoint}   ${inputs}
    [Return]        ${response}

Variables

Next, set up the variables. The name of the table needs to be defined even if there's only one table in the database schema. You'll also need to define the name of the column which you want to predict, or in other words, the target column.

*** Variables ***
${table}=       invoice_data
${target}=      Product_Category

Predicting categories

Now you are ready to use the keywords. Start off by reading the input data from a .txt file located in the same directory. Below is the content of the text file.

{
 'Inv_Id': 15033,
 'Vendor_Code': 'VENDOR-1424',
 'GL_Code': 'GL-6100410',
 'Inv_Amt': 78.78,
 'Item_Description': 'Digital Display'
}

The object in the file describes an imaginary invoice for which we want to predict the Product_Category. Save the above text into a file named inputdata.txt.

Reading the file returns a string which represents a dictionary, so you need to turn it into a dictionary with literal evaluate. Note that if the target variable (in our case Product_Category) exists in the dictionary, it must be removed as all of the variables in the dictionary are used in predicting the target variable. Having the the target variable in the dictionary will skew the results.

*** Test Cases ***
Predict
    # Read file and turn into dictionary
    ${data}=        Get File          inputdata.txt
    ${data}=        Evaluate          ${data}

Run the predict query by running the Aito Predict keyword which was defined earlier. It returns the predicted value for our target column Product_Category and the probability of the prediction.

    ${pred}   ${prob} =     Aito Predict      ${table}    ${data}    ${target}

Uploading entries

After the predicted value is checked to be correct, the entry can be uploaded into Aito to function as additional training data. You can use the** Aito Upload** keyword you created for uploading the entry. Please note that uploading predicted values is not recommended since it may degrade accuracy over time. So only add entries into Aito which have been ensured to be correct.

    Set To Dictionary       ${data}            ${target}        ${pred}
    ${response}=            Aito Upload        ${table}         ${data}

Running the robot

Now you are equipped to use Aito through Robot Framework! The code should have been saved into a Robot Framework file, e.g. aito.robot. You can run it with the following command.

robot aito.robot

Wrapping up

Here's the full code we created.

*** Settings ***
Library     Collections
Library     OperatingSystem
Library     aito.sdk.aito_client.AitoClient     %{AITO_INSTANCE_URL}     %{AITO_API_KEY}    False

*** Keywords ***
Aito Predict
    [Arguments]    ${table}    ${inputs}    ${target}    ${limit}=1
    # table: String, name of the table in Aito schema
    # inputs: Dictionary, input data for prediction
    # target: String, name of the feature being predicted
    # limit: Integer, how many entries Aito returns

    # Make sure the limit is an integer
    ${limit}=       Convert To Integer  ${limit}

    # Construct predict query body as a Dictionary using arguments
    ${query}=       Create Dictionary   from=${table}   where=${inputs}   predict=${target}   limit=${limit}

    # Query for Aito
    ${response}=    Request    POST    /api/v1/_predict     ${query}
    
    # Return only first feature and probability
    [Return]    ${response['hits'][0]['feature']}    ${response['hits'][0]['$p']}
    
Aito Upload
    [Arguments]     ${table}        ${inputs}
    # table: String, name of the table in Aito schema
    # inputs: Dictionary, input data for prediction
    
    ${endpoint}=    Catenate    SEPARATOR=    /api/v1/data/       ${table}
    ${response}=    Request         POST        ${endpoint}   ${inputs}
    [Return]        ${response}
    
*** Variables ***
${table}=       invoice_data
${target}=      Product_Category

*** Test Cases ***
Predict And Upload
    # Read file and turn into dictionary
    ${data}=        Get File          inputdata.txt
    ${data}=        Evaluate          ${data}
    
    # Send predict query to Aito
    ${pred}   ${prob} =     Aito Predict      ${table}    ${data}    ${target}
    
    # Add result to dictionary and upload data entry to Aito
    Set To Dictionary       ${data}            ${target}        ${pred}
    ${response}=            Aito Upload        ${table}         ${data}

Deleting the data

In case you want to start making predictions from your data on a clean slate, you can delete all the tables and data from your instance with the following Aito CLI command.

aito delete-database
Was this article helpful?