Aito workflow
  • 05 Nov 2020
  • 2 Minutes to read
  • Dark
    Light
  • PDF

Aito workflow

  • Dark
    Light
  • PDF

Aito is a predictive database that can be accessed through an API. When you create an Aito instance in the Aito console you will get an URL, read-only and read/write API keys. Read this article in case you're unfamiliar how an API endpoint is used, i.e. how to send requests.

aito-workflow-graph

  1. The workflow starts from a use case. A use case for Aito should be one for which you have existing historical labeled data. So for example, if your use case would be to select a group for an incoming invoice, you would need data from old invoices that how they were grouped. Aito needs data in order to make future predictions. You find some inspiration for your use case in our Tutorials section.

  2. After you have a use case decided and the data that goes with it, you will have to define a schema for the data. Defining schema means that you will tell Aito how it should process each of the input variables. For more info on the Aito schema, you can read the schema concept article and the how to create a schema article.

  3. When you have uploaded the schema into Aito, you can start uploading data. ETL stands for Extract, Transform and Load. In the Extract phase you will take the data out from the data source (a database, CSV etc.), in the Transform phase you will transform the data into the Aito suitable JSON or NDJSON format, then finally you can Load the data into Aito. You can read more about data upload in this article.

  4. After you have your schema and data in, you can make predictions using queries. The query works as the ML model if speaking in data science terms. In the query, you define for what kind of entity you want to predict something using a SQL-like query language. For more info about the query language, you can read this article.

  5. Then it's time to evaluate how good the results are in terms of accuracy. For this purpose, you can use the evaluate endpoint Aito offers. In case you're interested in interpreting Aito's results check out this article. If the results are not as good as you'd expect, you can try to modify the query or you might have to take a deeper dive into the data you have, in case it really isn't a representative sample of the use case in question. Aito supports multiple use cases within the limits of the data schema and data itself as you can use the same database with different kind of queries to answer different questions. You can also add new data into Aito after the initial upload without any extra re-training steps, but always remember to check that the prediction accuracy is still where you need it to be.

  6. When you're happy with the accuracy of your predictions, you can go ahead and use Aito in any application you see fit! Usually it is a good idea to automate the data update into Aito, but still remember to check the accuracy of your predictions everytime you add data into Aito.


Was this article helpful?

What's Next