Parabola.io
  • 20 Oct 2020
  • 2 Minutes To Read
  • Print
  • Share
  • Dark
    Light

Parabola.io

  • Print
  • Share
  • Dark
    Light

Aito works great with several workflow automation platforms, including an awesome Parabola. Once you have your Aito instance up and running, here's how to get started in no time!

Creating schema to Aito

We recommend creating the data schema in Aito Console or with CLI first, as this way you can ensure everything is set up properly to receive your data.

Adding data: Send to an API

Adding data to Aito happens with "Send to an API" activity.
What ever the input to that activity will be, is to be sent one by one to API. In this case to Aito. Note! If you are working with bigger datasets, you should explore batch transfer options.

Check the below screenshots for guidance on how to set up the connection. Here are the steps.

  • Request type is POST.
  • API endpoint URL is the one given to you in console, along with /api/v1/data/[tablename]where you need to add a name of the table in Aito to which you are uploading data.
  • Body needs to be constructed to have a JSON element that contains the data that you are uploading. Check the example in the screenshot of using Parabola's merge tags to add data from your input.
  • Max requests throttles the number of calls sent to API. Drop it to 30 per minute to avoid any issues.
  • In authentication, you should add two custom headers. One is Content-Type with value application/json and the second is x-api-key with value being the read/write key that you find in Aito console.

image.png

image.png

Making predictions: Pull from an API or Enrich with an API

There are two ways to make queries to Aito API from your Parabola workflow.

Pull from an API would work if you start by getting some data from Aito and push it to other activities later on.

Enrich with an API would be more typical way of using Aito in Parabola. It enriches your existing dataset by triggering a call for each row and adding something new there. For example predicting a missing datapoint for each row.

Below example pulls data from Google Sheet, enriches it with Aito, and then pushes back to another Google Sheet.

Overview workflow

Click open the activity, and configure your action with Aito. Follow the numbering in the screenshots.

  1. Choose the right HTTP request method in Type. Most of the times it is POST.
  2. Endpoint URL is a combination of yoru instance URL (check from the Aito Console), and API endpoint in use. In this case we are using _similarity as the endpoint.
  3. Query body goes here. Essentially, you are writing an Aito predictive query. Check for tips here, and remember that Parabola merge tags make it easy to feed in the data from the each row of your input.
  4. Add custom header Content-Type with a value application-json.
  5. Add another custom header, where the key is x-api-key and the value is the API key that you can fetch from the Aito Console.
  6. This is really handy! Parabola will parse the response and pick up the right element if you choose "hits" as Top Level Key in the Nested Keys section. Note that in this case it is recommended to limit the query results to one with "limit": 1 in your query body. This way you'll get only the top prediction for each row.

Configurinng Aito query, part1

Configuring Aito query, part2

Got questions?

With this you are good to go! For any questions and help, join our Slack channel.

Was This Article Helpful?