Articles on: Dataslayer API
This article is also available in:

How can you insert your API Query Manager data in AWS

Import your data in Amazon Web Services (AWS) with our API QM product

With Dataslayer you can pull a URL directly from our API QM product in JSON format (learn how here). Each URL corresponds to a data table.

To enter the information of each URL that you generate with us to an S3 bucket in AWS, it is as easy as having a lambda function configured that calls as many URLs as you generate and need, convert the JSON format to CSV if you wish, and finally upload it to your S3 bucket. All this can be done by the lambda function.

This is an example of a lambda function that calls different URLs made with our API QM product to push that URL data into an S3 bucket on AWS. This example is made in Python, but it can be transformed into any programming language you want, and you can also take advantage of lambda to transform the data when saving it to your S3:

import requests
import boto3

def lambda_handler(event, context):
  urls = [
      # Each API Query Manager URL
      # ...

    for url in urls:
        response = requests.get(url)
        data = response.json()
        s3 = boto3.client('s3')
        s3.put_object(Bucket='<Bucket Name>', Key='<File Name>.json', Body=data)

    return 'Data stored in S3 successfully'

Still have questions or doubts about this? Don't hesitate to contact us via our live chat on our website or via email.

Updated on: 23/08/2023

Was this article helpful?

Share your feedback


Thank you!