Articles on: Dataslayer API
This article is also available in:

Insert your API Query Manager data into Azure Blob Storage





Import your data in Microsoft Azure with our API QM product



With Dataslayer you can pull a URL directly from our API QM product in JSON format (learn how here). Each URL corresponds to a data table.

To enter the information of each URL that you generate with us to Azure, it is as easy as having a lambda function configured that calls as many URLs as you generate and need, convert the JSON format to CSV if you wish, and finally upload it to your Azure account. All this can be done by the lambda function.

This is an example of a lambda function that calls different URLs made with our API QM product to push that URL data into Azure. In this version of the Lambda function, we use the azure.storage.blob module to interact with Azure Storage. To authenticate the connection, you'll need to get a connection string and container name from your Azure Storage account and provide them in the function. Additionally, we use a BlobClient to interact with the Blob objects. This example is made in Python, but it can be transformed into any programming language you want, and you can also take advantage of lambda to transform the data when saving it to your Azure account:

import requests
from azure.storage.blob import BlobServiceClient

def lambda_handler(event, context):
    urls = [
        # Each API Query Manager URL
        # ...
    ]

    # Retrieve the connection string and container name from environment variables
    connect_str = '<connection_string>'
    container_name = '<container_name>'

    # Create a BlobServiceClient object to interact with the Blob storage account
    blob_service_client = BlobServiceClient.from_connection_string(connect_str)

    for url in urls:
        response = requests.get(url)
        data = response.json()
        # Get a BlobClient object to represent the blob
        blob_client = blob_service_client.get_blob_client(container=container_name, blob='<file_name>.json')
        # Upload the data to the blob
        blob_client.upload_blob(data)

    return 'Data stored in Azure Blob storage successfully'


Still have questions or doubts about this? Don't hesitate to contact us via our live chat on our website or via email.

Updated on: 24/01/2024

Was this article helpful?

Share your feedback

Cancel

Thank you!