Answer a question

I am working with Azure at the moment and I am unhappy with the predefined functions in the DataFactory because they start a Cluster in the background and this is absolutely not necessary for my problem.

I receive a csv file in a predefined folder and want to pick a set of columns and store them in a certain order in a csv file.

At the moment my file looks as follows:

The JSON file:

  "bindings": [
    {
      "name": "myblob",
      "type": "blobTrigger",
      "path": "input-raw",
      "connection": "AzureWebJobsStorage",
      "direction": "in"
    },
    {
      "name": "outputblob",
      "type": "blob",
      "path": "{blobTrigger}-copy",  
      "connection": "AzureWebJobsStorage",
      "direction": "out"
    }
  ],
  "disabled": false,
  "scriptFile": "__init__.py"
}

The init.py:

import logging
import azure.functions as func

def main(myblob: func.InputStream, outputblob: func.Out[func.InputStream]):
    logging.info(f"Python blob trigger function processed blob \n"
                 f"Name: {myblob.name}\n"
                 f"Blob Size: {myblob.length} bytes")
    outputblob.set(myblob)

My function picks a file in the folder and copies it with a '-copy' in the end in the same folder. Is there an easy way to access the data and to edit it with python?

Toll now I tried the packages 'csv', 'io' and 'fileinput' to read the information but I could not manage till now to edit or even see the data within my VisualStudioCode.

If you need more information please let me know.

Best P

Answers

In fact there is no way to 'edit' the .csv file. But you can download the .csv file and change it then upload to override the .csv file on azure.

By the way, if i read right, your function have a big problem. When the azure function be triggered, there will be endless 'xx-Copy' file in your container. I mean the output file will be the trigger condition of your function and the function will be endless.

This is my function, It use InputStream in func to read the blob data:

import logging

import azure.functions as func


def main(myblob: func.InputStream):
    
    logging.info(myblob.read().decode("utf-8") );
    logging.info(f"Python blob trigger function processed blob \n"
                 f"Name: {myblob.name}\n"
                 f"Blob Size: {myblob.length} bytes")

{
  "scriptFile": "__init__.py",
  "bindings": [
    {
      "name": "myblob",
      "type": "blobTrigger",
      "direction": "in",
      "path": "samples-workitems",
      "connection": "AzureWebJobsStorage"
    }
  ]
}

In my situation, I first read the blob data to bytes, and then I convert it to string. Let me know whether this can solved your question.:)

Logo

开发云社区提供前沿行业资讯和优质的学习知识,同时提供优质稳定、价格优惠的云主机、数据库、网络、云储存等云服务产品

更多推荐