Function App upload file directly to Storage Account

So this is sorted now, thanks to /u/AdamMarczakIO who pointed me in the right direction. <3

In case anyone comes across the same issue as I do, posting solution:

Edit: Gave up trying to deal with Reddit's handling of inline code with RIF and new.reddit.

Added this line in to the run.ps1

# Publish to Blob

Push-OutputBinding -Name blobStorage_connection -Value ($payload -join "`n")

The "blobStorage\_connection " is the name property that you specify in the function.json file that looks like:

{

"bindings": [

{...}.

{...},

{

  "name": "blobStorage\_connection",

  "type": "blob",

  "path": "file-upload/AWSIPRanges.csv",

  "connection": "blobStorage\_connection",

  "direction": "out"

}

],

"disabled": false

}

```

The use case for this is to create a Sentinel Watchlist for quicker lookups/exclusions for known ranges that matched particular security use cases. But given how large these files were we couldn't use the HTTP API Collector as file sizes were too large, and for-each loops would've taken hours.

This method allows me to grab a file from any where, convert to csv, store in blob, create SAS url, then use the preview API endpoint for large file upload to push the file in.

As the API endpoint is in preview I couldn't find documentation on how to use it (but given it's in the UI it obviously exists) so I had to use Burp suite to expose the URI and payload format.

It's pretty much the same as the normal watchlist API with a change to the API version, and a sasuri property in the payload.

/r/AZURE Thread