Connecting to Azure Data Lake Storage Gen2 from PowerShell using REST API – a step-by-step guide

I prepared an article showing how to connect to ADLS Gen2 using OAuth bearer token and upload a file. It is definitely easier (no canonical headers and encoding), although it requires the application account. You can read more here: http://sql.pawlikowski.pro/2019/07/02/uploading-file-to-azure-data-lake-storage-gen2-from-powershell-using-oauth-2-0-bearer-token-and-acls/

 

 

Introduction


Azure Data Lake Storage Generation 2 was introduced in the middle of 2018. With new features like hierarchical namespaces and Azure Blob Storage integration, this was something better, faster, cheaper (blah, blah, blah!)  compared to its first version – Gen1.

Since then, there has been enough time to prepare the appropriate libraries, thanks to which we could connect to our data lake.

Is that right? Well, not really…

ADLS Gen2 is globally available since 7th of February 2019. Thirty-two days later, there is still no support for the BLOB API, and it means no support for az storage cli or REST API. So you gonna have problems here:

“Blob API is not yet supported for hierarchical namespace accounts”

Are you joking?

And good luck with searching for a proper module in PS Gallery. Of course, this will change in the future.

As a matter of fact – I hope, that this article will help someone to write it 🙂 (yeah, I’m too lazy or too busy or too stupid to do it myself 😛 )

So for now, there is only one way to connect to Azure Data Lake Storage Gen2… Using native REST API calls. And it’s a hell of the job to understand the specification and make it work in the code. But it’s still not rocket science 😛

And by the way. I’m just a pure mssql server boy, I need no sympathy 😛 So let me put it in that way: web development, APIs, RESTs and all of that crap are not in my field of interest. But sometimes you have to be your own hero to save the world… I needed this functionality in the project, I read all the documentation and a few blog posts. However, no source has presented how to do it for ADLS Gen2 😮

So now I’ll try to fill this gap and explain this as far as I understood it, but not from the perspective of a professional front-end/back-end developer, which I am definitely not!

 

ADLS Gen REST calls in action – sniffing with Azure Storage Explorer


I wrote an entire article about How to sniff ADLS Gen2 storage REST API calls to Azure using Azure Storage Explorer.

Go there, read it, try it for yourself. If you need to implement it in your code just look at how they are doing it.

 

Understanding Data Lake REST calls


If you want to talk to ADLS Gen2 endpoint in his language, you have to learn “two dialects” 😛

  • REST specification dedicated only for Azure Data Lake Storage Gen2 (URL with proper endpoints and parameters), documented HERE.
  • GENERAL specification for authenticating connections to any Azure service endpoint using “Shared key”, documented HERE.

Knowing the first one gives you the ability to invoke proper commands with proper parameters on ADLS. Just like in console, mkdir or ls.

Knowing the second – just how to encrypt those commands. Bear in mind that no bearer token (see the details in a green box at the beginning of this page…), no passwords or keys are used for communication! Basically, you prepare your request as a bunch of strictly specified headers with their proper parameters (concatenated and separated with new line sign) and then you “simply” ENCRYPT this one huge string with your Shared Access Key taken from ADLS portal (or script).

Then you just send your request in pure http(s) connection with one additional header called “Authorization” which will store this encrypted string, along with your all headers!

You may ask, whyyyyy, why we have to implement such a logic?

The answer is easy. Just because they say so 😀

But to be honest, this makes total sense.

ADLS endpoint will gonna receive your request. Then it will also ENCRYPT it using our secret Shared Key (which once again, wasn’t transferred in the communication anywhere, it’s a secret!). After encryption, it will compare the result to your “Authorization” header. If they will be the same it means two things:

  • you are the master of disaster, owner of the secret key that nobody else could use to encrypt the message for ADLS
  • there was no in-the-middle injection of any malicious code and that’s the main reason why this is so tortuous…

How to encrypt – it is just a different kettle of fish… Because you have to use keyed-Hash Message Authentication Code (HMAC) prepared with SHA256 algorithm converted to Base64 value using a hash table generated from your ADLS shared access key  (which, by the way, is the Base64 generated string :D) But no worries! It’s not so complicated as it sounds, and we have functions in PowerShell that can do that for us.

 

Authorization Header specification


Ok, let’s look closer to the Azure auth rest specification here: https://docs.microsoft.com/en-us/rest/api/storageservices/authorize-with-shared-key

We are implementing an ADLS gen2 file system request, which belongs to “Blob, Queue, and File Services” header. If you are going to do this for Table Service or if you would like to implement it in “light” version please look for a proper paragraph in docs.

Preparing  Signature String


Quick screen from docs:

We can divide it into 4 important sections.

  1. VERB – which is the kind of HTTP operation that we are gonna invoke in REST (in.ex. GET will be used to list files, PUT for creating directories or uploading content to files or PATCH for changing permissions)
  2. Fixed Position Header Values – which must occur in every call to REST, obligatory for sending BUT you can leave them as an empty string (just “” with an obligatory end of line sign)
  3. CanonicalizedHeaders – what should be there is also specified in the documentation. Basically, you have to put all available “x-ms-” headers here. And this is important – in lexicographical order by header name! For example: "x-ms-date:$date`nx-ms-version:2018-11-09`n"
  4. CanonicalizedResourcealso specified in docs. Here come all lowercase parameters that you should pass to adls endpoint according to its own specification. They should be ordered exactly the same as it is in the invoked URI. So if you want to list your files recursively in root directory invoking endpoint at https://[StorageAccountName].dfs.core.windows.net/[FilesystemName]?recursive=true&resource=filesystem" you need to pass them to the header just like this: "/[StorageAccountName]/[FilesystemName]`nnrecursive:true`nresource:filesystem"

Bear in mind that every value HAS TO BE ended with new line sign, except for the last, that will appear in the canonicalized resource section. They also need to be provided as lowercase values!

In examples above, I’m using `n which in PowerShell is just a replacement for \n

 

So if I want to list files in the root directory first I declare parameters:

 

Then I need to prepare a date, accordingly to specification. And watch out! Read really carefully this “Specifying the Date Header” section in the docs! It’s really important to understand how Date headers work! The most important part is:

Ok, creating it in PowerShell, also I’m preparing new line and my method as variables ready to be used later:

 

So what about fixed headers? For listing files in ADLS and with defined x-ms-date I can leave them all empty. It would be different if you would like to implement PUT and upload the content to Azure. Then you should use at least Content-Length and Content-Type . For now it looks like this:

 

  • Implementing file upload is much more complex than only listing them. According to the docs, you have to first CREATE file then UPLOAD a content to it (and it looks like you can do this also in parallel 😀 Happy threading/forking!)
  • Don’t forget about the required headers. UPDATE gonna need more than listing.
  • You can also change permissions here, see setAccessControl
  • There is also a huuuge topic about conditional headers. Don’t miss it!

Encrypting Signature String


Let’s have fun 🙂 Part of the credits goes to another blog with an example of a connection to storage tables.

First, we have to prepare an Array of Base64 numbers converted from our Shared Access Keys:

 

Then we create HMAC SHA256 object, we gonna fill it with our numbers from access key.

 

Now we can actually encrypt our string using ComputeHash function of our HMAC SHA256 object. But before encryption, we have to convert our string into the byte stream.

As specification requires it from us, we have to encode everything once again into Base64.

 

And use it as auth header, along with storage account name before it.

Also here we have to add ordinary headers with date of sending the request and API version.

Remember to use proper version (a date) which you can find in ADLS Gen2 REST API spec:

 

That’s it!

 

Now you can invoke request to your endpoint:

 

Result:

It works!

 

Now small debug in Visual Studio Code, adding a watch for $result  variable to see a little more than in console output:

From here you can see, that recursive=true did the trick, and now I have a list of 4 objects. Three folders in root directory and one file in FOLDER2, sized 49MB. Just rememebr that API has a limit of 5000 objects.

Easy peasy, right? 😉 Head down to the example scripts and try it for yourself!

Example Scripts


 

 

List files


This example should list the content of your root folder in Azure Data Lake Storage Gen2, along with all subdirectories and all existing files recursively.

 

 

 

List files in directory, limit results, return recursive


This example should list the content of your requested folder in Azure Data Lake Storage Gen2. You can limit the number of returned results (up to 5000) and decide if files in folders should be returned recursively.

 

 

 

Create directory (or path of directories)


This example should create folders declared in PathToCreate variable.

Bear in mind, that creating a path here, means creating everything declared as a path. So there is no need to create FOLDER1 as first, then FOLDER1/SUBFOLDER1 as second. You can make them all at once just providing the full path "/FOLDER1/SUBFOLDER1/"

 

 

 

List filesystems


This example lists all available filesystems in your storage account. Refer to $result.filesystems  to get the array of filesystem objects.

 

 

Create filesystem


This example should create filesystem provided in FilesystemName parameter. Remember, that filesystem names cannot have big letters or special chars!

 

 

Get permissions from filesystem or path


It will return a raw value of x-ms-acl property like: user::rwx,group::r-x,other::—
In Path parameter:
If you want get permission for filesystem just use “/”.
If you want to get permissions for a given path use it without leading “/” in.ex.: “Folder1/Folder2/File.csv”
 

Set permissions on filesystem or path


In Path parameter:
If you want get permission for filesystem just use “/”.
If you want to get permissions for a given path use it without leading “/” in.ex.: “Folder1/Folder2/File.csv”
Example of setting “Other” the Default –X (execute) permission (set it as $PermissionString  var):
user::rwx,group::r-x,other::--x,default:other::--x
 

Good luck!

46 thoughts on “Connecting to Azure Data Lake Storage Gen2 from PowerShell using REST API – a step-by-step guide

  1. Hi,
    How to create multiple folders at a time in ADLS Gen2 using Power Shell Script can You Please Help me this one.

    1. Hi Venu!

      At a time you can only create only a path of folders.
      It means, that if you are creating a path on an empty filesystem:
      folder1/folder2/folder3/folder4
      it will create for you 4 different folders, one inside another (in the background they are not exactly folders inside folders but let’s not discuss internals, it will make more confusion)

      But if you would like to create them like:
      folder1/folder2
      folder3
      folder4/folder5

      REST API will not allow you to do this in one call.

      I suggest to make a function or script and paste content of the example above – look for “Create directory (or path of directories)” and call it in other part of the script.

      Just a simple one.
      So let’s assume that you will paste the example into file: Create-ADLSFolder.ps1
      Then make another script: create_folders.ps1

      then inside create_folders.ps1 try this:

      $StorageAccountName = 'your_stor_acc_name'
      $FilesystemName = 'your_filesystem'
      $AccessKey = 'your_access_key'

      & Create-ADLSFolder.ps1 -PathToCreate "folder1/folder2" -StorageAccountName $StorageAccountName -FilesystemName $FilesystemName -AccessKey $AccessKey
      & Create-ADLSFolder.ps1 -PathToCreate "folder3" -StorageAccountName $StorageAccountName -FilesystemName $FilesystemName -AccessKey $AccessKey
      & Create-ADLSFolder.ps1 -PathToCreate "folder4/folder5" -StorageAccountName $StorageAccountName -FilesystemName $FilesystemName -AccessKey $AccessKey

      save it, open powershell and try to run your script:
      ./create_folders.ps1

      this should create you 3 different paths, having 5 different folders 🙂

      1. Hi Michal,
        I tried but got some error.

        Please see below error and give some pointer on that.

        C:\Users\Create-ADLSFolder.ps1 : A parameter cannot be found that matches parameter name ‘PathToCreate’.
        At C:\Users\create_folders.ps1:5 char:41
        + & C:\Users\Create-ADLSFolder.ps1 -PathToCreate “folder1/folder …
        + ~~~~~~~~~~~~~
        + CategoryInfo : InvalidArgument: (:) [Create-ADLSFolder.ps1], ParameterBindingException
        + FullyQualifiedErrorId : NamedParameterNotFound,Create-ADLSFolder.ps1

        C:\Users\Create-ADLSFolder.ps1 : A parameter cannot be found that matches parameter name ‘PathToCreate’.
        At C:\Users\create_folders.ps1:6 char:41
        + & C:\Users\Create-ADLSFolder.ps1 -PathToCreate “folder3” -Stor …
        + ~~~~~~~~~~~~~
        + CategoryInfo : InvalidArgument: (:) [Create-ADLSFolder.ps1], ParameterBindingException
        + FullyQualifiedErrorId : NamedParameterNotFound,Create-ADLSFolder.ps1

        C:\Users\Create-ADLSFolder.ps1 : A parameter cannot be found that matches parameter name ‘PathToCreate’.
        At C:\Users\create_folders.ps1:7 char:41
        + & C:\Users\Create-ADLSFolder.ps1 -PathToCreate “folder4/folder …
        + ~~~~~~~~~~~~~
        + CategoryInfo : InvalidArgument: (:) [Create-ADLSFolder.ps1], ParameterBindingException
        + FullyQualifiedErrorId : NamedParameterNotFound,Create-ADLSFolder.ps1

        Thanks in Advance

        1. Hi Shivam.
          Did you paste the entire code from the example: “Create directory (or path of directories)” ?
          Your error: “A parameter cannot be found that matches parameter name ‘PathToCreate’. ” means that there is a problem with it…
          Look for param section:

          [CmdletBinding()]
          Param(
          [Parameter(Mandatory=$true,Position=1)] [string] $StorageAccountName,
          [Parameter(Mandatory=$True,Position=2)] [string] $FilesystemName,
          [Parameter(Mandatory=$True,Position=3)] [string] $AccessKey,
          [Parameter(Mandatory=$True,Position=4)] [string] $PathToCreate
          )

        2. Previous issue which is Parameter “PathToCreate” not find is resolved but after Executing
          Poweshell script ./create_folders.ps1 , Getting Response Status code 400 i.e which is not indicate Success.

          Please find below error and give some suggestion on it.

          Response status code does not indicate success: 400 (The request URI is invalid.).
          At C:\Users\Create-ADLSFolder.ps1:69 char:5
          + Throw $ErrorMessage + ” ” + $StatusDescription
          + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
          + CategoryInfo : OperationStopped: (Response status cod\u2026 URI is invalid.). :String) [], RuntimeException
          + FullyQualifiedErrorId : Response status code does not indicate success: 400 (The request URI is invalid.).

          1. Shivam,
            I’m sorry, but I’m not sure what is wrong without pasting your work here.
            BUT let’s try with the other scenario, so when you have one script, function and one script call.

            This works perfectly for me:

            function Create-ADLSFolder
            {
            [CmdletBinding()]
            Param(
            [Parameter(Mandatory=$true,Position=1)] [string] $StorageAccountName,
            [Parameter(Mandatory=$True,Position=2)] [string] $FilesystemName,
            [Parameter(Mandatory=$True,Position=3)] [string] $AccessKey,
            [Parameter(Mandatory=$True,Position=4)] [string] $PathToCreate
            )

            # Rest documentation:
            # https://docs.microsoft.com/en-us/rest/api/storageservices/datalakestoragegen2/path/create

            $PathToCreate = "/" + $PathToCreate.trim("/") # remove all "//path" or "path/"

            $date = [System.DateTime]::UtcNow.ToString("R") # ex: Sun, 10 Mar 2019 11:50:10 GMT

            $n = "`n"
            $method = "PUT"

            $stringToSign = "$method$n" #VERB
            $stringToSign += "$n" # Content-Encoding + "\n" +
            $stringToSign += "$n" # Content-Language + "\n" +
            $stringToSign += "$n" # Content-Length + "\n" +
            $stringToSign += "$n" # Content-MD5 + "\n" +
            $stringToSign += "$n" # Content-Type + "\n" +
            $stringToSign += "$n" # Date + "\n" +
            $stringToSign += "$n" # If-Modified-Since + "\n" +
            $stringToSign += "$n" # If-Match + "\n" +
            $stringToSign += "*" + "$n" # If-None-Match + "\n" +
            $stringToSign += "$n" # If-Unmodified-Since + "\n" +
            $stringToSign += "$n" # Range + "\n" +
            $stringToSign +=
            < # SECTION: CanonicalizedHeaders + "\n" #>
            "x-ms-date:$date" + $n +
            "x-ms-version:2018-11-09" + $n #
            < # SECTION: CanonicalizedHeaders + "\n" #>

            $stringToSign +=
            < # SECTION: CanonicalizedResource + "\n" #>
            "/$StorageAccountName/$FilesystemName" + $PathToCreate + $n +
            "resource:directory"#
            < # SECTION: CanonicalizedResource + "\n" #>

            $sharedKey = [System.Convert]::FromBase64String($AccessKey)
            $hasher = New-Object System.Security.Cryptography.HMACSHA256
            $hasher.Key = $sharedKey

            $signedSignature = [System.Convert]::ToBase64String($hasher.ComputeHash([System.Text.Encoding]::UTF8.GetBytes($stringToSign)))

            $authHeader = "SharedKey ${StorageAccountName}:$signedSignature"

            $headers = @{"x-ms-date"=$date}
            $headers.Add("x-ms-version","2018-11-09")
            $headers.Add("Authorization",$authHeader)
            $headers.Add("If-None-Match","*") # To fail if the destination already exists, use a conditional request with If-None-Match: "*"

            $URI = "https://$StorageAccountName.dfs.core.windows.net/" + $FilesystemName + $PathToCreate + "?resource=directory"

            try {
            Invoke-RestMethod -method $method -Uri $URI -Headers $headers # returns empty response
            $true
            }
            catch {
            $ErrorMessage = $_.Exception.Message
            $StatusDescription = $_.Exception.Response.StatusDescription
            $false

            Throw $ErrorMessage + " " + $StatusDescription
            }
            }

            $StorageAccountName = 'acc_name'
            $FilesystemName = 'fs_name'
            $AccessKey = 'acc_key'

            Create-ADLSFolder -PathToCreate "folder1/folder2" -StorageAccountName $StorageAccountName -FilesystemName $FilesystemName -AccessKey $AccessKey
            Create-ADLSFolder -PathToCreate "folder3" -StorageAccountName $StorageAccountName -FilesystemName $FilesystemName -AccessKey $AccessKey
            Create-ADLSFolder -PathToCreate "folder4/folder5" -StorageAccountName $StorageAccountName -FilesystemName $FilesystemName -AccessKey $AccessKey

            dl

      2. Hi Michał Pawlikowski,
        Thanks, It is working Fine.
        I want to create multiple file systems at a time in ADLS Gen2 is it possible ?

      3. Hi Michał Pawlikowski,
        Thanks, It is working Fine.
        I want to create multiple file systems at a time in ADLS Gen2 is it possible ?

  2. Michal Is it possible to create multiple filesystem rather than multiple folders in existing filesystem.

    1. Hi Shivam,
      yes. Just create anothe script, like for folders, but this time use function with code from different example from my post above: “Create filesystem”.
      Run this function as many times as you want to create your filesystems, just pass different values in $FilesystemName.

      1. Hi Michal,
        Can we create multiple filesystem with single time running the code i.e “create filesystem”
        Like in case of multiple folders.

          1. is it possible (i.e creating multiple file systems ) in a single script, that is what Shivam expecting.

  3. Thanks Michal.

    You save my life!.
    one question.
    I tried to use “list file” cmdlets and it works well.
    But, When Azure Data Lake Store Gen 2 has over 5000 files, the cmdlets retrieve just 5000 files.
    I tried to use “x-ms-continuation” and got a “”Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the
    signature.” error.
    have you tried to use “x-ms-continuation”?

    I really appreciate your post.

    1. Hi Jay!
      Oh, this one. Look into my post 🙂

      From here you can see, that recursive=true did the trick, and now I have a list of 4 objects. Three folders in root directory and one file in FOLDER2, sized 49MB. Just rememebr that API has a limit of 5000 objects.

      Unfortunately I did not try to use continuation 🙁
      But it looks like it works kinda differently. I do not have time to make POC right now, but let me guide you to make it possible. Maybe you will nail it 🙂

      1. See the “continuation” parameter in API spec: https://docs.microsoft.com/pl-pl/rest/api/storageservices/datalakestoragegen2/path/list

      2. IF you run LIST for the first time and this request will return x-ms-continuation in header – it means that you reached limitation. You must save this x-ms-continuation value, catch it – it will be a continuation token. TIP: look into ResponseHeadersVariable parameter in Invoke-RestMethod docs 🙂 it should return you headers if you will use it, this is how you will catch this token.

      3. Then, if you will catch x-ms-continuation value, run once again LIST script, but this time you have to pass this value into ‘continuation’ param.

      It is an URI parameter. So you need to add it twice.
      – to requested URL:
      $URI = "https://$StorageAccountName.dfs.core.windows.net/" + $FilesystemName + "?recursive=true&resource=filesystem&continuation=$YOUR_x-ms-continuation-TOKEN"

      and unfortunately to CanonicalizedResource in string to sign (it must be placed in alphabetical order, so before recursive):


      $stringToSign +=
      < # SECTION: CanonicalizedResource + "\n" #>
      "/$StorageAccountName/$FilesystemName" + $n +
      "continuation:" + $YOUR_x-ms-continuation-TOKEN + $n +
      "recursive:true" + $n +
      "resource:filesystem"#
      < # SECTION: CanonicalizedResource + "\n" #>

      That’s all from theoretical perspective. Give it a try.
      If you will fail just give me a sign here. We will move this discussion into emails and try to work this out somehow with my limited time 😉

    2. @Jay, did you get continuation to work? I’m getting back first 5000 results, and when I add continuation, the request fails (error 403: Forbidden). I followed the documentation and Michal’s code, but cannot get it to work. I’d appreciate any help.

      1. Finally figured this out: most of the cases (if not all) continuation token returns with “==” at the end, and it messes the uri up. For uri, I escape them; for canonicalized resources I leave the string as is.

        @Michal, you’re a truly great resource. Thank you!

        1. @gunta, that’s a good tip! Definitely everything that has to appear in uri must be in accordance with the specification.
          [uri]::EscapeDataString($URI) should do the trick…
          Thanks for the help! 🙂

  4. Hi Michal,

    thanks for this, it’s working!

    I just wanted to ask how the specific string of the PermissionString should look like. I’ve tried to put the one which is right above the Set permissions on filesystem or path code but it fails and the error is following: “The remote server returned an error: (400) Bad Request. The request URI is invalid. (The remote server returned an error: (400) Bad Request. The request URI is invalid.)”
    I tried to look it up but unsuccessfully. Do you have any idea what might go wrong?

    Thanks in advance
    Viktor

    1. Hhm, maybe you have some error in the string? Unfortunately, it requires studying documentation and ACL permissions. But I have a better suggestion for you.
      Try to check and understand ACL settings just by sniffing calls from Azure Storage Explorer 🙂

      Please go to my other article: http://sql.pawlikowski.pro/2019/03/09/how-to-sniff-adls-gen2-storage-rest-api-calls-to-azure-using-azure-storage-explorer/

      Just run ASE, start sniffing and try to change permissions as you like them to be.
      Then copy permission string directly from it and use it in your code. This is the best approach in my opinion.

      Here is an example with setting all permissions to other:
      http://sql.pawlikowski.pro/wp-content/uploads/2019/05/CaptureGen2.png

  5. Hi Michal,

    I try to check if a filesystem already exists, because if the filesystem already exists, the create operation fails. Unfortunally, my code is not working (AuthenticationFailed, Server failed to authenticate the request.)

    Can you look at the code ? Basically it’s the “List files” example, but with some different URI parameters and no filesystem.

    [CmdletBinding()]
    Param(
    [Parameter(Mandatory=$true,Position=1)] [string] $StorageAccountName,
    [Parameter(Mandatory=$True,Position=3)] [string] $AccessKey
    )

    $date = [System.DateTime]::UtcNow.ToString("R") # ex: Sun, 10 Mar 2019 11:50:10 GMT

    $n = "`n"
    $method = "GET"

    $stringToSign = "$method$n" #VERB
    $stringToSign += "$n" # Content-Encoding + "\n" +
    $stringToSign += "$n" # Content-Language + "\n" +
    $stringToSign += "$n" # Content-Length + "\n" +
    $stringToSign += "$n" # Content-MD5 + "\n" +
    $stringToSign += "$n" # Content-Type + "\n" +
    $stringToSign += "$n" # Date + "\n" +
    $stringToSign += "$n" # If-Modified-Since + "\n" +
    $stringToSign += "$n" # If-Match + "\n" +
    $stringToSign += "$n" # If-None-Match + "\n" +
    $stringToSign += "$n" # If-Unmodified-Since + "\n" +
    $stringToSign += "$n" # Range + "\n" +
    $stringToSign +=

    "x-ms-date:$date" + $n +
    "x-ms-version:2018-11-09" + $n #

    $stringToSign +=

    "/$StorageAccountName" + $n +
    "resource:account"#

    $stringToSign

    $sharedKey = [System.Convert]::FromBase64String($AccessKey)
    $hasher = New-Object System.Security.Cryptography.HMACSHA256
    $hasher.Key = $sharedKey

    $signedSignature = [System.Convert]::ToBase64String($hasher.ComputeHash([System.Text.Encoding]::UTF8.GetBytes($stringToSign)))

    $authHeader = "SharedKey ${StorageAccountName}:$signedSignature"

    $authHeader

    $headers = @{"x-ms-date"=$date}
    $headers.Add("x-ms-version","2018-11-09")
    $headers.Add("Authorization",$authHeader)

    $URI = "https://$StorageAccountName.dfs.core.windows.net/" + "?resource=account"

    $result = Invoke-RestMethod -method $method -Uri $URI -Headers $headers # returns empty response

    $result

    1. Hi Ton!
      Unfortunately the reason is really small 🙂
      In:

      $stringToSign +=

      "/$StorageAccountName" + $n +
      "resource:account"#

      storage account name needs to end with another / sign.
      So use "/$StorageAccountName/" + $n + instead of "/$StorageAccountName" + $n +

      But thank you for this issue! It’s also an important thing when it comes to deal with all requests scoped on account level, not filesystem. I added an example of listing filesystems to the article.

  6. Hi thanks for the post.
    I am however, stuck on access key part; I am giving my SAS token but it gives error

    Exception setting “Key”: “Cannot convert value “104 198 28 238 43 122 244 84 104 225 219 195 31 134 189 59 18 175 224 133 195 110 111 226 8 71 142 3 189 175 103 198 240 197 53 110 50 182 49 99 131 106 164 12 81 195 36 79 118 28 42 92 202 93 109 0 26 210 237
    164 47 236 82 142” to type “System.Byte[]”.

    Can you please tell what exact $Access Key that i need to provide? as i am only giving it SAS token which is not base64 encoded.

    Thanks in advance

  7. Hi Michal, do you know how to use maxResults in the token?

    I tried the code below:

    “/$StorageAccountName/$FilesystemName” + $n +
    “directory:”+$file_dir + $n +
    “maxResults:5″+ $n +
    “recursive:true” + $n +
    “resource:filesystem”
    #

    and here is the url:

    $URI = “https://$StorageAccountName.dfs.core.windows.net/” + $FilesystemName + “?directory=”+$file_dir+”&maxResults=5&recursive=true&resource=filesystem”

    the url output as below:
    https://yygen2.dfs.core.windows.net/dd1?directory=f1&maxResults=5&recursive=true&resource=filesystem

    But when execute it, the error occurs:
    “AuthenticationFailed”,”message”:”Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the
    signature.

    If I removed the maxResults from the signature and url, it can work fine.
    Could you please kindly let me know how to make it work? Thanks very much.

    1. Hi Ivan,
      well, that’s a good one catch!
      For me it looks like a bug. In the documentation or in the implementation (of canonical resources parser in REST algorithm). Not sure, but I’ve reported it anyway: https://github.com/MicrosoftDocs/feedback/issues/1532

      So please use maxresults instead of maxResults. Yes, that’s the reason of the error.
      Also there is an update in my post, one more example with parameterized directory, max results and recursive.

      Thank you!

  8. Hi Michal,

    Really thank you very much. Now it works with the lowercase maxresults .

    Thank you again.

    1. I have an update on this.

      In the section “Constructing the Canonicalized Resource String” of the documentation Authorize with Shared Key documentation, there is a point: “4. Convert all parameter names to lowercase.”.
      So if you will get this error, make sure that you are lowercasing all parameters in your canonicalized resource string. It’s a requirement, pointed out in the specification, so it’s not a bug.

      Anyway thank you for this interesting issue 🙂 I will update the article to be sure that lowercasing is there.

  9. Hi MIchal,
    Is there a way I can get list of all the files/folders to which a user has access.
    I want to take username as a argument and list all the files for that user to which he has access.

    Thanks

    1. Hi Irfan.
      Unfortunately REST API does not have such possibility by default. You always get the list of files and directories in a context of the connection.
      In my example the context is handled by access key, so it means that you view files in a context of something like admin account.
      Path object definition contains “permissions” property. It stores the permissions for different “layers” of security.

      You can parse it to read what permission is set on the object. Of course these permissions can be granted from different layers. I mean from “OTHER”, owner, group or explicitly granted for the given user. So it’s not impossible to implement, but it will not be easy 🙁

      The second option, if the connection can be made by the user (so he or she will provide a password or token from the context), you can use bearer authentication (OAuth 2.0) to connect to ADLS Gen2 REST API. I did not try it, but I know, that it is possible and Azure Storage Explorer and azcopy is using it when user does not have the permission on the service level but have on ACL level.

      Some examples using curl: https://social.msdn.microsoft.com/Forums/sqlserver/ja-JP/dc102604-bdb7-47be-8de4-dc47a42e31a4/azure-data-lake-gen2-rest-api?forum=AzureDataLake

      1. Thanks a lot for the response.
        While setting permissions for a file it seems to overwrite the previous permissions.
        If I have given access to UserA and I run the script again to give access to UserB, then UserB will have access but not UserA, looks like it overwrites the access properties. How can I avoid this situation?

        $URI = “https://$StorageAccountName.dfs.core.windows.net/” + $FilesystemName + “/” + $Path + “?action=setAccessControl”

        Thanks,
        Irfan

  10. Hi Michal,

    I was working on moving files ( upload) from local to Azure ADL Gen2 storage account. Thanks for your detailed blog, I am able to create the directories. But can you please help me in uploading files to those newly created folders on the Azure ADL Gen2 storage account
    I know uploading the files to BLOB are relatively easy to do but not sure how to do the same in ADL Gen2.

    Thanks & Regards,
    JD

    1. Hi JD,
      I do not have much time lately, anyway I have in my plans to prepare examples on how to connect to ADLS Gen2 using OAuth2 token for service principal. Then Ill prepare another example of file uploading. Probably Ill do it during this weekend.

      If you want to start doing this by yourself, the basic approach is to create a file (like you create directory but with different header “resource:file”), then use Path – Update api with action=append to position 0 and body filled with the content of your file. Then you need to flush the content, so once again Path – Update but with action=flush and position = 0.

      I’m still not sure what about maximum request size, so what is the maximum possible file size that REST API will accept in the input. There is an error: “413 Request Entity Too Large” which indicates, that this could be an issue. And with Path – Update and proper position (offset in bytes) you can upload files as many connections, in chunks… Anyway, I will try to test it, but I do not have much experience in REST file uploading so not sure how to handle this… For example azcopy (and Azure Storage Explorer) always sends file to ADLS Gen2 in many small chunks. If you dont bilive it, just look into logs in your .azcopy catalog after the upload 😀 (it’s in yor c:/Users/[your_account]/)

  11. Thanks for the tutorial

    But how about reading the file inside the data lake store gen2 so that I could use the file in local computer for some processing.

    Any help would be highly appreciated.

  12. Hi, Thanks for the gr8 write up, I do have one question, am trying to set permission for all the child folder and files, and added recursive=true in the above set permission code, but still cant make it work. What am i doing wrong ?

    1. Merin Kumar SP,
      unfortunately it will not work.
      The “recursive” parameter is dedicated only to delete folder process.
      Update path API does not support any recursive update as you can see here:
      https://docs.microsoft.com/en-us/rest/api/storageservices/datalakestoragegen2/path/update

      This can be handled only manually, so you need to traverse all subfolders and files using own algorithm and set permissions on every level.

      In the other hand, accordingly to this article:

      https://docs.microsoft.com/en-us/azure/storage/blobs/data-lake-storage-use-hdfs-data-lake-storage

      you can use hdfs cli on adls gen2. And hdfs cli supports -R option in setfacl command.
      Detials: https://www.cloudera.com/documentation/enterprise/5-9-x/topics/cdh_sg_hdfs_ext_acls.html#concept_urf_fhk_q4__section_krq_pwv_ls

Leave a Reply