Get Site collection Administrators for SharePoint Online using PowerShell


SharePoint Online sites can have thousands of users.  There are times when people are asked to get site collection admins using SharePoint Online Management Shell or Get-SPOUser cmdlet with Where $_.IsSiteAdmin.  This will result in timeout if there are lot of users on the site.  Instead I decided to use a different approach.  First get all the users from site into a CSV and then make the check. 

image

$Creds = Get-Credentials

$site = ‘https://sharepoint-admin.sharepoint.com’
Connect-SPOService -Url $site -Credential $Creds

$AllUsers = Get-SPOUser -Site https://site.sharepoint.com -Limit all | select DisplayName, LoginName,IsSiteAdmin
$AllUsers | Export-Csv -Path C:\temp\allusers.csv -NoTypeInformation -Force
$Data = Import-Csv C:\temp\allusers.csv
foreach($aUser in $Data)
{
  if($aUser.IsSiteAdmin -eq “True”)
  {
    Write-Host $aUser.DisplayName $aUser.LoginName
  }
}

Advertisements

Add Fields to List using CSOM and PowerShell


A very basic script to add site columns to SharePoint Online List using Client Side Object Model and PowerShell.  This is a starter script and can be upgraded to allow support for content types as well. 

#$Creds = Get-Credential -Message “Please enter SPO Admin Credentials.”
$SPOSiteURL = https://tenant.sharepoint.com/
#Add references to SharePoint client assemblies and authenticate to Office 365 site – required for CSOM
Add-Type -Path “C:\Program Files\Common Files\microsoft shared\Web Server Extensions\16\ISAPI\Microsoft.SharePoint.Client.dll” -ErrorAction Stop
Add-Type -Path “C:\Program Files\Common Files\microsoft shared\Web Server Extensions\16\ISAPI\Microsoft.SharePoint.Client.Runtime.dll” -ErrorAction Stop

$credentials = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($Creds.UserName, $Creds.Password)
$ctx = New-Object Microsoft.SharePoint.Client.ClientContext($SPOSiteURL)
$ctx.Credentials = $credentials
$ctx.Load($ctx.Web);
Write-Host “Please enter Library Title” -ForegroundColor Yellow
$LibraryName = Read-Host
$List = $ctx.web.Lists.GetByTitle($LibraryName)
$ctx.Load($List);
$ctx.ExecuteQuery()

if($List -eq $null)
{
  return;
}

$Fields = $ctx.web.AvailableFields
$ctx.Load($Fields)
$ctx.ExecuteQuery()

$CustomerRepType = $ctx.Web.AvailableFields | Where {$_.Title -eq “CustomerRepType”}
$DocumentType = $ctx.Web.AvailableFields | Where {$_.Title -eq “DocumentType”}
$ctx.Load($CustomerRepType)
$ctx.Load($DocumentType)

$LegalDocumentTypes = $ctx.Web.AvailableFields | Where {$_.Title -eq “LegalDocumentType”}
$Product = $ctx.Web.AvailableFields | Where {$_.Title -eq “Product”}
$ctx.Load($LegalDocumentTypes)
$ctx.Load($Product)

$Solution = $ctx.Web.AvailableFields | Where {$_.Title -eq “Solution”}
$WWRegion = $ctx.Web.AvailableFields | Where {$_.Title -eq “WWRegion”}
$ctx.Load($Solution)
$ctx.Load($WWRegion)
$ctx.ExecuteQuery()

#Add fields to the list
$List.Fields.Add($CustomerRepType)
$List.Fields.Add($DocumentType)
$List.Fields.Add($LegalDocumentTypes)
$List.Fields.Add($Product)
$List.Fields.Add($Solution)
$List.Fields.Add($WWRegion)
$List.Update()
$ctx.ExecuteQuery()

#Add fields to the default view
$DefaultView = $List.DefaultView
$DefaultView.ViewFields.Add(“CustomerRepType”)
$DefaultView.ViewFields.Add(“DocumentType”)
$DefaultView.ViewFields.Add(“LegalDocumentTypes”)
$DefaultView.ViewFields.Add(“Product”)
$DefaultView.ViewFields.Add(“Solution”)
$DefaultView.ViewFields.Add(“WWRegion”)
$DefaultView.Update()
$ctx.ExecuteQuery()

Script is also available

https://gallery.technet.microsoft.com/Add-Colums-to-List-CSOM-ee62a96a

Bulk Upload Files to SharePoint Online with Metadata


There are many scripts on Technet that helps you upload the content from File Shares to SharePoint Online or OneDrive for Business but none of them actual copy the user information from file shares.  The script below migrates metadata like created by and created and modificed dates along with the files.  The script uses SharePoint Online Client Side Object Model.  I hope this helps someone.  

Note: The actual script was written by someone at Technet which I do not remember, I just made serious modifications to make it possible to extract and upload metadata.

[CmdletBinding()]
param(
  [Parameter(Mandatory=$True,Position=1)]
  [String]$UserName,
  [Parameter(Mandatory=$True,Position=2)]
  [String]$Password,
  [Parameter(Mandatory=$True, Position=3)]
  [String]$SiteURL,
  [Parameter(Mandatory=$True, Position=4)]
  [String]$DocLibName,
  [Parameter(Mandatory=$True, Position=5)]
  [String]$Folder,
  [Parameter(Mandatory=$False, Position=6)]
  [Switch]$Checkin,
  [Parameter(Mandatory=$False, Position=7)]
  [string]$MetaDataCSV,
  [Parameter(Mandatory=$False, Position=8)]
  [Switch]$O365
)

<#
    Add references to SharePoint client assemblies and authenticate to Office 365 site – required for CSOM

    Add-Type -Path “C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\15\ISAPI\Microsoft.SharePoint.Client.dll”
    Add-Type -Path “C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\15\ISAPI\Microsoft.SharePoint.Client.Runtime.dll”
#>

Add-Type -Path “C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\16\ISAPI\Microsoft.SharePoint.Client.dll”
Add-Type -Path “C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\16\ISAPI\Microsoft.SharePoint.Client.Runtime.dll”

function GetUserLookupString{
    [CmdletBinding()]
    param($context, $userString)
   
    try{
        $user = $context.Web.EnsureUser($userString)
        $context.Load($user)
        ExecuteQueryWithIncrementalRetry -context $Context -retryCount 5 -delay 500
       
        # The “proper” way would seem to be to set the user field to the user value object
        # but that does not work, so we use the formatted user lookup string instead
        #$userValue = New-Object Microsoft.SharePoint.Client.FieldUserValue
        #$userValue.LookupId = $user.Id
        $userLookupString = “{0};#{1}” -f $user.Id, $user.LoginName
    }
    catch{
        Write-Host “Unable to ensure user ‘$($userString)’.”
        $userLookupString = $null
    }
   
    return $userLookupString
}

function ExecuteQueryWithIncrementalRetry([Microsoft.SharePoint.Client.ClientContext] $context, $retryCount, $delay)
{
    if ($retryCount -eq $null) { $retryCount = 5 } # default to 5
    if ($delay -eq $null) { $delay = 500 } # default to 500
    $retryAttempts = 0
    $backoffInterval = $delay
    if ($retryCount -le 0)
    {
        throw New-Object ArgumentException(“Provide a retry count greater than zero.”)
    }

    if ($delay -le 0)
    {
        throw New-Object ArgumentException(“Provide a delay greater than zero.”)
    }

    # Do while retry attempt is less than retry count
    while ($retryAttempts -lt $retryCount)
    {
        try
        {
            $context.ExecuteQuery()
            return
        }
        catch [System.Net.WebException]
        {
            $response = [System.Net.HttpWebResponse]$_.Exception.Response
            # Check if request was throttled – http status code 429
            # Check is request failed due to server unavailable – http status code 503
            if ($response -ne $null -and ($response.StatusCode -eq 429 -or $response.StatusCode -eq 503))
            {
                # Output status to console. Should be changed as Debug.WriteLine for production usage.
                Write-Host “CSOM request frequency exceeded usage limits. Sleeping for $backoffInterval seconds before retrying.”

                # Add delay for retry
                Start-Sleep -m $backoffInterval

                # Add to retry count and increase delay.
                $retryAttempts++
                $backoffInterval = $backoffInterval * 2
            }
            else
            {
                throw
            }
        }
    }
    throw New-Object Exception(“Maximum retry attempts $retryCount, has be attempted.”)
}

<#
    Define Functions
#>

<#
    Upload File – This function performs the actual file upload
#>
function UploadFile($DestinationFolder, $File)
{
    #Get the datastream of the file, assign it to a variable
    $FileStream = New-Object IO.FileStream($File.FullName,[System.IO.FileMode]::Open)

    #Create an instance of a FileCreationInformation object
    $FileCreationInfo = New-Object Microsoft.SharePoint.Client.FileCreationInformation

    #Indicate whether or not you would like to overwrite files in the event of a conflict
    $FileCreationInfo.Overwrite = $True

    #Make the datastream of the file you wish to create equal to the datastream of the source file
    $FileCreationInfo.ContentStream = $FileStream

    #Make the URL of the file equal to the $File variable which was passed to the function.  This will be equal to the source file name
    $FileCreationInfo.url = $File

    #Add the file to the destination folder which was passed to the function, using the FileCreationInformation supplied.  Assign this to a variable so that it can be loaded into context.
    $Upload = $DestinationFolder.Files.Add($FileCreationInfo)
    if($Checkin)
    {
        $Context.Load($Upload)
        #$Context.ExecuteQuery()
        ExecuteQueryWithIncrementalRetry -context $Context -retryCount 5 -delay 500
        if($Upload.CheckOutType -ne “none”)
        {
            $Upload.CheckIn(“Checked in by Administrator”, [Microsoft.SharePoint.Client.CheckinType]::MajorCheckIn)
        }
    }
    $Context.Load($Upload)
    #$Context.ExecuteQuery()
    ExecuteQueryWithIncrementalRetry -context $Context -retryCount 5 -delay 500
   
    if($MetaDataCSVData)
    {
      $MetaDataItem = $MetaDataCSVData | ? FilePath -eq $File.FullName
      Write-Host “User does not exist in metadata file.  Standard metadata will be used…” -ForegroundColor Yellow
      if($MetaDataItem)
      {
        Write-Host “Adding Metadata to File Item…$($File.FullName)”
        $listItem = $Upload.ListItemAllFields
        $Context.Load($listItem)
        #$Context.ExecuteQuery()
        ExecuteQueryWithIncrementalRetry -context $Context -retryCount 5 -delay 500
   
        $Item = $List.GetItemById($listItem.Id);
        $Context.Load($Item)
        #$Context.ExecuteQuery()
        ExecuteQueryWithIncrementalRetry -context $Context -retryCount 5 -delay 500
     
        $Item[“Description0”] = $MetaDataItem.Description      
        $Author = GetUserLookupString $Context $MetaDataItem.Author
        if($Author)
        {
          $Item[“Author”] = $Author
        }
        $Editor = GetUserLookupString $Context $MetaDataItem.Editor
        if($Editor)
        {
          $Item[“Editor”] = $Editor
        }
       
        [System.DateTime]$CreatedOnDate = $MetaDataItem.CreatedOn
        [System.DateTime]$ModifiedOnDate = $MetaDataItem.ModifiedOn
        $Item[“Created”] = $CreatedOnDate
        $Item[“Modified”] = $ModifiedOnDate
        try
        {
          $Item.Update()
        }
        catch
        { Write-Host “Exception Occured… But script will continue” }
        #$Context.ExecuteQuery()
        ExecuteQueryWithIncrementalRetry -context $Context -retryCount 5 -delay 500
      }
    }
   
   
 
}

<#
    Create Folder Function.
#>
function PopulateFolder($ListRootFolder, $FolderRelativePath)
{
    #split the FolderRelativePath passed into chunks (between the backslashes) so that we can check if the folder structure exists
    $PathChunks = $FolderRelativePath.split(“\”)

    #Make sure we start with a fresh WorkingFolder for every folder passed to the function
    if($WorkingFolder)
    {
        Remove-Variable WorkingFolder
    }

    #Start with the root folder of the list, load this into context
    $WorkingFolder = $ListRootFolder
    $Context.load($WorkingFolder)
    $Context.ExecuteQuery()

    #Load the folders of the current working folder into context
    $Context.load(($WorkingFolder.folders))
    #$Context.executeQuery()
    ExecuteQueryWithIncrementalRetry -context $Context -retryCount 5 -delay 500

    #Set the FileSource folder equal to the absolute path of the folder that passed to the function
    $FileSource = $Folder + $FolderRelativePath
   
    #Loop through the folder chunks, ensuring that the correct folder hierarchy exists in the destination
    foreach($Chunk in $PathChunks)
    {
        #Check to find out if a subfolder exists in the current folder that matches the patch chunk being evaluated
        if($WorkingFolder.folders | ? {$_.name -eq $Chunk})
        {
            #Log the status to the PowerShell host window
            Write-Host “Folder $Chunk Exists in” $WorkingFolder.name -ForegroundColor Green

            #Since we will be evaluating other chunks in the path, set the working folder to the current folder and load this into context.
            $WorkingFolder = $WorkingFolder.folders | ? {$_.name -eq $Chunk}
            $Context.load($WorkingFolder)
            $Context.load($WorkingFolder.folders)
            #$Context.ExecuteQuery()
            ExecuteQueryWithIncrementalRetry -context $Context -retryCount 5 -delay 500

        }
        else
        {
            #If the folder doesn’t exist, Log a message indicating that the folder doesn’t exist, and another message indicating that it is being created
            Write-Host “Folder $Chunk Does Not Exist in” $WorkingFolder.name -ForegroundColor Yellow
            Write-Host “Creating Folder $Chunk in” $WorkingFolder.name -ForegroundColor Green
           
            #Load the working folder into context and create a subfolder with a name equal to the chunk being evaluated, and load this into context
            $Context.load($WorkingFolder)
            $Context.load($WorkingFolder.folders)
            #$Context.ExecuteQuery()
            ExecuteQueryWithIncrementalRetry -context $Context -retryCount 5 -delay 500
            $WorkingFolder= $WorkingFolder.folders.add($Chunk)
            $Context.load($WorkingFolder)
            $Context.load($WorkingFolder.folders)
            #$Context.ExecuteQuery()
            ExecuteQueryWithIncrementalRetry -context $Context -retryCount 5 -delay 500
           
        }

    }

    #Folder is confirmed existing or created – now it’s time to list all files in the source folder, and assign this to a variable
    $FilesInFolder = Get-ChildItem -Path $FileSource | ? {$_.psIsContainer -eq $False}
   
    #For each file in the source folder being evaluated, call the UploadFile function to upload the file to the appropriate location
    Foreach ($File in ($FilesInFolder))
    {

        #Notify the operator that the file is being uploaed to a specific location
        Write-Host “Uploading file ” $file.Name “to” $WorkingFolder.name -ForegroundColor Cyan

        #Upload the file
        UploadFile $WorkingFolder $File

    }
   
   
   
}

<#
    Bind your context to the site collection
#>
$Context = New-Object Microsoft.SharePoint.Client.ClientContext($SiteURL)

<#
    Create a credential object using the username and password supplied
#>
if($O365)
{
    $Creds = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($UserName,(ConvertTo-SecureString $Password -AsPlainText -Force))
}
else
{
    $Creds = New-Object System.Net.NetworkCredential($UserName, (ConvertTo-SecureString $Password -AsPlainText -Force))

}

<#
    Set the credentials that are used in the context.
#>
$Context.Credentials = $Creds

<#
    Retrieve the library, and load it into the context
#>
$List = $Context.Web.Lists.GetByTitle($DocLibName)
$Web = $Context.Web;
$Context.Load($Web)
$Context.Load($List)
$List.EnableVersioning = $false
$List.Fields.GetByInternalNameOrTitle(“Author”).ReadOnlyField = $false;
$List.Fields.GetByInternalNameOrTitle(“Editor”).ReadOnlyField = $false;
$List.Fields.GetByInternalNameOrTitle(“Created”).ReadOnlyField = $false;
$List.Fields.GetByInternalNameOrTitle(“Modified”).ReadOnlyField = $false;
$List.Update()
#$Context.ExecuteQuery()
ExecuteQueryWithIncrementalRetry -context $Context -retryCount 5 -delay 500

$MetaDataCSV = “C:\temp\Metadata.csv”
$MetaDataCSVData = Import-Csv -Path $MetaDataCSV

#Get a recursive list of all folders beneath the folder supplied by the operator
$AllFolders = Get-ChildItem -Recurse -Path $Folder |? {$_.psIsContainer -eq $True}

#Get a list of all files that exist directly at the root of the folder supplied by the operator
$FilesInRoot = Get-ChildItem -Path $Folder | ? {$_.psIsContainer -eq $False}

#Upload all files in the root of the folder supplied by the operator
Foreach ($File in ($FilesInRoot))
{

    #Notify the operator that the file is being uploaded to a specific location
    Write-Host “Uploading file ” $File.Name “to” $DocLibName -ForegroundColor Cyan

    #Upload the file
    UploadFile($list.RootFolder) $File
   

}

#Loop through all folders (recursive) that exist within the folder supplied by the operator
foreach($CurrentFolder in $AllFolders)
{
    #Set the FolderRelativePath by removing the path of the folder supplied by the operator from the fullname of the folder
    $FolderRelativePath = ($CurrentFolder.FullName).Substring($Folder.Length)
   
    #Call the PopulateFolder function for the current folder, which will ensure that the folder exists and upload all files in the folder to the appropriate location
    PopulateFolder ($list.RootFolder) $FolderRelativePath
}

$List.Fields.GetByInternalNameOrTitle(“Author”).ReadOnlyField = $true;
$List.Fields.GetByInternalNameOrTitle(“Editor”).ReadOnlyField = $true;
$List.Fields.GetByInternalNameOrTitle(“Created”).ReadOnlyField = $true;
$List.Fields.GetByInternalNameOrTitle(“Modified”).ReadOnlyField = $true;
$List.EnableVersioning = $True
$List.Update()
#$Context.ExecuteQuery()
ExecuteQueryWithIncrementalRetry -context $Context -retryCount 5 -delay 500

Script is also available below

https://gallery.technet.microsoft.com/Bulk-Upload-Files-to-b9ed3126