Track SharePoint Content Database Growth via Central Admin

Following on from a recent post I made about a SharePoint health analyzer rule that can be used to automatically expand a SharePoint content database outside of normal working hours, I wanted to create a solution for monitoring content databases growth over time via central admin. Here’s what I came up with:

image

The solution consists of four parts, the first part is the Review Databases Sizes page shown above. The page is accessed from a custom action under Application Management > Databases:

image

The Review Databases Sizes page lists each content database present in the farm, plus a spark line that shows the database data file size and log file size over time. Clicking on the database name or either of the spark lines shows the second part of the solution, the Database Size Details application page. This page will be displayed to you inside an SP.UI.ModalDialog:

image

imageThe chart shown in the modal dialog (and the spark lines) are created via the jqPlot jQuery extension and allow for some nifty features such as data point highlighting, animated rendering and zooming. Note: You may need to check the jqPlot browser requirements to ensure this will work in your environment.

To zoom into an area on the chart simply click and then drag a rectangle that contains the data to be explored:

The chart will be re-rendered to display just the data points contained in the area you selected.

After you’ve zoomed in, you can examine individual values by hovering your mouse over a data point or you can zoom back out to the full chart by double clicking anywhere on the chart.

image

The third part of the solution is the deployment of the jqPlot JavaScript libraries themselves. The required libraries are deployed by a SharePoint feature and use ScriptLinks to add themselves to the master page of central admin without updating the master page itself. I’ve used this simple and powerful method to deploy jQuery libraries before and more details about can be found here: Use jQuery in SharePoint without updating your MasterPage

The fourth and final part of the solution is a custom timer job that is set to run once a day sometime between midnight and 1am. Its called ‘SPHealth Database Size Collection’:

image

The timer job finds each content database in the farm and demines the size of the database data file and log files for each. The sizes are then stored in the property bag for each content database.

That’s it – two application pages, a timer job, and a couple of module files. Smile

I’ve published the source code to the solution at http://sphealthdbsize.codeplex.com if you want to have a poke around and try it out for yourself. Caveat: Before you deploy this in a production farm, just like any other third party solution, I would recommend you review and understand what the code is doing before you use it. Also note the following, it may take a couple of days before you see any charts as the timer job will need to have run twice to have collected enough data points to plot!

Enjoy…

Autogrowth of SharePoint Content Databases – an Automated Solution

During a recent presentation at the London SharePoint User Group by Steve Smith of Combined Knowledge about SharePoint administration (and many other things), he discussed the issues surrounding the auto-growth of SharePoint content databases and the possible performance ramifications these can have when they are triggered during business hours.

As Steve pointed out, the default auto-growth settings for a newly created content databases are to grow in 1MB increments:

imageClearly for a content database that is used off the bat with this configuration, a lot (and I mean a lot) of auto-growths will be performed on the database as users load content and even access the site collections that the content database contains. The recommendations from Microsoft are to pre-grow data and log files and to set the auto-growth to 10% – see Storage and SQL Server capacity planning and configuration (SharePoint Server 2010) for further details: http://technet.microsoft.com/en-us/library/cc298801.aspx.

 

These recommendation rightly point out that database growth should be proactively managed. So Steve’s presentation and this article got me thinking about a repairable SharePoint health analyzer rule that could warn when content databases are filling up and, if required, grow them automatically. What makes this a practical solution I believe is the ability to configure the rule so that database growths performed by the repair action of the health rule are only executed within a specified time window.

The health rule derives from SPRepairableHealthAnalysisRule so it can be configured to automatically repair (for repair read grow) a database once it has exceeded a configurable capacity threshold. The rule supports four custom configurable parameters:

<Properties>
  <!-- Enter the database capacity percentage that is used to trigger -->
  <!-- a warning and potentially a scheduled database expansion. Values -->
  <!-- should be between 0.0 and 1.0. -->
  <Property Key="CapacityThreshold" Value="0.8" />
  <!-- Enter the BeginHour for the time period in which database -->
  <!-- expansions should occur. -->
  <Property Key="BeginHour" Value="1" />
  <!-- Enter the EndHour for the time period in which database -->
  <!-- expansions should occur. -->
  <Property Key="EndHour" Value="3" />
  <!-- Enter the percentage of growth the database should undertake -->
  <!-- during an expansion. Values should be between 0.0 and 1.0. -->
  <Property Key="GrowBy" Value="0.3" />
</Properties>

The CapcityThreshold property is used to set the level at which warnings about database capacities are raised. Once a database exceeds 80%  (the default threshold for the rule) a health analyzer warning is raised and is visible in central admin.

The BeginHour and EndHour properties are used to define a time window in which, for database that have exceeded their capacity threshold, growths should be executed by the rule. These growths will not occur if the ‘Repair Automatically’ button is pressed outside of this window. Ideally you should review the properties and behaviour of this rule and if appropriate, set the rule to repair automatically. Please note, in order for the rule to repair automatically during the specified time window, the rule schedule should remain hourly:

image

Lastly, the GrowBy property is used by the repair method to determine the amount of expansion a database should undertake. The default option is 30% – this means that if a database is 100MB in size and 90% full, the database will be grown to 130MB. The total database size is used to calculate the new database size and not the amount of space currently used.

The rule is packaged as part of the SharePoint Health Analyzer Rules project on http://sphealth.codeplex.com/

The source code for the rule can be reviewed here: http://sphealth.codeplex.com/SourceControl/changeset/view/412d4aba56ba#SPHealth.SharePoint.HealthRules%2fSP%2fRules%2fPerformance%2fWarnDatabaseCapacity.cs

BTW: There is a quicker way to solve this entire auto-growth problem – make the content database read-only! Winking smile

Create custom SharePoint Health Analyzer rules

Have you got a SharePoint farm that has a unique set-up, special monitoring requirements, particular SLAs that it must meet or a farm that needs to provide your operations team with pro-active monitoring. If so, create your own SharePoint Health Analyzer rules, it’s super easy!

I’m sure we’ve all worked on deployments that fall into one or more of the categories above or have tons of other requirements than would benefit from monitoring. Perhaps the monitoring you need is nothing to do with the farm deployment and it’s operating environment but instead monitoring of a custom application you’ve built. Either way, creating your own SharePoint Health Analyzer rules could be a good idea.

Here’s how you create them…

Start a new Visual Studio 2010 Empty SharePoint Project and add to it a new class. The class must inherit from SPHealthAnalysisRule:

image

Next, you need to override the Category and ErrorLevel your rule will be reported under:

image

Next, override the Explanation, Remedy and Summary strings the rule returns. These are what the user see when the rule is displayed in the Review problems and solutions list from within Central Administration.

image

Next, override the SPHealthAnalysisRuleAutomaticExecutionParameters, these control how, where and when the rule is checked.

image

The interesting option here is the Scope. The Scope allows the rule to be executed on ‘Any’ or ‘All’ servers in the farm. Depending on what your rule is designed to do, running it on one server might be enough but you may need to run it on every server. For example, a health rule than check the size of a content database could be run on any server (SPHealthCheckScope.Any) as it doesn’t matter from which server you interrogate your database for its size. However, a rule that checks for available disk space will need to be executed on every server (SPHealthCheckScope.All).

Now the last part, the rule logic itself. To implement this, simple override the Check() method:

image

The check method must return a SPHealthCheckStatus:

image

As you can see, creating rules is simple. Installing the rules is a little more involved but still only a few lines of code. To deploy the rules, you’ll need to add a farm scoped feature to your project, add to the farm scoped feature an event receiver and override the FeatureActivated and FeatureDeactivating events. The FeatureActivated event will install the rules contained in the assembly produced by the project by calling the RegisterRules method of the SPHealthAnalyzer class:

image

Lastly, the code to remove the rules on feature activation is just as simple:

image

Now deploy your feature and watch it fail…

There’s on last trick to getting this working. It appears that there is a issue with deploying the solution and activating the feature all in one step (just like Visual Studio tries to do). The RegisterRules method call fails if you attempt this, I suspect this is due to timing of the DLL becoming available in the GAC but I haven’t got to the bottom of this one yet. To work around this issue, update the farm feature manifest.xml to include the ActivateOnDefault=”False” attribute:

image

Now you can deploy your solution, manually activate your farm feature and begin testing your custom rules.Smile

If you want a complete sample solution that includes the rule I’ve used as an example in this post and many more you can download the source code and WSP at http://sphealth.codeplex.com

Enjoy.

Update IIS bindings programmatically via SharePoint timer job

In some rare edge-cases, it may be necessary to programmatically update IIS settings from SharePoint code. In this example I’m updating the host header bindings in IIS as I’m using (and creating) host header site collections programmatically. I could use a wildcard DNS and the default port 80 but in my scenario we need explicit host header bindings.

To accomplish the update to IIS we need to use a timer job for two reasons. The first is to ensure our updates are run with sufficient privileges to perform the required updates – as timer jobs run under the farm account this is typically elevated enough already. Secondly, using the timer job framework allows us to target which servers (one, some or all) the modifications are run on. This is important  because  if we are using a farm with multiple servers, keeping changes to IIS bindings synchronised  across the farm is clearly a good place to aim for – in every other direction lies madness.

How to create a timer job is not in the scope of this post – here’s some great reference material to get you started: http://msdn.microsoft.com/en-us/library/cc427068(v=office.12).aspx – this example is for WSS 3 but the process is just the same in 2010.

The important thing to note is how you install your timer job. One of the parameters of your constructor method for the timer job can be an SPJobLockType :

image

The SPJobLockType enumeration has three members that control where the timer job is executed:

  • NoneProvides no locks. The timer job runs on every machine in the farm on which the parent service is provisioned, unless the job I associated with a specified server in which case it runs on only that server (and only if the parent service is provisioned on the server).
  • ContentDatabaseLocks the content database. A timer job runs one time for each content database associated with the Web application.
  • JobLocks the timer job so that it runs only on one machine in the farm.
    The None option is potentially the most useful option to us for updating IIS settings as this will cause the timer job to run on every server in the farm that is running the ‘Microsoft SharePoint Foundation Web Application’ service – these are effectively our web front end servers. This approach also relies on the third parameter passed to the constructor being null otherwise the SPServer object that is passed via this parameter is used as the server to host the timer job.

Once we’ve installed our timer job in the appropriate way, the payload is simple. We need to override the Execute method of our timer job with whatever logic we need. Here’s an extract of my code for updating IIS bindings which should now be executed of each SharePoint WFE:

image

I hope this helps….

Update: 23 December 2012 – A good friend of mine, Gael Fabry, pointed me in the direction of this article that describes the schema of the IIS 7 applicationHost.config file: http://msdn.microsoft.com/en-us/library/aa347559(v=VS.90).aspx

Publishing and Consuming SharePoint Service Application with PowerShell

Below are my PowerShell scripts for publishing and consuming service applications between SharePoint farms (Service Application Federation). Before I introduce the scripts let be briefly explain the process involved.

The are several reasons why you might want to share service applications between farms and I’m not planning on going into those details here but the overall process of setting up service application federation is illustrated below:

image

The SharePoint farm on the left is the consuming farm. It will use the service applications from the SharePoint farm on the right – the publishing farm. Practically this means that the publishing farm contains your service instances, service applications and service application proxies and your consuming farm contains just service application proxies that ‘point’ to the publishing farm. Loads more details on service application federation is available on technet including which service applications can be federated: http://technet.microsoft.com/en-us/library/ff621100.aspx

The scripts I have follow the pattern shown in the diagram above. First on the consuming farm, I export the farm root certificate, the security token service certificate and I write the farm ID to a text file:

Contents of 1_Consumer_ExportCerts.ps1 – run this on the CONSUMING farm

Add-PSSnapIn "Microsoft.SharePoint.PowerShell" -EA 0

# export consumer root certificate
Write-Host "Exporting Consumer Root Certificate…" -nonewline
$rootCert = (Get-SPCertificateAuthority).RootCertificate
$rootCert.Export("Cert") | Set-Content ConsumingFarmRoot.cer -Encoding byte
Write-Host "Done" -Foreground Green

# export consumer sts certificate
Write-Host "Exporting Consumer STS Certificate…" -nonewline
$stsCert = (Get-SPSecurityTokenServiceConfig).LocalLoginProvider.SigningCertificate
$stsCert.Export("Cert") | Set-Content ConsumingFarmSTS.cer -Encoding byte
Write-Host "Done" -Foreground Green

# export consumer farm id
Write-Host "Exporting Consumer Farm Id…" -nonewline
$farmID = Get-SPFarm | select Id
set-content -path ConsumerFarmID.txt $farmID
Write-Host "Done" -Foreground Green

Write-Host "Now copy ConsumingFarmRoot.cer, ConsumingFarmSTS.cer & ConsumerFarmID.txt to the publishing farm." -Foreground Yellow

You’ll notice this script ends with a prompt to guide you to the next step – copy the consuming farm certificate, consuming sts certificate and the consuming farm id txt file we’ve just created to the publishing farm. Next, switch over to the publishing farm.

Contents of 2_Publisher_ExportCerts.ps1 – run this on the PUBLISHING farm

Add-PSSnapIn "Microsoft.SharePoint.PowerShell" -EA 0

# export publisher root certificate
Write-Host "Exporting Publisher Root Certificate…" -nonewline
$rootCert = (Get-SPCertificateAuthority).RootCertificate
$rootCert.Export("Cert") | Set-Content PublishingFarmRoot.cer -Encoding byte
Write-Host "Done" -Foreground Green

Write-Host "Exporting Publisher Topology Url…" -nonewline
$topologyUrl = Get-SPTopologyServiceApplication | Select LoadBalancerUrl
$url = $topologyUrl.LoadBalancerUrl.OriginalString
Set-Content -path PublishingFarm.Url.txt -Value $url
Write-Host "Done" -Foreground Green

write-host
Write-Host "Now copy PublishingFarmRoot.cer & PublishingFarm.Url.txt to the consuming farm." -Foreground Yellow
write-host

Now copy the publishing farm certificate and publishing farm url txt file that have just been generated to the consuming farm. Now switch back to the consuming farm.

Contents of 3_Consumer_ImportCerts.ps1 – run this on the CONSUMING farm

Add-PSSnapIn "Microsoft.SharePoint.PowerShell" -EA 0

# import publisher root certificate
Write-Host "Importing Publisher Root Certificate…" -nonewline
$trustCert = Get-PfxCertificate PublishingFarmRoot.cer
New-SPTrustedRootAuthority PublishingFarm -Certificate $trustCert
Write-Host "Done" -Foreground Green

Write-Host "Now import certificates on the publishing farm." -Foreground Yellow

You’ve now imported the publishing farm root certificate into the consuming farm, Next switch back to the publishing farm and import the consuming farm certificate and consuming sts certificate into the publishing farm with the following PowerShell:

Contents of 4_Publisher_ImportCerts.ps1 – run this on the PUBLISHING farm

Add-PSSnapIn "Microsoft.SharePoint.PowerShell" -EA 0

# import consumer root certificate
Write-Host "Importing Consumer Root Certificate…" -nonewline
$trustCert = Get-PfxCertificate ConsumingFarmRoot.cer
New-SPTrustedRootAuthority ConsumingFarm -Certificate $trustCert
Write-Host "Done" -Foreground Green

# import consumer sts certificate
Write-Host "Importing Consumer STS Certificate…" -nonewline
$stsCert = Get-PfxCertificate ConsumingFarmSTS.cer
New-SPTrustedServiceTokenIssuer ConsumingFarm -Certificate $stsCert
Write-Host "Done" -Foreground Green

Write-Host "Now set permissions for application discovery on the publishing farm." -Foreground Yellow

Note: At this point I would recommend you access Central Admin on both the CONSUMING and PUBLISHING farms and verify that the trusts are in place as expected. To do this, from central admin select Security > Manage Trust.

Once you have verified your trusts are in place you’re ready to start sharing the service applications between farms. Now switch to the publishing farm.

Contents of 5_Publisher_SetPermissions.ps1– run this on the PUBLISHING farm

Add-PSSnapIn "Microsoft.SharePoint.PowerShell" -EA 0

# get consumer farm id
Write-Host "Reading Consumer Farm ID…" -nonewline
$consumerId = Get-Content -path ConsumerFarmID.txt
$consumerId = $consumerId.Replace("@{Id=","").Replace("}","")
Write-Host "Done" -Foreground Green

# set application discovery permissions
Write-Host "Set Application Discovery Permissions…" -nonewline
$security=Get-SPTopologyServiceApplication | Get-SPServiceApplicationSecurity
$claimprovider=(Get-SPClaimProvider System).ClaimProvider
$principal=New-SPClaimsPrincipal -ClaimType "http://schemas.microsoft.com/sharepoint/2009/08/claims/farmid" -ClaimProvider $claimprovider -ClaimValue $consumerId
Grant-SPObjectSecurity -Identity $security -Principal $principal -Rights "Full Control"
Get-SPTopologyServiceApplication | Set-SPServiceApplicationSecurity -ObjectSecurity $security
Write-Host "Done" -Foreground Green

# list the available service applications and prompt for one to be selected
$serviceAppList = @{"0"="DummyServiceApp"}
$serviceApps = Get-SPServiceApplication
$count = 1
$serviceWarning = ""
Write-Host
Write-Host "The following service applications are available for publishing:"
foreach ($serviceApp in $serviceApps)
{
    # ensure only service applications that can be shared are listed
    $type = $serviceApp.TypeName
    $serviceSharable = 0

    Switch ($type)
    {
        ("Business Data Connectivity Service Application") {$serviceSharable = 1}
           ("Managed Metadata Service")                       {$serviceSharable = 1}
           ("User Profile Service Application")               {$serviceSharable = 1}
           ("Search Service Application")                     {$serviceSharable = 1}
           ("Secure Store Service Application")               {$serviceSharable = 1}
           ("Web Analytics Service Application")              {$serviceSharable = 1}
           ("Microsoft SharePoint Foundation Subscription Settings Service Application") {$serviceSharable = 1}
    }
    if ($serviceSharable -gt 0)
    {
        $serviceAppList.Add("$count",$serviceApp.Id)
        Write-host "$count. " -nonewline -foregroundcolor White
        Write-host $serviceApp.DisplayName -foregroundcolor gray
        $count++
    }

}
Write-Host
$serviceAppNum = Read-Host -Prompt " – Please enter the id of the service application to be shared"
Write-Host
Write-Host "Getting Service Application…" -nonewline
$serviceAppId = $serviceAppList.Get_Item($serviceAppNum)
$serviceApp = Get-SPServiceApplication $serviceAppId
Write-Host "Done" -Foreground Green

# warn about domain trusts
$serviceWarning = ""
$type = $serviceApp.TypeName
Switch ($type)
{
    ("Business Data Connectivity Service Application") {Write-Host; Write-Host "Note: Publishing domain must trust Consuming domain." -Foreground Yellow; Write-Host;}
       ("User Profile Service Application") {Write-Host; Write-Host "Note: A two-way trust must exist between the Publishing and Consuming domains." -Foreground Yellow; Write-Host;}
       ("Secure Store Service Application") {Write-Host; Write-Host "Note: Publishing domain must trust Consuming domain." -Foreground Yellow; Write-Host;}
}

# list the service rights for the specified service application
write-host
$rightsList = @{"0"="DummyServiceApp"}
$count = 1
$serviceAppSecurity = Get-SPServiceApplicationSecurity $serviceApp
foreach ($right in $serviceAppSecurity.NamedAccessRights)
{
        $rightsList.Add("$count",$right.Name)
        Write-host "$count. " -nonewline -foregroundcolor White
        Write-host $right.Name -foregroundcolor gray
        $count++
}
write-host
$serviceAppRight = Read-Host -Prompt " – Please enter the right to be granted"
$serviceAppRight = $rightsList.Get_Item($serviceAppRight)

Write-Host "Granting ‘$serviceAppright’ to service application…" -nonewline
$security=Get-SPServiceApplication $serviceApp| Get-SPServiceApplicationSecurity
$claimprovider=(Get-SPClaimProvider System).ClaimProvider

if ($type -eq "User Profile Service Application")
{
    $consumFarmAcc= Read-Host -Prompt " – Please enter the consuming farm account e.g. DOMAIN\account"
    $principal=New-SPClaimsPrincipal -Identity $consumFarmAcc -IdentityType WindowsSamAccountName   
}
else
{
    $principal=New-SPClaimsPrincipal -ClaimType "http://schemas.microsoft.com/sharepoint/2009/08/claims/farmid" -ClaimProvider $claimprovider -ClaimValue $consumerId
}
Grant-SPObjectSecurity -Identity $security -Principal $principal -Rights $serviceAppRight
Set-SPServiceApplicationSecurity $serviceApp -ObjectSecurity $security
Write-Host "Done" -Foreground Green

Write-Host "Publishing service application…" -nonewline
Publish-SPServiceApplication -Identity $serviceApp
Write-Host "Done" -Foreground Green

$lbUrl = Get-SPserviceApplication $serviceApp | Select Uri
Set-Content -path $serviceAppId -Value $lbUrl
$cleanUrl = Get-Content -path $serviceAppId
del $serviceAppId
$cleanUrl = $cleanUrl.Replace("@{Uri=","").Replace("}","")

write-Host
Write-Host "Now connect to the service application from the consumer farm with the following url:" -Foreground Yellow
write-Host
Write-Host $cleanUrl -Foreground Yellow

It’s a whopper but basically this script lists all the available service applications on the PUBLISHING farm and allows you to publish these to the CONSUMING farm whilst choosing the service application specific permissions to grant to the consuming farm:

image

Simply enter the the number for the service application you wish to publish and then the number associated with the permissions you wish to grant to the consuming farm.

Now, switch to the consuming farm.

Contents of 6_Consumer_ConnectService.ps1 – run this on the CONSUMING farm

Add-PSSnapIn "Microsoft.SharePoint.PowerShell" -EA 0

# get url from user
Write-Host
Write-Host "Reading topology service url…" -nonewline
$topologyUrlShort = get-content -path PublishingFarm.Url.txt
Write-Host "Done" -Foreground Green
Write-Host

#get available published services:
Write-Host "Connecting to topology service $topologyUrlShort…" -nonewline
$publishedServices = Receive-SPServiceApplicationConnectionInfo -FarmUrl $topologyUrlShort
Write-Host "Done" -Foreground Green
Write-Host

# list the published services
Write-Host "The following service applications are available for consumption:"
$serviceAppList = @{"0"="DummyServiceApp"}
$count = 1
foreach ($publishedService in $publishedServices)
{
    Write-host "$count. " -nonewline -foregroundcolor White
    Write-host $publishedService.DisplayName -foregroundcolor gray
        $serviceAppList.Add("$count",$publishedService.Uri)
    $count++

}

Write-Host
$serviceAppNum = Read-Host -Prompt " – Please enter the id of the service application to be consumed"

Write-Host
$serviceAppProxyName= Read-Host -Prompt " – Please enter the service application proxy name"
Write-Host

#get the selected published service app
$count = 1
foreach ($publishedService in $publishedServices)
{
    if ($count.ToString() -eq $serviceAppNum )
    {

        #we’ve found our service application – let go create it based on the type
        $type = $publishedService.SupportingProxy
        $serviceUrl =  $serviceAppList.Get_Item($serviceAppNum)
       
        Switch ($type)
        {
            ("BdcServiceApplicationProxy"){
                    Write-Host "Creating new Business Data Connectivity Service Application Proxy…" -nonewline
                    New-SPBusinessDataCatalogServiceApplicationProxy -Uri "$serviceUrl" -Name "$serviceAppProxyName"
                }
            ("MetadataWebServiceApplicationProxy"){
                    Write-Host "Creating new Managed Metadata Service Proxy…" -nonewline
                    New-SPMetadataServiceApplicationProxy -Uri "$serviceUrl" -Name "$serviceAppProxyName"
                }
            ("UserProfileApplicationProxy"){
                    Write-Host "User Profile Service Application Proxy…" -nonewline
                    New-SPProfileServiceApplicationProxy -Uri "$serviceUrl" -Name "$serviceAppProxyName"
                }
            ("SearchServiceApplicationProxy"){
                    Write-Host "Search Service Application Proxy…" -nonewline
                    New-SPEnterpriseSearchServiceApplicationProxy -Uri "$serviceUrl" -Name "$serviceAppProxyName"
                }
            ("SecureStoreServiceApplicationProxy"){
                    Write-Host "Secure Store Service Application Proxy…" -nonewline
                    New-SPSecureStoreServiceApplicationProxy -Uri "$serviceUrl" -Name "$serviceAppProxyName"
                }
            ("WebAnalyticsServiceApplicationProxy"){
                    Write-Host "Web Analytics Service Application Proxy…" -nonewline
                    New-SPWebAnalyticsServiceApplicationProxy -Uri "$serviceUrl" -Name "$serviceAppProxyName"
                }
        }
        Write-Host "Complete." -Foreground Yellow

    }
    $count++

}

This final script connects to the topology service of the publishing farm and lists all the published services applications. Simply select the number of the service application you wish to consume

image

This has saved me loads of time in the past and has proved to be very reliable. However, please note the following points:

  • The script assumes the files copied between the farms are copied to the same location as the PowerShell scripts.
  • If you want to consume partitioned service applications you’ll need to update the final script to include the –PartitionMode switch (or the –Partitioned switch in the case of the New-SPEnterpriseSearchServiceApplicationProxy cmdlet)

I hope this helps…

Update SharePoint Timer Job Progress Programmatically

Often when developing custom timer jobs it can be very useful to provide feedback in central administration about the progress your job is making. Most of the out of the box timer jobs provide this progress feedback:

image

To extend your custom timer jobs to also support this progress bar is super easy with just one line of code:

image

The SPJobDefinition.UpdateProgress method is used to provide SharePoint with a percentage completeness of your timer jobs progress. The UpdateProgress method on SPJobDefinition takes a simple int parameter of a value between 0 and 100.

Now the hard part, how to calculate your actual (and accurate) progress value – you might find this article useful: http://en.wikipedia.org/wiki/Wikipedia:Reference_desk/Archives/Miscellaneous/2008_July_1#how_long_is_a_piece_of_string.3FWinking smile

Enjoy!

Adding additional claims to a Trusted Identity Token Issuer

In my first blog post about setting up claims based authentication between the Thinktecture identity server and SharePoint I showed how to create a basic token that contains a single claim – emailaddress.

Here is how you can extend the claims that SharePoint will accept in a token. I’m assuming you’ve setup claims based authentication as per by previous article.

First, we get a reference to the trusted identity token issuer we created:

$ap = Get-SPTrustedIdentityTokenIssuer | where {$_.Name -eq "idp SAML Provider"  }

Next we extend this to include our new claim – role:

$ap.ClaimTypes.Add("http://schemas.microsoft.com/ws/2008/06/identity/claims/role")
$ap.Update()

Next we create our claim mapping:

$map1 = New-SPClaimTypeMapping -IncomingClaimType "http://schemas.microsoft.com/ws/2008/06/identity/claims/role" -IncomingClaimTypeDisplayName "Role" –SameAsIncoming

Finally we add this mapping to our trusted identity provider:

Add-SPClaimTypeMapping -Identity $map1 -TrustedIdentityTokenIssuer $ap

If we query our trusted identity token issuer again we should see the additional claim:

image

Finally, logging onto our claims based authenticated site we should see our new claim courtesy of  the claims viewer web part I installed from the codeplex project http://claimsid.codeplex.com/:image

Enjoy!

SharePoint claims based authentication with Thinktecture identity server – Walkthrough

This article describes how to setup claims based authentication for SharePoint using the Thinktecture Identity Server. If you don’t know about the Thinktecture identity server, it’s a great open source identity provider (IP-STS) available via codeplex: http://identityserver.codeplex.com

For brevity I’ll be referring to the Thinktecture identity server as the IP-STS and SharePoint as the RP-STS in the remainder of this article. Additionally, this setup is for demonstration purposes only, in a production environment you would probably not want to use self-signed certificates as I do here and you’re web applications should be secured with SSL.

To setup claims based authentication, the following steps need to be completed:

So lets begin with creating certificates.

 

Create Certificates

Before the RP-STS can trust the IP-STS, the IP-STS must be able to prove it’s authority – this is done via certificates. We’ll need to create a certificate for the IP-STS to use to sign the tokens it sends the the RP-STS. These tokens contain the claims about the identities the IP-STS authenticates and the RP-STS will only accept these claims when they are signed with a trusted certificate. Therefore, there are three steps involved in the certificate management:

  • Create the certificate
  • Register the certificate with the IP-STS
  • Register the certificate with the RP-STS

In this demo, I’m using self-signed certificates which you probably wouldn’t want to do in a production environment. To create a self signed certificate, open a Visual Studio command prompt and enter the following:

makecert -r -pe -n CN=idp.bc01.com -ss my -sr localmachine -sky exchange -sp "Microsoft RSA SChannel Cryptographic Provider" -sy 12 idp.bc01.com.cer

The above will create a certificate for the “idp.bc01.com” domain and save the certificate to a idp.bc01.com.cer file. This will also install the certificate on local machine’s Personal Certificates store, ready to be assigned to a site in IIS configuration.

image TIP: if you want to make a client browser trust it, run mmc and add the certificates snap-in and choose Local Computer as the account. Then import it into Trusted Root Certificates.

Next, we are going to add the certificate to the RP-STS. To do this, open Central Admin –> Security –> Manage Trust and select ‘New’ from the ribbon. Import the certificate you just created and assign it a name, press OK:

image

Once the certificate is imported, you can review it’s properties:

image

At this point, you’ve configured the RP-STS to trust this certificate but we need to get the IP-STS to use this certificate when it signs tokens. I’m assuming you’ve already downloaded the Thinktecture identity server project and built the solution, if not you’ll need to complete the following steps:

    • Build the Thinktecture identity server project
    • Create a new IIS web site and point it’s root directory at the ‘website’ project directory of the Thinktecture solution.
    • Set the App Pool for the new web site to use .NET 4 Framework
    • Assign the certificate you created above to the IIS web site – typically in a production environment you would use a separate certificate for the IP-STS site and the actual signing of tokens

 

Configure Identity Store

    Now we have the basic IP-STS site configured, we need to create an identity store for the IP-STS to use. the Thinktecture identity server uses a pluggable model to connect to a variety of identity stores or you can create your own. In this example I’ll be using the ASP.NET SQL Server Registration Tool to create a database to store and manage my identities. To create your identity store you use aspnet_regsql.exe, full details of this command can be found here: http://msdn.microsoft.com/en-us/library/ms229862(VS.80).aspx
    Note: although this is the same identity store that can be used with SharePoint forms based authentication, this store will be connected to the IP-STS and SharePoint will not directly connect to it.
    Once you have your identity store created, I typically create another IIS web site to manage the identities via the IIS Manager UI. To do this, create another IIS web site (that is neither the IP-STS web site nor a SharePoint site) and create a connection string to point to your newly created identity store database:

image

Now create the following .NET Roles:

image

The ‘Milkman’ role is optional but you could use this as a claim presented to the RP-STS later. Finally, create some users and assign them to these roles – ensuring you assign at least one user to the IdentityServerAdministrators role:

image

Lastly, open the Thinktecture solution in Visual Studio and locate the connectionString.config file in the WebSite project:

image

Update the connection string in this file to point to your identity store database:

image

In my example this database is called SharePoint_FBA but again, this is nothing to do with FBA in SharePoint and SharePoint never directly connects to this database – I’m just reusing an identity store I already had created for another project.

Now run rebuild the Thinktecture solution and access the IP-STS web site – in my example http://idp.bc01.com and you should see:

image

Sign into the site with the user credentials you added to the IdentityServerAdministrators role. Once signed in click on the [administration] link and you should see the following menu: image

Now we need to configure the IP-STS. First, starting at My Claims, you can view the current claims available for the logged on user:

image

I’ve highlighted in red the claim we’re going to be using as our identity when we sign into the RP-STS. In green I’ve highlighted the additional claim the IP-STS has added for the current user as this user was also a member of the Milkman role in the identity store.

Next, click view My Token to see the token the IP-STS has issued to the current user in XML format:

image

This can be useful screen for reviewing and debugging claims based authentication.

Next, under Global configuration, we need to adjust a few setting:

Set the Default Token Type to:

image

and switch on the following options:

image

Next, under Certificate Configuration, ensure that the Current SSL and Current Signing certificates are set to the certificate you created earlier:

image

Lastly, under Relying Parties we need to create an entry for our RP-STS (SharePoint). Set the Relying Party Name, Realm URI and ReplyTo URL (assumes you’ll create a web application with the URL http://claims.bc01):

image

NOTE: Strictly speaking the Realm URI does not need to be the same as the ReplyTo URL but for to simplify configuration these are set to the same address.

Your IP-STS is all done.

IMPORTANT: Be sure to sign out of the IP-STS site before you continue.

 

Create Trust between IP-STS and RP-STS

Next we are going to configure SharePoint to act as a RP-STS and use the IP-STS we’ve just configured. We do this via PowerShell.

First we create a certificate object from the certificate file we created earlier:

$root = New-Object System.Security.Cryptography.X509Certificates.X509Certificate2("C:\idp.bc01.com.cer")

Next we create a new Trusted Root Authority that uses the certificate:

New-SPTrustedRootAuthority -Name "idp Signing Cert" -Certificate $root

Next we create a claim mapping. In this example we are creating a single claim mapping based on emailaddress:

$map = New-SPClaimTypeMapping -IncomingClaimType "http://schemas.xmlsoap.org/ws/2005/05/identity/claims/emailaddress" -IncomingClaimTypeDisplayName "EmailAddress" -SameAsIncoming

Next we create a realm, this needs to be exactly the same as the realm URI specified in the Thinktecture Relying Party configuration:

$realm = http://claims.bc01/_trust

Lastly we create Trusted Identity Token Issuer that uses the certificate, claims mappings and realm we just created. Additionally, it specifies which claim is the identifier claim, in this example it’s the emailaddress. The following PowerShell creates a new Trusted Identity Token Issuer called “idp SAML Provider”:

$ap = New-SPTrustedIdentityTokenIssuer -Name "idp SAML Provider" -Description "Thinktecture IDP (SAML)" -Realm $realm -ImportTrustCertificate $root -ClaimsMappings $map -SignInUrl "https://idp.bc01.com/account/signin" -IdentifierClaim http://schemas.xmlsoap.org/ws/2005/05/identity/claims/emailaddress

NOTE: the SignInUrl for the command is what redirects the user to the IP-STS to authenticate.

With all that PowerShell configuration complete, we can go ahead and create a new Web application. Be sure to use claims based authentication for the authentication mode and select the new Trusted Identity Provider as the claims authentication type:

image

…also be sure to leave NTLM authentication for the web app enabled.

With this in place we are ready to test authentication.

 

Test Authentication

To test authentication we need to ensure the user we access the site with has at least view permissions to the web app (I’m assuming you also created a root site collection in the web app you created in the previous step) and is a member of the IdentityServerUsers role within the identity store.

Access the new web application and when prompted to pick the type of credentials used to logon to the SharePoint site, choose Windows Authentication:

image

NOTE: You now get prompted to choose the authentication provider when you access the site because the web app support both claims based authentication via our new Trusted Identity Provider and NTLM authentication.

Enter valid windows credentials for a user who has Site Collection admin privileges. Once logged on, select Site Actions – > Site Permissions –> Members Group (the actual name of the group will depend on the name of your site collection). Now from the ribbon, select New and enter the email address of a user you created in the identity store earlier, again this user must be a member of the IdentityServerUsers role in the identity store:

image

NOTE: type the users email address very carefully as by default anything you type will be validated as a valid claim (see Claims Provider – http://msdn.microsoft.com/en-us/library/ee535894.aspx for more details on this default behaviour).

Add the user to the SharePoint group and log off the SharePoint site.

Now we are going to log on again but this time select the trusted identity provider as the authentication type. When we do so we will be redirected to the IP-STS to authenticate:

image

Enter the credentials of a user you entered into your identity store earlier who is a member of the IdentityServerUsers role and press Sign In. the IP-STS will authenticate your credentials and build you a SAML token. The SAML token will be returned to your browser as a cookie and your browser will be instructed to redirect to the SharePoint site (to the /_trust virtual directory):

This should culminate in you being signed into SharePoint with your claims based identity:

image

You can verify you are signed into you RP-STS (SharePoint) via claims based authentication by selecting My Settings from your sign in name:

image

When you do this, you’ll see the account name displayed includes the trusted identity provider:

image

Finally, the codeplex project http://claimsid.codeplex.com/ contains a useful web part you can drop on any web part page that will display the details of the current claim you’re signed in with. In my example, it looks like this:

image

 

Summary

Hopefully this has given you a few pointers to setting up claims based authentication with SharePoint. I would recommend checking out the Thinktecture codeplex project – it a great basis for deploying or developing a custom IP-STS.

Enjoy!

UPDATE: My next article shows how to add additional claims to the SharePoint RP-STS: http://sharepintblog.com/2011/10/26/adding-additional-claims-to-a-trusted-identity-token-issuer/

SharePoint Topology Data Collection Walkthrough

Here’s a walkthrough of how to install, configure and use the great SharePoint Topology Data Collection and Visio SharePoint Network Topology Add-in provided by one very clever AravindKS.

The tool is designed to automatically collect topology data about from your SharePoint farm and to automate the creation of Visio diagrams that illustrate this topology via data connected diagrams – very cool!

Timer Job Installation

Step 1.

Download the SharePoint Topology Data Collection and Visio SharePoint Network Topology Add-in

Step 2.

Install the SharePoint Topology Data Collection Tool. Note: This tool required the .Net Framework v4 to be present on the farm:

image

image

Once this tool has installed, check Central Admin for the following items. Under Monitoring > Timer Jobs you should see a new option appear on the monitoring home page called ‘SharePoint Topology Data Job Settings’:

image

Additionally, under Monitoring > Review job definitions, you should see a new timer job entitled ‘SharePoint Topology Data Collection Timer Job’:

image

Step 3.

Before we run the SharePoint Topology Data Collection Timer Job, we need to configure the timer job settings, so back to Monitoring > Timer Jobs > SharePoint Topology Data Job Settings. From here, enter a valid

image

From here, pick a Web Application, Site Collection, Site, and List name to store the collected topology data. You don’t have to use central admin like I have done but you might want to so that access to the topology data is restricted. Additionally, note that the list name entered does not have to already exist, the timer job will create the list the next time it runs.

Step 4.

From Monitoring > Review job definitions, run the SharePoint Topology Data Collection Time Job. Once this job has run, you should notice a new list appeared in the location you specified in Step 3 and it will be populated with a number of different rows that the timer jobs has created:

image

In my single server development farm the timer job created 45 rows.

If you see this list and it contains data, then the installation and configuration of the timer job is complete and you can move onto the Visio add-in installation

Visio Installation and Data Connection

Step 1.

Now that the timer job is installed and working, we can configure the Visio add-in. Install the Visio SharePoint Network Topology Add-in and note the installation directory. Once the installation is complete, navigate to the installation directory and locate the SPNetworkTopology.vsto file:

image

Open this file and the Microsoft Office Customization Installer will prompt you to accept the package. Continue by accepting the package.

Step 2.

Once the package is installed, start up Visio and check the add-in is installed, File > Options > Add-ins:

image

Step 3.

Create a new Visio diagram from the SharePoint Network Topology template the add-in has enabled. If this template is not immediately available to you, you should find it under the Add-Ins folder under the File > New options:

image

Step 4.

When you open a diagram that has been created from the SharePoint Network Topology template, a new ribbon will appear called SharePoint Topology:

image

We will use this ribbon to create our diagram based on the data collected via the timer job. From the SharePoint Topology ribbon, select the Link Data to SharePoint action, to access the SharePoint Network Topology Data Selector:

image

Click Next, and then enter the SharePoint site url and link name we configured in the timer job settings:

image

Click Next.

On the Column Selection screen, choose the properties (columns) you would like available in your diagram (I’ve chosen ‘Select All’):

image

Click Next

Finally, confirm your selections and click Finish:

image

Diagram Creation

Now that we have completed the Link Data to SharePoint action, our Visio diagram should have a data connection to the SharePoint list that contains our topology data. Now it is time to use this data.

Step 1.

Navigate to the first page that is created for us, ‘SharePoint Network Topology’:

image

Step 2.

From the ribbon, select Generate Diagram:

image

Now the magic begins! The add-in will use the topology data collected to automatically generate a diagram of your SharePoint Topology that is data bound to the topology data collected from the timer job. This means that once the diagram has been created you can refresh it at any time to review changes in your topology and its performance – very nice!

Step 3.

Navigate to the Service Details page and repeat Step 2.

Step 4.

The diagrams that were created in steps 2 & 3 for my single server farm are shown below. Obviously in a larger farm these diagrams will contain more servers and more details:

image image

I’ll publish some more complex diagrams once I’ve run the tool in my test lab.

Step 5.

Now the diagrams are available you can review the diagrams and decorate them with further data visualisations. See Display a data-connected Visio drawing in a SharePoint web part for more details on how to do this and how you can publish this dynamic diagram to SharePoint.

Enjoy!

Quote of the day: Site Availability

This gem was found in the article ‘SharePoint Server 2010 capacity management: Software boundaries and limits’: http://technet.microsoft.com/en-us/library/cc262787.aspx

image

It says ‘Note: Deleting or creating a site or subsite can significantly affect a site’s availability.’

Just tickled me….Enjoy!