SharePoint Dragons

Nikander & Margriet on SharePoint

Tag Archives: Tools

PowerShell Maxer for SharePoint 2013

Checking Capacity boundaries was made simpler using our Maxer tool ( We decided that it would be way easier to port the code to PowerShell (and then add some). This makes it easier for everybody to add or adjust sections according to personal or company liking.

PowerShell Maxer for SharePoint 2013 is a script that checks for capacity planning limits as described per the Planning section of the TechNet Wiki SharePoint 2010 Best Practices overview page at

PowerShell Maxer for SharePoint 2013 can do the following things:

· Checks user limit in groups.

· Checks list item limits.

· Checks site user limits.

· Displays group membership.

· Displays group owners.

· Analyzes all web applications, site collections, and sites in a farm.

· Displays which features have been activated at the farm, web application, site collection, and site level.

· Checks sub site limits.

· Checks site collection limits per content db.

· Displays site collection owner and creation date.

· Lists relevant application pool names.

· Checks user limit in site collections.

To get the idea, we’ve included some code snippets.

Here’s code for counting the number of list items:

Copy code



function CountLists($currentWeb) 

foreach ($currentList in $currentWeb.Lists) 


$sw.WriteLine(“There are {0} items in list {1}. A max of 30,000,0000 is allowed.”, $currentList.ItemCount, $currentList.Title) 


Code for counting members in a group:

Copy code



function CountGroups($currentSc) 

foreach($currentGroup in $currentSc.OpenWeb().SiteGroups) 


$sw.WriteLine(“Group {0} has {1} users. A max of 5,000 is allowed.”, $currentGroup.Name, $currentGroup.Users.Count) 


Code for counting all users in a site collection:

Copy code



function CountUsers($currentSc) 

foreach ($currentUser in $currentSc.OpenWeb().SiteUsers) 


$sw.WriteLine(“User {0} is a member of {1} groups. A max of 5,000 is allowed.”, $currentUser.get_Name(), $currentUser.Groups.Count) 


Code for finding group owners:

Copy code



function CountOwners($ownersWeb) 

foreach($ownersGroup in $ownersWeb.Groups) 


if($ownersGroup.Name -like “*Owners*”) 



            $sw.WriteLine(“The following users are a member of the {0} group:”, $ownersGroup.Name) 

foreach($ownerUser in $ownersGroup.Users) 


                $sw.WriteLine(“User: {0}”, $ownerUser.Name) 




Code for counting sub sites and displaying activated site features:

Copy code



function Countwebs($currentSc) 

foreach ($currentWeb in $currentSc.AllWebs) 



$sw.WriteLine(“Analyzing Web site {0}”, $currentWeb.Title) 

        CountOwners $currentWeb


$sw.WriteLine(“The following features are active at Web scope:”) 

$contentWebAppServices = (Get-SPFarm).services  

$webFeatures = Get-SPFeature | Where-Object {$_.Scope -eq “Web” } 

if ($webFeatures -ne $null) 


foreach ($feature in $webFeatures) 


if ((Get-SPFeature -Web $sc.Url | Where-Object {$_.Id -eq $}) -ne $null) 


$sw.WriteLine(“Feature: {0}, Typename {1} with GUID {2} is hidden {3}”, $feature.DisplayName, $feature.TypeName, $feature.Id, $feature.Hidden) 





$sw.WriteLine(“Web site {0} has {1} sub sites. A max of 2,000 is allowed.”, $currentWeb.Title, $currentWeb.Webs.Count) 

        CountLists $currentWeb



Code for counting site collections in content db’s:

function CountContentDatabases($currentWebApp)
foreach($currentCd in $currentWebApp.ContentDatabases)
  $sw.WriteLine(“Content database {0} has {1} site collections. The maximum supported is 10,000. Recommended is a maximum of 5,000.”, $, $currentCd.Sites.Count)

And so on… Get the full PS script via the download link at

Finding Filtdump

The Filtdump tool, if you don’t know it, is a really useful tool for troubleshooting index problems coming from the days of yore. What’s more, you can do really sneaky things with it, but that’s a topic for another story. We’ve referred to it in one of our golden olden blog posts, at Now, it’s not every day that we have to solve index problems, so it was a long time since we’ve used it. We found out it was surprisingly hard to find, so for our future reference (and anybody else’s), you can find it here: , as a part of the Windows Search 3.x SDK.



SharePoint Code Analysis Framework (SPCAF) currently is a free beta tool ( and remains that way until 2013/09/30, but eventually will become a commercial product. SPCAF currently uses +/- 300 rules dedicated to SharePoint analyzing compiled code, XML files, user controls (*.ascx files), pages (*.aspx files), and master pages (*.master files). It integrates in Visual Studio and is also available as a separate client application that can be run from the command line. The following Figure shows a screenshot of the client application.


The following Figure shows a dependency graph that understands SharePoint solution files.


SPCAF rules are divided in various categories:

· Correctness. Correctness rules check the SharePoint XML code for syntax errors. This includes check for all required XML attributes, correct values and data types of attributes. For example, such rules check for required Id attributes in the <Solution> element, valid GUIDs, and checks Feature folder and file names don’t contain spaces.

· Security. Checks if solutions pose security issues. For example, such rules check for calls to SPWeb.AllowUnsafeUpdates, WindowsIdentity.Impersonate() calls, running with elevated privileges, specific CAS policy settings, and presence of a form digest control in *.aspx pages.

· SharePoint Supportability. Checks if solutions endanger the supportability of SharePoint. For example, such rules check for attempts to change system files, accessing the SharePoint API via reflection, reading the content database connection string, and querying SharePoint databases directly.

· Design. Warnings that support proper library design. For example, such rules check for presence of assembly file version number, hard coded URLs, and programmatically created content types.

· Best Practices. Rules to warn if best practices are not used. For example, such rules check for direct calls to Item collection of SPList, check if locking is used when storing objects in SharePoint cache, and instantiating new list, list object, sites, and/or webs in event receivers.

· Deployment. The deployment process of SharePoint customizations is often a critical part. Deploying the wrong way or the wrong files can harm the SharePoint farm or make the farm inaccessible. Deployment rules check the code for these risks or potential problems. For example, such rules check for global deployments, web server resets in code during deployment, deploy assemblies in Debug mode, and deployment of web services to the SharePoint LAYOUTS folder.

· Localization. Localization is the process of customizing an application, webpage, or website for a given culture or locale. The localization rules check if all attributes in XML which support localization are localized in a proper way. For example, such rules check that localizable attributes (such as display names) use resources and more.

· Naming. Checks files and artifacts for violations against naming conventions. For example, such rules check for valid namespaces, names of web templates that start with &apos;WEBTEMP&apos;.

· Customization. Rules which check violations against SharePoint customization guidelines. For example, such rules check for the presence of HTTP handlers and/or modules, presence of timer jobs, presence of event receivers, and presence of inline code in .aspx pages.

· Sandboxed compatibility. Checks files and artefacts whether they are compatibel with Sandboxed solutions requirements. For example, such rules check for presence of APTCA attribute, references to .NET assemblies that are unavailable within the sandbox, and HideCustomAction elements.

· SharePoint 2007 compatibility. Checks files and artefacts whether they are compatibel with SharePoint 2007. For example, such rules check for references to the correct assemblies.

· SharePoint 2010 compatibility. Checks files and artefacts whether they are compatibel with SharePoint 2010. For example, such rules check for check for references to the correct assemblies, and deprecated API calls.

· SharePoint 2013 compatibility. Checks files and artefacts whether they are compatibel with SharePoint 2013. For example, such rules check for references to the correct assemblies, .NET 4.5 target framework, and deprecated API calls.

SPCAF promises to become the most powerful tool when it comes to analyzing custom SharePoint solutions, but we’ll keep our eyes on the tool!

World Clock

Sometimes, you really need to see what time it is at any place in the world. This is our favy web site for doing this:

The Migration Dragon for SharePoint 2013

The Migration Dragon for SharePoint 2013 is a tool that we’ve built to help to migrate file and folder structures from the file system to SharePoint 2013 Document Libraries leveraging the batching mechanism of the SharePoint managed client object model. You can get it here:

Maxer for SharePoint 2013

We’ve created Maxer for SharePoint 2013, a command line tool that checks for capacity planning limits. You can get it here:

Fishbone: Waiting for CKSDev

We’re writing this in advance, so it’s possible that a version of CKSDev for the combi of VS.NET 2012/SharePoint 2013 has been released, but in the mean time, check out the Fishbone Systems VS.NET 2012 extension at : it allows you to copy to the SharePoint root, perform an iisreset, reset the SharePoint Timer Service, and copy DLLs to the GAC. Maybe not as cool as CKSDev, but still very useful!

NDepend Review

It has been quite some time since we looked at the NDepend tool ( We remember reading a Robert C. Martin book and were enthusiastic about his ideas about assembly dependencies. Back then, we were pleasantly surprised to find out that there was a tool based on the ideas of Robert C. Martin, and did some experimenting with it, but for some reason the tool didn’t quite fit in our development process.

Recently, we took another look at the tool and were pleasantly surprised by the way it has evolved. It has been designed to participate in Continuous Integration (CI) scenarios, has become a full-blown code analysis tool including nice reporting capabilities, and the documentation and ease of use are top notch. The feature we like best of all is the little amount of effort it takes to create your own code rules, something we have really want for some time and found to be lacking in the standard Visual Studio Static Code analysis features.

Let’s take a moment to discuss some of the other features…

  • There’s a visual tool that allows you to inspect NDepend projects (that analyze your own VS projects) without having to start Visual Studio separately.
  • There’s a console tool that ios ideal for CI purposes.
  • There’s a VS add-in(also support VS 2012) allowing easy access to NDepend features:
  • The Queries and Rules explorer performs static code analysis of your code:
  • The tool is shipped with nice reporting capabilities:
  • You can run Linq queries live at design time to inspect the code base you’re working with:
  • The Dependency Graph is impressive:
  • Great in-context info. In the example below we use NDepend to find out which methods are using the current method:
  • NDepend allows you to Diff separate versions of source files and highlights the differences ( )
  • Of course, the unique Robert C Martin package dependency report isn’t missing, and luckily our PressurePoint is doing wonderful:

The tool is packed with handy features:

As said before, our favorite feature is the ability to quickly create new code rules. You can do this just by using some Linq code and learn from the examples. For instance, the next code rule checks for methods that are considered to be too big:

// <Name>Methods too big</Name>

warnif count > 0 from m in JustMyCode.Methods where

m.NbLinesOfCode > 30 ||

m.NbILInstructions > 200

orderby m.NbLinesOfCode descending,

m.NbILInstructions descending

select new { m, m.NbLinesOfCode, m.NbILInstructions }

// Methods where NbLinesOfCode > 30 or NbILInstructions > 200

// are extremely complex and should be split in smaller methods.

// See the definition of the NbLinesOfCode metric here


This rule checks method access modifiers:

// <Name>Methods that could have a lower visibility</Name>

warnif count > 0 from m in JustMyCode.Methods where

m.Visibility != m.OptimalVisibility &&

!m.HasAttribute(“NDepend.Attributes.CannotDecreaseVisibilityAttribute”.AllowNoMatch()) &&

!m.HasAttribute(“NDepend.Attributes.IsNotDeadCodeAttribute”.AllowNoMatch()) &&

// If you don’t want to link NDepend.API.dll, you can use your own attributes and adapt this rule.

// Eliminate default constructor from the result.

// Whatever the visibility of the declaring class,

// default constructors are public and introduce noise

// in the current rule.

!( m.IsConstructor && m.IsPublic && m.NbParameters == 0) &&

// Don’t decrease the visibility of Main() methods.


select new { m,

m.Visibility ,

CouldBeDeclared = m.OptimalVisibility,

m.MethodsCallingMe }

Finally an easy way to implement code rules specific to our own projects. All in all, we’re happy with our brand new toy! If you want to take it for a test spin you can get it here:

Working with .BLG files

When doing performance analysis, we routinely record .blg files using perfmon to inspect the various servers involved in the SharePoint farm. We were looking for a library or piece of .NET code to read such .blg files but eventually came to the conclusion that there are only two workable solutions:

$result = import-counter .\sample.blg
$result | export-counter -path .\output.csv -FileFormat csv

For more info on SharePoint related perf counters and the related get-counter cmd, check out

It’s the SharePoint Flavored Weblog Reader again!


We’ve released SFWR v1.3.

What is it?

“The IIS logs are an invaluable way to get to know your web application and your end users once it’s in production. Therefore, having a tool to analyze IIS logs is in invaluable asset in your bag of tricks. Especially if this tool has a certain SharePoint flavor added to it…”

What’s the update?

“Not everybody has the same IIS logging settings, that’s why SFWR uses the DLR to support this. Nevertheless, there were certain reports that threw exceptions and caused processing to halt because they expect certain data to be present (mainly bytes sent and bytes received). v1.3 uses a little Aspect Oriented Programming (AOP) to make sure all report calculation errors are handled.”

Where to  get it?