SharePoint Dragons

Nikander & Margriet on SharePoint

Tag Archives: Diagnostics

SharePoint Debugging: Not without a trace

We’re always quite interested to see how other peoply try to solve SharePoint issues and thought it would be interesting to share a recent experience with MS support. In a case were list items got corrupted after a migration, MS support was interested in the following:

– An HTTP trace retrieved via Fiddler taken while the issue is reproduced via the browser.

– Relevant ULS log files.

– A memory dump of the SharePoint process retrieved via tttracer taken while the issue is reproduced.

To us, the latter choice is the most interesting one. Tttracer.exe refers to the Microsoft Time Tracel Tracing Tool (see http://www.thewindowsclub.com/microsoft-time-travel-tracing-diagnostic) and is a diagnostic tool that captures trace info and extends the WinDbg tool to load such trace files for further analysis. Tttracer allows you to select a specific process (or more) on your computer and collects info about it. At a later time, MS support is able to use such trace files and go back and forth in time to diagnose SharePoint processes before, during, and after issues.

Unfortunately, tttracer is not available outside Microsoft so of no immediate use to us. However, there were some steps in the trace capturing process that are good practices to follow anyway, such as:

1. If you’re interested in doing a memory dump, isolate a WFE that will be used for testing the issue.

2. If you’re interested in doing a memory dump, edit the host file on that WFE to ensure all SharePoint URL calls are directed to the WFE, and not to a load balancer.

3. Set ULS logging to verbose and put that info in a separate log file (via Set-SPLogLevel -TraceSeverity VerboseEx -EventSeverty Verbose and New-SPLogFile).

4. Reset IIS.

5. Reproduce the issue.

6. If you’re interested in doing a memory dump, find the process id of the application pool that hosts the SharePoint site where the issue occurs (by executing “%windir%\system32\inetsrv\appcmd list wps” on a command prompt).

8. Reproduce the issue.

9. Analyze all the info you retrieved.

We suspect your own troubleshooting may not be that different, and most likely will be more extensive than this, but for sure it won’t hurt to compare notes!

Solving problems with ULS log file generation

If the ULS log files remain empty (0 KB) take the following steps to resolve:

  • Make sure you have enough space in your hardware driver.
  • Check timer and tracing service accounts information, the tracing service must use the account as Local Service.
  • Reset the timer service account password. Restarted the tracing service.
  • Check Diagnostic logging under Central Administration site, you can refer to:http://technet.microsoft.com/en-us/library/ee748656.aspx
  • Check the permissions on /12/logs folder

And these blogs may help you:

http://sharepointlearningcurve.blogspot.com/2010/04/sharepoint-2010-uls-problems-logs-are.html

http://www.eiben.weite-welt.com/2011/08/no-uls-logging-in-sharepoint-2010/

Taken from forum thread: http://social.technet.microsoft.com/Forums/en-US/sharepoint2010general/thread/e327f833-5e43-42d1-a330-f2d16c41106a

Check the following Wiki page for updates: http://social.technet.microsoft.com/wiki/contents/articles/9340.sharepoint-2010-solving-problems-with-uls-log-file-generation.aspx

The SharePoint Flavored Weblog Reader(SFWR)

In a previous gallery post we’ve tried to satisfy our interest in the performance of SharePoint environments, we’ve explored which set of performance counters you should use for monitoring SharePoint WFEs and database servers: http://gallery.technet.microsoft.com/PowerShell-script-for-59cf3f70 . This time, we’ll look at another aspect closely tied to this topic: IIS logs.

The IIS logs are an invaluable way to get to know your web application and your end users once it’s in production. For us, they are also the first stop when a web application has performance problems. Therefore, having a tool to analyze IIS logs is in invaluable asset in our bag of tricks. There are numerous commercial tools out there that do a reasonable job at analyzing IIS logs and often providing (more or less) great visual displays while doing it. There are some things wrong with commercial tools though:

  • They cost money (duh!).
  • We want to be able to add specific queries in a language familiar to us, should the need arise. We want to be able to add additional queries in Linq.
  • Usually these tools don’t have any intrinsic knowledge about SharePoint.

Since we came to the conclusion that having a tool that helps to analyze IIS log files is essential and that we weren’t fully happy with existing options, we decided to build such a tool ourselves which primarily gives us ultimate control when it comes to adding new overviews.

We call the tool the SharePoint Flavored WebLog Reader (sfwr.exe) and it can be used to analyze any IIS log file or batch of IIS log files. On top of that, it has specific knowledge about SharePoint, which adds a SharePoint flavor to the tool in the form of specific overviews that only make sense within a SharePoint context.

The SharePoint Flavored WebLog Reader has the following advantages:

  • It’s free.
  • It’s easy to use.
  • It contains a considerable amount of overviews.
  • It has a certain SharePoint flavor to it (because it also includes overviews specifically targeted towards SharePoint).
  • It leverages parallel programming techniques and is therefore pretty fast calculating the various  reports.

There are some drawbacks as well:

  • It’s not a rich tool visually.
  • Support is limited. Although, if you run into problems we’d be interested in the IIS log files that cause them and we’d be probably interested to look into them and improve the tool.
  • At this point, you won’t be able to extend the tool. We’d sure be interested in hearing requests for new overviews and we’d probably add them too.
  • It supports a little over (and is tested using) 2.3 million log entries. After that, you’ll have to divide the batch of log files in pieces.

So, what can it do?

The SharePoint Flavored WebLog Reader provides the following overviews:

  • The average request time per URI.
  • The max request time per URI.
  • The min request time per URI.
  • The average request time per InfoPath URI.
  • The max request time per InfoPath URI.
  • The min request time per InfoPath URI.
  • The average request time per Report Server URI.
  • The max request time per Report Server URI.
  • The min request time per Report Server URI.
  • Browser percentage.
  • Dead links.
  • Failed pages.
  • Failed InfoPath pages.
  • Most busy days of the week.
  • Most requested pages.
  • Requested pages per day.
  • Percentage error page requests.
  • Requests per hour per day.
  • Requests per hour.
  • Requests per user.
  • Requests per user per month.
  • Requests per user per week.
  • Slowest requests.
  • Slowest failed requests.
  • Slowest successful requests.
  • Slowest requests per URI.
  • Top requests per hour.
  • Top visitors.
  • Traffic per day in MB.
  • Traffic per week in MB.
  • Unique visitors.
  • Unique visitors per day.
  • Unique visitors per week.
  • Unique visitors per month.

Where can I get it?

You can download it at the TechNet Gallery: http://gallery.technet.microsoft.com/The-SharePoint-Flavored-5b03f323

How do I use it?

Open a command prompt, navigate to the folder where sfwr is stored, and type:

sfwr.exe [items] [log path]

For example:

sfwr.exe 100 “c:\temp\logs”

What do I need to remember about the tool?

  • It requires the presence of the .NET 4 framework.
  • Depending on the IIS log settings, you may not be able to see all reports. For example, if you don’t record the bytes sent and bytes received, you can’t see the Traffic per day and per week in MB overviews. If you use the default IIS settings, you’ll be able to see every overview.
  • It outputs the results in a file called Overviews.txt. The file is saved in the same directory as sfwr.exe.
  • The sfwr tool only processes files that have an extension of .log, and ignores all other extensions.
  • The max limit of log items that is tested is 2.3 million. If you cross that boundary, you might experience memory exceptions (don’t worry, sfwr displays the current count for you, so you can see if and where you cross the line). This seems to be caused by our extensive use of the Dynamic Language Runtime (DLR), but for us, there wasn’t really a need to research this issue and see if we could boost the tool to even higher numbers.
  • The memory structure holding the IIS log entries is predefined at 2.4 million items. After that, memory reallocation may cause a time delay and maybe additional memory issues.

How does it do it?

Since we realize that everyone  could define different IIS logging settings, we didn’t want to predefine a log structure that we depend on. Instead, we use the log header that every log file has to determine the structure, and use the Dynamic Language Runtime (DLR) ExpandoObject to create the structure dynamically.

Please note: This flexibility came at a cost. It seems that the extensive use of millions of expando objects causes the limit of (a little over) 2.3 million log entry items. This doesn’t seem to happen in a version that uses predefined structures. However, we feel this implementation has superior flexibility and allows us to generate overviews on a generate-if-possible basis and is therefore more generically applicable. Besides, 2.3 million items is a huge amount, so we decided to stick to the DLR approach.

At a high level, we do this:

  • Determine the structure of the IIS log file and store that in memory (in lineEntries).
  • Create an expando object that represents a log entry and add all properties in the IIS log file to it.
  • Assign values found in the log entry to the Expando object.

The code goes like this:

dynamic exp = new ExpandoObject();
for (int i = 0; i < lineEntries.Length; i++)
{
    IDictionary<string, object> dict = exp;

    // If you were wondering at the strange guard clause below,

    // VS.NET Code Contracts made us do it!!!
    if (dict == null) throw new SfwrException(“No dictionary”);

    if (PropertyNames.Count() != lineEntries.Count()) throw new SfwrException(“Property names are different from line entries”);
    dict[PropertyNames[i]] = lineEntries[i];
}

W3CImporter.Log.Add(exp);

Later on, we use these Expando objects to generate our overviews. The cool thing about Expando objects is that you can use the dictionary keys as actual property names. So, the following is perfectly valid:

dynamic exp = new ExpandoObject();

IDictionary<string, object> dict = exp;

dict[“Test”] = “Hello Expando!”;

Console.WriteLine(exp.Test);

This is a pretty cool feature that really helps us out in this tool. Now that we have a way to create a full-blown DTO object with properties on them and everything, we can use them to create overviews via Linq (or Plinq). Some of these queries are pretty straightforwards, others are niftier, such as the next one creating an overview of the number of unique visitors per week:

DateTimeFormatInfo dfi = DateTimeFormatInfo.CurrentInfo;
Calendar cal = dfi.Calendar;

var result = from log in Logs
               group log by new { Week = cal.GetWeekOfYear(log.CurrentDateTime, dfi.CalendarWeekRule, dfi.FirstDayOfWeek) } into grouped
               select new { Week = grouped.Key.Week, Visitors = grouped.Select(x => x.UserName).Distinct().Count() };

Anyways, this is the high level overview of how we do it. With this infrastructure in place, it’s going to be quite easy to extend the number of available overviews. We feel like we’ve covered all of the most important ones, but we need your help to come up with ideas for new overviews and features.

Support?

Most importantly? Use the tool to your benefit and provide feedback! Contact us at margriet at loisandclark dot eu if you have questions or requests.

Using the entity framework to see the contents of the SharePoint logging database

In this post, we will discuss how to use entity framework 4.0 to make it really easy to take a look at the contents of the SharePoint logging database. As an example, we’ll look at the performance counters stored in them. To get everything set up and running and to see what we’re going to accomplish, first read:

https://sharepointdragons.com/2011/11/16/leveraging-the-logging-database-to-see-performance-counters/

As you’ve learned from the aforementioned post, we’re going to use the PerformanceCounter view (which combines data from the tables PerformanceCounters_Partition0 to PerformanceCounters_Partition31, each table representing a different day with a total history of 32 days) in combination with the PerformanceCountersDefinitions table to inspect the performance counters. The PerformanceCountersDefinitions table doesn’t have a primary key which, in this case, makes it impossible for the Entity Framework to add it to an Entity Data Model.

Since the WSS logging database is intended for direct use (as opposed to the SharePoint config and content databases), we’ve taken the liberty to promote the Id column in the PerformanceCountersDefinitions table to a primary key. By default, saving this change won’t be supported by the SQL Server designer, so first you have to make a change:

http://www.bidn.com/blogs/BrianKnight/ssis/52/sql-server-2008-designer-behavior-change-saving-changes-not-permitted-1

Now, in VS 2010, create a new class library (name it WSS.Logging.Model, or something) and do the following:

  1. Add > New Item.
  2. Choose ADO.NET Entity Data Model.
  3. Click Add.
  4. Choose Generate from database.
  5. Click Next.
  6. Make a connection to the WSS logging database (for us, its name is WSS_Logging).
  7. Click Next.
  8. Select the PerformanceCountersDefinitions table and the PerformanceCounter view.
  9. Click Finish.
  10. Create a new client (for example, a Windows application).
  11. Add a project reference to the WSS.Logging.Model class library.
  12. Copy the <connectionStrings> section from the WSS.Logging.Model class library and add it to the config file of your client app.
  13. Import the WSS.Logging.Model namespace in the client app and add the following code to the Main() method:

using (WSS_LoggingEntities1 ctx = new WSS_LoggingEntities1())
{
  var counters = from p in ctx.PerformanceCounters
  join d in ctx.PerformanceCountersDefinitions on p.CounterId equals d.Id
  select new
  {
    p.LogTime,
    p.MachineName,
    p.Value,
    d.Counter,
    d.Category,
    d.Instance
  };
grd.DataSource = counters;