SharePoint Dragons

Nikander & Margriet on SharePoint

Category Archives: SharePoint

The scientific methodology for troubleshooting SharePoint

A methodology is a set of methods, rules or ideas that are important in a science or art. The scientific methodology for troubleshooting SharePoint is a set of ideas important for troubleshooting SharePoint issues based on the traditional scientific method for problem solving. This methodology comprises 7 steps, each step containing several ideas.

1.      Define the problem


  • Talk to the person that found out about the issue.
  • If possible, talk to other people experiencing the same problem. They may approach the problem at different angles getting you different insights, or they may be able to provide more accurate details (such as: how long does the issue exist? When did it start?)
  • How long does the problem exist?
  • How bad is it (what’s the priority)?
  • Identify possible parties (and contact persons) that possibly suffer from either the problem or potentially the troubleshooting of the problem (business people, companies, departments). Also identify parties that may play a role in fixing the issue (dba, network admin, 3rd party vendor, etc).
  • Forget what you think you know, don’t assume much and doubt everything.
  • Most important of all: forget pressure, try to free yourself from emotions. This will improve your chances to solve the problem a lot.

Core tasks:

  • Determine what the expected behavior is.
  • Determine what is really happening.
  • Take the trouble to describe what the difference between both scenarios is?

2.      Do research


  • Know your environment, know relevant code.
  • Read literature.
  • Post a forum question.
  • Discuss the problem with others.
  • Tell it to a rubber duck (which forces you to explain the problem thereby helping you to gain insights).
  • If you don’t make any progress within the hour, talk to others.
  • Contact other parties that can help you to get more insight into the problem.
  • Read the error message with attention, preferably aloud.
  • Collect data:
    • ULS logs
    • Event viewer general info
    • Event viewer SharePoint application logs
    • Health log database
    • Temporarily set monitoring logging to verbose
    • Turn on ASP.NET debugging settings
    • Decode add-in access token
    • Turn on developer dashboard
    • Debug issue (client-side JavaScript, C#, PS)
    • Fiddler trace logs
    • IIS logs (inc. request tracing)

Core tasks:

  • Reproduce the problem. Preferably in an environment where you can experiment safely.
  • If there is a working environment and an environment with a problem: keep calm, methodically check what is different and the solution will come!



Establish a hypothesis about the problem.

Core tasks:

  • Ensure the hypothesis is testable.

Design the experiment


  • Contact all parties that are impacted by the experiment (typically business).
  • Test hypothesis.
  • Run automated tests.
  • Comment code that is not needed to test the hypothesis.
  • Add break points.
  • Limit the number of variables you’re testing in a single experiment.
  • Try something funny (this probably won’t solve the issue, but let’s try it anyway).
  • Eliminate most likely causes, such as:
    • Recent code.
    • Custom code instead of MS code.
  • Do “half-splitting”: eliminate the problem by splitting it in half (comment code, determine if the problem is client-side or server-side, determine if the problem is environment specific and so on).
  • Don’t rush. Now is an easy time to break more. Take your time because often you need to temporarily break something to fix it.

Core task:

  • Execute the experiment.
  • Take rudimentary notes.
  • Try variations.

Gather data

Core task:

  • Establish the result of the experiment.
  • Take some distance and disconnect from emotions while you’re observing what is happening.
  • Gather log files.
  • Read error messages carefully. We repeat: Read error messages carefully.

Analyze results

Core tasks:

  • Is the problem solved?
  • Did you learn anything? If you didn’t, the whole troubleshooting experience was pretty useless. Remember, even in the unfortunate event where you didn’t solve the problem, there is still value in having learned from the experience.
  • Do you understand why the problem was solved?
  • Keep notes.
  • Add automated test that check for the issue.
  • Document failed experiments.

Draw conclusions


  • Enjoy successfully solving a problem.
  • Harder to do: enjoy progress.
  • Write about it (blog?).
  • Update documentation.

Core tasks:

  • Determine the next step:
    • The problem is fixed.
    • You need to revisit one of the previous steps.
    • You need to get help from others (MS support, external help, another department, colleagues).


Over the course of time we’ve used lots of sources to improve our bag of tricks. Unfortunately, we didn’t keep track of those sources. So, if you feel you need to be credited but didn’t, we’re awfully sorry. Contact us and we’ll add you to the references section.

We recommend you to check out:




Use Cases for SharePoint

Microsoft has assembled a list of use cases for SharePoint 2013 that are both easy on the eye and easy on the brain at

The SharePoint Flavored Weblog Reader(SFWR)

In a previous gallery post we’ve tried to satisfy our interest in the performance of SharePoint environments, we’ve explored which set of performance counters you should use for monitoring SharePoint WFEs and database servers: . This time, we’ll look at another aspect closely tied to this topic: IIS logs.

The IIS logs are an invaluable way to get to know your web application and your end users once it’s in production. For us, they are also the first stop when a web application has performance problems. Therefore, having a tool to analyze IIS logs is in invaluable asset in our bag of tricks. There are numerous commercial tools out there that do a reasonable job at analyzing IIS logs and often providing (more or less) great visual displays while doing it. There are some things wrong with commercial tools though:

  • They cost money (duh!).
  • We want to be able to add specific queries in a language familiar to us, should the need arise. We want to be able to add additional queries in Linq.
  • Usually these tools don’t have any intrinsic knowledge about SharePoint.

Since we came to the conclusion that having a tool that helps to analyze IIS log files is essential and that we weren’t fully happy with existing options, we decided to build such a tool ourselves which primarily gives us ultimate control when it comes to adding new overviews.

We call the tool the SharePoint Flavored WebLog Reader (sfwr.exe) and it can be used to analyze any IIS log file or batch of IIS log files. On top of that, it has specific knowledge about SharePoint, which adds a SharePoint flavor to the tool in the form of specific overviews that only make sense within a SharePoint context.

The SharePoint Flavored WebLog Reader has the following advantages:

  • It’s free.
  • It’s easy to use.
  • It contains a considerable amount of overviews.
  • It has a certain SharePoint flavor to it (because it also includes overviews specifically targeted towards SharePoint).
  • It leverages parallel programming techniques and is therefore pretty fast calculating the various  reports.

There are some drawbacks as well:

  • It’s not a rich tool visually.
  • Support is limited. Although, if you run into problems we’d be interested in the IIS log files that cause them and we’d be probably interested to look into them and improve the tool.
  • At this point, you won’t be able to extend the tool. We’d sure be interested in hearing requests for new overviews and we’d probably add them too.
  • It supports a little over (and is tested using) 2.3 million log entries. After that, you’ll have to divide the batch of log files in pieces.

So, what can it do?

The SharePoint Flavored WebLog Reader provides the following overviews:

  • The average request time per URI.
  • The max request time per URI.
  • The min request time per URI.
  • The average request time per InfoPath URI.
  • The max request time per InfoPath URI.
  • The min request time per InfoPath URI.
  • The average request time per Report Server URI.
  • The max request time per Report Server URI.
  • The min request time per Report Server URI.
  • Browser percentage.
  • Dead links.
  • Failed pages.
  • Failed InfoPath pages.
  • Most busy days of the week.
  • Most requested pages.
  • Requested pages per day.
  • Percentage error page requests.
  • Requests per hour per day.
  • Requests per hour.
  • Requests per user.
  • Requests per user per month.
  • Requests per user per week.
  • Slowest requests.
  • Slowest failed requests.
  • Slowest successful requests.
  • Slowest requests per URI.
  • Top requests per hour.
  • Top visitors.
  • Traffic per day in MB.
  • Traffic per week in MB.
  • Unique visitors.
  • Unique visitors per day.
  • Unique visitors per week.
  • Unique visitors per month.

Where can I get it?

You can download it at the TechNet Gallery:

How do I use it?

Open a command prompt, navigate to the folder where sfwr is stored, and type:

sfwr.exe [items] [log path]

For example:

sfwr.exe 100 “c:\temp\logs”

What do I need to remember about the tool?

  • It requires the presence of the .NET 4 framework.
  • Depending on the IIS log settings, you may not be able to see all reports. For example, if you don’t record the bytes sent and bytes received, you can’t see the Traffic per day and per week in MB overviews. If you use the default IIS settings, you’ll be able to see every overview.
  • It outputs the results in a file called Overviews.txt. The file is saved in the same directory as sfwr.exe.
  • The sfwr tool only processes files that have an extension of .log, and ignores all other extensions.
  • The max limit of log items that is tested is 2.3 million. If you cross that boundary, you might experience memory exceptions (don’t worry, sfwr displays the current count for you, so you can see if and where you cross the line). This seems to be caused by our extensive use of the Dynamic Language Runtime (DLR), but for us, there wasn’t really a need to research this issue and see if we could boost the tool to even higher numbers.
  • The memory structure holding the IIS log entries is predefined at 2.4 million items. After that, memory reallocation may cause a time delay and maybe additional memory issues.

How does it do it?

Since we realize that everyone  could define different IIS logging settings, we didn’t want to predefine a log structure that we depend on. Instead, we use the log header that every log file has to determine the structure, and use the Dynamic Language Runtime (DLR) ExpandoObject to create the structure dynamically.

Please note: This flexibility came at a cost. It seems that the extensive use of millions of expando objects causes the limit of (a little over) 2.3 million log entry items. This doesn’t seem to happen in a version that uses predefined structures. However, we feel this implementation has superior flexibility and allows us to generate overviews on a generate-if-possible basis and is therefore more generically applicable. Besides, 2.3 million items is a huge amount, so we decided to stick to the DLR approach.

At a high level, we do this:

  • Determine the structure of the IIS log file and store that in memory (in lineEntries).
  • Create an expando object that represents a log entry and add all properties in the IIS log file to it.
  • Assign values found in the log entry to the Expando object.

The code goes like this:

dynamic exp = new ExpandoObject();
for (int i = 0; i < lineEntries.Length; i++)
    IDictionary<string, object> dict = exp;

    // If you were wondering at the strange guard clause below,

    // VS.NET Code Contracts made us do it!!!
    if (dict == null) throw new SfwrException(“No dictionary”);

    if (PropertyNames.Count() != lineEntries.Count()) throw new SfwrException(“Property names are different from line entries”);
    dict[PropertyNames[i]] = lineEntries[i];


Later on, we use these Expando objects to generate our overviews. The cool thing about Expando objects is that you can use the dictionary keys as actual property names. So, the following is perfectly valid:

dynamic exp = new ExpandoObject();

IDictionary<string, object> dict = exp;

dict[“Test”] = “Hello Expando!”;


This is a pretty cool feature that really helps us out in this tool. Now that we have a way to create a full-blown DTO object with properties on them and everything, we can use them to create overviews via Linq (or Plinq). Some of these queries are pretty straightforwards, others are niftier, such as the next one creating an overview of the number of unique visitors per week:

DateTimeFormatInfo dfi = DateTimeFormatInfo.CurrentInfo;
Calendar cal = dfi.Calendar;

var result = from log in Logs
               group log by new { Week = cal.GetWeekOfYear(log.CurrentDateTime, dfi.CalendarWeekRule, dfi.FirstDayOfWeek) } into grouped
               select new { Week = grouped.Key.Week, Visitors = grouped.Select(x => x.UserName).Distinct().Count() };

Anyways, this is the high level overview of how we do it. With this infrastructure in place, it’s going to be quite easy to extend the number of available overviews. We feel like we’ve covered all of the most important ones, but we need your help to come up with ideas for new overviews and features.


Most importantly? Use the tool to your benefit and provide feedback! Contact us at margriet at loisandclark dot eu if you have questions or requests.

Stuff related to BCS in SharePoint 2010

How’s that for an inspired title? In this article we’ll be exploring various topics related to BCS in SPS 2010. We’ll be discussing:

  • Types of external content types,
  • The Bigint problem,
  • Inspecting the XML of the BDC Metadata Model,
  • BCS terminology,
  • Creating a custom .NET assembly connector,
  • Debugging an ECT built on the custom .NET assembly connector,
  • Creating a data access layer using the Entity Framework 4,
  • Consuming (and building) a WCF 4 service,
  • Deployment,
  • Get IntelliSense support for BDC models,
  • Standard search support,
  • Creating a profile page,
  • Consuming a SQL Server table directly,
  • ECT Associations,
  • Secure Store support.

If you are suspecting, judging from the list of topics, that this is going to be a long article, you’d be absolutely right.

Different types of External content types

Creating a BCS external content type and connecting to a SQL Server database using SharePoint Designer isn’t particularly hard to do. If we were living in a video game, this would be level 1 of the game. Which game you ask? Why, Donkey Kong of course!

Besides creating a BCS ECT for SQL Server, you also have the option to create ECT’s for .NET assemblies and .asmx or WCF services. In this article, you will see all of the three types in action.

The Bigint scenario

There is one important thing to note though. When it comes to SQL Server, we absolutely love to use primary keys of the Bigint datatype. Unfortunately, in BCS they get converted to System.Int64 and this data type is not supported in external lists. We’ve seen lots of places where people advise to avoid using this datatype in BCS scenarios, but how can you if you have to work with an existing database? If you happen to be working on one of our legacy applications, you’re bound to meet this problem!

Now, the max value of a Bigint is 9,223,372,036,854,775,807, and the max value of System.Int32 is 2,147,483,647, so if the max id of your table falls well within the System.Int32 range, the Bigint scenario is a valid scenario where you need to build a .NET assembly connector or a WCF service wrapper to circumvent this problem and handle the conversion of Bigints to System.Int32 and vice versa.

About the custom .NET assembly connector

BCS also supports the creation of connectors via Visual Studio.NET 2010. Such connectors potentially harness more power, but in video game terminology, they are a couple levels up.

There are two kinds of connectors:

  • .NET Assembly Connectors, they connect to a specific system (say, your customers database).
  • Custom Connectors, they connect to a generic type system (say, SQL Server).

If you’re not a 3rd party vendor planning to build custom connectors (a free tip: don’t build one for SQL Server, a large company beat you to it), you probably belong to a group that is far more likely to build .NET Assembly Connectors. We’re also members of that group, so please come and join us.

Creating a Hello World .NET assembly connector

If we were attending a party, this section discusses to say hi to the host, but doesn’t mention that you might want to bring a gift too. In this section, we’ll be creating a .NET assembly connector that is as easy as possible and use it to build an external list that allows read operations only. We’ll be using an imaginary SQL Server table that contains an integer called ProductID and a Name that is a string. The ProductID is the key that uniquely identifies products.

Please note: Don’t create that table yet, we don’t need it in this section. We’ll create it later.

Create a start project

First, we’ll be creating a new .NET assembly connector in VS.NET:

  1. Start Visual Studio 2010.
  2. Create a New Project.
  3. In the Installed Templates section, under the SharePoint > 2010 node, choose Business Data Connectivity Model.
  4. We’ll name the project: LoisAndClarkConnectors.
  5. Click OK.
  6. Choose a SharePoint site and accept the only option available: Deploy as a farm solution.
  7. Click Finish. You should note the presence of a new folder called BdcModel1.
  8. Select BdcModel1, and, using the Properties window, change the folder name to ProductModel.
  9. The focus should still be on the ProductModel node. You should see an entity called Entity1 on the Entity Design Surface. You don’t need it, delete it.
  10. If you expand the ProductModel node, you’ll notice there are a couple of items: BdcModel1.bdcm, Entity1.cs, and Entity1Service.cs.
  11. Delete Entity1.cs and Entity1Service.cs.
  12. Rename BdcModel1.bdcm to ProductModel.bdcm using it’s File Name property.

Being superstitious, we’ll stop at step 12. Now, we’re at a place where we’ve got everything cleaned up nicely and we’ve created ourselves a fine place to start.

Please note: The part about being superstitious is actually not true, we’re planning to create steps 13 like there’s no tomorrow in the rest of the article. It was just that we were out of steps to add to the list.

Inspecting the xml of the BDC Metadata model

At this point, it’s good to understand that the BDC Metadata Model (the .bdcm file), is actually an XML file.  If you want to inspect the XML, do the following:

  1. Right-click the ProductModel.bdcm file and choose Open With.
  2. Choose Yes if a dialog appears asking you if you want to save your changes.
  3. Choose XML (Text) Editor in the Open With – ProductModel.bdcm dialog.
  4. Click OK.
  5. Choose Yes in the dialog that asks if you want to close the bdcm file.

Now, you’re looking at the XML of the BDCM file. Not as easy as pressing F7, but ok. If you want to see the Design Surface again, double click the bdcm file and choose Yes to close the XML view of the BDCM file.

BCS terminology prerequisites

In the next part, we’ll be creating a Product entity that supports two types of operations: read the entire product list, and return a single product item. In BCS terminology, these types of operations are called stereotypes or stereotyped operations.

Stereotypes predefine operations types that can be defined against the external system. In other words, those are the operation types that can be used and have meaning within BCS. The most basic stereotypes in BCS are: Read Item, Read List, Create, Update, and Delete. In addition, there are more stereotypes available (15 or so). Of course, the “reading the entire product list” and “returning a single product item” operations correspond to the BCS stereotypes Read List and Read Item.

It’s important to keep in mind that if you want to support an operation in BCS you need to create both a Method and a MethodInstance element in the BDC Metadata model.

The Method element defines what the operation looks like in terms of defining input parameters, output parameters, and filters. The following BDCM fragment describes a method called ReadList that returns a collection of products and accepts no input parameters:

<Method Name=”ReadList”>
<Parameter Name=”productList” Direction=”Return”>
<TypeDescriptor Name=”ProductList” TypeName=”System.Collections.Generic.IEnumerable`1[System.String]” IsCollection=”true”>
<TypeDescriptor Name=”Product” TypeName=”System.String” /></TypeDescriptors></TypeDescriptor></Parameter>

… Method instance info omitted for clarity.

A MethodInstance element defines which stereotype is implemented. The following BDCM fragment describes a method instance called ReadListInstance that is a Finder BCS stereotype (by default, this method will be named ReadList, but we find this confusing).

<Method Name=”ReadList”>
   <Parameter Name=”productList” Direction=”Return”>
    <TypeDescriptor Name=”ProductList” TypeName=”System.Collections.Generic.IEnumerable`1[System.String]” IsCollection=”true”>
            <TypeDescriptor Name=”Product” TypeName=”System.String” /></TypeDescriptors></TypeDescriptor></Parameter>
    <MethodInstance Name=”ReadListInstance” Type=”Finder” ReturnParameterName=”productList” ReturnTypeDescriptorPath=”ProductList” />

A method is like a prototype of the operation and can contain multiple method instances, although typically developers implement a 1-1 relationship to keep things no more complicated than necessary. A method instance is like the implementation of the method.

Creating a Product entity

On with the creation of the Product entity… First, we’ll only add a Product entity to the Design surface, and define a unique identifier for it:

  1. From the Toolbox, drag an Entity to the Design Surface.
  2. Using it’s Name property, call it Product.
  3. Right-click the Product entity and choose Add > Identifier.
  4. Call the identifier ProductID,
  5. In the Properties window, set the Type Name to System.Int32.
  6. Set the Identifier Entity to Product and set the Identifier property to ProductID.
  7. Set the Read-only property to True. This is necessary because users should not be able to change the unique id.

Mapping the Product entity to a DTO class

The Product entity needs to be mapped to a .NET class (the DTO class or POCO) containing properties that correspond to the entity. First, we’ll have to create the .NET class:

  1. Right-click the ProductModel folder and choose Add > Class.
  2. Call the class Product.cs and click Add.
  3. Add the following code:

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;

namespace LoisAndClarkConnectors.ProductModel
    public class Product
        public int ProductId { get; set; }
        public string Name { get; set; }

At this point you should take note of two things:

  • Since it’s a simple example, we’ll only be adding a Name property, and of course, a unique identifier. If the unique identifier is missing, you won’t be able to create an external content type for it. So, this is about the minimum amount of code we need to get by.
  • The other thing to make a strong note of is the fact that we’ve called the property ProductId. In the Product entity on the design surface we’ve called the unique identifier ProductID, spelled with different casing. This is absolutely on purpose, but will lead to errors later on. Because we feel this mistake is so easy to make and the Visual Studio tooling doesn’t check for this, we want to show what happens and how you can detect this mistake and correct it.

Next, we need to create a method that returns all products. Or, to put it in BCS terms, we need to implement the Finder stereotype. To do this, we need to describe it in a BCS method that also defines it parameters. We’ll also define a BCS method instance, that refers to the Finder stereotype.

  1. Double-click ProductModel.bdcm.
  2. At the bottom window, click BDC Method Details.
  3. In the Methods section, click Add a Method and choose Create a Finder Method.
  4. Change the name (thru the Property window) to ReadProducts.
  5. Notice that VS.NET created a return parameters list called productList. Rename it to ProductList.
  6. The TypeDescriptor of the ReadProducts list is not ok, click it and choose Edit.
  7. In the Properties window, choose Type Name.
  8. Click the Current Project tab.
  9. Click Project (referring to the Project.cs you created earlier).

Right now you’ve created a BCS method that returns a collection of Product instances, and that’s exactly what we want. Now it’s time to switch to the BDC Explorer, and finish the description of the various types we need in the ReadProducts method. The BDC Explorer should look like this:


Up until this point VS.NET has inferred that the productList return parameter (@parameter) returns a collection of Product objects. Strangely, it doesn’t know the type of a Product and thinks it’s a System.String. We’ll need to correct that:

  1. Click Type Name in the Properties window.
  2. Click Current Project.
  3. Click Product.

Although now the BDCM model knows that a ProductList contains a collection of Product objects AND knows that a Product refers to the Product class, it still needs more information. It also doesn’t know anything about the members of the Product class, so we’ll need to add them too.

  1. Right-click the Product node in BDC Explorer and choose Add Type Descriptor.
  2. Give it the following name: ProductID.
  3. Give it the following type name: System.Int32.
  4. Add another type descriptor to the Product node.
  5. Give it the name: Name.
  6. The default type, System.String, is good.

At this point, we’ve created in enough detail what the BCS method looks like. Now, there’s still the job left of defining the BCS method instance.

  1. In the BDC Method Details pane, select the ReadProducts method instance.
  2. Since it’s too easy to confuse with the BCS ReadProducts method, change it’s name property to ReadProductsInstance.

Adding a specific finder

Before we’re able to create an external list, we need not only a Finder method, but also a SpecificFinder method that’s able to return the specifics of any given product. We’ll create one in this section.

  1. In the BDC Method Details pane, click Add a Method.
  2. Select Create Specific Finder Method.
  3. It’s called ReadItem but let’s change it’s name to ReadProduct.
  4. Change the parameter names to Product and ProductID.
  5. For the Product parameter, click the Type Descriptor of ProductID and set the Identfier Entity to Product. Set the Identifier property to ProductID.
  6. Set theead-only property to True.
  7. Change the name of the method instance (also called ReadProduct) to ReadProductInstance.

Note that it already has an In parameter allowing BCS to pass in a product id parameter. It also has a return parameter that gives back a Product object. Nothing else needs to be done here at this point.

Switch back to the Solution Explorer and double-click ProductService.cs. As you can see, there are two static methods waiting for you to implement them. Since we’re only creating a Hello World example we’ll do this in the simplest way possible:

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
namespace LoisAndClarkConnectors.ProductModel
    public partial class ProductService
        public static IEnumerable<Product> ReadProducts()
            return new List<Product>()
                new Product() { ProductId = 1, Name = “Blue” },
                new Product() { ProductId = 2, Name = “Red” },

        public static Product ReadProduct(int productID)
            return new Product() { ProductId = 1, Name = “Blue” };

Debugging the ECT

Remember that we misspelled ProductId on purpose in Product.cs, and that it should be ProductID. We’ve found that SPD is a great tool for finding such mistakes at an early stage (unfortunately, there’s no tool for checking at design time).

Let’s see what happens if we press F5, start the project, and deploy the external content type we’ve created.

  1. The SharePoint site opens.
  2. Choose Site Actions > Edit in SharePoint Designer.
  3. Click External Content Types.
  4. Click the Product ECT.
  5. Double-click ReadProductInstance.
  6. Choose Yes in the dialog asking you to create a connection and edit the operation.
    The Read Item window opens and notifies you there is an error.
  7. Click Next two times.
  8. Here you see that the identifier ProductID is not mapped to a data source element. You’ll also notice that the data source element ProductId is not checked.

We’ll fix the problem at the source later. For now, just click Cancel. Let’s go ahead and create the external list anyway.

  1. In SPD, click Lists and libraries.
  2. In the ribbon, click External List.In the External Content Types Picker, double-click Product.
  3. Call the list ProductList and click OK.

Regardless if the permissions you may have in SharePoint, you still need to set explicit permissions for the ECT.

  1. Open SCA.
  2. Click Application Management.
  3. In the Service Applications section, click Manage service applications.
  4. Choose Business Data Connectivity Service.
  5. From the Product action (ECB) menu, choose Set Permissions.
  6. Add your account and give all permissions, then click OK.

Go back to the original SharePoint site that should now contain the ProductList list.
Click it. You’ll see the message:

Unable to display this Web Part. To troubleshoot the problem, open this Web page in a Microsoft SharePoint Foundation-compatible HTML editor such as Microsoft SharePoint Designer. If the problem persists, contact your Web server administrator, and a correlation id.


Copy the correlation id and paste it in SPLIT (the SharePoint Log Investigation Tool, it can be downloaded from CodePlex). Near the bottom of the list of related messages, you’ll find a message stating that a property with the name of ProductID can’t be found on the parent object. That’s another hint that you’ve misspelled something.

Now, go back to VS.NET and stop debugging. Correct the situation and rename the property in Product.cs to ProductID. Do the same in ProductService.cs and see if the project compiles. Then, press F5 again. If you want, check ReadProductInstance in SPD. We feel this is the best way to check if everything works, and the best way to get a good description of the problems.

Please note: If you’re getting an access denied message, you need to set the permissions in SCA for this ECT again, as described earlier in this article.

If everything is fine and dandy, you’ll also be able to look at the ProductList external list.

Please note: Debugging is quite easy. Just add break points to the ProductService.cs class.

Show me the money!

Let’s inspect the results. The product overview looks like this (the call to the ReadProducts() method):


The product detail view (the call to the ReadProduct() method) looks like this:


At this point, the XML for the BDCM model looks like this (if you still have problems, you may want to edit the BDCM file in XML mode and compare what’s different):

<?xml version=”1.0″ encoding=”utf-8″?>
<Model xmlns:xsi=”” xmlns:xsd=”” xmlns=”” Name=”BdcModel1″>
    <LobSystem Name=”BdcModel1″ Type=”DotNetAssembly”>
        <LobSystemInstance Name=”BdcModel1″ />
        <Entity Name=”Product” Namespace=”LoisAndClarkConnectors.ProductModel” Version=”″>
            <Property Name=”Class” Type=”System.String”>LoisAndClarkConnectors.ProductModel.ProductService, BdcModel1</Property>
            <Identifier Name=”ProductID” TypeName=”System.Int32″ />
            <Method Name=”ReadProducts”>
                <Parameter Name=”ProductList” Direction=”Return”>
                  <TypeDescriptor Name=”ProductList” TypeName=”System.Collections.Generic.IEnumerable`1[[LoisAndClarkConnectors.ProductModel.Product, BdcModel1]]” IsCollection=”true”>
                      <TypeDescriptor Name=”Product” TypeName=”LoisAndClarkConnectors.ProductModel.Product, BdcModel1″ IsCollection=”false”>
                          <TypeDescriptor Name=”ProductID” TypeName=”System.Int32″ IsCollection=”false” ReadOnly=”true” IdentifierEntityName=”Product” IdentifierEntityNamespace=”LoisAndClarkConnectors.ProductModel” IdentifierName=”ProductID” />
                          <TypeDescriptor Name=”Name” TypeName=”System.String” /></TypeDescriptors></TypeDescriptor></TypeDescriptors>

                <MethodInstance Name=”ReadProductsInstance” Type=”Finder” ReturnParameterName=”ProductList” ReturnTypeDescriptorPath=”ProductList” />
            <Method Name=”ReadProduct”>
                <Parameter Name=”Product” Direction=”Return”>
                  <TypeDescriptor Name=”Product” IsCollection=”false” TypeName=”LoisAndClarkConnectors.ProductModel.Product, BdcModel1″>
                      <TypeDescriptor Name=”ProductID” IsCollection=”false” TypeName=”System.Int32″ IdentifierEntityName=”Product” IdentifierEntityNamespace=”LoisAndClarkConnectors.ProductModel” IdentifierName=”ProductID” ReadOnly=”true” />
                      <TypeDescriptor Name=”Name” TypeName=”System.String” /></TypeDescriptors></TypeDescriptor></Parameter>
                <Parameter Name=”ProductID” Direction=”In”>
                  <TypeDescriptor Name=”ProductID” TypeName=”System.Int32″ IdentifierEntityName=”Product” IdentifierEntityNamespace=”LoisAndClarkConnectors.ProductModel” IdentifierName=”ProductID” /></Parameter>
                <MethodInstance Name=”ReadProductInstance” Type=”SpecificFinder” ReturnParameterName=”Product” ReturnTypeDescriptorPath=”Product” />

So, reaching the end of this section, did we earn enough points to take a peek at another golden oldie game? We sure did! How about  “Raid over Moscow”?

Building an entity framework wrapper

Of course, this was a simple hello world example. We need to take this example to the next level, and write the data access logic to make a real impact on the database. That way, we can come to a point were we have a full-fledged example of a BCS ECT working against a single table. We’ve chosen to use Entity Framework to accomplish this.

Start with creating a new database called TestBCS and execute the following SQL script to create the Product table we’re using for testing purposes:


/****** Object:  Table [dbo].[Product]    Script Date: 2/7/2012 11:44:14 AM ******/


CREATE TABLE [dbo].[Product](
    [ProductId] [int] IDENTITY(1,1) NOT NULL,
    [Name] [nchar](50) NULL,
    [ProductId] ASC


We’re not really looking forward to creating a solution with the first version of the Entity Framework and the problems associated with that older version. So, we’ll build a data access layer using v2, Entity Framework 4. This does have a significant consequence though: since SPS 2010 doesn’t support .NET 4, we’ll eventually have to build a WCF service around it as a wrapper and call that from the BCS ECT. Just as well, the solution seems cleaner that way and since we haven’t demonstrated how to build an ECT against a WCF service, it’ll add considerable flavor to this article as well.

Let’s not worry about that right now. First, we’ll just concentrate on creating the data access layer using the Entity framework.

  1. Start a new VS.NET project.
  2. Make a Class Library.
  3. Call it Product.Data.
  4. Click OK.
  5. Right-click the project and type the following default namespace: LoisAndClark.Product.Product.Data.
  6. Delete Class1.cs.
  7. Add a new ADO.NET Entity Data Model item and just call it Model.edmx.
  8. On the Entity Data Model Wizard dialog, choose Generate from database.
  9. Click Next.
  10. Click New Connection and point to the TestBCS database.
  11. Click Next.
  12. Select the Product table.
  13. Uncheck the Pluralize or singularize generated object names.
  14. Click Finish.
  15. Add a new project, a Console application, to the Product.Data solution. We’ll call it the ProductTester.
  16. Right-click it and choose Set as Startup project.
  17. Add a Project reference to Product.Data.
  18. Add a reference to System.Data.Entity.
    In Program.cs, add the following using statement: using LoisAndClark.Product.Data;.
  19. Add a new item of template type: Application Configuration File.
  20. Add the following <connectionStrings> section to it (make a copy from this section from Product.Data):

    <add name=”TestBCSEntities” connectionString=”


provider=System.Data.SqlClient;provider connection string=&quot;data source=ASTRO\CTP0;initial catalog=TestBCS;integrated security=True;multipleactiveresultsets=True;App=EntityFramework&quot;” providerName=”System.Data.EntityClient” />
    <add name=”TestBCSEntities1″ connectionString=”


provider=System.Data.SqlClient;provider connection string=&quot;data source=ASTRO\CTP0;initial catalog=TestBCS;integrated security=True;multipleactiveresultsets=True;App=EntityFramework&quot;” providerName=”System.Data.EntityClient” />

We’ve included the source code of our main class. If you look carefully, it contains all the building blocks required for creating the basic set of BCS stereotypes (find, find all, create, update, delete).

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using LoisAndClark.Product.Data;

namespace ProductTester
    class Program
        static void Main(string[] args)

        private static void Add()
            using (var ctx = new TestBCSEntities1())
                            ctx.Product.AddObject(new Product()
                                 Name = “Test”


                private static int GetId(string name)
                    int productId;

                    using (var ctx = new TestBCSEntities1())
                        Product product = (from p in ctx.Product
                                                             where p.Name == name
                                                             select p).First();

                        productId = product.ProductId;

                    return productId;

        private static void Update()
                    int id = GetId(“Test”);

                    using (var ctx = new TestBCSEntities1())
                        // We’ll do something similar later to implement the Finder and SpecificFinder BCS stereotypes.
                        Product product = (from p in ctx.Product
                                                where p.ProductId == id
                                                select p).First();

                        Console.WriteLine(“id {0} name {1}”, product.ProductId, product.Name);
                        product.Name = “UpdatedName”;

                private static void Delete()
                    int id = GetId(“UpdatedName”);

                    using (var ctx = new TestBCSEntities1())
                        // We’ll do something similar later to implement the Finder and SpecificFinder BCS stereotypes.
                        Product product = (from p in ctx.Product
                                                             where p.ProductId == id
                                                             select p).First();



By now, you know the drill. Once we have reached the end of a significant section, it’s time for a small reward. Leave a comment if you have played “Aztec Challenge”:

image image

building The WCF 4 wrapper service

Using Entity Framework 4 is a great thing, but it does complicate things a little. Now, we need a wrapper service to consume our .NET 4 made DLL. It would be folly to create an .asmx service for it, so of course we’ll build a WCF 4 service around it. If you ever tried it, you know that creating a .NET 4 application under a SharePoint site aint no picnic, so we’ll be creating a separate web site for it later.

Please note: We’re skipping the business layer altogether, just because we can and this is only an example.

Let’s start creating a WCF service.

  1. In VS.NET, right-click the solution and choose Add > New Web Site.
  2. Accept all defaults, and call it ProductServices.
  3. Click OK.
  4. Add a project reference to Product.Data.
  5. Add a reference to System.Data.Entity.
  6. Expand the App_Code folder and delete IService1.cs and Service1.cs.
  7. Add a new class to the App_Code folder called IProductService.cs (the actual im.
  8. Add a new class to the App_Code folder called ProductService.cs.
  9. Add a new class to the App_Code folder called SingleProduct.cs.
  10. Add a new class to the App_Code folder called AllProductsResponse.cs.
  11. Add a new class to the App_Code folder called ProductRequest.cs.
  12. Open Service.svc and change the code behind to ProductService.cs, so that it looks like:

<%@ ServiceHost Language=”C#” Debug=”true” Service=”ProductServiceLibrary.ProductService” CodeBehind=”~/App_Code/ProductService.cs” %>

Just above the closing </configuration> element of the web.config file, add the following connection section:

    <add name=”TestBCSEntities” connectionString=”


provider=System.Data.SqlClient;provider connection string=&quot;data source=ASTRO\CTP0;initial catalog=TestBCS;integrated security=True;multipleactiveresultsets=True;App=EntityFramework&quot;” providerName=”System.Data.EntityClient” />
    <add name=”TestBCSEntities1″ connectionString=”


provider=System.Data.SqlClient;provider connection string=&quot;data source=ASTRO\CTP0;initial catalog=TestBCS;integrated security=True;multipleactiveresultsets=True;App=EntityFramework&quot;” providerName=”System.Data.EntityClient” />

Let’s discuss the actual class implementations one by one. First of all, we’ll need a type that can represent a single product, so we can use it as the return type for our SpecificFinder method:

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Runtime.Serialization;

namespace ProductServiceLibrary
public class SingleProduct
public int ProductID { get; set; }

public string Name { get; set; }

Our Finder method needs to return a list of all products, so we’ll be needing a type for that too. Note that later on we’ll be creating an ECT in SPD communicating with our WCF service, this compels us to keep things simple. This means you can’t use more than 1 level of nested types, because SPD won’t understand it.

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Runtime.Serialization;

namespace ProductServiceLibrary
public class AllProductsResponse
public List<SingleProduct> Products { get; set; }

For update scenarios, we’ll be passing in a single parameter representing a single product:

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Runtime.Serialization;

namespace ProductServiceLibrary
public class ProductRequest
[DataMember(IsRequired = false)]
public int ProductID { get; set; }

public string Name { get; set; }

The interface for our WCF service must contain operations for the basic 5 BCS stereotypes:

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.ServiceModel;

namespace ProductServiceLibrary
public interface IProductService
List<SingleProduct> FindAllProducts();

SingleProduct FindSpecificProduct(int productID);

int CreateProduct(ProductRequest request);

void UpdateProduct(ProductRequest request);

void DeleteProduct(int productId);

The actual implementation of our WCF service is leveraging our Entity Framework model that we’ve created in an earlier section and looks like this:

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using LoisAndClark.Product.Data;

namespace ProductServiceLibrary
public class ProductService : IProductService
public List<SingleProduct> FindAllProducts()
var response = new List<SingleProduct>();

using (var ctx = new TestBCSEntities1())
var products = from p in ctx.Product
select p;

foreach (var product in products)
new SingleProduct()
ProductID = product.ProductId,
Name = product.Name

return response;

public SingleProduct FindSpecificProduct(int productID)
var response = new SingleProduct();

using (var ctx = new TestBCSEntities1())
var products = from p in ctx.Product
where p.ProductId == productID
select p;

if (products.Count() == 1)
return new SingleProduct()
ProductID = products.First().ProductId,
Name = products.First().Name
return null;

public int CreateProduct(ProductRequest request)
int newId;

using (var ctx = new TestBCSEntities1())
var newProduct = new Product() { Name = request.Name };


newId = newProduct.ProductId;

return newId;

public void UpdateProduct(ProductRequest request)
using (var ctx = new TestBCSEntities1())
Product product = (from p in ctx.Product
where p.ProductId == request.ProductID
select p).First();

product.Name = request.Name;

public void DeleteProduct(int productId)
using (var ctx = new TestBCSEntities1())
Product product = (from p in ctx.Product
where p.ProductId == productId
select p).First();


If everything compiles, you should be good to go.

  1. You can test it by setting ProductServices as the startup project.
  2. Click CTRL+F5 (Start without debugging).
  3. Now add a service reference in the ProductTester project to this service.
  4. Set ProductTester as the startup project and add the following test code:

Please note: Rename ServiceReference2 to whatever name you’re using.

ServiceReference2.IProductService service = new ServiceReference2.ProductServiceClient();
int newID = service.CreateProduct(new ServiceReference2.ProductRequest() { Name = “ABNA” });

ServiceReference2.SingleProduct product = service.FindSpecificProduct(newID);

ServiceReference2.ProductRequest request = new ServiceReference2.ProductRequest() { ProductID = product.ProductID, Name = “New name” };


List<ServiceReference2.SingleProduct> products = service.FindAllProducts().ToList<ServiceReference2.SingleProduct>();
foreach (var prod in products)
    Console.WriteLine(“id: {0} name: {1}”, prod.ProductID, prod.Name);

Now add a breakpoint to the first line of the Main method of ProductTester, press F5, step thru it and see if everything works.

Since we’ve created a WCF 4 service, we won’t go thru the hassle and try to get everything to work in a separate application under a SharePoint site. Instead, we’ll just create a new web site for it.

  1. Open IIS Manager.
  2. Right-click the Application Pools node and choose Add Application Pool.
  3. Call it ServicesPool.
  4. Select the .NET Framework 4 version.
  5. Click OK.
  6. Select ServicesPool, then click Advanced Settings.
  7. Locate the Identity property and set it to a custom account which has access to the TestBCS database. This is important, otherwise the WCF service won’t work.
  8. Click OK.
  9. Stop and start the application pool.
  10. Right-click the Sites node and choose Add Web Site.
  11. Name it ServicesSite.
  12. Select the ServicesPool App pool.
  13. Specify the following Physical path: C:\inetpub\wwwroot\ServicesSite.
  14. Choose a free port number. We’ve chosen 8080.
  15. Click OK.
  16. Right-click ServicesSite and choose Add Application.
  17. Call the Alias ProductServices.
  18. Set the Physical path to C:\inetpub\wwwroot\ServicesSite.
  19. Click OK.
  20. Copy the entire contents of the ProductServices folder to C:\inetpub\wwwroot\ServicesSite.
  21. Adjust the web.config so that it contains the service info (the <system.serviceModel> part) from the app.config file of your service, like so:

<?xml version=”1.0″?>
    <compilation debug=”true” targetFramework=”4.0″>
        <add assembly=”System.Data.Entity, Version=, Culture=neutral, PublicKeyToken=B77A5C561934E089″/>
      <service name=”ProductServiceLibrary.ProductService”>
            <add baseAddress = “
http://localhost:8080/ProductService/” />
        <!– Service Endpoints –>
        <!– Unless fully qualified, address is relative to base address supplied above –>
        <endpoint address =”” binding=”wsHttpBinding” contract=”ProductServiceLibrary.IProductService”>
              Upon deployment, the following identity element should be removed or replaced to reflect the
              identity under which the deployed service runs.  If removed, WCF will infer an appropriate identity
            <dns value=”localhost”/>
        <!– Metadata Endpoints –>
        <!– The Metadata Exchange endpoint is used by the service to describe itself to clients. –>
        <!– This endpoint does not use a secure binding and should be secured or removed before deployment –>
        <endpoint address=”mex” binding=”mexHttpBinding” contract=”IMetadataExchange”/>
          <!– To avoid disclosing metadata information,
          set the value below to false and remove the metadata endpoint above before deployment –>
          <serviceMetadata httpGetEnabled=”True”/>
          <!– To receive exception details in faults for debugging purposes,
          set the value below to true.  Set to false before deployment
          to avoid disclosing exception information –>
          <serviceDebug includeExceptionDetailInFaults=”True” />
    <modules runAllManagedModulesForAllRequests=”true”/>

    <add name=”TestBCSEntities” connectionString=”


provider=System.Data.SqlClient;provider connection string=&quot;data source=ASTRO\CTP0;initial catalog=TestBCS;integrated security=True;multipleactiveresultsets=True;App=EntityFramework&quot;” providerName=”System.Data.EntityClient” />
    <add name=”TestBCSEntities1″ connectionString=”


provider=System.Data.SqlClient;provider connection string=&quot;data source=ASTRO\CTP0;initial catalog=TestBCS;integrated security=True;multipleactiveresultsets=True;App=EntityFramework&quot;” providerName=”System.Data.EntityClient” />


You have arrived at a point where you have a working WCF service supporting the basic BCS stereotypes. Let’s test it by opening the ProductTester app and adding a service reference to: http://localhost:8080/ProductServices/service.svc. If your WCF service works, good things are about to happen.

It’s that time again, the end of a significant section. Btw, we still can’t be beat at “Way of the Exploding Fist”:

Full-fledged single table

We can use SPD to create an ECT that consumes our WCF service by pointing it at the MEX endpoint, which can be found at:


In our case, the location is:


Please note: It’s important that you don’t use http://localhost anymore, because SPD won’t accept it.

Let’s walk thru the steps of creating the new ECT:

  1. Open SPD again for the site where you want to add the new ECT.
  2. Click External Content Types.
  3. In the ribbon, click External Content Type.
  4. In the External Content Type Operations section, click Click here to discover external data sources and define operations.
  5. Click Add Connection.
  6. In the External Data Source Type Selection dialog, choose WCF Service and click OK. The WCF Connection dialog opens.
  7. Enter the following Service Metadata URL: http://astro.loisandclark.internal:8080/ProductServices/Service.svc/mex.
  8. Make sure the Metadata Connection Mode is set to Metadata Exchange.
  9. Set the Service EndPoint URL to http://astro.loisandclark.internal:8080/ProductServices/Service.svc.
  10. Choose the following name: ProductServiceConnection.
  11. Click OK.

Finish the creation of operations for the new connection. We’ll start with the FindAll stereotype:

  1. In the Data Source Explorer, expand ProductServiceConnection.
  2. Expand the Web Methods node.
  3. Right-click FindAllProducts and choose New Read List Operation.
  4. Click Next twice.
  5. When you arrive at the Return Parameter Configuration page, you’ll see an error stating that you should pick an identifier.
  6. Select ProductID and check the Map to Identifier checkbox.
  7. The error disappears, click Finish.

Now, let’s work on the ReadSpecific stereotype:

  1. Right-click FindSpecificProduct and choose New Read Item Operation.
  2. Click Next.
  3. On the Input Parameters Configuration page, select productID and check the Map to Identifier checkbox.
  4. Click Next.
  5. On the Return Parameter page, select ProductID and click the MapToIdentifier checkbox.
  6. Click Finish.

We have support for the most basic operations. At this point, you will be able to create a read-only version of an external list based on the new ECT communicating with our WCF service. Since we’re going for the full-fledged implementation, we’ll be implementing the other stereotypes as well, starting with Create:

  1. Right-click CreateProduct and choose New Create Operation.
  2. Click Next.
  3. On the Input Parameters page, select ProductID and click the MapToIdentifier checkbox.
  4. Click Next and Finish.

Let’s move on to Delete:

  1. Right-click DeleteProduct and choose New Delete Operation.
  2. Click Next.
  3. On the Input Parameters page, select productId and click the MapToIdentifier checkbox.
  4. Click Finish.

Finally, finish up Update:

  1. Right-click UpdateProduct and choose New Update Operation.
  2. Click Next.
  3. On the Input Parameters page, select ProductID and click the MapToIdentifier checkbox.
  4. Click Finish.

You have defined all basic operations for an ECT, let’s finish the last details:

  1. On the breadcrumb menu, click New external content type.
  2. Choose the following name: ProductServiceECT and choose the same Display Name.
  3. Save it by pressing CTRL+S.
  4. Go on creating an external list for this ECT.
  5. Click Lists and Libraries.
  6. In the ribbon, click External List.
  7. Choose ProductServiceECT and click OK.
  8. Call the external list MyProductServiceList and click OK.

You’re almost ready to use the list, except that you need to set some permissions.

  1. Open SCA.
  2. Click Application Management.
  3. Click Manage service applications.
  4. Click Business Data Connectivity Service.
  5. From the ECB menu of ProductServiceECT, select Set Permissions.
  6. Give the desired users all permissions available and click OK.

Open the browser and check out the external list (see Figure). Play around with it a bit to ensure that all operations work.


Time for a little treat. Truthfully, this is the funniest multiplayer game ever built, “Spy vs Spy”:



Deploying a .NET assembly connector is easy, because a solution ( a .wsp file) is created automatically by VS. Deploying a solution created by SPD such as the one described in the previous section is not truly difficult, but a bit more esoteric. What you need to do is find a way to export the corresponding BDCD model and associated external lists, modify connection info, and install it on another environment using a feature.

Honestly, this is one of these moments were we feel closer to being sorcerers than being software developers. If you want to include external list instances too, you need to export the entire site, locate the artifacts you need, and make a SharePoint feature for them that you can deploy. Please follow along this walkthrough of the SPD solution incantation:

  1. Browse to the site containing the ECT you want to deploy.
  2. Go to Site Actions > Site Settings.
  3. In the Site Actions section, click the Save site as template link.
  4. Choose the following file name: MySiteTemplate.
  5. Choose the same Template name.
  6. Click OK. This creates a template in the Solutions Gallery.
  7. Click the go to the solution gallery link.
  8. Right-click MySiteTemplate and choose Save target as.
  9. Save MySiteTemplate.wsp in c:\temp.
  10. Start VS.NET 2010.
  11. Choose File > New > Project > SharePoint > Import SharePoint Solution Package.
  12. Click OK.
  13. Select a SharePoint site used for debugging. We’ll use http://astro/.
  14. Select Deploy as a farm solution.
  15. Click Next.
  16. Browse to the path for the existing solution package (MySiteTemplate.wsp).
  17. Click Next.
  18. Select the first item from the list, then press CTRL+SHIFT+END. This selects all items in the package.
  19. Click a checkbox. This causes all items to be unchecked.
  20. Locate the Lists you want to take with you (they have the type List Instance) and check them.
  21. Click Finish.
  22. If you get a warning that the list instances you’ve selected depends on Modules click Yes to include all required dependencies.
  23. Click OK once the solution import completed succesfully.
  24. Switch to SPD.
  25. Click External Content Types.
  26. Right-click ProductServiceECT and choose Export BDC Model.
  27. Choose the BDC Model Name: ProductServiceModel.
  28. Click OK.
  29. Save ProductServiceModel.bdcm in c:\temp.
  30. Go back to VS.
  31. Right-click the project and choose Add > Existing Item, then browse to c:\temp and import ProductServiceModel.bdcm.
  32. If you’re asled to add a .NET assembly LobSystem choose No.
  33. Click OK.

A feature is created automatically for the BDCM file and you’re ready to deploy your BCS solution. This also means that it’s reward time again, how about taking a look at a true classic: “Outrun”?


Standard Search support

The reason why you’d wanna be building BCS solutions lies in the fact that it integrates so well with SharePoint, something you can’t achieve by just building ASP.NET solutions. We’ve already seen external lists, now we’re gonna take a look at SharePoint search support.

You will find that having basic search support is relatively easy. That is, if you’re a SharePoint magician or have mental problems (as the Mongolian proverb says: “A detour of 10 miles is not much, if you’re a mad dog”).

To begin with, we will take some actions to make sure that it’s possible to click on search results and look at the details. Clicking a search result leads to the display of the ECT profile page, and we didn’t create one – yet. Profile pages are special web part pages that display info about external items. Here’s how you go about creating one:

  1. First set an ECT profile Page Host site by opening SCA.
  2. Click Application Management.
  3. Click Manage Service applications.
  4. Click Business Data Connectivity Service.
  5. In the ribbon, click Configure.
  6. Check Enable Profile Page Creation and define the site where profile pages will be created (in a special doc lib called _bdc). We’ve chosen http://astro as our host site.
    Click OK.

Now, create a profile page for our ECT.

  1. Open SPD.
  2. Click External Content Types.
  3. Click ProductServiceECT.
  4. In the ribbon, click Create Profile Page.

This causes the creation of a profile page, which shows search result details.

For every ECT that supports basic search, you always need to implement a Finder and a SpecificFinder method and make some modifications to the BDCM model to support indexing (full crawl). We needed to provide those methods anyway, so as far as that’s concerned we’re good to go. If you want to support incremental crawls, you have to do more work (such as providing a last modified datetime), but in this article we’re not gonna go there.

The next procedure shows how to add search support for our custom BCS ECT:

Let’s start exporting our BCD model so we can modify it:

  1. Go to SPD.
  2. Click External Content Types.
  3. Right-click ProductServiceECT > Export BDC Model.
  4. Choose the following BDC Model Name: ProductServiceBDCM.
  5. Click OK.
  6. Save it in c:\temp.

To support indexing of an external system, you need to understand some parts of the bdc model and make some minor changes too. We’ll get IntelliSense support by copying the schema of the BDC model in the same folder as the model itself, and then edit it using VS.NET.

  1. Go to SPD.
  2. Click External Content Types.
  3. Right-click ProductServiceECT > Export BDC Model.
  4. Choose the following BDC Model Name: ProductServiceBDCM.
  5. Click OK.
  6. Save it in c:\temp.
  7. Rename ProductServiceBDCM.bdcbm to ProductServiceBDCM.xml.
  8. Go to C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\14
  9. TEMPLATE\XML and copy BDCMetadata.xsd to c:\temp.
  10. Right-click ProductServiceBDCM.xml and choose Open with > Microsoft Visual Studio 2010.
  11. Notice that the FindAllProducts method has a property called RootFinder. This method is called by the crawler during the indexing of product information. It’s allowed to give it any value you want (such as ‘x’) or no value at all, the property just needs to be present.
  12. Locate the element <LobSystem Type=”Wcf” Name=”ProductServiceConnection”>.
    In it’s <Properties> section, add the following property (which also doesn’t require a value):

    <Property Name=”ShowInSearchUI” Type=”System.String”></Property>

  13. Save the model.
  14. Rename it back to ProductServiceBDCM.bdcm.
  15. Open SCA.
  16. Click Application Management > Manage service applications > Business Data Connectivity Service.
  17. Delete the existing ProductServiceBDCM model.
  18. In the ribbon, click Import.
  19. Browse to ProductServiceBDCM.bdcm.
  20. Click Import. It will take some time before the BDCM file is validated.
  21. When finished, click OK.
  22. In the ECB menu of ProductServiceBDCM, choose Set Permissions and set user permissions for your test user.

Now, let’s index our Products data.

  1. Open SCA.
  2. Click Application Management.
  3. Click Manage service applications.
  4. Click Search Service Application.
  5. Click Content Sources.
  6. Click New Content Source.
  7. Call it ProductSource.
  8. n the COntent Source Type section, select Line of Business Data.
  9. In the External Data Source section, select Crawl selected external data source.
  10. Check ProductServiceConnection.
  11. In the Start Full Crawl section, check Start full crawl of this content source.
  12. Click OK.

On the Manage Content Sources pages, you will see that the crawling process is starting. Refresh the page to see what the status is, but remember that this can take a while. In a test we only had 2 entries in our products database, but the indexing process still took 4 minutes and 20 seconds.

Eventually, the crawling process will end. Click Crawl Log to see how it went. You can click on the numbers to see more specifics about successes, failures and so on. After that, you can go to your Search Center (if you haven’t got one, create a new site collection using the Search Center template). Search for a product name (you have to type in the entire name) and you will get the appropriate search results:


Here is the details view:


Please note: If the URL starts with bdc3, you haven’t correctly created a profile page, or created it after creating the content source and indexing it.

As a well-deserved reward, let’s take a look at the artworks of a game with a great gameplay: Fast Eddie. Not only that, it’s name always reminds us of an old friend: Eddy Anthonio, the former European karate champ. He was aptly called Fast Eddy too.

ECT Associations

As of yet, we haven’t discussed what we think is a truly awesome feature of BCS: the ability to define relationships between ECTs, also called ECT Associations. This means you now have the possibility to define a relationship between tables located in different databases, or between a web service and a database table. How cool is that?

BCS supports the following associations:

  • One-to-many
  • Many-to-many (not available in SPD)
  • Self-referential (returns the same ECT)
  • Reverse (returns parent entity)

In this section, we will explore a one-to-many relationship. To explain it, we’ll have to create a new database table called Product_part, that contains multiple product parts that all refer to a specific product. We won’t bother creating a FK relation for this table, since we’ll be creating another one, a BCS relation, pretty soon.

Here’s the script for creating the Product_Part table:


/****** Object:  Table [dbo].[Product_Part]    Script Date: 2/14/2012 7:00:41 AM ******/


CREATE TABLE [dbo].[Product_Part](
    [ProductPartID] [int] IDENTITY(1,1) NOT NULL,
    [ProductID] [int] NOT NULL,
    [Name] [nchar](55) NOT NULL,
    [ProductPartID] ASC


We’ll move on and create a SQL Server ECT pointing to this table. This is convenient, because this is the last type of ECT we didn’t show yet in this article. In addition to the normal operations we’ve defined up until now, we’ll also be creating an association pointing to the ProductServiceECT. In other words, we’ll be defining a relationship between a database table and a web service.

  1. Open SPD.
  2. Click External Content Types.
  3. In the ribbon, click External Content Type.
  4. Call it ProductPartECT.
  5. Click the link Click here to discover external data sources and define operations.
  6. Click Add Connection.
  7. Select the SQL Server Data Source Type and click OK.
  8. Connect to the TestBCS database.
  9. Expand the TestBCS node in the Data Source Explorer.
  10. Expand the tables node.
  11. Right-click it and choose Create All Operations.
  12. On the Operation Properties dialog, click Next twice.
  13. Click Finish.
  14. Right-click Product_Part and choose New Association.
  15. Click Browse.
  16. Select ProductServiceECT.
  17. Click OK.
  18. Click Next.
  19. In the Data Source Elements pane, select ProductID.
  20. Check the Map to Identifier checkbox.
  21. Click Next twice.
  22. Click Finish.
  23. Save the ECT.

We have created the new ECT, let’s move on and assign the appropriate permissions so you’ll be able to access the ECT:

  1. Go to SCA > Application Management > Manage service applications.
  2. Click Business Data Connectivity Service.
  3. Select the ProductPartECT ECT and Set Permissions.

Now, you should be able to access the ECT. We’’ll create an external list based on this ECT:

  1. Switch back to SPD.
  2. Click Lists and Libraries.
  3. Click External List.
  4. Select ProductPartECT and click OK.
  5. Enter the name: ProductPartList.
  6. Click OK.

Please note: Be aware that sometimes it takes a couple of minutes before changes are synchronized. Don’t worry too much if you’re getting an access denied error.

You can check that the association is valid by adding a new item to the ProductPartList. You’ll get to open the external item picker and choose a product to which the product part belongs.


To better demonstrate the effects of an ECT association, we’ll use the Business Data List and Business Data Related List web parts, and connect them to each other to leverage the ECT association.

  1. Go to a web part page and choose Site Actions > Edit Page.
  2. Click Add a Web Part.
  3. Select the Business Data category.
  4. lick Business Data List and choose Add.
  5. Click Add a Web Part.
  6. Select Business Data Related List.
  7. Click Add.
  8. On the Business Data List web part, click Open the tool pane.
  9. Choose the following type: ProductServiceECT.
  10. Click OK twice.
  11. On the Business Data Related List, choose Open the tool pane.
  12. Choose the type: ProductPartECT.
  13. Click OK twice.
  14. On the ProductPartECT List web part, choose Connections > Get Related Item From > ProductServiceECT2 List.

If you click an item in the ProductServiceECT2 List web part, you’ll notice the ProductPartECT List web part updates too.


If you want, you can throw away the ECT association and observe what happens (it may be some moments before changes are sync’ed):image

So, that’s that. Let’s move on to things that are really important: the Commodore 64 classic Beach-Head:

Secure Store

Since there’s nothing to fix, we first need to break something (and then fix it again). Up until now we’ve taken the shortcut when it comes to authenticating. On our dev machine, it’s fine authenticating using the current user which happens to be the administrator who happens to have access to the TestBCS database. It’s unthinkable (sorry, we couldn’t resist picturing what unthinkable looks like, if you were wondering what Samuel L. Jackson was doing in our article) that every user gets to have direct access to the database, not to mention the double hop issue. Btw, this is exactly what we are going to do: we won’t be mentioning the double hop issue again, if you want to learn more about that you’re on your own.

What you should do is leverage the Secure Store Service (SSS) and map the credentials of end users to one or more accounts that do have access to the TestBCS database. In this section, we’ll map our windows account to a SQL Server account (using another Windows account is fine too) that has access to the database. We’ve created a SQL Server user called testbcsuser. We won’t be describing how to add this user and give it appropriate permissions, because we’re getting near the end of the article and are getting tired.

Next, you need to create a new SSS target application that allows you to map client credentials to TestBCS database user credentials.

  1. Go to SCA > Application Management > Manage service applications > Secure Store Service.
  2. If you didn’t do it before, click the Generate New Key button in the ribbon.
  3. Click New to create a new target application.
  4. Choose as Target Application ID and Display Name: TestBCS.
  5. Enter an e-mail address.
  6. Click Next.
  7. Add two fields.
  8. Call the first field SQL User, Field Type User Name, unmasked.
  9. Call the second one SQL Password, Field Type Password, Masked.
  10. Delete both Windows User Name and Windows Password entries. The page should look like the figure shown below.
  11. Click Next.
  12. Add your account as the Target Application Administrator.
  13. Click OK.


You’ve created a new SSS target application. Now, you have to configure the ECTs that use TestBCS database so that they’ll be using SSS to connect to the database. In the next procedure, we’ll show how to do this for the ProductPartECT.

  1. Switch to SPD.
  2. Click External Content Types.
  3. Click ProductPartECT.
  4. Click TestBCS (located after External System).
  5. Choose the following Authentication Mode: Impersonate Custom Identity.
  6. Enter the following Secure Store Application ID: TestBCS.
  7. Click OK.
  8. If you get a dialog box informing you that this decision affects ECTs, choose Yes.

Let’s see what impact this has made. If you go to external lists based on the ProductPartECT, you will be asked to provide SQL credentials to access the TestBCS database. After you have provided these once, you won’t have to do it again. It’s also possible that the administrator provides the credentials for you, but we’ll talk about that later.

  1. Switch to the browser.
  2. Go to the ProductPartList.
  3. Click the new link Click here to authenticate.
  4. In our case, a warning appears stating there our web site doesn’t have a security certificate. This is ok, because it’s a dev machine.
  5. Click Continue to this Site.
  6. The default SSS set credentials page appears and asks you for a SQL User and SQL Password (as shown in the next Figure). Enter these and click OK.


From now on, the BCS infrastructure then uses these credentials to login to the TestBCS database as long as BCS succeeds in logging in using these credentials. If you, as the administrator of your farm, want to influence these settings, you can go back to SCA and find the Target Application TestBCS on the SSS page.

  1. Check the TestBCS Target Application ID checkbox.
  2. In the ribbon, in the Credentials section, click the Set button.

You can enter a user name in the Credential Owner textbox to specify the credentials for a specific user. If you select a user and leave the credentials emppty (and click OK), you’re effectively resetting the user’s settings for the ECTs using this SSS target application. As a result, the next time the end user goes to the ProductPartList External List, he will be asked to authenticate again.

Let’s show the final reward: Daley Thompson’s Decathlon. If you know the game, you’ll never forget the experience of playing your joystick until your hands are so sore. There has never been a party game like this, nor will there ever be one…


Of course, you have been on a little trip in time and watched at some of the really old (but not oldest) video games. That in itself should have been worth the read, although we feel the other stuff may have been worthy of your time and attention too. BCS is developing into a mature and powerful product and SPS 2010 is the first version where we’re feeling that you can’t afford not to know it.

More capacity planning links

Some more useful capacity planning links… It contains interesting information. For example, a stretched farm, where multiple data centers are located close to another with high bandwidth connectivity so that they are configured as a single farm, must have less than 1ms latency between SQL Server and the WFEs, and at least 1Gbps connectivity:

Having said that though, apparently Microsoft has supported enviroments with latency levels up to 10ms:

Also check out the following info:

Using external data in user profiles using BCS

When we set out to reuse data from a BCS external content type we’ve used a couple of resources that we wanted to share. First off, the next article was a great help during the set up of user profile synchronization:

Then, the following post was a nice walk thru showing how to set up user profile imports using BCS:

Finally, when we saw that the import (using miisclient.exe) failed because of a missing anchor component, the following post was a life saver:

Claims in SharePoint 2010

When we set out to write a blog post we had the intention of documenting the process of setting up a claims aware SharePoint web application. After consulting numerable resources, we’ve found out that it’s not needed at all.

First off, if you want to read more about claims authentication, go to and read the claims-based authentication white paper.

If you want to set up claims authentication in SharePoint Server 2010 we found that the following post is the best resource: This post describes a visual way to set it up, and since it’s so easy to make tiny mistakes, it’s better to execute the process aided by visual tools.

We have three comments about the post that might help you though:

  1. You can also use the MembershipSeeder tool to set up test accounts, we feel it’s the best way to set up a larger number of test accounts. The MembershipSeeder tool can be found on CodePlex:
  2. Be sure to create a new SQL Server user account as specified in the blog post.
  3. Adjust the People Picker when you’re done following the steps in the blog post.

1. Setting up users via the MembershipSeeder tool

The MembershipSeeder tool is a client dedicated to communicating with the Membership database. It’s the best way for adding larger amounts of test users to the Membership database.

Please note: Don’t get too excited about the comment that the MembershipSeeder tool has been superseded by CKS: FBA. If you follow that path, it will actually be more work for you. We like the fact that the MembershipSeeder tool directly accesses the membership database and works in a simple, no bells and whistles kinda way.

Before you’re able to use the MembershipSeeder tool, you need to download it (duh!) and change some configuration data:

  1. Download the MembershipSeeder tool, extract it, and go to the Debug folder.
  2. Locate the <connectionStrings> element.
    Notice that the connection string is pointing to the wrong direction: server=o12; database=aspnetdb; Trusted_Connection=True. Replace it, in our case it looks like this:

      <add name=”AspNetSqlMembershipProvider” connectionString=”Data Source=moon;Initial Catalog=MyAccounts;User ID=administrator;Password=*****” />

Please note that we’re using SQL Server authentication, as specified in the blog post, so while your data source and user credentials may differ, don’t change that aspect. It’s a great way to test if the SQL Server user account has sufficient permissions. Now, let’s add some roles and users:

  1. Double-click MembershipSeeder.exe. This opens the ASP.NET Membership Form window.
  2. We feel the UI is a bit messy (we’re not criticizing, we feel this is a very handy tool), so be careful. Check out the next Figure to get the idea.


  3. First, check the two checkboxes about adding, creating or deleting only 1 user.
  4. The upper right part allows you to create new roles. Create two new roles: Developers and Managers.
  5. Then, click the Get Roles button to verify this has succeeded.
  6. The upper left part allows you to add new users. Create three new users: anton (with the password: anton), bill (pwd: bill), and daisy (pwd: daisy).
  7. After you’ve done that, add the users to roles in the upper right part by typing a role name in the Role textbox and typing the user name in the User Prefix textbox.
  8. Then, click the Add to Role button. Add anton and daisy to the Managers role. Click the Get Users button to verify this has succeeded.
  9. Now, add bill and daisy to the Developers role.

At this point we’ve created a Developers role with the members Bill and Daisy, and a Managers role with the members Anton and Daisy. This will give us some users to test with later.

2. Create a new SQL Server user account

At first we used a SQL Server user account that we’ve created during installation. Apparently this is a special kind of user account that can’t be used when connecting to the membership database. When you’re doing direct manual configuration of the web.config files, this mistake is hard to detect. We liked the approach of creating a fresh SQL Server user account, even for development purposes.

3. Configure People Picker

If you want to be able to type parts of user names and want the People Picker to autocomplete those names add the following line to the web.config file of every web application where you want to use People Picker autocompletion:

<add key=”fbamembershipprovider” value=”%” />

So that it looks like:

  <clear />
  <add key=”AspNetSqlMembershipProvider” value=”%” />
  <add key=”fbamembershipprovider” value=”%” />

What now?

If everything goes well, you should be able to log in both using Windows accounts and form accounts. To check out what claims are associated to each user, first download and install the Windows Identity Foundation SDK at We’ll be using it within a web part, so you’ll need install the SDK 3.5 version because SPS 2010 doesn’t support .NET 4.

Now, create an empty SharePoint project in Visual Studio.NET 2010, add a visual web part and add a label on it. Add the following code to it, deploy the package to SPS 2010 and add the web part to a page:

using System;
using System.Web.UI;
using System.Web.UI.WebControls;
using System.Web.UI.WebControls.WebParts;
using Microsoft.IdentityModel.Claims;

namespace LCWebParts.ClaimsVisualPart
    public partial class ClaimsVisualPartUserControl : UserControl
        protected void Page_Load(object sender, EventArgs e)
            IClaimsIdentity cidentity = (IClaimsIdentity) Page.User.Identity;
            Label1.Text = String.Format(“User name = {0}<br/>”, cidentity.Name);
            foreach (Claim claim in cidentity.Claims)
                Label1.Text += String.Format(“type: {0} value = {1}<br/>”, claim.ClaimType, claim.Value);

It’s interesting to see the different set of claims associated to different user accounts, and this allows you to make programmatic authorization decisions based on claims.

Claims documentation

So far the best resource we’ve found online about claims authentication and SharePoint 2010:

Number of logged on users in SharePoint

Nice: A very reasonable approach to finding the number of users currently logged on to SharePoint.

The SharePoint Language Store

We feel the SharePoint Language Store ( ) is an interesting idea for situations where you don’t want (or are not allowed to) to roll out a new resource file every time there is a change in the UI multilingual text.