About AllowUnsafeUpdates…

Posted: January 4, 2016 in SharePoint

In short here is how to deal with AllowUnsafeUpdates:

1) Don’t update SharePoint objects from your code behind on GET requests as if you do so your code will be exploitable via a cross-site scripting. If you understand the consequences of doing this and still want to do it then read below about how to use the AllowUnsafeUpdates property.

2) If your code is processing a POST request then make sure you call SPUtility.ValidateFormDigest() before you do anything else. This will ensure that the post request is validated (that it is not a cross-site scripting attack) and after that you will not have to worry about AllowUnsafeUpdates, because its default value will be “true” after the form digest is validated. To find out more about this read the second part of this article.

The Microsoft idea behind introducing the AllowUnsafeUpdates property is to protect YOU from cross-site scripting attacks. The way this works is that if your application is running in an HTTPContext (i.e. it’s a web part for instance) and the request is a GET request then SharePoint will refuse to do any changes unless the value of AllowUnsafeUpdates is set to true and by default it will be false for GET requests. If you try to do any updates to lists, webs or any SharePoint objects that require an SPSite to be created first, and if you don’t set AllowUnsafeUpdates to true you will get this exception:

System.Exception: Microsoft.SharePoint.SPException: The security validation for this page is invalid. Click Back in your Web browser, refresh the page, and try your operation again. —> System.Runtime.InteropServices.COMException (0x8102006D): The security validation for this page is invalid. Click Back in your Web browser, refresh the page, and try your operation again.

It is important to understand that if you are writing a class library for example, your code will behave differently when called from a web application and when called from a rich client. Actually if the HTTPContext.Current is null then AllowSafeUpdates will be always true. This is the case in rich clients where no cross-scripting is possible as there are simply no web requests.

Usually when you create your own SPSite or SPWeb objects, i.e. when you are not getting them from the SPContext (such as SPContext.Web), and when you try to update anything such as web or list properties, list items metadata etc, you may get the exception listed above. This is a clear indication that AllowUnsafeUpdates of the SPWeb is false and this is preventing you from doing the update. This problem is resolved easily by setting the AllowUnsafeUpdates of the parent web object to true. Still sometimes even after you have done this you may still be getting the same error. This is typically caused by one of the the following reasons:

A) You have set the AllowUnsafeUpdate to true for the wrong SPWeb

You have to be careful because sometimes the ParentWeb of an object is not the same instance of the web you have retrieved the object from. For example when you go initialWeb.Lists[listId] you would expect that the returned list’s ParentWeb instance is the same as you initialWeb. However this is not the case. So if somewhere later in your code you go list.ParentWeb.UpdateSomething() this will not work because you have never set the AllowUnsafeUpdates property of list.ParentWeb. You have set it for your initialWeb but even that this is the same web as the list’s parent web both are different instances. Usually you see the error and then you go and investigate in Reflector whether this is the same instance or not.

Alternatively you could use another more generic and clever way to deal with almost any similar situation described in the following post:


The author suggests that you can set the HttpContext.Current to null before you do your updates and then reassign its initial preserved value when done. This will work great but remember to set the HTTPContext to null as early as possible. In the post above probably SharePoint uses the site.RootWeb to do the updates to the site scoped features and the RootWeb’s AllowUnsafeUpdates hasn’t been set to true explicitly.

  1. B) The AllowUnsafeUpdates gets reset to false sometimes after you have set it to true

If we have a look at how the property is managed it turns out that it is stored in the request object associated with every SPWeb (which is actually a COM object)

[SharePointPermission(SecurityAction.Demand, UnsafeSaveOnGet = true)]
private void SetAllowUnsafeUpdates(bool allowUnsafeUpdates)

This actually means that every time the request is reset, the property will be also reset to its default value. The m_Request member is modified when a new web is created, when the web is disposed or when the SPWeb.Invalidate() method is called.

internal void Invalidate()
if (this.m_Request != null)
if (this.m_RequestOwnedByThisWeb)
this.m_Request = null;
this.m_bInited = false;
this.m_bPublicPropertiesInited = false;
this.m_Url = null;

So any operation that calls SPWeb.Invalidate() will reset AllowUnsafeUpdate to its default value. And for code running under HTTPContext, i.e. web applications, this default value for a GET request will be false. I’ve looked up for you all legitimate cases for which Invalidate() is being called by the SharePoint object model. These cases are:

1) When the Name or the ServerRelativeUrl properties of the SPWeb are changed and then Update() is called. In this case the AllowUnsafeUpdate is reset because with the change of these properties the URL of the web will change and logically the request object will change as it will now point to a different URL.

2) When any object that implements ISecurable (those are SPWeb, SPList and SPListItem) breaks or reverts their role definition inheritance. This means every time you callSPRoleDefinitionCollection.BreakInheritance(), BreakRoleInheritance(), ResetRoleInheritance() or set the value of HasUniquePerm the AllowUnsafeUpdatesproperty of the parent web will reset to its default value and you may need to set it back to true in order to do further updates to the same objects.

3) In many cases when an exception is caught by the SharePoint object model when you try to retrieve any sort of data the AllowUnsafeUpdates of the parent web will be reset to false as a precaution to protect against potential exploits. In those cases however the objects will be in unknown state anyway after the request has been reset and the exception is re-thrown so they are of no practical interest.

And finally it is also good to mention that you may get another related exception when trying to update your SharePoint objects and that is:

System.Exception: Microsoft.SharePoint.SPException: Cannot complete this action.Please try again. —> System.Runtime.InteropServices.COMException (0x80004005): Cannot complete this action.

This usually happens when some updates have been made to an object (usually SPSite, SPWeb or SPList) that may be clashing with your changes and SharePoint refuses to do the update. To recover from this situation you simply need to create fresh copies of the SPSite and the SPWeb objects and do the updates on the objects retrieved from the fresh copies. And of course don’t forget to set the AllowUnsafeUpdates to true for the freshly created SPWeb if required.

So you have probably noticed from part one of the article that the internal method that sets the AllowUnsafeUpdates property had quite an interesting name:SetIgnoreCanary(). Canary is something that refers to a method of protecting from stack overflow attacks. The terminology is a reference to the historic practice of using canaries in coal mines, since they would be affected by toxic gases earlier than the miners, thus providing a biological warning system. In SharePoint the request canary is a unique pseudo-random value that protects you from cross-site scripting attacks. If you have ever examined the HTML source of your SharePoint pages you have probably noticed the __REQUESTDIGEST  hidden field.  This is what is referred to as the canary or the Form Digest and is used to verify that the request is genuine.

<input id=”__REQUESTDIGEST” name=”__REQUESTDIGEST” type=”hidden” value=”0x5DC31993EF285644A7C48FBFA2E6FB719CD7E9DB0922A329E97,19 May 2008 23:37:22 -0000″ />

As you see this is nothing more than a hidden field set by the server and verified back by the server when the page is submitted. As documented by Microsoft: The purpose of form digest validation is to help prevent security attacks where a user is tricked into posting data unknowingly to a server.

The place where the Form Digest value is set is the WebPartPage.FormOnLoad() method:

private void FormOnLoad(object sender, EventArgs e)
if (HttpContext.Current != null)
SPWebcontextWeb = SPControl.GetContextWeb(HttpContext.Current);
if(contextWeb != null)
SPWebPartManager.RegisterOWSScript(this, contextWeb);
if (this.Page.Items[“FormDigestRegistered”] == null)
stringbstrUrl = SPGlobal.GetVTIRequestUrl(this.Context.Request, null).ToString();
SPStringCallback pFormCallback = new SPStringCallback();
contextWeb.Request.RenderFormDigest(bstrUrl, pFormCallback);
base.ClientScript.RegisterHiddenField(“__REQUESTDIGEST”, SPHttpUtility.NoEncode(pFormCallback.StringResult));
this.Page.Items[“FormDigestRegistered”] = true;

The actual value of the __REQUESTDIGEST  field is generated by the COM objects in the OWSSVR.dll. After that another method is called:FormDigest.RegisterDigestUpdateClientScriptBlockIfNeeded()

public static void RegisterDigestUpdateClientScriptBlockIfNeeded(Page page)
if (SPContext.Current.Site.WebApplication.FormDigestSettings.Enabled)
totalMilliseconds = SPContext.Current.Site.WebApplication.FormDigestSettings.Timeout.TotalMilliseconds;
if(totalMilliseconds > 2147483647.0)
totalMilliseconds = 2147483647.0;
int num2 = Convert.ToInt32((double)(totalMilliseconds * 0.8));
if(!page.ClientScript.IsOnSubmitStatementRegistered(typeof(FormDigest), “SPFormDigestUpdaterSubmitHandler”))
page.ClientScript.RegisterOnSubmitStatement(typeof(FormDigest), “SPFormDigestUpdaterSubmitHandler”,
“UpdateFormDigest(‘” + SPEncode.ScriptEncode(SPContext.Current.Web.ServerRelativeUrl) + “‘, “+ num2.ToString(CultureInfo.InvariantCulture) + “);”);

ScriptLink.Register(page, “init.js”, true, false);

There are a couple of interesting pieces of information in the code above. Firstly the Form Digest value generated by the server can expire. By default this happens in 30 min. Secondly the SPWebApplication.FormDigestSettings property can be used to change the form digest settings per web application. Those settings are persisted in the configuration database if you call SPWebApplication.Update(). The information provided in MSDN for the “Enabled” property is however not completely correct. MSDN says that:Enabled gets or sets a value that determines whether security validation is included with all form pages. But my SharePoint code examination and code tests showed that theForm Digest will be always included regardless of the Enabled value. The value of “false” means that when the digest expires (by default in 30 min) the user will not be able to submit the form and will get a security validation timeout exception trying to do so. Further test however showed that setting Enabled to false will indeed disable the security validation and you will not be getting the “The security validation for this page is invalid. Click Back in your Web browser, refresh the page, and try your operation again.” exception even that AllowUnsafeUpdates will have a value of false.

Looking into the code of FormDigest.RegisterDigestUpdateClientScriptBlockIfNeeded() we see that it registers a client script that calls the UpdateFormDigest()  JavaScript function when the form is submitted. This JavaScript function calls the GetUpdatedFormDigest() method of the _vti_bin/sites.asmx WebService and updates the form digest field value on the fly before the form is submitted back to the server. According to the preliminary documentation of the Sites Web Service Protocol released by Microsoft on 4 Apr 2008: TheGetUpdatedFormDigest is used to request renewal of an expired security validation, also known as a message digest. The purpose of form digest validation is to help prevent security attacks where a user is tricked into posting data unknowingly to a server.  To generate the new digest the web service simply creates a new instance of the FormDigestcontrol and returns its DigestValue property value. So if the function is not called and the Form Digest is not updated a security timeout exception will occur and users will have to refresh a page before they can submitt it.

So the next question is where is the Form Digest validated? This is actually done in the SPWeb.ValidateFormDigest() method:

public bool ValidateFormDigest()
HttpContext current = HttpContext.Current;
if (current != null)
if (HttpContext.Current.Items[“FormDigestValidated”] == null)
if (!this.Request.ValidateFormDigest(this.Url, null))
return false;
current.Items[“FormDigestValidated”] = true;

return true;
return true;
return true;

And as the code above shows the validation is done only once per web request and is then cached in the HTTPContext. Furthermore when a new SPWeb or SPSite object is created they also create an internal SPRequest object from the HTTPRequest. And within the lifetime of a single HTTP request to your web part or a web page, if the Form Digesthas been successfully validated once then the AllowUnsafeUpdates property will now have a default value of true for all freshly created objects and the updates will be considered safe by SharePoint which also means you don’t have to set AllowUnsafeUpdates to do your job.

For some reason sometimes SharePoint doesn’t always call the ValidateFormDigest() method and this is why a workaround with setting AllowUnsafeUpdates to true is used. But a much better and safer solution is if you call this method yourself. The best way to do it is to call the SPUtility.ValidateFormDigest() method somewhere at the beginning of your POST request code behind.

And finally if you use in SharePoint you custom built ASPX pages which don’t inherit from a WebPartPage you can insert a FormDigest control in them which will insert the Form Digest  for you automatically and will protect you from cross-site scripting. Just make sure to call ValidateFormDigest() and DON’T touch the AllowUnsafeUpdates.

I hope that my investigation was more useful than confusing to you. Knowing more about how things work is always a key to building better and more secure applications.

Happy coding.


CAML Queries are most commonly used  for easily accessing  data inside SharePoint.

SharePoint object model provides many useful Classes which use CAML queries for fetching the data you need (listed below). These classes are fine tuned for specific scenarios.

  1. SPQuery
  2. ContentIterator
  3. SPSiteDataQuery
  4. PortalSiteMapProvider
  5. CrossListQueryCache and CrossListQueryInfo

Lets explore each of the above and see which how they can be best used depending upon requirements.


SPQuery is probably most popular among SharePoint developers. Using Query class , we can execute a CAML query against an SPList instance, to retrieve items corresponding to the query.

using (SPSite site = new SPSite("http://spsite")) {
 using (SPWeb web = site.OpenWeb()) {
 SPList list = web.Lists["Contacts"];
 SPQuery query = new SPQuery();

 // Define columns to fetch
 query.ViewFields = "<FieldRef Name=\"Title\" /><FieldRef Name=\"Email\" />";

 // Force to fetch only the specified columns
 query.ViewFieldsOnly = true; 

 query.Query = "<Where><Contains><FieldRef Name=\"Email\" />
 <Value Type=\"Text\">@extremesharepoint.com</Value></Contains></Where>";

 //Define the maximum number of results for each page (like a SELECT TOP)
 query.RowLimit = 10;

 // Query for items
 SPListItemCollection items = list.GetItems(query);
 foreach (SPListItem item in items) {
 Console.WriteLine(item["Title"] +" : "+item["E-mail Address"]);

SPQuery can be used within any kind of application that query SharePoint data : windows application or web application. It is the best way to query a single list when items returned by query are frequently changing. And also when you need real time data in your results.

Below are few points you should take into consideration for achieving best performance out of SPQuery

  • Always bound SPQuery object using RowLimit.  An SPQuery without RowLimit has poor performance and may fail on large lists. You should specify a RowLimit between 1 and 2000. You can also usePaging in SPQuery   if you want to retreive more then 2000 items at a time .
  • Avoid Query Throttle exceptions. The maximum number of items you should retrieve in a single query or view in order to avoid performance degradation is 5000 – query threshold. If your query returns more items than the configured query threshold , the query will be blocked and you will not get results.
  • Use indexed fields in query where possible . If you are querying  a field that is not indexed, and the resulting scan encounters more than 5000 items(query threshold), your query not get any results.

SharePoint Server 2010 provides a new class named ContentIterator that you can use to query lists without hitting throttle limits and hence can avoid receiving an SPQueryThrottleException. You should consider using ContentIterator if you need to run a query that will return more than 5,000 rows of data.

The ContentIterator object divides the list items into chunks and runs the query against one chunk of list data at a time. Each list item is processed asynchronously by a callback method until the query is complete.

The following  example demonstrates usage of the ContentIterator class.

    static int noOfErrors = 0;
    static int noOfItemsProcessed = 0;

    string camlQuery = @"<View><Query><Where>              
                        <FieldRef Name='Title' />                       

    ContentIterator iterator = new ContentIterator();
    SPQuery listQuery = new SPQuery();
    listQuery.Query = query1;
    SPList list = SPContext.Current.Web.Lists["Tasks"];

public  bool ProcessError(SPListItem item, Exception e) 
    // process the error
    return true; 
public void ProcessItem(SPListItem item)
    //process the item.

ContentIterator will iterate througth each item in the list and  invoke the callback methodProcessItem specified to process list items . If an error occurs while iterating the list,  theProcessError function is invoked.

To efficiently use ContentIteratoryou should include one of the three OrderBy clauses—

  • ContentIterator.ItemEnumerationOrderByID : Gets an OrderBy clause for a query that orders items by ID.
  • ContentIterator.ItemEnumerationOrderByPath : Gets an OrderBy clause that orders query items by URL.
  •  ContentIterator.ItemEnumerationOrderByNVPField : Gets an OrderBy clause for a query that orders items by the NVP index used in the <Where > clauseIt  actually enables the index to be used.

By default,  SharePoint adds a OrderBy clause that orders by content type, which ensures that folders are processed before list items. You should override this behavior with one of the three OrderBy clauses to take full advantage of  indexed fields.

The following code example shows how to use the ContentIterator.ItemEnumerationOrderByNVPField clause. The example assumes that you are querying an indexed field.

SPQuery query = new SPQuery();
query.Query = "<Where><Eq><FieldRef Name=\"IndexedFieldName\"/><Value Type=\"Text\">Sharepoint</Value></Eq></Where>" 
+ ContentIterator.ItemEnumerationOrderByNVPField;
ContentIterator contentIterator = new ContentIterator();
    delegate(SPListItem item)
        // Work on each item.
    delegate(SPListItem item, Exception e)
        // Handle an exception that was thrown while iterating.
        // Return true so that ContentIterator rethrows the exception.
        return true;

You can use SPSiteDataQuery  when you want to query multiple lists within a site collection simultaneously. Like the SPQuery object, the SPSiteDataQuery object has Query and ViewFields properties. In addition, the SPSiteDataQuery object also has Lists and Websproperties. Below example uses SPSiteDataQuery to  return all events from all calendar lists in the current site collection where the end date is later than today.

SPSiteDataQuery query = new SPSiteDataQuery;
query.Query = "<Where><Gt><FieldRef Name='EndDate'/><Value Type='DateTime'><Today OffsetDays=\"-1\"/></Value></Gt></Where>";
//Sets the list types to search.106 is for calendar list.
query.Lists = "<Lists ServerTemplate='106' />";
//Sets the Fields to include in results
query.ViewFields = "<FieldRef Name='Title' /><FieldRef Name='Location' />";
//Sets the scope of the query
query.Webs = @"<Webs Scope='SiteCollection' />";
//Define the maximum number of results for each page (like a SELECT TOP)
query.RowLimit = 10;
//Execute the query
DataTable table = SPContext.Current.Site.RootWeb.GetSiteData(query);

The Lists property is used to specify the lists that should be included in the query within the site collection. The Lists property can take several forms to specify the lists to include in the query.:

  • Setting Lists property to <Lists ServerTemplate=[value]/> limits the query to lists of a certain server template. For example, type 106 is a calendar. By default, this attribute is null and so the query is not limited to lists based on a particular template
  • Setting Lists property to <Lists BaseType=[value]/> limits the query to lists of a certain BaseType.By default, the query considers lists of BaseType 0 (generic lists)
  • Setting Lists property to <Lists Hidden=’true’/> includes hidden lists in the query. By default, the query considers all non-hidden lists.
  • Setting Lists property to <Lists MaxListLimit=[value]/> limits the query to considering no more than the specified number of lists. If the query exceed the limit, it fails with SPException. By default, the limit is 1000. When set to 0, there is no limit to the number of lists that are considered (You should avoid setting limit to 0)
  • You can also instruct to include specific lists only by using Lists property.For example, to include only 2 specific lists to search, use <Lists><List ID=”[list1GUID]” /><List ID=”[list2GUID]” /></Lists>.The ID attribute identifies each list.

The Webs property is used to set the scope of the query

  • Setting Webs property to  <Webs Scope=’SiteCollection’/>   includes all lists in the site collection.
  • Setting Webs property to  <Webs Scope=’Recursive’/>  includes only the lists in the current site or subsites beneath the current site.
A few important points to note about SPSiteDataQuery:
  • Like SPQuery, SPSiteDataQuery also throws exception when results exceeds the no of  items allowed by the MaxItemsPerThrottledOperation or the MaxItemsPerThrottledOperationOverride property of SPWebApplication. So, you should setRowLimit property for optimum performance and to avoid the throttle exceptions.
  • SPSiteDataQuery  does not consider indexed columns and so using index columns in query have no positive effect on performance. This behavior differs from SPQuery which considers indexed column values and can achieve better performance.

PortalSiteMapProvider is the navigation site map provider for SharePoint. The main purpose of PortalSiteMapProvider class is to help to cache the content for navigation.

Additionally, it is useful for aggregating data as it provides cached queries and access to cached object stores.PortalSiteMapProvider also offers efficient management of caching infrastructure for retrieving list data

We can use PortalSiteMapProvider.GetCachedListItemsByQuery method to query the list and also cache the query results.  The PortalSiteMapProvider.GetCachedListItemsByQuery method first checks cache to see if those items already exist. If they exist, the method returns the cached results. If not, it queries the list, stores the results in cache and then returns them.

You can use below example method in a webpart or user control

protected void TestPortalSiteMapProvider(HtmlTextWriter writer)
 SPWeb web = SPContext.Current.Web;
 SPQuery spquery = new SPQuery();
 spquery.Query = "<Where><IsNotNull><FieldRef Name='Title'/></IsNotNull></Where>";
 PortalSiteMapProvider provider = PortalSiteMapProvider.WebSiteMapProvider;
 PortalWebSiteMapNode node = (PortalWebSiteMapNode)provider.FindSiteMapNode(web.ServerRelativeUrl);
 SiteMapNodeCollection nodeCollec = provider.GetCachedListItemsByQuery(node, "Tasks", spquery, web);
 foreach (SiteMapNode smnode in nodeCollec)

It should be noted that PortalSiteMapProvider requires HTTPContext (SPContext)to work. So you cannot use it in the scenarios where HTTPContext is null, for example: Console\Windows applications, Timer Jobs etc.

The main advantage of using PortalMapSiteProvider is that it exploits SharePoint object cache and hence provides efficient data access.

Apart from querying list items, PortalSiteMapProvider can also be used aggregate information  for sites, property bags etc. The below example demonstrates how to use the PortalSiteMapProvider to retrieve the site property bag values for a specific key in the site collection.  Since this information does not reside in a list, neither the SPQuery nor the SPSiteDataQuery can easily retrieve the information in this case.

protected void GetPropertyBagValues()
 PortalSiteMapProvider provider = PortalSiteMapProvider.CombinedNavSiteMapProvider;
 NameValueCollection sitenameAndImageUrl = new NameValueCollection();
 if (provider.CurrentNode != null && provider.CurrentNode.HasChildNodes)
 foreach (PortalWebSiteMapNode node in provider.GetChildNodes(provider.CurrentNode))
 //Retrieves value from the site property bag by using a specified object key. For e.g. SiteImageUrl
 if (node.GetProperty("SiteImageUrl") != null)
 sitenameAndImageUrl.Add(node.Title, (string)node.GetProperty("SiteImageUrl"));
 } } } }

Using PortalSiteMapProvider is one of the best performing data access technique . However, you should be aware of certain limitations:

  • It cannot be used in windows  or console application since HTTPContext is null in these applications
  • It can only be use with Publishing sites in  Sharepoint Server and not with SharePoint foundation.
  • PortalSiteMapProvider is  most useful if the data you are retrieving is not significantly changing over time. If you are trying to frequently retrieve different list items or data, the PortalSiteMapProvider will constantly read from the database, insert data into the cache and then return data. So, if this is the case, we don’t benefit from PortalSiteMapProvider as we are not reading from cache. Due to additional tasks it performs(checking and inserting into cache) , We also have performance loss if we use PortalSiteMapProvider for such situations.
  • PortalSiteMapProvider uses the site collection object cache to store data. By default, the object cache is limited to 100 megabytes (MB). Hence, the amount of memory PortalSiteMapProvider can use may be limited.

[Note:  You can increase the size of the site collection object cache from the Object cache settings page in the site collection. However, we should note that the amount of memory assigned to the object cache comes out of the shared memory available to the application pool.  Therefore, you should carefully increase the limit after ensuring you have that much memory to consume. Check this out for details ]

CrossListQueryCache and CrossListQueryInfo

CrossListQueryCache and CrossListQueryInfo provide very scalable way to run cross–site queries like SPSiteDataQuery. Unlike  SPSiteDataQuery, CrossListQueryCache .GetSiteData()  uses cache and hence have better performance – if you use correct overloaded version of the method.

The CrossListQueryInfo object uses the CrossListQueryInfo object to get the cached results or, if there are no cached results available, it performs a cross-list query to the database and then caches the results for future use. Audience targeting is then applied to the result set, depending on the setting specified in the CrossListQueryInfo object. You can use the CbqQueryCacheobject to obtain a CrossListQueryInfo object for a specific Content by Query Web Part.

Overloaded methods Description   Uses Cache
GetSiteData(SPSite) Retrieves the cached data that is based on the CrossListQueryInfo specification.  YES
GetSiteData(SPWeb) Retrieves the data from the SPWeb object.  NO
GetSiteData(SPSite, String)
Retrieves the cached data from the SPSite 
and from the specified web url.
GetSiteData(SPWeb, SPSiteDataQuery) Retrieves the data by using the specified SPSiteDataQuery object.  NO

If you don’t use CrossListQueryCache .GetSiteData() version that supports caching then better to go for SPSiteDataQuery instead. Below is an example that uses cached version :

protected DataTable TestCrossListQueryCache()
DataTable dt = null;
CrossListQueryInfo crossListQueryInfo = new CrossListQueryInfo();
crossListQueryInfo.Query = "<Where><IsNotNull><FieldRef Name='Title'/></IsNotNull></Where>";
crossListQueryInfo.ViewFields = "<FieldRef Name=\"Title\" />";
crossListQueryInfo.Lists = "<Lists BaseType=\"0\" />";
crossListQueryInfo.Webs = "<Webs Scope=\"SiteCollection\" />";
crossListQueryInfo.UseCache = true;
CrossListQueryCache crossListQueryCache = new CrossListQueryCache(crossListQueryInfo);
dt = crossListQueryCache.GetSiteData(SPContext.Current.Site);
return dt;

Like PortalSiteMapProvider,CrossListQueryCache and CrossListQueryInfo also need HTTPContext to work. Hence they cannot be used in windows or console applications. For their usage, you should consider same points mentioned above for PortalSiteMapProvider.

KeywordQuery Class
In order to develop custom search web parts or applications that support ‘search-by-keyword’ scenario, SharePoint Query object model exposesKeywordQuery Class.

PortalSiteMapProvider gives you the power of SPQuery along with cache for better performance. CrossListQueryCache gives you the power of SPSiteDataQuery along with cache for better performance. Now, to use cache or not depends on the kind of data you want to  query. If your query returns frequently changing  Data-Sets then caching is actually a overhead and can in turn cause performance hit.

ContentInterator is good to use only when you want to process >5000 items which is the default query threshold limit.

  • GeoLocation
  • Related Items (You will find this column in Task list)


Integrating location and map features SharePoint 2013 introducing new column type “geolocation”; New column GEO location added help to specify the location using BING map. By default this column not visible in the UI. You can add this column by code also available in MSDN link http://code.msdn.microsoft.com/office/SharePoint-2013-Add-a-d3fa8288/sourcecode?fileId=67748&pathId=1981093371
Above link shows how to create the geolocation using the Client object model.

 private static void AddGeolocationField()         {             // change “http:// localhost” with your sharepoint server name             ClientContext context = new ClientContext(@”http://Mysite&#8221;);             //Replace the <List Title> with valid list name.              List oList = context.Web.Lists.GetByTitle(“GeoLocationlist”);
//DisplayName will be displayed as the name of the newly added Location field on the list             oList.Fields.AddFieldAsXml(“<Field Type=’Geolocation’ DisplayName=’Location’/>”                                         , true, AddFieldOptions.AddToAllContentTypes);             oList.Update();              context.ExecuteQuery();         }

Creating a view from that list (another new view added in SharePoint 2013 called “Map view”)
Map View
MAp view
After creating the view you will see the below screen
Related Items
Related items column you will find inside the task list.

Related Items is hidden column. It is not available in the site columns. To add the columns in your custom list you need to change the column hidden property. Please read this blog to “Add the related items to the site column”.
“Related items” column not available in the new item form.
 Related Items


But when you click on the view item from all items then you will able to see the button Add Related items.


View item form



A modal pop up came up suggesting me to choose something related to link to page related to the site collection (any Asset).  You can add the multiple items to that filed but the related items window not allows you to select more than one items at a single time.
I have select the home page as a related item.

AddRelatedItems button

Some of the other columns

  • ParentID
  • AppAuthor
  • AppEditor
  • NoCrawl
  • PrincipalCount
  • Checkmark
  • RelatedLinks
  • MUILanguages
  • ContentLanguages
  • UserInfoHidden
  • IsFeatured
  • DisplayTemplateJSTemplateHidden
  • DisplayTemplateJSTargetControlType
  • DisplayTemplateJSIconUrl
  • DisplayTemplateJSTemplateType
  • DisplayTemplateJSTargetScope
  • DisplayTemplateJSTargetListTemplate
  • DisplayTemplateJSTargetContentType
  • DisplayTemplateJSConfigurationUrl
  • DefaultCssFile
  • RelatedItems
  • PreviouslyAssignedTo

SharePoint 2013 Minimal Download Strategy (MSD) new feature introduced by Microsoft that improving end user experience. I recently check the master page and find some interesting tags that used most at the top of the placeholders.i’e SharePoint:AjaxDelta Tag.

SharePoint:AjaxDelta wrapping all our favorite Delegate Controls and other elements in the Master Page. What this AjaxDelta control does is to only download to the client (browser) what has been changed since the previous download. If nothing has changed, nothing is downloaded Ajax Style.
Following are the AjaxDelta control that are available in the Master page.

  1. DeltaPlaceHolderAdditionalPageHead
  2. DeltaSPWebPartManager
  3. DeltaSuiteLinks
  4. DeltaSuiteBarRight
  5. DeltaSPNavigation
  6. DeltaWebPartAdderUpdatePanelContainer
  7. DeltaWebPartAdderUpdatePanelContainer
  8. DeltaTopNavigation
  9. DeltaSearch
  10. DeltaPlaceHolderPageTitleInTitleArea
  11. DeltaPlaceHolderPageDescription
  12. DeltaPlaceHolderLeftNavBar
  13. DeltaPlaceHolderMain
  14. DeltaFormDigest
  15. DeltaPlaceHolderUtilityContent
when you click on any of the page you will see the start.aspx page  and following by # then your site page URL. example http://xxx/_layouts/15/start.aspx#/SitePages/Home.aspx. The start.aspx page is responsible render the Delta changes control.

The Minimal Download Strategy is by default enabled on Team sites, Community sites but MDS is not enabled on publishing sites. MDS is a web feature.

What is Delegate control

  1. Delegate controls available in a SharePoint allow branding or substitution of common elements without altering master page.
  2. Substitute registered with control with lowest sequence number based on control ID.(Microsoft – here the default value is 100 so your sequence must be lower than this for your control to be loaded instead.)
  3. Parameters can be passed to the control via the declaration http://msdn.microsoft.com/en-us/library/ms463169.aspx

Type of Delegate

  1. Multi Delegate (Delegate Control load more than one user/server control.)
  2. Single Delegate

Multi Delegate: If delegate contains the “AllowMultipleControls=”true” ” attribute in the markup; It means the control is multi delegate control. it loads more than one user/server control, that has registered as lowest sequence number.

Single Delegate: If delegate without “AllowMultipleControls=”true” ” attribute in the markup; It means the controls only replace with the user control; that has registered as lowest sequence number.

HTML Example of Delegate control

<SharePoint:DelegateControl id=”ID_SuiteLinksDelegate” ControlId=”SuiteLinksDelegate” runat=”server” />

Following are the list of Delegate Controls available in SharePoint 2010, Highlighted are newly added in SharePoint 2013.

  1. AdditionalPageHead
  2. GlobalSiteLink0
  3. GlobalSiteLink2
  4. GlobalSiteLink3
  5. PublishingConsole
  6. PageHeader
  7. TopNavigationDataSource
  8. TreeViewAndDataSource
  9. PageFooter
  10. QuickLaunchDataSource
  11. SmallSearchInputBox
  12. GlobalNavigation
  13. SuiteBarBrandingDelegate (2013)
  14. SuiteLinksDelegate(2013)
  15. PromotedActions(2013)

I have created the solution that implemented SuiteBarBrandingDelegate, SuiteLinksDelegate and PromotedActions.

SuiteBarBrandingDelegate: It will change the top left bar. Here in my example I replace “SharePoint” with “My SharePoint Site”.

This is the HTML of the delegate control.

Replace this with 

SuiteLinksDelegate: Replace the left top links bar with custom links. As showing in the below images. Replace “Newsfeed, SkyDrive, Sites” with “About Us, Contact Us, Feedback” links.

Master Page SuiteLinksDelegate  HTML markup.

Replace with  with Image.
PromotedActions:  This is multi delegate control. Added link between “Share” and “Follow” links ie “Facebook” link. As showing below.
This is the HTML of the delegate control.

Replace this  with 

Feature Element file

Header before

After apply delegate control

Visual Studio Project

Most of the developer even don’t know the power of delegate control.

There are the different master pages available in the SharePoint 2013, so it could be possible you would not found all the above delegate control in master page. ex. “v4.master” page not containing the “PromotedActions” delegate control. This is only available in “oslo.master” page.

In SharePoint 2013 SharePoint:AjaxDelta wrapping all our favorite Delegate Controls and other elements in the Master Page. So now delegate control comes under the “SharePoint 2013 Minimal Download Strategy (MSD)” 

Attached code with this Post.

Result Source: SharePoint 2013 comes with Result Source, “which is a replacement of SharePoint Scopes and federation locations”.

Apply query to transform a corpus subset. In SharePoint 2010, the scopes only are created at the application level to overcome this problem. Result source can be created at Application Level, Site Collection Level and Site Level. Now the every site owner can creates the result source at Application, site collection and site.

Site Level

Site Collection Level

Application Level
So site collection result source available to all the sub site, and site can have its own result source. Once it created user create the new Search Vertical and update the core result web part.
Following are the steps to create the Result Source.

  • Go to the settings -> Site Settings -> Under Search -> Click to the Result Source -> Click to Add Result Source

  • You need to give the name of the Result Source
  • In Protocol section -> Select Local SharePoint Search
  • Click “Lunch query builder” button, the below image will display.
  • The query builder helps to generate the query.

  • Select the keyword filter to add keyword and property filter display the list of manage Property click “Add Keywords Filter/Add Property Filter” to add the filter to your search query. To test the query click test query button the preview window display the search result. There are two tabs one for sorting to sort your search result and other tab display the test tab to run execute the query. After create the query click [OK].
  • Select authentication and click [OK], the Result source created. The below “Document DOX FileSize” is the name of the Result source, that I just created.

  • Above Image display the two result fist default provided by the SharePoint and second Result source at site level.
  • Now the Result source is created, next step to create the new page.
  • I have created the new page from the Setting->Click “Add New Page” option. Page is created having a name “DocOnly”
  • Next Step, Need to update the Search result web part.
  • Edict Search result web part. Click “Search Criteria” node, click “Change Query” button. Query builder open in Popup window.

  • Select the Result Source that we created at the top, name of the result source “Document DOX FileSize“.

  • Click [OK], to apply the setting to the Web part, publish the page, Now the expected search will display.
  • Next step we need to create the “Search Vertical“.
  • Click to the setting->Site Settings -> under search tab click Search Settings.
  • Under configure Search Navigation section, added new Search vertical, called “Onlydocs“.
  • Click [OK] to save the search configuration.
  • Result source configured successfully.

Following are the limitation of SharePoint software.

Limit Maximum Value Limit Type
Web application 20 per farm Supported
Zone 5 per web application Boundary
Managed path 20 per web application Supported
Solution cache size 300 MB per web application Threshold
Application Pools 20 per farm Supported
Number of content databases 500 per farm Supported
Content database size (general usage scenarios) 200 GB per content database Supported
Content database size (all usage scenarios) 4 TB per content database Supported
Content database size (document archive scenario) No explicit content database limit Supported
Content database items 60 million (includes documents and items) Supported
Site collections per content database 10,000 Supported
Site collections per farm 750,000 Supported
Web site 250,000 per site collection Supported
Site collection size Max size of the content database Supported
List row size 8,000 bytes per row Boundary
File size 2 GB Boundary
Documents 30,000,000 per library Supported
Major versions 400,000 Supported
Minor versions 511 Boundary
Items 30,000,000 per list Supported
Rows size limit 6 table rows internal to the database used for a list or library item Supported
Bulk operations 100 items per bulk operation Boundary
List view lookup threshold 8 join operations per query Threshold
List view threshold 5,000 Threshold
List view threshold for auditors and administrators 20,000 Threshold
Subsite 2,000 per site view Threshold
Coauthoring in Word and PowerPoint for .docx, .pptx and .ppsx files 10 concurrent editors per document Threshold
Security scope 1,000 per list Threshold