Sunday, December 20, 2015

Validating the People Picker in SharePoint with J query

Validating the People Picker in SharePoint with J query

--------------------------------------------------------------------------------------

// Peoplepicker name: pplName




 function CheckPeoplePickerIsEmpty()
{
            var value = $(".pplName span.ms-entity-resolved").attr("title");  
          
          if (value == undefined) {
                $('#pplmsg').text("Please select people name.");    // Take span for printing the msg
            }
            else {
                $('#pplmsg').text("");
            }
}


=======================================================================

When button not click by Jquery OnClick function, try to add $(document).ready. see below

When button not click by Jquery OnClick function, try to

add  $(document).ready. see below

------------------------------------------------------------------------------------------------------------
 $(document).ready('#button').click(function(event){              
   //  Ensure that the SP.UserProfiles.js file is loaded before the custom code runs.
   // SP.SOD.executeOrDelayUntilScriptLoaded(getCurrentUser, 'SP.UserProfiles.js');
     getCurrentUser();

  });
----------------------------------------------------------------------------------------------------------

Querying List Items from Large Number of Sites in SharePoint

Querying List Items from Large Number of Sites in SharePoint


Source Link: http://www.vrdmn.com/2012/11/querying-list-items-from-large-number.html

When scouting the web for working with SharePoint Large Lists, you can find many articles which deal with fetching a huge number of items from one particular list. But very little data when you want to fetch items from a large number of sub sites. So after a little bit of poking around, I decided to blog about some of my findings here:

The Scenario:

Here are the conditions on which I was testing:
  • 1 Site Collection
  •  500 Sub sites
  •  1 Task List in Each sub site - > 500 Lists
  •  10 items in each List -> 5000 List Items

 So the total count of items I had to query was about 5000 and according to the test conditions, the items which would match the query were not more than 1200 at a time.

The Tools:

The tools I was using for measuring the performance were nothing extraordinary:

1)  I was using the StopWatch Class from the System.Diagnostics namespace. This class provides a fairly simple and easy mechanism for recording the time a particular operation took to execute.
This MSDN link has excellent examples on how to use the StopWatch class for performance measuring

2) The Developer Dashboard has always been my goto tool for performance measuring. I don’t know how I used to get by before I started using it. It provides a wealth of information about the page load. It can provide you with the time taken, the database calls made, the stack trace and a whole lot of other very useful information. A good tutorial on the Developer Dashboard can be found here.

SPSiteDataQuery:

The SPSiteDataQuery class is the heart of architecture when you want to get data from multiple sites. This class by itself does not use any form of caching and always returns data based on the real time queries. So even if it takes a bit longer to fetch the data, it is guaranteed that you will get all the current results and your users will never have to wait to see their new items to be returned by the query.

Here is the code for doing a simple query with the SPSiteDataQuery class:

SPSiteDataQuery query = new SPSiteDataQuery();
query.ViewFields = "\"Title\" />\"
DueDate\" />";
query.Query = @"



" +
SPContext.Current.Web.CurrentUser.Name
+ @"



Completed



";
//Query only the Tasks List in each web.
query.Lists = "\"107\" MaxListLimit=\"0\"/>"
;
/*Specifying the MaxListsLimit as 0 means that there is no limit on how many lists
in the site collection will be queries. If you want you can limit this number to
increase your performance.*/
query.Webs = "\"Recursive\" />"
;
//Specifying the row limit will limit the number of items which will be returned.
//query.RowLimit = 100;
DataTable results = SPContext.Current.Web.GetSiteData(query);

view rawspsite.cs hosted with ❤ by GitHub

Here is a stack trace of the internal methods which are called by the SharePoint framework when a SPSiteDataQuery is used:


So as you can see, it calls the SPRequest.CrossListQuery method which internally makes queries to the Database to fetch the relevant results.

When querying the database the procedure proc_EnumListsWithMetadata is used. You can have a look at this procedure in your Content DB. It queries several tables such as the dbo.AllListsdbo.AllWebs etc. to fetch the relevant results.

Time taken to query 5000 items in 500 sub sites and return 1200 matching items:

 650ms average on each load.

CrossListQueryInfo:

The CrossListQueryInfo class is another mechanism you can use to fetch the List Items from multiple sites. This class internally uses the SPSiteDataQueryclass to actually fetch the items from the database and when the items are returned, it stores them in the object cache of the Publishing Infrastructure. When any more calls to the same data are made subsequently, then the data is returned from the cache itself without making any more trips to the database.

The working of the CrossListQueryInfo class largely depends on the object cache of the Publishing Features of SharePoint server. So you cannot use this class in SharePoint 2010 Foundation or in sandbox solutions. Also, the default expiry time of the object cache is set to 60 seconds. So you might want to change that time depending upon your environment requirements.

Here is the same code for using the CrossListQueryInfo class:

CrossListQueryInfo query = new CrossListQueryInfo();
query.ViewFields = "\"Title\" />\"
DueDate\" />";
query.Query = @"



" +
SPContext.Current.Web.CurrentUser.Name
+ @"



Completed



";
query.Lists = "\"107\" MaxListLimit=\"0\"/>"
//Tasks Lists
query.Webs = "\"Recursive\" />"
;
//query.RowLimit = 100;
//Make sure to set this property as true.
query.UseCache = true;
CrossListQueryCache cache = new CrossListQueryCache(query);
//Make sure to use one of the overloads of the GetSiteData method which takes in the SPSite parameter
//and not the SPWeb parametre.
DataTable results = cache.GetSiteData(SPContext.Current.Site);

view rawcross.cs hosted with ❤ by GitHub

Make sure to set the CrossListQueryInfo.UseCache as true if you want to use the caching features. Another very important thing to mention is that there are 4 overloads of the CrossListQueryCache.GetSiteData method and only 2 of them support caching.
So only use the methods which accepts the SPSite object as one of the parameters if you want to use caching in your code.
The Stack Trace of the CrossListQueryInfo class looks like this:


So as you can see, the Publishing.CachedArea is queried first to check whether the items exist in the cache. If they don’t exist, then a call to theSPSiteDataQuery is made which fetches the values from the database and stores it in the cache. All the next subsequent calls will find that the items are present in the cache so no more calls with the SPSiteDataQuery class will be made.

As a result, the very first call will take longer than a vanilla SPSiteDataQuery call as under the hood, the CrossListQueryInfo is not only fetching the items but also building a cache with them.

Time taken to query 5000 items in 500 sub sites and return 1200 matching items:
 2000ms on first load and 30ms average on each subsequent load until the object cache expires.

PortalSiteMapProvider:

The PortalSiteMapProvider is a class which can used to generate the navigation on SharePoint Publishing sites. The Global navigation, the Quick Launch and the Breadcrumb navigation can all be generated with help of the PortalSiteMapProvider. It also provides methods to query sub sites, lists and list items with help of caching.

The main advantage of the PSMP is that it queries the SharePoint change log to check whether any changes have happened to the data being queried. If yes, then only the incremental changes are fetched and thus the cache is updated accordingly.

However, my tests showed that the PortalSiteMapProvider.GetCachedSiteDataQuery method which is used to get items from multiple sub sites does not maintain an incremental cache and it only fetches the new or updated items when the object cache has expired.

So essentially when querying for items from multiple sites, the CrossListQueryInfo and the PortalSiteMapProvider behave almost similarly.

Here is the sample code for the PortalSiteMapProvider:

SPSiteDataQuery query = new SPSiteDataQuery();
query.ViewFields = "\"Title\" />\"
DueDate\" />";
query.Query = @"



" +
SPContext.Current.Web.CurrentUser.Name
+ @"



Completed



";
query.Lists = "\"107\" MaxListLimit=\"0\"/>"
//Tasks Lists
query.Webs = "\"Recursive\" />"
;
//query.RowLimit = 100;
PortalSiteMapProvider ps = PortalSiteMapProvider.CurrentNavSiteMapProviderNoEncode;
PortalWebSiteMapNode pNode = ps.FindSiteMapNode(curWeb.ServerRelativeUrl) as PortalWebSiteMapNode;
DataTable results = ps.GetCachedSiteDataQuery(pNode, query, SPContext.Current.Web);

view rawportal.cs hosted with ❤ by GitHub

The stack trace for the PortalSiteMapProvider:


You can see that it’s very similar to the CrossListQueryInfo.

Time taken to query 5000 items and return 1200 matching items:
 2000ms on first load and 30ms average on each subsequent load until the object cache expires



So these are some of the methods you can use to query multiple List Items in multiple sites. Hope you had a good time reading through the post. 

Happy SharePointing!


Thursday, August 27, 2015

Understand the "AllowUnsafeUpdates", and use it to it best way


The Best way Deal  to deal with AllowUnsafeUpdates:

Source Url:  https://hristopavlov.wordpress.com/2008/05/16/what-you-need-to-know-about-allowunsafeupdates/
-----------------------------------------------------------------------------------------------------------

1) Don’t update SharePoint objects from your code behind on GET requests as if you do so your code will be exploitable via a cross-site scripting. If you understand the consequences of doing this and still want to do it then read below about how to use the AllowUnsafeUpdates property.
2) If your code is processing a POST request then make sure you call SPUtility.ValidateFormDigest() before you do anything else. This will ensure that the post request is validated (that it is not a cross-site scripting attack) and after that you will not have to worry about AllowUnsafeUpdates, because its default value will be “true” after the form digest is validated. 

The Microsoft idea behind introducing the AllowUnsafeUpdates property is to protect YOU from cross-site scripting attacks. The way this works is that if your application is running in an HTTPContext (i.e. it’s a web part for instance) and the request is a GET request then SharePoint will refuse to do any changes unless the value ofAllowUnsafeUpdates is set to true and by default it will be false for GET requests. If you try to do any updates to lists, webs or any SharePoint objects that require an SPSite to be created first, and if you don’t set AllowUnsafeUpdates to true you will get this exception:
System.Exception: Microsoft.SharePoint.SPException: The security validation for this page is invalid. Click Back in your Web browser, refresh the page, and try your operation again. —> System.Runtime.InteropServices.COMException (0x8102006D): The security validation for this page is invalid. Click Back in your Web browser, refresh the page, and try your operation again.
It is important to understand that if you are writing a class library for example, your code will behave differently when called from a web application and when called from a rich client. Actually if the HTTPContext.Current is null then AllowSafeUpdates will be always true. This is the case in rich clients where no cross-scripting is possible as there are simply no web requests.
Usually when you create your own SPSite or SPWeb objects, i.e. when you are not getting them from the SPContext (such as SPContext.Web), and when you try to update anything such as web or list properties, list items metadata etc, you may get the exception listed above. This is a clear indication that AllowUnsafeUpdates of the SPWeb is false and this is preventing you from doing the update. This problem is resolved easily by setting the AllowUnsafeUpdates of the parent web object to true. Still sometimes even after you have done this you may still be getting the same error.  This is typically caused by one of the the following reasons:
A) You have set the AllowUnsafeUpdate to true for the wrong SPWeb
You have to be careful because sometimes the ParentWeb of an object is not the same instance of the web you have retrieved the object from. For example when you goinitialWeb.Lists[listId] you would expect that the returned list’s ParentWeb instance is the same as you initialWeb. However this is not the case. So if somewhere later in your code you go list.ParentWeb.UpdateSomething() this will not work because you have never set the AllowUnsafeUpdates property of list.ParentWeb. You have set it for your initialWeb but even that this is the same web as the list’s parent web both are different instances. Usually you see the error and then you go and investigate in Reflector whether this is the same instance or not. Alternatively you could use another more generic and clever way to deal with almost any similar situation described in the following post:
The author suggests that you can set the HttpContent.Current to null before you do your updates and then reassign its initial preserved value when done. This will work great but remember to set the HTTPContent to null as early as possible. In the post above probably SharePoint uses the site.RootWeb to do the updates to the site scoped features and the RootWeb’s AllowUnsafeUpdates hasn’t been set to true explicitly.
B) The AllowUnsafeUpdates gets reset to false sometimes after you have set it to true
If we have a look at how the property is managed it turns out that it is stored in the request object associated with every SPWeb (which is actually a COM object)
[SharePointPermission(SecurityAction.Demand, UnsafeSaveOnGet = true)]
private void SetAllowUnsafeUpdates(bool allowUnsafeUpdates)
{
     this.Request.SetIgnoreCanary(allowUnsafeUpdates);
}
This actually means that every time the request is reset, the property will be also reset to its default value. The m_Request member is modified when a new web is created, when the web is disposed or when the SPWeb.Invalidate() method is called.
internal void Invalidate()
{
   if (this.m_Request != null)
   {
      if (this.m_RequestOwnedByThisWeb)
      {
         SPRequestManager.Release(this.m_Request);
      }

      this.m_Request = null;
   }

   this.m_bInited = false;
   this.m_bPublicPropertiesInited = false;
   this.m_Url = null;
}
So any operation that calls SPWeb.Invalidate() will reset AllowUnsafeUpdate to its default value. And for code running under HTTPContext, i.e. web applications, this default value for a GET request will be false. I’ve looked up for you all legitimate cases for which Invalidate() is being called by the SharePoint object model. These cases are:
1) When the Name or the ServerRelativeUrl properties of the SPWeb are changed and then Update() is called. In this case the AllowUnsafeUpdate is reset because with the change of these properties the URL of the web will change and logically the request object will change as it will now point to a different URL.
2) When any object that implements ISecurable (those are SPWeb, SPList and SPListItem) breaks or reverts their role definition inheritance. This means every time you callSPRoleDefinitionCollection.BreakInheritance()BreakRoleInheritance()ResetRoleInheritance() or set the value of HasUniquePerm the AllowUnsafeUpdatesproperty of the parent web will reset to its default value and you may need to set it back to true in order to do further updates to the same objects.
3) In many cases when an exception is caught by the SharePoint object model when you try to retrieve any sort of data the AllowUnsafeUpdates of the parent web will be reset to false as a precaution to protect against potential exploits. In those cases however the objects will be in unknown state anyway after the request has been reset and the exception is re-thrown so they are of no practical interest.
And finally it is also good to mention that you may get another related exception when trying to update your SharePoint objects and that is:
System.Exception: Microsoft.SharePoint.SPException: Cannot complete this action.Please try again. —> System.Runtime.InteropServices.COMException (0x80004005): Cannot complete this action.
This usually happens when some updates have been made to an object (usually SPSite, SPWeb or SPList) that may be clashing with your changes and SharePoint refuses to do the update. To recover from this situation you simply need to create fresh copies of the SPSite and the SPWeb objects and do the updates on the objects retrieved from the fresh copies. And of course don’t forget to set the AllowUnsafeUpdates to true for the freshly created SPWeb if required.

Sunday, June 14, 2015

Converting image byte array to image control directly


**********************************************************
private void GetUserPhoto(int id)
        {
            try
            {
                LookUps obj = new LookUps();
                List<UserPhotoEntity> newObj = obj.GetPhotoById(id);
                Byte[] photoByte = newObj[0].Photo;
                ImageConverter ic = new ImageConverter();
                System.Drawing.Image img = (System.Drawing.Image)ic.ConvertFrom(photoByte);
               // Image1.ImageUrl = "data:image/jpg;base64," + Convert.ToBase64String((byte[])photoByte);
                Image1.ImageUrl ="data:image/jpg;base64," + Convert.ToBase64String((byte[])photoByte);


                //Response.ClearHeaders();
                //Response.ContentType = "image/JPEG";
                //Response.AddHeader("content-disposition", "inline;filename=Photo.jpeg");
                //Response.BinaryWrite(photoByte);
            }
            catch (Exception ex)
            {
                ex.ToString();
            }
**************************************************************

Checklist for coding in c# for SharePoint Development

Checklist for SharePoint Development


·         Does my code properly dispose of SharePoint objects?
·         Does my code cache objects properly?
·         Does my code cache the correct types of objects?
·         Does my code use thread synchronization when necessary?
·         Does my code work as efficiently for 1,000 users as it does for 10 users?
·         Code should not include hardcoding of any type

Naming Conventions

·         Coding should follow a proper naming conventions, either Camel or Pascal casing, use PascalCasing for class names and method names. use camelCasing for method arguments and local variables
·         Make sure to not to use underscores in identifiers. Exception: you can prefix private static variables with an underscore.
·         Make sure to use noun or noun phrases to name a class
·         Prefix interfaces with the letter I.  Interface names are noun (phrases) or adjectives.
·         organize namespaces with a clearly defined structure
o   // Examples
o   namespace Company.Product.Module.SubModule
o   namespace Product.Module.Component
o   namespace Product.Layer.Module.Group
·         do not use Hungarian notation or any other type identification in identifiers
·         // Correct
·         int counter;
·         string name;
·          
·         // Avoid
·         int iCounter;
·         string strName;

·         Controls Naming Convention:-
o   Textbox: txtName
o   Button: btnName
o   Label: lblName
o   Panel: pnlViewer
o   Image: imgName
Properties:s
o   Use noun or noun phrase with Pascal case like UserID, Password etc.
Methods:
o   Use verb or verb phrase with Pascal case like InsertQuery(), GetUserInfo() etc.

Coding Flow and Comment

·         Write only one statement per line.
·         Write only one declaration per line.
·         Add at least one blank line between method definitions and property definitions.
·         Place the comment on a separate line, not at the end of a line of code.
·         Begin comment text with an uppercase letter.
·         End comment text with a period.
·         Insert one space between the comment delimiter (//) and the comment text
·         Do not create formatted blocks of asterisks around comments.
·         Either use Camel case like userID or Pascal case like UserID

Object Model

·         Make sure SharePoint objects like SPSite and SPWeb are being disposed properly. Do not.Site or dispose of any item returned directly from the Microsoft.SharePoint.SPContext Microsoft.SharePoint.SPContext.Web property except when you obtain reference to these using a constructor or when you use AllWebs.
  • The SPSiteCollection.Add method creates and returns a new SPSite object. You should dispose of any SPSite object returned from the SPSiteCollection.Add method.
  • The SPSiteCollection [] index operator returns a new SPSite object for each access. An SPSiteinstance is created even if that object was already accessed. The following code samples demonstrate improper disposal of the SPSite object.
  • The SPSite.AllWebs.Add method creates and returns an SPWeb object. You should dispose of anySPWeb object returned from SPSite.AllWebs.Add.
  • The SPWebCollection.Add method creates and returns an SPWeb object that needs to be disposed.
  • The SPSite.AllWebs [] index operator returns a new SPWeb instance each time it is accessed.
  • The OpenWeb method and SelfServiceCreateSite method (all signatures) create an SPWeb object and return it to the caller.
·         The Microsoft.Office.Server.UserProfiles.PersonalSite returns an SPSite object that must be disposed

  • The SPWeb.Webs property returns an SPWebCollection object. The SPWeb objects in this collection must be disposed.
  • The SPWeb.Webs.Add method (or Add) creates and returns a new SPWeb object. You should dispose of any SPWeb object returned from this method call.
  • The SPWeb.Webs[] index operator returns a new SPWeb object for each access

Using Objects in Event Receivers

Do not instantiate SPWeb, SPSite, SPList, or SPListItem objects within an event receiver. Event receivers that instantiate SPSite, SPWeb, SPList, or SPListItem objects instead of using the instances passed via the event properties can cause the following problems:
  • They incur significant additional roundtrips to the database. (One write operation can result in up to five additional roundtrips in each event receiver.)
Calling the Update method on these instances can cause subsequent Update calls in other registered event receivers to fail.


  • Do not use an unbounded SPQuery object.
  • An SPQuery object without a value for RowLimit will perform poorly and fail on large lists. Specify aRowLimit between 1 and 2000 and, if necessary, page through the list.
  • Use indexed fields.
  • If you query on a field that is not indexed, the query will be blocked whenever it would result in a scan of more items than the query threshold (as soon as there are more items in the list than are specified in the query threshold). Set SPQuery.RowLimit to a value that is less than the query threshold.
  • If you know the URL of your list item and want to query by FileRef, use SPWeb.GetListItem(stringstrUrl, string field1, params string[] fields) instead.

Exception Handling

·         Make sure to use Try-catch and using statements in exception-handling
·         Don't catch (Exception) more than once per thread
·         Cleanup code should be put in finally blocks
·         Don't use exception handling as means of returning information from a method
·         Errors/Exceptions must be logged



·         Switching ON and OFF the AllowUnsafeUpdates option, as and when appropriately.
·         Disposing the SPSite objects after usage.
·         Disposing the SPWeb objects after usage.
·         Avoid disposing objects derived from SPContext.
·         Try the "Using" clause - to avoid the overhead of disposing objects explicitly.

Customization

·         Don't change any default SharePoint installation files
·         Custom master pages will be based on a copy of the default master page
·         Avoid customising content pages with SharePoint Designer
·         Avoid inline server side script in pages
·         All applications should be globalised and then localised to English
·         Package all web parts and templates as features




Deploy code 
Deploy code using wsp solutions. Ensure code is not in debug mode when compiled, it runs slower and potentially can crash/holdup your production environment. Implement CAS, deploy to bins if possible and apply CAS security policies to custom coding.

Configuration

·         In a shared environment no AppSetting entries are generally allowed in the web.config configuration file of WSS. Instead consider using your own application specific configuration file and store along within your application or feature folder. The feature.xml file should be used for storing Feature properties. A SharePoint list can be considered for storing common configuration data

Features:-

·         All Feature scope should be Site collection scope or lower instead of at the Web-application level. If features can be re-used then they could be deployed at the web-application level.
·         Feature Title should be appropriately named.
·         Provide full description which should include information on the usage, any restrictions, and dependencies on other features, assembly, ownership and contact information. This description should be included in the Feature.xml in addition to any documentation