0

What debuted in SharePoint 2010 and what happened in 2013?

This post will help you understand new features added to SharePoint 2010 and also its new first-class tools, that help developers speed up their development, debugging and deployment process. Since 2013 is already out and with companies rapidly adopting it, it makes sense at this point to mention about the enhancements to each of them in 2013.

1. Sandbox Solutions and Resource Governors:

Quick Overview:
Sandbox solutions is a new concept for 2010. A sandbox solutions run in a restricted execution environment that allows programs to access only certain resources which would consequently contains/restricts bugs to that Sandboxed environment only with out affecting the rest of the SharePoint farm.

One good examples would be Sandbox solutions cannot utilize certain local or network resources, and  may not have access content outside of the site collection they are located in [1]. To get an overview of when to use Sandbox solutions with 2010 you can read my post- Sandbox Solutions – SharePoint 2010

What changed in 2013?
Things changed in SharePoint 2013, Microsoft do not actively  encourage adopting Sandbox solutions as first design choice. These will be still available for backward compatibility only. Microsoft encourages to use ‘Apps‘ instead and leverage their new ‘Apps Model’.

2.Client Object Model

Quick Overview:
With SharePoint 2007, there were limited options to interact with SharePoint. The SharePoint object model is available only for server side applications only. Which means that your code has to be hosted and running on one of the SharePoint servers. There are very limited options for Client applications to interact with SharePoint data.

The only option available in SP 2007  for client applications is to use the web services API. This always worked great on server side code where the service metadata is downloaded and developers could use them seamlessly. But client-side technologies like JavaScript did not fit well with this set up.

So ‘Client Object Model’ was introduced to help develop client side applications that can leverage REST. Also WPF(Windows Presentation Foundation) or Silverlight  also needed something like this to build faster and bandwidth friendly apps.

Client Object Model comes in two flavors

a. Support for .net based clients like WPF or Silverlight apps.
These applications would typically add reference to the following dlls and  use appropriate classes/ methods/ objects to develop applications.
Microsoft.SharePoint.Client.dll
Microsoft.SharePoint.Client.Runtime.dll
Microsoft.SharPoint.Client.Silverlight.dll
b. Support for Javascript based clients
These application would typically reference
SP.js
SP.Core.js
SP.Ribbon.js
SP.Runtime.js.

All of these files are also have debug versions (SP.Debug.js) available for debugging purposes, but these files are larger in size and should not be used on production.

So effectively use them in development environment, by setting SharePoint deployment to use debug-versions  ‘<deployment retail=”false”>‘ under ‘System.web’ element.

What changed in 2013?
All the Client object api calls in 2010 are made via a WCF entry point which is not directly accessible. Proxies have to be used build via .NET code or JavaScript libraries. There were harder to write, there was no compile time checking and less intellisense support.

These are now lot improved in 2013

– Fully leverage REST based API calls that use basic HTTP for CRUD operations

– Client Object model now supports oData protocol. OData is a mainstream data access api for HTTP based clients  for creating and consuming data APIs. OData builds on core relies on protocols like HTTP and commonly accepted methodologies like REST.

– Extended API to support more server-side functionalities like User Profiles, Search, Taxonomy, Search, Feeds, Publishing, Sharing, Workflow, E-Discovery, IRM, Analytics, Business Data etc.

– CSOM also supports Windows Phone Applications

– **Deprecated ‘ListData.svc’, but still available for backward compatibility for older applications. ‘Client.svc’ is introduced with more endpoints catering for more functionality.

More on the following. Keep Reading 🙂

3. Business Connectivity Services

4. List Enhancements

5. Enhancements to Visual Studio

6. Web Solution Packages

7.  Developer Dashboard

8. SilverLight integration

9. Web 2.0 Protocols and New Standards

10. LINQ Enhancements

11. SharePoint Designer Enhancements

12. Visio, Access and InfoPath Enhancements

References:

1. MSDN: Sandbox Solutions Overview (SharePoint Server 2010)

0

Architectural Enhancements in SharePoint 2013

The evolution of SharePoint 2010 has been revolutionary, there were so many new features, capabilities, development changes and architectural enhancements to 2010 when compared with 2007. So with 2013 released we should not be expecting a completely new architecture as most of its architecture is inherited from 2010. Microsoft has always channeled its efforts in making SharePoint a self-service product, heavily encouraging on no-code solutions, using browser and office applications for business needs. So 2013 has some architectural enhancements to support the above and also making SharePoint a better product. This blog post with summarize major enhancements and the related posts would explain each in detail.

  •  Database improvements – Contributes to faster & efficient data retrieval
    As Microsoft keeps improving SQL Server version by version, adding more features/   functionality, SharePoint can leverage such enhancements and make it more even powerful in terms of data storage, search and retrieval.
    The following can be treated as highlights of the database enhancements
    1. Shredded Storage
    As the name indicates files are shredded and stored in the content database. Once the user requests a file all the bits and pieces are combined and shown as a single file. So when a file has multiple versions, only the changes on that file are saved to the content database instead of the entire file being saved repeatedly. So no more storing of the entire file, version after version, as it has always been in SharePoint 2007/2010. This would be faster, reduces the amount of space occupied in the content database and also saves on network bandwidth.
    2. The database is Microsoft SQL Azure compliant
    If you wish to host SharePoint 2013 on the cloud the SQL Server database is all set for this. This is quite important to those organization which plan to host their sites on cloud as the new features include Updates to throttling behavior, New event table for trouble shooting, Recursive triggers are supported etc.
    3. Cleaned up databases – Both config and content databases
    Redundant and unused tables are removed and this would clearly enhance the database performance when queried.
    4. Better design
    The SQL Server has an improved schema to support ‘Shredded Storage’ which reduces the input and output operations while using document libraries.
  • Request Management
    SharePoint 2013 ships with request management which is designed for ‘Throttling & Routing’, ‘Prioritization’ and ‘Load Balancing’. Please DO NOT consider this as a replacement of ‘Load Balancer’ or ‘Traffic Mangers’ and as a matter of fact your organization MAY NOT require this at all. If you chose to use this make sure you understand the capabilities and if that’s the right choice for the problem.  ‘Request Management’ is just a service instance that run on WFEs (Web Front End server which runs SharePoint Foundation Web Application Service (SPFWA)). This service understands SharePoint and can route requests at the ‘Web Application’ level and this is the key advantage of Request Management. 
  • Workflow enhancements
     SharePoint Server 2013 brings a major advancement to workflow: enterprise features such as fully declarative authoring, REST and Service Bus messaging, elastic scalability, and managed service reliability.SharePoint Server 2013 can use a new workflow service built on the Windows Workflow Foundation components of the .NET Framework 4.5. This new service is called Workflow Manager and it is designed to play a central role in the enterprise. More on this can be found at What’s new in workflow in SharePoint Server 2013>>
  • Cache Service improvements
    The cache service improvements plays an important factor in building high-performance and scalable applications. In SharePoint 2010 each server has its own cache which is used only if the request comes to that server and so when a request come to a different server the whole information has to be recreated. But SharePoint 2013 introduces Distributed Cache Service (DCS) which is based on the Windows Server AppFabric 1.1. AppFabric Caching stores serialized managed objects in a cache cluster which can be used by all the server to serve the request. This pooled memory is presented to cache clients as a single source of caching memory.
  • Minimal Download Strategy
    In SharePoint 2010, when a user requests a page or makes changes to it, the whole page is downloaded. But in SharePoint 2013 only changed portions of the page are downloaded using Ajax Delta control which is added to the head section of the master page.
  • The Theme Engine
    Themes in SharePoint 2013 are HTML5 compatible and the format is not based on Office Open XML format as in SharePoint 2010. So in SharePoint 2010 you can create new themes using office applications which you can no longer do it in 2013,but you would use browser to preview and publish new themes.
  • Mobile Device support in SharePoint 2013
    SharePoint 2013 content is created and delivered based of location aware technique which detects the user-agent and servers the content. So lists are location aware and optimized for mobile delivery. So there are multiple views for enhanced mobile experiences such as Contemporary view (for HTML5 supported browsers), Classic View (for backward compatibility for SharePoint 2010), Full Screen view (for full desktop view on mobile device).
  • Deprecated browser support and more supported browser
    Internet explorer 6 and 7 are still supported for content rendering but cannot be used for content authoring, so WCM authors can use IE 8 & above for full support. Other browsers like Chrome, Mozilla FireFox, Apple Safari offer limited support.
  • New, deprecated  and modified/improved service applications
    This sections provides very brief explanation of the service applications, detailed description is out of scope of this post, but will be covered in later with in SharePoint 2013 blogs
    1. )New service applications
    The following service applications are added to SharePoint 2013
    – Machine Translation
    – Work Management
    – App Management.
    2.) Modified/improved Service Applications
    The following applications are improved in 2013
    – SSA (Search Service Application)
    – MMS (Managed Meta Data Service Application),
    – BCS (Business Connectivity Services),
    – User Profile Service application
    – Microsoft SharePoint Foundation Subscription Settings Service
    – Access Service Application  which is s
    plit into 2 services.
    a.) ‘Access Services 2010
    b.) ‘Access Services‘ for 2013 only.
     3.) Deprecated Service Applications
    The following service application are not totally deprecated but served as a different way such a isolating as a separate product or service on server etc.
    – ‘Office Web Apps’ is no longer a service applications and is packaged as a separate service product.
    – ‘Web Analytics’ is no longer a service application but is part and parcel of SharePoint search engine.
    – ‘Power Point Automation Service’ is no longer a service applications but can be started using ‘Service On Server’ page in SharePoint.
  • Web Application and Site Collection improvements
    Microsoft now recommends using Claims based authentication by default and also use Hosted Named Site Collections. With HNSC each site collections can be accessed using its own top-level URL even though they originate from the same content database. 
  • SharePoint 2013 development changes
    1. ) SharePoint Apps – Any hosting services to run and deploy your app
    This is something new to SharePoint and also its developer community. The concept is old and is similar to how we have apps hosted in Android market, Microsoft or Apple’s app store. They are all apps & everything in there is an app. This is direction the ‘Apps for SharePoint’ is going even faster. Apps are easier to integrate with cloud services,  Office suite and has even better platform for distribution.  In spite of all this complex technical evolution of apps, as always Microsoft provided easier options to create apps as web applications using HTML, JavaScript or using  server side programming languages like C#, VB.NET or PHP. Now the developer
    2.) Extended Client-Object Model
    This would enable custom code can be created by using a similar apps model
    3.) Alternative choice to  Sandbox solution
    Microsoft still continues to support farm solutions, but there are not major enhancements to Sandbox solutions and recommends using App instead of Sandbox solution. Now the choice would be either Farm solution or App instead of Farm solution or a Sandbox solution.Details explanation to each of the enhancements would be covered in other articles under SharePoint 2013 category.

References:

  • TechNet Microsoft
  • Book – Exploring Microsoft SharePoint 2013 by Penelope Coventry
    – This is one of the best technical books I have ever read, the author has outstanding knowledge in the subject. I totally recommend reading this book. I would give total credit, even this blog post for the author
0

Windows Command Prompt vs Windows PowerShell

With PowerShell gaining more popularity and more adoption among developers/administrators, there is also increasing confusion between Command Prompt and PowerShell.

I will try my best to show case the differences between these two based on my experience.

Please read my other blogs on PowerShell to get some basic understanding of what it is and how it can change your life as  an IT pro.

Lets gets started….

What is a shell?

Shell is a program that takes commands from the keyboard or mouse and passes on to operating system to perform requested actions. This could be as simple as opening a text file, navigating through your C or D drive, opening your internet explorer etc.

For instance look at the following examples and see why they qualify as Shell.

1. Windows Explorer ( is a shell ):  It is one of the most commonly used shell with a GUI (Graphical User Interface).
This qualifies to be a Shell because, all it does is it takes input for your keyboard or mouse and helps you navigate through your files and folders.

2. Command prompt ( is a shell ):
 DOES NOT have a GUI works only with predefined commands like dir, mkdir, taskkill etc. which takes only text as input.

3. Windows PowerShell (is a shell)DOES NOT have a GUI works with predefined command-lets like Get-Process, Get-Help etc which takes objects as input and not just text.

Difference 1: Commands (PowerShell can execute all the commands that Command Prompt and beyond.)

Following images shows how ‘dir’ commands works on both Command Prompt and PowerShell. Do you see any difference? Not much, correct?

Powershell and Command Prompt Difference

Powershell and Command executing ‘Dir’ Command

Please see below image executing a command-let ‘Get-ChildItem’ to list the same files with in the same Images folder. If you observe the previous example, ‘PowerShell’ was running ‘Dir’ command but behind the scenes it is executing ‘Get-ChildItem’ command-let. PowerShell ships with Command-lets only, some these command-lets execute on behalf of the DOS commands.

PowerShell Showing Get-ChildItem

PowerShell executing Get-ChildItem

 

The following table contains a selection of the cmdlets that ship with PowerShell, noting similar commands in other well-known command-line interpreters. Many of these similar commands come out-of-the-box defined as aliases within PowerShell, making it easy for people familiar with other common shells to start working.[1]

Comparision of PoweShell cmdlets with similar commands

Comparision of PoweShell cmdlets with similar commands


If you would like to know the total count of all  the available command-lets available try to run the following command – @(Get-Command).Count.
The count varies from version to version of the Powershell, the following is for version 3.

PowerShell Showing Command Count

PowerShell Showing Command Count

Check your version by running $PSVersionTable.PSVersion command.

Difference 2: Generation Gap (Automation and Security)

Command prompt was born in early 1980s,  it was doing well during its infancy and childhood , but when it reached its adult hood around 2005, it was facing some harsh challenges: Automation and Security.

Automation: 

It is very hard to find instances where you do not require to automate. All the things that you do on a daily basis can be automated such as creating an IIS site, clean up logs, back up databases and the list goes on…

Let see the following scenario where automation helps tremendously.

Imagine that your website is hosting ‘The Super Bowl’ or ‘The American Idol’ finale, can you fathom the amount of traffic you can expect for these events? Also think about the very next day or as a matter of fact, the very next hour once these events are done? There is very high probability or almost guaranteed that you will drop to 5% of your traffic that you had while these events are streaming live.

Do you believe it makes sense to purchase hundreds of servers just for a day or few hours and not use them for the rest of the year? How about any other events that can draw huge audiences but not as much as Super Bowl? You will be under utilizing your hardware then.

So ideally you need to scale-out or scale-down when ever appropriate. This scenario can be applied even for a day, for instance, if your’s is a stock brokerage firm and is very typical that you might have heavy traffic from 9:00 to 4:00 pm only, so does it mean that you can shut down few of your servers after 4:00 pm? Absolutely!!

It would be close to  impossible to achieve something at this scale and time with out automation. Command prompt cannot achieve this with the limited commands it has and as a matter of fact it wasn’t designed or built for these use cases. One the flip side, Powershell is designed and well baked for automation.

Security:

Security is a broad topic and I am not going to deep dive, but this section will help you get some sense of what kind of security challenges that PowerShell solves.

Bad guys running .bat or .ps1 files:

For example, if you are running a .bat file (batch file: series of commands) it is all or nothing proposition, for instance, if a bad guy somehow gets access to your server, he can execute all the batch files he wished for.

With PowerShell you can set execution policy to run only digitally signed scripts, to go even further, you can even restrict the server not to run the PowerShell at all.

Impersonation or running in a different security context:

It is very easy to impersonate user with PowerShell it has built in cmd-lets to support it. It is possible with CMD prompt as well but not very straight forward.

Remote Command Execution:

Typically if you have to run batch files, you need to have these running on the server. With PowerShell it is designed to execute command remotely with out even copying the PowerShell script to the server that it is supposed to run. How this works? PowerShell is built on .NET and uses cmd-lets to accomplish tasks. As long as the server has the cmd-lets on the server, you should be able to run any command from any where. You can accomplish the same with Command Prompt, of-course not as elegant solution as with PowerShell.

Restricted permissions on the server itself:

When using Command prompt your options are limited. Even common administrative tasks might be tough or not even feasible to accomplish, for instance, modifying Registry and other administrative functions. With PowerShell ships with cmd-lets that make these administrative tasks absolutely pain free.

Difference 3: Adoption and Support

PowerShell is part of the CEC (Common Engineering Criteria) since 2009, which means that all the products shipped since then have to support PowerShell. All Microsoft products must ship with cmd-lets that help with administrative and maintenance tasks. Microsoft Exchange team is one of the early adopters and the product can be completely administered via PowerShell.

All such products from Microsoft stack, ship with these cmd-lets going forward. It is very unlikely that the will be any changes or improvements with command prompt to support these products. Even all the third party tools that are build on Mircosoft tools and software will provide support for PowerShell and not Command Prompt.

References:

[1] https://en.wikipedia.org/wiki/Windows_PowerShell

Disclaimer:

All data and information provided on this blog is for informational purposes only makes no representations as to accuracy, completeness, recentness, suitability, or validity of any information on this site and will not be liable for any errors, omissions, or delays in this information or any losses, injuries, or damages arising from its display or use. All information is provided on an as-is basis.

 

0

Shared Services in SharePoint 2007 vs Service Applications in SharePoint 2010

Why not Shared Services?

As the name indicates these Services are created to be shared among multiple web applications. Primarily these services would provide Search, Excel Services, Profile Synchronization, Audiences and My Sites etc.  For instance if you want to have ‘Search’ enabled on you web application you have to subscribe to all the other services too, it is in fact all or nothing scenario. A perfect analogy would be a combo in fast food restaurant you have to have your fries and drink along with your burger. As explained in the below diagrams ‘Share Service 1’ is share among multiple web applications. ‘Web Application 1’ would need only ‘Search’ but it is served with the rest of the redundant services too.  On the other hand for example due to the sensitive nature of the data if a web application is required to have its own ‘Search’ you still have to create new ‘Shared Service’ which will  feed the Web application with all the other redundant services. The downside to this approach is some of these services would require a stand alone database which has to be created and populate for no reason.

SharedServices1

Why Service Applications?

Unlike Shared Services, Service Applications can subscribe only to the required service as show below. That is one reason why this is referred as Service A la carte and a good marketing strategy by Microsoft.

Service Applications

Key Advantages of Service Applications over Shared Services:

  1. As each service can be subscribed individually redundant databases need not be created to support unwanted services
  2. The architecture is loosely coupled any third party service applications can be easily integrated.
  3. Cross farm communication is possible via Service Application proxy and location easily using Service URI
0

What is jQuery?

jQuery is a concise JavaScript library. In simple terms, its a very lengthy script file with many predefined functions that developers can leverage to develop rich client applications. You might find lot more resources to kick-start at http://www.jquery.com .

So why learn jQuery? why is it gaining popularity?
Users expect your site to be blazing fast, easy to navigate with minimum page refreshes (AJAX), minimum round trips to the server etc. so to accomplish most of this everything has to happen at the browser level or the requests should be asynchronous. That is exactly where jQuery come into picutre. You name it; think of it; it does it for you.  Good part is, you can extend jQuery, write your own functions and reuse the code.

Difference between jQuery and JavaScript
Let us understand this vivid difference with an example. We have a scenario where alternate rows of a table should be colored.

Using plain javascript:
var table = document.getElementById(id);
var rows = table.getElementsByTagName(“tr”);
for (i = 0; i < rows.length; i++)
{
       //manipulate rows
        if (i % 2 == 0) {
                                rows[i].className =“even”;
               }else {
                                 rows[i].className =“odd”;
               }
}

Using jQuery:
$(“table tr:nth-child(“even”)).addClass(“alternateColor”);

So you see the difference? All the above functionality  is achieved in one line. How?
jQuery has built-in functions that take the parameters and does it for you. You might wonder how it predicts what a developer would need? The authors of  jQuery have invested time and effort to create  ‘Utility Functions’ which are common and widely used. For instance charts, math functions, showing and hiding elements, event handling, styling, dynamic insertion of DOM elements etc. which the developers can utilize to develop applications.

Where can I get the jQuery Library?
You can download it free from http://www.jQuery.com

Will jQuery work in a JavaScript disabled browser?
NO! jQuery makes things easier for you and at the end of the days its still a JavaScript library and it WILL NOT work in a JavaScript disabled browser

Does jQuery slow down the rendering of webpages?
Not very much, a few milliseconds that we can afford for plethora of things it does for you. And most of the functions, as a best practise are not kicked in at ‘onLoad()’ of the page. They dive into action usually after all the DOM is completely rendered on the page.

More on JQuery… Please keep reading posts on JQuery on my blog…..

6

JQuery Ajax ‘Post’ not working in Chrome and Firefox

When I was using jQuery Ajax( ) to request an asmx web service with some input parameters it worked fine on IE (Internet Explorer), but not on Chrome and Firefox. I have done quite a bit of research and found few work arounds for this. This post assumes that  you are working on a SharePoint environment ( or .net) and making these webservice calls across domains.

Reason for failure: Cross-Origin Resource Sharing (CORS).

To explain in simple terms, if a request is being posted to a domain different than that of the client it fails in Chrome or Firefox; except for IE. As IE approves such communication by default. But most other browsers don’t. CORS enables AJAX requests to be posted to a domain different than that of the client.This type of requests are treated to as a security threat and has been denied by most of the browsers. And such a handshake should be done at server level.

The following post drives you through possible solutions and changes to the script respectively.

Original JQuery script

Original JavaScript Code

Original JavaScript Code

All below solutions are valid, they seemed to have worked for many readers and are marked as answers on various blogs. I am summarizing all these here even if they did not work for me, as they might work for you!

1. Add jQuery.support.cors = true;

Add Jquery.Support.Cors= true

Add Jquery.Support.Cors= true

Initially even IE did not respond for the Ajax requests. When the above line is added IE started responding.

‘cors’ is equal to true if a browser can create an XMLHttpRequest object and if that XMLHttpRequest object has a withCredentials property.To enable cross-domain requests in environments that do not support cors yet but do allow cross-domain XHR requests.

2. Add empty ‘data’ parameter to the ajax call

I cannot use this solution, as I have input parameters and data cannot be empty as show above

Empty Data Element

Empty Data Element

3. Add ‘beforeSend’ parameter to the Ajax call

Add BeforeSend Parameter

Add BeforeSend Parameter

4. Add crossDomain:true to the Ajax method

Add Cross Domain True

Add Cross Domain True

5. Add xhrFields to the Ajaxmethod

Add XHR Fields

Add XHR Fields

The ‘xhrFields’ property sets additional fields on the XMLHttpRequest. This can be used to set the ‘withCredentials’ property.Set the value to ‘true’ if you’d like to pass cookies to the server. If this is enabled, your server must respond with the header Access-Control-Allow-Credentials: true’.

6.Change Button type to Submit

Not working in Chrome and Firefox

Input Type='button'

Input Type=’button’

Working in Chrome and Firefox

Input Type='submit'

Input Type=’submit’

Sorry if the above solutions disappointed you. Let me provide the solution that worked for me. Please understand that this solution is valid only for ASP.NET WebService application only. Modify the following sections on web.config file of the web service to

Web.Config Changes

Web.Config Changes

NOTE: Use Access-Control-Allow-Origin value* in header only on chosen URLs that need to be accessed cross-domain. Don’t use the header for the whole domain.

References or Useful resources:

1. http://encosia.com/using-cors-to-access-asp-net-services-across-domains/

2. http://www.html5rocks.com/en/tutorials/cors/

3. http://www.bennadel.com/blog/2327-Cross-Origin-Resource-Sharing-CORS-AJAX-Requests-Between-jQuery-And-Node-js.htm

4. https://www.owasp.org/index.php/HTML5_Security_Cheat_Sheet

0

Calling a Web Service (asmx) via Jquery

This post guides on how you can call an asmx webservice via JQuery. For those who are not familiar with JQuery, its a consise javascript library. To make it simple its a lenghty java script file which contains predefined functions, which users can leverage to develop rich client side applications. You may download the latest from www.jquery.com

<!DOCTYPE HTMLPUBLIC“-//W3C//DTD HTML 4.0 Transitional//EN”>

<html>

<head>

<!–Step1: Add reference to the jquery script file.–>

<scriptsrc=”JS/jquery-1.8.3.min.js”type=”text/javascript”></script>

<!–Step 2: Show acitivity indicator. This would help the users know  that some activity is going on in the background.–>

<scripttype=”text/javascript”language=”javascript”>

   $(document).ready(

function () {

              $(“#activityIndicator”).bind(“ajaxSend”, function () {

$(this).show();}).bind(“ajaxStop”, function () {

                   $(this).hide();}).bind(“ajaxError”, function () {

      $(this).hide();

            });

});

</script>

<!–Step 3:  Make a webservice call. Most of the below script is self explanatory .–>

<scripttype=”text/javascript”language=”javascript”>

//URL path to the webservice

var webServiceURL = ‘http://ServerName/WebService.asmx/WebMethod’;

function CallWebSerivce() {

//Get the parameter to the service from the user input

var parma1value = document.getElementById(‘<%= Ctrl1.ClientID %>’).value;

//Format parmaters as follows ‘param1=paramvalue, param2=paramvalue’

var parameters = ‘param1=’ + parma1value;

//Do not remove:  ‘cors’ is equal to true if a browser can create an XMLHttpRequest object and if that XMLHttpRequest object has a withCredentials property.To enable cross-domain requests in environments that do not support cors yet but do allow cross-domain XHR requests

  jQuery.support.cors =true;
//Make an ajax request
$.ajax({
type:“POST”,
url: webServiceURL,
data: parameters,
dataType:“xml”,
crossDomain:true,
success: OnSuccess,//method upon success
error: OnError//method upon error/failure
 });

//ajax

//1. GET the node value from webservice xml response – $(data).find(“nodeName1”).text()

//2. Find() function will get all the nodes with the provided name. If you have child nodes with the same name. All of the values are returned as a concatinated string

//Perform success operations

function OnSuccess(data, status) {

                document.getElementById(‘<%= Ctrl2.ClientID %>’).value = $(data).find(“nodeName1”).text();

                document.getElementById(‘<%= Ctrl3.ClientID %>’).value = $(data).find(“nodeName2”).text();

                document.getElementById(‘<%= Ctrl4.ClientID %>’).value = $(data).find(“nodeName3”).text();

            }

//Perform error operations

function OnError(request, status, error) {

                alert(‘Error calling webservice. Please enter meta data manually!’);

            }

}

</script>

</head>

<body>

<table>

<tr>

<td>

<!– TextBox which takes user input as a parameter to the webservice –>

<asp:textboxid=”Ctrl1″ runat=”server”/>

</td>

<td>

<!– TextBox which displays value from the webservice –>

    <asp:textboxid=”Ctr2″ runat=”server”/>

</td>

</tr>

<tr>

<td>

<!– TextBox which displays value from the webservice –>

<asp:textboxid=”Ctrl3″runat=”server” />

</td>

<td>

<!– TextBox which displays value from the webservice –>

<asp:textboxid=”Ctrl4″runat=”server”/>

</td>

</tr>

<tr>

<td>

<!– Activity Indicator container: Hidden by default –>

<div id=”activityIndicator”style=”display: none;”>

<img id=”imgActivityIndicator”src=”Images/ajax-loader.gif” alt=”Loading”/>

</div>

      </td>

<td>

<!– CallWebService() function called ‘onclick’ of a button –>

<input type=”button”name=”BtnFetchData” title=”Fetch Data” value=”Fetch

Data” onclick=”CallWebSerivce();” />

</td>

</tr>

</table>

</body>

</html>

Useful links:

1. You may generate ajax loader icons from Ajax Load Info

2. JQuery script files: www.jquery.com

1

Deploy InfoPath 2010 with manage code as ‘Content Type’ via a ‘Feature’ in SharePoint 2010

1. Creating InfoPath:
I have created the following InfoPath for demonstration purpose, please ignore the content. Before we go further  I believe you have your InfoPath form ready for deployment.

InfoPath form

InfoPath form

2. Publish InfoPath: Please follow the below steps to publish the InfoPath form

Step 1: Change the Security Level to “Full Trust”.

Note: This is required because the private assembly for a managed code form template is running under a hosted CLR application domain, the security settings for forms require full trust.

Step 2: Click on the ‘Info’ and you should find ‘Publish your form’

 

Step 3: Locate the path where you want to publish and name the template appropriately

Step 4: If you are not sure you may leave this field empty

Step 5:  Publish the form.

3. Create Visual  Studio Project:

Step 1 : Create an empty SharePoint project

Step 2:  Add a feature and name it appropriately. Please make sure you have ‘Site’ as the scope of the feature as this InfoPath is deployed as content type

Step 3:  Add a new Element File
Note: Empty elements are most often used to define SharePoint project items that lack a project or project item template in Visual Studio, such as fields. When you add an empty element to your project, it contains a single file that is named Elements.xml. Use XML statements to define the desired elements in Elements.xml

Creating Empty Element
Creating Empty Element

Step 4: Add the dll and the InfoPath to the element folder as shown below

Step 5: Configure element properties as shown below.
Note: Use the XsnFeatureReceiver class to trap events that are raised after an InfoPath form template installation, uninstallation, activation, or deactivation on the server. The assembly and class name used to trap these events must be referenced in the feature.xml file used to deploy the SharePoint feature containing one or more InfoPath form templates.

Feature.xml file would look as below
<?xml version="1.0" encoding="us-ascii" standalone="yes"?>
<Feature Id="39EBA28E-2238-4C82-A37E-81BAEBFB7A61" 
    Title="Simple Form Template" 
    Description="This feature deploys infopath as content type" 
    Version="1.0.0.0" 
    Scope="Site" 
    ReceiverClass="Microsoft.Office.InfoPath.Server.Administration.XsnFeatureReceiver" 
    ReceiverAssembly="Microsoft.Office.InfoPath.Server, Version=14.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c" 
    xmlns="http://schemas.microsoft.com/sharepoint/">
    <ElementManifests>
        <ElementManifest Location="Elements.xml" />
        <ElementFile Location="InfoPathasFeature.xsn" />
    </ElementManifests>
    <ActivationDependencies>
        <ActivationDependency FeatureId="C88C4FF1-DBF5-4649-AD9F-C6C426EBCBF5" />
    </ActivationDependencies>
</Feature>

Step 6: Include the .xsn, dll to be a part of deployment files.
If you preview your feature you would see the following the are no items in the feature except for the ‘Element.xml’. See how it changes after completing this step

Modify the xsn properties, change the ‘Deployment Type’ to ‘Element File’

Modify the dll properties, change the ‘Deployment  Type’ to ‘ElementFile’

Note: Setting the ‘DeploymentType’ to ‘ElementFile’ specifies the deployment of element files to SharePoint. Element files are referenced by the feature manifest (feature.xml). The default path for ElementFile is {SharePointRoot}TemplateFeatures.

After completing the above modifications, you will find the items in the feature as show below.

Step 7: Package and Deploy the project

Step 8: After you are done with deployment, verify if the folder structure is shown as below

Modify List Settings:

Follow the below steps to set the default content types to the form library.

Navigate to the advanced setting of the ‘Form Templates’ library and select ‘Yes’ as shown below.

Navigate to the ‘Add  Content Types’ Screen & if you select “All Groups” or preferable “Microsoft InfoPath” you should find the “InfoPathasFeature” Content Type in the list.

Check if the content type is added:

Your are done! Check if the InfoPath is showing up if you select ‘New Document’ as below

References:

1. How to: Deploy InfoPath Form Templates with Code
2. Security levels of InfoPath forms
3. XSNFeatureReciever Class

0

Gotchas- InfoPath 2010

Error1:

I have encountered the following error “Microsoft InfoPath has stopped working” while trying to open VSTA (Microsoft Visual Studio Tools for Applications) via the InfoPath form. I was trying to leverage ‘Edit from Code’ option available in InfoPath. I tried  troubleshooting for a while, searched for errors in my event log and researched a bit to find a solution. But failed! The following solution did work for me, but there could be hundred different reasons why it failed on you. Hopefully the following solution saves some time for you

Solution: For unknown reasons VSTA failed to locate the code that it was referencing previously. I have deleted the references to code using ‘Remove Code’ option in InfoPath and reattached the code and everything worked fine seamlessly! This basically involves navigating to Developer>Language> Form Options> Programming> Remove Code. After completing this, point ‘Project Location’ to where the form code existed before and hit ‘OK’.

Error2:

Task failed because “sgen.exe” was not found, or the .NET Framework SDK v2.0 is not installed.  The task is looking for “sgen.exe” in the “bin” subdirectory beneath the location specified in the SDKInstallRootv2.0 value of the registry key HKEY_LOCAL_MACHINESOFTWAREMicrosoft.NETFramework.  You may be able to solve the problem by doing one of the following:  1.) Install the .NET Framework SDK v2.0.  2.) Manually set the above registry key to the correct location.  3.) Pass the correct location into the “ToolPath” parameter of the task

Solution:

This error started coming up when I added web reference to VSTA and started to build the application. The whole error is all about the setting in the project settings “Generate Serialization Assembly” this setting was initially “Auto” so I have set this to “Off” and the build was successful. When the the setting was set to “Auto” the VSTA was trying to find sgen.exe (XML Serializer Generator –  this tool creates an XML serialization assembly for types in a specified assembly in order to improve the startup performance of a XmlSerializer when it serializes or deserializes objects of the specified types).

Error3:

I have encountered the following error while I tried to run my VSTA that loads my InfoPath form to debug. “InfoPath cannot open files selected forms because of an error in the form code. Policy settings prevent opening Internet forms with managed code. To fix this problem, contact your system administrator

Solution: 

It appears that some forms need to be in full trust if they contain managed code. I gave the form ‘Full trust’. This scenario works for me and might not fit well with your environment, please be cautious when using this option as this will enable InfoPath form to access files and settings on the computer.

0

Sandbox Solutions – SharePoint 2010

You must have come to this post searching for ‘Sandbox Solutions’ in one of your favorite search engines. Good that you are at the right place to know in and out of Sandbox solutions.

Why do I need a Sandbox Solution?

Scenario 1# Sometimes I am a dumb programmer.
Programming is like hand writing. The start is the most important. The better the writing habbits are, the better your wrting style is.  Even for a programmer, if good coding habits are not adopted/exercised right from the beginning, he would some how mange to get the stuff working, but he will never realise that its not he best way to do. Especially with SharePoint even after taking a great care in disposing SharePoint objects there might be  chances of bad coding practices to creep in, leaving some objects undisposed which can potentially throttle the server with high memory usage.

You might be thinking of asking me why is Garbage Collector not taking care of this? For instance, when using objects of type ‘SPSite’ or ‘SPWebapplication’, part of the actions do not run under managed code at all. You shouldn’t be surprised to know that CRUD(create, read, update, delete) operations to ‘content’ database are performed by a COM component which does not run managed code and the ‘Garbage Collector’ is not even aware of these objects. This is one reason why you have to dispose your SharePoint objects promptly. To handle situations like this  ‘Sandbox Solution’ is the answer.

Scenario 2# I hate my administrator
As a developer, once you finish your development and have your solution package ready you are eager to deploy it to staging or production. Its not very uncommon to hear from your administrator asking you to come back. You will be pissed off even more if you are a site collection administration and not able to deploy your own solutions to your site with out administrator’s intervention. ‘Sandbox Solution’ is the answer.

Scenario 3# I hate my developer
As an admin, I really get scared when my developers approach and say “The solution package is ready. Can you please deploy that to production?” At that moment, I have so many things going on in my mind. Will his solution get my portal down? I am not sure what his code does, Will it delete all my existing webapplications? Is his code accessing local file system and exposing sensitive data?  Even if I have tools like SPDispose check tool that can check memory leaks, there are very low or no chances of validating his code against any bad practices. ‘Sandbox Solution’ is the answer.

Scenario 4# My client always has a big NO!
For instance consider yourself as a service provider or a product development company. When the clients buy our product/SharePoint solution, what is the guarantee that you don’t have malicious code with in your solution that could access their local files or sensitive data? How can they trust that your code is not consuming too much of their server resources? There might be a situation where the client might not even let you change their folder/file structure, which in many cases the developer becomes handicapped with out being able to change webconfig, or add files to 14/12 hive. ‘Sandbox Solution’ is the answer.

Fortunately to handle situations like above, Microsoft has introduced ‘Sandbox Solutions’. So what is a Solution? Typically development is like solving a puzzle, when you finished solving and got the peices together, its time to pack or wrap it up. So you develop and pack all your folders, files together in file with .wsp extension as a solution to your client’s requirements. So your client has SharePoint installed and needs to deploy this solution in their farm. Once deployed, all these files get into those folders that were created while installing SharePoint. SharePoint picks these up when the end user requests for them via the application built. Contine reading to understand the concept of solution packaging and how ‘Sandbox Solutions’  provide solutions to the above mentioned scenarios….