0

SP Error: ‘New Document’ requires a Microsoft SharePoint Foundation-compatible application and web browser

We encountered the following error when ever a user tries to create a new document from any SharePoint document library using Internet Explorer (IE). It works fine in all other browsers.

Note: We are using On-Premise SharePoint 2013 and Office 2010 Professional plus.

“New Document requires a Microsoft SharePoint Foundation-Compatible application and web browser. To add a document to this document library, click the ‘Upload Document’ button”

IE pop up for new word documents

IE pop up for new word documents

Based on our observation it is due to the missing add-ons,  that get added to IE along with the Office package.
I believe some updates to the office removed these or the Office installation got corrupted due to patch updates or something.

SharePoint Add-ons

SharePoint Add-ons

Solution: Repair your office installation.

  1. From the Start menu, click Control Panel, and then click Programs.
  2. On the Control Panel, click Programs and Features.
  3. Scroll down to the list of installed programs, select Microsoft Office, and then click ‘Change’.
  4. In the Change dialog, click Repair and then click Continue.

Conclusion:

Hope this post helped to resolve your issue. If it did not, all the best in troubleshooting the root cause and please leave a comment on how you resolved it. Happy coding!

0

Some files can harm your computer in SharePoint 2013 (Internet Explorer Only!)

We had the below annoying pop showing up for some users while opening documents within SharePoint 2013.  Surprisingly this issue DOES NOT occur while accessing the same document from a Document Library. But when opening this document from a SharePoint page or a link within a Content Search or Content Query web part the below message pops up.

Some Files Can Harm Your Computer

Following are some of the alternative solutions that were listed in various forums/blogs that might help resolve your issue. But not for me. I have listed the actual solution at the very end.

Solution 1: Add URL to trusted sites.

Add the URL to the trusted sites on Internet Explorer. If this is working for you, you should work with your company to push this as a group policy or something to update for all users. Unfortunately, this did not work for me. I already had the site URL in the trusted sites.

Solution 2: Disable SharePoint OpenDocument Class Add-on for Internet Explorer

Disable SharePoint OpenDocument Class Add-on for Internet Explorer. This works 100%. But we can’t expect every user to do this and has to be pushed via a Group Policy or something. Also, we do not want to deal with consequences of disabling this Add-on. I do not want to go into the details of this Add-On, for now, I will save this as a topic for another blog.

SharePoint OpenDocuments Class

Solution 3: Follow the blog

Solution 4: Add script to suppress calls to Add-on (Worked For me!!)

The following screen capture explains it all. The issue is not occurring within a document library because the ‘onmousedown’ and ‘click‘ events are NOT calling the Add-On see below.

But on the page the link with in Content Query results, the ‘onmousedown’ and ‘click‘ events are calling the Add-On in IE (Internet Explorer).

SharePoint OpenDocuments Class Script

 

Script:

So as a solution we have added a script on to the master page with identifies anchor tags that are pointing to PDF files and replaces text within the two attributes as shown below.


<script type="text/javascript">
var replaceStrings = ["SharePoint.OpenDocuments.3", "SharePoint.OpenDocuments"];
var anchorAttributes = ["onclick", "onmousedown"];
$(document).ready(function () {
$('a[href*="pdf"]').each(function () {
for (var i = 0; i < anchorAttributes.length; i++) {
for (var j = 0; j < replaceStrings.length; j++) {
var value = $(this).attr(anchorAttributes[i]);
if (value) {
$(this).attr(anchorAttributes[i], value.replace(replaceStrings[j], ''));
}
}
console.log($(this).attr(anchorAttributes[i]));
}
});
});
</script>

0

Some SharePoint users always show up as DOMAIN\username

This is a common issue that most of SharePoint Admin or Developers come across where some users do not resolve to First Name Last Name but instead, they resolve as ‘domain\username’.

For instance, it resolves as  microsoft\sdakoju instead of Susheel Dakoju within the address book of SharePoint.

Following two steps seems to have worked for me. I am just jotting them down and might help someone out there facing the same issue.

  1. Run Set-SPUser PowerShell Command

    1. Specifies that user information will be synchronized from the user directory store.
      Set-SPUser -Identity ‘i:0#.w|microsoft\sdakoju’ –Web https://yourdomain/ –SyncFromAD
      
  2. Delete the user from the UIL (User Information List) – (Optional)

    1. The User Information List (“/_catalogs/users/simple.aspx” or “_catalogs/users/detail.aspx”) is a hidden list in each site collection that is only visible and accessible to Site Collection Administrators. The User Information List stores metadata information about a user.
    2. It is totally safe to delete the user from this list, the user will get added or recreated when the user visits the sites.

Regarding how this might have happened in the first place, I did not perform any root cause analysis due to pressing timeline to resolve the issue.

There could be multiple reasons, following is

  • Username or some user information may have been edited in Active Directory.
    For instance, some users move from temp to full time so the username might have changed to the one without temp.

    • sdakoju-temp to sdakoju or
    • sdakoju-domain1 to sdakoju-domain etc.
  • There could be an error or issue while the SharePoint profile Sync is running.
    • Check the error log for more details.
  • Core user profile services and other supporting services might not have recovered due to unplanned reboot or patching.

Happy TroubleShooting!!

0

Clone user permissions on SharePoint Online – Office 365

I had a request from one my clients to clone the permissions of one user to another for a SharePoint Online site.

I had created a simple PowerShell script that automates this process. It is a very common scenario when someone leaves the organization and somebody else takes that role or you want a backup for an employee and wants the same permissions to the backup person as well

It’s a simple task and I hope the following script comes in handy for you, Happy Scripting! You can also download the script from SPO_CloneUserPermissions.ps1


<#
.SYNOPSIS
Close user permissions of user to another in SharePoint Online

.AUTHOR
Susheel Dakoju

.DATE
04/21/2017

.DESCRIPTION
1. Fetch all the groups that the user is part of
2. Add ActualUser to the same groups

.Note
- This script requires admin privileges on the machine that is being executed on!
-
#>

#SharePoint online Admin site URL
$SPOAdmiURL = "https://yoursite-admin.sharepoint.com/"
$username = "username@onmicrosoft.com"
$password = "@@@@@@@@"

#Url of the SharePoint Online Site
$SPOSiteURL = 'https://yoursite.sharepoint.com/sites/test'

#User used as reference
$ReferenceUser = 'sdakoju@onmicrosoft.com'
#The actual user that needs to be added to Groups
$ActualUser = 'rkevin@onmicrosoft.com'

#Create a credential object
$cred = New-Object -TypeName System.Management.Automation.PSCredential -argumentlist $userName, $(convertto-securestring $Password -asplaintext -force)

#Connect to SharePoint Online using the credentials
Connect-SPOService -Url $SPOAdmiURL -Credential $cred

#Get the SharePoint Online Site Object
$site = Get-SPOSite $SPOSiteURL

#Get the user object of the reference user
$user = Get-SPOUser -Site $site -LoginName $ReferenceUser

#Loop through Groups and add the actual user
$user.Groups | Foreach-Object {

#Fetch Group Object that the reference user is part of
$group = Get-SPOSiteGroup -Site $site -Group $_

#Add 'ActualUser' to the same group that the reference user is part of
Add-SPOUser -Site $SPOSiteURL -LoginName $ActualUser -Group $group.LoginName

}

Note:

– It is assumed that you have SharePoint Online module installed on the machine you are running this script. If you do not have, please follow this link to download SharePoint Online Management Shell

– I had the following errors while running the script, I am jotting them here as they may be helpful for you.

Error 1:

  • Identity Client Runtime Library (IDCRL) could not look up the realm information for a federated sign-in. Or
  • The partner returned a bad sign-in name or password error. For more information, see Federation Error-handling Scenarios. Or
  • The ‘username’ argument is invalid. Or
  • The partner returned a bad sign-in name or password error. For more information, see Federation Error-handling Scenarios.

Solution:

  • I was using a wrong username. Please make sure you have right admin username and password for the SharePoint Online site that you are running against.

Error 2:

  • The site https://yoursite.sharepoint.com/sites/test/ is not properly formed

Solution:

  • There is a trailing slash at the end of the URL it should be ‘https://yoursite.sharepoint.com/sites/test’ and NOT ‘https://yoursite.sharepoint.com/sites/test/’
0

An error occurred accessing your Microsoft SharePoint Foundation site files

We encountered the following exception while trying to open a SharePoint 2013 site using SharePoint Designer

“An error occurred accessing your Microsoft SharePoint Foundation site files. Authors- if authoring against a web server, please contact the Webmaster for this server’s Web site. WebMasters – please see the server’s application event log for more details.”

SharePoint Designer Error

SharePoint Designer Error

After exhaustive troubleshooting, we discovered that New Relic configuration is what was causing the issue. This might or might not be the case for you!!

Step 1: You could first try to disable the New Relic agent altogether  and see if that is causing the issue.

Following is the screen capture to show the path to navigate and the XML Element to change to  disable the .NET Agent for New Relic, change agentEnabled = “false”

New Relic Agent Enabled

New Relic Agent Enabled

Once you have confirmed it’s the New Relic  that’s causing the issue, try the following option.

Step 2: Disabling the Browser Auto Injection with Agent still enabled.

The following screen capture clearly indicates the changes that are needed to disable the Browser Auto Injection with Agent STILL ENABLED.

Browser Auto Injection disabled

Browser Auto Injection disabled

Browser Auto Injection automatically injects a piece of Javascript code to the <head> of the page that tracks front end page views and other end user metrics. Here is our documentation on this: https://docs.newrelic.com/docs/browser/new-relic-browser/additional-standard-features/page-views-understanding-your-sites-popularity

The impact of disabling Auto Injection is that you will no longer be receiving these end-user metrics unless you use the copy/paste method to manually add the Javascript code, described here: https://docs.newrelic.com/docs/browser/new-relic-browser/installation-configuration/adding-apps-new-relic-browser#copy-paste-app

There are a few options for adding the Javascript in Sharepoint, you could:

  • Create a new master-page, and add the Javascript to the <head> there. https://msdn.microsoft.com/en-us/library/dn205273.aspx
  • Use the AdditionalPageHead delegate control in your SP solution. http://www.fivenumber.com/understanding-sharepoint-delegate-control/
  • Using Designer, add the script as you would in any other HTML editor. However, we’ve witnessed odd behavior where Designer will place the <head> tags inside the <body> tags, which has caused copy/paste instrumentation not to work. The solution here is to add the script right below the DOCTYPE declaration, outside of any actual tags. The script will still be rendered inside of <head> tags on actual deployment, though.

Conclusion:

The idea behind sharing this information is to encourage you to look at other dependencies like network configuration, third party modules or agents etc. that might be causing the issue. In our case it was New Relic, it could be something else for you.

Disclaimer: New Relic is one of THE BEST monitoring tools available in the market. Undoubtedly! It is just that you need to know how to tweak the configuration tailored to your needs.

1

Design, Manage and Monitor – Microsoft Azure Storage Services

I have been meaning to get around to this article for a long time, this blog post is for developers or folks who wants to read through the clutter, get a ten thousand foot view of what Azure Storage is, when to use what service and various API options available. I assume that you have some understanding what Azure is and familiar with some jargon with Azure Storage as well.

What is Azure storage? (Understanding with analogy)

A perfect analogy would be, let’s assume that you are relocating from Texas to Virginia, you said your good-byes, wrapped up your boring office send-offs, took pictures, packed all your stuff and  bagged a few memories. You drove to Virginia and negotiating with a Storage Service Provider like U-Haul, or EzStorage to store all your stuff. You might have boxes of all sizes – small, big and some extra large; fragile items, wall hangings, suite cases, may be an old piano or an antic car etc. Everybody will have unique needs based on what they have to store. And these storage providers will allocate space based on what you want to store and how you want to store. You might need a garage for your car or your might just need a small shelf shared with others for storing small boxes. Of course! you pay for their service, you pay based on the size, kind of storage, location of the storage etc. Azure Storage Services is no different.

Now relate Azure Storage with the Storage Facility, Azure Storage Services also stores stuff for you, unlike boxes or wall hangings, you might have VHDs (Virtual Hard Disks), Big & Small images, High definition videos, Office documents etc. whatever ever your needs are, Azure will accommodate them for you. Azure provides various Storage Services to host different files types and catering to various use cases. 

Let us get started!  The following section will provide use cases for understanding on when to use different storage options.

  1. Table Services

  2. Blob Services

  3. Queue Services

  4. File Share Services

Table Service
Typical use cases:

  • For basic NoSQL/Key-Value functionality and the ability to scale quickly.
    • NoSQL (Not only SQL) is revolutionary and the underlying concept of Big Data. 
    • What’s Big Data? In layman’s terms, when the concept of storage started, humans used to contribute data, but with an evolution of millions of smart gadgets like smart TVs, smart refrigerators, smart cameras everything is interconnected and all of these products contribute to Terabytes if not Petabytes of data.
      If your applications allocate resources processing real-time data,  you might lose some of the data being fed to these systems continuously, you need a way to quickly “somehow” save this data in some format and process it later. This ‘somehow’ is NoSQL which means that you need a type of storage that can quickly take what every is inputted and stored for later processing.
      When the demand for resources is reaching its limit (while adding real-time data), underlying systems automatically scale out and scale up if needed. This is where Azure will help you.
  • If you have a need to input and quickly retrieve millions of entities (similar to rows in SQL) this is a good choice.
    • If you are not following relational schema it becomes a whole lot easier to input and extract data.
      (Note: You should have Row and Partition key to fetch data fast or else as the entities grow querying becomes slow.)
  • If your data storage requirements exceed 150 GB and you are reluctant to manually shard or partition your data.
  • If you want to keep your costs low. Table storage services are comparatively cheaper than other storage services .
    • Note: Microsoft has been optimizing Azure SQL Database service to reduce overall costs and also ongoing administration costs, with a change like this, more and more customers are preferring SQL Database to Table Service.
  • If you require robust disaster recovery capabilities that span geographical locations.

Avoid:

  • If your data has complex relationships, in other words, data that require RDBMS (go with SQL Azure).
  • You if you need the support of secondary indexes, this is not an option.
  • If your code depends on ADO.NET or ODBC.

Blob Service (Block blob & Page Blob)
Typical use cases

  • If you have tons of static files like pictures, videos, pdf, office documents etc. this is the default choice.
    • As a part of modernization, more companies are migrating to the cloud, it could be websites, applications or even whole infrastructure. There might be hundreds, thousands or even millions of supporting files. Azure Blob service can host these files for you.
    • Depending on the size of the file you are trying to store, you have two options
      • Block Blob
        • Each Block Blob can be up 200 GB, you can upload block blobs with a size up to 64 MB in one operation.
      • Page Blob.
        • Each Page blob can be up to 1 TB in size and consist of a collection of 512-byte pages. You set the maximum size when creating a page blob and then you can write or update specific pages
  • This is most popular among all Storage Services, as a matter of fact Azure uses this for all of its storage needs, this could be while you create a VM and storing any supporting files/data used by other services on Azure.
  • When people refer to migrating Virtual Machines to Cloud, generally they prep the machine by running tools like Sysprep on those machines and uploading VHDs to BLOB service.

Queue Service(Azure Queues & Service Bus queues)
Typical use cases

  • If you desire to have a loosely coupled or applications that can interact asynchronously.
    • For instance, you have a system that checks inventory and notifies what needs to be ordered and you have another system that does the ordering for you.What if the order processing VMs or Worker Roles are offline? What happens if one of them is busy or overloaded with traffic? How do you recover from failure? Queue Services will help you with these scenarios
  • If you have a need for large numbers of messages that needs to be accessed from anywhere in the world via authenticated calls using HTTP or HTTPS.
  • Load Leveling & Balancing
    • If you have apps that experience a huge burst of traffic, you could level the load if you could categorize processes that can wait (in the queue) until a free worker becomes available.
    • For load balancing,  you could spin up additional worker roles to process messages and balance the load among the worker processes. Also, this can be automated.
  • Subscription
    • If you have one producer and multiple consumers, Queues are an excellent choice. For instances, one system can generate a messages (Producer) and multiple systems that subscribe and consume the messages (Consumers) and processes them independently.
  • Scheduled Processing
    • If you do not want to stress your applications during the peak traffic hours, you can isolate that task that can be processed later on a scheduled basis. This could be nightly or weekly, you can add messages to the Queue and process them based on the schedule.

Note: Azure Queues and Service Bus Queues work differently, it is good to understand the differences between these two and make informed architectural decisions. Refer to ‘References’ section for good articles.

Advice on Multiple Queues:
I am not against using multiple queues, but use them sparingly, the idea is, you can achieve a lot with single queues itself (if architected well).
As a matter of fact, I have used multiple queues myself in the following use cases
1. Different Priorities: You want different priorities for different messages. The last thing you want is a high-priority message getting stuck behind hundreds of low-priority messages. Create a hi-priority queue for these.
2. Large Transactions: If the number of message transactions (Read,Put,Delete) exceed over 500 transactions per second. Creating multiple queues will spread the transaction volume across storage partitions.

File Storage
Typical use cases

  • Share data across on-premises and cloud servers
    • If you have a hybrid set up and you have applications that share data across on-premise and cloud.
  • Migrate file share-based applications
    • You can migrate applications to the cloud with no code changes.
  • Integrate modern applications
    • With more devices accessing the same information and distributed globally, it creates a dire need for sharing resources and build more file sharing applications. Also, it supports REST API for building modern applications.
  • High Availability
    • If all you need is a file share interacting via Server Message Block 3.0 (SMB) protocol it is highly efficient to use Azure File Share service backed by Azure infrastructure, redundancy, and SLA.
  • Sharing files between Services
    • If you have VMs or different services that need to share files this is an excellent choice.

Note: If your application is heavy on using file attributes, file processing and not just regular read/write, there are certain unsupported features.
Click Here for the complete list.

After all these services are provisioned, there should be an option to consume these services. Following screen capture provides a quick view or a cheat sheet of what Storage API provides. Most of the operations are self-explanatory based on the naming convention. I will strive to write more articles on each one of these and provide examples in C# leveraging REST API.

Azure Storage Services Quick Reference:
Azure Storage Services Quick Reference

Conclusion:

I was confused on when to use what service, but after working hands on and a lot of reading, I was able to assimilate some information but not everything. I made an honest attempt to provide a quick read and to get an overall view of what Azure Storage Services are, what to choose when and when to avoid. Also concluded with summarizing most of the operations that you can do with via Azure Storage API. Hope this article made some sense and helped you with something.

References:

[1] Azure Queues and Service Bus queues – compared and contrasted

[2] Azure Storage Queues VS Azure Service Bus Queues

 

0

Code, Manage and Collaborate with Bitbucket and Visual Studio

This blog post is a quick demo on how to integrate Visual studio with an existing code repository on Bitbucket.

There are so many blogs that explain Git and Bitbucket in great detail. You may follow the below resources to get basic understanding of Git and how Bitbucket uses Git.

This post assumes that you have the following prerequisites:

  1. Bitbucket account with appropriate credentials. If you do not have one Sign Up Here
  2. Visual Studio 2013 or higher installed.

Let’s get started!

What is Git?

“Git is a free and open source distributed version control system designed to handle everything from small to very large projects with speed and efficiency.”

Once you install Git, you will be able to create a local repository, commit your changes locally and then push to the cloud or origin.

Installing Git is easy and this post will cover this for you.

What is Bitbucket?

Bitbucket is a cloud-based flavor of Git for professional teams.  This is how it works – when you add/modify files in your local computer, changes are committed to local repository (i.e. Git repository) and then synced Bitbucket. You will know more when you read through this blog post.

Git is the underlying engine that does all the heavy lifting and Bitbucket is just a wrapper that leverages Git.

Following are the key steps to integrate Visual Studio with Bitbucket.

 Step 1: Install Git. (Link to download)

Please follow the steps and install Git.

Git Install Step 1

Git Install Step 2

Git Install Step 3

Git Install Step 4

Git Install Step 5

Git Install Final Step

Step 2: Create Code Repository in Bitbucket.

Please follow the steps and create Bitbucket Repository.

Create Bitbucket Repository Step 1

Create Bitbucket Repository Step 3

Once the repository is created, navigate to the ‘Overview’ section within the repository.
Copy and save the highlighted URL, you will need this later.

Note:
You should see the following only if you are working with an existing repository.
If you followed this blog post and created a new repository you will not see the following.

Step 3: Use Visual Studio to create a local Git Repository.

a. Navigate to the ‘Team Explorer’ within visual studio.

  • Click ‘Clone’ option under Local Git Repositories.
  • Provide the URL that you saved in the previous steps in the first text field.
  • Give the path to the local folder where you want the Git repository to be created in the second text field.

Create Local Git Repository using Visual Studio Step 1

 

 

 

 

 

 

 

 

 

 

b. Enter the user name and password for the Bitbucket account.

Create Local Git Repository using Visual Studio Step 3

C. Check if the repository is created in the Visual Studio and local File System.

Visual Studio:

Create Local Git Repository using Visual Studio Step 2

Local File System:

A hidden folder ‘.git’ will be created (Check your ‘Folder Options’ to see ‘hidden Items’)
This indicates that the folder is being watched and tracked by Git.

Local Git Repsoitory

Step 4: Add a new project to the Git Repository.

Important!: When you create a new project, make sure you  are creating using ‘New’ under Git Repository.
Double click on the Git Repository name and you should see the following.
Create new project in the Git Repository

DO NOT create the project from the New option with in ‘File’ menu in the visual studio.
You can re-associate with the Git repository later, but this is the cleanest and the easiest way of doing it.

Give appropriate name to the project and make sure your project location matches the Git Repository location.
Create a new project in the local Git Repository

If you notice clearly, you will find a ‘Plus’ icon next to the files, indicating that it is under version control.
Visual Studio Project created in the local Git Repository

Step 5: Commit and sync the project to Bitbucket.

Right click on the Solution name and hit commit. Remember, this is only committing to local Git Repository!

Commit the newly created project

Give appropriate comments. You are still committing to local Git repository.

Add comments while committing the project

Hit ‘Sync’ . This will push the changes to the outgoing commits. You are still syncing to local Git repository.

Sync to Bitbucket or origin

Push the changes to Bitbucket using ‘Push’ option under Outgoing Commits. THIS PUSHES TO Bitbucket!

Outgoing commits to origin

Check if you can find the newly created project in the Bitbucket.

Code Uploaded to Bitbucket Repo

Congratulations on our first integration with Cloud Hosted Version Control Management System.

Happy Coding!

0

Using JavaScript Promises with SharePoint 2013 – Deferring executeQueryAsync to be a little Synchronous

If you have worked with Asynchronous programming, its no brainier that you might have enjoyed writing highly performing apps.

But I am sure you might have felt at some rare occasions that “Alas! I wish if this small portion of my code works synchronously and dammit wait for an action to complete!”. 

Most of the time its always fire-and-forget scenario with asynchronous calls, but there are certain scenarios where you want your code to return something. Especially this gets even tricky if you are working with asynchronous logic within loops. JavaScript Promise comes to your rescue!

This blog post is not meant to educate you on JQuery or JavaScript Promises, but just to give you an idea of something like this exists and how you can use it with SharePoint. I will not leave you disappointed, following are some good articles that will get you started.

Article 1: http://blog.qumsieh.ca/2013/10/31/using-jquery-promises-deferreds-with-sharepoint-2013-jsom/

Article 2: http://www.vasanthk.com/jquery-promises-and-deferred-objects/

Spread the word for your fellow developers

“JavaScript Promise 

I was working on a typical use-case of updating SharePoint list items for a SharePoint Online site. Let me explain the use case in detail, I have two SharePoint lists, one is a SharePoint Document Library and the other is a Custom list. Document library has to be updated based on certain values from the Custom list. Of course, the data has to be manipulated based on certain conditions and multiple business rules before updating the document library.

Initial thought process would be to use any of the tools at disposal or use out-of-the-box features to get this working, but after an exhaustive research over the internet, I could not find any way to accomplish this with no customization. So as a last resort, I decided to write some code.

NOTE: This is SharePoint online site, REST API or JSOM (Java Script Object Model) would be the default choices. I am using JavaScript for this example.

Following is the general thought process

Step 1: Get the list of items asynchronously  from Source List (i.e. SharePoint Custom List)

Step 2: Loop through the items and update the Destination List (i.e. SharePoint Document Library)

Ideally everything should work seamlessly, but Step 2 will fail with asynchronous way of programming. When you loop through the items that are fetched from Source List it works great, but any updates or actions that you perform with in a loop may not work.

Loop proceeds irrespective of action being complete or not. If you are dealing with one item and no-loop involved it works fine. But I am dealing with hundreds of items with too many of these loops.

Using JQuery Promise for SharePoint 2013

 

NOTE:

Following  code has clear and detailed inline comments. Everything should be self explanatory!

Following code has been developed for one-time use only and is meant for educational purposes only and not recommended for production use. I can say upfront that the naming conventions, error handling/logging are not production ready. You may need to make updates to the code and may require some re-work. Of course, there is always a better way doing things.Use at your own risk!


<!--Add JQuery reference-->
<script src="//ajax.googleapis.com/ajax/libs/jquery/1.12.0/jquery.min.js"></script>

<script type="text/javascript">

//Get the source list items to loop through and update the destinantion list.
function GetSourceListItems() {

//Assign the name of the list to use as Source List
var strSourceListName = "SourceListName";

//Get the current SharePoint Context: Represents the context for objects and operations
var oContext = SP.ClientContext.get_current();

//Get the current web
var oWeb = oContext.get_web();

//Get the source list using the Name or GUID
var oList = oWeb.get_lists().getByTitle(strSourceListName);

//CAML Query to query the Source List
//Following is the Query to get items less than ID 800.
var strQuery = '<View><Query> <Where> <Leq> <FieldRef Name=\'ID\' \/> <Value Type=\'Counter\'>800<\/Value> <\/Leq> <\/Where> <\/Query><\/View>';
var oQuery = new SP.CamlQuery();
oQuery.set_viewXml(strQuery);

//Get the items based on Query
var allItems = oList.getItems(oQuery);
oContext.load(allItems);

//Make an asynchronous call.
oContext.executeQueryAsync(Function.createDelegate(this, function () { onQuerySuccess(allItems); }),
Function.createDelegate(this, this.onQueryFailed));
}

//On Query success loop through the items and update the destination list:
function onQuerySuccess(allItems) {

//Returns an enumerator that iterates through the SP.ClientObjectCollection Class.
var ListEnumerator = allItems.getEnumerator();

//Names of the columns in the source list to use
var sourceColumnArray = ["SourceListColumn1", "SourceListColumn2", "SourceListColumn3"];

//Iterate though list items and update the destination list.
while (ListEnumerator.moveNext()) {
//Get the current item
var currentItem = ListEnumerator.get_current();

//Create an array of values that should be used to update the destination list.
var valuesArray = [];

//Loop through the source column array and populate the array.
for (var i = 0; i < sourceColumnArray.length; i++) {
var value = currentItem.get_item(sourceColumnArray[i]);
valuesArray.push(value);
}

//Get the value of the common key that relates both the Source and Destination Lists.
//Use the same key to query against the destination list.
var commonKey = currentItem.get_item('commonColumnName');

//Check the length of array and send it update the destination list.
if (valuesArray.length > 0) {
UpdateDestinationListItem(commonKey, valuesArray);
}
}
}

//On Query failure log message
function onQueryFailed(sender, args) {
//Log Error
}

//Update the destination list
//Use common key to query the distination list
//Use values array to update the values of the desination list
function UpdateDestinationListItem(commonKey, valuesArray) {
//****Important 1.0****//
//Use of JQuery deferred
var d = $.Deferred();

//Give the name of the distination list to update with values.
var destinationListName = "DestinationListName";

//Use the key from the source list to query the destination list.
var strDistinationCAMLQuery = '<View><Query><Where><Eq><FieldRef Name=\'FileLeafRef\' \/> <Value Type=\'File\'>' + commonKey + '<\/Value> <\/Eq> <\/Where> <\/Query><\/View>';

//Get the current SharePoint Context: Represents the context for objects and operations
var oClientContext = SP.ClientContext.get_current();

//Get the current web
var oWeb = oClientContext.get_web();

//Get the destination list using the Name or GUID
var oDestinationlist = oWeb.get_lists().getByTitle(destinationListName);

//Specifies a Collaborative Application Markup Language (CAML) query on a list or joined lists.
var oCamlQuery = new SP.CamlQuery();
oCamlQuery.set_viewXml(strDistinationCAMLQuery);

//Get the list of items using the CAML Query
var oCollListItem = oDestinationlist.getItems(oCamlQuery);
oClientContext.load(oCollListItem);

//****Important 2.0****//
//Use of JQuery deferred
//Send list of items, client context, array of values from Source List
var o = { d: d, listOfItems: collListItem, clientContext: clientContext, valuesArray: valuesArray };

clientContext.executeQueryAsync(Function.createDelegate(o, successCallback), Function.createDelegate(o, failCallback));

return d.promise();

function successCallback() {

var itemCount = this.listOfItems.get_count();

var destinationColumnArray = ["DestinationListColumn1", "DestinationListColumn2", "DestinationListColumn3"];

var listItemInfo;

if (itemCount > 0) {
//****Important 3.0****//
//Use of JQuery deferred: Using 'listOfItems'
var listItemEnumerator = this.listOfItems.getEnumerator();

//Iterate though the list of items fetched from destination list
while (listItemEnumerator.moveNext()) {

var oCurrentListItem = listItemEnumerator.get_current();

for (var i = 0; i < destinationColumnArray.length; i++) {
//Write to console for debugging
console.log('Updating Value');

//****Important****//
//Destination list item getting updated.
oCurrentListItem.set_item(destinationColumnArray[i], this.valuesArray[i]);

//Write to console for debugging
console.log('Update Successfull');
}
oCurrentListItem.update();

this.clientContext.executeQueryAsync(onUpdateItemSuccess, onUpdateItemFailed);
}
}
}

function failCallback() {
this.d.reject("Failed at failCallback() method");
}

function onUpdateItemSuccess() {
console.log("Item Updated Sucessful");
}

function onUpdateItemFailed() {
console.log("Item Updated Failed");
}
}
</script>

I am not sure if this blog post helped you with what you wanted, but I would be glad if  you have discovered about JavaScript Promises for the first time via my blog post. Happy Coding!

2

What is Azure?

Explaining Azure in simple terms can be a minute conversation, but if you attempt to deep dive into various services, it can even take months. No kidding, I am serious!

Microsoft provides over 200 Services, and you would be spell-bound on the rate at which new features get added on a daily basis.

This blog post does not get into the weeds of each of the services,  but will cover few basic fundamentals that will help you understand the terminology and assimilate some of the technical jargon.

Before even starting on what Azure is, it is a good idea to talk a little about Cloud, as Azure itself is a Microsoft’s flavor of Cloud. There are many other Cloud service providers such as Amazon Web Services, Akamai, CenturyLink, CSC, Dimension Data, Fujitsu, Google, IBM (SoftLayer),  Interoute,  Joyent, Rackspace, Verizon, Virtustream, VMware etc. and the list goes on…

So what is Cloud? Why companies want to be on Cloud? Why there are companies providing these services?

Technology plays a very key role in managing business operations and customer relations.

Following image demonstrates Azure replacing some or all of the On-Premise services.

Azure replacing On-Premise Services

Azure replacing On-Premise Services

Shift towards Cloud is not always about monetary reasons, it is also about ease of use and peace of mind.  This is one of the million reasons why companies are leaning more towards Cloud Computing and designing their own Cloud strategy.

Cloud has been a ubiquitously ambiguous. There is no true definition of what qualifies a company as a Cloud Service provider. For instance, one service provider may provide only half of the service provided by its competitors.

For instance, Rackspace does a very good job on providing IaaS services and they could provide solid references from their clientele for those services. Overall as a customer you would assume that these Service Providers would at least provide IaaS / PaaS services.

On a lighter note, you cannot run couple of servers off of your basement and call yourself a Cloud provider. In reality, you should be able to scale-up or scale-out dynamically catering to the demand, security, ability to adhere to SLA of 99.9%  up-time and much more. It is not an easy task. Cloud is a serious business!

Enough said, lets get started on some of the terminology used with in Azure and may be common to most of the Cloud Service providers.

1. Public, Private and Hybrid Cloud

Public Private and Hybrid Cloud Offerings

Public Private and Hybrid Cloud Offerings

2. IaaS, PaaS, SaaS

Following is just a thirty thousand foot view, explaining in detail would require a complete blog post by itself.

If you are interested in knowing more in depth please follow this article.

As a matter of fact, with the Cloud maturing day by day the differences are getting blurrier.

IaaS vs. PaaS vs. SaaS

IaaS vs. PaaS vs. SaaS

3. Azure Services

Everything in Azure is termed as a ‘Service’. It could be a VM or Website or CDN (Content Distribution Network).

Azure services are categorized as show below,  to further explore please follow this article.

Azure Services

Azure Services

4. Azure Portals

Microsoft provides easy to use web portals to leverage their services. Usually there are two portals, one in production and one is New portal. The intention of the new portal is to keep up with the ever changing user experience demands and making it mobile friendly. They are both meant for the same purpose. Following are some of the key differences

  • Not all features are migrated into the new portal, some of them may be deprecated or merged with other features.
  • New Portal has RBAC (Role Based Access Management)
    • The current production portal does not have this feature, which means that if you are logged-in, you are an Admin. Scary!
    • With RBAC you can associate roles to the users to have read only, write , contribute or even delegate admin role.
  • More modern user interface, even more mobile friendly.
  1. Production URL: https://manage.windowsazure.com/

    Window Auzre Production Portal

    Quick Peek : Window Azure Production Portal

  2. Preview Portal: https://portal.azure.com/

    Azure Preview Portal

    Quick Peek: Azure Preview Portal

5. Subscriptions

Azure Subscriptions

Azure Subscriptions

 

6. Pricing Calculator

This is a very useful tool to price and configure Azure services for your scenarios. If you have a Pre-Paid subscription your credit status will appear as an overlay on top of window.

Credit Status

Credit Status

Following is the screen capture of pricing calculator, indicating how much it would cost to add ‘DocumentDB’ as a service.

Pricing Calculator

Pricing Calculator

I hope you enjoyed reading this article and hope this was useful!!

0

Provider Hosted App (Add-in) on SharePoint Online in 5 steps – Step 5/5

Step 5: Create Publishing Profiles for SharePoint Online and publish

  • Create a new Publishing Profile for SharePoint app and provide the Client ID and Client Secret

New Publishing Profile to deploy SharePoint APP

New Publishing Profile to deploy SharePoint APP Step 2

New Publishing Profile to deploy SharePoint APP Step 3

Publish the app or Hit F5. An you should see similar screens as below

App Install Page

  • Following is the screen capture of the app, this is just a simple HTML and JavaScript added to default.aspx page.

SharePoint Online Showing the Final App

Note:  This blog post has been to split into multiple posts as they render fast with fewer images and it will greatly enhance the experience for readers accessing this via mobile devices