1

Design, Manage and Monitor – Microsoft Azure Storage Services

I have been meaning to get around to this article for a long time, this blog post is for developers or folks who wants to read through the clutter, get a ten thousand foot view of what Azure Storage is, when to use what service and various API options available. I assume that you have some understanding what Azure is and familiar with some jargon with Azure Storage as well.

What is Azure storage? (Understanding with analogy)

A perfect analogy would be, let’s assume that you are relocating from Texas to Virginia, you said your good-byes, wrapped up your boring office send-offs, took pictures, packed all your stuff and  bagged a few memories. You drove to Virginia and negotiating with a Storage Service Provider like U-Haul, or EzStorage to store all your stuff. You might have boxes of all sizes – small, big and some extra large; fragile items, wall hangings, suite cases, may be an old piano or an antic car etc. Everybody will have unique needs based on what they have to store. And these storage providers will allocate space based on what you want to store and how you want to store. You might need a garage for your car or your might just need a small shelf shared with others for storing small boxes. Of course! you pay for their service, you pay based on the size, kind of storage, location of the storage etc. Azure Storage Services is no different.

Now relate Azure Storage with the Storage Facility, Azure Storage Services also stores stuff for you, unlike boxes or wall hangings, you might have VHDs (Virtual Hard Disks), Big & Small images, High definition videos, Office documents etc. whatever ever your needs are, Azure will accommodate them for you. Azure provides various Storage Services to host different files types and catering to various use cases. 

Let us get started!  The following section will provide use cases for understanding on when to use different storage options.

  1. Table Services

  2. Blob Services

  3. Queue Services

  4. File Share Services

Table Service
Typical use cases:

  • For basic NoSQL/Key-Value functionality and the ability to scale quickly.
    • NoSQL (Not only SQL) is revolutionary and the underlying concept of Big Data. 
    • What’s Big Data? In layman’s terms, when the concept of storage started, humans used to contribute data, but with an evolution of millions of smart gadgets like smart TVs, smart refrigerators, smart cameras everything is interconnected and all of these products contribute to Terabytes if not Petabytes of data.
      If your applications allocate resources processing real-time data,  you might lose some of the data being fed to these systems continuously, you need a way to quickly “somehow” save this data in some format and process it later. This ‘somehow’ is NoSQL which means that you need a type of storage that can quickly take what every is inputted and stored for later processing.
      When the demand for resources is reaching its limit (while adding real-time data), underlying systems automatically scale out and scale up if needed. This is where Azure will help you.
  • If you have a need to input and quickly retrieve millions of entities (similar to rows in SQL) this is a good choice.
    • If you are not following relational schema it becomes a whole lot easier to input and extract data.
      (Note: You should have Row and Partition key to fetch data fast or else as the entities grow querying becomes slow.)
  • If your data storage requirements exceed 150 GB and you are reluctant to manually shard or partition your data.
  • If you want to keep your costs low. Table storage services are comparatively cheaper than other storage services .
    • Note: Microsoft has been optimizing Azure SQL Database service to reduce overall costs and also ongoing administration costs, with a change like this, more and more customers are preferring SQL Database to Table Service.
  • If you require robust disaster recovery capabilities that span geographical locations.

Avoid:

  • If your data has complex relationships, in other words, data that require RDBMS (go with SQL Azure).
  • You if you need the support of secondary indexes, this is not an option.
  • If your code depends on ADO.NET or ODBC.

Blob Service (Block blob & Page Blob)
Typical use cases

  • If you have tons of static files like pictures, videos, pdf, office documents etc. this is the default choice.
    • As a part of modernization, more companies are migrating to the cloud, it could be websites, applications or even whole infrastructure. There might be hundreds, thousands or even millions of supporting files. Azure Blob service can host these files for you.
    • Depending on the size of the file you are trying to store, you have two options
      • Block Blob
        • Each Block Blob can be up 200 GB, you can upload block blobs with a size up to 64 MB in one operation.
      • Page Blob.
        • Each Page blob can be up to 1 TB in size and consist of a collection of 512-byte pages. You set the maximum size when creating a page blob and then you can write or update specific pages
  • This is most popular among all Storage Services, as a matter of fact Azure uses this for all of its storage needs, this could be while you create a VM and storing any supporting files/data used by other services on Azure.
  • When people refer to migrating Virtual Machines to Cloud, generally they prep the machine by running tools like Sysprep on those machines and uploading VHDs to BLOB service.

Queue Service(Azure Queues & Service Bus queues)
Typical use cases

  • If you desire to have a loosely coupled or applications that can interact asynchronously.
    • For instance, you have a system that checks inventory and notifies what needs to be ordered and you have another system that does the ordering for you.What if the order processing VMs or Worker Roles are offline? What happens if one of them is busy or overloaded with traffic? How do you recover from failure? Queue Services will help you with these scenarios
  • If you have a need for large numbers of messages that needs to be accessed from anywhere in the world via authenticated calls using HTTP or HTTPS.
  • Load Leveling & Balancing
    • If you have apps that experience a huge burst of traffic, you could level the load if you could categorize processes that can wait (in the queue) until a free worker becomes available.
    • For load balancing,  you could spin up additional worker roles to process messages and balance the load among the worker processes. Also, this can be automated.
  • Subscription
    • If you have one producer and multiple consumers, Queues are an excellent choice. For instances, one system can generate a messages (Producer) and multiple systems that subscribe and consume the messages (Consumers) and processes them independently.
  • Scheduled Processing
    • If you do not want to stress your applications during the peak traffic hours, you can isolate that task that can be processed later on a scheduled basis. This could be nightly or weekly, you can add messages to the Queue and process them based on the schedule.

Note: Azure Queues and Service Bus Queues work differently, it is good to understand the differences between these two and make informed architectural decisions. Refer to ‘References’ section for good articles.

Advice on Multiple Queues:
I am not against using multiple queues, but use them sparingly, the idea is, you can achieve a lot with single queues itself (if architected well).
As a matter of fact, I have used multiple queues myself in the following use cases
1. Different Priorities: You want different priorities for different messages. The last thing you want is a high-priority message getting stuck behind hundreds of low-priority messages. Create a hi-priority queue for these.
2. Large Transactions: If the number of message transactions (Read,Put,Delete) exceed over 500 transactions per second. Creating multiple queues will spread the transaction volume across storage partitions.

File Storage
Typical use cases

  • Share data across on-premises and cloud servers
    • If you have a hybrid set up and you have applications that share data across on-premise and cloud.
  • Migrate file share-based applications
    • You can migrate applications to the cloud with no code changes.
  • Integrate modern applications
    • With more devices accessing the same information and distributed globally, it creates a dire need for sharing resources and build more file sharing applications. Also, it supports REST API for building modern applications.
  • High Availability
    • If all you need is a file share interacting via Server Message Block 3.0 (SMB) protocol it is highly efficient to use Azure File Share service backed by Azure infrastructure, redundancy, and SLA.
  • Sharing files between Services
    • If you have VMs or different services that need to share files this is an excellent choice.

Note: If your application is heavy on using file attributes, file processing and not just regular read/write, there are certain unsupported features.
Click Here for the complete list.

After all these services are provisioned, there should be an option to consume these services. Following screen capture provides a quick view or a cheat sheet of what Storage API provides. Most of the operations are self-explanatory based on the naming convention. I will strive to write more articles on each one of these and provide examples in C# leveraging REST API.

Azure Storage Services Quick Reference:
Azure Storage Services Quick Reference

Conclusion:

I was confused on when to use what service, but after working hands on and a lot of reading, I was able to assimilate some information but not everything. I made an honest attempt to provide a quick read and to get an overall view of what Azure Storage Services are, what to choose when and when to avoid. Also concluded with summarizing most of the operations that you can do with via Azure Storage API. Hope this article made some sense and helped you with something.

References:

[1] Azure Queues and Service Bus queues – compared and contrasted

[2] Azure Storage Queues VS Azure Service Bus Queues

 

0

Code, Manage and Collaborate with Bitbucket and Visual Studio

This blog post is a quick demo on how to integrate Visual studio with an existing code repository on Bitbucket.

There are so many blogs that explain Git and Bitbucket in great detail. You may follow the below resources to get basic understanding of Git and how Bitbucket uses Git.

This post assumes that you have the following prerequisites:

  1. Bitbucket account with appropriate credentials. If you do not have one Sign Up Here
  2. Visual Studio 2013 or higher installed.

Let’s get started!

What is Git?

“Git is a free and open source distributed version control system designed to handle everything from small to very large projects with speed and efficiency.”

Once you install Git, you will be able to create a local repository, commit your changes locally and then push to the cloud or origin.

Installing Git is easy and this post will cover this for you.

What is Bitbucket?

Bitbucket is a cloud-based flavor of Git for professional teams.  This is how it works – when you add/modify files in your local computer, changes are committed to local repository (i.e. Git repository) and then synced Bitbucket. You will know more when you read through this blog post.

Git is the underlying engine that does all the heavy lifting and Bitbucket is just a wrapper that leverages Git.

Following are the key steps to integrate Visual Studio with Bitbucket.

 Step 1: Install Git. (Link to download)

Please follow the steps and install Git.

Git Install Step 1

Git Install Step 2

Git Install Step 3

Git Install Step 4

Git Install Step 5

Git Install Final Step

Step 2: Create Code Repository in Bitbucket.

Please follow the steps and create Bitbucket Repository.

Create Bitbucket Repository Step 1

Create Bitbucket Repository Step 3

Once the repository is created, navigate to the ‘Overview’ section within the repository.
Copy and save the highlighted URL, you will need this later.

Note:
You should see the following only if you are working with an existing repository.
If you followed this blog post and created a new repository you will not see the following.

Step 3: Use Visual Studio to create a local Git Repository.

a. Navigate to the ‘Team Explorer’ within visual studio.

  • Click ‘Clone’ option under Local Git Repositories.
  • Provide the URL that you saved in the previous steps in the first text field.
  • Give the path to the local folder where you want the Git repository to be created in the second text field.

Create Local Git Repository using Visual Studio Step 1

 

 

 

 

 

 

 

 

 

 

b. Enter the user name and password for the Bitbucket account.

Create Local Git Repository using Visual Studio Step 3

C. Check if the repository is created in the Visual Studio and local File System.

Visual Studio:

Create Local Git Repository using Visual Studio Step 2

Local File System:

A hidden folder ‘.git’ will be created (Check your ‘Folder Options’ to see ‘hidden Items’)
This indicates that the folder is being watched and tracked by Git.

Local Git Repsoitory

Step 4: Add a new project to the Git Repository.

Important!: When you create a new project, make sure you  are creating using ‘New’ under Git Repository.
Double click on the Git Repository name and you should see the following.
Create new project in the Git Repository

DO NOT create the project from the New option with in ‘File’ menu in the visual studio.
You can re-associate with the Git repository later, but this is the cleanest and the easiest way of doing it.

Give appropriate name to the project and make sure your project location matches the Git Repository location.
Create a new project in the local Git Repository

If you notice clearly, you will find a ‘Plus’ icon next to the files, indicating that it is under version control.
Visual Studio Project created in the local Git Repository

Step 5: Commit and sync the project to Bitbucket.

Right click on the Solution name and hit commit. Remember, this is only committing to local Git Repository!

Commit the newly created project

Give appropriate comments. You are still committing to local Git repository.

Add comments while committing the project

Hit ‘Sync’ . This will push the changes to the outgoing commits. You are still syncing to local Git repository.

Sync to Bitbucket or origin

Push the changes to Bitbucket using ‘Push’ option under Outgoing Commits. THIS PUSHES TO Bitbucket!

Outgoing commits to origin

Check if you can find the newly created project in the Bitbucket.

Code Uploaded to Bitbucket Repo

Congratulations on our first integration with Cloud Hosted Version Control Management System.

Happy Coding!

0

Using JavaScript Promises with SharePoint 2013 – Deferring executeQueryAsync to be a little Synchronous

If you have worked with Asynchronous programming, its no brainier that you might have enjoyed writing highly performing apps.

But I am sure you might have felt at some rare occasions that “Alas! I wish if this small portion of my code works synchronously and dammit wait for an action to complete!”. 

Most of the time its always fire-and-forget scenario with asynchronous calls, but there are certain scenarios where you want your code to return something. Especially this gets even tricky if you are working with asynchronous logic within loops. JavaScript Promise comes to your rescue!

This blog post is not meant to educate you on JQuery or JavaScript Promises, but just to give you an idea of something like this exists and how you can use it with SharePoint. I will not leave you disappointed, following are some good articles that will get you started.

Article 1: http://blog.qumsieh.ca/2013/10/31/using-jquery-promises-deferreds-with-sharepoint-2013-jsom/

Article 2: http://www.vasanthk.com/jquery-promises-and-deferred-objects/

Spread the word for your fellow developers

“JavaScript Promise 

I was working on a typical use-case of updating SharePoint list items for a SharePoint Online site. Let me explain the use case in detail, I have two SharePoint lists, one is a SharePoint Document Library and the other is a Custom list. Document library has to be updated based on certain values from the Custom list. Of course, the data has to be manipulated based on certain conditions and multiple business rules before updating the document library.

Initial thought process would be to use any of the tools at disposal or use out-of-the-box features to get this working, but after an exhaustive research over the internet, I could not find any way to accomplish this with no customization. So as a last resort, I decided to write some code.

NOTE: This is SharePoint online site, REST API or JSOM (Java Script Object Model) would be the default choices. I am using JavaScript for this example.

Following is the general thought process

Step 1: Get the list of items asynchronously  from Source List (i.e. SharePoint Custom List)

Step 2: Loop through the items and update the Destination List (i.e. SharePoint Document Library)

Ideally everything should work seamlessly, but Step 2 will fail with asynchronous way of programming. When you loop through the items that are fetched from Source List it works great, but any updates or actions that you perform with in a loop may not work.

Loop proceeds irrespective of action being complete or not. If you are dealing with one item and no-loop involved it works fine. But I am dealing with hundreds of items with too many of these loops.

Using JQuery Promise for SharePoint 2013

 

NOTE:

Following  code has clear and detailed inline comments. Everything should be self explanatory!

Following code has been developed for one-time use only and is meant for educational purposes only and not recommended for production use. I can say upfront that the naming conventions, error handling/logging are not production ready. You may need to make updates to the code and may require some re-work. Of course, there is always a better way doing things.Use at your own risk!


<!--Add JQuery reference-->
<script src="//ajax.googleapis.com/ajax/libs/jquery/1.12.0/jquery.min.js"></script>

<script type="text/javascript">

//Get the source list items to loop through and update the destinantion list.
function GetSourceListItems() {

//Assign the name of the list to use as Source List
var strSourceListName = "SourceListName";

//Get the current SharePoint Context: Represents the context for objects and operations
var oContext = SP.ClientContext.get_current();

//Get the current web
var oWeb = oContext.get_web();

//Get the source list using the Name or GUID
var oList = oWeb.get_lists().getByTitle(strSourceListName);

//CAML Query to query the Source List
//Following is the Query to get items less than ID 800.
var strQuery = '<View><Query> <Where> <Leq> <FieldRef Name=\'ID\' \/> <Value Type=\'Counter\'>800<\/Value> <\/Leq> <\/Where> <\/Query><\/View>';
var oQuery = new SP.CamlQuery();
oQuery.set_viewXml(strQuery);

//Get the items based on Query
var allItems = oList.getItems(oQuery);
oContext.load(allItems);

//Make an asynchronous call.
oContext.executeQueryAsync(Function.createDelegate(this, function () { onQuerySuccess(allItems); }),
Function.createDelegate(this, this.onQueryFailed));
}

//On Query success loop through the items and update the destination list:
function onQuerySuccess(allItems) {

//Returns an enumerator that iterates through the SP.ClientObjectCollection Class.
var ListEnumerator = allItems.getEnumerator();

//Names of the columns in the source list to use
var sourceColumnArray = ["SourceListColumn1", "SourceListColumn2", "SourceListColumn3"];

//Iterate though list items and update the destination list.
while (ListEnumerator.moveNext()) {
//Get the current item
var currentItem = ListEnumerator.get_current();

//Create an array of values that should be used to update the destination list.
var valuesArray = [];

//Loop through the source column array and populate the array.
for (var i = 0; i < sourceColumnArray.length; i++) {
var value = currentItem.get_item(sourceColumnArray[i]);
valuesArray.push(value);
}

//Get the value of the common key that relates both the Source and Destination Lists.
//Use the same key to query against the destination list.
var commonKey = currentItem.get_item('commonColumnName');

//Check the length of array and send it update the destination list.
if (valuesArray.length > 0) {
UpdateDestinationListItem(commonKey, valuesArray);
}
}
}

//On Query failure log message
function onQueryFailed(sender, args) {
//Log Error
}

//Update the destination list
//Use common key to query the distination list
//Use values array to update the values of the desination list
function UpdateDestinationListItem(commonKey, valuesArray) {
//****Important 1.0****//
//Use of JQuery deferred
var d = $.Deferred();

//Give the name of the distination list to update with values.
var destinationListName = "DestinationListName";

//Use the key from the source list to query the destination list.
var strDistinationCAMLQuery = '<View><Query><Where><Eq><FieldRef Name=\'FileLeafRef\' \/> <Value Type=\'File\'>' + commonKey + '<\/Value> <\/Eq> <\/Where> <\/Query><\/View>';

//Get the current SharePoint Context: Represents the context for objects and operations
var oClientContext = SP.ClientContext.get_current();

//Get the current web
var oWeb = oClientContext.get_web();

//Get the destination list using the Name or GUID
var oDestinationlist = oWeb.get_lists().getByTitle(destinationListName);

//Specifies a Collaborative Application Markup Language (CAML) query on a list or joined lists.
var oCamlQuery = new SP.CamlQuery();
oCamlQuery.set_viewXml(strDistinationCAMLQuery);

//Get the list of items using the CAML Query
var oCollListItem = oDestinationlist.getItems(oCamlQuery);
oClientContext.load(oCollListItem);

//****Important 2.0****//
//Use of JQuery deferred
//Send list of items, client context, array of values from Source List
var o = { d: d, listOfItems: collListItem, clientContext: clientContext, valuesArray: valuesArray };

clientContext.executeQueryAsync(Function.createDelegate(o, successCallback), Function.createDelegate(o, failCallback));

return d.promise();

function successCallback() {

var itemCount = this.listOfItems.get_count();

var destinationColumnArray = ["DestinationListColumn1", "DestinationListColumn2", "DestinationListColumn3"];

var listItemInfo;

if (itemCount > 0) {
//****Important 3.0****//
//Use of JQuery deferred: Using 'listOfItems'
var listItemEnumerator = this.listOfItems.getEnumerator();

//Iterate though the list of items fetched from destination list
while (listItemEnumerator.moveNext()) {

var oCurrentListItem = listItemEnumerator.get_current();

for (var i = 0; i < destinationColumnArray.length; i++) {
//Write to console for debugging
console.log('Updating Value');

//****Important****//
//Destination list item getting updated.
oCurrentListItem.set_item(destinationColumnArray[i], this.valuesArray[i]);

//Write to console for debugging
console.log('Update Successfull');
}
oCurrentListItem.update();

this.clientContext.executeQueryAsync(onUpdateItemSuccess, onUpdateItemFailed);
}
}
}

function failCallback() {
this.d.reject("Failed at failCallback() method");
}

function onUpdateItemSuccess() {
console.log("Item Updated Sucessful");
}

function onUpdateItemFailed() {
console.log("Item Updated Failed");
}
}
</script>

I am not sure if this blog post helped you with what you wanted, but I would be glad if  you have discovered about JavaScript Promises for the first time via my blog post. Happy Coding!

2

What is Azure?

Explaining Azure in simple terms can be a minute conversation, but if you attempt to deep dive into various services, it can even take months. No kidding, I am serious!

Microsoft provides over 200 Services, and you would be spell-bound on the rate at which new features get added on a daily basis.

This blog post does not get into the weeds of each of the services,  but will cover few basic fundamentals that will help you understand the terminology and assimilate some of the technical jargon.

Before even starting on what Azure is, it is a good idea to talk a little about Cloud, as Azure itself is a Microsoft’s flavor of Cloud. There are many other Cloud service providers such as Amazon Web Services, Akamai, CenturyLink, CSC, Dimension Data, Fujitsu, Google, IBM (SoftLayer),  Interoute,  Joyent, Rackspace, Verizon, Virtustream, VMware etc. and the list goes on…

So what is Cloud? Why companies want to be on Cloud? Why there are companies providing these services?

Technology plays a very key role in managing business operations and customer relations.

Following image demonstrates Azure replacing some or all of the On-Premise services.

Azure replacing On-Premise Services

Azure replacing On-Premise Services

Shift towards Cloud is not always about monetary reasons, it is also about ease of use and peace of mind.  This is one of the million reasons why companies are leaning more towards Cloud Computing and designing their own Cloud strategy.

Cloud has been a ubiquitously ambiguous. There is no true definition of what qualifies a company as a Cloud Service provider. For instance, one service provider may provide only half of the service provided by its competitors.

For instance, Rackspace does a very good job on providing IaaS services and they could provide solid references from their clientele for those services. Overall as a customer you would assume that these Service Providers would at least provide IaaS / PaaS services.

On a lighter note, you cannot run couple of servers off of your basement and call yourself a Cloud provider. In reality, you should be able to scale-up or scale-out dynamically catering to the demand, security, ability to adhere to SLA of 99.9%  up-time and much more. It is not an easy task. Cloud is a serious business!

Enough said, lets get started on some of the terminology used with in Azure and may be common to most of the Cloud Service providers.

1. Public, Private and Hybrid Cloud

Public Private and Hybrid Cloud Offerings

Public Private and Hybrid Cloud Offerings

2. IaaS, PaaS, SaaS

Following is just a thirty thousand foot view, explaining in detail would require a complete blog post by itself.

If you are interested in knowing more in depth please follow this article.

As a matter of fact, with the Cloud maturing day by day the differences are getting blurrier.

IaaS vs. PaaS vs. SaaS

IaaS vs. PaaS vs. SaaS

3. Azure Services

Everything in Azure is termed as a ‘Service’. It could be a VM or Website or CDN (Content Distribution Network).

Azure services are categorized as show below,  to further explore please follow this article.

Azure Services

Azure Services

4. Azure Portals

Microsoft provides easy to use web portals to leverage their services. Usually there are two portals, one in production and one is New portal. The intention of the new portal is to keep up with the ever changing user experience demands and making it mobile friendly. They are both meant for the same purpose. Following are some of the key differences

  • Not all features are migrated into the new portal, some of them may be deprecated or merged with other features.
  • New Portal has RBAC (Role Based Access Management)
    • The current production portal does not have this feature, which means that if you are logged-in, you are an Admin. Scary!
    • With RBAC you can associate roles to the users to have read only, write , contribute or even delegate admin role.
  • More modern user interface, even more mobile friendly.
  1. Production URL: https://manage.windowsazure.com/

    Window Auzre Production Portal

    Quick Peek : Window Azure Production Portal

  2. Preview Portal: https://portal.azure.com/

    Azure Preview Portal

    Quick Peek: Azure Preview Portal

5. Subscriptions

Azure Subscriptions

Azure Subscriptions

 

6. Pricing Calculator

This is a very useful tool to price and configure Azure services for your scenarios. If you have a Pre-Paid subscription your credit status will appear as an overlay on top of window.

Credit Status

Credit Status

Following is the screen capture of pricing calculator, indicating how much it would cost to add ‘DocumentDB’ as a service.

Pricing Calculator

Pricing Calculator

I hope you enjoyed reading this article and hope this was useful!!

0

Provider Hosted App (Add-in) on SharePoint Online in 5 steps – Step 5/5

Step 5: Create Publishing Profiles for SharePoint Online and publish

  • Create a new Publishing Profile for SharePoint app and provide the Client ID and Client Secret

New Publishing Profile to deploy SharePoint APP

New Publishing Profile to deploy SharePoint APP Step 2

New Publishing Profile to deploy SharePoint APP Step 3

Publish the app or Hit F5. An you should see similar screens as below

App Install Page

  • Following is the screen capture of the app, this is just a simple HTML and JavaScript added to default.aspx page.

SharePoint Online Showing the Final App

Note:  This blog post has been to split into multiple posts as they render fast with fewer images and it will greatly enhance the experience for readers accessing this via mobile devices

0

Provider Hosted App (Add-in) on SharePoint Online in 5 steps – Step 4/5

Step 4: Create Publishing Profiles for Azure Web App and publish

Create a new Publishing Profile for Azure web app and publish

  • Import the azure Publishing Profile downloaded in Step 1
    Import Publishing Profile
  • Once you import the publish settings files you will fine similar to following and validate your connectionImport Publishing Profile Summary

Select Profile Debug Publish

  • Preview will show the files that will be published to Azure Web App.

Optional Preview Step

  •  Publish the Azure Web App.

Continue to Step 5: Create Publishing Profiles for SharePoint Online and publish>>

 

Note:  This blog post has been to split into multiple posts as they render fast with fewer images and it will greatly enhance the experience for readers accessing this via mobile devices

0

Provider Hosted App (Add-in) on SharePoint Online in 5 steps – Step 3/5

Step 3: Register the app and update the Visual Studio

  • Step 3 a: Register the app in SharePoint
    • Go the URL on your SharePoint Online Site: https://yourname.sharepoint.com /_layouts/15/appregnew.aspx
    • Click generate for Client Id and Secret, new ids will be generated
    • Give a title to the app of your choice
    • Add app domain: This is your Azure web app with out HTTP or HTTPS
    • Redirect URI: You may enter the URI: http://yourazureurl/default.aspx
      • This part is tricky there are developers that use just HTTP and no /default.aspx in the end
    • Please follow Register SharePoint Add-in from MSDN for more informationRegister App on SharePoint online

      This is the most important step of all. Please save the following info you need this information later on!!

      App Identier Created Successfull Message

  • Step 3 b: Update web.config file of the Azure Web App Project: Update ‘Client Id’ and ‘Client Secret’
  • Web Config Changes for Azure Web App
  • Step 3 c :Update the Permissions section of Manifest.xml by double clicking the the fileChange the App Manifest Permissions
  • Step 3 d: Update code of Manifest.xml by right clicking the file and click ‘<> View Code’App Manifest Code File Changes

Continue Step 4: Create Publishing Profiles for Azure Web App and publish>>

Note:  This blog post has been to split into multiple posts as they render fast with fewer images and it will greatly enhance the experience for readers accessing this via mobile devices

0

Provider Hosted App (Add-in) on SharePoint Online in 5 steps – Step 2/5

Step 2: Configure Visual Studio for development

Step 2 a: Create a new project of Type ‘Apps for SharePoint’

Create Visual Studio Project

Enter the SharePoint Site URL where this app needs to be deployed and select Provider-hosted

Site Name and Select Provider Hosted APP

You will prompted to enter your office 365 credentials to select the version of the SharePoint

Connect Visual Studio with Office 365

Please select SharePoint Online

Select the SharePoint version

Leave the default Option, unless you want to use certificate authentication.

App Authentication

Choose the web application project of your choice

Select Web Application Type

  • Two projects will be created
    • One for SharePoint online – ThermoConverterApp
    • One for Azure Web App     – ThermoConverterAppWeb

If you publish/deploy ‘ThermoConverterAppWeb’ it will be deployed to Azure and the if you deploy/publish ‘ThermoConverterApp’ it will be deployed to SharePoint. Notice the icons to differentiate better.

Visual Studio with two project

Once you are complete with all the above steps, please check the properties of project, the ‘SiteURL’ should match your SharePoint Online site.

SharePoint Project Properties

Continue to Step 3: Register the app and update the Visual Studio>>

Note:  This blog post has been to split into multiple posts as they render fast with fewer images and it will greatly enhance the experience for readers accessing this via mobile devices

1

Provider Hosted App (Add-in) on SharePoint Online in 5 steps – Step 1/5

This blog post is a mere refresher to the folks who already know what SharePoint apps are and also know the basic difference between SharePoint hosted and Provider hosted app.

This blog post unfortunately does not get into the weeds but focuses on quickly connecting all the dots to quickly set up a provider hosted app for development. If you would like to choose pattern for your development read this MSDN Article

A picture is worth a thousand words, most of the blog post will have loads of images and they will speak for them selves.

Let’s get started! Following are the steps to get a basic Provider Hosted App up and running!

Step 1:  Create a Azure Web App to host your code and download Publishing Profile

Step 2:  Configure Visual Studio for development

  • Step 2 a: Create a new project of Type ‘Apps for SharePoint’
  • Step 2 b: Choose .NET 4.5 as target framework
  • Step 2 c: Enter the SharePoint Site URL where this app needs to be deployed
  • Step 2 d: Choose the type of app as ‘Provide hosted app’
  • Step 2 e: Enter Office 365 Credentials to connect visual studio with Office 365
  • Step 2 f:  Create a new publishing profile, import the Azure web app profile and Publish

Step 3: Register the app and update the Visual Studio

Step 4: Create Publishing Profiles for Azure Web App and publish

Step 5: Create Publishing Profiles for SharePoint Online and publish

Step 1: Create a Azure Web App to host your code and download Publishing Profile 

Create New Web App

Create Azure Web App

Create Azure Web App Running

Download Publishing Profile

  • Download publishing settings and save it for later use in Step 4

Download Publishing Profile

Continue to Step 2 :Configure Visual Studio for development >>

Note:  This blog post has been to split into multiple posts as they render fast with fewer images and it will greatly enhance the experience for readers accessing this via mobile devices

0

Powershell Create IIS Site

<#
.SYNOPSIS
 Create and IIS Site

.DESCRIPTION
 - This script creates and IIS based on the user input
 - The  script takes back up of the existing IIS configuration
   Restores back to the the backed up IIS Configuration if something goes wrong
 - This script is meant for education/demonstration purpose only and not meant for production use
 - This script plays by the user inputs, you could take the inputs as aruguments. For instance,
   #$siteName    = Read-Host 'What is the name of your site' 
       could be
   #$siteName    = $args&#91;0&#93;
   You could invoke the script as InstallIISSite.ps1 &#91;WebsiteName&#93; &#91;AppPoolName&#93; &#91;Port&#93; &#91;Path&#93; (&#91;domain\user&#93; &#91;password&#93;)

.Note
 - This scripts requires admin priviliges!
#>

# Add 'WebAdministration' modules to the current session. 
# The modules must be installed on the local computer or a remote computer

Import-Module WebAdministration

#Suspends the activity in a script or session for the specified period of time in secods
Start-Sleep 2 
 
# Get the user inputs for SiteName, AppPool, Port, Path for the site, user and password
$siteName    = Read-Host 'What is the name of your site?'           
$appPoolName = Read-Host 'What is the name of the application pool?'
$port        = Read-Host 'Specify the port number'
$path        = Read-Host 'Choose the path for your website'
$user        = Read-Host 'Specify user in the followng fashion domain\username'
$password    = Read-Host 'Enter password' -AsSecureString 

<# Or you can have the following

$siteName    = $args&#91;0&#93;           
$appPoolName = $args&#91;1&#93;
$port        = $args&#91;2&#93;
$path        = $args&#91;3&#93;
$user        = $args&#91;4&#93;
$password    = $args&#91;5&#93;

You may invoke the script as 
InstallIISSite.ps1 &#91;WebsiteName&#93; &#91;AppPoolName&#93; &#91;Port&#93; &#91;Path&#93; (&#91;domain\user&#93; &#91;password&#93;)
#>
 
#Backup the IIS Configuration
#Create the name for back up folder
$backupName = "$(Get-date -format "yyyyMMdd-HHmmss")-$siteName"

#"Backing up IIS config to backup named $backupName"
# Backup will be created at C:\Windows\System32\inetsrv\backup
$backup = Backup-WebConfiguration $backupName

 
try { 
    # Delete the website & app pool if already existed
    if (Test-Path "IIS:\Sites\$siteName") {
        "Removing existing website $siteName"
        Remove-Website -Name $siteName -Confirm
    }
 
    if (Test-Path "IIS:\AppPools\$appPoolName") {
        "Removing existing AppPool $appPoolName"
        Remove-WebAppPool -Name $appPoolName -Confirm
    }
 
    #Remove sites using the same port
    foreach($site in Get-ChildItem IIS:\Sites) {
        if( $site.Bindings.Collection.bindingInformation -eq ("*:" + $port + ":")){
            "Warning: Found an existing site '$($site.Name)' already using port $port. Removing it..."
             Remove-Website -Name  $site.Name -Confirm
             "Website $($site.Name) removed"
        }
    }
 
    "Create an appPool named $appPoolName with v4.0 framework"
    $pool = New-WebAppPool $appPoolName
    $pool.managedRuntimeVersion = "v4.0"
    if ($user -ne $null -AND $password -ne $null) {
	"Setting Application Pool to run under the $user"
		$pool.processmodel.identityType = 3
		$pool.processmodel.username = [string] $user
		$pool.processmodel.password = [string] $password
	}
    else
    {
    #If no user name is not specified, set the ID as NetworkService
      $pool.processModel.identityType = 2
    }
	
    $pool | Set-Item

    if ((Get-WebAppPoolState -Name $appPoolName).Value -ne "Started") {
        throw "Application pool $appPoolName was created but did not start!"
    }
 
    "Create a website $siteName pointing at $path and create with port $port"
    $website = New-Website -Name $siteName -PhysicalPath $path -ApplicationPool $appPoolName -Port $port
 
    if ((Get-WebsiteState -Name $siteName).Value -ne "Started") {
        throw "Website $siteName was created but did not start!"
    }
 
    "Website and Application Pool was created and started successfully"
} 
catch
 {
    "Oops something went wrong! Rolling back web server to its initial state. Please wait..."
     Start-Sleep 3 
     Restore-WebConfiguration $backupName
     "IIS Restore complete."
     throw
}