0

Migration Strategy for .net framework applications to the cloud

This blog post summarizes various options available for migrating .net framework applications to Azure or AWS.

  • Approach 1:  Deploy existing .NET apps as Linux containers only
  • Approach 2: Deploy existing .NET apps as  Linux containers or EC2 or Azure VMs
  • Approach 3:  Deploy existing .NET apps as Windows containers only
  • Approach 4: Deploy existing .NET apps as  Native Cloud Services

Comparison of all approaches

Linux containers onlyLinux Containers and  EC2 or Azure VMsWindows Containers onlyNative Cloud services
Porting to .net coreRequiredRequired for Linux containers
Optional for EC2 or Azure VM-hosted apps.
Not requiredNot required
End-to-End Migration timeVery highHighLowMedium
Ease of migrationVery Complex(Application need to be  targeted to .net core and lot of code refactoring)Complex(Apps that cannot be containerized will be hosted on EC2)Easy(Supports Lift and Shift with some code refactoring)Moderate(Might require some code refactoring based on the adopted service)


High-level steps of migration: Applies to all the above approaches.

  • Step 1: Run portability analysis using the following tools, a better approach would be to use both options a and b for analysis to get the best of both tools.
  1. .NET Upgrade Assistant
    1. The .NET Portability Analyzer – .NET
  2. AWS Porting assistant for .net
    1. Porting Assistant for .NET – Amazon Web Services
  • Step 3: Identify functionalities that need to be refactored to work seamlessly on Cloud, following are some of the items to consider.
    • Session State: Applications should be stateless to scale/switch at will. Session state should be persisted outside the app using services like Redis Cache etc. Especially with containers that get dropped and recreated more often than less maintaining Session state is critical.
    • Logging: If the apps are logging to file systems locally, these apps need to be updated to log to the native logging service within AWS or Azure.
    • LDAP-Queries: If the apps are querying LDAP stores such as AD, these applications should use AWS Cognito or Azure AD.
    • Authentication: If the app relies on Windows Authentication or on-Premise AD this might require additional changes to the application.
    • File-Persistence: If the apps have uploading files that need to be persisted these files have to be persisted outside the app.
    • Miscellaneous:
      • On-Premise dependencies.
      • Email functionality.
      • Custom Domains
      • Apps using GAC (Global Assembly Cache)
  • Step 4: Fix the issues from Step 1, 2 and 3
  • Step 5: Containerize the app- Add supporting YAML configuration, make project changes, etc.
  • Step 6: Build, test locally and deploy.

Approach 1: Deploy existing .NET apps as Linux containers only

.NET Framework applications must run on Windows, porting these .NET Framework apps to Linux containers requires these apps to be re-targeted to the .NET core. This makes this approach the most time-consuming and complex among all of the approaches. 

Note: As of this writing, .Net 5 is the latest stable version that supports Linux containers. .NET 5.0 is the next major release of .NET Core following .NET core 3.1. 

Microsoft released .NET 5.0 instead of .NET Core 4.0 for two reasons:

  1. Skipped version numbers 4. x to avoid confusion with .NET Framework 4. x.
  2. Dropped “Core” from the name to emphasize that this is the main implementation of .NET going forward. .NET 5.0 supports more types of apps and more platforms than .NET Core or .NET Framework. [1]
  3. NET Framework 4.8, released in April 2019, is the last major version, “all future investment will be in .NET Core.”[2]

.Net 6 is still in Release candidate (RC) releases that provide early access to features and are feature complete. These releases are supported for production use since they have a go-live license.

Please find this white paper on what it takes to convert a simple .NET Framework to .NET Core to run on Linux containers.

Modernize .NET Applications with Linux Containers

Pros:

  • Upgrading to .Net 5 makes it easier for future upgrades.

Cons:

  • Requires a lot of changes to the project structure to retarget to the .NET 5 framework.
  • Porting Windows apps, .net libraries to .NET 5 can be relatively easy compared to ASP.NET Web applications.
  • ASP.NET web applications require quite a lot of code refactoring, changes to project structure, and how the configuration is read and written, as well as source code changes may be required for unresolved incompatibilities.
  • If there are no compatible NuGet packages for .NET 5, the application has to wait until a compatible NuGet package is released or find alternatives which can sometimes be tedious and time-consuming.
  • Any changes to the application require a deep understanding of the app and how it’s intended to function, this could be challenging for a team that is just responsible for migrations.
  • Applications well covered by unit tests are safer to port as there is a better chance of finding differences in behavior that may cause unexpected results.
  • Some of the features may not be supported in .net core, if the application uses any of these features, the application may have to be rewritten. Please see the complete list of .NET Framework technologies unavailable on .NET Core and .NET 5+

Following is just a sneak peek of  .NET Portability Analyzer in action, Micorosft has heavy documentation regarding the tool. Please see more at The .NET Portability Analyzer – .NET

Run the .NET Portability Analyzer.

The sample out of the Analyzer can be found at

The tool is available as a Visual Studio extension, following screen capture shows how to target .NET 5.0

After the extension has been installed, the Portability analysis can be performed as shown below.

Approach 2: Deploy existing .NET apps as  Linux containers or EC2 or Azure VMs:

This is similar to Approach 1 but accommodates all use cases. If there are any issues with porting to .NET Core or if an app cannot be containerized, in such cases these apps can be hosted within AWS EC2 or Azure VM instances. 

Approach 3: Deploy existing .NET apps as Windows containers only

A .NET Framework application must run on Windows, period. If existing .NET Framework applications have to be containerized, you can’t or don’t want to invest in migration to .NET Core or later (“If it works properly, don’t migrate it”), the only choice you have for containers is to use Windows Containers. [3]

Pros:

  • Aligns well with Rehost migration strategy, applications can be ported as Windows containers easily with very less turnaround time.
  • If there is no need for full-fledged Container orchestration Azure has low-cost managed services that support both Windows and Linux-based containers.
  • Applications can benefit from windows containers, as the main dependency for the .NET Framework in Windows. 
  • Also, applications having secondary dependencies, like IIS, and System.Web in traditional ASP.NET benefits from using Windows containers.

Cons:

  • Each application warrants changes to the project structure to support containerization.
  • Architecture and maintenance can get complex depending on the choice of container orchestration between managed vs. self-managed.
  • Code may need to be refactored if the functionality is compromised due to hosting it within a container.

Approach 4:  Deploy existing .NET apps as  Native Cloud Services

Microsoft introduced Cloud Adoption Framework that recommends how to migrate various applications to Azure. A good strategy should include multiple migration/design choices catering to the application needs with the least amount of migration time and modifications to the applications. 

Please see the flowcharts below to get a high-level understanding of Microsoft’s recommendation of migrating applications and their supporting data sources within Azure.

Flow chart to select a candidate to compute service.

Read more at Choosing an Azure compute service – Azure Architecture Center

Flow chart to select a candidate data service.

Read more at Review your data options – Cloud Adoption Framework

Note: This section covers Azure services only, a similar flowchart can be devised for AWS as well. It’s important to establish a clear plan and scope upfront for all the applications, the team should not come to the drawing board for each application.

Pros:

  • More flexible due to more services/design choices and eventually reduces the overall migration cost and time.
  • Accommodates all the acceptable migration strategies
    • Rehost – Lift and shift as-is with no changes to applications.
    • Refactor/Repackage – With a few code changes and refactoring
    • Rearchitect
    • Rebuild

References:

  1. What’s new in .NET 5
  2. Microsoft .NET Framework – Microsoft Lifecycle
  3. Deploy existing .NET apps as Windows containers
  4. NuGetPackageExplorer/NuGetPackageExplorer: Create, update and deploy Nuget Packages with a GUI

0

Error: AzResourceGroupDeployment failed!

I was trying to deploy multiple resources using ARM Template deployment and got interrupted with the following error.

New-AzResourceGroupDeployment : 11:59:25 PM – Error: Code=InvalidTemplateDeployment; Message=The template deployment ‘AzureFunctionsVPN’ is not valid according to the validation procedure. The tracking id is ’16b65e77-8c46-462f-8f4c-a909492f7229′. See inner errors for details.

It is very common to see such errors while working with ARM Template deployments. I found the following command-let extremely helpful without which it would have been a herculean task.

Get-AzLog -CorrelationId 'xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx' -DetailedOutput

If you are still using old RM command-lets, use the following

Get-AzureRMLog -CorrelationId xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx -DetailedOutput

The above command generates a lengthy JSON Output, pay attention to Properties > Status Message > Message

Note: Please give at least a minute before you run the above command, it usually takes that much time for the log to show up, otherwise you might get an empty response sometimes.

You can also get the logs based on a date and time range. For more information please refer to the following link.

https://docs.microsoft.com/en-us/powershell/module/az.monitor/get-azlog?view=azps-3.8.0

0

Error: No match was found for the specified search criteria for the provider ‘NuGet’

I received the following error while installing Azure PowerShell Modules.

Install-Module -Name Az -AllowClobber

PackageManagement\Install-PackageProvider : No match was found
for the specified search criteria for the provider 'NuGet'.
The package provider requires 'PackageManagement' and 'Provider'
tags. Please check if the specified package has the tags.

Issue:

The issue, as I understand it, is that PowerShell by default uses TLS 1.0 for web requests and causing an issue in this case. So when the Install-Module command is trying to install the missing Package-Provider as a pre-requisite and AZ Modules itself, it’s failing! See step 3 to change the default to TLS 1.2

Solution:

Step 1: Check the PowerShell version installed and update it if required.

$PSVersionTable.PSVersion

Note: Azure PowerShell works with PowerShell 5.1 or higher on Windows, or PowerShell Core 6.x and later on all platforms. You should install the latest version of PowerShell Core available for your operating system. Azure PowerShell has no additional requirements when running on PowerShell Core.

To use Azure PowerShell in PowerShell 5.1 on Windows:

  1. Update to Windows PowerShell 5.1 if needed. If you’re on Windows 10, you already have PowerShell 5.1 installed.
  2. Install .NET Framework 4.7.2 or later.
  3. Make sure you have the latest version of PowerShellGet. Run Update-Module PowerShellGet -Force.

Step 2: Check if the NuGet package provider is installed

Get-PackageProvider -ListAvailable

Following is what I have initially, notice NuGet package provider is missing.

Step 3: Run the following command

The following will for the PowerShell to use TLS 1.2 

[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12

Step 4: Re-Run Install-Module command

Install-Module -Name Az -AllowClobber

Note: The above command should have installed the missing package-provider as well and should look like the following. See highlighted.

References:

https://docs.microsoft.com/en-us/powershell/azure/install-az-ps?view=azps-3.8.0

0

Azure Service Bus Feature Summary

This post acts as a quick reference for all the popular features that differentiate Service Bus from Storage Queues.

This blog post assumes that you have some fundamental understanding of messaging options within Azure. But if you want to get started and need a more detailed understanding please refer to the following articles.

Microsoft Service Bus

Messaging options in the Microsoft Azure Platform

Service Bus feature summary:

Note: All the above options applies to both Queues and Topics.

Time to live:
Message time to live determines how long a message will stay in the queue before it expires and is removed or dead lettered. When sending messages it is possible to specify a different time to live for only that message. This default will be used for all messages in the queue which do not specify a time to live for themselves. 

Lock Duration:
Sets the amount of time that a message is locked for other receivers. After its lock expires, a message pulled by one receiver becomes available to be pulled by other receivers. Defaults to 30 seconds, with a maximum of 5 minutes.

Duplicate Detection:
Enabling duplicate detection configures your queue to keep a history of all messages sent to the queue for a configurable amount of time. During that interval, your queue will not accept any duplicate messages. Enabling this property guarantees exactly-once delivery over a user-defined span of time

Dead Lettering:
Dead lettering messages involves holding messages that cannot be successfully delivered to any receiver to a separate queue after they have expired. Messages do not expire in the dead letter queue, and it supports peek-lock delivery and all transactional operations.

Sessions:
Service bus sessions allow ordered handling of unbounded sequences of related messages. With sessions enabled a queue can guarantee first-in-first-out delivery of messages.

Partitioning:
Partitions a queue across multiple message brokers and message stores. Disconnects the overall throughput of a partitioned entity from any single message broker or messaging store. This property is not modifiable after a queue has been created.

References:

Time to live:
https://docs.microsoft.com/en-us/azure/service-bus-messaging/message-expiration
Duplicate Detection:
https://docs.microsoft.com/en-us/azure/service-bus-messaging/duplicate-detection
Dead-Letter-Queues:
https://docs.microsoft.com/en-us/azure/service-bus-messaging/service-bus-dead-letter-queues
Sessions:
https://docs.microsoft.com/en-us/azure/service-bus-messaging/message-sessions
Partitioning:
https://docs.microsoft.com/en-us/azure/service-bus-messaging/service-bus-partitioning

1

Design, Manage and Monitor – Microsoft Azure Storage Services

I have been meaning to get around to this article for a long time, this blog post is for developers or folks who wants to read through the clutter, get a ten thousand foot view of what Azure Storage is, when to use what service and various API options available. I assume that you have some understanding what Azure is and familiar with some jargon with Azure Storage as well.

What is Azure storage? (Understanding with analogy)

A perfect analogy would be, let’s assume that you are relocating from Texas to Virginia, you said your good-byes, wrapped up your boring office send-offs, took pictures, packed all your stuff and  bagged a few memories. You drove to Virginia and negotiating with a Storage Service Provider like U-Haul, or EzStorage to store all your stuff. You might have boxes of all sizes – small, big and some extra large; fragile items, wall hangings, suite cases, may be an old piano or an antic car etc. Everybody will have unique needs based on what they have to store. And these storage providers will allocate space based on what you want to store and how you want to store. You might need a garage for your car or your might just need a small shelf shared with others for storing small boxes. Of course! you pay for their service, you pay based on the size, kind of storage, location of the storage etc. Azure Storage Services is no different.

Now relate Azure Storage with the Storage Facility, Azure Storage Services also stores stuff for you, unlike boxes or wall hangings, you might have VHDs (Virtual Hard Disks), Big & Small images, High definition videos, Office documents etc. whatever ever your needs are, Azure will accommodate them for you. Azure provides various Storage Services to host different files types and catering to various use cases. 

Let us get started!  The following section will provide use cases for understanding on when to use different storage options.

  1. Table Services

  2. Blob Services

  3. Queue Services

  4. File Share Services

Table Service
Typical use cases:

  • For basic NoSQL/Key-Value functionality and the ability to scale quickly.
    • NoSQL (Not only SQL) is revolutionary and the underlying concept of Big Data. 
    • What’s Big Data? In layman’s terms, when the concept of storage started, humans used to contribute data, but with an evolution of millions of smart gadgets like smart TVs, smart refrigerators, smart cameras everything is interconnected and all of these products contribute to Terabytes if not Petabytes of data.
      If your applications allocate resources processing real-time data,  you might lose some of the data being fed to these systems continuously, you need a way to quickly “somehow” save this data in some format and process it later. This ‘somehow’ is NoSQL which means that you need a type of storage that can quickly take what every is inputted and stored for later processing.
      When the demand for resources is reaching its limit (while adding real-time data), underlying systems automatically scale out and scale up if needed. This is where Azure will help you.
  • If you have a need to input and quickly retrieve millions of entities (similar to rows in SQL) this is a good choice.
    • If you are not following relational schema it becomes a whole lot easier to input and extract data.
      (Note: You should have Row and Partition key to fetch data fast or else as the entities grow querying becomes slow.)
  • If your data storage requirements exceed 150 GB and you are reluctant to manually shard or partition your data.
  • If you want to keep your costs low. Table storage services are comparatively cheaper than other storage services .
    • Note: Microsoft has been optimizing Azure SQL Database service to reduce overall costs and also ongoing administration costs, with a change like this, more and more customers are preferring SQL Database to Table Service.
  • If you require robust disaster recovery capabilities that span geographical locations.

Avoid:

  • If your data has complex relationships, in other words, data that require RDBMS (go with SQL Azure).
  • You if you need the support of secondary indexes, this is not an option.
  • If your code depends on ADO.NET or ODBC.

Blob Service (Block blob & Page Blob)
Typical use cases

  • If you have tons of static files like pictures, videos, pdf, office documents etc. this is the default choice.
    • As a part of modernization, more companies are migrating to the cloud, it could be websites, applications or even whole infrastructure. There might be hundreds, thousands or even millions of supporting files. Azure Blob service can host these files for you.
    • Depending on the size of the file you are trying to store, you have two options
      • Block Blob
        • Each Block Blob can be up 200 GB, you can upload block blobs with a size up to 64 MB in one operation.
      • Page Blob.
        • Each Page blob can be up to 1 TB in size and consist of a collection of 512-byte pages. You set the maximum size when creating a page blob and then you can write or update specific pages
  • This is most popular among all Storage Services, as a matter of fact Azure uses this for all of its storage needs, this could be while you create a VM and storing any supporting files/data used by other services on Azure.
  • When people refer to migrating Virtual Machines to Cloud, generally they prep the machine by running tools like Sysprep on those machines and uploading VHDs to BLOB service.

Queue Service(Azure Queues & Service Bus queues)
Typical use cases

  • If you desire to have a loosely coupled or applications that can interact asynchronously.
    • For instance, you have a system that checks inventory and notifies what needs to be ordered and you have another system that does the ordering for you.What if the order processing VMs or Worker Roles are offline? What happens if one of them is busy or overloaded with traffic? How do you recover from failure? Queue Services will help you with these scenarios
  • If you have a need for large numbers of messages that needs to be accessed from anywhere in the world via authenticated calls using HTTP or HTTPS.
  • Load Leveling & Balancing
    • If you have apps that experience a huge burst of traffic, you could level the load if you could categorize processes that can wait (in the queue) until a free worker becomes available.
    • For load balancing,  you could spin up additional worker roles to process messages and balance the load among the worker processes. Also, this can be automated.
  • Subscription
    • If you have one producer and multiple consumers, Queues are an excellent choice. For instances, one system can generate a messages (Producer) and multiple systems that subscribe and consume the messages (Consumers) and processes them independently.
  • Scheduled Processing
    • If you do not want to stress your applications during the peak traffic hours, you can isolate that task that can be processed later on a scheduled basis. This could be nightly or weekly, you can add messages to the Queue and process them based on the schedule.

Note: Azure Queues and Service Bus Queues work differently, it is good to understand the differences between these two and make informed architectural decisions. Refer to ‘References’ section for good articles.

Advice on Multiple Queues:
I am not against using multiple queues, but use them sparingly, the idea is, you can achieve a lot with single queues itself (if architected well).
As a matter of fact, I have used multiple queues myself in the following use cases
1. Different Priorities: You want different priorities for different messages. The last thing you want is a high-priority message getting stuck behind hundreds of low-priority messages. Create a hi-priority queue for these.
2. Large Transactions: If the number of message transactions (Read,Put,Delete) exceed over 500 transactions per second. Creating multiple queues will spread the transaction volume across storage partitions.

File Storage
Typical use cases

  • Share data across on-premises and cloud servers
    • If you have a hybrid set up and you have applications that share data across on-premise and cloud.
  • Migrate file share-based applications
    • You can migrate applications to the cloud with no code changes.
  • Integrate modern applications
    • With more devices accessing the same information and distributed globally, it creates a dire need for sharing resources and build more file sharing applications. Also, it supports REST API for building modern applications.
  • High Availability
    • If all you need is a file share interacting via Server Message Block 3.0 (SMB) protocol it is highly efficient to use Azure File Share service backed by Azure infrastructure, redundancy, and SLA.
  • Sharing files between Services
    • If you have VMs or different services that need to share files this is an excellent choice.

Note: If your application is heavy on using file attributes, file processing and not just regular read/write, there are certain unsupported features.
Click Here for the complete list.

After all these services are provisioned, there should be an option to consume these services. Following screen capture provides a quick view or a cheat sheet of what Storage API provides. Most of the operations are self-explanatory based on the naming convention. I will strive to write more articles on each one of these and provide examples in C# leveraging REST API.

Azure Storage Services Quick Reference:
Azure Storage Services Quick Reference

Conclusion:

I was confused on when to use what service, but after working hands on and a lot of reading, I was able to assimilate some information but not everything. I made an honest attempt to provide a quick read and to get an overall view of what Azure Storage Services are, what to choose when and when to avoid. Also concluded with summarizing most of the operations that you can do with via Azure Storage API. Hope this article made some sense and helped you with something.

References:

[1] Azure Queues and Service Bus queues – compared and contrasted

[2] Azure Storage Queues VS Azure Service Bus Queues

 

2

What is Azure?

Explaining Azure in simple terms can be a minute conversation, but if you attempt to deep dive into various services, it can even take months. No kidding, I am serious!

Microsoft provides over 200 Services, and you would be spell-bound on the rate at which new features get added on a daily basis.

This blog post does not get into the weeds of each of the services,  but will cover few basic fundamentals that will help you understand the terminology and assimilate some of the technical jargon.

Before even starting on what Azure is, it is a good idea to talk a little about Cloud, as Azure itself is a Microsoft’s flavor of Cloud. There are many other Cloud service providers such as Amazon Web Services, Akamai, CenturyLink, CSC, Dimension Data, Fujitsu, Google, IBM (SoftLayer),  Interoute,  Joyent, Rackspace, Verizon, Virtustream, VMware etc. and the list goes on…

So what is Cloud? Why companies want to be on Cloud? Why there are companies providing these services?

Technology plays a very key role in managing business operations and customer relations.

Following image demonstrates Azure replacing some or all of the On-Premise services.

Azure replacing On-Premise Services

Azure replacing On-Premise Services

Shift towards Cloud is not always about monetary reasons, it is also about ease of use and peace of mind.  This is one of the million reasons why companies are leaning more towards Cloud Computing and designing their own Cloud strategy.

Cloud has been a ubiquitously ambiguous. There is no true definition of what qualifies a company as a Cloud Service provider. For instance, one service provider may provide only half of the service provided by its competitors.

For instance, Rackspace does a very good job on providing IaaS services and they could provide solid references from their clientele for those services. Overall as a customer you would assume that these Service Providers would at least provide IaaS / PaaS services.

On a lighter note, you cannot run couple of servers off of your basement and call yourself a Cloud provider. In reality, you should be able to scale-up or scale-out dynamically catering to the demand, security, ability to adhere to SLA of 99.9%  up-time and much more. It is not an easy task. Cloud is a serious business!

Enough said, lets get started on some of the terminology used with in Azure and may be common to most of the Cloud Service providers.

1. Public, Private and Hybrid Cloud

Public Private and Hybrid Cloud Offerings

Public Private and Hybrid Cloud Offerings

2. IaaS, PaaS, SaaS

Following is just a thirty thousand foot view, explaining in detail would require a complete blog post by itself.

If you are interested in knowing more in depth please follow this article.

As a matter of fact, with the Cloud maturing day by day the differences are getting blurrier.

IaaS vs. PaaS vs. SaaS

IaaS vs. PaaS vs. SaaS

3. Azure Services

Everything in Azure is termed as a ‘Service’. It could be a VM or Website or CDN (Content Distribution Network).

Azure services are categorized as show below,  to further explore please follow this article.

Azure Services

Azure Services

4. Azure Portals

Microsoft provides easy to use web portals to leverage their services. Usually there are two portals, one in production and one is New portal. The intention of the new portal is to keep up with the ever changing user experience demands and making it mobile friendly. They are both meant for the same purpose. Following are some of the key differences

  • Not all features are migrated into the new portal, some of them may be deprecated or merged with other features.
  • New Portal has RBAC (Role Based Access Management)
    • The current production portal does not have this feature, which means that if you are logged-in, you are an Admin. Scary!
    • With RBAC you can associate roles to the users to have read only, write , contribute or even delegate admin role.
  • More modern user interface, even more mobile friendly.
  1. Production URL: https://manage.windowsazure.com/

    Window Auzre Production Portal

    Quick Peek : Window Azure Production Portal

  2. Preview Portal: https://portal.azure.com/

    Azure Preview Portal

    Quick Peek: Azure Preview Portal

5. Subscriptions

Azure Subscriptions

Azure Subscriptions

 

6. Pricing Calculator

This is a very useful tool to price and configure Azure services for your scenarios. If you have a Pre-Paid subscription your credit status will appear as an overlay on top of window.

Credit Status

Credit Status

Following is the screen capture of pricing calculator, indicating how much it would cost to add ‘DocumentDB’ as a service.

Pricing Calculator

Pricing Calculator

I hope you enjoyed reading this article and hope this was useful!!

0

Introduction to PowerShell for Azure – Creating a VM

<pre>&lt;#
.SNOPSYS
Excute Basic Azure command and create a VM

.DESCRIPTION
This script is meant for education/demonstration purpose only
and not meant for production use
- Initial section of the script covers basics commands to get connected with Azure
- The rest of the script covers creating a VM with SQL Server running.

.NOTES
Author: Susheel Dakoju
Date Authored: 04/13/2015
#&gt;
#-----------------------------------------------------------------------------
# 1. Connect to Azure: Option 1: Add-AzureAccount
#-----------------------------------------------------------------------------
#Connect
Add-AzureAccount
#Check subscription details
Get-AzureSubscription
#-----------------------------------------------------------------------------
# 1. Connect to Azure: Option 2: Get-AzurePublishSettingsFiles
#-----------------------------------------------------------------------------
#Download PublishSettingsFile
Get-AzurePublishSettingsFile
#Import Setting Files
Import-AzurePublishSettingsFile "C:PowershellAzureVisual Studio Ultimate with MSDN-4-14-2015-credentials.publishsettings"

#-----------------------------------------------------------------------------
# 2. Export all the exiting images in the VM Gallery and
# copy the name image you want to create
# Note: There are more elegant way of doing this.Here the most basic form is demonstration. See below
&lt;#
$vmimages | where {$_.Label -like 'sql server 2014*'} | select Label, RecommendedVMSize, PublishedDate | Format-Table -AutoSize
$imgtouse = $vmimages | where {$_.Label -eq 'SQL Server 2014 RTM Enterprise on Windows Server 2012 R2' `
-and $_.PublishedDate -eq '6/9/2014 3:00:00 AM'} | select ImageName
#&gt;
#-----------------------------------------------------------------------------
Get-AzureVMImage &gt; C:PowershellAzureimages.txt

#-----------------------------------------------------------------------------
# 3. Create new Azure VM
#-----------------------------------------------------------------------------
New-AzureVMConfig -Name 'sdcapsandbox1' -InstanceSize Basic_A3 -ImageName 'fb83b3509582419d99629ce476bcb5c8__SQL-Server-2014-RTM-12.0.2361.0-Enterprise-ENU-Win2012R2-cy14su05' |`
Add-AzureProvisioningConfig -Windows -AdminUsername 'sdcapsandboxuser1' -Password 'Administrator@1234'|`
New-AzureVM -Location 'East US' -ServiceName 'sdcapsandbox1'
#-----------------------------------------------------------------------------
# 3.Check if the New VM is generated
#-----------------------------------------------------------------------------
#Get-AzureVM

#------------------------------------------------------------------------------------
# Optional: If New-AzureVM errors out on storage account. Keep the following handy
# - Use one of the storage accounts or create a new one run Set-AzureSubscription
#------------------------------------------------------------------------------------
#Get-AzureStorageAccount
#Set-AzureSubscription -SubscriptionId 16c41f83-c83e-476a-8fdc-51b8ae69849f -CurrentStorageAccount mystorageaccount
</pre>
0

Building you first cloud hosted app on Office 365 – Using Napa

I was eavesdropping on one of the so called ‘Technical Elevator Conversations’

I heard “NAPA”.

Did some Googling, sorry some Binging as well and started to assimilate some of the information available on MSDN and other blogs. Fell in love instantly! What’s fascinating about this is, you could develop & deploy  from the scratch a complete ‘Cloud hosted App’ via browser and mere JavaScript. You would be surprised that it took less than ten minutes for the whole thing.

I thought to keep it simple and started of creating a simple temperature conversion tool, which run on simple JavaScript and some lines of  HTML. After all, the whole idea is to use NAPA and create an app. So I quickly borrowed few lines of code from W3C schools and used it in my app.

Note: There are so many blogs and MSDN articles out there explaining Napa in great detail. I just made an attempt to keep it as simple as possible, just to give you a glimpse of what Napa is all about and get some first introduction to it.

Following is what I did

  1. Create or use existing SharePoint 2013 site on Office 365 portal
    Note: If you do not have an Office 365 account you could easily activate one with you MSDN subscription. If you do not have an MSDN subscription, you may sign up for Office 365 for home to start and exploring some of the features.
  2. Once you have your SharePoint 2013 site up and running you would navigate to Site Contents and click ‘add an app’ as highlighted below
    Snap1
  3. Go to SharePoint Store by click the link as highlighted below.Snap2
  4. Search for ‘Napa’ and you should find an app ‘Napa Office 365 Development Tools’. Go ahead and install it.Snap3
  5. Once the install is finished you should find in your ‘Site Contents’. You may click on it to start using ‘Napa’ or by clicking ‘Build an app’ option available on the home screen. Please see below, for both of these options.
    Option 1:
    Snap4 Option 2:Snap5
  6. Kick off the app creating by clicking ‘Add New Project’
    Snap6
  7. You will be prompted with options to create a different kind of app, choose ‘App for SharePoint’ and give your app a name.Snap7
  8. Once you completed the above step, you are now officially on ‘Napa’ and can start coding. As I mentioned in the beginning, I borrowed the following code from W3C schools which helps with temperature conversions from Celsius to Fahrenheit and vice-versa.Snap8
  9. That is you are almost done, click Publish icon as highlighted in the screen capture. This should prepare the package, deploy and launch the app.
    Snap9
  10. This is how the ‘Temperature Converter’ app looks like. Nothing fancy two text boxes and few lines of JavaScript.Snap10
  11. This app should now show up in your Site Contents, please see highlighted.Snap11
  12. Congratulate your self for building the first cloud hosted app and get a brew.
3

Cloud Applications development – It’s NOT all about VMs

Cloud platforms have plethora of options and they come in many flavors. Be it Microsoft Azure, Rackspace, AWS, Oracle or any cloud platform of your choice, all of these are unique and are irreplaceable. But they all share one common offering i.e. Infrastructure as Service (IaaS) and Platform as a Service (PaaS). IaaS is the most basic and widely adapted cloud offering. PaaS is gaining popularity and would be the default choice in near future, but not at the time of writing this blog.

So if you are Cloud Applications developer, it is key that you understand what IaaS/PaaS is and things to bear in mind while developing applications for cloud platforms.

If you assumed that Cloud or Cloud Computing is nothing but a mere collection of millions and millions of VMs/servers remotely hosted and maintained by someone else. Also deployment is all about copying and pasting your code on to those servers! I am afraid you are wrong!

Developing cloud applications is not just mere copy and paste, there is much more to that.

This post will cover few examples that would influence your thought process while developing applications for cloud. Again, there are many more, many more things to consider while developing apps for Cloud platforms.

The following is an honest attempt to influence you to adopt different prospective while you design Cloud apps.

  1. Automation of  Deployments and Hardware Provisioning
    1. There are high chance that you may not have been concerned about hardware provisioning with traditional development. But with cloud application development, it is key that you are aware of automation all your hardware provisioning and deployments.You could leverage Windows Azure Object Model that helps automate lot of  things that are typical with Cloud application development like creation of VMs, Websites, Storage Accounts etc.
    2.  You my leverage C#, PowerShell scripts or popular tools like Chef , Puppet for IT/Deployment automation.
    3. Automation is one of the key design principles of Cloud application development and highly recommended by Microsoft. This will not only minimize errors but also makes it easier to create and deploy items repeatedly.
    4. Example: Consider a scenario where you were requested to create a developer VMs for 20 developers with with Windows Server 2012 R2, SQL Sever R2, IIS 8.0, Sharepoint 2010, MS office and other supporting tools.
      1. What is the first thought you get? I am sure you are tempted to create 20 VMs on Azure portal, RDP onto these machines and install these software. I bet even if you spend like 2 to 3 hours for each VM it would take 60 hours i.e. close to two business weeks.
      2. You might have also thought that, I would create one instance, generalize it and create the rest out of it correct? Note that some software cannot be part of generalization of VM and therefore cannot part of template
      3. Solution: Use scripts or any kind of automation of your choice.
        Write a script to create and spin up these 20 VMs. This can be easily done using Power Shell scripts often with few lines of code.  This could hardly take an hour or two.
        The other issue with software that doesn’t support generalization or bundling with VM images/templates, you could read Bootstrapping a Virtual Machine with Windows Azure, by Michael Washam.
  2.  Budgeting and Cost Conscious
    1. As a traditional developer cost is not something you would consider while developing. But when you are designing for Cloud apps everything come with a price. For instance, your design choice storing your data in-memory vs. database can have huge impact on pricing.
    2. Based on your application design choice you my end up saving a ton or pay the price for not having cost as as important parameter
    3. Cost calculation is as important as application design, development and testing. This should be an integral part of design.
  3. Fluid/Adaptive design choices
    1. The beauty of cloud platforms is you can start with minimal configuration and grow as needed. So your design should be flexible to adapt to changing hardware, it could be reduced CPU utilization or memory allocations or additional servers added to the NLB, it could be anything.
    2. You design choices are critical for Cloud development. For instance, for Session management, you have to prefer Outproc session management to Inproc while using ‘Swap Deployment‘ feature with Azure.What it means is that virtual IPs swap between the staging and production environments for a service. If the service is currently running in the staging environment, it will be swapped to the production. If it is running in the production environment, it will be swapped to staging.So if  are leveraging Inproc session management, you would loose your state, it id ideal that you choose Outproc session management.Please see the below screen capture highlighting ‘Swap’ feature under instances.

      Swap Deployment with Microsoft Azure

      Swap Deployment with Microsoft Azure

       

  4. High availability, Performance and Fault Tolerance
    1. Most of the Cloud providers will have an SLA of 99.5 and above for availability. This is valid only if you provision at least two instance for every service. Developers should be aware of this.
      1. If you choose only one instance, your application may be temporarily unavailable while patching or reboots.
    2. Another important feature that every developer should understand is Affinity Groups.These allow to group your Azure services to optimize performance. All services and VMs within an affinity group will be located in the same region. For instance, it is ideal that you may associate your application and supporting database servers to the same Affinity group to avoid any network latency and increase performance.

Conclusion: It is key to understand that developing Cloud based application development is not same as traditional development. You should start with leaving aside our ego and stop acting that developing for Cloud Platforms, is no different to traditional development. And stop thinking developing for Cloud platforms is all about logging on to a VM, copy-paste, deploy and configure. As a matter of fact, I was under the same impression and attitude when started picking up Azure. Its time to digest the notion that Cloud application development has lot more different in terms of design, development and cost AND IS NOT ALL ABOUT VMs.