0

Migration Strategy for .net framework applications to the cloud

This blog post summarizes various options available for migrating .net framework applications to Azure or AWS.

  • Approach 1:  Deploy existing .NET apps as Linux containers only
  • Approach 2: Deploy existing .NET apps as  Linux containers or EC2 or Azure VMs
  • Approach 3:  Deploy existing .NET apps as Windows containers only
  • Approach 4: Deploy existing .NET apps as  Native Cloud Services

Comparison of all approaches

Linux containers onlyLinux Containers and  EC2 or Azure VMsWindows Containers onlyNative Cloud services
Porting to .net coreRequiredRequired for Linux containers
Optional for EC2 or Azure VM-hosted apps.
Not requiredNot required
End-to-End Migration timeVery highHighLowMedium
Ease of migrationVery Complex(Application need to be  targeted to .net core and lot of code refactoring)Complex(Apps that cannot be containerized will be hosted on EC2)Easy(Supports Lift and Shift with some code refactoring)Moderate(Might require some code refactoring based on the adopted service)


High-level steps of migration: Applies to all the above approaches.

  • Step 1: Run portability analysis using the following tools, a better approach would be to use both options a and b for analysis to get the best of both tools.
  1. .NET Upgrade Assistant
    1. The .NET Portability Analyzer – .NET
  2. AWS Porting assistant for .net
    1. Porting Assistant for .NET – Amazon Web Services
  • Step 3: Identify functionalities that need to be refactored to work seamlessly on Cloud, following are some of the items to consider.
    • Session State: Applications should be stateless to scale/switch at will. Session state should be persisted outside the app using services like Redis Cache etc. Especially with containers that get dropped and recreated more often than less maintaining Session state is critical.
    • Logging: If the apps are logging to file systems locally, these apps need to be updated to log to the native logging service within AWS or Azure.
    • LDAP-Queries: If the apps are querying LDAP stores such as AD, these applications should use AWS Cognito or Azure AD.
    • Authentication: If the app relies on Windows Authentication or on-Premise AD this might require additional changes to the application.
    • File-Persistence: If the apps have uploading files that need to be persisted these files have to be persisted outside the app.
    • Miscellaneous:
      • On-Premise dependencies.
      • Email functionality.
      • Custom Domains
      • Apps using GAC (Global Assembly Cache)
  • Step 4: Fix the issues from Step 1, 2 and 3
  • Step 5: Containerize the app- Add supporting YAML configuration, make project changes, etc.
  • Step 6: Build, test locally and deploy.

Approach 1: Deploy existing .NET apps as Linux containers only

.NET Framework applications must run on Windows, porting these .NET Framework apps to Linux containers requires these apps to be re-targeted to the .NET core. This makes this approach the most time-consuming and complex among all of the approaches. 

Note: As of this writing, .Net 5 is the latest stable version that supports Linux containers. .NET 5.0 is the next major release of .NET Core following .NET core 3.1. 

Microsoft released .NET 5.0 instead of .NET Core 4.0 for two reasons:

  1. Skipped version numbers 4. x to avoid confusion with .NET Framework 4. x.
  2. Dropped “Core” from the name to emphasize that this is the main implementation of .NET going forward. .NET 5.0 supports more types of apps and more platforms than .NET Core or .NET Framework. [1]
  3. NET Framework 4.8, released in April 2019, is the last major version, “all future investment will be in .NET Core.”[2]

.Net 6 is still in Release candidate (RC) releases that provide early access to features and are feature complete. These releases are supported for production use since they have a go-live license.

Please find this white paper on what it takes to convert a simple .NET Framework to .NET Core to run on Linux containers.

Modernize .NET Applications with Linux Containers

Pros:

  • Upgrading to .Net 5 makes it easier for future upgrades.

Cons:

  • Requires a lot of changes to the project structure to retarget to the .NET 5 framework.
  • Porting Windows apps, .net libraries to .NET 5 can be relatively easy compared to ASP.NET Web applications.
  • ASP.NET web applications require quite a lot of code refactoring, changes to project structure, and how the configuration is read and written, as well as source code changes may be required for unresolved incompatibilities.
  • If there are no compatible NuGet packages for .NET 5, the application has to wait until a compatible NuGet package is released or find alternatives which can sometimes be tedious and time-consuming.
  • Any changes to the application require a deep understanding of the app and how it’s intended to function, this could be challenging for a team that is just responsible for migrations.
  • Applications well covered by unit tests are safer to port as there is a better chance of finding differences in behavior that may cause unexpected results.
  • Some of the features may not be supported in .net core, if the application uses any of these features, the application may have to be rewritten. Please see the complete list of .NET Framework technologies unavailable on .NET Core and .NET 5+

Following is just a sneak peek of  .NET Portability Analyzer in action, Micorosft has heavy documentation regarding the tool. Please see more at The .NET Portability Analyzer – .NET

Run the .NET Portability Analyzer.

The sample out of the Analyzer can be found at

The tool is available as a Visual Studio extension, following screen capture shows how to target .NET 5.0

After the extension has been installed, the Portability analysis can be performed as shown below.

Approach 2: Deploy existing .NET apps as  Linux containers or EC2 or Azure VMs:

This is similar to Approach 1 but accommodates all use cases. If there are any issues with porting to .NET Core or if an app cannot be containerized, in such cases these apps can be hosted within AWS EC2 or Azure VM instances. 

Approach 3: Deploy existing .NET apps as Windows containers only

A .NET Framework application must run on Windows, period. If existing .NET Framework applications have to be containerized, you can’t or don’t want to invest in migration to .NET Core or later (“If it works properly, don’t migrate it”), the only choice you have for containers is to use Windows Containers. [3]

Pros:

  • Aligns well with Rehost migration strategy, applications can be ported as Windows containers easily with very less turnaround time.
  • If there is no need for full-fledged Container orchestration Azure has low-cost managed services that support both Windows and Linux-based containers.
  • Applications can benefit from windows containers, as the main dependency for the .NET Framework in Windows. 
  • Also, applications having secondary dependencies, like IIS, and System.Web in traditional ASP.NET benefits from using Windows containers.

Cons:

  • Each application warrants changes to the project structure to support containerization.
  • Architecture and maintenance can get complex depending on the choice of container orchestration between managed vs. self-managed.
  • Code may need to be refactored if the functionality is compromised due to hosting it within a container.

Approach 4:  Deploy existing .NET apps as  Native Cloud Services

Microsoft introduced Cloud Adoption Framework that recommends how to migrate various applications to Azure. A good strategy should include multiple migration/design choices catering to the application needs with the least amount of migration time and modifications to the applications. 

Please see the flowcharts below to get a high-level understanding of Microsoft’s recommendation of migrating applications and their supporting data sources within Azure.

Flow chart to select a candidate to compute service.

Read more at Choosing an Azure compute service – Azure Architecture Center

Flow chart to select a candidate data service.

Read more at Review your data options – Cloud Adoption Framework

Note: This section covers Azure services only, a similar flowchart can be devised for AWS as well. It’s important to establish a clear plan and scope upfront for all the applications, the team should not come to the drawing board for each application.

Pros:

  • More flexible due to more services/design choices and eventually reduces the overall migration cost and time.
  • Accommodates all the acceptable migration strategies
    • Rehost – Lift and shift as-is with no changes to applications.
    • Refactor/Repackage – With a few code changes and refactoring
    • Rearchitect
    • Rebuild

References:

  1. What’s new in .NET 5
  2. Microsoft .NET Framework – Microsoft Lifecycle
  3. Deploy existing .NET apps as Windows containers
  4. NuGetPackageExplorer/NuGetPackageExplorer: Create, update and deploy Nuget Packages with a GUI

0

Error: An error occurred while receiving the HTTP response

We have an Azure AppService that communicates with web services that are hosted on-premise via a Site-to-Site VPN. We are using a HUB and SPOKE model for the Site-to-Site VPN set up and this web app integrates with one of the subnets that is part of SPOKE VNet. We have gotten the following exception while performing end-to-end testing. Hope this blog helps you save some research time. Good luck!

Exception:

An error occurred while receiving the HTTP response to https://yourwebserviceURL. This could be due to the service endpoint binding not using the HTTP protocol. This could also be due to an HTTP request context being aborted by the server (possibly due to the service shutting down). System.Net.WebException: The underlying connection was closed: An unexpected error occurred on a receive.

Inner exception:

System.IO.IOException: Unable to read data from the transport connection: An existing connection was forcibly closed by the remote host. —> System.Net.Sockets.SocketException: An existing connection was forcibly closed by the remote host at System.Net.Sockets.Socket.Receive(Byte[] buffer, Int32 offset, Int32 size, SocketFlags socketFlags)
at System.Net.Sockets.NetworkStream.Read(Byte[] buffer, Int32 offset, Int32 size) at System.Net.Sockets.NetworkStream.Read(Byte[] buffer, Int32 offset, Int32 size)
at System.Net.FixedSizeReader.ReadPacket(Byte[] buffer, Int32 offset, Int32 count)
at System.Net.Security._SslStream.StartFrameHeader(Byte[] buffer, Int32 offset, Int32 count, AsyncProtocolRequest asyncRequest)at System.Net.Security._SslStream.StartReading(Byte[] buffer, Int32 offset, Int32 count, AsyncProtocolRequest asyncRequest)at System.Net.Security._SslStream.ProcessRead(Byte[] buffer, Int32 offset, Int32 count, AsyncProtocolRequest asyncRequest)at System.Net.TlsStream.Read(Byte[] buffer, Int32 offset, Int32 size)
at System.Net.PooledStream.Read(Byte[] buffer, Int32 offset, Int32 size) at System.Net.Connection.SyncRead(HttpWebRequest request, Boolean userRetrievedStream, Boolean probeRead)

Solution:

You might find a lot of solutions online suggesting to update the web.config, edit the binding etc. none of them worked for me. By default, Azure app routes only RFC1918 traffic into your VNet. If you want to route all of your outbound traffic into your VNet, use the following steps to add the WEBSITE_VNET_ROUTE_ALL setting in your app:

WEBSITE_VNET_ROUTE_ALL = 1

References:

https://docs.microsoft.com/en-us/azure/app-service/web-sites-integrate-with-vnet

0

Error: AzResourceGroupDeployment failed!

I was trying to deploy multiple resources using ARM Template deployment and got interrupted with the following error.

New-AzResourceGroupDeployment : 11:59:25 PM – Error: Code=InvalidTemplateDeployment; Message=The template deployment ‘AzureFunctionsVPN’ is not valid according to the validation procedure. The tracking id is ’16b65e77-8c46-462f-8f4c-a909492f7229′. See inner errors for details.

It is very common to see such errors while working with ARM Template deployments. I found the following command-let extremely helpful without which it would have been a herculean task.

Get-AzLog -CorrelationId 'xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx' -DetailedOutput

If you are still using old RM command-lets, use the following

Get-AzureRMLog -CorrelationId xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx -DetailedOutput

The above command generates a lengthy JSON Output, pay attention to Properties > Status Message > Message

Note: Please give at least a minute before you run the above command, it usually takes that much time for the log to show up, otherwise you might get an empty response sometimes.

You can also get the logs based on a date and time range. For more information please refer to the following link.

https://docs.microsoft.com/en-us/powershell/module/az.monitor/get-azlog?view=azps-3.8.0

0

Error: No match was found for the specified search criteria for the provider ‘NuGet’

I received the following error while installing Azure PowerShell Modules.

Install-Module -Name Az -AllowClobber

PackageManagement\Install-PackageProvider : No match was found
for the specified search criteria for the provider 'NuGet'.
The package provider requires 'PackageManagement' and 'Provider'
tags. Please check if the specified package has the tags.

Issue:

The issue, as I understand it, is that PowerShell by default uses TLS 1.0 for web requests and causing an issue in this case. So when the Install-Module command is trying to install the missing Package-Provider as a pre-requisite and AZ Modules itself, it’s failing! See step 3 to change the default to TLS 1.2

Solution:

Step 1: Check the PowerShell version installed and update it if required.

$PSVersionTable.PSVersion

Note: Azure PowerShell works with PowerShell 5.1 or higher on Windows, or PowerShell Core 6.x and later on all platforms. You should install the latest version of PowerShell Core available for your operating system. Azure PowerShell has no additional requirements when running on PowerShell Core.

To use Azure PowerShell in PowerShell 5.1 on Windows:

  1. Update to Windows PowerShell 5.1 if needed. If you’re on Windows 10, you already have PowerShell 5.1 installed.
  2. Install .NET Framework 4.7.2 or later.
  3. Make sure you have the latest version of PowerShellGet. Run Update-Module PowerShellGet -Force.

Step 2: Check if the NuGet package provider is installed

Get-PackageProvider -ListAvailable

Following is what I have initially, notice NuGet package provider is missing.

Step 3: Run the following command

The following will for the PowerShell to use TLS 1.2 

[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12

Step 4: Re-Run Install-Module command

Install-Module -Name Az -AllowClobber

Note: The above command should have installed the missing package-provider as well and should look like the following. See highlighted.

References:

https://docs.microsoft.com/en-us/powershell/azure/install-az-ps?view=azps-3.8.0

0

Azure Service Bus Feature Summary

This post acts as a quick reference for all the popular features that differentiate Service Bus from Storage Queues.

This blog post assumes that you have some fundamental understanding of messaging options within Azure. But if you want to get started and need a more detailed understanding please refer to the following articles.

Microsoft Service Bus

Messaging options in the Microsoft Azure Platform

Service Bus feature summary:

Note: All the above options applies to both Queues and Topics.

Time to live:
Message time to live determines how long a message will stay in the queue before it expires and is removed or dead lettered. When sending messages it is possible to specify a different time to live for only that message. This default will be used for all messages in the queue which do not specify a time to live for themselves. 

Lock Duration:
Sets the amount of time that a message is locked for other receivers. After its lock expires, a message pulled by one receiver becomes available to be pulled by other receivers. Defaults to 30 seconds, with a maximum of 5 minutes.

Duplicate Detection:
Enabling duplicate detection configures your queue to keep a history of all messages sent to the queue for a configurable amount of time. During that interval, your queue will not accept any duplicate messages. Enabling this property guarantees exactly-once delivery over a user-defined span of time

Dead Lettering:
Dead lettering messages involves holding messages that cannot be successfully delivered to any receiver to a separate queue after they have expired. Messages do not expire in the dead letter queue, and it supports peek-lock delivery and all transactional operations.

Sessions:
Service bus sessions allow ordered handling of unbounded sequences of related messages. With sessions enabled a queue can guarantee first-in-first-out delivery of messages.

Partitioning:
Partitions a queue across multiple message brokers and message stores. Disconnects the overall throughput of a partitioned entity from any single message broker or messaging store. This property is not modifiable after a queue has been created.

References:

Time to live:
https://docs.microsoft.com/en-us/azure/service-bus-messaging/message-expiration
Duplicate Detection:
https://docs.microsoft.com/en-us/azure/service-bus-messaging/duplicate-detection
Dead-Letter-Queues:
https://docs.microsoft.com/en-us/azure/service-bus-messaging/service-bus-dead-letter-queues
Sessions:
https://docs.microsoft.com/en-us/azure/service-bus-messaging/message-sessions
Partitioning:
https://docs.microsoft.com/en-us/azure/service-bus-messaging/service-bus-partitioning

0

SP Error: ‘New Document’ requires a Microsoft SharePoint Foundation-compatible application and web browser

We encountered the following error when ever a user tries to create a new document from any SharePoint document library using Internet Explorer (IE). It works fine in all other browsers.

Note: We are using On-Premise SharePoint 2013 and Office 2010 Professional plus.

“New Document requires a Microsoft SharePoint Foundation-Compatible application and web browser. To add a document to this document library, click the ‘Upload Document’ button”

IE pop up for new word documents

IE pop up for new word documents

Based on our observation it is due to the missing add-ons,  that get added to IE along with the Office package.
I believe some updates to the office removed these or the Office installation got corrupted due to patch updates or something.

SharePoint Add-ons

SharePoint Add-ons

Solution: Repair your office installation.

  1. From the Start menu, click Control Panel, and then click Programs.
  2. On the Control Panel, click Programs and Features.
  3. Scroll down to the list of installed programs, select Microsoft Office, and then click ‘Change’.
  4. In the Change dialog, click Repair and then click Continue.

Conclusion:

Hope this post helped to resolve your issue. If it did not, all the best in troubleshooting the root cause and please leave a comment on how you resolved it. Happy coding!

0

Some files can harm your computer in SharePoint 2013 (Internet Explorer Only!)

We had the below annoying pop showing up for some users while opening documents within SharePoint 2013.  Surprisingly this issue DOES NOT occur while accessing the same document from a Document Library. But when opening this document from a SharePoint page or a link within a Content Search or Content Query web part the below message pops up.

Some Files Can Harm Your Computer

Following are some of the alternative solutions that were listed in various forums/blogs that might help resolve your issue. But not for me. I have listed the actual solution at the very end.

Solution 1: Add URL to trusted sites.

Add the URL to the trusted sites on Internet Explorer. If this is working for you, you should work with your company to push this as a group policy or something to update for all users. Unfortunately, this did not work for me. I already had the site URL in the trusted sites.

Solution 2: Disable SharePoint OpenDocument Class Add-on for Internet Explorer

Disable SharePoint OpenDocument Class Add-on for Internet Explorer. This works 100%. But we can’t expect every user to do this and has to be pushed via a Group Policy or something. Also, we do not want to deal with consequences of disabling this Add-on. I do not want to go into the details of this Add-On, for now, I will save this as a topic for another blog.

SharePoint OpenDocuments Class

Solution 3: Follow the blog

Solution 4: Add script to suppress calls to Add-on (Worked For me!!)

The following screen capture explains it all. The issue is not occurring within a document library because the ‘onmousedown’ and ‘click‘ events are NOT calling the Add-On see below.

But on the page the link with in Content Query results, the ‘onmousedown’ and ‘click‘ events are calling the Add-On in IE (Internet Explorer).

SharePoint OpenDocuments Class Script

 

Script:

So as a solution we have added a script on to the master page with identifies anchor tags that are pointing to PDF files and replaces text within the two attributes as shown below.


<script type="text/javascript">
var replaceStrings = ["SharePoint.OpenDocuments.3", "SharePoint.OpenDocuments"];
var anchorAttributes = ["onclick", "onmousedown"];
$(document).ready(function () {
$('a[href*="pdf"]').each(function () {
for (var i = 0; i < anchorAttributes.length; i++) {
for (var j = 0; j < replaceStrings.length; j++) {
var value = $(this).attr(anchorAttributes[i]);
if (value) {
$(this).attr(anchorAttributes[i], value.replace(replaceStrings[j], ''));
}
}
console.log($(this).attr(anchorAttributes[i]));
}
});
});
</script>

0

Some SharePoint users always show up as DOMAIN\username

This is a common issue that most of SharePoint Admin or Developers come across where some users do not resolve to First Name Last Name but instead, they resolve as ‘domain\username’.

For instance, it resolves as  microsoft\sdakoju instead of Susheel Dakoju within the address book of SharePoint.

Following two steps seems to have worked for me. I am just jotting them down and might help someone out there facing the same issue.

  1. Run Set-SPUser PowerShell Command

    1. Specifies that user information will be synchronized from the user directory store.
      Set-SPUser -Identity ‘i:0#.w|microsoft\sdakoju’ –Web https://yourdomain/ –SyncFromAD
      
  2. Delete the user from the UIL (User Information List) – (Optional)

    1. The User Information List (“/_catalogs/users/simple.aspx” or “_catalogs/users/detail.aspx”) is a hidden list in each site collection that is only visible and accessible to Site Collection Administrators. The User Information List stores metadata information about a user.
    2. It is totally safe to delete the user from this list, the user will get added or recreated when the user visits the sites.

Regarding how this might have happened in the first place, I did not perform any root cause analysis due to pressing timeline to resolve the issue.

There could be multiple reasons, following is

  • Username or some user information may have been edited in Active Directory.
    For instance, some users move from temp to full time so the username might have changed to the one without temp.

    • sdakoju-temp to sdakoju or
    • sdakoju-domain1 to sdakoju-domain etc.
  • There could be an error or issue while the SharePoint profile Sync is running.
    • Check the error log for more details.
  • Core user profile services and other supporting services might not have recovered due to unplanned reboot or patching.

Happy TroubleShooting!!

0

Clone user permissions on SharePoint Online – Office 365

I had a request from one my clients to clone the permissions of one user to another for a SharePoint Online site.

I had created a simple PowerShell script that automates this process. It is a very common scenario when someone leaves the organization and somebody else takes that role or you want a backup for an employee and wants the same permissions to the backup person as well

It’s a simple task and I hope the following script comes in handy for you, Happy Scripting! You can also download the script from SPO_CloneUserPermissions.ps1


<#
.SYNOPSIS
Close user permissions of user to another in SharePoint Online

.AUTHOR
Susheel Dakoju

.DATE
04/21/2017

.DESCRIPTION
1. Fetch all the groups that the user is part of
2. Add ActualUser to the same groups

.Note
- This script requires admin privileges on the machine that is being executed on!
-
#>

#SharePoint online Admin site URL
$SPOAdmiURL = "https://yoursite-admin.sharepoint.com/"
$username = "username@onmicrosoft.com"
$password = "@@@@@@@@"

#Url of the SharePoint Online Site
$SPOSiteURL = 'https://yoursite.sharepoint.com/sites/test'

#User used as reference
$ReferenceUser = 'sdakoju@onmicrosoft.com'
#The actual user that needs to be added to Groups
$ActualUser = 'rkevin@onmicrosoft.com'

#Create a credential object
$cred = New-Object -TypeName System.Management.Automation.PSCredential -argumentlist $userName, $(convertto-securestring $Password -asplaintext -force)

#Connect to SharePoint Online using the credentials
Connect-SPOService -Url $SPOAdmiURL -Credential $cred

#Get the SharePoint Online Site Object
$site = Get-SPOSite $SPOSiteURL

#Get the user object of the reference user
$user = Get-SPOUser -Site $site -LoginName $ReferenceUser

#Loop through Groups and add the actual user
$user.Groups | Foreach-Object {

#Fetch Group Object that the reference user is part of
$group = Get-SPOSiteGroup -Site $site -Group $_

#Add 'ActualUser' to the same group that the reference user is part of
Add-SPOUser -Site $SPOSiteURL -LoginName $ActualUser -Group $group.LoginName

}

Note:

– It is assumed that you have SharePoint Online module installed on the machine you are running this script. If you do not have, please follow this link to download SharePoint Online Management Shell

– I had the following errors while running the script, I am jotting them here as they may be helpful for you.

Error 1:

  • Identity Client Runtime Library (IDCRL) could not look up the realm information for a federated sign-in. Or
  • The partner returned a bad sign-in name or password error. For more information, see Federation Error-handling Scenarios. Or
  • The ‘username’ argument is invalid. Or
  • The partner returned a bad sign-in name or password error. For more information, see Federation Error-handling Scenarios.

Solution:

  • I was using a wrong username. Please make sure you have right admin username and password for the SharePoint Online site that you are running against.

Error 2:

  • The site https://yoursite.sharepoint.com/sites/test/ is not properly formed

Solution:

  • There is a trailing slash at the end of the URL it should be ‘https://yoursite.sharepoint.com/sites/test’ and NOT ‘https://yoursite.sharepoint.com/sites/test/’
0

An error occurred accessing your Microsoft SharePoint Foundation site files

We encountered the following exception while trying to open a SharePoint 2013 site using SharePoint Designer

“An error occurred accessing your Microsoft SharePoint Foundation site files. Authors- if authoring against a web server, please contact the Webmaster for this server’s Web site. WebMasters – please see the server’s application event log for more details.”

SharePoint Designer Error

SharePoint Designer Error

After exhaustive troubleshooting, we discovered that New Relic configuration is what was causing the issue. This might or might not be the case for you!!

Step 1: You could first try to disable the New Relic agent altogether  and see if that is causing the issue.

Following is the screen capture to show the path to navigate and the XML Element to change to  disable the .NET Agent for New Relic, change agentEnabled = “false”

New Relic Agent Enabled

New Relic Agent Enabled

Once you have confirmed it’s the New Relic  that’s causing the issue, try the following option.

Step 2: Disabling the Browser Auto Injection with Agent still enabled.

The following screen capture clearly indicates the changes that are needed to disable the Browser Auto Injection with Agent STILL ENABLED.

Browser Auto Injection disabled

Browser Auto Injection disabled

Browser Auto Injection automatically injects a piece of Javascript code to the <head> of the page that tracks front end page views and other end user metrics. Here is our documentation on this: https://docs.newrelic.com/docs/browser/new-relic-browser/additional-standard-features/page-views-understanding-your-sites-popularity

The impact of disabling Auto Injection is that you will no longer be receiving these end-user metrics unless you use the copy/paste method to manually add the Javascript code, described here: https://docs.newrelic.com/docs/browser/new-relic-browser/installation-configuration/adding-apps-new-relic-browser#copy-paste-app

There are a few options for adding the Javascript in Sharepoint, you could:

  • Create a new master-page, and add the Javascript to the <head> there. https://msdn.microsoft.com/en-us/library/dn205273.aspx
  • Use the AdditionalPageHead delegate control in your SP solution. http://www.fivenumber.com/understanding-sharepoint-delegate-control/
  • Using Designer, add the script as you would in any other HTML editor. However, we’ve witnessed odd behavior where Designer will place the <head> tags inside the <body> tags, which has caused copy/paste instrumentation not to work. The solution here is to add the script right below the DOCTYPE declaration, outside of any actual tags. The script will still be rendered inside of <head> tags on actual deployment, though.

Conclusion:

The idea behind sharing this information is to encourage you to look at other dependencies like network configuration, third party modules or agents etc. that might be causing the issue. In our case it was New Relic, it could be something else for you.

Disclaimer: New Relic is one of THE BEST monitoring tools available in the market. Undoubtedly! It is just that you need to know how to tweak the configuration tailored to your needs.