SharePoint Server- Renewing SSL certificates quickly

A week ago, my wildcard SSL certificate expired on GoDaddy. It was automatically purchased, but I still had to validate my domain and download a new IIS CER certificate request file.

My old post from a few years ago has some good info on certs, the file types, etc. https://eschrader.com/2014/09/23/sharepoint-2013-iis7-nlb-ssl-certificates-and-godaddy/

This is a quick guide.

The only issue I have with this quick renewal is that I could not export the certs as a PFX, but I was able to get them installed on the server in IIS by completing a CSR

Here are the steps:

GoDaddy automatically renews SSL certificate

GoDaddy has renewed your SSL certificate, but you have to verify your domain using a TXT record they give you (@ is the host field).

Once you verify, you can download the certificate. Note, this is a CER which is a certificate request that has to be completed in IIS.

Download the certificate for IIS

Copy and extract the zip to the server

I chose to delete my old certificates from my computers Personal certificate store.

Once removed, I go into IIS and go to Server Certificated under the machine:

Once in Server Certificates in IIS, click on Complete Certificate Request:

Change the file type to *.* (All files) and find your CER file you copied over:

Enter your certificates friendly name (mine is a wildcard, so I use *.mydomain.com):

Next, go to your SharePoint IIS web apps that use this host header (could be more than one) and edit the bindings and select the new certificate. If you see multiple, this is why I deleted them in my step above. If you get an error saying change this will leave behind an old certificate of the same name, just double check the other web applications in IIS to make sure they are set correctly. Updating one should update them all, but I always check each site in IIS.

That’s it! The certificate is update.

The bad thing is I have to repeat the IIS complete CSR steps on each machine. I would rather export the first one and import PFX certificate files to my other machines, but hey, this is how I got it to work.

Leave any comments below, thanks!

SharePoint Online global navigation across site collections, with highlighting and security trimming

One common request when working with SharePoint sites is having a consistent navigation across multiple site collections. If you are using a Publishing Portal site template, you can use the Managed Navigation for your Global Navigation (or top navigation). This also supports drop downs. I did a quick test and it appears to support highlighting of the current element, which is nice considering the URLs are hard coded rather than dynamically added.

As for security, MS indicates the term store navigation supports security trimming as follows: “Note that if users don’t have access to the physical .aspx page (read permissions at least), the link won’t appear in menus even if these options are checked. By this way you can also control links displayed to users according to permissions. It follows the same behavior as the default SharePoint navigation menus.”

The drawback: You have to create a term set in each site collections Managed Navigation, pin EACH of your parent term navigation items (but it includes child terms at least). Its a lot of work, but the only way without custom code. Other options I have seen discussed are using Search web parts or CSOM, etc. Possibly 3rd party solutions. This does not work on new modern team sites (at least at the time of writing this), I get “access denied” when trying to enable Managed Navigation, even after turning on Publishing for the site/web.

Managed Navigation:

Under the target site collection(s), configure your navigation from Site Settings

Ensure Managed Navigation is checked under Global Navigation:

Uncheck:

Next, rather than creating a new term set from the site collection, do it in SharePoint Admin Center.

Go to the Admin tile:

Go to SharePoint under Admin Centers:

Select Term Store on left navigation:

Add your organizational tenant email to Term Store Administrators, save and reload the page.

Then, select the root term store for your O365 tenant, and select New Group:

Type in the orange input box, call it Navigation or something unique:

Select your new term group, and add yourself as a Group Manager and Contributor:

Create new Term Set under the group:

I just called mine Sites, but this is the actual element you will be selecting for your navigation. All child terms will appear in the actual navigation menu.

Then, select the sites element and add yourself as Owner, Contact and Stakeholder and SAVE:

Go to the Intended Use tab at the top, and enable “Use this Term Set for Site Navigation”:

Note: I also see faceted navigation, which IF the product catalog is now possible in SharePoint Online I will do another post soon, as I have been waiting years for this. I remember the roadblock was something with search managed properties…

Then under your term set, add your terms by selecting Create Term:

Go to Navigation tab and add your custom link:

You can create sub terms under terms as well to enable a drop-down navigation.

You can re-order terms in a group by selecting the group and going to Custom Sort:

Now just repeat the first step of selecting Managed Navigation and the Term Group on each of your site collections you want to inherit this navigation.

Update: Selecting this term set is limited to 1 per site collection. So the workaround acording to MS is to “Pin” each of your primary terms (with children) to the new site collections term set. https://support.microsoft.com/en-us/help/3144166/implement-global-navigation-across-multiple-site-collections-through-managed-navigation-in-sharepoint-server-2013 see steps 5-7 One note, it doesn’t seem to preserve custom sorting from the parent term set.

Uncheck:

Done!

Note: if you see any errors (such as Error loading navigation: The Managed Navigation term set is improperly attached to the site), switch the navigation to Structural on BOTH Global and Current, SAVE the changes, then change it back to Managed (and uncheck Add new pages to navigation automatically and Create friendly URLs for new pages automatically) and the error should go away.

Uncheck:

SharePoint 2010 Content Deployment Job failed. The remote upload web request failed. The remote server returned an error: (404) Not Found.

Summary

My farm content deployment jobs had been working, but all of a sudden stopped one day. The end fix was to edit a Central Admin web.config file upload size on the target WFE servers.

Issue

I was seeing the following errors in my ULS logs after 23 minutes of packing up just under 1gb of content from QA to PRD:

  • ContentDeploymentJob.ExecuteJob(): Failed ExecuteJob() with JobInfo: Name: ‘QA to Prod Job’, Id:’5db43c5d-1c3b-41cd-ac0a-495a48acb175′, JobType:’ServerToServer’, Description:”. Exception: ‘System.Net.WebException: The remote server returned an error: (404) Not Found.
  • Failed to transfer files to destination server for Content Deployment job ‘QA to Prod Job’. Exception was: ‘System.Net.WebException: The remote server returned an error: (404) Not Found.

In central admin, I was receiving this error after 23 minutes of running the job:

  • Content deployment job ‘QA to Prod Job’ failed.The remote upload Web request failed.

Resolution

Thanks to this article, I figured out the issue was the max upload size of the Central Admin web config on the target server WFE’s: https://social.msdn.microsoft.com/Forums/sharepoint/en-US/1d4aca49-40c1-414e-980e-150b148caf10/content-deployment-problems?forum=sharepointgeneralprevious

This is what finally worked for me, On the target WFE(s) modified web.config same as above:

  1. On target WFE(s), Edit the web.config file located in C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\14\TEMPLATE\ADMIN\Content Deployment

    <httpRuntime maxRequestLength=”307200” />

    <requestLimits maxAllowedContentLength=”314572800” />

    1. old values (for backup purposes)

      <httpRuntime maxRequestLength=”102400″ />

      <requestLimits maxAllowedContentLength=”104857600″ />

  2. On target WFE(s), perform an IISreset
  3. Launch the CDP job again and it should run.

Search Result preview images in SharePoint Online

SharePoint search results OOTB do not display image previews until you hover. We wanted to have a baseball card type approach to display certain items.

Here is an unfinished example of the results displayed as cards with image previews and the OOTB hover panel:

  1. Make sure the site is a publishing portal and publishing features are enabled at the site collection and site level. In order to get the Search Display templates to display the *.html files in the masterpage gallery, some of these features have to be on. Otherwise you just have *.JS files which can limit you. Just ask me under comments if you have any questions.
  2. Modify your Search display template for result items:
    1. /_catalogs/masterpage/Forms/Display%20Templates.aspx
    2. I modified Item_CommonItem_Body.html.
  3. Edit Item_CommonItem_Body.html properties and add the following property:
    1. ServerRedirectedPreviewURL
    2. (properties are separated by a comma, and surrounded by single quotes. So the exact text I added to the end was ,’ ServerRedirectedPreviewURL’ (but replace my blog posts fancy quotes)
  4. Edit the Item_CommonItem_Body.html file (I open with Explorer and edit the file in notepad++)
    1. Verify the property was added:

    2. Next, add a JavaScript tag to store the Preview Image URL as a web safe string:

      imageurlpreview = $htmlEncode(ctx.CurrentItem.ServerRedirectedPreviewURL)

    3. Now, let’s add in our custom HTML. This is kinda “hacky” since I am using this site for a proof of concept and I don’t care if these customizations exist everywhere in my test site collection. Ask me below in comments if you want to know how to copy this file and isolate the results to use this custom template using result types or a custom search results page.
        1. I scrolled down to the first HTML div I found, “ms-srch-item-body”. Right above this, I added my custom Baseball Card HTML. Then I moved the rest of this stuff in except for the closing div tag.
        2. The main thing was this line to add the new JavaScript for the image:

          <img src=”_#= imageurlpreview =#_” alt=”Preview” />

          Here is my baseball card HTML (including the preview image)

          [code language=”html” highlight=”7,30″]
          <div class="cbs-PictureCardsContainer">
          <div class="cbs-PictureCardsImageContainer">
          <a title="Title here" class="cbs-pictureImgLink" href="#">
          <div class="cbs-PictureCardsImage">
          <img src="_#= imageurlpreview =#_" alt="Preview" /></div>
          <div class="sts-cardtype sts-cardtypeidea">
          Type</div>
          </a></div>
          <div title="" class="cbs-pictureCardsCategory ms-noWrap">
          Category</div>
          <div class="cbs-PictureCardsDataContainer">
          <a title="Title here" class="cbs-PictureCardsLine1Link" href="#">
          <div class="cbs-PictureCardsTitle ms-noWrap">
          Pic title</div>
          </a>

          Move all of the OOTB stuff here, starting with the div ms-srch-item-body
          <div title="" class="cbs-PictureCardsDesc">
          Description of image</div>
          </div>
          </div>
          [/code]

        3. Now Save the display template HTML file and publish ONLY the HTML file from the browser (the JS file gets automatically updated instantly):

      Add the baseball card CSS to your result page and you should be good. Again, the element selector (ms-srch-item) for floating these elements is a bit hokey, I could have modified the Control_SearchResults and individual Item templates but this is just a POC.

      [code language=”css” highlight=”2″]
      /* Cards */
      .ms-srch-item {
      width:240px;
      display:inline;
      float:left;
      margin:11px;
      border: 1px solid #DDD;
      clear:none;
      }
      .cbs-PictureCardsImageContainer{
      height:240px;
      border-top-left-radius: 9px;
      border-top-right-radius: 9px;
      width:240px;

      }
      .cbs-PictureCardsImage {
      height:240px;
      overflow:hidden;
      width:240px;

      }
      .cbs-noImageContainer-ContentWrapperLarge {
      display:none;
      }
      .cbs-PictureCardsDataContainer {
      padding: 8px 22px 0px 22px;
      background-color: #f8f8f8;
      color: #212121;
      }
      .cbs-PictureCardsDataContainer a, .cbs-PictureCardsDataContainer a:active, .cbs-PictureCardsDataContainer a:hover, .cbs-PictureCardsDataContainer a:visited {
      color: #212121;
      }
      .cbs-pictureCardsCategory {
      background-color:#666;
      color: #FFF;
      font-weight: bold;
      font-size: 12px;
      padding: 8px 22px 8px 22px;
      border-top:1px solid #FFF;
      }
      .cbs-PictureCardsTitle {
      font-weight: bold;
      }
      .cbs-PictureCardsDesc {
      height: 75px;
      overflow:hidden;

      }

      .sts-cardtype {
      position:relative;
      top:-18px;
      left:120px;
      text-align:center;
      height: 18px;
      width: 120px;
      color:#000;
      font-weight:bold;
      }

      .sts-cardtypeidea {
      background-color:#a8da69;

      }

      .cbs-PictureCardsIconSection {
      float:left;
      margin-top:8px;
      margin-bottom:8px;
      }
      [/code]

    4. Check in and publish your custom result page (the custom ASPX page with all of your search refiners, result web part, search box, above CSS, etc.) and you should be good.

Automating Azure IaaS SharePoint VM Provisioning via PowerShell Remoting

I have been searching for a way to rapidly create standalone Developer SharePoint 2013 standalone VM’s joined to a central domain for our in-house developers. Our team has created 60+ SharePoint VMs on Azure and continue to create about 10 per month. We are beginning to treat our VMs needing hours of repair like cattle, and no longer like the family dog if they have issues or “get terminally ill”, VM’s are replaced with a brand new shiny cow within 4 hoursJ

The process of manually creating VMs was not fun, taking over a day per VM in the beginning on average. I did not want to use Sysprep, since I would have to maintain multiple VM Images at a single point in time. For some, this might be the best way to go.

My solution was to create a lightweight PowerShell set of scripts that create the VM in Azure, install the applications, and keep everything consistent. I can create 4+ VMs at one time, all under ~4 hours total. This is a process that scales somewhat, to meet our needs. Perfect.

Alternatives to this manual PowerShell process I went through:

  • Sysprep would save a ton of steps, but is not as easy to update OS patches, etc. as newer software comes out
  • AzureRM– Azure Resource Manager is a lot easier. However, these topologies seem somewhat isolated and all of our existing VMs and didn’t work well with what we had in place for our network/VPN, etc.
  • Azure SharePoint QuickStart templates/images- These were preconfigured and had various OS settings changed. Similar to the above Azure RM solution issues we ran into.

Assumptions:

  • You have an Azure subscription all set up, with a virtual network/DNS/Subnet (we have a site-to-site VPN)
  • You have domain controller with all of the SharePoint service accounts created for Least Privileges security installation
  • You have installed the latest version of Azure PowerShell installed, rebooted after installing it, and performed the Add-AzureAccount command
  • You have used AutoSPInstaller before

What you need:

  • The following information from Azure
    • Subscription ID
    • Virtual Network info
      • Network Name
      • Subnet Name
    • Resource Group
  • The following pre-existing VMs:
    • DC
      • Service Accounts
    • Fileshare VM
      • All of the necessary ISO’s and EXE’s
        • Installer files:
          • 7zipInstall.msi
          • ccleaner.exe
          • fiddler4setup.exe
          • Copy-Item ‘C:\Fileshare\applications\Firefox Setup Stub 36.0.4.exe’
          • iview438_setup.exe
          • LINQPad4Setup.exe
          • npp.6.7.5.Installer.exe
          • paint.net.4.0.5.install.exe
          • PowerGUI.3.8.0.129.msi
          • cutepdf-writer.exe
          • CKS.Dev11.vsix
          • codecompare.exe
      • Stand Alone EXE’s
        • ULSViewer.zip
        • U2U.SharePoint.CQB2010.zip
      • Applications (Extracted into their own folder with configuration.ini files)
        • CamlDesigner2013
        • Visual Studio 2012
        • Visual Studio 2015
        • SharePoint Designer 2013
        • en_sql_server_2014_enterprise_edition_with_service_pack_1_x64_dvd_6669618
        • AutoSPInstaller for dev
        • sql2014config file for dev
        • en_sharepoint_server_2013_with_sp1_x64_dvd_3823428
        • SharePoint 2013 June 2015 CU (note, if you download this from the internet, uncheck the security property so you don’t get prompted during the AutoSPInstaller process for UAC- right click all 3 CU files and go to Security and unblock, you only have to do this one time on the fileshare.)
  • Silent install for software (one-time prep, then save the folder on the Fileshare VM)

Configure PowerShell variables for the new standalone developer SharePoint VM

#VM Name will be ASP13D08

#IP will be 192.168.1.87

#Cloud service Company-Redondo-D08 (each developer has their own cloud service so they can power on VMs without having to wait for the other developers to start at the same time)

#service accounts

#Single SharePoint 2013 developer VM

[code language=”powershell”]$varVMLocation = "A"
$varVMServerType = "SP"
$varVMSPVersion = "13"
$varVMType = "D"
$varVMIntanceNum = "08"
$varVMReduxSuffix = "" # I sometimes append a version letter to the end of the developers VM, if they are getting an additional VM of the same role.
$spsetupname = "svc_spsetup"
$spsetuppasstext = "passw0rdspsetup"
$users = @("svc_spsetup", "svc_spfarm", "eric.schrader", "dev1", "etc");
$varVMStaticIP = "192.168.1.87"
$varStorageAccount = "Company" + $varVMLocation + $varVMType + $varVMIntanceNum #unique
$varStorageAccount = $varStorageAccount.ToLower()
$service = "Company-Redondo-" + $varVMType + $varVMIntanceNum
$instancesize = "Basic_A4"
$subscriptionid = "12345678-12345-123456"
$subscriptionName = "Microsoft Azure Enterprise"
$imageFamily = "Windows Server 2012 R2 Datacenter" #Azure VM Image name, the latest will be used below.
$localadminname = "company.admin" #cant be "administrator", etc.
$localadminPassword = "passw0rdlocaladmin"
$joindomain = "Domain.local"
$domainname = "Domain"
$machineOU = "OU=Azure,OU=Development,OU=Servers,OU=Seattle,DC=Company,DC=local"
$timezone = "Pacific Standard Time"
$domainusername = "svc_spsetup"
$domainpassword = "passw0rdspsetup"
$datadiskGB = 127
$datadiskLUN = 0
$datadiskCACHE = "None"
$vmsubnet = "Subnet-1"
$vmaffinitygroup = "VPN-Linked"[/code]

Create the Azure storage account if it doesn’t exist, the set it as the default for PowerShell

#Get-AzureStorageAccount | ft

#Change varStorageAccount to lowercase

[code language=”powershell”]$lowerStorageAccount = $varStorageAccount.ToLower()

Try{

get-azurestorageaccount -storageaccountname $varStorageAccount -ErrorAction Stop

#if this fails to get it, it will create it below. Need above error action

}

Catch {

#you got an error trying to get it, so create it.

Write-output "creating storage account $varStorageAccount"

New-AzureStorageAccount -StorageAccountName $lowerStorageAccount -Label $lowerStorageAccount -AffinityGroup $vmaffinitygroup

}[/code]

#now that it exists, set it as default.

[code language=”powershell”]Set-AzureSubscription -CurrentStorageAccountName $varStorageAccount -SubscriptionId $subscriptionid

Select-AzureSubscription -SubscriptionId $subscriptionid -Current[/code]

Create the VM using above variables

#try to fix the DNS error in the Comapny DC, WARNING: The specified DNS name is already taken.

#New-AzureService -Label $service -Description $service -AffinityGroup $vmaffinitygroup -ServiceName $service

 

[code language=”powershell”]</span>New-AzureVMConfig -Name $name -InstanceSize $instancesize -ImageName $image | Add-AzureProvisioningConfig -AdminUserName $localadminname -EnableWinRMHttp -TimeZone $timezone -DisableAutomaticUpdates –Password $localadminPassword -WindowsDomain -JoinDomain $joindomain -Domain $domainname -DomainUserName $domainusername -DomainPassword $domainpassword -MachineObjectOU $machineOU | Add-AzureDataDisk -CreateNew -DiskSizeInGB $datadiskGB -DiskLabel $datadiskname -LUN $datadiskLUN -HostCaching $datadiskCACHE | Set-AzureSubnet –SubnetNames $vmsubnet | Set-AzureStaticVNetIP -IPAddress $varVMStaticIP| New-AzureVM –ServiceName $service -AffinityGroup $vmaffinitygroup[/code]

Configures Secure Remote PowerShell Access to Windows Azure Virtual Machines

Download PS1 file from this blog post to your local computer with Azure PowerShell. https://gallery.technet.microsoft.com/scriptcenter/Configures-Secure-Remote-b137f2fe

#CD to location in PowerShell, for example, your desktop:

[code language=”powershell”]Cd C:\users\eric.schrader\desktop[/code]

#Create WInRM Cert to new VM

[code language=”powershell”].\InstallWinRMCertAzureVM.ps1 -SubscriptionName $subscriptionname -ServiceName $service -Name $name[/code]

Connect to remote session

#Connect via remote powershell as local azure.admin
#uses variables from when the VM was created above

[code language=”powershell”]$passwordsec = convertto-securestring $localadminPassword -asplaintext -force

$user = $name +"\"+ $localadminname

$cred = new-object -typename System.Management.Automation.PSCredential -argumentlist $user,$passwordsec

$uri = Get-AzureWinRMUri -ServiceName $service -Name $name

Enter-PSSession -ConnectionUri $uri -Credential $cred

$env:computername[/code]

#Run variables Again!!!

# IMPORTANT, COPY AND PASTE THE ABOVE VARIABLES SECTION IN AGAIN. This is a new remote session to the new Azure VM.

[code language=”powershell”]$varVMLocation = "A"

$varVMServerType = "SP"

$varVMSPVersion = "13"

$varVMType = "D"

$varVMIntanceNum = "08"

$varVMReduxSuffix = "" # I sometimes append a version letter to the end of the developers VM, if they are getting an additional VM of the same role.

$spsetupname = "svc_spsetup"

$spsetuppasstext = "passw0rdspsetup"

$users = @("svc_spsetup", "svc_spfarm", "eric.schrader", "dev1", "etc");

$varVMStaticIP = "192.168.1.87"

$varStorageAccount = "Company" + $varVMLocation + $varVMType + $varVMIntanceNum #unique

$varStorageAccount = $varStorageAccount.ToLower()

$service = "Company-Redondo-" + $varVMType + $varVMIntanceNum

$instancesize = "Basic_A4"

$subscriptionid = "12345678-12345-123456"

$subscriptionName = "Microsoft Azure Enterprise"

$imageFamily = "Windows Server 2012 R2 Datacenter" #Azure VM Image name, the latest will be used below.

$localadminname = "company.admin" #cant be "administrator", etc.

$localadminPassword = "passw0rdlocaladmin"

$joindomain = "Domain.local"

$domainname = "Domain"

$machineOU = "OU=Azure,OU=Development,OU=Servers,OU=Seattle,DC=Company,DC=local"

$timezone = "Pacific Standard Time"

$domainusername = "svc_spsetup"

$domainpassword = "passw0rdspsetup"

$datadiskGB = 127

$datadiskLUN = 0

$datadiskCACHE = "None"

$vmsubnet = "Subnet-1"

$vmaffinitygroup = "VPN-Linked"[/code]

Set proper storage account in remote session

# Now that you set the variables, set the storage account for the remote session

[code language=”powershell”]Set-AzureSubscription -CurrentStorageAccountName $varStorageAccount -SubscriptionId $subscriptionid

Select-AzureSubscription -SubscriptionId $subscriptionid -Current[/code]

Format F drive for SharePoint/SQL, permission service accounts

#Format F drive

[code language=”powershell”]$labels = @("DATA1","DATA2")

Write-Host "Initializing and formatting raw disks"

$disks = Get-Disk | Where partitionstyle -eq ‘raw’ | sort number

## start at F: because sometimes E: shows up as a CD drive in Azure

$letters = 70..89 | ForEach-Object { ([char]$_) }

$count = 0

foreach($d in $disks) {

$driveLetter = $letters[$count].ToString()

$d |

Initialize-Disk -PartitionStyle MBR -PassThru |

New-Partition -UseMaximumSize -DriveLetter $driveLetter |

Format-Volume -FileSystem NTFS -NewFileSystemLabel $labels[$count] `

-Confirm:$false -Force

$count++

}

GET-WMIOBJECT –query "SELECT * from win32_logicaldisk where DriveType = ‘3’"[/code]

#add developer, and admins/spsetup/spfarm, set in $users variable above

[code language=”powershell”]foreach($user in $users) {

$domainuser= $domainname + "\"+$user

$Group = "Administrators"

$de = [ADSI]"WinNT://$name/$Group,group"

$de.Add("WinNT://$domainname/$user")

Write-Host "Done, $domainuser has been permissioned to this computer."

}

net localgroup administrators[/code]

#create folder and share for apps

[code language=”powershell”]New-Item -Path F:\tools -ItemType directory -Value Tools

New-SMBShare –Name "Tools" –Path "F:\Tools" -ChangeAccess "Everyone"[/code]

 Disable UAC (for developers), restart computer, set execution policy, etc.

#Disable UAC

[code language=”powershell”]Set-ItemProperty -Path HKLM:\Software\Microsoft\Windows\CurrentVersion\policies\system -Name EnableLUA -Value 0[/code]

#allow scripts

[code language=”powershell”]Set-executionpolicy unrestricted -force[/code]

#reboot

[code language=”powershell”]Restart-computer[/code]

#wait 5 minutes for reboot

#Reconnect to powershell

(exit, reconnect to remote powershell, re-run vars)

 Connect to remote session

#Connect via remote powershell as local azure.admin
#uses variables from when the VM was created above

[code language=”powershell”]$passwordsec = convertto-securestring $localadminPassword -asplaintext -force

$user = $name +"\"+ $localadminname

$cred = new-object -typename System.Management.Automation.PSCredential -argumentlist $user,$passwordsec

$uri = Get-AzureWinRMUri -ServiceName $service -Name $name

Enter-PSSession -ConnectionUri $uri -Credential $cred

$env:computername[/code]

#Run variables Again!!!

# IMPORTANT, COPY AND PASTE THE ABOVE VARIABLES SECTION IN AGAIN. This is a new remote session to the new Azure VM.

 

[code language=”powershell”]</span>$varVMLocation = "A"

$varVMServerType = "SP"

$varVMSPVersion = "13"

$varVMType = "D"

$varVMIntanceNum = "08"

$varVMReduxSuffix = "" # I sometimes append a version letter to the end of the developers VM, if they are getting an additional VM of the same role.

$spsetupname = "svc_spsetup"

$spsetuppasstext = "passw0rdspsetup"

$users = @("svc_spsetup", "svc_spfarm", "eric.schrader", "dev1", "etc");

$varVMStaticIP = "192.168.1.87"

$varStorageAccount = "Company" + $varVMLocation + $varVMType + $varVMIntanceNum #unique

$varStorageAccount = $varStorageAccount.ToLower()

$service = "Company-Redondo-" + $varVMType + $varVMIntanceNum

$instancesize = "Basic_A4"

$subscriptionid = "12345678-12345-123456"

$subscriptionName = "Microsoft Azure Enterprise"

$imageFamily = "Windows Server 2012 R2 Datacenter" #Azure VM Image name, the latest will be used below.

$localadminname = "company.admin" #cant be "administrator", etc.

$localadminPassword = "passw0rdlocaladmin"

$joindomain = "Domain.local"

$domainname = "Domain"

$machineOU = "OU=Azure,OU=Development,OU=Servers,OU=Seattle,DC=Company,DC=local"

$timezone = "Pacific Standard Time"

$domainusername = "svc_spsetup"

$domainpassword = "passw0rdspsetup"

$datadiskGB = 127

$datadiskLUN = 0

$datadiskCACHE = "None"

$vmsubnet = "Subnet-1"

$vmaffinitygroup = "VPN-Linked"[/code]

Install SQL pre-reqs

 #Install .net 3.5 for SQL prereq on SQL server

[code language=”powershell”]Install-WindowsFeature –name NET-Framework-Core[/code]

Now that the VM is ready for software installs, lets copy the software over. Due to the Windows “triple hop” issue of credentials, I cannot remote into the VM then copy from a 3rd remote location to the vm. I will have to RDP manually

RDP to fileshare computer as svc_SPSetup

Run variables on fileshare computers PowerShell

#Run variables Again!!!

# IMPORTANT, COPY AND PASTE THE ABOVE VARIABLES SECTION IN AGAIN. This is a new VM session.

[code language=”powershell”]$varVMLocation = "A"

$varVMServerType = "SP"

$varVMSPVersion = "13"

$varVMType = "D"

$varVMIntanceNum = "08"

$varVMReduxSuffix = "" # I sometimes append a version letter to the end of the developers VM, if they are getting an additional VM of the same role.

$spsetupname = "svc_spsetup"

$spsetuppasstext = "passw0rdspsetup"

$users = @("svc_spsetup", "svc_spfarm", "eric.schrader", "dev1", "etc");

$varVMStaticIP = "192.168.1.87"

$varStorageAccount = "Company" + $varVMLocation + $varVMType + $varVMIntanceNum #unique

$varStorageAccount = $varStorageAccount.ToLower()

$service = "Company-Redondo-" + $varVMType + $varVMIntanceNum

$instancesize = "Basic_A4"

$subscriptionid = "12345678-12345-123456"

$subscriptionName = "Microsoft Azure Enterprise"

$imageFamily = "Windows Server 2012 R2 Datacenter" #Azure VM Image name, the latest will be used below.

$localadminname = "company.admin" #cant be "administrator", etc.

$localadminPassword = "passw0rdlocaladmin"

$joindomain = "Domain.local"

$domainname = "Domain"

$machineOU = "OU=Azure,OU=Development,OU=Servers,OU=Seattle,DC=Company,DC=local"

$timezone = "Pacific Standard Time"

$domainusername = "svc_spsetup"

$domainpassword = "passw0rdspsetup"

$datadiskGB = 127

$datadiskLUN = 0

$datadiskCACHE = "None"

$vmsubnet = "Subnet-1"

$vmaffinitygroup = "VPN-Linked"[/code]

Copy the software

#Run from Fileshare as SPSetup in PowerShell

#re-run variables

#run installers
#Copy applications from local computer S drive on \\fileshare to server F drive

[code language=”powershell”]Copy-Item C:\Fileshare\applications\7zipInstall.msi -Destination <a href="///\\$name\tools">\\$name\tools</a>

Copy-Item C:\Fileshare\applications\ccleaner.exe -Destination <a href="///\\$name\tools">\\$name\tools</a>

Copy-Item C:\Fileshare\applications\fiddler4setup.exe -Destination <a href="///\\$name\tools">\\$name\tools</a>

Copy-Item ‘C:\Fileshare\applications\Firefox Setup Stub 36.0.4.exe’ -Destination <a href="///\\$name\tools">\\$name\tools</a>

Copy-Item C:\Fileshare\applications\iview438_setup.exe -Destination <a href="///\\$name\tools">\\$name\tools</a>

Copy-Item C:\Fileshare\applications\LINQPad4Setup.exe -Destination <a href="///\\$name\tools">\\$name\tools</a>

Copy-Item C:\Fileshare\applications\npp.6.7.5.Installer.exe -Destination <a href="///\\$name\tools">\\$name\tools</a>

Copy-Item C:\Fileshare\applications\paint.net.4.0.5.install.exe -Destination <a href="///\\$name\tools">\\$name\tools</a>

Copy-Item C:\Fileshare\applications\PowerGUI.3.8.0.129.msi -Destination <a href="///\\$name\tools">\\$name\tools</a>

Copy-Item C:\Fileshare\applications\cutepdf-writer.exe -Destination <a href="///\\$name\tools">\\$name\tools</a>

Copy-Item C:\Fileshare\applications\CKS.Dev11.vsix -Destination <a href="///\\$name\tools">\\$name\tools</a>

Copy-Item C:\Fileshare\applications\codecompare.exe -Destination <a href="///\\$name\tools">\\$name\tools</a>[/code]

#copy exes to F:\Tools 

[code language=”powershell”]Copy-Item C:\Fileshare\applications\ULSViewer.zip -Destination <a href="///\\$name\tools">\\$name\tools</a>

Copy-Item C:\Fileshare\applications\U2U.SharePoint.CQB2010.zip -Destination <a href="///\\$name\tools">\\$name\tools</a>

Copy-Item C:\Fileshare\applications\CamlDesigner2013\* -Destination <a href="///\\$name\tools">\\$name\tools</a> -Recurse

Copy-Item "C:\Fileshare\Visual Studio 2012\*" -Destination <a href="///\\$name\tools">\\$name\tools</a> -Recurse

Copy-Item "C:\Fileshare\Visual Studio 2015\*" -Destination <a href="///\\$name\tools">\\$name\tools</a> -Recurse

Copy-Item "C:\Fileshare\SharePoint Designer 2013\*" -Destination <a href="///\\$name\tools">\\$name\tools</a> -Recurse[/code]

#copy SQL to SQL server (Copy ISO CONTENTS)

[code language=”powershell”]Copy-Item C:\Fileshare\en_sql_server_2014_enterprise_edition_with_service_pack_1_x64_dvd_6669618\* -Destination <a href="///\\$name\tools">\\$name\tools</a> -Recurse

Copy-Item C:\Fileshare\AutoSPInstallerDev2013\* -Destination <a href="///\\$name\tools">\\$name\tools</a> -Recurse

Copy-Item C:\Fileshare\sql2014configdevint\* -Destination <a href="///\\$name\tools">\\$name\tools</a> -Recurse -force

Copy-Item C:\Fileshare\en_sharepoint_server_2013_with_sp1_x64_dvd_3823428\* -Destination <a href="///\\$name\tools\AutoSPInstaller\SP\2013\SharePoint">\\$name\tools\AutoSPInstaller\SP\2013\SharePoint</a> -Recurse -force

Copy-Item "C:\Fileshare\SharePoint 2013 June 2015 CU" -Destination <a href="///\\$name\tools\AutoSPInstaller\SP\2013\Updates">\\$name\tools\AutoSPInstaller\SP\2013\Updates</a> -Recurse -force[/code]

Close RDP to fileshare and go back to your local computers PowerShell. We are ready to install the software

#re-run variables

#Run variables Again!!!

# IMPORTANT, COPY AND PASTE THE ABOVE VARIABLES SECTION IN AGAIN. This is a new remote session to the new Azure VM.

[code language=”powershell”]$varVMLocation = "A"

$varVMServerType = "SP"

$varVMSPVersion = "13"

$varVMType = "D"

$varVMIntanceNum = "08"

$varVMReduxSuffix = "" # I sometimes append a version letter to the end of the developers VM, if they are getting an additional VM of the same role.

$spsetupname = "svc_spsetup"

$spsetuppasstext = "passw0rdspsetup"

$users = @("svc_spsetup", "svc_spfarm", "eric.schrader", "dev1", "etc");

$varVMStaticIP = "192.168.1.87"

$varStorageAccount = "Company" + $varVMLocation + $varVMType + $varVMIntanceNum #unique

$varStorageAccount = $varStorageAccount.ToLower()

$service = "Company-Redondo-" + $varVMType + $varVMIntanceNum

$instancesize = "Basic_A4"

$subscriptionid = "12345678-12345-123456"

$subscriptionName = "Microsoft Azure Enterprise"

$imageFamily = "Windows Server 2012 R2 Datacenter" #Azure VM Image name, the latest will be used below.

$localadminname = "company.admin" #cant be "administrator", etc.

$localadminPassword = "passw0rdlocaladmin"

$joindomain = "Domain.local"

$domainname = "Domain"

$machineOU = "OU=Azure,OU=Development,OU=Servers,OU=Seattle,DC=Company,DC=local"

$timezone = "Pacific Standard Time"

$domainusername = "svc_spsetup"

$domainpassword = "passw0rdspsetup"

$datadiskGB = 127

$datadiskLUN = 0

$datadiskCACHE = "None"

$vmsubnet = "Subnet-1"

$vmaffinitygroup = "VPN-Linked"[/code]

RDP to Developer VM using svc_SPSetup and install SQL by PowerSHell.
#via SPSetup , possibly have to sign into RDP instead of remote PS. Takes 20 minutes

[code language=”powershell”]start-process F:\tools\sql\Setup.exe -ArgumentList "/q /SkipRules=VSShellInstalledRule RebootRequiredCheck /ConfigurationFile=F:\Tools\ConfigurationFile.ini /ERRORREPORTING=1 /IACCEPTSQLSERVERLICENSETERMS" -Wait[/code]

Close RDP to developer VM once SQL is done.

From local computer, Connect via remote powershell as spsetup. There is code in here to install SQL remotely, but it too has a triple hop credential issue since I use service accounts. I just RDP to the VM and install SQL via PowerShell there.
#uses variables from when the VM was created above

[code language=”powershell”]$passwordsec = convertto-securestring $spsetuppasstext -asplaintext -force

$cred = new-object -typename System.Management.Automation.PSCredential -argumentlist $spsetupname ,$passwordsec

$uri = Get-AzureWinRMUri -ServiceName $service -Name $name

Enter-PSSession -ConnectionUri $uri -Credential $cred

$env:computername[/code]

#FIX – maybe to install SQL remotely
#Invoke-Command -ComputerName $name -Authentication CredSSP -credential $cred -scriptblock {
#F:\tools\sql\Setup.exe -ArgumentList “/q /SkipRules=VSShellInstalledRule RebootRequiredCheck /ConfigurationFile=F:\Tools\ConfigurationFile.ini /ERRORREPORTING=1 /IACCEPTSQLSERVERLICENSETERMS” -Wait
#}

Set SQL max memory to 10GB, set Max Degree of parallelism to 1 (this is a huge script, maybe you can shorten it)

#Set Max degree of parallelism to 1

[code language=”powershell”]## Sets the ‘max degree of parallelism’ value to 1 for the specified SQL server instance
## Port 1433 is used if not specified
## 2012-10-08
## <a href="http://www.pointbeyond.com">www.pointbeyond.com</a>
## NOTE: This function requires at least serveradmin level permissions within SQL server
function SetMaxDegreeOfParallelism()

{

Param(

$server,

$port="1433")

$conn = new-object System.Data.SqlClient.SqlConnection

try

{

$connectionString = "Server="+$server+","+$port+";Database=master;Integrated Security=True;"

$conn.ConnectionString = $connectionString

$conn.Open()

$cmd = new-object System.Data.SqlClient.SqlCommand

$cmd.Connection = $conn

# Ensure advanced options are available

$commandText = "sp_configure ‘show advanced options’, 1;RECONFIGURE WITH OVERRIDE;"

$cmd.CommandText = $commandText

$r = $cmd.ExecuteNonQuery()

# Set the Max Degree of Parallelism value to 1

write-host "Setting ‘max degree of parallelism’ value to 1 for server $server…"

$commandText = "sp_configure ‘max degree of parallelism’, 1;RECONFIGURE WITH OVERRIDE"

$cmd.CommandText = $commandText

$r = $cmd.ExecuteNonQuery()

write-host "Success"

}

catch

{

write-host "An error occurred trying to set the MaxDegreeOfParallelism value to 1 for server $server" -Fore Red

write-host "Ensure that server and port parameters are correct and that the current user has at least serveradmin permissions within SQL" -Fore Red

}

finally

{

$conn.Close()

}

}

# Call the function passing in SQL server name/instance/alias and port number

SetMaxDegreeOfParallelism -server $name -port "1433"[/code]

#Set SQL Max memory – 3GB

[code language=”powershell”]Function Test-SqlSa {

<#

.SYNOPSIS

Ensures sysadmin account access on SQL Server. $server is an SMO server object.

.EXAMPLE

if (!(Test-SQLSA $server)) { throw "Not a sysadmin on $source. Quitting." }

.OUTPUTS

$true if syadmin

$false if not

#>

[CmdletBinding()]

param(

[Parameter(Mandatory = $true)]

[ValidateNotNullOrEmpty()]

[object]$server

)

try {

return ($server.ConnectionContext.FixedServerRoles -match "SysAdmin")

}

catch { return $false }

}

Function Get-ParamSqlCmsGroups {

<#

.SYNOPSIS

Returns System.Management.Automation.RuntimeDefinedParameterDictionary

filled with server groups from specified SQL Server Central Management server name.

.EXAMPLE

Get-ParamSqlCmsGroups sqlserver

#>

[CmdletBinding()]

param(

[Parameter(Mandatory = $true)]

[string]$Server

)

if ([Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SMO") -eq $null) {return}

if ([Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.Management.RegisteredServers") -eq $null) {return}

$cmserver = New-Object Microsoft.SqlServer.Management.Smo.Server $server

$sqlconnection = $cmserver.ConnectionContext.SqlConnectionObject

try { $cmstore = new-object Microsoft.SqlServer.Management.RegisteredServers.RegisteredServersStore($sqlconnection)}

catch { return }

if ($cmstore -eq $null) { return }

$newparams = New-Object System.Management.Automation.RuntimeDefinedParameterDictionary

$paramattributes = New-Object System.Management.Automation.ParameterAttribute

$paramattributes.ParameterSetName = "__AllParameterSets"

$paramattributes.Mandatory = $false

$argumentlist = $cmstore.DatabaseEngineServerGroup.ServerGroups.name

if ($argumentlist -ne $null) {

$validationset = New-Object System.Management.Automation.ValidateSetAttribute -ArgumentList $argumentlist

$combinedattributes = New-Object -Type System.Collections.ObjectModel.Collection[System.Attribute]

$combinedattributes.Add($paramattributes)

$combinedattributes.Add($validationset)

$SqlCmsGroups = New-Object -Type System.Management.Automation.RuntimeDefinedParameter("SqlCmsGroups", [String[]], $combinedattributes)

$newparams.Add("SqlCmsGroups", $SqlCmsGroups)

return $newparams

} else { return }

}

Function Get-SqlCmsRegServers {

<#

.SYNOPSIS

Returns array of server names from CMS Server. If -Groups is specified,

only servers within the given groups are returned.

.EXAMPLE

Get-SqlCmsRegServers -Server sqlserver -Groups "Accounting", "HR"

#>

[CmdletBinding()]

param(

[Parameter(Mandatory = $true)]

[string]$server,

[string[]]$groups

)

if ([Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SMO") -eq $null) {return}

if ([Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.Management.RegisteredServers") -eq $null) {return}

$cmserver = New-Object Microsoft.SqlServer.Management.Smo.Server $server

$sqlconnection = $cmserver.ConnectionContext.SqlConnectionObject

try { $cmstore = new-object Microsoft.SqlServer.Management.RegisteredServers.RegisteredServersStore($sqlconnection)}

catch { throw "Cannot access Central Management Server" }

$servers = @()

if ($groups -ne $null) {

foreach ($group in $groups) {

$cms = $cmstore.ServerGroups["DatabaseEngineServerGroup"].ServerGroups[$group]

$servers += ($cms.GetDescendantRegisteredServers()).servername

}

} else {

$cms = $cmstore.ServerGroups["DatabaseEngineServerGroup"]

$servers = ($cms.GetDescendantRegisteredServers()).servername

}

return $servers

}

Function Get-SqlMaxMemory {

<#

.SYNOPSIS

Displays information relating to SQL Server Max Memory configuration settings. Works on SQL Server 2000-2014.

.DESCRIPTION

Inspired by Jonathan Kehayias’s post about SQL Server Max memory (<a href="http://bit.ly/sqlmemcalc">http://bit.ly/sqlmemcalc</a>), this script displays a SQL Server’s:

total memory, currently configured SQL max memory, and the calculated recommendation.

Jonathan notes that the formula used provides a *general recommendation* that doesn’t account for everything that may be going on in your specific environment.

.PARAMETER Servers

Allows you to specify a comma separated list of servers to query.

.PARAMETER ServersFromFile

Allows you to specify a list that’s been populated by a list of servers to query. The format is as follows

server1

server2

server3

.PARAMETER SqlCms

Reports on a list of servers populated by the specified SQL Server Central Management Server.

.PARAMETER SqlCmsGroups

This is a parameter that appears when SqlCms has been specified. It is populated by Server Groups within the given Central Management Server.

.NOTES

Author : Chrissy LeMaire

Requires:         PowerShell Version 3.0, SQL Server SMO, sysadmin access on SQL Servers

DateUpdated: 2015-May-21

.LINK

<a href="https://gallery.technet.microsoft.com/scriptcenter/Get-Set-SQL-Max-Memory-19147057">https://gallery.technet.microsoft.com/scriptcenter/Get-Set-SQL-Max-Memory-19147057</a>

.EXAMPLE

Get-SqlMaxMemory -SqlCms sqlcluster

Get Memory Settings for all servers within the SQL Server Central Management Server "sqlcluster"

.EXAMPLE

Get-SqlMaxMemory -SqlCms sqlcluster | Where-Object { $_.SqlMaxMB -gt $_.TotalMB } | Set-SqlMaxMemory -UseRecommended

Find all servers in CMS that have Max SQL memory set to higher than the total memory of the server (think 2147483647)

#>

[CmdletBinding()]

Param(

[parameter(Position=0)]

[string[]]$Servers,

# File with one server per line

[string]$ServersFromFile,

# Central Management Server

[string]$SqlCms

)

DynamicParam { if ($SqlCms) { return (Get-ParamSqlCmsGroups $SqlCms) } }

PROCESS {

if ([string]::IsNullOrEmpty($SqlCms) -and [string]::IsNullOrEmpty($ServersFromFile) -and [string]::IsNullOrEmpty($servers))

{ throw "You must specify a server list source using -Servers or -SqlCms or -ServersFromFile" }

if ([Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SMO") -eq $null )

{ throw "Quitting: SMO Required. You can download it from <a href="http://goo.gl/R4yA6u">http://goo.gl/R4yA6u</a>" }

if ([Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.Management.RegisteredServers") -eq $null )

{ throw "Quitting: SMO Required. You can download it from <a href="http://goo.gl/R4yA6u">http://goo.gl/R4yA6u</a>" }

$SqlCmsGroups = $psboundparameters.SqlCmsGroups

if ($SqlCms) { $servers = Get-SqlCmsRegServers -server $SqlCms -groups $SqlCmsGroups }

If ($ServersFromFile) { $servers = Get-Content $ServersFromFile }

$collection = @()

foreach ($servername in $servers) {

Write-Verbose "Attempting to connect to $servername"

$server = New-Object Microsoft.SqlServer.Management.Smo.Server $servername

try { $server.ConnectionContext.Connect() } catch { Write-Warning "Can’t connect to $servername. Moving on."; continue }

$maxmem = $server.Configuration.MaxServerMemory.ConfigValue

$reserve = 1

$totalMemory = $server.PhysicalMemory

# Some servers underreport by 1MB.

if (($totalmemory % 1024) -ne 0) { $totalMemory = $totalMemory + 1 }

if ($totalMemory -ge 4096) {

$currentCount = $totalMemory

while ($currentCount/4096 -gt 0) {

if ($currentCount -gt 16384) {

$reserve += 1

$currentCount += -8192

} else {

$reserve += 1

$currentCount += -4096

}

}

}

$recommendedMax = [int]($totalMemory-($reserve*1024))

$object = New-Object PSObject -Property @{

Server = $server.name

TotalMB = $totalMemory

SqlMaxMB = $maxmem

RecommendedMB = $recommendedMax

}

$server.ConnectionContext.Disconnect()

$collection += $object

}

return ($collection | Sort-Object Server | Select Server, TotalMB, SqlMaxMB, RecommendedMB)

}

}

Function Set-SqlMaxMemory {

<#

.SYNOPSIS

Sets SQL Server max memory then displays information relating to SQL Server Max Memory configuration settings. Works on SQL Server 2000-2014.

.PARAMETER Servers

Allows you to specify a comma separated list of servers to query.

.PARAMETER ServersFromFile

Allows you to specify a list that’s been populated by a list of servers to query. The format is as follows

server1

server2

server3

.PARAMETER SqlCms

Reports on a list of servers populated by the specified SQL Server Central Management Server.

.PARAMETER SqlCmsGroups

This is a parameter that appears when SqlCms has been specified. It is populated by Server Groups within the given Central Management Server.

.PARAMETER MaxMB

Specifies the max megabytes

.PARAMETER UseRecommended

Inspired by Jonathan Kehayias’s post about SQL Server Max memory (<a href="http://bit.ly/sqlmemcalc">http://bit.ly/sqlmemcalc</a>), this uses a formula to determine the default optimum RAM to use, then sets the SQL max value to that number.

Jonathan notes that the formula used provides a *general recommendation* that doesn’t account for everything that may be going on in your specific environment.

.NOTES

Author : Chrissy LeMaire

Requires:         PowerShell Version 3.0, SQL Server SMO, sysadmin access on SQL Servers

DateUpdated: 2015-May-21

.LINK

<a href="https://gallery.technet.microsoft.com/scriptcenter/Get-Set-SQL-Max-Memory-19147057">https://gallery.technet.microsoft.com/scriptcenter/Get-Set-SQL-Max-Memory-19147057</a>

.EXAMPLE

Set-SqlMaxMemory sqlserver 2048

Set max memory to 2048 MB on just one server, "sqlserver"

.EXAMPLE

Get-SqlMaxMemory -SqlCms sqlcluster | Where-Object { $_.SqlMaxMB -gt $_.TotalMB } | Set-SqlMaxMemory -UseRecommended

Find all servers in CMS that have Max SQL memory set to higher than the total memory of the server (think 2147483647),

then pipe those to Set-SqlMaxMemory and use the default recommendation

.EXAMPLE

Set-SqlMaxMemory -SqlCms sqlcluster -SqlCmsGroups Express -MaxMB 512 -Verbose

Specifically set memory to 512 MB for all servers within the "Express" server group on CMS "sqlcluster"

#>

[CmdletBinding()]

Param(

[parameter(Position=0)]

[string[]]$Servers,

[parameter(Position=1)]

[int]$MaxMB,

[string]$ServersFromFile,

[string]$SqlCms,

[switch]$UseRecommended,

[Parameter(ValueFromPipeline=$True)]

[object]$collection

)

DynamicParam { if ($SqlCms) { return (Get-ParamSqlCmsGroups $SqlCms)} }

PROCESS {

if ([string]::IsNullOrEmpty($SqlCms) -and [string]::IsNullOrEmpty($ServersFromFile) -and [string]::IsNullOrEmpty($servers) -and $collection -eq $null)

{ throw "You must specify a server list source using -Servers or -SqlCms or -ServersFromFile or you can pipe results from Get-SqlMaxMemory" }

if ($MaxMB -eq 0 -and $UseRecommended -eq $false -and $collection -eq $null) { throw "You must specify -MaxMB or -UseRecommended" }

if ($collection -eq $null) {

$SqlCmsGroups = $psboundparameters.SqlCmsGroups

if ($SqlCmsGroups -ne $null) {

$collection = Get-SqlMaxMemory -Servers $servers -SqlCms $SqlCms -ServersFromFile $ServersFromFile -SqlCmsGroups $SqlCmsGroups

} else { $collection = Get-SqlMaxMemory -Servers $servers -SqlCms $SqlCms -ServersFromFile $ServersFromFile }

}

$collection | Add-Member -NotePropertyName OldMaxValue -NotePropertyValue 0

foreach ($row in $collection) {

$server = New-Object Microsoft.SqlServer.Management.Smo.Server $row.server

try { $server.ConnectionContext.Connect() } catch { Write-Warning "Can’t connect to $servername. Moving on."; continue }

if (!(Test-SqlSa $server)) {

Write-Warning "Not a sysadmin on $servername. Moving on."

$server.ConnectionContext.Disconnect()

continue

}

$row.OldMaxValue = $row.SqlMaxMB

try {

if ($UseRecommended) {

Write-Verbose "Changing $($row.server) SQL Server max from $($row.SqlMaxMB) to $($row.RecommendedMB) MB"

$server.Configuration.MaxServerMemory.ConfigValue = $row.RecommendedMB

$row.SqlMaxMB = $row.RecommendedMB

} else {

Write-Verbose "Changing $($row.server) SQL Server max from $($row.SqlMaxMB) to $MaxMB MB"

$server.Configuration.MaxServerMemory.ConfigValue = $MaxMB

$row.SqlMaxMB = $MaxMB

}

$server.Configuration.Alter()

} catch { Write-Warning "Could not modify Max Server Memory for $($row.server)" }

$server.ConnectionContext.Disconnect()

}

return $collection | Select Server, TotalMB, OldMaxValue, @{name="CurrentMaxValue";expression={$_.SqlMaxMB}}

}

}

Set-SqlMaxMemory $name 10000[/code]

Install Developer APPS

[code language=”powershell”]start-process F:\tools\7zipInstall.msi -ArgumentList "/q" -Wait

start-process F:\tools\ccleaner.exe -argumentlist "/S" -Wait

start-process F:\tools\fiddler4setup.exe -ArgumentList "/S" -Wait

start-process ‘F:\tools\Firefox Setup Stub 36.0.4.exe’ -ArgumentList "/S" -Wait

start-process F:\tools\iview438_setup.exe -ArgumentList "/silent" -Wait

start-process F:\tools\LINQPad4Setup.exe -ArgumentList "/silent" -Wait

start-process F:\tools\npp.6.7.5.Installer.exe -ArgumentList "/S" -Wait

start-process F:\tools\paint.net.4.0.5.install.exe -ArgumentList "/S" -Wait

start-process F:\tools\paint.net.4.0.5.install.exe -ArgumentList "/auto" -Wait

start-process F:\tools\PowerGUI.3.8.0.129.msi -ArgumentList "/q"[/code]

#start print spooler for cutepdf

[code language=”powershell”]net start spooler

sc query spooler[/code]

#fix hanging http://d4rkcell.com/archives/1217

[code language=”powershell”]start-process F:\tools\cutepdf-writer.exe -ArgumentList "/VERYSILENT /SUPPRESSMSGBOXES /NORESTART /SP-"

start-process F:\tools\VisualStudio2012\vs_premium.exe -ArgumentList "/adminfile AdminDeployment.xml /passive /norestart" -Wait -NoNewWindow -PassThru

$vsixInstallerPath = "C:\Program Files (x86)\Microsoft Visual Studio 11.0\Common7\IDE\VSIXInstaller.exe"

$extensionPath = "F:\tools\CKS.Dev11.vsix"

Start-Process -FilePath $vsixInstallerPath -ArgumentList "/q $extensionPath" -NoNewWindow -Wait

start-process F:\tools\SPDesigner\setup.exe -ArgumentList "/adminfile updates\adminfile.msp" -wait

start-process F:\tools\en_visual_studio_enterprise_2015_with_update_1_x86_x64_web_installer_8234346.exe -ArgumentList "/S /AdminFile F:\Tools\AdminDeployment.xml" -Wait[/code]

RDP to developer VM as svc_SPSetup and Launch AutoSPInstaller bat file

[code language=”powershell”]start-process F:\tools\AutoSPInstaller\SP\AutoSPInstaller\AutoSPInstallerLaunch.bat -wait[/code]

#Run autospinstaller

#Pre-reqs- 10 minutes w Restart

#Install binaries- 15 minutes

#Automation Fix- CU prompts for internet trusted file. 30 minutes (or right click all 3 CU files and go to Security and unblock, already done for June 2015 CU on AzureShare.).


#UPS Sync- we have to do this manually per install guide

#Add developer as full control of web applications

SharePoint- Configure User Profile AD Sync by hand

AD Connection:

Active Directory Company

Company.local

Company\svc_spups

passw0rd

(Sync All OUs)


Enable timerjob 1am daily:


Start full sync:


Manually add developer as full control of web applications.

Done!

SharePoint DevOps Part 1 – Setup CentOS with Ansible

This is the first post in a series for creating a Linux host with Ansible to control Windows machines and install SharePoint.

Let me preface this with I have no clue how far I will get. I expect this to take a few weeks if it’s possible. Ansible seems to be growing, and this all might not be fully baked out. We will see! Feel free to comment.

  1. Part 1- Setup CentOS with Ansible (This post)
  2. Part 2- Setup Ansible with Windows Machines (Coming soon)
  3. Part 3- Using Ansible to prepare Windows Machines for SQL Server (Coming soon)
  4. Part 4- Using Ansible to install SharePoint using AutoSPInstaller (Coming soon)
  5. Part 5- Using Ansible to Maintain SharePoint Machines (Coming soon)

Introduction

There are various technologies I have attempted to use in the past for Windows Deployment SharePoint automation using AutoSPInstaller, such as:

  1. Chocolatey– I could not find enterprise worthy packages for SQL Server and SharePoint. I also couldn’t figure out a way to provision Azure VMs from it.
  2. PowerShell DSC – Way to overkill with push/pull when I just wanted to configure a new SharePoint farm
  3. Chef– Have not tried it. I personally can’t get over the names of things relating to items in the kitchen. But I believe this is a standard. Check it out
  4. Azure WebHooks– Runs PowerShell, possibly limited, on Azure VMs. Note each VM needs an endpoint for each runbook. I have 60 dev VMs/farms that I manage, and I wanted a different runbook for different parts of the SharePoint configuration process, this wasn’t going to happen.
  5. Azure RunBooks– Lots of great tutorials for building a SP Farm, but the farm was not fully configured. I prefer AutoSPInstaller and couldn’t figure out a way to merge them.

Now my above experience is very limited, as when I would hit a wall I would pretty much see how far the technology is and if it’s not supported I would give up. Also, the above experiences are a hybrid of on-premises SharePoint installation needs and Azure IaaS hosting SharePoint needs. So if this does not work, I will be going back to the drawing board or the above list. I have not used Microsoft System Center Orchestrator, as I rarely provision new Hyper-V VMs, and heard it is a ton of work to configure.

My Goal with Automation

I have a very well defined steps I have documented to install SharePoint 2013 farms with one or more servers, for dev/qa or production. These can be in Azure or on-premises in Hyper-V. I want something lightweight that I can provision a new VM, join it to the domain, create a domain if needed, run Windows Update, configure OS rules, install SQL Server (different versions), install SharePoint via AutoSPInstaller, patch SharePoint, and configure services. I would also like to set services and verify those rogue developers did not change any system settings (just kidding team). I know automation does not replace planning, but I hope to turn documentation and PowerShell scripts into a custom deployment tool I can use for provisioning future environments as well as maintaining existing environments. When I read the Phoenix Project, they brought up a good point of “If a family dog gets injured, you nurse it back to health, but if one of your cattle get injured, you will be having beef for dinner”, which basically means why are you spending 100 hours fixing an environment, when you can just recreate or replace it with another? Of course, developers and users change system settings, but the core of the machine can be recreated via a script in less than a day, rather than troubleshooting something and possibly not solving the issue.

Getting Started with Ansible

I picked up a book on Amazon called Ansible for DevOps by Jeff Geerling but shortly into the book realized it was only for Linux based machines. Ansible does support Windows targets, but the commands must be ran from a Linux OS. Dang. I don’t know Linux. So now I am writing this blog post. And it begins. Let’s see what Ansible can do for Windows targets.

Installing Linux – CentOS on a Windows Hyper-V host

  1. Download Cent OS7
    1. https://www.centos.org/download/
    2. I choose Torrent option, as the mirrors were pretty slow. Torrent went at 6mbps download and finished 4GB in about 10 minutes.
  2. Create new VM

    1. Choose a location for the VM file

    2. Choose Generation1

    3. Choose a fixed amount of ram

    4. Choose your Hyper V NIC

    5. Choose your VHD path/info for a new blank VHD to be created

    6. Choose the CentOS ISO file downloaded from the first step

    7. Turn on the VM!
    8. Boot to Cent OS install

Install CentOS on VM (then Python/Ansible)

  1. Select Language


  1. Choose software selection (Choose Server with GUI unless you know how to use Linux terminal well) I also selected the Development/Security tools, and MariaDB (I saw MariaDB in the Ansible book example and figure this will save me some steps later for Ansible testing)


  2. Choose disk


  3. Enable Ethernet and choose a hostname for the computer


  4. Verify everything looks good:


  5. While the OS is installing, configure a root and local user account. Root is like a local server admin password, which we will be using. The user account is a username and password, which you will be logging into each time you start the VM


  6. Select the blue Reboot option when the install is complete.

Log In to Cent OS

  1. Log in to CentOS
    1. License Agreement
      1. Hit 1 to read it, 2 to accept, c to continue, and c again to continue (I kinda struggled with this part)


    2. Sign in using the username and password you set up:


      1. Accept language, keyboard layout, and skip cloud accounts if desired. Then click Start using Linux!


  2. Run Terminal


Install Python/Ansible on CentOS

  1. Install Python
    1. Type SU then hit Enter in terminal to enter the root admin window


    2. Enter root password (different from user password, you entered it in setup)
    3. Once in root, install Python.
      1. Type: sudo yum install epel-release


      2. Hit Y to continue (twice)
      3. Verify complete

  2. Install Ansible
    1. Type sudo yum install ansible


    2. Select Y to continue (twice)

    3. Test ansible command to verify install is complete:
      1. Type: ansible –version


      2. You should get back a version number.

RESOURCES

Thanks to this article for the Python install help I was able to figure out how to install it without errors on CentOS7. Here are the same commands over again, just together without screenshots:

http://stackoverflow.com/questions/32048021/yum-what-is-the-message-no-package-ansible-available

$ su

$ sudo yum install epel-release

$ sudo yum install ansible

SharePoint 2013 Newsfeed – We’re still collecting the latest news error

If you are a SharePoint admin or use, you have probably seen this error message on your SharePoint 2013 MySite Newsfeed:

“We’re still collecting the latest news. You may see more if you try again a little later.”

sharepoint 2013 mysite company newsfeed error -Were still collecting the latest news

Hopefully this article will help explain a few common scenarios I have ran into, and how to resolve any errors. Please post if you have any tips or suggestions, as I am always looking for thoughts from the community on this.

This article will address:

  • Common reasons for newsfeed data not displaying
  • What is distributed cache?
  • Distributed cache configuration
  • Shutdown/Reboot WFE procedure for maintenance so you don’t lose your cache
  • Repopulating cache (if server stopped unexpectedly)
  • References

Common reasons for newsfeed data not displaying

  1. Someone rebooted all your Distributed Cache servers at the same time
    1. Check the task manager uptime to see how long the servers with distributed cache have been running, or if they were rebooted unexpectedly
    2. Fix: Repopulate cache using PowerShell, or maybe wait a long time for new news
    3. Prevent it from happening again: Shut down one server at a time, stopping the cache first, rebooting, and then starting the cache again.
  2. “Everyone” is empty because it only keeps company conversations for 14 days by default.
    1. Fix: Increase the retention time, or encourage people to post to the company newsfeed (not site newsfeeds). See this article for more information on what appears in site newsfeeds vs company newsfeeds. https://support.office.com/en-za/article/What-items-appear-in-your-newsfeed-bd3d9268-0408-4ad4-bc51-2e4ec5406e16#__toc327280723
  3. Distributed cache is not configured right
    1. Fix: Configure it right J this one is so simple, yet so difficult I find. See configuration below.

What is distributed cache?

Distributed cache is a framework Microsoft uses to quickly host social information in SharePoint within the SP servers ram. This can be enabled on one or many SP servers in your farm.

Official definition can be found on this poster, https://www.microsoft.com/en-us/download/details.aspx?id=35557

What uses distributed cache?

Pretty much anything social, but some of the social data comes from content databases and user profile databases. Company newsfeed posts are stored in distributed cache.

  • Newsfeeds
  • Authentication
  • OneNote client access
  • Security Trimming
  • Page load performance

Distributed cache configuration

Note: Run any scripts/commands logged in as the SPFarm account, and be sure to run SharePoint Management Shell as administrator if you have UAC enabled (like a good administrator)

Caution: configuration deletes the cache, so you will need to repopulate the cache after configuring it.

Determine servers to host the distributed cache service

Usually the WFE servers, not servers running search or excel. AutoSPInstaller has a limit of 4 servers, but typically it does not configure distributed cache correctly.

Configuring Distributed Cache

There is a good article here on these commands, https://technet.microsoft.com/en-us/library/JJ219613.aspx and probably better than this article I am writing. But it’s very long so I wrote this article to get Admins 90%-100% of the way there.

Here is how I have been configuring distributed cache. Thanks Jon for the help!

  1. Use Central Admin or PowerShell to start/stop the SharePoint Distributed Cache service on the desired servers in your farm (usually WFE’s).
    1. Or you can use PowerShell to get/start-spserviceinstance of Distributed Cache on the desired servers. I like to use PowerShell to see what servers are running Distributed Cache within my farm:
      1. Get-SPServiceInstance | where-object {$_.typename -ilike “*distributed*”}
  2. Verify Cache service is running on desired servers:
    1. Use-CacheCluster
      1. Note, this command doesn’t configure the server, but just connects the current PowerShell session to manage the cache cluster it’s joined for the PowerShell management session. It’s actually an alias command for Connect-AFCacheClusterConfiguration.
    2. Get-CacheHost
    3. You should see each server running distributed cache listed above. If not, there might be more work to configure the cache cluster I may have missed in this post. Let me know!
  3. Get current configuration
    1. Get-AFCacheHostConfiguration -ComputerName wfe01 -CachePort “22233”
    2. The Cache Size can be updated, see guide here https://technet.microsoft.com/en-us/library/JJ219613.aspx#memory . For 16GB of ram on our WFE servers, we go with 819MB (~5% of 16GB). Note, changing this requires the distributed cache service to be stopped on the computer you are changing it on. Update-SPDistributedCacheSize -CacheSizeInMB 819
  4. Export config, verify service account for distributed cache, as well as servers.
    1. Export-CacheClusterConfig -Path C:\test.xml
      1. Check max cache size (default is 5% -, no more than 4GB – size depends on services on the server)
      2. Check servers – ensure only WFE (or desired servers are in the cluster)
      3. Check service account – ensure all servers use the same service account (spservice)
      4. Check ports
  5. Warning: After configuration is complete do not ever run Add-SPDistributedCacheServiceInstance or Remove-SPDistributedCacheServiceInstance. It reconfigures the cluster (and usually incorrectly)

Shutdown/Reboot WFE procedure

If you have to do reboots on the WFE for Windows Updates, etc., you might be expecting to lose your Newsfeed cache. Here is the proper procedure to retain the cache.

Summary: shutdown the cache one server at a time, reboot that server, add the server back into the cache cluster. Repeat on next server.

  1. Verify Cache service is running on desired servers (more than one server too is key):
    1. Use-CacheCluster
    2. Get-CacheHost
  2. Reboot SQL Server first if needed. Get this out of the way.
    1. Wait for SQL Server to come back online
  3. Reboot WFE1
    1. Perform these commands on WFE1
    2. Verify Cache service is running on desired servers (more than one server too is key):
      1. Use-CacheCluster
      2. Get-CacheHost
    3. Run Stop-spdistributedcacheserviceinstance -graceful
    4. Verify Cache service is stopped on WFE1. Ensure it is stopped before proceeding:
      1. Get-CacheHost
    5. Reboot WFE1
    6. Verify Cache service is running on WFE1:
      1. Use-CacheCluster
      2. Get-CacheHost
      3. If not, go to Central Admin Services on Server and start Distributed Cache service on WFE1, or use PowerShell.
  4. Reboot WFE2
    1. (Repeat above Step #3, but replace WFE1 with WFE2)
  5. Verify it is running
    1. Verify Cache service is running on desired servers (more than one server too is key):
      1. Use-CacheCluster
      2. Get-CacheHost

Repopulating cache (if server stopped unexpectedly)

Replace URL with your mysite URL. This script will populate each user’s cache using Update-SPRepopulateMicroblogFeedCache and the entire user profile newsfeed cache using Update-SPRepopulateMicroblogLMTCache. I am not sure if I stole this script from anywhere, but part of it is from various user profile scripts adapted to fix the users feed cache.

Note: Run any scripts/commands logged in as the SPFarm account, and be sure to run SharePoint Management Shell as administrator if you have UAC enabled (like a good administrator). Otherwise you will get a .ctor error that will drive you crazy.

Download the script from here: http://pastebin.com/K9yR2pEk

$proxy
= Get-SPServiceApplicationProxy | ? {$_.Name -ilike
“User Profile Service Application*”}

Update-SPRepopulateMicroblogLMTCache -ProfileServiceApplicationProxy $proxy

[System.Reflection.Assembly]::LoadWithPartialName(“Microsoft.Office.Server”)

[System.Reflection.Assembly]::LoadWithPartialName(“Microsoft.Office.Server.UserProfiles”)

$url
=
http://mysiteurl.domain.com

$contextWeb
=
New-Object
Microsoft.SharePoint.SPSite($url);

$ServerContext
= [Microsoft.Office.Server.ServerContext]::GetContext($contextWeb);

$UserProfileManager
=
New-Object
Microsoft.Office.Server.UserProfiles.UserProfileManager($ServerContext);

$Profiles
=
$UserProfileManager.GetEnumerator();

foreach ($oUser
in
$Profiles ) {


if ($oUser.item(“SPS-PersonalSiteCapabilities”).Value -eq 14 ){


$personalurl
=
$url
+
$oUser.item(“personalspace”).Value


Write-Host
$oUser.item(“AccountName”).Value

Update-SPRepopulateMicroblogFeedCache -ProfileServiceApplicationProxy $proxy -accountname $oUser.item(“AccountName”).Value


#-siteurl $personalurl

}

}

$contextWeb.Dispose()

After running the script on each WFE where distributed cache runs, wait 15 minutes for the Newsfeed data to populate.

Then test newsfeed.

References

http://consulting.risualblogs.com/blog/2014/04/01/export-impor-distributed-cache-configuration-in-sharepoint-2013/

http://sharepoint.stackexchange.com/questions/125798/userprofileapplicationnotavailableexception-logging-userprofileapplicationpro

http://netwovenblogs.com/2014/03/11/the-newsfeed-is-not-working-on-mysite-in-sharepoint-2013/

SharePoint PowerShell to audit list columns

Recently, I had to list find where all of the SharePoint site columns are used in lists across many site collections. I wrote some quick PowerShell to get all SharePoint list columns and write the site collection, list, and column.

Columns: I listed all site columns/list column display names I was seeing if they are used

Sites: all site collections, except my sites

I am sure it can be cleaned up, but it’s quickly to audit the list columns to get a report. I tried to export to CSV but decided it was more time than formatting the few results that came back in the window.

Results: (can be cleaned up easily, but I don’t have time for this quick script)

AuditColumns.ps1

SPSite Url=https://client/sites/docs, Accounting Documents, Document Type

SPSite Url=https://client/sites/docs, Meeting Minutes, Document Type

SPSite Url=https://client/sites/docs, Documents, Document Type

SPSite Url=https://client/sites/docs, Announcements, Document Type

Script:

[system.reflection.assembly]::loadwithpartialname("microsoft.sharepoint")
$sites = @("https://client",
"https://client/sites/docs");

$cols = @("Document Type",
"Another Column Name",
"My Column",
"Find Me");
foreach($siteurl in $sites){
$site = get-spsite $siteurl
$web=$site.OpenWeb()
$lists=$web.Lists
foreach($list in $lists) {
$fields = $list.Fields
foreach($col in $cols){
if($fields.title -contains $col){
write-output "$site, $list, $col"
}
}
}

Visio file error in SharePoint – Sorry, we can’t perform this action

I recently had an issue with a new Windows 8.1 laptop not able to open Visio files from our SharePoint 2013 intranet. The error is similar to the 32bit/64bit mixed environment errors. My co-worker Rod and I found a workaround that might not be best practices, but resolved the issue. Please post any comments if there is an industry best standard to resolve the error aside from reformatting.

SharePoint 2013 Intranet error with Office Pro Plus on Visio 2013 64bit: Microsoft Visio- Sorry, we can’t perform this action. Incompatible Office products are installed on your machine. If you have an administrator, please contact them for help. OK

Microsoft Visio- Sorry, we can't perform this action. Incompatible Office products are installed on your machine. If you have an administrator, please contact them for help. OK

Terminology:

  • OneDrive for Business- SharePoint file sync tool
    • Blue cloud icon
  • OneDrive (Personal)- Not discussed anywhere in this post. Ignore personal OneDrive here, its 100% separate from SharePoint.
    • White cloud icon

My setup:

  • OS: Windows 8.1 Enterprise 64 bit
  • Computer: Dell Latitude E5550 touch screen laptop, Intel i7 2.6GHz, 8GB Ram
  • Office: Office 365 ProPlus 15.0.4727.1003
  • Visio Version: Visio Professional 2013 15.0.4569.1506
  • OneDrive for Business Updates:
  • Internet Explorer Version: Internet Explorer 11 Version: 11.0.9600.17842
    • SharePoint intranet site is added as Local Intranet, default security settings.

Short term solution: I noticed if I ended the OneDrive for Business process in task manage, I would then end the Microsoft Office Upload Center process. After these were ended (in that order only) I could open Visio files from the intranet.

Before I did this short term fix, other users at my company with the same application versions would try to launch OneDrive for Business and get the same “Sorry, we can’t perform this action” message with OneDrive for Business, but they could open Visio files from the intranet. However, I could launch only OneDrive for Business, but not Visio from the Intranet file link. So when I did this fix, I could launch Visio from the SharePoint 2013 intranet file and view my Visio file, and when I would open OneDrive for Business, I would get the same error as other users, “Sorry, we can’t perform this action”. Good, now I am following the company standard. BUT, as soon as I reboot, same issue comes back and I can’t open Visio files from the SharePoint intranet.

Long term solution: (NOTE, this might not be best practices, but it solves my issue. If you have a Microsoft supported alternative, please post it here in the comments). Instead of ending the processes above, go to Startup tab of the Task Manager and disable OneDrive for Business. This might also disable the child Office Document Cache under it, but it resolves my issue. Summary: OneDrive for Business conflicts with Microsoft Office Document Cache and it might be a 64bit/32bit issue, not sure, but disabling OneDrive for Business on startup resolves the issue for me. I will post if there are any noticeable side effects of file offline caching, stale cache, or other errors. I think this happened to me only because I configured OneDrive for Business to resolve this issue when I first got my laptop, not sure why the laptop originally had the issue. Again, if there’s a better way, please comment below. Thanks!

SharePoint 2013, IIS7, NLB, SSL certificates and GoDaddy Renewal Steps

Overview:

SSL certificates with SharePoint 2013 web applications expire, and when that does, you have to generate a new SSL Certificate. In this post, I will go over how to renew you SharePoint 2013 SSL HTTPS website with GoDaddy, even including multi-server Web Front End (WFE’s) topologies. If you use wildcard certificates on you SharePoint websites, there are a few gotchas when renewing. The process is similar for most certificate types, but wildcards and SharePoint are this blog posts focus. These steps are also similar if you are adding a SSL certificate to your website for the first time (once your SharePoint farm, web applications, and site collections have been configured to use HTTPS, etc.).

Here is an overview of the steps involved with the certificate renew process:

  1. Request a new certificate request from the machine running IIS/SharePoint (Pick a WFE)
  2. Go to GoDaddy and rekey your certificate, entering your certificate request text from step 1
  3. Complete the certificate request in IIS on WFE
  4. Update WFE bindings to use SSL cert
  5. Export certificate from WFE to WFE2 (PFX with personal information, create a password)
  6. Import the PFX on WFE2 IIS
  7. Update WFE2 bindings to use SSL cert

Common issues:

First, this is my experience. Comment below any corrections or other helpful information.

  • When adding the cert to IIS and refreshing, it disappears!
    • Your certificate request is expired. Generate a new one and try again.
    • You are following GoDaddys guide, which does not work. Follow my post below.
    • The cert might already exist and need to be deleted in the Certificate Manager on the server.
  • CER, CRT, PFX- what is the difference? Why do I have to select *.* if I need a specific type? Who designed this stuff…
    • CER is a request
    • CRT is a certificate without private information
    • PFX is a certificate package with private information (exported from CRT paired on the first server, the PFX is imported to the second server).
  • How do I complete a request on WFE2 if it was already completed from WFE1?
    • Export the working cert from Server 1 as a PFX file with a password, then import it on server 2 in IIS. Do not use cert manager on server 2.

Steps to renew your Existing wildcard SSL Certificate:

  1. Verify your certificate is expired by navigating to your SharePoint site. If you get an HTTPS trust warning, it’s expired or has issues that this blog post will address.
  2. Go to WFE1 IIS 7 on your SharePoint box
    1. Go to Server Certificates in IIS

    2. Remove any old certificates that contain the URL for your SharePoint site that we are renewing

    3. On the top right in IIS, go to “Create Certificate Request”

    4. Enter your information. Common name is the wildcard URL. The rest, do not use abbreviations. See this post for more info: https://support.godaddy.com/help/article/4800/generating-iis-7-csrs-certificate-signing-requests

    5. Select “4096” for the bit length

    6. Select a location/filename for the text file that is about to be generated

    7. We will be copying the contents of this file to GoDaddy to rekey our wildcard SSL certificate in the next step.
  3. Now that we have our server “key” information waiting in the text file, we can now go to GoDaddy and pair this server information to that of our SSL certificate.
    1. Go to Go Daddy Certificate Manager (Manage SSL Certificates > Manage Certificates)

    2. Select “Re-Key” on the top navigation
    3. Paste your text file contents from the IIS text file to this GoDaddy window:

    4. Select “Re-Key”
    5. Click “Manage Certificates” From the top navigation, then select “Certificates” folder on the left navigation.
    6. Select the bottom SSL certificate (the most recent version)
    7. Select “Download” icon from the navigation.

    8. Select IIS7, the “Download”

    9. Save this zip to your WFE server where you created the IIS certificate request.
    10. Extract to C:\Temp and proceed carefully to the next steps in this post.
  4. On WFE1 in IIS where you created the certificate request, open IIS 7 and follow these steps to use the certificate you downloaded from GoDaddy.
    1. Remove any old expired wildcard certificates from the WFE1 servers “Certificate Manager”, check Personal > Certificates and the Intermediate > Certificates locations

    2. COMMON GOTCHA: Do not install the cert, do it using IIS.
    3. Go back to “Server Manager” in IIS 7, select “Complete Certificate Request” on the right navigation

    4. Enter the information for the Certificate request as follows:

    5. COMMON GOTCHA: Select *.* when browsing for the CRT file from the GoDaddy zip

    6. Friendly name must be the wildcard URL of the domain.
    7. Click OK.
    8. Refresh the Server Manager to verify the certificate “stays”. If it disappears, you either have:
      1. A certificate in your Personal Certificate store with the same friendly name
      2. An expired or old Certificate Request you generated and downloaded, or you downloaded an older certificate from GoDaddy. Repeat these steps and it will work (it should).
  5. Set the IIS binding of the new certificate to your SharePoint 443 SSL HTTPS website in IIS:
    1. Go to IIS 7 > Sites > select the SharePoint site that uses the wildcard cert.
    2. Select “Bindings” on the right with the website selected.

       

    3. Select “Edit” and select the new SSL certificate

    4. Select OK. On WFE2, you will get an error here trying to use an exported PFX file, follow the next steps to fix WFE2.
    5. Verify the site loads on WFE1 if you can control your DNS/NLB routing.
  6. If you have additional WFE servers, you need to export this new verified SSL certificate to IIS. Here is how.
    1. From WFE1, Go to “Server Certificates”, right click the wildcard cert and select “Export”

    2. Pick a location for the new PFX file, then enter a secure password.

    3. Click OK
    4. Copy the PFX file to WFE2 through Explorer or any other method.
    5. On WFE2, go to IIS 7 > “Server Certificates” and select “Import”

       

       

       

    6. Browse to the PFX file copied over from WFE1, enter your password and select OK.
    7. Refresh “Server Certificates” to verify it is still available.
    8. Repeat the import process in IIS on other WFE servers.
  7. Now that the certificate is available on the other WFE’s in IIS, we need to update the bindings. Same process as the first WFE.
    1. (Copied and pasted from WFE1 steps, but perform these on the WFE2 and additional servers once the certificate is imported)
    2. Go to IIS 7 > Sites > select the SharePoint site that uses the wildcard cert.
    3. Select “Bindings” on the right with the website selected.

       

    4. Select “Edit” and select the new SSL certificate

    5. Select OK.
    6. Verify the site loads on WFE2 if you can control your DNS/NLB routing.

That’s it! I believe most of what’s above is best practices. I would also remove temporary certificate files, such as PFX, CSR files, etc. left around during the process for added security.

Running MS Office Demo VMs in Azure

UPDATE: If you are a MS employee, Visit https://demomonkey.cloudapp.net/. There is a complete Azure VM deployment script for this in your Azure subscription. I had limited success getting this to work on my own.

If you have used the Microsoft Office Demos website Office 365 environment, you know it’s quite handy for client demos. It used to be similar to the SharePoint Information Worker Demo or SDPS demo. This new environment runs on Windows Server 2012 and features SharePoint 2013, Exchange, Lync, and Office web apps. These demos can be spun up on Office 365, or downloaded as Hyper-V virtual machines for on premise demos. They are HUGE VMs and resource hogs. I think you need 50+ GB of ram to host all 9 VMs, as well as probably 1TB of hard drive space to even consider starting these VMs. Remember, having everything run on the same disk will create major throughput issues with your storage and run unbearably slow (tried on four 1TB Raid 10 7200 SATA drives and could only get a few going before hitting huge performance walls).

So, let’s host it in Azure!

There are plenty of performance considerations in Azure, such as use an E drive for your data, turn on or off disk caching, using separate storage accounts, etc. I will NOT be covering that. This is a POC for a client demo, so my goal is just to get SharePoint 2013 with the Contoso users and content working on premise so we can demo Web Content Management (WCM) features of SharePoint, along with the Content by Search web part and Taxonomy driven navigation for product sites.

I will only require SharePoint and the Domain Controller for this effort. My farm does not require Search live preview, Outgoing/incoming email, Lync presence, etc. So hosting this on

Here are the major steps:

  • Downloading the template
  • Convert VHDX to VHD before uploading to Azure
  • MakeCert
  • Connect Azure to PowerShell
  • Start conversion while waiting (VHDX to VHD)
  • Upload VHDs:
  • Add VHD to VM Image
  • Create VM
  • Repeat and create the SharePoint VM
  • Add VHD to VM Image
  • Create VM
  • Resources and Links

Downloading the template VMs

First, download the VHDX zip files from www.microsoftofficedemos.com

I chose to download mine with content.

  1. 2013-DC v4 (Complete)
  2. 2013-EXCH v4 (Complete)
  3. 2013-LYNC-SE1 v4 (Complete)
  4. 2013-PCHAT v4 (Complete)
  5. 2013-SP v4 (Complete)
  6. 2013-SP-AFCache v4 (Complete)
  7. 2013-VPN v4 (Complete)
  8. 2013-WAC v4 (Complete)
  9. Office Demos 2013 VHD EULA

I downloaded #1, #5 and #7. These 3 files required ~31GB of disk space and extract to ~175GB

Extract each VM to your computer.

Convert VHDX to VHD before uploading to Azure

If you have Windows 8.1 or Windows PowerShell 4.0, Windows Server 2012 R2, Run PowerShell Convert-VHD command: http://technet.microsoft.com/en-us/library/hh848454.aspx

Convert-VHD -path “D:\temp\MS-Office-Demo-VMs\2013-DC v4 (Complete)\2013-DC Complete v4\Virtual Hard Disks\2013DC.VHDX” -destinationpath “D:\temp\MS-Office-Demo-VMs\2013-DC v4 (Complete)\2013-DC Complete v4\Virtual Hard Disks\2013DC.VHD”

MakeCert

Visual Studio command prompt

C:\Program Files (x86)\Microsoft SDKs\Windows\v7.1A\Bin\x64

makecert -sky exchange -r -n “CN=MSDNAzure6scport06” -pe -a sha1 -len 2048 -ss My “MSDNAzure6scport06.cer”

Connect Azure to PowerShell

MMC- add snap-in Certificate. Navigate to Personal certificates. If you are not a local administrator and run CMD as administrator, they will not appear for you. I just exported another Azure cert from the management tools for testing and it worked for me.

Export again as CER – DER 509 CRT to Desktop

Upload CER file to Azure:

Get-AzurePublishSettingsFile

Save Certificate from Azure when prompted to download: C:\temp\

Import-AzurePublishSettingsFile “C:\temp\Windows Azure MSDN – Visual Studio Premium-12-3-2013-credentials.publishsettings”

Test a random command to verify PowerShell is connected: (Your storage account name below):

Get-AzureStorageAccount portalvhdsfz7h5hgfmhh4k

Start conversion while waiting (VHDX to VHD)

convert-vhd -path “E:\Hyper-V\Virtual Hard Disks\MS Office Demo\2013-DC v4 (Complete)\2013-DC Complete v4\Virtual Hard Disks\2013-DC.VHDX” -DestinationPath “E:\Hyper-V\Virtual Hard Disks\MS Office Demo\2013-DC v4 (Complete)\2013-DC Complete v4\Virtual Hard Disks\2013-DC.VHD”

convert-vhd -path “E:\Hyper-V\Virtual Hard Disks\MS Office Demo\2013-SP v4 (Complete)\2013-SP Complete v4\Virtual Hard Disks\2013-SP.VHDX” -DestinationPath “E:\Hyper-V\Virtual Hard Disks\MS Office Demo\2013-SP v4 (Complete)\2013-SP Complete v4\Virtual Hard Disks\2013-SP.VHD”

convert-vhd -path “E:\Hyper-V\Virtual Hard Disks\MS Office Demo\2013-VPN v4 (Complete)\2013-VPN Complete v4\Virtual Hard Disks\2013-VPN.VHDX” -DestinationPath “E:\Hyper-V\Virtual Hard Disks\MS Office Demo\2013-VPN v4 (Complete)\2013-VPN Complete v4\Virtual Hard Disks\2013-VPN.VHD”

Upload VHDs:

Add-AzureVhd -Destination http://portalvhdsfz7h5hgfmhh4k.blob.core.windows.net/vhds/2013-dc.vhd -LocalFilePath “E:\Hyper-V\Virtual Hard Disks\MS Office Demo\2013-DC v4 (Complete)\2013-DC Complete v4\Virtual Hard Disks\2013-DC.VHD” -NumberOfUploaderThreads 32 -OverWrite

Add-AzureVhd -Destination http://portalvhdsfz7h5hgfmhh4k.blob.core.windows.net/vhds/2013-SP.vhd -LocalFilePath “E:\Hyper-V\Virtual Hard Disks\MS Office Demo\2013-SP v4 (Complete)\2013-SP Complete v4\Virtual Hard Disks\2013-SP.VHD” -NumberOfUploaderThreads 32 -OverWrite

Add-AzureVhd -Destination http://portalvhdsfz7h5hgfmhh4k.blob.core.windows.net/vhds/2013-VPN.vhd -LocalFilePath “E:\Hyper-V\Virtual Hard Disks\MS Office Demo\2013-VPN v4 (Complete)\2013-VPN Complete v4\Virtual Hard Disks\2013-VPN.VHD” -NumberOfUploaderThreads 32 -OverWrite?

Add VHD to VM Image

Add-AzureVMImage -ImageName SP2013-DC -MediaLocation http://portalvhdsfz7h5hgfmhh4k.blob.core.windows.net/vhds/2013-dc.vhd -OS Windows -Label SP2013-DC -ImageFamily “Microsoft Office Demo SharePoint 2013 w content” -Eula http://go.microsoft.com/fwlink/?LinkID=324375 -PrivacyUri http://go.microsoft.com/fwlink/?LinkID=282418 -RecommendedVMSize Medium -Verbose

VERBOSE: 2:20:55 PM – Begin Operation: Add-AzureVMImage

VERBOSE: 2:21:03 PM – Completed Operation: Add-AzureVMImage

AffinityGroup :

Category : User

Location : West US

LogicalSizeInGB : 41

Label : SP2013-DC

MediaLink : http://portalvhdsfz7h5hgfmhh4k.blob.core.windows.net/vhds/2013-dc.vhd

ImageName : SP2013-DC

OS : Windows

Eula : http://go.microsoft.com/fwlink/?LinkID=324375

Description :

ImageFamily : Microsoft Office Demo SharePoint 2013 w content

PublishedDate :

IsPremium : False

IconUri :

PrivacyUri : http://go.microsoft.com/fwlink/?LinkID=282418

RecommendedVMSize : Medium

PublisherName : User

OperationDescription : Add-AzureVMImage

OperationId : 4e871f30-af08-3f06-95e3-ef9b72913288

OperationStatus : Succeeded

Create VM

Go to Azure, Create new Virtual Machine. Choose My Images. Choose SP2013-DC

Chose a unique local user account (probably wont be used) and non-common password.

Select a region or affinity group (I have an affinity Group)

Open Port for RDP and PS for now. Later we might have to add more for the DC, etc.

Wait for the VM to provision.

Connect using contoso\administrator. Password is pass@word1

Repeat and create the SharePoint VM

Add VHD to VM Image

Add-AzureVMImage -ImageName SP2013-SP -MediaLocation http://portalvhdsfz7h5hgfmhh4k.blob.core.windows.net/vhds/2013-SP.vhd -OS Windows -Label SP2013-SP -ImageFamily “Microsoft Office Demo SharePoint 2013 w content” -Eula http://go.microsoft.com/fwlink/?LinkID=324375 -PrivacyUri http://go.microsoft.com/fwlink/?LinkID=282418 -RecommendedVMSize Medium -Verbose

If you get an error, its because the dynamic disk on the SP box is an issue: http://social.msdn.microsoft.com/Forums/windowsazure/en-US/e5feddff-7fee-49b4-86e2-751a1903e852/the-blob-is-not-a-valid-vhd

My error turned out that the VHD was not uploaded correctly. Retried uploading and it worked.

Add-AzureVMImage : “An exception occurred when calling the ServiceManagement API. HTTP Status Code: 400. Service

Management Error Code: BadRequest. Message: The blob is not a valid VHD.. Operation Tracking ID:

9db3a37a1da33d75ba569ee062f63936.”

At line:1 char:1

+ Add-AzureVMImage -ImageName SP2013-SP -MediaLocation http://portalvhdsfz7h5hgfmh …

+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

+ CategoryInfo : CloseError: (:) [Add-AzureVMImage], ServiceManagementClientException

+ FullyQualifiedErrorId : Microsoft.WindowsAzure.Management.ServiceManagement.IaaS.DiskRepository.AddAzureVMImage

Back to square 1. Its takes 6 hours to upload this VHD at 30mbps.

Reupload:

Add-AzureVhd -Destination http://portalvhdsfz7h5hgfmhh4k.blob.core.windows.net/vhds/2013-SP.vhd -LocalFilePath “E:\Hyper-V\Virtual Hard Disks\MS Office Demo\2013-SP v4 (Complete)\2013-SP Complete v4\Virtual Hard Disks\2013-SP.VHD” -NumberOfUploaderThreads 32 -OverWrite

Convert VHD to VM template:

Add-AzureVMImage -ImageName SP2013-SP -MediaLocation http://portalvhdsfz7h5hgfmhh4k.blob.core.windows.net/vhds/2013-SP.vhd -OS Windows -Label SP2013-SP -ImageFamily “Microsoft Office Demo SharePoint 2013 w content” -Eula http://go.microsoft.com/fwlink/?LinkID=324375 -PrivacyUri http://go.microsoft.com/fwlink/?LinkID=282418 -RecommendedVMSize Medium -Verbose

VERBOSE: 7:26:47 AM – Begin Operation: Add-AzureVMImage
VERBOSE: 7:26:53 AM – Completed Operation: Add-AzureVMImage

AffinityGroup :
Category : User
Location : West US
LogicalSizeInGB : 127
Label : SP2013-SP
MediaLink : http://portalvhdsfz7h5hgfmhh4k.blob.core.windows.net/vhds/2013-SP.vhd
ImageName : SP2013-SP
OS : Windows
Eula : http://go.microsoft.com/fwlink/?LinkID=324375
Description :
ImageFamily : Microsoft Office Demo SharePoint 2013 w content
PublishedDate :
IsPremium : False
IconUri :
PrivacyUri : http://go.microsoft.com/fwlink/?LinkID=282418
RecommendedVMSize : Medium
PublisherName : User
OperationDescription : Add-AzureVMImage
OperationId : e493b45b-d214-3fe1-a5b8-6c21a349f670
OperationStatus : Succeeded

Create VM

Go to Azure, Create new Virtual Machine. Choose My Images. Choose SP2013-SP

Chose a unique local user account (probably wont be used) and non-common password.

Select a Cloud Service from previous DC vm:

Open Port for RDP and PS for now. Later we might have to add more for SP, etc.

Wait for the VM to provision.

Connect using contoso\administrator. Password is pass@word1

Fixing the SharePoint VM

Looks like Central Admin (CA) loads, but not the DNS host names for http://intranet.contoso.com. This is not a big deal but let’s trace the issue.

Let’s trace the DNS. SP2013-DC is the DNS/DHCP/AD server.

Log in to the SP2013-DC VM.

Try to ping SP2013-SP, we can’t. That means Azure is taking over our DNS.

You can configure Azure to use your VM as a DC, but it looks complicated: http://msdn.microsoft.com/en-us/library/windowsazure/jj156088.aspx#bkmk_BYODNS

I will just update my hosts file for now under C:\Windows\System32\Drivers\etc:

127.0.0.1        intranet.contoso.com

127.0.0.1        2013-sp

127.0.0.1        www.contoso.com

127.0.0.1        www.contoso.de

10.76.86.2        2013-dc

Be sure to update your DC IP under 2013-DC. Just do an IPConfig from the DC box.

Update: I just realized the VM image is called 2013-x, not SP2013-x. I don’t think Azure needs this info to match from what I can gather at this moment.

Open Internet Explorer from the 2013-SP box and go to www.contoso.com and intranet.contoso.com

Its pretty slow, but I get a SharePoint start page. www.contoso.de challenges me for credentials, I think its because I am rebooting the DC.

I made the DC an Extra Small CPU, since its only at 3% CPU on a Medium footprint.

The SP box is 97% cpu in the past hour, so I want to make it a Large footprint, but its still provisioning.

I also want to open some incoming ports for mydomain.cloudapp.net so users can come in using their web browser.

Resources:

Storage issues

Using Azure disks not sysprepped (DC, SP farm, etc) http://blog.aditi.com/cloud/guide-to-azure-iaas-vhds-disks-images/

Uploading a VHD to Azure (Latest instructions from Official Azure site) http://www.windowsazure.com/en-us/manage/windows/common-tasks/upload-a-vhd/

Azure VHD dynamic disk error: http://social.msdn.microsoft.com/Forums/windowsazure/en-US/e5feddff-7fee-49b4-86e2-751a1903e852/the-blob-is-not-a-valid-vhd

VHDX

Lessons learned uploading a VHDX to Azure: (see step #27) http://blogs.catapultsystems.com/cmoore/archive/2013/04/30/one-does-not-simply-upload-a-vm-to-azure.aspx

Converting VHDX to VHD http://blogs.technet.com/b/cbernier/archive/2013/08/29/converting-hyper-v-vhdx-to-vhd-file-formats-for-use-in-windows-azure.aspx

Convert-VHD command http://technet.microsoft.com/en-us/library/hh848454.aspx

Azure PowerShell trust certs

Add Certificates to MMC: http://social.technet.microsoft.com/wiki/contents/articles/2167.how-to-use-the-certificates-console.aspx

Convert PFX to CER: http://stackoverflow.com/questions/403174/convert-pfx-to-cer

Visual Studio: makecert- http://msdn.microsoft.com/en-us/library/bfsktky3(v=vs.110).aspx

Add-AzureVHD http://msdn.microsoft.com/en-us/library/dn205185.aspx

Networking

Azure DNS- http://msdn.microsoft.com/en-us/library/windowsazure/jj156088.aspx#bkmk_BYODNS

Feature with ID ‘87294c72-f260-42f3-a41b-981a2ffce37a’ is not installed in this farm, and cannot be added to this scope. Error creating SharePoint site collection, Powershell

I had an error today when I created my farm via PowerShell. I forgot to run the SharePoint Products and Configuration Wizard (PSConfig) after creating the farm. The result was an error when creating site collections via the UI, as well as subsites via the UI. I could only create the sites via PowerShell and I got quite a few errors in the UI and had to navigate to http://intranet/_layouts/settings.aspx to get the site to load without an error.

 

Sorry, something went wrong.

Feature with ID ‘87294c72-f260-42f3-a41b-981a2ffce37a’ is not installed in this farm, and cannot be added to this scope.

Technical Details

 

ULS shows this error creating a subsite under the root site collection via the UI: “Failed to apply template “STS#0” to web at URL “http://intranet.com/Test”, error Feature with ID ‘guid’ is not installed in this farm, and cannot be added to this scope.”

Solution: Run PSConfig, then try creating your site in the UI.

If you created the sites via PS and want to delete them, I had to do get-spsite | remove-spsite (THIS REMOVES ALL SITE COLLECTIONS). Then I ran PSConfig and recreated the Site Collections via PowerShell successfully.

 

You can run the following command:

Install-SPFeature –AllExistingFeatures

via powershell, but there are other commands as well that must be ran that PSConfig performs:

Install-SPHelpCollection -All

Initialize-SPResourceSecurity

Install-SPService

Install-SPFeature –AllExistingFeatures

New-SPCentralAdministration -Port 1234 -WindowsAuthProvider “NTLM”

Install-SPApplicationContent

SharePoint 2013 On-Premise App Store Configuration

The SharePoint 2013 March Public Update requires additional configuration steps to complete a SharePoint App Store deployment.
In this article, “Enable apps in AAM or host-header environments for SharePoint 2013”, http://technet.microsoft.com/en-us/library/dn144963.aspx The additonal steps are indicated.

New-SPWebApplicationAppDomain
$contentService = [Microsoft.SharePoint.Administration.SPWebService]::ContentService
$contentService.SupportMultipleAppDomains = $true
$contentService.Update()
Iisreset

Managed Metadata field error on SharePoint 2013 “The given guid does not exist in the term store”

We recently migrated our SharePoint 2010 content database to SharePoint 2013.

In SharePoint 2013, users tried to use a MMD field/column on a new item form and received this error: “The given guid does not exist in the term store”

After researching the issue, it appears that the error is due to a disconnect between the list column and the MMD values. I remembered the MMD database/terms for the Managed Metadata Service Application is separate from the content database.

I opened the Managed Metadata Service Application on SharePoint 2013 and confirmed there were no terms present:

Optional: To confirm the term store structure back on SharePoint 2010, I opened the Managed Metadata Service Application on the SharePoint 2010 farm, the terms were indeed present:

Since we only migrated the SharePoint 2010 content database (not the Managed Metadata Service Application data), the MMD field fails to retrieve the values from the new SharePoint 2013 farm.

Solution: You will have to migrate the Managed Metadata Service Application from SharePoint 2010 to SharePoint 2013.

For more information on how to upgrade the Service Applications, see http://technet.microsoft.com/en-us/library/jj839719.aspx

“When you upgrade from SharePoint 2010 Products to SharePoint 2013, you must use a database attach upgrade, which means that you upgrade only the content for your environment and not the configuration settings. After you have configured the SharePoint 2013 environment, and copied the content and service application databases, you can upgrade the service applications to SharePoint 2013. This article contains the steps that you take to upgrade the service applications.”