SharePoint DevOps Part 1 – Setup CentOS with Ansible

This is the first post in a series for creating a Linux host with Ansible to control Windows machines and install SharePoint.

Let me preface this with I have no clue how far I will get. I expect this to take a few weeks if it’s possible. Ansible seems to be growing, and this all might not be fully baked out. We will see! Feel free to comment.

  1. Part 1- Setup CentOS with Ansible (This post)
  2. Part 2- Setup Ansible with Windows Machines (Coming soon)
  3. Part 3- Using Ansible to prepare Windows Machines for SQL Server (Coming soon)
  4. Part 4- Using Ansible to install SharePoint using AutoSPInstaller (Coming soon)
  5. Part 5- Using Ansible to Maintain SharePoint Machines (Coming soon)

Introduction

There are various technologies I have attempted to use in the past for Windows Deployment SharePoint automation using AutoSPInstaller, such as:

  1. Chocolatey– I could not find enterprise worthy packages for SQL Server and SharePoint. I also couldn’t figure out a way to provision Azure VMs from it.
  2. PowerShell DSC – Way to overkill with push/pull when I just wanted to configure a new SharePoint farm
  3. Chef– Have not tried it. I personally can’t get over the names of things relating to items in the kitchen. But I believe this is a standard. Check it out
  4. Azure WebHooks– Runs PowerShell, possibly limited, on Azure VMs. Note each VM needs an endpoint for each runbook. I have 60 dev VMs/farms that I manage, and I wanted a different runbook for different parts of the SharePoint configuration process, this wasn’t going to happen.
  5. Azure RunBooks– Lots of great tutorials for building a SP Farm, but the farm was not fully configured. I prefer AutoSPInstaller and couldn’t figure out a way to merge them.

Now my above experience is very limited, as when I would hit a wall I would pretty much see how far the technology is and if it’s not supported I would give up. Also, the above experiences are a hybrid of on-premises SharePoint installation needs and Azure IaaS hosting SharePoint needs. So if this does not work, I will be going back to the drawing board or the above list. I have not used Microsoft System Center Orchestrator, as I rarely provision new Hyper-V VMs, and heard it is a ton of work to configure.

My Goal with Automation

I have a very well defined steps I have documented to install SharePoint 2013 farms with one or more servers, for dev/qa or production. These can be in Azure or on-premises in Hyper-V. I want something lightweight that I can provision a new VM, join it to the domain, create a domain if needed, run Windows Update, configure OS rules, install SQL Server (different versions), install SharePoint via AutoSPInstaller, patch SharePoint, and configure services. I would also like to set services and verify those rogue developers did not change any system settings (just kidding team). I know automation does not replace planning, but I hope to turn documentation and PowerShell scripts into a custom deployment tool I can use for provisioning future environments as well as maintaining existing environments. When I read the Phoenix Project, they brought up a good point of “If a family dog gets injured, you nurse it back to health, but if one of your cattle get injured, you will be having beef for dinner”, which basically means why are you spending 100 hours fixing an environment, when you can just recreate or replace it with another? Of course, developers and users change system settings, but the core of the machine can be recreated via a script in less than a day, rather than troubleshooting something and possibly not solving the issue.

Getting Started with Ansible

I picked up a book on Amazon called Ansible for DevOps by Jeff Geerling but shortly into the book realized it was only for Linux based machines. Ansible does support Windows targets, but the commands must be ran from a Linux OS. Dang. I don’t know Linux. So now I am writing this blog post. And it begins. Let’s see what Ansible can do for Windows targets.

Installing Linux – CentOS on a Windows Hyper-V host

  1. Download Cent OS7
    1. https://www.centos.org/download/
    2. I choose Torrent option, as the mirrors were pretty slow. Torrent went at 6mbps download and finished 4GB in about 10 minutes.
  2. Create new VM

    1. Choose a location for the VM file

    2. Choose Generation1

    3. Choose a fixed amount of ram

    4. Choose your Hyper V NIC

    5. Choose your VHD path/info for a new blank VHD to be created

    6. Choose the CentOS ISO file downloaded from the first step

    7. Turn on the VM!
    8. Boot to Cent OS install

Install CentOS on VM (then Python/Ansible)

  1. Select Language


  1. Choose software selection (Choose Server with GUI unless you know how to use Linux terminal well) I also selected the Development/Security tools, and MariaDB (I saw MariaDB in the Ansible book example and figure this will save me some steps later for Ansible testing)


  2. Choose disk


  3. Enable Ethernet and choose a hostname for the computer


  4. Verify everything looks good:


  5. While the OS is installing, configure a root and local user account. Root is like a local server admin password, which we will be using. The user account is a username and password, which you will be logging into each time you start the VM


  6. Select the blue Reboot option when the install is complete.

Log In to Cent OS

  1. Log in to CentOS
    1. License Agreement
      1. Hit 1 to read it, 2 to accept, c to continue, and c again to continue (I kinda struggled with this part)


    2. Sign in using the username and password you set up:


      1. Accept language, keyboard layout, and skip cloud accounts if desired. Then click Start using Linux!


  2. Run Terminal


Install Python/Ansible on CentOS

  1. Install Python
    1. Type SU then hit Enter in terminal to enter the root admin window


    2. Enter root password (different from user password, you entered it in setup)
    3. Once in root, install Python.
      1. Type: sudo yum install epel-release


      2. Hit Y to continue (twice)
      3. Verify complete

  2. Install Ansible
    1. Type sudo yum install ansible


    2. Select Y to continue (twice)

    3. Test ansible command to verify install is complete:
      1. Type: ansible –version


      2. You should get back a version number.

RESOURCES

Thanks to this article for the Python install help I was able to figure out how to install it without errors on CentOS7. Here are the same commands over again, just together without screenshots:

http://stackoverflow.com/questions/32048021/yum-what-is-the-message-no-package-ansible-available

$ su

$ sudo yum install epel-release

$ sudo yum install ansible

SharePoint 2013 Newsfeed – We’re still collecting the latest news error

If you are a SharePoint admin or use, you have probably seen this error message on your SharePoint 2013 MySite Newsfeed:

“We’re still collecting the latest news. You may see more if you try again a little later.”

sharepoint 2013 mysite company newsfeed error -Were still collecting the latest news

Hopefully this article will help explain a few common scenarios I have ran into, and how to resolve any errors. Please post if you have any tips or suggestions, as I am always looking for thoughts from the community on this.

This article will address:

  • Common reasons for newsfeed data not displaying
  • What is distributed cache?
  • Distributed cache configuration
  • Shutdown/Reboot WFE procedure for maintenance so you don’t lose your cache
  • Repopulating cache (if server stopped unexpectedly)
  • References

Common reasons for newsfeed data not displaying

  1. Someone rebooted all your Distributed Cache servers at the same time
    1. Check the task manager uptime to see how long the servers with distributed cache have been running, or if they were rebooted unexpectedly
    2. Fix: Repopulate cache using PowerShell, or maybe wait a long time for new news
    3. Prevent it from happening again: Shut down one server at a time, stopping the cache first, rebooting, and then starting the cache again.
  2. “Everyone” is empty because it only keeps company conversations for 14 days by default.
    1. Fix: Increase the retention time, or encourage people to post to the company newsfeed (not site newsfeeds). See this article for more information on what appears in site newsfeeds vs company newsfeeds. https://support.office.com/en-za/article/What-items-appear-in-your-newsfeed-bd3d9268-0408-4ad4-bc51-2e4ec5406e16#__toc327280723
  3. Distributed cache is not configured right
    1. Fix: Configure it right J this one is so simple, yet so difficult I find. See configuration below.

What is distributed cache?

Distributed cache is a framework Microsoft uses to quickly host social information in SharePoint within the SP servers ram. This can be enabled on one or many SP servers in your farm.

Official definition can be found on this poster, https://www.microsoft.com/en-us/download/details.aspx?id=35557

What uses distributed cache?

Pretty much anything social, but some of the social data comes from content databases and user profile databases. Company newsfeed posts are stored in distributed cache.

  • Newsfeeds
  • Authentication
  • OneNote client access
  • Security Trimming
  • Page load performance

Distributed cache configuration

Note: Run any scripts/commands logged in as the SPFarm account, and be sure to run SharePoint Management Shell as administrator if you have UAC enabled (like a good administrator)

Caution: configuration deletes the cache, so you will need to repopulate the cache after configuring it.

Determine servers to host the distributed cache service

Usually the WFE servers, not servers running search or excel. AutoSPInstaller has a limit of 4 servers, but typically it does not configure distributed cache correctly.

Configuring Distributed Cache

There is a good article here on these commands, https://technet.microsoft.com/en-us/library/JJ219613.aspx and probably better than this article I am writing. But it’s very long so I wrote this article to get Admins 90%-100% of the way there.

Here is how I have been configuring distributed cache. Thanks Jon for the help!

  1. Use Central Admin or PowerShell to start/stop the SharePoint Distributed Cache service on the desired servers in your farm (usually WFE’s).
    1. Or you can use PowerShell to get/start-spserviceinstance of Distributed Cache on the desired servers. I like to use PowerShell to see what servers are running Distributed Cache within my farm:
      1. Get-SPServiceInstance | where-object {$_.typename -ilike “*distributed*”}
  2. Verify Cache service is running on desired servers:
    1. Use-CacheCluster
      1. Note, this command doesn’t configure the server, but just connects the current PowerShell session to manage the cache cluster it’s joined for the PowerShell management session. It’s actually an alias command for Connect-AFCacheClusterConfiguration.
    2. Get-CacheHost
    3. You should see each server running distributed cache listed above. If not, there might be more work to configure the cache cluster I may have missed in this post. Let me know!
  3. Get current configuration
    1. Get-AFCacheHostConfiguration -ComputerName wfe01 -CachePort “22233”
    2. The Cache Size can be updated, see guide here https://technet.microsoft.com/en-us/library/JJ219613.aspx#memory . For 16GB of ram on our WFE servers, we go with 819MB (~5% of 16GB). Note, changing this requires the distributed cache service to be stopped on the computer you are changing it on. Update-SPDistributedCacheSize -CacheSizeInMB 819
  4. Export config, verify service account for distributed cache, as well as servers.
    1. Export-CacheClusterConfig -Path C:\test.xml
      1. Check max cache size (default is 5% -, no more than 4GB – size depends on services on the server)
      2. Check servers – ensure only WFE (or desired servers are in the cluster)
      3. Check service account – ensure all servers use the same service account (spservice)
      4. Check ports
  5. Warning: After configuration is complete do not ever run Add-SPDistributedCacheServiceInstance or Remove-SPDistributedCacheServiceInstance. It reconfigures the cluster (and usually incorrectly)

Shutdown/Reboot WFE procedure

If you have to do reboots on the WFE for Windows Updates, etc., you might be expecting to lose your Newsfeed cache. Here is the proper procedure to retain the cache.

Summary: shutdown the cache one server at a time, reboot that server, add the server back into the cache cluster. Repeat on next server.

  1. Verify Cache service is running on desired servers (more than one server too is key):
    1. Use-CacheCluster
    2. Get-CacheHost
  2. Reboot SQL Server first if needed. Get this out of the way.
    1. Wait for SQL Server to come back online
  3. Reboot WFE1
    1. Perform these commands on WFE1
    2. Verify Cache service is running on desired servers (more than one server too is key):
      1. Use-CacheCluster
      2. Get-CacheHost
    3. Run Stop-spdistributedcacheserviceinstance -graceful
    4. Verify Cache service is stopped on WFE1. Ensure it is stopped before proceeding:
      1. Get-CacheHost
    5. Reboot WFE1
    6. Verify Cache service is running on WFE1:
      1. Use-CacheCluster
      2. Get-CacheHost
      3. If not, go to Central Admin Services on Server and start Distributed Cache service on WFE1, or use PowerShell.
  4. Reboot WFE2
    1. (Repeat above Step #3, but replace WFE1 with WFE2)
  5. Verify it is running
    1. Verify Cache service is running on desired servers (more than one server too is key):
      1. Use-CacheCluster
      2. Get-CacheHost

Repopulating cache (if server stopped unexpectedly)

Replace URL with your mysite URL. This script will populate each user’s cache using Update-SPRepopulateMicroblogFeedCache and the entire user profile newsfeed cache using Update-SPRepopulateMicroblogLMTCache. I am not sure if I stole this script from anywhere, but part of it is from various user profile scripts adapted to fix the users feed cache.

Note: Run any scripts/commands logged in as the SPFarm account, and be sure to run SharePoint Management Shell as administrator if you have UAC enabled (like a good administrator). Otherwise you will get a .ctor error that will drive you crazy.

Download the script from here: http://pastebin.com/K9yR2pEk

$proxy
= Get-SPServiceApplicationProxy | ? {$_.Name -ilike
“User Profile Service Application*”}

Update-SPRepopulateMicroblogLMTCache -ProfileServiceApplicationProxy $proxy

[System.Reflection.Assembly]::LoadWithPartialName(“Microsoft.Office.Server”)

[System.Reflection.Assembly]::LoadWithPartialName(“Microsoft.Office.Server.UserProfiles”)

$url
=
http://mysiteurl.domain.com

$contextWeb
=
New-Object
Microsoft.SharePoint.SPSite($url);

$ServerContext
= [Microsoft.Office.Server.ServerContext]::GetContext($contextWeb);

$UserProfileManager
=
New-Object
Microsoft.Office.Server.UserProfiles.UserProfileManager($ServerContext);

$Profiles
=
$UserProfileManager.GetEnumerator();

foreach ($oUser
in
$Profiles ) {


if ($oUser.item(“SPS-PersonalSiteCapabilities”).Value -eq 14 ){


$personalurl
=
$url
+
$oUser.item(“personalspace”).Value


Write-Host
$oUser.item(“AccountName”).Value

Update-SPRepopulateMicroblogFeedCache -ProfileServiceApplicationProxy $proxy -accountname $oUser.item(“AccountName”).Value


#-siteurl $personalurl

}

}

$contextWeb.Dispose()

After running the script on each WFE where distributed cache runs, wait 15 minutes for the Newsfeed data to populate.

Then test newsfeed.

References

http://consulting.risualblogs.com/blog/2014/04/01/export-impor-distributed-cache-configuration-in-sharepoint-2013/

http://sharepoint.stackexchange.com/questions/125798/userprofileapplicationnotavailableexception-logging-userprofileapplicationpro

http://netwovenblogs.com/2014/03/11/the-newsfeed-is-not-working-on-mysite-in-sharepoint-2013/

SharePoint PowerShell to audit list columns

Recently, I had to list find where all of the SharePoint site columns are used in lists across many site collections. I wrote some quick PowerShell to get all SharePoint list columns and write the site collection, list, and column.

Columns: I listed all site columns/list column display names I was seeing if they are used

Sites: all site collections, except my sites

I am sure it can be cleaned up, but it’s quickly to audit the list columns to get a report. I tried to export to CSV but decided it was more time than formatting the few results that came back in the window.

Results: (can be cleaned up easily, but I don’t have time for this quick script)

AuditColumns.ps1

SPSite Url=https://client/sites/docs, Accounting Documents, Document Type

SPSite Url=https://client/sites/docs, Meeting Minutes, Document Type

SPSite Url=https://client/sites/docs, Documents, Document Type

SPSite Url=https://client/sites/docs, Announcements, Document Type

Script:

[system.reflection.assembly]::loadwithpartialname("microsoft.sharepoint")
$sites = @("https://client",
"https://client/sites/docs");

$cols = @("Document Type",
"Another Column Name",
"My Column",
"Find Me");
foreach($siteurl in $sites){
$site = get-spsite $siteurl
$web=$site.OpenWeb()
$lists=$web.Lists
foreach($list in $lists) {
$fields = $list.Fields
foreach($col in $cols){
if($fields.title -contains $col){
write-output "$site, $list, $col"
}
}
}

Visio file error in SharePoint – Sorry, we can’t perform this action

I recently had an issue with a new Windows 8.1 laptop not able to open Visio files from our SharePoint 2013 intranet. The error is similar to the 32bit/64bit mixed environment errors. My co-worker Rod and I found a workaround that might not be best practices, but resolved the issue. Please post any comments if there is an industry best standard to resolve the error aside from reformatting.

SharePoint 2013 Intranet error with Office Pro Plus on Visio 2013 64bit: Microsoft Visio- Sorry, we can’t perform this action. Incompatible Office products are installed on your machine. If you have an administrator, please contact them for help. OK

Microsoft Visio- Sorry, we can't perform this action. Incompatible Office products are installed on your machine. If you have an administrator, please contact them for help. OK

Terminology:

  • OneDrive for Business- SharePoint file sync tool
    • Blue cloud icon
  • OneDrive (Personal)- Not discussed anywhere in this post. Ignore personal OneDrive here, its 100% separate from SharePoint.
    • White cloud icon

My setup:

  • OS: Windows 8.1 Enterprise 64 bit
  • Computer: Dell Latitude E5550 touch screen laptop, Intel i7 2.6GHz, 8GB Ram
  • Office: Office 365 ProPlus 15.0.4727.1003
  • Visio Version: Visio Professional 2013 15.0.4569.1506
  • OneDrive for Business Updates:
  • Internet Explorer Version: Internet Explorer 11 Version: 11.0.9600.17842
    • SharePoint intranet site is added as Local Intranet, default security settings.

Short term solution: I noticed if I ended the OneDrive for Business process in task manage, I would then end the Microsoft Office Upload Center process. After these were ended (in that order only) I could open Visio files from the intranet.

Before I did this short term fix, other users at my company with the same application versions would try to launch OneDrive for Business and get the same “Sorry, we can’t perform this action” message with OneDrive for Business, but they could open Visio files from the intranet. However, I could launch only OneDrive for Business, but not Visio from the Intranet file link. So when I did this fix, I could launch Visio from the SharePoint 2013 intranet file and view my Visio file, and when I would open OneDrive for Business, I would get the same error as other users, “Sorry, we can’t perform this action”. Good, now I am following the company standard. BUT, as soon as I reboot, same issue comes back and I can’t open Visio files from the SharePoint intranet.

Long term solution: (NOTE, this might not be best practices, but it solves my issue. If you have a Microsoft supported alternative, please post it here in the comments). Instead of ending the processes above, go to Startup tab of the Task Manager and disable OneDrive for Business. This might also disable the child Office Document Cache under it, but it resolves my issue. Summary: OneDrive for Business conflicts with Microsoft Office Document Cache and it might be a 64bit/32bit issue, not sure, but disabling OneDrive for Business on startup resolves the issue for me. I will post if there are any noticeable side effects of file offline caching, stale cache, or other errors. I think this happened to me only because I configured OneDrive for Business to resolve this issue when I first got my laptop, not sure why the laptop originally had the issue. Again, if there’s a better way, please comment below. Thanks!

SharePoint 2013, IIS7, NLB, SSL certificates and GoDaddy Renewal Steps

Overview:

SSL certificates with SharePoint 2013 web applications expire, and when that does, you have to generate a new SSL Certificate. In this post, I will go over how to renew you SharePoint 2013 SSL HTTPS website with GoDaddy, even including multi-server Web Front End (WFE’s) topologies. If you use wildcard certificates on you SharePoint websites, there are a few gotchas when renewing. The process is similar for most certificate types, but wildcards and SharePoint are this blog posts focus. These steps are also similar if you are adding a SSL certificate to your website for the first time (once your SharePoint farm, web applications, and site collections have been configured to use HTTPS, etc.).

Here is an overview of the steps involved with the certificate renew process:

  1. Request a new certificate request from the machine running IIS/SharePoint (Pick a WFE)
  2. Go to GoDaddy and rekey your certificate, entering your certificate request text from step 1
  3. Complete the certificate request in IIS on WFE
  4. Update WFE bindings to use SSL cert
  5. Export certificate from WFE to WFE2 (PFX with personal information, create a password)
  6. Import the PFX on WFE2 IIS
  7. Update WFE2 bindings to use SSL cert

Common issues:

First, this is my experience. Comment below any corrections or other helpful information.

  • When adding the cert to IIS and refreshing, it disappears!
    • Your certificate request is expired. Generate a new one and try again.
    • You are following GoDaddys guide, which does not work. Follow my post below.
    • The cert might already exist and need to be deleted in the Certificate Manager on the server.
  • CER, CRT, PFX- what is the difference? Why do I have to select *.* if I need a specific type? Who designed this stuff…
    • CER is a request
    • CRT is a certificate without private information
    • PFX is a certificate package with private information (exported from CRT paired on the first server, the PFX is imported to the second server).
  • How do I complete a request on WFE2 if it was already completed from WFE1?
    • Export the working cert from Server 1 as a PFX file with a password, then import it on server 2 in IIS. Do not use cert manager on server 2.

Steps to renew your Existing wildcard SSL Certificate:

  1. Verify your certificate is expired by navigating to your SharePoint site. If you get an HTTPS trust warning, it’s expired or has issues that this blog post will address.
  2. Go to WFE1 IIS 7 on your SharePoint box
    1. Go to Server Certificates in IIS

    2. Remove any old certificates that contain the URL for your SharePoint site that we are renewing

    3. On the top right in IIS, go to “Create Certificate Request”

    4. Enter your information. Common name is the wildcard URL. The rest, do not use abbreviations. See this post for more info: https://support.godaddy.com/help/article/4800/generating-iis-7-csrs-certificate-signing-requests

    5. Select “4096” for the bit length

    6. Select a location/filename for the text file that is about to be generated

    7. We will be copying the contents of this file to GoDaddy to rekey our wildcard SSL certificate in the next step.
  3. Now that we have our server “key” information waiting in the text file, we can now go to GoDaddy and pair this server information to that of our SSL certificate.
    1. Go to Go Daddy Certificate Manager (Manage SSL Certificates > Manage Certificates)

    2. Select “Re-Key” on the top navigation
    3. Paste your text file contents from the IIS text file to this GoDaddy window:

    4. Select “Re-Key”
    5. Click “Manage Certificates” From the top navigation, then select “Certificates” folder on the left navigation.
    6. Select the bottom SSL certificate (the most recent version)
    7. Select “Download” icon from the navigation.

    8. Select IIS7, the “Download”

    9. Save this zip to your WFE server where you created the IIS certificate request.
    10. Extract to C:\Temp and proceed carefully to the next steps in this post.
  4. On WFE1 in IIS where you created the certificate request, open IIS 7 and follow these steps to use the certificate you downloaded from GoDaddy.
    1. Remove any old expired wildcard certificates from the WFE1 servers “Certificate Manager”, check Personal > Certificates and the Intermediate > Certificates locations

    2. COMMON GOTCHA: Do not install the cert, do it using IIS.
    3. Go back to “Server Manager” in IIS 7, select “Complete Certificate Request” on the right navigation

    4. Enter the information for the Certificate request as follows:

    5. COMMON GOTCHA: Select *.* when browsing for the CRT file from the GoDaddy zip

    6. Friendly name must be the wildcard URL of the domain.
    7. Click OK.
    8. Refresh the Server Manager to verify the certificate “stays”. If it disappears, you either have:
      1. A certificate in your Personal Certificate store with the same friendly name
      2. An expired or old Certificate Request you generated and downloaded, or you downloaded an older certificate from GoDaddy. Repeat these steps and it will work (it should).
  5. Set the IIS binding of the new certificate to your SharePoint 443 SSL HTTPS website in IIS:
    1. Go to IIS 7 > Sites > select the SharePoint site that uses the wildcard cert.
    2. Select “Bindings” on the right with the website selected.

       

    3. Select “Edit” and select the new SSL certificate

    4. Select OK. On WFE2, you will get an error here trying to use an exported PFX file, follow the next steps to fix WFE2.
    5. Verify the site loads on WFE1 if you can control your DNS/NLB routing.
  6. If you have additional WFE servers, you need to export this new verified SSL certificate to IIS. Here is how.
    1. From WFE1, Go to “Server Certificates”, right click the wildcard cert and select “Export”

    2. Pick a location for the new PFX file, then enter a secure password.

    3. Click OK
    4. Copy the PFX file to WFE2 through Explorer or any other method.
    5. On WFE2, go to IIS 7 > “Server Certificates” and select “Import”

       

       

       

    6. Browse to the PFX file copied over from WFE1, enter your password and select OK.
    7. Refresh “Server Certificates” to verify it is still available.
    8. Repeat the import process in IIS on other WFE servers.
  7. Now that the certificate is available on the other WFE’s in IIS, we need to update the bindings. Same process as the first WFE.
    1. (Copied and pasted from WFE1 steps, but perform these on the WFE2 and additional servers once the certificate is imported)
    2. Go to IIS 7 > Sites > select the SharePoint site that uses the wildcard cert.
    3. Select “Bindings” on the right with the website selected.

       

    4. Select “Edit” and select the new SSL certificate

    5. Select OK.
    6. Verify the site loads on WFE2 if you can control your DNS/NLB routing.

That’s it! I believe most of what’s above is best practices. I would also remove temporary certificate files, such as PFX, CSR files, etc. left around during the process for added security.

Running MS Office Demo VMs in Azure

UPDATE: If you are a MS employee, Visit https://demomonkey.cloudapp.net/. There is a complete Azure VM deployment script for this in your Azure subscription. I had limited success getting this to work on my own.

If you have used the Microsoft Office Demos website Office 365 environment, you know it’s quite handy for client demos. It used to be similar to the SharePoint Information Worker Demo or SDPS demo. This new environment runs on Windows Server 2012 and features SharePoint 2013, Exchange, Lync, and Office web apps. These demos can be spun up on Office 365, or downloaded as Hyper-V virtual machines for on premise demos. They are HUGE VMs and resource hogs. I think you need 50+ GB of ram to host all 9 VMs, as well as probably 1TB of hard drive space to even consider starting these VMs. Remember, having everything run on the same disk will create major throughput issues with your storage and run unbearably slow (tried on four 1TB Raid 10 7200 SATA drives and could only get a few going before hitting huge performance walls).

So, let’s host it in Azure!

There are plenty of performance considerations in Azure, such as use an E drive for your data, turn on or off disk caching, using separate storage accounts, etc. I will NOT be covering that. This is a POC for a client demo, so my goal is just to get SharePoint 2013 with the Contoso users and content working on premise so we can demo Web Content Management (WCM) features of SharePoint, along with the Content by Search web part and Taxonomy driven navigation for product sites.

I will only require SharePoint and the Domain Controller for this effort. My farm does not require Search live preview, Outgoing/incoming email, Lync presence, etc. So hosting this on

Here are the major steps:

  • Downloading the template
  • Convert VHDX to VHD before uploading to Azure
  • MakeCert
  • Connect Azure to PowerShell
  • Start conversion while waiting (VHDX to VHD)
  • Upload VHDs:
  • Add VHD to VM Image
  • Create VM
  • Repeat and create the SharePoint VM
  • Add VHD to VM Image
  • Create VM
  • Resources and Links

Downloading the template VMs

First, download the VHDX zip files from www.microsoftofficedemos.com

I chose to download mine with content.

  1. 2013-DC v4 (Complete)
  2. 2013-EXCH v4 (Complete)
  3. 2013-LYNC-SE1 v4 (Complete)
  4. 2013-PCHAT v4 (Complete)
  5. 2013-SP v4 (Complete)
  6. 2013-SP-AFCache v4 (Complete)
  7. 2013-VPN v4 (Complete)
  8. 2013-WAC v4 (Complete)
  9. Office Demos 2013 VHD EULA

I downloaded #1, #5 and #7. These 3 files required ~31GB of disk space and extract to ~175GB

Extract each VM to your computer.

Convert VHDX to VHD before uploading to Azure

If you have Windows 8.1 or Windows PowerShell 4.0, Windows Server 2012 R2, Run PowerShell Convert-VHD command: http://technet.microsoft.com/en-us/library/hh848454.aspx

Convert-VHD -path “D:\temp\MS-Office-Demo-VMs\2013-DC v4 (Complete)\2013-DC Complete v4\Virtual Hard Disks\2013DC.VHDX” -destinationpath “D:\temp\MS-Office-Demo-VMs\2013-DC v4 (Complete)\2013-DC Complete v4\Virtual Hard Disks\2013DC.VHD”

MakeCert

Visual Studio command prompt

C:\Program Files (x86)\Microsoft SDKs\Windows\v7.1A\Bin\x64

makecert -sky exchange -r -n “CN=MSDNAzure6scport06” -pe -a sha1 -len 2048 -ss My “MSDNAzure6scport06.cer”

Connect Azure to PowerShell

MMC- add snap-in Certificate. Navigate to Personal certificates. If you are not a local administrator and run CMD as administrator, they will not appear for you. I just exported another Azure cert from the management tools for testing and it worked for me.

Export again as CER – DER 509 CRT to Desktop

Upload CER file to Azure:

Get-AzurePublishSettingsFile

Save Certificate from Azure when prompted to download: C:\temp\

Import-AzurePublishSettingsFile “C:\temp\Windows Azure MSDN – Visual Studio Premium-12-3-2013-credentials.publishsettings”

Test a random command to verify PowerShell is connected: (Your storage account name below):

Get-AzureStorageAccount portalvhdsfz7h5hgfmhh4k

Start conversion while waiting (VHDX to VHD)

convert-vhd -path “E:\Hyper-V\Virtual Hard Disks\MS Office Demo\2013-DC v4 (Complete)\2013-DC Complete v4\Virtual Hard Disks\2013-DC.VHDX” -DestinationPath “E:\Hyper-V\Virtual Hard Disks\MS Office Demo\2013-DC v4 (Complete)\2013-DC Complete v4\Virtual Hard Disks\2013-DC.VHD”

convert-vhd -path “E:\Hyper-V\Virtual Hard Disks\MS Office Demo\2013-SP v4 (Complete)\2013-SP Complete v4\Virtual Hard Disks\2013-SP.VHDX” -DestinationPath “E:\Hyper-V\Virtual Hard Disks\MS Office Demo\2013-SP v4 (Complete)\2013-SP Complete v4\Virtual Hard Disks\2013-SP.VHD”

convert-vhd -path “E:\Hyper-V\Virtual Hard Disks\MS Office Demo\2013-VPN v4 (Complete)\2013-VPN Complete v4\Virtual Hard Disks\2013-VPN.VHDX” -DestinationPath “E:\Hyper-V\Virtual Hard Disks\MS Office Demo\2013-VPN v4 (Complete)\2013-VPN Complete v4\Virtual Hard Disks\2013-VPN.VHD”

Upload VHDs:

Add-AzureVhd -Destination http://portalvhdsfz7h5hgfmhh4k.blob.core.windows.net/vhds/2013-dc.vhd -LocalFilePath “E:\Hyper-V\Virtual Hard Disks\MS Office Demo\2013-DC v4 (Complete)\2013-DC Complete v4\Virtual Hard Disks\2013-DC.VHD” -NumberOfUploaderThreads 32 -OverWrite

Add-AzureVhd -Destination http://portalvhdsfz7h5hgfmhh4k.blob.core.windows.net/vhds/2013-SP.vhd -LocalFilePath “E:\Hyper-V\Virtual Hard Disks\MS Office Demo\2013-SP v4 (Complete)\2013-SP Complete v4\Virtual Hard Disks\2013-SP.VHD” -NumberOfUploaderThreads 32 -OverWrite

Add-AzureVhd -Destination http://portalvhdsfz7h5hgfmhh4k.blob.core.windows.net/vhds/2013-VPN.vhd -LocalFilePath “E:\Hyper-V\Virtual Hard Disks\MS Office Demo\2013-VPN v4 (Complete)\2013-VPN Complete v4\Virtual Hard Disks\2013-VPN.VHD” -NumberOfUploaderThreads 32 -OverWrite?

Add VHD to VM Image

Add-AzureVMImage -ImageName SP2013-DC -MediaLocation http://portalvhdsfz7h5hgfmhh4k.blob.core.windows.net/vhds/2013-dc.vhd -OS Windows -Label SP2013-DC -ImageFamily “Microsoft Office Demo SharePoint 2013 w content” -Eula http://go.microsoft.com/fwlink/?LinkID=324375 -PrivacyUri http://go.microsoft.com/fwlink/?LinkID=282418 -RecommendedVMSize Medium -Verbose

VERBOSE: 2:20:55 PM – Begin Operation: Add-AzureVMImage

VERBOSE: 2:21:03 PM – Completed Operation: Add-AzureVMImage

AffinityGroup :

Category : User

Location : West US

LogicalSizeInGB : 41

Label : SP2013-DC

MediaLink : http://portalvhdsfz7h5hgfmhh4k.blob.core.windows.net/vhds/2013-dc.vhd

ImageName : SP2013-DC

OS : Windows

Eula : http://go.microsoft.com/fwlink/?LinkID=324375

Description :

ImageFamily : Microsoft Office Demo SharePoint 2013 w content

PublishedDate :

IsPremium : False

IconUri :

PrivacyUri : http://go.microsoft.com/fwlink/?LinkID=282418

RecommendedVMSize : Medium

PublisherName : User

OperationDescription : Add-AzureVMImage

OperationId : 4e871f30-af08-3f06-95e3-ef9b72913288

OperationStatus : Succeeded

Create VM

Go to Azure, Create new Virtual Machine. Choose My Images. Choose SP2013-DC

Chose a unique local user account (probably wont be used) and non-common password.

Select a region or affinity group (I have an affinity Group)

Open Port for RDP and PS for now. Later we might have to add more for the DC, etc.

Wait for the VM to provision.

Connect using contoso\administrator. Password is pass@word1

Repeat and create the SharePoint VM

Add VHD to VM Image

Add-AzureVMImage -ImageName SP2013-SP -MediaLocation http://portalvhdsfz7h5hgfmhh4k.blob.core.windows.net/vhds/2013-SP.vhd -OS Windows -Label SP2013-SP -ImageFamily “Microsoft Office Demo SharePoint 2013 w content” -Eula http://go.microsoft.com/fwlink/?LinkID=324375 -PrivacyUri http://go.microsoft.com/fwlink/?LinkID=282418 -RecommendedVMSize Medium -Verbose

If you get an error, its because the dynamic disk on the SP box is an issue: http://social.msdn.microsoft.com/Forums/windowsazure/en-US/e5feddff-7fee-49b4-86e2-751a1903e852/the-blob-is-not-a-valid-vhd

My error turned out that the VHD was not uploaded correctly. Retried uploading and it worked.

Add-AzureVMImage : “An exception occurred when calling the ServiceManagement API. HTTP Status Code: 400. Service

Management Error Code: BadRequest. Message: The blob is not a valid VHD.. Operation Tracking ID:

9db3a37a1da33d75ba569ee062f63936.”

At line:1 char:1

+ Add-AzureVMImage -ImageName SP2013-SP -MediaLocation http://portalvhdsfz7h5hgfmh …

+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

+ CategoryInfo : CloseError: (:) [Add-AzureVMImage], ServiceManagementClientException

+ FullyQualifiedErrorId : Microsoft.WindowsAzure.Management.ServiceManagement.IaaS.DiskRepository.AddAzureVMImage

Back to square 1. Its takes 6 hours to upload this VHD at 30mbps.

Reupload:

Add-AzureVhd -Destination http://portalvhdsfz7h5hgfmhh4k.blob.core.windows.net/vhds/2013-SP.vhd -LocalFilePath “E:\Hyper-V\Virtual Hard Disks\MS Office Demo\2013-SP v4 (Complete)\2013-SP Complete v4\Virtual Hard Disks\2013-SP.VHD” -NumberOfUploaderThreads 32 -OverWrite

Convert VHD to VM template:

Add-AzureVMImage -ImageName SP2013-SP -MediaLocation http://portalvhdsfz7h5hgfmhh4k.blob.core.windows.net/vhds/2013-SP.vhd -OS Windows -Label SP2013-SP -ImageFamily “Microsoft Office Demo SharePoint 2013 w content” -Eula http://go.microsoft.com/fwlink/?LinkID=324375 -PrivacyUri http://go.microsoft.com/fwlink/?LinkID=282418 -RecommendedVMSize Medium -Verbose

VERBOSE: 7:26:47 AM – Begin Operation: Add-AzureVMImage
VERBOSE: 7:26:53 AM – Completed Operation: Add-AzureVMImage

AffinityGroup :
Category : User
Location : West US
LogicalSizeInGB : 127
Label : SP2013-SP
MediaLink : http://portalvhdsfz7h5hgfmhh4k.blob.core.windows.net/vhds/2013-SP.vhd
ImageName : SP2013-SP
OS : Windows
Eula : http://go.microsoft.com/fwlink/?LinkID=324375
Description :
ImageFamily : Microsoft Office Demo SharePoint 2013 w content
PublishedDate :
IsPremium : False
IconUri :
PrivacyUri : http://go.microsoft.com/fwlink/?LinkID=282418
RecommendedVMSize : Medium
PublisherName : User
OperationDescription : Add-AzureVMImage
OperationId : e493b45b-d214-3fe1-a5b8-6c21a349f670
OperationStatus : Succeeded

Create VM

Go to Azure, Create new Virtual Machine. Choose My Images. Choose SP2013-SP

Chose a unique local user account (probably wont be used) and non-common password.

Select a Cloud Service from previous DC vm:

Open Port for RDP and PS for now. Later we might have to add more for SP, etc.

Wait for the VM to provision.

Connect using contoso\administrator. Password is pass@word1

Fixing the SharePoint VM

Looks like Central Admin (CA) loads, but not the DNS host names for http://intranet.contoso.com. This is not a big deal but let’s trace the issue.

Let’s trace the DNS. SP2013-DC is the DNS/DHCP/AD server.

Log in to the SP2013-DC VM.

Try to ping SP2013-SP, we can’t. That means Azure is taking over our DNS.

You can configure Azure to use your VM as a DC, but it looks complicated: http://msdn.microsoft.com/en-us/library/windowsazure/jj156088.aspx#bkmk_BYODNS

I will just update my hosts file for now under C:\Windows\System32\Drivers\etc:

127.0.0.1        intranet.contoso.com

127.0.0.1        2013-sp

127.0.0.1        www.contoso.com

127.0.0.1        www.contoso.de

10.76.86.2        2013-dc

Be sure to update your DC IP under 2013-DC. Just do an IPConfig from the DC box.

Update: I just realized the VM image is called 2013-x, not SP2013-x. I don’t think Azure needs this info to match from what I can gather at this moment.

Open Internet Explorer from the 2013-SP box and go to www.contoso.com and intranet.contoso.com

Its pretty slow, but I get a SharePoint start page. www.contoso.de challenges me for credentials, I think its because I am rebooting the DC.

I made the DC an Extra Small CPU, since its only at 3% CPU on a Medium footprint.

The SP box is 97% cpu in the past hour, so I want to make it a Large footprint, but its still provisioning.

I also want to open some incoming ports for mydomain.cloudapp.net so users can come in using their web browser.

Resources:

Storage issues

Using Azure disks not sysprepped (DC, SP farm, etc) http://blog.aditi.com/cloud/guide-to-azure-iaas-vhds-disks-images/

Uploading a VHD to Azure (Latest instructions from Official Azure site) http://www.windowsazure.com/en-us/manage/windows/common-tasks/upload-a-vhd/

Azure VHD dynamic disk error: http://social.msdn.microsoft.com/Forums/windowsazure/en-US/e5feddff-7fee-49b4-86e2-751a1903e852/the-blob-is-not-a-valid-vhd

VHDX

Lessons learned uploading a VHDX to Azure: (see step #27) http://blogs.catapultsystems.com/cmoore/archive/2013/04/30/one-does-not-simply-upload-a-vm-to-azure.aspx

Converting VHDX to VHD http://blogs.technet.com/b/cbernier/archive/2013/08/29/converting-hyper-v-vhdx-to-vhd-file-formats-for-use-in-windows-azure.aspx

Convert-VHD command http://technet.microsoft.com/en-us/library/hh848454.aspx

Azure PowerShell trust certs

Add Certificates to MMC: http://social.technet.microsoft.com/wiki/contents/articles/2167.how-to-use-the-certificates-console.aspx

Convert PFX to CER: http://stackoverflow.com/questions/403174/convert-pfx-to-cer

Visual Studio: makecert- http://msdn.microsoft.com/en-us/library/bfsktky3(v=vs.110).aspx

Add-AzureVHD http://msdn.microsoft.com/en-us/library/dn205185.aspx

Networking

Azure DNS- http://msdn.microsoft.com/en-us/library/windowsazure/jj156088.aspx#bkmk_BYODNS

Feature with ID ‘87294c72-f260-42f3-a41b-981a2ffce37a’ is not installed in this farm, and cannot be added to this scope. Error creating SharePoint site collection, Powershell

I had an error today when I created my farm via PowerShell. I forgot to run the SharePoint Products and Configuration Wizard (PSConfig) after creating the farm. The result was an error when creating site collections via the UI, as well as subsites via the UI. I could only create the sites via PowerShell and I got quite a few errors in the UI and had to navigate to http://intranet/_layouts/settings.aspx to get the site to load without an error.

 

Sorry, something went wrong.

Feature with ID ‘87294c72-f260-42f3-a41b-981a2ffce37a’ is not installed in this farm, and cannot be added to this scope.

Technical Details

 

ULS shows this error creating a subsite under the root site collection via the UI: “Failed to apply template “STS#0” to web at URL “http://intranet.com/Test”, error Feature with ID ‘guid’ is not installed in this farm, and cannot be added to this scope.”

Solution: Run PSConfig, then try creating your site in the UI.

If you created the sites via PS and want to delete them, I had to do get-spsite | remove-spsite (THIS REMOVES ALL SITE COLLECTIONS). Then I ran PSConfig and recreated the Site Collections via PowerShell successfully.

 

You can run the following command:

Install-SPFeature –AllExistingFeatures

via powershell, but there are other commands as well that must be ran that PSConfig performs:

Install-SPHelpCollection -All

Initialize-SPResourceSecurity

Install-SPService

Install-SPFeature –AllExistingFeatures

New-SPCentralAdministration -Port 1234 -WindowsAuthProvider “NTLM”

Install-SPApplicationContent

SharePoint 2013 On-Premise App Store Configuration

The SharePoint 2013 March Public Update requires additional configuration steps to complete a SharePoint App Store deployment.
In this article, “Enable apps in AAM or host-header environments for SharePoint 2013”, http://technet.microsoft.com/en-us/library/dn144963.aspx The additonal steps are indicated.

New-SPWebApplicationAppDomain
$contentService = [Microsoft.SharePoint.Administration.SPWebService]::ContentService
$contentService.SupportMultipleAppDomains = $true
$contentService.Update()
Iisreset

Managed Metadata field error on SharePoint 2013 “The given guid does not exist in the term store”

We recently migrated our SharePoint 2010 content database to SharePoint 2013.

In SharePoint 2013, users tried to use a MMD field/column on a new item form and received this error: “The given guid does not exist in the term store”

After researching the issue, it appears that the error is due to a disconnect between the list column and the MMD values. I remembered the MMD database/terms for the Managed Metadata Service Application is separate from the content database.

I opened the Managed Metadata Service Application on SharePoint 2013 and confirmed there were no terms present:

Optional: To confirm the term store structure back on SharePoint 2010, I opened the Managed Metadata Service Application on the SharePoint 2010 farm, the terms were indeed present:

Since we only migrated the SharePoint 2010 content database (not the Managed Metadata Service Application data), the MMD field fails to retrieve the values from the new SharePoint 2013 farm.

Solution: You will have to migrate the Managed Metadata Service Application from SharePoint 2010 to SharePoint 2013.

For more information on how to upgrade the Service Applications, see http://technet.microsoft.com/en-us/library/jj839719.aspx

“When you upgrade from SharePoint 2010 Products to SharePoint 2013, you must use a database attach upgrade, which means that you upgrade only the content for your environment and not the configuration settings. After you have configured the SharePoint 2013 environment, and copied the content and service application databases, you can upgrade the service applications to SharePoint 2013. This article contains the steps that you take to upgrade the service applications.”

SharePoint 2013 error A potentially dangerous Request.Path value was detected from the client (%) on intranet/mysites access requests from email link

I had a link in my email inbox for “User wants access to ‘Intranet'” sent from the user to myself. I am the Intranet site Owner and recieve access requests.

There is a link to “Accept or Decline this request”

If I click accept from the email Link, I get a SharePoint error:

Sorry, something went wrong
An unexpected error has occurred.

Correlation ID: 00000000-0000-0000-0000-000000000000

Date and Time: 8/5/2013 3:19:39 PM


Go back to site

I took a look in the ULS logs and found this error when I requested the page:

08/05/2013 14:56:19.09 w3wp.exe (0x2BCC) 0x26C4 SharePoint Foundation Runtime tkau Unexpected System.Web.HttpException: A potentially dangerous Request.Path value was detected from the client (%).    at System.Web.HttpRequest.ValidateInputIfRequiredByConfig()     at System.Web.HttpApplication.PipelineStepManager.ValidateHelper(HttpContext context)

08/05/2013 14:56:19.09 w3wp.exe (0x2BCC) 0x26C4 SharePoint Foundation General ajlz0 High Getting Error Message for Exception System.Web.HttpException (0x80004005): A potentially dangerous Request.Path value was detected from the client (%).     at System.Web.HttpRequest.ValidateInputIfRequiredByConfig()     at System.Web.HttpApplication.PipelineStepManager.ValidateHelper(HttpContext context)

08/05/2013 14:56:19.09 w3wp.exe (0x2BCC) 0x26C4 SharePoint Foundation General aat87 Monitorable

08/05/2013 14:56:19.10 w3wp.exe (0x2BCC) 0x26C4 SharePoint Foundation General ajb4s Monitorable ViewStateLog: Failed to write to the velocity cache: https://intranet/Access Requests/pendingreq.aspx

The URL from the email is: https://intranet/sites/BI/Access%2520Requests/pendingreq.aspx

The link was %2025, as soon as I went to Site Settings > Access requests and invitations, The page loaded and the URL was https://intranet/sites/BI/Access%20Requests/pendingreq.aspx#InplviewHash45f07088-94ce-4184-8700-e90e139485fb=#InplviewHashb2af6c82-aa6b-402e-97f7-b7503c93b415=

Note how the URL is %20, not %2520. This was the cause of the error.

Environment: Windows Server 2012 with SharePoint 2013 March 2013 Public Update.

For now, use the UI or manually change the link. I am reporting this issue to Microsoft and will update this if I find anything out.

SharePoint 2013 – Common Installation Issues

With SharePoint 2013, I have had a lot of installation issues. I will cover the latest issues I have ran into. This post does not enforce the Least Privlages security practice, but it can easily be adapted to for other environments. This post was for standing up a development environment. After a long battle of getting SharePoint 2013 to work without any issues, I wanted to share my experience in one place.

I have documented our company’s current development environment setup in this post. Your environment does not have to match mine 100%. Feel free to use Windows Server 2012, etc.

Installation Media Requirements (MSDN):

  • en_windows_server_2008_r2_standard_enterprise_datacenter_and_web_with_sp1_x64_dvd_617601.iso
  • en_office_professional_plus_2013_x64_dvd_1123674.iso
  • en_sharepoint_designer_2013_x64_1134649.exe
  • en_sharepoint_server_2013_x64_dvd_1121447.iso
  • en_sql_server_2012_enterprise_edition_with_sp1_x64_dvd_1227976.iso
  • en_visio_professional_2013_x64_1123802.exe
  • en_visual_studio_premium_2012_x86_dvd_920758.iso and update en_visual_studio_2012_x86_update_1_1203928.exe

One common issue I found is that I had to run Windows Update before installing SQL 2012. There were 96 Windows updates on a clean server install and it took 2 hours. At about an hour and a half Internet Explorer has a hidden install prompt in the background that you have to click “Next” on, etc. The patch process is as follows: Install, reboot, install security update, reboot one last time.

Before installing SQL Server 2012, I had to go to my Windows Server 2008 R2 Roles/Features and add the .NET Framework 3.5.1 Features.

.net 3.5

Next I installed SQL Server 2012. I let the installer check for updates. If this installer fails, you probably did not finish the Windows Updates.

I installed the following features for SQL:

  • Database engine
  •  Full text Search
  •  SSAS
  •  SSRS Integrated
  •  SSRS addin tool
  •  Management tools basic and complete (for SSRS)

SQL 2012 features

You don’t need Full Text, but we are using it in our development environment for custom SQL applications outside of SharePoint.

Install Visual Studio 2012 and the update listed in the beginning of this post.

Then Install SharePoint 2013 Prerequisites. You will need to reboot and continue, then reboot again (I see a pattern here…)

For configuing SharePoint, MAKE SURE YOU ARE LOGGED IN AS A DOMAIN USER ACCOUNT – SPSetup for example! I tried using local admin and my service account created the SQL database, but got an error on the products and configuration wizard:

03/26/2013 23:07:06  6  ERR                        Failed to create the configuration database.

An exception of type System.InvalidOperationException was thrown.  Additional exception information: An error occurred while getting information about the user SPFarm at server Domain.local: Access is denied

System.InvalidOperationException: An error occurred while getting information about the user SPFarm at server Domain.local: Access is denied

Configuration Failed

Configuration failed. One or more configuration settings failed. Failed to create the configuration database.

Basically SharePoint is trying to get information about this service account and access is denied. This is because you are running the Windows session and Products and Configuration Wizard as a local user that does not have access to the AD OU to verify the SPFarm domain account. Log out, Log in as SPSetup (or any domain user if you don’t have one) and re-launch the SharePoint Products and Configuration Wizard. Specify the SPFarm account to connect to the SQL Database. Dont forget, you might have to log in to SQL Management Studio and remove the partially created farm database SharePoint_Config or use a different name the second time.

Then Install SharePoint 2013. I use Complete instead of Stand-alone.

Specify your domain Database Access account (Domain\SPFarm). This account has to be DBOwner and Security Admin in the SQL instance (or sysadmin if you are lazy and frustrated).

Launch the Farm Configuration Wizard.

Specify a new managed account for the service applications (For Dev I ended up using the same SPFarm account).

I noticed the SharePoint 2013 Farm Configuration Wizard said “Working on it…” Sorry to keep you waiting. for over 6 hours. I read a blog post from Todd that once the W3WP.exe and OWSTimer.exe processes die down, you can kill the IE window and re-open Central Admin. Give it a good 15 minutes or so. More time the better.

Once you bring up Central Admin, then create the root site collection. My Sites is configured on the root web application under /my site collection. Go to About Me on your username above the ribbon to verify everything works.

Boom, SharePoint 2013 is up and running.

SharePoint 2010 Caching options

These are a few notes from Designing a Microsoft SharePoint 2010 Infrastructure PDF. (Page’s 3-30 and 3-31)

There are 3 different types of caching options in SharePoint 2010. Two of these require publishing features.

  1. (page) Output Caching
    1. Setting this to as small as 60 seconds can make a big difference on WFE servers load
    2. Page requests are stored in memory on the WFE server.
    3. Available with the Publishing feature enabled
    4. Memory based on WFE servers
    5. Cache Profiles
      1. Determine which users receive cached pages, etc. Set at the Site, Site collection, or web application level
  2. Object cache
    1. Store lists/ libraries/ or page layouts on WFE server. Size adjusted at the site collection or web application level
    2. Available with the Publishing feature enabled
    3. Memory based on WFE servers
  3. BLOB cache
    1. Better for media streaming (allows files to play before they are finished downloading). Possibly better for large files (depending on your SQL specs vs WFE).
    2. Default is 10GB and disabled
    3. Web application level
    4. Disk based on WFE servers
      1. Determine which users receive cached pages, etc. Set at the Site, Site collection, or web application level

More from MS: http://technet.microsoft.com/en-us/library/cc261797.aspx

Microsoft SharePoint Hyper-V virtual machine DNS fix

After downloading the MS SharePoint preconfigured VM (demo2010a and demo2010b), I started them up on my network and noticed they would not load the SharePoint sites.

Here is the VM I downloaded from Microsoft: 2010 Information Worker Demonstration and Evaluation Virtual Machine (RTM):


http://www.microsoft.com/downloads/en/details.aspx?FamilyID=751fa0d1-356c-4002-9c60-d539896c66ce&displaylang=en

After trying the site http://intranet.contoso.com in the Demo2010A machines internet browser and getting a Page cannot be displayed error, I checked my app pools, IIS web site, central admin (which worked on http://demo2010a:2010 btw), alternate access mappings, and DNS.

I noticed some funny settings with my vm’s DNS manager on Demo2010A machine.

The IP’s I was seeing were 192.168.150.0 and 192.168.150.1, which I am on the 10.6.0.x network, so this was alarming to me.

I called over our new IT guy and put him to work, explaining that I can access one of the applications by hostname (http://demo2010a:2010), but all of the alternate access mappings and DNS entries were not working.

After some troubleshooting and changing the DNS entry for demo2010a and demo2010b (SEE PICTURE ABOVE) to 10.6.0.164 and 10.6.0.169 (The new 10.6.0.x IP’s were the IPs machines were both assigned from TCIP set to automatic).

After I modified the DNS, I did a Command prompt ipconfig /flushdns

Following this,my coworker set the DNS under TCP/IP v4 to the IP of demo2010a machine (the dns server- 10.6.0.169).

The issue was that the DNS server was inheriting from my companys DNS on the same 10.6.0.x network, instead of the contoso DNS entries on demo2010a machine.

After setting Demo2010A and Demo2010B’s DNS to the local IP of demo2010a (10.6.0.169), this fixed the site. I can now access http://intranet.contoso.com from the demo2010a machine. I CANNOT access this on any computer who’s DNS is not set to 10.6.0.169, as the DNS automatically obtains my company’s DNS server and settings.