Sunday 13 September 2015

SharePoint 2013 Workflow options - notes

Overview:  There are a lot of workflow options and each of the solutions lend themselves favorably to different circumstances.  In this post I look at the more common options around workflow for SharePoint.  The 3 options I'm exploring are: K2 blackperl, Nintex and SP2013 WorkFlow Manager.  Also note that existing SP2010 workflow still exists and is an option if your business has workflows on the platform already or you have this skill set available.  There are other products but these are the main stream options.

So each of these products has their place and suit different organisations.  This post is my opinion and I am not a workflow expert and show my thoughts on when I would favor 1 of the approaches.

Licencing:  Workflow Manager does not have the licencing costs.  Nintex has a server and CAL licencing model and K2 has a server licencing model.

Skills:  what are your existing in-house skills.  If you already have K2 or Nintex expertise it makes these products far more attractive.

Size:  How big is your organisation, how complex are the workflows, how many workflows and how often do they change shall influence the workflow option to select.

SharePoint 2013 Workflow Manager
SharePoint 2013 introduces an new standalone workflow engine based on Windows Workflow Manager 1.0 using .NET 4.5.  In the SP 2013 world, Office Web Apps (OWA) and Workflow Manager runs as a service separate from SharePoint 2013.
  • SharePoint Designer 2013
  • Ideal for simple or medium complexity workflow processes
  • Limited to a pre-defined set of activities and actions
  • Relatively quick and easy to configure
  • Custom workflow development through Visual Studio
  • Can implement state-machine workflows
  • Supports custom actions/activities
  • Supports debugging
  • Ideal for modelling complex processes
  • Requires a developer
  • Workflow Manager
  • High Density and Multi-Tenancy
  • Elastic Scale
  • Fully Declarative Authoring
  • REST and Service Bus Messaging

Nintex
  • On-premise and cloud workflows – but cloud workflows do not allow custom actions
  • Nintex uses the SharePoint workflow engine
  • Easy to create Nintex workflows (good tooling) but not so easy to upgrade and maintain if complex – they require a proper dev environment if workflows require changing
  • Tight coupling with SharePoint – so upgrades need to be planned. Some workflows have broken after upgrade.
  • Can create custom activities but these are limited to constraints imposed by Nintex design surface
  • More suited to State machine workflows using reusable custom modules and user defined actions.
  • Nintex uses its own database which you will need intimate knowledge of when it comes to performance issues.

K2
K2 – technology agnostic – best suited if SharePoint is only a part of your technology snapshot, some folks consider K2 a BPM product.
Pros:
  • Off box WF hosting:  Allows for increasing the number of blackperl servers and no resource overlap, flexible licencing model as it is server based
  • Well tried and tested workflow engine
  • Good reporting and troubleshooting
  • Excellent SOA layer (SmartObjects) with multiple products.  This is more an EA feature as it can be a great way to create an SOA.  Allows API to connect to custom SQL, CRM, SAP, Web Services.
  • Proven advanced tooling, good visual tooling (not as good as Nintex IMHO)
Cons:
  • Cost is relatively high, support costs are extensive, need to pay for dev and pre-prod licence
  • Not based on the latest MS workflow engine
  • Not easy for end users to build WF (despite marketing noise)
  • Setup and monitoring is specialised and will require advanced K2 expertise
    Difficult to back out of product
  • Tooling requires training and breaking out of OOTB features requires a high level of expertise and dependency on K2
  • Support tended to be patchy with technical knowledge
Updated: 2017-11-03.  Possible Extranet facing blackpearl Infrastructure design
Summary:
K2 is a good product if you need to build an SOA layer for integration, are prepared to install correctly (cost) and maintain.  You shall need dedicated workflow people to create the workflows.  So in the right circumstances it has it’s place.

Updated 11 December 2019:
Microsoft Power Automation (formerly Microsoft Flow) is the default workflow option when working with the Microsoft Power Platform (Power Apps, Power Automation and Power BI).  O365, SPO and D365 can also use Microsoft Power Automation.  Azures Logic Apps is also a good option especially if your application is C#/Azure based and not within one of the SaaS Azure offerings.

Sunday 6 September 2015

Content Type Hub and MMS Notes

This topic has been well covered and this post merely calls out items I feel are worth knowing.

Notes:


You can configure more than 1 MMS assigned to a Web Application. 
You can have multiple CTHub that your sites can subscribe to.  Think it is up to 4.

After publishing, new SC will get the CT's pretty quick whereas the updates to existing CT in the downstream SC take some time to permeate out.  Timer job on the subscriber Web Application goes thru SC and gets the latest CT (it will over write any local changes).  No Granular push out (check with JJ).

Terminology for MMS (Term Group, Term Sets and Terms):

Thursday 20 August 2015

Non Functional Testing for SharePoint

Overview:  Functional Requirements are the business requirements that the business define for the application being built.  Non-functional testing is concerned with performance, reliability, scalability, recovery, load,  security and usability testing.  For SharePoint it is a good idea to test this at a platform level and then verify the individual application non functional testing is appropriate.

SharePoint Non Functional Testing:
All of these test should be performed against your various SharePoint platforms and will dictate the SLA's offered to the business using SharePoint as a service.  Baseline testing is a good idea as the differences can be used to determine the efficiency of the individual application being created.

Proxies:
Fiddler is my favourite (other tools for capturing web traffic Charles, BurpSuite, WireShark and you can use the developer tools shipped with the browsers). tcpdump, goPacket are awesome for network monitoring.

Use Fiddler to:
  • Observe traffic (http/https requests, headers,..)
  • Replay sessions, 
  • Evaluate performance,
  • Set break point
A common misconception is the function of performance vs load testing.  

Performance is primarily concerns with looking at typical user usage scenarios and see how long each page takes to load.  So a recorded script with wait time between recorded calls is useful  It's also worth looking at the same page with minimal data and also with large amounts of data.  

Load Test involves recording the standard users interaction (ensure some users query heavy and some light amounts of data), there are no wait times.  The norm is to multiple this concurrent number of users by 100 to know the amount of users the farm business application can support.  e.g. a concurrent user load of 100 users where the performance is acceptable means the farm should handle 10,000 users (100 users times 100).  You run the 100 users in a steady state for a few hours.

Stress testing is similar to load testing but you keep stepping up the number of concurrent users until the SharePoint farm starts running out of resources and throttling requests.  Once the system degrades and performance is not acceptable is pretty much how many users the application on the SharePoint farm are supported.  So if the performance throttles at 170 users, the farm can handle 17, 000 normal usage users.

References:
http://www.guru99.com/non-functional-testing.html

Sunday 16 August 2015

FedAuth Notes for Problem Solving

Overview:  These are my notes on FedAuth relating to SharePoint 2013.
SharePoint (SP) 2013 uses Claim Based Authentication (CBA).  In this example, I am looking at SiteMinder (a CA product) as the Federation Service (this is the equivalent to ADFS (Active Directory Federation Service) as the Identity Provider (IdP)).
Basic Flow of SP CBA Authentication:
  1. SP looks for a FedAuth cookie; if  there is no FedAuth cookie for the users, it shall redirect the user to login via the IdP (AAD/SiteMinder/ADFS et al.). 
  2. The IdP returns a valid SAML token to SharePoint's STS.
  3. The STS generated a FEDAUTH cookie for the user to hold the current users session lifespan state (to keep the user log in).  The user holds the STS token not the SAML token.  The FedAuth in is a pointer to the SAML token held in the SharePoint Token Cache.
The default behaviour of SharePoint is to store the FEDAUTH cookie on the user’s disk, with a fixed expiration date. The expiration of the FEDAUTH cooking can be for a fixed time or a sliding session (if the user interacts with SP, the SP session is extended).  FedAuth can be stored on the Disk (default or in memory (not persisted between browser close downs). 

Note:  Changing where the cookie is stored affects the way the user shall login and effects Office application login such as Word.  Test thoroughly before changing)

Note:  Watch the IdP providers expiration policy vs what you set up in SP.  As an example, you could remove a user from the IdP, however, the session is still persisted and the user can still interact with SharePoint.   From MSDN "Make sure that the value of the LogonTokenCacheExpirationWindow property is always less than the SAML token lifetime; otherwise, you'll see a loop whenever a user tries to access your SharePoint web application and keeps being redirected back to the token issuer." 

Note: Closing a browser window with the FEDAuth stored to Disk does not invalidate the SharePoint session.  So the FedAuth shall persist when IE is close.  However, by keeping the session lifespan/FedAuth lifetime relatively short, the  to be relatively short (think less than an hour) your security is better.
Note: Change from FedAuth to session based sessions should taken lightly, Office products need to be thoroughly tested and generally won't work seamlessly.

Updated 11 Feb 2019: rtFA cookie
"The root Federation Authentication (rtFA) cookie is used across all of SharePoint Online and the rtFA cookie is used to authenticate the user silently without a prompt.  So when moving from OneDrive to SPO or Admin sites, the user does not get additional prompts for login.  When a user signs out of SharePoint Online, the rtFA cookie is deleted."

References:
SharePoint Authentication and Session Management
https://msdn.microsoft.com/en-us/library/hh446526.aspx
Remote Authentication for SharePoint Online (RTFA)
Why IE and Office work together in SP
Adding, removing SP claims and managing security using claims  and NB! also
Logout of SharePoint
 NB!  http://blog.robgarrett.com/2013/05/06/sharepoint-authentication-and-session-management/
Check Cookie TimeOut; set by formula:
FedAuth LifeTime = IdP endpoint SAML token lifetime Duration - STS LogonTokenCacheExpirationWindow

Update 04 Feb 2016: screenshot for clarity:


My blog post on changing from FedAuth to session based cookies.  The post also shows how to examine the cookies for Internet Explorer (IE).

Sunday 9 August 2015

Request Management

I hate Request Management (RM) and I believe it is going away in SP 2016.

Issue I have seen with RM enabled:
  • Tenant Admin API can't provision Site Collections (point to the WFE's and it works)
  • Workflow sometimes don't start
  • Removing RM improved performance on an Extranet
  • Distributed Cache - One client had Distributed Cache intermittent issues.  User had to re-authenticate using STS.  As the RM was hit 1st and using distributed cache and then picking 1 of several WFE's, the issue was twice as likely to occur.  Used the NLB to direct traffic to the WFE's and the number of re-auths halved.

Saturday 11 July 2015

Machine Translation Service for SP2013

Overview:  I have never use Machine Translation Services (MTS) and this post is my discovery of the Service.  These are my summarised notes.
  • Setup a MTS on the farm
  • Configure MTS on the farm
Notes
  • The Server/servers running the MTS need internet access as the need to connect to Microsoft Translator.
  • Used to translate word documents, html documents and plain text.
  • MTS has a single database
  • There is a length restriction of translations so long word document won't translate.  This can be amend in your MTS configuration but 500,000 characters is the default max translation length.
  • Full APIs: Server side Object model, or CSOM and REST API's. 
More Info:
https://technet.microsoft.com/en-gb/library/jj553772.aspx
http://blogs.technet.com/b/wbaer/archive/2012/11/12/introduction-to-machine-translation-services-in-sharepoint-2013.aspx
http://blogs.msdn.com/b/mvpawardprogram/archive/2013/08/05/overview-of-sharepoint-2013-multilingual-features.aspx
 

Saturday 4 July 2015

Provisioing Site Collections on-prem using the Tenant Admin API

Problem: Ability to provision Site Collections without using Server Side code.  Use CSOM and the Tenant Admin APIs.  This is a follow on the post: Provisioning Site Collections using CSOM (read it 1st).  Thanks to Sachin Khade, Frank M (check) and Alex N R (check) has given me his time to understand this: https://sachinkhade.wordpress.com/
I have reduced the Tenant Admin process into the least amount of steps that works.


The steps are:
Perform on an existing Web Application
Run the PS Script below:
  1. Create SC using a team site site template STS#0
  2. Set the AdministratorSite Type = TenantAdministrator
  3. Add ProxyLibrary that add the TenantAdmin dll
  4. Attach the Proxy to the existing Web Application
  5. Enable SelfServiceCreation on the Web Application
  6. IISReset
  • Using the C# console create new site collections using the Tenant Admin API
PS Script

========
# The first section contains the variables you need to specify based on your needs
$webapp =  get-spwebapplication http://radimaging.co.uk:555 # My Web application (already exists)
$url = "http://radimaging.co.uk:555/sites/msotenantcontext" # Tenant Admin Site Collection used for provisioing (does not exist)
$WebsiteName = "Tenant Admin"
$WebsiteDesc = "Tenant Admin Site"
# better to use the site template "tenantadmin#0" using the team site site template "sts#0" causes
# an error msg (SubscriptionId can't be null), both work but you get less admin options # for provisioning.
$Template = "STS#0" 
$PrimaryLogin = "radimaging\psmith"
$PrimaryDisplay = "Paul smith"
$PrimaryEmail = paul.smith@radimaging.com
# Create a site collection and top level website
New-SPSite -Url $url -Name $WebsiteName –Description $WebsiteDesc -Template $Template -OwnerAlias $PrimaryLogin –OwnerEmail $PrimaryEmail
$web = Get-SPWeb $url
$web.CreateDefaultAssociatedGroups($web.site.owner,$web.site.secondaryowner,"")
$web.Dispose()


#Set the TenantAdmin SC
$site = get-spsite -Identity $url
$site.AdministrationSiteType = [Microsoft.SharePoint.SPAdministrationSiteType]::TenantAdministration
$newProxyLibrary = New-Object "Microsoft.SharePoint.Administration.SPClientCallableProxyLibrary"
$newProxyLibrary.AssemblyName = "Microsoft.Online.SharePoint.Dedicated.TenantAdmin.ServerStub, Version=15.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c"
$newProxyLibrary.SupportAppAuthentication = $true
$webapp.ClientCallableSettings.ProxyLibraries.Add($newProxyLibrary)
$webapp.SelfServiceSiteCreationEnabled=$True
$webapp.Update()
Write-Host "Successfully added TenantAdmin ServerStub to ClientCallableProxyLibrary."
# Reset the memory of the web application
Write-Host "IISReset..."   
Restart-Service W3SVC,WAS -force
Write-Host "IISReset complete."


As always check out this post from Vesa Vuronen:
https://blogs.msdn.microsoft.com/vesku/2014/06/09/provisioning-site-collections-using-sp-app-model-in-on-premises-with-just-csom/