Sunday, 16 December 2018

SharePoint Online Property Bag SPWeb Properties are not indexed by default

Problem:  Moving an on-prem SharePoint solution to SPO, I realised that SPO does not automatically index property bag values.

Initial Hypothesis:  The Search schema looks correct and automatically created the correct Managed Properties.  Asked our Microsoft representative and they sent us a link to enable property bag values in the search index.

Resolution: Be aware that you need to do some Powershell commands on your tenant and site collections when using SharePoint Online to make property bag settings appear in the search results.

More Info:
https://blog.kloud.com.au/2018/04/26/how-to-make-property-bag-values-indexed-and-searchable-in-sharepoint-online/

Saturday, 15 December 2018

ShareGate User Migration Gotcha

Problem:  Migrated an Extranet site with a large user base, and multiple users have the same name.  When a user is removed from AD, and running migration to the new farm, the AD automatically picks a different user and gives them the user that lefts permissions.

Example:
John Smith (john.smith@contoso.com) has been added to a site collection.
John Smith (@contoso) is removed from AD but still exists in the site collection permissions.
Ran Sharegate to move the content including user permissions to a new farm.
John Smith is added to the same SharePoint groups however, it has added john_smith@clientA.com

Initial Hypothesis: Sharegate tries to resolve the user and is incorrectly resolving the user's name and not the name in AD.  As the user has left the firm, the other user is being resolved and we end up with permission inconsistency.

I got this reply from Sharegate and can see that my issue happens at step 8.

"How Sharegate resolves users from the source to the destination"

"We look at the whole account name available, for matches to users at the destination through the SharePoint people picker.
Once we have a list of potential matches for your user, we go through the list of values below (in the specified order). We consider the account a match when we find the same values for one of these properties:
1.    Exact same account name
2.    Same normalized account name (without claims header)
3.    Same login and domain
4.    Same login
5.    Same login and domain (source login read from display name - this can happen when importing from file system because the account name is set as the display name)
6.    Same login (source login read from display name - this can happen when importing from file system because the account name is set as the display name)
7.    Same email address
8.    Same display name

9.    PrincipalType is not set or is a Security Group and same display name without domain"

Somewhat related:
https://sharegate.com/blog/unresolved-user-when-preserving-created-modified-sharepoint-migration

Monday, 3 December 2018

SharePoint Online Geo-Replication SPO/O365

Geo-replication/Multi-tenancy

Mid 2018 I outlined the state of Multi-geo on O365, the easier parts of Geo-Replication are already well handled and the changes are discussed in the the link.  This post focuses on SSO options today and the likely road-map.

O365 is moving towards multi-tenancy that will allow multinational companies to store data in compliance with country rules.  For instance EU data may not be allowed to be stored outside the EU but you already have your O365 tenancy based in the US.

Historically, most larger companies have chosen either the US or EU to base their data storage in.  If you wanted data to be stored in another region you had to buy another tenant with Microsoft strongly discouraged.

Microsoft, are working towards supporting O365 in multi geo-locations.  Basically, their are 2 parts: 1) User specific data (email, OneDrive) where we know where a user is based and their data is encrypted and stored in that country. and 2) group/team/country specific data (SharePoint) where the data itself may have residency rules.

This post looks at SharePoint data that is required to be stored in a specific country.

Options today:
1. On-Prem. : Have a SharePoint farm in each geo location, this requires a fair amount of thought to deal with SSO, Search, MMS, Content Types and UPA.
2. O365: Have multiple tenants (non are connected) in each location and connect your authentication up to each tenant.  The problem with option 2 is that each O365 tenant requires a separate Azure Active Directory.  This means that you will need to hook each O365 tenant up to a single MMS, Search service and poly-fill in the SSO process.  Imaging if you have 8 regional tenants for regulatory purposes.  To achieve SSO, you will need to create a central AAD, then connected each regional AAD to the central AAD.  Azure directory sync is needed, inviting members and guests, other companies AAD becomes and issue.  The image below outlines a possible pattern to solve this complex problem.


Coming Q1 2019 : Multi Geo tenant, that shall be the answer.  A lot of the multi-tenant is still in  preview so I shall be interesting to see mutil-geo tenancy when it goes into General Availability (GA) next year (+-Feb/March 2019).

MSIgnite tour London updates 27-Feb-19:
Brent Alinger

Sovereign geos:
US Gov
China (21Vianet)
Germany

Coming new geos: South Africa, UAE, Norway o365 data regions coming soon.  See office.com/datamaps

UK: Cardiff, London, Durham are 3 data centres in the UK.
Note: some services such as AAD, planner, yammer, Sway are not uk based either Europe or US based.

US has 8 data centres

Can get default region moved, it’s difficult.

Phase 1:  oneDrive and exchange April 2018 delivered
Phase 2: o365groups and SharePoint private preview Oct 2018.  Good feedback so far.  Keen ferry, Cott dimension data.

Multi-geo is not for solving:
GDPR
PERFORMANCE enhancer - rather align with MS Global Network.  
pining data to a specific country

Cost:  $2 per month extra per user in satellite locations, go thru account manager to set it up.  Once approved shows in admin centre and provisioned, take less than 30 days but can be 2 days.

Need a domain name per geo location for OneDrive and SPO e.g. https://emeia-radimaging.sharepoint.com

Preferred Data Location (PDL) - used to specify in AAD to show where a user is stored.  Not for travelling user but long term office assignment.  Users of exchange online are seemlessly moved.  ODfB requires a PS cod to move the user data.  

Phase 2: SPO March into GA by 30 March 2019 confirmed.  DLP per satellite geo.  Hub sites can span multi geos.  

Aka.ms/multi-geo

Update: 2020-06-30.  Multi-geo is available in
Australia, 
Asia Pacific, 
Canada, 
European Union, 
France, 
India, 
Japan, 
Korea, 
United Kingdom, 
United States, 
United Arab Emirates, 
South Africa, and 
Switzerland.





Sunday, 2 December 2018

O365 AAD - Federation B2B options

Problem: Using O365 as an Extranet.  A basic analysis before starting is a minimal requirement.  The existing Extranet will make a lot of the questions fairly easy to clarify.  You can cover this in tremendous detail but to avoid information paralysis, I recommend a decision maker, and preferably someone that already works on Extranet.  A committee is cool if you have the cash but it's so hard to guess at the future, my preference is to get the broad strokes right and amended once we are in the weeds.  These four points can be answered with the right people in a meeting or may take months for complex organisations especially if there is no clear leader to make decisions.

Consideration Point:
1. Who is using the Extranet?  Clients, partners, vendors, ..., or Client Users
2. How will Client and Company users authenticate? O365 options including ADFS, another federation service e.g. Ping, Passport/Live, Google, Facebook,...
3. Self-registration or known approved Client Users?  Try to figure out what the process for on-boarding your Client User will be.
4. Client User Profile Usage?  Will the client users amend content, have the ability to share permissions or old school, they will read web published pages (read-only).  Will client users have OneDrive, use teams, only SharePoint or other O365 applications.

2.> O365 authentication
The most basic option is to allow O365 to have client users (guests), as long as a user has an O365 account they can be a Client User.  You can also use any Microsoft account for a client user.
Azure has a service that allows for you to connect users as guests, the user shall use their own AAD or ADFS or any federation service including Google and Facebook to authenticate.  Microsoft allows 5 guest accounts on AAD for every 1 member (licence user).

4.> Client Usage Profiles
O365 can share a document anonymously in a link within an email.  Obviously, this means anyone can potentially access the file.  However, to replace attachment in an email and wide distribution this is a great step forward, as you can control versions and retract the access at any point.  Additionally, the link settings can be customised to control who can use the link.  For example, you can set the specific people who get the link or you could specify only internal people get the link.  Once it is set to "Anyone" the email or link can be forwarded and literally anyone can get access.

Governance:  Manage O365 to apply the businesses rules so users comply with governance.  O365 has an easy straight forward configuration to make this happen.  When configuring sharing governance you need to ensure it is done at the O365, SharePoint Admin and Site Admin levels.  If 1 of these says no external sharing you can't share so it is a fairly granular approach.  This allows Extranet and Intranet to live on the same O365 tenant.

Licensing: As a general rule, there tends to be no cost for External users, as 5 client Users for every internal O365 user is allowed for the O365 extranet scenario.  Check with Microsoft as business scenarios play out differently.

Thoughts:

  • O365 uses Azure Active Directory (B2C), there is a 1-to-1 relationship between your tenant AAD and you O365 instance.
  • External accounts can be connect as guests e.g. Another AAD tenant, Micsrosoft accounts (passport), ADFS or any auth provider (SAML), Facebook, Google+, AAD B2C (separate service from AAD).  There is also a One Time Passcode option.

Sunday, 18 November 2018

Securing SharePoint O365

Microsoft outline how they treat access to your company data, how your data is kept secure and audit and availability, read this post.  The information below notes possible settings and configuration to secure 0365.

Azure AD is the key, ensure auth is 100%.  e.g. MFA for some or all accounts.  Use the "Identity Secure Score" to check possible problems.  Consider Microsoft Authenticator for MFA.

O365 Settings use:
  1. Secure Score - Overview of my tenant settings and how they should be set.  Check my tenant again set MS best practices for O365. 
  2. Validate setting meet governance and are not merely defaults.
  3. Review SPO audit logs - ensure it is turned on (default is to have it turned off).
  4. Security and Compliance Dashboard - Good email checker/analysis.  Low value for SPO.
Cloud App Security (CAS) - service looks for security on O365 tenants, improving constantly.  CAS Overview.  Add-on or included in E5 plans.

Office 365 Advanced Threat Protection (ATP) - service to identify threats.  "ATP analyzes content that's shared and applies threat intelligence and analysis to identify sophisticated threats.", Microsoft.

To manage document use IRM on SPO and AIP on documents.

"Azure Active Directory Identity Protection is a feature of the Azure AD Premium P2 edition that enables you to 1) Detect potential vulnerabilities affecting your organization’s identities 2)
Configure automated responses to detected suspicious actions that are related to your organization’s identities 3) Investigate suspicious incidents and take appropriate action to resolve them".  More info.


SAML, OAuth and OpenID Connect

Rough Notes - Fix

OverviewSAML has been around for a fair amount of time  (roughly 8 years) and still is widely used for authentication and authorisation of end-users on the Internet.  OAuth2 is used to allow internet users to give internet/web-based applications access to the user's information without the user password.  OpenID Connect is an extension to OAuth2 and generally the way to go instead of SAML for user authentication.

Azure Active Directory Supports:
  1. SAML 2.0
  2. OAuth 2.0,
  3. OIDC, and
  4. WS-FED.
SAML:
  1. SAML is an umbrella standard that covers federation, identity management and single sign-on (SSO); and
  2. SAML is an open standard for exchanging authentication and authorization data between parties, in particular, between an identity provider and a service provider. SAML is an XML-based a markup language for security assertions.
Limitations of SAML:
1.       It was launched in November 2002 which supports SSO but has now been deprecated by a lot of auth suppliers.
2.       SAML is not supported with native Mobile Applications. Hybrid mobile apps can work with it.
3.       As it has now deprecated, finding it may not be supported by the latest technologies applications and technologies. 

OAuth 2.0:
1. Derived from OAuth, Auth2 uses 2 passwords and is more secure.
2. (Open Authorization) is a standard for authorization of resources. It does not deal with authentication. It was released in 2006.
Limitations of OAuth:
1.       It only deals in Authorization so we have a limitation that we cannot verify the user's identity, i.e. Authentication.


OpenID Connect:
  1. OpenID Connect (OIDC) is a protocol to verify user identities and get the user profile information. OIDC enables devices/apps to verify identities based on the authentication done by an authentication server
  2. It was launched in February 2014
  3. OpenID Connect (OIDC) is an authentication layer on top of OAuth 2.0, an authorization framework
  4. OpenID Connect is built on top of OAuth 2.0, specifies a RESTful HTTP API, and uses JSON as a data format. It has a specialized set of predefined data types and endpoints for exchanging user information between the identity provider and the application.
  5. There are 4 different types of OIDC, pick the appropriate flow, normally Code flow.
Advantages of OpenID Connect
1.       It supports SSO and federation.
2.       Has good support with .Net Core.
3.       It supports a wide range of clients like web applications, mobile apps and JavaScript applications.
4.       Have support with Azure AAD B2C as per Microsoft guidelines and others like Google+.

Comparing Differences:
  1. https://www.gluu.org/resources/documents/articles/oauth-vs-saml-vs-openid-connect/
  2. https://stackoverflow.com/questions/7699200/what-is-the-difference-between-openid-and-saml.
  3. https://www.quora.com/What-is-the-difference-between-OAuth-OpenID-and-OpenID-Connect

Support with Azure b2c
  1. https://docs.microsoft.com/en-us/azure/active-directory-b2c/active-directory-b2c-setup-oidc-idp
Web sign-in with OpenID Connect in Azure Active Directory B2C
  1. https://docs.microsoft.com/en-us/windows-server/identity/ad-fs/development/enabling-openid-connect-with-ad-fs
  2. https://docs.microsoft.com/en-us/aspnet/core/security/authentication/ws-federation?view=aspnetcore-2.2

Using Postman to Generate an OAuth Token Example


Using the Generated Bearer token in a get Request
Tip:  To examine a Bearer token use the website https://jwt.io 

Saturday, 8 September 2018

SharePoint Saturday 2018 - Cambridge

Here is my slide deck from SharePoint Saturday Cambridge 2018  Introduction to Azure Information Protection (10 MB includes recordings)

Sessions I attended:
1. PowerApps Jump Start by Sandy Ussia
I got some useful pointers in this session, Sandy presents well and focused on business/citizen developers. 
2. Office 365 Security and Compliance with Albert Hoitingh and Daniel Laskewitz
This was two sessions and amazing.  Hands-on how it works and what I need to know.  Absolutely brilliant double session.  These guys really know AIP, DLP and O365 security.  Great info in a small focused setting.
3. Managing Content in O365 with Erica Toelle
I did not know Erica, I do now!  And wow she is good, she covered O365 security center, Cloud App Security (new service looks for security on O365 tenants) and AIP.  Great knowledge, humble and so easy to talk to.
4. My presentation on AIP, I cover a few points from Erica's session, as most of the audience were in both our sessions, I skipped over the info Erica already provided.
5. Containers with Anthony Nocentino
Amazing presenter - very engaging and I learnt a lot about containers.

A great conference, well organised - the sessions info were outrageous.  The speaker's dinner in Sidney Sussex College was quite an experience.  Thanks to the organizers:  Paul Hunt, Mark Broadbent, & Andy Dawson

Wednesday, 1 August 2018

JSLink on Modern Sites

Problem:  I want to use JSLink/CSR to amend a view on a list for a client on SharePoint Online/O365 and I can't find the JSLink property on the List View Web Part to perform simple Conditional formatting on a column.  JSLink as brilliant for achieving this task in SP 2016 and 2013.
Hypothesis: Modern sites use the Modern List View Web Part that is a new implementation of the LVWP.  https://techcommunity.microsoft.com/t5/Microsoft-SharePoint-Blog/Introducing-SharePoint-modern-list-library-web-parts/ba-p/64581  I assume pragmatically 1 can still use JSLink, but on "modern" pages this needs to be verified.

Resolution: Use "Column Formatting" as outlined here: https://docs.microsoft.com/en-us/sharepoint/dev/declarative-customization/column-formatting#get-started-with-column-formatting





Sunday, 15 July 2018

Power Platform Notes - Power Apps, CDS, Power BI & Power Automate

Power Apps

Variables:
Microsoft Docs "PowerApps and Excel both automatically recalculate formulas as the input data changes".
  • Contextual variables - scoped at a screen level
  • Global variables - scoped app level
  • Fx> Set(MyUniqueClientNo, 12)
Functions:
Fx> UpdateContext({MyTimesheetId: 34})
Tip from Shane Young:  Note the setting variable may be the reference, so for a control use:
Fx> UpdateContext({MyTimesheetId: txtTimesheetId.Text}) not
Fx> UpdateContext({MyTimesheetId: txtTimesheetId.Text}) unless you want the context to float

Pass a variable to another screen use the Navigate overload, OnSelect property of a button
Fx> Navigate(Screen2, ScreenTransition.None, {TSvar: MyTimesheetId}
MyTimeSheet Id is a contextual variable

If Statement:
Fx> If(MyUniqueClientNo = 12, lblAns.text = 'yes', lblAns.text = 'No')

ALM:
Power Apps has it's own source control and you can manually export and import entire projects. Power Apps Build Tools allows for DevOps integration.

Licencing:
Updated: Power Apps Licencing Summary as of 30 December 2019
"To run standalone apps, guest users need the same license as users in your tenant."  Microsoft Docs.  
  • PowerApps using AAD B2B user (both members and guest) using standalone Power Apps shall require the Power Apps licence (not necessarily Portal Apps).  Steps to Share Power Apps with Guests
  • SharePoint user interacting with a Power Apps form on a SharePoint list do not require a Power Apps licence.
Coding Standards for Power Apps:
Microsoft provide a whitepaper as a suggestion for using naming/coding standards.

Power Apps Portals:
Build responsive website using CDS.  Allow anonymous access or implement multiple Identiy Providers (IdP) such as AAD B2C, AAD, Google+, LinkedIn.

Updated: 28 July 2018:
Common Data Service (CDS): Comes from Dynamics CRM Sales, pretty much used like CT's in SharePoint.
  • Based on Azure SQL & uses CosmosDB with a nice API that support REST/ODatav4;  
  • Has Row, field RBAC and uses AAD for authentication;
  • Allows Power BI, Power Apps (previously labeled as PowerApps) & Flow (Power Automate - new name since Ignite 2019) to work with CDS;
  • Use Power Query to Import data easily or data flows
  • Multiple data source such as SQL Server, Cosmos, files, Excel, Search, 3rd parties such as SAP.
  • CDS is not part of the O365 licencing (including E5), you need to get a PowerApps P1 or P2 (Now Power Apps for Users as licencing changed circa Oct 2019).  Note Power Apps licencing included with O365 is extremely limited and for instance does not cover CDS;
  • One Environment ties to one CDS database
  • Any type of data; any type of app.  
  • Easy to use; secure.
https://docs.microsoft.com/en-us/powerapps/maker/common-data-service/data-platform-intro

Examples: 
Insert or Update using Patch to CDS:
Patch(
        Vote,
        Defaults('Employee Sentiment'),
        {
            userid: UserID,
            vote: ThisItem.Vote,
            votedate: Now()
        }
    );
Select from CDS:
UpdateContext( { locVotes: LookUp('Vote', votedate = <date>) });

Power Automate

Updated 11 December 2019:
Microsoft Power Automate (previously called Microsoft Flow, is an user friendly workflow tool/service): Simple workflow engine sitting on Azure Logic App.
Basic business process flows are:
  1. Automated Flows - Basic starting point
  2. Button Flow - Events such as a click triggers the flow;
  3. Scheduled Flow - in effect timer jobs;
  4. Business Process Flow - ensures users enter  data consistently (think Windows Presentation Framework) & follows the same steps every time; and
  5. UI Flow (preview) - Provides RPA capabilities;
https://docs.microsoft.com/en-us/power-automate/get-started-logic-flow

Building Blocks for Power Automation are:
  • Trigger - what starts the workflow, can be manual trigger or an automated trigger
  • Actions - ,
  • Conditions,
  • Data operations, 
  • Expressions - basic operations like adding two numbers together, making a string upper case, and
  • Flow Type - see the 5 flow types above.
Display directions using Map on a Image control in PowerApps:
https://dev.virtualearth.net/REST/v1/Imagery/Map/road/Routes/driving?waypoint.1=51.525503,-0.0822229&waypoint.2=SE9%4PN&mapSize=600,300&key=<key>
https://dev.virtualearth.net/REST/v1/Imagery/Map/road/Routes/driving?waypoint.1=" & EncodeUrl(txtDriverLocation.Text) & "&waypoint.2=" & EncodeUrl(txtDriverLocation.Text) & "&mapSize=600,300&key=AsR555key

Tuesday, 3 July 2018

Visual Studio Code - Azure functions using local Node and JavaScript Problem Solving

Problem:  When I deploy my function locally, I get warnings.  The local server still works but fixing the Functions Worker Runtime issue.


The Visual Studio Terminal when debugging locally output the following warnings:

"your worker runtime is not set. As of 2.0.1-beta.26 a worker runtime setting is required.
Please run `func settings add FUNCTIONS_WORKER_RUNTIME <option>` or add FUNCTIONS_WORKER_RUNTIME to your local.settings.json
Available options: dotnet, node, java"

Later it show that Java is the default language:
"Could not configure language worker java."

Resolution:
Open the local.settings.json file and add the "FUNCTIONS_WORKER_RUNTIME" value as shown below.  "FUNCTIONS_WORKER_RUNTIME": "node",



Problem:  When debugging my Azure Function using Node, I get the following error: "Cannot connect to the runtime process, timeout after 10000 ms - (reason: Cannot connect to the target:)." and the debugger stops.  the local server still serves the application but I cannot debug.


Resolution: After reading many blogs, I was not having any joy.  Finally, I remembered that I have changed the port number the previous day.  In desperation, I changed the port number in the launch.json file back to it's original and the debugger started working.


Sunday, 17 June 2018

Azure Powershell from VS Code

Overview:  I am moving over to using Visual Studio code for everything including PowerShell.  Historically, I would use PowerGUI as it was the best IDE for PS for many years but PS ISE is excellent and I don't see a material difference these days.  Basically, I use VS code for my ISE for JS, SPFx, C# unless the full versions of Visual Studio speed up my delivery rate, this allows me to remain in VS code without going to PowerGUI or 1 of the Windows PS consoles/IDE.

Get the VS code debugger working: 

Get the IDE (VS Code) ready
On a new VS Code install, add the VS Extension "PowerShell", the VSIX has the description "Develop PowerShell scripts in Visual Studio Code!"



Run and Verify PS is working and output returned is working

Add the Azure Account Extension
Sign into Azure
A notification pops up to authenticate the machine/laptop with you Miscrofot credentials.  Run the popup and sign in to authenticate the local dev IDE.

 Open the Cloud Shell
Verify you are signed in



Wednesday, 6 June 2018

Geo-replication in SharePoint and SPO to the rescue

Geo-Replication on SharePoint (Not covering email or OneDrive)

Problem: Over the past 7 years, I have worked on a few clients that require some form of Geo-Replication of share SharePoint farms.  Geo-replication is normally needed for compliance.  This post assumes you need to geo-replicate and not why you need to geo-replicate

Tip: Geo-replication can be used for performance but the complexity that it brings I feel is an added bonus and should not be undertaken for performance gains, there are easier better pragmatic answers to performance such as Riverbed devices, caching and CDN's to name a few.

Initial Hypothesis:  Large organisations existing in multiple geographic regions and need to abide by country regulations and often other industry standards bring the need to geo-replication capability.  I recently completed several high profile projects for a big four consultancy that needed to ensure SharePoint data does not leave its jurisdiction depending on its metadata.  Building on-prem SharePoint farms were extremely complex and the 3 big services that needed to be centralized or copied are Search, MMS and the Content Type Hub.  There are more like AAD but for my situation, I needed to be able to have multiple SharePoint farms in specific regions that connected to centralised services.

Thoughts: MS has OneDrive and the email piece working in local geographies.
SharePoint is coming with multi-tenancy and users will get unified search results across geographic regions.
  1. Search each tenant holds their own index, not a central index for search - "good news for data location compliance".  Somehow MS are intermingling all the search results using federation - so they appear as an ordered result set from multiple different Geo indexes.  
  2. Profile Services (use to be UPS) gets core fields from central AAD and local fields are stored at a tenancy level (good news).  
  3. Taxonomy (MMS) is replicated downwards from the central MMS.
  4. Each tenant has it's own content type hub (I never liked this), the CTH uses a star topology to push the CTHub from the central tenant to the regional tenants so the copies including GUIDs are identical.
SPO to the Geo-Rescue (coming soon, in pre-beta/private preview as of 6 June 2018):
  • SPO is implementing multiple tenants across O365 like O365 previously did for OneDrive, you can specify where sites get created i.e. region/country.  Each region as it's data centres specified and the URL of the Sites clearly indicates where the site is hosted.
  • The search index is kept in-country and federated up to the central tenant for a seamless search experience across multiple region tenants.
  • Central taxonomy is automatically replicated to the regional tenant.  MMS us a star topology to distribute and keeps GUIDs in sync.
  • UPA holds only key data centrally and each region holds additional properties (good for GDPR and other DPA regulations).
  • AAD shall be controlled centrally and I believe AAD's have regional copies.  * Each O365 has it's own AAD today, this will be the big change to facilitate SSO.
RoadMap:
OneDrive is multi-geo now. Offered to large enterprises only, must have certain number of users.
Circa Q1 2019 SharePoint will offer multi-geo.

http://blog.sharepointsite.co.uk/2013/08/stretched-farms-geo-replication-and.html

SharePoint Online Replacement Patterns in Diagrams

Overview: This Post highlights my default position for achieving Common SharePoint solutions using SharePoint Online, flow and Azure Functions.


Matt Wade has a great resource on the components making up O365.
https://app.jumpto365.com/

Wednesday, 30 May 2018

Azure Information Protection - Protect your companies documents

Azure Information Protection (AIP) can be used to protect documents owned by your organisation to ensure they are retractable, encrypted, visible to the correct people.


Technical High-Level Overview:  
1. When AIP is used to label a document, the document is encrypted and the permissions saved within the document, the document needs to interact with the Azure RMS (AIG) Service.  
2. When the document is opened, the end user needs to authenticate, get their permissions and if they have permission, the document is decrypted and opened.

Pre-Steps to get AIP working on a Word Document:
1.> On your Azure Portal go to Azure Information Protection to Activate AIP and add labels to the global policy.
2.> On a client machine with Word/Office, install the  Azure Information Protection Client add-in (AzInfoProtection.exe).  5 min video on setting up AIP on a client and introductory information.

3.> Open a word document, and set the label on the document, this shall encrypt the docx file.





Admin Demos:
1.> Creating Labels in Azure Information Protection - 2 min (3MB)
2.> Adding Labels to AIP Policies - 2 min (2MB)



Notes:

  • Event Driven Protection
  • Auto classify 
  • Office document labels (Azure retention labels)
  • E-Discovery relook
  • Joanne-cklein.com data 
  • AIP works doc-centric: pdf and office docs anywhere
  • O365 DLP is SPO, OD4B, application level controlled

Azure Information Protection scanner for automated classification requires the AIP Premium P2 licence.
Document tracking and revocation requires either the P1 or P2 AIP licence.  The O365 E3 does not have the revocation and tracking included.



Common Issues:

Problem:  Added a new label and it is showing on Office, when I try set the new lable I receice the error "Azure Information Protection cannot apply the label because the client isn't fully configured..."



Resolution: Give it time to propagate the update made tot he labels in Azure or use the Azure RmsAnalyzer tool to fix the client machine.

Problem:  Can't view on OWA.  
Resolution:  Protected encrypted documents are not available in Office Web Apps, use the Office products such as Word.

Problem:  I can't track or revoke a file with my O365 E3 account.

Resolution:  Only people that need to track need this capability so you can get away with far fewer licences than the number of users. 


AIP Folks to follow 
Bram de Jager

Jethro Seghers
https://jethroseghers.com/category/azure/azure-information-protection/
Albert Hoitingh

Sunday, 27 May 2018

SharePoint Framework Notes

As the SPFx is progressing and changing rapidly, I shall try to update this page as time goes by.  I have been dabbling with the SharePoint Framework (SPFx) for a few months and went to a day workshop with Andrew Connell (AC) on SPFX as the SharePoint Conference 2018 North America on 20 May 2018.  I would definitely recommend attending Andrew Connell training (I have gone to a lot of workshops and presentations over the years and he is excellent) I am not an expert but these notes are my summary of items to be aware of.

Last Updated:15 June 2018
  • To use the SPFx on-prem. with SP2016, you need to have feature pack 2.  SP2016 only for SPFx web parts does not do    SP2019 will be behind SP365 but it shall have all the updates circ May 2018 when it is released circa Sept-Oct 2018. 
  • Safer to user SPFx on modern pages rather than classic SP pages.
  • Development can be done on any laptop with any editor.
  • Either build Web Parts in the local or O365 (/_layouts/15/workbench.aspx) Workbench.

  • What you need is 1. Node.js, 2. npm, 3. Yeoman, 4. GULP, 5. Webpack (used to check and load dependency JS modules).  AC suggests for simplicity install and forget about: Node.js, Yeoman, Gulp and webpack.  You'll use them but you don't really need to understand them.
  • Language-wise, use JavaScript or you can use TypeScript which obviously converts down into normal JS but makes it easier to program (e.g. type ahead/intelisense).
  • Use NVM (allows for multiple versions of Node.js; you may have clients of different versions and NVM allows you to have multiple Node.js versions on a machine) and use the LTS (Long-term support) versions: v8.11.2 or v8.9.4
  • Install the following pre-reqs using npm:  yomen, gulp and the MSfx template for yeomen scaffolding namely @microsoft/sharepoint…
  • VS code makes a good editor, I think Mark Rackly has built a VS template that will do all the scaffolding instead of using yeoman.
SPFx Eqivalancy Comparison:

SPFx Tool C# WSP Tool Desc
Node.js .NET Used to run npm and compile the SP package (*.sppkg) using gulp and webpack.  Runs a local server to use the tooling
npm Nuget Download 3rd party packages/frameworks e.g. jQuery or Angular
yeoman Visual Studio Generates basic SPFx web part files, same as a template built using VSIX in VS.  Ensure you have all the basic parts to build a SPFx web part
gulp MSBuild or F5 Builds the package
webpack NA checks dependant files are included in the package.  AC explained it as shaking the tree (removes unnecessary js libraries and ensure libraries are included)

SP2016 on-prem. Dev vs No FTC Sp2016 on-prem. vs SPO SPFx
  • WSP
  • Timer
  • Custom Service Apps
  • Event Handlers
References:
https://www.voitanos.io/

Background:
Node.js - Allows you to create a web server and compile JS on the server-side.  It's 2 main functions to use in SP are: 1) Need it for local development workbench and 2) Node.js has npm (package manager) built in, you need node.js that uses npm and webpack to uses gulp to build packages (like we did with MSBuild for WSP's).
webpack - build tool that manages code.  Manages styles and JS files.

Sunday, 20 May 2018

Visual Studio Code - IDE Tips

VS Code - Short cuts
Ctrl + S = Save the current page
Alt+ Shift + A = Comment out multiple lines
Ctrl + / = Comment out a single line of code, same cmd to toggle the comment off

VS Code is perfect for comparing two files.  Select the two source file, right click > Select to Compare >

Azure Helper

Azure Services - Replacing Data Centres with "Azure Virtual Networks"
There are so many different services that are constantly being changed and new services added.  This info looks at using an "Azure Virtual Network" to replace traditional data centres.  This "Azure Virtual Network" scenario covers VM's, Virtual Networking (subnets and VPN's), Resource Groups and backups (Recovery Service vaults).

Replacement of a traditional data centre
Tip:  Virtual Networks is a service offered by Azure.  "Azure Virtual Networks" is my term referring to using Azure to host VMS on Azure that happen to us the Virtual Networks service.
  1. Hierarchy is "VM" assigned to a "VNet" that is in a "Resource Group" on Azure tenant.
  2. VPN creates an encrypted secure tunnel between an office location (from the router/or a specific machine) directly to your VNet, allowing the office to use the VM's internal IP addresses.
  3. Use the "Azure AD Domain Service" rather than a DC on a VM or on-prem/data centre to connect machines together.
  4. "Recovery Service Vault" allows you can set up customised policies to back-up the entire VM's.
Azure SQL

T-SQL to create a new login and assign permissions to a specific database using SQL Server Management Studio:
Use master
CREATE LOGIN TestReader WITH PASSWORD = 'Password';

USE AzureTimesheetDB
CREATE USER TestReader FROM LOGIN TestReader;
EXEC sp_addrolemember 'db_datareader', 'TestReader';

Add rights to the TestReader user to run a specific Stored Proc:
USE AzureTimesheetDB;   
GRANT EXECUTE ON OBJECT::uspGetTimesheeyById  
    TO TestReader ;  
GO 

Tags

I'm not a huge fan of tags, even in complex environments I find naming the resources and arranging the resource groups logically pays a high return.  One exception I use is I tag a common tag "Environment" on all my enterprise resources.  This allows me to quickly filter for production or test environment resource only with the Azure Portal.

Thursday, 3 May 2018

TLS 1.2 and SharePoint on-prem.

Problem:  By default SharePoint 2016 and SP 2013 use TLS1.0 for its communication protocol (think SSL version number).  A lot of companies and partners are now insisting your internal SharePoint farms support TLS 1.2 and not the earlier versions.  For various compliance frameworks (DSS PCI, HIPPA) you need to ensure you no longer use TLS1.0 or 1.1 and only support TLS 1.2 and soon to be TLS 1.3.  The problem is that earlier versions have potential vulnerabilities and can lead to various attach including being susceptible to man in the middle attacks.

TLS/SSL Basics:

Three Areas of Concern:
  1. SharePoint Farm (upgrade to SSL across the farm, Supports TLS 1.2 on SP2010 & 2013 since +_Oct 2016) - biggest lift
  2. CSOM Client VM (Ensure the client VM can send TLS1.2) - Jump boxes, browser
  3. Ensure the Communicating App support TLS1.2 (Either .NET4.6.2 compiled apps default to TLS1.2 or problematically enforce the TLS order/version. 
Our business needs three parts:
1. SharePoint farm upgrade thru DTAP environments and Test
2. Calling application needs to use TLS 1.2 as the default and can potentially call backwards.  This includes IE/Chrome, calling CSOM code (either fix works: .NET 4.6.2 upgrade or pragmatically enforcing TLS), PowerShell
// C# CSOM programmatic fix to TLS inaccuracies
ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls12 | SecurityProtocolType.Tls11 | SecurityProtocolType.Tls | SecurityProtocolType.Ssl3;
3. If your tools are on a Server ensure outbound TLS traffic is allowed.  IIS TLS settings affect both inbound and outbound traffic, overwrite the outbound if you need to in the registry.

Validate SSL SharePoint Web Front End:
Use PowerShell to check the WFE or an external tool like HTBridge.
Here is a great tool to look at your HTTPS web server/endpoint setup externally: https://www.htbridge.com/ssl

Tip: Removing TLS1.0 and 1.1 needs to be thoroughly tested as there are numerous dependencies such as OWA, Workflow, CSOM, Internal Comms, SQL Server.

Wednesday, 11 April 2018

HTTP 400 bad request response

Problem:  I have an old SharePoint 2013 custom application that is partially loading, The application has not encountered the error in several years that it has been running).  This is only happening for 1 user out of thousands and occurs on Chrome and IE.  I can see some in the IE developer toolbar that some requests are showing 200 responses, and some are showing 400 responses from the web servers.  The SP WFE's are load balanced, and all WFE's are showing the 400.

Initial Hypothesis:  Only 1 user has the issue.  Some URL requests work, and other are malformed (return 400 errors) on the same WFE.   The user on a different machine still fails.  Using a different browser, the user still fails.  The user is forming a malformed request.  It appears to be a problem with the specific user to a particular site collection and is likely to be the HTTP Header request.
Using the browser settings/Fiddler or Dev toolbars get the error details, i.e.
HTTP 400 – Bad Request - The size of the request headers is too long.
Alternatively, user the IE browser and turn on friendly, to identify if the issue is the HTTP header request is too long.

Possible Resolution: Look at the request header, it may be too long for the WFE to handle.  As making the header smaller is generally not an option, look to increase the size of the requests IIS allows for HTTP requests (HKEY_LOCAL_MACHINE\System\CurrentControlSet\Services\HTTP\Parameters).  As this is production issue and I can't replicate to a lower environment, I need to use a host entry to get my offending user to only be accessing a single WFE where the fix is applied.  By using the NLB and updating IIS, I can ensure the fix works without disrupting my user base.
See https://www.grouppolicy.biz/2013/06/how-to-configure-iis-to-support-large-ad-token-with-group-policy/