Showing posts with label Azure. Show all posts
Showing posts with label Azure. Show all posts

Wednesday 22 May 2019

Azure B2C Authentication for SaaS applications

Overview:  This blog post looks at setting up multiple public federation services on an Azure based SaaS web application.  It is worth understanding that a Microsoft account (old passport accounts) is like a google account and not the same as an Organisational Account (Azure AD IT company (e.g. paul@mycompany.com) setup account.

AAD B2C Overview
1. Client using a browser, goes to a website URL
2. User receives a 302 HTTP response and is redirected to Azure's B2C (AAD and Azure B2C are separate services)
3. User is prompted to login (assuming they don't already have a valid token)
4. After the user is authenticated, they get a Valid token.
5. Using a valid token, the users sessions is established on the web site.

The diagram does not show the flow pass B2C, this shall use "Passive-claims base authentication" to select the users Identity Provider e.g. Google account.  Once the user has a google account authenticated, they are redirected back to the B2C service where the Google token is used to issue a B2C user token for the user.  And step 4 continues.

Azure Active Directory (AAD) also sometimes referred to as AAD B2C
Has two types of users, namely:
1. Members - these are internal company users from an organization e.g., paul.beck@mycompany.com
2. Guests - are external users from outside our company e.g., harry@jpmorgan.com
Tip: Native member users passwords are stored in your Azure B2C service.  Whereas, native guest users e.g. harry@jpmorgan.com, actually logs into JP Morgans AAD and our AAD tenant sees him as a guest and issues a SAML token from us based on JPMorgan's assertion that the user is valid on their AAD tenant.
Note:  A guest user can be made a member and a member user can be changed to be a guest user.  There is no good reason that I have come across for switching guests to members (maybe 2 companies merging) but it is possible if you need to do it.
AAD supports the  following protocols: WS-Federation, SAMLP, & OIDC & OAuth2.  WS-Fed and SAMLP are used but go for OIDC as the default.

AAD B2C Instance:
The diagram above show AAD B2C not B2C.  B2B is provisiong on you Azure tenant and is tied to your O365 instance.  B2C is a separate Azure service used fro managing customers identities.  So if I have a website and some mobile applications, offered an API to clients, I would use B2C not B2B for managing security.  You can connect multiple AAD B2C to your single B2C instance.  B2C basically allows you to connect to other Identity providers using SAML, OIDC, OAuth2 and WS-Fed.  B2C also has the option to use it's own local store if the user doesn't want to connect existing accounts.

If a user has a gmail account, B2C can create an object in the service, but the users password is still maintained by Google.   When accessing our applications, the user goes to the B2C service instance, and then is pushed onto their own IdP (Google in this example) , once they authenticate, they are redirected to the B2C servce, get another new B2C token and are redirected back to the app and shall have access.

Billing/Cost of AAD B2C Service:
B2C is base on Monthly Active Users (MAU), you can have 60k users in the B2C but only 20k of the users have actively logged in using the B2C service.  Dormant/unused accounts in a calendar moth are not counted.
Updated 30 Nov 2019: first 50k MAU's are free for single factor authentication.  It's very cheap per user after the initial 50k and get's cheaper the more you have per user i.e. 50k-100k are £0.0041 per user.  So if I have 61,000 users, the first 50,000 are free and I pay £0.0041 per MAU for the next 11,000 users, amounting to £45.10 for my additional 11k users.
Multi Factor Authentication (MFA) is billed at £0.023 per event (think event as each SMS attempt both successful or failed).  So if the users use MFA and each of the 20k MAU users do 3 MFA's per month on average, the first part is free and the MFA part will cost (20,000 users * 3 attempts * 0.023 per SMS) £1380 per month.  It's a bargain.


More Info:
Great Post from my ex colleague Deepak Srinivasan on Guest and Member AAD access
Understanding ADFS Authentication with SharePoint

Sunday 12 May 2019

Designing an API with Web API on Azure

.NET core is great for creating a Web API project to provide an API.
Tip: Swagger is a great tool that has an editor to specify your API.  With the OpenAPI specification for your API, you can generate documentation, or use the code generation feature to generate tests or C# code.
Tip:  Use the Swashbuckle.AspNetCore Nugget package to generate an OpenAPI specification for your API if you use a code first approach.

Do not trust incoming parameters, you must validate.  Always respond with generic header error messages so you don't give away back-end information.  Ensure you log all errors and review them.

Preferred High Level Design I like for Web API projects

The API needs to return Status Codes to allow consumers/clients to know how there operations are doing so I design with these standard HTTP status codes:

CodeCode Status Meaning
200OK (GET)
201Created (POST or PUT)
202Accepted.  Used for long running operations.
204Resource has been deleted (DELETE)
304Not Modified
400Bad Request
401Not Authorized
403Forbidden
404Not Found
422Entity Validation Issue
429Too many Requests (Implement at APIM layer only)
500Internal Error
    



  • The bold codes are the three minimum set of response codes I will use for a simple API.  The next 4 most important are italicized.
  • Name resources/EndPoints with nouns and I always use lowercase e.g. /api/customers is better than verb noun which was common a few years ago, e.g., /api/GetAllCustomers  NB!  No verbs in URL naming.
  • Keep the URL's simple.
  • Always use HTTPS.
  • DateTime data is always in UTC.
  • Sensitive information must be in the header or body, not in the URL as it is not encrypted.

URL/Resource

GET

(read)

POST

(create)

PUT

(Update)

DELETE

(Delete)

/customers

List all customers

Add/created a new customer

Bulk update customers

Bulk Delete or Generally error and don’t implement

/customers/35

Get a specific customer.  Customer 35 is John DoeNAUpdate the John Doe recordDelete the John Doe object/record

HTTP Methods to Support
GETReturn the current value of an object
PUTReplace or update an object
DELETEDelete an object
POSTCreate a new object.  Return 201 for created and 202 for long running operations.
* PATCH - is also used for updating objects but normally a subsets of existing data

Versioning
API's need to change but there can be multiple client applications consuming my OpenAPI.  We use versioning to ensure I don't break existing implementations.  There are different strategies for versioning API's:
  • No versioning:   https://pbeck.co.uk/customers/35
  • URI versioning: https://pbeck.co.uk/v2/customers/35
  • Query string versioning: https://pbeck/customers/35?version=2
  • Header Versioning:  Not in the URL but passed in the header (my preferred option)
APIM has a great implementation for versioning.  Generally, adding optional data is pretty safe for API versioning.  You need to give 3rd parties using your API's time to implement the new version before deprecating and versions.

Tip: Enforce versions are present in the header.  No version return bad request.

JSON
JSON property names SHOULD be camelCased.

Query Params
Allow filtering using query param e.g. https://pbeck.co.uk/customers?firstname='paul'
Allow sorting for the most common properties
DateTime should be in UTC - The pattern for this date and time format is YYYY-MM-DDThh:mm:ss.sTZD. It is recommended to use the ISO-8601 format.  UI does adjustment for local time display.

Paging
Requests must have max return limit e.g. 100 items.  Specifying the limit param if you want to change the size of the dataset to return.

CORS
Cross Origin Request will need to be supported, reply on OAuth for who can use the API.

HATEOAS 
Good way to navigate around your API.  And a principle worth adhering to where possible.  

Authentication and Authorization
API's access should be via OAuth2.0.  It is very easy to hook APIM and underlying App Service (Web API) up to AAD B2C.  Scopes are used for authorization.

Code Review/Quality
"Linting is the automated checking of your source code for programmatic and stylistic errors. This is done by using a lint tool (otherwise known as linter)".   Examples of Linting Tools are Resharper, StyleCopSonarQube, ...

Security

It's a good idea to also have a WAF in front of your API.

Sunday 7 April 2019

Azure Active Directory, B2C and Rights

Azure Identity Management is a fairly large body on knowledge.  Basically, dividing it into different areas makes if easier to understand.

RBAC in Azure:
Azure AD and B2C bother offer a way to authenticate a user thru the user providing an identity.
The user is assigned to 1 or more groups, and then the groups (or individual users) are assigned to Roles.  The diagram below shows internal and external users and how permissions can be given out.  Resulting in Role Based Access Control (RBAC).  The application itself deals with the operations a user can perform but having the users role/claims allows the individual applications to figure out what action the user can perform.

RBAC can be assigned at 1 of these 4 levels to manage Azure Resources:

Tip: For small Azure Tenants, managing resources are the resource level works well, but in most enterprises, you should mange at the Resource group or even subscription level to keep management controllable.
Note: There is the concept of "Directory", multiple "Resource Groups" are setup to a directory.  I believe all companies should have a single directory but it is more common to find even relatively small businesses common to have multiple directories. 
"Multiple subscriptions can trust the same Azure AD directory. Each subscription can only trust a single directory." Microsoft Docs

Thursday 4 April 2019

Adding users to all new SQL database using Azure AD groups

Problem:  I have a dedicated SQL 2017 VM on Azure that is joined to my Azure AD tenant e.g. int.contoso.com (Azure AD Domain Service).  I need a set of users to have read and write access to all databases that get provisioned on the SQL 2017 instance.

Initial Hypothesis: 
Create an Azure AD security group and add all the AAD users and
Add the AAD group to the Model database with the permissions that all new database should have.

Resolution:
1. Using Azure AD create a new security group, I called my group developers and add the users as members Fig 1.& Fig2.
Fig 1. Azure Portal, go to Azure AD and Groups

Fig2. Add the security group

2. Add the AAD Group e.g. int.contoso.com\Developers to the System "Model Database", I have given the group read and write access below in Fig 3.
Fig3. Add permissions to the Model DB
3. Create a new database and validate that the new permissions are added to the new database as shown in fig4.
Fig4.
Note: Changing exsiting DB permissions
To add permissions to existing database, an option is to run
EXEC sp_MSForEachDB 'exec sp_addrolemember ''db_datareader'',''INT\paul.beck'''
T-SQL to list of Daatbase: EXEC sp_MSforeachdb 'USE ? SELECT SF.Name FROM sys.databases SF'


Saturday 30 March 2019

Azure Security Checklist

Overview:  Constantly improving Security on Azure and Office 365 is essential to a lot of companies.  Microsoft provide outstanding infrastructure and monitoring for companies and it is also the companies responsibility to configure and secure O365 and Azure to ensure security and allow for the appropriate liberalization of services so the business can operate effectively.  This post outlines some basic items to look at to optimize the balance of security and it meeting you business needs.

I generally go through the infrastructure and write up a report for management in the form of:  Finding, Recommendation and Management Comment.

Finding:  The company users login using their Azure AD accounts and the credentials page has not got customized branding to help user know they are logging into the companies secure resources (SharePoint, email etc.)
Recommendation:  Using the "Azure Portal", use the "Azure Active Directory" Service > "Company Branding" to upload the company logo in banner and square format and update the color/theme for match the firms branding.
Management Comment: We accept the finding and wish to mark the changes immediately.

Microsoft Provide Tooling to help identify improvements and below are two tools you can use to help clarify the current environment so improvements can be recommended.

Network Security Groups (NSG's)
NSG's are basically firewall on Azure.  Fantastic and simple.  They can get really complex with multiple policies.  Azure gives you great tooling for Azure networks called "Network Watcher".

Wednesday 27 February 2019

MCAS overview MSIgnite London

Work in progress from MSIgniteTour London
Microsoft Cloud app security brokers (CASB) helps manage Shadow IT, detect high-risk OAuth apps, and control high-risk user sessions in real-time for your Office 365 environment.

Covers:
  1. Azure AD (AAD)
  2. Threat protection
  3. Information protection 
  4. SaaS e.g. box, SPO, ODfB
Shadow IT discovery:
Log collector uses proxy or proxy logs.  Find apps people are using.  
Can write back to block app usage at the proxy.  See people using dodgy saas apps. Supports script generation for most devices.

OAuth e.g. G-suite, attackers faking to get access to user info.  MCAS has risk score for apps used. Show all usage, correct users access.

O365 apps:
Check all apps against score:

MCAS protects for:
  • Malicious employees
  • Malware & ransomware
  • Rogue applications
  • Compromised accounts


Investigate:
Helps investigate abnormal behaviour.  Alert and highlight concerns.  Gain insight into user activity.
Can take action such as lock account, or req re-login.

File security:
Prevent sensitive info in the cloud, uses MIP Framework that uses AIP. Show public internet available info, only show SaaS services business control.  Can also force governance on 3rd party SaaS such as box

Block download of data:
Conditional access, so user using an unmanaged device, route user thru MCAS.  Can calc risk and decide on how they access e.g., an unmanaged device could for MFA.  Lots of controls, boilerplate web access, block, MFA, ...

Tuesday 26 February 2019

Microsoft Information Protection Update

Microsoft as of 2019 Feb is still using Microsoft Information Protection (MIP)/ Azure Information Protection AIP interchangeably as this video from Ignite 2018 Oct highlights.  Today I went to the Ignite tour and AIP and MIP are being used to mean the same topic that I'm referring to as AIP in this post.

MIP is a framework that includes AIP includes AIP scanner (files share and SharePoint on-prem.), DLP (cloud), RMS, Azure Advanced Threat Protection, MCAS (cloud), Windows Information Protection (integrates, understands AIP labels), need a central portal to monitor in to the "Security and Compliance Centre" (SCC).

The screenshot from the Ignite London presentation shows where AIP is today as presented by Maayan Nasman Rand.  The presentation was a good overview of AIP.  The big improvement to AIP over the past 3 months is the Analytics/Monitoring, this was not working and now it's very good but still in preview.


  • AIP is getting closer but I feel the big missing piece is the encryption used by AIP does not allow SPO to provide previews and more importantly search cannot index the data in SPO.  Despite this key missing piece, I'd use it on O365 without encryption if I'm in a SharePoint store.   
  • The native applications auto labelling is improving quickly.
  • The Auto-labeling feature is new and useful.
  • A few months ago, AIP labels were merged into the Security & Compliance Centre, worth noting is if you had labels in AIP admin, you need to migrate the labels using "Unified labelling" option and the policies need to be manually brought into the Security & Compliance Centre.
  • Auto-labeling is now in the Mac Office suite and also it is coming to the Office apps in Droid and iOS (preview).
  • AIP is an add on, new Office and Office for Mac and Android have the AIP plug-in already installed.  Applies to all office products including Outlook, Word, Excel, PPTX.
  • The UI ribbon for AIP in Office on Windows has also been updated to a new look.
  • Microsoft Cloud App Security (MCAS) has scanners to perform labelling (like AIP scanner) but also works on g-suite and Box others are coming
  • AIP Scanner works on file shares (CIFS) and SP2013 and SP2016 on-prem.
  • 3rd party product Adobe Pro does not do yet have the ability to update labels, but it's coming soon (Jun 2020?).  They use the SDK that developers can all use.  
  • The Monitoring/Reporting is actually working, a year back it was flakey and the UI and find-ability UI is much improved.
  • A couple of Preview Screens show today:



Previous AIP Posts:
AIP - Protect your companies documents (Catching up to Symantec's product quickly)
SharePoint Saturday AIP Notes

Friday 25 January 2019

Thursday 10 January 2019

NoSQL Document Database options on Azure - CosmosDB

Overview: Azure has a plethora of options for using NoSQL, I have used RavenDB and DocumentDB a couple of years back.  Both are easy and great tools for the right situation, DocumentDB now falls under CosmosDB as a product at Microsoft. However, I feel that CosmosDB would be anyone's default choice today on Azure as DocumentDB is really a feature subset of CosmosDB.

CosmosDB"Azure Cosmos DB is a global distributed, multi-model database (db) that is used in a wide range of applications and use cases. It is a good choice for any serverless application that needs low order-of-millisecond response times, and needs to scale rapidly and globally."  CosmosDB is used by Microsoft's Skype, MSN, Xbox, Office 365, Azure products.

Def: CosmosDB is a Planet scalable NoSQL JSON database that has multiple API support (including SQL(Core)).  Multiple copies/instance around the world (think SQL AOAG).
  • Encrypted on Azure at Rest and in Motion.
  • Multiple API's supported including SQL API (DocumentDB) and multiple other JavaScript and Table.
  • A logical breakdown of CosmosDB API
  • Partitions are managed transparently and users are routed based on geographic location and usage.
  • One write db and multiple reads.  Or can setup multitple write databases  Can set automatic failover so if the write db is unavailable, one of the read db's becomes the write db.
  • CosmosDB Migration tool is great at bringing in data from JSON, SQL, CSV, MongoDB, Amazon DynamoDB.
Concerns:
  • Determinant geo-replication - Use to be 1 master and multiple read copies of the data.  Not all copies can be written to but if you have country data residency rules you can't configure data to be within specific regions.  I.e. I can't specify certain bits of data are only stored in a specific region.  You can specify a region/location for a container, but not split a container.   : Check!  Not a fact.
  • Backup and Recovery - Point in time recovery and MS ticket needs to be raised.  Can't structure complex backup plans.  Take it or leave it approach.
  • Limited LINQ support
  • SQL API is very limited compared to SQL relational databases, offering no joins or aggregation capability such as GROUP BY.
  • Temporal Tables don't exist, there are good auditing options such as the "Change Feed" where all changes can be streamed into an external database/system.
  • Entity Framework support is limited. Consider a PoC before using.
  • Consistency (copy data to other read-only debs) 5 options: "Strong" (commit to all dbs and acknowledge state, so slow to align but all reading same data but it may be stale.  "Eventual", reads what is in the local db you are going to.   The default is "Session".  As always, it depends on the requirement.
Tip: Amazon DynamoDB is AWS's equivilant to Azures CosmosDB

More Info:
NoSQL options - https://www.nebbiatech.com/2017/02/09/exploring-the-nosql-options-on-azure/ 

Monday 3 December 2018

SharePoint Online Geo-Replication SPO/O365

Geo-replication/Multi-tenancy

Mid 2018 I outlined the state of Multi-geo on O365, the easier parts of Geo-Replication are already well handled and the changes are discussed in the the link.  This post focuses on SSO options today and the likely road-map.

O365 is moving towards multi-tenancy that will allow multinational companies to store data in compliance with country rules.  For instance EU data may not be allowed to be stored outside the EU but you already have your O365 tenancy based in the US.

Historically, most larger companies have chosen either the US or EU to base their data storage in.  If you wanted data to be stored in another region you had to buy another tenant with Microsoft strongly discouraged.

Microsoft, are working towards supporting O365 in multi geo-locations.  Basically, their are 2 parts: 1) User specific data (email, OneDrive) where we know where a user is based and their data is encrypted and stored in that country. and 2) group/team/country specific data (SharePoint) where the data itself may have residency rules.

This post looks at SharePoint data that is required to be stored in a specific country.

Options today:
1. On-Prem. : Have a SharePoint farm in each geo location, this requires a fair amount of thought to deal with SSO, Search, MMS, Content Types and UPA.
2. O365: Have multiple tenants (non are connected) in each location and connect your authentication up to each tenant.  The problem with option 2 is that each O365 tenant requires a separate Azure Active Directory.  This means that you will need to hook each O365 tenant up to a single MMS, Search service and poly-fill in the SSO process.  Imaging if you have 8 regional tenants for regulatory purposes.  To achieve SSO, you will need to create a central AAD, then connected each regional AAD to the central AAD.  Azure directory sync is needed, inviting members and guests, other companies AAD becomes and issue.  The image below outlines a possible pattern to solve this complex problem.


Coming Q1 2019 : Multi Geo tenant, that shall be the answer.  A lot of the multi-tenant is still in  preview so I shall be interesting to see mutil-geo tenancy when it goes into General Availability (GA) next year (+-Feb/March 2019).

MSIgnite tour London updates 27-Feb-19:
Brent Alinger

Sovereign geos:
US Gov
China (21Vianet)
Germany

Coming new geos: South Africa, UAE, Norway o365 data regions coming soon.  See office.com/datamaps

UK: Cardiff, London, Durham are 3 data centres in the UK.
Note: some services such as AAD, planner, yammer, Sway are not uk based either Europe or US based.

US has 8 data centres

Can get default region moved, it’s difficult.

Phase 1:  oneDrive and exchange April 2018 delivered
Phase 2: o365groups and SharePoint private preview Oct 2018.  Good feedback so far.  Keen ferry, Cott dimension data.

Multi-geo is not for solving:
GDPR
PERFORMANCE enhancer - rather align with MS Global Network.  
pining data to a specific country

Cost:  $2 per month extra per user in satellite locations, go thru account manager to set it up.  Once approved shows in admin centre and provisioned, take less than 30 days but can be 2 days.

Need a domain name per geo location for OneDrive and SPO e.g. https://emeia-radimaging.sharepoint.com

Preferred Data Location (PDL) - used to specify in AAD to show where a user is stored.  Not for travelling user but long term office assignment.  Users of exchange online are seemlessly moved.  ODfB requires a PS cod to move the user data.  

Phase 2: SPO March into GA by 30 March 2019 confirmed.  DLP per satellite geo.  Hub sites can span multi geos.  

Aka.ms/multi-geo

Update: 2020-06-30.  Multi-geo is available in
Australia, 
Asia Pacific, 
Canada, 
European Union, 
France, 
India, 
Japan, 
Korea, 
United Kingdom, 
United States, 
United Arab Emirates, 
South Africa, and 
Switzerland.





Sunday 2 December 2018

O365 AAD - Federation B2B options

Problem: Using O365 as an Extranet.  A basic analysis before starting is a minimal requirement.  The existing Extranet will make a lot of the questions fairly easy to clarify.  You can cover this in tremendous detail but to avoid information paralysis, I recommend a decision maker, and preferably someone that already works on Extranet.  A committee is cool if you have the cash but it's so hard to guess at the future, my preference is to get the broad strokes right and amended once we are in the weeds.  These four points can be answered with the right people in a meeting or may take months for complex organisations especially if there is no clear leader to make decisions.

Consideration Point:
1. Who is using the Extranet?  Clients, partners, vendors, ..., or Client Users
2. How will Client and Company users authenticate? O365 options including ADFS, another federation service e.g. Ping, Passport/Live, Google, Facebook,...
3. Self-registration or known approved Client Users?  Try to figure out what the process for on-boarding your Client User will be.
4. Client User Profile Usage?  Will the client users amend content, have the ability to share permissions or old school, they will read web published pages (read-only).  Will client users have OneDrive, use teams, only SharePoint or other O365 applications.

2.> O365 authentication
The most basic option is to allow O365 to have client users (guests), as long as a user has an O365 account they can be a Client User.  You can also use any Microsoft account for a client user.
Azure has a service that allows for you to connect users as guests, the user shall use their own AAD or ADFS or any federation service including Google and Facebook to authenticate.  Microsoft allows 5 guest accounts on AAD for every 1 member (licence user).

4.> Client Usage Profiles
O365 can share a document anonymously in a link within an email.  Obviously, this means anyone can potentially access the file.  However, to replace attachment in an email and wide distribution this is a great step forward, as you can control versions and retract the access at any point.  Additionally, the link settings can be customised to control who can use the link.  For example, you can set the specific people who get the link or you could specify only internal people get the link.  Once it is set to "Anyone" the email or link can be forwarded and literally anyone can get access.

Governance:  Manage O365 to apply the businesses rules so users comply with governance.  O365 has an easy straight forward configuration to make this happen.  When configuring sharing governance you need to ensure it is done at the O365, SharePoint Admin and Site Admin levels.  If 1 of these says no external sharing you can't share so it is a fairly granular approach.  This allows Extranet and Intranet to live on the same O365 tenant.

Licensing: As a general rule, there tends to be no cost for External users, as 5 client Users for every internal O365 user is allowed for the O365 extranet scenario.  Check with Microsoft as business scenarios play out differently.

Thoughts:

  • O365 uses Azure Active Directory (B2C), there is a 1-to-1 relationship between your tenant AAD and you O365 instance.
  • External accounts can be connect as guests e.g. Another AAD tenant, Micsrosoft accounts (passport), ADFS or any auth provider (SAML), Facebook, Google+, AAD B2C (separate service from AAD).  There is also a One Time Passcode option.

Sunday 18 November 2018

Securing SharePoint O365

Microsoft outline how they treat access to your company data, how your data is kept secure and audit and availability, read this post.  The information below notes possible settings and configuration to secure 0365.

Azure AD is the key, ensure auth is 100%.  e.g. MFA for some or all accounts.  Use the "Identity Secure Score" to check possible problems.  Consider Microsoft Authenticator for MFA.

O365 Settings use:
  1. Secure Score - Overview of my tenant settings and how they should be set.  Check my tenant again set MS best practices for O365. 
  2. Validate setting meet governance and are not merely defaults.
  3. Review SPO audit logs - ensure it is turned on (default is to have it turned off).
  4. Security and Compliance Dashboard - Good email checker/analysis.  Low value for SPO.
Cloud App Security (CAS) - service looks for security on O365 tenants, improving constantly.  CAS Overview.  Add-on or included in E5 plans.

Office 365 Advanced Threat Protection (ATP) - service to identify threats.  "ATP analyzes content that's shared and applies threat intelligence and analysis to identify sophisticated threats.", Microsoft.

To manage document use IRM on SPO and AIP on documents.

"Azure Active Directory Identity Protection is a feature of the Azure AD Premium P2 edition that enables you to 1) Detect potential vulnerabilities affecting your organization’s identities 2)
Configure automated responses to detected suspicious actions that are related to your organization’s identities 3) Investigate suspicious incidents and take appropriate action to resolve them".  More info.


Saturday 8 September 2018

SharePoint Saturday 2018 - Cambridge

Here is my slide deck from SharePoint Saturday Cambridge 2018  Introduction to Azure Information Protection (10 MB includes recordings)

Sessions I attended:
1. PowerApps Jump Start by Sandy Ussia
I got some useful pointers in this session, Sandy presents well and focused on business/citizen developers. 
2. Office 365 Security and Compliance with Albert Hoitingh and Daniel Laskewitz
This was two sessions and amazing.  Hands-on how it works and what I need to know.  Absolutely brilliant double session.  These guys really know AIP, DLP and O365 security.  Great info in a small focused setting.
3. Managing Content in O365 with Erica Toelle
I did not know Erica, I do now!  And wow she is good, she covered O365 security center, Cloud App Security (new service looks for security on O365 tenants) and AIP.  Great knowledge, humble and so easy to talk to.
4. My presentation on AIP, I cover a few points from Erica's session, as most of the audience were in both our sessions, I skipped over the info Erica already provided.
5. Containers with Anthony Nocentino
Amazing presenter - very engaging and I learnt a lot about containers.

A great conference, well organised - the sessions info were outrageous.  The speaker's dinner in Sidney Sussex College was quite an experience.  Thanks to the organizers:  Paul Hunt, Mark Broadbent, & Andy Dawson

Tuesday 3 July 2018

Visual Studio Code - Azure functions using local Node and JavaScript Problem Solving

Problem:  When I deploy my function locally, I get warnings.  The local server still works but fixing the Functions Worker Runtime issue.


The Visual Studio Terminal when debugging locally output the following warnings:

"your worker runtime is not set. As of 2.0.1-beta.26 a worker runtime setting is required.
Please run `func settings add FUNCTIONS_WORKER_RUNTIME <option>` or add FUNCTIONS_WORKER_RUNTIME to your local.settings.json
Available options: dotnet, node, java"

Later it show that Java is the default language:
"Could not configure language worker java."

Resolution:
Open the local.settings.json file and add the "FUNCTIONS_WORKER_RUNTIME" value as shown below.  "FUNCTIONS_WORKER_RUNTIME": "node",



Problem:  When debugging my Azure Function using Node, I get the following error: "Cannot connect to the runtime process, timeout after 10000 ms - (reason: Cannot connect to the target:)." and the debugger stops.  the local server still serves the application but I cannot debug.


Resolution: After reading many blogs, I was not having any joy.  Finally, I remembered that I have changed the port number the previous day.  In desperation, I changed the port number in the launch.json file back to it's original and the debugger started working.


Sunday 17 June 2018

Azure Powershell from VS Code

Overview:  I am moving over to using Visual Studio code for everything including PowerShell.  Historically, I would use PowerGUI as it was the best IDE for PS for many years but PS ISE is excellent and I don't see a material difference these days.  Basically, I use VS code for my ISE for JS, SPFx, C# unless the full versions of Visual Studio speed up my delivery rate, this allows me to remain in VS code without going to PowerGUI or 1 of the Windows PS consoles/IDE.

Get the VS code debugger working: 

Get the IDE (VS Code) ready
On a new VS Code install, add the VS Extension "PowerShell", the VSIX has the description "Develop PowerShell scripts in Visual Studio Code!"



Run and Verify PS is working and output returned is working

Add the Azure Account Extension
Sign into Azure
A notification pops up to authenticate the machine/laptop with you Miscrofot credentials.  Run the popup and sign in to authenticate the local dev IDE.

 Open the Cloud Shell
Verify you are signed in



Wednesday 6 June 2018

SharePoint Online Replacement Patterns in Diagrams

Overview: This Post highlights my default position for achieving Common SharePoint solutions using SharePoint Online, flow and Azure Functions.


Matt Wade has a great resource on the components making up O365.
https://app.jumpto365.com/

Wednesday 30 May 2018

Azure Information Protection - Protect your companies documents

Azure Information Protection (AIP) can be used to protect documents owned by your organisation to ensure they are retractable, encrypted, visible to the correct people.


Technical High-Level Overview:  
1. When AIP is used to label a document, the document is encrypted and the permissions saved within the document, the document needs to interact with the Azure RMS (AIG) Service.  
2. When the document is opened, the end user needs to authenticate, get their permissions and if they have permission, the document is decrypted and opened.

Pre-Steps to get AIP working on a Word Document:
1.> On your Azure Portal go to Azure Information Protection to Activate AIP and add labels to the global policy.
2.> On a client machine with Word/Office, install the  Azure Information Protection Client add-in (AzInfoProtection.exe).  5 min video on setting up AIP on a client and introductory information.

3.> Open a word document, and set the label on the document, this shall encrypt the docx file.





Admin Demos:
1.> Creating Labels in Azure Information Protection - 2 min (3MB)
2.> Adding Labels to AIP Policies - 2 min (2MB)



Notes:

  • Event Driven Protection
  • Auto classify 
  • Office document labels (Azure retention labels)
  • E-Discovery relook
  • Joanne-cklein.com data 
  • AIP works doc-centric: pdf and office docs anywhere
  • O365 DLP is SPO, OD4B, application level controlled

Azure Information Protection scanner for automated classification requires the AIP Premium P2 licence.
Document tracking and revocation requires either the P1 or P2 AIP licence.  The O365 E3 does not have the revocation and tracking included.



Common Issues:

Problem:  Added a new label and it is showing on Office, when I try set the new lable I receice the error "Azure Information Protection cannot apply the label because the client isn't fully configured..."



Resolution: Give it time to propagate the update made tot he labels in Azure or use the Azure RmsAnalyzer tool to fix the client machine.

Problem:  Can't view on OWA.  
Resolution:  Protected encrypted documents are not available in Office Web Apps, use the Office products such as Word.

Problem:  I can't track or revoke a file with my O365 E3 account.

Resolution:  Only people that need to track need this capability so you can get away with far fewer licences than the number of users. 


AIP Folks to follow 
Bram de Jager

Jethro Seghers
https://jethroseghers.com/category/azure/azure-information-protection/
Albert Hoitingh

Sunday 20 May 2018

Azure Helper

Azure Services - Replacing Data Centres with "Azure Virtual Networks"
There are so many different services that are constantly being changed and new services added.  This info looks at using an "Azure Virtual Network" to replace traditional data centres.  This "Azure Virtual Network" scenario covers VM's, Virtual Networking (subnets and VPN's), Resource Groups and backups (Recovery Service vaults).

Replacement of a traditional data centre
Tip:  Virtual Networks is a service offered by Azure.  "Azure Virtual Networks" is my term referring to using Azure to host VMS on Azure that happen to us the Virtual Networks service.
  1. Hierarchy is "VM" assigned to a "VNet" that is in a "Resource Group" on Azure tenant.
  2. VPN creates an encrypted secure tunnel between an office location (from the router/or a specific machine) directly to your VNet, allowing the office to use the VM's internal IP addresses.
  3. Use the "Azure AD Domain Service" rather than a DC on a VM or on-prem/data centre to connect machines together.
  4. "Recovery Service Vault" allows you can set up customised policies to back-up the entire VM's.
Azure SQL

T-SQL to create a new login and assign permissions to a specific database using SQL Server Management Studio:
Use master
CREATE LOGIN TestReader WITH PASSWORD = 'Password';

USE AzureTimesheetDB
CREATE USER TestReader FROM LOGIN TestReader;
EXEC sp_addrolemember 'db_datareader', 'TestReader';

Add rights to the TestReader user to run a specific Stored Proc:
USE AzureTimesheetDB;   
GRANT EXECUTE ON OBJECT::uspGetTimesheeyById  
    TO TestReader ;  
GO 

Azure Virtual Desktop/ Azure VDI

Microsoft Azure Virtual Desktop (AVD), previously called Windows Virtual Desktop (WVD) is Microsoft's Azures implementation of VDI (Virtual Desktop Infrastructure).  The most common VDI I came across is Citrix Virtual Apps and Desktops (CVAD).  VDI provides a user with a remote desktop instance so a user has their desktop apps and setup from anywhere without need a local laptop build. i.e. don't need to have a full laptop/client machine locally.  The machine is instead hosted as in AVD's case in an Azure Data Centre and the user logs in with their network credentials and gets their instance to work on.  No need to build laptops and easy to move laptop for the user.  Laptop is no longer a risk as the data is held in the data centre.  

Tags

I'm not a huge fan of tags, even in complex environments I find naming the resources and arranging the resource groups logically pays a high return.  One exception I use is I tag a common tag "Environment" on all my enterprise resources.  This allows me to quickly filter for production or test environment resource only with the Azure Portal.

updated: 2021/07/07 Azure Data Studio

Azure Data Studio can be used instead of SSMS to look at and query SQL database. 

Wednesday 15 February 2017

MMS hybrid between SP2013 and O365 and SP2016 farms

Problem:  A lot of large enterprise customers have the Management Meta Data Service including the Content Type Hub that SharePoint farms subscribe to.  You are on-prem. with this centralised MMS and CTH.  Now you want search to work on your O365 public tenant and to use SP2016 on-prem.  It may even be more complicated with SP2016 installed on Azure and there is no direct access onto the on-prem SP2013 CTH.


Initial Hypothesis:  You want to have a central production MMS service that all SP farms subscribe to,  You can't subscribe from a SP2016 farm to the SP2013 central MMS service.  O365 can't subscribe to a different MMS, you need to use the MS MMS and sync the term store using CSOM or a tool that shall use CSOM.  Crossing domains such as in a DMZ that does not allow inbound connections look at chaining for CTHub solutions.

Restoring MMS to another farm also see moving the MMS database (think Prod for development workstations) is straight forward if you merely want another copy of the MMS, use the Export-SPMetadataWebServicePartitionData to get the MMS info and then import the MMS proxy using the PS Import-SPMetadataWebServicePartitionData.  Best post is here on exporting and importing ensuring GUIDs are maintained.  Andrew Connell has a great series on MMS and one of his post looks at the Copying the MMS instance from Prod to Development.  A normal backup and restore of the Content Hub Site Collection how to bring a copy of the CTHub back.


In Progress....

Sunday 11 December 2016

Extranet Authentication Options for SharePoint 2013

Overview: Most large enterprises using SharePoint have implement Extranet solutions and these vary in complexity greatly.  Many implementation I have seen have morphed into bazaar solutions generally due to the tactical solutions implemented over time and were not caused by poor architecture.  It is the nature of these projects to get something out and with the rapid change in authentication over the past 5 years tons of business have landed in complex scenarios.

Office 365 has grown quickly and using Office 365 is generally a good idea however a lot of organisations still are resistant due to a variety of concerns such as regulatory compliance and trust.  Microsoft is definitely removing these barriers and I'd lean towards hosting the SharePoint Extranet in the cloud in the majority of situations.  The biggest barriers to moving to the cloud are Executive level buy in followed by senior IT folks that are bias to sticking to what they knew 10 years ago.   So a lot of the change is around education and providing a clear road-map.  The biggest technical hurdle will be around identify management.

Pretty much every organisation I deal with used Active Directory and then you may have a Federation Service normally ADFS.  You may have you external users in the same AD, a dedicate DMZ AD, or any other user directory including SQL or other LDAP provider.

Using Office 365/SharePoint Online I need to get both my internal and external users to be work with Office 365 and depending on the client setup I need to work thru both scenarios and think about the ramifications.

Note:  Ramifications are: resetting user passwords, does search work for all users and where does the data reside.

Possible Options:
  • AzureAD - Azures ACS for user accounts
  • Federated Identifies - use ADFS and build trust with ACS, identity and password is under our company control
  • AD sync to AzureAD - Think DirSyng, tooling is ADConnect
On an on-prem. SharePoint farm, the following Authentication methods are supported at the Web Application Level:

  1. Classic (Windows (Basic/NTLM/Kerberos)), 
  2. CBA - Claims Based Authentication backed by either Windows Claims (Windows (Basic/NTLM/Kerberos)) or SAML Claims (ADFS or SiteMinder or Ping or ThinkTexture, ....)
  3. FBA - Forms Based Authentication, and 
  4. Anonymous (none)

Notes:
http://www.sharepointeurope.com/blog/2015/10/identity-management-in-a-saas-based-world