Monday 30 December 2019

Power Apps Licencing

Overview:  Licensing has change a fair amount over the past 3 years for Power Apps and Flow.  This post summaries my understanding of licencing on the Power Platform as of Dec 2019.

Update 4 Nov 2021: Power Apps licencing was halved in price in 2020, a few days ago at the Microsoft ignite conference, a 3rd pricing option for Power Apps, "Pay-as-you-go" is an option.  The other two options are "Per User" or "Per App" plans mention below.

Types of Licences:
1. O365 and D365 licencing: Basic PowerApps licencing comes with these plans but you only get a small subset of connectors (missing Azure SQL and CDS Connectors).
2. Per-User or Per-App Plans: There are no more PowerApps P1 and P2 plans, you either need to use "per-user" or "per-app" Power Apps licencing plans.
3. Community Plan:  Allows a developer/creator the full set of functionality, free of change but the apps can't be shared.

"To run standalone apps, guest users need the same license as users in your tenant."  Microsoft Docs.  
  • Power Apps uses AAD B2B (both members and guest) 
  • Using standalone Power Apps shall require the Power Apps licence (not necessarily Portal Apps).  Steps to Share Power Apps with Guests
  • SharePoint user interacting with a Power Apps form on a SharePoint list do not require a Power Apps licence as you already have O365.
Updated: 29 May 2020 
Problem:  I cannot import Power App solution (zip) files into other DTAP Power App Environments.  The "Import" button is disabled.
Initial Hypothesis:  My trial per User Power App licence expired and most functionality was still working with my O365 Power App built in licence.  Administrator could not get my old licence renewed or add a Power Apps per user per app licence.
Resolution:  The Ops team assigned me a Dynamics 365 licence, and Import functionality is working again.



Thursday 12 December 2019

PL-900 Microsoft Power Platform Fundamentals Beta Exam

Overview:  This morning, I took the PL-900 Beta exam; it is seriously tough.  I don't think I passed but let us see as the results come out 2 weeks after the beta is closed.

My Thoughts: The exam covers three products: Power BI, Power Apps & Power Automate (formerly called Microsoft Flow) but is extremely wide-ranging on relying Microsoft technology.  You really need to have worked in detail with the products as the questions were not straight forward and often combine multiple related services/products.

What I learnt:  You need to know the 3 products well, and how the interact.  My connectors knowledge could be better, CDS comes in a lot, and my Dynamics 365 knowledge is lacking.  There were some Cognitive service questions that I was not ready to deal with.  It is a good test of knowledge, that has helped me realize I have holes in my knowledge on the Power Platform.

The exam itself:  Microsoft exams become easier as you learn how they ask questions, I found some of the language and questions difficult to follow.   What is the question actually being asked but this is pretty standard in a lot of Microsoft certification exams.

I did the exam remotely using Pearson/VUE software, which is great.  No need to book and go to a test centre.  They check your environment and identification and watch you throughout.  I got a warning for reading aloud, which until that point was awesome as the wording of the questions is not the easiest to understand, but it makes sense as people could record the questions using audio for cheating with brain dumps.

Summary:
  • I am convinced that I failed PL-900.  
  • I learnt a lot preparing for the exam and it was a good experience.
  • I need to look at Dynamics & Cognitive Services in more detail.
  • Remote exams are awesome and secure the way they are done & don't talk to yourself during the exam.
  • I've missed taking Microsoft and other certifications.

Tuesday 3 December 2019

Microsoft Teams Governance

Update 2020/07/07:  I recently watch a presentation by Rupert Squires on MS Teams Governance that provides a good introduction into Team Governance and thoughts for your MS Teams projects: https://www.slideshare.net/RupertSquires/positive-governance-in-5-steps-for-microsoft-teams

Storage: MS Teams stores data in SharePoint, exchange and OneDrive for Business (OF4F).  Each team has it's own dedicate site per team.  Chats are stored in the senders OD4B, images stored in exchange.  The Teams will provision in the location where the tenant was setup.  In the O365 Admin centre, you can see where your instance of Microsoft Teams is located, mine is in the EU zone.


If location is important to you for compliance reasons, it's important you select the correct location.  I favour the EU zone as Europe is pretty strict on data, pretty central globally to my user bases.  But it does come down to your clients needs.

All data is encrypted at rest and encryption in transit and Teams data is stored in the Microsoft Data centre for your region

MS Teams Configuration: Team admin allows for a great degree of control in allowing different users different rights.  You can turn-off feature to groups of people easily to align with your companies governance.  There are a few dials to help you get granular control.

O365 Group Naming Policies is good for controlling access.  This allows for a common easy understanding of Groups, what the group has access to .

Sensitivity Labels/Azure Information Protection - very useful

Team Creation Thoughts - Should anyone be allowed to create your teams results in complex scenario that has be to governed and brought back under control.  Too much restriction results in people not using teams and potentially using shadow IT to achieve their goals.  Privacy must be appropriate, public vs private, don't allow people just to go for public because it sounds more open.  Each team should have a purpose, likely to have an end date (not always).  Don't just follow org structure.

Teams is not a replacement to all tools but it works best if you work out the right tool and often Teams can replace a variety of tools in large enterprises.

Teams should be using Teams instead of Email.  This is generally how one can tell if teams are being well implemented.  For me, I want to use teams for all communications and project related file store.  Email should be last resort.  Remove Skype if you have Teams.  Consider removing Zoom, WebEx, GoTo... you have teams, use this as the chat, calls or team meeting tool by default.  Schedule meetings from Teams if possible so it's in the correct team.  The meeting is therefore context based.
Don't allow people to add any app, think about it.  The exception is possibly small companies.

MS Teams ALM: Ensure that Teams are deleted when no longer needed.  ShareGate's Apricot tool is great for getting Teams under control.  Archive before deleting a MS team if you require the data or you have to be available.  Owners can delete a Team, the Team goes into the recycle bin for the default set period (30 days maybe).  It will be gone after this including the underlying storage data.

Note: When you delete a team all the underlying corresponding data is also deleted from SharePoint, Exchange and OneDrive.

MS Teams Series:
https://www.pbeck.co.uk/2019/12/microsoft-teams-governance.html
https://www.pbeck.co.uk/2020/05/microsoft-teams-overview.html
https://www.pbeck.co.uk/2020/06/multi-geo-for-ms-teams.html

Web Api hosted on Azure App Service with OIDC security using Azure AD B2C

Problem:  I want to add security to my .NET core ASP.NET Web API C# application using Azure AD B2C.

Terminology:
  • .NET Core - revision of the .NET framework.  Allows your application to run on Linux, Macs and Windows.  You do not need to have the .NET framework installed.   
  • ASP.NET Web API - Follows the MVC pattern using Controllers and Models to provide an HTTP services e.g. Give me the temp in Paris today.
  • Azure App Service  - Host an MCV or Web API on Azure.  Acts as a web server, it is scale-able and fully manged.
  • Azure Active Directory (AAD) B2C - AAD B2B is different to AAD B2C, totally separate services on Azure.  Business 2 Consumer (B2C) provides applications with an identity repository.  B2C provide authentication and identity management as a service to web applications and mobile applications.  Think of it as the same Google authentication but you own the identity provider instead of rely on third-party authentication providers like Google.
  • IdP - Indentity Provider, B2C is one of 2 AAD service for managing users/identities on Azure.
  • MVC - Model, View Controller is a pattern used to aggange software.  In this post I'm refering to project that utilise the MVC templates to create a project for web sites or Web API.



Problem: MVC web application hosted on a Web App, using Azure B2C, B2C holds users and also uses a social Identity Provider (IdP) namely Google.

Figure 1, Create a new project on the Google Developer Console

Figure 2, OAuth Consent Screen setup
Figure 3, Add the Credentials to Google
AAD B2C linkup to Google IdP.

High-Level Approach:
  1. Create your own Azure tenant & B2C service instance on Azure (using the Azure Portal)
  2. Register your ASP.NET Web application on the Azure tenant (using the Azure Portal)
  3. Create User Flows (Policies) on the B2C tenant (This allows you to create the flow to sign-in a user, create a new account, or a user to reset their password,...)
  4. Setup Google to connect to the B2C IdP (see figure 1-3)
  5. Update application created in Step 4 so that is is aware of the Google IdP
  6. Perform Authentication setup - create MCV web application using Visual Studio
Tip: This approach and technology is extremely well documented by Microsoft.


Monday 2 December 2019

Box.com to SharePoint migrations & O365 improvements

Box.com Tenant to SPO Tenant migrations:  Mover has been bought by MS and allows moving files from Box.com, drop box and others into SharePoint (it's going to be free).

  • Fast Track has a tool that can do Box.com migrations, but the Mover tool is a great option.  
  • Mover allows for incremental update, permissions are roughly copied.  
  • File versions are not copied, only the latest file.  
  • Look out for file size and file types in the source.  
  • Look out for file names, 0365 has restrictions on special chars, and Mover shall remove them from the destination file name e.g. *,/<>.:
  • Original Box.com timestamps are kept
  • I didn't need to move custom properties from Box into SharePoint metadata and I can't see this functionality (I did read the ProvenTeq do metadata migrations)
  • A scan of the source data and a spike/PoC is a good idea to know what limitation you shall face.
  • Mover is a good starting place for the technology to migrate from Box onto SharePoint Online.
  • ProvenTeq have a Box.com to SharePoint migration tool - I haven't used it.
  • MigrationWiz can migrate from Box.com to SharePoint (I've never used it).  
  • Use "Box embedded widget" to access Box content from inside SharePoint.
  • Custom code for migrations: Box.com and SharePoint REST have great API's but you'll need to manage the throttling, changes/updates and permissions yourself.  Use a tool unless there is no suitable tool.  
  • The same common sense approach as outlined a few years ago for Lotus Notes to SharePoint on-prem. migration is essential.
Key SharePoint online limitations have been increased namely:
  • 500k site collections per tenant is raised to 2 million SC/tenant
  • SharePoint Online & OD4F file upload max size increased to 100GB max size
Sensitivity Labels: AIP, DRP, Teams, unified labeling experience across O365.  Sensitivity labels will be added to document added using an O365 E5 licence for the import from Box.com. 

  • Sensitivity labels work with teams now!  
  • Private channel are now available.
  • Sensitivity labels therefore allow for governance and protecting your teams and team channels.

OD4B: Allows external upload only permissions.  Really useful as the number of large clients that still only support SFTP/SSH or worst FTP to share files.  Very quick low risk locked down way of getting files from clients and partners.

O365 tenant to tenant migrations: There seems to be a lot happening in this area but right not it's use the tools and Fast track.

Plus Addressing in Exchange Online: Exchange online offer Plus Address.  Allows you to signup to newsletters and you can identify if you email address is sold on.  e.g paul@radimaging.co.uk, i will get the email for paulnews365@radimaging.co.uk, if it is sold on, you know where news365.com have sold on my details.
https://office365.uservoice.com/forums/273493-office-365-admin/suggestions/18612754-support-for-dynamic-email-aliases-in-office-36


Friday 29 November 2019

Redis Cache

Caching in Microsoft technologies has many forms and options.  You can have server side cache, client side cache or downstream caching such as proxy servers.

Server Cache in .NET:
  • in-Memory store: This type of server cache historically was very useful for maintaining state and having users tied to a collection of objects/knowledge and is extremely fast.  You need to think about multiple instances, or how to route traffic.  So you can tie a user to use static routing to ensure user Joe's session always goes to server 3.  But is server 3 dies, you loose all the cache data.  
  • Local storage cache works well for larger amounts of data that are too big for memory cache but as they are storage based are much slower.
  • Shared/ centralized caching, allows shared cache, a good example is using SQL server for caching and the user can go to any front end web server and the cache is pulled from SQL.  Allows for failure, no need to performed fixed user routing of requests (sticky sessions).  It is slower as cache is pulled from a central service that in turn goes into SQL to retrieve the data.
  • Azure Redis cache is a form a of Shared Caching.  The cache is better if used in memory like Redis cache does.  A Redis Cache is a service that is optimized for performance and allows stateless caching to work across server instances.  So while it needs to travel over your network in Azure, it is not as fast as local cache but extremely fast for centralized caching.  Redis is pretty advanced and has clustering and replication to ensure performance is good.
https://docs.microsoft.com/en-us/azure/architecture/best-practices/caching

Client Side Cache:
  • Browser can hold information in session but it is not secure or at least less secure than server side cache.
  • CDN's are a way of retrieving data from an optimized central store but useful for static files and data.
  • Adding headers to HTTP requests allow for downstream caching.  For example, I offer a REST API (e.g. C# Web API) that return a feature product that change hourly.  I could easily set expiry caching to 10 min.  The product is changed every 10 minutes for each user and added to the users local cache.  So if an average user is on for 20 minutes, they only to 2 of their 10 requests to the REST API, the other 8 calls are served locally.  Couple with Server caching, the requests to the server can be fast, with very few calls to the back-end database yet the relatively static data causes far less demand on the web service.
  • Validation Caching - is client caching, that stores data locally, but a request is sent to the server to ask if there is a change, if there is a change send back the new data.  If the data has no changed, a 304 response is sent and the browser uses the previously stored local cached data.
Client and Server (Redis) side Caching on Azure
Note:  The diagram only outlines client caching referring to the browser or end device.  Using HTTP headers, you can also set down stream proxy caching.  This is also generally referred to as client caching.

Firefox has a nice short cut to check you client cache is working.  Open Firefox, and request the URL that shall cache a resource on the local client machine.  then in the URL type in "" and check what is being served up locally.


Basic Caching Options:
1. Client Cache (local) - saves on network call and serve side processing.  Can hold client specific static data.
2. Client Cache (proxy) - saves on server side processing.  Can only hold generic static data.
3. Server side Cache (Redis) - saves on computing resources by not having to connect to data sources on every request.  Useful for static share data.



Friday 22 November 2019

Azure IaaS backup Service Notes

Azure has great cost calculation tooling.  DR can be pretty expensive is running but not being used.  Having the ability to either turn on or deploy a DR environment can make massive cost savings.

I often see organisation over spending Azure dollars, basically most cost reduction falls into 1 of these 3 groups:
  1. Eliminate waste - storage & service no longer used
  2. Improve utilisation - Oversized resources
  3. Improve billing options - long term agreements, Bring you own licence (BYOL), 

Apptio Cloudability is a useful tool for AWS and Azure cost savings.  Azure has good help and tooling for cost savings.

Azure IaaS Backup:
  • Recovery Services Vaults
  • Off site protection (Azure data center)
  • Secure
  • Encrypted (256-bit encryption at rest and in transit)
  • Azure VM's or VMS' woth SQL and on on-prem. VM's or Servers
  • Server OS supported: Windows 2019, 2016, 2012, 2008 (only x64)
  • SQL all the way back to SQL 2008 can be backup
  • Azure Pricing Calculator can help estimate backup costs
  1. Azure Backup Agent (MARS Agent), used to backup Files and folders.
  2. Azure Backup Server (trimmed down lightweight version of System Centre Data Protection Manager (DPM)), used for VM's, SQL, SharePoint, Exchange.
  3. Azure VM Backup, management done on Azure Portal to backup Azure VM's.
SQL Server in Azure VM backup, used to backup SQL databases on Azure IaaS VMs.

Backing up Azure VM's must be done to the same geo location as the vault.  It can't cross geo-locations.  Recovery has to be to a different location (verify this is correct?)
Note: "Backup Configuration" setting of the Vault properties can be set to "Geo-redundant"

Azure Recovery Vault Storage choice:
LRS - Local Redundacy Store - 3 local async copies
GRS - Globally Redundant - 2 async copies in the same data region with 3 local copies- so can keep in Europe for compliance, all 6 copies are in Europe.
Update Feb 2020: I think there is also a GZRS option, check if this has changed?

Naming is absolutely key and having a logical hierarchy within Resource Groups so it is easy to find resources.  I focus on naming my resource consistently however, I've always felt "Tags" have little or no purpose in smaller environments.  In larger environments tagging can be useful for cost management, recording maintenance, resource owners, creation dates.  Lately, I've been finding it useful to mark my resource with and Environment Tag to cover my Azure DTAP scenarios.  E..g., Production, Testing, Development.