Tuesday 14 January 2020

Blazer Intro

Blazor intro - Pretty much a SPA written in C#.  Deployed on server and client side using SignalR.  Server-side option is available today.

Learn about Blazer on 14 Jan 2020 Online Conference

More to come...

Sunday 12 January 2020

API Management Mocking

Problem:  Create an Open API Specification (OAS) endpoint for testing using APIM

Background:
  • Azure has a great service to bring multiple API's under a single publishable layer (think MuleSoft).  I like to use APIM to setup an initial contract that developers can use before setting up the actual API.  This allows both the consumer teams and the back-end development team to work independently to this OpenAPI agreed contract. 
  • Swagger originally owned and ended up creating the OpenAPI specification (OAS) that now of companies now use.
  • Swagger has great tooling for creating OAS API's, documentation, stub hosting and generating code such as .NET core that you can import into your dev environment. 
  • Azure has a developer APIM instance licence that is relatively inexpensive (creating an APIM instance takes up to 20 minutes) but leaving it running for my personal dev is pretty expensive.
Overview:  This post outlines the steps to setup an APIM instance using an OpenAPI file created in swagger.  The APIM service shall be setup to return mock data.

Steps:
1. Create a new instance of APIM - Do this first as it takes up to 20 minutes before the service is ready.
1. Create an APIM instance
2. Open swagger.io in a browser and signup, "Create New", this gives you your starting OAS file for your custom API.

2. Login to Swagger and create the Open API file using YAML
3. Using the Swagger Editor, create the desired endpoints.  It took me a few hours to get use to YAML as it is space sensitive but very readable.
3. Swagger editor
4. On the top right of the Swagger editor is the "Export" option.  Click Export > Download API > JSON Unresolved.  And keep the .json file ready to import into your APIM service.
5. Open APIM, and add a new API.  APIs > Add API > Open API as show below.
Import OAS file into APIM
 6. Import the file and the fields get populated per you instructions.
Upload the OpenAPI json file into APIM
7. The list of operations shows up - in my case i only have a single GET operation call \Get Customers
APIM Service is now created but not connected to a back-end

8. Add Mocking to the end point.  Highlight the Operation i.e. \Get List all Customers > Inbound Processing (Add Policy) button.  Select "Mock-responses" > Save


9. Generate the JSON responses:
a) Select the Operation i.e. \GET List all customers
b) Frontend drop down > Form-based editor
c) Click the "Responses" tab
d) Click the "200 OK" link
e) Click in the Sample Box, and "Auto Generate", Save


 The APIM service is now setup and we are ready to test.

1. Test using APIM's Testing tool


Check the 200 Response
Mocked response from APIM test tool

2. Testing using Postman
Open Postman, and craft the request as shown below:
Postman APIM testing

NoteThere are several competitor products like Mulesoft, Amazon API gateway, Postman and Swagger also offer a lot of these features.  There are other products that I have not used such as Kong API Gateway, GCP has Apigee, Gartner has a list of competitors and the magic quadrant done each year.

Summary:  It is pretty straight forward to setup APIM mocking as shown above.  And then easy to test it using Azure APIM tests and Postman.   This post show how to add various mocked APIM responses.   

APIM Notes

 5 Pricing Versions of APIM:

  1. Developer Version (No SLA) - No prod data but has all the premium features.
  2. Consumption Model (99.95% SLA) - priced per request serviced.  Missing Dev portal, not a static IP adr - automatically scales, 
  3. Basic  (99.95% SLA)  - No AAD integration - Scales to 2 units
  4. Standard  (99.95% SLA)  - Scales to 4 units
  5. Premium  (99.99% SLA)  - Allows Custom Domains - unlimited scale.  Also allows a single APIM instance to be scaled to more than 1 Azure Region.  Also direct VNET access is only available in premium.
One can move (scale up or down) between Azure APIM pricing tiers but it take up to 45 minutes.  
All consumption pricing tiers require the owner to setup or perform scaling. 
APIM has Metrics to determine if the number of APIM units should be increased or decreased.  The best metric is the capacity metric made up of  Note: Ignore short spikes look for an average over 15 minutes.  Microsoft suggests that the APIM Capacity metric running over 60% to 70% for a period of 30 minutes would indicate that scaling is appropriate.
When building your scaling strategy understand that adding APIM Units takes time (roughly 30 min) so scaling at 60% may not work for flash traffic in which case you'll need to account for this outside of only using metrics for scaling.
Tip: Regionally deployed APIM instances point to a single back-end URL, so it is best to keep the traffic routed to APIM in the same region as the back-end for simple scenarios, use the other region for failures and obviously you can multi-route back end traffic using Front Door for larger deployments.

Tip: Use Azure Key Vault for secrets.

Developer Portal - part of APIM, this portal/website is highly "customisable" and allows users to discover/consume our API's.  User can check their access (only see API's they have access to) and to try/test the API.

Tips:

  • Users > Groups > Products > API EndPoints
  • Subscriptions assigned to 1) products or/and 2)API endpoints.
  • Transition policies great for turning SOAP into JSON.  There are a lot of OOTB policies, easy to create policies using C#.
  • APIM Extension for VS Code is nice for working with APIM.
More Info:
https://www.youtube.com/watch?v=0yf_xm5cPIo
https://www.youtube.com/watch?v=gA2yxwKo0M0

PB APIM Series:

Thursday 9 January 2020

Microsoft Azure MFA Notes (Az-300)

Study Notes on Multi Factor Authentication:
  • AAD MFA: for 2nd factor done via Text, MS Authenticator, Phone Call 
  • Azure MFA Server - For AD on-prem. MFA.  Most advanced set op options for integrating on-prem. infrastructure with MFA cloud service.  Download and install on a Windows server.  Don't need to AD Connect sync accounts to Azure AD (AAD).


  • Azure MFA Server needs to use the Azure MFA Service to send SMS and Text authentication and MS Authenticator.
  • The Azure MFA Server downloan includes a GDPR.exe utility for generation GDPR reports for a user.
  • MFA billing is per User and is included in AD premium licences
  • Conditional Access - so don't need for every user but when advanced roles can enforce MFA
  • Azure SDK is only a Web Service since 2018
  • ADFS has 2 MFA approaches/options: Azure MFA Server - no need to replicate users to AAD or ADFS 3 (Win 2016) upwards can use cloud based (no Azure MFA Server required).
  • Password Stuffing - Hacker uses compromised password on different sites as people tend to reuse.
  • Know e.g. password, or something you have e.g. RSA token, something you are e.g. fingerprint.  MFA must use 2 or more of these types.  Out-of-band device e.g. you phone using MS Authenticator.
  • As a general rule with the 2nd factor Auth on Azure, if you want to add a pin to the auth, you can't use the cloud service but need to be using Azure MFA Server.
  • OATH tokens for RSA or other outside token MFA (also for offline on phone via MSAuthenticator) but it requires Azure MFA Server to implement.  Azure portal also has basic OATH integration for 3rd party vendors.

Monday 30 December 2019

Power Apps Licencing

Overview:  Licensing has change a fair amount over the past 3 years for Power Apps and Flow.  This post summaries my understanding of licencing on the Power Platform as of Dec 2019.

Update 4 Nov 2021: Power Apps licencing was halved in price in 2020, a few days ago at the Microsoft ignite conference, a 3rd pricing option for Power Apps, "Pay-as-you-go" is an option.  The other two options are "Per User" or "Per App" plans mention below.

Types of Licences:
1. O365 and D365 licencing: Basic PowerApps licencing comes with these plans but you only get a small subset of connectors (missing Azure SQL and CDS Connectors).
2. Per-User or Per-App Plans: There are no more PowerApps P1 and P2 plans, you either need to use "per-user" or "per-app" Power Apps licencing plans.
3. Community Plan:  Allows a developer/creator the full set of functionality, free of change but the apps can't be shared.

"To run standalone apps, guest users need the same license as users in your tenant."  Microsoft Docs.  
  • Power Apps uses AAD B2B (both members and guest) 
  • Using standalone Power Apps shall require the Power Apps licence (not necessarily Portal Apps).  Steps to Share Power Apps with Guests
  • SharePoint user interacting with a Power Apps form on a SharePoint list do not require a Power Apps licence as you already have O365.
Updated: 29 May 2020 
Problem:  I cannot import Power App solution (zip) files into other DTAP Power App Environments.  The "Import" button is disabled.
Initial Hypothesis:  My trial per User Power App licence expired and most functionality was still working with my O365 Power App built in licence.  Administrator could not get my old licence renewed or add a Power Apps per user per app licence.
Resolution:  The Ops team assigned me a Dynamics 365 licence, and Import functionality is working again.



Thursday 12 December 2019

PL-900 Microsoft Power Platform Fundamentals Beta Exam

Overview:  This morning, I took the PL-900 Beta exam; it is seriously tough.  I don't think I passed but let us see as the results come out 2 weeks after the beta is closed.

My Thoughts: The exam covers three products: Power BI, Power Apps & Power Automate (formerly called Microsoft Flow) but is extremely wide-ranging on relying Microsoft technology.  You really need to have worked in detail with the products as the questions were not straight forward and often combine multiple related services/products.

What I learnt:  You need to know the 3 products well, and how the interact.  My connectors knowledge could be better, CDS comes in a lot, and my Dynamics 365 knowledge is lacking.  There were some Cognitive service questions that I was not ready to deal with.  It is a good test of knowledge, that has helped me realize I have holes in my knowledge on the Power Platform.

The exam itself:  Microsoft exams become easier as you learn how they ask questions, I found some of the language and questions difficult to follow.   What is the question actually being asked but this is pretty standard in a lot of Microsoft certification exams.

I did the exam remotely using Pearson/VUE software, which is great.  No need to book and go to a test centre.  They check your environment and identification and watch you throughout.  I got a warning for reading aloud, which until that point was awesome as the wording of the questions is not the easiest to understand, but it makes sense as people could record the questions using audio for cheating with brain dumps.

Summary:
  • I am convinced that I failed PL-900.  
  • I learnt a lot preparing for the exam and it was a good experience.
  • I need to look at Dynamics & Cognitive Services in more detail.
  • Remote exams are awesome and secure the way they are done & don't talk to yourself during the exam.
  • I've missed taking Microsoft and other certifications.

Tuesday 3 December 2019

Microsoft Teams Governance

Update 2020/07/07:  I recently watch a presentation by Rupert Squires on MS Teams Governance that provides a good introduction into Team Governance and thoughts for your MS Teams projects: https://www.slideshare.net/RupertSquires/positive-governance-in-5-steps-for-microsoft-teams

Storage: MS Teams stores data in SharePoint, exchange and OneDrive for Business (OF4F).  Each team has it's own dedicate site per team.  Chats are stored in the senders OD4B, images stored in exchange.  The Teams will provision in the location where the tenant was setup.  In the O365 Admin centre, you can see where your instance of Microsoft Teams is located, mine is in the EU zone.


If location is important to you for compliance reasons, it's important you select the correct location.  I favour the EU zone as Europe is pretty strict on data, pretty central globally to my user bases.  But it does come down to your clients needs.

All data is encrypted at rest and encryption in transit and Teams data is stored in the Microsoft Data centre for your region

MS Teams Configuration: Team admin allows for a great degree of control in allowing different users different rights.  You can turn-off feature to groups of people easily to align with your companies governance.  There are a few dials to help you get granular control.

O365 Group Naming Policies is good for controlling access.  This allows for a common easy understanding of Groups, what the group has access to .

Sensitivity Labels/Azure Information Protection - very useful

Team Creation Thoughts - Should anyone be allowed to create your teams results in complex scenario that has be to governed and brought back under control.  Too much restriction results in people not using teams and potentially using shadow IT to achieve their goals.  Privacy must be appropriate, public vs private, don't allow people just to go for public because it sounds more open.  Each team should have a purpose, likely to have an end date (not always).  Don't just follow org structure.

Teams is not a replacement to all tools but it works best if you work out the right tool and often Teams can replace a variety of tools in large enterprises.

Teams should be using Teams instead of Email.  This is generally how one can tell if teams are being well implemented.  For me, I want to use teams for all communications and project related file store.  Email should be last resort.  Remove Skype if you have Teams.  Consider removing Zoom, WebEx, GoTo... you have teams, use this as the chat, calls or team meeting tool by default.  Schedule meetings from Teams if possible so it's in the correct team.  The meeting is therefore context based.
Don't allow people to add any app, think about it.  The exception is possibly small companies.

MS Teams ALM: Ensure that Teams are deleted when no longer needed.  ShareGate's Apricot tool is great for getting Teams under control.  Archive before deleting a MS team if you require the data or you have to be available.  Owners can delete a Team, the Team goes into the recycle bin for the default set period (30 days maybe).  It will be gone after this including the underlying storage data.

Note: When you delete a team all the underlying corresponding data is also deleted from SharePoint, Exchange and OneDrive.

MS Teams Series:
https://www.pbeck.co.uk/2019/12/microsoft-teams-governance.html
https://www.pbeck.co.uk/2020/05/microsoft-teams-overview.html
https://www.pbeck.co.uk/2020/06/multi-geo-for-ms-teams.html