Sunday 27 December 2020

AIP and Sensitivity Labels

Overview:  AIP has had many names and twists over the past few years.  The functionality has been improving, but the naming and changes made it difficult to implement well.  Finally, I feel Microsoft Azure Information Protection is implementable at scale.

Summary: Sensitivity labels have have the ability to allow documents and email to be classified to protect email and files.  One can track, and encrypt documents/email.  You can also use sensitivity labels to protect SharePoint sites, Teams sites and Microsoft 365 Groups.  Within AAD (B2C) I can assign sensitivity labels to Microsoft 365 Groups.

Sunday 13 December 2020

ISO 27001 Certification & OWASP

Overview:  I have been thru several ISO and security audits over the years for various companies offering SaaS products.  This post outlines a some of my note around the latest ISO 27001 audit I touched on.

ISO 27001 covers Information Security Management (ISMS) which is about protecting and managing your businesses information assets to reduce your business risks.  It demonstrates that your organisation has good security practices in place.

Note: ISO 27001 is a management of systems standard for an organisation, it is not done for a particular product.  

An ISO 27001 ISMS consists of policies, procedures and other controls involving people, processes and technology".  https://www.itgovernance.co.uk/iso27001

Parts to an ISO 27001 audit:

  • Part 1 - Check you have the correct documentation.  
            Output is a go ahead and get a visit plan from the auditor.
  • Part 2 - Checks you as a business are complying/working to the documentation.  Basically evidence based reporting based on visual confirmations and discussing with the staff using interviews to verify compliance (sample based auditing).  Findings normally grouped into 3 types of findings: 1)   Opportunity for improvement = suggestions, need to review before next audit to see if this is worth implementing 2) Non conformance - Minor = can have a few of these, look to fix 3) Non conformance - Major - won't get certification with a major.  There is a period to address/fix major issue/issues.  Always complete the phase 2 audit as they may discover other majors.
            Output Findings report and several weeks latter the certification.
  • Certification
  • Yearly: Need to repeat and show you are improving based on the findings and the audit will generally go into specific areas in more detail.
More Info:
Data Protection and Regulation note - see bottom of post for ISO27001

Notes
Business Continuity quarter check
Annual Security Policy & Standard Review 
Security training - different roles need different training
Annual penetration testing
Audit annual re-certification days
Risk Information: Non conformity & root cause analysis

Technical:  Encryption and REST, Encryption in Transit, DAST/SAST on code, =logically secure customer data/security, Azure Defender to harden infra and continuously monitor, vulnerability or external penetration testing, ASVS/OWASP.

ISO 27701 - "ISO 27701 extends the meaning of “information security” detailed in ISO 27001. While the privacy and protection of personal data is part of ISO 27001, the newer standard extends the scope to include the “protection of privacy as potentially affected by the processing of PI" source: https://www.learningcert.com/information-security/iso-27001-vs-iso-27701/

ISO 27017 - is a security standard developed for cloud service providers and users to make a safer cloud-based environment and reduce the risk of security source Wikipedia.  I think ISO27017 is now part of ISO27001 extended.

ISO 28000 - is the spec for security management systems for the supply chain (partner dependancies e.g. software vendor, hosting company service)

ISAE 3402/SOC 2/ISO 27001 - about verification of business processes/internal controls of the business of of a high standard.

Tuesday 1 December 2020

Testing your home Internet Speed using your IPhone

Problem:  Broadband offers various speeds options when purchasing, the actual speeds you get are usually well below and depend on you specific instance. 

Initial Hypothesis:  iOS has multiple apps to monitor speed to your iPhone.  

Resolution:  Download "Speedtest" using the app store an any Apple device.   5G performance is fantastic.

Below are my Results, I live in South West London (Zone 4)

Sky broadband - SW London

Broadband Download (Mbps)  Up Speed (Mbps)  Location             
Sky phone 34.80 5.72 SW London 
EE 4G - LTE 13.00 0.13 SW London
O2 4G - LTE 16.90 10.20SW London
EE - 5G 372.00 19.80 Newcastle

EE 4G - Mobile
EE 4G - SW London

O2 4G - SW London

5G - Newcastle

Thoughts:
Speed tests vary greatly, so worth doing at least 3 to get an average. 70 Mbps download on EE4G is very possible so the download speeds can be as good as my Sky broadband.  5G performance is fantastic - the MIFI/5G routers are going to be awesome when 5G rolls out to my area.  Using O2 and EE at my home, O2 is way faster down but interesting the upload speed is amazing using O2 (The O2 tower is way better positioned).

Update 15/03/2023
Nice package to check Internet upload and download speed on Mac, Windows and Linux that I got from Tobias Zimmergren

Install cmd PS>  npm install --location=global fast-cli
Run cmd> fast -u

Sunday 29 November 2020

Azure SQL Basic Options Summary

OverviewAzure SQL is incredible.  There are a lot of options when choosing how to host database and performance good.  "handles patching, backups, replication, failure detection, underlying potential hardware, software or network failures, deploying bug fixes, failovers, database upgrades, and other maintenance tasks", from Microsoft Docs and Azure SQL.

Azure SQL is the PaaS database service that does the same functions as SQL Server did for us for many years as the workhorse for many organisations.  Microsoft initially only offered creating VM's and then installing SQL Server on-prem.   Azure SQL is Microsoft's PaaS SQL database as a Service offering on the cloud.  Azure SQL is a fully managed platform for SQL databases that Microsoft patches managed backups and high availability.  All the features that are available in the on-prem. Edition are also built into Azure SQL with minor exceptions.  I also like that the minimum SLA provide by Azure SQL is 99.99%.

Three SQL Azure PaaS Basic Options:

  1. Single Database - This is a single isolate database with it's own guaranteed CPU, memory and storage.
  2. Elastic Pool - Collection of single isolate databases that share DTUs (CPU, Memory & I/O) or Virtual Cores.
  3. Manage Instance - You mange a set of databases, with guaranteed resources.  Similar to IaaS with SQL installed but Microsoft manage more parts for me.  Can only purchase using Virtual Core model (No DTU option).
Thoughts: Managed Instances recommend up to 100TB but can go higher.  Individual databases under elastic pools or single databases are limited to a respectable 4 TB.

Two Purchasing Options:

  1. DTU - A single metric that Microsoft use to calculate CPU, memory and I/O.  
  2. Virtual Cores - Allows you to choose you hardware/infrastructure.  One can optimise more memory than CPU ratio over the generalist DTU option.
Thoughts:  I prefer the DTU approach for SaaS and greenfield projects.  I generally only consider Virtual Cores, if I a have migrated on-prem. SQL onto a Managed Instance or for big workloads virtual cores can work out cheaper if the load is consistent.  There are exceptions but that is my general rule for choosing the best purchasing option.

Three Tiers:

  1. General Business/Standard (There is also a lower Basic Level)
  2. Business Critical/Premium
  3. Hyperscale

Backups

Point in time backups are automatically stored for 7 to 35 days (default is 7 days), protected using TDE, full, differential and transaction log backups are used to point in time recovery.  The backups are stored in blob storage RA-GRS (meaning in the primary region and all the read-only backups are stored in a secondary Azure region).  £ copies of the data in the active Azure Zone and 3 read only copies of the data.

Long Term Retention backups can be kept for 10 years, these are only full backups.  The smallest retention is full backups retained for each weeks full backup.  LTR is in preview available for Managed Instances.

Azure Defender for SQL 

Monitors SQL database servers checking vulnerability assessments (best practice recommendations) and Advance Threat Protection which monitors traffic for abnormal behavior.

Checklist:

  1. Only valid IP's can directly access the database, Deny public Access,
  2. AAD security credentials, use service principals
  3. Advanced Threat Protection has real time monitoring of logs and configuration (it also scans for vulnerabilities), 
  4. Default is to have encryption in transit (TLS 1.2) and encryption at rest (TDE) - don't change,
  5. Use Dynamic data masking inside the db instance for sensitive data e.g. credit cards
  6. Turn on SQL auditing,

Note: Elastic Database Jobs (same as SQL Agent Jobs).

Azure offers MySQL, Postgre and MariaDB as hosted PaaS offerings. 

Note: The Azure SQL PaaS Service does not support the filestream datatype : use varbinary or references to blobs. 

Monday 26 October 2020

Identity Server - OAuth and OIDC

Overview:  The current version of Identity Server is 4.  Identity server is basically a .NET Core 3.1 application that is an Identify Provider (IdP) similar in role to PingId, SiteMinder, AAD b2C.  Identity server allows application (native mobile, web sites and servers) to securely authenticate users.  In this post OAuth means OAuth2.0.

OAuth2 Grant Types:

Flow Description Client Grant Type
Authorization with PK Authorization Code Grant Type.  Default choice for authorization. Native mobile Apps, Windows app, Browser Apps Code
Client Credential Server-to-server (S2S) communication also refereed to as Machine-to-machine (M2M). Server,Consoles,Services ClientCredentials
Implicit Rather use the Authorization Code Flow with PKCE Native Apps & SPA's often use Implicit Flow Implicit
Hybrid
Device
Resource Owner Pswd Don't use

Scopes: The authorisation Server specifies the "scope" that the user has consented too.  So for an API you can limit the actions the user can perform.  Scopes must be unique strings.  Recommendation is to name your scopes by the API and the Verb e.g. "pauls_api.read" is better than "read".  Scopes are used to give the user access to resources so "read" is not a good idea.  Also scopes have length limits so don't be crazy verbose in naming.

Mandatory Endpoints:  OAuth specifies 2 endpoints namely:
  1. /authorization endpoint - gets the access grant and user consent (only code and implicit flows use this endpoint)
  2. /token Endpoint - issues tokens (client credential only uses the token endpoint, obviously code & implicit flow use both endpoints)
Optional Endpoint Extensions:
  • /revoke - used to revoke a token
  • /userinfo - used to hold profile info for the user e.g. name, email.  The /userinfo endpoint is used in OIDC implementations of OAuth and specifies user must use: address, phone, email, profile in their scopes.
  • /.well-known/oauth-authorization-server - useful to discover the actual OAuth implementation.
Tokens
Access Token:
  • JSON Web Token (JWT) pronounce "JOT" is an access token that contains authorisation and profile data.  The alternative is to use Opaque to JWT but most implementations use JWT.  
  • JWT's need to be validated using the signature.  The JWT Access Token is base 64 encoded and are made up of three parts separated by period signs i.e. HEADER.PAYLOAD.SIGNATURE


Refresh Token:
  • Refresh tokens are opaque
  • Single endpoint with a single function to get a new Access Token.


Interactive description of the OAuth Code Flow process:


Monday 19 October 2020

APIM High Availability and Performance across Regions

Overview:  APIM can be setup in multiple regions and incoming request will be routed to the closest APIM endpoint.  If there is only 1 APIM region, it is best to ensure the API/App Service/Function is hosted in the same region.  With multiple APIM's you can also host a API in the same region.  The routing is either done automatically using Azure Front Door or via policy on the APIM.

Front Door can be substituted with Azure WAF, or Cloudflare or Barracuda's SaaS solution.

More Info

WAF Options

Overview:  HTTP Traffic from users to web sites and API's need to have WAF protection.  Both Azure and AWS have good services to protect your API's and applications.  There is also the option to use a dedicated WAF Services.  When protecting large organizations with hybrid cloud providers then options like Barracuda, Imperva/Encapusla, F5 and Cloudflare are good enterprise level options.  Fundamentally, a WAF sits as an intermediary between the user and the resource they are requesting using HTTP.  I like to set my highest priority rule to DENY all HTTP & HTTPS traffic, then i specifically open the rules that i want to flow thru, a lot people do it the other way around in smaller implementations.

WAF Options:
  • Azure WAF simple in 1 region for a WAF especially with APIM and if you are an Azure customer simple got for an Azure Application Gateway with WAF enabled.  DDoS is s separate service that can be integrated before Azure WAF or Azure Firewall.  Cheaper per IP SKU option for specific IP adrs.
  • Azure Front Door WAF is pretty amazing, Cloudflare is historically the leader with similar functionality.  On Microsoft Azure the main two options for WAF are Front Door WAF (Best, most expensive) and Azure Application Gateway WAF.
  • Competitor  options: Barracuda WAF SaaS Service or Any software firewall KEMP, F5, Check Point, Fortinet/Fortigate, Cloudflare WAF, Akamai, AWS WAF, AWS Network Firewall, Cloud Armor is GCPs WAF I believe, ....  
  • Check WAF service has protection at least for DDoS, XSS, SQL injection attacks, SSL Termination if you need it, Managed RuleSets.
  • AWS WAF is for web traffic (layer 7), there is a separate AWS Shield service that is used for DDos attacks.  AFS can be applied at a Application Load Balancer, Amazons API Gateway, and Amazon CloudFront.  With AWS WAF you also get Shield (standard free).  Shield adds advanced features and the standard version that is always included by default with AWS WAF has monitoring and DDoS protection.
  • Barracuda WAF is a SaaS Service that has worked fairly well for me.  Has a fair amount of options and rules.  Has add-ons like anti-virus scanning.
  • Imperva WAF was previously called Incapsula WAF, that provides a SaaS WAF service including Smart DDoS (block dodgy traffic and passes thru good requests), API Security, SQL injections, Xss.  Multiple data centers around the world.
  • Cloudflare is a Secure access service edge (SASE).  Cloudflare provides a WAF service at hundreds of endpoints around the globe (for instance there are 5 Cloudflare endpoints in Australia).  WAF functionality like SSL, DDoS (L7), customer rule e.g. rate limiting, OWASP rules applied, "api protection", et al. is done close to the user request (nice low latency) and then if successful it is pushed to the backend.

 

Last Updated: 2022-03-15

Friday 9 October 2020

App Insights - Website and API Monitoring

Overview:  App Insights has functionality to run scheduled web requests and log the output in App Insights.  There are multiple advantages to this including end to end active monitoring of web sites and API's, and keeping the application warm.

Below I show a simple request to my blog (public website) and the results, Azure refers to this test type as a URL Ping test which is basically a URL HTTP GET request.  


Wait a few minutes and Refresh to see the results:

Very easy way to include a constant check that your API or Website is running.  There is also the options to create "Multi-step web test" using Visual Studio.  You can record the authentication and assert for known response content to build advanced constant monitoring.

Tip: The URL does need to be publicly available.

The content I used to test out the functionality comes from the Microsoft Docs site.
Also see Live Metric Stream that is part of App Insights.

Monitoring using Azure Monitor Dashboards:
  • The image above shows a dashboard that can be used to monitor a SaaS applications PaaS Infrastructure.
  • It's a good idea to create multiple dashboards and they can show the overview and allow the user to drill into specific areas.
  • Internal boards watching key API's, HTTP uptime ping type requests is also a good idea.

Updated: 11 Jan 2023
- Checkly looks like a great monitoring service that works with Playwright for continuous monitoring.  I like to keep all my SaaS on Azure, but Checkly is independent an as it is monitoring.

More Info: 
App Insights MultiStep Tests  Replacement Option for MultiStep Test based on Azure Functions

Thursday 1 October 2020

App Insights - Basic Introduction

OverviewAzure App Insights is a great platform for collecting logs and monitoring cloud based applications on Azure.  All Azure Services can push logging information into App Insight instances.  This can be errors, usages, performances logging that in turn is easy to query.  There are SDKs for developers that can be used to add custom logging to applications.  I am a big fan of AppDynamics for logging and monitoring but for SaaS and on a new project I'd go with App Insights.

Retention:  App Insights can keep 730 days worth of logs.  For long term storage, "Continuous Export" can be used to push data into storage accounts as soon as it arrives in AppInsights.  Retaining the App Insight logs for 90 days has no additional cost, so the default to store logs should be set to 90 days at least in most situations.

What is logged and what can be logged:  
  • All Azure Services can be configured to send service logs to a specific App Insight instance.
  • Instrument packages can be added to services to capture logs such as IIS, or background services.  You can pull in telemetry from infrastructure into App insights e.g. Docker logs, system events.
  • Custom code can also call the App Insight instance to add logging and hook into exceptions handling.  There are .NET, Node.JS, Python and other SDK's that should e used to add logging, exception capturing, performance and usage statistics.

App Insights has a REST API to query the logs.  The "API Explorer" tool is awesome for querying App Insights online.  


The data below comes from Microsoft Docs.

"What kinds of data are collected?

The main categories are:

  • Web server telemetry - HTTP requests. Uri, time taken to process the request, response code, client IP address. Session id.
  • Web pages - Page, user and session counts. Page load times. Exceptions. Ajax calls.
  • Performance counters - Memory, CPU, IO, Network occupancy.
  • Client and server context - OS, locale, device type, browser, screen resolution.
  • Exceptions and crashes - stack dumps, build id, CPU type.
  • Dependencies - calls to external services such as REST, SQL, AJAX. URI or connection string, duration, success, command.
  • Availability tests - duration of test and steps, responses.
  • Trace logs and custom telemetry - anything you code into your logs or telemetry."
App Insights creates a hierarchy of requests built up from the operationId, and operation_parentId.

Application Insights is part of Azure Monitor and makes it easy to trace user interaction.  Independent infrastructure for recording issues and tracing.   App Insights in 3 parts. 
  • Collect: Track infra/PaaS via instrumentation (throughput, speed, response times, failure rates, exceptions etc.), and via SDK (e.g. JavaScript SDK, C#) to add custom logging and tracing.  Blue boxes
  • Store: Stores the data.  Purple Box
  • Insights: Alerts, PowerBI, live metrics, REST API.  Green Box
Extending App Insights:
For long running operations like using queues or ESB you will need to tie the operations together, and it's really easy to connect this in a hierarchy using distributed tracing.  

SPA's:  There is a JavaScript SDK but logging on SPA's needs configuration and understanding as not every operation is logged uniquely for tracing.

Smart Detection: automatically tries to quickly warn you of problems/abnormalities and there root cause.

Snapshot Debugger/profiler: VS remote debugging can be hooked to an issue.  Shows execution traces from your live app.

Transaction Search:  Easy way to query and find data or unique info.


Tuesday 29 September 2020

Sunday 13 September 2020

Building better Software Thoughts

Overview:  I see a lot of development teams, and they always seem to have areas they are good at and capabilities teams need improvement on.  Key is culture and building a happy team where team members trust and help one another.

Building a culture where teams enjoy code reviews is also key for successful Software projects.   To improve software, reviewing various areas not only code reviews are essential.  For me, clear requirements are the number 1 factor in improving teams performance.  

Companies are getting better at building software; I aim to work on these topics to improve the delivery of software within scrum teams:

  1. Code Reviews & Peer Reviews (Daily reviews are awesome, should be pretty short and enjoyable not someone trying to show off or hours long)
  2. Collaboration (Standups, Slack/Teams, Code tools have collaboration built in)
  3. Documentation & Requirements Reviews
  4. Better tooling including better CI/CD tooling including static analysis tools
  5. Unit Testing, automate coding standards, Integration testing, UI Testing, and API testing 
  6. Requirements (Use Stories are clear and Acceptance Criteria)
  7. Cadence is improving thanks predominately to Agile practices; I like short release cycles (2-3 weeks depending on the team and industry).  Changing requirements, indecision kills software projects.  Agile helps, but decisive knowledgeable product owners increase the likelihood of the project succeeding.

Benefits of Code, Documentation and Requirement Reviews:

  1. Improved software quality & product delivery
  2. Share domain knowledge
  3. Training team members (useful for onboarding new team members)
  4. Reduce support and fix costs
  5. Lower cost & faster development

Options Layering API's on Data Sources - Micrososervices kind of

Hasura takes data sources such as SQL, Postgress & MySQL and converts it into GraphQL API's.  SQL Server is in preview.  Service is available on Azure and hooks into AAD and AAD B2C.  Hasuru looks extremely interesting and useful.  Potentially a great time saver.

CDS/DataFelx/Oakdale - Allows for Entity creation and provides REST API's.

SharePoint lists provide HTTP API's for CRUD operations.

REST API's vs GraphQL

OpenAPI specification (previously known as the Swagger specification) is my default for an API, this allows for a known RESTful API that anyone with access can use.   Open API has set contracts that returned defined objects which is great, you can work with the API like a database with simple CRUD operations as defined by the specification.  The issue is that the returned objects are fixed in structure so you may need 2 or more queries to get the data you are looking for.  Alternatively, GrapghQL allows the developer to ask for the data exactly as the want it.
Open API example:
/api/user/{2} returns the user object  // Get the user object for user 2
/api/users/{2}/orders/10  // Returns the last 10 orders for the user
GraphQL example:
Post a single HTTP request.
query {
 User(id: "") {
    name
    email
    orders(last: 10 {
      orderid
      totalamount
      datemodified
    }
 }

Database Option Notes:
PostgreSQL - Hybrid relation database (Communitity edition free, standard and enterprise higher loads) sits on Linex but can also use Windows OS.  Replicas are used for additional read-only nodes (I think up to 5 geo relicas allowed) 
Can be hosted on Azure or AWS IaaS.  PostgreSQL is also offered as a fully managed service (PaaS) on Azure (either single server or hyperscale).
MariaDB community fork off MySQL - Open source relation database used in the LAMP app stack.  Azure MySQL manage service can have up to 5 replicas for read heavy workloads.

Sunday 6 September 2020

Working with CDS data structures for non CRM types

Overview:  I am working on a Power Platform solution and I need to use CDS.  Basically, I need to be able to see and edit values within CDS.

Option 1: Microsoft SQL Server Management Studio (SSMS) version 18.6 allows connectivity and read-only access.  Here are the instructions.

Option 2XrmToolBox has fantastic tools for Dynamics and Power Apps.  There are a lot of individual tools from various contributors.

Here I am using "Entity Relationship Diagram Creator" to look at the relationships between the CDS entities.




Saturday 5 September 2020

Reducing Power Apps Dynamic calls and where to store Power Apps data.

Overview:  Power Apps is driven by data and generally that data comes from Connectors.  So the great news is there are a lot of different connectors and if in trouble I always find the custom connector can be relied upon.  When working with Power Apps, it is not as simple as just having a data source and consuming it, one needs to consider all the data sources, do we need live data, performance.  Basically, going and dynamically pulling lots of dynamic sets of data repeatedly leads to poor performance.

CDS performance has relatively few layers when retiring or updating data so it generally is fast to work with.  Custom connectors have more layers so can be slower and you are dependent on the underlying REST API so watch for performance.  The on-prem azure data gateway is pretty slow but amazing if you need it.

Identify data sources:  CDS, Azure SQL, SharePoint SQL on-prem., CosmosDB, RSS, Open API's...

Understand the security, and the amount of data being pulled.  For example, if we need all the airport codes in the world for a drop down so the user can choose their closest airport e.g. JFK is for New York John F Kennedy airport.  There are roughly 4000 commercial airports in the world. 

Options: Call an Open API service.  Power App by default returns sets of 500, Power Apps max return count is 2000.  You still need to perform 2 calls with paging to get the full data set.  You could use a type ahead if the API supports it, but their will be a lag after each keystroke when Power Apps runs out to the service.  And there will be a lot of calls.  More suitable would be to do 2 calls with 2,000 record for each call and bind the control to the returned data.

A further improvement would be to store the airport lookup on data load or on first request, then subsequent requests would use the table/collection.  In effect, you are locally caching all the airport codes using 2 calls for each Power Apps user session.

For the airport example, one could also store the data using a Excel import, but beware the data is imported into Power Apps and store locally.  Big issue is the data is static in Power Apps, to update it, you need to re-import the Excel table.  So brilliant for sat days of the week or Months of the year as these never change.  Fairly static data like airport codes work well, but require a publisher level overwrite to update the list.  Also, storing extreme amounts of static data leads to bloat of the app and that data still needs to be loaded.  So I would not consider it for 100k+ items as a general rule.  

FYI Microsoft recommends to use less than 30 connections per application.

More Info:

Todd Baginski has a great video on using Excel import and creating language variations/multi-lingual Power Apps using Excel imports.

Alagunila Meganathan on C# Corner has a good post on Excel Imports for Power Apps.


Monday 24 August 2020

AWS vs Azure services offering comparison for Solution Architects

Overview: Microsoft provides a useful list that allows me to know AWS services aligned to Azure Services.  This is pretty useful if you know 1 platform considerably better than another to quickly figure out your options on either AWS or Azure.

My Service comparison notes:

Amazon CloudWatch - same as Azure Monitor.

Amazon Relational Database Service (RDS) – SQL Server, Oracle, MySQL, PostGress and Aurora (Amazon’s proprietary database).  

Azure SQL lines up with Amazon's RDS SQL Server Service.  Although Aurora is probably also worth the comparison as it's AWS's native DB option.

Amazon DynomoDB – Same as CosmosDB – NoSQL database.

AWS API Gateway - Azure API Management

Amazon Redshift is the data warehouse.  Can be encrypted and isolate.  Support Petabytes of data.

Amazon ElastiCache run Redis cache and MemCached (simple cache).

AWS Lamda – Azure Functions. i.e. Serverless.

AWS Elastic Beanstalk – Platform for deploying and scaling web apps & Services.  Same as Azure App services.

Amazon SNS – Pub/Sub model – Azure Event Grid.

Amazon SQS – Message queue.  Same as Azure Storage Queues and Azure Service Bus.

Amazon Step Functions – Workflow. Same as logic apps

AWS Snowball – Same as Azure Box.  Physically copy and transport to data centre for upload.

Virtual Private Cloud (VPC) – Azure virtual network 

Amazon AppStream - Azure VDI (Virtual desktop) I think.

Amazon QuickSight - Power BI (Tableau Business Intelligence).

AWS CloudFormations - ARM and Bicep

Tip: I am glad that I did the AWS Certified Cloud Practitioner exam as it helped my understand of the AWS offering which has been very useful in large integration projects.  I have worked with AWS IaaS (EC2, API gateway and S3 historically).  Like Azure, there are a lot of Services and features.  Basically, there are equivilant services for Azure & AWS.  It may be a 2-to-1 service offering or it is not something offered by the cloud provider.

Sunday 23 August 2020

AWS vs Azure vs GCP Comparison

Overview:  I predominately use Azure & Microsoft for all my cloud services.  


I have installed multiple SharePoint farms and setups on AWS EC2 instances and I'm currently preparing for the Cloud Practitioner AWS exam.  I have used Google for authentication, SaaS nut not as a IaaS offering.  I'm also a huge fan of Heroku which is great for PaaS and I used this to host my game built for Facebook games.  I've also seen IBM's cloud offering a few years ago.  For me it is too niche and not as feature rich.  So basically I understand Azure's offering well so I found this comparison pretty useful.

My Thoughts:  The contenders:  I really like Heroku for it's simplicity.  I feel for a small Indie developer or company, Heroku has a good free and cheap simple billing options.  GCP, I really can't comment from a good position of knowledge but from what I've used, I like GCP.  GCP is the third biggest Cloud provider.  As a large organisation, I'd only consider the big three: Microsoft Azure, AWS, and GCP to be our cloud partner.  Multi-cloud partner is a demand from some organisations, it's truely extra expensive.  Azure uses ARM templates and has many options for provisioing the IAAS, PaaS offerings.  If you are thinking multi-cloud consider Terraform by Hashicorp for IaC.  There is also the concept of Click-Ops (sic) which allows you to click thru the UI of the management of the Cloud services to get the the desired architecture, this is fine for simple small architecture but you can't do this at any scale or agility and it's super error prone.  Click-ops is more a joke term for the laziest way to build infrastructure and we need to make it sound modern.  IBM's offering, well if you are a partner, you cloud go with this option but it is aimed more a large business partners.  IBM's cloud is IaaS focused, with some PaaS offerings but once again I'm not an expert.

AWS, has always been really easy to use.  It is big and complex like Azure with many offerings.  Basically, I'd choose AWS if the organisation was already using it and the people in the org know have experience with AWS.  AWS originally was aimed at the B2C/startup market but was first to market at scale.

Azure, so in my world Azure and O365 feel like the dominant player but the diagram below provides a great insight into the relative size of the Cloud infrastructure market.  Azure SaaS offering O365/M365 is also huge and hosted on Azure.   Azure security is well thought out and their thinking on BYOK and geo-location appear to be important.  Microsoft offer Arm templates and DSC for configuring environments, they are also adding Bicep which is an abstract layer that will run ARM templates into Azure.

There is good resource CloudWars.co that goes into looking at the various cloud providers.  My current take away is Amazon is the biggest player in the IaaS field.  Azure has IaaS, a large PaaS offering and a massive SaaS (including Dynamics and O365) offering (Amazon has no equivalent).  I am focused on PaaS solutions for my customers so as to remove the infrastructure and process overheads of IaaS.

Off the top of my head reasons for moving and objections I hear for the cloud regardless of platform:

Why Cloud:

  1. Save Money
  2. More Secure
  3. Fast Delivery/More Agile/Easy to scale/Increase business resilience
  4. Eco-friendly

Challenges:

  1. Lack budget
  2. Spiraling costs
  3. CAPEX model vs OPEX is business common norm that some business find difficult to switch
  4. Resources/Skills
  5. Believe security is an issue/Don't trust the Cloud
  6. Migrate legacy apps (for me don't move to the cloud unless you get significant advantage)


Sunday 16 August 2020

Onboarding staff and New Starters for SaaS

Overview:  With large SaaS projects, various stakeholders tend to start at different times.  For me it takes a while to get all team members up to speed with: domain knowledge, technical knowledge, our design and big picture, company specifics. 

Thoughts:

In person and Teams/Skype meetings are great for building relationships and getting the most information to the new starter.

Record presentations, then shorten and annotate the presentation to make focused high impact onboarding material that can be reused across various roles.

For each starter, ensure they have an introduction/onboarding plan:  Book meetings, get the people the background, have resources e.g. websites, mp4. glossary of terms ready:


Resources:

List out useful resource specific to the role.  I tend to keep these in SharePoint but as long as content if searchable and consistent and of value, I am in the right place.

Saturday 1 August 2020

Possible Technical Roadmap and thoughts on a startup

Overview:  A friend of mine's son has recently built a web site, and it looks impressive, and we started discussing his project, this turned into a detailed technical conversation to everyone's horror at a BBQ.  This chap is 14, and all I can say is wow.  All hosted and built without spending any money.  I interview many developers, and his knowledge is outstanding.  

I've drawn his architectural explanation below, added a few items to check and what my next piece would be.

Clarification of High Level Design
Thoughts:
  1. Build the Native mobile apps, getting users buy-in with a Native App will be considerably higher.  As you have used React, use React Native (React is separate to React Native, a completely new codebase).
  2. Cordova/PhoneGap is a wrapper that would allow you to keep a single code base and merely inject the existing code into Native wrappers for iPhone and Android.  As your app is HTML 5, and already looks like a modern mobile app, I'd use PhoneGap.  At least try it out and see if it fits.  You'd then only have to deal with a single code base to update all the platforms.  You could always use Google's Flutter if you wanted a single code base for the web site and Mobile native apps. 
  3. Ensure your API's are OpenAPI/Swagger
  4. Security - the API already has some protection.  Ensure the database is protected/secured.
Revenue model:  Ads (Gambling Ads pay well). Make a premium-paid for service (keep end-user usage/sign-up free).  Sign-up and collection of money while not directly applicable to this venture, can be done and kept dead simple using Shopify.

Thursday 23 July 2020

Shopify - Add optional installation on the cart for shoppers

Problem:  On my Shopify shopping cart, if a buyer is checking out and they have plants/flowers in their cart, I need to offer them the ability to have someone plant for them.

This could have various options such as in furniture, you could offer an assembly service.

Initial Hypothesis: Shopify use Liquid as it's scripting language.  Liquid allows us to combine Liquid with HTML, CSS and JavaScript to get the desired page behavior.  Below is my User Story:

As a shopper I want to be able to add labor to allow me to have my flowers/plants installed/planted so that I know an expert has got my flowers in correctly.

I also need to add the appropriate amount of labor, so if less than 3 plants charge for 1 hr  otherwise charge for 2 hrs.  Shopify has variants allowing me to have a labor product for 1 or 2 hrs.

Resolution:  Amend the cart.liquid cart summary page to allow for Labor to plant for the Shopper to easily be added.

Desired Behavior: 
Add a button to add installation

The optional installation cost for 1 hr is added to the store
Steps to Implement: 
1. Create a new Product in Shopify "Plant for me please".  Add it to a unique collection, mine was "Last Minute Checkout Items".  Add a couple of variants for time/cost as shown below:
2. Go to the Product page and append .xml to the end of the page e.g., https://nurserynearby.co.za/collections/last-minute-checkout-items/products/would-you-like-someone-to-plant-your-plants.xml and get the Variant_Id's, we need these to add the correct amount of labor cost later.
3. Open the cart.liquid file and add the  following logic:




Shopify thoughts

Overview:  Recently, I had two requests for some work around shopping carts/auctions.  I dismissed the first request as it is not what I specialize in.  On the second request, I decided to take the project on at a hugely discounted rate as I felt I did not have expertise in the field.

Shopify: Shopify is an amazing product and eco system, and I have build a great shop on the SaaS Shopify platform in record time.  There are great plug-ins to add functionality such as gift registry's, and Mobile add integration.  I have now been playing with the product, deliver solutions, worked with app vendors, reviewed competitor's, had help from Shopify support and I can honestly say that it is awesome.


Program-ability:

  • There are simple API's that are well documents.  
  • The internal language is Liquid which is basically a UI scripting language.
  • Shopify SaaS has add ins and templates, so look before developing
Liquid:


In the code snippet above I am building custom functionality int he cart checkout to add specific products to allow more sales on the store depending on custom rules provided by the customer.

Mobile:
The templates OOTB or that shop admins can purchase follow responsive design as one would expect.  There are add-ons for iOS and Droid to make native apps, very well priced with good functionality changed on a monthly reoccurring basis (circ $40-100/month).  I need a custom mobile app for both platforms so I'm between choosing to build with Flutter or Blazor.


Friday 10 July 2020

Power Apps Tracing to App Insights Not Working in edit mode

Overview: Power Apps integrates to directly log to App insights.  This post looks at the issues around Tracing in App insights.

Setup & Verify:
App Insights instance Instrument Key Required, Config you Power App to Trace to App Insights, and create a button to test.
The Monitor Tool is fantastic for tracing all outbound traffic, you no longer need to go to the service and check if Power Apps is reaching.  For example I use to have to look at the APIM Azure App logs for custom Connectors.
I can see all my interactions in Power Apps and travelling out.  This is a massive win for Power Apps so giving Tracing from the front end.

Greg Lindhorst, wrote a post on using the Power Apps Monitor Tool for debugging and performance improvement.
Note: Traces written to App Insights don't work in preview.  Verify, see below.

Problem: It normally takes a few minutes (like 2 minutes) to show up in App Insights, it is not showing up after 15 minutes.  Let's see, I've drop the Power Apps team a message and i think it's a bug that has crept in recently.  Today is 10 July 2020.  I've also notice that none of my Session page tracing is showing up in App Insights.

Update: 11 July 2020, the Power Apps behaviour has been on my mind.  I'm in edit mode, and when I publish the app and use it, App Insights logs perfectly.  When in edit mode running the app, even direct traces, no logging into Power Apps.  I did not realise this, but it kind of makes sense.

Resolution: Publish & Run the app, and the Tracing and Power App session tracing shows up in App Insights.

Only when I Run the Published App to my Page Views and Custom Traces get logged in App Insights.

Tuesday 30 June 2020

Multi-Geo for MS Teams

O365 offers multi-geo tenants to meet data residency rules for 13 countries and regions (as of 30 June 2020):
  1. Australia
  2. Asia Pacific 
  3. Canada
  4. European Union
  5. France
  6. India
  7. Japan
  8. Korea
  9. United Kingdom
  10. United States
  11. United Arab Emirates
  12. South Africa
  13. Switzerland.
Teams data resides in SharePoint Online, OneDrive for Business and Exchange Online.  With Multi-Geo enabled, a company can specify where data will reside.  There are 2 parts to multi-geo:
  • User specific data.  This data is stored in various satellite Geo for each user e.g. email, OneDrive
  • Company/Project/division specific data e.g. file shares, Document libraries
For more info on Multi-Geo on O365

Microsoft "Multi-Geo is currently available to Enterprise Agreement customers with a minimum of 250 Microsoft 365 Services subscriptions."

The UK South Azure Region has 3 data centres/zones, and it's geo-paired paired with UK West there is over 150 mile between the regions. 

MS Teams Background Info:
https://www.pbeck.co.uk/2019/12/microsoft-teams-governance.html
https://www.pbeck.co.uk/2020/05/microsoft-teams-overview.html

Note: Microsoft since 2022 I think have Microsoft Priva to help manage country privacy & compliance laws