Showing posts with label Azure. Show all posts
Showing posts with label Azure. Show all posts

Tuesday 20 June 2023

App Insights for Power Platform - Part 7 - Monitoring Azure Dashboards

Series

App Insights for Power Platform - Part 1 - Series Overview 

App Insights for Power Platform - Part 2 - App Insights and Azure Log Analytics 

App Insights for Power Platform - Part 3 - Canvas App Logging (Instrumentation key)

App Insights for Power Platform - Part 4 - Model App Logging

App Insights for Power Platform - Part 5 - Logging for APIM 

App Insights for Power Platform - Part 6 - Power Automate Logging

App Insights for Power Platform - Part 7 - Monitoring Azure Dashboards (this post)

App Insights for Power Platform - Part 8 - Verify logging is going to the correct Log analytics

App Insights for Power Platform - Part 9 - Power Automate Licencing

App Insights for Power Platform - Part 10 - Custom Connector enable logging

App Insights for Power Platform - Part 11 - Custom Connector Behaviour from Canvas Apps Concern

Overview: Azure Dashboards are excellent but, if you want beautiful dashboards use the Azure Grafana service or Power BI dashboards.

It's always difficult to identify KPI's and graphs to allow support and stakeholders to quickly digest monitoring information.  An option is to have an overview for the PM, PO, Business owners,,, and a separate set of dashboards for support.

Identify What is important?  End-to-end testing is always nice especially if running CI to detect abnormalities.

For instance, this Azure Dashboard, fires of a Canvas App recorded Test (done using test studio) and shows the speed (performance) of the run, i then warn the user if the performance is too slow.  I also grab the oldest successful run from 7 days ago to check if performance is considerably different.  

Power automate has it's own logging, but integrating log entries when a error occurs via Log analytics, allows me to see if any of my workflows have a problem.  This is discussed in part 6 of the logging series on Flows.

Azure services are often used when building solutions using the Power Platform.  The most common are Functions, App Services, APIM, Service Bus, maybe sendgrid.  So we need to know that they are working, and help the user see performance or issues.  Here are a couple of examples.





Series

App Insights for Power Platform - Part 1 - Series Overview 

App Insights for Power Platform - Part 2 - App Insights and Azure Log Analytics 

App Insights for Power Platform - Part 3 - Canvas App Logging (Instrumentation key)

App Insights for Power Platform - Part 4 - Model App Logging

App Insights for Power Platform - Part 5 - Logging for APIM 

App Insights for Power Platform - Part 6 - Power Automate Logging

App Insights for Power Platform - Part 7 - Monitoring Azure Dashboards (this post)

App Insights for Power Platform - Part 8 - Verify logging is going to the correct Log analytics

App Insights for Power Platform - Part 9 - Power Automate Licencing

App Insights for Power Platform - Part 10 - Custom Connector enable logging

App Insights for Power Platform - Part 11 - Custom Connector Behaviour from Canvas Apps Concern


Thursday 30 December 2021

Azure DevOps Series - Overview

Azure DevOps is a SaaS platform that provides tools for deploying software using DevOps techniques generally within Agile software delivery projects.  DevOps is useful for gathering requirements, building the solution, performing daily “integration builds”, and having deliverable end-of-sprint demos.  The key to building software today versus several years ago is that we should automate as much as possible.  Azure DevOps provides excellent tooling to automate and implement automation results in better quality, reduced timelines.  Building software is easy as long as you have great people and precise requirements.  Agile practices and DevOps processes and tooling can help get you to the desired state.

  1. Azure Boards are great for planning and getting the requirements cleanly broken down.
  2. Visual Studio coupled with Azure boards items with Azure Repos (source control options are GitHub & TFVS) is ideal for the development using most languages such as C#, NodesJS, Angular, React, TypeScript.
  3. Azure Pipelines are good at deploying solutions by setting up the infrastructure (I prefer to use PaaS and get out of the Infrastructure world, using ARM templates) and deploying code with the appropriate DTAP environment configuration.  Azure Test Plans are used to verify builds. 
  4. Monitor and Alert – Azure Monitor/App Insights is ideal for monitoring the infrastructure and operating code to detect issues early.
  5. Azure Artifacts help create and share Nuget code packages.

More Info:

Friday 30 April 2021

Azure Naming Conventions

 My Format (I simplify for smaller companies)

<Company>-<BusinessUnit>-<Region>-<Environment>-<ResourceType>-<Project>-<Instance>

GS-IT-UK-PR-RGP-Treetops-001

GS-HR-US-DV-NSG-Cloud-001

I like to enforce the same length for each part, just because it makes it easier to read in a list.  i.e. Region - Could be the 2 digit country code.  Case consistency is also important. 

Environment is my DTAP environment i.e. DV = Development, TS = Test, AS = Acceptance, PR=Production

Resource Type is the Azure Resource Type e.g. Network Security Group = NSG.   It is worth publishing a list as application services could be app or aps.

Tip: In azure sometime you can't use hyphens or need to use lowercase.  If I am forced, then I keep the same convention but merely abide by the rules of the service.

The key is just keep it consistent.  I find organisation use Tags poorly so with the naming convention, it helps replace the need for Tags or tags can easily be added as it gives the info away in the name.

Microsoft Recommends Azure naming convention:


Another example format:
<4digitApplicationName>-<2digitEnvironment>-apim-<2digitcounter> e.g. taxp-pr-apim-01 or  dvla-dv-appins-01 
I think it is just key to agree a standard format and stick to it.

I 100% ensure naming is consistently followed for resources, it's also useful to have a few tags, but I'm not dogmatic regarding tags.

Tags Examples
Env: Dev
Data Classification: Confidential
Project: Tax Treaties

Sunday 29 November 2020

Azure SQL Basic Options Summary

OverviewAzure SQL is incredible.  There are a lot of options when choosing how to host database and performance good.  "handles patching, backups, replication, failure detection, underlying potential hardware, software or network failures, deploying bug fixes, failovers, database upgrades, and other maintenance tasks", from Microsoft Docs and Azure SQL.

Azure SQL is the PaaS database service that does the same functions as SQL Server did for us for many years as the workhorse for many organisations.  Microsoft initially only offered creating VM's and then installing SQL Server on-prem.   Azure SQL is Microsoft's PaaS SQL database as a Service offering on the cloud.  Azure SQL is a fully managed platform for SQL databases that Microsoft patches managed backups and high availability.  All the features that are available in the on-prem. Edition are also built into Azure SQL with minor exceptions.  I also like that the minimum SLA provide by Azure SQL is 99.99%.

Three SQL Azure PaaS Basic Options:

  1. Single Database - This is a single isolate database with it's own guaranteed CPU, memory and storage.
  2. Elastic Pool - Collection of single isolate databases that share DTUs (CPU, Memory & I/O) or Virtual Cores.
  3. Manage Instance - You mange a set of databases, with guaranteed resources.  Similar to IaaS with SQL installed but Microsoft manage more parts for me.  Can only purchase using Virtual Core model (No DTU option).
Thoughts: Managed Instances recommend up to 100TB but can go higher.  Individual databases under elastic pools or single databases are limited to a respectable 4 TB.

Two Purchasing Options:

  1. DTU - A single metric that Microsoft use to calculate CPU, memory and I/O.  
  2. Virtual Cores - Allows you to choose you hardware/infrastructure.  One can optimise more memory than CPU ratio over the generalist DTU option.
Thoughts:  I prefer the DTU approach for SaaS and greenfield projects.  I generally only consider Virtual Cores, if I a have migrated on-prem. SQL onto a Managed Instance or for big workloads virtual cores can work out cheaper if the load is consistent.  There are exceptions but that is my general rule for choosing the best purchasing option.

Three Tiers:

  1. General Business/Standard (There is also a lower Basic Level)
  2. Business Critical/Premium
  3. Hyperscale

Backups

Point in time backups are automatically stored for 7 to 35 days (default is 7 days), protected using TDE, full, differential and transaction log backups are used to point in time recovery.  The backups are stored in blob storage RA-GRS (meaning in the primary region and all the read-only backups are stored in a secondary Azure region).  £ copies of the data in the active Azure Zone and 3 read only copies of the data.

Long Term Retention backups can be kept for 10 years, these are only full backups.  The smallest retention is full backups retained for each weeks full backup.  LTR is in preview available for Managed Instances.

Azure Defender for SQL 

Monitors SQL database servers checking vulnerability assessments (best practice recommendations) and Advance Threat Protection which monitors traffic for abnormal behavior.

Checklist:

  1. Only valid IP's can directly access the database, Deny public Access,
  2. AAD security credentials, use service principals
  3. Advanced Threat Protection has real time monitoring of logs and configuration (it also scans for vulnerabilities), 
  4. Default is to have encryption in transit (TLS 1.2) and encryption at rest (TDE) - don't change,
  5. Use Dynamic data masking inside the db instance for sensitive data e.g. credit cards
  6. Turn on SQL auditing,

Note: Elastic Database Jobs (same as SQL Agent Jobs).

Azure offers MySQL, Postgre and MariaDB as hosted PaaS offerings. 

Note: The Azure SQL PaaS Service does not support the filestream datatype : use varbinary or references to blobs. 

Monday 24 August 2020

AWS vs Azure services offering comparison for Solution Architects

Overview: Microsoft provides a useful list that allows me to know AWS services aligned to Azure Services.  This is pretty useful if you know 1 platform considerably better than another to quickly figure out your options on either AWS or Azure.

My Service comparison notes:

Amazon CloudWatch - same as Azure Monitor.

Amazon Relational Database Service (RDS) – SQL Server, Oracle, MySQL, PostGress and Aurora (Amazon’s proprietary database).  

Azure SQL lines up with Amazon's RDS SQL Server Service.  Although Aurora is probably also worth the comparison as it's AWS's native DB option.

Amazon DynomoDB – Same as CosmosDB – NoSQL database.

AWS API Gateway - Azure API Management

Amazon Redshift is the data warehouse.  Can be encrypted and isolate.  Support Petabytes of data.

Amazon ElastiCache run Redis cache and MemCached (simple cache).

AWS Lamda – Azure Functions. i.e. Serverless.

AWS Elastic Beanstalk – Platform for deploying and scaling web apps & Services.  Same as Azure App services.

Amazon SNS – Pub/Sub model – Azure Event Grid.

Amazon SQS – Message queue.  Same as Azure Storage Queues and Azure Service Bus.

Amazon Step Functions – Workflow. Same as logic apps

AWS Snowball – Same as Azure Box.  Physically copy and transport to data centre for upload.

Virtual Private Cloud (VPC) – Azure virtual network 

Amazon AppStream - Azure VDI (Virtual desktop) I think.

Amazon QuickSight - Power BI (Tableau Business Intelligence).

AWS CloudFormations - ARM and Bicep

Tip: I am glad that I did the AWS Certified Cloud Practitioner exam as it helped my understand of the AWS offering which has been very useful in large integration projects.  I have worked with AWS IaaS (EC2, API gateway and S3 historically).  Like Azure, there are a lot of Services and features.  Basically, there are equivilant services for Azure & AWS.  It may be a 2-to-1 service offering or it is not something offered by the cloud provider.

Sunday 23 August 2020

AWS vs Azure vs GCP Comparison

Overview:  I predominately use Azure & Microsoft for all my cloud services.  


I have installed multiple SharePoint farms and setups on AWS EC2 instances and I'm currently preparing for the Cloud Practitioner AWS exam.  I have used Google for authentication, SaaS nut not as a IaaS offering.  I'm also a huge fan of Heroku which is great for PaaS and I used this to host my game built for Facebook games.  I've also seen IBM's cloud offering a few years ago.  For me it is too niche and not as feature rich.  So basically I understand Azure's offering well so I found this comparison pretty useful.

My Thoughts:  The contenders:  I really like Heroku for it's simplicity.  I feel for a small Indie developer or company, Heroku has a good free and cheap simple billing options.  GCP, I really can't comment from a good position of knowledge but from what I've used, I like GCP.  GCP is the third biggest Cloud provider.  As a large organisation, I'd only consider the big three: Microsoft Azure, AWS, and GCP to be our cloud partner.  Multi-cloud partner is a demand from some organisations, it's truely extra expensive.  Azure uses ARM templates and has many options for provisioing the IAAS, PaaS offerings.  If you are thinking multi-cloud consider Terraform by Hashicorp for IaC.  There is also the concept of Click-Ops (sic) which allows you to click thru the UI of the management of the Cloud services to get the the desired architecture, this is fine for simple small architecture but you can't do this at any scale or agility and it's super error prone.  Click-ops is more a joke term for the laziest way to build infrastructure and we need to make it sound modern.  IBM's offering, well if you are a partner, you cloud go with this option but it is aimed more a large business partners.  IBM's cloud is IaaS focused, with some PaaS offerings but once again I'm not an expert.

AWS, has always been really easy to use.  It is big and complex like Azure with many offerings.  Basically, I'd choose AWS if the organisation was already using it and the people in the org know have experience with AWS.  AWS originally was aimed at the B2C/startup market but was first to market at scale.

Azure, so in my world Azure and O365 feel like the dominant player but the diagram below provides a great insight into the relative size of the Cloud infrastructure market.  Azure SaaS offering O365/M365 is also huge and hosted on Azure.   Azure security is well thought out and their thinking on BYOK and geo-location appear to be important.  Microsoft offer Arm templates and DSC for configuring environments, they are also adding Bicep which is an abstract layer that will run ARM templates into Azure.

There is good resource CloudWars.co that goes into looking at the various cloud providers.  My current take away is Amazon is the biggest player in the IaaS field.  Azure has IaaS, a large PaaS offering and a massive SaaS (including Dynamics and O365) offering (Amazon has no equivalent).  I am focused on PaaS solutions for my customers so as to remove the infrastructure and process overheads of IaaS.

Off the top of my head reasons for moving and objections I hear for the cloud regardless of platform:

Why Cloud:

  1. Save Money
  2. More Secure
  3. Fast Delivery/More Agile/Easy to scale/Increase business resilience
  4. Eco-friendly

Challenges:

  1. Lack budget
  2. Spiraling costs
  3. CAPEX model vs OPEX is business common norm that some business find difficult to switch
  4. Resources/Skills
  5. Believe security is an issue/Don't trust the Cloud
  6. Migrate legacy apps (for me don't move to the cloud unless you get significant advantage)


Tuesday 3 December 2019

Web Api hosted on Azure App Service with OIDC security using Azure AD B2C

Problem:  I want to add security to my .NET core ASP.NET Web API C# application using Azure AD B2C.

Terminology:
  • .NET Core - revision of the .NET framework.  Allows your application to run on Linux, Macs and Windows.  You do not need to have the .NET framework installed.   
  • ASP.NET Web API - Follows the MVC pattern using Controllers and Models to provide an HTTP services e.g. Give me the temp in Paris today.
  • Azure App Service  - Host an MCV or Web API on Azure.  Acts as a web server, it is scale-able and fully manged.
  • Azure Active Directory (AAD) B2C - AAD B2B is different to AAD B2C, totally separate services on Azure.  Business 2 Consumer (B2C) provides applications with an identity repository.  B2C provide authentication and identity management as a service to web applications and mobile applications.  Think of it as the same Google authentication but you own the identity provider instead of rely on third-party authentication providers like Google.
  • IdP - Indentity Provider, B2C is one of 2 AAD service for managing users/identities on Azure.
  • MVC - Model, View Controller is a pattern used to aggange software.  In this post I'm refering to project that utilise the MVC templates to create a project for web sites or Web API.



Problem: MVC web application hosted on a Web App, using Azure B2C, B2C holds users and also uses a social Identity Provider (IdP) namely Google.

Figure 1, Create a new project on the Google Developer Console

Figure 2, OAuth Consent Screen setup
Figure 3, Add the Credentials to Google
AAD B2C linkup to Google IdP.

High-Level Approach:
  1. Create your own Azure tenant & B2C service instance on Azure (using the Azure Portal)
  2. Register your ASP.NET Web application on the Azure tenant (using the Azure Portal)
  3. Create User Flows (Policies) on the B2C tenant (This allows you to create the flow to sign-in a user, create a new account, or a user to reset their password,...)
  4. Setup Google to connect to the B2C IdP (see figure 1-3)
  5. Update application created in Step 4 so that is is aware of the Google IdP
  6. Perform Authentication setup - create MCV web application using Visual Studio
Tip: This approach and technology is extremely well documented by Microsoft.


Friday 29 November 2019

Redis Cache

Caching in Microsoft technologies has many forms and options.  You can have server side cache, client side cache or downstream caching such as proxy servers.

Server Cache in .NET:
  • in-Memory store: This type of server cache historically was very useful for maintaining state and having users tied to a collection of objects/knowledge and is extremely fast.  You need to think about multiple instances, or how to route traffic.  So you can tie a user to use static routing to ensure user Joe's session always goes to server 3.  But is server 3 dies, you loose all the cache data.  
  • Local storage cache works well for larger amounts of data that are too big for memory cache but as they are storage based are much slower.
  • Shared/ centralized caching, allows shared cache, a good example is using SQL server for caching and the user can go to any front end web server and the cache is pulled from SQL.  Allows for failure, no need to performed fixed user routing of requests (sticky sessions).  It is slower as cache is pulled from a central service that in turn goes into SQL to retrieve the data.
  • Azure Redis cache is a form a of Shared Caching.  The cache is better if used in memory like Redis cache does.  A Redis Cache is a service that is optimized for performance and allows stateless caching to work across server instances.  So while it needs to travel over your network in Azure, it is not as fast as local cache but extremely fast for centralized caching.  Redis is pretty advanced and has clustering and replication to ensure performance is good.
https://docs.microsoft.com/en-us/azure/architecture/best-practices/caching

Client Side Cache:
  • Browser can hold information in session but it is not secure or at least less secure than server side cache.
  • CDN's are a way of retrieving data from an optimized central store but useful for static files and data.
  • Adding headers to HTTP requests allow for downstream caching.  For example, I offer a REST API (e.g. C# Web API) that return a feature product that change hourly.  I could easily set expiry caching to 10 min.  The product is changed every 10 minutes for each user and added to the users local cache.  So if an average user is on for 20 minutes, they only to 2 of their 10 requests to the REST API, the other 8 calls are served locally.  Couple with Server caching, the requests to the server can be fast, with very few calls to the back-end database yet the relatively static data causes far less demand on the web service.
  • Validation Caching - is client caching, that stores data locally, but a request is sent to the server to ask if there is a change, if there is a change send back the new data.  If the data has no changed, a 304 response is sent and the browser uses the previously stored local cached data.
Client and Server (Redis) side Caching on Azure
Note:  The diagram only outlines client caching referring to the browser or end device.  Using HTTP headers, you can also set down stream proxy caching.  This is also generally referred to as client caching.

Firefox has a nice short cut to check you client cache is working.  Open Firefox, and request the URL that shall cache a resource on the local client machine.  then in the URL type in "" and check what is being served up locally.


Basic Caching Options:
1. Client Cache (local) - saves on network call and serve side processing.  Can hold client specific static data.
2. Client Cache (proxy) - saves on server side processing.  Can only hold generic static data.
3. Server side Cache (Redis) - saves on computing resources by not having to connect to data sources on every request.  Useful for static share data.



Friday 22 November 2019

Azure IaaS backup Service Notes

Azure has great cost calculation tooling.  DR can be pretty expensive is running but not being used.  Having the ability to either turn on or deploy a DR environment can make massive cost savings.

I often see organisation over spending Azure dollars, basically most cost reduction falls into 1 of these 3 groups:
  1. Eliminate waste - storage & service no longer used
  2. Improve utilisation - Oversized resources
  3. Improve billing options - long term agreements, Bring you own licence (BYOL), 

Apptio Cloudability is a useful tool for AWS and Azure cost savings.  Azure has good help and tooling for cost savings.

Azure IaaS Backup:
  • Recovery Services Vaults
  • Off site protection (Azure data center)
  • Secure
  • Encrypted (256-bit encryption at rest and in transit)
  • Azure VM's or VMS' woth SQL and on on-prem. VM's or Servers
  • Server OS supported: Windows 2019, 2016, 2012, 2008 (only x64)
  • SQL all the way back to SQL 2008 can be backup
  • Azure Pricing Calculator can help estimate backup costs
  1. Azure Backup Agent (MARS Agent), used to backup Files and folders.
  2. Azure Backup Server (trimmed down lightweight version of System Centre Data Protection Manager (DPM)), used for VM's, SQL, SharePoint, Exchange.
  3. Azure VM Backup, management done on Azure Portal to backup Azure VM's.
SQL Server in Azure VM backup, used to backup SQL databases on Azure IaaS VMs.

Backing up Azure VM's must be done to the same geo location as the vault.  It can't cross geo-locations.  Recovery has to be to a different location (verify this is correct?)
Note: "Backup Configuration" setting of the Vault properties can be set to "Geo-redundant"

Azure Recovery Vault Storage choice:
LRS - Local Redundacy Store - 3 local async copies
GRS - Globally Redundant - 2 async copies in the same data region with 3 local copies- so can keep in Europe for compliance, all 6 copies are in Europe.
Update Feb 2020: I think there is also a GZRS option, check if this has changed?

Naming is absolutely key and having a logical hierarchy within Resource Groups so it is easy to find resources.  I focus on naming my resource consistently however, I've always felt "Tags" have little or no purpose in smaller environments.  In larger environments tagging can be useful for cost management, recording maintenance, resource owners, creation dates.  Lately, I've been finding it useful to mark my resource with and Environment Tag to cover my Azure DTAP scenarios.  E..g., Production, Testing, Development.

Tuesday 12 November 2019

Microsoft Information Protection


Check out my earlier Post on AIP (feb 2019)

End-to-end life-cycle for encrypting files using Azure Information Protection (AIP)


Use "Unified Labeling" to create labels



Note: Encrypting stops SharePoint being able to look into the content of the file.  The labels and name are still search but not the content of the file.  eDiscovery, Search, co-authoring don't work on AIP encrypted documents.

Cloud App Security (MCAS) Screen Shot


Sunday 6 October 2019

Common Azure Services

Azure Key Vault - Secure config storage and retrieval
There are SDK's for working with Azure Key Vault such as the "Azure Key Vault secret client library for .NET (SDK v4)".  Extremely easy to get secrets from the secure vault using C#.

Azure Storage
Microsoft Azure Storage Explorer is a great tool for reviewing your Azure Storage and in the case below I used it to add some Azure table storage for a demo customer list.
There is also a web edition of Storage explorer that is in preview as of 18 Nov 2020.

App Service - Host Web sites or WebAPI

Azure Artifacts - Code and share your packages via NuGet, and npm packages with Azure Artifacts for more reliable and scalable builds

Azure Data Factory (ADF) - Basically PaaS fully managed Azure ETL/SSIS.  Many connectors to ingest data.  Send to Azure Synapse Analytics.

Azure Big Data


Azure Synapse Analytics  - is a managed PaaS solution that brings together ADF, Data Lakes (both Storage and Analyse) and Azure Data Warehouse under single managed solution.  Easier than the individual pieces and scales as you need with almost unlimited capability.  Azure Purview - discover and analyses all your data, integrates with AIP.  Azure Synapses simplified analytics, sold as a PaaS (Serverless) or dedicated.  Easiest way to draw data out of Azure Synapse is Power BI.  Easy to bring data into Azure Synapse from CosmosDB and SQL databases (no affect on performance) can automatically push the data into Synapse, no need for ADF. And the data is in live time.



Azure Application Configuration - Feature Toggles/Feature flags are extremely useful in code.  This service is great for turning on experimental features, operation feature, environment/release features, and security features.  Feature Toggles (aka Feature Flags) (martinfowler.com)  Use for feature flags whereas KeyVault is for secrets.



Azure Resource Explorer - Documentation on Azure API's and ability to call the APIs.

Azure Policy - Azure Policy Templates can be custom created that apply rules to your subscription.  There are a lot out of pre-canned policies.  You can enforce naming conventions, tagging standards, enforce deployment of resources into specific regions, ....

Sunday 11 August 2019

Send email using O365 in an Azure Function C#

Create an Azure Function and the basics for send an email are below:

MailMessage msg = new MailMessage();
msg.To.Add(new
 MailAddress("someone@somedomain.com", "SomeOne"));
msg.From
 = new MailAddress("you@radimaging.co.uk", "Paul Beck");
msg.Body
 = "This is a test message using <b>O365</b> Exchange online";
msg.Subject = "Test Mail";
msg.IsBodyHtml
 = true;
SmtpClient
 client = new SmtpClient();
client.UseDefaultCredentials
 = false;
client.Credentials
 = new System.Net.NetworkCredential("your user name", "your password");  // Generate a key so pswd does not change or if account uses MFA
client.Port
 = 587; 
client.Host
 = "smtp.office365.com";
client.DeliveryMethod
 = SmtpDeliveryMethod.Network;
client.EnableSsl
 = true;

Tip:  Azure SendGrid is a great service for sending out emails.  Amazon have SES (Simple Email Service).

Friday 21 June 2019

O365 and AAD using InTune

Overview:  Our company has gone away from traditional on-prem. networking and we use Azure.  We use AAD, Azure Domain Services, Intune and O365 with all laptops and PC's using Windows 10 Pro.  It is so easy and removes so much administration.

Intune: If your users have O365 or E365 licences Intune is included, with E3 accounts you can add on for £7.50 per month.  Intune allows me to deploy a setup that historically would have used GPO to manage the individual machines referred to as "Configuration".  I can verify all my users are compliant with my policies such as Windows 10, ensure they are patched to a certain level.  Defender works brilliantly thru Intune.  I've pulled off our old anti-virus/malware on end-user devices because with Intune it's better with Defender.  I ensure all our PC's and laptops have BitLocker.  Checking all devices my users are using is done thru Intune using "Compliance".

  • I can wipe any PC or device remotely.
  • With the user logins, I can see activity and it provides a great end to end management solution.
  • I haven't used team viewer as we still us LogMeIn for remote support but I'd personally lean to TeamViewer as it's fully integrated with Intune.
  • BYOD devices are also controllable using Intune.
Summary:  Intune is easy to use and roll out and provides good control of end user devices.

Example Policy for Windows 10 devices:

Health Service Setting
  1. Require BitLocker
Required
  1. Require Secure Boot to be enabled on the device
Disabled
Device Properties Setting
  1. Min OS Version
1809
System Security  Setting
  1. Require a password to unlock mobile devices
Required
  1. Simple passwords
Block
  1. Password Type
AlphaNumeric
  1. Min password length
8
  1. Max time of inactivity before password is required
10
  1. Password expiration (days)
45
  1. Number of previous passwords to prevent reuse
12
  1. Require password when device returns from idle state
Required
  1. Encryption of Data storage on device
Required
  1. Device Security - Firewall
Required
  1. Device Security – Antivirus
Required
  1. Device Security – Antispyware
Required
  1. Defender – Windows Defender Antimalware
Required
  1. Defender – Min version
1.295.933.0
  1. Defender – Antimalware intelligent up-to-date
Required
  1. Defender – Real-time protection
Required
Windows defender ATP  Setting
  1. Require the device to be at or under the machine risk score
Medium

Update: 2022-June-20

"BigFix automates discovery, management, and remediation of all endpoints whether on-premises, mobile, virtual, or in the cloud" - product by HCL.  

Competitor is Microsoft Endpoint Manager (MEM).  MEM is useful for patching and monitoring Windows 10/11 devices.  Can setup policy to ensure different notification for the end user to install and cn force if the end user does not install the patch.