Saturday, 25 April 2026

VSCode with Claude Code or GitHup Copilot

Overview: I have seen people really struggling with understanding that Claude is not GitHub Copilot (GHCP).  And GitHub Copilot is not an LLM. 

Terminology

Claude by Anthropic is made up of various parts, and it helps to be more specific.  Calusde is amazing at providing great Large Language Models (LLMs).  There are Claude Opus (for programming the lastest us 4.7), Claud Sonnet, and Claude Haiku.  GPT-5.4 is OpenAI's current flagship LLM.  Gemini 3.1 Pro is known for UI-focused coding. I found Gemini 3.0 good, but I don't use it that often.   

Claude also offers other services beyond supplying LLMs like Claude Code and Claude CLI.  These are the key ones for me:

  • Claude in Chrome — a browsing agent
  • Cowork — a desktop tool for non-developers to automate file and task management, rival to Microsoft 365 Copilot.
  • GitHub Copilot for VS Code is the equivalent of Claude Code for VS Code

    Developing in VS Code:

    In the screen below, I am using both Claude Code and GitHub Copilot in the VS Code IDE.

    Here I have some C# code that creates and deploys an Azure Function. I can use either option until I run out of credits with my monthly (GHCP) or hourly (Claude) subscription allowance.  When I go over my GHCP allowance, I have it set up to use my Azure Credits.

    Wednesday, 22 April 2026

    Copilot Studio Custom Agent SharePoint Channel unexpected behaviour

    Overview: I have been using Copilot Studio for production solutions for a few weeks now, and I am impressed.  I pushed my fist Copilot Studio custom agent from using ALM into QA and Production, where I finished the iteration a week ago.  My boss was looking at the Agent in SharePoint and identified a bug with uploading files and using them.

    Anyway, I tested the scenario in other channels and in the Test area without issue.  In SharePoint, a Custom Copilot Agent can't upload a file to work with OOTB.






    Sunday, 15 March 2026

    Chess, I mean IT for beginners

    Medium- and large-sized IT projects often run into trouble at various points.  This analogy has helped me keep projects online and delivered, so I thought I'd share.

    Firstly, for those who don't know in depth about chess strategy, it generally goes something like this: 

    Opening, Middle and Endgame is how to break down a chess game.

    Chess Cheat Sheet for Beginners

    Opening Principles

    • Control the center: Aim for squares e4, d4, e5, d5.
    • Develop pieces early: Knights and bishops out before moving the same piece twice.
    • Don’t bring your queen out too soon: Avoid early queen moves.
    • Castle early: Protect your king and connect rooks.

    Middle Game Tips

    • Coordinate pieces: Make them work together.
    • Avoid unnecessary pawn moves: Pawns can’t move back.
    • Look for tactics: Pins, forks, skewers, discovered attacks.

    Endgame Basics

    • Activate your king: It becomes powerful in endgames.
    • Push passed pawns: They’re your winning ticket.
    • Rooks behind passed pawns: Classic endgame rule.

    General Rules

    • Every move should have a purpose.
    • Don’t sacrifice without clear compensation.
    • The threat can be more serious than the execution.
    As an Analogy, I find this extremely useful for getting all stakeholders working together and understanding how to do so.

    MVP cost balloons when “nice‑to‑haves” silently become “required”.  Not relatable to chess, except it is key, what are we doing in the Min viable Prod and why?  Think of it more like recon for the battle, learn, don't try to win it.

    Sunday, 15 February 2026

    Entra ID App Registrations

    Overview: MS Graph and OAuth permissions must be assigned via Entra ID (IdP) App Registrations.

    App registration - is the definition of an app (API permissions it exposes, scopes, app roles, redirect URIs, etc.).

    Enterprise Application (Service principal) - the instance of that app in the Entra tenant, created after consent is granted.

    Consent is granted to an Enterprise Application instance/service principal, not by an app registration.

    Steps to Set up MS Graph Access for SharePoint Online (Site Collection Level Access)

    1. Register a new App Reg in Entra ID


    2. Add the MS Graph API Permission: Sites.Selected Delegated

    3. Using PowerShell 7, install the Pnp.PowerShell module and connect to PnPOnline

    PS> Install-Module PnP.PowerShell -Scope CurrentUser

    PS> Connect-PnPOnline -Url https://radimaging.sharepoint.com/sites/Contracts  -ClientID <xxx-xxx> -Interactive

     
    4. Assign the new App Registration to your SharePoint site, and you will need Site Collection Admin.

    PS> Grant-PnPAzureADAppSitePermission  -AppId "8f468b7c-9APP-YOU-WANT-TO-GRANT"   -DisplayName "My App"  -Site "https://<tenant>.sharepoint.com/sites/<site>" -Permissions Write

    Tip: You may want "Read" instead of "Write" permissions or another higher level.

    5. When you access the Site for the first time, you will be asked to provide consent (the administrator can also consent on behalf of business users).

    6. Verify that within Portal.azure.com > Enta ID > Manage > Enterprise Applications (find the app reg that has been consented to)

    7. Connect to the site using Postman or any client to verify you have the access you need.







    Thursday, 22 January 2026

    GitHub Copilot (GHCP) for VS Code - Notes

    I’m a big fan of using GitHub Copilot with VS Code. Right now, my preferred LLM is Claude Opus 4.5 — it’s so good.

    Anyway, these are my notes and findings for using GHCP:

    Custom Agents are built for a specific role or working style. You select an agent when you want Copilot to follow a particular set of instructions and use dedicated tools tailored to that job.

    Agent Skills, on the other hand, are reusable capabilities. They bundle instructions, scripts, and resources that Copilot can automatically draw on whenever they’re relevant—no need for you to choose or switch anything manually.

    Tuesday, 20 January 2026

    App Insights for Power Platform - Part 12 - A fix story

    Overview: My system notified me that my production errors were going crazy.  Quickly I knew all the tenants at my largest customer were down.  

    Problem: Apps would load and stay in a loading state.  I could see the Canvas apps but that is where is would continue try to load.  

    Initial Hypothesis: Originally, I thought is was 1 environment and I know it was loading a SharePoint list so i thought it may be permissions, but it was on all my environments and my continuous test was picking them all up.

    I checked the Microsoft Services and all the services are working: https://azure.status.microsoft/en-gb/status

    I went to QA and Dev and they were also failing with the same issue.  I ran the Canvas app in debug mode and could see the error was relating to Connectors to the European APIM for Dataverse.

    Next I went to other environments on my client in other regions, they too were also failing.  I was a bit surprised as I wasn't getting any feedback from other clients, or feeds, so i logged onto my own company Power Platform tenant and in environments, they were working.  So this was only to this specific client. And now I knew the extent, and i could not run flows only on the client environments.

    Here are the CI test report and results for a subset of the apps on the business units production environment.

    Ran a test to check a single Production department Environment with 24 Canvas apps:

    Simple test: runs on 3 worker processes on a single browser engine chromium.

    Reports logs show all 8 of these sites are not working

    All 8 tests were not finding the Title of the page.  I log from Playwright and from Canvas apps using App Insight traces, the error was supper easy to pickup even without the Power Platform trace in the Dev environment.

    Resolution: Raise a ticket, tell MS that we had the issue and provide the info (not raised by me, but by a support engineer).  30 minutes later, the company has been advised to close the browsers and try again.  Did this manually and issue resolved.

    I still had the old token for SPO in Playwright for Chromium, so I ran the test on for all 3 browser engines.  Chromium fails with the old token, Firefox and webkit pass as they grabbed new login tokens.
    Success: Same test using webkit browser engine - working as they have a new SPO Bearer token.


    If I find out what cause the issue, I'll post what MS did and found out.  


    Useful Dashboards I used:


















    Portal.azure.com OOTB App Insights Url to retrieve an OperationIDs' history
    https://portal.azure.com/#blade/HubsExtension/BladeRedirect/bladeName/Microsoft_Azure_LogicAppsRunBlade/
    runId/<OPERATION_ID>/ logicAppName/<LOGIC_APP_NAME>/ resourceGroupId/%2Fsubscriptions%2F<SUBSCRIPTION_ID>
    %2FresourceGroups%2F<RESOURCE_GROUP_NAME>%2Fproviders%2FMicrosoft.Logic%2Fworkflows%2F<LOGIC_APP_NAME>



    Series

    App Insights for Power Platform - Part 1 - Series Overview 

    App Insights for Power Platform - Part 2 - App Insights and Azure Log Analytics 

    App Insights for Power Platform - Part 3 - Canvas App Logging (Instrumentation key)

    App Insights for Power Platform - Part 4 - Model App Logging

    App Insights for Power Platform - Part 5 - Logging for APIM 

    App Insights for Power Platform - Part 6 - Power Automate Logging

    App Insights for Power Platform - Part 7 - Monitoring Azure Dashboards 

    App Insights for Power Platform - Part 8 - Verify logging is going to the correct Log analytics

    App Insights for Power Platform - Part 9 - Power Automate Licencing

    App Insights for Power Platform - Part 10 - Custom Connector enable logging

    App Insights for Power Platform - Part 11 - Custom Connector Behaviour from Canvas Apps Concern 

    App Insights for Power Platform - Part 12 - A fix story (this post)


    Thursday, 15 January 2026

    MS Fabric - Storage underpinning

     Microsoft Fabric get all its data from OneLake.  I believe all storage, except Real Time Intelligence (RTI), uses OneLake to ensure there is only one copy of the data.


    Left side of the diagram/Data sources:

    Any data source - copy data from any source into a Lakehouse, and the Parquet and Delta is stored in OneLake and exposed via the Catalog Layer

    Mirroring: Some data sources are mirrored into OneLake Parquet, including Snowflake, PostgreSQL, and Azure SQL 2025. 

    MS Fabric SQL is part of Fabric, and the SQL database is mirrored into OneLake.

    Shortcuts - 3rd party software holds the parquet* data and allows MS Fabric to query the data.   

    Sunday, 11 January 2026

    Working with Snowflake and MS Fabric

    Overview: Snowflake covers a small area of what Fabric does.  But Snowflake cover it's area unbelievably well.  For large enterprises use these together even though there is some overlap, Snowflake is great at what it does! 

    Five ways to use Snowflake data in Fabric: 

    1. ETL - use Data Factory or an ETL tool to copy data from Snowflake to Fabrics OneLake (point in time copy).  This should be your last option. 

    2. Direct query (no copy) - Fabric compute (Power BI, Notebooks, Dataflows, Pipelines) runs queries directly against Snowflake’s SQL endpoint. Best when you want zero‑copy and Snowflake stays the system of record.

    3. Mirroring (copy + sync) - Fabric mirrors a Snowflake database using CDC into OneLake so Fabric can work locally with governed, accelerated data while staying synced with Snowflake.  Good for small and commonly accessed data. 

    4. Shortcut to Snowflake‑hosted Iceberg (no data copy) - Fabric creates a Shortcut (virtual pointer) to Iceberg tables stored with Snowflake, so Fabric tools read them without moving data.

    5. Snowflake writes Iceberg to OneLake - Like option 3 but Snowflake handle the outbound - Snowflake materializes Iceberg tables into a OneLake location; Fabric then reads them natively (open‑format interop).

    Reference:
    Greg Beaumont's Architecture blog - Fantastic stuff! 

    Saturday, 10 January 2026

    SharePoint AI new features

     SharePoint has added some great AI components when coupled with M365 Copilot.  We have had this for a few weeks in GA:

    The Summarize feature is fantastic:

    • It works well in Word and Excel and is okay in PPTX files.
    • It processes images using OCR and AI to interpret them and generate a summary.
    • It works with PDFs/Adobe documents, including text with embedded images and image-based PDFs.  Pdf-A also.

    There is no MS Graph API yet for this functionality, so I used Playwright to scrape the summaries and add them to a summary metadata field.  I came across Knowledge Agent, which is in public preview, and it is fantastic.  

    Knowledge Agent (for SharePoint online if you have a M365 Copilot licence): 

    Knowledge Agent in the SPO UI

    If you need it enabled, as a SharePoint Admistrator you need to enable it using PowerShell.

    Generate metadata using your existing files and metadata as the example shows below:

    Sunday, 30 November 2025

    SharePoint for BIM

    Overview: Building Information Modelling (BIM) Enterprise Document Management Systems are expensive.  I decided to use SharePoint instead, as we already have SharePoint licencing, security is approved, and it works with Entra MFA.  SharePoint offers great API's and is well-known and easy to manage.  Lastly, I got much better performance from SharePoint than the other two dedicated Doc Mgm Systems I evaluated.

    Concerns:

    • BIM version, numbering, and workflows: No issues seen; merely need a simple workflow
    • Web Viewer for CAD files

    Advantages:

    • Part of the Microsoft ecosystem
    • Free, as we all have E3 licences
    • Users know how to use and manage SharePoint, so no training is required
    • SaaS is running and is part of our business SLA, with 99.9% uptime.  SharePoint has an active-active architecture, built-in backups and data is stored in multiple locations
    • Reduced setup and no external 3rd party requiring contracts and approvals.  No supplier has nearly as many compliance assertions including ISO27001, SOC1, SOC2, SOC3, GDPR, 
    • Security is already ready with the client Entra userbase with MFA.  DLP and sensitivity labels.   Great built-in data residency, audit logs and retention policies.  File Sync is super helpful in working with large CAD files, especially in remote locations.  All data is encrypted at rest and in transit.
    • SharePoint is widely used in construction projects.  Customers and third parties can securely access SharePoint Online for central collaboration.
    • Mobile-friendly, tool-friendly, management-friendly approach to BIM.
    <ProjectCode>-<CreatedDivision/Partnercode>-<DocType>-<Discipline/Category>-<IncrementNo>
    BLD123-MACE-CAD-ELE-00001

    HLD Architecture Designs for Finding SharePoint File data

     SharePoint data source using Microsoft Foundry to build an Agent



    Sunday, 2 November 2025

    Edge Computing Notes (IoT)

    WIP

    Edge Computing is where computing is done close to IoT devices and machines/PLCs.  Basically, if it happens on the "edge" of your network.  The processing occurs on local devices, gateways, or edge servers near IoT sensors, cameras, or machines.

  • Low Latency: Ideal for applications like autonomous vehicles, industrial automation, and AR/VR.
  • Bandwidth Efficiency: Only relevant data is sent to the cloud, reducing costs.
  • Reliability: Systems can continue functioning even with intermittent connectivity.
  • Over the past few weeks, I ordered a Raspberry Pi, which I intend to use for processing data from remote IoT sensors, namely Cameras, LiDAR, and temperature.

    Message Queue Telemetry Support (MQTT) is a light weight messaging protocol used to allow remote devices to communicate reliably using a pub-sub model operating in real-time.  MQTT 5.0 is ISO/IEC 20922 and the latest approved edition done in 2019.  Sends small packets

    Node-RED provides a web browser-based flow editor, which can be used to create JavaScript functions (Wikipedia)

    Azure IoT Edge is a cloud-to-edge computing platform that extends Azure services and custom logic to IoT devices. It allows you to deploy containerised workloads (such as AI models, analytics, and business logic) directly on edge devices, enabling local data processing and reducing reliance on continuous cloud connectivity. This improves latency, bandwidth efficiency, and resilience in offline scenarios.

    IoT Edge Modules are Docker-compatible containers running Azure services, third-party apps, or custom code. Examples: Azure Stream Analytics, Azure Machine Learning models. [learn.microsoft.com]

    IoT Edge Runtime must be installed on each edge device. Handles module deployment, security, and communication between modules, devices, and the cloud.

    Includes:

    IoT Edge Agent: Deploys and monitors modules.

    IoT Edge Hub: Manages messaging between modules and the cloud. [learn.microsoft.com]


    Azure IoT Hub is a cloud service for managing IoT Edge devices, configurations, and monitoring.


    A Raspberry Pi has an OS, whereas an Arduino device has no OS:

    • Arduino is an Application + Microcontroller (MCU) - Compile code in C++ or C
    • Pi is Application + Library & System Functions (OS) + MCU



    Why Use Ada on MCUs?

    • High reliability (ideal for avionics, medical devices).
    • Built-in concurrency and timing features.
    • Safer than C/C++ in many cases due to strong typing and runtime checks.

    • If you need maximum safety and reliability, Ada is superior.
    • If you need performance, simplicity, and broad support, C dominates.
    Ada is a strongly typed, multi-paradigm programming language originally developed for the U.S. Department of Defense to ensure reliability in mission-critical systems.
    Setup:
    • Install GNAT (GNU Ada Toolchain)
    • GNAT Studio or VS Code with Ada extensions.
    • Run gnat --version in your terminal
    • Write the code:

    Compile & Run> gnatmake hello_mobility.adb

    Friday, 31 October 2025

    Playwright Agents in VS Code

    I started looking at the latest version of Playwright late last night. The Agents add-in for VS Code is amazing.  I can't stop improving my code, my tests, and my automation.  It is highly addictive.

    Playwright 1.56.1 includes the new Playwright CLI, which has the test agents as shown in VS Code above:

    Node: v22.21.0
    npx: 11.6.2
    Playwright test package: ^1.56.1

    Sunday, 12 October 2025

    Federated and Event Driven Architecture

    Event Driven Architecture (EDA): System components interact with each other by producing, detecting and reacting to events.

    Event-driven APIs differ from conventional REST APIs, offering improved scalability, strong service decoupling, reduced network traffic, and greater flexibility. Even-driven APIs need to address the challenges of monitoring, distributed tracing, security, and versioning.

    Distributed, event-driven architectures are a powerful approach to building decouple long-running, high-performance, scalable systems and can use Conflict-Free Replicated Data Types (CRDTs) to provide eventual consistency.


    An Event Mesh is a dynamic, interconnected network of event brokers that allows events to flow seamlessly across distributed applications, regardless of where they are deployed—on-premises, in the cloud, or at the edge.

    Federated architecture allows each system to flexibly interact with other systems while remaining independent, so it can be easily extended, and individual pieces (a system) can be replaced relatively quickly.

    Thoughts: Like Cross cutting concerns where you build IT runway to perform specific tasks and then call them, Federated Architecture, each system does a job so for instance, there is a standalone system that can be replaced or extended for Requesting a Room (1st system), this allows the user to reserve a room using the booking system (2nd system), this in turn calls the communication system that handles email, teams meeting with reminders (3rd system) and then calls the communication systems (n services/systems)

    Events are facts; they are loosely coupled to the booking system.  This approach allows for the reuse and easy creation of highly customised business processes.

    Thought: Choosing between an Event Mesh and a Federated Architecture...

    Thursday, 9 October 2025

    Medallion Architecture in Fabric High Level Organisation Design Pattern

    Microsoft Fabric is excellent!  We do still need to follow good practices we have been using for years, such as making data accessible and secure.   Possibly the most used architecture for Big Data is the Medallion Architecture pattern, where data is ingested normally in a fairly raw format into the bronze layer, then transformed into more meaningful and usable information. Lastly, the gold layer exposes data relationally using semantic models to reporting tools.

    Overview: This document outlines my attempt to organise enterprise data into MS Fabric using a Medallion Architecture based on Fabric Workspaces.  Shortcuts are better than imported data, but it does depend on factors such as what the data source is, what data we need, how up-to-date the data is and performance requirements from the systems involved.

    The reports and semantic models can get data from other workspaces at any of the medallion layers.  This architecture lends itself well to using the new Direct Lake Query mode.

    Summary of a Design used by a Large Enterprise:

    Medallion architecture using Fabric Workspaces.

    Friday, 26 September 2025

    Microsoft Fabric High-level architecture

    Overview: Microsoft Fabric is an end-to-end analytics platform that unifies data movement, storage, processing, and visualisation. It integrates multiple services into a single SaaS experience, enabling organisations to manage their entire data lifecycle in one place.  One Lake is at the core of MS Fabric.

    Image 1. One page High-Level Architecture of MS Fabric. 

    European Fabric Conference in Vienna Sept 2025 takeways

    FabConEurope25 was terrific in Vienna last week.  Great opportunity to meet Fabric and data experts, speak to the product teams and experts, and the presentations were fantastic.  The hardest part was deciding which session to attend as there are so many competing at the same time.  

    My big takeaways:
    • Fabric SQL is excellent.  The HA, managed service, redundancy, and shipping logs ensure that OneLake is in near real-time.  Fabric SQL supports new native geospatial types.  SQL has temporal tables (old news), but row, column and object-level (incl. table) security is part of OneLake.   There are a couple of things security reviewers will query, but they are addressed.
    • Fabric Data Agent is interesting.  Connect to your SQL relational data and work with it.
    • User-defined functions (UDF), including Translytical (write-back), HTTP in or out, wrap stored procedures, notebooks,.... - amazing.
    • OneLake security is complex but can be understood, especially with containers/layers, such as Tenant, Workspace, Item, and Data.  There is more needed, but it's miles ahead of anything else, and Graph is the magic, so it will only continue to improve. - amazing, but understand security.  Embrace Entra and OAuth; use keys only as a last resort.
    • Snowflake is our friend.  Parquet is fantastic, and Snowflake, including Iceberg, play well together with MS Fabric.  There are new versions of Delta Parquet on the way (and this will even make Fabric stronger, supporting both existing and the latest formats).
    • Mirroring and shortcuts - don't ETL unless you need to shortcut, then mirror, then ETL.
    • Use workspaces to build out simple medallion architectures.
    • AI Search/Vector Search and SQL are crazy powerful.
    • New Map functionality has arrived and is arriving on Fabric.  Org Apps for Maps is going to be helpful in the map space.  pmtiles are native... (if you know you know)
    • Dataverse is great with Fabric and shortcuts, as I learned from Scott Sewell at an earlier conference.  Onelake coupled with Dataverse, is massively underutilised by most orgs, 
    • Power BI also features new Mapping and reporting capabilities related to geospatial data.
    • Other storageCosmosDB (it has its place, but suddenly, with shortcuts, the biggest issue of cost can be massively reduced with the right design decisions).  Postgres is becoming a 1st class citizen, which is excellent on multiple levels. The CDC stuff is fantastic already.
    • RTI on Fabric is going to revolutionise Open Telemetry and AI, networking through the OSI model, application testing, digital twins, and live monitoring,....  I already knew this, but it keeps getting better.  EventHub and notebooks are my new best friends.  IoT is the future; we all knew this, but now with Fabric, it will be much easier to implement safely and get early value.
    • Direct Lake is a game changer for Power BI - not new, but it just keeps getting better and better thanks to MS Graph.
    • Manage Private Endpoint as improved and should be part of all companies' governance.
    • Purview... It's excellent and solves/simplifies DLP, governance and permissions.  I'm out of my depth on Fabric Purview and governance, and I know way more than most people on DLP and governance. Hire one of those key folks from Microsoft here.  
    • Warehouse lineage of data is so helpful.  
    • We need to understand Fabric Digital Twins, as it is likely to be a competitor or a solution we offer and integrate. 
    • Parquet is brilliant and fundamentally is why AI is so successful.
    • Powerful stuff in RDF for modelling domains - this is going to be a business in itself.  I'm clueless here, but I won't be in a few weeks.
    Now the arr..
    • Pricing and capacity are not transparent.  Watch out for the unexpected monster bill!  Saying that the monitoring and controls are in place, but switching off my tenant doesn't sit well with me if workloads aren't correctly set out. Resource governance at the workspace level will help fix the situation or design around it, but it will be more expensive.
    • Workspace resource reservation does not exist yet; however, it can be managed using multiple fabric tenants. Distribution will be significantly improved for cost control with Workspace resource management.
    • Licensing needs proper thought for an enterprise, including ours.  Reserve Fabric is 40% cheaper, and it cannot be suspended, so use the reserved fabric just as you would for most Azure Services.  Good design results in much lower cost with Workloads.  Once again, those who genuinely understand know my pain with the workload costs.
    • Vendors and partners are too far behind (probably due to the pace of innovation)
    Microsoft Fabric is brilliant; it is all under one simple managed autoscaling umbrella.  It integrates and plays nicely with other solutions, has excellent access to Microsoft storage, and is compatible with most of the others.  Many companies will move onto Fabric or increase their usage in the short term, as it is clearly the leader in multiple Gartner segments, all under one hood.  AI will continue to help drive its adoption by enterprises.

    Saturday, 13 September 2025

    Railway Infrastructure - Signalling

    Understanding Railways Series:

    UK Railway Industry for Dummies, focusing on Rail Infrastructure

    Railway Infrastructure - Signalling (this post)

      Signalling for Railways 

    Historically, trains were operated by a driver who knew a lot about the train, the route, and even the train itself.  Speed, signalling, and braking distance were controlled by the driver, and they had to manage the train, the driving, and the route information affecting them.  Signalling was used to help the driver determine whether the rail line was clear or approaching, when to slow down, and to provide reminders, thereby allowing the driver to control the train safely by managing multiple factors.

    Advancements in technology have built on existing infrastructure and approaches, enabling more trains, safer travel, and better information.  For example, wheel counters allowed signalling to ensure that a train and all its wheels have cleared a section.

    Telecommunications allow a central control centre (ROC) to communicate with the drivers.  GSM-R...

    The train should run regardless of temporary telecommunications issues or unavailability (assuming it is safe).  The onboard computer on every train can provide the driver with information or act on it, such as automatically stopping the train if a red light is detected while the train is on the section of track the driver is entering.  

    ETCS is for modern signalling; the fundamental principle is that all trains, routes and tracks can be managed centrally safely (see SIL4).  Everything on the railway has to be safe.  Super safe, if unsure, stop is the principle.

    Common Equipment

    Balise

    A balise is a device mounted on the track between the two rails that sends messages to a train as it passes overhead, primarily conveying the train's position.





    Understanding Railways Series:

    UK Railway Industry for Dummies, focusing on Rail Infrastructure

    Railway Infrastructure - Signalling (this post)



    Sunday, 17 August 2025

    GIS Notes

    What is GIS?

    GIS stands for Geographic Information Systems, which are tools and techniques for capturing, managing, storing, processing, and analysing spatial data. It is part of the broader geospatial technology ecosystem, which also includes drones, remote sensing, and GPS.

    Geospatial data (Raw)

    Definition: Any data that includes a geographic component, describing the location and attributes of features on Earth, contains raw information, like points, lines, and polygons, that has a real-world location associated with it.
    Examples: a car's GPS location or a customer's address.

    GIS data (Organised)

    Definition: Geospatial data that is structured, stored, and analysed using Geographic Information System software.
    Examples include a digital map of roads created from GPS data or layers of data showing flood-risk areas.

    Summary: Geospatial data is the foundation: the raw material for all things spatial. GIS is a toolset that may include tools like ArcGIS from Esri.

    Other:
    In the AEC space, building and Asset management rely heavily on GIS within BIM.
    ArcGIS is the industry leader in GIS tooling, and comes in three versions: 
    • Desktop (ArcPro, Arc Toolbox, ArcCatelog),
    • Server (), 
    • SaaS ArcGIS Online (AGOL).
     GIS data comes in various formats such as Shapefiles, GeoJSON, KML, or geodatabase feature layers.

    What WGS84 and GeoJSON Mean?  

    These are the most common formats for storing position (WGS84) and shape data with coordinates (GeoJSON) 

    WGS84 (World Geodetic System 1984) is the global standard geographic coordinate reference system. It represents positions on Earth using latitude and longitude in decimal degrees.

    GeoJSON is a widely used format for encoding geographic data structures in JSON. According to RFC 7946, all GeoJSON coordinates must use WGS84 (EPSG:4326).

    The OSGB/OSGB36 coordinate system, also known as the Ordnance Survey National Grid or British National Grid (BNG), is a geographic grid reference used in Great Britain.  

    The European Terrestrial Reference System 1989 (ETRS89).

    Standards and formats Notes

  • ISO 19115: International standard for geographic information metadata.
  • Shapefiles: Upload as a ZIP archive containing .shp, .shx, and .prj files.
  • File Geodatabases: Must be compressed into a ZIP file before uploading.
  • WMS/WMTS Services: Add Web Map Service (WMS) or Web Map Tile Service (WMTS) layers by providing the service URL.
  • Prefix Conventions

    • Points: Use prefix pt_
    • Polygons: Use prefix py_
    • Lines: Use prefix ln_
    • Raster: Use prefix rs_

    ESRI Competitors

    Esri have ArcGIS, which is the gold standard in GIS. It's pretty expensive if you don't need the feature set offered by AGOL/Esri.  
    Google offers easy-to-set-up and use mapping, geo services and APIs Google Maps Platform Pricing   . I'd usually choose this when hardcore analytics are not needed

    Common source of data used to mash onto maps

    DEFRA,
    Environmental Agency, 
    Natural England,
    Esri Living Atlas,