From checklist to real capability

From checklist to real capability

From checklist to real capability 2560 1280 Centro de Ciberseguridad Industrial

Operator supervising HMI systems on the plant floor. Real CapabilityHow MACIN turns industrial cybersecurity into a strategic asset (with two practical cases)

In industrial cybersecurity (OT/ICS), there is a very common trap: confusing “having controls” with “having capability”. Many organizations might “pass” a checklist because a procedure, a SOC, a ticketing tool, or a patching policy exists. But when a real incident arrives—with pressure for continuity, legacy systems, connected third parties, and limited maintenance windows—the question that truly matters arises:

Does this work consistently on the plant floor… or does it only exist on paper?

That is where MACIN adds value: it helps you move from a binary verification (“yes/no”) to a maturity measurementthat reflects if the organization has real, repeatable, and improvable capability. And that, for Management, has an immediate effect: prioritizing investments, justifying decisions, and demonstrating progress with evidence, not perceptions.

Why does the checklist fall short in OT?

A checklist is useful to start, but it usually fails on three key points:

  1. Measures existence, not efficacy. “There is a procedure” does not mean it is executed well under pressure, nor that it is adapted to OT.
  2. Does not capture interdependencies. In industrial environments, capability depends on processes + people + technology + results. If one leg fails, the control “doesn’t hold”.
  3. Does not prioritize by operational impact. Everything seems urgent; without maturity, it is easy to invest where it is most “visible,” not where risk is reduced the most

MACIN has been built by professionals with extensive experience and fieldwork to add value and turn it into a reporting lever for Management.

  • MACIN is not just another report; it is an instrument to govern an OT program. It allows you to:
    • Obtain a clear baseline (by domains and practices mapped against reference frameworks)
    • Identify “paper vs. operation” gaps
    • Build a prioritized roadmap (what to do first and why)
    • Align OT, IT, and management with a common language.
    • Repeat the measurement to demonstrate continuous improvement

Put in Management terms: it reduces uncertainty, improves decision-making, and helps protect continuity, safety, and production with traceable actions.

And since the best way to prove this is with examples or use cases, let’s particularize its use in two contexts:

Case 1: OT Incident Management

What a checklist “sees” vs. what maturity “sees” (MACIN – Domain 8)

Realistic scenario: A vendor attempts to connect outside the maintenance window to an engineering station. The SOC detects anomalous activity. Operations asks for caution to avoid affecting production.

Checklist Vision (Binary)

  1. ✅ “Incident response procedure exists”
  2. ✅ “There is a SOC/CSIRT and escalation”
  3. ✅ “Incidents are registered in tickets”
  4. ✅ “A simulation was conducted”

Checklist conclusion: “Incident management implemented”.

Analysis under MACIN maturity review logic (Real Capability):

MACIN goes down to the detail of the capabilities that must work end-to-end:

  • Detection and triage with OT logic: classification, severity, and interpretation with context (8.1.1 – 8.1.3).
  • Decision and escalation without improvisation: when an event becomes an incident, to whom it is escalated, and how the cycle is operated (8.2.1 – 8.2.5).
  • Executable response in the plant: trained roles, OT playbooks, activities by phase, and continuity criteria (8.3.1 – 8.3.4).
  • Communication and coordination: stakeholders, notification, and coordination with third parties while preserving evidence (8.4.1 – 8.4.3).
  • Lessons learned: proportional analysis, improvements that are closed, and exercises that validate efficacy (8.5.1 – 8.5.3).

Key difference: the checklist demonstrates that “a plan exists” ; the maturity evidences if that plan can be executed and if it improves with every cycle.

Case 2: Vulnerability Management in OT

From “we have a patching policy” to “we know what to treat, when, and how” (MACIN – Domain 3)

Realistic scenario: A critical vulnerability affecting HMI/engineering software is published. In OT, aggressive scanning is not possible, the patch requires a shutdown, and the vendor takes time to validate. Meanwhile, there is risk.

Checklist Vision (Binary):

  • “We have a patching policy”
  • “We perform periodic scans”
  • “We have a vulnerability register”
  • “We apply patches when possible”

Checklist conclusion: “Vulnerability management implemented”.

Analysis under MACIN maturity review logic (Real Capability):

MACIN converts that situation into a set of practices that, if mature, avoid the “chaos” of every CVE:
1) Govern the process (not just react)
  • Define objectives, criteria, resources; document/approve the process; communicate and review needs (3.1.1 – 3.1.4). Key question: Do we have clear rules for OT (what can be scanned, what cannot, and how is it decided)?
2) Identify, evaluate, and prioritize with OT context
  • Information sources, continuous collection, safe manual/automated tests, interpretation, evaluations, and prioritization/communication (3.2.1 – 3.2.6). Key question: Do we prioritize by “high CVSS” or by real impact on the process (exposure, asset criticality, available compensations, maintenance window)?
3) Treat the vulnerability without disrupting operations
  • Analyze options and operational impact, treatment plans, execution of mitigations, exception logging, and residual risk communication (3.3.1 – 3.3.5). Maturity example: if patching is not immediately possible, treatment is defined: temporal segmentation, strict remote access control, hardening, specific rules, monitoring, and a formal exception with expiration and a plan
4) Measure effectiveness and improve
  • Monitor compliance/effectiveness, communicate status, and establish improvement plans (3.4.1 – 3.4.3). Business result:fewer repeated “scares,” less invisible technical debt, and traceable decisions (why it is accepted, mitigated, or patched).

Key difference: the checklist tells you “there is a policy”; maturity tells you if a sustained capability exists to manage OT vulnerabilities with operational realism, documented decisions, and measurable results

These two cases can serve as examples to evidence the different visions provided by the Checklist-based review and the maturity evaluation. In this new MACIN update, maturity criteria have also been established for four essential pillars that must be coherent and consistent

1) Processes: converting intention into repeatable operation.

What it brings: the Processes axis tells you if the organization has a standardized and governed way of doing things (define, execute, review, improve). In OT this is critical because, if the process is weak, everything becomes “ad-hoc” when urgencies arise (shutdowns, incidents, changes, vendor support).

What it “unmasks” versus the checklist: that “a procedure exists” does not imply there are clear criteria, phases, roles, escalation, review, and improvement. A mature process is noticeable because it reduces improvisation and accelerates decisions without breaking continuity

2)People: moving from heroes to institutional capabilities.

What it brings: the People axis measures if there are defined roles, competencies, training, and sufficient drilling to execute the process (not just understand it). In industrial cybersecurity, the usual gap is not a “lack of people,” but a lack of cross-training (SOC–OT–operations–maintenance) and dependence on key individuals.

What it “unmasks” versus the checklist: an org chart is not equivalent to a capability. Maturity in people is seen when the organization responds well even if “person X is not there” and when training is conducted for real OT scenarios.

3) Technologies/Tools: having useful, integrated, and maintained means.

What it brings: the Technologies axis focuses on whether tools exist, are well configured, integrated, and kept alive(use cases, rules, coverage, automation). In OT, it is frequent to have “legacy” or “underutilized” tools, or to use IT solutions without adapting them to OT constraints.

What it “unmasks” versus the checklist: buying a tool does not create capability if there is no tuning, integration, operating procedures, and a deployment strategy by criticality.

4)Results: demonstrating effectiveness (and being able to improve).

What it brings: the Results axis is the one that breaks most with the checklist: it forces looking at what is achieved and how it is demonstrated (metrics, traceability, coverage, lessons learned, reduction of repetition). It is the “closing of the circle”: without results, there is no real feedback or evidence-based prioritization.

What it “unmasks” versus the checklist: you can “comply” and still not know if you detect on time, if your response reduces impact, or if you repeat incidents due to the same causes.

Therefore, as we have been able to see throughout the examples and the analysis of the pillars on which MACIN is based, we can obtain the following conclusions:

  • MACIN helps you know where you truly are (not where you think you are).
  • It helps you decide what to do first (prioritization with operational sense).
  • It helps you demonstrate progress (evidence and continuous improvement).
  • It helps you align teams (OT/IT/management, with a common framework).

 

Javier Cao Avellaneda

Coordinator of the Industrial Cybersecurity Centre

 – –

Further information about the MACIN Platform can be found here.