This website uses modern construction techniques, which may not render correctly in your old browser.
We recommend updating your browser for the best online experience.

Visit browsehappy.com to help you select an upgrade.

Skip to Content

Posted by Fragile to Agile on

ADAPT's recent Community Intelligence Report, The Security Divide: Why Australia’s Security Leaders Are Fighting Yesterday’s War, written by Pooja Singh, is a timely and important read for Australian technology, security and business leaders.

Fragile To Agile was pleased to support the development of the paper through the contributions of our CEO, Archie Reed , alongside other experienced community experts including William MacMillan and David Gee.

The report’s central message is direct:

Australian organisations are spending more on cyber security, but many are still not building the resilience they need.

That should concern boards, executives and technology leaders because the problem is no longer simply a cyber security problem.

It is an operating model, governance, architecture and leadership problem.

Cyber resilience is no longer proven by what an organisation has bought, documented or reported. It is proven by what still works when systems, people, suppliers and AI-enabled processes are under pressure.

The report highlights a familiar but uncomfortable pattern. Organisations are investing in security platforms, compliance activity and AI initiatives, while still carrying unresolved weaknesses in basic controls such as multi-factor authentication, patching, backups, privileged access management and Essential Eight maturity.

That is the security divide:

The gap between the security posture organisations believe they have, and the resilience they can actually demonstrate when it matters.

More spend is not the same as resilience!

One of the most important observations in the report is that increasing security spend does not automatically reduce enterprise risk.

Security incidents continue to rise, while many foundational controls remain incomplete. The report also highlights a two-speed market: some organisations have very large cyber budgets, while many others are working with far more constrained resources.

But the real issue is not simply the size of the budget. It is whether investment is being directed toward the right outcomes.

Too often, organisations mistake activity for progress:

 

  • a new tool, but no measurable reduction in response time
  • a new dashboard, but no clearer view of material business risk
  • a compliance uplift, but no improvement in operational resilience
  • an AI pilot, but no agreed controls around data, verification or accountability

 

Each may be useful. None is sufficient on its own.

Cyber resilience needs to be viewed through a broader enterprise lens: business capability, architecture, delivery governance, operating model, data, risk and executive decision-making.

AI has raised the stakes!

AI makes this challenge more urgent.

Most organisations are already experimenting with AI, whether formally or informally. Staff are using generative AI to draft documents, summarise information, analyse data, write code, prepare proposals and accelerate everyday work.

In many cases, that experimentation is useful and well-intentioned. But without clear guardrails, it creates real risk:

 

  • sensitive or confidential data being entered into unapproved tools
  • AI-generated outputs being accepted without verification
  • intellectual property or client information being exposed
  • decisions being influenced by outputs that are not explainable
  • audit trails disappearing from important business processes

 

The answer is not to block AI. That rarely works.

The answer is to govern it in a way that supports safe adoption.

As Archie notes if organisations do not set AI policy, their staff effectively will. Clear guidance is needed on what can be shared, what cannot, which tools are approved, how outputs must be verified, and where human oversight is required.

This is not theoretical governance. It is practical risk management.

AI security needs to be designed into workflows, data handling, architecture decisions, operating models and delivery governance from the start. Otherwise, organisations risk creating the next generation of shadow IT, only faster, more distributed and harder to unwind.

Security must be embedded into architecture and delivery!

A key implication from the report is that security cannot remain a parallel workstream.

When security is engaged late, it is seen as a blocker. When it is treated mainly as compliance, it becomes paperwork. When it is approached only through tooling, it becomes fragmented and expensive.

The better approach is to embed security and trust into the way organisations design, prioritise and deliver change.

For Fragile to Agile, this is where enterprise architecture plays a critical role. A strong architecture view helps leaders understand:

 

  • which business capabilities are most exposed
  • which systems, data, integrations and third parties support them
  • which controls are required to protect them
  • where current-state maturity is weak
  • which investments will reduce the most material risk
  • how AI changes the organisation’s exposure and control requirements

 

Security should not be something added at the end of a roadmap. It should help shape the roadmap.

That is how organisations move from security as a separate technical function to security as a core part of trusted transformation.

Boards need a clearer story!

The ADAPT report also points to a long-standing challenge: many CISOs still struggle to demonstrate return on security investment to the board. Less than one-third of CISOs surveyed can confidently demonstrate ROI, despite rising budgets and increasing incident rates.

This is partly a measurement issue, but it is also a communication issue.

Boards do not need more technical detail. They need a clearer story:

 

  • what are we most exposed to?
  • what would it cost us if it happened?
  • what are we doing about it?
  • are we improving?
  • what risk are we still carrying?
  • where does AI change the equation?

 

That is the language of enterprise risk, not cyber operations.

CISOs, CIOs, CTOs and enterprise architects need to work together to translate security from controls and vulnerabilities into money, time, reputation, continuity and trust. That is how cyber becomes a board-level decision, not just a technology function update.

Closing the divide...

The real value of ADAPT’s paper is that it cuts through the illusion that more activity equals more resilience. Closing the security divide requires organisations to:

 

  • get the fundamentals right
  • govern AI before it runs ahead of policy
  • connect compliance to actual risk reduction
  • rationalise tools around measurable outcomes
  • build cyber capability across the broader organisation
  • communicate risk in language boards can act on
  • integrate security into architecture and transformation roadmaps

 

It also requires a shift in mindset.

 

  • Security is not a separate layer placed over transformation. It is part of good transformation.
  • Security is not the opposite of innovation. It is what allows innovation to be trusted, scaled and sustained.
  • AI governance is not an administrative burden. It is now a core part of protecting data, decisions, reputation and organisational confidence.

 

For boards and executive teams, the question is no longer whether cyber security is receiving attention. It is whether the organisation can prove that its security investment, AI governance, architecture decisions and operating model are reducing the risks that matter most.

The organisations that will lead in the AI era will not be those that move fastest at any cost, but those that can move quickly, safely and with trust designed in from the start.

That is the real call to action from The Security Divide: stop treating security as a control function that slows transformation down, and start treating it as the discipline that makes transformation credible.

Older All posts Newer