Tony Hasek, CEO and Co-Founder of Goldilock, explores the future of cybersecurity across the supply chain.

As global supply chains are restructured in response to economic uncertainty, rising tariffs, and geopolitical pressure, a new cybersecurity dilemma is coming to the foreground. The number of cyberattacks exploiting supply chain vulnerabilities is surging. 45% of businesses are expected to face software supply chain attacks this year. With three major UK retailers falling victim to cyberattacks within just 10 days of each other, the need for rapid action is clearly emphasised. 

To manage cost pressures, procurement complexity, and disruption risk, many businesses have spent the last few years consolidating suppliers. This means relying more heavily on a select few. But while this strategy may offer operational simplicity, it also introduces unforeseen cybersecurity risks.

When companies buy in bulk through a few key suppliers, it becomes harder to trace where individual components or services actually come from. The benefits of scale can quickly be outweighed by a lack of transparency. This creates openings for cyber threats – compromised hardware might be introduced without detection, unverified software and firmware can slip through, and oversight often breaks down across multiple layers of third-party subcontractor and vendor networks.

Recent geopolitical shifts in global trade have added a new layer of complexity, forcing companies to quickly move to new suppliers in different regions – often building entire supply chains from scratch. In this fast-changing environment, organisations must ask: are software-only cyber defences still enough?

Supply chain fragmentation is redefining risk

Over the past decade, cybersecurity strategy has largely focused on digital defences: intrusion detection systems, firewalls, endpoint protection, and role-based identity management. These are all essential, but they rest on the assumption that all components of an end-to-end system can be trusted or at least detected if they pose a threat.

As companies pivot to new vendors, particularly in critical infrastructure, telecommunications, and manufacturing, they inherit new digital dependencies often with little time or visibility to assess risk. A growing number of cyberattacks now originate, not from obvious threat actors, but from compromised supply chain components.

In a recent survey, it was found that 55% of global supply chain professionals use a mix of local and global IT solutions, resulting in fragmented systems that create multiple weak points for cybercriminals. These threats include routers shipped with hidden backdoors, firmware with embedded vulnerabilities, or software libraries poisoned long before deployment.

The infamous SolarWinds breach is a prime example where attackers injected malware into the company’s software build system for months before being detected. Because the malware was delivered through trusted channels, it didn’t appear as a breach to downstream customers – reinforcing the dangerous assumption that a well-known software supply chain couldn’t be compromised.

This is the challenge now facing every CIO and security lead. With the global supply web constantly shifting, the threat vector has moved upstream, and it’s becoming increasingly difficult to tell which components are compromised until it’s too late.

The blind spots in modern cybersecurity

Geopolitical pressures and economic instability have accelerated supplier diversification. As a result, organisations are often forced to onboard new hardware and software partners on compressed timelines. This leaves less room for thorough due diligence. The bigger challenge, however, is ensuring that pre-compromised components don’t make it through the door in the first place.

Modern cybersecurity tools excel at monitoring and responding to suspicious behaviour, but most still work reactively. If malicious code runs inside a network or access credentials are stolen, it’s up to the software to identify, isolate, and shut down the threat. This approach assumes detection happens quickly, before the attacker has had time to move deeper into the system.

Unfortunately, lateral movement – when attackers quietly expand their access across a network – is one of the most damaging and least understood stages of a cyberattack. Even a foothold in a non-critical system can lead to privilege escalation, data theft, and the compromise of sensitive environments. While software defences can slow this process, they often struggle to stop it entirely.

This is especially true in the case of state-sponsored attackers and advanced persistent threats (APTs), which use highly sophisticated methods and zero-day exploits that are designed to bypass detection or lie dormant until the right opportunity arises. If the initial breach comes from a trusted supply chain partner, it can slip under the radar for months hidden behind software that appears safe and behaves normally, until it’s too late.

Why physical isolation matters now

This is where physical network isolation enters the conversation. Not as a throwback to air-gapped systems of the past, but as a modern, strategic layer of defence. For years, organisations have used software-based methods like network segmentation and logical separation to compartmentalise systems. While valuable, these approaches are still vulnerable and can’t guarantee complete control. Physical connection control takes isolation further, enforcing a dynamic, hardware-based barrier – essentially a modern air-gap – that offers true separation and resilience against advanced threats and supply chain compromises.

At its core, physical network isolation does what software alone cannot. It completely severs the potential for any unauthorised communication. Systems can be placed entirely offline or connected only via out-of-band controls that are not susceptible to remote compromise. In other words, even if an attacker manages to breach a system or sneak in through a compromised component, they cannot pivot elsewhere because there’s simply nowhere to go.

In high-value environments, such as critical infrastructure, government networks, and financial systems, this approach is increasingly being revisited. The logic is simple: certain systems are too important to risk. They must be ringfenced, not just monitored.

Advances in control technologies now allow for dynamic physical disconnection. This enables systems to be securely reconnected for updates or access without maintaining constant exposure. It’s a modern interpretation of air-gapping, dynamic and perfectly adapted to today’s operational demands.

Resilient by design

A system that is physically unreachable provides a level of assurance that software-based defences alone cannot match. This makes physical isolation particularly valuable when built into supply chain security protocols. Systems receiving data or code from third-party vendors can remain physically segregated until fully verified, while backup infrastructure can stay completely offline until needed. Even control systems can be made unreachable from external networks, removing the risk of remote hijacking.

To be clear, physical isolation isn’t a silver bullet. But when it can be configured on demand, it becomes a critical layer in both threat mitigation and business continuity. It serves as a proactive first line of defence, a reactive last line of defence, and a practical way to limit the scope and timing of any potential attack.

In cybersecurity, layered defence is essential. Firewalls protect the perimeter, detection tools monitor activity, and identity systems control access. But if those are compromised, what’s left to protect the core?

Time to rethink what “secure” really means

As the digital and physical worlds become more intertwined, organisations must evolve their definition of cybersecurity. Only 30% of businesses report prioritising a secure, connected system for their supply chain. This indicates that more needs to be done. Software tools will always play a critical role, but they should not be the only line of defence. This is particularly true in an era where a single compromised component can trigger a cascade of consequences, all the way up to a network-wide breach.

Physical network isolation doesn’t replace modern cybersecurity, it reinforces it. In a future defined by volatility and hyperconnectivity, businesses must ask not just “can we detect threats?”. They also have to ask “can we better control them and contain them when detection fails?”

For those willing to embrace a multi-layered strategy that includes both virtual and physical controls, the answer will be yes.

The digital landscape is changing day by day. Ideas like the metaverse that once seemed a futuristic fantasy are now…

The digital landscape is changing day by day. Ideas like the metaverse that once seemed a futuristic fantasy are now coming to fruition and embedding themselves into our daily lives. The thinking might be there, but is our technology really ready to go meta? Domains and hosting provider, Fasthosts, spoke to the experts to find out…

How the metaverse works

The metaverse is best defined as a virtual 3D universe which combines many virtual places. It allows users to meet, collaborate, play games and interact in virtual environments. It’s usually viewed and accessed from the outside as a mixture of virtual reality (VR), (think of someone in their front room wearing a headset and frantically waving nunchucks around) and augmented reality (AR), but it’s so much more than this…

These technologies are just the external entry points to the metaverse and provide the visuals which allow users to explore and interact with the environment within the metaverse. 

This is the ‘front-end’ if you like, which is also reinforced by artificial intelligence and 3D reconstruction. These additional technologies help to provide realistic objects in environments, computer-controlled actions and also avatars for games and other metaverse projects. 

So, what stands in the way of this fantastical 3D universe? Here are the six key challenges:

Technology

The most important piece of technology, on which the metaverse is based, is the blockchain. The blockchain is essentially a chain of blocks that contain specific information. They’re a combination of computers linked to each other instead of a central server which means that the whole network is decentralised. This provides the infrastructure for the development of metaverse projects, storage of data and also allows them the capability to be compatible with Web3. Web3 is an upgraded version of the internet which will allow integration of virtual and augmented reality into people’s everyday lives. 

Sounds like a lot, right? And it involves a great deal of tech that is alien to the vast majority of us. So, is technology a barrier to widespread metaverse adoption?

Jonothan Hunt, Senior Creative Technologist at Wunderman Thompson, says the tech just isn’t there. Yet.

“Technology’s readiness for the mass adoption of the metaverse depends on how you define the metaverse, but if we’re talking about the future vision that the big tech players are sharing, then not yet. The infrastructure that powers the internet and our devices isn’t ready for such experiences. The best we have right now in terms of shared/simulated spaces are generally very expensive and powered entirely in the cloud, such as big computers like the Nvidia Omniverse, cloud streaming, or games. These rely heavily on instancing and localised grouping. Consumer hardware, especially XR, is still not ready for casual daily use and still not really democratised.

“The technology for this will look like an evolution of the systems above, meaning more distributed infrastructure, better access and updated hardware. Web3 also presents a challenge in and of itself, and questions remain over to what extent big tech will adopt it going forward.”

Storage

Blockchain is the ‘back-end’, where the magic happens, if you will. It’s this that will be the key to the development and growth of the metaverse. There are a lot of elements that make up the blockchain and reinforce its benefits and uses such as storage capabilities, data security and smart contracts. 

Due to its decentralised nature, the blockchain has far more storage capacity than the centralised storage systems we have in place today. With data on the metaverse being stored in exabytes, the blockchain works by making use of unutilised hard disk space across the network, which avoids users within the metaverse running out of storage space worldwide. 

In terms that might be a bit more relatable, an exabyte is a billion gigabytes. That’s a huge amount of storage, and that doesn’t just exist in the cloud – it’s got to go somewhere – and physical storage servers mean land is taken up, and energy is used. Hunt says: “How long’s a piece of string? The whole of the metaverse will one day be housed in servers and data centres, but the amount or size needed to house all of this storage will be entirely dependent on just how mass adopted the metaverse becomes. Big corporations in the space are starting to build huge data centres – such as Meta purchasing a $1.1 billion campus in Toledo, Spain to house their new Meta lab and data centre – but the storage space is not the only concern. These energy-guzzlers need to stay cool! And what about people and brands who need reliable web hosting for events, gaming or even just meeting up with pals across the world, all that information – albeit virtual – still needs a place to go.

“The current rising cost of electricity worldwide could cause problems for the growth of data centres, and the housing of the metaverse as a whole. However, without knowing the true size of its adoption, it is extremely difficult to truly determine the needed usage. Could we one day see an entire island devoted to data centre storage? Purely for the purposes of holding the metaverse? It seems a little ‘1984’, but who knows?”

Identity

Although the blockchain provides instantaneous verification of transactions with identity through digital wallets, our physical form will be represented by avatars that visually reflect who we are, and how we want to be seen. 

The founder of Saxo Bank and the chairman of the Concordium Foundation, Lars Seier Christensen, argues, “I think that if you use an underlying blockchain-based solution where ID is required at the entry point, it is actually very simple and automatically available for relevant purposes. It is also very secure and transparent, in that it would link any transactions or interactions where ID is required to a trackable record on the blockchain.”

Once identity is established, it is true that it could potentially become easier to assess creditworthiness of parties for purchasing and borrowing in the metaverse due to the digital identity and storage of each individual’s data and transactions on the blockchain. However, although it sounds exciting, there must be considerations into how it could impact privacy, and how this amount of data will be recorded on the blockchain. 

Security

There are also huge security benefits to this set up. The decentralised blockchain helps to eradicate third-party involvement and data breaches, such as theft and file manipulation, thanks to its powerful data processing and use of validation nodes. Both of these are responsible for verifying and recording transactions on the blockchain. This will be reassuring to many, given the widespread concerns around data privacy and user protection in the metaverse.

To access the blockchain all we will need is an internet connection and a device, such as a laptop or smartphone, this is what makes it so great as it will be so readily available. However, to support the blockchain, we’re relying on a whole different set of technologies.  Akash Kayar, CEO of web3-focused software development company Leeway Hertz, had this to say on the readiness of the current technology available: “The metaverse is not yet completely mature in terms of development. Tech experts are researching strategies and

testing the various technologies to develop ideas that provide the world with more feasible and intriguing metaverse projects.

“Projects like Decentraland, Axie Infinity, and Sandbox are popular contemporary live metaverse projects. People behind these projects made perfect use of notable metaverse technologies, from blockchain and cryptos to NFTs.

“As envisioned by top tech futurists, many new technologies will empower the metaverse in the future, which will support the development of a range of prolific use cases that will improve the ability of the metaverse towards offering real-life functionalities. In a nutshell, the metaverse is expected to bring extreme opportunities for enterprises and common users. Hence, it will shape the digital future.”

Currency & Payments

Whilst it’s only considered legal tender in two countries, cryptocurrency is currently a reality and there is a strong likelihood that it will eventually be mass adopted. However, the metaverse is arguably not yet at the same maturity level, meaning cryptocurrency may have to wait before it can finally fully take off. 

Golden Bitcoin symbol and finance graph screen. Horizontal composition with copy space. Focused image.

There is no doubt that cryptocurrency and the metaverse will go hand-in-hand as the former will become the tender of the latter with many of the current metaverse platforms each wielding its native currency. For example Decentraland uses $MANA for payments and purchases. However, with the volatility of crypto currencies and the recent collapse of trading platform FTX indicating security lapses, we may not yet be ready for the switch to decentralised payments. 

Energy

Some of the world’s largest data centres can each contain many tens of thousands of IT devices which require more than 100 megawatts of power capacity – this is enough to power around 80,000 U.S. households (U.S. DOE 2020) and is equivalent to $1.35bn running cost per data centre with the cost of a megawatt hour averaging $150. 

According to Nitin Parekh of Hitachi Energy, the amount of power which takes to process Bitcoin is higher than you might expect: “Bitcoin consumes around 110 Terawatt Hours per year. This is around 0.5% of global electricity generation. This estimate considers combined computational power used to mine bitcoin and process transactions.” With this estimate, we can calculate that the annual energy cost of Bitcoin is around $16.5bn. 

However, some bigger corporations are slowly moving towards renewable energy to power their projects in this space, with Google signing close to $2bn worth of wind and solar investments in order to power its data centres in the future and become greener. Amazon has also followed in their footsteps and have become the world’s largest corporate purchaser of renewable energy. 

They may have plenty of time yet to get their green processes in place, with Mark Zuckerberg recently predicting it will take nearly a decade for the metaverse to be created: “I don’t think it’s really going to be huge until the second half of this decade at the earliest.”

About Fasthosts

Fasthosts has been a leading technology provider since 1999, offering secure UK data centres, 24/7 support and a highly successful reseller channel. Fasthosts provides everything web professionals need to power and manage their online space, including domains, web hosting, business-class email, dedicated servers, and a next-generation cloud platform. For more information, head to www.fasthosts.co.uk