Fawad Qureshi, Global Field CTO, Snowflake, on realising possibilities for innovation in this new AI era

Without cloud migration, businesses face the end of innovation. In this new AI era, businesses operating within the closed architectures of legacy systems do not have the flexible, data-driven foundation to engage with these new technologies and ensure a strong pipeline of necessary innovation. And as AI continues to evolve, those not able to keep pace with innovation risk being left behind. 

Cloud migrations are the foundation to modernise and drive business growth over the long term. When organisations migrate to a cloud-based environment, it’s crucial to focus on the tangible business value a migration will deliver, rather than simply shifting from one system to another. Moving a company’s customer-facing applications and all of their data to a cloud-based environment has the benefits that are increasingly real and measurable.

Migration isn’t just a Plug and Play approach – Which migration fits your needs?

There are two approaches to cloud migration, broadly speaking: horizontal and vertical, each with their own benefits and potential challenges. A vertical approach sees organisations migrating applications one by one: this approach is a good choice if certain systems have to be prioritised, or if the applications being migrated do not have many interdependencies. Vertical migration allows for focused efforts and risk management on individual systems, and requires fewer resources. Horizontal migration moves entire system layers at the same time. This is the best solution when businesses have tight deadlines to retire legacy systems, or if their systems are tightly integrated. Horizontal migrations tend to be faster by allowing for parallel work streams, but they require more technical expertise. 

Organisations often adopt a mixture of the two approaches, for example, horizontally migrating important systems such as data platforms, while taking a vertical approach to customer-facing applications. Whatever approach an organisation takes, it’s vital that the migration also includes a culture shift, preparing employees to adapt to new, consumption-based models and the possibilities of the new technology. Migration is also just the start of the journey, unlocking the potential of AI-driven use cases and seamless data collaboration, including new ways to achieve business value. 

Before diving straight in, ensure it’s with a Data-First Mindset

When migrating to the cloud, a data-first approach is essential. For those acting as the catalyst for change, whether that be IT managers or even CIOs, data must be front of mind before planning any successful migration.  Understanding how data is used within the organisations, including its structure, governance needs, and how it delivers value and business outcomes, is imperative. This applies doubly when it comes to large, complex systems with many interconnected applications. 

Before migrating, businesses must comprehensively assess their current ecosystem. It’s imperative that the end-to-end business product survives the migration, intact. Organisations should maintain internal control over core competencies around data, such as business process knowledge, data governance and change management. These areas include institutional knowledge that external parties may not grasp. Businesses should also maintain direct oversight over compliance requirements and risk management. 

Technical activities such as cloud infrastructure optimisation, performance testing, and specialised migration tooling are something, by contrast, that can be handled by external expertise. Code conversion can also benefit from purpose-built tools that use technologies including AI. Technical parts of the immigration tend to evolve rapidly and require specialist knowledge, so are ripe for outsourcing. While doing so, those steering the migration need to ensure clear governance around outsourced activities, including regular knowledge transfer sessions. 

Different parts of the business all have a role to play: IT and engineering lead on technical implementation, handling the technical side of business requirements, while finance will identify ROI opportunities and manage cloud costs. It helps to create a cross-functional steering committee with representation from every department to ensure that different areas of the business are aligned and ready to address challenges. 

Adaptability and Flexibility is the key to business longevity 

Migration is never one-size-fits-all, and business leaders should be prepared to be flexible and adapt. There are multiple kinds of horizontal migration, from a simple ‘lift and shift’ focused on moving systems as they are to a ‘move and improve’ where migration is followed by optimisation to reduce technical debt. They should be ready to adapt at their own pace, choosing data platforms which offer agnostic architecture and the freedom to choose between data models and tools to ensure minimal disruption.

Flexibility is also important in choosing the tools used for migrations. Flexible data platforms will offer the support businesses need to deal with collaboration and governance frameworks. For businesses operating in EMEA, where different countries can have varying policies, pay close attention to issues around data quality, security and compliance, particularly when it comes to data sovereignty and issues around European data residency. 

A Shared Destiny

The shift to the cloud fundamentally changes security. The traditional cloud ‘shared responsibility’ model clearly demarcated duties between the provider and the customer. However, a more advanced approach is emerging: the ‘shared destiny’ model. This model recognises that in the event of a breach, reputational damage affects both parties. This shared risk incentivises the cloud provider to be a more proactive partner, actively helping customers strengthen their security posture rather than simply managing their own side of the demarcation line.

As ‘destinies’ intertwine, you help eliminate the vulnerability created due to password simplicity. Put simply, in a ‘shared responsibility’ model, the cloud provider is only responsible for securing infrastructure, while the customer remains responsible for securing data and apps in the cloud, as well as for configuration. In a ‘shared destiny’ model, the cloud provider plays a more proactive role to ensure that their customers have the best possible security posture. 

Taking a ‘shared destiny’ approach allows businesses to be more proactive in securing data, using approaches such as multi-factor authentication, secure programmatic access and more comprehensive cloud monitoring services. Choosing a modern, AI-driven data platform offers the best security foundations here, offering security controls across cloud service providers and the entire data ecosystem. 

A Pathway to Growth

In today’s world, the bigger risk is standing still. Nothing changes if nothing changes.

If organisations are holding back on innovation due to technological limitation, then the time to migrate is clear. There is no need to face an end to possibilities when the path towards success lies in reach, offering an opportunity to bring businesses up to date with modern requirements, and pave the way for the adoption of technologies such as AI. 

However, as we’ve seen, it’s not just a case of plug and play. Organisations must ensure a flexible, data-driven approach to migration, while keeping security front of mind via a ‘shared destiny’ approach. To deliver this, the right choice of a modern, flexible data platform will ensure the whole organisation can work together effectively and deliver a path to future innovation and growth. 

Learn more at snowflake.com

  • Data & AI
  • Digital Strategy
  • Infrastructure & Cloud

Vertiv expects powering up for AI, Digital Twins and Adaptive Liquid Cooling to shape future Data Centre Design and Operations

Data Centre innovation is continuing to be shaped by macro forces and technology trends related to AI, according to a report from Vertiv, a global leader in critical digital infrastructure. The Vertiv™ Frontiers report, which draws on expertise from across the organisation, details the technology trends driving current and future innovation, from powering up for AI, to digital twins, to adaptive liquid cooling.

“The data centre industry is continuing to rapidly evolve how it designs, builds, operates and services data centres, in response to the density and speed of deployment demands of AI factories,” said Vertiv chief product and technology officer, Scott Armul. “We see cross-technology forces, including extreme densification, driving transformative trends such as higher voltage DC power architectures and advanced liquid cooling that are important to deliver the gigawatt scaling that is critical for AI innovation. On-site energy generation and digital twin technology are also expected to help to advance the scale and speed of AI adoption.”

The Vertiv Frontiers report builds on and expands Vertiv’s previous annual Data Centre Trends predictions. The report identifies macro forces driving data centre innovation:

  • Extreme densification – accelerated by AI and HPC workloads; gigawatt scaling at speed – data centres are now being deployed rapidly and at unprecedented scale
  • Data centre as a unit of compute – the AI era requires facilities to be built and operated as a single system
  • Silicon diversification – data centre infrastructure must adapt to an increasing range of chips and compute

The report details how these macro forces have in turn shaped five key trends impacting specific areas of the data centre landscape.

1.         Powering up for AI

Most current data centres still rely on hybrid AC/DC power distribution from the grid to the IT racks, which includes three to four conversion stages and some inefficiencies. This existing approach is under strain as power densities increase, largely driven by AI workloads. The shift to higher voltage DC architectures enables significant reductions in current, size of conductors, and number of conversion stages while centralising power conversion at the room level. Hybrid AC and DC systems are pervasive, but as full DC standards and equipment mature, higher voltage DC is likely to become more prevalent as rack densities increase. On-site generation, and microgrids, will also drive adoption of higher voltage DC.

2.          Distributed AI

The billions of dollars invested into AI data centres to support large language models (LLMs) to date have been aimed at supporting widespread adoption of AI tools by consumers and businesses. Vertiv believes AI is becoming increasingly critical to businesses but how, and from where, those inference services are delivered will depend on the specific requirements and conditions of the organisation. While this will impact businesses of all types, highly regulated industries, such as finance, defence, and healthcare, may need to maintain private or hybrid AI environments via on-premise data centres, due to data residency, security, or latency requirements. Flexible, scalable high-density power and liquid cooling systems could enable capacity through new builds or retrofitting of existing facilities.

3.          Energy autonomy accelerates

Short-term on-site energy generation capacity has been essential for most standalone data centres for decades, to support resiliency. However, widespread power availability challenges are creating conditions to adopt extended energy autonomy, especially for AI data centres. Investment in on-site power generation, via natural gas turbines and other technologies, does have several intrinsic benefits but is primarily driven by power availability challenges. Technology strategies such as Bring Your Own Power (and Cooling) are likely to be part of ongoing energy autonomy plans.

4.          Digital twin-driven design and operations

With increasingly dense AI workloads and more powerful GPUs also come a demand to deploy these complex AI factories with speed. Using AI-based tools, data centres can be mapped and specified virtually, via digital twins, and the IT and critical digital infrastructure can be integrated, often as prefabricated modular designs, and deployed as units of compute, reducing time-to-token by up to 50%. This approach will be important to efficiently achieving the gigawatt-scale buildouts required for future AI advancements.

5.          Adaptive, resilient liquid cooling

AI workloads and infrastructure have accelerated the adoption of liquid cooling. But conversely, AI can also be used to further refine and optimise liquid cooling solutions. Liquid cooling has become mission-critical for a growing number of operators but AI could provide ways to further enhance its capabilities. AI, in conjunction with additional monitoring and control systems, has the potential to make liquid cooling systems smarter and even more robust by predicting potential failures and effectively managing fluid and components. This trend should lead to increasing reliability and uptime for high value hardware and associated data/workloads.

Vertiv does business in more than 130 countries, delivering critical digital infrastructure solutions to data centres, communication networks, and commercial and industrial facilities worldwide. The company’s comprehensive portfolio spans power management, thermal management, and IT infrastructure solutions and services – from the cloud to the network edge. This integrated approach enables continuous operations, optimal performance, and scalable growth for customers navigating an increasingly complex digital landscape.

Find out more at Vertiv.com.

  • Data & AI
  • Digital Strategy
  • Infrastructure & Cloud

Jon Abbott, Technologies Director of Global Strategic Clients at Vertiv, asks how we can build a generation of data centres for the AI age

The promise of artificial intelligence (AI) is enlightenment. The pressure it places on infrastructure is far less elegant.

Across every layer of the data centre stack, AI is exposing structural limits – from cooling thresholds and power capacity to build timelines and failure modes. What many operators are now discovering is that legacy models, even those only a few years old, are struggling to accommodate what AI-scale workloads demand.

This isn’t simply a matter of scale – it is a shift in shape. AI doesn’t distribute evenly, it lands hard, in dense blocks of compute that concentrate energy, heat and physical weight into single systems or racks. Those conditions aren’t accommodated by traditional data hall layouts, airflow assumptions or power provisioning logic. The once-exceptional densities of 30kW or 40kW per rack are quickly becoming the baseline for graphics processing unit- (GPU) heavy deployments.

The consequences are significant. Facilities must now support greater thermal precision, faster provisioning and closer coordination across design and operations. And they must do so while maintaining resilience, efficiency and security.

Design Under Pressure

The architecture of the modern data centre is being rewritten in response to three intersecting forces. First, there is density – AI accelerators demand compact, high-power configurations that increase structural and thermal load on individual cabinets. Second, there is volatility – AI workloads spike unpredictably, requiring cooling and power systems that can track and respond in real time. Third, there is urgency – AI development cycles move fast, often leaving little room for phased infrastructure expansion.

In this environment, assumptions that once underpinned data centre design begin to erode. Air-only cooling no longer reaches critical components effectively, uninterruptible power supply (UPS) capacity must scale beyond linear load, and procurement lead times no longer match project delivery windows.

To adapt, operators are adopting strategies that prioritise speed, integration and visibility. Modular builds and factory-integrated systems are gaining traction – not for convenience, but for the reliability that controlled environments can offer. In parallel, greater emphasis is being placed on how cooling and power are architected together, rather than as separate functions.

Exploring the Physical Gap

There is a growing disconnect between the digital ambition of AI-led organisations and the physical readiness of their facilities. A rack might be specified to run the latest AI training cluster. The space around it, however, may not support the necessary airflow, load distribution or cable density. Minor mismatches in layout or containment can result in hot spots, inefficiencies or equipment degradation.

Operators are now approaching physical design through a different lens. They are evaluating structural tolerances, rebalancing containment zones, and planning for both current and future cooling scenarios. Liquid cooling, once a niche consideration, is becoming a near-term requirement. In many cases, it is being deployed alongside existing air systems to create hybrid environments that can handle peak loads without overhauling entire facilities.

What this requires is careful sequencing. Introducing liquid means introducing new infrastructure: secondary loops, pump systems, monitoring, maintenance. These elements must be designed with the same rigour as the electrical backbone. They must also be integrated into commissioning and telemetry from day one.

Risk in the Seams

The more complex the system, the more attention must be paid to the seams. AI infrastructure often relies on a patchwork of new and existing technologies – from cooling and power to management software and physical access control. When these systems are not properly aligned, risk accumulates quietly.

Hybrid cooling loops that lack thermal synchronisation can create blind spots. Overlapping monitoring systems may provide fragmented data, hiding early signs of imbalance. Delays in commissioning or last-minute changes in hardware specification can introduce vulnerabilities that remain undetected until something fails.

Avoiding these scenarios requires joined-up design. From early-stage planning through to testing and operation, infrastructure must be treated as a whole. That includes the physical plant, the digital control layer and the operational processes that bind them.

Physical Security Under AI Conditions

As infrastructure becomes more specialised and high-value, the importance of physical security rises. AI racks often contain not only critical data but hardware that is financially and strategically valuable. Facilities are responding with enhanced perimeter control, real-time surveillance, and tighter access segmentation at the rack and room level.

More organisations are adopting role-based access tied to operational state. Maintenance windows, for example, may trigger temporary access privileges that expire after use. Integrated access and monitoring logs allow operators to correlate physical movement with system behaviour, helping to identify unauthorised activity or unexpected patterns.

In environments where automation and remote management are becoming standard, physical security must be designed to support low-touch operations with intelligent systems able to flag anomalies and initiate response workflows without constant human oversight.

Infrastructure as an Adaptive System

The direction of travel is clear. Infrastructure must be able to evolve as quickly as the workloads it supports. This means designing for flexibility and for lifecycle. It means understanding where capacity is needed today, and how that might shift in six months. It means choosing platforms that support interoperability, rather than locking into closed systems.

The goal is not simply to survive the shift to AI-scale compute. It is to build a foundation that can keep up with whatever comes next – whether that is a new training model, a change in energy market conditions, or a new set of regulatory constraints.

Discover more at vertiv.com

  • Data & AI
  • Digital Strategy
  • Infrastructure & Cloud

Welcome to the latest issue of Interface magazine! Click here to read the latest edition! USDA: A Fresh Perspective on…

Welcome to the latest issue of Interface magazine!

Click here to read the latest edition!

USDA: A Fresh Perspective on Digital Service

This month’s cover story focuses on the digital transformation journey continuing at the United States Department of Agriculture (USDA). In conversation with Fátima Terry, USDA’s former Digital Service Deputy Director, we revisit the sterling work being carried out and find out how technology is being humanised to deliver value to the American people this organisation serves.

“One of the things we did was partner with multiple USDA teams that focused on customer experience and digital service delivery for their programs,” she explains. “We also partnered with other federal-wide agencies and departments to move forward and evaluate the progress of digital transformation by cross-pollinating success models to everyone connected.”

Ayoba: A Super-App for Africa

Ayoba, part of the MTN telco group, is a super-app platform built in Africa, for Africa. Esat Belhan, Chief Technology & Product Officer, reveals how it is bringing more people to digital so they can be tech-savvy and educated on digital capabilities…

“In order to do that, one thing you could do is give away free data, but that data could be easily wasted on another data-heavy app, like TikTok, in just a couple of hours. So, the real solution is that the valuable and insightful content Ayoba provides should be provided for free, and that we provide instant messaging and short video content, to keep people using our platform for their communication and entertainment needs.”

Kraft Kennedy: Supporting MSPs with People and Processes

Nett Lynch, CISO at Kraft Kennedy, explains how the company’s new division, Legion, solves cyber pain-points for MSPs with a collaborative, business-centred approach.

“A lot of MSPs struggle with client strategy, they’re talking tech instead of business. We’re nerds – we love the tech, we love the features. But we need to admit clients aren’t focused on those things. They don’t necessarily care how or why it works. They just want it to work and align to their business goals.”

And read on to hear from FICO’s CIO on using AI to transform technical operations; learn from KnowBe4 how AI Agents will be a game changer for tackling cybercrime; and discover how data centres are meeting the demands of the AI boom with Vertiv.

Click here to read the latest edition!

  • Data & AI
  • Digital Strategy
  • Infrastructure & Cloud
  • People & Culture

Andy Swift, Cyber Security Assurance Technical Director at Six Degrees on

According to AV-TEST, the independent IT security institute, every day sees at least 450,000 new malware variants added to its database. In June this year, for example, cybercriminals are thought to have used malware to steal over 16 billion login credentials across various major platforms in what is thought to have been the largest breach of its kind in history. For security teams, this represents a relentless challenge that demands constant attention and consumes significant resources.

Malware-Free Attacks

As if that wasn’t enough, malware-free attacks are increasingly favoured by cybercriminals as a way to circumvent organisational security. Typically using legitimate programs and tools, these stealth attacks are particularly complex to detect. And they are invisible to most automated security protection options that are available to buy.

With no obvious malware signatures to detect, automated defences are often powerless to respond. And without robust security foundations, even advanced detection tools offer limited protection once an attacker gains a foothold. When that happens, the consequences can be significant.

At the heart of the matter are the limitations of many traditional security tools, which are simply not designed to stop what they cannot see. Malware-free attacks do not rely on external payloads or binaries with known malicious signatures. This renders many automated detection systems, including standard antivirus solutions, effectively useless. As a result, the burden falls elsewhere.

For most organisations, that means having the right expertise in place to recognise unusual behaviour, supported by technologies that can identify behavioural anomalies quickly. Endpoint detection and response (EDR) platforms offer some of these capabilities. But even the most advanced solutions rely on proper configuration and human oversight to be effective. In an ideal world, every business would have round-the-clock monitoring in place, but in reality, very few do.

Challenging Assumptions Around Risk

So, how can organisations fill the gap? When assessing how to protect against malware-free attacks, many organisations begin with the assumption that they will need to buy new tools or licenses. This can form part of a rounded solution. However, leading with this mindset often overlooks a more fundamental and cost-effective question: What can be improved with the tools already in place?

Reviewing existing capabilities should be the first step. For example, most environments already have some level of EDR, behavioural monitoring or identity protection deployed. Yet these are often underutilised or misconfigured. This can result from a lack of understanding around tool capabilities (and limitations), paying for the wrong level of license coverage, and failing to ensure configurations support behavioural analysis rather than just malware scanning. In many cases, even minor adjustments can significantly increase effectiveness without any additional spend.

Cost vs Risk

Organisations should also reconsider how they approach the question of investment. The cost vs risk conversation needs to shift from what they should buy to what they should fix. Even the most expensive detection tools can be rendered ineffective if attackers can exploit basic oversights such as poor configuration, excessive access rights or the absence of multi-factor authentication. In contrast, identifying and addressing these gaps in existing systems is not only more cost-effective but also more impactful in stopping attacks before they gain momentum.

This kind of review process is also an opportunity to identify gaps and prioritise actions that reduce risk without escalating costs. For example, many organisations find that network segmentation, strict privilege controls and enforcing least-access policies can help prevent lateral movement and minimise credential misuse – two of the most common techniques used in malware-free attacks. Putting these capabilities in place are security fundamentals that often determine whether an attack is stopped early or is able to spread.

In this context, a best practice approach matters more than ever. Not as a one-off initiative, but as a continuous effort to close the windows of opportunity that attackers rely on. This includes reducing privilege levels, adopting MFA by default, limiting binary access and educating users on social engineering techniques. All of which are good examples of cost-effective steps that can limit the opportunity for malware-free attacks to take hold. These are not headline-grabbing technologies, but they remain the strongest defence against attacks that thrive on poor hygiene and overlooked gaps.

So, rather than investing in yet another layer of detection, organisations should focus on strengthening what they already have. This approach not only helps avoid unnecessary expense but also delivers a stronger, more sustainable defence posture in an environment where threat actors continue to be extremely effective.

  • Cybersecurity
  • Cybersecurity in FinTech
  • Infrastructure & Cloud

TechEX Europe – Powering the Future of
Enterprise Technology at Amsterdam’s RAI Arena September 24-25

TechEx Europe unites five leading enterprise technology events — AI & Big DataCyber SecurityData CentresDigital Transformation and IoT — into one powerful experience designed for organisations driving change. Five events, two days, one ticket – register for your pass here.

From scaling infrastructure to unlocking new efficiencies, this is where decision-makers and their teams come to connect, explore real-world use cases, and discover the technologies that will shape their next phase of growth.

AI & Big Data Expo

The AI & Big Data Expo is the premier event showcasing Generative AI, Enterprise AI, Machine Learning, Security, Ethical AI, Deep Learning, Data Ecosystems, and NLP

Speakers include:

Cybersecurity & Cloud Expo

The Cyber Security & Cloud Expo, is the premier event showcasing the latest in Application and Cloud Security, Hybrid Cloud, Data Protection, Identity and Access Management, Network and Infrastructure Defence, Risk and Compliance, Threat Intelligence,  DevSecOps Integration, and more. Join industry leaders to explore strategies, tools, and innovations shaping the future of secure, connected enterprises.

Speakers include:

IOT Tech Expo

IoT Tech Expo is the leading event for IoT, Digital Twins & Enterprise Transformation, IoT Security, IoT Connectivity & Connected Devices, Smart Infrastructures & Automation, Data & Analytics and Edge Platforms.

Speakers include:

Digital Transformation

The Digital Transformation Expo is the leading event for Transformation Infrastructure, Hybrid Cloud, The Future of Work, Employee Experience, Automation, and Sustainability.

Speakers include:

Data Center Expo

The Data Centre Expo and conference is the premier event tackling key challenges in data centre innovation. It highlights AI’s Impact, Energy Efficiency, Future-Proofing, Infrastructure & Operations, and Security & Resilience, showcasing advancements shaping the future of data centre. 

Speakers include:

Book your place at TechEx Europe 2025 now!

  • Cybersecurity
  • Data & AI
  • Digital Strategy
  • Event Newsroom
  • Events
  • Infrastructure & Cloud

Accenture is helping SSEN Transmission manage hundreds of infrastructure projects vital to achieving the UK’s Net Zero ambition. Effective delivery…

Accenture is helping SSEN Transmission manage hundreds of infrastructure projects vital to achieving the UK’s Net Zero ambition. Effective delivery required addressing fragmented data and disconnected tools that can slow the flow of information between systems. SSEN Transmission sought a partner to help reshape its approach for data-driven execution on capital projects.

Meeting the Digital Challenge with Accenture

SSEN Transmission partnered with Accenture to embrace automation and digitisation in response to increasing project demands, a challenge reflected across the wider Capital Projects sector. Through the adoption of BIM (Building Information Modelling) and the implementation of Integrated Project Management (IPM), which was developed with Oracle and Microsoft, this collaboration laid the groundwork for more connected ways of working and continues to promote transformation across the organisation.

Key Benefits Delivered

Accenture supported with IPM (Integrated Project Management) and Building Information Modelling (BIM) customised to meet specific needs and achieve key goals: 

  • Digitise processes for a single unified environment
  • Unify data for a standardised and trusted source of truth
  • Create a scalable platform for delivering capital projects

“With a unified real-time view of project data, SSEN Transmission has improved efficiency and strengthened collaboration across internal teams and with external partners. This allows for more time focused on higher value insight-led work, supporting better outcomes, faster decisions and much more agile delivery”

Huda As’ad, Managing Director, Capital Projects & Infrastructure, UKI

Building for the Future

More than a solutions provider, Accenture helps with strategy and issupporting SSEN Transmission’s continued focus on refining best practice for smooth project delivery. The partnership is helping to evolve ways of working and strengthening the digital foundation for future readiness.

“Our collaboration is built on a strong digital foundation that can scale with SSEN Transmission’s growing needs. By unifying systems, data, and process, we are enabling the faster adoption of new capabilities and supporting the shift towards a fully data-driven capital project delivery”

Nithin Vijay, Managing Director, Industry X – Capital Projects & Infrastructure

Accenture: A Partner for the Journey

Transformation is a journey that begins with the right foundation across people, data and process. It also requires a digital partner that brings together the best of industry experience, process excellence and technology to:

  • Develop a clear, actionable strategy for digital and data transformation
  • Embed industry best practices to optimise processes and drive continuous improvement
  • Enable smarter, more consistent delivery aligned to a long-term vision, from strategy through to execution

And that’s where Accenture makes its mark, helping clients navigate the journey with confidence.

Learn more about how Accenture is supporting SSEN Transmission on its digitisation journey with Huda As’ad, Managing Director, Capital Projects & Infrastructure, UKI and Nithin Vijay, Managing Director, Industry X – Capital Projects & Infrastructure

  • Digital Strategy
  • Infrastructure & Cloud
  • Sustainability Technology

Tech Show London is coming to Excel March 12-13. Register for your free ticket now!

Unlock unparalleled value with a single ticket that gets you free access to five industry-leading technology shows. Welcome to Cloud & AI Infrastructure, DevOps Live, Cloud & Cyber Security Expo, Big Data & AI World, and Data Centre World.

Tech Show London has it all. Don’t miss this immersive journey into the latest trends and innovations.

Discover tomorrow’s tech today

Unleash Potential, Embrace the Future. Hear from the greatest tech minds, all in one place.

Dive into a world where cutting-edge ideas shape your tomorrow. Tech Show London is the epicentre of technology innovation in London and beyond, hosting the brightest minds in technology, AI, cyber security, DevOps, and cloud all under one roof.

The Mainstage Theatre is not just a stage; it’s a launchpad for innovative ideas. Witness a stellar lineup featuring world-renowned experts from across the tech stack, influential C-level executives, key government figures, and the vanguards of AI and cybersecurity. All ready to share ideas set to rock the industry.

GLOBAL INSPIRATION, LOCAL IMPACT

Seize the opportunity to be inspired by global visionaries. Furthermore, with speakers from the UK, USA, and beyond, prepare to be inspired by transformative concepts and actionable strategies from technology insiders, ensuring your business stays ahead in an ever-evolving technology landscape.

Where the future of technology takes the stage

Secure your competitive edge at Tech Show London, the UK’s award-winning convergence of the industry’s brightest tech minds.

On 12-13 March 2025, gain vital foresight into the disruptive technologies reshaping your market, and position your organisation at the forefront of technology’s next frontier.

If you’re defining your business’s tech roadmap, register for your free ticket to join us at Excel London.

Register for FREE

Register for your Ticket

  • Cybersecurity
  • Data & AI
  • Digital Strategy
  • Event Newsroom
  • Infrastructure & Cloud

Cybersecurity leader Shinesa Cambric on Microsoft’s innovation journey to identify, detect, protect, and respond to emerging threats against identity and access

This month’s cover story highlights a cybersecurity program protecting billions of users.

Welcome to the latest issue of Interface magazine!

Interface showcases leaders at the forefront of innovation with digital technologies transforming myriad industries.

Read the latest issue here!

Microsoft: Innovation in Cybersecurity

Shinesa Cambric is on a mission to drive innovation for cybersecurity at Microsoft. Moreover, by embracing diversity and opening all channels towards collaboration her team tackles anti-abuse and delivers fraud-defence. Continuous Improvement doesn’t just play into her role, it defines it…

“In the fraud and abuse space, attackers are constantly trying to identify ways to look like a legitimate user,” warns Shinesa. “And this means my team, and our partners, have to continuously adapt. We identify new patterns and behaviours to detect fraudsters. At the same time, we must do it in such a way we don’t impact our truly ‘good’ and legitimate users. Microsoft is a global consumer business and any time you add friction or an unpleasant experience for a consumer, you risk losing them, their business and potentially their trust. My team’s work sits on the very edge of the account sign up and sign in process. We are essentially the first touch within the customer funnel for Microsoft – a multi-billion dollar company.”

ABB: Digital Technolgies contributing towards Net Zero

Nigel Greatorex, Global Industry Manager for Carbon Capture and Storage (CCS) at ABB Energy Industries, explains how digital technologies can play a critical role in the transition to a low carbon world. He highlights the role of CCS in enabling global emissions reductions and how challenges can be overcome through digitalisation…

“It is widely recognised decarbonisation is essential to achieving net zero emissions by 2050. Therefore, it’s not surprising that emerging decarbonisation technology is becoming an increasingly important, and rapidly growing market.”

CSI: How can your IT estate improve its sustainability?

Andy Dunn, Chief Revenue Officer at IT solutions specialist CSI, reveals how digital technologies can contribute to ESG obligations: “Sustainability is a now seen as a strategic business imperative, so much so that 74% of companies consider Environmental, Social and Governance (ESG) factors to be very important to the value of their company. Additionally, we know almost three in four organisations have set a net zero goal. With an average target date of 2044, 50% of organisations are seeking more energy efficient products and services.”

https://www.youtube.com/watch?v=tsDaZiSO1ho

“Optimising energy use and consolidating servers and storage infrastructure form a strong basis for shaping a more environmentally friendly and efficient IT estate. It no longer needs to be the Achilles Heel of an ESG policy. “

Mia Platform: Sustainable Cloud Computing

Davide Bianchi, Senior Technical Lead at Mia Platform, explores the silver lining of sustainable cloud computing. He reveals how it can help us reduce our digital carbon thumbprint with collaboration, efficient use of applications, containerisation of apps, microservices and green partnerships.

“We’re already on an important technological path toward ubiquitous cloud computing. Correspondingly, this brings incredible long-term benefits too. These include greater scalability, improved data storage, and quicker application deployment, to name a few.”

Also in this issue, we hear from Doug Laney, Innovation Fellow at West Monroe and author of Infonomics and Data Juice. Also, we learn how companies can measure, manage and monetise to realise the potential of their data. And, Deputy CIO Melvin Brown discusses the people-centric approach to IT supporting America’s civil service at The Office of Personnel Management (OPM).

Enjoy the issue!

Dan Brightmore, Editor

  • Infrastructure & Cloud