Zero trust architecture is not just ‘nice to have’

Paul German, CEO, Certes Networks, outlines why adopting a Zero Trust mindset to data and cyber security is now an operational necessity, and explores the essential components organisations must embrace to achieve this.
The US National Institute of Standards and Technology’s (NIST) recent Special Publication (SP 800-207) has changed the table stakes when it comes to cybersecurity best practice. While not mandatory, the federal agency’s role in enhancing economic security cannot be under-estimated. As such, its guidance on refining the concept of Zero Trust and its high-level roadmap on how organizations can implement a standardized approach to a Zero Trust Architecture can also not be ignored.
Paul German, CEO, Certes Networks, outlines why adopting a Zero Trust mindset to data and cyber security is now an operational necessity and explores the essential components organizations must embrace to achieve this.
Zero Trust

The concept of ‘zero trust’ is not new; originally defined in Stephen Paul Marsh’s doctoral thesis on computational security in 1994, it became a key cybersecurity concept when Forrester’s John Kindervag reignited it in the late 2000’s. The idea being that would-be attacks could come from both within, as well as from without, an organization’s network.

However, until recently, the debate around zero trust has remained – in my view – focused solely on authenticating the user within the system rather than taking a more holistic approach and looking at user authentication and access to sensitive data using protected micro-segments. This concept has changed with NIST’s Special Publication; no longer is the network the focus of zero trust; finally, it is the data that traverses the network.

At its core, NIST’s Special Publication decouples data security from the network. Its key tenets of policy definition and dynamic policy enforcement, micro-segmentation and observability offer a new standard of Zero Trust Architecture (ZTA) for which today’s enterprise is responsible.

Dynamic policy aligned to business intent

As data owners, organizations are responsible for protecting their sensitive information. Moreover, with increasing regulation that specifically targets the protection of this sensitive data, it is more important than ever that organizations adopt a cybersecurity stance that can ensure – and maintain – compliance, or information assurance. However, not all data has the same level of sensitivity.

Under the latest zero trust standards, data needs to be classified according to differing levels of sensitivity and the business intent of that data. This business intent needs to define an organization’s operational policy around how data is handled and accessed, when, where and by whom, with micro-segmentation protecting each data class from external compromise and providing isolation from other data classifications.

In addition, enterprises are encouraged to observe and collect as much information as possible about their asset security posture, network traffic and access requests; process that data; and use any insight gained to dynamically improve policy creation and enforcement.

Authentication and authorization

Under NIST’s zero trust standards, access to individual enterprise resources is granted on a per-session basis based on a combination of component relationships, such as the observable state of client identity, application/service, and the requesting asset—and may include other behavioral and environmental attributes – with operational policy enforcement.

Authentication and authorization to one resource does not grant access to another resource. It is also dynamic, requiring a constant cycle of obtaining access, scanning and assessing threats, adapting, and continually re-evaluating trust in ongoing communication.

Cybersecurity best practice demands that, by creating fine-grain policies, authentication and authorization are done on a ‘per-packet’ basis, only allowing access to the resources required. Layer-4 encryption protects data as it transits between policy enforcement points. It provides full observability by encrypting the payload only, leaving the packet header in the clear and allowing for granular enforcement of security policies.

Network visibility and observability tools are the linchpins that provide real-time contextual meta-data enabling rapid detection of out-of-policy data and fast response and remediation to any non-compliant traffic flow or policy change to maintain the required security posture on a continuous basis.

READ MORE: 

No compromise

Fundamentally, a Zero Trust posture must be achievable without compromising the network’s performance, allowing users with authenticated and authorized access to the data they need to do their jobs seamlessly.

Organizations need to be able to secure data in transit, across any network, with zero impact to performance, scalability or operational visibility. As the latest NIST zero trust standards advocate, decoupling security from network hardware in this way is a unique approach and enables security teams to be confident that their organization’s data is assured, regardless of what is happening to the network – finally putting the focus for cyber security best practice where it belongs – the data.

For more news from Top Business Tech, don’t forget to subscribe to our daily bulletin!

Follow us on LinkedIn and Twitter

Amber Donovan-Stevens

Amber is a Content Editor at Top Business Tech

Laying the foundations for global connectivity

Waldemar Sterz • 26th June 2024

With the globalisation of trade, the axis is shifting. The world has witnessed an unprecedented rise in new digital trade routes that are connecting continents and increasing trade volumes between nations. Waldemar Sterz, CEO of Telegraph42 explains the complexities involved in establishing a Global Internet and provides insight into some of the key initiatives Telegraph42...

Laying the foundations for global connectivity

Waldemar Sterz • 26th June 2024

With the globalisation of trade, the axis is shifting. The world has witnessed an unprecedented rise in new digital trade routes that are connecting continents and increasing trade volumes between nations. Waldemar Sterz, CEO of Telegraph42 explains the complexities involved in establishing a Global Internet and provides insight into some of the key initiatives Telegraph42...

IoT Security: Protecting Your Connected Devices from Cyber Attacks

Miro Khach • 19th June 2024

Did you know we’re heading towards having more than 25 billion IoT devices by 2030? This jump means we have to really focus on keeping our smart devices safe. We’re looking at everything from threats to our connected home gadgets to needing strong encryption methods. Ensuring we have secure ways to talk to these devices...

Future Proofing Shipping Against the Next Crisis

Captain Steve Bomgardner • 18th June 2024

Irrespective of whether the next crisis for ship owners is war, weather or another global health event, one fact is ineluctable: recruiting onboard crew is becoming difficult. With limited shore time and contracts that become ever longer, morale is a big issue on board. The job can be both mundane and high risk. Every day...

London Tech Week 2024: A Launched Recap

Dianne Castillo • 17th June 2024

Dominating global tech investment, London Tech Week 2024 was buzzing with innovation. Our team joined the action, interviewing founders and soaking up the latest tech trends. Discover key takeaways and meet some of the exciting startups we met!

The Future of Smart Buildings: Trends in Occupancy Monitoring

Khai Zin Thein • 12th June 2024

Occupancy monitoring technology is revolutionising building management with advancements in AI and IoT. AI algorithms analyse data from IoT sensors, enabling automated adjustments in lighting, HVAC, and security systems based on occupancy levels. Modern systems leverage big data and AI to optimise space usage and resource management, reducing energy consumption and promoting sustainability. Enhanced encryption...