Gigamon: Encrypted traffic a perfect storm

September 6, 2023

FEATURED

Gigamon Field CTO EMEA, Mark Jow says encrypted traffic is a perfect storm of risk that no one’s talking about.

Security professionals often feel like they are playing a high-stakes game of cat and mouse. Each new step forward they take is often quickly undone by bad actors who are becoming increasingly ambitious and resourceful, forcing them to adapt to a new set of threats.

It’s no different when it comes to encryption solutions. Despite providing undeniable security benefits, encryption, or rather encrypted data, is increasingly being used by bad actors to conceal malicious activity. So, in an increasingly hybrid cloud-centric and encrypted world, it’s perhaps unsurprising that deep observability is rapidly becoming a mandatory capability, as the only means to provide the required levels of transparency and insight into the data in motion.

Hiding in plain sight

As far back as 2018, the security benefits of encryption came with an important caveat: Cisco predicted in its annual report that year that a majority of cyber-attacks would use encryption in the near future.

In recent years, the growing use of encryption by businesses has created the perfect camouflage for stealth attacks. Today, cybersecurity teams face a reality in which 93 percent of malware hides behind encryption. And yet, when we polled over 1,000 global IT and security leaders for our Hybrid Cloud Survey, only 30 percent of UK respondents said their organisation has sufficient visibility into encrypted traffic.

Given today’s threat landscape, failure to embrace deep observability is a high-risk strategy, and yet even now many IT and security professionals continue down this path. Of course, CISOs must make difficult daily choices about where to allocate resources and budget, to ensure an adequate level of security within their hybrid cloud environments. And in a world where productivity means profitability, it can be hard to justify the time, money and processing power needed to decrypt and analyse encrypted network traffic. That’s bad news though for security teams and their boards. If they can’t identify encrypted threats, these businesses are operating in a perfect storm of increased cyber risk combined with an alarmingly low level of visibility where they really need it.  

Exacerbating the visibility gap

There’s plenty to keep IT and security professionals awake at night, but one of the biggest causes of stress is an inability to analyse data flowing across cloud and on-premises networks. Half of those surveyed identify these blind spots as a key cause of concern. Achieving true visibility across networks is an invaluable means of anticipating, monitoring and quickly remediating breaches. Yet complex networks that span multiple data centres and cloud platforms make this an increasingly difficult task.

Hybrid and multi cloud networks dramatically increase IT complexity, which in turn obscures network visibility. On-premises network tools have limited to no observability into cloud traffic, and cloud-native tools may not be able to see network-level traffic. This cloud visibility gap means each cloud becomes siloed, leaving gaps into which unseen threats can permeate.

All too often perception is not matched by reality. While 94 percent of respondents believe their security tools and processes provide complete visibility into hybrid cloud environments, the reality is that end-to-end visibility is hard to come by. Just under half of the same respondents have sight across laterally moving data, otherwise known as East-West traffic — meaning that malicious traffic can flow inside most networks without detection, and as bad actors are increasingly using lateral movement as means to penetrate defences, this means for many organisations their defences are lowest where the risks are at their highest.   

These gaps in visibility are clearly at odds with IT and security leaders’ perceptions of their security posture and represent a formidable obstacle to implementing robust long-term security plans, including implementing Zero Trust. Introduced in 2010 and thrust firmly into the spotlight during the COVID pandemic, the “never trust, always verify” ethos of Zero Trust is rapidly gaining traction in the boardroom.  Survey respondents cite a 29 percent increase in board-level awareness compared to last year. Yet as security teams look to implement a Zero Trust approach, they soon realise it is far from straightforward. Many organisations are still on the journey to achieving a Zero Trust security posture, and poor visibility into hybrid cloud network traffic is holding them back.  

Free-flowing threats

The disparity between perceived visibility and actual network oversight is at its widest when it comes to encrypted traffic. SSL/TLS encryption of data-at-rest and data-in-transit secures the information lifecycle and ensures that only those authorised to access sensitive information can do so. It is an invaluable tool to shield data from bad or even negligent actors. But businesses must also recognise that the same strategy can be exploited for malicious intent and monitor for this accordingly. Our research found that less than a third of businesses have visibility into encrypted traffic entering or exiting their networks.

The most common use of encryption by cybercriminals is of course as a stage in many ransomware attacks, whereby threat actors compromise a network and encrypt crucial files, forcing the business to pay up to access a decryption key. However, malicious actors are also using encryption as a vehicle for their attacks to hide malware and conceal suspicious traffic and communications, including the delivery of malicious code and the exfiltration of sensitive data.

As businesses move to higher levels of digital maturity and transition more workloads to the cloud, so too comes a greater reliance on encryption. The increasing volume of encrypted data travelling across networks only heightens the risks posed by unmonitored SSL/TLS encryption, allowing malicious actors to work in the dark. Add to this the high costs and processing power needed to decrypt, analyse, and re-encrypt traffic as it flows through a network, and it’s no surprise that it has become so popular among malicious actors.

With over two thirds of businesses allowing encrypted data to flow freely, security professionals and boards are leaving their networks vulnerable to attacks which could cause significant financial and reputational damage.

Streamlining decryption

It’s clear that visibility is vital to maintaining a robust security posture. But the central role encryption now plays in the cyber criminals’ armoury has flown under the radar for too long. It’s time for cybersecurity leaders to recognise and embrace the need for deep observability and petition their boards for the necessary resources to make the investments required.

Mitigating the challenges posed by network visibility gaps through rigorous decryption and inspection of traffic can be a challenge, increasing costs and compute resources. In public cloud environments these compute costs can be even higher. Fortunately, though there are mechanisms available to network and security professionals, that can optimise the compute resources required to gain the levels of visibility they need for encrypted traffic.

Application filtering allows teams to distinguish ‘trusted’ traffic signatures from the crowd, and thus separate traffic flows into high and low risk. This means traffic from high-volume, trusted applications such as YouTube or Netflix can be filtered out before decryption to reduce the overall compute workload. Similarly, deduplication of traffic ensures that only unique network “packets” are decrypted, again reducing compute costs.  These capabilities allow network security leaders to prioritise key resources whilst not having to compromise in their security posture.

The growing complexity and opacity of hybrid cloud networks can present network security teams with what feels like an endless challenge. But that’s exactly why security leaders need deep observability. When done right, it can drive successful Zero Trust approaches, and ensure that the right choices are made to protect business operations, data, and reputation. 

In the final analysis, you can’t defend against something you cannot see, and unprotected blind spots present organisation-wide risks with expensive and often enduring consequences. As business success becomes more closely tied to the maturity and robustness of the security posture it’s time to put encrypted threats firmly in the spotlight and stop giving those with malicious intent a safe place to hide.

About Mark Jow

Mark Jow at Gigamon aims to help organisations leverage deep observability to optimise hybrid cloud security and performance. As the organisation’s first field evangelist for EMEA, Mark Jow has over 30 years of experience in the industry, having held senior technical leadership positions in Oracle, EMC, Veritas, Symantec and more recently Commvault.

More UK Security News

Read Next