Featured

Will TLS 1.3 Ruin Security Production Tools?


With the ever-escalating cyberthreats, newer versions of encryption protocols have been developed to address vulnerabilities and support stronger, more secure ciphers and algorithms.

The Background


The Transport Layer Security (TLS) protocol was developed as a successor to the Secure Sockets Layer (SSL).  SSL and TLS are mostly the same cryptographic protocols running at the application layer of the Open Systems Interconnection (OSI) model. Both provide authentication and data encryption between servers, devices, and applications operating over a network.  SSL 1.0 was initially developed by Netscape in 1995 with SSL 2.0 version released for general use. In 1996, it was replaced by SSL 3.0 after several vulnerabilities were discovered. As more vulnerabilities were found, both SSL versions were deprecated in 2011 and 2015 respectively.

TLS 1.0 was released in 1999 through the Internet Engineering Task Force (IETF) organization to standardize SSL. While both SSL and TLS are mostly the same, Microsoft succeeded in changing the name from SSL to TLS.  Since the TLS 1.0 release two subsequent releases were made, TLS 1.1 was released in 2006 to and in anticipation to address the BEAST attack vulnerability. In 2008 TLS 1.2 was introduced with authenticated encryption, a significant hallmark that thwarted many weaknesses.

Ivan Ristic produced and maintained a comprehensive timeline of events that defined the SSL/TLS protocol from its origins to where it is today.

A Step Forward in Data Security


With any new release updates, the question remains if the encryption standard will ruin an organization’s security infrastructure. There may be specific applications or devices that may not be compatible. However, the purposes for which TLS 1.3 was developed and released in March 2018 was to improve performance by simplifying the handshake process between a client and a server. Using a single encryption algorithm negotiation to establish a session eliminated the back and forth communications on which algorithm to use. The server provides the encryption key to the client where an individually unique and only session is established.

This technique also stopped older vulnerable encryption algorithms from being used eliminating several attack vectors from being orchestrated. For example, Perfect Forward Secrecy (PFS) is enabled that makes it impossible for a hacker to capture encrypted traffic that leads to subsequent decryption of the data.

Man-in-the-Middle Attacks Halted


TLS 1.3 is a bitter-sweet to many organizations, sweet to highly regulated organizations such as insurance and financial institutions, bitter to others. The controversy is TLS 1.3 does not allow backdoor into unencrypted traffic. Most deep packet inspection security tools such as Data Loss Prevention (DLP) are engaged with an authorized man-in-the-middle attack enabling security engineering teams visibility by decrypting the data, inspecting it, re-encrypting then sending it on to its destination.

Most current packet inspection security tools deployed use this man-in-the-middle approach that is based on server key identities, unlike a single one-time session. With TLS 1.3, these tools become useless that are based upon TLS 1.2. The concern most organizations have is with regulatory compliance and blocking malware since they have no way of inspecting traffic traversing their network.

The Road Ahead


All new server applications and browsers will begin requiring support for TLS 1.3 pressuring the manufacturers of security tools to roll out adoption of the latest version. The dilemma is that most commercial websites still use TLS 1.0, 1.1 or 1.2 for security.

Another concern an organization must consider is a draw-back to the rapid negotiation performance enhancement it provides. This enhancement is called the zero round-trip-time resumption (0-RTT) option. Under previous TLS versions, the server and client make two round trips to establish a session while the new version requires one creating a security risk. 

Replay attacks are a threat where hackers can intercept encrypted client traffic and resend them to the server tricking it to extend trust allowing access to sensitive data. It is essential to be on the look-out enabling the 0-RTT in application services. Developers must be cognoscente with a proactive configuration to avoid potential security risks and latency considerations.