Skip to main content

Featured

What Motivates Young Cybercriminals?

In the world of cybercrimes, the majority of cybercriminals always seek financial gain, but this is not the primary motivation. Aside from the advanced sophistication of state-sponsored incidences, the young cybercriminal venturing into the dark side boils down to their ego. Adolescent criminals seek out recognition among their peers eager for a sense of success in an effort to prove themselves.

Many seek out popularity within internet hacking communities driven by a feeling of accomplishment they compromised a target. This provides them with a rush, a demeanor to develop their skills further becoming tragically involved with organized crime immersed in their addictive and dangerous sphere of influence.

Others find inadequate employment opportunities and thus are lured into the dark side to learn a skill as a matter of survival by participating in online hacking groups. They are easy prey for organized crime and state-sponsored groups to recruit indoctrinating them into …

Understanding Cryptography From Math to Physics


Does one have to be a mathematical genius to understand this common method used in information security? To the cryptographer perhaps as their job is to provide encryption algorithms that are extremely difficult to decipher. Protecting customer credit card information, securing remote user connections to a network or protecting intellectual property from digital piracy, encryption is used every day.

My focus in this article is to bring the daunting science of cryptography and convey it into a basic understanding all may understand how it is used to encrypt data.
In this article I will discuss:

  1. History of Cryptology
  2. Fundamentals
  3. Common Algorithms Used

1. History of Cryptology

The differences in the words “cryptology” and “cryptography” are often used interchangeably in modern literature and confusions to their actual meanings get into semantics. Both words have different meanings best explained as:
  • Cryptology – is the study of the art of secrecy and/or the science of cryptosystems.
  • Cryptography – is the practical practice of applying methods designing the cryptosystem to make something secret or hidden.
  • Cryptoanalysis – is finding weaknesses that will permit retrieval of the plaintext from the ciphertext without necessarily knowing the key or the algorithm.
Much of the article is devoted to “cryptography” as is practice today and want you to be aware of and appreciate the distinction and meanings of both words.

In and of itself, the study of cryptology as a science has been around for many years where the first known evidence of the use of cryptography was found in an inscription carved around 1900 BC, in the main chamber of the tomb of the nobleman Khnumhotep II, in Egypt. The scribe used some strange hieroglyphic symbols here and there in place of more ordinary ones. The purpose was not to hide the message but perhaps to change its form in a way which would make it appear dignified.

At the height of the Roman Empire (100 BC), Julius Caesar was known to use a form of encryption to convey secret messages to his army generals posted in the war front. This substitution cipher, known as Caesar cipher, is perhaps the most mentioned historical cipher in literature. (A cipher is an algorithm used for encryption or decryption.) In a substitution cipher, each character of the plaintext (plain text is the message which has to be encrypted) is substituted by another character to form the ciphertext (ciphertext is the encoded message). The variant used by Caesar was a shift by 3 ciphers. Each character was shifted by 3 places, so the character ‘A’ was replaced by 'D,' ’‘B’ was replaced by ‘E,’ and so on. The characters would wrap around at the end so ‘X’ would be replaced by ‘A.’
During World War II the U.S. Marines recruited and trained individuals from the Navajo Indian Tribe fluent in the Navajo language. It was an attractive choice for code use because few people outside the Navajo had learned to speak the language and no books in Navajo had been published. Outside of the word, the Navajo spoken code was not very sophisticated by cryptographic standards. It would likely have been broken if a native speaker and trained cryptographers could have worked together productively. The Japanese had an opportunity to attempt this when they captured Joe Kieyoomia in the Philippines in 1942 during the Bataan Death March. Kieyoomia, a Navajo sergeant in the U.S. Army, but not a code talker, was ordered to interpret the radio messages later in the war. However, since Kieyoomia had not participated in the code training, the words made no sense to him. When he reported that he could not understand the messages, his captors tortured him The Japanese Imperial Army and Navy never cracked the spoken code.

Back in the early 1970’s, IBM realized that their customers were demanding some form of encryption, so they formed a “crypto group” headed by Horst-Feistel. They designed a cipher called Lucifer. In 1973, the Nation Bureau of Standards (now called the National Institute of Standards and Technology or NIST) in the US put out a request for proposals for a block cipher which would become a national standard. They had apparently realized that they were buying a lot of commercial products without any good crypto support. Lucifer was eventually accepted and was called DES or the Data Encryption Standard. In 1997, and in the following years, DES was broken by an exhaustive search attack. The main problem with DES was the small size of the encryption key. As computing power increased, it became accessible to brute force all different combinations of the key to obtaining a possible plaintext message. Prior In the 1980s, there was only one real choice, and that was the Data Encryption Standard (DES). That’s changed. Today, we have a broad selection of stronger, faster and better-designed algorithms. Now, the problem is to sort out the choices.

In 1997, NIST again put out a request for proposal for a new block cipher. It received 50 submissions. In 2000, it accepted Rijndael and christened it as AES or the Advanced Encryption Standard.

2. Fundamentals

Encryption is the process of changing data so that it is unrecognizable and useless to an unauthorized person while decryption is turning it back to its original form. The most secure techniques use a mathematical algorithm and a variable value known as a ‘key.’ The selected key (often any random character string) is input on encryption and is integral to the changing of the data. The exact same key must be input to enable decryption of the data.

This is the basis of the protection, if the key (sometimes called a password) is only known by the authorized individual(s), the data cannot be exposed to other parties. Only those who know the key can decrypt it. This is known as ‘private key’ cryptography, which is the most well-known form.
Now the fundamental reasons why encryption is necessary are:

    • Confidentiality – When transmitting data, one does not want an eavesdropper to understand the contents of the broadcast messages. The same is true for stored data that should be protected against unauthorized access, for instance by hackers.
    • Authentication – This property is the equivalent of a signature. The receiver of a message wants proof that a word comes from a particular party and not from somebody else (even if the original party later wants to deny it).
    • Integrity – This means that the receiver of individual data has evidence that no changes have been made by a third party.
    • Non-repudiation – Prevent an originator from denying credit (or blame) for creating or sending a message.
Ciphers
Cryptography is the art and science of hiding (through encryption) sensitive data. It includes encryption (when the cipher is initially applied to raw “plain text”) and decryption (when the cipher is used to bring data back into readable form).

To illustrate ciphers the best approach shows you simplified examples:

Polybius Cipher
The Polybius Cipher is a type of substitution cipher. In my case, it involves
using a 6×6 two-dimensional matrix holding all the uppercase letters of the alphabet and numbers from 0-9.

This will give us the following matrix:




With the 6×6 matrix (36 alpha-numeric characters), we can start the substitution process. For example, the letter ‘A’ is 1×1 or x=1, y=1. This can be simplified by 11. Another example, let’s take the letter ‘N.’ This is in the position 2×3 or x=2, y=3. This can be simplified to 23.

Let’s encrypt a simple message:
Message: ENCRYPT ME 2 DAY
Encrypted: 51-23-31-63-15-43-24 13-51 55 41-11-15
The cipher can be quite large and complex to include lowercase and special characters. Also, periodic and random scrambling of character positions renders it unpredictable to brute force attacks. This is analogous to polymorphism used in advanced computerized encryption methods today.

Caeser Cipher



One of the first ciphers ever created was the Caeser cipher about Julius Caeser, the creator of it. He used it to encrypt his messages to his generals securely so that the Roman Empire’s adversaries would not be able to read it. A Caeser cipher is an elementary form of encryption and extremely easy to break. Therefore it is not used today for any serious purposes.

Basically, a Caeser Cipher is the re-arranging of the alphabet. Different shift values can be used to create different message encodings. The number of shifts is as it sounds, the number of letters shifted across (right or left) to create the encrypted message. Here is a practical example of using a shift of 3 to the left.

English: ENCRYPT ME
Encrypted: HQFUBSW PH

The message above can be easily decrypted by brute-forcing by trying every combination of letters until one finally makes sense. More complicated ciphers have been created on this principle such as The Vigenere or The Gronsfeld Ciphers, but they employ substitution methods. Deciphering this can be confused as well such as each letter acting as value changing the shift pattern.

Vigenere Cipher Table



It’s important to understand how ciphers work before continuing in cryptography, as they are the basis of all encryption. Stenography is the process of writing hidden messages, which is actually more just classic cryptography, as cryptography has now become synonymous with “computer security.”

Polymorphism

Polymorphism is a relatively advanced part of cryptography and is most common in computers encryption techniques. Polymorphism is a cipher that changes itself after each use where each time it is used it produces a different result. This is most common in cipher algorithms, which are used in computers. This means that, if we were to encrypt the same data twice, each time it would produce a different encrypted result.

Think of a key, perhaps to a car. Nowadays, we all have small electronic remote devices that unlock our vehicles with just a push of a button. Here’s something you might not think about every time you open your car, the amount of data that is sent to your vehicle, and this data is specific to your car, where if it matches, then your vehicle unlocks. The easy way to accomplish that is merely putting each remote device on a different frequency. However, that is difficult to regulate. So instead, all are on the same wavelength, and they use different algorithms (rolling code) for the data sent to the car. These algorithms are polymorphic.

This means the algorithms are harder to reverse engineer as they change each time. Even if a hacker discovered the algorithm (which is harder in the first place with a polymorphic algorithm), the hacker would have to match it to the same set the car/key is on which is a complicated task.

3. Common Algorithms Used

Today’s ciphers use either secret-key or public-key techniques. Secret-key ciphers can be used to protect critical/sensitive data. Because secret-key ciphers use a single key that two people must share, this is also known as symmetric cryptography.




In 1949, Claude Shannon of Bell Laboratories published the fundamental theory behind secret-key ciphers, and the decades of evolution since then have yielded high-caliber examples. However, it was not until 1975 that a powerful secret-key algorithm, DES, was made available for general use.
Public-key or asymmetric cryptography also emerged in the mid-1970s. Public-key ciphers use a pair of keys where the public key that gets shared with other people, and a corresponding private key that is kept secret by its single owner. For example, a recipient can create a key pair and share the public key with anyone else who might want to send a secret message. The sender can encrypt a letter to a recipient by using a public key, and the recipient can decrypt it using the private key.




The strength of the encryption cipher depends on three primary factors:

The infrastructure – If the cryptography is implemented primarily in software, then the foundation will be the weakest link. If you are trying to keep messages secret, the hacker’s best bet is to hack into your computer and steal the words before they’re encrypted. It’s always going to be easier to cut into a system or infect it with a virus than to crack a sizeable secret key. In many cases, the easiest way to uncover a secret key might be to eavesdrop on the user and intercept the secret key when it’s passed to the encryption program.

Key size – In cryptography, essential size matters. If an attacker can’t install a keystroke monitor, then the best way to crack the ciphertext is to try to guess the key through a “brute-force” trial-and-error search. A practical cipher must use a key size that makes brute-force searching impractical. However, since computers get faster every year, the size of a “borderline safe” key keeps growing.
Experts acknowledge that keys of 64 bits or less, including DES keys, are vulnerable to a determined attacker. In 1999, the Electronic Frontier Foundation (EFF) funded the development of a device called Deep Crack, which could crack a DES encryption key in three days or less. Contemporary cipher keys always contain more than 100 bits and a few support 256-bit keys.

Algorithm quality – Algorithm quality is difficult to judge, and it’s relatively easy to construct a plausible-looking cipher based on an existing algorithm, and it can be hard to detect subtle flaws unless experienced people take a close look. Cipher flaws can yield “shortcuts” that allow attackers to skip large blocks of keys while doing their trial-and-error search. For example, the popular compression utility PKZIP traditionally incorporated a custom-built encryption feature that used a 64-bit key. In theory, it should take 264 trials to check all possible keys. In fact, there is a shortcut attack against PKZIP encryption that only requires 227 tests to crack the ciphertext. The only way to find such flaws is to actually try to break the algorithm, usually by using tricks that have worked against other ciphers. An algorithm usually only shows its quality after being subjected to such analyses and attacks. Even so, the failure to find a flaw today doesn’t guarantee that someone won’t find one eventually.

Types of Algorithms

DES – DES has stood the test of time because the cipher’s quality had been proven over many years of published research. After a quarter century of study, researchers only managed to find a few speculative attacks that ultimately weren’t as practical as brute force. The DES cipher’s only real weakness has been its crucial 56-bit size.





Triple DES – gets around this by applying the cipher three times in a row, using either a 112-bit key or a 168-bit key. The resulting cipher is much slower than other ciphers of similar strength but is rendered obsolete with powerful computerized attacks cracking the algorithm.

AES – The Advanced Encryption Standard (AES) supports three key sizes of 128, 192 and 256 bits and uses a 128-bit block size. It currently considered the standard and used worldwide.

Rijndael Cipher Table



While DES was explicitly designed to be built in hardware, no thought was given to making it work efficiently in software. The National Institute of Standards and Technology (NIST) evaluated software execution efficiency and storage requirements to ensure that AES worked well in C and Java running on workstations, as well as the more restricted environments of embedded ARM processors and smart cards.

Although Rijndael, developed by Dutch researchers Vincent Rijmen and Joan Daemen, won the NIST competition, all of the AES finalist ciphers provide vast improvements over DES and the DES substitutes. All of them are block ciphers that support 128-bit or larger key sizes. None of the finalists had any severe weaknesses; the final choice involved a balance of cryptographic strength with performance.

AES is based on a design principle known as a substitution-permutation, combination of both substitution and permutation, and is fast in both software and hardware. Unlike its predecessor DES, AES does not use a Feistel. AES is a variant of Rijndael which has a fixed block size of 128 bits, and a critical dimension of 128, 192, or 256 bits. By contrast, the Rijndael specification is specified with block and key sizes that may be any multiple of 32 bits, both with a minimum of 128 and a maximum of 256 bits.

AES operates on a 4×4 column-major order matrix of bytes, termed the state, although some versions of Rijndael have a larger block size and have additional columns. Most AES calculations are done in a particular finite field.

The key size used for an AES cipher specifies the number of repetitions of transformation rounds that convert the input, called the plaintext, into the final output, called the ciphertext. The number of cycles of repetition is as follows:
  • 10 cycles of repetition for 128-bit keys
  • 12 cycles of repetition for 192-bit keys
  • 14 cycles of repetition for 256-bit keys



Each round consists of several processing steps, each containing four similar but different stages, including one that depends on the encryption key itself. A set of reverse repetitions are applied to transform ciphertext back into the original plaintext using the same encryption key.

Quantum Cryptography



In the above diagram, Quantum Key Distribution (BB84 Protocol) is a secure communication method which implements a cryptographic protocol involving components of quantum mechanics and guarantees secure communication. It enables two parties to produce a shared random secret key (symmetric keys) known only to them, which can then be used to encrypt and decrypt messages. Quantum mechanics is the body of scientific laws that describe the behavior of photons, electrons and the other particles that make up the universe.

Industries are in search of far greater security from hackers, a new generation of cryptography has been turning from math to physics. Scientists in atoms and particle physics have entered into the world of cryptography. These scientists want to exploit the laws of quantum mechanics to send messages that are impossible to hack. They are the architects of a new field called quantum cryptography, which has come of age only in the past few decades.

Quantum cryptography draws its strength from particle physics. The particles making up our universe are inherently uncertain phenomena, able to simultaneously exist in more than one place or more than one state of being. They choose independently how to behave only when they bump into an object or when measuring their properties.

Cryptography is a fascinating area of information security and is one of the most complex disciplines to grasp. Once we progress from the simple understanding of the Ceasar and Polybius ciphers and into the DES and AES ciphers with repeated iterations of encryption, it then becomes less complicated to grasp the algorithm concepts.

Cryptology is a science of its own, and we explored its history, the fundamental concepts of ciphers from the least to the most complex types in use today.