A glance at modern Information Security

Who hasn’t wondered if they have chosen the right career path? During some self analysis recently, one of the areas I keep considering for myself is the Information Security path. Especially the position of CISO – after all I have a fair share of security engineering (not commercial) experience and some sense for organizing things. A recent article [1] describing the discovery of new feature of prime numbers got me thinking what sort of implications this could have and what I would do as a response. This is, after all, a rather forgotten plane, of all the planes and vectors the CISOs currently have to deal with.

So, what are the IS challenges currently popular and how is this different?

Let’s put the organizational part aside, including policies and procecures or dealing with social engineering. In technology, without any particular order, there’s the software security, with the most prominent type of vulnerability being related to memory management. I’m sure everyone has heard or read about exploiting buffer overflow scenarios. That is most of the time a result of lack of control of data sizes and the memory allocated for them. So, why not simply control it? That is because the circumstances required for a buffer overflow potential are not easy to notice when analyzing the source code. Something that is hard to spot and yet can lead to someone taking over control of a system puts this type of threat very high on the risk map. The good news? There are also many approaches to address vulnerabilities of this type, such as ASLR[2] or code audits.

Another endangered area is again related to memory, but from the hardware perspective. Rowhammer is much less known, probably because a successful exploitation of this vulnerability “on a large scale” has never made it to the news – or even, above white paper [3] level. Nevertheless this threat is probably just as “red” on the risk map as the one above. Successful exploitation can lead to privilege escalation but the likelihood – well, that’s every PC and server memory currently used, including the new DDR4 [4]. There are hardware responses to this coming in new CPU instruction sets [5], however they are flawed in the typical way security solutions do – there’s a negative impact on speed/latency. The speed of computer memory has always been a bottleneck so in many implementations such solutions will not be acceptable.

Speaking of hammers, there’s always the “hammer way” (as opposed to the clean-cut “scalpel way”) of exploiting a vulnerability. I am referring to the next security area commonly referred to as Denial of Service. The point of DoS is “if you cannot take over the control, at least stop the service” by flooding with spurious service requests. The financial implications can be quite severe. There are typical DoS flaws that can be addressed with smart programming, but also attacks based on distribution of source of requests (DDoS), against which there is no efficient solution, maybe aside unreasonably increasing the infrastructure footprint and some smart collaboration with ISPs.

Last plane I wanted to cover here is cryptography. This probably is the easiest area – mathematicians define ciphers which can be used to encrypt communication. Those are coded in functions which then are packet into security libraries to be commonly used by programmers as required. Since proving a cipher to have weaknesses can take time after initial announcement, every now and then the IT world is rocked by a cryptography based vulnerability such as Heartbleed, weaknesses around MD5, or previously RC4 and so on. Nevertheless, the ciphers are in vast majority based all on the prime numbers theorem – or more importantly, that it is difficult to timely check if the exponent of a number is prime or not, using currently available efficiency of computation.

In reference to [1], the typical weaknesses of ciphers, aside from quantum computing, have gained a new enemy. If there is a higher chance that for a prime number, the next digit in the next number is known by likelihood of 65%, then that should make “guessing” the number much easier. For example, one could assign priorities for the numbers when running a brute force check, thus speeding up the cracking process. This likelihood then is a clue, which can be used in a similar way you’d be cracking a password hash with tools like John The Ripper – by building a set of rules such as “passwords with numbers usually are constructed with the letters preceeding the numbers, i.e. password ‘Forthequeen96′”.

I hope I am wrong about this, however from the theory it appears that with this discovery[1], all modern cryptography got another impulse to find a new, quantum-computing-proof cipher. If I am right, this discovery just makes all cryptography weaker, meaning – if you needed a super computer to crack encrypted communication, now you’d need less.

1. https://www.quantamagazine.org/20160313-mathematicians-discover-prime-conspiracy/
2. https://en.wikipedia.org/wiki/Address_space_layout_randomization
3. http://users.ece.cmu.edu/~yoonguk/papers/kim-isca14.pdf
4. http://arstechnica.com/security/2016/03/once-thought-safe-ddr4-memory-shown-to-be-vulnerable-to-rowhammer/
5. http://blogs.cisco.com/security/mitigations-available-for-the-dram-row-hammer-vulnerability