The Doghouse: Crypteto

Crypteto has a 49,152-bit symmetric key:

The most important issue of any encryption product is the ‘bit key strength’. To date the strongest known algorithm has a 448-bit key. Crypteto now offers a
49,152-bit key. This means that for every extra 1 bit increase that Crypteto has over its competition makes it 100% stronger. The security and privacy this offers
is staggering.

Yes, every key bit doubles an algorithm’s strength against brute-force attacks. But it’s hard to find any real meaning in a work factor of 249152.

Coupled with this truly remarkable breakthrough Crypteto does not compromise on encryption speed. In the past, incremental key strength improvements have effected the speed that data is encrypted. The usual situation was that for every 1 bit increase in key strength there was a consequent reduction in encryption
speed by 50%.

That’s not even remotely true. It’s not at all obvious how key length is related to encryption speed. Blowfish has the same speed, regardless of key length. AES-192 is about 20% slower than AES-128, and AES-256 is about 40% slower. Threefish, the block cipher inside Skein, encrypts data at 7.6 clock cycles/byte with a 256-bit key, 6.1 clock cycles/byte with a 512-bit key, and 6.5 clock cycles/byte with a 1024-bit key. I’m not claiming that Threefish is secure and ready for commercial use—at any keylength—but there simply isn’t a chance that encryption speed will drop by half for every key bit added.

This is a fundamental asymmetry of cryptography, and it’s important to get right. The cost to encrypt is linear as a function of key length, while cost to break is geometric. It’s one of the reasons why, of all the links in a security chain, cryptography is the strongest.

Normally I wouldn’t bother with this kind of thing, but they explicitly asked me to comment:

But Hawthorne Davies has overcome this issue. By offering an algorithm with an unequalled key strength of 49,152 bits, we are able to encrypt and decrypt data at speeds in excess of 8 megabytes per second. This means that the aforementioned Gigabyte of data would take 2 minutes 13 seconds. If Bruce Schneier, the United State’s foremost cryptologist, were to increase his Blowfish 448 bit encryption algorithm to Blowfish 49152, he would be hard pressed to encrypt one Gigabyte in 4 hours.

[…]

We look forward to receiving advice and encouragement from the good Dr. Schneier.

I’m not a doctor of anything, but sure. Read my 1999 essay on snake-oil cryptography:

Warning Sign #5: Ridiculous key lengths.

Jaws Technology boasts: “Thanks to the JAWS L5 algorithm’s statistically unbreakable 4096 bit key, the safety of your most valued data files is ensured.” Meganet takes the ridiculous a step further: “1 million bit symmetric keys—The market offer’s [sic] 40-160 bit only!!”

Longer key lengths are better, but only up to a point. AES will have 128-bit, 192-bit, and 256-bit key lengths. This is far longer than needed for the foreseeable future. In fact, we cannot even imagine a world where 256-bit brute force searches are possible. It requires some fundamental breakthroughs in physics and our understanding of the universe. For public-key cryptography, 2048-bit keys have same sort of property; longer is meaningless.

Think of this as a sub-example of Warning Sign #4: if the company doesn’t understand keys, do you really want them to design your security product?

Or read what I wrote about symmetric key lengths in 1996, in Applied Cryptography (pp. 157–8):

One of the consequences of the second law of thermodynamics is that a certain amount of energy is necessary to represent information. To record a single bit by changing the state of a system requires an amount of energy no less than kT, where T is the absolute temperature of the system and k is the Boltzman constant. (Stick with me; the physics lesson is almost over.)

Given that k = 1.38×10-16 erg/°Kelvin, and that the ambient temperature of the universe is 3.2°Kelvin, an ideal computer running at 3.2°K would consume 4.4×10-16 ergs every time it set or cleared a bit. To run a computer any colder than the cosmic background radiation would require extra energy to run a heat pump.

Now, the annual energy output of our sun is about 1.21×1041 ergs. This is enough to power about 2.7×1056 single bit changes on our ideal computer; enough state changes to put a 187-bit counter through all its values. If we built a Dyson sphere around the sun and captured all its energy for 32 years, without any loss, we could power a computer to count up to 2192. Of course, it wouldn’t have the energy left over to perform any useful calculations with this counter.

But that’s just one star, and a measly one at that. A typical supernova releases something like 1051 ergs. (About a hundred times as much energy would be released in the form of neutrinos, but let them go for now.) If all of this energy could be channeled into a single orgy of computation, a 219-bit counter could be cycled through all of its states.

These numbers have nothing to do with the technology of the devices; they are the maximums that thermodynamics will allow. And they strongly imply that brute-force attacks against 256-bit keys will be infeasible until computers are built from something other than matter and occupy something other than space.

Ten years later, there is still no reason to use anything more than a 256-bit symmetric key. I gave the same advice in 2003 Practical Cryptography (pp. 65-6). Even a mythical quantum computer won’t be able to brute-force that large a keyspace. (Public keys are different, of course—see Table 2.2 of this NIST document for recommendations).

Of course, in the real world there are smarter ways than to brute-force keysearch. And the whole point of cipher cryptanalysis is to find shortcuts to brute-force search (like this attack on AES), but a 49,152-bit key is just plain stupid.

EDITED TO ADD (9/30): Now this is funny:

Some months ago I sent individual emails to each of seventeen experts in cryptology, all with the title of Doctor or Professor. My email was a first announcement to the academic world of the TOUAREG Encryption Algorithm, which, somewhat unusually, has a session key strength of over 49,000 bits and yet runs at 3 Megabytes per second. Bearing in mind that the strongest version of BLOWFISH has a session key of 448 bits and that every additional bit doubles the task of key-crashing, I imagined that my announcement would create more than a mild flutter of interest.

Much to his surprise, no one responded.

Here’s some more advice: my 1998 essay, “Memo to the Amateur Cipher Designer.” Anyone can design a cipher that he himself cannot break. It’s not even hard. So when you tell a cryptographer that you’ve designed a cipher that you can’t break, his first question will be “who the hell are you?” In other words, why should the fact that you can’t break a cipher be considered evidence of the cipher’s security?

If you want to design algorithms, start by breaking the ones out there. Practice by breaking algorithms that have already been broken (without peeking at the answers). Break something no one else has broken. Break another. Get your breaks published. When you have established yourself as someone who can break algorithms, then you can start designing new algorithms. Before then, no one will take you seriously.

EDITED TO ADD (9/30): I just did the math. An encryption speed of 8 megabytes per second on a 3.33 GHz CPU translates to about 400 clock cycles per byte. This is much, much slower than any of the AES finalists ten years ago, or any of the SHA-3 second round candidates today. It’s kind of embarrassingly slow, really.

Posted on September 30, 2009 at 5:52 AM115 Comments

Comments

aikimark September 30, 2009 6:35 AM

Bruce,

I think you meant to write “6.5 clock cycles per byte” instead of “1024 clock cycles per byte”

Tomasz Wegrzanowski September 30, 2009 7:16 AM

“we cannot even imagine a world where 256-bit brute force searches are possible.”

I can, with highly parallel quantum computers. It’s not any harder than imagining 128-bit classical computer brute force searches.

Peter September 30, 2009 7:17 AM

I think it’s pretty telling that the only search results for this Dr. Hawthorne, in respect to encryption, are from that site and it’s blog. Given that they state he’s “[…]one of the UK’s foremost and experienced encryption experts[…]” I can find no reference to his work.

Aaron Toponce September 30, 2009 7:27 AM

I mentioned this as a comment on his blog post, but it’s awaiting moderation, and it’s not to terribly friendly, so I’ll repost here:

Creating a key that is 49,152 bits is just stupid. The problem these “researchers” seem to not be addressing, is that key strength on symmetric keys in relation to time is not linear. Just because a key has doubled its key size, does not mean that it’s doubled its strength, or doubled the time it takes to encrypt/decrypt data.

All doubling a key size does is double the search space for the correct key. This doesn’t necessarily mean it’s doubling its strength. This is something that amateur cryptographers get wrong all the time.

Suppose you’re looking for a needle in a haystack with n-amount of hay. Doubling the amount of hay in the stack is directly analogous to doubling the key size- your search space for the needle has doubled. But has the strength increased? Regardless of the amount of hay in the stack, couldn’t I just burn down the stack, to find my needle?

So, even though the key size might contain 2^49152 keys in the space, that doesn’t mean the strength for finding the right key is on the same magnitude. I’ll bet, with proper cryptanalysis, that this key is no stronger than standard 2^128 symmetric keys.

So, the claim to “the world’s strongest encryption algorithm” is probably wrong. It should be reworded to “the world’s largest symmetric key space”.

Lastly, when common 256-bit keys like AES and Blowfish begin to show serious cryptanalysis attacks, we should be concerned about strengthening the algorithms, or coming up with stronger new algorithms, not necessarily increasing the key space. Remember burning down the stack of hay.

Stephane September 30, 2009 7:33 AM

“I can, with highly parallel quantum computers. It’s not any harder than imagining 128-bit classical computer brute force searches.”

You can’t put quantum computers in parallel like this. Or rather, it would mean you don’t have a unique 256 bits key but a number of small keys that are applied to the data without relation to the others (i.e. first block is encrypted with key 1, second with key 2, etc.)

egan September 30, 2009 7:40 AM

Ten years later, there is still no reason to use anything more than a 256-bit symmetric key.

So why 448 bits for Blowfish ?

Bruce Schneier September 30, 2009 7:46 AM

“‘we cannot even imagine a world where 256-bit brute force searches are possible.’ I can, with highly parallel quantum computers. It’s not any harder than imagining 128-bit classical computer brute force searches.”

Right. And 128-bit classical computer brute force searches are impossible.

Clive Robinson September 30, 2009 7:49 AM

@ Bruce,

“This is enough to power about 2.7×10 56 single bit changes on our ideal computer; enough state changes to put a 187-bit counter through all its values.”

For most binary counters each change of value at the output involves more than one bit changing state, not just at the output but internaly.

Also did you think about what the clock speed would need to be on the aproximate assumption of 2^25 seconds a year….

Bruce Schneier September 30, 2009 7:51 AM

“So why 448 bits for Blowfish?”

General sloppiness, more than anything else. Blowfish is really defined for a variable-length key: 32 to 448 bits. Blowfish uses a key expansion step, to expand the key to 448 bits. The key expansion step can take a shorter key, or the entire 448 bits.

It’s a sloppy key schedule. Key schedules are very hard, and I didn’t understand them very well in 1993 when I designed Blowfish. I’m much prouder of the key schedule in Twofish and Threefish.

Bruce Schneier September 30, 2009 7:54 AM

“Also did you think about what the clock speed would need to be on the aproximate assumption of 2^25 seconds a year….”

I did not.

I’d be happy for someone to update the physics of this analysis.

DaveShaw September 30, 2009 8:03 AM

I know little about designing a modern crypto algorithm. But enough to know that claiming the worlds strongest and unbreakable is not something anyone with half a mind would claim. It just means it won’t last long when the source come out.

Drew September 30, 2009 8:22 AM

The key lengths necessary to ensure they are secure are only as valid as the algorithm though, right? Isn’t it of benefit to maybe double the key length (symmetric of 512, public of 4096 or so) to protect against future discoveries of weaknesses in the underlying crypto scheme?

Bruce Schneier September 30, 2009 8:25 AM

“The key lengths necessary to ensure they are secure are only as valid as the algorithm though, right? Isn’t it of benefit to maybe double the key length (symmetric of 512, public of 4096 or so) to protect against future discoveries of weaknesses in the underlying crypto scheme?”

A reasonable approach — it’s one of the reasons I generally recommend 256-bit keys.

Steve September 30, 2009 8:41 AM

Strange that pretty much everyone guess on keylength.com disagrees with your take on asymmetric key length. 256 bits symmetric and 15424 asymmetric is what ECRYPT II recommends for foreseeable future. NIST recommends 256 bits symmetric, 15360 asymmetric for the periods much greater then 2030. They all rate 2048 bit asymmetric as less strong then 128 bit symmetric. Is this because of new research since 1999, or differences in estimations of what is secure?

I know explosive growth in asymmetric keylength needed for security is one of the reasons DNSSEC is working on standardising elliptic curve crypto in the near future.

kme September 30, 2009 8:44 AM

Clive: Your counter could use a gray code. This seems like a reasonable assumption given that we are talking a best-case scenario here.

Simon September 30, 2009 8:47 AM

‘jawstech’ isn’t a real company. It’s a BS website that’s used to promote domains. You can dredge up crap like this all day. Are you trying to create a strawman somewhere here?

scott September 30, 2009 9:02 AM

@Simon

Jawstech is a 1999 link. Just goes to show what happens when a snake-oil cryptographer gets thrown in the doghouse 😉

HJohn September 30, 2009 9:05 AM

To state the obvious, it doesn’t matter how strong cryptography is if you can just get someone to tell you their password/phrase (intentionally or inadvertantly, such as through keystroke loggers).

The “Law of Unintended Consequences” definitely comes into play here. The resources dumped into unnecessarily enhancing cryptography are resources that are not used to strengthen the more exploitable weaknesses, or worse, resources diverted from such weaknesses to make it worse.

Akin to a college student that is determined to get his Cryptography 101 grade up to an A+ from an A, with the end result that his other classes, like Telecom 101 or Database security 101, drop from B’s to C’s instead of moving them to A’s as well.

Apologies to those who know this. I suspect some comments here are read by managers less adept at techology who non-the-less may think it worthwhile to boast strong cryptography rather than a firewall or IDS upgrade. Indeed, reports to senior management that boast XXX-bit cryptography look more effective than reports that discuss an upgrade to version 8 of something.

chas September 30, 2009 9:23 AM

They also loose points for naming “the World’s strongest encryption algorithm” after that unpronounceable VW SUV.

bob September 30, 2009 9:27 AM

Is there a correlation between the number of yellow posts (Bruce) and the number of time Bruce’s name is mentioned in the quoted article…?

I would say that a 49,152 bit key will be very vulnerable to the “its too complicated to set so just use the default key of ‘12345….'” attack.

Bruce Schneier September 30, 2009 9:28 AM

“‘jawstech’ isn’t a real company. It’s a BS website that’s used to promote domains. You can dredge up crap like this all day. Are you trying to create a strawman somewhere here?”

It was a real link in 1999, when I wrote the essay. I considered putting a note in to that effect, but I figured it it was obvious. I guess it wasn’t.

boris September 30, 2009 9:28 AM

If you’re going to give a physics lesson, you should try to get the units right. It is improper to refer to “°Kelvin”. The unit is Kelvin (plural Kelvins), abbreviated “K”. So, you could say “The ambient temperature of the universe is 3.2 K.”, or “The ambient temperature of the universe is 3.2 Kelvins.”

You can make your point without being so pedantic. The pedantry just makes you seem silly (which is not truly the case); people will take you more seriously without it.

Paeniteo September 30, 2009 9:29 AM

@DaveShaw: “Isn’t it of benefit to maybe double the key length (symmetric of 512, public of 4096 or so) to protect against future discoveries of weaknesses in the underlying crypto scheme?”

AFAIK it generally isn’t trivial to simply double the keylength for a given algorithm – nobody knows how an AES-512 should look like.
A general strengthening approach “just to be sure” would be to use good old triple-encryption (famous from Triple-DES).

Bruce Schneier September 30, 2009 9:30 AM

“Is there a correlation between the number of yellow posts (Bruce) and the number of time Bruce’s name is mentioned in the quoted article…?”

Funny.

I think it’s primarily based on how busy I am, whether or not I’m on an airplane, and so on. Although this is a doghouse entry, so I’m monitoring the comments a bit more closely than I ususally do.

john greco September 30, 2009 9:49 AM

@boris

While you are correct about the units, complaining about someone being pedantic while correcting what is a rather forgivable and common mistake comes off as a bit odd.

Aaron Toponce September 30, 2009 9:53 AM

“Although this is a doghouse entry, so I’m monitoring the comments a bit more closely than I ususally do.”

When authors take the time to interact with their community, it makes the blog, essay, journal, or site much more personal and enjoyable to read. You’re actually human!

Bruce Schneier September 30, 2009 10:03 AM

“You can make your point without being so pedantic. The pedantry just makes you seem silly (which is not truly the case); people will take you more seriously without it.”

Oh, I know. But in the context of the book, it worked. The section was on key length, and I talked about basic brute-force cracking, distributed software brute-force cracking, and hardware brute-force cracking. Then I talked about using viruses for brute-force cracking, something from 1991 called the “Chinese Lottery,” and something else from 1991 about using biochips for brute-force cracking. At that point, it was all pretty silly — so I went for the extreme.

jeff September 30, 2009 10:41 AM

IIRC, which I might not, Planck’s time is approximately 10^-44 seconds, which would be 2^-146 seconds. With there being 2^25 seconds per year, one could conclude that the most number of sequential (ie., non-parallel) computations in a year for any system could not exceed 2^171, regardless of how much energy were available for the computations.

Disclaimer: IAALNAP (I am a lawyer, not a physicist)
jeff

Dave Aronson September 30, 2009 11:03 AM

@Aaron: You said “All doubling a key size does is double the search space for the correct key.” I’m no crypto-genius (else I’d be the famous one with books and a blog!), but….

Doesn’t doubling the key size square the key space? (Assuming that all possible values are valid keys, that is.) IOW, 2^(n2) is (2^n)^2, not (2^n)2. All the latter takes is 2^(n+1).

For instance, take a five-bit key. It can have 2^5=32 values, from 0 to 31 (decimal). Double the length, to ten bits. Now it can have 2^10=1024 values, from 0 to 1023. That’s 32 squared, not 32 doubled. All it takes to double the keyspace, is to add one bit. In this example, go from five bits to six; now it can have 2^6=64 values, from 0 to 63.

Am I missing something, either in your explanation or in crypto in general?

Thanks!

Clive Robinson September 30, 2009 11:41 AM

@ kme,

“Your counter could use a gray code. This seems like a reasonable assumption given that we are talking a best-case scenario here.”

If you where only looking at output state changes.

Have a think about how you make a Grey Code generator in terms of basic gates…

Oh and then think about it using reversable gates such as Toffili and Friedkin gates.

JeffH September 30, 2009 11:44 AM

49152? Where the hell did they come up with that? Hex C000? Cooo? Why not just go all the way to Fooo? Or ffff?

I’ve got a better encryption scheme they might want to consider. It has a keysize of 2^7228363491 bits.

wait for it…

f000bar in base 48.

I bet they’re really sorry they invited Bruce’s comments.

David Schwartz September 30, 2009 11:52 AM

Adding a single bit doubles the search space for the key. If you imagine some process to find an (n)-bit key, you’d need two of them to find an (n+1)-bit key in the same time. One would search the keyspace where the additional bit is zero and one would search the keyspace where the additional bit is one.

The problem with overly-large keys is basically that managing such a large key becomes the weakest link. A person can easily memorize a passphrase that encodes to a secure 128-bit key. At 256-bits, there’s significant loss of theoretical strength because some results will be more likely than others.

Think about the longest phrase you can reasonably remember and be expected to type in exactly. It would have to be a lot longer than 49,152 bits for there to be any way to turn it into a 49,152 bit key such that all possible keys were anywhere near equally likely.

It doesn’t matter how many bits your key is if the user types in “iamgod” or “this is a secret” as the key.

In some usability tests I did a few years ago, I found that you could get 128 bits without too much prodding and torture. Getting 256-bits from ordinary users is near impossible.

Chris September 30, 2009 12:01 PM

“One of several building blocks in the TOUAREG Encryption Algorithm is a 2048 wheel upgrade of HMX, further strengthened by the use of a multiplexer, which has the effect of interchanging the content of the wheels. I have every reason to believe that Professor Piper’s assessment still holds, and that, to use his words, an exhaustive search of the 49152-bit session key is the fastest form of attack.”

(From http://hdencrypt.wordpress.com/2009/09/07/proving-the-doubters-wrong/ )

I’m no encryption expert, but it smells to me like there’s fuzzy math going on to make a smaller key seem like a 49k bit key.

Chris September 30, 2009 12:13 PM

I can’t see the results of the study (paper is behind a paywall) but apparently the HFX/HKM ciphers were considered broken, of which this latest system is dervived from.

http://www.springerlink.com/content/y45200317u900875/

While he’s right that HFX was considered and studied over a 4 year period, I find no reference that it was ever accepted, only a draft paper proposes it’s use.

Bruce Schneier September 30, 2009 1:44 PM

“The unit is Kelvin (plural Kelvins), abbreviated ‘K’. So, you could say ‘The ambient temperature of the universe is 3.2 K.’, or ‘The ambient temperature of the universe is 3.2 Kelvins.'”

Good catch. Me, my editor, my copyeditor, the publisher’s copyeditor, the publisher’s proofreader all missed that back in 1996.

Nomen Publicus September 30, 2009 3:02 PM

Meanwhile in the real world. When the cost of cracking an encrypted message is expected to exceed the cost of shipping the author to a location where the keys can be beaten out of him by some anonymous thugs, plane tickets are bought.

This is something all the paranoid ‘leet hackers’ forget when they insist on huge key lengths 🙂

Tom September 30, 2009 3:42 PM

David,

I use passphrases that contain over 30 alphanumeric characters. At approximately 6-bits of entropy per character, my keys exceed 180 bits. It seems better to use something like SHA-256 to spread 180 bits into a 256 bit key than to condense 180 bits down to a 128-bit key.

Tom

Thomas September 30, 2009 3:56 PM

“Ten years later, there is still no reason to use anything more than a 256-bit symmetric key.”

Assuming your key bits are perfectly(tm) random then a 256-bit key should do.

Using a longer key provides safety in case your random number generator isn’t.

Clive Robinson September 30, 2009 4:06 PM

Having had a quick look at some of the things the designer has (supposadly) done in the past I’m not struck by anything.

I get the feeling that his earlier systems from a quater of a century ago are based on either enigma style rotors or fish style pin wheels, to generate a long cycle key stream (apparently they fit within a 3KB basic program, so may well have been developed on an 8bit home computer).

On the principle of “old dogs not learning new tricks” some of what he says makes me think that is what he is still doing.

One part that raised an eyebrow is about the mutual key being something like 6MB in size!!!

This would make the system impractical for a lot of uses.

However to be fair the above is speculation, untill he releases more information we can only make guesses.

But this gives rise to the question “why is he keeping it secret” there is no real comercial advantage in closed or proprietry systems due to interoperability issues.

For instance the Serpent contender for AES arguably is a more conservative design but are people going to use it? Probably not except in bespoke applications.

Further only the European military apear interested in stream based systems these days, and they have a history of stealing, not paying for ideas.

The EU Nessie project had a stream cipher section but none of the submited designs appeared to pass muster at the end of the day.

Which leaves the question what does the designer hope to gain by obscurity…

Tony H. September 30, 2009 4:18 PM

‘If you’re going to give a physics lesson, you should try to get the units right. It is improper to refer to “°Kelvin”. The unit is Kelvin (plural Kelvins), abbreviated “K”. So, you could say “The ambient temperature of the universe is 3.2 K.”, or “The ambient temperature of the universe is 3.2 Kelvins.”‘

The “ergs” are kinda quaint, too, even for 1996.

Michael Lynn September 30, 2009 4:39 PM

Did anyone else check out the post on that same blog entitled “CRYPTETO passes the test”:

–snip–
“The Dispersion Test now selects a random character from the 1,572,864 hex characters of the Mutual Key and makes the smallest possible alteration. The Mutual Key is no longer totally correct.”

“The above results make it clear that, when no alteration is made, the distribution of the decrypted message is characteristic of 25600 “A”s. The fascinating point is, however, that when a single bit alteration is made, namely, D =1101 to C = 1100, the resulting distribution of the decrypt has the characteristic of a totally unintelligible message. So it looks as if it’s going to be pretty tough for anyone trying to crack Touareg. For a start, he has to cope with a key that is so large that it takes over 390 A4 pages to print it out. And if the Dispersion Test is anything to go by, he hasn’t the foggiest idea how close he is to getting all 1,572,864 hex characters correct. That sort of thing can be very discouraging!”
–snip–

So it seems that he is under the belief (or wants us to be) that the way you go about breaking modern encryption is that you guess a key then change bits in it one at a time going with whichever one has less entropy until eventually the plain text emerges.

Granted that is a test that any real algorithm should pass (or actually any decent classical method too) but I’m not sure that equates to a strong algorithm.

Is this actually how people think that cipher are broken?

Clive Robinson September 30, 2009 4:49 PM

@ Tony H.,

“The “ergs” are kinda quaint, too, even for 1996.”

That’s the trouble with SI Units and later, they just don’t cut it name wise.

Ergs sounds like a mans unit where as Kelvin sounds like a spotty teenager unit.

Can you imagine the script writter for Star Treck writing for Scotty using “Kelvin” or “ergs”…

valdis September 30, 2009 4:50 PM

@Paeniteo: “A general strengthening approach “just to be sure” would be to use good old triple-encryption (famous from Triple-DES).”

Actually, this is a really good time to go back and re-read the section in Applied Cryptography on why Triple-DES was used rather than Double-DES. Just saying “use good old triple-encryption” without understandingwhy is cargo-cult crypto, and will usually result in your ass being handed to you…

(Note – Paenito may indeed understand why triple-DES was triple and just handwaved it, but I suspect a lot of readers will take it as a cargo-cult rule of thumb….)

Clive Robinson September 30, 2009 5:10 PM

@ Tom,

“At approximately 6-bits of entropy per character, my keys exceed 180 bits.”

Err sorry to worry you but with human memorable pass phrases the entropy tends to be nearer 1.8 bits per alpher char.

So you will be lucky to get half of 128 bits of entropy out of a 30 char pass phrase…

“It seems better to use something like SHA-256 to spread 180 bits into a 256 bit key than to condense 180 bits down to a 128-bit key.”

It’s a common but mistaken belief. However X bits of entropy are still X bits of entropy after you spread it around with a hash or other crypto primative.

When it comes to practical attacks against any given system using crypto, attacks on a lack of entropy is the next best attack after side channel attacks…

Jeremy September 30, 2009 5:24 PM

@Tom
“I use passphrases that contain over 30 alphanumeric characters. At approximately 6-bits of entropy per character, my keys exceed 180 bits.”

But do you choose those alphanumeric characters entirely at random? Most people use patterns or mnemonics, or even actual words, which reduces the entropy per character substantially. Random capitalization is particularly uncommon.

pdf23ds September 30, 2009 6:52 PM

It seems better to use something like SHA-256 to spread 180 bits into a 256 bit key than to condense 180 bits down to a 128-bit key.

It’s a common but mistaken belief.

Oh? It’s not better to have a 256 bit key with 180 bits of entropy rather than a 128 bit key with 128 bits of entropy?

David Schwartz September 30, 2009 9:02 PM

It’s not particularly difficult to massage the entropy in an input passphrase into however many bits of key you like. If the input contains n-bits of entropy, the output will contain very slightly less than the lesser of n and the key size. (SHA-512 followed by truncation, for example, does this just fine.)

People tend to grossly overestimate the number of bits of entropy in their keys. The simple math only applies if the characters are truly random. And some automated “key strength” indicators I’ve seen use seriously mistaken algorithms.

For example, in “I am happy.”, neither the capital letter nor the period at the end add any real additional entropy, since they’re required by English convention. Yet some algorithms will think the capital ‘I’ means an attacker must try a capital for every letter or the period means an attacker must try every special symbol at every position.

“i aM h!appy” realistically contains much more entropy. An attacker would likely have to try capitalizing every combination of digits and would have to try every special symbol in every place.

Words, of course, do not contain as much entropy as random letter strings.

If 128-bits is sufficient for a truly random key, there is no difference between compressing 180-bits down to a 128-bit key or expanding it to a 256-bit key. In both cases, you will have at least 128-bits, which is what you need.

If you’d prefer to have 256-bits, then using a 256-bit key with 180-bits or entropy is better than a 128-bit key with 128-bits.

The thing is, your key probably doesn’t have 180-bits of entropy. It’s easy to think it’s much higher than it really is. It’s very unusual for humans to memorize keys with more than 128-bits of entropy.

Clive Robinson September 30, 2009 10:59 PM

@ pdf23ds,

“Oh? It’s not better to have a 256 bit key with 180 bits of entropy rather than a 128 bit key with 128 bits of entropy?

As I said in my posting I doubt he was actually getting half of 128 bits of entropy. And that X bits of entropy prior to hashing was still X bits after hashing.

In reality his 30 char string is probably giving closer to 54 bits of entropy. Which would still give 54 bits of entropy after hashing.

Hashing is not “magic pixie dust” it does not increase entropy.

Then there is the question of if a 256bit key with 54bits of entropy is better than 128bit key with 54bits of entropy?

And the answer to that is it depends…

If you are talking of a brut force key search then yes each extra bit gets you double the search space.

However as in reality nobody does a brut force search as there is neither the time or resources available with modern crypto like AES it does not gain you anything.

A brut force search is used as a bench mark bound against which other more fruitfull attacks can be measured.

For instance if there is a problem with the key expansion in a block cipher design then a 256 bit key might actually be worse than a 128 bit key under a specific attack.

For example if you take 3-DES as a block you would think that you where getting three times the key size of 1-DES, and thus a vastly increased search space.

However in reality you are not getting anything like that at best, and at worst the equivalent of 1-DES with a weak key.

There have even been crypto systems put in field use where only a small percentage of the keys were strong (for the time).

There have been various reasons proposed over the years as to why, but one is that if the opposition copied the design without understanding it then in use the probability is they would be using weak keys most of the time. There apears to be some truth in this with the NSA and it’s ill fated key escrow cipher. The design was very brittle in that even very small changes considerably weakened it in subtle ways.

As Bruce notes above expanding the key in a block cipher is not an easy job. If care is not taken you could effectivly cancel rounds out or introduce other weaknesses, all of which make the argument about key size fairly unimportant.

dot tilde dot October 1, 2009 3:45 AM

although they “look forward to receiving advice and encouragement from the good dr. schneier” my concise blog comment linking to this post got removed.

.~.

Not By Choice October 1, 2009 3:54 AM

There was so much more wrong with Jaws then just oversold weak crypto. Everytime I read one of these claims, I imagine a whole world of similar goings on.

So I keep my eye out for the principals, as they pop up again.

21st century Dave October 1, 2009 6:43 AM

49,152-bit key? I’ll have some of that. Unfortunately there’s no pricing information on their website, or ordering facility. Perhaps they forgot.

Nevermind. I’ll email them. Hope it works on my Mattel Aquarius!

David October 1, 2009 9:42 AM

There’s also problems with increasing key sizes, which is why we tend to keep them in convenient lengths as small as is secure.

Each individual bit has to be random, so you need much more randomness for an individual 6+ kilobyte key than an 8-byte key. Truly random data is generally in short supply, and so it’s going to be difficult to generate session keys. At best, using the long key means that you won’t change keys readily, so if there is a compromise for some other reason the compromise will be far larger.

You can transmit an 8-byte key in one of any number of ways, provided you secure the channel somehow. (In sending a symmetric key by an asymmetric cipher, remember that the key strength is the minimum of the two.) We’ve all entered more than 256 bits of entropy in doing things like entering activation codes from over the phone, or copying them from printed keys. With 6KB keys, the only practical way to transmit them is electronically, and not by asymmetric cipher (unless there’s incredible amounts of bits in that key length), so it is necessary to send them by secure courier. Personally, I’d rather rely on my enemy being unable to break AES-256 than my enemy being unable to intercept a physical key exchange. (If I have to do a physical transmission of digital data by secure courier, why don’t I just send enough DVDs of random data to use as a one-time pad? That entirely avoids the problem of cryptanalysis.)

(Yes, it’s possible to say something like “start at character X of Project Gutenberg publication Y, and use this algorithm to generate a key”. That is a key with entropy based on the size of X and Y, not 49,152 bits. Unless you’re relying on security by obscurity, the enemy has the algorithm, and need only search Project Gutenberg to see which key works.)

I’m perfectly happy with using key lengths such that brute-forcing them will take longer than anything recognizably human will be around, and it doesn’t take all that many bits to ensure that.

jacob October 1, 2009 10:07 AM

I have a newbie question. Why could, say IDEA not be used as a OTP of a 120Gb hard drive, hash it etc.? Please don’t flame me, just a question.

joel8360 October 1, 2009 10:19 AM

Math terminology nitpick:

The cost to encrypt is linear as a function of key length, while cost to break is geometric.

Consider replacing with either of the following:

The cost to encrypt increases arithmetically with key length, while cost to break increases geometrically.

The cost to encrypt is linear as a function of key length, while cost to break is exponential.

karrde October 1, 2009 10:30 AM

Question about bits of entropy per character in human-memorable keys:

Is expected entropy changed if I do something like move my fingers up one row on the keyboard?

That is, if I replace the (presumably low-entropy) phrase
“the quick brown fox jumped over the lazy dog”

with the phrase
“5y3 178di g492h r9s u7j03e 9f34 5y3 oqa6 e9t”.

It is moderately challenging to a trained touch-typist, especially if capitalized characters are kept capitalized. But it turns recognizable human-text into gibberish, while retaining the property of being easily remembered and generated by the user.

Would this increase the entropy-per-character in the keyphrase?

An alternate method would be to type QWERTY patterns on a Dvorak keyboard, or vice-versa.

ajw October 1, 2009 10:42 AM

@karrde: Not significantly. All you’re really doing there is running your phrase through a schoolboy’s substitution cipher.

Leprechaun October 1, 2009 11:55 AM

@john greco

And I can imagine invisible pink unicorns.
If they’re invisible, are you sure they’re pink? The ones I see are usually lavender.

Paeniteo October 1, 2009 11:59 AM

@valdis:

@Paeniteo: “A general strengthening approach
“just to be sure” would be to use good old triple-
encryption (famous from Triple-DES).”

Actually, this is a really good time to go back and
re-read the section in Applied Cryptography on
why Triple-DES was used rather than Double-DES.

Actually, we might think about whether triple-encryption is still necessary with 128 bit (or even 256 bit) ciphers.
Bruce states in AC that meet-in-the-middle attacks that can break double encryption appear infeasible against 128 bit ciphers (because of their memory requirements of 2^128 blocks).
It is not obvious whether a breakthrough against the time-complexity of breaking a particular cipher would also reduce the memory requirements. Therefore, while it may not improve cipher strength as much as one might expect, double encryption would make a cipher more robust against cryptanalytic advancements.

Peter A. October 1, 2009 1:00 PM

@karrde, @ajw:

It is partially security by obscurity. If your opponent knows or guesses that you take a dictionary word or phrase and perform a particular substitution like shifting your fingers one key up or down, you’re vulnerable to a dictionary attack. But if he does not know what your sustitution may be, he would have to try all substitutions from the set of letters to the set of ‘typable’ characters – and that would be equivalent to brute-forcing.

In fact, password cracking software does just that – it takes a dictionary of words and performs some trivial substitutions or other simple operations like suffixing, prefixing or infixing that are known to be used by some people in order to extend the key space covered by the attack.

So by doing this, you can increase your passphrase security as long as your mangling method is novel and nobody put it in the cracking code yet.

One example:

Consider 8-character password consisting of printable ASCII characters – there are 96 of them, including space. The key space is thus 96^8 = 3*2^40. As for 8-character words, my /usr/share/dict/american-english have 15693 of them. It may be incomplete, lets assume that there are 16384 = 2^15 8-character dictionary words. Much less, isn’t it? That’s why dictionary attacks on passwords are often effective.

The words use letters (small and capital) and aposthrophe, 53 different symbols. Now let’s apply a substitution cipher that is some injection from the set of letters and apostrophe (53 elements) into the set of printable characters (96 elements). (I don’t consider non-injection functions from 53-set to 96-set not to get a trivial substitution such as replace every character with ‘a’). There are 96! / (96 – 53)! ~ 1.6 * 10^97 of them, much more than the target key space! The attacker would have to brute-force your password – unless he guesses or otherwise finds your particular substitution.

I use this method to choose my passwords. For example one of my past passwords was Owj9o8p7. Try to guess the word it was constructed from! (It wasn’t English word, besides.) Of course, I will keep my substitution secret 🙂

neill October 1, 2009 1:52 PM

sadly though the product with the nicer, shinier, more colorful box and the higher numbers on the box will win in the marketplace

Glenn Willen October 1, 2009 2:22 PM

Since we’re talking about passphrase strategies, I’m going to toss mine out seeking opinions. 😉

Take the user’s favorite dictionary/wordlist. Estimate the entropy in bits of any word in the list as the base-2 log of the length of the list. Select enough random words from the list to accumulate the desired number of bits.

It seems like it takes about 8 words, with any reasonably-sized wordlist, to get 128 bits of entropy; and you have enough bits left over that you can allow the user to throw out a few words they don’t like, and offer them replacements.

Personally I find 8 random dictionary words a lot easier to memorize than a string of alphanumeric crap. 🙂

Clive Robinson October 1, 2009 3:14 PM

@ Glenn Willen,

“Estimate the entropy in bits of any word in the list”

I’m not clear how you are actually going to measure the entropy in your list.

But for words in a known language with a normal letter, binim, trinim etc frequency the entropy goes down substantialy (the trinim “the” has less entropy potential than “x”).

If you are going to use dictionary words pick those that use low probability letters binims etc and that do not have the usuall suspects such as “th” “qu” in them.

When you have a list of a thousand or so words then randomly select your dictionary list from them.

Also consider leaving letters out from the “EAT ON IRISH” list or substituting rare letters for them. One way of doing this is to pick a word with both high frequency and low frequency letters in and use it as a simple substitution cipher.

The problem is that although all of these tricks work to make pass phrase guessing more difficult and in some cases increase the entropy per char, you quickly reach a point where you would be better of writting it down and putting it in your wallet…

The simple fact is that when it comes to finding pass words/phrases the computer has a better performance than human memory virtually every time.

tom October 1, 2009 4:09 PM

The trick I use to try to get the most entropy is to use passphrases that I need to figure out each time I use them. I can not remember the passphrases; I remember how to recreate them. An example would be

74931953381110189=16541times16543TIMES16547times16549

This becomes my passphrase into PasswordSafe. I have password safe generate random passwords of 30-50 characters. It is easy for me to remember how to recreate the passwordsafe passphrase but I’d never be able to remember it.

I will concede though that this is less than 6 bits of entropy per character, but I suspect that it is a bit more than 1.5 bits per character.

Nick P October 1, 2009 4:22 PM

I guess I throw mine in as well. My approach is somewhat complex, but easy enough for most geeks. I start with a passphrase. Then, I use one of several rules: insert (symbol here) between each word; last letter capital; substitute W for each i. There are more to choose from. Then, I use mnemonics to memorize it.

What this system does is allow a human being to generate a gibberish-looking password on the spot. The rules are simple enough to mnemonically remmber, and the passphrase itself isn’t long. It’s like memorizing a passphrase and 2-3 passwords. Since I reuse passwords on similarly risky or critical sites, I only have to remember a half-dozen or so instead of dozens. It can also be used for password managers and password-based encryption w/ decent entropy.

It’s worked for me in risky environments, and if you type it in bursts of 6-8 characters, shoulder surfing isn’t easier. 😉

b October 1, 2009 6:48 PM

You know, I sometimes wonder if these things are a trap. They know they’ll piss you off and get you to comment, then take your comments completely out of context.

They’ll say this is what Bruce Schneier has said about our super-key algorithm:
“…AES-256 is about 40% slower…”
“Even a mythical quantum computer won’t be able to brute-force that large a keyspace.”
“The cost to encrypt is linear as a function of key length, while cost to break is geometric.”

Clive Robinson October 1, 2009 7:14 PM

@ jacob,

“I have a newbie question. Why could, say IDEA not be used as a OTP of a 120Gb hard drive, hash it etc.? Please don’t flame me, just a question.”

It’s actually quite a good question and ranks with those simple to ask questions like “why is the sky blue” and “why is grass green”. Easy to ask but darn difficult to answer simply, but anyway here goes 8)

Oh and if anybody else wants to jump in and help out please feel free.

The quick answer first, the entropy in the system you describe is what is on the hard disk, encrypting it will not change the amount of entropy it has. If the disk is all zeros encrypting it on a bit by bit basis against the same key will not change that the result will be either all zeros or all ones. Likewise with a block cipher a block of zeros will encrypt to the same value under the same key. The pattern is nolonger all zeros but the pattern is always the same under the same key.

As I said in a posting above crypto primatives are not “magic pixie dust” they do not increase entropy they mearly change what it looks like (a chameleon is still a chameleon irrespective of what colour it is ;).

That being said if your disk contents have high entropy then it is not realy any different to using IDEA in CTR mode, and then encrypting the output under IDEA again which would be simpler and give you a lot lot more keying material.

Anton October 2, 2009 3:00 AM

“To record a single bit by changing the state of a system requires an amount of energy no less than kT, where T is the absolute temperature of the system and k is the Boltzman constant.”

Bruce,

You seem to be unaware that since you wrote the book, it was discovered that completely reversible computations do not do any erasing, which is the process that has a thermodynamic cost attached to it. So your argument needs to be rethought in the light of that information.

A good overview of this can be found in Leff H.S. & Rex A.F. “Maxwell’s Demon 2”, 2003

Dr William Hawthorne October 2, 2009 9:55 AM

As the designer of TOUAREG(the algorithm that drives Crypteto) I think I must respond to your Doghouse column. I totally agree that the assertion, that “an addition of one bit to the session key halves encryption speed”, was incorrect. These errors can creep in to sales literature and I am grateful to you for spotting it.
The correct views are: (1) An addition of one bit doubles the time taken for an exhaustive search. I think nobody would disagree with that. (2) Sscaling up a given cipher design by doubling of the key length halves the running speed of the cipher. But this does not mean, as you rightly point out, that the key lengths of algorithms in general are inversely proportional to their running speeds. That is where good design comes in and accounts for the fact that your BLOWFISH is faster than Triple DES in spite of the fact that the latter has a much shorter key – something on which you should be congratulated.
Your analysis of the non-response by academics to me has a strange irony. The one professor who reluctantly replied to me seemed to be unaware that his former head of department knows me very well. He charged a company in the eighties a very considerable fee for analysing my HMX cipher, which became the cornerstone of my internationally recognised HFX40 Cipher and is now one of the building blocks of TOUAREG. We met personally on a number of occasions. And, a few years ago, he deputed a colleague in the same university to charge another considerable sum for examining one of the underlying concepts of TOUAREG.
And there is another strange irony. In the late nineties, your very own “Doghouse” column had a real go at Chantilley Corporation Ltd, a company of which I was a director, using your favourite snake oil metaphor. You ridiculed the company for using jargon such as “Multiple Fermat Sequencing”, unaware that these new concepts were already recognised by GCHQ, NSA and ITU. All of which confirms that I am not a total amateur. I hope you will read my CV on our blog and I look forward to a continuing our dialogue.

partdavid October 2, 2009 1:56 PM

On the subject of “Kelvins”, in my lifetime (as in, when I first took physics) common usage still was to talk about “degrees” (generally naked ones, or degrees Celsius) as the incremental unit, while “Kelvin” was the name of a scale. “Kelvins” sounds odd to me, and although this usage was apparently adopted around 1970 it wasn’t common in my physics and chemistry classes.

Clive Robinson October 2, 2009 3:03 PM

@ Anton,

“it was discovered that completely reversible computations do not do any erasing, which is the process that has a thermodynamic cost attached to it.”

Not quite true there is still an energy cost in moving the information as a force is involved.

For those interested in universal reversable logic gates have a google on Fredkin and Toffili gates or reversable logic, (wikipedia has so so pages on them).

However irrespective of the energy involved Bruce’s counter would not work the frequency required to get through the number of states in the time period is not attainable in a physicaly realisable form.

Clive Robinson October 2, 2009 3:30 PM

@ William Hawthorne,

‘You ridiculed the company for using jargon such as “Multiple Fermat Sequencing”, unaware that these new concepts were already recognised by GCHQ, NSA and ITU.’

The thing about “jargon” is that in most cases it should be ridiculed and for good reason.

George Orwell made the point that “jargon” is a way of excerting unwarented political control by self appointed and largly unaccountable cleaques.

When we see such behaviour today we generaly deride it by giving it names such as “managment speak” or “political correctness”.

As a statment “Multiple Fermat Sequencing” lacks clarity and therefore fails to convay usefull meaning.

That being said your above comment to this blog page has actually not addressed pertanent issues raised on it.

So you will forgive us for not taking you seriously as all we currently see is “smoke and mirrors” from you.

Some of us have been around long enough to realise that claims of “if you know what I know” but “I can’t tell you or else I’d have to kill you” usually lack an substance or merit. And as such are usually reserved for gulling the likes of politicians and the feeble minded out of money.

Nick P October 2, 2009 6:29 PM

Maybe the question is a bit naive, but why don’t we have verifiable primitives to use when building primitives? Essentially, a functional programming version of crypto algorithm design?

Here’s what I’m talking about. In other disciplines, we have vast libraries that contain many primitive operations and even complex algorithms. Many are black box: you put certain data in, and you can be pretty sure you get the right results out. Has anyone come up with scrambling functions, etc. that work the same way?

The payoff would be diversity of crypto. If certain functions definitely did their job, then amateurs could combine them in new ways producing new algorithms. Cracking crypto would be even harder. I see this working for symmetric more than asymmetric, but why the heck not? Maybe some for hash functions too.

Of course, we make do fine with what few algorithms we currently have. Take Blowfish and TripleDES: still secure after all this time and probably have plenty of life left. If SHA-2 starts to suck, we have RIPEMD and Whirlpool. They give time to create new algorithms. I guess the only benefit to my scheme would be long-term secrecy: if the amateur algorithm, built using proven transformation functions, was lost then the documents would be unrecoverable. Both the algorithm and key would have to be cracked. There are some people who are interested in keeping secrets for decades, and a slower but provable algorithm might suit them.

Any comments? Criticisms?

Paeniteo October 5, 2009 10:20 AM

@Nick P:
I think, the crypto algorithms themselves actually are what you are thinking of.
They form the building blocks of complex cryptosystems/protocols.

There are also more basic (lets say “atomic”) building blocks of ciphers, such as Feistel networks, S-boxes etc., but these cannot easily be seen in isolation.

Paeniteo October 5, 2009 10:49 AM

@Dr William Hawthorne: “(2) Sscaling up a given cipher design by doubling of the key length halves the running speed of the cipher.”

In general, this is wrong.
Take RC4 as one very simple counterexample where it is immediately obvious that keylength does not have any influence on the cipher speed.

RH October 5, 2009 6:01 PM

To those who have password formulas, your passwords are probably stronger against mindless brute force than others. Just don’t trust them against a determined adversary.

An adversary who has taken the time to compromise your email, and seen several of your passwords in one way or another will have a HUGE leg up reverse engineering your password choices, especially if they have any hints of being non-random (i.e. not stored in PWSafe, but rather stored in your head). Reading books (such as Bruce’s), it would appear that the most dangerous foes you face are those who can get inside someone’s head.

Dr William Hawthorne October 6, 2009 7:44 AM

Clearly the posting on our blog created some active debate and views on your blog. There were also some fairly derogatory comments. We would like it made known that we are happy to address everyone’s view, however uncomplimentary, if they wish to contact us.

We consider it is very important to stimulate these types of discussion with both someone as experienced and highly regarded as you and any others that would like to join in; after all we are keen to make our voice heard from “this side of the pond”. For example we have noticed a number of comments by Clive Robinson and our response to some of his points are

We understand Clive Robinson’s suspicions given the limited amount of information we have released about the TOUAREG Encryption Algorithm. TOUAREG is not a stream cipher. It is closest to a rotary cipher. At the risk of introducing more jargon, we describe it as a “Solar Cipher”. Some of the design constants of the algorithm are:

Mutual Key strength: 6,291,456 bits

Session Key strength: 49,152

Number of Sun and Planet rotors: 2 x 8

Minimum Cyclic Length of the Sun Wheel movements >120,000,000,000,000,000 Gigabytes

                                                                    (precise figure withheld)

Variability of the rotor offsets: >2^169770 (precise figure withheld)

Cyclic length of the primary stream: 7.83 x 10^9890

On the question of compact codes, a development of HMX, known to us in Hawthorne Davies Ltd as the “Postcard” cipher, due to the fact that its essential code can be written in nine lines, also achieves 49152 bit strength. And I’m disappointed to be likened to an old dog. I was rather hoping to be called the new kid on the block!

jacob October 6, 2009 8:31 AM

Thanks clive,
So while it would be increasing the difficulty it would not be a OTP? Thanks for answering the child’s question “why is the sky blue” 🙂
Would digitizing say the face of the moon or other image, xoring, be feasible for a OTP? also, has IDEA or AES in a triple encryption with different keys been analyzed? Thanks

Clive Robinson October 6, 2009 10:52 AM

@ jacob,

“Would digitizing say the face of the moon or other image, xoring, be feasible for a OTP?”

The two golden rules of OTPs are,

1, Use a True Random source.

2, Never ever use the same key material twice.

And a mathmatician (can not remember which one of the top of my head) has already sugested using images of the moon. Another released a CD of random numbers which he claimed he got by mixing white noise with american rap music…

However there are a few dirty little secrets you need to be aware of,

1, Most images are not at all random.

2, Truely Random sources have run length issues.

3, Most physical sources of random data are either detectably biased or have charecteristics that change in a predictable way with time.

With regards,

“has IDEA or AES in a triple encryption with different keys been analyzed?”

The answer is yes a number of people have looked at using both AES and IDEA in various modes. However I’m not sure to what level.

If you do an Internet search on them you might find some stuff also look up the likes of the work of Bart Preneel as well as AES’s & IDEA’s designers

Paul October 6, 2009 11:04 AM

Jacob,

While I hope Clive answers again (as he always has something interesting to say), I think it would be helpful to make something explicit.

The security proof of a One-Time Pad has some very specific requirements:

  • The key must be (truly) randomly generated.
  • The key must only be used once (as opposed to the original form of the Vernam cipher, which was theoretically breakable due to key reuse).
  • There must be a secondary, secure path to transmit the key.
  • The key must be greater than or equal to the length of the plaintext to be enciphered.
  • The key is securely disposed of after use.

Now, this causes serious problems due to the strict key requirements (which is obviously what you are trying to get around by creatively choosing key elements). Which is why the OTP is more of an academic exercise than a practical cryptosystem, except possibly in very high-risk low-bandwidth channels. But if you change or relax any of the key requirements, the security proof for the OTP is no longer valid and it would be good to not muddle definitions by referring to it as a OTP.

I am not an expert on the face of the moon, but it would not be suitable for a OTP on the grounds that it is not randomly generated. If anyone else knows your algorithm (e.g. digitize the face of the moon and use as the encryption key) it would not be too difficult for them to do the same thing at different points to save possible keys. Not to mention I do not think the keyspace would be very large, relatively speaking.

As for using other images, it would be feasible as a OTP only if you randomly generate the bits to use and use the image as a sort of wrapper to contain the data. If you use a stock picture in circulation (whether on the internet or somewhere else) not only are you dealing with likely insufficient entropy in the key material, you also do not have control over distribution of the image (and thus, cannot delete your key). Not to say you couldn’t build a “good enough” system for your own uses (depending on what they are), regardless it would not be a OTP.

Nonce October 6, 2009 11:34 AM

@William Hawthorne: “HMX…also achieves 49152 bit strength.”

Nope. See Lai and Rueppel, “Attacks on the HKM/HFX Cryptosystem”, FSE 1996.

Unsurprisingly, Dr. Hawthorne’s work does not appear to have been published in any established crypto venue.

Clive Robinson October 6, 2009 12:44 PM

@ William Hawthorne,

“And I’m disappointed to be likened to an old dog. I was rather hoping to be called the new kid on the block!

Hmm the “new kids” I’ve seen hanging around the block recently are even more scary than the Mods-n-Rockers and Punks of my younger days. Being likened to one of those would not be my choice 😉

Your second post tells me a little more about your system, though 9 lines of code sounds a little light for a rotor based system (unless you are entering the convoluted C competition 8)

I’m assuming by “rotors” you are refering to systems where the charecters are sent through a mapping defined by the rotor wiring, as opposed to pin / lug wheels where a pin or lug is used either as a gear tooth or simple two state electrical contact (on/off).

There are two basic types of wired “physical rotor” the Half rotor and the full rotor. Of which the full rotor was generaly considered better for various reasons that are not entirly mechanical.

The third type of wired rotor the “reflector” or “reflecting rotor” is usually not a rotor but a stator and due to certain issues is depreciated in use, and I’m guessing that you probably don’t use one.

However apart from the number available / in use and the terminal to terminal wiring the actual rotors are of little interest except as a simple map.

In fact most rotors can simply be replaced with strips of paper on a ruler and a few rubber bands to stop them moving to easily 😉

What is usually of slightly more interest is how the rotors are fed.

Generaly there are two ways to feed a rotor. The first is, as in the enigma design where there is a terminal for each char in the alphabet in use. The second is where a rotor has the same number of input terminals as there are bits in the char (5 for baudot 7 for ASCII etc).

It is how the rotors are used where the fun starts and is generaly of most intrest.

The second big weakness of the enigma design was the predictable stepping method. Later versions of Sigba and a certain NSA influenced Swiss company used complex stepping with reversable steps and stepping based on other rotors / lug wheels.

However they all tended to use a side by side rotor line up, and have the same number of contacts on the rotors and thus the same number of steps per rotor.

However in software the last set of physical constraints do not apply (provided you are carefull with your mapping choices).

A couple of open refrences on the Internet suggest that the system considered by the ITU used 19 prime numbers (possibly for Chinese clock mathmatics).

And you refere to it as a “Solar Cipher” and say you have 2×8 “Sun and Planet rotors”. This gives rise to a couple of notions.

The first being that you have two nearly identical rotor drive systems. The second that the stepping of the rotors could be either as a “sun and planent” gear system or more interestingly similar to that of an orrery.

Now the question is why two sets…

History sugests (the work of Frank Rowlet on Sigba etc) that one is used to drive the steping of the other. However the japanese had a system where one rotor acted as the input half of the map and the second the output half of the map.

From your two statments,
“TOUAREG is not a stream cipher.” And “Cyclic length of the primary stream: 7.83 x 10^9890” possibly suggests the former not the latter.

One very odd thing is your statment,

“Minimum Cyclic Length of the Sun Wheel movements >120,000,000,000,000,000 Gigabytes”

I would not normaly associate “Gigabytes” with an equivalent of a mechanical stepping.

You have indicated that your “Mutual Key strength: 6,291,456 bits” is a byte array which as you talk about rotors could quite possibly be the equivalent of the rotor wiring (ie the base rotor maps).

I could start making further guesses but I might give you some new ideas to think over 8)

DaveC October 7, 2009 12:52 AM

@tom:

The trick I use to try to get the most entropy is to use passphrases that I need to figure out each time I use them. I can not remember the passphrases; I remember how to recreate them. An example would be

74931953381110189=16541times16543TIMES16547times16549

This becomes my passphrase into PasswordSafe. I have password safe generate random passwords of 30-50 characters. It is easy for me to remember how to recreate the passwordsafe passphrase but I’d never be able to remember it.

I will concede though that this is less than 6 bits of entropy per character, but I suspect that it is a bit more than 1.5 bits per character.

Not trying to be (too much of) a smartass, but this is ironically a great illustration of how easy it is to mislead oneself with key strength calculations …

If your 4 numbers are always a linear progression but you vary the interval, your key components are:

  • choose a 5 digit integer from 10000 to 99999 (~25.5 bits)
  • a single digit integer from -5 to 5 for the progression (~3.5 bits)

So you’ve got about 29 bits and an average of about 19 digits -> I make that right about 1.5 bits per digit.

I was wondering if switching up the arithmetic functions would give you a few more bits, but AFAICT it’s lagrely negated by duplicate keys.

Lee October 7, 2009 6:57 AM

in response to the epicleveling spam – one wonders if they’ve contemplated the blog content and present discussion (which is a very interesting read)?

my money’s on it being posted by the same lot who tried to scam Bruce’s laptop away 😉

Steve Roberts October 15, 2009 7:54 AM

@tom:
When you calculate your true password by 16541x16543x16547x16549 etc, I bet you can’t do all that in your head and you jot the numbers down on a piece of paper.

bugstomper October 15, 2009 8:07 AM

A more detailed and thorough analysis of limits to computation posed by physics was this paper from 1999-2000 as cited recently by a pseudonymous poster to the cryptography mailing list. The conclusions do not contradict Bruce’s assertion that enumerating all states of a 256-bit counter would be impractical. Note that the “ideal computer” in the paper requires an operating temperature equivalent to the Big Bang. The rest of this is quoted from that post:

Seth Lloyd, “Ultimate physical limits to computation”:
http://arxiv.org/abs/quant-ph/9908043

“As an example, quantitative bounds are put to the computational
power of an ‘ultimate laptop’ with a mass of one kilogram confined to a
volume of one liter.

The ultimate laptop performs 2mc2 /π¯h = 5.4258 × 1050 logical
operations per second on ≈ 1031 bits. Although its computational
machinery is in fact in a highly specified physical state with zero
entropy, while it performs a computation that uses all its resources
of energy and memory space it appears to an outside observer to be in
a thermal state at ≈ 109 degrees Kelvin. The ultimate laptop looks
like a small piece of the Big Bang.”

bugstomper October 15, 2009 8:20 AM

The exponents disappeared in the copy and paste in my immediately preceding post. That should have read “2mc^2 /π¯h = 5.4258 × 10^50 logical
operations per second on ≈ 10^31 bits” and the temperature should have been 10^9 K

Sandro October 15, 2009 9:11 AM

I guess this guy got more attention than he deserved and was probably expecting, given the size of the doghouse article and the huge number of comments…

Squid October 15, 2009 10:27 AM

On the question of memorizing real, long passphrases which contain good entropy:

The best thing I have found is to use some quality passphrase generator to create a 16 or 32 character passphrase for you and MEMORIZE THAT!

For people who think this might be too hard, ask a piano player to write out the notes to a favorite piano tune on paper (without the piano).

Typing a 32 character passphrase becomes a piano tune on the keyboard after a few repetitions.

I can’t write mine, or say it letter-for-letter, but I can type it, no problem.

Any other ideas on overcoming the limited human memory capacity for complicated, long passphrases with adequate entropy?

Kevin Wall October 15, 2009 10:52 AM

Douglas Adams had it almost right. The universe is actually a gigantic quantum computer designed by Satan to brute force God’s 49,152-bit symmetric key. Answer to be delivered RSN, perhaps on 12/21/2012. 🙂

Fredric L. Rice October 15, 2009 6:55 PM

This “Doctor” Hawthorne reminds me of “Doctor” L. Ron Hubbard who also made grand claims and statements utilizing emotive rhetoric to hype-up his non-existant achievements, after which “Doctor” Hubbard sought Nobel Peace Prizes for being the world’s greatest humanitarian, scientist, and philosopher.

When the marketing rhetoric yarks off in to sky-high nonsense, you know there’s fried shit being sold, this time sauted in snake oil.

“Doctor” Hawthorne should make the software implementation available for public review and testing so that the process can be evaluated by non-amatures, otherwise — “Dismissed! Go away, kid, ya bother me.”

My opinions only and only my opinions, as always.

Tom T. October 16, 2009 12:37 AM

Squid wrote:
“Any other ideas on overcoming the limited human memory capacity for complicated, long passphrases with adequate entropy?”

My personal choice for the Password Safe key is acronyming, with the provision that the acronymed phrase is easy for me to remember (perhaps from a favorite movie, book, play, speech, etc.,) but not easily guessed by those with full access to my public records, the books in my home, etc. (Think how many phrases each book, movie, etc. contain, and that it might be one from a television program from 30 years ago, of which there’s no evidence that I saw it.)

Having done that, acronym it in a way that you can remember, but which still includes the various cases, numerals, keyboard characters, etc.

Trivial example, although today’s students aren’t taught history, apparently:

The beginning of the speech made by then-President Abraham Lincoln after the 1863 Battle of Gettsyburg in the American Civil War:

“Four score and seven years ago, our forefathers brought forth on this continent a new nation…”

Some acronyms:

4S&7ya,oFfbfotCanN
fS+sya,o4fb(4th)otcaNN

Note that I can remember to put parens around turning “forth” into “(4th). Could also add more caps, etc.

passwordmeter.com rates the second at 100%, or “very strong”, despite the fact that they have a 16-character limit and therefore truncated the rest — Why? I can memorize the phrase and converter to this 22-character password very easily.

Microsoft rated both at 100%, but who wants to trust them?

Geekwisdom.com actually thought the first one stronger than the second. Odd.

Of course, this is a trivial example, using one of the most famous speeches in US history. But if I use phrases chosen from less notable sources, and chosen at random, and apply the characters and casing properly, then it seems “they” would still have to waterboard it out of me, which “they” can do anyway.

As I’m not a crypto expert in any way, I’d welcome comments on the entropy of this method, assuming a generating phrase that even one’s closest associates are unlikely to guess. (Pick a phrase from something that you saw/heard/read before these closest associates knew you, assuming your parents and siblings aren’t your adversary.)

enigma October 16, 2009 10:42 AM

@Clive Robinson

Minor correction.

Regarding your comment about Serpent, truecrypt is a well-known, non-bespoke disk/volume encryption application which uses serpent in some of its multiple encryption variants.

Yes, people will use it 🙂

Clive Robinson October 16, 2009 11:38 AM

@ enigma,

“Regarding your comment about Serpent, truecrypt is a well-known, non-bespoke disk/volume encryption application which uses serpent in some of its multiple encryption variants.”

Aghh yes I had forgoton TrueCrypt…

Perhaps you will alow me to wriggle a bit on the hook?

By saying in my previous paragrah I was talking about “commercial” and “interopreability”.

TrueCrypt does not have to interoperate with other programs, just it’s self…

Have I wriggled off 8)

Mike November 5, 2009 3:02 PM

A million digit combination lock safe made of cardboard is less secure than a three digit combination safe made of steel. TOUAREG should go back to the drawingboard.

Daniel Carrera December 18, 2009 12:28 PM

@ Clive Robinson,

“ergs” are not actually SI units. The SI unit of energy is the joule.

“ergs” belongs to the cgs (centimetre, gram, second), so it could be confused with SI, but the SI system is actually based on mks (metre, kilogram, second) plus other base units (ampere, mole, candela, kelvin).

Therefore, the unit of energy in SI is the joule, not the erg.

CW January 5, 2010 6:22 PM

@Paeniteo
AFAIK it generally isn’t trivial to simply double the keylength for a given algorithm – nobody knows how an AES-512 should look like.

AES512 looks just like AES128, AES192 and AES256. The entire algorithm works with a 512 bit seed key just like it works with any of the smaller ones.

The only variable is how many rounds to execute for diffusion.

CN June 25, 2013 10:45 AM

I know I’m a few years late with this comment, but “ergs”? Really? OTOH, it’s great to see seniors still doing useful work, and you look surprisingly spry for your age. Most of the people I’ve known who ever used CGS units are gone now.

Clive Robinson June 26, 2013 5:14 AM

@ CN,

    it’s great to see seniors still doing useful work, and you look surprisingly spry for your age

The trouble with you young upstarts is you get to upitty, thus the old gummers get gainfull employment if not some enjoyment cuffing you around the ear or a good poke with a walking stick. So watch your six because even a tourtoise can sneak up on a sleeping hare…

As for the “middle aged” who have time served enough to know when to duck or jump we sit back and watch with interest to see the race and also learn golf so when it’s our turn to use the walking stick we know how to get a good swing on it 😉

Kev March 8, 2016 1:26 PM

CubeItz is now apparently doing 1 million bit encryption. I [A] find that seriously hard to believe, due to the excessiveness. [B] find it seriously hard to get any information on it other than ‘1 million bit encryption’, sounds a bit fishy.

Seirdy January 15, 2021 12:44 AM

I recently wrote a blog post
analyzing physical limitations of a brute-force attack in a similar vein to the
provided excerpt from Applied Cryptography.

TLDR: I concluded that a classical computer operating at the theoretical maximum possible efficiency can brute-force a 306-bit key with 100% certainty after using up the mass-energy of the observable universe. I picked a more conservative estimate for the temperature of cosmic background radiation.

Scott Aaronson published a similar post with a similar approach, reaching the same conclusion for 405-bit keys.

To find a safe key length for quantum computers, take a safe key length for classical computers and double it.

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.