Monday, November 2, 2009

Upcoming Black Swan talk at ZISC

I will be giving a talk in mid December at the Zurich Information Security Colloquium (ZISC) on one of my favourite topics, Some Black Swans in IT Security. Details can be found on the ZISC site.

Security FreeMind map sources

I have started to upload the FreeMind source of my MindMaps to my Google site here.  I will eventually upload all my maps, including those that were used just for reference and storing links (17 have been uploaded at the moment).

Sunday, October 11, 2009

Focus on securing business processes not the process of securing

The title comes from a list of conclusions I gave at a presentation called Does IT Security Matter? just before Christmas in 2007. The wonderful thing about the writing process is that every now and again you hit upon a pithy phrase like that which communicates so much. But it's like mining for gold - you have to move a lot of earth to find the nuggets.

The full presentation is available on Sribd as shown below. There have been about 1200 reads and 240 downloads. Re-reading it now, the presentation could do with an update, however the core messages are still valid. My main conclusions were
  • There is a dependency between IT and IT Security but not a strategic relation
  • IT and IT Security are good neighbours but not good friends
  • IT Security is one area competing for attention and funding, amongst many
  • If you don’t make IT security matter, it won’t
  • Focus on securing business processes not the process of securing
  • Excel is your new best friend - make your spreadsheets work with their (business) spreadsheets

Does IT Security Matter?

The Size of our Security World

I was sent a link from StumbleUpon that referred to a post which showed the relative sizes of planets in our solar system, then compared them to our Sun, and moved onto comparisons with much larger other stars. Surprisingly much larger stars - in fact in the final screen-size graphic our Sun is just represented as a single pixel as compared to the Antares, a red supergiant star in the Milky Way. My 15-year-old daughter was impressed by this, and if you have a 15-year-old, then you can appreciate what a momentous achievement this is.

I started out my post-university life working in cryptography, then I spent a long time in IT Security, then IT Risk and most recently in Enterprise Risk Management (ERM). When I look back at crypto now it seems of similar consequence to the proportions of the Sun and Antares - not merely because my professional interests have changed, but in the vast equation that constitutes ERM, crypto is a variable with minor weighting. Its gravitational force is largely exerted on specialists, and rapidly declines (much faster than the inverse square law) beyond that sphere. It's just a pixel on the football-field sized collage of ERM.


Tuesday, October 6, 2009

Risk Analysis Rising

In June I posted on a paper called A Risk Analysis of Risk Analysis, and from that post

The title of this post is taken from a both sobering and sensible paper published last year by Jay Lund, a distinguished professor of civil engineering at the University of California (Davis), who specialises in water management. The paper presents a discussion of the merits of Probabilistic Risk Assessment (PRA), which is a “systematic and comprehensive methodology to evaluate risks associated with a complex engineered technological entity”. PRA is notably used by NASA (see their 320 page guide) as well as essentially being mandated for assessing the operational risks of nuclear power plants during the 80’s and 90’s.

It  is a wonderfully insightful paper that I uploaded to Scribd, who recently informed me that the paper is now on their hotlist. You can get to the paper from the link below. Highly recommended!

Monday, September 21, 2009

Post Number 100

This is post number 100 since the beginning of the No Tricks blog in September 2007. I have not been collecting statistics from Google Analytics over this whole period, but since April 2008 my site has had just over 12,700 visits by people who have viewed close to 21,000 pages. My bounce rate and new visitor rate stubbornly hover around 75%. The majority of visits are to the main page, but after a post gets off the main page we can get some measure of its popularity. The top 10 posts by the number of visits are

  1. Are AES 256-bit keys too large? (2,558)
  2. Examples of Risk Profile Graphs (1,164)
  3. Entrust PKI v5 Overview (743)
  4. The cost of SHA-1 collisions reduced to 2^{52} (590)
  5. AES-256 and Reputational Risk (514)
  6. Weapons of Math Instruction: The Birthday Paradox (458)
  7. The spin on passwords for AES (421)
  8. The Wisdom of a Random Crowd of One(416)
  9. Goodbye Yellow Brick Road (412)
  10. Long Tail of Vulnerability for A5/1 (362)

The top two posts account for a large number of visits, which I called my Pareto Posts back in my assessment last Christmas . The post on AES-256 and Reputational Risk may yet reach towards the top as it is a relatively recent post that is getting more hits. On the other hand, if we rank posts by the average time spent reading the post, then a different picture emerges. Below I list the top 10 posts that have had at least 20 views by largest average visit time

  1. Excellent Awareness talk from British Airways (8:35)
  2. Self-Destructing Digital Data with Vanish (7:38)
  3. Marcus Ranum and the Points of No Return (7:23)
  4. Twitter in the Land of Power Laws (6:42)
  5. ENISA and Security Awareness (5:41)
  6. Quantum Computing: are you Shor? (5:30)
  7. The Sub-Time Crisis in Web 2.0 (5:11)
  8. AES-256 and Reputational Risk (4:45)
  9. Weapons of Math Instruction: The Birthday Paradox (4:43)
  10. A tie between Long Tail of Vulnerability for A5/1 and Zero Knowledge Proofs (4:20)
The posts on AES reputational risk and attacks on A5/1 are perhaps the best examples of posts that are both well-visited and well-read. Thanks to everyone for visiting the No Tricks blog!

Sunday, September 20, 2009

My Top 10 Security and Risk Uploads to Scribd

I have been reading and uploading to Scribd for several years now. It is really a vast source of documents and its seems that it has been a victim of its own popularity since now so many varied and inconsequential documents are finding their way to to site. The search function is not quite as effective as it was, and as always been true, the site itself is quite slow.

Over the last couple of years I have slowly uploaded just over 40 documents and presentations, mostly in the area of security and risks. For the last few months I have been getting just over 100 hits per day, and about 12 downloads per day. The total number of hits is now getting close to 20,000, and will reach that mark in the next week. Here is a list of the top 10 visited documents that I have uploaded – the number of reads is in parentheses, and documents in bold type are written by me

  1. A Data Centric Security Model (1529)

  2. ISACA Risk Framework (1498)

  3. How much is enough? A Risk Management Approach to Computer Security (1290)

  4. Does IT Security Matter? (1127)

  5. Entropy Bounds for Traffic Confirmation (886)

  6. Risk Analysis of Power Station survival of Cyber (712)

  7. Password Authentication on Mac OS X from Dave Dribin (704)

  8. An analysis of the Linux Random Number Generator (702)

  9. The Core Components of the Entrust PKI v5 (677)

  10. Canadian Government 1999 Threat and Risk Assessment Guide (628)

Thursday, September 17, 2009

The Luxembourg Attacks and AES-256

There have been several attacks on AES-256 in the last few months, referred to as the "Luxembourg Attacks". I had hoped to write about the attack announced in June before I went on summer holiday, but ran short on time - and then another attack was announced in July. It is the first attack that I wish to address here to emphasize some points I made in an earlier post.

In May I wrote about the topic of AES-256 and Reputational Risk where I argued that the huge 256-bit key makes it possible for someone to construct an attack which has "big savings" over exhaustive search yet still does not effectively change the margin of security. What suffers in this case is the reputation of AES-256, and the post concluded with

Imagine you posed the following question to a group of top physicists. You asked them to present you with ideas for new research projects where they could assume that the budget included all the money that we have, all the money that has ever been, and the total financial assets of the world for the next 10 million years. Would the resulting proposals be credible?

AES-256 puts cryptanalysts on the same research agenda.

So the first Luxembourg attack gives you an example of the type of arguments that can be proposed when working with 256-bit keys. The attack needs to work with 2^{119} plaintext/ciphertext pairs, requiring about 2^{77} storage, all to yield 156 bits of the key. The remaining 100 unknown bits of the key are guessed, at an average cost of 2^{99} encryptions. This guessing step is hardly worth mentioning as compared to the main 2^{119} computation, since it represents something like one part in a million of the total work.

Some Details of the Attack

The basis of the attack is called differential cryptanalysis (DC), a powerful attack method that gained widespread popularity in the late 80's, and was standard fare by the time AES was announced. In this attack the cryptanalyst finds a pair of values (~P, ~C) such that two plaintexts P1 and P2 of difference ~P  = P1 + P2 are biased towards producing ~C as the difference C1 + C2 of their corresponding ciphertexts (here the difference operator + usually means XOR but it varies according to how the key is combined internally in a cipher). In short, plaintexts of difference ~P produce ciphertexts of difference ~C with some probability that is much higher than 2^{-N} where N is the block size of the cipher. Given such a pair of differences (~P, ~C) an attacker can use statistical methods to gain information about parts of the key that were used in the last or next to last round for example. The remaining bits are then guessed.

DC is typically quite data intensive. Even though plaintexts of difference ~P may be biased towards producing ciphertexts of difference ~C, this may only happen with probability 2^{-50} for example, which sounds quite low but it is much higher than you would expect for a 128-bit block cipher. Therefore the attacker will need to obtain  2^{50} plaintext pair encryptions of difference ~P to get on average just one ciphertext pair of difference ~C. Part of a successful DC attack is designing a filtering strategy to discard most ciphertext pairs that don't have ~C as a difference before they are spuriously used as the basis to give information about some part of the key. So a large number of encryptions are needed, where the majority will be winnowed away, leaving a precious few that lead to information about the key.

In the  first Luxembourg attack (LA1), the basic DC attack is upgraded to a boomerang attack which eliminates the need to thread the differences (~P, ~C) to each other through all the intermediate rounds of the cipher. LA1 also uses related keys to coerce differences into a desired form - in the vanilla DC attack we are trying to keep plaintext differences in a known state, but by using related keys, the key can also be used to bring the internal state of the cipher into a known difference.

The basic step of LA1 is to obtain 2^{61} encryptions of specially crafted quartets (4 related values), and then combine these values amongst themselves to generate the required internal differences. It is this combination process and its associated filtering which drives up the cost of the attack to 2^{119}. The result is that the remaining unfiltered quartets identify all but 100 bits of the key which, as we mentioned above, can be guessed at a small marginal cost.

Luxembourg 2.0

Even though the attack has unrealistic resource requirements, it works on the full-version of AES-256, which is a significant result – something that no other researchers has been able to do – and that’s not due to lack of trying. And the reputation of AES-256? Well the Register for example  reported that LA1 represents an improvement over brute force search for AES-256. Well even that is debatable since you need a huge amount of plaintext/ciphertext encrypted under several keys to mount the attack, while brute force  needs just a few blocks to execute. What’s in the future then? Well I guess its here already in the second Luxembourg attack, which is on my reading list.

Related Posts

Tuesday, September 15, 2009

Thoughts on the Cult of Schneier

Earlier in the year John Viega wrote a short opinion article called The Cult of Schneier, referring to the near-religious following that Bruce Schneier has acquired over his long and successful career in IT Security, and the biblical authority that the Applied Cryptography book has attained. Viega's main issue with the book as it currently stands is that "It's fine and fun to read it, just don't build from it".

I think that Applied Cryptography was a very well-crafted book. It contains an excellent mix of mathematics, exposition, security intrigue and executable code. However for me, and a few other cryptographers I know, the Handbook of Applied Cryptography is a best source of general cryptography information. The book does not enjoy anywhere near the same general recognition as Applied Cryptography, seemingly because it is viewed as a "math book" - correct, factual, thorough and therefore unappealing to a wide audience, as most technical books are. In short it lacks the narrative woven into Applied Cryptography. On the other hand, no one would really confuse the Handbook with a solution manual for designing and implementing secure systems.

Earlier in the year I made a post on Some Black Swans in IT Security, and I listed Bruce as an unexpected phenomenon in the following way

Bruce Schneier is the best known security authority in the world. His blog has hundreds of thousands of readers, his posts can yield hundreds of comments, and his books are bestsellers. His opinions hold sway over both technical people and executives, as well as all the layers in between. He is the Oprah of security - a public figure and a leading opinion maker. The Black Swan aspect of Mr. Schneier is that he has achieved this status through excellent communication (and yes cunning publicity as well) rather than technical prowess. Of course he has technical prowess but that is rather common in security and cryptography. What is uncommon, or even uncanny, is the ability to explain security in terms that can be understood by non-specialists whether it be programmers, professionals, managers or executives. Bruce has literally written himself into the modern history books of security. He has shown, once again, that communication is king - the security explanation is mightier than the security deed.

I don’t really think that there is a cult in operation over Bruce Schneier, but rather a hero was found when security as an industry needed to believe in heroes.

Sunday, September 13, 2009

Another crack at open Rainbow Tables for A5/1

About a year ago I posted on The Long Tail of Vulnerability for A5/1, the stream cipher for GSM data encryption, after earlier in the year David Hulton, director of applications for the high-performance computing company Pico, and Steve Muller, a researcher for mobile security firm CellCrypt, announced new results which claimed that A5/1 keys can be recovered in 30 minutes for $1000. I concluded that

A5/1 has operated unchanged for the last 21 years but it has now reached its cryptographic end-of-life, engulfed by the march of Moore's Law. However, the operational end-of-life of A5/1 may still be decades away as there are approximately 2 billion GSM subscribers, commanding about 80% of the global mobile market. This would be a tough product recall indeed. A5/1 is well-positioned to become the NT of the mobile crypto world, and I see the makings of a long tail of GSM vulnerability.

The Hulton & Muller attack was based on pre-computing rainbow tables that enable the direct "lookup" of A5/1 keys when indexed by a small amount of observed traffic (3 to 4 GSM frames). However these tables have not been materialized, apparently due to potential legal issues.

But a new project announced in August could deliver these tables in 6 months - or by Christmas if we are lucky. Karsten Knol, best known for his work on reverse engineering the Mifare chip, announced, at the Hacking At Random conference in August, a new project to produce A5/1 rainbow tables that would be open and available to the public. Knol described the project details in his conference paper with the somewhat sinister title of Subverting the security base of GSM. In subsequent interviews Knol has gone to some lengths to emphasize that his intentions are not to destabilize GSM, but rather to highlight through public demonstration the weaknesses GSM encryption, and hopefully initiate a migration towards a more secure solution for mobile telephony.

What are these Rainbow Tables?

Several recent articles paint quite a bleak picture for GSM, such as this one from the Tech Herald, which says of Knol’s project, “if successful, will allow anyone with some RF equipment, patience, and a $500 USD laptop, the ability to decode GSM-based conversations and data transmissions”. We need a little technical background to decode this statement.

Imagine that an attacker is given (or intercepts) a plaintext-ciphertext pair P, C encrypted under an unknown key. In a brute force attack the attacker arranges all possible keys in some order, and then starts searching the keys until the right one is found – that is, the key K that encrypts P to C. In this case the attacker is facing a needle-in-a-haystack search problem since there is only one point in the key space (the target key K) that provides information on when the search can stop.

The idea of rainbow tables is to pre-compute a compressed form of the key space that contains a collection of “checkpoint” keys. When searching the compressed key space, hitting a checkpoint identifies a relatively small collection of keys (say several billion) that is known to contain the target key. So key search with rainbow tables has two steps; first, start the search to find a checkpoint, and then second, search the keys defined by the checkpoint to find the target key.

The advantage over brute force search is that many checkpoints can be created, and the search process will therefore hit one of these checkpoints much faster than waiting to stumble across the lone target key. Of course the disadvantage is that the checkpoints points must be pre-computed and stored. So there is a time-memory trade-off (TMTO) to be considered – the search time is reduced with more checkpoints but at a cost of more pre-computation and storage.

On the Shoulders of Giants

We have glossed over many important ideas in an effort to give the gist of how rainbow tables offer improvements over brute force search. In practice generating the required set of a tables to allow key recovery with high probability is quite tricky. Conveniently for Nohl and his team, the appropriate parameters for generating an effective set of tables was done previously by The Hacker’s Choice (THC), including the special optimisations that give rise to the name rainbow tables.

The THC project failed to produce a set of public A5/1 tables (apparently for legal reasons), and Knol’s project is to use the THC base to re-compute the tables in an open and distributed fashion. One innovation is to compute the tables using CUDA chips, which is a parallel computing architecture developed by the graphics co-processor company NVIDIA. Knol estimates that computing the tables on a single-threading Intel-like processor would require 100,000 years but only a matter months on 80 CUDA nodes. So Knol’s project (homepage here) is to find volunteers who will download optimized A5/1 code for the construction of the tables. It is estimated that the total computational effort to produce the tables is 2^{57}.

Tables in the Cloud

Knol emphasizes that the tables are computed through a volunteer distributed computation and then made available to all and sundry through BitTorrent for example. The point is not to centrally reconstitute the tables – in fact he wants to avoid that. He does not want to have a central point or legal entity where the project can be potentially shut down, and this also helps to provide anonymity for the volunteers if they seek it. In any case, this sounds a little bit like the defence that Pirate Bay unsuccessfully used in their recent prosecution – we don’t distribute illegal music, only the index for where you can find it.

Impact and Responses

Public rainbow tables for A5/1 will provide a definite and public demonstration of the shortcomings of GSM encryption - currently relied on by 3 billion subscribers in over 200 countries, representing about 80% of the mobile telephony market. While it has been known for sometime that A5/1 is weak, there is nothing like an actual demonstration to drive the point home, as was the case in the breaking of 56-bit DES or more recently with the Cold Boot attack.

Potentially all GSM communication, including voice and SMS, as well as newer security services that rely on GSM as an out-of-band mechanism for authentication and authorization, will be under threat. Knol’s advice, and the main reason for undertaking the project, is to raise awareness on the need to upgrade to more secure encryption. The stronger A5/3 algorithm is being phased in during upgrades to 3G networks, providing 128-bit encryption. However A5/3 may only be used to encrypt data traffic, and voice traffic will remain with A5/1 until a full upgrade is made.

The GSM Association (GSMA) has politely scoffed at the project, claiming that the practical complexities of carrying out actual attacks, in particular intercepting the required GSM traffic, are being underestimated. However I think the project team will rise to the challenge.

Knol hopes to have proof-of-concept tables by the end of 2009, and you can see some graphs on the main project page showing current computational results. In risk terms the project promises to show that what is perceived as a high-impact/low-probability event is actually medium- to high-probability. We should have more news by Christmas.

imageYou can find additional information on this topic from the FreeMind map I created for this post, rendered into Flash here.

Wednesday, September 2, 2009

New US Digital Border Search directives

The US Department of Homeland Security (DHS) announced last week new directives to enhance and clarify oversight for searches of computers and other electronic media at U.S. ports of entry, justified as "a critical step designed to bolster the Department’s efforts to combat transnational crime and terrorism while protecting privacy and civil liberties". Border searches as an issue flared up last year when a group including the Electronic Frontier Foundation (EFF), the American Civil Liberties Union and the Business Travel Coalition, published an open letter calling on the House Committee on Homeland Security to limit searches to be appropriate and non-invasive. The purpose of the latest policies is
To provide guidance and standard operating procedures for searching, reviewing, retaining, and sharing information contained in computers, disks, drives, tapes, mobile phones and other communication devices, cameras, music and other media players, and any other electronic or digital devices, encountered by U.S. Customs and Border Protection (CBP) at the border
Notice that the scope of devices that can be searched is quite broad. And well it might be, since the policy is designed to detect electronic evidence relating to terrorism, human trafficking, bulk cash smuggling, contraband, and child pornography, as well as other run-of-the-mill crimes. Where practical, these searches will be conducted in the presence of a supervisor and the owner of the electronic device.

Devices should only be detained if there is probable cause to believe that the device contains evidence of a crime that border authorities are authorized to prosecute. Devices may be “detained” for up to 5 days without justification, and this period can be extended in the case of extenuating circumstance but requires supervisor approval. The policy also makes clear that your device or a copy of its contents can be sent to another location for additional assistance, which includes requests for translation, decryption, or general subject matter consulting Such requests must be approved by a supervisor, transmitted securely, and processed within 15 days.

The DHS reports that between Oct. 1, 2008, and Aug. 11, 2009, border authorities encountered more than 221 million travellers at U.S. ports of entry. Amongst these travellers, approximately 1,000 laptop searches were performed, and less than 50 of the searches were detailed. So using these figures, your chances of being searched are less than one percent of one percent per traveller volume. However, I think the chances of being searched would be higher if the figures were restricted to travellers who were actually carrying a laptop or some other visible electronic device.

Tuesday, August 25, 2009

Solo Desktop Factorization of an RSA-512 Key

It was recently announced that Benjamin Moody has factored an RSA-512 bit key in 73 days using only public software and his desktop computer. The first RSA-512 factorization in 1999 required the equivalent of 8,400 MIPS years over an elapsed time of about 7 months. The announcement was made all the more intriguing in that it came from a mailing list associated with a user forum for Texas Instruments calculators.

The public key in question is used to verify signed OS binaries for the TI 83 Plus and the factorization means that

… any operating system can be cryptographically signed in a manner identical to that of the original TI-OS. Third party operating systems can thus be loaded on any 83+ calculators without the use of any extra software (that was mentioned in recent news). Complete programming freedom has finally been achieved on the TI-83 Plus!

The original post from Moody was not very informative, but the subsequent Q & A thread drew out more details. Moody used public source factoring software on a dual-core Athlon64 at 1900 MHz. Just under 5 gigabytes of disk was required and about 2.5 gigabytes of RAM for the sieving process. The final computation involved finding the null space of a 5.4 million x 5.4 million matrix.

Most security people would rightly claim that RSA-512 is known to provide little security as a key length, however this does not mean that 512-bit public keys are not in use today by companies other than Texas Instruments. By the way, someone is offering an RSA 512 factoring service with a “price indication” of $5000.

CORRECTION, Feb 19, 2010: The original post said the factoring was done in 73 hours when the correct value is 73 days, as pointed out in the comment below. Thank you to Samuel for pointing out the mistake.

Self-Destructing Digital Data with Vanish

University of Washington researchers recently announced a system for permanently deleting data from the internet. The solution, called Vanish, can be used for example to delete all copies of an email cached at intermediate relay sites and ISPs during transmission from sender to receiver. Advertising that Vanish provides self-destructing data conjures up the digital equivalent of a tape that bursts into flames after its message has been played. But data protected by Vanish neither self-destructs nor does the system actively seek out data at third party sites for deletion.

Vanish works by encrypting data with a secret key (say an AES-256 key), splitting the key into secret shares, and then storing the shares at a randomly selected set of nodes in the distributed hash table (DHT) of a public P2P network. In this way Vanish creates an encrypted object for inclusion in email, for example, that the sender can transmit to a receiver or group of receivers.
image
When a receiver opens the encrypted object, Vanish attempts to access a sufficient number of DHT nodes with shares so that the key can be recovered and the data decrypted.

The self-destructing aspect is that the key shares will be deleted as part of the natural node churn in the DHT, quite independent of the actions of both the sender and the receiver. The lifetime of a share is about 8 to 9 hours in the DHT, after which there is a low probability that there will be a sufficient number of shares to reach the recovery threshold.

So the encrypted data does not self-destruct - but rather key recovery information is placed in volatile public storage that is very likely to delete that information after a short delay as part of its normal operation. And with that deletion also disappears logical access to all copies of the unencrypted data.

The full paper that describes the Vanish system has extensive results of experimenting with the Vuze P2P network, as well as other local simulations, examining the interplay of various parameters such as the number shares versus the deletion rate.

Monday, August 24, 2009

Twitter in the Land of Power Laws

This is my first post after a long summer holiday and I am glad to say that my blog has still been receiving a reasonable number of visits in the absence of new content, albeit some visits were barely more than a glance.

Gartner has released a collection of Hype Cycles for various technology niches, as reported by Eric Auchard at Reuters for example. The particular Hype Cycle below is for Emerging technologies (double-click to enlarge).

image

I was immediately surprised to see Quantum Computing registering in the Technology Trigger region, which is used to denote technologies that are 10 years or more out from acceptance. I think this is something of an understatement as I argued here.

Auchard picked up on microblogging (read Twitter) being positioned near the peak of expectations, and therefore about to experience the full G-force of descending into the Trough of Disillusionment. I am not really sure from what perspective Gartner is making this prediction, since a recent study released Sysmos shows that user growth has more than doubled this year – in fact, over 70% of Twitter users joined in 2009.

image

The report shows that almost every aspect of Twitter is operating under a power law. In the case of new users this power law means exponential growth, whereas for other measures the power law typically means the domination by the few. For example, 92% of people follow less than 100 other people, but 1% of people follow more than a 1000 others. Less than 1% of people have more than a 1000 followers, and more than 90% have less than 100. Interestingly, 21% of people with a registered Twitter account have never made a tweet. Most people make only one Tweet per day but just over 1% make at least 10 on average. More generally, Sysmos observed that there is a 75/5-rule in operation, meaning that 75% of activity is accounted for by 5% of the Twitter user base.

image

Is Gartner right then? Perhaps in the words of Neil Postman the majority of Twitter users are just “amusing themselves to death”, while a few Twitter users are really tweeting themselves to death. Where is the business case here? But as I argued in The Sub-Timed Crisis in Web 2.0, we are not heading for collapse:

Unlike our current financial structures, the web is not at threat of collapsing, though many foot soldiers will fall by the way (they see themselves as pioneers, but in fact they are easily replaced). Our informational structures are not hierarchical but relational, and as such, are much more resilient to the removal of individuals. It is not the case that there are eager underlings waiting to replace leaders – the underlings are here and functioning already.

Web 2.0 losses will largely go unnoticed. New users/readers, whose information experience begins today, are being added all constantly. They are essentially unaware of what happened last month and will remain that way. Joining is not a generational wait, no corporate ladder to be climbed. Everyone joins and progresses simultaneously. This turnover goes largely unnoticed since leavers are dwarfed by joiners.

I am quite sure that Twitter will weather the Trough of Disillusionment – in fact, it already has I would say. Gartner is as overly optimistic of Quantum Computing as it is pessimistic of Microblogging.

Related Posts

Saturday, July 4, 2009

How will my loved ones break my password?

image Just a few days ago I posted about a new Swiss web service from DataInherit to manage the life cycle of your sensitive data and credentials. Coincidentally Cory Doctorow has an article in the Guardian this week on the same topic, fretting about passwords being carried off with loved ones into the next life. While creating a will with his wife, Doctorow was stumped by how to deal with his data, and specifically the secrets that protect that data. His various hard disks are protected by AES-128 bit encryption and a passphrase that is unlikely to succumb to anything less than quantum leaps in quantum computing. So while Doctorow feels safe against attacks on his data, he wonders about the following scenario:

But what if I were killed or incapacitated before I managed to hand the passphrase over to an executor or solicitor who could use them to unlock all this stuff that will be critical to winding down my affairs – or keeping them going, in the event that I'm incapacitated?

After considering several technical and non-technical approaches he finally decided on the following solution

I'd split the passphrase in two, and give half of it to my wife, and the other half to my parents' lawyer in Toronto. The lawyer is out of reach of a British court order, and my wife's half of the passphrase is useless without the lawyer's half (and she's out of reach of a Canadian court order).

Doctorow remarks that the surprising outcome of this process was the realisation that we are missing a well-known service for handling key escrow in an era of military grade encryption being available to home users. He concludes that “you need to figure this stuff out, before you get hit by a bus and doom your digital life to crypto oblivion”. I think that DataInherit will be giving him a call.