Business Continuity Test Scenarios at the Speed of Light?

The more business continuity test scenarios you can run in your IT systems, the closer you can get to a bullet-proof organisation. Of course, that doesn’t mean that you’ll necessarily achieve such a Holy Grail; it might just mean you’ll be a little less further away than when you started. And then there’s all that data to be crunched. Big Data is a hot item at the moment with the huge volumes available from systems recording company operations and customer interactions. However, unless you can significantly speed up processing and analysis, you might still be churning that data when test scenario “N” becomes an unfortunate reality. What’s the solution?

Optical computing, or processing at the speed of light, would be nice. Photons zip around much faster than electrons and calculations for business continuity test scenarios should be correspondingly faster for the same quantity of data. Unfortunately, after having benefited from considerable buzz in the 1980’s, it never made its way to large scale production for business use. However, IT vendors, having noted that Moore’s law is still going strong, are milking conventional transistor/chip technology to produce accelerations of a different kind.

The new buzz is ‘in-memory computing”. It’s technology that keeps data close to where they need to be processed, in main memory next to the processor, thus avoiding the comparatively long times for fetching them from other parts of the system like hard drives. The most dramatic claims are for shortening processing times of days to minutes, or even less. Systems and software vendors like IBM, Microsoft, Oracle and SAP now all have “in-memory” offerings available to speed up access to data. Even if Einstein says you can’t move at the speed of light (you become infinitely heavy), most organisations would settle for moving through their business continuity test scenarios at a good fraction of it.