Implications of #Pillar’s acquisition by #Oracle on #storage space and specialist players http://bit.ly/iq1oAk
#PwC sponsoring the development of #XBRL taxonomy to promote disclosure transparency in sustainability data http://bit.ly/mPHIK7 #green
There has been some recent press with IBM’s 100th birthday as well as Lockheed’s recent purchase of the supposedly first commercial quantum computer (QC) on what the next frontier in computing looks like. The pundits seem to agree that a viable and practical quantum computing would be the next big step after silicon.
Though there are challenges that need to be overcome, such as the near absolute zero temperatures needed for the quantum processor to operate, innovation over time will lower the costs and usage of such a device. So which areas will benefit the most from such a computing paradigm. Multiple ones, though it seems the cryptography has gotten the most ink and perhaps the breakthrough that Artificial Intelligence has been waiting for.
One area that would benefit immensely would be system testing. For large complex systems such as those in commercial airplanes, the number of permutations and combinations of possible scenarios is overwhelming by today’s standards. Inevitably, there is always a combination that was not tested and as remote as the chance may be, could result in a potentially catastrophic situation occurring.
As complexity increases and with it validation considerations, QC allows a fundamentally different perspective what is a practical case of solving “NP-complete” problems, the problems that are impossible or nearly impossible to calculate on a classical computer. Ironically, this itself poses some challenges in the context of ‘who is checking on the checker’ — as in the validation that the QC processor is functioning as intended.
Interesting times indeed.
After #Buzz, #Google’s latest foray into social networks - Google+ http://bloom.bg/jaelUq -> will this impact #fb usage?
better late than never, @BlueArc files for IPO http://reut.rs/jXU1Wi <- window closing due to emergence of all SSD storage devices #NAS
RT @bigdata: SPROUT: Probabilistic DB for use when the source data has inaccuracies or gaps http://goo.gl/wYNNO #bigdata #database
Vendor hype not withstanding, the notion of seamless provisioning and accessibility of computing across private and public clouds “at will” is far from reality. In fact, one would posit that the current evolution is somewhat similar to the notion of Enron and its bandwidth trading notion towards the end. At one time, there was hype around how movies and related content would be streamed across the pipes to people in their homes. What had prevented this from being a reality, apart from some key networking breakthoughs lacking was the last mile that the telcos controlled. And all they had to do was to make sure that investment would ensure sub-optimal throughput to consumer homes.
Qck analysis of passwords used 4 #PSN accnts shows ppl reusing 1 password 4 multiple sites; no surprise thr! http://bit.ly/jVu1f0 #security
Recently, there have been several announcement (like 5 new offerings in May 2011) from established vendors such as IBM, EMC, NetApp and others around their commitment and support for Hadoop. Ofcourse apart from the software and hardware bundles, one can go to Cloudera to license a supported version and build their own infrastructure around it.
So what is the impact on the current vendor ecosystem? In some ways it is analogous to when Oracle introduced a supported version of linux (then known as ‘Unbreakable Linux’). In a nutshell, it was to strengthen the relationship with the end cusotmer, while maximizing the land grab in terms of the IT real estate. It was not important whose Linux distribution was being used by a client, as long as it was not Windows. That meant more money for Oracle licenses while mitigating the likelihood of SQL Server, Exchange, .NET development tools gaining traction.
Similarly, this seems to be a play by the DW vendors to gain capabilities around transformation and load dimensions of ETL. The extract portion is not as relevant in the BigData paradigm as most of the data sources (aka generators) use proprietary data stores or simply stream raw data into flat files.
This has some rather interesting ramifications for ETL vendors. Independent ones such as Informatica, are launching products that address real-time user requirements (e.g., Ultra Messaging) while those part of a larger product suite (e.g., IBM’s InfoSphere Service Director) are attempting to add value by exposing existing enterprise data stores to more event driven data consumers.
Overall the traditional premium for ETL products is fading and the market place is ripe for consolidation. For current enterprises, given Moore’s Law, most data movement that is facilitated via ETL can be handled through more real-time integration suites. For the boundary conditions and data sets in what is termed ‘Big Data’ (e.g., web clickstreams) Hadoop centric tools will be more cost effective.
As data volumes continue to grow, and decision cycle times shrink, batch approaches such as Hadoop will be replaced by frameworks and architectures that are more ‘real time’. Google already has a ‘real time’ search. How ready is your enterprise for that?
#Docomo’s research project to leverage aggregate mobile location for #city planning and #disaster response http://bit.ly/lQGwwQ
#Chase and #Wells start issuing #EMV enabled CCs Yay! http://bit.ly/kMCcSO #security #chipandpin