With a potentially infinite scalability of capacity and invoiced to use resources, the cloud is the ideal model for Big Data. Overview offers five Hadoop in the cloud.
In their 2015 forecast, IDC analysts believe that 90% of data that will be generated by the connected objects will be stored in the cloud. The latter, by its ability to provide, 10, 100 or 1000 virtual machines on demand, in seconds, lends itself perfectly to big data applications.
No need to purchase an oversized cluster of machines to carry the heaviest stones. All cloud heavyweights now have an offer Hadoop to their catalog and Big Data market as a Service (BDaas) or Hadoop as a Service (Haas) was gradually established. Hadoop nodes preinstalled, automated cluster management tools, everything is ready to host data and processing algorithms to apply.
Flexibility and performance are waiting for you
In a recent study, Accenture assessed four deployment models for Hadoop architecture, from bare-metal approach where the company buys and manages its servers from A to Z solutions to its platform 100% cloud-based Hadoop as a Service in through the purchase of specialized servers (Hadoop appliances) or accommodation. Their study, from three cases of realistic usage, shows the superiority of cloud model in terms of price / performance. The impact in terms of performance of having to work in the cloud and virtualized machine can be balanced against the tuning end that allows their administration interface.
Startups defy the giants Cloud
The first Hadoop demand offerings have emerged several years ago and now technology and interfaces are mature enough to exploit this infrastructure sustainably and efficiently. Amazon, Google and Microsoft offerings, Oracle is preparing to launch his own while several start-ups, to the Altiscale or picture Qubole managed to raise tens of millions of dollars to launch their own services Haas.