/home/leansigm/public_html/components/com_easyblog/services

SigmaWay Blog

SigmaWay Blog tries to aggregate original and third party content for the site users. It caters to articles on Process Improvement, Lean Six Sigma, Analytics, Market Intelligence, Training ,IT Services and industries which SigmaWay caters to

No Memory required for Big Data

Up to 16 Exabytes of RAM can be supported by a 64-bit system. Machines with 128GB RAM or more are becoming common with this era of Cloud Computing and Big Data. The data sets for Big Data are getting too large for even heavily loaded machines with memory despite the best efforts, don't fit into the RAM even after clustering in some cases. Researchers at MIT created a cluster called BlueDBM using Solid-State Drives (SSDs) to get rid of the memory problem. They also moved some of the computational power off the servers and onto chips. By pre-processing known parts of the data onto the flash drives prior to passing it back to the servers, the chips made distributed computation much more efficient than before. They thus got rid of the overhead of running an operating system. Read more at: http://www.itworld.com/article/2947839/big-data/mit-comes-up-with-a-no-memory-solution-for-big-data.html

Rate this blog entry:
5932 Hits
0 Comments
Sign up for our newsletter

Follow us