SigmaWay Blog

SigmaWay Blog tries to aggregate original and third party content for the site users. It caters to articles on Process Improvement, Lean Six Sigma, Analytics, Market Intelligence, Training ,IT Services and industries which SigmaWay caters to

No Memory required for Big Data

Up to 16 Exabytes of RAM can be supported by a 64-bit system. Machines with 128GB RAM or more are becoming common with this era of Cloud Computing and Big Data. The data sets for Big Data are getting too large for even heavily loaded machines with memory despite the best efforts, don't fit into the RAM even after clustering in some cases. Researchers at MIT created a cluster called BlueDBM using Solid-State Drives (SSDs) to get rid of the memory problem. They also moved some of the computational power off the servers and onto chips. By pre-processing known parts of the data onto the flash drives prior to passing it back to the servers, the chips made distributed computation much more efficient than before. They thus got rid of the overhead of running an operating system. Read more at:

Rate this blog entry:
Big Data shapes Loyalty Programs
Big Data builds Smart Cities

Related Posts



No comments made yet. Be the first to submit a comment
Already Registered? Login Here
Wednesday, 12 August 2020
If you'd like to register, please fill in the username, password and name fields.

Sigma Connect

sigmaway forums


Raise a question

Access Now

sigmaway blogs


Blog on cutting edge topics

Read More

sigmaway events


Hangout with us

Learn More

sigmaway newsletter


Start your subscription

Signup Now

Sign up for our newsletter

Follow us