SigmaWay Blog

SigmaWay Blog tries to aggregate original and third party content for the site users. It caters to articles on Process Improvement, Lean Six Sigma, Analytics, Market Intelligence, Training ,IT Services and industries which SigmaWay caters to

Banks Depend on Data

Organizations need databases which are needed to store data safely. Through databases one can solve problems from NoSQL and RDBMS framework to in-memory databases. This help banks to give quicker reaction times and viable examination, prompting better client experience and maintenance. Utilizing a center layer on top of various databases, banks can rapidly assemble information. For more read the article written by Nanda Kumar(CEO, SunTec Business Solution) : https://www.finextra.com/blogposting/12478/making-data-work-for-banks


Rate this blog entry:
3485 Hits

Big data is good for database

Mention database and what all things will come to your mind? Data management, storage, tables, RDBMS… But this trend is changing. The reason for the change is big data. Data has become so big that conventional methods and tools are not enough to manage this data. So what to do? We need some database technologies which can deal with big data. Hadoop is the most popular technology which can tackle big data. It is a data-centric platform which is highly scalable and used to run applications in parallel. Another alternative to RDBMS is the use of NoSQL platforms like MongoDB. Read the full article here: http://www.infoworld.com/article/3003647/database/how-big-data-is-changing-the-database-landscape-for-good.html

Rate this blog entry:
4491 Hits

Big Data for Small Businesses

There is huge amount of data and searching for ways to use that data can seem terrifying. Although very little of this data is useful, more and more can be extracted for useful patterns and information. This portion of the useful data is called “Big Data”. For now Big Data is a good way for any company to gain advantage but in the coming ten years use of Big Data will be inevitable. Cloud computing and things like Hadoop and NoSQL provide data analyzing tools to several businesses and entrepreneurs to help analyze the right data sets. Experts on Big Data give some useful tips on how to use this technology.
Know the problem you are trying to solve.
The better way to think about the use of Big Data is to consider it as a tool to solve challenges. Once the problem is identified the data can be used to find a solution.
Start small and grow.
It is a good idea to run a trial analysis and see if it solves the problem. If it doesn’t then you haven’t risked much and if it’s a success you will come out with useful data.
Choose the right data.
 Although it can be difficult to find data, data should be combined from different places to obtain the most useful ones.
Move Fast.
Big Data is not only about analyzing information but also acting on it in real time so that all parts of the company is moving towards a common target and can make the most out of the available data.
Read More at: http://www.forbes.com/sites/mikemontgomery/2015/05/07/small-businesses-shouldnt-fear-big-data/



Rate this blog entry:
5249 Hits

Where the Data from IoT Disappears

When it comes to the Internet of Things (IoT), one of the biggest challenges is managing the data created by devices and their flow in and out of the system a developer creates. This data is vital for an efficient system. The initial step is where the data is created and enters the Internet. In the second step, a central system holds on to this data, organizing it constantly. In the third stage old data is sore for future needs. NoSQL platforms like Cassandra are a suitable fit for IoT systems. Distributed database systems like Cassandra have no 'primary' server that controls the rest. Each node in a cluster can handle the incoming transactions. Read at:  http://www.wired.com/2015/03/internet-things-data-go/

Rate this blog entry:
10303 Hits

Choosing The Databases

Today's databases should be flexible and should be able to deliver extreme performance and handle humongous data volumes. So database architects have come up with NoSQL, NewSQL alternatives to relational database management systems (RDBMS). In order to choose among these three, there has to be a fundamental understanding of all the three technologies. RDBMS can handle thousands of transactions per second but the new face of online transaction processing (OLTP) in scenarios such as real-time advertising, fraud detection, multi-player games, and risk analysis, to name a few, involves close to a million transactions per second -- a pace that traditional RDBMS has problem in dealing with. These problems can be addressed by NoSQL and NewSQL. NoSQL database management systems store data in a variety of formats. Most NoSQL products discard ACID performance to achieve data storage flexibility. NewSQL, retain both SQL and ACID, but they overcome the performance overhead of RDBMS. In order to choose the type of database the following questions have to be answered-To what extent do you rely on data in terms of storage, processing, and analysis? How important are the scale, flexibility, and performance aspects of a DBMS? What is your level of investment in incumbent technologies? Read more at:


Rate this blog entry:
5209 Hits

How can Banks utilize Big Data?

Proficiency in Big Data provides a competitive advantage to banks. Banks too often depended on traditional technologies such as aggregation and normalization of data which resulted in several weaknesses like lack of flexibility in responding to upstream and downstream data changes. Data lineage may be lost after aggregation and summarization and data governance is likely weakened when several constituents retain responsibility for an extended, multi-stage data flow. These weaknesses are detrimental to the success of big data initiatives. So a new approach is required.  Big data represents a new way that banks can interact with and leverage their data. As a result, banks need to shift the paradigm for designing, developing, deploying, and maintaining big data solutions with new approaches to data storage (e.g., NoSQL databases)  and maturity of distributed-computation software frameworks (e.g., Hadoop). The approach to Big Data implementation also needs to change through rapid, iterative, and incremental deployment of solutions in a way that aligns well to the speed at which the underlying data are measured, understood, and parsed. This will take banks to an acceptable level of competency and capability. Read more at:


Rate this blog entry:
5803 Hits

Apache Cassandra gets in-memory option with DataStax Enterprise 4.0

Datastax has updated its Apache Cassandra-based NoSQL database. An in-memory computing feature is added to increase the performance of online applications. DataStax Enterprise 4.0 includes improved search, an updated version of the OpsCenter visual monitoring tool, and certified Cassandra 2.0. Performance benchmarks conducted by DataStax on the in-memory feature show significant improvements in speed. To know more on this topic go through the article by Toby Wolpe, senior reporter at ZDNet in London:


Rate this blog entry:
5911 Hits

Cassandra 2.0: The next generation of big data

Facebook released Cassandra, its NoSQL, big data distributed data store to open source. It can power massive data sets quickly and reliably without compromising performance. Different new features and improvements are included in the latest version. To know more about Cassandra 2.0, follow Steven J. Vaughan-Nichols's, article link:


Rate this blog entry:
5307 Hits
Sign up for our newsletter

Follow us