/home/leansigm/public_html/components/com_easyblog/services

SigmaWay Blog

SigmaWay Blog tries to aggregate original and third party content for the site users. It caters to articles on Process Improvement, Lean Six Sigma, Analytics, Market Intelligence, Training ,IT Services and industries which SigmaWay caters to

Fraud Detection With The Help Of Data Analysis

Fraud is one of the major issues in today’s business world. It was found that 8% increase over last year in the cost per dollar of fraud losses and online retailers are the worst hit as fraudulent transactions grew 32.1% in 2015. Nowadays, data analysis has come to rescue organizations from fraud as it allows companies to create fraud risk profiles and then use existing data to identify potential fraudulent activity. This article explores the commonly used data analysis techniques used to search out fraudulent transactions. They are: Pattern Recognition, Outlier Detection, Regression Analysis, Semantic Modeling, Neural Network, and Anomaly Detection. Read more at: http://it.toolbox.com/blogs/itmanagement/how-to-detect-fraud-using-data-analysis-74726

 

Rate this blog entry:
3485 Hits
0 Comments

Data Mining and its Importance

Data mining sounds like a monotonous activity on a pile of information, requiring little oversight. It is however, in the words of Professor Uwe Aickelin, University of Nottingham, "A discipline that blurs the lines between artificial intelligence, machine learning, statistics and other cutting-edge disciplines to unearth the golden nuggets that lurk within data." He explains how data mining is the effort to extract valuable information from unstructured or 'messy' data. Statistics fails to recognize patterns and it is here where Evolutionary Computation and Machine Learning is required. Industries have begun to understand the need to make sense of the large amounts of data out there and Data Mining is more important than ever now. Read more at: http://www.gizmodo.com.au/2015/05/why-data-mining-is-so-important/

Rate this blog entry:
4645 Hits
0 Comments

How casinos are betting on big data

Billions of dollars are lost by gamblers every year along the Vegas Strip, but some casino operators are taking strides to soften the blow of serious gambling losses and leveraging big data to keep customers coming back, according to one executive.  "They could win a lot or they lose a lot or they could have something in the middle. So we do try to make sure that people don't have really unfortunate visits," said Caesars Entertainment Chairman and CEO Gary Loveman on Big Data Download.  Caesars and other casino operators offer loyalty programs. As gamblers spend, companies gather data on those spending trends. Customers also receive tailored incentives for gambling and spending. 

"We give you very tangible and immediate benefits for doing so. So we give you meals, and hotel rooms and limousines and show tickets. You share with us information on what you've been doing, what sorts of transactions you've made," said Loveman, whose company is the biggest U.S. casino operator.

Caesars in particular employs about 200 data experts at its Flamingo Hotel alone. They scour through data on the types of games customers have played, what hotel they've stayed at and where they've been dining. So the next time when you visit a casino, expect a suddenly friendlier slot machine after you are on a losing streak.

Read the complete report here:  http://www.cnbc.com/id/101027330

Rate this blog entry:
5624 Hits
0 Comments

An overview of Text Mining

Text mining, which is sometimes referred to "text analytics", is one way to make qualitative or "unstructured" data usable by a computer. Also known as text data mining, refers to the process of deriving high-quality information from text. High-quality information is typically derived through the devising of patterns and trends through means such as statistical pattern learning. Text mining usually involves the process of structuring the input text, deriving patterns within the structured data, and finally evaluation and interpretation of the output. 'High quality' in text mining usually refers to some combination of relevance, novelty, and interestingness. Typical text mining tasks include text categorization, text clustering, concept/entity extraction, sentiment analysis etc. Text analysis involves information retrieval, analysis to study word frequency distributions, pattern recognition, tagging, information extraction, data mining techniques including link and association analysis, visualization, and predictive analytics. The main goal is, essentially, to turn text into data for analysis, via application of natural language processing (NLP) and analytical methods. To read more about text mining: http://www.scientificcomputing.com/blogs/2014/01/text-mining-next-data-frontier.

Rate this blog entry:
5296 Hits
0 Comments

Extracting insights from mobile data

Mobile phones serve a dual purpose in the context of Big Data. Each mobile phone, non-smart phones inclusive, creates numerous types of data every day. These include call detail records, SMS data, and geo-location data. In case of smartphones, such devices also generate log data via the use of mobile applications, financial transaction data associated with mobile banking and shopping, and social media data from updates to Facebook, Twitter and other social networks. The volume of portable information and the velocity at which it is made is just going to build as both the worldwide population and cell phone infiltration rates ascent, and the utilization of online networking expands. When investigated viably, this information can give knowledge on client opinion, conduct and even physical development designs. Because of the sheer number of cell phones being used, Big Data specialists can tap versatile Big Data examination to better see such patterns cross over large population and sub-portions of clients to enhance engagement strategies and improve the conveyance of administrations. It gets to be especially valuable for examination purposes when joined together with outside information sources, for example, climate information and investment information, which permit experts to relate macro-level patterns to focused on sub-portions of clients. To read more: http://wikibon.org/blog/the-dual-role-of-mobile-devices-for-big-data/

Rate this blog entry:
6371 Hits
0 Comments

Analytics to combat fraudsters

Fraudsters are more competent, better made, and creatively excellent than whatever possible time in the later past. Their adulteration arrangements include complex frameworks of individuals, records, and events. The evidence for these schemes may exist on multiple systems, incorporate various data sorts, and deliberately represent hidden activity. So an analyst has abundant investigative focuses on these frameworks with no true approach to join data or results. To prevent and uncover deception, one needs a solution that is more exceptional and advanced than hoaxers. A basic venture in fraud detection analytics is visualizing the patterns in your data between people, places, frameworks, and events. These data mining and profound analysis capabilities provide more context and better information, enabling more accurate data segmentation and data labelling, which further improves pattern recognition. To read more about it: http://www.21ct.com/solutions/fraud-detection-analytics/.

Rate this blog entry:
6250 Hits
0 Comments
Sign up for our newsletter

Follow us