Rose Technologies
  • Home
    • Client Login
    • Partner Login
    • Associates Login
  • About Rose
  • Services
    • Big Data Analytics Infrastructure
    • Data Science & Analytics Professional Services
    • Business Intelligence Data Warehouse >
      • Data Warehouse / Business Intelligence (DW/BI)
      • Master Data Management (MDM)
      • Successful Business Intelligence Deployment Best Practices
      • Trends in Business Analytics
      • Mobile Business Intelligence (BI)
      • BI Best Practices in Midmarket Organizations
      • BI for Business Users
      • Big Data Potential
      • Clouds, Big Data, and Smart Assets
      • Big Data Innovation, Competition and Productivity
      • Game-changing Effects of Big Data
      • Analytics: The Widening Divide
      • Top Benefits of Analytics
      • Predictive Analytics E Guide
      • Business Intelligence, Analytics and Performance Management, Worldwide 2011
      • Big Data BI and Analytics
      • Data Wharehouse Management Best Practices
      • Business intelligence (BI) Geospatial Cloud Computing
      • Data Warehousing Technology Trends
      • Reasons to Offload Analytics from your Data Warehouse
      • Analytics: The New Path to Value
      • Trends in Predictive Analytics
      • Data Insight and Action
      • Converged Data Center
      • Turning Decisions into Profit with Business Analytics and Optimization
      • Game-changing Business Analytics Trends
      • Business Intelligence Platforms Magic Quadrant 2012
      • Dynamic Case Management Vendors Wave 2011
      • Enterprise Data Warehousing Platforms Wave 2011
      • Database Auditing and Real-Time Protection Wave 2011
      • Data Quality Tools Magic Quadrant 2011
      • Data Warehouse Database Management Systems Magic Quadrant 2011
      • Enterprise Business Intelligence Platforms Wave 2011
      • Enterprise Hadoop Solutions Wave 2012
    • Private / Hybrid Cloud Solutions >
      • Private Hybrid Cloud Value and Evolution
      • Cloud Definition and Type Comparisons
      • Future of Cloud Computing
      • Adopting the Application-centric Private Cloud
      • Private Cloud Decision Analysis
      • Four Stages to Virtualization Maturity
      • Benefits of Infrastructure as a Service (IaaS)
      • Benefits of Platform as a Service (PaaS)
      • Benefits of Software as a Service (SaaS)
      • Private Cloud Market Offerings
      • Economics of the Cloud
      • Cloud Compliance Issues
      • Cloud Security
      • From Secure Virtualization to Secure Private Clouds
      • Cloud Buyers Guide Government
      • Cloud Hybrids
      • U.K. Officials Put Classified Information in the Cloud
      • Cloud Computing Review June 2011
      • Private Cloud Solutions 2011
      • Selecting a Cloud Computing Platform
      • Study on Reducing Labor Costs with Private Cloud
      • Future Internet Report May 2011
      • Cloud Security Issues
      • Simplifying Private Cloud Operations
    • Mobile Technology >
      • Mobile Strategy
      • Mobile Technology
      • Mobile Security Policy and Rules
      • Mobile Device Management
      • Mobile Collaboration
      • Mobile Business Intelligence (BI)
      • Manufacturer Operating System Share for Smartphones Q2 2011
      • Future of Enterprise Mobile
      • Internet Trends 2010 by Morgan Stanley Research
      • Oracle Mobile Computing Strategy
      • Rugged vs. Commercial: Total Cost of Ownership Of Handheld Devices
      • Designing Mobile Devices Improve Productivity
      • How Will Mobile Change Your Future 2011
      • Tablet Market Prices Comparison October 2011
      • How Workers Adopt And Use Business Technology
      • Mobile Data Protection 2011 Magic Quadrant
      • Benefits of Mobile WAN Optimization
      • Business Smartphone Selection
      • Corporate Telephony Magic Quadrant 2011
    • Enterprise Resource Planning >
      • ERP Selection Considerations
      • ERP SaaS Cost, Customization, Security
      • ERP Implementation Best Practices
      • Successful Enterprise Resource Planning Implementation Practices
      • ERP Systems Buyer’s Guide
      • Best Practices for Selecting Midmarket ERP Software
      • ERP for Small Businesses: A Buyer’s Guide
      • Enterprise Resource Planning Selection Process
      • ERP Comparison Guide
      • Overview of 2011 ERP Report
      • 2011 ERP Report of Panorama Consulting
      • Enterprise Resource Planning (ERP) Priority Matrix
      • Customer Relationship Management (CRM) Suites Leaders
      • Enterprise Resource Planning (ERP) Upgades Project Management Best Practices
      • CRM Multichannel Campaign Management Magic Quadrant 2011
      • Integrated Marketing Management Magic Quadrant 2011
      • Marketing Resource Management Magic Quadrant 2012
      • Corporate Learning Systems Magic Quadrant 2011
      • E-Discovery Software Magic Quadrant 2011
    • Enterprise Content Management >
      • Content-centric Approach for ECM Strategy
      • Enterprise Content Management (ECM) Planning
      • Evaluating and Selecting Content Analytics Tools
      • IBM’s Watson Content Analytics Technology
      • Collaboration Dynamic Case Management
      • Enterprise Content Management Magic Quadrant 2011
      • Web Content Management Magic Quadrant 2011
      • Content-Centric Enterprise Content Management
      • Document Output Customer Communications Management Wave 2011
      • Enterprise Content Management Wave 2011
    • Virtualization >
      • Top 7 Reasons to Virtualize Infrastructure
      • Virtualization Overview
      • Benefits of Server Virtualization
      • Benefits of Storage Virtualization
      • Benefits of Network Virtualization
      • Benefits of Data Virtualization
      • Data Virtualization Reaches Critical Mass
      • Four Stages to Virtualization Maturity
      • Top Reasons to Deploy Virtual Desktops
      • Virtual Infrastructures
      • Virtualization and Consolidation vs. Application Performance and WAN Optimization
      • Virtual Servers and Storage Systems
      • State of Virtualization and Cloud Computing 2011
      • Virtualization Hardware Selection
      • Virtualization Server Infrastructure Magic Quadrant 2011
    • Managed Services >
      • Benefits of Infrastructure as a Service (IaaS)
      • Benefits of Platform as a Service (PaaS)
      • Benefits of Software as a Service (SaaS)
      • Key Trends Shaping the Future of Data Center Infrastructure
      • Future of Cloud Computing
      • Gartner’s Top Predictions for IT 2011 and Beyond
      • Global IT Infrastructure Outsourcing 2011
      • Study on Reducing Labor Costs with Private Cloud
      • WAN Optimization Controllers Magic Quadrant 2011
      • Application Performance Monitoring Magic Quadrant 2011
      • Tech Trends Report 2011 IBM
      • Gartner Predictions for 2012
      • Enterprise Service Bus (ESB) Vendor Evaluation 2011
      • Modular Disk Arrays Magic Quadrant 2010
      • Ensure Reliable Service Delivery by Linking Infrastructure and APM
      • Cloud Infrastructure as a Service Magic Quadrant 2011
      • Unified Communications Magic Quadrant 2011
      • Integrated Software Quality Suites Magic Quadrant 2011
      • Customer Management Contact Center BPO Magic Quadrant 2011
      • Application Security Testing Magic Quadrant 2012
      • Web Conferencing Magic Quadrant 2011
      • Endpoint Protection Platforms Magic Quadrant 2012
      • Enterprise Architecture Management Suites Wave 2011
      • Backup Recovery Software Magic Quadrant 2011
      • Business Process Analysis Tools Magic Quadrant 2012
      • Database Marketing Service Providers Wave 2011
      • Customer Relationship Management Service Contact Centers Magic Quadrant 2011
      • Employee Performance Management Magic Quadrant 2011
      • Enterprise Architecture Tools Magic Quadrant 2011
      • Enterprise Governance Risk Compliance Platforms Wave 2011
      • Enterprise Network Firewalls Magic Quadrant 2011
      • External Controller Based ECB Disk Storage Magic Quadrant 2011
    • Custom Application Development
    • Knowledge Management
    • System Architecture Design
    • Data Management >
      • Data Storage Technologies
      • Primary Storage Data Reduction
      • Data Protection for Small and Medium-Sized Business
      • Keeping Product Data Clean
      • Data Center Outsourcing Services
      • Data Center / Infrastructure Utility Outsourcing Services 2011
      • Data Integration Tools Magic Quadrant 2011
    • Systems Administration
    • E-Commerce
  • Partners
    • About Rose Partners
  • Professional Expertise
    • Professional Certifications
    • White Papers
    • IT Expertise
    • Technologies
    • Programming Languages
  • Careers
  • Contact
  • Blog

CIO Technology Priorities

11/8/2012

1 Comment

 
Picture
1 Comment

Percolator, Dremel and Pregel: Alternatives to Hadoop

7/18/2012

23 Comments

 
Hadoop (MapReduce where code is turned into map and reduce jobs, and Hadoop runs the jobs) is great at crunching data yet inefficient for analyzing data because each time you add, change or manipulate data you must stream over the entire dataset. 

In most organizations, data is always growing, changing, and manipulated and therefore time to analyze data significantly increases.

As a result, to process large and diverse data sets, ad-hoc analytics or graph data structures, there must be better alternatives to Hadoop / MapReduce. 

Google (architect of Hadoop / MapReduce) thought so and architected a better, faster data crunching ecosystem that includes Percolator, Dremel and Pregel. Google is one of the key innovative leaders for large scale architecture.
Picture
Percolator is a system for incrementally processing updates to a large data sets. By replacing a batch-based indexing system with an indexing system based on incremental processing using Percolator, you significantly speed up the process and reduce the time to analyze data.

Percolator’s architecture provides horizontal scalability and resilience. Percolator allows reducing the latency (time between page crawling and availability in the index) by a factor of 100. It allows simplifying the algorithm. The big advantage of Percolator is that the indexing time is now proportional to the size of the page to index and no more to the whole existing index size.

See: http://research.google.com/pubs/pub36726.html
Picture
Dremel is for ad hoc analytics. Dremel is a scalable, interactive ad-hoc query system for analysis of read-only nested data. By combining multi-level execution trees and columnar data layout, it is capable of running aggregation queries over trillion-row tables in seconds. The system scales to thousands of CPUs and petabytes of data and allows analysts to scan over petabytes of data in seconds to answer queries. Dremel is capable of running aggregation queries over trillions of rows in seconds and thus is about 100 times faster than MapReduce.

Dremel's architecture is similar to Pig and Hive. Yet while Hive and Pig rely on MapReduce for query execution, Dremel uses a query execution engine based on aggregator trees. 

See: http://research.google.com/pubs/pub36632.html
Picture
Pregel is a system for large-scale graph processing and graph data analysis. Pregel is designed to execute graph algorithms faster and use simple code. It computes over large graphs much faster than alternatives, and the application programming interface is easy to use. 

Pregel is architected  for efficient, scalable and fault-tolerant implementation on clusters of thousands of commodity computers, and its implied synchronicity makes reasoning about programs easier. Distribution-related details are hidden behind an abstract API. The result is a framework for processing large graphs that is expressive and easy to program.

See: http://kowshik.github.com/JPregel/pregel_paper.pdf
23 Comments

Hadoop Internal Software Architecture

7/17/2012

9 Comments

 
Picture
The main components include:
  • Hadoop. Java software framework to support data-intensive distributed applications 
  • ZooKeeper. A highly reliable distributed coordination system 
  • MapReduce. A flexible parallel data processing framework for large data sets 
  • HDFS. Hadoop Distributed File System 
  • Oozie. A MapReduce job scheduler 
  • HBase. Key-value database 
  • Hive. A high-level language built on top of MapReduce for analyzing large data sets 
  • Pig. Enables the analysis of large data sets using Pig Latin. Pig Latinis a high-level language compiled into MapReduce for parallel data processing. 
See Hadoop Documentation: http://bit.ly/LqkJTP
9 Comments

Modern BI Architecture & Analytical Ecosystems

7/11/2012

13 Comments

 
Picture
The goal is to design and build a data warehouse / business intelligence (BI) architecture that provides a flexible, multi-faceted analytical ecosystem for each unique organization.

A traditional BI architecture has analytical processing first pass through a data warehouse. 

In the new, modern BI architecture, data reaches users through a multiplicity of organization data structures, each tailored to the type of content it contains and the type of user who wants to consume it.

The data revolution (big and small data sets) provides significant improvements. New tools like Hadoop allow organizations to cost-effectively consume and analyze large volumes of semi-structured data. In addition, it complements traditional top-down data delivery methods with more flexible, bottom-up approaches that promote predictive or exploration analytics and rapid application development.

In the above diagram, the objects in blue represent traditional data architecture. Objects in pink represent the new modern BI architecture, which includes Hadoop, NoSQL databases, high-performance analytical engines (e.g. analytical appliances, MPP databases, in-memory databases), and interactive, in-memory visualization tools.

Most source data now flows through Hadoop, which primarily acts as a staging area and online archive. This is especially true for semi-structured data, such as log files and machine-generated data, but also for some structured data that cannot be cost-effectively stored and processed in SQL engines (e.g. call center records). 

From Hadoop, data is fed into a data warehousing hub, which often distributes data to downstream systems, such as data marts, operational data stores, and analytical sandboxes of various types, where users can query the data using familiar SQL-based reporting and analysis tools.

Today, data scientists analyze raw data inside Hadoop by writing MapReduce programs in Java and other languages. In the future, users will be able to query and process Hadoop data using familiar SQL-based data integration and query tools.

The modern BI architecture can analyze large volumes and new sources of data and is a significantly better platform for data alignment, consistency and flexible predictive analytics.

Thus, the new BI architecture provides a modern analytical ecosystem featuring both top-down and bottom-up data flows that meet all requirements for reporting and analysis.

In the top-down world, source data is processed, refined, and stamped with a predefined data structure--typically a dimensional model--and then consumed by casual users using SQL-based reporting and analysis tools. In this domain, IT developers create data and semantic models so business users can get answers to known questions and executives can track performance of predefined metrics. Here, design precedes access. The top-down world also takes great pains to align data along conformed dimensions and deliver clean, accurate data. The goal is to deliver a consistent view of the business entities so users can spend their time making decisions instead of arguing about the origins and validity of data artifacts.

Creating a uniform view of the business from heterogeneous sets of data is not easy. It takes time, money, and patience, often more than most departmental heads and business analysts are willing to tolerate. They often abandon the top-down world for the underworld of spreadmarts and data shadow systems. Using whatever tools are readily available and cheap, these data hungry users create their own views of the business. Eventually, they spend more time collecting and integrating data than analyzing it, undermining their productivity and a consistent view of business information.

The bottom up world is a different process. Modern BI architecture creates an analytical ecosystem that brings prodigal data users back into the fold. It allows an organization to perform true ad hoc exploration (predictive or exploratory analytics) and promotes the rapid development of analytical applications using in-memory departmental tools. In a bottom-up environment, users can't anticipate the questions they will ask on a daily or weekly basis or the data they'll need to answer those questions. Often, the data they need doesn't yet exist in the data warehouse.

The modern BI architecture creates analytical sandboxes that let power users explore corporate and local data on their own terms. These sandboxes include Hadoop, virtual partitions inside a data warehouse, and specialized analytical databases that offload data or analytical processing from the data warehouse or handle new untapped sources of data, such as Web logs or machine data. The new environment also gives department heads the ability to create and consume dashboards built with in-memory visualization tools that point both to a corporate data warehouse and other independent sources.

Combining top-down and bottom-up worlds is challenging but doable with determined commitment. 

BI professionals need to guard data semantics while opening access to data. 

Business users need to commit to adhering to data standards. 

Further, well designed data governance programs are an absolute requirement.
13 Comments

Data Flow and Processes Compared

7/10/2012

16 Comments

 
Picture
16 Comments

Getting Value from Big Data

7/7/2012

16 Comments

 
getting_value_from_big_data.mp3
File Size: 12578 kb
File Type: mp3
Download File

Gartner's Yvonne Genovese reviews the popular term "Big Data" and why IT Leaders should act now.

Pattern-Based Strategy: Getting Value from Big Data

"Big data" refers to the growth in the volume of data in organizations. Understanding how to use Pattern-Based Strategy to seek, model and adapt to patterns contained in big data will be a critical IT and business skill.
16 Comments

Big Data Value Potential Index

7/5/2012

25 Comments

 
Picture
New academic research suggests that companies using this kind of “big data” and business analytics to guide their decisions are more productive and have higher returns on equity than competitors that do not. As big data changes the game for virtually all industries, it will tilt the playing field, favoring some over others. 
Top Benefits of Analytics

Analytics is about having the right information and insight to create better business outcomes. Business analytics means leaders know where to find the new revenue opportunities and which product or service offerings are most likely to address the market requirement. It means the ability to can quickly access the right data points to evaluate key performance and revenue indicators in building successful growth strategies. And, it means recognizing regulatory, reputational, and operational risks before they become realities.

1) Having the knowledge you need: Analytics delivers insightful information in context so decision makers have the right information, where, when and how you need it.

2) Making better, faster decisions: Analytics provides decision makers throughout the organization with the interactive, self-service environment needed for exploration and analysis.

3) Optimizing business performance: Analytics enables decision makers to easily measure and monitor financial and operational business performance, analyze results, predict outcomes and plan for better business results.

4) Uncover new business opportunities: Analytics delivers new insights that help the organization maximize customer and product profitability, minimize customer churn, detect fraud and increase campaign effectiveness
25 Comments

What is Big Data?

7/3/2012

14 Comments

 
Big data is a term applied to data sets whose size is beyond the ability of commonly used software tools to capture, manage, and process the data within a tolerable elapsed time. Big data is a popular term used to describe the exponential growth, availability and use of information, both structured and unstructured.  

Technologies today not only support the collection and storage of large amounts of data, they provide the ability to understand and take advantage of its full value, which helps organizations run more efficiently and profitably. For instance, with big data and big data analytics, it is possible to:

  • Analyze millions of SKUs to determine optimal prices that maximize profit and clear inventory.
  • Recalculate entire risk portfolios in minutes and understand future possibilities to mitigate risk.
  • Mine customer data for insights that drive new strategies for customer acquisition, retention, campaign optimization and next best offers.
  • Quickly identify customers who matter the most.
  • Generate retail coupons at the point of sale based on the customer's current and past purchases, ensuring a higher redemption rate.
  • Send tailored recommendations to mobile devices at just the right time, while customers are in the right location to take advantage of offers.
  • Analyze data from social media to detect new market trends and changes in demand.
  • Use clickstream analysis and data mining to detect fraudulent behavior.
  • Determine root causes of failures, issues and defects by investigating user sessions, network logs and machine sensors.

Until recently, organizations have been limited to using subsets of their data, or they were constrained to simplistic analyses because the sheer volumes of data overwhelmed their processing platforms. What is the point of collecting and storing terabytes of data if you can't analyze it in full context, or if you have to wait hours or days to get results? On the other hand, not all business questions are better answered by bigger data.

A number of recent technology advancements are enabling organizations to make the most of big data and big data analytics:

  • Cheap, abundant storage and server processing capacity.
  • Faster processors.
  • Affordable large-memory capabilities, such as Hadoop.
  • New storage and processing technologies designed specifically for large data volumes, including unstructured data.
  • Parallel processing, clustering, MPP, virtualization, large grid environments, high connectivity and high throughputs.
  • Cloud computing and other flexible resource allocation arrangements.

Big data technologies not only support the ability to collect large amounts of data, they provide the ability to understand it and take advantage of its value. The goal of all organizations with access to large data collections should be to harness the most relevant data and use it for optimized decision making.

• As much as 80% of the world’s data is now in unstructured formats, which is created and held on the web. This data is increasingly associated with genuine Cloud-based services, used externally to the Enterprise IT. The part of Big Data that relates to the expected explosive growth and creation of new value is the unstructured data mostly arising from these external sources.

• Data sets are growing at a staggering pace 

• Expected to grow by 100% every year for at least the next 5 years. 

• Most of this data is unstructured or semi-structured – generated by servers, network devices, social media, and distributed sensors. 

• “Big Data” refers to such data because the volume (petabytes and exabytes), the type (semi- and unstructured, distributed), and the speed of growth (exponential) make the traditional data storage and analytics tools insufficient and cost-prohibitive. 

• An entirely new set of processing and analytic systems are required for Big Data, with Apache Hadoop being one example of a Big Data processing system that has gained significant popularity and acceptance.

• According to a recent McKinsey Big Data report, Big Data can provide up to USD $300 billion annual value to the US Healthcare industry, and can increase US retail operating margins by up to 60%. It’s no surprise that Big Data analytics is quickly becoming a critical priority for large enterprises across all verticals.

Big data characteristics

Volume: there is a lot of data to be analyzed and/or the analysis is extremely intense; either way, a lot of hardware is needed.

Variety: the data is not organized into simple, regular patterns as in a table; rather text, images and highly varied structures—or structures unknown in advance—are typical.

Velocity: the data comes into the data management system rapidly and often requires quick analysis or decision making.

Drivers 

Volume, variety, velocity, and complexity of incoming data streams

Growth of “Internet of Things” results in explosion of new data 

Commoditization of inexpensive terabyte-scale storage hardware is making storage less costly ….so why not store it?

Increasingly  enterprises are needing to store non-traditional and unstructured data in a way that is easily queried

Desire to integrate all the data into a single source

The power of Compression

Challenges

Data comes from many different sources (enterprise apps, web, search, video, mobile, social conversations and sensors) 

All of this information has been getting increasingly difficult to store in traditional relational databases and even data warehouses

Unstructured or semi-structured text is difficult to query. How does one query a table with a billion rows?

Culture, skills, and business processes

Conceptual Data Modeling

Data Quality Management

Implications

Emerging capabilities to process vast quantities of structured and unstructured data are bringing about changes in technology and business landscapes.

As data sets get bigger and the time allotted to their processing shrinks, look for ever more innovative technology to help organizations glean the insights they'll need to face an increasingly data-driven future.

What is Hadoop?

The most well known technology used for Big Data is Hadoop.   It has been inspired from Google publications on MapReduce, GoogleFS and BigTable.   As Hadoop can be hosted on commodity hardware (usually Intel PC on Linux with one or 2 CPU and a few TB on HDD, without any RAID replication technology), it allows them to store huge quantities of data (petabytes or even more) at very low costs (compared to SAN  systems).

Hadoop is an opensource version of Google’s MapReduce framework.  It is a free, Java-based programming framework that supports the processing of large data sets in a distributed computing environment. It is part of the Apache project sponsored by the Apache Software Foundation:  http://hadoop.apache.org/. 

The Hadoop “brand” contains many different tools. Two of them are core parts of Hadoop:

Hadoop Distributed File System (HDFS) is a virtual file system that looks like any other file system except than when you move a file on HDFS, this file is split into many small files, each of those files is replicated and stored on (usually, may be customized) 3 servers for fault tolerance constraints. 

Hadoop MapReduce is a way to split every request into smaller requests which are sent to many small servers, allowing a truly scalable use of CPU power.

What problems can Hadoop solve?

• The Hadoop framework is used by major players including Google, Yahoo , IBM, eBay, LinkedIn and Facebook, largely for applications involving search engines and advertising. The preferred operating systems are Windows and Linux but Hadoop can also work with BSD and OS X.

• The Hadoop platform was designed to solve problems where you have a lot of data — perhaps a mixture of complex and structured data — and it doesn't fit nicely into tables. It's for situations where you want to run analytics that are deep and computationally extensive, like clustering and targeting. That's exactly what Google was doing when it was indexing the web and examining user behavior to improve performance algorithms. 

• Hadoop applies to a bunch of markets. In finance, if you want to do accurate portfolio evaluation and risk analysis, you can build sophisticated models that are hard to jam into a database engine. But Hadoop can handle it. In online retail, if you want to deliver better search answers to your customers so they're more likely to buy the thing you show them, that sort of problem is well addressed by the platform Google built. 

Big Data Market

The Big Data market is on the verge of a rapid growth spurt that will see it top the USD $50 billion mark worldwide within the next five years.

As of early 2012, the Big Data market stands at just over USD $5 billion based on related software, hardware, and services revenue. Increased interest in and awareness of the power of Big Data and related analytic capabilities to gain competitive advantage and to improve operational efficiencies, coupled with developments in the technologies and services that make Big Data a practical reality, will result in a super-charged CAGR of 58% between now and 2017.

Vertical Perspective

Enhancing Fraud Detection for Banks and Credit Card Companies Scenario

• Build up-to-date models from transactional to feed real-time risk-scoring systems for fraud detection.

Requirements

• Analyze volumes of data with response times that are not possible today.

• Apply analytic models to individual client, not just client segment. 

Benefits

• Detect transaction fraud in progress, allow fraud models to be updated in hours than weeks.

Social Media Analysis for Products, Services and Brands Scenario

• Monitor data from various sources such as blogs, boards, news feeds, tweets, and social medias for information pertinent to brand and products, as well as competitors.

Requirement

• Extract and aggregate relevant topics, relationships, discover patterns and reveal up-and-coming topics and trends.

Benefits

• Brand Management for marketing campaigns, Brand protection for ad placement networks.

Store Clustering Analysis in the Retail Industry Scenario

• Retailer with large number of stores needs to understand cluster patterns of shoppers. 

Requirement

• Use shopping patterns for multiple characteristics like location, incomes, family size for better product placement.

Age Range
Education
Income
Children
Assets
Urbanicity 

Benefits

• Store specific clustering of products, clustering specific types of products by locations.

Healthcare and Energy Industry Scenario

IBM Stream Computing for Smarter Healthcare

IBM Watson pairs natural language processing with predictive root cause analysis.

InfoSphere Streams based analytics can alert hospital staff of impending life threatening infections in premature infants up to 24 hours earlier than current practices.

Vestas Wind Systems use IBM big data analytics software and powerful IBM systems to improve wind 
turbine placement for optimal energy output. 
Picture
Picture
14 Comments

Data Virtualization Wave 2012

7/1/2012

22 Comments

 
Picture
22 Comments

Business Intelligence Technologies

6/14/2012

48 Comments

 
Picture
48 Comments
<<Previous

    Rose Technology

    Our mission is to identify, design, customize and implement smart technologies / systems that can interact with the human race faster, cheaper and better.

    Archives

    May 2017
    November 2016
    October 2016
    September 2016
    August 2016
    July 2016
    June 2016
    May 2016
    April 2016
    March 2016
    February 2016
    January 2016
    December 2015
    November 2015
    October 2015
    September 2015
    August 2015
    July 2015
    June 2015
    May 2015
    April 2015
    March 2015
    February 2015
    January 2015
    December 2014
    November 2014
    October 2014
    September 2014
    August 2014
    July 2014
    June 2014
    May 2014
    April 2014
    March 2014
    February 2014
    January 2014
    December 2013
    November 2013
    October 2013
    September 2013
    August 2013
    July 2013
    June 2013
    May 2013
    April 2013
    March 2013
    February 2013
    January 2013
    December 2012
    November 2012
    October 2012
    September 2012
    August 2012
    July 2012
    June 2012
    May 2012
    April 2012
    March 2012
    February 2012
    January 2012
    December 2011
    November 2011
    October 2011
    September 2011
    August 2011
    July 2011

    Categories

    All
    Accumulo
    Adrian Bowles
    Algorithms
    Analytic Applications
    Analytic Applications
    Analytics
    Andriod
    Android
    Android Tablets
    Apache Falcon
    Application Performance Monitoring
    Application Security Testing
    Artificial Intelligence
    B2b Marketing
    Backup Recovery Software
    Benefits Of Data Virtualization
    Blackberry. Palm
    Blade Servers
    Boot Camp
    Bpaas
    Business Analytics
    Business Cloud Strategy
    Business Data
    Business Data
    Business Improvement Priorities
    Business Improvement Priorities
    Business Inteligence
    Business Inteligence
    Business Intelligence
    Business Intelligence
    Business Intelligence And Analytics Platform
    Business Process
    Business Process Analysis Tools
    Business Smartphone Selection
    Business Technologies Watchlist
    Business Technology
    Case Management
    Cassandra
    Cio
    Client Management Tools
    Cloud
    Cloud Assessment Framework
    Cloud Business Usage Index
    Cloud Deployment Model
    Cloud Deployment Model Attributes
    Cloud Gateways
    Cloud Strategies Online Collaboration
    Cluster Architectures
    Cognitive Computing
    Collaboration
    Computational Experiments
    Computer Platforms
    Conference
    Connectivity
    Content
    Content Analytics
    Core Technology Rankings
    Corporate Learning Systems
    Corporate Telephony
    Cost
    Crm
    Crm Multichannel Campaign Management
    Customer Communications Management
    Customer Management Contact Center Bpo
    Customer Relationship Management
    Customer Service Contact Centers
    Customization
    Cybernetic Employees
    Cybernetic Era
    Data
    Data Analytics Lifecycle
    Data Archiving
    Database
    Database Auditing
    Database Management Systems
    Data Center
    Data Center. Database
    Data Center Outsourcing
    Data Center Outsourcing And Infrastructure Utility Services
    Data Growth
    Data Integration Tools
    Data Loss Prevention
    Data Management Stack
    Data Mining
    Data Quality
    Data Quality Tools
    Data Science
    Data Science
    Data Silos
    Data Stack
    Data Theft
    Data Virtualization
    Data Visualization
    Data Volume Variety Velocity
    Data Volume Variety Velocity Veracity
    Data Warehouse
    Data Warehouse Database Management Systems
    Deep Learning
    Dido
    Digital Subterfuge
    Document Output
    Dr. David Ferrucci
    Dr. John Kelly
    Ecm
    E Commerce
    E-Commerce
    E Discovery Software
    Emerging Technologies And Trends
    Employee-Owned Device Program
    Employee Performance Management
    Endpoint Protection Platforms
    Enterprise Architecture Management Suites
    Enterprise Architecture Tools
    Enterprise Content Management
    Enterprise Data Warehousing Platforms
    Enterprise Mobile Application Development
    Enterprise Resource Planning
    Enterprise Service Bus
    Enterprise Social Platforms
    Erp
    Erp Demonstrations
    Financial Services
    Forecasting
    Forrester
    Fraud Detection
    Future It
    Galaxy
    Galaxy Nexus
    Gale-Shapley Algorithm
    Gartner
    Global It Infrastructure Outsourcing 2011 Leaders
    Global Knowledge Networks
    Global Network Service Providers
    Google Glasses
    Google Wallet
    Hadoop
    Hadoop Technology Stack
    Hadoop Technology Stack
    Hardware As A Service
    Hbase
    Health Care And Big Data
    Hidden Markov Models
    High Performance Computing
    High-performance Computing
    Human Resources
    Iaas
    Ibm
    Ibm Big Data Platform
    IBM's Watson
    Iconsumer
    Information
    Information Capabilities Framework
    Information Management
    Information Workers
    Infosphere Streams
    Infrastructure As A Service
    Infrastructure Utility Services
    In-memory Grid
    Innovation
    Integrated It Portfolio Analysis Applications
    Integrated Software Quality Suites
    Internet
    Internet Of Things
    Internet Trends 2011
    Ipad
    Iphone
    Iphone 4s
    It Innovation Wave
    Jeff Hammerbacher
    Job Search
    Key Performance Indicators
    Kindle Fire Tablet
    Lambda Architecture
    Lifi
    Long Term Evolution Network Infrastructure
    Machine Data
    Machine Learning
    Machine Learning
    Magic Quadrant
    Mainframe
    Managed Hosting
    Managed Security Providers
    Manufacturing
    Mariadb
    Marketing Resource Management
    Marketing Resource Management
    Mark Weiser
    Master Data
    Master Data Management
    Maxent Classifiers
    Mdm
    Media Tablet
    Microsoft Big Data Platform
    Microsoft Dynamics Ax
    Mlbase
    Mobile
    Mobile App Internet
    Mobile Application Development
    Mobile Business Application Priorities
    Mobile Business Intelligence
    Mobile Collaboration
    Mobile Consumer Application Platforms
    Mobile Data Protection
    Mobile Development Tool Selection
    Mobile Device Management
    Mobile Device Management Software Magic Quadrant 2011
    Mobile Devices
    Mobile Internet Trends
    Mobile Payments
    Mobile Payment System
    Modular Disk Arrays
    Modular Systems
    Mysql
    Naive Bayes
    Natural Language Processing
    Network
    Networked Society
    Network Firewalls
    Network Infrastructure
    Network Virtualization
    N-gram Language Modeling
    Non-Computer Traffic
    Nosql Database
    Operating System
    Oracle
    Paas
    Pioneering The Science Of Information
    Platform As A Service
    Predictive Analytics
    Prescriptive Analytics
    Primary Storage Reduction Technologies
    Python
    Real Time Analytics
    Real-time Analytics
    Real-time Bidding Ad Exchange
    Recommendation Engines
    Retail Marketing Analytics
    Rim
    Risk
    R Language
    Robotics
    Saas
    Sales Force Automation
    Sap Big Data Platform
    Scala
    Scenario-Based Enterprise Performance Management (EPM)
    Search
    Security
    Security Information & Event Management
    Selection Process
    Self-Service Business Intelligence
    Sensors
    Server Virtualization
    Service Oriented Architecture
    Smart City
    Smarter Computing
    Smartphones
    Social Media
    Software As A Service
    Sony Tablet S
    Spark
    Sports Analytics
    Spying
    Steve Jobs
    Storage Virtualization
    Storm
    Strategy
    Stream Processing
    Survey Most Important It Priorities
    Symantec
    Tablet
    Tablets
    Technology
    Technology Industry Report Card
    Technology Innovation
    Technology M&A Deals
    Technology Sourcing
    Text Mining
    Ubiquitous Computing
    User Authentications
    Vector-space Models
    Vendor Due Diligence
    Vertical Industry It Growth
    Videoconferencing
    Virtual Desktops
    Virtualization
    Virtual Work
    Visualization
    Wan Optimization
    Watson
    Wave
    Wearable Device
    Web Conferencing
    Web Content Management
    Web Hosting
    Windows Mobile
    Wireless
    Wireless Data
    Wireless Technologies
    Workload Optimization

    RSS Feed

Powered by Create your own unique website with customizable templates.