Introduction To Business Analytics Hokey Min
Introduction To Business Analytics Hokey Min
In the era of knowledge economy, getting the right information to decision makers at the right
time is critical to their business success. One such attempt includes the growing use of business
analytics. Generally speaking, business analytics refers to a broad use of various quantitative
techniques such as statistics, data mining, optimization tools, and simulation supported by the
query and reporting mechanism to assist decision makers in making more informed decisions
within a closed-loop framework seeking continuous process improvement through monitoring
and learning. Business analytics also helps the decision maker predict the future business
activities based on the analysis of historical patterns of past business activities. For example,
your nearby grocery chain, such as Kroger, might frequently issue discount coupons tailored for
each customer based on his past shopping patterns. This practice encourages the customer to
consider buying the discounted but favorite items repeatedly, while building customer loyalty.
This practice is possible, since a smart use of business analytics allows the grocery store to figure
out which items are likely to be purchased by which customer in his next grocery shopping trip.
Likewise, application potentials of business analytics are enormous given the abundant data
available from the digital and mobile data sources.
Although business analytics has been rapidly gaining popularity among practitioners and
academicians alike in the recent past, its conceptual foundation has existed for centuries. One of
the first forms of business analytics may be statistics whose uses can be traced back at least to
the biblical times in ancient Egypt, Babylon, and Rome. Regardless of historical facts, its
longevity may be attributed to its usefulness for helping the policy maker (including ancient
rulers or kings) make a better decision. In other words, whatever the form of business analytics
may be, it would help us answer the following fundamental questions critical for decision
making:
1. What happened?
Is there scientific evidence indicating the validity and usefulness of our changed
practices?
By answering the preceding questions, business analytics aims to accomplish these various
goals:
Gaining insights into business practices and customer behaviors: Business analytics is
designed to transform unstructured, nonstandardized big data originated from multiple
sources into meaningful information helpful for a better business decision.
Improving predictability: By deriving insights into customer behavioral patterns and market
trends, business analytics can improve the organization’s ability to make demand forecast
more accurately.
Identifying risk: With growing complexity and uncertainty resulting from the globalization
of business activities, many organizations encounter the daunting tasks of managing risk.
Risk cannot be managed without identifying it and then preparing for it. Business analytics
can function as an early warning system for detecting the signs or symptoms of potential
troubles by dissecting the business patterns (e.g., shrinking market share, a higher rate of
customer defection, declining stock price).
Improving the effectiveness of communication: With the query and reporting mechanism of
business analytics, it can not only speed up the reporting procedures, but also provide user-
friendly reports including “what-if” scenarios. Such reports can be a valuable
communication tool among the decision makers and thus would help the management team
make more timely and accurate business decisions.
Enhancing operating efficiency: By aiding the decision maker in understanding the way
business works and where the greatest business opportunities are, business analytics can
decrease the chances of making poor investment decisions and misallocating the company’s
resources and thus would help improve the company’s operating efficiency.
The Use of Facts To support your conclusions To form your own opinions and
beliefs
Given the importance of analytical thinking to the successful application of business analytics,
the development of analytical thinking (or nurturing analytical thinking skills) should precede the
adoption of business analytics. The following summarizes ways to develop analytical thinking in
a systematic manner:
1. Identify: Before leveraging big data as a driver of business value, we need to first identify
who will be using the data. The identification of target users will allow us to determine
which type of data is worthy of consideration for collection and storage. Afterward, we need
to identify where the data is coming from, who is creating those data, and where the content
lives.
2. Filter: Depending on the purpose of data usage, we need to determine which data is relevant
for analysis and which data should be thrown out. This step will prevent the decision maker
from using outdated and irrelevant (misleading) data.
3. Analyze: After a manageable number of datasets are created, the next step to take is to
determine which information should be extracted from the given datasets to solve particular
decision problems encountered by the business executive. For instance, if you are interested
in finding the optimal routes for package delivery couriers, you need to extract information
about the costs/distances associated with each route option and capacity limits of each
available courier. The type of information that you are looking for will dictate the choice of
various data analysis tools such as descriptive, exploratory, inferential, predictive, causal,
and mechanistic analysis. According to Smith (2013), descriptive analysis focuses on the
quantitative summary of data features (e.g., mean, median, standard deviation, frequency,
cumulative percentage). Exploratory analysis is intended to find previously unknown
relationships. Inferential analysis is designed to test theories about the nature of the world
in general (or some part of it) based on samples of “subjects” taken from the world (or some
part of it). Predicative analysis is intended to make predictions about future events using
current facts and historical trends. Causal analysis aims to find out what happens to one
variable when another variable is changed. Mechanistic analysis helps us understand how
changes in certain variables can lead to transformations in other variables through iterative
experiments.
4. Disseminate: After insightful information is extracted from data analysis, this information
should be transmitted to the right person at the right location at the right time. Especially for
confidential or proprietary information, information security should be ensured to avoid
information breaches.
5. Update: To avoid a wrong decision stemming from outdated information, we should
constantly keep data updated and monitored for veracity.
Following a closed loop of deriving value from big data and then gaining insights into business
decision problems, we should not overlook the process of operationalizing big data by putting
those insights gleaned from big data into use or practice in a form of repeatedly usable solutions.
One way of facilitating that process is to develop visual reporting mechanisms (e.g., summary
reports, graphical charts, tables, dashboards) that can be easily shared by the team of decision
makers and stakeholders without technical expertise. More important, since intelligent insights
obtained from big data analysis may not be shared throughout the organization, we need to first
break visible or invisible silos that disconnect some organizational units partially or wholly from
the rest of their organization.
1. Data screening: Some data were collected and stored by the company not necessarily
because they would be useful, but because they had to be captured and kept by government
or company mandates. For example, we are required by the healthcare law to retain patient
data for 7 years for adults and 25 years for children. Some portions of those data may be
useful for tracking the patient’s health history and immunization records, but others, such as
the height or weight of patients at a certain point of their lives, may not present many clues
for their current health conditions. As such, data screening should begin with the
determination of relevancy of stored data to intended data usage (e.g., medical diagnosis,
vaccine immunization against infectious disease outbreaks, market trend forecasts). After
relevant data are identified, those data should be cleaned by the removal of any outliers and
faulty data (e.g., 200-year-old human being) from the analysis.
2. Data standardization: It will be difficult for us to make sense of incompatible data in
different formats (e.g., Excel versus SPSS) or measurement units (e.g., dollar versus yuan).
So the main purpose of data standardization is to make data consistent and clear. Herein,
what we mean by “consistent” is ensuring that the output (data analysis result) is reliable so
that related data can be identified using a common terminology and format. What we mean
by “clear” is to ensure that the data can be easily understood by those who are not involved
with the data analysis process (Oracle 2015). Also, data standardization ensures that the
analyzed data can be shared across the enterprise.
3. Data analysis: With the standardized data, the next step to take is to figure out what that
data means by describing, condensing, inspecting, recapping, and modeling it. Since raw
data itself means nothing to the decision maker, it is really important for us to select the
proper data analysis tools to interpret what the data tells. In a broad sense, there are two
types of data analysis tools: qualitative approach and quantitative approach. In general, a
qualitative approach aims to develop (usually not predefined) concepts and insights useful
for explaining natural phenomena from holistic, speculative, and descriptive views. It often
deals with data that is not easily converted to numbers and its analysis relies heavily on field
observations, interviews, archives, transcriptions, audio/video recordings, and focus group
discussions. For example, this approach is proven to be useful for segmenting unfamiliar
markets, understanding customer responses to new products, differentiating a company
brand from its competition, and repositioning a product after its market image has gone
stale (Mariampolski 2001). On the other hand, a quantitative approach aims to make sense
of numerical data (numbers) by evaluating it mathematically and reporting its analysis
results in numerical terms. A quantitative approach is primarily concerned with finding
clear patterns of evidence to either support or contradict a preconceived notion or
hypothetical idea that is formulated based on the abstract representation of real-world
situations. As such, it is helpful for fact findings.
These two approaches can be further broken down into many categories, as shown in Figure
1.1. Since categories belonging to the quantitative approach are discussed in detail in
Chapter 3, “Business Analytics Models,” we will briefly introduce well-known categories
of the qualitative approach here. These include narrative analysis, which is intended to
reflect on the subjective accounts of field texts such as stories, folklore, life experiences,
letters, conversations, diaries, and journals presented by people in different contexts and
from different viewpoints (see, e.g., Reissman 2008). Thus, narrative analysis is not
interested in verifying whether field texts are true. Grounded theory develops a set of
general but flexible guidelines for collecting and analyzing qualitative data to construct
theories “grounded” in the data themselves and foster seeing the data in fresh ways
(Charmaz 2006). It usually starts with an examination of a single case from a predefined
population to formulate a general statement and then proceeds to the examination of another
case to see whether it fits the general statement. This process continues until all the cases of
predefined population fit that statement. Stated simply, content analysis is a systematic and
objective way of interpreting message characteristics from a wide range of text data (e.g.,
words, phrases) obtained from the careful examination of human interactions; character
portrayals in TV commercials, films, and novels; the computer-driven investigation of word
usage in news releases and political speeches; and so forth (Neuendorf 2002). Discourse
analysis aims to analyze the use of “discourse” (i.e., language beyond the level of a
sentence; language behavior linked to social practices; language as a system of thought) in
any communicated messages. Thus, it can be defined as the analysis of language “beyond
the sentence” (Tannen 2015). Discourse analysis allows us to make sense of what we are
hearing or reading based on the analysis of every word spoken by a particular individual,
the timing of her words, and the general topic she addresses when she utters those
words. Domain analysis is intended to discover patterns that exist in cultural behaviors,
social situations, and cultural artifacts in the group from whom the data was
collected. Conversation analysis uncovers details of conversational interaction among
people under the premise that conversation can give us a sense of who we are to one another
and that conversational interaction is sequentially organized, and talk can be analyzed in
terms of the process of social interaction rather than in terms of motives or social status
(Holstein and Gubrium 2000). Conversation analysis allows us to capture shifts in meaning
of language a person spoke, changes in her nuances, and conveyance of nonverbal
messages.
4. Data Reporting: To create actionable insights into the analysis results, the results should be
presented to the intended users in such a way that they can be accessed in real time and be
understood by the users without much technical expertise. Thus, the results should be
rendered in tabular, graphical, and other visual formats. Some of the data visualization tools
such as Instant-Atlas, Fusion-Charts, and Visualize Free that support data presentation
include the dynamic interactive feature that enables the user to see alternative results under
the “what-if” scenarios.
As discussed, extracting actual business value from data overload is an onerous task that should
be performed in a systematic, ordered fashion. To maximize the efficiency and effectiveness of
data extraction, it is desired that business executives develop clear data management policy and
procedural guidelines to follow based on the aforementioned steps.
In addition, every time data are collected and transmitted to multiple parties, the data security
and privacy issues will arise. Especially in a global setting, varying privacy laws on data
collection/sharing in different countries can make the use of business analytics extremely
challenging for multination firms. For example, the European Union (EU) works to finalize
sweeping regulations that set higher privacy standards for personal data collection and analysis
in EU countries (Burns 2015). These kinds of new measures, which were released in draft form
in 2012 and are expected to be finalized in late 2015 or in early 2016, will heighten already
stringent EU privacy protections and subsequently limit the use of business analytics.
Furthermore, given many different business analytics platforms and tools designed for particular
functionalities (e.g., data discovery, data streaming, data processing, ad hoc reporting and
querying), firms embracing business analytics will face the technical challenges of managing,
consolidating, and unifying various incompatible business analytics platforms, tools, and
database management systems.
Despite a growing significance of business analytics to global business success, the recent SAP
survey reported that a mere 27% of U.S. firms had a plan for the use of business analytics or any
form of business intelligence tools, and only 13.5% of the surveyed firms used business analytics
on a daily or ongoing basis (Primault 2012). A lack of business analytics application may be
attributed to the user’s unfamiliarity with this tool, unproven benefits, implementation cost
concerns and hassles, internal resistance against the adoption of a newly introduced tool, and a
difficulty in leveraging it as the competitive differentiator. To overcome these hurdles, the
potential users of business analytics should identify key success factors and then formulate
business analytics implementation strategy as part of their global business strategy. Figure
1.2 displays a list of key success factors for business analytics.
As discussed previously, the best way to maximize business analytics is to incorporate it as part
of the global business strategy and then develop specific action plans for its successful
implementation.
Bibliography
A.T. Kearney (2013), “Big Data and the Creative Destruction of Today’s Business
Models,” https://www.atkearney.com/strategic-it/ideas-insights/article/-/asset_publisher/
LCcgOeS4t85g/content/big-data-and-the-creative-destruction-of-today-s-business-models/
10192, retrieved on May 30, 2015.
Holstein, J. A., and J. F. Gubrium (2000), The Self We Live By: Narrative Identity in a
Postmodern World, New York, NY: Oxford University Press.
IBM (2012), Bringing Big Data to the Enterprise, Unpublished Report, Armonk, NY: IBM
Watson Foundation.
Neuendorf, K. A. (2002), The Content Analysis Guidebook, Thousand Oaks, CA: Sage
Publications, Inc.
Oracle (2015), Oracle Enterprise Data Quality for Product Data Knowledge Studio Reference
Guide, https://docs.oracle.com/cd/E35636_01/doc.11116/e29134/stan_data.htm, retrieved on
June 8, 2015.
Smith, J. (2013), “Six Types of Analyses Every Data Scientist Should Know,” Data Scientist
Insights, http://datascientistinsights.com/2013/01/29/six-types-of-analyses-every-data-scientist-
should-know/, retrieved on June 5, 2015.