QAB Mod 5 Data analysis, Report and Decision making
QAB Mod 5 Data analysis, Report and Decision making
In your organizational or business data analysis, you must begin with the right question(s). Questions
should be measurable, clear and concise. Design your questions to either qualify or disqualify potential
solutions to your specific problem or opportunity.
For example, start with a clearly defined problem: A government contractor is experiencing rising
costs and is no longer able to submit competitive contract proposals. One of many questions to solve
this business problem might include: Can the company reduce its staff without compromising quality?
This step breaks down into two sub-steps: A) Decide what to measure, and B) Decide how to measure
it.
A) Decide What To Measure
Using the government contractor example, consider what kind of data you’d need to answer your key
question. In this case, you’d need to know the number and cost of current staff and the percentage of
time they spend on necessary business functions. In answering this question, you likely need to answer
many sub-questions (e.g., Are staff currently under-utilized? If so, what process improvements would
help?). Finally, in your decision on what to measure, be sure to include any reasonable objections any
stakeholders might have (e.g., If staff are reduced, how would the company respond to surges in
demand?).
Thinking about how you measure your data is just as important, especially before the data collection
phase, because your measuring process either backs up or discredits your analysis later on. Key
questions to ask for this step include:
With your question clearly defined and your measurement priorities set, now it’s time to collect your
data. As you collect and organize your data, remember to keep these important points in mind:
● Before you collect new data, determine what information could be collected from existing
databases or sources on hand. Collect this data first.
● Determine a file storing and naming system ahead of time to help all tasked team members
collaborate. This process saves time and prevents team members from collecting the same
information twice.
● If you need to gather data via observation or interviews, then develop an interview template
ahead of time to ensure consistency and save time.
● Keep your collected data organized in a log with collection dates and add any source notes as
you go (including any data normalization performed). This practice validates your conclusions
down the road.
As you manipulate data, you may find you have the exact data you need, but more likely, you might
need to revise your original question or collect more data. Either way, this initial analysis of trends,
correlations, variations and outliers helps you FOCUS YOUR DATA ANALYSIS ON BETTER
ANSWERING YOUR QUESTION and any objections others might have.
During this step, data analysis tools and software are extremely helpful. Visio, Minitab and Stata are
all good software packages for advanced statistical data analysis. However, in most cases, nothing
quite compares to Microsoft Excel in terms of decision-making tools. If you need a review or a primer
on all the functions Excel accomplishes for your data analysis.
After analyzing your data and possibly conducting further research, it’s finally time to interpret your
results. As you interpret your analysis, keep in mind that you cannot ever prove a hypothesis true:
rather, you can only fail to reject the hypothesis. Meaning that no matter how much data you collect,
chance could always interfere with your results.
As you interpret the results of your data, ask yourself these key questions:
If your interpretation of the data holds up under all of these questions and considerations, then you
likely have come to a productive conclusion. The only remaining step is to use the results of your data
analysis process to decide your best course of action.
By following these five steps in your data analysis process, you make better decisions for your
business or government agency because your choices are backed by data that has been robustly
collected and analyzed. With practice, your data analysis gets faster and more accurate – meaning you
make better, more informed decisions to run your organization most effectively.
Q.3. Explain the core differences betwee Qualitative and quantitative data analysis.
Ans. Core differences between qualitative and quantitative data analysis-
• Qualitative data analysis is less standardised with the wide variety in approaches to qualitative
research matched by the many approaches to data analysis, while quantitative researchers
choose from a specialised, standard set of data analysis techniques;
• The results of qualitative data analysis guide subsequent data collection, and analysis is thus a
less-distinct final stage of the research process than quantitative analysis, where data analysis
does not begin until all data have been collected and condensed into numbers;
• Qualitative researchers create new concepts and theory by blending together empirical and
abstract concepts, while quantitative researchers manipulate numbers in order to test a
hypothesis with variable constructs; and
• Qualitative data analysis is in the form of words, which are relatively imprecise, diffuse and
context based, but quantitative researchers use the language of statistical relationships in
analysis.
Groupware-Groupware is a software that runs on a network so that teams can work on joint projects,
and it allows people from different departments to access data jointly. For example, if the accounting,
finance, sales, and production departments have to coordinate their efforts to come up with a viable
product within a tight budget, they will be served well by groupware. This software is of immense use
for efficient and effective completion of specific team projects.
Neural Networks-Neural Networks are designed to trace patterns in a set of data and generalize
therefrom. This software enables sales forecasts, stock market predictions, detection of weather
patterns, and the like. The California Scientific Software‘s Brainmaker used for managing investments
by recognizing patterns and trends influencing stock prices can be cited as a specific example.
CAM/CAD-Computer-aided manufacturing (CAM) software helps engineers to design the
manufacturing components and directs the production of the product. Computeraided design (CAD)
software creates and displays complex drawings with precision, enabling experimentation with
different designs. CAD/CAM software that integrates the two has been in use for a long time in
manufacturing and production units of organizations. Design sophistication and product development
are made possible by this program, and this software is extensively used by manufacturing
organizations.
• An abstract,
• introduction,
• methodology,
• results,
• discussion, and
• references.
The Abstract
The abstract is an overview of the research study and is typically two to four paragraphs in length.
Think of it as an executive summary that distills the key elements of the remaining sections into a few
sentences.
Introduction
The introduction provides the key question that the researcher is attempting to answer and a review of
any literature that is relevant. In addition, the researcher will provide a rationale for why the research
is important and will present a hypothesis that attempts to answer the key question. Lastly, the
introduction should summarize the state of the key question following the completion of the research.
For example, are there any important issues or questions still open?
Methodology
The methodology section of the research report is arguably the most important for two reasons. First it
allows readers to evaluate the quality of the research and second, it provides the details by which
another researcher may replicate and validate the findings. (1)
Typically the information in the methodology section is arranged in chronological order with the most
important information at the top of each section.
Ideally the description of the methodology doesn’t force you to refer to other documents; however if
the author is relying on existing methods, they will be referenced.
Results
In longer research papers, the results section contains the data and perhaps a short introduction.
Typically the interpretation of the data and the analysis is reserved for the discussion section.
Discussion
The discussion section is where the results of the study are interpreted and evaluated against the
existing body or research literature. In addition, should there be any anomalies found in the results,
this is where the authors will point them out. Lastly the discussion section will attempt to connect the
results to the bigger picture and show how the results might be applied. (3)
References
This section provides a list of each author and paper cited in the research report. Any fact, idea, or
direct quotation used in the report should be cited and referenced.
Search terms
Search terms were designed to identify publications that discuss the evaluation or assessment of
quality or excellence 2 of research 3 that is done in a TDR context. The search strategy favored
sensitivity over specificity to ensure that we captured the relevant information
Databases searched
ISI Web of Knowledge (WoK) and Scopus were searched between 26 June 2013 and 6 August 2013.
The combined searches yielded 15,613 unique citations. Additional searches to update the first
searchers were carried out in June 2014 and March 2015, for a total of 19,402 titles scanned. Google
Scholar (GS) was searched separately by two reviewers during each search period. The first reviewer’s
search was done on 2 September 2013 (Search 1) and 3 September 2013 (Search 2), yielding 739 and
745 titles, respectively. The second reviewer’s search was done on 19 November 2013 (Search 1) and
25 November 2013 (Search 2), yielding 769 and 774 titles, respectively. A third search done on 17
March 2015 by one reviewer yielded 98 new titles. Reviewers found high redundancy between the
WoK/Scopus searches and the GS searches.
Supplementary searches
References in included articles were reviewed to identify additional relevant literature. td-net’s ‘Tour
d’Horizon of Literature’, lists important inter- and transdisciplinary publications collected through an
invitation to experts in the field to submit publications ( td-net 2014 ). Six additional articles were
identified via supplementary search.
Limitations of coverage
The review was limited to English-language published articles and material available through internet
searches. There was no systematic way to search the gray (unpublished) literature, but relevant
material identified through supplementary searches was included.
Inclusion of articles
This study sought articles that review, critique, discuss, and/or propose principles, criteria, indicators,
and/or measures for the evaluation of quality relevant to TDR. As noted, this yielded a large number of
titles. We then selected only those articles with an explicit focus on the meaning of IDR and/or TDR
quality and how to achieve, measure or evaluate it. Inclusion and exclusion criteria were developed
through an iterative process of trial article screening and discussion within the research team.
Critical appraisal
In typical systematic reviews, individual articles are appraised to ensure that they are adequate for
answering the research question and to assess the methods of each study for susceptibility to bias that
could influence the outcome of the review (Petticrew and Roberts 2006). Most papers included in this
review are theoretical and methodological papers, not empirical studies. Most do not have explicit
methods that can be appraised with existing quality assessment frameworks. Our critical appraisal
considered four criteria adapted from Spencer et al. (2003): (1) relevance to the review question, (2)
clarity and logic of how information in the paper was generated, (3) significance of the contribution
(are new ideas offered?), and (4) generalizability (is the context specified; do the ideas apply in other
contexts?). Disagreements were discussed to reach consensus
• Critical instance case studies gather data to examine a single instance of unique interest and/or
• Cumulative case studies gather data from many case studies to answer audit/evaluation
questions.
• Exploratory case studies gather data both to describe conditions and to generate hypotheses for
future investigation
• Illustrative case studies gather data to describe and add realism and/or in-depth examples
• Program effects case studies gather data to examine causality and usually involve multiple
• Program implementation case studies gather data on program operations, often at multiple
program sites.
Q.9. Explain the characteristics of scientific research.
Ans. The characteristics of scientific research are as follows: 1. Objectivity 2. Verifiability 3. Ethical
Neutrality 4. Systematic Exploration 5. Reliability 6. Precision 7. Accuracy 8. Abstractness 9.
Predictability.
1. Objectivity:
Scientific knowledge is objective. Objectivity simple means the ability to see and accept facts as they
are, not as one might wish them to be. To be objective, one has to guard against his own biases,
beliefs, wishes, values and preferences. Objectivity demands that one must set aside all sorts of the
subjective considerations and prejudices.
2. Verifiability:
Science rests upon sense data, i.e., data gathered through our senses—eye, ear, nose, tongue and touch.
Scientific knowledge is based on verifiable evidence (concrete factual observations) so that other
observers can observe, weigh or measure the same phenomena and check out observation for
accuracy.
Is there a God? Is Varna’ system ethical or questions pertaining to the existence of soul, heaven or hell
are not scientific questions because they cannot be treated factually. The evidence regarding their
existence cannot be gathered through our senses. Science does not have answers for everything. It
deals with only those questions about which verifiable evidence can be found.
3. Ethical Neutrality:
Science is ethically neutral. It only seeks knowledge. How this knowledge is to be used, is determined
by societal values. Knowledge can be put to differing uses. Knowledge about atomic energy can be
used to cure diseases or to wage atomic warfare.
Ethical neutrality does not mean that the scientist has no values. It here only means that he must not
allow his values to distort the design and conduct of his research. Thus, scientific knowledge is value-
neutral or value- free.
4. Systematic Exploration:
A scientific research adopts a certain sequential procedure, an organised plan or design of research for
collecting and analysis of facts about the problem under study. Generally, this plan includes a few
scientific steps—formulation of hypothesis, collection of facts, analysis of facts (classification, coding
and tabulation) and scientific generalisation and predication.
5. Reliability::
Scientific knowledge must occur under the prescribed circumstances not once but repeatedly. It is
reproducible under the circumstances stated anywhere and anytime. Conclusions based on casual
recollections are not very reliable.
● planners have clear information requirements as they seek to identify and solve problems as
outlined above
● implementers of projects need to monitor what they are achieving and the impacts of their
activities, in order to adjust their activities
● policy-makers, like planners, must be informed about problems, their causes, and means of
overcoming them
● donors seek to take actions to support other stakeholders and at the same time pursue their
own, sometimes contradictory, objectives
● service agencies (such as extension and research organisations, and input and output marketing
companies, for example) need to make decisions about how to invest and what activities to
engage in
● academics can be major users of information