100% found this document useful (1 vote)
28 views

QAB Mod 5 Data analysis, Report and Decision making

QAB Mod 5 Data analysis, Report and Decision making

Uploaded by

Dr Rakesh Thakor
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
100% found this document useful (1 vote)
28 views

QAB Mod 5 Data analysis, Report and Decision making

QAB Mod 5 Data analysis, Report and Decision making

Uploaded by

Dr Rakesh Thakor
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 11

Course Name: Research Methodology

Module 5: Data Analysis, Report & Decision


Making
Q.1. Explain data analysis process.
Ans. Data analysis involves-

• Inference - the use of reasoning to reach a conclusion based on evidence;


• A public method or process - revealing their study design in some way;
• Comparison as a central process – identification of patterns or aspects that are similar or
different; and
• Striving to avoid errors, false conclusions and misleading inferences.

Q.2. Explain the process of data analysis.


Ans. Data analysis process-

Step 1: Define Your Questions

In your organizational or business data analysis, you must begin with the right question(s). Questions
should be measurable, clear and concise. Design your questions to either qualify or disqualify potential
solutions to your specific problem or opportunity.

For example, start with a clearly defined problem: A government contractor is experiencing rising
costs and is no longer able to submit competitive contract proposals. One of many questions to solve
this business problem might include: Can the company reduce its staff without compromising quality?

Step 2: Set Clear Measurement Priorities

This step breaks down into two sub-steps: A) Decide what to measure, and B) Decide how to measure
it.
A) Decide What To Measure

Using the government contractor example, consider what kind of data you’d need to answer your key
question. In this case, you’d need to know the number and cost of current staff and the percentage of
time they spend on necessary business functions. In answering this question, you likely need to answer
many sub-questions (e.g., Are staff currently under-utilized? If so, what process improvements would
help?). Finally, in your decision on what to measure, be sure to include any reasonable objections any
stakeholders might have (e.g., If staff are reduced, how would the company respond to surges in
demand?).

B) Decide How To Measure It

Thinking about how you measure your data is just as important, especially before the data collection
phase, because your measuring process either backs up or discredits your analysis later on. Key
questions to ask for this step include:

● What is your time frame? (e.g., annual versus quarterly costs)


● What is your unit of measure? (e.g., USD versus Euro)
● What factors should be included? (e.g., just annual salary versus annual salary plus cost of staff
benefits)

Step 3: Collect Data

With your question clearly defined and your measurement priorities set, now it’s time to collect your
data. As you collect and organize your data, remember to keep these important points in mind:

● Before you collect new data, determine what information could be collected from existing
databases or sources on hand. Collect this data first.
● Determine a file storing and naming system ahead of time to help all tasked team members
collaborate. This process saves time and prevents team members from collecting the same
information twice.
● If you need to gather data via observation or interviews, then develop an interview template
ahead of time to ensure consistency and save time.
● Keep your collected data organized in a log with collection dates and add any source notes as
you go (including any data normalization performed). This practice validates your conclusions
down the road.

Step 4: Analyze Data


After you’ve collected the right data to answer your question from Step 1, it’s time for deeper data
analysis. Begin by manipulating your data in a number of different ways, such as plotting it out and
finding correlations or by creating a pivot table in Excel. A pivot table lets you sort and filter data by
different variables and lets you calculate the mean, maximum, minimum and standard deviation of
your data.

As you manipulate data, you may find you have the exact data you need, but more likely, you might
need to revise your original question or collect more data. Either way, this initial analysis of trends,
correlations, variations and outliers helps you FOCUS YOUR DATA ANALYSIS ON BETTER
ANSWERING YOUR QUESTION and any objections others might have.

During this step, data analysis tools and software are extremely helpful. Visio, Minitab and Stata are
all good software packages for advanced statistical data analysis. However, in most cases, nothing
quite compares to Microsoft Excel in terms of decision-making tools. If you need a review or a primer
on all the functions Excel accomplishes for your data analysis.

Step 5: Interpret Results

After analyzing your data and possibly conducting further research, it’s finally time to interpret your
results. As you interpret your analysis, keep in mind that you cannot ever prove a hypothesis true:
rather, you can only fail to reject the hypothesis. Meaning that no matter how much data you collect,
chance could always interfere with your results.

As you interpret the results of your data, ask yourself these key questions:

● Does the data answer your original question? How?


● Does the data help you defend against any objections? How?
● Are there any limitation on your conclusions, any angles you haven’t considered?

If your interpretation of the data holds up under all of these questions and considerations, then you
likely have come to a productive conclusion. The only remaining step is to use the results of your data
analysis process to decide your best course of action.

By following these five steps in your data analysis process, you make better decisions for your
business or government agency because your choices are backed by data that has been robustly
collected and analyzed. With practice, your data analysis gets faster and more accurate – meaning you
make better, more informed decisions to run your organization most effectively.
Q.3. Explain the core differences betwee Qualitative and quantitative data analysis.
Ans. Core differences between qualitative and quantitative data analysis-

• Qualitative data analysis is less standardised with the wide variety in approaches to qualitative
research matched by the many approaches to data analysis, while quantitative researchers
choose from a specialised, standard set of data analysis techniques;

• The results of qualitative data analysis guide subsequent data collection, and analysis is thus a
less-distinct final stage of the research process than quantitative analysis, where data analysis
does not begin until all data have been collected and condensed into numbers;

• Qualitative researchers create new concepts and theory by blending together empirical and
abstract concepts, while quantitative researchers manipulate numbers in order to test a
hypothesis with variable constructs; and

• Qualitative data analysis is in the form of words, which are relatively imprecise, diffuse and
context based, but quantitative researchers use the language of statistical relationships in
analysis.

Q.4. Explain the softwares for data analysis.


Ans. We now list some of the software programs and examine their application in the different areas
of business. As you read through these, ponder how researchers might be helped by the different
software.

Groupware-Groupware is a software that runs on a network so that teams can work on joint projects,
and it allows people from different departments to access data jointly. For example, if the accounting,
finance, sales, and production departments have to coordinate their efforts to come up with a viable
product within a tight budget, they will be served well by groupware. This software is of immense use
for efficient and effective completion of specific team projects.

Neural Networks-Neural Networks are designed to trace patterns in a set of data and generalize
therefrom. This software enables sales forecasts, stock market predictions, detection of weather
patterns, and the like. The California Scientific Software‘s Brainmaker used for managing investments
by recognizing patterns and trends influencing stock prices can be cited as a specific example.
CAM/CAD-Computer-aided manufacturing (CAM) software helps engineers to design the
manufacturing components and directs the production of the product. Computeraided design (CAD)
software creates and displays complex drawings with precision, enabling experimentation with
different designs. CAD/CAM software that integrates the two has been in use for a long time in
manufacturing and production units of organizations. Design sophistication and product development
are made possible by this program, and this software is extensively used by manufacturing
organizations.

Q.5. Explain the integral part of business research.


Ans. The integral part of research report is as follows:

• An abstract,
• introduction,
• methodology,
• results,
• discussion, and
• references.
The Abstract

The abstract is an overview of the research study and is typically two to four paragraphs in length.
Think of it as an executive summary that distills the key elements of the remaining sections into a few
sentences.

Introduction

The introduction provides the key question that the researcher is attempting to answer and a review of
any literature that is relevant. In addition, the researcher will provide a rationale for why the research
is important and will present a hypothesis that attempts to answer the key question. Lastly, the
introduction should summarize the state of the key question following the completion of the research.
For example, are there any important issues or questions still open?

Methodology

The methodology section of the research report is arguably the most important for two reasons. First it
allows readers to evaluate the quality of the research and second, it provides the details by which
another researcher may replicate and validate the findings. (1)

Typically the information in the methodology section is arranged in chronological order with the most
important information at the top of each section.

Ideally the description of the methodology doesn’t force you to refer to other documents; however if
the author is relying on existing methods, they will be referenced.

Results

In longer research papers, the results section contains the data and perhaps a short introduction.
Typically the interpretation of the data and the analysis is reserved for the discussion section.

Discussion

The discussion section is where the results of the study are interpreted and evaluated against the
existing body or research literature. In addition, should there be any anomalies found in the results,
this is where the authors will point them out. Lastly the discussion section will attempt to connect the
results to the bigger picture and show how the results might be applied. (3)

References

This section provides a list of each author and paper cited in the research report. Any fact, idea, or
direct quotation used in the report should be cited and referenced.

Q.6. Explain Oral presentation.


Ans. An oral presentation is a formal, research-based presentation of your work. Presentations happen
in a range of different places. For instance, if you work at a company that assigns people to teams to
collaborate on projects, your project team might give an oral presentation of your progress on a
particular project. If you work with a nonprofit organization that hosts an annual meeting at which the
organization shares its activities, budget, and goals with funders and community members, you might
give an oral presentation delivering that information. Learning how to construct and deliver an
effective oral presentation is a useful skill. In this context, we’re referring to oral presentations given
to report on a research project and your research findings.

Q.7. Explain methods of assessing research quality.


Ans. Systematic review
Systematic review is a rigorous, transparent, and replicable methodology that has become widely used
to inform evidence-based policy, management, and decision making. Systematic reviews follow a
detailed protocol with explicit inclusion and exclusion criteria to ensure a repeatable and
comprehensive review of the target literature. Review protocols are shared and often published as peer
reviewed articles before undertaking the review to invite critique and suggestions. Systematic reviews
are most commonly used to synthesize knowledge on an empirical question by collating data and
analyses from a series of comparable studies, though methods used in systematic reviews are
continually evolving and are increasingly being developed to explore a wider diversity of questions

Search terms
Search terms were designed to identify publications that discuss the evaluation or assessment of
quality or excellence 2 of research 3 that is done in a TDR context. The search strategy favored
sensitivity over specificity to ensure that we captured the relevant information

Databases searched
ISI Web of Knowledge (WoK) and Scopus were searched between 26 June 2013 and 6 August 2013.
The combined searches yielded 15,613 unique citations. Additional searches to update the first
searchers were carried out in June 2014 and March 2015, for a total of 19,402 titles scanned. Google
Scholar (GS) was searched separately by two reviewers during each search period. The first reviewer’s
search was done on 2 September 2013 (Search 1) and 3 September 2013 (Search 2), yielding 739 and
745 titles, respectively. The second reviewer’s search was done on 19 November 2013 (Search 1) and
25 November 2013 (Search 2), yielding 769 and 774 titles, respectively. A third search done on 17
March 2015 by one reviewer yielded 98 new titles. Reviewers found high redundancy between the
WoK/Scopus searches and the GS searches.

Targeted journal searches


Highly relevant journals, including Research Evaluation, Evaluation and Program Planning,
Scientometrics, Research Policy, Futures, American Journal of Evaluation, Evaluation Review, and
Evaluation, were comprehensively searched using broader, more inclusive search strings that would
have been unmanageable for the main database search.

Supplementary searches
References in included articles were reviewed to identify additional relevant literature. td-net’s ‘Tour
d’Horizon of Literature’, lists important inter- and transdisciplinary publications collected through an
invitation to experts in the field to submit publications ( td-net 2014 ). Six additional articles were
identified via supplementary search.

Limitations of coverage
The review was limited to English-language published articles and material available through internet
searches. There was no systematic way to search the gray (unpublished) literature, but relevant
material identified through supplementary searches was included.

Inclusion of articles
This study sought articles that review, critique, discuss, and/or propose principles, criteria, indicators,
and/or measures for the evaluation of quality relevant to TDR. As noted, this yielded a large number of
titles. We then selected only those articles with an explicit focus on the meaning of IDR and/or TDR
quality and how to achieve, measure or evaluate it. Inclusion and exclusion criteria were developed
through an iterative process of trial article screening and discussion within the research team.

Critical appraisal
In typical systematic reviews, individual articles are appraised to ensure that they are adequate for
answering the research question and to assess the methods of each study for susceptibility to bias that
could influence the outcome of the review (Petticrew and Roberts 2006). Most papers included in this
review are theoretical and methodological papers, not empirical studies. Most do not have explicit
methods that can be appraised with existing quality assessment frameworks. Our critical appraisal
considered four criteria adapted from Spencer et al. (2003): (1) relevance to the review question, (2)
clarity and logic of how information in the paper was generated, (3) significance of the contribution
(are new ideas offered?), and (4) generalizability (is the context specified; do the ideas apply in other
contexts?). Disagreements were discussed to reach consensus

Data extraction and management


The review sought information on: arguments for or against expanding definitions of research quality,
purposes for research quality evaluation, principles of research quality, criteria for research quality
assessment, indicators and measures of research quality, and processes for evaluating TDR
Data synthesis and TDR framework design
Our aim was to synthesize ideas, definitions, and recommendations for TDR quality criteria into a
comprehensive and generalizable framework for the evaluation of quality in TDR. Key ideas were
extracted from each article and summarized in an Excel database. We classified these ideas into
themes and ultimately into overarching principles and associated criteria of TDR quality organized as
a rubric ( Wickson and Carew 2014 ). Definitions of each principle and criterion were developed and
rubric statements formulated based on the literature and our experience. These criteria (adjusted
appropriately to be applied ex ante or ex post ) are intended to be used to assess a TDR project. The
reviewer should consider whether the project fully satisfies, partially satisfies, or fails to satisfy each
criterion

Q.8. Explain the types of case studies.


Ans. Types-

• Critical instance case studies gather data to examine a single instance of unique interest and/or

to perform a limited test on an assertion about a strategy, program, or problem.

• Cumulative case studies gather data from many case studies to answer audit/evaluation

questions.

• Exploratory case studies gather data both to describe conditions and to generate hypotheses for

future investigation

• Illustrative case studies gather data to describe and add realism and/or in-depth examples

about a program or policy.

• Program effects case studies gather data to examine causality and usually involve multiple

program sites and multiple audit/evaluation methods.

• Program implementation case studies gather data on program operations, often at multiple

program sites.
Q.9. Explain the characteristics of scientific research.
Ans. The characteristics of scientific research are as follows: 1. Objectivity 2. Verifiability 3. Ethical
Neutrality 4. Systematic Exploration 5. Reliability 6. Precision 7. Accuracy 8. Abstractness 9.
Predictability.
1. Objectivity:

Scientific knowledge is objective. Objectivity simple means the ability to see and accept facts as they
are, not as one might wish them to be. To be objective, one has to guard against his own biases,
beliefs, wishes, values and preferences. Objectivity demands that one must set aside all sorts of the
subjective considerations and prejudices.

2. Verifiability:

Science rests upon sense data, i.e., data gathered through our senses—eye, ear, nose, tongue and touch.
Scientific knowledge is based on verifiable evidence (concrete factual observations) so that other
observers can observe, weigh or measure the same phenomena and check out observation for
accuracy.

Is there a God? Is Varna’ system ethical or questions pertaining to the existence of soul, heaven or hell
are not scientific questions because they cannot be treated factually. The evidence regarding their
existence cannot be gathered through our senses. Science does not have answers for everything. It
deals with only those questions about which verifiable evidence can be found.

3. Ethical Neutrality:

Science is ethically neutral. It only seeks knowledge. How this knowledge is to be used, is determined
by societal values. Knowledge can be put to differing uses. Knowledge about atomic energy can be
used to cure diseases or to wage atomic warfare.

Ethical neutrality does not mean that the scientist has no values. It here only means that he must not
allow his values to distort the design and conduct of his research. Thus, scientific knowledge is value-
neutral or value- free.

4. Systematic Exploration:

A scientific research adopts a certain sequential procedure, an organised plan or design of research for
collecting and analysis of facts about the problem under study. Generally, this plan includes a few
scientific steps—formulation of hypothesis, collection of facts, analysis of facts (classification, coding
and tabulation) and scientific generalisation and predication.

5. Reliability::
Scientific knowledge must occur under the prescribed circumstances not once but repeatedly. It is
reproducible under the circumstances stated anywhere and anytime. Conclusions based on casual
recollections are not very reliable.

Q.10. Mention the qualities of useful information.


Ans. What are the qualities of 'good' or useful information?
The definition of 'good' information varies between different users of information. Therefore it may be
helpful to consider who might be the users of information generated by research?

● planners have clear information requirements as they seek to identify and solve problems as
outlined above
● implementers of projects need to monitor what they are achieving and the impacts of their
activities, in order to adjust their activities
● policy-makers, like planners, must be informed about problems, their causes, and means of
overcoming them
● donors seek to take actions to support other stakeholders and at the same time pursue their
own, sometimes contradictory, objectives
● service agencies (such as extension and research organisations, and input and output marketing
companies, for example) need to make decisions about how to invest and what activities to
engage in
● academics can be major users of information

You might also like