0% found this document useful (0 votes)
54 views

DIT Short Course - Week 1 Webinar Slides

This document discusses research methods for doctoral-level research, comparing qualitative and quantitative approaches. It covers key topics like developing research questions, experimental design, ethnography, case studies, and surveys. The goal is to help students understand what is required for rigorous doctoral research and to compare different methodologies.

Uploaded by

Ras KO
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
54 views

DIT Short Course - Week 1 Webinar Slides

This document discusses research methods for doctoral-level research, comparing qualitative and quantitative approaches. It covers key topics like developing research questions, experimental design, ethnography, case studies, and surveys. The goal is to help students understand what is required for rigorous doctoral research and to compare different methodologies.

Uploaded by

Ras KO
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 67

Researching at the doctoral level

Webinar Qualitative and quantitative


research methods and techniques in
computing

Craig S Wright
School of Computing and Mathematics
Charles Sturt University, NSW 2678
[email protected]
When thinking about your research
problem:
• Is it significant?
• Are you really interested in it?
• Is it novel?
• Is it an important area?
– High cost, high risk?
• Can it be studied?
• Is it relevant?
2
Overview
• This series of lectures is designed top help you
understand what is required in doctoral level research.

• In this, we will review a variety of research methods:


– Experimental design,
– ethnography,
– case study,
– survey methods.

• We will compare Qualitative and quantitative research


methods and techniques in computing
Fundamental Goals...
• Of Science:
– To Understand, To Predict, To
Control

• Of Scientists:
– To communicate discoveries and
findings to a community of peers
Introduction to Research
• Think about research – engineering / science
• Outcomes Model designed to put boundaries
around your area of study and expertise
• Variable identification
• Understanding rigor – correct methods for any
type of research design
• Enhance enjoyment in reading research articles
• Understand the challenge of the words so easily
used, “evidence-based practice.”

5
Designing Research
Dimensions of Analysis
• Research Purposes - theoretical or applied?
• Research Problems - what questions are asked?
• Research Settings - simulated or natural?
• Research Investigators - background and training
• Research Methods - a continuum
– Experimental, Ethnography, Case study, Survey
Topic overview
This topic looks at the research methods in detail.
• Research Purposes - theoretical or applied?
• Research Problems - what questions are asked?
• Research Settings - simulated or natural?
• Research Investigators - background and training
• Research Methods - a continuum
– Experimental, Ethnography, Case study, Survey
Definition of the Research Problem

REAL WORLD
CONTEXT
Match / Match /
Mismatch Mismatch

Match /
PHILOSO- Mismatch OVERALL
PHICAL RESEARCH
POSITION QUESTIONS

Match

RESEARCH
OBJECTIVE
Research Methodologies
A continuum rather than “either/or”
• Qualitative • Quantitative
– Goal: To – Goal: To Predict and
Understand, Control
Predict • Measure and Evaluate
• Descriptive • Generalize to
accounts population,
• Similarities and reproduction
Contrasts – Basic and Theoretical
– Applied and – Hypothesis testing
Theoretical – Lab study
– Research •
Types of Research Methods:
(all have rules of evidence!)

Qualitative Quantitative
Grounded theory Non-Experimental or
Ethnography Descriptive
Critical feminist theory Experimental or Randomized
Phenomenology Controlled Trials
Ethnography
Content Analysis
Models of analysis: fidelity
to text or words of Models of analysis:
interviewees Parametric vs. non-
parametric
10
Evaluating Research
• Validity
– A concern for most social scientists is the complex nature of the
phenomena under study: human behavior.
– Multiple perspectives are required in order to adequately reflect the
richness of these complexities.

• Reliability
– Consistency, Replicability

• Usefulness or Value of Investigation


– Contribution to knowledge
– Advance THEORY and PRACTICE in discipline
Overall Research Questions
• They tell you what you want to focus on and what you
want to know
• They set the rough boundaries of the research: you will
study some issues in some context with some actors
• They are oriented towards action and process
• The way they are (implicitly) formulated will determine
research strategy later on
• They set the vision for the research project and helps
focusing activities
Validity and Reliability
• Both Quantitative and Qualitative research
designs seek reliable and valid results. For
example:
– Quantitative Reliability: Data that are consistent or
stable as indicated by the researcher's ability to
replicate the findings.
– Qualitative: Validity of findings are paramount so that
data are representative of a true and full picture of
constructs under investigation.
Part Versus Whole
• “Whole” is often greater than “Parts”
• It is a non-trivial matter to infer the behavior of the whole
from the behavior of its parts
– Quantitative research designs strive to identify and
isolate specific variables within the context (seeking
correlation, relationships, causality) of the study.
– Qualitative design focuses on a holistic view of what
is being studied (via documents, case histories,
observations and interviews).
Data Collection
• Quantitative
– Emphasis on numerical data, measurable variables
– Data is collected under controlled conditions in order
to rule out the possibility that variables other than the
one under study can account for the relationships
identified
• Qualitative
– Emphasis on observation and interpretation.
– Data are collected within the context of their natural
occurrence.
Static and Dynamic
• Quantitative
– The accumulation of facts and causes of behavior
through careful isolation, measurement and
evaluation of variables.
– Predictability and Control over time.
• Qualitative
– Concerned with the changing and dynamic nature of
reality.
– Understanding a Point in time
Range of Research Methods

• Experimental design
• Ethnography
• Case study
• Survey
Experimental Design

• Hypothesis testing
• Independent and Dependent Variables
– For example - Predictor: method of instruction, Resulting
differences: math performance
• Sampling of Population
• Experimental and Controlled Conditions
• Random assignment
Experimental Research

• The researcher does something to the subjects


or objects or research, and then attempts to
determine the effects of these actions
• Reporting
– Careful description of sampling procedure
– Inferential statistics, effect size, and so on.
Ethnography

• Defined: a picture of the “way of life” of some identifiable


group of people
• Anthropology - “doing fieldwork”, “going native”
• Preoccupied with culture, and how people interact with
each other
• Qualitative Methodology - Both a research process and a
product
– Outcome: an ethnographic account
Field Research Techniques
• An Inquiry Process of multiple methods:
– Participant observation
• privileged, active participant
• passive observer
– Interviewing
• key informants, structured, unstructured
• groups, surveys and questionnaires
– Making and using records
• historical documents, archives, written records
Case Study
• Understanding the intricate complexity,
idiosyncrasy of one particular case
– investigation of a “bounded system”
– Some entity deemed worthy of close watch
• a single child, a single classroom, a single school, a single
national program…
• Goals
– Understand and report the uniqueness of individual
cases (both commonalities and differences)
– Usually no attempt to represent case by single or
multiple “scores”
Case Study Methods
• Mostly Social
– ASKING - Interviews
• Gather narrative and testimony
– WATCHING - Observations
– SEARCHING - Written records and artifacts
• Reporting
– Develop a conceptual structure, look for patterns,
consistencies, repetitions, and manifestations
pertinent to your research question(s)
Validity and Reliability
• There are many different stories to be told
– Different researchers have different questions to
answer, different conceptualizations of the situation,
and set different boundaries for the case
• Generalizability: What is true of one case is
often true about other cases
– Consistencies can be found - predictability
– How many cases are needed before patterns
emerge? It depends...
Survey Research Methods
• Purpose and Goal
– Describe specific characteristics of a large group of
persons, objects, or institutions
– Understand present conditions, rather than the
effects of particular intervention (as in experimental
research)
• Sample of Population
– Groups of interest are well defined and chosen using
well defined rules
– Representativeness
Survey Methods
• Mail
– postage and printing costs, participation rate

• Telephone
– sampling, wage and time costs, participation rates

• Face-to-Face
– wage and time costs, participation rates, like structured interview

• Web-based
– anytime, anywhere, cost effective
Issues in Survey Construction
• Item (question) and scale construction
• Pilot Testing and revision
• Sampling procedures
• Analysis and reporting of results
• Generalizability
– Drawing conclusions about the conditions, attitudes,
opinions, or status of a population of persons, objects,
institutions, or other entities.
Results
• Academics promoted the use of both
quantitative and qualitative measures to report
on “quality”
– QUANTITY OF:
• Journal publications, conference presentations, books and
book chapters, awards, grants, budget, and so on…
– QUALITY OF:
• Reputation of publication, reputation of granting agency,
quality of conference, peer reviews of research programs,…
• Quality of institutions that hire graduate students
• Societal benefit of research
Research Objective
One can distinguish between mainly three objectives
or purposes with a research project:

• To explain the causality between different observations


or the reasons behind a certain situation concerning
the phenomenon
• To explore a vague problem or a new area of
research
• To describe, i.e., observe and visualise the situation of
certain phenomena

The research objective does not automatically define


a quantitative or qualitative logic
Research Objective

The research questions implicitly determine the research objective, and together
they indicate quantitative vs. qualitative research:
• WHAT questions of descriptive nature in the sense “how much” or “how
many” call for a quantitative approach
• WHAT questions of explanatory or exploratory nature call for a qualitative
approach
• HOW questions and WHY questions call for a qualitative approach

Qualitative research is needed when we want to come to terms with the meaning,
not the “right” or “wrong” with the phenomena under investigation

PhD Seminar Series. Qualitative Reseach Methodology


K.E. Soderquist
DEFINITION OF THE RESEARCH PROBLEM

REAL WORLD
CONTEXT
Match / Match /
Mismatch Mismatch

Match /
PHILOSO- Mismatch OVERALL
PHICAL RESEARCH
POSITION QUESTIONS

Match

RESEARCH
OBJECTIVE

“Full“ UNIT OF

Process ANALYSIS

Model of Match / Mismatch Mismatch Mismatch

Methodological Match

Choice
RESEARCH
STRATEGY

Match / Mismatch Mismatch Mismatch

Match

RELATION TO THEORY
Deduction Induction

Predefinition and test of Determination of theory


Mismatch a theoretical model. from observations.
THEORY EXTENSION THEORY DEVELOPMENT

Match / Mismatch

Match

Methodological choice

Qualitative Qualitative
Deductive Logic Inductive Logic Data collection and data
analysis methods, appropriate
Quantitative Quantitative for the chosen methodology
Deductive Logic Inductive Logic
Where do ideas come from?
• Literature reviews
• Newspaper stories
• Being a research assistant
• Mentors/teachers
• Fellow students
• Patients
• Field experience
• Experts in the field

Build your area of expertise from multiple sources.


32
Uses of Substruction
• Critique a published study
• Plan a new study

33
Substruction
• A strategy to help you understand the
theory and methods (operational
system) in a research study
• Applies to empirical, quantitative
research studies
• There is no word, Substruction, in the
dictionary. It has an inductive meaning,
constructing and a deductive meaning,
deconstructing
• Hueristic 34
Substruction
Theory Construct Deductive (qualitative)
(Theoretical   
system) Concept  
 

Methods Measures  
(Operational   
System) Scaling/Data  
analysis (quantitative) Inductive

35
Substruction:
Building Blocks or Statements of Relationships
Construct axiom Construct
Pain quality of life
 
Concept proposition Concept
Intensity functional status
 
Measure hypothesis Measure
10 cm scale mobility scale

36
Statements of Relationships
Construct:
Postulate: Security attacks
Statement of consists of three
relationship
between a concepts
construct and
concepts
Concepts:
Intensity
Location
Duration 37
Substruction:
Research Design Perspective

Focus of Study (RCT?)

Co-variates Z Independent Variable X


Severity of attack treatment Dependent Variable Y
for risk adjustment how measured?
(analysis of covariance)

38
Substruction: Theoretical System,
an example
Hacker Intervention Study

Post Incident Incident Management


System Recovery
Severity of attack Containment
System Eradication Length of intrusion
Class of user Communication System Owner
Satisfaction

39
Substruction: Operational System
Attack Intensity Functional Status
Instrument: Instrument:1-5 scale,
Scale 1=low & 5=high
(low to high attack function
threshold)
Scale: continuous or
discrete? Scale: continuous or
discrete? 40
Scaling
Discrete: non-parametric (Chi square)
• Nominal gender
• Ordinal low, medium, high
income
Continuous: parametric (t or F tests)
• Interval Likert scale, 1-5
functionality
• Ratio money, timing
41
Issues
• What is the conceptual basis of the study?
• What are the major concepts and their
relationships?
• Are the proposed relationships among the
constructs and concepts logical and defensible?
• How are the concepts measured? valid?
reliable?
• What is the level of scaling and does it relate to
the appropriate statistical or data analytical
plan?
• Is there logical consistency between the
theoretical system and the operational system?
42
Literature Review
• We review the literature in order to
understand the theoretical and operational
systems relevant to our area of interest.
• What is known about the constructs and
concepts in our area of interest?
• What theories are proposed that link our
variables of interest?

43
Literature Review
• What is known?
• What is not known?

• Resources
– The IEEE, Journals, Industry
– Library Data Bases

44
Literature Review:
How to combine, synthesis, and demonstrate
direction?

T o p ic

S tud y 1 S tud y 2 S tud y 3


45
Literature Review
T o p ic

S tud y 1 S tud y 2 S tud y 3

46
Table 1. Outline of study variables related to
your topic
Covariates Interventions Outcomes
Independent Dependent
Studies variable Variable
Z X Y
Zia (2009)

Wright
(2013)
Etc.
47
Table 2. Threats to validity of research
studies related to topic
Author Type of Diagram Statistical Construct Internal External
(year) Design Conclusion Validity of Validity Validity
Validity Cause &
Effect

Zia RCT O X1 O n/a


(2009) O X2 O
O O

Wright
(2013)

48
Table 3. Instruments
Instrument # Validity Reliability Utility
Studies items

Zia (2009) Questionnaire

Wright
(2003)

49
Table 4. Power analysis for
literature review on topic.
Sample Alpha Power Effect
Studies Size Size

Zia (2009) 32 –exp 0.05 0.60 Est. at


40 – medium
cont
Wright
(2013)

50
Literature Synthesis
• Synthesis - what we know and do not
know
• Strengths – rigor, types of design,
instruments?
• Weaknesses –lack of rigor, no RCTs,
poorly developed instruments
• Future needs – what is the next step?

51
Research Designs
Qualitative
Quantitative

52
Research Design: Qualitative
• Ethnography
• Phenomenology
• Hermeneutics
• Grounded Theory
• Historical
• Case Study
• Narrative

53
Rigor in Qualitative Research
• Dependability
• Credibility
• Transferability
• Confirmability

54
Types of Quantitative Research
Designs
• We will focus on RIGOR:

– Experimental

– Non-experimental

55
X,Y, Z notation
• Z = covariate
• Severity of Attack

• X = independent variable (interventions)


• Intrusion management

• Y = dependent variable (outcome)


• Quality of system recovery

56
Types of Quantitative Research
Designs
– Descriptive X? Y? Z?
• What is X, Y, and Z?

– Correlational rxy.z
• Is there a relationship between X and Y?

– Causal ΔX  ΔY?
• Does a change in X cause a change in Y?
57
Rigor in Quantitative Research
• Theoretical Grounding: Axioms & postulates
– substruction-validity of hypothesized
relationships
• Design validity (internal & external) of
research design; Instrument validity and
reliability
• Statistical assumptions met (scaling, normal
curve, linear relationship, etc.)

(Note: Polit & Beck: reliability, validity,


generalizability, objectivity)
58
Literature Review Study Aims

Study Aims Study Question

Study Question Study Hypothesis

59
Aim, Question, and Hypothesis
• Study Aim: To explore if it is possible to reduce
patient falls for elderly in nursing homes.
• Study Question: Does putting a “sitter” in a
patient room reduce the incidence of falls?
• Study Hypothesis:
Null: H0: There is no difference between patients
who have a “sitter” and those who do not in the
incidence of falls.
60
Thank you
Reading and Study
Recommended readings
Reading 1
• Woodley, A. (2004). Getting and analysing quantitative data. PREST Core
Module A3. Commonwealth of Learning. Available online
http://www.col.org/SiteCollectionDocuments/A3.pdf [accessed on 8 March
2012]
Reading 2
• Walker, R. (2004). Getting and analysing qualitative data. PREST Core
Module A4. Commonwealth of Learning. Available online
http://www.col.org/SiteCollectionDocuments/A4.pdf [accessed on 8 March
2012]
Reading 3
• Spratt, C., Walker, R., & Robinson, B. (2004). Mixed research methods.
PREST Core Module A5. Commonwealth of Learning. Available online
http://www.col.org/SiteCollectionDocuments/A5.pdf [accessed on 8 March
2012]
Reading and Study
Recommended readings
Reading 4
• Sokolowski, J.A., & Banks, C. M. (2009). Principles of Modelling and
Simulation: A Multidisciplinary Approach. Hoboken, NJ. USA. Wiley.
(Available online eBook from CSU library)
Reading 5
• Maria, A. (1997). Introduction to modelling and simulation. Proceedings of
the 29th conference on Winter simulation (WSC 97). IEEE Computer
Society. ACM Digital Library.
Reading 6
• Arsham, H. (n.d). Systems Simulation: The Shortest Route to Applications.
Available online http://home.ubalt.edu/ntsbarsh/simulation/sim.htm
[Accessed on 8 March 2012]
Reading and Study
Recommended readings
Reading 7
• McGeoch, C. C. (2007). Experimental Algorithms. Communications of the
ACM. Volume 50, Issue 11. ACM Digital Library.
Reading 8
• Johnson, D. S. (2001). A Theoretician’s Guide to the Experimental
Analysis of Algorithms. Available online
http://www2.research.att.com/~dsj/papers/experguide.pdf [accessed on 8
March 2012)
Reading 9
• Tichy, W. F. (1998). Should Computer Scientist Experiment More? IEEE
Computer. Volume 31, No. 5. IEEE Xplore.
Reading and Study
Recommended readings
Reading 10
• Mytkowicz, T., Diwan, A., Hauswirth, M., & Sweeney, P. F. (2009).
Producing wrong data without doing anything obviously wrong!
Proceedings of the 14th international conference on Architectural support
for programming languages and operating systems. ACM, NY. USA. ACM
Digital Library.
Reading 11
• Wang, B., Yao, Y., Himmelspach, J., Ewald, R., Uhrmacher, A. M. (2009).
Experimental analysis of logical process simulation algorithms in JAMES
II. Proceedings of the 2009 Winter Simulation Conference. ACM Digital
Library.
Reading 12
• Feitelson, D. G. (2006). Experimental computer science: The Need for a
Cultural Change. Available online
http://www.cs.huji.ac.il/~feit/papers/exp05.pdf [Accessed on 8 March 2012]
Reading and Study
Recommended readings
Reading 13
• Booth, W. C., Colomb, G. G.,& Williams, J. M. (2008). The Craft of
Research (3rd Ed). Chicago, USA. The University of Chicago Press.
(Pages 177-269)
• (Available online eBook from CSU library)
Reading 14
• Hamalainen, W. (2006). Scientific Writing for Computer Science Students.
Available online. http://www.cs.joensuu.fi/pages/whamalai/sciwri/sciwri.pdf
• [Accessed on 12 March 2012]
Reading 15
• Kaiser, G., Partridge, C., Roy, S., Siegel E., Stolfo, S., Trevisan, L.,
Yemini, Y., Zadok, E., & Craveiro, J. (n.d.). Writing Technical Articles.
Available online. http://www.cs.columbia.edu/~hgs/etc/writing-style.html
[Accessed on 12 March 2012]
Other Resources
• Bowen, K. A. (1996). The Sin of Omission -Punishable
by Death to Internal Validity: An Argument for Integration
of Qualitative and Quantitative Research Methods to
Strengthen Internal Validity. [On-line]. Available:
http://trochim.human.cornell.edu/gallery/bowen/hss691.
htm
• Edyburn, D. L. (1998). The Electronic Scholar:
Enhancing Research Productivity with Technology.
Prentice-Hall: Columbus, OH.
• Jaeger, R. M. (1997). Complementary Research
Methods for Research in Education, (2nd ed). American
Educational Research Association: Washington, DC.

You might also like