0% found this document useful (0 votes)
52 views

Data Analysis: Parametric vs. Non-Parametric Tests

This document discusses parametric and non-parametric statistical tests. Parametric tests make assumptions about the population parameters and include the paired-samples t-test, independent samples t-test, one-way ANOVA, Pearson correlation, and regression analysis. Non-parametric tests make fewer assumptions. Specific parametric tests are then described in more detail, including how to conduct and interpret their results. Pearson correlation is explained as measuring the strength of association between two variables. Regression analysis can be used to predict one variable from another.
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
52 views

Data Analysis: Parametric vs. Non-Parametric Tests

This document discusses parametric and non-parametric statistical tests. Parametric tests make assumptions about the population parameters and include the paired-samples t-test, independent samples t-test, one-way ANOVA, Pearson correlation, and regression analysis. Non-parametric tests make fewer assumptions. Specific parametric tests are then described in more detail, including how to conduct and interpret their results. Pearson correlation is explained as measuring the strength of association between two variables. Regression analysis can be used to predict one variable from another.
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 19

DATA ANALYSIS

PARAMETRIC VS. NON-


PARAMETRIC TESTS

LEE G. BARAQUIA
DATA ANALYSIS
DATA ANALYSIS
DATA ANALYSIS

Parametric Tests – a statistical test that


makes assumptions about the
parameters (defining properties) of the
population.
 Parameter refers to a measure which
describes population.
 A fixed characteristic of population based
on all the elements of the population is
termed as the parameter.
DATA ANALYSIS

Parametric Tests
 Paired-samples T-test
 Independent Samples T-test
 One-way ANOVA
 Pearson Correlation
 Regression Analysis
PAIRED-SAMPLES T-TEST

 The Paired Samples t Test is also known as: Dependent t


Test, Paired t Test, or Repeated Measures t Test
 It compares two means that are from the same
individual, object, or related units.
 The two means typically represent two different times
(e.g., pre-test and post-test with an intervention between
the two time points) or two different but related
conditions or units (e.g., left and right ears, twins).
 Purpose: To determine whether there is statistical
evidence that the mean difference between paired
observations on a particular outcome is significantly
different from zero.
INDEPENDENT SAMPLES T-TEST

 The independent t-test is also called the two


sample t-test, independent-samples t-test or
student's t-test
 An inferential statistical test that determines
whether there is a statistically significant
difference between the means in two unrelated
groups that are normally distributed.
 To run an independent t-test, you need: (1) One
independent, categorical variable that has two
levels/groups and (2) One continuous dependent
variable.
ONE-WAY ANALYSIS OF VARIANCE (ANOVA)

 One-way ANOVA is used to compare normally distributed


variables for more than two groups.
 Is used to compare the means between the groups and
determines whether any of those means are statistically
significantly different from each other.
 ANOVA Cannot tell you which specific groups were
statistically significantly different from each other, only
that at least two groups were. To determine which specific
groups differed from each other, you need to use a Post
Hoc Test.
 If your data met the assumption of homogeneity of
variances, use Tukey's honestly significant difference
(HSD) post hoc test. Scheffe’s test is customarily used with
unequal sample sizes.
PEARSON CORRELATION

 The Pearson product-moment correlation coefficient (or Pearson


correlation coefficient, for short) is a measure of the strength of a
linear association between two variables and is denoted by r.
 It attempts to draw a line of best fit through the data of two
variables, and the Pearson correlation coefficient, r, indicates how
far away all these data points are to this line of best fit.
 The Pearson r can take a range of values from +1 to -1. A value of 0
indicates that there is no association between the two variables.
 A value greater than 0 indicates a positive association; that is, as
the value of one variable increases, so does the value of the other
variable.
 A value less than 0 indicates a negative association; that is, as the
value of one variable increases, the value of the other variable
decreases.
PEARSON CORRELATION

 The Pearson product-moment correlation coefficient (or Pearson


correlation coefficient, for short) is a measure of the strength of a
linear association between two variables and is denoted by r.
 It attempts to draw a line of best fit through the data of two
variables, and the Pearson correlation coefficient, r, indicates how
far away all these data points are to this line of best fit.
 The Pearson r can take a range of values from +1 to -1. A value of 0
indicates that there is no association between the two variables.
 A value greater than 0 indicates a positive association; that is, as
the value of one variable increases, so does the value of the other
variable.
 A value less than 0 indicates a negative association; that is, as the
value of one variable increases, the value of the other variable
decreases.
Positive Relationship
Negative Relationship
No Relationship
PEARSON CORRELATION

If r = Zero this means no association or


correlation between the two variables.

If 0 < r < 0.25 = weak correlation.

If 0.25 ≤ r < 0.75 = intermediate


correlation.

If 0.75 ≤ r < 1 = strong correlation.

If r = l = perfect correlation.
Correlation Vs. Regression

 Correlation tells you if there is an


association between x and y but it
doesn’t describe the relationship
or allow you to predict one
variable from the other.
 To do this we need REGRESSION
ANALYSIS!
REGRESSION ANALYSES

 Regression: technique concerned with predicting


some variables by knowing others
 Simple Linear Regression - A statistical model
that utilizes one quantitative independent
variable “X” to predict the quantitative
dependent variable “Y.”
 Multiple Linear Regression - A statistical model
that utilizes two or more quantitative and
qualitative explanatory variables (x1,..., xp) to
predict a quantitative dependent variable Y.
REGRESSION MODEL
REGRESSION ANALYSES

 Predicted final grade in class = 59.95 +


3.17*(hours of study)
 Predict the final grade of…
Someone who studies for 12 hours
Final grade = 59.95 + (3.17*12)
Final grade = 97.99

Someone who studies for 1 hour:


Final grade = 59.95 + (3.17*1)
Final grade = 63.12
THE END…..

THANK YOU VERY MUCH!!!

You might also like