You are on page 1of 5

REGRESSION

LINER Manual And Excel only two variable

y=mx+c ;

y=beta0+beta1x

CURVY SPSS the degree of x is high

Y= beta0+beta1x1+beta2x^2

MULTIPLE-SPSS, y= beta0+beta1x1+beta2x2+

Interaction y=beta 0+bx1+b2x2+b3x3

R2 - coeff of determination

value is from 0-1

0-no reln

.1-perfect

R co relation coeff ; value -1 ,0,1

Y^ - estimated

y- original

SSE- sum of square of error (y-Y^)^2 [original-pred]^2

SSR-(Y^-ybar)^2[pred-avg]^2

SST- (y-ybar)^2[orig-avg]^2 {ssr+se}

M=(y-ybar)(x-xbar)

------------------------

(x-xbar)^2

C=ybar-mxbar

Y^=mx+c (predicted values)

R^2= SSR/SST

R=(ssr/sst)^.5 (the sign of m)

90% of variability in Y can be explained by X and the rest remains unexplained


ANOVA analysis of Variance

DoF(n=10) MS(mean sq) F value


tdist

SSR (done for 2) 1 =SSR/Dof =MSR/MSE

SSE =(SST-SSR) 8 =SSE/Dof

SST =(n-1)9

(MSE=sigma sq)
Alpha =0.05 (mostly)
Fcri= FIN(alpha,DoFSSR,DofSSE)
Pval=FDIST(f calc,DofSSR,DoFSST)
H0- m=0 (no reln)
H1= m >< 0 {greater , as x value incr Y value dec}
Alpha smaller than P then accept

Excel Sequence

1. Data
2. Data analysis
3. regression

SPSS Sequence

1. analyse
2. regression
3. linear
all values in scale
standardize coeff = x-avgx/std x

SPSS

Model summary R and R^2

Adjusted R^2 R^2 for the standard data

Derbin Watson- nearing 2 or more than 2 the more nearer the stronger the
relationship

ANOVA-Significant value =p value

Co efficient table- the first two things


1.constant
2. m value
When x changes by 1 y changes by m units

Coeff co reln table- how one variable is dependent on another variable

Diagonal values will always be 1


It shows the relationship bn the 2 variables
Shows inverse relnship

CURVY LINEAR
The only diff is the power of x

Need to find x^2

Y=b0+b1x+b2x^2

Plot the graph and see if it is curvy if not given in the question

Only SPSS

X and Y
Regression
Curve estimation

To convert it into multiple regression we create a dummy variable

X^2 to x2

SPSS

Analyse
Regression
Linear
Selection of the same as linear

MULTIPLE REGRESSION y=b0x+b1x1+b2x2+b3x3

We can put any type of method

We have 5 methods
1. Enter- considers all variables at once
2. Step wise- uses one sees if it is imp, then uses or rejects (calculates r values
and the higher value it uses first)
3. Forward- like step wise ,one after other in increasing order
4. Backward uses all the variable and calculates R , the higher R value will be
removed
5. Remove

R2 and and adjusted R2 must be less durbin-watson should be close to 2

Using interaction

X3=x1*x2

SPSS

Transform
Compute variable

QUALITATIVE DATA

Code the qualitative data into quantitative data dummy variable (0/1)
(always scaled data)
Transform-Recode into diff variable
Multiple regression
ANOVA
Are the data depending on another or not

1. Completely randomize design(Each col is dependent on other


And all the data in one col is dependent on each other) SSE,SSTR,SSR
2. block (one block depends other block )(one row depends on other)
^+SSBL
3. factorial (everyone depends everything) ^+SSI

STEPS
SST-(y-ybar)^2
SSTR=(avg of 1 treatment+avg of 2 treatment}^2
SSE =SST-SSTR.
MS
F

Interaction

Same +SSBL+SSI

You might also like