Title: | Better Statistics for OLS and Binomial Logistic Regression |
---|---|
Description: | Provides squared semi partial correlations, tolerance, Mahalanobis, Likelihood Ratio Chi Square, and Pseudo R Square. Aberson, C. L. (2022) <doi:10.31234/osf.io/s2yqn>. |
Authors: | Chris Aberson |
Maintainer: | Chris Aberson <[email protected]> |
License: | GNU General Public License version 3 |
Version: | 0.3.0 |
Built: | 2024-11-17 03:54:47 UTC |
Source: | https://github.com/chrisaberson/betterreg |
Power for Comparing Dependent Coefficients in Multiple Regression with Two or Three Predictors Requires correlations between all variables as sample size. Means, sds, and alpha are option. Also computes Power(All)
depbcomp( data = NULL, y = NULL, x1 = NULL, x2 = NULL, x3 = NULL, x4 = NULL, x5 = NULL, numpred = NULL, comps = "abs" )
depbcomp( data = NULL, y = NULL, x1 = NULL, x2 = NULL, x3 = NULL, x4 = NULL, x5 = NULL, numpred = NULL, comps = "abs" )
data |
name of data file |
y |
dependent variable name |
x1 |
first predictor variable name |
x2 |
second predictor variable name |
x3 |
third predictor variable name |
x4 |
fourth predictor variable name |
x5 |
fifth predictor variable name |
numpred |
number of predictors |
comps |
Type of comparison, "abs" for absolute values or "raw" for raw coefficients |
Comparing Dependent Coefficients in Multiple Regression
depbcomp(data=testreg,y="y",x1="x1",x2="x2",x3="x3",x4="x4",x5="x5", numpred=5,comps="abs")
depbcomp(data=testreg,y="y",x1="x1",x2="x2",x3="x3",x4="x4",x5="x5", numpred=5,comps="abs")
Comparing Independent Coefficients in Multiple Regression
indbcomp(model1 = NULL, model2 = NULL, comps = "abs", pred = NULL)
indbcomp(model1 = NULL, model2 = NULL, comps = "abs", pred = NULL)
model1 |
Summary of first model (see example for how to summarize) |
model2 |
Summary of second model (see example for how to summarize) |
comps |
Type of comparison. "abs" - absolute value of coefficient |
pred |
Number of predictors |
Comparing Independent Coefficients in Multiple Regression
y_1<-rnorm(200); x1_1<-rnorm(200); x2_1<-rnorm(200) y_2<-rnorm(200); x1_2<-rnorm(200);x2_2<-rnorm(200) df1<-as.data.frame(cbind(y_1, x1_1,x2_1)) df2<-as.data.frame(cbind(y_2, x1_2,x2_2)) model1_2<-summary(lm(y_1~x1_1+x2_1, data=df1)) model2_2<-summary(lm(y_2~x1_2+x2_2, data=df2)) indbcomp(model1 = model1_2, model2 = model2_2, comps="abs", pred=2)
y_1<-rnorm(200); x1_1<-rnorm(200); x2_1<-rnorm(200) y_2<-rnorm(200); x1_2<-rnorm(200);x2_2<-rnorm(200) df1<-as.data.frame(cbind(y_1, x1_1,x2_1)) df2<-as.data.frame(cbind(y_2, x1_2,x2_2)) model1_2<-summary(lm(y_1~x1_1+x2_1, data=df1)) model2_2<-summary(lm(y_2~x1_2+x2_2, data=df2)) indbcomp(model1 = model1_2, model2 = model2_2, comps="abs", pred=2)
Compute Likelihood Ratio Chi-square for Binomial Logistic Regression with up to 10 predictors
LRchi( data = NULL, y = NULL, x1 = NULL, x2 = NULL, x3 = NULL, x4 = NULL, x5 = NULL, x6 = NULL, x7 = NULL, x8 = NULL, x9 = NULL, x10 = NULL, numpred = NULL )
LRchi( data = NULL, y = NULL, x1 = NULL, x2 = NULL, x3 = NULL, x4 = NULL, x5 = NULL, x6 = NULL, x7 = NULL, x8 = NULL, x9 = NULL, x10 = NULL, numpred = NULL )
data |
name of your datafile, loaded |
y |
dependent variable name |
x1 |
first predictor variable name |
x2 |
second predictor variable name |
x3 |
third predictor variable name |
x4 |
fourth predictor variable name |
x5 |
fifth predictor variable name |
x6 |
sixth predictor variable name |
x7 |
seventh predictor variable name |
x8 |
eighth predictor variable name |
x9 |
ninth predictor variable name |
x10 |
tenth predictor variable name |
numpred |
number of predictors |
LRchi(data=testlog, y="dv", x1="iv1", x2="iv2",numpred=2)
LRchi(data=testlog, y="dv", x1="iv1", x2="iv2",numpred=2)
Compute Mahalanobis Distance for Multiple Regression
Mahal(model = NULL, pred = NULL, values = 5)
Mahal(model = NULL, pred = NULL, values = 5)
model |
name of model |
pred |
number of predictors |
values |
number of Mahal values to print (highest values). Default is 10 |
Mahalanobis Distance to detect MV outliers
mymodel<-lm(y~x1+x2+x3+x4, testreg) Mahal(model=mymodel, pred=5, values = 10)
mymodel<-lm(y~x1+x2+x3+x4, testreg) Mahal(model=mymodel, pred=5, values = 10)
Compute squared semi partial correlatoins for Multiple Regression
parts(model = NULL, pred = NULL)
parts(model = NULL, pred = NULL)
model |
name of model |
pred |
number of predictors |
Squared semipartial correlations for MRC with up to 10 predictors
mymodel<-lm(y~x1+x2+x3+x4+x5, data=testreg) parts(model=mymodel, pred=5)
mymodel<-lm(y~x1+x2+x3+x4+x5, data=testreg) parts(model=mymodel, pred=5)
Pseudo R-square Values for Binomial Logistic Regression
pseudo(model = NULL)
pseudo(model = NULL)
model |
name of model |
Pseudo R-square Values for Logistic Regression
mymodel<-glm(dv~iv1+iv2+iv3+iv4, testlog,family = binomial()) pseudo(model=mymodel)
mymodel<-glm(dv~iv1+iv2+iv3+iv4, testlog,family = binomial()) pseudo(model=mymodel)
R-square change for Hierarchical Multiple Regression
R2change(model1 = NULL, model2 = NULL)
R2change(model1 = NULL, model2 = NULL)
model1 |
first regression model |
model2 |
second regression model |
mymodel1<-lm(y~x1+x2, data=testreg) mymodel2<-lm(y~x1+x2+x3+x4, data=testreg) R2change(model1=mymodel1, model2=mymodel2)
mymodel1<-lm(y~x1+x2, data=testreg) mymodel2<-lm(y~x1+x2+x3+x4, data=testreg) R2change(model1=mymodel1, model2=mymodel2)
A dataset to test logistic regression functions
testlog
testlog
A data frame with 164 rows and 11 variables:
DV
1st predictor
2nd predictor
3rd predictor
4th predictor
5th predictor
6th predictor
7th predictor
8th predictor
9th predictor
10th predictor
A dataset to test regression functions
testreg
testreg
A data frame with 1000 rows and 6 variables:
DV
1st predictor
2nd predictor
3rd predictor
4th predictor
5th predictor
Compute tolerance for Multiple Regression
tolerance(model = NULL)
tolerance(model = NULL)
model |
name of model |
Tolerance for MR
mymodel<-lm(y~x1+x2+x3+x4+x5, data=testreg) tolerance(model=mymodel)
mymodel<-lm(y~x1+x2+x3+x4+x5, data=testreg) tolerance(model=mymodel)