Lectures

Inference for Regression

by David Spade, PhD
(1)

Questions about the lecture
My Notes
  • Required.
Save Cancel
    Learning Material 2
    • PDF
      Foliensatz 10 Statstics II David Spade.pdf
    • PDF
      Download Lecture Overview
    Report mistake

    About the Lecture

    The lecture Inference for Regression by David Spade, PhD is from the course Statistics Part 2. It contains the following chapters:

    • Inference for Regression
    • Example: Body Fat
    • Pitfalls to Avoid

    Included Quiz Questions

    1. The residuals can be viewed as estimates of the mean value of the response variable for each value of the explanatory variable.
    2. The response variable is assumed to have a normal distribution for each value of the explanatory variable.
    3. By performing linear regression, we are estimating the mean value of the response variable for each value of the explanatory variable.
    4. The fitted values of the response variable are used as estimates of the mean value of the response for each value of the explanatory variable.
    1. The test statistic follows a t-distribution with n−2 degrees of freedom.
    2. The test statistic follows a normal-distribution with mean 0 and variance 1.
    3. The test statistic follows a t-distribution with n degrees of freedom.
    4. The test statistic follows a t-distribution with n−1 degrees of freedom.
    1. The scatterplot of the response variable against the explanatory variable shows random scatter about 0.
    2. The residuals are nearly normal.
    3. The plot does not thicken.
    4. The observations are independent of each other.
    1. We quantify the spread around the regression line by using the standard error of the slope.
    2. We quantify spread around the regression line by using the standard error of the intercept term.
    3. We quantify spread around the regression line by using the mean value of the explanatory variable.
    4. We quantify spread around the regression line by using only the sample standard deviation of the residuals.
    1. Fitting a linear regression to data that are not linear will not have a negative effect on the inference procedures for a slope.
    2. Being careful of thickening plots will not have a negative effect on the inference procedures for a slope.
    3. Making sure that the residuals are nearly normal will not have a negative effect on the inference procedures for a slope.
    4. Being careful of outliers and influential points will not have a negative effect on the inference procedures for a slope.

    Author of lecture Inference for Regression

     David Spade, PhD

    David Spade, PhD


    Customer reviews

    (1)

    5,0 of 5 stars
    5 Stars
    5
    4 Stars
    0
    3 Stars
    0
    2 Stars
    0
    1  Star
    0