Your question is a bit hard to understand, but I'll do my best. Sometimes taking the log of your independent variable will improve a linear fit. If you have two sets of data, X and Y, and they don't seem to fit a linear relationship, you may take the log of X, and the log of X may fit a linear relationship. Example: Suppose your data correctly fits the model y = a Xm. So plotting Y and X*, where X* is the log of X, and performing a linear regression, you obtain a slope and intercept. Your intercept is log(a). If you are using log base 10, then a (in the model) = 10intercept value and m is the slope of the semi-log line.
true
Yes.
true, liner regression is useful for modeling the position of an object in free fall
The value depends on the slope of the line.
It could be any value
One of the main reasons for doing so is to check that the assumptions of the errors being independent and identically distributed is true. If that is not the case then the simple linear regression is not an appropriate model.
The solution of a system of linear equations is a pair of values that make both of the equations true.
It means that there is no set of values for the variables such that all the linear equations are simultaneously true.
There are several possible explanations: Leaving aside the two most obvious reasons: calculation error and attempted extrapolation, there are the following possibilities: The true relationship is non-linear. A relevant variable has been missed omitted. The observations are very variable: leading to a very large residual error. There is not enough variation in the independent (or predictive) variable so that Sxx is very small.
You are trying to find a set of values such that, if those values are substituted for the variables, every equation in the system is true.
a solution
Solution set