A problem in regression analysis is multicollinearity, which is to say moderate or high correlations among the independent variables. From Wordnik.com. [Multicollinearity and Micronumerosity, Bryan Caplan | EconLog | Library of Economics and Liberty] Reference
RE: your comments on my use of the term multicollinearity, see my Monday blog 9/25/05 for a "Little Primer on multicollinearity" at: http://freedomspeace.blogspot.com/ This is not to say you should read such a primer, but simply to bring it to your attention. From Wordnik.com. [Multicollinearity and Micronumerosity, Bryan Caplan | EconLog | Library of Economics and Liberty] Reference
And what about the common trend in CO2 and Solar, no multicollinearity problems? jae. From Wordnik.com. [IPCC and Solar Correlations « Climate Audit] Reference
Evidence of multicollinearity is consistent with that hypothesis and certainly not a “problem” for it. From Wordnik.com. [Groveman and Landsberg « Climate Audit] Reference
In short, whether or not multicollinearity is a problem depends on what hypothesis you are trying to test. From Wordnik.com. [Groveman and Landsberg « Climate Audit] Reference
In my experience, interaction variables are kitchen sink type regressors that induce severe multicollinearity and give spurious results. From Wordnik.com. [I'm Shocked,] Reference
Goldberger's main point: People who use statistics often talk as if multicollinearity high correlations between independent variables biases results. From Wordnik.com. [Multicollinearity and Micronumerosity, Bryan Caplan | EconLog | Library of Economics and Liberty] Reference
But if we had to use historical data instead, we would sove the problem of multicollinearity by using factor analysis or partial least squares, both of which combine the data into fewer, but independent predictors. From Wordnik.com. [Multicollinearity and Micronumerosity, Bryan Caplan | EconLog | Library of Economics and Liberty] Reference
In theory, PLS is applied in situations of multicollinearity, but the MBH network has many series that are essentially white noise and thus the proxies are surprisingly close to being orthogonal in the early networks and there are blocks of orthogonal series in the later network. From Wordnik.com. [More on Bürger et al 2006 « Climate Audit] Reference
In the context of an inverse regression, you have to think long and hard about whether a procedure for regression of effect upon causes (tree ring ~ temperature + precipitation) where you want orthogonality can be transmogrified into an inverse regression of cause upon effect in the style of dendroclimatologists ( temperature ~ bristlecones+ Gasp + …), where you actually want multicollinearity (i.e. a signal). From Wordnik.com. [More on "Naturally Orthogonal" « Climate Audit] Reference
Given the multicollinearity problems of my model, here is what I think I know. From Wordnik.com. [The Aleph Blog] Reference
Using three different indicators of the same type - momentum, for example - results in the multiple counting of the same information, a statistical term referred to as multicollinearity. From Wordnik.com. [Using Technical Indicators To Develop Trading Strategies - Yahoo! Finance] Reference
Our data met assumptions of no outliers, homogeneity within the variance-covariance matrices (values were all within a factor of ten of each other), and the absence of multicollinearity of explanatory variables. From Wordnik.com. [PLoS ONE Alerts: New Articles] Reference
Regarding the analysis of cost of regulations: Rather than suggesting that regression methodology should not be used at all, our critics should recognize that the methodology comes with inherent issues of multicollinearity that can result in sign reversal. From Wordnik.com. [The Earth Times Online Newspaper] Reference
Peter Kennedy's "Guide to Econometrics" is also very good on this subject, pointing out that with a small number of exceptions he doesn't use the example, but John Lott's work on the 2000 Florida elections where he more or less intentionally constructed a model so as to be collinear would be one of the exceptions, multicollinearity is a feature of the data, not of the model, and to claim that "the data are biased" brings into sharp relief how silly most critiques of multicollinearity are. From Wordnik.com. [Multicollinearity and Micronumerosity, Bryan Caplan | EconLog | Library of Economics and Liberty] Reference
Wilson and Luckman argue that this orthogonality is a good thing in that it avoid multicollinearity; I’m inclined to say that it’s a bad thing if you’re trying to extract a “signal”. From Wordnik.com. [More North American Upper Treeline: Wilson-Luckman 2002, 2003 « Climate Audit] Reference
Everyone in the regression world is so used to viewing multicollinearity as their enemy and something to be feared that it’s easy to lose sight of the fact that multicollinearity is exactly what you want when you have a network of pseudoproxies and orthogonality is your enemy. From Wordnik.com. [Groveman and Landsberg « Climate Audit] Reference
Here's what you need to do with your aborted regression: (1) control for state effects by using panel data or dummy variables for 49 of the states (not 50, so to avoid perfect multicollinearity); (2) control for the dual causality problem by using an instrumental variable to estimate the the beer tax; (3) add per-capita-income as an (exogenous) regressor (it's basic econometric theory that consumption of a good is a function of its price AND income, not regressing against income leads to substantial omitted variable bias); and (4) run your regressions using the natural logs of the variables (this better models real life, as you probably know). From Wordnik.com. [A Failed Experiment « PubliCola] Reference
LearnThatWord and the Open Dictionary of English are programs by LearnThat Foundation, a 501(c)3 nonprofit.
Questions? Feedback? We want to hear from you!
Email us
or click here for instant support.
Copyright © 2005 and after - LearnThat Foundation. Patents pending.

