http://stats.stackexchange.com/questions/30486/when-does-lasso-select-correlated-predictors
http://stats.stackexchange.com/questions/86269/what-is-the-effect-of-having-correlated-predictors-in-a-multiple-regression-mode
In the second post, the best answer shows that if the data size is 1000, the ordinary least square regression can report the correct relationship even though the two variables are highly correlated, correlation coefficient = 0.95.
It is also a good idea to include correlated variables in the same model if they affect the response variables in a different manner (not one mediated by another). Including the correlated variable may even change the sign of regression coefficient. As shown in the post http://stats.stackexchange.com/questions/78828/is-there-a-difference-between-controlling-for-and-ignoring-other-variables-i, including the variable X2, which is negatively correlated to X1, changes the sign of the relationship between Y and X1.
In summary, (1) if the dataset size is big ~1000, then collinearity may not be a problem in linear regression; (2) include all relevant variables (determined by other scientific knowledge) in the model to reveal the real relationship between predictors and response variables; (3) a statistical correlation between two variables does not directly tell the mechanical link which has to be figured out by seeking other scientific knowledge.