I am curious to find the braking distance for a car on a road.
In attempting to find this out, I found that the braking distance for a car (on a flat road) is $$ d = \frac{v^2}{2\mu g} $$ where $\mu$ is the coefficient of friction between the road and tires (CRF), $g$ is gravity, and $d$ is the distance traveled. However, this book mentioned that $\mu$ increases as velocity increases. Right now, I'm having trouble finding said coefficient of friction.
This transportation engineering site gave several examples of CRFs at various speeds on a rainy road, but I'd like to understand if they needed to empirically measure it.
(EDIT: Of course $\mu$ increases as $v$ does; there wouldn't be a table if it didn't.)
In short, my question is this: Is there a general method of calculating $\mu$, given that the car is on a 0% grade, wet, asphalt-covered road? Or must $\mu$ be measured/calculated using empirical data?
(Note: not really a homework question; if it was, I'd just guesstimate the answer based on the data from the second link. I just want to understand the data better, if possible, as opposed to just having the value.)