3

This question has been asked before...

(Does the force of kinetic friction increase with the relative speed of the objects involved? If not, why not?),

...but there's a very specific part I'm struggling with, and it's not addressed in that post.

Part of what causes friction is the breaking of microscopic ridges in the materials sliding on each other, protrusions that catch and need to be dislodged.

enter image description here

Let's say it takes a certain amount of force to break each of these ridges.

If the two surfaces are moving faster relative to each other, they would need to dislodge more of these protrusions in the same amount of time. It seems to me, then, that kinetic friction (at least between rough surfaces) should be velocity dependent. However, the standard explanations say it isn't.

Why?

Thank you!

joshuaronis
  • 3,055
  • The original reason I asked this question was because a problem in my physics book (Halliday Resnick Kline) asked us to find the acceleration of a slipping wheel, and the supposed realization was that it doesn't matter what the wheel's current angular velocity is if it's slipping, all that matters is the normal force and the coefficient of kinetic friction. – joshuaronis Dec 28 '18 at 20:49

1 Answers1

1

The instantaneous force just depends on how many of the ridges are currently being broken/deformed. If the number of ridges that are "caught" at any given moment doesn't depend on velocity, then you wouldn't expect the force required to either. If you go faster, you're through one bump and on to the next in less time, but that doesn't change the force required to get through the bump. The fact that you're breaking through more bumps does mean you require linearly more power--breaking more bumps in the same amount of time takes proportionally more energy--but that is consistent with a constant coefficient of kinetic friction: power is force time velocity.

Ben51
  • 9,694
  • 1
  • 24
  • 50
  • I would just note that there is no fundamental physical law that says the coefficient of kinetic friction can't depend on velocity. It's an empirical fact that for most materials, before they get too hot, it doesn't depend on velocity much. But it's nice that the model of little ridges that each require a certain amount of force to dislodge doesn't predict a wildly different behavior. – Ben51 Nov 24 '18 at 16:01
  • Thanks for your answer, and its probably my fault but I don't really understand it. To me, the fact that the instantaneous force doesn't change should have nothing to do with it. I mean, if you have a car driving against the wind, and the surface of the car doesn't change, at any instant the same amount of wind particles are gonna be hitting the car. But more wind means higher resistance to it's movement, a higher force. I think what you mean to say is that the amount of momentum needed to break a bump doesn't change, but if there's more bumps breaking per second, force would still be greater. – joshuaronis Nov 24 '18 at 16:25
  • I picture the amount of energy require to break each bump to be constant--that's what's called the toughness of a material. This comes from assuming that it takes a certain amount of force to cause a deflection, and a deflection of a given size is required to make the break. – Ben51 Nov 24 '18 at 16:32
  • The drag force on a car does increase with velocity, actually as velocity squared. Each particle of air imparts a certain impulse, and as you increase velocity, that impulse grows linearly as does the number of collisions per unit time. With sliding friction, the impulse of breaking each bump decreases with increasing velocity, because the same force is applied for a shorter time. So the increase in bumps per unit time is balanced by the decrease in impulse per bump. – Ben51 Nov 24 '18 at 16:37
  • Thanks Ben, I'm upvoting your answer, but I'm still confused. Why are bumps easier to break with increasing velocity? What do you mean the same force is applied for a shorter time...? – joshuaronis Nov 24 '18 at 16:51
  • 1
    If we say each bump exerts a set amount of resistive force, and that force is maintained over a certain distance (far enough to break free of the bump) then the force is applied for a shorter time when you move faster--same distance at higher velocity equals less time. I would say that bumps are not easier to break with increasing velocity--they are equally hard, required the same amount of energy to break; but that equates to less momentum transfer per bump. – Ben51 Nov 24 '18 at 16:57