Eric Olson responded to this claim in his paper, "Rate of Time's Passage". His counter argument basically says that it is impossible for time's flow to pass at one second per second because rates of anything is a ratio between two values. In the case of one second per second, you would not get a rate (because the second units would cancel) and you'd be left with just a number as quotient, namely one, which is not a rate of anything. Olson concludes from this that not only does time not flow but that any A-theory of time must be false because A-theories entail times passing at some rate or other.
This doesn't seem like a good argument at all. The reason why we cannot gauge time's passage as a quantifiable rate is because we have no standard to measure it against. The rate of change for anything (movement, temperature etc) can be measured against certain standards such as the movement of the clock's hands but what will we gauge the passage of time against?
Olson anticipates this answer and says that it is no good because we can use time to gauge its own passage such as we can use something's length to know its own length but it would still turn out to be nonsense by getting numbers instead of true rates. He says that it makes sense to say that the Standard Meter is one meter long and we know this because it is the same length as itself.
Here's my response: First of all, length is not a rate, it is a property of the thing itself (a genuine property in fact). Rates are comparisons or relations between events so right off the bat, there is a false analogy. Second, if we had a suitable measure for time's passage such as a "supertime" we could gauge time's passage or flow against supertime and get something like 1 second per supersecond which is a rate and not a number because the units do not cancel since they are in different units (time vs supertime). But what wouldn't supertime need a supersupertime to use as a standard to measure the flow of supertime and so on ad infinitum? No. Because supertime may use time as a standard; it's arbitrary what we will use. Third, scientists use time to gauge time's passage all the time without any problem.
Consider relativity. I am at rest relative to you and you are on a fast spaceship going to Mars. For every 4 seconds my clock registers, your clock registers only 1 second (time dilation). Call my inertial reference frame R1 and yours R2. In this case, your clock passes at .25 seconds per second (you can use the Lorentz transformation to calculate how fast you'd have to go for this time dilation to occur). But the "seconds" in the divisor and dividend do not cancel because they are relative to different inertial reference frames. We get something like .25 R2 sec per one R1 sec and that is not problematic at all.
We cannot gauge time's passage as a whole because we have no available and suitable standard but that doesn't mean there is no possible one. The problem is analogous to the old ponderous childhood puzzle of if everything became twice as large at the same time, how would we know? In this case, if everything flowed at twice its normal rate (assuming there is a normal rate), how would we know? That's an epistemological problem to be resolved with the appropriate standards.