I agree with both of you on the fact that a timer is sufficient in most cases and is more suited to B4A, but I noticed that it was better to choose another solution in two cases: for a short and quick animation (my first tries with a timer were not satisfying for my SlidingSidebar class for example) and when the regularity is critical (e.g. for a metronome). As the timer is sometimes ahead of time, sometimes behind, it's really too irregular for an audio application. A lag above 20ms is noticeable by any experienced musician (as I am) and disturbing when you play a fast song. The audio applications for PC or Mac offer a latency (hardware+software) below 10ms and almost no deviation during a whole session. It's what is expected from a mobile application too (or the application is just a toy).
If you look at the code of a game engine, you'll see that it doesn't trust at all the regularity of the timer (when they use one, some rely only on a main loop). Sprites and other animated objects have usually a speed in pixels per second and their position is computed with the elapsed time from the previous frame. You don't see code like : sp.x = sp.x + 2, which supposes a very good regularity of the timer, but things like: sp.x = sp.x + 50 * deltaTime. If you wonder why x and y are float values in the Android API, now you know.
About the CPU consumption, it is obvious that my example, doing nothing, is a complete waste of CPU time. But in a real app, this loop is used to call code and most of the CPU time is consumed by this code, not by the loop. We have to keep in mind that the purpose of this loop is also to indicate precisely when we are compared to the beginning or the previous tick, so to estimate the progress of an animation for example. A timer does not provide this information (fortunately, since Honeycomb, we have TimeAnimator in Java; we can even set a FrameDelay to force it to respect the given pace).