Would it be bad to leave behind our Flash roots?

I am working with Tyler on Stealth, our high-performance component framework. After reading this article on performance by Arno Gourdol of Adobe I began wondering if we should leave behind our Flash roots of motion and timeline design by defaulting framerate to 0 in our Stealth-based applications.

Framerate makes great sense when doing games or timeline based animations, but in applications do we need it? We can update the screen on mouse moves, roll overs, etc. with the MouseEvent.updateAfterEvent instance method. And for transitions and tweening the class could use a Timer for the duration of the animation and again call Timer.updateAfterEvent. Then the screen would only refresh when it needs to. Performance would be greatly increased. Seems like it makes sense. Would this be something to add to Flex? Would it give us the performance we need/want for mobile applications and more responsive desktop applications?

Any foreseeable drawbacks? What do you think?

Update: I did some testing and it seems that the Timer class is directly influenced by the frameRate. With a frame rate of 0 a timer which should fire immediately (set to 0ms) doesn’t fire for 20 seconds! With the frame rate a 0.1 it happens at about 2 seconds, a frame rate of 1 is about 145 ms and a frame rate of anything over 4 seems to be around the same (10ms – 30ms probably depending on what the OS is currently doing).

MouseEvent updating and such happen as they should however, so as long as you start of with a frame rate of 4 so that the app can initialize visually, you could drop it down to 0 until a tween is needed and then bump it up to 4 for the duration of the tween. The rest of the visual changes can respond to mouse events (resize, click, rollover, etc). Or leaving it at 4 frames a second probably isn’t too bad on CPU, either way, you’d have to call updateAfterEvent when needed.

3 Responses to “Would it be bad to leave behind our Flash roots?”

  1. Josh Says:

    The Timer issue is really the only thing that immediately comes to mind when thinking about what could be affected. If I remember right, Timers can fire at their fastest 10x per frame, so if your FPS is set to be something really low, it’s definitely going to affect your Timers. Those numbers you’re seeing make sense. (Of course, all that other crap like CPU speed, how busy your machine is, etc. plays a role too.)

    The only other thing to consider though is if you ever want to use any animations that timeline-based. Sure, you can tween everything using Timers if you want, but if you import animated content that some designer painstakingly created using the timeline, you’d want to have a higher framerate so the animation looks nice & pretty.

  2. Jacob Wright Says:

    Yeah, maybe the default frame rate could be 4 (testing on other systems than my own required of course) for most RIA/Desktop/Mobile applications and then it could be set higher for games and apps that imported Flash animations.

    The key to allowing this would be to build tweening and animation frameworks to use the Timer and updateAfterEvent and having the invalidation framework use updateAfterEvent (you wouldn’t want to do it on every mouse move if there are not visual changes happening).

  3. DonMoir Says:

    About a year and a half ago, I spent serveral days beating flash over the head, in an effort to figure out framerate. Used timers, updateAfterEvent, etc. NOT… I spied on it, I bought it into a C++ container and beat on it some more. I did get it to update quickly from C++ with framerate set to 1, but this was a hack and had other problems so of course no good.

    My conclusion was Adobe needs to get rid of the framerate. It is a real bottle neck.

    I filed a bug report on this after I was done testing. This report goes into more detail.