Fabrizio Giudici is a Senior Java Architect with a long Java experience in the industrial field. He runs Tidalwave, his own consultancy company, and has contributed to Java success stories in a number of fields, including Formula One. Fabrizio often appears as a speaker at international Java conferences such as JavaOne and Devoxx and is member of JUG Milano and the NetBeans Dream Team. Fabrizio is a DZone MVB and is not an employee of DZone and has posted 67 posts at DZone. You can read more from them at their website. View Full User Profile

Stop Watches Anyone? (Or "About Continuous Profiling")

  • submit to reddit
One of the most interesting talks at the latest Community One in San Francisco, NetBeans track, was the speech by NetBeans lead architect Yarda Tulach about the performance improvements in the latest NetBeans IDE 6.1. Yarda also talked about the impact of performance tuning on the development process and—summing up—one of the main points is that you should do continuous profiling. In other words, you could be happy with the performance of a certain feature, then you start fixing bugs, adding new functions, etc... and you don't realize that the performance is slowly degrading; and at a certain point has degraded so much that you need some major refactoring. It's more advisable to keep it under control continuously and immediately be aware if something is getting slower.

I've actually experienced this during blueMarine development in the hard way, something that now I can't afford any longer as I get closer to a real release. My beta testers have been pretty clear about their performance expectations of some features and one of the real, "political" disadvantages of Java is that the persisting myth "Java is slow", while false, can give you additional problems if you do something wrong in this area. Just be inefficient once in your code, and somebody will soon poo-poo your desktop application since it's "Java crap" (that is, even if the error is yours, and you can actually fix it, you risk a perpetual refusal of your application because of the technology it is made with).

So I've started doing some periodical, manual measurements of performance in the most important areas of blueMarine. So far I'm running some special load tests, getting the numbers out of it and putting them into an OpenOffice spreadsheet where I can track the progress, see trends and eventually make comparisons between figures etc... (the very first example has been talked about on my blog a couple of months ago).

Now, in perspective, I see the usual problem of "continuous" tasks: it's taking longer and longer, and the more performance tests I add, the worse. So this must be automated in some way, and possibly integrated with Hudson.

Of the many problems about continuous profiling, the very first is how to collect data. Now, NetBeans has got an excellent profiler (apart from a few bugs that are causing me troubles in my specific RCP scenario), but in my opinion this is a different thing:

  1. The Java profiler is a specific development tool, while I want to be possible that some profiling data can be collected at any time with the production code, even by some (beta) tester. While it's true that the Java 6 profiler is much easier to set up, it's still a JDK tool that would scare a casual user; what I want is a specific UI option pane where the user can click on a checkbox and possibly select the functional area to profile.
  2. The Java profiler is not "semantic aware". What I mean is that it can tell me how many times and how long a certain method—say loadImage()—has been executed, while I need e.g. to segment the collected data in function of the parameters (for instance, the image type and so on). Please read the example below for a better clarification of the point.
  3. The Java profiler collects too many data for a "continuous profiling" approach, and this of course also impacts on the performance. Of course, with the Java profiler it's perfectly possible to fine tune the profiling points, but again this is a task for a programmer, not for a (beta) user.

Of the previous points, I'd say that the stronger is the number #2, considering that improvements in the Java Profiler may weaken the #1 and #3 as time goes on.

Published at DZone with permission of Fabrizio Giudici, author and DZone MVB.

(Note: Opinions expressed in this article and its replies are the opinions of their respective authors and not those of DZone, Inc.)


Geertjan Wielenga replied on Wed, 2008/06/18 - 8:24am

Great article. Thanks!

Fabrizio Giudici replied on Wed, 2008/06/18 - 8:39am

BTW, comments were broken for a few hours. I'm kindly asking people that tried to comment and failed to retry now. Thanks.

Tim Boudreau replied on Wed, 2008/06/18 - 9:05am

Have you looked at the Timers API in NetBeans? I think it still maybe in "friend" mode (i.e. only JARs that it knows about can call it, but you can get added to the list), since it's not in the officialjavadoc. If you're using a dev build, you can pull up aRun Time Watches window (not present in release builds).The team redoing the Java editor created this for doingexactly the kind of continuous profiling it sounds like you're talking about. I don't know for sure that it'sapplicable, but I think it might be - and should below-overhead.I don't know too much about it, but it sounds like itmight be what you're looking for - it's all about collecting timings for things that should remain fast at runtime.  

Not the same as a unit test for performance regression (hard to do on all platforms, particularly with imaging where you're dealing with platform and hardware specific stuff, but still a worth goal and doable in the abstract).

Our biggest effort in NetBeans 6.1 was not improving performance, but finding ways to make sure that someone cannot commit a change that regresses performance and not know it.  It's a hard problem.

Jan Lahoda in Prague is, AFAIK, the author of the Timers API and its UI.  I'll point him at this thread.


Vincent DABURON replied on Wed, 2008/06/18 - 3:54pm


When i see your statistique : msec (val/min/max/avg), i remeber the "Java Application Monitor" API or JAMON V1.0



Please Fabrisio, have a look to the JAMON API before create a new API.

If there is something missing, may be Steve Souza the principal support of JAMOM could add your feature?


If you want to add a junit timed performance, you can use the JunitPerf API.


Like this :

For example, to create a timed test that waits for the completion of the ExampleTestCase.testOneSecondResponse() method and then fails if the elapsed time exceeded 1 second, use:

long maxElapsedTime = 1000;
Test testCase = new ExampleTestCase("testOneSecondResponse");
Test timedTest = new TimedTest(testCase, maxElapsedTime);


Vincent D.



William Louth replied on Sun, 2008/07/20 - 7:31am

Jamon is incredibly slow but that is relatively to what you are doing and what you expect. 

StopWatches are so 80's.  

Is all about resource metering and billing a concept very applicable to our present and future - cloud computing.

There is already an API that slows this problem: JXInsight Probes Open API 

Matt Coleman replied on Thu, 2012/04/26 - 12:49am

i appreciate the information you are sharing here guys,.we at graphic artist buffalo are constatly lookin out for new platforms

Mateo Gomez replied on Fri, 2012/04/27 - 2:07am in response to: William Louth

it slow at it is..too bad mexican dessert

Mateo Gomez replied on Fri, 2012/04/27 - 2:08am

mexican dessert appreciates netbean platforms very much

Matt Coleman replied on Tue, 2012/06/19 - 2:03am

i love stopwatches especially the ones we had in the 80's and 90's..brings back childhood memories..this is great by the way

graphic designer buffalo

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.