Virtual Machinery logoJHawk logo
  Object-Oriented Software Metrics - Afterword and further reading
Product Home Application Eclipse Plugin Command Line Metrics Guide Download Free Trial FAQ, News, Bugs Other Products
Click here to see prices and purchase JHawk Click here to download demo

Afterword
None of us are angels - we all slip up at some time -

  • The code that you were forced to change ('for our most important customer') the Friday afternoon before you went on holiday
  • The code that you inherited but never really understood and were too proud to admit it
  • The code that you inherited that was a complete rats nest that you've never dared to tackle
  • The code that you didn't write unit tests for because it was 'so trivial'
  • The code that you wrote a hundred different ways and when you finally got it right you didn't dare to tidy it up in case it broke

We've all done these kind of things - we will almost certainly all continue to do them. That's alright - as long as you know what you have to do and you have the items on your 'To Do' list. That list is always bigger than the available time. But don't worry -the thing about that stuff that's on your to do list is that you will end up revisiting it - because it'll break - and you'll get a chance to cross it off the list. But you'll be kicking yourself as you do it. There's no real way out of this situation - we don't have infinite time or infinite patience. You can mitigate this situation though - write Unit tests as you write or maintain code, write them as you find bugs. And always measure your code - no matter which measurements you choose to use - use them regularly and record and compare the results. As time passes you will see improvement and sometimes when you do a lot of work that has no visible benefit to the end user there is the satisfaction of seeing some metric change for the better.

There are other areas where metrics can help. For example distributed software development in Java is a fact of life. Mergers and acquisitions have resulted in companies having geographically scattered development teams. Outsourcing to offshore locations such as India have exacerbated this trend. Development managers have to pull the efforts of all these teams together into a seamless whole. This has led to an increased interest in products that can help this process - just how do you assess the quality of code that has arrived from a group of programmers that you know absolutely nothing about? Laborious line by line review processes are the most effective way but that process is too slow and the tedious nature of the exercise means that even fairly major bugs can slip through. Products such as JHawk seek to address this by using automated Object-Oriented Software metrics which can direct you quickly to areas of code that may be suspect. In the case of JHawk we give you the code (Full licenses only) which will allow you to modify the metrics used to suit your own Java 'house style'. Giving these tools to the external developers means that they get an opportunity to fix their Java code before they send it to you, saving you a lot of work. There will always be arguments about which metrics are valuable and in which situations they are valuable. Products like JHawk that allow you to configure the metrics reporting to suit your own requirements can help you focus on those metrics that you have found to be really important to your development processes. The JHawk command line interface that comes with all versions of JHawk allows you to integrate metrics measurement into your automated build processes.

OK - But where do I start?

Iíve been designing and programming Object-Oriented systems in Smalltalk, Java and C++ since 1989 and in my experience there are a few very simple rules to using metrics -

  • Start off slowly - pick a handful of metrics at each level. If you produce too much information you just won't read it. In the case of JHawk I would suggest using the UI at first rather than diving in immediately with an automated report using the CommandLine interface. Start off at the class level and look for those classes with large numbers of methods, high Maximum Cyclomatic complexity and high Fan outs. Within these classes at the method level look at the number of lines of code, the Cyclomatic Complexity and the numbers of Class references. Don't just pick a metric level and look at everything above that level - take the top 10 offenders in each of the metrics that you have chosen. Looking at these entities will tell you a lot about whether your code has problems and why. Just as everywhere has a 'house style' for good code they often have a 'house style' for bad code. I don't doubt that you'll see the same 'crimes' committed across the code. Once you've established where your problems might lie you can decide what levels of particular metrics you find acceptable. If a large percentage of your code exceeds those levels you should initially set the levels higher otherwise the programmers will feel that the task is impossible. You can then bring the levels down as the amount exceeding the levels reduces.
  • When you set levels, set them for guidance - do not be overly prescriptive - but you must have a set of guidelines which says it is alright to exceed this metric level in the following circumstances - and then clearly define that set of circumstances.
  • Automate as much of your quality measurement and analysis as possible. If it takes too long it wonít get done. (JHawk just analysed 4000 java classes in 120 seconds on my PC (2.66GHz P4 with 1Gb RAM)). Also be careful you don't produce too much data - otherwise it won't get read, never mind acted on.
  • Improving your code as a result of this analysis will almost certainly mean refactoring it. Don't attempt refactoring with having a full, reliable set of Unit tests such as those that you can create with JUnit.
  • Never compromise a good design without a good reason. Use tools, metrics and rigorous analysis of the results to do things Ďby factí.
  • If you learn even one new thing from using a tool then itís been worth the effort/money since you can apply that new learning time and time again (I know that sounds like a sales pitch!)

If you are interested in Java Metrics you might be interested in topics that we are publishing in our Sidebars papers. Just click on this line to have a look.

References
Halstead calculations - http://yunus.hun.edu.tr/~sencer/complexity.html

The standard text on Object-Oriented metrics.Brian Henderson-Sellers, Object-Oriented Metrics: Measures of Complexity, Prentice-Hall, 1996.

Eminent common sense Java programming. Every Java programmer will learn something they didn't know from this book. Joshua Bloch - Effective Java - Programming Language Guide, Addison Wesley, 2001

A Practitioners view of coupling - This also provides excellent descriptions of Instability, Abstractness and Instability. In pursuit of code quality: Code quality for software architects - Andrew Glover

The following provide links of a more academic nature but are valuable if you have a deeper interest in object-oriented metrics -

http://user.cs.tu-berlin.de/~fetcke/metrics/oo.html (Links to main object oriented software metrics sites)

http://ivs.cs.uni-magdeburg.de/sw-eng/us/bibliography/bib_main.shtml (Bibliography subdivided by type of metric)

 
 
 

Contact Us

All Content © 2017 Virtual Machinery   All Rights Reserved.