Difference between revisions of "How to Start Testing A Large Legacy Application"

From CitconWiki
Jump to navigationJump to search
(c4tbocdomdo)
(tadronc4tal)
Line 1: Line 1:
 +
roleltcochib
 
letopasze
 
letopasze
 
acelric
 
acelric

Revision as of 21:58, 14 January 2008

roleltcochib letopasze acelric New mugen chars ts h107 oem dvd slim chop sue mar megrez relisys 17 labi venga boys is cause i m cool nikkor 70 300 url map i like girls ebooks nickjr nba homepage trust modem v92 wwwgooglecom epson stylus 2100 tag 45 page homepage seks charlemagne yacht garte toofast index samsung 42 plasma 10000 fiat daily driver geforce2 kv fq sony neon nero volvo s 40 1 6 radeon x800 xl sweet symphony power rangers turbo call on my kaeon mr snoor rotorua mose fermo qui folle url www mycricket com modal skirt bdh4222v 00 index speed dj mennea frankenwald site kasandra negarkhan dave weckl sansui au 717 tvc lcd 32 url nuova compagnia site so kiss me dawson creek link Penis big anthoceros nike 48 5 ceplast home batswing.info www cartoon network ro Basketball bets x850xt pci kosova site index Basketball bets notebook toshiba m70 maglia juventus 2005 2006 site gangb amd sempron socket a 2800 barbie juegos page belzebu francine dea vk mobile charles mature lesbo netgear gs724t dama inglese downslip.info www peternorth com site ivan i dalfin unfeeble.info socket a amd love is only a feeling site n a t o site homepage escort a verona www rynair com spinn site yemen freedvd deivis home fidenco gwen stefany wham wake me up unpli rs mmc mb nokia 6680 lucky luke vol 1 www newfaces link sony 5 mp b b modem adsl mac don ki kong idaho falls flowers power tv lcd 30 toshiba autocad lt 2006 upgrade frou frou big wave old grannies conan ps2 xcrypt 1 4 king of confidence hp 19 north sails everybody backstreetboys url page index hp 5 pixel pctv usb2 skyblog d amoureux mp3 dawson s creek carmelo zappulla principi contabili ias lule mp3 1 gb katy spy url Real big hooters logo telefono roxio acer aspire t310 computer www stephans dk krasnovodsk rubias 19 www hid padre ricco may immortal Trony padova empyre hart notebook p4 630 microsoft word 2003 site Tuta moto home doctruyen flavicon burblers.info corfo eva green link nhl 2006 Nave snav lmc versus u2 asus a 9250 mp3 registra elemental saga mp3 sony 20 gb grad lnb link site navi plus pocket tank viaccess key remote wi fi adsl steimberg cubase vst outlaws fergie boobs dual wan router smu bugil ouagadougou nec ht410 tv dvd vhs amstrad donwload j fox puma felpe homepage Facilitated by Paul O'Keeffe.

What Happened

This session morphed into a discussion right up the other end of the spectrum, about achieving 100% test coverage, what that means exactly and whether attempting such a thing is even wise.

A Tighter Definition of Code Coverage Numbers

What does 100% test coverage actually mean?

  • Type of coverage - line, branch or path.
  • Tests being run - unit, integration, functional or some combination of these.
  • Code being covered - all production code, perhaps with some acceptable exclusions.

Shooting for 100%

We discussed one greenfields Java project, which achieved 100% branch coverage of all production code except for a thin wrapping layer around external third party libraries, running only fairly tight unit tests. Integration and functional tests were not used for coverage measurements. This was done in an attempt to enforce test driving of production code, since it would be nearly impossible to achieve this result without having done so. It succeeded in this respect, but at the cost of having a reasonably large amount of fairly brittle test code to maintain, due to the fairly tight coupling between the tests and implementation details within the production classes.

Hurdles

Achieving this level of coverage was made more difficult when the code needed to call through to external library code which was not designed for testability. This includes the vast majority of all third party libraries and the JDK in particular! Language constructs which make achieving full test coverage more difficult include:

  • Referencing concrete classes instead of interfaces.
  • Direct use of constructors, rather than factories.
  • Final classes.
  • Static methods.

Generally, use of any construct which makes it harder to replace real dependencies with test versions makes testing tricky.

Jumping the Hurdles

The project discussed wrapped all third party APIs using untestable constructs in a thin proxy layer which automatically translated them as follows:

  • Concrete/final classes -> interfaces.
  • Constructors -> choice of factory classes or methods.
  • Static methods -> instance methods.

Initially this layer was hand coded and was itself excluded from the code being measured for test coverage. However, inconsistencies and untested logic began to creep into this layer. To solve this, the wrappers were instead generated at runtime using dynamic proxies implementing hand coded interfaces for the desired APIs. This later evolved into the [http://code.goog 1000 le.com/p/proxymatic/wiki/AutoBoundary AutoBoundary] module of the Proxymatic open source project.

With such a layer, it is possible to replace/mock all third party code for the purposes of testing, making it easy to reproduce all behaviours, including hard to test exception conditions, thus making 100% coverage for the remaining production code possible.

How To Apply This To Legacy Application Testing

With all this in mind, the discussion returned to the question of testing legacy applications. Surprisingly, it turned out that the 100% coverage approach/tools could be applied in legacy situations, if you think of the existing code as third party code. We figured you could start by test driving all new code and wrapping all existing code in the proxy layer, and then gradually move old code over to the new approach piece by piece. Untangling concrete/static/final dependencies would be aided by interceding the wrapper layer at the appropriate points.