Visualisation of Quality

From CitconWiki
Revision as of 17:09, 6 July 2008 by 202.12.233.23 (talk) (New page: == Main Points == Measurement vs Quality Standardisation - define - Compare against Target the audience Automatic trend gathering Faster classification of raised issues Need data trend...)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigationJump to search

Main Points

Measurement vs Quality Standardisation - define - Compare against Target the audience Automatic trend gathering Faster classification of raised issues Need data trends to show improvement Motivation and sales

Detailed Notes

Jason Yip Had a problem with quality of automation system. Environments are unstable - perception Highlight the problem more explicitly. e.g. 60% of the time the environs are down. Times by number of testers how big is the problem. Putting the data out there to make it explicit got some action Built rails app to show uptime for test environments. Was story more about measurement than visualising. Built a dashboard to visualise. Talking from an environment space.

One of the guys use visualises to track code quality. Report is part of CI.

Performance testing - prints out defects and pins to PMs door.

How do you communicate a message

Need to define what needs to be measured.

e.g. level of cyclomatic complexity - required level. How do we communicate this.

Need to be able to fix a problem that is visualised. Need a targeted goal so that we can watch it get better.

Visualisation - leverage 80/20 rule and bug clusters.

Personal Software Process - understanding what you do wrong.

Showing where problems are in the code.

Trends may be very obvious once starting to track.

Tracking Code Quality

CI build Status - large build screens near every team.

Clover dashboard - showing coverage and metrics. Quickly show outliers

Tree map - shows outliers and hotspots. Tracking build and test times is essential. Bamboo shows this.

Fisheye - author line counts.

Commit comments - can link to issue tracker.

Viewing code coverage in code review tool. Mash-up between clover and crucible.

Test to code ratio. LOC in tests vs. LOC in code. Could potentially map versus code coverage. Eric used this chart to demonstrate what happened when team changed.

Coverage alone is not a useful measure of test quality. This is an indicator not a measure.

CodeCity - eye candy. FPS landscape generated from my codebase.

Trend of open bugs. Most voted for bugs or features.

Relate bug fixes to codes.

Even functional tests versus testing around bug clusters.

What question are you asking? Where are all the problems occurring. Functional breakdown of your app versus number of bugs occurring in that function. Do risk analysis up front? What are the most important area of the code.

Could you do this in JIRA. Bamboo 2.1 has tighter integration with JIRA. How many changes related to that issue resulted in build failure.

Steve Hayes Using gchart api to plot some statistics. Metrics - unit_coverage, functional_coverage, unit_test_count, unit_test_failures, lines_of_code Mainly works as a change agent. Get people to change the way they work. Graphs are good to help initiate change.

Marty Andrews Used pairing matrix to show why silos of knowledge where occurring. Showed trend of who was working on what. Looked at who was blogging on whose blog posts. Very similar to a pod of dolphins.

Google Motion Chart - bubble diagram versus time. GDP,health, versus time.

Release metrics. Show on retro timeline.

Classic metric for test managers, defect detection rate.

Eye candy is a good sales tools.

Likes the bug cluster data. Needs to be married up with the correct data. Bugs found versus Bugs closed.

Rex Black Time worked estimated versus actuals Tests fixed expected versus actuals.

Blocked test cases Avatars that show emotion. Communicate emotional state as well as progress. Nico Nico calendar? Explicit blockage board.

Careful that showing the right message. Story boards. Linking risks against what you are doing.