(no title)
MuffinCups | 5 years ago
1) Transactions per second 2) Total time to process HTTP request and return response 3) Timing of specific sections of code 4) Exception counts 5) Specific function counts (get as detailed as you need here) 6) Response time as viewed from an external requester (with percentile breakdown) 7) Version numbers 8) Anything you feel is important to know about your software
I always deploy canaries and compare the stats of the canary box to the ones running version canary-1. Any difference in stats must be investigated and explained as valid before a new version can be deployed beyond canary.
The greatest tool I have ever created for regression testing, I call "A/B test". I record incoming network traffic, right off the wire using "ngrep". I have a tool that plays back this traffic to multiple different destinations and compares the response from each. Any difference in response between old version and new version must be explained and not caused by bugs. This tool also reads log files and other forms of output and does the same type of comparison. Obviously some things are always different like random numbers in responses, unique cookies, timestamps can differ by 1 second sometimes etc. Those things are removed before comparing.
This tool almost 100% guarantees no unintended change of behavior slips through a new release. The only way something could slip through here is if some odd request that wasn't tested causes it, or something in the data that we don't compare (due to it always being different like random number) causes something. I run 10 million requests through this A/B test and a "no unexplained difference" result is required before fully deploying.
No comments yet.