We have a couple of old apps with a development cycle like this: Business person calls developer describes problem. Developer codes and tests solution against the production database. Developer applies changes to production system. Developer checks in code changes. How many problems did you spot? No unit tests, no CI, no safegaurds against corrupted data, no roll back plan, yes, yes, yes. But the biggest problem of all? The business thinks this is great!
The business percieves that they:
- Can request a change without requiring any analysis of the impact of the change or describing it in too great detail
- Don't have to worry about justifying the change from a business value point of view
- Don't have to test in those annoying staging environments which are too hard to maintain and setup
- Don't have to worry about asking for the wrong thing - because they can just ask for something else straightaway
- Don't have to wait too long for code changes to be deployed
- Still have the right to complain if they find a critical bug
What's more, the developer also thinks this is great! They get to be a customer focused team player, with the respect of the business - these developers are often regarded as 'the best' by the business. Not to mention a certain amount of percieved job security because 'this system would fall over without me to maintain it'.
Are we servicing the business well with this arrangement? To some extend this depends on the level of developer doing the work, but in most cases a developer who allows this to happen is a C-level developer who thinks they're an A or a B and therefore 'the normal rules of development do not apply to me'. This, in turn, means the application code quality diminishes over time and as more changes are made, more changes become necessary to fix regression bugs. Eventually then, this 'fast turnaround', 'responsive' mode of operation builds up the technical debt of the application, it becomes legacy, harder and slower to maintain, and eventually needs to be replaced.
How do we turn this situation around? With difficulty! By the time you've got to this situation (or inherited it like me) it may be too late to save the code. If it's not too late it requires education of the business to understand the long term damage to the health of the application, and it requires education or (more usual) replacement of the developer supporting the application. It also requires someone with analysis skills to understand the business domain in which the application sits and the role of the application in the business process and to translate the business change requests into development requirements / test cases. All of this is perceived as a costly overhead - especially when 'we just call Bob and he fixes it now, but you want us to fill in forms and do testing...'. If it IS too late to save the code it may even be better to let the current sub-standard process continue and expend energy on the replacement system.