Friday, January 10, 2014

IT jobs

When computers were first around almost everything revolved around the programmer- it took a programmer to write the code to make the computer actually run, to write the compiler to interpret, the the code for the compiler to run, etc. Over time, things were abstracted away. Chip sets were standardised, then standard operating systems were introduced, then development platforms and databases made development and storage easier, then whole industry standard packages meant programmers weren't relied on to deliver a business function. 25 years ago a company may have considered writing their own software for handling accounts payable, 15 years ago they may have considered building a content management system for their web site, 10 years ago they may have considered building an enterprise service bus. All of those decisions, if made today, would be considered crazy.

In the meantime, hardware has also been steadily moving from an engineer-centric world to a commoditised virtual world - physical boxes to virtual machines and now to virtual data centres in the cloud. 10 years ago you might have required 4 weeks notice to procure a new server, now it can take minutes. In fact most of the software applications we develop now are deployed using a scorched earth policy - a new server is spun up, code deployed, tests run, DNS switched over, old server decommissioned automatically in minutes.

What does this mean for IT workers? My guess is that network engineers should consider other skills - especially dev ops as a natural progression. Developers continue to be needed, but expect more and more work in 'filling in the gaps' between off the shelf systems (be this integration or functional gaps). Developers with a few strings to their bow will be in demand especially in regards to service buses, emerging technologies (graph DBs, mobile), and platforms (SalesForce, Dynamics, Sharepoint). Of course if you're near retirement age and know Cobol there will still be a demand until at least 2023!

Friday, February 15, 2013

Sign of the times



I noticed today that GanttHead - a site I regularly visit and (mainly) full of well written and thought provoking articles has changed it's name to ProjectManagement. A subtle, but clear sign that linear planning PM methods are on the way out. The web site announcement stated:
gantthead.com is now ProjectManagement.com
Why the change?
Project Management is changing. When gantthead launched in 2000, every project worth managing was run using a gantt chart.
But times change. The change in our name is a recognition that many PMs who would benefit from being one of us may not even know what a gantt chart is.

Monday, July 30, 2012

The hidden cost of planning, precision, and predictability


There appears to be a perception that it would be irresponsible for us to do any work without a very clear idea of what and how we are going to deliver and how much this will cost. Whilst I’m not at odds with this idea, it denigrates all other notions of responsibility. Consider;
  • Is it responsible to make no decisions until all the facts are known even if, by delaying, it is too late to act?
  • Is it responsible to avoid proposing potentially risky courses of action which may yield high returns, because we're unsure of the effort /cost involved?
  • Is it responsible to invest time and energy defining solutions to problems rather than delivering solutions to problems?
Most sane people would agree that all of the above show some levels of irresponsible behaviour, but all too often this exact behaviour is hidden behind the notion that having a clear and precise plan is the only responsible course of action. Now, a clear and precise plan is a great thing to have, especially if you can come up with it at almost no cost, but as the military says ‘No plan survives first contact with the enemy’, and the corollary ‘if your attack is going to plan, it’s an ambush’…

So what is the responsible approach? I’d suggest it’s a compromise, i.e. that the responsible thing is to
  • Balance risks with rewards. If I know something will cost between $100 and $200 I don’t need to go into any more detail if the return is $300. I should switch to delivery mode immediately and get that benefit asap rather than spend time and effort working out that it should cost $173.24. On the other hand, if the return is $150 I might want to do more investigation.
  • Balance planning with action. Have a rough plan for the long term and a detailed plan only for the immediate short term, possibly with other levels between. Spend most of our time delivering value against the short term plan, and revise the long term plan less frequently.
  • Balance precision with effort. Being precise is admirable, but if the effort required to be precise is too high then the benefits of precision are undermined. E.g. if it costs me $20 to determine an investment will be between $100 or $200 (for a total cost of between $120 and $220) and $50 to know the cost is $173.24 (for a total cost of $223.24) I’ve realised no benefit from the extra effort involved in gaining that precision.
  • Balance predictability with adaptability. Having a predictable outcome is admirable, but if it means missing out on opportunities to change course and deliver a better value outcome the advantage is undermined. Knowing that I can spend $150 to achieve a return of $300 is great, but if it means missing out on an opportunity to increase that return to $400 for the same cost I have failed to maximise my effort.
  • Balance budgets with benefits. Rather than try to define a deliverable with a cost, define the benefit associated with an outcome and set a budget accordingly. If the budget appears unachievable reconsider the approach, but if it does appear achievable start delivering and continue to monitor the budget and benefits until the crossover point is reached – i.e. when the incremental benefit of more work is not worth the cost that will be incurred.
That is to say that whilst there's benefit to planning, precision, and predictability they're useless unless coupled with taking action, delivering benefits, and being adaptable. Of course, whilst you're stuck in an environment where an investment has to be proposed and approved through a lengthy bureaucratic process this balanced approach is difficult to achieve, but unless the status quo is challenged effort will continue to be wasted and opportunities missed.

Wednesday, May 30, 2012

Misunderstandings of agile


A list of common misunderstandings of Agile development
  • Incremental vs Iterative development . There’s a big difference between incrementally developing an app (which it appears many people are doing when I ask them about their agile process) and iteratively developing an app. Incremental development normally involves building one piece of the app at a time and stitching the pieces together as they’re built. I’m not saying this is a bad idea, it’s just not ‘at the core’ of agile. On the other hand, Iterative development involves putting the ‘bare minimum’ end-to-end functionality in place then revisiting that same functionality over and over again adding features (and business value) to the application on each pass (or iteration). Of the two methods, incremental development can result in ‘gold plating’ of the first pieces developed at the expense of the quality of the latter ones.  On the other hand, Iterative development quickly allows the business to view the end-to-end functionality and identify which parts of the system need further work, focusing development on the best value. Iterative development relies far more on automated unit testing – since the code is reworked many times we need those tests to ensure we’re not introducing regressions. Because of the nature of iterative development the resulting code tends to be simpler, standardised, resilient to change, well factored, and well understood – in ideal shape to be supported in the long term. 
  • There’s no way to manage scope (aka  How can we pin the client down on what they want without a requirements document). In agile projects the scope is not fixed, but the project constraints are (time, cost). Waterfallers tend to hear the first part of that statement but not the second. Yes, the business can add requirements late in a project, but they may need to drop some other as-yet undeveloped requirement of a similar cost to develop. Scope is not constrained but the overall budget is. If there is benefit in increasing the budget to deliver more valuable requirements then do so - as you would for a CR in a waterfall project. 
  • The project never ends. Let’s first look at the definition of ‘ends’ – an agile project should end as soon as the business value of any new functionality delivered in the next sprint/iteration is less than the cost of delivering that sprint/iteration. With this definition of course some projects would never start since the first one or two sprints would deliver little value on their own, so we only generally apply this rule once we have some functionality working. Does this mean a project could never end – well yes – but what’s wrong with that if it’s delivering more and more value? 
  •  If the developers can choose their tasks, they’ll only choose the ‘fun’ ones. To some extent this is true, but remember that the tasks are created each sprint and need to be completed within the sprint. Developers could pick off the ‘fun’ ones to start with, but very quickly they’d find they have to pick up the rest. Also, because of the natural and usually obvious dependencies between tasks there would be a certain amount of team pressure to pick up the tasks that are urgent rather than letting individuals decide for themselves.
  • Agile has too many unproductive practices – TDD, refactoring. Personally, I’m very dubious of someone saying they practice agile development but don’t have automated builds, tests, and high code coverage. If you’re doing projects without these processes I suspect you’re doing incremental rather than iterative development to avoid the need to refactor ‘delivered’ functionality. If you avoid refactoring then there’s less need for automated tests to confirm that nothing has broken after you’ve refactored. But, in order to avoid refactoring you either have to compromise your design (creating add-ons rather than spotting opportunities for reuse) or you are not getting the benefits of agile by iterating on the same pieces of functionality. Saying automated testing is unnecessary is like not using scaffolding to paint the exterior of a building. For small buildings you can probably get away with it, but with anything larger the scaffolding will make you more productive in the long run and will probably result in a better quality finish than using a ladder. 
  •  Agile works better for smaller projects. I’m actually surprised by the number of people who have stated this opinion. ‘Its fine for <that small project you’re working on> but you wouldn’t use it to build <the air traffic control system / MRI scanning system I’m working on>’. Actually I’ll bet those complex systems would be perfect candidates for agile development. Though they may require more thorough testing before being put into production than a run-of-the-mill business system, they would have too much complexity to design solutions for at the start of the project. Agile’s sweet spot is achieved when used on projects with complex requirements that can’t be worked out up front. If you already know the requirements up front you might as well just write a spec and use a waterfall approach. 
  •  Refactoring should be unnecessary with some more design up front. This is missing the point. The assumption is that not all of the future requirements can be known upfront (otherwise you should just write a spec and use waterfall). If you accept that currently unknown requirements will materialise during the project affecting unknown parts of the system then you can accept that it will be impossible to design the system up front, and that doing so will just be wasted time since the designs will necessarily have to change later. Refactoring should be seen as a positive activity rather than a ‘time waster’. Revisiting and working on the same code should make it simpler, promote reuse, ensure more eyes look over the code, in short, make it easier to maintain and enhance in the long term, 
  •  It’s too unpredictable. Done poorly, agile is very unpredictable. Agile processes require discipline by the team to frequently estimate at the micro (task) level and the macro (release) level. Estimation at the task level helps to track sprint progress. Velocity measurements help to predict the release schedule, but only if the definition of ‘done’ is clear and rigorous. However, agile projects can be far more predictable than waterfall projects when managed correctly. The main reason for this is that agile projects track the progress of the product itself (we know at the end of each sprint how complete the system is), whereas waterfall projects track activities against a schedule, which is sometimes a very poor representation of real progress (we may have completed 80% of the work, but the product is in some unknown state of completeness – maybe 50%, maybe 90% - we won’t really know until we’ve completed testing). 
  • It’s an excuse to avoid documentation. Some people certainly behave as if this is their belief, and it is generally true that less documentation is produced on an agile project. On a waterfall projects documentation is seen as equivalent to the process – therefore if you are lacking documentation you obviously are lacking process. On agile projects the documentation (writings) are more numerous, much smaller, and created over the duration of the project. In agile, requirements are written down and acceptance criteria are defined – what more is necessary? You might be inclined to produce some screen mock ups – but that would require knowing all the functionality that will be displayed on that screen up front rather than iterating towards the design. You might want to document test results – but why bother, if they pass all is well, if they fail you raise it with the developer and/or add another product backlog item depending on when it was found. End user and support documentation is generally valuable, but these should be created as part of the deliverables of the project, and they tend to be overlooked on both Agile and Waterfall projects. How many requirements documents from waterfall projects are valuable after the software is launched?
I tried to list 10 misunderstandings but only reached 9. Maybe you can suggest the tenth one yourself?

Tuesday, April 24, 2012

Responsive Support


We have a couple of old apps with a development cycle like this: Business person calls developer describes problem. Developer codes and tests solution against the production database. Developer applies changes to production system. Developer checks in code changes. How many problems did you spot? No unit tests, no CI, no safegaurds against corrupted data, no roll back plan, yes, yes, yes. But the biggest problem of all? The business thinks this is great!

The business percieves that they:
  1. Can request a change without requiring any analysis of the impact of the change or describing it in too great detail
  2. Don't have to worry about justifying the change from a business value point of view
  3. Don't have to test in those annoying staging environments which are too hard to maintain and setup
  4. Don't have to worry about asking for the wrong thing - because they can just ask for something else straightaway
  5. Don't have to wait too long for code changes to be deployed
  6. Still have the right to complain if they find a critical bug
Essentially we've absolved them of the responsibility to carefully consider the change being requested, justify the change, and thoroughly test the change before it's deployed, whilst giving them a fast turnaround.

What's more, the developer also thinks this is great! They get to be a customer focused team player, with the respect of the business - these developers are often regarded as 'the best' by the business. Not to mention a certain amount of percieved job security because 'this system would fall over without me to maintain it'.

Are we servicing the business well with this arrangement? To some extend this depends on the level of developer doing the work, but in most cases a developer who allows this to happen is a C-level developer who thinks they're an A or a B and therefore 'the normal rules of development do not apply to me'. This, in turn, means the application code quality diminishes over time and as more changes are made, more changes become necessary to fix regression bugs. Eventually then, this 'fast turnaround', 'responsive' mode of operation builds up the technical debt of the application, it becomes legacy, harder and slower to maintain, and eventually needs to be replaced.

How do we turn this situation around? With difficulty! By the time you've got to this situation (or inherited it like me) it may be too late to save the code. If it's not too late it requires education of the business to understand the long term damage to the health of the application, and it requires education or (more usual) replacement of the developer supporting the application. It also requires someone with analysis skills to understand the business domain in which the application sits and the role of the application in the business process and to translate the business change requests into development requirements / test cases. All of this is perceived as a costly overhead - especially when 'we just call Bob and he fixes it now, but you want us to fill in forms and do testing...'. If it IS too late to save the code it may even be better to let the current sub-standard process continue and expend energy on the replacement system.

Tuesday, April 17, 2012

User stories are not a breakdown of a requirements document


I’ve been interviewing a few business analysts recently (yes we still need them despite having nominal business product owners). Many of them claim some agile experience in their past and manage to mention the words ‘incremental development’ but very few appear to think in terms of ‘iterative development’.

A brief word on incremental vs iterative here: Iterative development involves going back to already written functionality, reviewing it, and deciding what, if anything, needs to be done to improve it. Incremental development is developing a large system in pieces – this is not the same. Compare...

Incrementally building piece by piece with a clear upfront design

Iteratively improving on an initial vague design
 
(And just calling sprints 'iterations' does not make your process iterative!!!!)

I know back when I was a BA in a waterfall environment I tried very hard to think of everything during the requirements gathering phase so I could include it in the requirements specification, and a good BA should naturally do so. But in transitioning to an agile environment this often results in BAs simply ‘compartmentalising’ a traditional requirements document into smaller chunks and calling them User Stories.

So what’s the problem with this?
  1. It removes the ability for the business to prioritise the functionality. Imagine a text search function – a BA may specify the multitude of different ways the search should return results given various common search operators (and / or / + / etc) in one user story. This negates the option to have a story for just a basic text search and a lower priority one for more advanced search options (the business may decide the benefits of the advanced search are not as great as another feature). 
  2. It increases the chances of gold plating. Good BAs can always think of improvements that _could_ be made to software and they tend to put all that knowledge in one story. But the _best_ BAs will recognise which improvement has the biggest business benefit, and will separate the stories based on potential business priority, rather than a blanket statement like 'The text search should be like Google'.
  3. It slows perceived progress. Instead of spending 1 day getting a basic search working and demonstrable to the end users, the developers have to spend a week getting the indexing and search engine customised, which in turn prevents them from delivering other features. The value of agile is the fast feedback mechanism it provides, the more we do to stay in this zone the better.
  4. It makes you blind to opportunities. If you've already thought of the 'solution' at the start of the project, you're not going to look for alternative better solutions later on. That's why vagueness is sometimes a good thing - it encourages delaying decisions until the last responsible moment - the moment at which you have the most information available to make the best decision. E.g. 'with the feedback from users of the basic search we think we should add a date filter rather than improving the text search options'.
I encourage BAs and product owners in early sprints of a product to write stories that they will probably never release into production. By this I don’t mean that they should lack precision or functional coherence, but simply that whilst they may be basic enough to show how something could be delivered and can form a basis for further discussion on improvements required, they will probably need to add more ‘feature’ before release - or they may decide that the basic version is adequate of course!

Saturday, April 07, 2012

[off topic] Carcassonne on the iPhone/Pad

Downloaded Carcassonne a few weeks ago on the iPad - the game of the year in 2001 consisting of tiles and wooden 'followers'. Like all the best games, deceptively simple yet very rich in game play strategies. In fact, I hadn't realised just how rich it was until playing the in built AI players. Initially, losing frequently to the 'Easy' players I eventually learned the strategies to overcome them and managed to get my level to at least on par with the strong players, without really understanding the subtle changes I'd made to my game play.


Haven't tried my new found powers on any humans yet, though I'm pretty sure I'm a better player now than a month ago.

The app also lets you play against human opponents or a mix of AI and human, and has a very nice 'solo' game too with completely different rules which is addictive. Highly recommended all round.