When computers were first around almost everything revolved around the programmer- it took a programmer to write the code to make the computer actually run, to write the compiler to interpret, the the code for the compiler to run, etc. Over time, things were abstracted away. Chip sets were standardised, then standard operating systems were introduced, then development platforms and databases made development and storage easier, then whole industry standard packages meant programmers weren't relied on to deliver a business function. 25 years ago a company may have considered writing their own software for handling accounts payable, 15 years ago they may have considered building a content management system for their web site, 10 years ago they may have considered building an enterprise service bus. All of those decisions, if made today, would be considered crazy.
In the meantime, hardware has also been steadily moving from an engineer-centric world to a commoditised virtual world - physical boxes to virtual machines and now to virtual data centres in the cloud. 10 years ago you might have required 4 weeks notice to procure a new server, now it can take minutes. In fact most of the software applications we develop now are deployed using a scorched earth policy - a new server is spun up, code deployed, tests run, DNS switched over, old server decommissioned automatically in minutes.
What does this mean for IT workers? My guess is that network engineers should consider other skills - especially dev ops as a natural progression. Developers continue to be needed, but expect more and more work in 'filling in the gaps' between off the shelf systems (be this integration or functional gaps). Developers with a few strings to their bow will be in demand especially in regards to service buses, emerging technologies (graph DBs, mobile), and platforms (SalesForce, Dynamics, Sharepoint). Of course if you're near retirement age and know Cobol there will still be a demand until at least 2023!