Working in IT is great fun. I love technology (always have) and can remember my first programming exploits on the Commodore Pet 64 around 1979/1980. One thing I learned early on with this industry is that things never stand still. In fact technology changes so much that someone once told me 25% of our IT knowledge becomes irrelevant each year. Have 4-5 years out and you’re almost starting from scratch. While I think that’s an overestimate (because experience also counts for a lot), I do think there is a real half-life to any technology learned.
So what does that mean? Well, looking at news articles like this (link), you’d think that IT was in decline. But that’s far from the truth. In reality, large corporations are simply getting rid of staff that have experience in technologies that are less relevant to their core business. These people don’t get retrained, but are “let go” and new employees are hired. It’s a cyclic shift that happens across the industry all the time.
What the numbers in the above article show is perhaps the churn at which new technologies are coming along and being adopted in the industry. This is something I’ve focused on over the years, with a huge investment in learning new technologies. There are many things I never work with at all (like programming languages, mainframe), but did add to my overall work experience and development. That said, the time needed to keep up with new technologies seems to increase each year.
Many people might think this phenomena is new, but that’s far from the truth. Alvin Toffler described in his 1970 book Future Shock many of the experiences we have today about technology and information overload. Professional development is important for everyone and a lifelong task.