There are many horror stories about the future of IT. As a published science fiction and horror writer, I’ve even written a few of those stories. But for the real world, where do I see the future of improvement in IT?
- Standardizing the Standards
The proliferation of standards will slow, simply because too much complexity becomes impossible to manage. In the longer term, standards will merge or common ones adopted and less flexible standards dropped.
An addendum I would add to this is the selection of a single standards organization for most purposes, though this would most likely be the ISO over the ANSI International Standards.
- Managing software management and updating our process of updates
When the first and middle of a month roll around, support call volume spikes. Many updates are pushed simultaneously or around the same period by different software packages. When dependent updates don’t sync, chaos can ensue. When updates all try to run at once, bandwidth and computing resources are used up.
One future area for improvement in IT is updating our processes of software updates so that related software groups coordinate updates so that one group’s updates don’t adversely impact one another and occur in the least frustrating sequence to users. Managing software management with greater collaboration among software vendors will help users while decreasing vulnerabilities and software support demands.
An addendum I would add to this is the rise of updates for firmware and software for “internet appliances” like routers, digital devices that automatically archive all emails that pass over the network and so forth. With more “smart” devices in need of software updates, ensuring that they don’t all try to call home on the first of the month is critical to not shutting networks down in a digital traffic jam.
- Smart AI testing software that can act dumb
Test software can run through many transactions to test software. Artificial intelligence is approaching higher level functioning and may one day match that of humans. Before it decides to take over the world, it could be put to work imitating human behavior on software, albeit at a much higher speed. AIs can analyze the computing logs placed on many work computers recording human behavior and activity. It can then imitate the common mistakes people make, from leaving critical fields blank to transposed digits to clicking on something that looks interesting but is really malicious. In short, we’ll be using smart AI to act like dumb (average) humans for software testing purposes.
An addendum I would add to this is that you cannot make an artificial intelligence do everything that a human can do in testing. As Einstein joked, there are possible limits to the universe but not to human ingenuity and creativity.
- Deflating bloated IT
There were jokes that Microsoft had a secret vested interest in CPU manufacturers, adding features to its Operating System that sucked up memory and processing to force users to get more powerful computers. The rise of portable devices has moved some computing off the personal device and onto the server. I expect functionality levels to remain the same, since users won’t accept fewer options. But vendors may make functionality customizable, only installing the options and add-ons users request, reducing the size of their OS or software in the process. Simplicity will reduce bandwidth and CPU demands. Those software vendors who move toward “lean” software first will be better able to meet the small device market along with allowing larger systems to run faster and more smoothly.
An addendum to this will be the need to reduce the sheer number of software tools on many PCs so that the drive to simplify the user interface and eliminate human decision making doesn’t make the hardware too slow. We also need to reduce the amount of information we expect to flow between servers and apps to only what is necessary, instead of essentially handing over a credit report to make a purchase or a super-duper secret handshake to connect to a database.