Past and present of the computer software world
Computer languages and technologies are a living thing. Their life lies in the minds of their creators and the service they offer to the people using them. Just as everything else alive, they move towards securing their existence and evolving to rule the world. Well, at least that's the principle. As time goes by, technology evolves. The new balances make certain technologies obsolete and create new ones that reflect the new requirements.
This article will not make a history lesson from day one. Its purpose is to examine present prevailing technologies and how they are related to older versions of themselves and their users.
Unlike what many people believe, the term "cloud" in the computing world has nothing to do with actual clouds. Its notion is about a centralized networked service available to groups of people. It wants people to think of it as something romantic, like people spread over an area and a little fluffy cloud in the middle is what they can all share at the same time.
The notion of the cloud is not a new thing. In 1960s - 1970s there was the idea of the mainframe and terminal which is 99% similar to what we call "cloud" today. Back then, computers were huge machines with extremely low computing power, compared to modern computers. So, the mainframe was supposed to be a central big computer with a lot of power sharing it with its users. Users would access it by very simple computers, the terminals, which essentially were a monitor and a keyboard connected to the mainframe. Many years passed since then, and today we are going back to those ideas but we are using different names. The funny thing is that people supposedly knowing a lot, they call them innovation. So, today we have "remote desktops", VNC, TeamViewer, the new Google Chrome OS, which are all modern versions of the old mainframe-terminal idea.
The centralized power of cloud's design is something that worried people a lot back then. It seems that today people are not as worried. This difference is easily explained by noticing that in the '60s and '70s there were very few people understanding this technology. They were the ones who would build those machines so they had the academic background to be able to understand the risks. Today, the vast majority of people building software and using it have very limited understanding of internet's mechanics. That explains their ignorance, which explains the lack of fear.
Of course, that is the dark side of the cloud. There is a number of good angles. "Economies of scale" as economists like to call them, is the number one advantage of the cloud. Large numbers of customers bring the costs down to silly figures. That creates a number of different problems, but as far as short-sighted customers are concerned, the low prices is just awesome. The unquestionable truth is that there is a lot of new technologies and businesses flourished by this good aspect of the cloud. The low barrier allows young programmers and entrepreneurs to test their ideas. As I said before this also creates a number of problems but overall, technology has been benefited a great deal so far and it will probably continue to do so for a long time.
The hardware wars
Only a few years back hardware giants were giving fearsome fights over CPU power, memory and PCI card throughputs. Then there was the wars over the awesome new technology of thin monitors (LCD, TFT, Plasma technologies etc) which were replacing CRT monitors (heavy monitors with long tails). Today the epic battles are given over smartphones, tablets, and lately smart watches and glasses. 3-D printing, drones and robots are next in line.
(I often catch myself wondering when are we going to be so thrilled over cultural evolution as we are with technology)
Software developers sometimes follow hardware evolution, sometimes lead it. Most of the times, the market leads both.
The hardware wars have a huge impact on software developers. Software devs don't know what to deal with first. Make programs for the web, for desktop apps, for mobile apps, for server side apps, for hardware? It's a huge mess which paralyses software developers. It's not a surprise why you can't find good programmers nowadays. Their focus is scattered over a large number of technologies, reducing the time they need to excel. And even if someone decides to focus on just a few technologies so that he/she becomes a guru, what if in a few years those technologies become obsolete? This sad truth forces software houses and freelance developers to divide their time and their expertise.
Developing for the platform
Another characteristic of our times and at the same time a factor that divides software developers' focus is the wide range of platforms of their programs.
Until 1990s there was MS Windows. If you knew how to make programs for MS Windows you were one happy programmer. Then the Linux revolution came. Of course, Linux used to be invisible to big companies, but not to young developers moved by the noble motives of the open source movement. Then the internet browsers started becoming smarter allowing for applications instead of plain text pages. Then Apple re-booted and their new generation of hardware attracted big money and therefore software developers.
Several cloud services have become so huge that some developers also need to know how to make Facebook apps, Twitter apps, how to use Amazon's web services and so on. Their cloud services can also be seen as a platform, just as if it's an operating system.
In our days we have the revolution of smart-phones, tablets, smart-watches, smart-glasses. This means that our collection of must-know technologies expands to Android-Java and iOS-Objective-C.
Then there are the CMSs. Content Management Systems like Wordpress, Joomla! and Drupal are efforts to simplify the website building process for simple internet users. The number of factors involved in this process is so large that makes those systems hardly "simplifications" or "automations". Nevertheless, they have managed to open the web to a large number of non-tech-savvy users. This has expanded the web, with notable side-effects of course.
Lately we are watching the establishment of browser apps, particularly Chrome Apps. This is another type of platform that allows software devs to write one app that runs on a large number of computers without caring about the operating system.
One large portion the computing world is game development. Game development has two special characteristics: as an industry it involves a lot of money and it requires a lot of processing power.
Until the 1990s gaming development was an absolute chaos because there were many different hardware vendors with their own software platforms. Then Microsoft unified them with DirectX. Over the last years the scene is changed again as there are powerful gaming consoles that require special made software. Apple's impact in the hardware business also split the pie. And, today we have a new breakthrough, the internet browsers. New generations of internet browsers have the ability to open direct channels to graphics and sound cards, essentially replacing DirectX's role.
And that's not the end of the latest developments. The latest unfortunate Microsoft's changes on Windows and its inability to adjust to the new landscape of smart-phones made people replacing Windows with other devices. A small side-effect with a lot of importance is a latest development. Valve creates a brand new Linux-OS for PCs that completely replace Microsoft's dominance. This will certainly make other game software houses do similar moves.
That is the sentiment of many software developers. End-users do not as sympathetic; they want more, so there will be more.
So that is an attempt to summarize the computing technology landscape of 2013 in a few lines. Endless technologies, limited developers, and very a chaotic but at the same time thrilling future ahead.