In the 60s and 70s, bell-bottom trousers were all the rage. I remember seeing pictures of my father wearing these trousers that were shaped oh so weirdly – they would balloon out towards the feet, like a bell. Hence the name bell-bottom. Another rage during the time were over sized sun glasses. Movies made during that era showcased the heroine, in particular, sporting huge eye wear. They would cover about 30-40% of the face and not just the eyes. They might as well have been face wear and not eye wear. While the bell bottom and parallel trousers have stayed away from the male fashion industry for a while now, the eye wear has definitely remained/ had a rebirth of sorts. Nowadays, it is common for women to sport big sized eye wear that covers the eyes and then some part of the face.
I cannot claim to have lived in the era of computers that would occupy an entire room – that was the stuff of lore in text books. The technological industry has stayed away from that reverting to that era and while no one has seen the future, it can be assumed that it is unlikely we will return to those kinds of processing machines in the time to come. I have read about the time when people would work on dumb terminals which would connect to a server which had all the intelligence built in. This was because the cost of replicating the “intelligence” or processors in multiple machines was prohibitively expensive and understandably, no one would want to invest in purchasing expensive hardware all the time. The fundamental principle involved here was that a central system would maintain relevant information while people could continue to work on systems that were less loaded.
In the world of computer networks built on infrastructure productised by companies such as Cisco, Juniper, Alcatel-Lucent and hundreds of others that have fallen by the wayside, the principle was to build infra that would work ever faster in routing information from one part of the world to the other. This lead to faster and better chips being designed that work only for the designed purpose. Think of the development of cars – from the old days of Aston Martins with the old style engines to the sleek and multitude of machines available nowadays to transport us. With networking, the focus was on producing faster and better Integrated Circuits customized to perform tasks that was otherwise taking more time when done purely by software. Just like Formula 1 cars have to be tweaked to suit the circuit that they are racing on, like horses are chosen for races based on the race course, companies invested heavily in making custom built hardware solutions for tackling the problem of speed and faster processing.
The focus nowadays has shifted once more to software and moved away from the principle of horses for courses. With Network Functions Virtualization, the idea is to move away from custom built hardware solutions to commonly available, Off-The-Shelf hardware and let the software components deal with requirements of speed and scale. This approach is a return to the old days of dumb terminals with the intelligence in a central component/ server. With NFV, the dumb terminals would be responsible for moving data information quickly while the intelligence of where this information is to be moved, housed in separate, commonly available hardware providing directions to the intellectually challenged, if you will, hardware. What goes around comes around even in the tech world, after all.