In these weekly articles I have been writing about how computers are now doing more and more of the work that humans did in the past. I remember visiting the Columban headquarters in Omaha in the United States in the late 1960s and seeing a room-size computer which had been installed there to process the addresses of those who were supporting the Columbans. These massive computers were not meant for individuals.
One of the most important moments in the development of personal computers happened in 1971 when Intel, a relatively small company, introduced the 4004 chip. This made possible the first commercially available microprocessor and it contained the sophisticated electronic circuits which made it possible to deal and make sense of vast amounts of data. The chip was seen at the time as a technological miracle, equipped with 2,300 tiny transistors. Each of these was 10,000 nanometres.
Fast forward 44 years to 2015, when Intel, now a world leader in computer chipmaking, released its latest chip Skylake. The number of transistors, presumed to be between 1.5bn to 2bn, is simply mindboggling. The progress in chipmaking by Intel or Taiwan Semiconductor Manufacturing Company is simply astounding and has given enormous power to computers to analyse data.
In the late 1960s, Gordon Moore, one of the founders of Intel, estimated that the number of components which could be crammed into any integrated circuit was doubling every year. Later this calculation was pushed back to two years and the phenomenon became known as “Moore’s Law.”1 Moore’s Law also made computers much smaller and, therefore, more available for individuals. But can Moore’s Law continue to make smaller and more powerful chips? Recently it is becoming clear that this cannot continue indefinitely because of size and financial limitations. The cost of building factories which create these structures runs at anything up to $10 billion. So it appears that this miniaturization of chips will eventually run its course.
But increasing computer power does not have to end there. At the moment chips are flat like single-storied buildings. Some researchers are talking about stacking components on top of each other. They believe that this would allow designers to cram in more components, in the same way as multi-storied buildings can accommodate more people than single storied buildings.2 In fact, this is already happening at the research centres of the South Korean computer giant, Samsung.
They now sell computer chips which are stacked in several layers. Scientist are also weighing-up whether quantum computing could be used to speed up processing. At the moment, the feeling is that, while it might be useable in well-protected data centres, it will not be factored into small vulnerable laptops or smart phones. This new technology would allow people with laptops and smart phones to connect with these massive data centres. In this way, “computing will become a utility that is tapped on demand like electricity or water is today.”3 In a way this is already happening with what is known as “cloud computing.” As this technology develops, it will make it possible to remove the cumbersome hardware that does the computational heavy lifting in laptops or smart phones and locate it in these data centres.
This is already happening. The computer company Apple has developed a voice-powered personal assistant known as Siri. A very simple request to locate a Chinese restaurant in a particular area of a city is far beyond the computing power of any current iPhone. So, the iPhone merely forwards the request to an Apple data centre. When the computers at the data-centre have located the particular restaurant, it sends the information back to the iPhone. Some companies such as Samsung are currently making voice-controlled televisions.
One other major concern for these new iPhones will be the energy efficiency of the machine. Much research is going into improving and lengthening the life of batteries.
While Moore’s Law has its limits and will not continue forever, computing power will continue to increase in other ways. For example, the computer of 2050 will “consist of a system of tiny chips embedded in everything from your kitchen counter to your car. Most of them will have access to vast amounts of computing power delivered wirelessly though the internet and we will interact with them by speaking to the room.”4
Columban Fr Sean McDonagh, a missionary in the Philippines for many years, has worked tirelessly to improve the care of the earth, and has published numerous books. He was an advisor to Pope Francis on preparing his encyclical, Laudato Si’.
1 Tim Cross, “Vanishing point: For decades computers have got smaller and more powerful, enabling huge scientific progress. But this can’t go on for ever. What happens when they stop shrinking? (The Guardian, 26 January, 2017, page 18.)2 ibid