This article was contributed by Neil Thompson, a researcher at MIT’s Computer Science and Artificial Intelligence Lab and the Digital Economy Initiative.
As computer applications become more complex and datasets grow, the environmental impact information technology is increasing. Historically, this wasn’t a big deal because growing computing demands were offset by improvements in hardware efficiency, colloquially called Moore’s law. But as hardware improvements diminish, another (often invisible) source of efficiency is taking center stage: improved algorithms.
Our growing appetite for computing can be seen in the proliferation of Data centers – which can span millions of square feet – and which passes through large amounts of electricity. The International Energy Agency estimates that data centers account for 1% of global energy consumption and 0.3% of all global CO2 emissions. In the absence of ways to make processing more efficient, this damage will increase as we face bigger and bigger big data problems in our increasingly sensor-laden world.
In a recent study, Yash Sherry (a researcher affiliated with MIT Sloan) and I examined the speed at which algorithms improve and compared it to what was historically the most important counterweight to the growing appetite for computation, Moore’s law. Driven by the miniaturization of the building blocks of computer hardware, Moore’s law has provided many decades of large year-over-year improvements in computation efficiency. Just as increased agricultural productivity has fueled world population growth, increased hardware productivity has fueled global computing growth.
But if Moore’s Law is the flashy brother who is always in the news, the algorithm improvement is the brother who works behind the scenes.
Algorithms it’s the recipes that tell computers what to do and in what order. And while Moore’s Law has given us computers that can perform far more operations per second, improved algorithms have provided better recipes for what to do more with each of these operations, and the benefits can be enormous. For example, imagine you are Google Maps and you need to find the shortest route out of 1,000 popular places people travel to. Calculating it using an old algorithm it could easily take a million times more computation like using a more modern version. Another example we have documented is text matching, such as when search engines search for keywords on web pages or lawyers look for particular references in legal documents. Better algorithms it can easily search 100 times faster than it originally was, thus reducing processing time and power consumption.
But while the individual examples can be impressive, we wanted a broader view. For this study, we went through 57 textbooks and more than a thousand research articles to find the algorithms that computer scientists believe are most important. From these we extracted 113 different “algorithm families” (sets of algorithms that solve the same problem in different ways) that had been highlighted as the most important by computer science textbooks. For each of the 113, we monitored every time a new algorithm was proposed for that problem from the 1940s to the present.
So how does the algorithm improvement compare to the hardware improvements? For Big Data issues, 43% of algorithm families have had year-to-year improvements equal to or greater than the resulting gains Moore’s law. Of these, 14% had noticeable improvements surpassed those that come from better hardware. These improvements completely transformed what was feasible in these areas, making it possible to address problems in a way that no hardware improvement can. Equally important for our current era of growing data size is that the benefits of improving the algorithm are greater the greater the problem to be addressed.
Companies and research labs at the forefront of computer science are already responding to the need to invest in better algorithms. The average organization is dedicating 6% to 10% of its IT developers to creating new algorithms and 11% to 20% to improving existing algorithms, which represents very large investments. Other organizations, accustomed only to buying new hardware as their way to improve computing, they will increasingly need to follow the lead of these flagship algorithms to stay competitive.
The growing importance of algorithms is part of a larger shift in what drives progress in computing. Historically, the improvement has focused on hardware, but with the end of Moore’s Law, things are changing. Instead, the improvement of the algorithm will come to the fore, providing the engine to tackle new, more difficult processing problems.
But pushing the frontiers of computing is just one of the benefits of better algorithmsThe other is efficiency. For those in government or academia, or just those who are concerned about the sustainability of computing, better algorithms are an ideal option: they allow us to achieve the same results but at significantly reduced environmental costs.
Neil Thompson is a researcher at MIT Computer science and artificial intelligence laboratory and the Digital Economy Initiative. Previously, he was Assistant Professor of Innovation and Strategy at the MIT Sloan School of Management where he co-directed the Experimental Innovation Lab. Thompson advised businesses and the government on the future of Moore’s Law and was a National Academies panel on transformation technologies and scientific reliability.
Welcome to the VentureBeat community!
DataDecisionMakers is the place where experts, including data engineers, can share data-related information and innovations.
If you want to read cutting edge ideas and up-to-date information, best practices and the future of data and data technology, join us at DataDecisionMakers.
You might also consider contributing to an article owned by you!