Energy is the key to technology - but where will we find it?
'Watson, are you a he, she or an it?"
That is the question I asked the Watson supercomputer on a recent trip to its home at a New York IBM research lab. The computer is arguably the closest humanity has come to producing artificial intelligence, and is possibly best known for besting champions of the US quiz show Jeopardy. Built from a room full of ultra-advanced IBM servers, Watson is capable of evaluating complex and nuanced questions, and even learning from its mistakes.
Ask a human a similar question, with factual and philosophical nuances, and he or she will burn the energy equivalent of about three grains of rice to produce an answer; Watson, on the other hand, consumes the energy of a small town. While battling on Jeopardy, the humans consumed roughly 600 calories; Watson stressed the power grid of the entire studio.
What Watson really has to say - what the computer represents by its sheer existence - is that power consumption is a fundamental constraint on technological advancement. On a micro scale, technology is outpacing the ability of batteries to keep up. What good is a fast phone that can only stay on for 30 minutes?
On a macro scale, innovation is stressing power grids and contributing significantly to greenhouse-gas production. According a recent New York Times article, digital warehouses and data centres worldwide use the about the same as the output of 30 nuclear power plants.
If the question at the heart of technology development is efficiency, the answer may lie in focused innovation. At the Advanced Technology Investment Company (Atic), we have partnered with the Semiconductor Research Corporation, a leading university-research consortium, to drive local innovation in exactly this area - what we call minimum energy electronic systems.
Atic has invested more than Dh100 million in local innovation, this year supporting research initiatives spanning Khalifa University, UAE University, American University of Sharjah, Masdar Institute and New York University Abu Dhabi. And these initiatives have already produced remarkable results.
One research vein at Khalifa University has made significant progress on transistors able to perform efficiently at ultra-high temperatures. This drastically reduces the need for bulky cooling systems that take up significant amounts of energy, both in personal computers and in massive data centres. The Khalifa team has spent the past year refining the designs, and with this round of grants hope to begin building and testing platforms in real-world scenarios.
At the American University Sharjah, there has been significant progress in energy autonomous circuits - systems able to harvest energy from ambient radio waves such as cell phone signals, Wi-Fi and TV broadcasts. Last year, the team unveiled a calculator able to run solely on harvested radio energy.
While successful, that technology only operated at 8 per cent efficiency. Looking forward, the team believes it can work towards systems operating at 40 per cent, providing power for technologies contributing to oil and gas, infrastructure and defence.
A separate initiative between Atic, Masdar Institute and the state of Saxony in Germany supports twin research labs in Abu Dhabi and Dresden working in tandem to advance 3D semiconductor stacking technology.
The technology combines multiple semiconductors to boost speed and efficiency in one processor. The technology will be used in a recently announced GlobalFoundaries mobile processor. It is expected to improve battery life by 40 to 60 per cent compared to today's two-dimensional technology.
So, what does Watson really have to say? What does worldwide annual electricity consumption at data centres - 30 billion watts - really mean? From a power perspective, technology is hitting a very real wall. Abu Dhabi and the UAE are playing a role to break that barrier.
Sami Issa is the executive director of the Technology Ecosystem unit of Atic