You’ve heard of smartphones, smart appliances and smart TVs, and now you’re hearing about things getting “smart” on a much larger scale.
The smart city buzz has officially arrived. Governments and private investors alike are suddenly racing to turn their cities into connected, high-tech hubs.
Investing in technology to improve the overall standard of living has been going on for arguably hundreds of years as well. And that, as one prominent Smart City scholar says, “is an economic and political challenge, not a technology trend.”
Yet something is happening now that somehow didn’t seem possible 200, 100 or even 20 years ago. Suddenly we’re beginning to understand how investing in technology and implementing it in the right ways can improve social and economic conditions for urban populations, improving the overall quality of life in the process.
Here’s why we’re at a point in time where the smart city concept is now tangible, and no longer just a pipe dream:
It’s the technology, stupid
While the concept of the smart city is not dependent on technological trends, it hinges on investment in technology. And the simple fact is that computing power has become strong enough to support the level of investment needed to reap the benefits of transforming a city into something intelligent.
In 1970, when Gordon E. Moore first proposed that the number of transistors in an integrated circuit would double every two years, meaning computational power would essentially do the same. Turns out, he wasn’t lying.
At that time, it was possible to create a microchip with over 3,000 transistors. Today’s most powerful chips can fit 30 billion transistors into the size of your fingertip.
As MIT Technology Review points out, the idea of a smart city took off in the early 2000s as the tech bubble grew, and “countries including China, South Korea and the United Arab Emirates hired developers to transform large swaths of land into photogenic cities stuffed with the latest innovations.” But all those projects fell short.
Even in just the last 20 years, the size, power and ubiquity of technology has reshaped the landscape, making the goals of using technology to better manage energy, transportation and lure an educated, affluent workforce to urban areas much more tangible.
Artificial intelligence and machine learning also offer the ability to mine through the enormous amount of data sensors installed in smart cities will produce, and actually make sense out of it. Previously, this would have been too daunting of a process to even consider being able to manage well.
Urban populations are booming
The biggest innovations happen when there’s a need for them. Right now, there’s an urgent need for cities to utilize technology to improve city management.
Populations are increasingly becoming urban across the world. By 2045, the U.N. estimates 6 billion people will be living in urban centers. As the world’s city’s swell, challenges around transportation, food, water and employment are going to become more and more difficult.
Cue up human innovation
In 1968 biologist Paul Ehrlich predicted the world’s population was quickly becoming untenable, and by the 1980s massive starvation would kill hundreds of millions. His book, Population Bomb, was based on sound statistics and forecasting, but while starvation certainly still exists, it’s not nearly on the widespread scale Ehrlich predicted.
But perhaps ringing the alarm bell set things in motion. We began managing resource better, thinking about how to make food production more efficient and implementing regulations and policies which would allow for the population to boom.
Smart cities will work now, because they have to. We can see the challenges on the horizon, and we have the foresight to being implementing solutions now that will keep the coming urban boom from becoming a bomb.