Today, analytics, artificial intelligence (AI), and machine learning (ML) have become big business. Throughout the 2020s, Harvard Business Review estimates that these technologies will add $ 13 trillion to the global economy, impacting virtually every sector in the process.
One of the biggest drivers of the value-added provided by AI / ML will come from smart cities: cities that leverage enhancements in such technologies to deliver improved services for citizens. Smart cities promise to provide data-driven decisions for essential public services like sanitation, transportation, and communications. In this way, they can help improve the quality of life for both the general public and public sector employees, while also reducing environmental footprints and providing more efficient and more cost-effective public services.
Whether it be improved traffic flow, better waste collection practices, video surveillance, or maintenance schedules for infrastructure – the smart city represents a cleaner, safer, and more affordable future for our urban centers. But realizing these benefits will require us to redefine our approach towards networking, data storage, and the systems underpinning and connecting both. To capitalize on the smart city paradigm, we’ll need to adopt a new and dynamic approach to computing and storage.
Providing bottomless storage for the urban environment
In practice, the smart city will require the use of vast arrays of interconnected devices, whether it be sensors, networked vehicles, and machinery for service delivery. These will all generate an ever-growing quantity and variety of data that must be processed and stored, and made accessible to the rest of the smart city’s network for both ongoing tasks and city-wide analytics. While a smart city may not need access to all the relevant data at once, there’s always the possibility of historic data needing to be accessed on recall to help train and calibrate ML models or perform detailed analytics.
All of this means that a more traditional system architecture that processes data through a central enterprise data center – whether it be on-premise or cloud – cannot meet the scaling or performance requirements of the smart city.
This is because, given its geographic removal from the places where data is generated and used, a centralized store cannot be counted on to provide the rapid and reliable service that’s needed for smart city analytics or delivery. Ultimately, the smart city will demand a decentralized approach to data storage. Such a decentralized approach will enable data from devices, sensors, and applications that serve the smart city to be analyzed and processed locally before being transferred to an enterprise data center or the cloud, reducing latency and response times.
To achieve the cost-effectiveness needed when operating at the scale of data variety and expected volume of a smart city, they’ll need access to “bottomless clouds”: storage arrangements where prices per terabyte are so low that development and IT teams won ‘ t need to worry about the costs of provisioning for smart city infrastructure. This gives teams the ability to store all the data they need without the stress of draining their budget, or having to arbitrarily reduce the data pool they’ll be able to draw from for smart city applications or analytics.
Freeing up resources for the smart city with IaaS
Infrastructure-as-a-service (IaaS) is based around a simple principle: users should only pay for the resources they actually use. When it comes to computing and storage resources, this is going to be essential to economically deliver on the vision of the smart city, given the ever-expanding need for provisioning while also keeping down costs within the public sector.
For the smart city in particular, IaaS offers managed, on-demand, and secure edge computing and storage services. IaaS will furnish cities with the components needed to deliver on their vision – whether it be storage, virtualization environments, or network structures. Through being able to scale up provisioning based on current demand while also removing the procurement and administrative burden of handling the actual hardware to a specialist third party, smart cities can benefit from economies of scale that have underpinned much of the cloud computing revolution over the past decades.
In fact, IaaS may be the only way to go, when it comes to ensuring that the data of the smart city is stored and delivered in a reliable way. While handling infrastructure in-house may be tempting from a security perspective, market competition between IaaS providers incentivizes better service provision from all angles, whether customer experience, reliability and redundancy, or the latest standards in security.
Delivering the smart city is a 21st century necessity
The world’s top cities are already transforming to keep up with ever-expanding populations, and in turn their ever-expanding needs. Before we know it, various sectors of urban life will have to be connected through intelligent technology to optimize the use of shared resources – not because we want to, but because we need to.
Whether it be a question of social justice, fiscal prudence, or environmental conscience, intelligently allocating and using the resources of the city is the big question facing our urban centers in this century. But the smart city can only be delivered through a smart approach to data handling and storage. Optimizing a city’s cloud infrastructure and guaranteeing cost-effective and quality provisioning through IaaS will be essential to deliver on the promise of the smart city, and thus meet some of our time’s most pressing challenges.