
Digitization as the key enabler
The demand for digital technologies in an always connected world from the Gen-Y and more so, Gen-Z consumers, who are willing to share and consume data and live in a connected world where social networks are trusted more than well-known brands has been a big pull factor for the digitization initiatives. On the other side, there has been a tremendous push for digitization from enabling technologies such as IoT, cloud, and pervasive connectivity. To top it all, digitization made business sense because of the innovative thought and disruptive business models, which challenged the status quo.
If data has to be analyzed using computing power, data has to be digitized, and that is where IoT technologies and sensors play a critical role. It generates the information in a format that is a critical input to the business decision making process, and can be processed easily. This data could be the GPS location of a taxi/customer, in the case of Uber, or the environmental data from machine sensors or video feeds from different cameras.
Data is being generated by machines, devices, sensors, wearables, social media, live streams, connected devices, and smart cars. The list is endless. International Data Corporation (IDC), a global provider of market intelligence and a premier research firm on consumer technology markets, estimates that we have more than 11 billion devices connected to the internet today, and that is growing at 4800 devices per minute. The amount of data that is being generated is astonishing. As an example, it was reported that Ford's new Ferrari-fighting supercar, the Ford-GT, has more than 50 sensors and 28 microprocessors built-in, and a single car is capable of generating more than 100 GB of data per hour. It is not only about the location where the data is being generated, but also about the volume that is being generated.
According to some studies from IDC, IBM, and other venture capital organizations such as DN Capital, 2.5 Exabytes (1 Exabyte = 1018 bytes) of data is being generated per day, and all of this data needs to be processed to get meaningful information that can be used to take decisions. Hence, it is important to transfer data to where it can be processed. Thus, the network to transport the data becomes a critical part of the digital foundation.
The data being created is so huge in volume that processing it in near real-time needs massive computing resources. This means much bigger data centers and a lot more storage and computers. However, another trait of the data is that it is not continuous but comes in spikes, and organizations might want to have a lot more data to process on a given day, week, season, and a lot less data to process during the next. Building a data center to do everything on its own premises would mean building the infrastructure to its peak capacity, bringing down the utilization levels, leading to inefficiencies and hence higher costs. In today's ever-so-competitive world, no business can afford to bear these costs upfront, tolerate any under-utilization of assets. This has led to the rise in cloud models that use these resources in a pool that can be paid-per-use rather than paying for full-time deployment.
Like any planning process, digitization also starts with the preparation phase where organizations need to have a clearly defined digital strategy outlining the areas they want to focus on for competitive advantage, and what digital capabilities they want to build along the value chain. Since the focus areas evolve in accordance with the changing market environment while the underlying infrastructure exists today, or has to be built today, it is imperative that the underlying infrastructure meets the needs of the digital wave when new use cases are to be incorporated.
The next section talks about the evolving technologies at the infrastructure layer that would enable the success of the digitization trend in an organization.