Volume – CREATE DATABASE dbName; GO

The amount of data being created has increased exponentially over the last decade and continues to do so. Two decades ago the typical amount of storage on a computer was measured in megabytes. Today, a few terabytes aren’t uncommon, especially when you recognize that there are approximately 2.5 quintillion bytes of data created each day. That is approximately 2 MB of data created per second, for every person on Earth. A quintillion is a 1 followed by 18 zeros: 1,000,000,000,000,000,000. As the velocity of data generation increases due to more devices and improved technology, the volume of data will increase respectively. That means the volume will continue to increase on a daily basis. Consider the kind of infrastructure required to handle even 1 percent of that amount of data. The management of even a small percentage of that amount of data is not something individuals or small companies could ever scale large enough to handle. Therefore, the creation of the cloud to provide the solution for such scenarios obviously didn’t happen by chance. Rather, the cloud service originated to address the need for a cost‐effective solution to manage the growth of data volume.

Data Locations

Companies and individuals have not always had a global perspective, and many still do not. The location of customers who want access to your data or the location of data producers does matter. A company’s customers can be local, regional, national, and/or global. If that is the case, you want to have the data they use as close to them as possible. It is simply too slow to have a customer in India access a database in Texas. If that is the case, you would lose your customers in India. To save them, you would place a database in India. Decades ago, that would be a very costly and time‐consuming activity. However today, cloud service providers have datacenters in almost every country and can be provisioned and brought online in minutes.

Devices that produce data may be stationary, move a short distance, or perhaps move around the globe. A lamppost is not going to move; therefore, you could theoretically have a database of lampposts in the same city, which would be very effective. However, if you have a logistics company that moves goods nationwide, then the distance between the truck (i.e., the data producer) and the datastore might result in some data loss. The same goes for moving goods internationally using a ship or an airplane. Knowing where these vehicles are and perhaps their speed is valuable data, especially in the logistics industry that runs on just‐in‐time (JIT) delivery modes. Again, the existence of cloud service provider datacenters in most countries around the world helps you gain this piece of business intelligence and data reliability.

The amount of data being created has increased exponentially over the last decade and continues to do so. Two decades ago the typical amount of storage on a computer was measured in megabytes. Today, a few terabytes aren’t uncommon, especially when you recognize that there are approximately 2.5 quintillion bytes of data created each day.…

Leave a Reply

Your email address will not be published. Required fields are marked *