Volume – CREATE DATABASE dbName; GO
The amount of data being created has increased exponentially over the last decade and continues to do so. Two decades ago the typical amount of storage on a computer was measured in megabytes. Today, a few terabytes aren’t uncommon, especially when you recognize that there are approximately 2.5 quintillion bytes of data created each day. That is approximately 2 MB of data created per second, for every person on Earth. A quintillion is a 1 followed by 18 zeros: 1,000,000,000,000,000,000. As the velocity of data generation increases due to more devices and improved technology, the volume of data will increase respectively. That means the volume will continue to increase on a daily basis. Consider the kind of infrastructure required to handle even 1 percent of that amount of data. The management of even a small percentage of that amount of data is not something individuals or small companies could ever scale large enough to handle. Therefore, the creation of the cloud to provide the solution for such scenarios obviously didn’t happen by chance. Rather, the cloud service originated to address the need for a cost‐effective solution to manage the growth of data volume.
Data Locations
Companies and individuals have not always had a global perspective, and many still do not. The location of customers who want access to your data or the location of data producers does matter. A company’s customers can be local, regional, national, and/or global. If that is the case, you want to have the data they use as close to them as possible. It is simply too slow to have a customer in India access a database in Texas. If that is the case, you would lose your customers in India. To save them, you would place a database in India. Decades ago, that would be a very costly and time‐consuming activity. However today, cloud service providers have datacenters in almost every country and can be provisioned and brought online in minutes.
Devices that produce data may be stationary, move a short distance, or perhaps move around the globe. A lamppost is not going to move; therefore, you could theoretically have a database of lampposts in the same city, which would be very effective. However, if you have a logistics company that moves goods nationwide, then the distance between the truck (i.e., the data producer) and the datastore might result in some data loss. The same goes for moving goods internationally using a ship or an airplane. Knowing where these vehicles are and perhaps their speed is valuable data, especially in the logistics industry that runs on just‐in‐time (JIT) delivery modes. Again, the existence of cloud service provider datacenters in most countries around the world helps you gain this piece of business intelligence and data reliability.
The amount of data being created has increased exponentially over the last decade and continues to do so. Two decades ago the typical amount of storage on a computer was measured in megabytes. Today, a few terabytes aren’t uncommon, especially when you recognize that there are approximately 2.5 quintillion bytes of data created each day.…
Archives
- August 2024
- July 2024
- June 2024
- May 2024
- April 2024
- March 2024
- February 2024
- January 2024
- December 2023
- November 2023
- October 2023
- September 2023
- July 2023
- May 2023
- April 2023
- March 2023
- February 2023
- January 2023
- December 2022
- November 2022
- October 2022
- September 2022
- May 2022
- April 2022
- February 2022
- January 2022
- December 2021
- October 2021
- September 2021
- August 2021
- June 2021
- May 2021
- April 2021
Contact US