Public policies and commercial norms could constrain our ability to realize the full potential of big data.
We are just at the beginning of the big data revolution. What was merely a prediction a few years ago is happening today. Not only are we creating data at exponential rates, we are starting to figure out how to harness it — albeit at a nascent stage. The growth of the cloud and the evolution of artificial intelligence (AI) technology is allowing us to actually see the value and benefits of what big data can offer. Now that we are able to harness big data, we are also beginning to understand both public policy and commercial norms that are constraining society from realizing the full potential of this resource.
Why is big data such a big deal?
Every heartbeat, every slight change in the weather and every rotation around the sun generates unimaginable amounts of data. Until recently, however, we were not able to capture much of it. In fact, 90% of the data that exists today was created in the past two years. This trend will continue for the foreseeable future.
Grasping the amount of data being created and captured today is next to impossible. Virtually every new industrial system being built today has hundreds, if not thousands of sensors. We are also retrofitting older systems with sensor technology. Globally, discrete manufacturing, transportation, and logistics industries are expected to spend $40 billion each on IoT technologies in 2020 alone. This is up from $10 billion in 2015.
Rolls-Royce, the premier engine manufacturer is diving headfirst into this IoT revolution by putting sensors on its jet engines. A Boeing 787 with a Rolls-Royce engine generates an average of 500GB of data per flight with long-haul trips producing several terabytes. Sensors aren’t just for manufacturing and industrial machines; they are also being deployed in agriculture. Today, farmers are using robots, drones and wireless sensors to collect data about farm functions and weather. It is estimated that by 2050, an average farm will generate 4.1 million data points every day.
Data overload — what to do with it all?
All the data being collected isn’t simply going to waste. The convergence of high-capacity computing power, advancement of AI, improvements in wireless technology and the scale offered by cloud computing allow us to process these massive data troves. This means that those terabytes of data being collected on the 787 engine making a transatlantic flight are able to be sent to a cloud computing data center, occasionally in real time, and analyzed in milliseconds.
The results generated are being used for virtually every purpose. By combining all this data and using artificial intelligence, Boeing can more accurately predict when mechanical parts are going to fail, improving safety and maintenance, as well as enhancing the efficiency of the engine during flight. It can also potentially be used by aviation regulators to monitor safety issues.
In the agriculture space, using artificial intelligence to combine data sets, along with macro-weather data sets, farmers will be able to manage their water usage more efficiently, limit the over consumption of fertilizer, and better predict crop yields. These data sets could also help governments plan their agriculture policy and improve how investors predict market fluctuations.
Are we ready for the data blitz?
As advancements in technology are making all this possible, is society ready for this phenomenon? Existing laws and policies, as well as antiquated contractual relationships, can add friction to the ability to make the most of data collected. Whether it be intellectual property rights, privacy rights or the simple fact that data exists in thousands of different formats, the roadblocks preventing effective use of data are everywhere.
My farming example highlights the potential problem. Farmers are producing data that would be very helpful for both governments and researchers, and in many if not most cases these data sets are more valuable to the farmers when shared with others than kept to themselves. At the same time, what mechanisms can farmers use to protect any potentially commercially sensitive information, while still sharing the data? And say that this data is made available, what are the principles and practices by which the government uses this data to deliver better public services? The same questions arise from government generated macro-weather data. How can this data be shared in such a way that private companies and researchers can access it? Furthermore, if an entity shares its data, how can we encourage a common format for distribution?
Any changes in policy or business relationships must take a long-term approach that maintains incentives to collect data, while continuing to improve how data are processed. This is in addition to the larger societal issues of protecting privacy and ensuring nefarious actors are not given a platform to abuse these systems.
Tim Molino is a technology consultant at Peck Madigan Jones in Washington, D.C. He provides policy, strategy, and political advice on technology-related issues, including intellectual property, antitrust, privacy, and cybersecurity. Molino’s tech policy expertise began when he was Chief Counsel to Senator, and current Presidential candidate, Amy Klobuchar (D-MN) where he was responsible for advising the Senator on technology-related issues. His other tech experience includes being a Policy Director for BSA | The Software Alliance, the leading software trade association and a practicing patent attorney.
The InformationWeek community brings together IT practitioners and industry experts with IT advice, education, and opinions. We strive to highlight technology executives and subject matter experts and use their knowledge and experiences to help our audience of IT … View Full Bio
More Insights
Leave a Reply