Two big trends are driving data center requirements for the oil and gas industry.
The first is big data. In 2011, Chevron's internal data traffic was more than 1.5 terabytes per day. While many industries generate big data through customer sales, the oil and gas industry generates big data through seismic surveys.
The second trend affecting oil and gas industry data centers is the industrial Internet of Things. Devices are smarter, and sensors attached to oil- or- gas extraction equipment or well-monitoring equipment capture massive amounts of data from well sites.
This data is used to help the industry locate oil and gas and manage the production process. Some of the data needs to be accessed at well sites, which is often in remote locations. The data also needs to be shared with routine business operational units, such as accounting, located in more comfortable climates.
One of the major data center requirements for processing all that data is high-performance computing (HPC). Oil industry spending on HPC is projected to reach $1.16 billion in 2017.
Data centers that support HPC provide racks of massively parallel servers with multi-core CPUs. Special-purpose accelerators and GPUs are used to achieve performance that standard CPUs can't provide.
The high density generates more heat, so data centers in the oil and gas industry often operate at the higher end of ASHRAE standards. Power and cooling demands are high.
The challenge with the industry's big data doesn't come only with the speed of processing it. High bandwidth capacity is needed to capture and transport seismic data, plus enable real-time manipulation.
The large volume of data is also an issue. Unlike many industries, where historical data can be taken offline and archived, historical data in the oil and gas industry are actively used to inform current decision-making. Far from archiving historical data, many paper-based maps and records are being converted to digital form, which creates additional storage requirements. New analytical methods generate multiple views of the same data, all of which must be stored.
There's often a significant amount of metadata as well, which can create difficulty in designing storage arrays. Metadata usually are in small files, with the referenced data in large files; the two different file sizes are not always handled equally well by devices.
High Availability and Security
Around-the-clock operations make high availability and redundancy key. Effective systems-monitoring is critical. Data backup must be fast and reliable, along with a disaster recovery plan to restore operations in case of failure. Physical premises security should use advanced authentication methods like biometrics.
Build Custom, Go Modular, or Outsource
Some customers in the oil and gas industry choose not to build their own data centers, preferring to outsource to service providers with expertise. Cloud providers with HPC infrastructure can provide the industry the capacity for big data analytics on an as-needed basis.
Other companies choose to use a modular data center, which offers scalability plus easy installation in the remote locations where wells are placed. Modular data centers also offer flexibility in responding to technical and economic changes. Rack space can be added without interrupting current operations. Modular data centers also make it easier for oil and gas companies to comply with environmental requirements, as most construction takes site off premises. The off-premise construction location also reduces the amount of permits needed and makes modular data centers quicker to deploy.
Despite worries about global warming and a push for clean energy, crude oil production has steadily increased over the last five years. Value-added resellers (VARS) that are able to help companies meet the big data requirements of data centers in the oil and gas industry will tap into a growing market that needs help understanding how to best manage and utilize the massive volumes of data it’s now generating.