Oil & Gas NewsThursday, 28 May 2009 TNK-BP's Exploration Data Management ProgramTNK-BP strategy in exploration and production is focused on application of new technology to turn the Company's huge resources into proven reserves. TNK-BP's investment into seismic should be supported by solutions ensuring secure information storage, and investment into exploration should be supported by solutions ensuring data reliability and accessibility. ![]() Oleg Bantyukov (ONBantyukov@tnk-bp.com), Data Quality Improvement Section Head, IT and Database Dept., TNNC ![]() Pavel Potapov (PAPotapov@tnk-bp.com), Acting Head of Archive Systems Section, IT and Database Dept., TNNC Data Management Organization of Tyumen Petroleum Research Center (TNNC) is in charge of developing a quality data management system in TNK-BP (see "Data Management: the Future is Defined by the Newly Established Organization", Innovator 20). Today, it manages all exploration and production data flows within the Company and supports all TNK-BP's Performance and Business Units. The Organization provides over 40 various services on corporate exploration and geological and geophysical (G&G) databases and archives to users from all subdivisions of the Company. Creating TNK-BP Seismic Archive One of the priority tasks for the TNNC data management specialists is to develop a corporate seismic archive. The seismic data is currently stored in IT and Database Dept., TNNC, on a specially allocated 500 GB disc array, as well as in PCMS seismic data management system. However these recourses are not sufficient, and up to 75 percent of the information is stored on single-copy magnetic tapes. In standard conditions, these records loose their properties after five to seven years of storage. Thus, in several years the Company may loose up to 25 percent of the acquired seismic data if it does not provide the right storage conditions. Moreover, data volume increase, random data storage on multiple media, data duplicating and lack of a consolidated corporate storage system hampers efficient work with the information and creates additional risk of data loss. These all dictated the necessity to develop a comprehensive shared information system to manage the seismic data and store primary seismic information and the results of its interpretation. Over the last two years, TNNC has made major efforts to create and equip the Company's seismic archive which is to start working in 2009. In summer 2008, a core storage facility was commissioned; it is now being equipped - racks have been purchased to store the seismic data storage media (Fig. 1), their installation is planned for the next spring. Furthermore, terms of reference have been developed and approved to create an indexing system for the seismic data storage media. It is planned to begin its installation in December 2008. After that, the storage media will be marked and indexed. The system will provide for the opportunity to identify the location of the required data in 3D mode showing the numbers of the room and the shelf. ![]() In January, a hardware and software complex will be shipped from Finland which will help expand the disc space for data storage and provide backup. In 2009, it is planned to equip the seismic data storage with a ventilation and humidification system to ensure reliable and longterm media storage, complete the data indexation, and arrange a centralized system for initial seismic data storage media search and complete the media bar-coding. Data Quality Means Operations Quality Another priority in data management is to ensure the quality of the G&G data. The lack of appropriate processes in the Company's PUs impacted data quality and integrity, as well as delayed its download into the Corporate Database (CDB). The inconsistence of information flows caused massive duplication both for the initial information and the interpretation results which resulted in the need for sidetracking and pilot drilling as well as causing unjustified expanses of the Company. ![]() Data quality and integrity is negatively affected by the fact that PU specialists do not have a tool to check and visualize operational G&G data coming from the contractors. That is why the key objective for TNNC Data Management Organization in this field is to develop tools and software to control the quality and reliability of the information downloaded into the CDB. Data Quality Improvement Section within TNNC IT and Database Dept. is in charge of this work. The Section initiated the development of software to convert unstructured G&G and exploration and production data into the Company's standard format and provided it to the contractors in geophysical studies. For the first time ever, the Company has developed regulations for the submitted data and the tools to convert the data into the desired format. Thus, File Inkl View includes a standard algorithm to calculate directional survey parameters based on tool-measured parameters, such as depth, angle and azimuth; average angle method is used to calculate trajectory. VDL (variable density log) Converter is used to convert unstructured files containing cementing quality findings into structured WDEF files. Another tool, PGIS (Development Logging) Converter, converts unstructured files containing well log control findings into structured WDEF files. Templates for the created files are generated based on appropriate Corporate Technical Standards. An effective tool was developed for PU specialists to evaluate input data quality based on certain criteria and visualize the acquired data in 3D mode. File Inkl View is designed for directional survey data (Fig. 2). When analyzing the well data, the user can easily change the borehole image scale and dimensional orientation to view the trajectory from all sides. The software provides for batch control of structured files, and the quality of the provided geophysical data is assessed within minutes. FileTest is used to process structured text files containing well data in LAS (Log ASCII Standard) format, ver. 1.2, 2.0 and 3.0. PGIS Test checks the structured WDEF files containing well log data for certain types of errors, the list of which will be further expanded. Another tool, VDL Test, is used to menting quality findings. It helps identify gross errors in cement bond log findings at the initial stage, as well as submitting quality data to CDB. All the software is conditioned for both individual and batch testing. New Solutions to Ensure Data Quality and Integrity To control the incoming file data integrity and track the information flow, TNNC specialists have developed ArchiveShare data flow management system. It includes of two subsystems. The registering subsystem automatically receives the incoming data and includes it into own incoming database. The data sources may be an e-mail box, DVD, hard drives, or FTP. After this, the received data are located in dedicated file resources where they become available for further work. The web-subsystem helps visualize these data. It has a set of functions to facilitate and manage data flow. Moreover, the web-subsystem uses e-mail to notify the users of key events, such as moving to the next stage of data processing or holdback. To control data quality and integrity, CDB has tools for the comprehensive information assessment in data array. They help accumulate a studies knowledge hub which, in its turn, improves the data testing quality. View Inkl is used to display and visually assess the quality of G&G information downloaded into BASPRO Database, including data on directional survey, segregations, layer intersection coordinates, wellhead coordinates, altitude, and correction of magnetic variation. The software enables us to track the path of an individual well or a whole well pad. Export Inkl is designed for modeling specialists. It helps obtain directional survey data for a PU from BASPRO Database. This can be done both in technical standard format (to submit data to regulators or contractors) and in a format ready to download into modeling software (subject to correction of magnetic variation). In 2009 data management will become much more effective, upon implementation of technical standards and software for data quality control. The Company will be able to operatively track depletion of the remaining hydrocarbon reserves, simulate well interventions for enhanced oil recovery more accurately, and identify the most efficient and cost-effective options for reservoir development. Labels: data management, Exploration, Information storage, production, Russia, seismic, TNK-BP, Tyumen posted by The Rogtec Team @ 17:07![]() ![]() |
Latest News:
![]() ![]() |
![]() | |
ROGTEC Magazine © 2007/ - All rights Reserved | Legal Disclaimer Website design and development by DTimagen - Web marketing and SEO by Solar Internet |
0 Comments:
Post a Comment
<< Home