Professional Documents
Culture Documents
video
Acoustic fibre optics in wells
What you can do with seismic in
the cloud
Is better subsurface knowledge
the key to improving the
financial viability of offshore
operations?
February / March 2015
Issue 53
015
vents 22014
EEvents
Non-seismic Geophysics
London, 19 Feb 2015
n
issio
m
d
A
20
m
o
r
f
www.d-e-j.com
www.findingpetroleum.com
www.d-e-j.com
Editor
Karl Jeffery
jeffery@d-e-j.com
Tel +44 208 150 5292
Conference Producer
Panas Kalliantas
pdkalliantas@d-e-j.com
Tel +44 208 150 5295
This takeover was underpinned by profound understanding of North Sea geology, of Yetto-Find volumes, of undeveloped discoveries, of upcoming development projects, of producing fields.
In point of fact, BP probably understood Britoils acreage and fields better than Britoil
did itself.
Richard McIntyre
mcintyre@d-e-j.com
Tel +44 (0) 208 150 5291
This 1986 1988 work was almost entirely analogue paper composite logs, paper seismic sections, hand-drawn maps, tracing paper, light-tables, occasional use of the digitizing table (remember those?). It worked!
Production
Wai Cheung
wai@tankeroperator.com
Subscriptions:
250 for personal subscription, 795 for corporate
subscription.
E-mail: subs@d-e-j.com
Leaders
The technology will also introduce new realtime factors which should be considered such
as current news events around economic instability, political unrest and natural disasters.
Cognitive computing systems can help in exploration and production by helping individuals to better interpret big data and then
make informed decisions based on that data.
Subsurface
Mark Stevens, Director of CommunicationsGDS, Oceaneering.
Usually Oceaneering has about 100 simultaneous video feeds across the company.
The data files are getting larger all the time,
with more high definition and 3D cameras
and high-resolution video compression formats.
Meanwhile there are many limitations to
physical media (such as DVDs, USB
drives). It is hard to manage and share files.
you can automatically bring up the video inspection image of a certain section of
pipeline.
A lot of companies are asking for video of
drilling data, particularly looking at the drill
floor, to record the drill pipe going through
the floor of the drill ship and back out again.
Benefits of video
Oil and gas companies are using video in
many different ways.
Subsea, companies use video for monitoring
ROV operations, observing rotary brush
cleaning operations, subsea X-ray imaging
(digital radiography) and general long term
asset monitoring.
The video is proving particularly useful in
subsea cleaning work. You can capture
videos before and after cleaning, to see how
effective the cleaning was, and then work
out a better cleaning schedule.
If the video is of a pipeline inspection, all of
the video can be geographically tagged, so
Archival infrastructure
The Video Vault solution is made available
as a hosted cloud service for a tiered, dailyrate or monthly price depending on application, including maintenance. It is also
available as a bundled hardware solution for
on premise installation. This means you can
store the data on your local servers, or you
can host it on a standard commercial
archival service like Google. Clients never
have to see any physical storage media.
If you are concerned about storing your data
on standard commercial cloud services, you
could use Video Vault with Amazons Gov-
Subsurface
Data acquired in the South Georgia Rift Basin (USA) using 2 UNIVIB trucks was able to bring out a structure
at 10,000 feet
Subsurface
By integrating UAVs with cable-less systems, powerful status QC and noise monitoring can be achieved simply and even more
efficiently without the need for complex
radio infrastructure, he said.
But there are still regulatory obstacles to
using UAVs in many countries though,
he said.
Big data
Inova's G3i HD cabled seismic recording system - designed for high productivity
UAVs
Companies are now showing interest in developing unmanned aerial vehicles (UAVs)
which can fly around the spread of wireless
devices, download data over wi-fi and do
basic quality control on it, he said.
Purchasing a fixed wing UAV, which can
carry a 5 lb payload and fly for 2 hours, costs
about $150,000, he said.
But they might be able to quality control 910 lines in a 2 hour period, compared to 1-2
lines using conventional methods.
Subsurface
I've had people coming to me and Delegates at Digital Energy Journal's Nov 27 Aberdeen conference
saying, 'I cant see my seismic',
"Better ways to manage seismic data"
that they've loaded. I said, Youve
loaded it in Norway, but it's in the UK sector, external one you need to make that decision.
because you got the wrong UTM. So the
rule is get it loaded by someone who actually Companies spends millions acquiring seismic, and then they forget to do the critical
knows how to do this not everyone has that
publishing of the final piece of work.
skill.
If you get the data catalogued, loaded, verified and ready to interpret then you have
done your job right.
Have your archive strategy in place, because you never know when you might need
to access it again.
Whether or not you find oil, the seismic interpretation data needs to be captured and
published.
And when all this is done, and the entire
package is been tied up quite neatly, the last
bit is to archive or sell! Whether it is archiving in your own internal storage system or an
IT
So the geophysicist is on the workstation,
and wants to access the data, but [the network is] so slow its driving them crazy.
You can do as much as you want to clean up
those data, but if the [network] you have got
is not up to the job, then whats the point?
Work with IT. Make sure that you have
everything in place like network connectivity. So when the geophysicists are actually
interpreting the data, they don't have to wait
10 minutes for each inline to display.
Geophysicists are a valuable commodity in
themselves so wouldnt you prefer they had
the tools to deliver the projects on time then
sit around waiting.
You can view Janes talk on video at
www.d-e-j.com/video/1228.aspx
Subsurface
Octio produces a system with permanent digital cables on the seabed, which
can make regular seismic surveys and monitor how the reservoir is changing.
You can see how the cracks develop subsurface. This will give the operational teams a
notice that something has to be done by the
injection, he said.
It is reasonably easy to set up a business
proposition which is reasonably sound, he
said.
Drill cuttings
The system is being used on the Oseberg
field (140km Northwest of Bergen), to monitor the injection of drill cuttings and waste
water.
Drill cuttings are milled very finely at the
platform and injected into the reservoir.
This provides a much less expensive option
for managing drill cuttings than transport
back to land.
The average cost for a North Sea field for
transportation of cuttings and water waste is
close to $20m a year, he said. The alternative is to ensure safe injection, for $2m to
3m.
But Norway has zero tolerance for any
waste pollution. If you can't ensure safe injection you have to transfer all the fluids onshore, he said.
Seabed infrastructure
Most oil and gas staff are busy meeting their
short term objectives, and dont have time
for longer term ones.
Overburden problems
The seismic recording can help spot problems leading to possible overburden leakage.
Developing a better business case for permanent reservoir seismic monitoring: Helge Brandsaeter, president,
OCTIO
said.
Subsurface
transmitted back to shore.
The installation of the system basically
comes down to the cost of leasing vessels,
and installation cost is a third of the lifetime
operating costs.
The data communications infrastructure can
also be used for any other subsea equipment.
If you build such an area wide infrastructure we can use it for all types of communications and types of sensors, he says. You
can drop down a sensor and communicate to
surface.
We have made basically an ethernet on the
seafloor, you can interface any system to
us, he said.
Making progress
with recording
seismic in wells
with fibre optic
David Hill, chief
technology officer,
Optasense
How it works
The technology works by firing a pulse of
light (laser) into the fibre.
The glass fibre is the purest material man
has ever made, but there is enough inhomogeneity in the molecular structure to cause a
small amount of light backscatter, he said.
The backscatter appears to be random, but it
stays relatively constant if the fibre is not
disturbed.
Subsurface
But if there is a tiny strain on the fibre,
which can include a strain caused by a noise,
the backscatter pattern changes slightly.
With a calculation involving the speed of
light, you can calculate which part of the
fibre that event happened.
Data
The company is trying to come up with a
standard way to define the data, so the data
can be transferred between systems.
The data files can be enormous, with one
well generating a terabyte a day without
any problem, he says. That's only going to
get orders of magnitude worse as technology
progresses.
To keep data files manageable, it is essential
to process the data at source, so you are only
transferring the much smaller processed data
files, he said.
In 2015, version 4 of the OptaSense DAS
system will be made available, with an extra
6dB signal to noise improvement and better
spatial resolution, he said.
Other applications
Optasenses biggest business application for
the technology so far is for pipeline monitoring. It is currently installed on 12,000 km of
pipeline, to detect potentially damaging activity along the pipeline and detect leaks.
The technology is also used to monitor and
installed around factories to monitor for people climbing over or cutting fences.
It is being used for condition monitoring, to
monitor condition of risers.
He was speaking at the Digital Energy Journal conference in Aberdeen on November 27,
Better Ways to Manage Seismic Data.
The benefits of cloud data are well publicised,
but perhaps none of them motivate activity as
much as short term economic gain.
The financial case of storing data on cloud
Subsurface
One seismic data has
been moved to the
cloud, you can gain
many extra benefits,
such as being able to
do Hadoop processing
and automatic data indexing - Henri
Blondelle, global business development
manager, CGG Data
Management Services
Indexing
You can use sophisticated cloud based analytics tools which can interpret text.
You can use analytics tools such as Tibcos
Spotfire or Tableau Software. These tools
make it easy to share the results of the analytics unlike if you do the analytics running
over Petrel, which means that someone needs
to login to a Petrel workstation to view them.
Many interpreters prefer to use something
that looks like Excel - like Spotfire or
Tableau, he says.
Teradata and Horton Works provide a range
of statistical tools you can use.
You can store your seismic data in a Hadoop
based storage solution, which gives you the
ability to do some Hadoop analytics without
downloading the data, including all the seismic processing and interpretation.
CGG has also looked at ways to make seismic data easier to catalogue and index automatically.
Indexing data the classical way generally
involves opening it, reading it, and entering
the relevant information in a database, which
can be time consuming, if you have millions
of files to index. As a result the task is often
not done.
So perhaps if the documents could be
analysed and indexed automatically, oil companies would index documents which otherwise would not be indexed at all.
These tools are already used by US law firms
to automatically classify reports on
cases, he said.
View Henris talk on video at
www.d-e-j.com/video/1223.aspx
Subsurface
The classic internet search engine, digital library and enterprise search have traditionally
focused on precision and ranking.
Faceted search
The rationale is that as long as the specific
web page or document you were seeking is
on that first page, it does not matter how
many results are returned.
This approach has been incredibly successful,
leading to Internet search engines like Google
attracting a crowd nearing one billion users a
week, of which 94 per cent never click past
the first page of search results.
But increasingly with Internet search, smart
algorithms recommend or suggest related information, trying to predict what we need or
may find interesting.
In addition, social networks undoubtedly aid
discovery. However, some researchers feel
the overuse of historical usage and activity
data within algorithms to make suggestions
may place us in a filter bubble constraining
some potential serendipitous encounters.
Enterprise search
In an enterprise environment, significant frustration still exists where the success seen on
the Internet seems harder to replicate inside
an enterprise.
Factors for unsatisfactory retrieval include investment levels, organizational culture, the
nature of workplace tasks, information governance and interventions, small crowds, information structure and permissions along with
information behaviours of staff and management.
Initial results were promising. In an observational study of 53 geoscientists in two oil and
gas organizations, 41 per cent felt current
search interfaces used by their organization
facilitated serendipity to a moderate/large extent, increasing to 73% with the introduction
of certain algorithmically generated filters.
Need to be surprising
Recent research by Robert Gordon University
published in the Journal of Information Science identified certain information needs with
respect to faceted search refiners.
Research was conducted using word co-occurrence stimuli generated from data provided by the Society of Petroleum Engineers,
Geological Society of London and the American Geological institute. The stimuli was
11
Subsurface
This may be detached from any initial specific intent, the surprising nature of the association enticing the searcher to drill down
further which may lead to a serendipitous encounter.
Enhancing creativity
one person, may not be by another as suggested filter terms are compared with their
own cognitive map, like a game of spot the
difference.
The challenge with text co-occurrence is to
decide what to present to the user, minimizing distraction but offering potential surprises, combining with traditional controlled
vocabulary (taxonomy) metadata approaches.
Companies that adopt such practices, may experience more happy accidents in the user
interface than those which do not.
Analytics on Hadoop
SAS and Cloudera recently announced technologies that move the analytic functions directly within a Hadoop cluster.
For many organizations, establishing an enterprise data hub using Hadoop will be a costeffective solution for data capture of all data,
structured and unstructured, in a secure, managed environment. When paired with additional technology applications to ensure data
quality, and to visualize and analyze the data
effectively, Hadoop is ready for prime-time.
Software companies, like Cloudera and SAS
are working together to provide processes and
technologies that accelerate data-driven insights.
Dave Cotten, whose team at Cloudera supports many US oil and gas companies, says
that Cloudera's oil and gas clients are realizing multiple revenue generating and cost savings opportunities.
From real-time field operations feedback improving reservoir yields, to full-fidelity electronic well record management, to mining
internal and public data to determine optimal
well spacing, customers are obtaining deeper
insights at lower costs provided by Hadoop in
an enterprise data hub.
Subsurface
There are many more exciting things the industry could do, if the seismic data systems
were on the cloud.
One US company put all of its seismic data
onto disk, and then hired 6 students from the
Colorado School of Mines and gave them access to the entire seismic library, telling them
go and find some stuff.
Mr Holmes said it is a mystery to me why
we keep standalone workstation going as long
as we have, he said.
Geophysicists still work on personal workstations, where they spend 20 minutes loading
up all their data every morning. If they could
work directly on a cloud system it would be a
lot faster.
A side-effect of the growth of big data systems is that many companies now have multiple systems for storing data, including their
normal archiving systems, high performance
computing (HPC) enviroments, Hadoop environments.
They might have the same data file in all of
these systems. If they back up the data in each
environment multiple times, they can end up
with many copies of the data. One company
worked out they would have 17 copies of all
of their data, if everything had gone well, he
said.
As data volumes get bigger, keeping 17
copies of everything will get very expensive.
If we have any chance of surviving the next
few years, its going to be crucial that we
have a single instance of our data, he said.
Or companies will make a fortune selling
you vast amounts of storage you don't need.
A new term has been invented, next generation data fabric, which describes the enter-
13
Subsurface
prise architecture for storing and managing
information, he said.
Companies will also use object storage
which means that the analytics tools can understand the different data storage systems
you are using.
Developments at LMKR
Subsurface data and modelling company LMKR reports that it has formed a partnership with petroWEB, an oil and gas data
and information management company based in Colorado, Canada and Houston.
The agreement is for LMKR's "GeoGraphix"
subsurface interpretation system to integrate
with petroWEB's "Enterprise DB" exploration
and production data management system,
built on the PPDM model.
Enterprise DB can serve as a corporate well
master, well log repository and well file management system.
By putting Enterprise DB together with GeoGraphix, you have a single system for managing large volumes of subsurface data, LMKR
says.
LMKR has also formed a technology partnership with LUMINA Geophysical, a company
based in Houston which provides special
tools for quantitative interpretation of the subsurface, based on a mathematical method
called spectral decomposition. This allows
more geological information to be extracted
EU.
Professor Waaler believes that this system
will be different to other big data solutions, in
that it will focus on understanding the complexity (including the variety) of the data,
where most other big data solutions just focus
on working with large data volumes.
"Optique .. addresses trustworthiness by
showing where data came from and how it
has changed, providing transparency for the
end user," he said.
At the moment, geologists and engineers need
to involve the IT department if they would
like to post a complex query to their databases, but with the Optique system, they can
get answers in minutes, he believes.
'This will open up new exploratory and inter-
active ways of working as users get more relevant data sets in shorter time."
The results of the research are planned to be
presented in Hvik, Norway, in early 2015,
with an aim to attract more companies to get
involved, and ultimately develop methods and
technology which will be used by the industry
mainstream.
'We will deliver a good concept, but this will
not be something that can be delivered to the
industry two years from now," Professor
Waaler said.
"I hope that by then [early 2015] we have
something so impressive that the industry will
want to continue to fund this project. I am optimistic."
Oil platform
We ve go
Weve
ot
you cove
ered.
As you strive to explore more of the worlds oceans its important to choose the right marittime
satellite communications partner. Intellian have proved time and time again its high perform
maance
VSA
AT, Sat TV and FleetBroadband systems can be relied upon to keeep you connected to the w
woorld,
even in the most extreme locations.
Innovative designs, built to perform, engineered to last and simple to use: Intellian continuuously
produces patented technology to deliver performance leading RF capability. Seamless integra
ration
and superior remote management and control capability makke Intellian the preferred choicee for
the worlds leading service providers
intelliantech.com
16
Data problems
It is too common for operators not to have
accurate records of what they have installed
on their rigs, he said.
We see inspection reports stating missing
certificates, missing history, impossible to
trace parts in the material master, and missing documentation and numbering information.
This might be expected, when you consider
that it is common for operators to take delivery of an oil rig or FPSO, but they do not receive the associated documents, part
information and materials lists, in a usable
format.
The problem can be ignored until it is time to
do modifications, but at this point, engineers
can spend 2-3 hours gathering specification
data (tags) for each component, before they
can put together a purchase order for new
materials.
Every time any information is missing, you
need to search for information and involve
colleagues, which take up hours of expensive
time. Poor information will lead to incorrect
purchase, increasing the cost even more.
On greenfield projects, it should be much
easier to gather necessary information.
But operators are often overwhelmed by the
amount of data. The operator may deal with
only a small number of contractors directly,
but each contractor will go on to send hundreds (or thousands) of purchase orders to
their suppliers and manufacturers, which
generates an enormous amount of documentation.
Sometimes documents for a single component (such as a motor) will be sent back to
the operator many times, because this component is used as part of many different
pieces of equipment.
The information is sent by email, which
means it easily gets lost, and no-one is sure if
it is correct.
Meanwhile, the operators projects staff do
not necessarily have an incentive to make
sure that the data is good, because after the
project they will move onto a new project,
and leave the data problem to the companys
operations staff.
It can take many thousands man hours to put
together a complete parts database for a new
offshore asset, he said, so it is not a surprise
that the work is often not done, and the project is handed to operations staff with only 30
per cent of information available.
We end up with poor and missing information in the material master, he said.
As the industry has to lower costs, the EPC
(Engineering Procurement and Construction)
companies must work more efficient and at
the same time deliver better quality. This is
what Sharecats products are tailored for, Mr
Drageset said.
We also experience drilling companies to
struggle with procedures and routines to
build up information correctly, he said. A lot
of data is only entered as free text, and no
possibility to retrieve crucial information in
later maintenance, modifications and purchase of equipment and parts.
Sharecat
Sharecats service is to reduce the overall
workload, by maintaining and continually
updating the shared catalogue of part data
and deliver quality data to the clients.
So for example data about a part such as a
specific ABB motor, which might be used in
thousands of different pieces equipment on
many different offshore assets, information
only needs to be entered once and re-used
many times.
Sharecat provides templates which can be
given to engineering contractors so they
know what data they need to provide, and
which can be automatically uploaded into
eEnable Your
o S
Supply Chain
Founded in 2000, OFS Portal is an organization which consists of diverse supp
plier
members who are committed to promoting eCommerce and reducing cost. We have a no
on-profit
objective to ensure we promote the bestt approaches for the industry. In addition to advvocating strong
protection for the security and confiden
ntiality of electronic data, OFS Portal has gained the trust and
confidence of the entire upstream oil and gas industry. We do this through our pro
oactive advocacy
approach toward best practices to reduc
ce costs and complexity while increasing the speed of adoption.
Our Community:
Anadarko Petroleum
Anderson Energy
Antero Resources
Apache Corporation
ARC Resources
Arrington Oil & Gas
Atinum E&P
Bahrain Petroleum Company
Baytex Energy
BHP Billiton
Bill Barrett Corp
CML Exploration
COG Operating
Common Resources
Compton Petroleum
ConocoPhillips
Consol Energy
Contango Oil & Gas
Continental Resources
Corex Resources
Crescent Point Energy
Crew Energy
Black Hills Exploration & Production DCP Midstream LP
Bonanza Creek Energy
Dee Three Exploration
Bonavista Petroleum
Denbury Resources
BP
Devon Energy
Breitburn Energy
Diamondback Energy
Cabot Oil & Gas
DistributionNow
Caelus Energy
E&B Natural Resources
Cairn India
Eagle Rock Energy
Canadian Natural Resources Eclipse Resources
Cantera Energy
Ember Resources
Cardinal Energy
Emerald Oil
Carrizo Oil & Gas
Encana
Cenovus Energy
Endurance Energy
Chesapeake Energy
Energen
Chevron
Energy XXI
Chief Oil & Gas
Enerplus
Citrus Energy
Enervest
Clayton Williams Energy
ENI
Legacy Reserves
Legado Resources
Linn Operating
Long Run Exploration
MacPherson Energy
Marathon
Matador Production
Mewbourne Oil
Petro-Hunt
PetroQuest Energy
Peyto Exploration
Pinecrest Energy
Pioneer Natural Resources
Post Rock Energy
Progress Energy
QEP Resources
Murphy
p y Exploration
p
& Production Q
Quantum Resource
NAL Resources
Quicksilver Resources
Nearburg Producing
Regent Resources
Repsol
Nexen
Resolute Natural Resources
Noble Energy
Rex Energy
Northern Blizzard Resources Rife Resources
Northstar Offffshore Group
Rio Oil & Gas
Oasis Petroleum
Rock Energy
OMERS Energy
Rosetta Resources
Highmount Exploration & Production Oxy
Sabine Oil & Gas
Hilcorp Energy
Pacesetter Directional Drilling Samson
Hunt Oil
Energy
Sandridge Energy
Huntington Energy
Parsley Energy
Shell
Husky Oil Operations
PDC Energy
Sheridan Production
Indigo Minerals
Pengrowth
Sinopec Daylight Energy
Penn Virgina
SM Energy
Jetta Operating
Perpetual Energy
Southwestern Energy
Jones Energy
Petrobank Energy &
Statoil
Laredo Petroleum
Resources
Stone Energy
Le Norman Operating
Petrobras
Sure Energy
EOG Reesources
EP Enerrgy E&P
Resources
EXCO R
obil
ExxonMobil
Fairwayss Offffshore
E&P
Fidelity Energy
E
Fieldwoood Energy
FIML Naatural Resources
Oil
Forest O
Freeport - McMoRan O&G
Gastar Exploration USA
Gear Energy
GeoSouthern Energy
Great Western O&G
Grizzly Oil Sands
Halcon Resources
Harvest Operations
Hess Corp
Surge Energy
Swift Energy
Talisman Energy
Tamarack Vaalley Energy
TAQA North
Tecpetrol (US)
Teine Energy
Total
Tourmaline Oil
Trey Resources
Trilogy Energy
Trioil Resources
Tug Hill Operating
Tundra Oil & Gas
Twin Butte Energy
Ultra Resources
Urban O&G
Vaanquard Natural Resources
Vaantage Energy
Velvet Energy
Venoco
Vermilion Energy
West Valley Energy
Whiting Petroleum
Wildhorse Resources
Windsor Energy
WPX Energy
XTO Energy
Yaates Petroleum
Our Members:
HALL
H
ALLIBU
BUR
RT
TO
ON
ON
7 % , , 3 % 26 ) # % ,4 $
Delegates at Digital Energy Journals November Aberdeen conference Doing more with Offshore Engineering Data
Set requirements
With Sharecat, you easily know if you have
the information you need or not, and if you
dont have it, you can inform your suppliers.
It is important to make sure the suppliers are
aware of what they will need to provide in
advance. You have to set up all the requirements at the beginning of the project, how
that is supposed to be delivered, he said. If
you start asking for information after the
18
iQx has also been helpful in operations enabling easy check of abnormal well behaviour with relevant offset wells and to easily
capture experiences in a new way involving
the whole team and ensuring that the quality
of these are up to the companys standards.
Events 2015
Non-seismic Geophysics
19
Dr Arap Ratan
Ray
The Real Time Data Convergence for Multiscreen Displays (RTDCMD) framework gathers data from multiple sources scattered over
large, geographically separated areas.
This real-time data, once gathered, converges
into one central point before analytics are applied.
The data is then filtered to determine what
data is relevant to whom, before being shared
with different stakeholders in various locations. This filtration is essential as different
stakeholders will be interested in different
data points and perspectives, therefore tailoring the data that is presented speeds up the
concerned persons understanding by limiting
the amount of irrelevant data to sift through.
Due to the sheer magnitude of an oil field, the
amount of data that could potentially be generated risks swamping the stakeholder if left
unfiltered. In contrast, applying analytics in-
stantly makes the data more manageable, enabling staff to react and take decisions more
efficiently in response to the data presented to
them.
The use of multiscreen devices that are now
popular with end users has amplified the
transformation in industrial monitoring. If we
look at how this applies to the oil and gas industry, the different stakeholders for the plant
as well as field personnel use different visualisation screens and hand-held devices. The
emerging IT and operational technology (OT)
ecosystem consists of devices, sensors, real
time data, analytic engines, always-on mobile
networks and powerful mobile applications.
Central database
The key to successfully using this ecosystem
is to accumulate the data in a large and central
database. The central database enables businesses to have access to a single source of
truth. Above this is a layer of real time analytics that distributes actionable insights to different screens for consumption based on user
roles and needs.
Benefits
Data is presented to users from a consistent
database enabling reliable collaboration.
Technologies
The ability to integrate data and systems at all
three levels (machine, plant, enterprise) is reliant on the strength of technical competencies and partnerships around the RTDCMD
framework. The extensive ecosystem of partnerships required for this includes engineering and automation specialists and
Information Technology providers.
For example, a very large oil and gas major
with global operations leveraged RTDCMD
technology by creating a Collaborative Work
Environment (CWE). RTDCMD can be particularly helpful in complex environments
where safety is a critical factor. M2M platforms and applications, with RTDCMD as the
backbone, can orchestrate access controls and
authorisation levels to ensure accidents and
security lapses are eliminated.
20
Two experienced ClampOn Service Engineers were mobilized and met the support
Clampon has services to measure subsea vibration, sand production and leakages. Other
typical subsea vibration jobs include vibration on large subsea flapper valves, vibration
on subsea flowlines and vibration from
chemical injections
Engineering Information
AS It Should Be
Datum360 delivers Software as a Service (SaaS) and
consultancy to help Oil & Gas companies specify, capture
and manage engineering information for capital-intensive
projects and operations.
www.datum360.com
info@datum360.com
MAPPING STANDARDS:
A CORE COMPETENCY
OF EVERY GEOSCIENTIST
Maps are a canvas used to express complex situations to help support difcult decisions.
In exploring the subsurface, maps serve a number of important purposes; recording and storing information; supporting the
analysis of a range of subsurface data; and presenting and communicating information and understanding. Map creation
should be a core competency of every geoscientist, used to express complex situations to help support difcult decisions.
Our consultants can help E&P companies dene and implement appropriate mapping standards that will help geoscientists
present a clear, consistent and concise suite of maps for a variety of purposes where having dened mapping standards has
enabled the geoscientists to spend more of their time focusing on the technical content.
Petrosys is a powerful subsurface mapping system that brings all your critical knowledge together on one mapping canvas,
our approach to surface modeling enables you to resolve complex challenges and to communicate geological information
necessary for decision makers to take the right action. Learn more at www.petrosys.com.au/transcend.