The people, science and technology behind discovery

Subscribe to the Magazine

Get new articles sent directly to
your mailbox.


Join the network

Twitter Updates Group Forum

Earth Explorer is an online source of news, expertise and applied knowledge for resource explorers and earth scientists. Sponsored by Geosoft.

Upcoming Events

Marelec 2017

JUNE 27 - 30, 2017

Exploration '17

OCTOBER 22 - 25, 2017

News & Views

All News


Using Software for Workflow Management

Alex Levitski


A lot has been written recently about the drastic changes taking place in the mineral exploration industry. Most of these changes are due to the fact that the exploration community has stumbled onto the proverbial “information super-highway”. The focus is shifting from data collection to working with data to achieve project outcomes. There already is a whole bunch of geo-data, of all kinds and shapes, accumulated in various archives and storages. The tricky part is finding these data, making them compatible with one another, and processing them in an effective and consistent way. The resulting data set guides you to your exploration targets. Not surprisingly, the geo-software market is flooded with tools that support this new approach: gridding applications such as Surfer, GIS such as ArcGIS and MapInfo, and highly specialized, power mapping systems such as Oasis montaj.

These data management applications enable you to harness the different types of data to achieve your goal. What about the different types of users? What, for example, will happen if you give a geoscientist several chunks of data in different formats and ask him or her to process these data? Given time and a proper set of tools, he or she will convert, sort, filter, and otherwise tweak the data, and finally present them to the world in a uniform format that will make perfect exploration sense – to this geoscientist. In all likelihood, this format will not make the same sense to any of his or her colleagues. On the other hand, if you give the same set of data to several different geoscientists, they will most probably re-process and use these data in totally different ways.

A direct consequence of the exploration community joining the IT-ruled world is division of responsibilities. Gone are the days when the same bearded geo-genius would plan a survey, do the field work, and then process the collected data and write a report. Today, surveys are planned by suit-and-tie office dwellers, field measurements are performed by designated field teams, and the data are processed by number-crunchers at a computer center. All the above individuals do not necessarily work in the same office and meet twice a day by the coffee machine. They might be (and often are) in the different cities, countries, and even time zones. They might not know each other personally. They might come from totally different educational and professional backgrounds. The only way to make the data flow from one team to the other consistent and efficient is to instill rigid processing rules – in other words, to use workflow management.

Several months ago, I was contracted by a North European exploration company to carry out a geophysical survey, and assist in a geochemical survey, at their gold-bearing concession in East Africa. In the concession region, gold is associated with disseminated sulphides hosted in quartz veins, which are located in certain rock types within the local metamorphic belt. Strategically, not a very complex task: magnetic measurements to delineate geological structures, then IP over the structurally promising areas to find the sulphide-rich zones, then soil and rock sampling for gold over the sulphides.

The real complexity was introduced by my customer’s organizational structure and by my position within it. I was stuck between the very old-school, methodical, “by-the-book” European management, and a vigorous, inexperienced, slightly fatalistic African field team. My duties included training the field team in the necessary data acquisition techniques, receiving and processing the field data in Toronto, and submitting reports and recommendations to the management.

The management’s approach to any task was “The rules must be observed. If the task takes a month instead of a week – so be it.” Conversely, the field team’s approach to the same task was “The job must be done before the end of the working day. If some rules get bent or broken in the process – so be it.” My approach – forged in Russia and hardened in Canada – is “The result must make sense and be ready by the deadline. If this means a sleepless night or two – so be it.” It was clear that, to bring all of us to a common denominator, we had to have a well-defined workflow. I ordered Geosoft’s Oasis montaj software and got busy building this workflow.

Oasis montaj offers a rich layer of features that can be used to define and instill workflows. Some of the features I used were:
  • Database and map templates – a template unambiguously defines how the data are imported, stored, processed, presented, and exported. If all the involved parties use the same template, you are not likely to be surprised by date/time values presented as integers in a database, or spend hours wondering why the Northern half of your site is shown in a map in ARC 1960, and the Southern half in WGS-84.

  • Sets of default values for databases, grids, maps, etc. – serve as a safety net for those operations and entities not covered by templates. If one group along the workflow chain introduces an exotic entity, such as multi-component 3D view, using default values, another group can easily decipher and reproduce that entity.

  • Quality control tool (for IP data) – helps quickly identify and analyze invalid or questionable field data. For example, one of the tricks IP might play is including negative potential values in the IP curve. The minus sign in these values is not apparent on the IP receiver screen. The montaj QC tool makes these values stand out.

  • “Locking” mechanism for database columns – prevents users from making inadvertent changes to the column values. This is a foolproof method to preserve the “vanilla” field data across the numerous interpretation procedures.

So far, the established workflow is doing its job – keeping everybody gainfully busy, and delivering results across the globe on time and with little need for explanations. The measure of success, as I see it, is the average number of weekly phone calls from one team to another – it is pretty close to zero. Hopefully, our workflow will hold when gold starts showing in the survey area and fun begins for real.