Contact Sales
Sign Up Log In

What Are Flows?

FlowsJuly 30, 2025

Flows are a batch processing system and how Danomics handles many of the repetitive tasks that are part of your day-to-day workflows. Whatever you are trying to accomplish – there’s a Flow for that!

Flows will be a new concept to many users, but the general idea is that Flows comprise individual tools that work in series to complete a task. Let's take the example of making a set of grids for porosity and water saturation on a zone-by-zone basis. You would need to do the following:

StepDescriptionFlow Tool
1Get the log dataLogInput
2Calculate PhiT and SwCpiLogCalc
3Average them by ZoneCpiLogCalc (as above)
4Grid the resultsPointsToGrid
5Write out the gridGridOutput

The Flow for accomplishing this is shown below.

This Flow does the following. It brings the well log data into the Flow using the LogInput tool. Then the CpiLogCalc tool pulls in the petrophysical interpretation and zone information to calculate the average PhiT and average Sw. CpiLogCalc in this case will output points on a per zone, per curve basis. PointsToGrid then converts the points to a grid, and GridOutput writes a multigrid file that will contain the grids for both the average porosity and average water saturation for every zone (you can write 100s of grids in one Flow such as this!)

This Flow is also completely re-usable. Want to update your interpretation and remake all the grids? No problem. Just update the interpretation in your CPI and then re-run this Flow. Want to use it on a different log database or petrophysical interpretation? No problem, swap out the relevant info in each Flow tool (or make a copy of it and then swap it out so you still have your original Flow to reference)

To better learn this process, two examples are provided below.

Gridding Petrophysical Results

Building on the example above, let's say we wanted to automatically detect and remove outliers, grid the results, apply a smooth, and write out the smoothed and unsmoothed versions for comparison. To do this we'd want to build a Flow that is structured as such:

StepDescriptionFlow Tool
1Get the log dataLogInput
2Calculate the propertiesCpiLogCalc
3Average and/or sum by zoneCpiLogCalc
4Remove outliersPointsSpatialFilter
5Grid the resultsPointsToGrid
6Write unsmoothed gridGridOutput
7Smooth the gridsGridSmooth
8Write smoothed gridGridOutput

This flow will look like:

Once this Flow is run it will write out two multigrid files - one before smoothing and one after smoothing that includes the average porosity and water saturation for each zone.

What if you wanted to update it to include properties such as net pay and net reservoir thickness? No problem, just add those properties and re-run it. All that will change is the config equations I select. And if I want to add OOIP, hydrocarbon pore volume, and average properties for all my input curves? No problem, just add those as well as shown here:

Now this Flow is gridding 11 properties for the 10 zones I have defined in my petrophysical interpretation (cpi). This means the resulting .grid files will contain 110 grids each - and all this was done by running one Flow.

Log Cleanup Flow

Another common task is cleaning log data. Although this can be done in the petrophysical interpretation it is often beneficial to do some basic cleanup and pre-processing ahead of time. An example of such a Flow is shown here:

In this flow we undertake the following steps:

StepDescriptionFlow Tool
1Get the log dataLogInput
2Fix depth curve problemsFixLogDepthProblems
3Convert units to standardsLogUnitsConversion
4Handle GRD name collisionsInferSGRDLogType
5Handle ambiguous lithology referencesInferPorosityLogType
6Handle ambigous resistivity typesInferResistivityLogType
7Base the SP CurveSPCurveBaselining
8Remove flatspots in log dataNullRepeatedLogSamples
9Remove entirely null curvesRemoveAllNullLogs
10Write out a new log databaseLogOutput

This Flow performs 8 processing tasks and can be used on any log database. Like our other Flows it can also be easily modified to include more processing steps. For example, let's say I wanted to resample all my logs to be on a 0.5' depth step. I could add a LogResample tool. Let's also say I noticed that there were gamma ray curves with negative values. I could use a LogMath tool to null out those values. The new Flow would look like this:

You are probably now starting to understand how Flows can be assembled, modified, and re-used to make your workflows more efficient. For more examples, please look at the links on the Flows Help Articles page.

Flows for (Almost) Anything

If analyze the parts of your workflow that are common across all of your projects, you will notice patterns. Flows will help you streamline those parts of your workflow. Here are some of the areas where we see flows being especially useful:

  • Gridding data for maps (e.g., structure, isopach, or reservoir property maps)
  • Cleaning well log data
  • Renaming well logs en masse for export and archiving
  • Building machine learning powered Flows for predicting missing logs
  • Identifying landing zones across 1000s of horizontal wells
  • Filtering and finding data that meet certain criteria (e.g., all the wells that have a full triple combo across the Wolfcamp C formation)
  • Finding wells that have a certain formation top (e.g., all wells that have a Wolfcamp C and Wolfcamp D top pick)
  • Extracting values from grids, 3D property models, or seismic data
  • Building custom Flows using Python

The list above is to just give you a few ideas on how you can use Flows. With a bit of patience and imagination you can literally do almost anything with Flows.

Tags

Related Insights

General

Sample data to get started

Need some sample data to get started? The files below are from data made public by the Wyoming Oil and Gas Commission. These will allow you to get started with petrophysics, mapping, and decline curve analysis. Well header data Formation tops data Deviation survey data Well log data (las files) Production data (csv) or (excel) Wyoming counties shapefile and projection Wyoming townships shapefile and projection Haven’t found the help guide that you are looking for?

July 9, 2025by Cameron Snow
Petrophysics

NMR Interpretation Module

Purpose The NMR interpretation module allows users to calculate porosity, bound and free fluids from the measure T1 and T2 distributions from NMR tools. Primary Outputs Discussion In oil and gas well logging, the $T_2$ distribution is used as a high-resolution "map" of the formation's pore system. While a standard porosity tool tells you how much fluid is there, NMR tells you where that fluid is trapped and whether it will flow.

December 18, 2025
Flows

General Concepts in Flows

Because Flows will be a new concept to many of you it is important to understand the general concepts that flows are built upon. These concepts are: Many tasks are repetitive These tasks should be done consistently These tasks can often be split into small pieces. What Are Flows? Flows are batch processing system that combine Flow tools to perform operations consistently across a dataset, and are especially useful for repetitive tasks like gridding data.

July 30, 2025

Get a Personal Demo

Unlock the subsurface with Danomics