Handling Large Datasets

Working with large amounts of data can cause out-of-memory errors, re-drawing issues, and sluggish behaviour. Long-term collection from permanent downhole gauges and high-frequency readings are becoming increasingly common in the field, and can lead to volumes of data much greater than is required for pressure transient analysis.

The maximum amount of data that can be imported into WellTest depends on the amount of random access memory (RAM) the computer has, as well as how many other programs are using up the RAM.

These steps help reduce memory-related issues in WellTest:

1. Reduce the amount of data to be imported into WellTest.

  • When extracting data from a database, use appropriate queries to extract enough data to obtain a good interpretation, but not so much that it affects performance. Extracting hundreds of thousands of points from millions of points yields a large enough dataset to perform further interactive filtering in WellTest, without losing signatures in the data.
  • When dealing with flat files, such as ASCII files, most wireline companies can provide lower frequency data than the gauge sampling frequency.
  • Use IHS ValiData™, a separate program, to reduce data before bringing it into WellTest. ValiData can handle approximately five times more data than WellTest.

2. Reduce the amount of data in WellTest.

  • Filter gauges. The filter in WellTest's Data Management tab is meant to be used for data reduction and should be used when dealing with super large datasets. See Filtering in Data Management for additional information.
  • In extreme cases, where re-drawing issues or sluggish behaviour prevents the interpretation of data, deleting data management gauges reduces the memory footprint of the file. This can be done by right-clicking the gauge tab above the grid and selecting "Delete Gauge".