Menu [toggle]

Charles P. Oden

Print

Phone: 303-800-2000 x02
email:

About Chuck

Publications

CV


Current Affiliations

  • Director of Research and Develpment, Earth Science Systems, LLC (email: )
  • Adjunct Assistant Professor, Colorado School of Mines (email: )
  • Proprietor, Mercury Geophysics (email: )

Research Interests


In geophysics, it is rare that we can directly measure the quantity of interest (e.g. hydraulic permeability, oil content, or ore grade). Rather, these quantities must be inferred. To make these inferences, geophysics draws from various sciences and technologies such as physics and electrical engineering (sensors), geology and petrophysics (rock properties), and computer science (process modeling and numerical algorithms). By combining the latest ideas from each of these fields, we are more likely to create better methods to estimate subsurface parameters than we would by working in a single discipline. When considering the difficulties we face in quantifying, understanding, managing, and protecting our natural resources, we conclude that multi-disciplinary solutions are simply required (besides, its more fun than working in a single field all of the time!).

High-Resolution Near-Surface Geophysics

  1. Wireless sensor networks (WSN) are being developed for seismic, radar, EM induction, magnetic, and spectral IP measurements. Miniature wireless sensors are easier to deploy and are less invasive than conventional instruments. Because of this, they are suited for permanent installation to monitor changes over long periods. Individual sensors may harvest energy (ambient vibrations, ambient light, ambient RF energy, heat, etc.) from their surroundings for long-term service-free operation. These sensors have many potential uses including precision agriculture, infrastructure monitoring, vadose zone studies, etc.
  2. Usually, multiple types of geophysical measurements are needed in order to ‘pin down’ the parameter of interest. For example, resistivity measurements alone are inadequate for tracking conductive contaminant plumes because soil resistivity depends on many parameters (e.g. moisture content, clay content, porosity, temperature, etc.). Since WSNs incorporate small inexpensive sensors, a single sensor node is able to make multiple types of measurements. This makes it less expensive to collect the necessary data to determine the parameter of interest.
  3. Tensor gradiometer EM induction measurements are useful in UXO detection and discrimination. These multi-component data sets can be inverted for the location, size, and shape of ferrous and conductive objects more quickly and with less uncertainty than the more traditional one or two component data.
  4. Variable antenna coupling and variable surface topography are environmental effects that can adversely GPR data. The effects of variable antenna coupling can be accounted for with proper system calibration and data processing. The effects of surface topography can be properly accounted for with a high contrast migration technique that allows velocity variations in three dimensions. More work is needed to reduce the calibration effort and computational burden of these techniques.

Data Processing Techniques

  1. Geophysicists are continually trying to glean more information from their data sets. Rather than simply locating a subsurface objects, we want to more about them (i.e. we found the pipe, but is it corroded?). For example, seismic and GPR data processing methods are often only concerned with the kinematic nature of the waves, but there is very useful information in their amplitudes. Using GPR, the amount of corrosion on a pipe may be estimated by deconvolution – provided we properly determine the amplitude and direction of antenna emissions into the ground. This in turn requires antenna calibrations that may by a function of variable ground coupling.
  2. Various references cite an ~30 fold increase in computing speed when using GPUs (graphical processing unit) for processing kernels commonly used in geophysical processing (FFT, matrix operations, wavelet transform, radon transform, etc.). Can this be translated into sharper images (i.e. iterative migration), or smaller confidence intervals (i.e. more complete exploration of non-linear objective functions in global optimization problems)?
  3. Ray tracing with GPUs (graphical processing unit) to help migrate complex or high contrast velocity structures.
  4. What are the limitations to interferometric techniques in lossy media? Under what conditions might we obtain true amplitudes using interferometry?

Instrumentation

  1. Emerging technologies are fueling the development of new measurements and methods. Generally, sensors are becoming smaller, more sensitive, and less costly. Newer instruments utilize embedded computing to improve data quality and provide more information in the field. Improved survey techniques allow data to be collected more efficiently and with less expense. Currently, I am developing new wireless sensor networks using MEMS accelerometers, miniature magnetometers, tunable GPR antennas, and solid state radiation detectors.
  2. Oftentimes, instruments are not properly calibrated before use. Since most geophysical inverse problems are ill-conditioned, proper instrument plays a huge role in estimating the properties of interest. Furthermore, it is not uncommon for instruments to be operated outside their range of valid calibration. Some have mistakenly interpreted non-linear or hysteretic instrument response as geophysical anomalies. Sometimes calibrations do not account for the full variation in an instrument’s response. For instance, the response of GPRs using ground coupled antennas changes with varying ground coupling, and this effect is rarely accounted for.
  3. It is rarely possible to create an instrument that only responds to the quantity of interest. Therefore, we must understand what our instruments respond to (in addition to what they are supposed to). We need to know what the sources of noise are in our measurements (interference, geologic, environmental, electrical, digital, etc.). What are their statistics? How does this affect the processed data? Finally, not all instruments are created equal. Usually, there are differences in measurements made by equipment from different manufacturers that supposedly measure the same thing.

Borehole Geophysics

  1. Existing down-hole tools do not always provide measurements that can be turned into reasonable estimates of a rock property (such as ore grade). New down-hole tools can be developed to answer specific questions (e.g. ore grade, Poisson's ratio in unconsolidated sediments, ). For instance, neutron activation tools can be tailored to measure the concentration of specific suites of elements; and sonic tools with a variable excitation frequency can be tuned to excite the desired modes on the borehole. By only exciting Stoneley waves, attenuation of these waves can be more precisely measured and used to infer permeability.
  2. There are usually several methods for estimating a given rock property from well logs (e.g. density porosity, sonic porosity, Archie's Law). There are assumptions that go with each method, and sometimes it is necessary to use multiple methods to determine which assumptions are valid. Alternatively, new methods can be created with less constraining assumptions. For example, consider pump tests that are used to estimate the hydraulic properties of an aquifer. In such a test, the well is to be produced at a constant rate while measuring the flow from a producing interval. Producing a well at a constant rate with varying drawdown can be difficult or time consuming. An algorithm that accepts varying drawdown, flow rates, and pumping rates dismisses this difficulty.



Created by: coden. Last Modification: Friday 02 of January, 2009 16:04:43 CST by coden.