Overview
The archive includes:
- Phase picks for the analyzed seismic event (230115_210804_DAS.pha)
- HDF5 DAS data file corresponding to the studied seismic event (SR_DS_2023-01-15_21-06-51_UTC.h5)
- Coordinates of sensing points located along the surface loop (Coordinates_Loop.txt)
- Python processing script supporting data loading from .h5 files, waveform visualization in the time–frequency and frequency–wavenumber domains, and advanced signal processing methods including filtering, beamforming, and cross-correlation (load_plot_process_h5data.py)
The provided processing workflow exploits the three-dimensional configuration of the fiber-optic cable, including both the near-surface section and the downhole installation in the monitoring well. As described in the associated publication, the workflow is tailored to seismic monitoring applications, enabling wavefield analysis, estimation of source parameters (including wave-field directionality, moment magnitude, stress drop), and visualization of wave propagation across different array geometries.
Workflow Features
- Load and parse HDF5 DAS files using
febus_optics_lib
- Slice and segment DAS data into well and surface loop sections
- Apply bandpass and f-k filtering
- Plot filtered sections and surface/well waveform data
- Compute beamforming backazimuth and incidence angle estimates
- Convert strain rate to ground acceleration
- Estimate source parameters (Mw, stress drop) using amplitude spectrum fitting
- Visualize correlation and lag matrices between channels
Dependencies
Make sure the following Python libraries are installed:
pip install obspy numpy scipy matplotlib febus-optics-lib
Input Files
.h5 DAS data files (e.g., SR_DS_2023-01-15_21-06-51_UTC.h5)
- Assumes a fiber layout with horizontal (surface loop) and vertical (borehole) sections
- Refers to phase picks stored in 230115_210804_DAS.pha
How to Run
Set the main parameters at the top of the script:
DAS_DATA_pathname : full path to the folder containing the h5 file to be loaded into memory
file2import : name of the file to loaded into the memory, eg 'SR_DS_2023-01-15_21-06-51_UTC.h5'
qdt : queried date-time around which the data segment to be treated is extracted, eg UTCDateTime('2023-01-15T21:08:03.000000Z')
querried_timespan : data are going to be segmented, in time, between qdt - timespan[0] and qdt + timespan[1] (in seconds)
querried_distspan : data are going to be loaded for that range of offsets (in meter). Keep None to load the full dataset
Parameters for beamforming and Parameters for source parameters evaluation define the range of tested values.
Then simply run:
python load_plot_process_h5data.py
The script will:
- Load and preprocess the DAS data
- Apply filtering
- Plot waveforms and f-k domain
- Estimate slowness, moment magnitude, and stress drop
- Export results and plots
Output
- Time-domain and frequency-domain plots of waveforms
- Filtered waveform streams (e.g.,
BP, BP+FK, BP+FK_P, BP+FK_S)
- Correlation and lag matrices
- Estimated source parameters stored in
RESULTS.npz
- Final beamforming/backazimuth results
- Summary plots
Notes
- Ensure that coordinates are defined for each channel (x, y, elevation) for delay computations.
- Parameters like slowness ranges, frequency bands, and source models can be adapted to specific sites or datasets.
- This script is optimized for the context of the study, but can be adapted to others.