model
Type of resources
Keywords
Publication year
Scale
Topics
-
The Capel and Faust basins lie at water depths of 1,500-3,000 m 800 km east of Brisbane. Geoscience Australia began a petroleum prospectivity study of these remote frontier basins with the acquisition of 2D geophysical data (seismic reflection, refraction, gravity, magnetic, multi-beam bathymetry) across an area of 87,000 km2 during 2006/07. The approach mapped the complex distribution of sub-basins and determined sediment thickness through integration of traditional 2D time-domain seismic interpretation techniques with 3D mapping, visualisation and gravity modelling. Forward and inverse 3D gravity models were used to inform the seismic interpretation process and test the seismic basement pick. Gravity models had three sediment layers with inferred average densities of 1.85, 2.13, 2.31 t/m3 overlying a pre-rift basement of density 2.54 t/m3, itself considered to consist of older basin material evidently intruded by igneous rocks. Conversion of travel times of interpreted seismic horizons to depth domain was achieved using a quadratic function derived from ray-tracing forward modelling of refraction data supplemented by stacking interval velocities, and densities for gravity modelling were inferred from the same velocity models. These models suggest sediment of average velocity 3.5 km/s reaches a thickness exceeding 6 km in the northwest of the area, and for the first time mapped the extent and depth of sediment in these basins. The results of the study have confirmed that sediment thickness in the Capel and Faust basins is sufficient in places for potential petroleum generation.
-
To follow
-
Geoscience Australia is supporting the exploration and development of offshore oil and gas resources and establishment of Australia's national representative system of marine protected areas through provision of spatial information about the physical and biological character of the seabed. Central to this approach is prediction of Australia's seabed biodiversity from spatially continuous data of physical seabed properties. However, information for these properties is usually collected at sparsely-distributed discrete locations, particularly in the deep ocean. Thus, methods for generating spatially continuous information from point samples become essential tools. Such methods are, however, often data- or even variable- specific and it is difficult to select an appropriate method for any given dataset. Improving the accuracy of these physical data for biodiversity prediction, by searching for the most robust spatial interpolation methods to predict physical seabed properties, is essential to better inform resource management practises. In this regard, we conducted a simulation experiment to compare the performance of statistical and mathematical methods for spatial interpolation using samples of seabed mud content across the Australian margin. Five factors that affect the accuracy of spatial interpolation were considered: 1) region; 2) statistical method; 3) sample density; 4) searching neighbourhood; and 5) sample stratification by geomorphic provinces. Bathymetry, distance-to-coast and slope were used as secondary variables. In this study, we only report the results of the comparison of 14 methods (37 sub-methods) using samples of seabed mud content with five levels of sample density across the southwest Australian margin. The results of the simulation experiment can be applied to spatial data modelling of various physical parameters in different disciplines and have application to a variety of resource management applications for Australia's marine region.
-
In response to the devastating Indian Ocean Tsunami (IOT) that occurred on the 26th of December 2004, Geoscience Australia developed a framework for tsunami risk modelling. The outputs from this methodology have been used by emergency managers throughout Australia. For GA to be confident in the information that is being provided to the various stakeholders' validation of the model and methodology is required. While the huge loss of life from the tsunami was tragic, the IOT did provide a unique opportunity to record the impact of a tsunami on the coast of Western Australia. Eight months after the tsunami a post-disaster survey was conducted at various locations along the coast and maximum run-up was determined from direct observational evidence or anecdotal accounts. In addition tide gauges located in harbours along the coast also recorded the tsunami and provide a timeseries account of the wave heights and frequency of the event. This study employs the tsunami hazard modelling methodology used by Geoscience Australia (GA) to simulate a tsunami scenario based on the source parameters obtained from the Boxing Day earthquake of 2004. The model results are compared to observational evidence from satellite altimetry, inundation surveys and tide gauge data for Geraldton, a community on the Western Australian coast. Results show that the tsunami model provides good estimates of the wave height in deep water and also run up in inundated areas and it importantly matches the timing of the first wave arrivals. However the model fails to reproduce the timeseries data of wave heights observed by a tide gauge in Geraldton harbour. The model does however replicate the occurrence of a late arriving (16 hrs after first arrival) wave packet of high frequency waves. This observation is encouraging since this particular wave packet has been noted elsewhere in the Indian Ocean and caused havoc in harbours many hours after the initial waves had arrived and dissipated.
-
This Agreements ontology is designed to model 'agreements' which are social contracts that include: licenses, laws, contracts, Memoranda of Understanding, standards and definitional metadata. Its purpose is to support data sharing by making explicit the relationships between agreements and data and agreements and Agents (people and organisations). Eventually it will also help with the interplay between different classes of agreements. We think of this ontology as a 'middle' ontology, that is one which specializes well-known, abstract, upper ontologies and is able to be used fairly widely but is expected to be used particular contexts in conjunction with detailed, domain-specific, lower ontologies. We have tried to rely on: existing agent, data manipulation, metadata and licence ontologies where possible. As such we specialise the ORG and FOAF ontologies; the PROV ontology; the Dublin Core Terms RDF schema & DCAT ontology; and the ODRS vocabulary & Creative Commons RDF data models for those areas, respectively
-
Machine learning methods, like random forest (RF), have shown their superior performance in various disciplines, but have not been previously applied to the spatial interpolation of environmental variables. In this study, we compared the performance of 23 methods, including RF, support vector machine (SVM), ordinary kriging (OK), inverse distance squared (IDS), and their combinations (i.e., RFOK, RFIDS, SVMOK and SVMIDS), using mud content samples in the southwest Australian margin. We also tested the sensitivity of the combined methods to input variables and the accuracy of averaging predictions of the most accurate methods. The accuracy of the methods was assessed using a 10-fold cross-validation. The spatial patterns of the predictions of the most accurate methods were also visually examined for their validity. This study confirmed the effectiveness of RF, especially its combination with OK or IDS, and also confirmed the sensitivity of RF and its combined methods to the input variables. Averaging the predictions of the most accurate methods showed no significant improvement in the predictive accuracy. Visual examination proved to be an essential step in assessing the spatial predictions. This study has opened an alternative source of methods for spatial interpolation of environmental properties.
-
In this study, we conducted a simulation experiment to identify robust spatial interpolation methods using samples of seabed mud content in the Geoscience Australian Marine Samples database. Due to data noise associated with the samples, criteria are developed and applied for data quality control. Five factors that affect the accuracy of spatial interpolation were considered: 1) regions; 2) statistical methods; 3) sample densities; 4) searching neighbourhoods; and 5) sample stratification. Bathymetry, distance-to-coast and slope were used as secondary variables. Ten-fold cross-validation was used to assess the prediction accuracy measured using mean absolute error, root mean square error, relative mean absolute error (RMAE) and relative root mean square error. The effects of these factors on the prediction accuracy were analysed using generalised linear models. The prediction accuracy depends on the methods, sample density, sample stratification, search window size, data variation and the study region. No single method performed always superior in all scenarios. Three sub-methods were more accurate than the control (inverse distance squared) in the north and northeast regions respectively; and 12 sub-methods in the southwest region. A combined method, random forest and ordinary kriging (RKrf), is the most robust method based on the accuracy and the visual examination of prediction maps. This method is novel, with a relative mean absolute error (RMAE) up to 17% less than that of the control. The RMAE of the best method is 15% lower in two regions and 30% lower in the remaining region than that of the best methods in the previously published studies, further highlighting the robustness of the methods developed. The outcomes of this study can be applied to the modelling of a wide range of physical properties for improved marine biodiversity prediction. The limitations of this study are discussed. A number of suggestions are provided for further studies.
-
For many basins along the western Australian margin, knowledge of basement and crustal structure is limited, yet both play an important role in controlling basin evolution. To provide new insight into these fundamental features of a continental margin, we present the results of process-oriented gravity modelling along a NW-SE profile across the Browse Basin through the Brecknock field. Process-oriented gravity modelling is a method that considers the rifting, sedimentation and magmatism that led to the present-day gravity field. By backstripping the sediment load under different isostatic assumptions (i.e. range of flexural rigidities), the crustal structure associated with rifting can be inferred. Combining the gravity anomalies caused by rifting and sedimentation and comparing them to observed gravity provides insight into the presence of magmatic underplating, the location of the continent-ocean boundary and the thermal history of a margin. For an effective elastic thickness of 25 km, backstripping syn- and post-rift sediments (Jurassic and younger) along the Browse Basin profile suggests moderate Jurassic stretching (beta-1-2) and shows that rifting and sedimentation generally explain the observed free-air gravity signature. The gravity fit is reasonable for most of the Scott Plateau and Caswell Sub-basin, but over the Leveque Shelf and Wilson Spur, predicted gravity is less than observed and predicted Moho is also shallower than suggested by seismic refraction data. These misfits suggest the presence of magmatic underplating beneath the Leveque Shelf and outermost parts of the basin, an inference that has mixed support from refraction and crustal-scale seismic reflection data.
-
The inversion analyses presented in our paper and now extended in this Reply were ultimately only one part of the AEM system selection process for the BHMAR project. Both Derivative and Inversion analyses are in their nature theoretical, and it is impossible, in a theoretical analysis, to capture all of the aspects relevant for real surveys with little margin for error in political time frames. In reality, neither the Derivative nor Inversion analysis provided the degree of certainty required (by the project manager and client) to ascertain whether any of the candidate AEM systems were able to map the key managed aquifer recharge targets recognized in the study area. Consequently, a decision was made to acquire data over a test line with the 2 systems (SkyTEM and TEMPEST) that performed best in the Derivative and Inversion analysis studies. This approach was vindicated with quite distinctive and very different performance observed between these two systems, especially when compared with borehole and ground geophysical and hydrogeological data over known targets. Data were inverted both with contractors' software and with reference software common to all systems and the results were compared. Ultimately, it was the test line, particularly in the near-surface (top 20metres), thatmade the SkyTEM system stand out as the best system for the particular targets in the project area. SkyTEM mapped the key multi-layered hydrostratigraphy and water quality variability in the key aquifer that defined the key MAR targets, although the TEMPEST system had a superior performance at depths exceeding 100metres. Importantly, the SkyTEM system also mapped numerous, subtle fault-offsets in the shallow near-surface. These structures were critical to mapping recharge and inter-aquifer leakage pathways. Further analysis has demonstrated that selection of the most appropriate AEM system and inversion can result in order of magnitude differences in estimates of potential groundwater resources. The acquisition of SkyTEM data was an outstanding success, demonstrating the capability of AEM systems to provide high-resolution data for the rapid mapping and assessment of groundwater and strategic aquifer storages in Australia's complex and highly salinized floodplain environments. The SkyTEM data were used successfully to identify 14 major new groundwater targets and multiple MAR targets, and these have been validated by an extensive drilling program (Lawrie et al., 2012a-e). Increasingly, the demand from clients for higher certainty in project decision making, and quantifying errors, will see development of new system comparative analysis approaches such as the Inversion analysis approach documented in our initial paper. Ultimately, system fly-offs are likely in high-profile projects where budgets permit.
-
Increasing the knowledge of ocean current patterns in Torres Strait region is of direct interest to indigenous communities and industries such as fisheries and shipping that operate in the region. Ocean circulation in Torres Strait influences nearly all aspects of the ecosystem, including sediment transport and turbidity patterns, primary production in the water column and bottom sediments, and recruitment patterns for organisms with pelagic phases in their life cycles. This study is the first attempt to describe the water circulation and transport patterns across Torres Strait and adjacent gulfs and seas, on time scales from hours to years. It has also provided a framework for an embedded model describing sediment transport processes (described in Margvelashvili and Saint-Cast, 2006). The circulation model incorporated realistic atmospheric and oceanographic forcing, including winds, waves, tides, and large-scale regional circulation taken from global model outputs. Simulations covered a hindcast period of eight years, allowing the tidal, seasonal, and interannual flow characteristics to be investigated. Results demonstrated that instantaneous current patterns were strongly dominated by the barotropic tide and its spring-neap cycle. However, longer-term transport through Torres Strait was mainly controlled by seasonal winds, which switch from north-westerly monsoon winds in summer to south-easterly trades in winter. Model results were shown to be relatively insensitive to internal model parameters. However, model performance was strongly dependent on the quality of the forcing fields. For example, the prediction of low-frequency inner-shelf currents was improved substantially when temperature and salinity forcing based on the average seasonal climatologies was replaced by that from global model outputs. Uncertainties in the tidal component of the forcing also limited model skill, particularly predictions to the west of Cape York which were strongly dependent on the sealevels imposed along the open boundary in Gulf of Carpentaria.