From 1 - 10 / 96
  • In the last century coastal erosion has caused significant damage to property and infrastructure in NSW. Extreme erosion can be caused by individual extreme storms, or by multi-storm 'clusters' which induce disproportionate erosion by limiting the time for inter-storm shoreline recovery. Statistical changes in storm wave properties also occur in association with seasonal and ENSO (El-Nino Southern Oscillation) cycles, and a number of studies suggest the latter affects the mean shoreline position and likelihood of extreme erosion in NSW. Quantification of site-specific erosion hazards is necessary to support coastal management, with probabilistic or risk-based approaches being particularly attractive because they avoid reliance on arbitrarily chosen 'design' events. Callaghan et al. (2008) developed a methodology for probabilistic erosion hazard assessment on sandy shorelines, combining a probabilistic model of storm waves with a deterministic shoreline evolution model. The probability of the shoreline eroding past a given position (over a given timeframe) may be quantified, and epistemic uncertainties associated with e.g., our limited knowledge of the frequency of very large storms, are accounted for with bootstrapping. Herein we develop a probabilistic model of the storm wave climate at Old Bar, NSW, for use in a coastal erosion hazard assessment. A novel aspect of the model is that it accounts for the impacts of ENSO and seasonality on the storm wave properties, and the frequency of storm events. We establish relationships between ENSO, seasonality, and storm waves in the area using 30 years of wave observations, and extend the statistical framework of Callaghan et al. (2008) to account for these factors. This study is a key component of the Bushfire and Natural Hazards CRC Project "Resilience to clustered disaster events on the coast ¿ storm surge". References: Callaghan et al., (2008) Statistical Simulation of wave climate and extreme beach erosion. Coastal Engineering 2008, 55, 375-390.

  • Robust methods for generating spatially continuous data from point locations of physical seabed properties are essential for accurate biodiversity prediction. For many national-scale applications, spatially continuous seabed sediment data are typically derived from sparsely and unevenly distributed point locations, particularly in the deep ocean due to the expense and practical limitations of acquiring samples. Methods for deriving spatially continuous data are usually data- and variable-specific making it difficult to select an appropriate method for any given physical seabed property. To improve the spatial modelling of physical seabed properties, this study compared the results of a variety of methods for deriving spatially continuous mud content data for the southwest margin of Australia (523,400 km2) based on 177 sparsely and unevenly distributed point samples. For some methods, secondary variables were also used in the analysis, including: bathymetry, distance-to-coast, seabed slope, and geomorphic province (i.e., shelf, slope, etc.). Effects of sample density were also investigated. The predictive performance of the methods was assessed using a 10-fold cross validation and visual examination. A combined method (random forest and ordinary kriging: RFrf) proved the most accurate method, with an RMAE up to 17% less than the control. No threshold sample density was detected; as sample density increased so did the accuracy of the method. The RMAE of the most accurate method is about 30% lower than that of the best methods in previous publications, further highlighting the robustness of the method developed in this study. The results of this study show that significant improvements in the accuracy of the spatially continuous seabed properties can be achieved through the application of an appropriate interpolation method. The outcomes of this study can be applied to the modelling of a wide range of physical properties for improved marine biodiversity prediction.

  • This Agreements ontology is designed to model 'agreements' which are social contracts that include: licenses, laws, contracts, Memoranda of Understanding, standards and definitional metadata. Its purpose is to support data sharing by making explicit the relationships between agreements and data and agreements and Agents (people and organisations). Eventually it will also help with the interplay between different classes of agreements. We think of this ontology as a 'middle' ontology, that is one which specializes well-known, abstract, upper ontologies and is able to be used fairly widely but is expected to be used particular contexts in conjunction with detailed, domain-specific, lower ontologies. We have tried to rely on: existing agent, data manipulation, metadata and licence ontologies where possible. As such we specialise the ORG and FOAF ontologies; the PROV ontology; the Dublin Core Terms RDF schema & DCAT ontology; and the ODRS vocabulary & Creative Commons RDF data models for those areas, respectively

  • The Paterson National Geoscience Agreement project is using a number of tools to better understand the time-space evolution of the northwest Paterson Orogen in Western Australia. One of these tools, 3D Geomodeller, is an emerging technology that constructs three-dimensional (3D) volumetric models based on a range of geological information. The Paterson project is using 3D Geomodeller to build geologically-constrained 3D models for the northwest Paterson Orogen. This report documents the model building capability and benefits of 3D Geomodeller and highlights some of the geological insights gained from the model building exercise. The principal benefit of 3D Geomodeller is that it provides geoscientists with a rapid tool for testing multiple working hypotheses. The Cottesloe Syncline district was selected as the focus for a trial of the 3D Geomodeller software. The 3D model was built by members of the Paterson Project, as well as model building specialists within Geoscience Australia. The resultant Cottesloe Syncline model including two dimensional sections, maps and images was exported from 3D GeoModeller and transformed into a Virtual Reality Modelling Language (VRML), enabling a wide audience to view the model using readily available software.

  • The term 'modelling while interpreting' refers to the use of 3D models during the interpretation of reflection seismic data in order to inform that process. Rather than using 3D models at a final stage of the project just to display results, new software tools are emerging to enable development of 3D models in parallel with the seismic interpretation work. These tools provide additional means to help interpreters make informed decisions such as where to pick basement and to check the 3D integrity of their geological models. Applications of this new workflow are illustrated through a recently completed petroleum prospectivity assessment of the Capel and Faust frontier deep-water basins located 800 km to the east of Brisbane. Geoscience Australia acquired 2D geophysical data across these basins in 2007 and subsequently mapped the complex distribution of sub-basins by integrating 2D time-domain seismic interpretation with 3D gravity modelling. Forward and inverse 3D gravity models were used to inform the seismic interpretation and test the seismic basement pick. The identification of basement was problematic due to a lack of wells and the likelihood that acoustic basement represented older sedimentary material intruded by igneous rocks. Sonobuoy refraction data were modelled to achieve conversion of travel times to depth and estimate densities. Modelling gravity while interpreting reflection seismic data improved confidence in the mapping of the extent and thickness of sediments in these basins, and has potential to be used more widely in mapping projects to reduce exploration risk.

  • The sediment-hosted Nifty Cu deposit is located 450 km east of Port Hedland in the Yeneena Basin of the Paterson Orogen in Western Australia. It is hosted within interbedded black carbonaceous shales and dolomitised micrites of the Broadhurst Formation. The host rocks have been folded and metamorphosed to lower greenschist facies in the Miles Orogeny (see also Czarnota et al. this volume). Textural relationships of the ore to host rock suggest syn-deformational (Miles Orogeny) timing of mineralisation (see also van der Wacht et al., this volume). Primary chalcopyrite preferentially replaces dolomitised micrite beds, occurs in black shales within the axial plane foliation, or as breccia infill. The ore and silica dolomite alteration envelopes trend from the keel of the Nifty Syncline and up the steeply dipping limb of the fold. There are two high grade ore trends (>1% Cu): one strikes NE-SW parallel to the fold axis and the other strikes N-S across the axis of the fold. Based on the inference that Nifty is a structurally controlled deposit that formed late, or after the establishment of the fold architecture, the question is why high grade ore is located in the keel and towards one limb of the asymmetric Nifty syncline. Assuming that post-folding dilation focussed flow of mineralising fluid(s) 2D and 3D coupled deformation/fluid flow simulations were carried out to examine why Nifty is in a syncline and what the controls on high grade ore trends may be. 2D models The 2D model geometry consists of a three layer stratigraphy folded in a series of asymmetric folds. The three-layer model represents the camp scale lithostratigraphy consisting of (i) a moderately competent and moderately permeable siltstone, (ii) a strong and permeable carbonate and (iii) a weak and impermeable shale. Contraction at hydrostatic pore pressure of this material layering resulted in focuses fluid flow down fluid pressure gradient occurred from the hinge of the syncline and up the steeply dipping limb of the fold driven by dilation higher up the limb. This dilation is a consequence of the location of a shear band that developed along the shallow dipping limb of the fold, above the competent carbonate unit, and intersected the steeply dipping limb of the syncline, adjacent to the syncline hinge. Models run using the same geometry but varying the stratigraphy to the mine sequence of shale-carbonate-shale showed focusing of fluid flow into anticlinal fold closures. This is a consequence of shear strain localisation below the competent carbonate unit and the intersection of the resultant shear band with the carbonate unit adjacent to the anticlinal fold closure. This scenario does not explain why Nifty is in a syncline. However this model may explain why the Telfer Ore deposit hosted in sediments with a similar competency contrast to this model (i.e. a sandstone unit between two weak carbonate units) is situated in a dome fold closure adjacent to the steeply dipping limb of the fold. Other models run on symmetrical folds showed similar results as the two models outlined above. However the shear bands in these models do not have preferential shallowly dipping fold limbs to localise on. 3D models A simple three layer 3D model of the Nifty syncline was constructed to examine the effects of (i) the nearby Vines Fault which was active during the Miles Orogeny as a major dextral strike-slip fault and (ii) the effects of the NW-SE directed Paterson Orogeny. The results of applying a dextral strike-slip velocity boundary velocity parallel to the NNW orientation of the Vines Fault produced high strain zones and associated dilation broadly coincident with the second direction of high grade ore trends. Deformation under the Paterson stress field i.e. perpendicular to the fold axis, resulted in shear strain localisation along inflections in the fold axis away from regions of mineralisation....

  • The Tanami 3D model covers a 300x300 km area of the Tanami region, primarily in the Northern Territory but also extending into Western Australia. The model incorporates the whole of the crust from the topographic surface down to the Moho.

  • Describes the use of reactive transport geochemical modelling to predict the geophysical signatures of alteration. A comparison between the reactive transport results and the geophysical response above a known deposit is also described.

  • Tsunami inundation models are computationally intensive and require high resolution elevation data in the nearshore and coastal environment. In general this limits their practical application to scenario assessments at discrete communities. This paper explores the use of moderate resolution (250 m) bathymetry data to support computationally cheaper modelling to assess nearshore tsunami hazard. Comparison with high resolution models using best available elevation data demonstrates that moderate resolution models are valid at depths greater than 10 m in areas of relatively low sloping, uniform shelf environments, however in steeper and more complex shelf environments they are only valid to depths of 20 m or greater. In contrast, arrival times show much less sensitivity to resolution. It is demonstrated that modelling using 250 m resolution data can be useful in assisting emergency managers and planners to prioritise communities for more detailed inundation modelling by reducing uncertainty surrounding the effects of shelf morphology on tsunami propagation. However, it is not valid for modelling tsunami inundation.

  • Conductivity-depth estimates generated using the 1D Geoscience Australia layered earth inversion algorithm (GA-LEI) have been released to the public domain. The GA-LEI has been shown to provide useful mapping of subsurface conductivity features in the Paterson; for example paleovalleys, unconformities and faults. GA-LEI interpretations have been supported by independent borehole conductivity logs, and lithological drill-hole information. The Geoscience Australia Record 2010/12; Geological and energy implications of the Paterson Province airborne electromagnetic (AEM) survey, Western Australia, summarises the AEM processing, inversion, interpretation and implications for mineral exploration using the 1D GA-LEI. There is an inherent assumption in the GA-LEI algorithm that the earth can be represented by a set of 1D layers, which extend to infinite distance in the horizontal plane. This layered earth assumption has some limitations, and has been demonstrated to create artefacts when applied to heterogeneous 3D geological features. 3D inversion methods can potentially overcome some of the limitations of 1D inversion methods, reducing the artefacts of a 1D earth assumption. 3D inversions require much greater computational resources than 1D methods because they have to solve many large systems of equations. In addition, a large sensitivity matrix is computed, which increases memory requirements, and the process must be repeated for multiple iterations. This computational expense has generally limited the application of 3D inversions to AEM datasets, and restricted its practicality as a general mapping tool. The EMVision® inversion generated by TechnoImaging presents a method of running a 3D inversion, with a runtime comparable to 1D inversion methods. The EMVision® algorithm uses a moving footprint to limit the number of data points needed as input to the inversion at any one location. A background conductivity model is chosen to represent the far-field response of the earth, and the data points within the AEM footprint are treated as anomalies with respect to the background. In 2010, Geoscience Australia decided that a comparison of the GA-LEI with the EMVision® inversion would be useful both for geological interpretation and for assessing the benefits of 3D inversion of AEM. A subset of the regional Paterson AEM dataset around the Kintyre uranium deposit was provided to TechnoImaging to create a 3D inversion using EMVision® software. The data subset was a combination of GA data and data owned by Cameco Corporation and the cost of inversion by TechnoImaging was shared by both parties. Under the terms of the agreement between Cameco Corporation and Geoscience Australia there was a moratorium on the data release until 2012.