Conference Paper
Type of resources
Keywords
Publication year
Scale
Topics
-
Australia's National Geospatial Reference System (NGRS) is a continually evolving system of infrastructure, data, software and knowledge. The NGRS serves the broader community by providing an accurate foundation for positioning, and consequently all spatial data. The NGRS is administered by the Intergovernmental Committee on Surveying and Mapping (ICSM) and maintained by its Federal and State jurisdictions. Increasingly, the role of Global Navigation Satellite Systems (GNSS) in positioning has required the globalisation of national coordinate systems. In the early 1990's ICSM endorsed the adoption of the Geocentric Datum of Australia (GDA94) which was aligned to the International Terrestrial Reference Frame (ITRF) with a stated uncertainty of 30mm horizontally and 50mm vertically. Since that time crustal deformation and the demand for higher accuracies has resulted in GDA94 no longer adequately serving user requirements. ITRF has continued to evolve in accuracy and distribution to the extent that it now requires very accurate modelling of linear and non-linear crustal deformation. Even the Australia plate, which has long been considered to be rigid, is now considered to be deforming at levels detectable by modern geodesy. Consequently, infrastructure development programs such as AuScope have been implemented to ensure that crustal deformation can be better measured. The Auscope program also aims to improve the accuracy of the ITRF by contributing to the next generation of the Global Geodetic Observing System in our region. This approach will ensure that the ITRF continues to evolve and that Australia's NGRS is integrally connected to it with equivalent accuracies. Ultimately this will remove the need for National Reference Systems, with a globally homogenous and stable reference system (e.g., ITRF) being far more beneficial to society. This paper reviews Australia's contribution to GGOS and how this impacts on positioning in Australia.
-
Government Geological Surveys study the Earth at the regional, province or national scale, and acquire vast volumes of technically complex data. These data must be high quality, fit for purpose, durable, and readily accessible and usable by industry. Increasingly, users require the geological information contained within the data as well as the data itself. High performance computers facilitate a step-change in advanced processing and modelling of large, complex data, and will help Government deliver more sophisticated products to industry and researchers. Data enhancement and manipulation are no-longer limited by the computational effort required, and there are no artificial limits to the size of the data or model, or the data resolution that can be processed. Geoscience Australia is collaborating with the National Computational Infrastructure facility (NCI) at the Australian National University to develop advanced methods for extracting the maximum geological information from large data volumes. The new methods include: Modelling of potential field data in spherical coordinates to create continental-scale reference models of density and magnetic susceptibility; Inversion of magnetotelluric tensor data to a full 3D mesh of resistivities, and; Monte Carlo inversion of AEM responses to assess the reliability and sensitivity of conductivity-depth images. These algorithms are being implemented in a new Virtual Geophysical Laboratory where government data and advanced processing methods are brought together in a single high performance computer environment.
-
This paper analyses the magnetic field differences between the Gnangara and Gingin geomagnetic observatory sites. It was first presented as a poster at the XVth IAGA Workshop on Geomagnetic Observatory Instruments, Data Acquisition and Processing in San Fernando, Spain, 4-14 June 2012. It will be submitted as an extended abstract for the workshop proceedings volume.
-
The Petrel Sub-basin Marine Survey was undertaken in May 2012 by Geoscience Australia and the Australian Institute of Marine Science to support assessment of CO2 storage potential in the Bonaparte Basin. The aim of sub bottom profiling was high resolution data to investigate regional seal breaches and potential fluid pathways. The sub bottom profiler data were acquired aboard the AIMS RV Solander, a total of 51 lines and 654 line km. Acquisition employed a Squid 2000 sparker and a 24 channel GeoEel streamer. Group interval of 3.125 m and shot interval of 6.25 m resulted in 6 fold stacked data. Record length was 500 ms, sampled every 0.25 ms. Rough sea conditions during the trade winds resulted in obvious relative motion between source and streamer. Multichannel seismic reflection processing compensated for most of the limitations of sparker acquisition. Front end mute and band pass filter removed low frequency noise. Non surface consistent trim statics corrected for the relative motion of sparker and streamer, aligning reflections pre stack and improving signal to noise. Post stack minimum entropy deconvolution both suppressed ghosting and enhanced high frequencies (>1000 Hz). Vertical resolution of better than 1 m allowed delineation of multiple episodes of channelling in the top 100 m of sediment. Imaging of small channels was improved by collapsing diffractions with finite difference migration.
-
Abstract: In most cases a single pixel in a satellite image contains information from more than one type of land cover substance. One challenge is to decompose a pixel with mixed spectral readings into a set of endmembers, and estimate the corresponding abundance fractions. The linear spectral unmixing model assumes that spectral reading of a single pixel is a linear combination of spectral readings from a set of endmembers. Most linear spectral unmixing algorithms rely on spectral signatures from endmembers in pre-defined libraries obtained from previous on-ground studies. Therefore, the applications of these algorithms are restricted to images whose extent and acquisition time coincide with those of the endmember library. We propose a linear spectral unmixing algorithm which is able to identify a set of endmembers from the actual image of the studied area. Existing spectral libraries are used as training sets to infer a model which determines the class labels of the derived image based endmembers. The advantage of such an approach is that it is capable of performing consistent spectral unmixing in areas with no established endmember libraries. Testing has been conducted on a Landsat7 ETM+ image subset of the Gwydir region acquired on Jun 22, 2008. Three types of land cover classes: bare soil, green vegetation and non-photosynthetic are specified for this test. A set consisting of 150 endmember samples and a number of ground abundance observations were obtained from a corresponding field trip. The study successfully identified an endmember set from the image for the specified land cover classes. For most test points, the spectral unmixing and estimation of the corresponding abundance are consistent with the ground validation data. From the 20th International Congress on Modelling and Simulation (MODSIM2013)
-
Atmospheric CO2 perturbations from simulated leaks have been used to determine the minimum statistically significant emissions that can be detected above background concentrations using a single atmospheric station. The study uses high precision CO2 measurements from the Arcturus atmospheric monitoring station in the Bowen Basin, Australia. A statistical model of the observed CO2 signal was constructed, combining both a regression and a time series model. A non-parametric goodness of fit approach using the Kolmogorov-Smirnoff (KS) test was then used to test whether simulated perturbations can be detected against the modelled expected value of the background for certain hours of the day and for particular seasons. The KS test calculates the probability that the modelled leak perturbation could be caused by natural variation in the background. Using pre-whitened data and selecting optimum test conditions, minimum detectable leaks located 1 km from the measurement station were estimated at 22 tpd for an area source of size 100 m x 100 m and 14 tpd for a point source at a KS cutoff defined by using the formal p-value of 0.05. These are very large leaks located only 1 km from the station and have a high false alarm rate of 56%. An alternative p-value could be chosen to reduce the false alarm rate but then the minimum detectable leaks are larger. A long term, single measurement station monitoring program that is unconstrained by prior information on the possible direction or magnitude of a leak, and based solely on detection of perturbations of CO2 due to leakage above a (naturally noisy) background signal, is likely to take one or more years to detect leaks of the order of 10 kt p.a. The sensitivity of detection of a leak above a background signal could be greatly improved through the installation of additional atmospheric monitoring stations or through greater prior knowledge about the location and size of a suspected leak.
-
Reducing uncertainty at an early stage of resource development is a key necessity to attract project finance. Risk analysis frameworks exist in the petroleum industry for quantifying risk and expected returns (Newendorp, 1975; Suslick et al., 2009). For deep Enhanced Geothermal Systems (EGS) and Hot Sedimentary Aquifers (HSA), there is limited knowledge and experience available from in-the-ground projects to make informed estimates of the likelihood of outcomes for incorporation into a risk analysis framework. Modelling incorporating uncertainty analysis based on a library of EGS and HSA geothermal reservoirs, together with proxy data, could be used to develop a Geothermal Play Systems framework for assessing reservoir risk and ranking prospects.
-
Knowledge of the degree of damage to residential structures expected from severe wind is used to study the benefits from adaptation strategies developed in response to expected changes in wind severity due to climate change, inform the insurance industry and provide emergency services with estimates of expected damage. A series of heuristic wind vulnerability curves for Australian residential structures has been developed for the National Wind Exposure project. In order to provide rigor to the heuristic curves and to enable quantitative assessment to be made of adaptation strategies, work has commenced by Geoscience Australia in collaboration with James Cook University and JDH Consulting to produce a simulation tool to quantitatively assess damage to buildings from severe wind. The simulation tool accounts for variability in wind profile, shielding, structural strength, pressure coefficients, building orientation, building weights, debris damage and water ingress via a Monte Carlo approach. The software takes a component-based approach to modeling building vulnerability. It is based on the premise that overall building damage is strongly related to the failure of key connections and members. If these failures can be ascertained, and associated damage from debris and water penetration reliably estimated, scenarios of complete building damage can be assessed. This approach has been developed with varying degrees of rigor by researchers around the world and is best practice for the insurance industry. This project involves the integration of existing Australian work and the development of additional key components required to complete the process.
-
Legacy product - no abstract available
-
Australian Organic Geochemistry Conference, Townsville, 12-14 July, 2000