External Publication
Type of resources
Keywords
Publication year
Scale
Topics
-
Demonstrates the application of modelling gamma-ray spectrometry and DEM for mapping regolith materials and in predicting salt stores.
-
Article
-
The oil and gas exploration and development industry is a significant Australian industry. In 2000 the value of oil and gas produced was $10.5 billion. This meant that Australia remained more than self sufficient in petroleum, contributing to economic activity and avoiding the balance of payment pressure that importing that amount of petroleum would represent. There is thus an incentive to maintain a healthy petroleum exploration and production industry. R&D for the upstream petroleum industry however, needs to be targeted to the requirements of the differing facets of the industry under the diverse conditions in which the industry operates or could operate. These conditions include changes in oil prices and perceptions of prospectivity, uncertain access to gas markets and the effects of international agreements such as the United Nations Framework Convention on Climate Change. Different petroleum companies also have differing exploration and production portfolios and different needs. Petroleum service industry companies try to meet industry?s needs. Governments have their own goals in promoting and regulating the industry and derive considerable revenues for economic rent applied to reserves held by the Crown. In the above context, a range of scenarios was considered in a planning process prioritising future needs for petroleum R&D in Australia. In this context two groups of senior petroleum industry, research and government representatives carried out scenario planning workshops in 1998 and 1999 to define scenarios and associated R&D priorities to assist in planning and identifying opportunities for petroleum R&D. The results of this study highlight core areas of R&D that are required under most of the scenarios. These are considered highest priority and high priority areas. Given the long time frame (in the order of 10 years) needed to develop and maintain R&D capability, this highlights for government, academia and industry the sustained effort needed for development and maintenance of capability particularly in these core areas of R&D. In 1998 and 1999 when the workshops that formed the basis of this study were undertaken, Australia was arguably in the `low oil and gas price scenario?. This scenario puts an onus on government to support regional studies to promote exploration and most priority petroleum R&D. Under this scenario support from industry is substantially aimed at reducing cost. Although oil prices have increased, coincident increases in stock market pressures for competitive profits from the industry has arguably left the industry in 2001 still in the low oil and gas price scenario. Thus there remains a strong need to maintain a local petroleum R&D capability to meet Australia?s needs.
-
The next decade promises an exponential increase in volumes of open data from Earth observing satellites (EOS). The ESA Sentinels, the Japan Meteorological Agency's Himawari 8/9 geostationary satellites, and various NASA missions, to name just a few, will produce petabyte scale datasets of national and global significance. If we are to cope with this deluge of data we must embrace the paradigm shift that is 'big data'. This paradigm shift requires a fundamental change in the way we manage and interact with our data from the traditional 'ad-hoc' and labour intensity methods to the new High Performance Data (HPD) models where data are well organised and co-located with High Performance Computational (HPC) facilities. We are now taking the compute power and algorithms to the data instead of downloading the data to our own computers. To meet this challenge Geoscience Australia (GA) has developed the Australian Geoscience Data Cube (AG-DC), hosted on the National Computational Infrastructure (NCI). The AG-DC is a data management system for large scale multi-dimensional data which will allow efficient spatial and temporal analyses of continental scale geospatial datasets, including those produced by EOS. Initial work on the AG-DC has been focused developing a proof of concept using the 25 year archive of calibrated Landsat data to demonstrate a new way of interacting with large volumes of geoscientific data to derive valuable information in a timely fashion. The AG-DC is now being further developed through a collaboration involving GA, CSIRO and the NCI to fully realise the vision of an integrated and operational HPD infrastructure that will enhance Australia's ability to maximise the value and impact of geoscience data to meet the social, economic and environmental challenges we face both now and into the future.
-
In several Australian regions severe wind is projected to increase in severity as a result of climate change. This poses problems for existing buildings which are already structurally substandard under present climate and represent a high community risk. Increased severe wind gust likelihood will greatly exacerbate this as damage increases very sharply with increasing wind speed. Increasing wind hazard also presents challenges for regulators who set the design standards for future building construction. Key to adapting both legacy structures and to constructing future infrastructure compatible with future hazard is a reliable means for quantifying the benefits of adaption strategies. In this presentation work led by Geoscience Australia in collaboration with James Cook University and JDH Consulting is described. With funding contributions from the Federal Department of Climate Change a simulation tool is being developed and refined that quantitatively assesses damage to specific building systems as a result of severe wind exposure. The simulation tool accounts for variability in wind profile, shielding, structural strength, pressure coefficients, building orientation, building component weights, debris damage and water ingress via a Monte Carlo simulation approach. The software takes a component-based approach to modelling building vulnerability based on the premise that overall building damage is strongly related to the failure of key connections and members. If these failures can be ascertained, and associated damage from debris and water penetration reliably estimated, scenarios of complete building damage can be assessed and quantified in repair terms. Further, the building elements primarily responsible for failure can be identified and a range of adaptation measures simulated to quantitatively assess the benefits of structural and architectural changes now and into the future.
-
The IAG (International Association of Geodesy) Working Group (WG) on 'Regional Dense Velocity Fields' aims at densifying the ITRF (International Terrestrial Reference Frame) and creating a dense velocity field based on regional and global GNSS networks. With the goal to generate a high-quality solution for a core network, several newly reprocessed global and regional cumulative position and velocity solutions were submitted to the WG. In order to find a consensus on discontinuity epochs for stations common to several networks (an issue which was problematic in previous submissions), the new submissions were restricted to contain only the core networks over which the analyst has full control so that ITRF2008 discontinuities could be applied. The 3D-RMS of the agreement of the new solutions with the ITRF2008 (after outlier rejection) varies between 0.6 and 1.1 mm/yr; it is extremely good for some solutions, while others still require more iteration to reach the required level of precision. Generally the cause of these disagreements has been identified and they often originate in the use of different data time spans within the ITRF2008 and submitted solution. In a next step, the WG expects to generate and use a discontinuity database complementing the ITRF2008 one and identify/solve the sources of disagreements. In addition, several of the regional solutions will be reprocessed to imbed the regional network in a global network and reduce the error induced by the network effect. More details on the WG are available from http://epncb.oma.be/IAG/.
-
Spatial interpolation methods have been applied to many disciplines. Many factors affect the performance of the methods, but there are no consistent findings about their effects. In this study, we focus mainly on comparative studies in environmental sciences to assess the performance of spatial interpolation methods and to quantify the impacts of data properties on the performance. Two new measures are proposed to compare the performance of the methods applied to multiple variables with different units/scales. A total of 53 comparative studies were assessed and the performance of 61 methods/sub-methods compared in these studies is analysed. The impacts of sample density, data variation, and sampling design on the estimations of 32 methods are quantified using data derived from their application to 80 variables. Inverse distance weighting (IDW), ordinary kriging (OK), and ordinary co-kriging (OCK) are the most frequently used methods. Data variation is a dominant factor and has significant impacts on the performance of the methods. As the variation increases, the accuracy of all methods decreases and the magnitude of decrease is method dependent. Gradient plus inverse distance squared (GIDS), OCK and regression residual kriging (RK-C) are less sensitive to data variation. Irregular-spaced sampling design might improve the accuracy of estimation. The effect of sampling density on the performance of the methods is found not to be significant. The implications of these findings are discussed.
-
The Australian Government formally releases new offshore exploration areas at the annual APPEA conference. This year, thirty-one areas plus two special areas in five offshore basins are being released for work program bidding. Closing dates for bid submissions are either six or twelve months after the release date, i.e. 3 December 2009 and 29 April 2010, depending on the exploration status in these areas is and on data availability. The 2009 Release Areas are located in Commonwealth waters offshore Northern Territory, Western Australia, South Australia and Victoria, comprising intensively explored areas close to existing production as well as new frontiers. As usual, the North West Shelf features very prominently and is complimented by new areas along the southern margin, including frontier exploration areas in the Ceduna Sub-basin (Bight Basin) and the Otway Basin. The Bonaparte Basin is represented by one Release Area in the Malita Graben, while five areas are available in the Southern Browse Basin in the under-explored part of that basin. A total of 14 areas are being released in the Carnarvon Basin, located within the Dampier Sub-basin, (eight areas), the Rankin Platform (three small blocks) and the Northern Exmouth Plateau (three large blocks) which is considered a deep water frontier. In the south, six large areas are on offer in the Ceduna Sub-basin and five areas of varying sizes are being released in the Otway Basin, including a deep water frontier offshore Victoria. The Special Release Areas are located in the Petrel Sub-basin, Bonaparte Basin offshore Northern Territory, and encompass the Turtle/Barnett oil discoveries.
-
The program package escript is a module in python for solving mathematical modelling problems. It is based on the finite element method (FEM) and scales on compute clusters for thousands of cores. In this paper we will discuss an extension to escript for solving large-scale inversion problems, in particular the joint inversion of magnetic and gravity data. In contrast to conventional inversion programs escript avoids the assemblage of the -in general- dense sensitivity matrix which is problematic when it comes to large-scale problems. Moreover, we will show how the FEM approach can easily be used to solve the adjoined forward problems required for the gradient calculation of the cost function. We will demonstrate the application of the algorithm for field data using hundreds of cores.
-
Abstract for IGNSS 2015 conference: A Global Navigation Satellite System (GNSS) antenna calibration facility has been established at Geoscience Australia, for determining individual antenna calibrations as well as aiding the establishment of typemean calibrations as used by the International GNSS Service (IGS). Studies have highlighted the importance of accounting for the variation in individual antenna calibrations for high precision positioning applications. In order to use individual antenna calibrations reliably, the repeatability of the calibration needs to be well understood. In this paper, we give an overview of the repeatability of calibrations for different antenna types. We also present a case study on the application of an individual GNSS antenna calibration in Australia and its effect upon positioning.