From 1 - 10 / 69
  • Stations on the Australian continent receive a rich mixture of ambient seismic noise from the surrounding oceans and the numerous small earthquakes in the earthquake belts to the north in Indonesia, and east in Tonga-Kermadec, as well as more distant source zones. The noise field at a seismic station contains information about the structure in the vicinity of the site, and this can be exploited by applying an autocorrelation procedure to the continuous records. By creating stacked autocorrelograms of the ground motion at a single station, information on crust properties can be extracted in the form of a signal that includes the crustal reflection response convolved with the autocorrelation of the combined effect of source excitation and the instrument response. After applying suitable high pass filtering the reflection component can be extracted to reveal the most prominent reflectors in the lower crust, which often correspond to the reflection at the Moho. Because the reflection signal is stacked from arrivals from a wide range of slownesses, the reflection response is somewhat diffuse, but still sufficient to provide useful constraints on the local crust beneath a seismic station. Continuous vertical component records from 223 stations (permanent and temporary) across the continent have been processed using autocorrelograms of running windows 6 hours long with subsequent stacking. A distinctive pulse with a time offset between 8 and 30 s from zero is found in the autocorrelation results, with frequency content between 1.5 and 4 Hz suggesting P-wave multiples trapped in the crust. Synthetic modelling, with control of multiple phases, shows that a local Ppmp phase can be recovered with the autocorrelation approach. This approach can be used for crustal property extraction using just vertical component records, and effective results can be obtained with temporary deployments of just a few months.

  • Geoscience Australia is supporting the exploration and development of offshore oil and gas resources and establishment of Australia's national representative system of marine protected areas through provision of spatial information about the physical and biological character of the seabed. Central to this approach is prediction of Australia's seabed biodiversity from spatially continuous data of physical seabed properties. However, information for these properties is usually collected at sparsely-distributed discrete locations, particularly in the deep ocean. Thus, methods for generating spatially continuous information from point samples become essential tools. Such methods are, however, often data- or even variable- specific and it is difficult to select an appropriate method for any given dataset. Improving the accuracy of these physical data for biodiversity prediction, by searching for the most robust spatial interpolation methods to predict physical seabed properties, is essential to better inform resource management practises. In this regard, we conducted a simulation experiment to compare the performance of statistical and mathematical methods for spatial interpolation using samples of seabed mud content across the Australian margin. Five factors that affect the accuracy of spatial interpolation were considered: 1) region; 2) statistical method; 3) sample density; 4) searching neighbourhood; and 5) sample stratification by geomorphic provinces. Bathymetry, distance-to-coast and slope were used as secondary variables. In this study, we only report the results of the comparison of 14 methods (37 sub-methods) using samples of seabed mud content with five levels of sample density across the southwest Australian margin. The results of the simulation experiment can be applied to spatial data modelling of various physical parameters in different disciplines and have application to a variety of resource management applications for Australia's marine region.

  • Geoscience Australia is supporting the exploration and development of offshore oil and gas resources and establishment of Australia's national representative system of marine protected areas through provision of spatial information about the physical and biological character of the seabed. Central to this approach is prediction of Australia's seabed biodiversity from spatially continuous data of physical seabed properties. However, information for these properties is usually collected at sparsely-distributed discrete locations, particularly in the deep ocean. Thus, methods for generating spatially continuous information from point samples become essential tools. Such methods are, however, often data- or even variable- specific and it is difficult to select an appropriate method for any given dataset. Improving the accuracy of these physical data for biodiversity prediction, by searching for the most robust spatial interpolation methods to predict physical seabed properties, is essential to better inform resource management practises. In this regard, we conducted a simulation experiment to compare the performance of statistical and mathematical methods for spatial interpolation using samples of seabed mud content across the Australian margin. Five factors that affect the accuracy of spatial interpolation were considered: 1) region; 2) statistical method; 3) sample density; 4) searching neighbourhood; and 5) sample stratification by geomorphic provinces. Bathymetry, distance-to-coast and slope were used as secondary variables. In this study, we only report the results of the comparison of 14 methods (37 sub-methods) using samples of seabed mud content with five levels of sample density across the southwest Australian margin. The results of the simulation experiment can be applied to spatial data modelling of various physical parameters in different disciplines and have application to a variety of resource management applications for Australia's marine region.

  • Geoscience Australia has developed a number of open source risk models to estimate hazard, damage or financial loss to residential communities from natural hazards and is used to underpin disaster risk reduction activities. Two of these models will be discussed here: the Earthquake Risk Model (EQRM) and a hydrodynamic model call ANUGA, developed in collaboratoin with the ANU. Both models have been developed in Python using scientific and GIS packages such as Shapely, Numeric and SciPy. This presentation will outline key lessons learnt in developing scientific software in Python. Methods of maintaining and accessing code quality will be discussed (1) what makes a good unit test (2) how defects in the code were discovered quickly by being able to visualise the output data; and (3) how characterisation tests, which describe the actual behaviour of a system, are useful for finding unintended system changes. The challenges involved in optimising and parallelising Python code will also be presented. This is particularly important in scientific simulations as they use considerable computational resources and involve large data sets. This will be focus on: profiling; NumPyl using C code; and parallelisation of applications to run on clusters. Reduction of memory use by using a class to represent a group of items instead of a single item will also be discussed.

  • The tragic events of the Indian Ocean tsunami on 26 December 2004 highlighted the need for reliable and effective alert and response sysems for tsunami threat to Australian communities. Geoscience Australia has established collaborative partnerships with state and federal emergency management agencies to support better preparedness and to improve community awareness of tsunami risks.

  • One of the important inputs to a probabilistic seismic hazard assessment is the expected rate at which earthquakes within the study region. The rate of earthquakes is a function of the rate at which the crust is being deformed, mostly by tectonic stresses. This paper will present two contrasting methods of estimating the strain rate at the scale of the Australian continent. The first method is based on statistically analysing the recently updated national earthquake catalogue, while the second uses a geodynamic model of the Australian plate and the forces that act upon it. For the first method, we show a couple of examples of the strain rates predicted across Australia using different statistical techniques. However no matter what method is used, the measurable seismic strain rates are typically in the range of 10-16s-1 to around 10-18s-1 depending on location. By contrast, the geodynamic model predicts a much more uniform strain rate of around 10-17s-1 across the continent. The level of uniformity of the true distribution of long term strain rate in Australia is likely to be somewhere between these two extremes. Neither estimate is consistent with the Australian plate being completely rigid and free from internal deformation (i.e. a strain rate of exactly zero). This paper will also give an overview of how this kind of work affects the national earthquake hazard map and how future high precision geodetic estimates of strain rate should help to reduce the uncertainty in this important parameter for probabilistic seismic hazard assessments.

  • Geoscience Australia (GA) has been acquiring both broadband and long-period magnetotelluric (MT) data over the last few years along deep seismic reflection survey lines across Australia, often in collaboration with the States/Territory geological surveys and the University of Adelaide. Recently, new three-dimensional (3D) inversion code has become available from Oregon State University. This code is parallelised and has been compiled on the NCI supercomputer at the Australian National University. Much of the structure of the Earth in the regions of the seismic surveys is complex and 3D, and MT data acquired along profiles in such regions are better imaged by using 3D code rather than 1D or 2D code. Preliminary conductivity models produced from the Youanmi MT survey in Western Australia correlate well with interpreted seismic structures and contain more geological information than previous 2D models. GA has commenced a program to re-model with the new code MT data previously acquired to provide more robust information on the conductivity structure of the shallow to deep Earth in the vicinity of the seismic transects.

  • Natural hazards such as floods, dam breaks, storm surges and tsunamis impact communities around the world every year. To reduce the impact, accurate modelling is required to predict where water will go, and at what speed, before the event has taken place. ANUGA is free, open source, software created to model water flow arising from these events. The resulting knowledge can be used to reduce loss of life and damage to property in communities affected by such disasters by providing vital input to evacuation plans, structural mitigation options and planning. The software was developed collaboratively by the Australian National University (ANU) and Geoscience Australia (GA) and is available at http://sourceforge.net/projects/anuga. ANUGA solves the non-linear shallow water wave equations using the finite volume method with dynamic time stepping. A major capability of ANUGA is that it can model the process of wetting and drying as water enters and leaves an area. This means it is suitable for simulating water flow onto a beach or dry land and around structures such as buildings. ANUGA is also capable of modelling complex flows involving shock waves and rapidly changing flow speeds (transitions from sub critical to super critical flows). ANUGA is a robust software package that contains over 800 unit tests. It has been validated against wave tank experiments [1] and model outputs from the 2004 Indian Ocean tsunami have compared very well with a run-up survey at Patong Beach, Thailand. This particular activity has also underpinned the results provided to Australian emergency managers managing tsunami risk. This presentation will outline the key components of ANUGA, examples for a range of hydrodynamic hazards as well as a sample of validation outputs.

  • Natural hazards such as floods, dam breaks, storm surges and tsunamis impact communities around the world every year. To reduce the impact, accurate modelling is required to predict where water will go, and at what speed, before the event has taken place.

  • Models of seabed sediment mobilisation by waves and currents over Australia's continental shelf environment are used to examine whether disturbance regimes exist in the context of the intermediate disturbance hypothesis (IDH). Our study shows that it is feasible to model the frequency and magnitude of seabed disturbance in relation to the dominant energy source (wave-dominated shelf, tide-dominated shelf or tropical cyclone dominated shelf). Areas are mapped where the recurrence interval of disturbance events is comparable to the rate of ecological succession, which meets criteria defined for a disturbance regime. We focus our attention on high-energy, patch-clearing events defined as exceeding the Shields (bed shear stress) parameter value of 0.25. Using known rates of ecological succession for different substrate types (gravel, sand, mud), predictions are made of the spatial distribution of a dimensionless ecological disturbance index (ED), given as: ED = FA (ES/RI), where ES is the ecological succession rate for different substrates, RI is the recurrence interval of disturbance events and FA is the fraction of the frame of reference (surface area) disturbed. Maps for the Australian continental shelf show small patches of ED-seafloor distributed around the continent, on both the inner and outer shelf. The patterns are different for wave-dominated (patches on the outer shelf trending parallel to the coast), tide-dominated (patches crossing the middle-shelf trending normal to the coast) and cyclone-dominated (large oval-shaped patches crossing all depths). Only a small portion of the shelf (perhaps ~10%) is characterised by a disturbance regime as defined here. To our knowledge, this is the first time such an analysis has been attempted for any continental shelf on the earth.