From 1 - 10 / 69
  • A key component of Geoscience Australia's marine program involves developing products that contain spatial information about the seabed for Australia's marine jurisdiction. This spatial information is derived from sparse or unevenly distributed samples collected over a number of years using many different sampling methods. Spatial interpolation methods are used for generating spatially continuous information from the point samples. These methods are, however, often data- or even variable- specific and it is difficult to select an appropriate method for any given dataset. Machine learning methods, like random forest (RF) and support vector machine (SVM), have proven to be among the most accurate methods in disciplines such as bioinformatics and terrestrial ecology. However, they have been rarely previously applied to the spatial interpolation of environmental variables using point samples. To improve the accuracy of spatial interpolations to better represent the seabed environment for a variety of applications, including prediction of biodiversity and surrogacy research, Geoscience Australia has conducted two simulation experiments to compare the performance of 14 mathematical and statistical methods to predict seabed mud content for three regions (i.e., Southwest, North, Northeast) of Australia's marine jurisdiction Since 2008. This study confirms the effectiveness of applying machine learning methods to spatial data interpolation, especially in combination with OK or IDS, and also confirms the effectiveness of averaging the predictions of these combined methods. Moreover, an alternative source of methods for spatial interpolation of both marine and terrestrial environmental properties using point survey samples has been identified, with associated improvements in accuracy over commonly used methods.

  • Tsunami inundation models are computationally intensive and require high resolution elevation data in the nearshore and coastal environment. In general this limits their practical application to scenario assessments at discrete communiteis. This study explores teh use of moderate resolution (250 m) bathymetry data to support computationally cheaper modelling to assess nearshore tsunami hazard. Comparison with high ersolution models using best available elevation data demonstrates that moderate resolution models are valid (errors in waveheight < 20%) at depths greater than 10m in areas of relatively low sloping, uniform shelf environments. However in steeper and more complex shelf environments they are only valid at depths of 20 m or greater. Modelled arrival times show much less sensitivity to data resolution compared with wave heights and current velocities. It is demonstrated that modelling using 250 m resoltuion data can be useful in assisting emergency managers and planners to prioritse communities for more detailed inundation modelling by reducing uncertainty surrounding the effects of shelf morphology on tsunami propagaion. However, it is not valid for modelling tsunami inundation. Further research is needed to define minimum elevation data requirements for modelling inundation and inform decisions to undertake acquisition of high quality elevaiton data collection.

  • One of the main outputs of the Earthquake Hazard project at Geoscience Australia is the national earthquake hazard map. The map is one of the key components of Australia's earthquake loading standard, AS1170.4. One of the important inputs to the map is the rate at which earthquakes occur in various parts of the continent. This is a function of the strain rate, or the rate of deformation, currently being experienced in different parts of Australia. This paper presents two contrasting methods of estimating the strain rate, and thus the seismicity, using the latest results from the seismology and geodynamic modelling programs within the project. The first method is based on a fairly traditional statistical analysis of an updated catalogue of Australian earthquakes. Strain rates, where measurable, were in the range of 10-16s-1 to around 10-18s-1 and were highly variable across the continent. By contrast, the second method uses a geodynamic numerical model of the Australian plate to determine its rate of deformation. This model predicted a somewhat more uniform strain rate of around 10-17s-1 across the continent. The uniformity of the true distribution of long term strain rate in Australia is likely to be somewhere between these two extremes but is probably of about this magnitude. In addition, this presentation will also give an overview of how this kind of work could be incorporated into future versions of the national earthquake hazard map in both the short and long term.

  • The tragic events of the Indian Ocean tsunami on 26 December 2004 highlighted the need for reliable and effective alert and response sysems for tsunami threat to Australian communities. Geoscience Australia has established collaborative partnerships with state and federal emergency management agencies to support better preparedness and to improve community awareness of tsunami risks.

  • This folder contains WindRiskTech data used in preliminary stages of the National Wind Risk Assessment. The data are synthetic TC event sets, generated by a statistical-dynamical model of TCs that can be applied to general circulation models to provide projections of TC activity. Output from two GCMs is available here - the NCAR CCSM3 and the GFDL CM2.1 model. For each, there are a number of scenarios (based on the SRES scenarios from AR4 and previous IPCC reports) and time periods (the time periods are not the same for the A1B scenario). For each mode, scenario and time period, the data are a set of 1000 TC track files in tab-delimited format contained in the huur.zip files in each sub-folder. The output folder contains the output of running TCRM (pre-2011 version) on each of the datasets.

  • The major tsunamis of the last few years in the southern hemisphere have raised awareness of the possibility of potentially damaging tsunami to Australia and countries in the Southwest Pacific region. Here we present a probabilistic hazard assessment for Australia and for the SOPAC countries in the Southwest Pacific for tsunami generated by subduction zone earthquakes. To conduct a probabilistic tsunami hazard assessment, we first need to estimate the likelihood of a tsunamigeneic earthquake occurring. Here we will discuss and present our method of estimate the likely return period a major megathrust earthquake on each of the subduction zones surrounding the Pacific. Our method is based on the global rate of occurrence of such events and the rate of convergence and geometry of each particular subduction zone. This allows us to create a synthetic catalogue of possible megathrust earthquakes in the region with associated probabilities for each event. To calculate the resulting tsunami for each event we create a library of "unit source" tsunami for a set of 100km x 50km unit sources along each subduction zone. For each unit source, we calculate the sea floor deformation by modelling the slip along the unit source as a dislocation in a stratified, linear elastic half-space. This sea floor deformation is then fed into a tsunami propagation model to calculate the wave height off the coast for each unit source. Our propagation model uses a staggered grid, finite different scheme to solve the linear, shallow water wave equations for tsunami propagation. The tsunami from any earthquake in the synthetic catalogue can then be quickly calculated by summing the unit source tsunami from all the unit sources that fall within the rupture zone of the earthquake. The results of these calculations can then be combined with our estimate of the probability of the earthquake to produce hazard maps showing (for example) the probability of a tsunami exceeding a given height offshore from a given stretch of coastline. These hazard maps can then be used to guide emergency managers to focus their planning efforts on regions and countries which have the greatest likelihood of producing a catastrophic tsunami.

  • Random forest (RF) is one of the top performed methods in predictive modelling. Because of its high predictive accuracy, we introduced it into spatial statistics by combining it with the existing spatial interpolation methods, resulting a few hybrid methods and improved prediction accuracy when applied to marine environmental datasets (Li et al., 2011). The superior performance of these hybrid methods was partially attributed to the features of RF, one component of the hybrids. One of these features inherited from its trees is to be able to deal with irrelevant inputs. It is also argued that the performance of RF is not much influenced by parameter choices, so the hybrids presumably also share this feature. However, these assumptions have not been tested for the spatial interpolation of environmental variables. In this study, we experimentally examined these assumptions using seabed sand and gravel content datasets on the northwest Australian marine margin. Four sets of input variables and two choices of 'number of variables randomly sampled as candidates at each split' were tested in terms of predictive accuracy. The input variables vary from six predictors only to combinations of these predictors and derived variables including the second and third orders and/or possible two-way interactions of these six predictors. However, these derived predictors were regarded as redundant and irrelevant variables because they are correlated with these six predictors and because RF can do implicit variable selection and can model complex interactions among predictors. The results derived from this experiment are analysed, discussed and compared with previous findings. The outcomes of this study have both practical and theoretical importance for predicting environmental variables.

  • Geoscience Australia has collaboratively developed a number of open source software models and tools to estimate hazard, impact and risk to communicaties for a range of natural hazard to support disaster risk reduction in Australia and the region. These models and tools include: * ANUGA * EQRM * TCRM * TsuDAT * RICS * FiDAT This presentation will discuss the drivers for developing these models and tools using open source software and the benefits to the end-users in the emergency management and planning community as well as the broader research community. Progress and plans for these models and tools will also be outlined in particular those that take advantage of the availability of high performance computing, cloud computing, webservices and global initiatives such as the Global Earthquake Model.

  • The Attorney General's Departement has supported Geoscience Australia to develop inundation models for four east coast communities with the view of buildling the tsunami planning and preparation capacity of the Jurisdictions. The aim of this document and accompanying DVD is to report on the approach adopted by each Jurisdiction, the modelling outcomes and supply the underpinning computer scripts and input data.

  • Lagrangian stochastic (LS) forward modelling of CO2 plumes from above-surface release experiments conducted at the GA-CO2CRC Ginninderra GHG controlled release facility demonstrated that small surface leaks are likely to disperse rapidly and unlikely to be detected at heights greater 4 m; this was verified using a rotorcraft to map out the plume. The CO2 sensing rotorcraft unmanned aerial vehicle (RUAV) developed at the Australian National University, Canberra, is equipped with a CO2 sensor (3 ppm accuracy and 2 s response time), a GPS, lidar and a communication module. It was developed to detect, locate and quantify CO2 gas leaks. The choice of a rotorcraft UAV allows slower flight speeds compared to speeds of a fixed-wing UAV; and the electric powered motor enables flight times of 12 min. During the experiments, gaseous CO2 (100 kg per day) was released from a small diffuse source located in the middle of the paddock of the controlled release facility, and the RUAV, flying repeatedly over the CO2 source at a few metres height, recorded CO2 concentrations up to 85 ppm above background. Meteorological parameters measured continuously at the site were input in the LS model. Mapped out horizontal and vertical CO2 concentrations established the need to be close to the ground in order to detect CO2 leakage using aerial techniques. Using the rotorcraft as a mobile sensor could be an expedient mechanism to detect plumes over large areas, and would be important for early detection of CO2 leaks arising from CO2 geological storage activities.