From 1 - 10 / 69
  • Spatial interpolation methods for generating spatially continuous data from point samples of environmental variables are essential for environmental management and conservation. They may fall into three groups: non-geostatistical methods (e.g., inverse distance weighting), geostatistical methods (e.g., ordinary kriging) and combined/hybrid methods (e.g. regression kriging); and their performance is often data-specific (Li and Heap, 2008). Because of the robustness of machine learning methods, like random forest and support vector machine, in data mining fields, we introduced them into spatial statistics by applying them to the spatial predictions of seabed mud content in combination with existing spatial interpolation methods (Li et al., 2011). This development can be viewed as an extension of the combined methods from statistical methods to machine learning field. These applications have significantly improved the prediction accuracy and opened an alternative source of methods for spatial interpolation. Given that they have only been applied to one variable, several questions remain, namely: are they dataset- specific? How reliable are their predictions for different datasets and variables? Could other machine learning methods (such as boosted regression trees) improve the spatial interpolations? To address these questions, we experimentally compared the predictions of several methods for sand content on the southwest Australian marine margin. We tested a variety of existing spatial interpolation methods, machine learning methods and their combinations. In this study, we discuss the experimental results and the value of this advancement in spatial interpolation, visually examine the spatial predictions, and compare the results with the findings in the previous publications. The outcomes of this study can be applied to the spatial prediction of marine and terrestrial environmental variables.

  • Within the general trend of post-Eocene cooling, the largest and oldest outlet of the East Antarctic Ice Sheet underwent a change from ice-cliff to ice-stream and/or ice-shelf dynamics, with an associated switch from line-source to fan sedimentation. Available geological data reveal little about the causes of these changes in ice dynamics during the Miocene Epoch, or the subsequent effects on Pliocene-Pleistocene ice-sheet history. Ice-sheet numerical modeling reveals that bed morphology was probably responsible for driving changes in both ice-sheet extent and dynamics in the Lambert-Amery system at Prydz Bay. The modeling shows how the topography and bathymetry of the Lambert graben and Prydz Bay control ice-sheet extent and flow. The changes in bathymetric volume required for shelf-edge glaciation correlate well with the Prydz Channel fan sedimentation history. This suggests a negative feedback between erosion and glaciation, whereby the current graben is overdeepened to such an extent that shelf-edge glaciation is now not possible, even if a Last Glacial Maximum environment recurs. We conclude that the erosional history of the Lambert graben and Prydz Bay in combination with the uplift histories of the surrounding mountains are responsible for the evolution of this section of the East Antarctic Ice Sheet, once the necessary initial climatic conditions for glaciation were achieved at the start of the Oligocene Epoch.

  • Geoscience Australia has collaboratively developed a number of open source software models and tools to estimate hazard, impact and risk to communicaties for a range of natural hazard to support disaster risk reduction in Australia and the region. These models and tools include: * ANUGA * EQRM * TCRM * TsuDAT * RICS * FiDAT This presentation will discuss the drivers for developing these models and tools using open source software and the benefits to the end-users in the emergency management and planning community as well as the broader research community. Progress and plans for these models and tools will also be outlined in particular those that take advantage of the availability of high performance computing, cloud computing, webservices and global initiatives such as the Global Earthquake Model.

  • The Attorney General's Departement has supported Geoscience Australia to develop inundation models for four east coast communities with the view of buildling the tsunami planning and preparation capacity of the Jurisdictions. The aim of this document and accompanying DVD is to report on the approach adopted by each Jurisdiction, the modelling outcomes and supply the underpinning computer scripts and input data.

  • The tragic events of the Indian Ocean tsunami on 26 December 2004 highlighted the need for reliable and effective alert and response sysems for tsunami threat to Australian communities. Geoscience Australia has established collaborative partnerships with state and federal emergency management agencies to support better preparedness and to improve community awareness of tsunami risks.

  • Modelling the effects on the built environment of natural hazards such as riverine flooding, storm surges and tsunami is critical for understanding their economic and social impact on urban communities. Geoscience Australia and the Australian National University are developing a hydrodynamic inundation modelling tool called ANUGA to help simulate the impact of these hazards.

  • Effective disaster risk reduction is founded on knowledge of the underlying risk. While methods and tools for assessing risk from specific hazards or to individual assets are generally well developed, our ability to holistically assess risk to a community across a range of hazards and elements at risk remains limited. Developing a holistic view of risk requires interdisciplinary collaboration amongst a wide range of hazard scientists, engineers and social scientists, as well as engagement of a range of stakeholders. This paper explores these challenges and explores some of the common and contrasting issues sampled from a range of applications addressing earthquake, tsunami, volcano, severe wind, flood, and sea-level rise from projects in Australia, Indonesia and the Philippines. Key issues range from the availability of appropriate risk assessment tools and data, to the ability of communities to implement appropriate risk reduction measures. Quantifying risk requires information on the hazard, the exposure and the vulnerability. Often the knowledge of the hazard is reasonably well constrained, but exposure information (e.g., people and their assets) and measures of vulnerability (i.e., susceptibility to injury or damage) are inconsistent or unavailable. In order to fill these gaps, Geoscience Australia has developed computational models and tools which are open and freely available. As the knowledge gaps become smaller, the need is growing to go beyond the quantification of risk to the provision of tools to aid in selecting the most appropriate risk reduction strategies (e.g., evacuation plans, building retrofits, insurance, or land use) to build community resilience.

  • In response to the devastating Indian Ocean Tsunami (IOT) that occurred on 26 December 2004, Geoscience Australia developed a framework for tsunami risk modelling. The outputs from this methodology have been used by emergency managers throughout Australia to plan and prepare for future events. For Geoscience Australia to be confident in the information that is provided to the various stakeholders, validation of the model and methodology is required. Tsunami modelling at Geoscience Australia employs a hybrid approach which couples two models at the continental shelf. First we use an elastic dislocation model to simulate the initial sea-floor displacement of an earthquake source. The tsunami is then propagated across the deep ocean using URSGA, a finite difference model that solves the non-linear shallow water wave equation across nested grids. We stop this model at the 100 m water depth contour and couple it to a detailed inundation modelling tool, ANUGA, developed by Geoscience Australia and the Australian National University. ANUGA also solves the non-linear shallow water wave equation and uses a finite volume method. It incorporates bottom friction coefficients and can resolve hydraulic shocks and the wetting and drying process. While the huge loss of life from the 2004 Indian Ocean tsunami was tragic it did provide a unique opportunity to record the impact of a large tsunami event. Information gained from post-tsunami surveys and tide gauge recordings at Patong Bay, Thailand and Geraldton, Western Australia is used to validate our tsunami inundation modelling methodology. By using these two locations we can assess the performance of our models at near-source and distal locations. In addition, wave heights observed in the deep ocean from satellite altimetry are utilised to validate our deep water propagation model.

  • The dataset contains three grids. Each of the ArcINFO grids is an output of a finescale hydrodynamic model, the Simulating WAves Nearshore (SWAN) model (Booij et al., 1999; Ris et al., 1999).The grids describe the modelled maximum orbital velocity (m/s) which can be used as estimation of seabed exposure in Jervis Bay.

  • The Tropical Cyclone Risk Model (TCRM) is a statistical-parametric model of tropical cyclone behaviour and effects. A statistical model is used to generate synthetic tropical cyclone events. This is then combined with a parametric wind field model to produce estimates of cyclonic wind hazard.