Abstract
Type of resources
Keywords
Publication year
Scale
Topics
-
Typological analysis of Australian coastal aquifers
-
In 1994, the United Nations Regional Cartographic Conference for Asia and the Pacific resolved to establish a Permanent Committee comprising of national surveying and mapping agencies to address the concept of establishing a common geographic information infrastructure for the region. This resolution subsequently led to the establishment of the Permanent Committee for GIS Infrastructure for the Asia and Pacific (PCGIAP). One of the goals of the PCGIAP was to establish and maintain a precise understanding of the relationship between permanent geodetic stations across the region. To this end, campaign-style geodetic-GPS observations, coordinated by Geoscience Australia, have been undertaken throughout the region since 1997. In this presentation, we discuss the development of an Asia Pacific regional reference frame based on the PCGIAP GPS campaign data, which now includes data from 417 non-IGS GPS stations and provides long term crustal deformation estimates for over 200 GPS stations throughout the region. We overview and evaluate: our combination strategy with particular emphasis on the alignment of the solution onto the International Terrestrial Reference Frame (ITRF); the sensitivity of the solution to reference frame site selection; the treatment of regional co-seismic and post-seismic deformation; and the Asia-Pacific contribution to the International Association of Geodesy (IAG) Working Group on "Regional Dense Velocity Fields". The level of consistency of the coordinate estimates with respect to ITRF2005 is 6, 5, 15 mm, in the east, north and up components, respectively, while the velocity estimates are consistent at 2, 2, 6 mm/yr in the east, north and up components, respectively.
-
Imagine you are an incident controller viewing a computer screen which depicts the likely spread of a bushfire that's just started. The display shows houses and other structures in the fire's path, and even the demographics of the people living in the area - such as the number of people, their age spread, whether the household has independent transport, and whether English is their second language. In addition, imagine that you can quantify and display the uncertainty in both the fire weather and also the type and state of the vegetation, enabling the delivery of a range of simulations relating to the expected fire spread and impact. You will be able to addresses the 'what if' scenarios as the event unfolds and reject those scenarios that are no longer plausible. The advantages of such a simulation system in making speedy, well-informed decisions has been considered by a group of Bushfire CRC researchers who have collaborated to produce a 'proof of concept' system initially for use in addressing 3 case studies. The system has the working name FireDST (Fire Impact and Risk Evaluation Decision Support Tool). FireDST links various databases and models, including the Phoenix RapidFire fire prediction model and building vulnerability assessment model (radiant heat and ember attack), as well as infrastructure and demographic databases. The information is assembled into an integrated simulation framework through a geographical information system (GIS) interface. Pre-processed information, such as factors that determine the local and regional wind, and also the typical response of buildings to fire, are linked with the buildings through a database, along with census-derived social and economic information. This presentation provides an overview of the FireDST simulation 'proof of concept' tool and walks through a sample probabilistic simulation constructed using the tool.
-
Floodplain vegetation can be degraded from both too much and too little water due to regulation. Over-regulation and increased use of groundwater in these landscapes can exacerbate the effects related to natural climate variability. Prolonged flooding of woody plants has been found to induce a number of physiological disturbances such as early stomatal closure and inhibition of photosynthesis. However drought conditions can also result in leaf biomass reduction and sapwood area decline. Depending on the species, different inundation and drought tolerances are observed. This paper focuses specifically on differing lake level management practices in order to assess associated environmental impacts. In western NSW, two Eucalyptus species, River Red Gum (E. camaldulensis) and Black Box (E. largiflorens) have well documented tolerances and both are located on the fringes of lakes in the Menindee Lakes Storage Water scheme. Flows to these lakes have been controlled since 1960 and lake levels monitored since 1979. Pre-regulation aerial photos indicate a significant change to the distribution of lake-floor and fringing vegetation in response to increased inundation frequency and duration. In addition, by coupling historic lake water-level data with a Landsat satellite imagery, spatial and temporal vegetation response to different water regimes has been observed. Two flood events specifically investigated are the 2010/11 and 1990 floods. Results from this analysis provide historic examples of vegetation response to lake regulation including whether recorded inundation duration and frequency resulted in positive or negative impacts, the time delay till affects become evident, duration of observed response and general recovery/reversal times. These findings can be used to inform ongoing water management decisions.
-
Monitoring changes in the spatial distribution and health of biotic habitats requires spatially extensive surveys repeated through time. Although a number of habitat distribution mapping methods have been successful in clear, shallow-water coastal environments (e.g. aerial photography and Landsat imagery) and deeper (e.g. multibeam and sidescan sonar) marine environments, these methods fail in highly turbid and shallow environments such as many estuarine ecosystems. To map, model and predict key biotic habitats (seagrasses, green and red macroalgae, polychaete mounds [Ficopamatus enigmaticus] and mussel clumps [Mytilus edulis]) across a range of open and closed estuarine systems on the south-west coast of Western Australia, we integrated post-processed underwater video data with interpolated physical and spatial variables using Random Forest models. Predictive models and associated standard deviation maps were developed from fine-scale habitat cover data. Models performed well for spatial predictions of benthic habitats, with 79-90% of variation explained by depth, latitude, longitude and water quality parameters. The results of this study refine existing baseline maps of estuarine habitats and highlight the importance of biophysical processes driving plant and invertebrate species distribution within estuarine ecosystems. This study also shows that machine-learning techniques, now commonly used in terrestrial systems, also have important applications in coastal marine ecosystems. When applied to video data, these techniques provide a valuable approach to mapping and managing ecosystems that are too turbid for optical methods or too shallow for acoustic methods.
-
The term "Smartline" refers to a GIS line map format which can allow rapid capture of diverse coastal data into a single consistently classified map, which in turn can be readily analysed for many purposes. This format has been used to create a detailed nationally-consistent coastal geomorphic map of Australia, which is currently being used for the National Coastal Vulnerability Assessment (NCVA) as part of the underpinning information for understanding the vulnerability to sea level rise and other climate change influenced hazards such as storm surge. The utility of the Smartline format results from application of a number of key principles. A hierarchical form- and fabric-based (rather than morpho-dynamic) geomorphic classification is used to classify coastal landforms in shore-parallel tidal zones relating to but not necessarily co-incident with the GIS line itself. Together with the use of broad but geomorphically-meaningful classes, this allows Smartline to readily import coastal data from a diversity of differently-classified prior sources into one consistent map. The resulting map can be as spatially detailed as the available data sources allow, and can be used in at least two key ways: Firstly, Smartline can work as a source of consistently classified information which has been distilled out of a diversity of data sources and presented in a simple format from which required information can be rapidly extracted using queries. Given the practical difficulty many coastal planners and managers face in accessing and using the vast amount of primary coastal data now available in Australia, Smartline can provide the means to assimilate and synthesise all this data into more usable forms.
-
This presentation will provide an overview of some of the work currently being undertaken at Geoscience Australia GA) as part of the National Coastal Vulnerability Assessment (NCVA), funded by the Department of Climate Change (DCC). The presentation will summarise the methodology applied, and highlight the issues, including the limitations and data gaps.
-
A detailed study to estimate magnetic bottom depths under north Queensland has been made using the continent-wide high-resolution airborne total magnetic intensity (TMI) data of Australia (a source dataset for the World Digital Magnetic Anomaly Map, WDMAM). Magnetisation of the lithosphere is generally assumed not significant below the Moho crust/mantle boundary due to compositional changes. However, in regions of high temperatures in the lower crust, this bottom-depth of magnetisation may be significantly above the Moho depths due to temperatures in excess of the Curie-point isotherm of the dominant magnetic mineralogy. This study uses modelling of the azimuthally averaged log of the power spectrum of TMI data to determine bottom depths. Two methods are considered and compared: slope-fitting and automated fitting of full spectral data. Several issues in successfully using these methods have been addressed, such as magnetisation type, size of data window, location of spectral peak, sensitivities of the spectral parameters and the choice of optimisation algorithm. The TMI data have an initial grid resolution of 80 m, with an appropriate IGRF removed. These data are reduced to the pole, upward continued 1 km, sub-sampled to a 1 km grid spacing and a first order polynominal trend removed prior to the spectral analysis. Calculated magnetic bottom depths are compared both with published data on the depth to Moho and with other model interpretations of the area including heat flow modelling.
-
North Queensland Geodynamic and Mineral System Synthesis
-
When considering structural design with regard to wind loading, the Australian building code through the Australia/New Zealand Wind Actions Standard (AS/NZS 1170.2, 2002) as well as the wind engineering community in general, relies to a significant extent on the peak wind gust speed observations collected over more than 60 years by the Bureau of Meteorology (BoM). The wind-loading performance of our infrastructure (resilience) is based primarily on the Dines anemometer interpretation of the peak gust wind speed. In the early 1990's BoM commenced a program to replace the aging pressure tube Dines anemometer with the Synchrotac and Almos cup anemometers. This paper presents the results of a reanalysis of the current BoM peak wind gust database for the non-cyclonic region (Region A) of AS/NZS 1170.2 (2002). We compares estimates of the 500-year RP peak wind gust hazard magnitude derived of varying observing record lengths obtained from 31 "Region A" BoM sites. Region A was considered for this initial study as record length would contain a significant number of extreme events (synoptic or thunderstorm) over decadal time scales (i.e. extremes not dominated by one or two tropical cyclone events). To isolate the issue of anemometer replacement, only wind stations located at airports (consistent exposure) and with more than 30 years of record were considered. The methodology was formulated to explore the consistency of peak wind gust measurements due to issues surrounding equipment upgrading. Comparison of results indicated that the recent period (1990-2006) appears to have a reduction in significant events (13 of 31 sites have a mean 500 year RP below the 95% confidence limit for the 500 year RP estimate using the total record). Future plans are to calibrate some existing Dines instruments in-situ in an effort to provide sufficient information to fully specify the dynamic response over the range of operating conditions