From 1 - 10 / 2490
  • The Australian National Gravity Database (ANGD) contains over 1.8 million gravity observations from over 2,000 surveys conducted in Australia over the last 80 years. Three processes are required to correct these observations for the effects of the surrounding topography: firstly a Bouguer correction (Bullard A), which approximates the topography as an infinite horizontal slab; secondly a correction to that horizontal slab for the curvature of the Earth (Bullard B); and thirdly a terrain correction (Bullard C), which accounts for the undulations in the surrounding topography. These three corrections together produce complete bouguer anomalies. Since February 2008, a spherical cap bouguer anomaly calculation has been applied to data extracted from the ANGD. This calculation applies the Bullard A and Bullard B corrections. Terrain corrections, Bullard C, have now been calculated for all terrestrial gravity observations in the ANGD allowing the calculation of complete bouguer anomalies. These terrain corrections were calculated using the Shuttle Radar Topography Mission 3 arc-second digital elevation data. The complete bouguer anomalies calculated for the ANGD provide users of the data with a more accurate representation of crustal density variations through the application of a more accurate Earth model to the gravity observations.

  • Discusses reasons to use the Australian Stratigraphic Units Database (ASUD), and new features of the web query page and reports

  • In this age of state-of-the-art devices producing analytical results with little input from analytical specialists, how do we know that the results produced are correct? When reporting the result of a measurement of a physical quantity, it is obligatory that some quantitative indication of the quality of the result be given so that those who use it can assess its reliability. Without such an indication, measurement results cannot be compared, either among themselves or with reference values given in a specification or standard. It is therefore necessary that there be a readily implemented, easily understood, and generally accepted procedure for characterising the quality of a result of a measurement, that is, for evaluating and expressing its 'uncertainty'. The concept of 'uncertainty' as quantifiable attribute is relatively new in the history of measurement, although error and error analysis have long been part of the practice of measurement science or 'metrology'. It is now widely recognised that, when all of the known or suspected components of error have been evaluated and the appropriate corrections have been applied, there still remains an uncertainty about the correctness of the stated result, that is, a doubt about how well the result of the measurement represents the value of the quantity being measured. This presentation will discuss the latest practices for the production of 'reliable' geochemical data that are associated with small measurement uncertainties, and will provide an overview of current understanding of metrological traceability and the proper use of reference materials. Correct use of reference materials will be discussed, as well as the role of measurement uncertainty and how it is affected by such issues as sample preparation, sample heterogeneity and data acquisition.

  • Proceedings of the Second National Forum on GIS in the Geosciences, 29 - 31 March 1995, held at the National Library of Australia.

  • Improvements to the Australian Crustal Temperature Image

  • Marine science is expensive. Duplication of research activities is potentially money wasted. Not being aware of other marine science studies could question the validity of findings made in single-discipline studies. A simple means of discovery is needed. The development of Earth Browsers (principally Google Earth) and KML (Keyhole Markup Language) files offer a possible solution. Google Earth is easy to use, and KML files are relatively simple, ASCII, XML-tagged files that can encode locations (points, lines and polygons), relevant metadata for presentation in descriptive 'balloons', and links to digital sources (data, publications, web-pages, etc). A suite of studies will be presented showing how information relating to investigations at a point (e.g. observation platform), along a line (e.g. ship borne survey) or over a region (e.g. satellite imagery) can be presented in a small (10 Kbyte) file. The information will cover a range of widely used data types including seismic data, underwater video, image files, documents and spreadsheets. All will be sourced directly from the web and can be downloaded from within the browser to one's desktop for analysis with appropriate applications. To be useful, this methodology requires data and metadata to be properly managed; and a degree of cooperation between major marine science organizations which could become 'sponsors' of the principal marine science disciplines (i.e oceanography, marine biology, geoscience). This need not be a complex task in many cases. The partitioning of the sciences is not important, so long as the information is being managed effectively and their existence is widely advertised. KML files provide a simple way of achieving this. The various discipline-based KML files could be hosted by an umbrella organization such as the AODCJF, enabling it to become a 'one-stop-shop' for marine science data.

  • Although there are several resources for storing and accessing geochronological data, there is no standard format for exchanging geochronology data among users. Current systems are an inefficient mixture of comma delimited text files, Excel spreadsheets and PDFs that assume prior specialist knowledge and force the user to laboriously and potentially erroneously extract the required data manually. With increasing demands for data interoperability this situation is becoming intolerable not only among researchers, but also at the funding agency level. Geoscience Australia and partners are developing a standard data exchange format for geochronological data based on XML (eXtensible Markup Language) technology that has been demonstrated in other geological data applications and is an important aspect of emerging international geoscience data format standards. This presentation will discuss developments at Geoscience Australia and the opportunities for participation. Key words: Geochronology, data management, metadata, standards.

  • A compilation of datasets gathered for the Central Gawler Gold subproject was released at the Gawler Craton: State of Play 2004 conference held in Adelaide on 4 - 6 August 2004. This presentation gives examples of some of the more recent datasets available in the data compilation, such as new AEM, Crystalline basement, Gravity, Magnetics, and worm layers.

  • This data is part of the series of maps that covers the whole of Australia at a scale of 1:250 000 (1cm on a map represents 2.5km on the ground) and comprises 513 maps. This is the largest scale at which published topographic maps cover the entire continent. Data is downloadable in various distribution formats.

  • This data is part of the series of maps that covers the whole of Australia at a scale of 1:250 000 (1cm on a map represents 2.5km on the ground) and comprises 513 maps. This is the largest scale at which published topographic maps cover the entire continent. Data is downloadable in various distribution formats.