From 1 - 10 / 85
  • Approximately 75% of Australia is covered by public-domain, airborne gamma-ray spectrometric surveys. However, all the older surveys are in units of c/s and their data values depend on the survey instrumentation and acquisition parameters. Also, many of the newer surveys were inadequately calibrated with the result that data values on adjacent surveys are not necessarily comparable. This limits the usefulness of these data Geoscience Australia and State Geological Surveys are working towards establishing a national baseline database of Australian gamma-ray spectrometric data that is consistent with the global radioelement baseline. This will be achieved by: (a) ensuring consistency in the calibration and processing of new gamma-ray spectrometric data through the use of standard processing procedures and calibration facilities that are tied to the global datum, and; (b) adjusting older surveys to the global datum through back-calibration and automatic grid merging. Surveys that are registered to the same datum are easily merged into regional compilations which facilitate the recognition and interpretation of broad-scale regional features, and allow lessons learnt in one area to be more easily applied to other areas.

  • Although there are several resources for storing and accessing geochronological data, there is no standard format for exchanging geochronology data among users. Current systems are an inefficient mixture of comma delimited text files, Excel spreadsheets and PDFs that assume prior specialist knowledge and force the user to laboriously and potentially erroneously extract the required data manually. With increasing demands for data interoperability this situation is becoming intolerable not only among researchers, but also at the funding agency level. Geoscience Australia and partners are developing a standard data exchange format for geochronological data based on XML (eXtensible Markup Language) technology that has been demonstrated in other geological data applications and is an important aspect of emerging international geoscience data format standards. This presentation will discuss developments at Geoscience Australia and the opportunities for participation. Key words: Geochronology, data management, metadata, standards.

  • The Australian National Marine Data Group was formed by the Heads of Marine Agencies (HOMA) to promote improved interchange of marine data in Australia. The ANMDG held a workshop of practitioners in May 2002 with the intention of identifying major areas of interest and tasks for working groups to address in order to make progress with development of marine data interchange in Australia. This Proceedings CD contains the presentations by speakers in the form of PowerPoint slides and a few Acrobat documents. It was distributed to participants in the workshop.

  • This extended abstract describes the 1:1 million scale Surface Geology of Northern Territory digital dataset and advances in digital data delivery via WMS/WFS services and the GeoSciML geological data model.

  • Part-page article on matters relating to Australian stratigraphy. This column discusses what constitutes a publication for the purpose of establishing and formalising stratigrphic units. ISSN: 0312 4711

  • In this age of state-of-the-art devices producing analytical results with little input from analytical specialists, how do we know that the results produced are correct? When reporting the result of a measurement of a physical quantity, it is obligatory that some quantitative indication of the quality of the result be given so that those who use it can assess its reliability. Without such an indication, measurement results cannot be compared, either among themselves or with reference values given in a specification or standard. It is therefore necessary that there be a readily implemented, easily understood, and generally accepted procedure for characterising the quality of a result of a measurement, that is, for evaluating and expressing its 'uncertainty'. The concept of 'uncertainty' as quantifiable attribute is relatively new in the history of measurement, although error and error analysis have long been part of the practice of measurement science or 'metrology'. It is now widely recognised that, when all of the known or suspected components of error have been evaluated and the appropriate corrections have been applied, there still remains an uncertainty about the correctness of the stated result, that is, a doubt about how well the result of the measurement represents the value of the quantity being measured. This presentation will discuss the latest practices for the production of 'reliable' geochemical data that are associated with small measurement uncertainties, and will provide an overview of current understanding of metrological traceability and the proper use of reference materials. Correct use of reference materials will be discussed, as well as the role of measurement uncertainty and how it is affected by such issues as sample preparation, sample heterogeneity and data acquisition.

  • Marine science is expensive. Duplication of research activities is potentially money wasted. Not being aware of other marine science studies could question the validity of findings made in single-discipline studies. A simple means of discovery is needed. The development of Earth Browsers (principally Google Earth) and KML (Keyhole Markup Language) files offer a possible solution. Google Earth is easy to use, and KML files are relatively simple, ASCII, XML-tagged files that can encode locations (points, lines and polygons), relevant metadata for presentation in descriptive 'balloons', and links to digital sources (data, publications, web-pages, etc). A suite of studies will be presented showing how information relating to investigations at a point (e.g. observation platform), along a line (e.g. ship borne survey) or over a region (e.g. satellite imagery) can be presented in a small (10 Kbyte) file. The information will cover a range of widely used data types including seismic data, underwater video, image files, documents and spreadsheets. All will be sourced directly from the web and can be downloaded from within the browser to one's desktop for analysis with appropriate applications. To be useful, this methodology requires data and metadata to be properly managed; and a degree of cooperation between major marine science organizations which could become 'sponsors' of the principal marine science disciplines (i.e oceanography, marine biology, geoscience). This need not be a complex task in many cases. The partitioning of the sciences is not important, so long as the information is being managed effectively and their existence is widely advertised. KML files provide a simple way of achieving this. The various discipline-based KML files could be hosted by an umbrella organization such as the AODCJF, enabling it to become a 'one-stop-shop' for marine science data.

  • Codes, guidelines, and standard practices for naming and describing Australian stratigraphic units have been discussed for more than 60 years since the Australian and New Zealand Association for the Advancement of Science (ANZAAS) set up a Research Committee on Stratigraphic Nomenclature in 1946. Like today's Australian Stratigraphy Commission, its aims were 'to encourage the orderly use of names and definitions for stratigraphic units'. .......

  • We propose an automated capture system that follows the fundamental scientific methodology. It starts with the instrument that captures the data, uses web services to make standardised data reduction programs more widely accessible, and finally uses internationally agreed data transfer standards to make geochemical data seamlessly accessible online from a series of internationally distributed certified repositories. The Australian National Data Service (http://www.ands.org.au/) is funding a range of data capture solutions to ensure that the data creation and data capture phases of research are fully integrated to enable effective ingestion into research data and metadata stores at the institution or elsewhere. They are developing a national discovery service that enables access to data in institutional stores with rich context. No data is stored in this system, only metadata with pointers back to the original data. This enables researchers to keep their own data but also enables access to many repositories at once. Such a system will require standardisation at all phases of the process of analytical geochemistry. The geochemistry community needs to work together to develop standards for attributes as the data are collected from the instrument, to develop more standardised processing of the raw data and to agree on what is required for publishing. An online-collaborative workspace such as this would be ideal for geochemical data and the provision of standardised, open source software would greatly enhance the persistence of individual geochemistry data collections and facilitate reuse and repurposing. This conforms to the guidelines from Geoinformatics for Geochemistry (http://www.geoinfogeochem.org/) which requires metadata on how the samples were analysed.