information management
Type of resources
Keywords
Publication year
Scale
Topics
-
Legacy product - no abstract available
-
Legacy product - no abstract available
-
The love affair geoscientists have had with their PCs leads many to think that a do-it-yourself approach can carry us into the dotcom era. However, the secret to the success of major online businesses is their mastery of the `backend' the logical, physical and human infrastructure that forms the foundation to their web sites. These businesses know that their customers are best served by focusing on the hard bit, the backend. Attractive web pages get customers in, but what keeps them returning is the quality, quantity and timeliness of the content behind the web site. Most successful dotcom companies have restructured, or built from the ground up, to provide the best possible backends. Geological surveys must do likewise to survive.
-
The Christmas Island Geographic Information System (CIGIS) is a collection of spatial data, viewing and analysis tools dealing with Christmas Island. The data include orthophotography, topographic, mining, cultural and environmental features of the island. Compilation of data and its organisation into a GIS together with documentation was undertaken by the Australian Geological Survey Organisation (AGSO) at the request of the Territories Office, Department of Transport and Regional Services (DOTRS). The data are presented in both ESRI ArcView and ArcExplorer projects. The ArcView projects require a licensed copy of ArcView. ArcExplorer is a free viewer and is distributed with the Cocos GIS CD-ROM. Data are stored as ESRI shapefiles and therefore readily useable with most modern GIS applications. Data were received from a variety of custodians and in many cases had no accompanying documentation. Lack of documentation made it increasingly difficult for AGSO with interpretation, translation and documentation of data. AGSO has attempted to include metadata for all datasets to ANZLIC core metadata standards, but the value of this is limited by the poor initial documentation. In addition to limited documentation, many datasets had inconsistent spatial accuracy. The CocosGIS comprises four main CD-ROMs with additional CD-ROMs containing full-colour orthophotography. A hard-copy user guide is distributed with the main CD-ROM set.
-
GRID Computing - enabling the next generation of Solid Earth and Environmental Research in Australia
Earth comprises systems of enormous complexity that sustain all life and control the distribution of our mineral, energy and water resources. Increasingly earth scientists are now moving away from focusing on single domain research on understanding isolated parts of these intricate systems to adopting multidisciplinary, computationally intensive integrated methodologies to model and simulate the real world complexities of earth systems science. Simultaneously developments in information technology are increasing the capacity of computational systems to credibly simulate complex systems. Real world Solid Earth and Environmental Science data sets are extremely heterogenous, complex and large, and are currently in the order of terabytes (1012 bytes). However, the size and complexity of geoscience data sets are also exponentially increasing, as more powerful modern computing systems combine with enhanced engineering capacity to design and build automated instruments to collect more data and new data types. We are rapidly moving into an era when Earth Scientists will need to have the capacity to analyse petabyte (1015 bytes) databases if they are to realistically model and simulate complex earth processes. Although digital geoscientific data sets are becoming increasingly available over the Internet, current Internet technologies only allow for the downloading of data (if the connection is fast enough): integration, processing and analysis then has to take place locally. As data sets get larger and more complex, then large computational resources are required to effectively process these data. Such resources are increasingly only available to the major industry players, which in turn creates a strong bias against the Small to Middle Enterprises, as well as many University researchers. For those that do not have access to large-scale computing resources, analysis of these voluminous data sets has to be compromised by dividing the data set into smaller units, accepting sub-optimal solutions and/or introducing sub-optimal approximations. It is clear that if we are to begin grappling with accurate analysis of large-scale geoscientific data sets to enable sustainable management of our mineral, energy and water resources, then current computational infrastructures are no longer viable.
-
The Australian National Marine Data Group was formed by the Heads of Marine Agencies (HOMA) to promote improved interchange of marine data in Australia. The ANMDG held a workshop of practitioners in May 2002 with the intention of identifying major areas of interest and tasks for working groups to address in order to make progress with development of marine data interchange in Australia. This Proceedings CD contains the presentations by speakers in the form of PowerPoint slides and a few Acrobat documents. It was distributed to participants in the workshop.
-
We have completed a new Web interface that makes it easier for AGSO's clients to find and order products sold by the AGSO Sales Centre. The new system is on AGSO's Web site at http://www.agso.gov.au/databases/catalog /html. Alternatively, from AGSO's home page at http://www.agso.gov.au, click on the `Products' button and select `AGSO Products' from the pull-down menu of online databases. The new interface is similar to the `Products Database' it replaces, but is based on the `AGSO Catalog', a new metadata system designed to keep track of all of AGSO outputs - including products, publications, datasets and resources. The new interface will be followed shortly by a Web interface for finding publications, papers and articles by AGSO staff members.
-
The important role of information management in improving baseline data for natural hazards has been demonstrated through a collaborative pilot project between Geoscience Australia, Mineral Resources Tasmania and the University of Wollongong. The result is a 'virtual' landslide database that makes full use of diverse data across three levels of government and has enabled landslide data to be collated and accessed from a single source. Such a system establishes the foundation for a very powerful and coordinated information resource in Australia and provides a suitable basis for greater investment in data collection. This paper highlights the capacity to extend the methodology across all hazards and describes one solution in facilitating a sound knowledge base on natural disasters and disaster risk reduction.
-
We propose an automated capture system that follows the fundamental scientific methodology. It starts with the instrument that captures the data, uses web services to make standardised data reduction programs more widely accessible, and finally uses internationally agreed data transfer standards to make geochemical data seamlessly accessible online from a series of internationally distributed certified repositories. The Australian National Data Service (http://www.ands.org.au/) is funding a range of data capture solutions to ensure that the data creation and data capture phases of research are fully integrated to enable effective ingestion into research data and metadata stores at the institution or elsewhere. They are developing a national discovery service that enables access to data in institutional stores with rich context. No data is stored in this system, only metadata with pointers back to the original data. This enables researchers to keep their own data but also enables access to many repositories at once. Such a system will require standardisation at all phases of the process of analytical geochemistry. The geochemistry community needs to work together to develop standards for attributes as the data are collected from the instrument, to develop more standardised processing of the raw data and to agree on what is required for publishing. An online-collaborative workspace such as this would be ideal for geochemical data and the provision of standardised, open source software would greatly enhance the persistence of individual geochemistry data collections and facilitate reuse and repurposing. This conforms to the guidelines from Geoinformatics for Geochemistry (http://www.geoinfogeochem.org/) which requires metadata on how the samples were analysed.
-
Legacy product - no abstract available