From 1 - 10 / 67
  • Digital technology and the Internet have contributed to the information explosion and in part the widespread increase in the use of spatial information. In this regard community needs for geoscientific information has extended beyond the traditional area of mineral and petroleum exploration. Geoscience is now recognised by society as having a part to play in the achievement of social wellbeing and environmental outcomes. This paper examines whether the geoscience data providers are playing their part in the data explosion. It looks at how geoscience can be applied to real world problems and questions whether the data providers are up to the mark in satisfying the immediate expectations of users as well as initiating new areas of application. The discussion incorporates issues of price, accessibility, formats and data assemblage in relation to a hierarchy of need for decision making.

  • Earth comprises systems of enormous complexity that sustain all life and control the distribution of our mineral, energy and water resources. Increasingly earth scientists are now moving away from focusing on single domain research on understanding isolated parts of these intricate systems to adopting multidisciplinary, computationally intensive integrated methodologies to model and simulate the real world complexities of earth systems science. Simultaneously developments in information technology are increasing the capacity of computational systems to credibly simulate complex systems. Real world Solid Earth and Environmental Science data sets are extremely heterogenous, complex and large, and are currently in the order of terabytes (1012 bytes). However, the size and complexity of geoscience data sets are also exponentially increasing, as more powerful modern computing systems combine with enhanced engineering capacity to design and build automated instruments to collect more data and new data types. We are rapidly moving into an era when Earth Scientists will need to have the capacity to analyse petabyte (1015 bytes) databases if they are to realistically model and simulate complex earth processes. Although digital geoscientific data sets are becoming increasingly available over the Internet, current Internet technologies only allow for the downloading of data (if the connection is fast enough): integration, processing and analysis then has to take place locally. As data sets get larger and more complex, then large computational resources are required to effectively process these data. Such resources are increasingly only available to the major industry players, which in turn creates a strong bias against the Small to Middle Enterprises, as well as many University researchers. For those that do not have access to large-scale computing resources, analysis of these voluminous data sets has to be compromised by dividing the data set into smaller units, accepting sub-optimal solutions and/or introducing sub-optimal approximations. It is clear that if we are to begin grappling with accurate analysis of large-scale geoscientific data sets to enable sustainable management of our mineral, energy and water resources, then current computational infrastructures are no longer viable.

  • Geoscience Australia (GA) produces geoscientific and geospatial data for the benefit of the Australian government and community, to inform public policy, to promote development of Australia's economy, to assist environmental management and to help manage and mitigate natural hazards. Users of GA's data want to know that data are produced to known standards using open and accountable processes and come from a unique and reliable source. Single Point of Truth (SPOT) is Geoscience Australia's standard for processes that produce data. The SPOT methodology describes a consistent approach to transforming an existing data theme into a SPOT. The same methodology can be used for developing a SPOT for a new data theme.

  • Legacy product - no abstract available

  • A national dataset with more than 73,000 mineral occurrences providing information on the name, lat/long, map sheet name and number, commodities of interest and source reference for each occurrence.

  • The recording of continuous waveform data presents different challenges to the recording of event triggered segmented data or to the recording of semi-continuous yet offline data. Many formats in use today derive their origins from the earlier imperatives of such systems. This article will briefly classify such formats so as to better appreciate continuous format requirements. Following this a comparison will be made of continuous formats and the format adopted for use in the Australian National Seismic Network (ANSN). The CD 1 format in detail, its use and adaptation within the ANSN will come after this. Some contextual background on networking will be provided and this will then be wrapped up by a section on where the ANSN may go in the future with CD 1. An appendix is provided to explain data conversion on the GDAS system.

  • This Record provides a description of the development and use of the version of the AGSO Library catalogue accessible from AGSO's World Wide Web site. The Library's in-house catalogue of books and serials is part of the Datatrek Professional Series library management system. Although this system operates from a Novell network server, the lack of integration of AGSO's Novell networks and lack of access to them from the Unix system means that direct access to Datatrek system is only available in the Library itself. Also, the Datatrek system has no provision for access, either directly or indirectly, from the Internet. In the interests of making the Library catalogue more readily available to both AGSO's staff and AGSO's clients, a decision was made to develop a version of the Datatrek catalogue as an Oracle database, and that that version would be made available on the World Wide Web of the Internet with a forms-based search interface usable on any Web browser.