Perry, S.; Jordan, T.
Our undergraduate research program, SCEC/UseIT, an NSF Research Experience for Undergraduates site, provides software for earthquake researchers and educators, movies for outreach, and ways to strengthen the technical career pipeline. SCEC/UseIT motivates diverse undergraduates towards science and engineering careers through team-based research in the exciting field of earthquake information technology. UseIT provides the cross-training in computer science/information technology (CS/IT) and geoscience needed to make fundamental progress in earthquake system science. Our high and increasing participation of women and minority students is crucial given the nation"s precipitous enrollment declines in CS/IT undergraduate degree programs, especially among women. UseIT also casts a "wider, farther" recruitment net that targets scholars interested in creative work but not traditionally attracted to summer science internships. Since 2002, SCEC/UseIT has challenged 79 students in three dozen majors from as many schools with difficult, real-world problems that require collaborative, interdisciplinary solutions. Interns design and engineer open-source software, creating increasingly sophisticated visualization tools (see "SCEC-VDO," session IN11), which are employed by SCEC researchers, in new curricula at the University of Southern California, and by outreach specialists who make animated movies for the public and the media. SCEC-VDO would be a valuable tool for research-oriented professional development programs.
Hagerty, M. T.; Lomax, A.; Hellman, S. B.; Whitmore, P.; Weinstein, S.; Hirshorn, B. F.; Knight, W. R.
Tsunami warning centers must rapidly decide whether an earthquake is likely to generate a destructive tsunami in order to issue a tsunami warning quickly after a large event. For very large events (Mw > 8 or so), magnitude and location alone are sufficient to warrant an alert. However, for events of smaller magnitude (e.g., Mw ~ 7.5), particularly for so-called "tsunami earthquakes", magnitude alone is insufficient to issue an alert and other measurements must be rapidly made and used to assess tsunamigenic potential. The Tsunami Information technology Modernization (TIM) is a National Oceanic and Atmospheric Administration (NOAA) project to update and standardize the earthquake and tsunami monitoring systems currently employed at the U.S. Tsunami Warning Centers in Ewa Beach, Hawaii (PTWC) and Palmer, Alaska (NTWC). We (ISTI) are responsible for implementing the seismic monitoring components in this new system, including real-time seismic data collection and seismic processing. The seismic data processor includes a variety of methods aimed at real-time discrimination of tsunamigenic events, including: Mwp, Me, slowness (Theta), W-phase, mantle magnitude (Mm), array processing and finite-fault inversion. In addition, it contains the ability to designate earthquake scenarios and play the resulting synthetic seismograms through the processing system. Thus, it is also a convenient tool that integrates research and monitoring and may be used to calibrate and tune the real-time monitoring system. Here we show results of the automated processing system for a large dataset of subduction zone earthquakes containing recent tsunami earthquakes and we examine the accuracy of the various discrimation methods and discuss issues related to their successful real-time application.
Levy, Gad; Blumberg, Nehemia; Kreiss, Yitshak; Ash, Nachman; Merin, Ofer
Following the January 2010 earthquake in Haiti, the Israel Defense Force Medical Corps dispatched a field hospital unit. A specially tailored information technology solution was deployed within the hospital. The solution included a hospital administration system as well as a complete electronic medical record. A light-weight picture archiving and communication system was also deployed. During 10 days of operation, the system registered 1111 patients. The network and system up times were more than 99.9%. Patient movements within the hospital were noted, and an online command dashboard screen was generated. Patient care was delivered using the electronic medical record. Digital radiographs were acquired and transmitted to stations throughout the hospital. The system helped to introduce order in an otherwise chaotic situation and enabled adequate utilization of scarce medical resources by continually gathering information, analyzing it, and presenting it to the decision-making command level. The establishment of electronic medical records promoted the adequacy of medical treatment and facilitated continuity of care. This experience in Haiti supports the feasibility of deploying information technologies within a field hospital operation. Disaster response teams and agencies are encouraged to consider the use of information technology as part of their contingency plans. PMID:20962123
IAEMIS (Integrated Automated Emergency Management Information System) is the principal tool of an earthquake preparedness program developed by Martin Marietta and the Mid-America Remote Sensing Center (MARC). It is a two-component set of software, data and procedures to provide information enabling management personnel to make informed decisions in disaster situations. The NASA-developed program ELAS, originally used to analyze Landsat data, provides MARC with a spatially-oriented information management system. Additional MARC projects include land resources management, and development of socioeconomic data.
Fan, Hong; Guo, Dan; Li, Huaiyuan
In this paper a web information extraction method is presented which identifies a variety of thematic events utilizing the event knowledge framework derived from text training, and then further uses the syntactic analysis to extract the event key information. The method which combines the text semantic information and domain knowledge of the event makes the extraction of information people interested more accurate. In this paper, web based earthquake news extraction is taken as an example. The paper firstly briefs the overall approaches, and then details the key algorithm and experiments of seismic events extraction. Finally, this paper conducts accuracy analysis and evaluation experiments which demonstrate that the proposed method is a promising way of hot events mining.
Perry, S.; Benthien, M.; Jordan, T. H.
The SCEC/UseIT internship program is training the next generation of earthquake scientist, with methods that can be adapted to other disciplines. UseIT interns work collaboratively, in multi-disciplinary teams, conducting computer science research that is needed by earthquake scientists. Since 2002, the UseIT program has welcomed 64 students, in some two dozen majors, at all class levels, from schools around the nation. Each summer''s work is posed as a ``Grand Challenge.'' The students then organize themselves into project teams, decide how to proceed, and pool their diverse talents and backgrounds. They have traditional mentors, who provide advice and encouragement, but they also mentor one another, and this has proved to be a powerful relationship. Most begin with fear that their Grand Challenge is impossible, and end with excitement and pride about what they have accomplished. The 22 UseIT interns in summer, 2005, were primarily computer science and engineering majors, with others in geology, mathematics, English, digital media design, physics, history, and cinema. The 2005 Grand Challenge was to "build an earthquake monitoring system" to aid scientists who must visualize rapidly evolving earthquake sequences and convey information to emergency personnel and the public. Most UseIT interns were engaged in software engineering, bringing new datasets and functionality to SCEC-VDO (Virtual Display of Objects), a 3D visualization software that was prototyped by interns last year, using Java3D and an extensible, plug-in architecture based on the Eclipse Integrated Development Environment. Other UseIT interns used SCEC-VDO to make animated movies, and experimented with imagery in order to communicate concepts and events in earthquake science. One movie-making project included the creation of an assessment to test the effectiveness of the movie''s educational message. Finally, one intern created an interactive, multimedia presentation of the UseIT program.
Degroot, R. M.; Jordan, T. H.; Benthien, M. L.; Ihrig, M.; Berti, R.
UseIT is one of the three undergraduate research programs sponsored by the Southern California Earthquake Center (SCEC). The program allows students to work in multi-disciplinary collaborative teams to tackle a scientific “Grand Challenge.” The topic varies each year but it always entails performing computer science research that is needed by earthquake scientists, educators, and other target audiences. The program allows undergraduates to use the advanced tools of information technology to solve important problems in interdisciplinary earthquake research. Since the program began in 2002, 145 students have participated in UseIT. The program stresses problem solving and interdisciplinary cross training. A key aspect of the UseIT program is its flexible, yet structured, team approach. Students share their diverse skills and interests, creating a powerful synergy through this peer mentoring. The majority of UseIT interns have considerable computer science skill or aptitude, but successful UseIT interns have hailed from nearly three-dozen disciplines, all class levels, and all skill levels. Successful UseIT interns have in common a willingness to step outside their comfort zones and try new things. During the 2009 internship the focus of the program was to deliver SCEC Virtual Display of Objects (VDO) images and animations of faults and earthquake sequences to SCEC, the Earthquake Country Alliance, and other virtual organizations via a content management system that captures the metadata and guides the user. SCEC-VDO is the SCEC intern-developed visualization software that allows the user to see earthquake related phenomena in three and four dimensions. The 2009 Grand Challenge had special relevance for the interns because the products they created were used for The Great California ShakeOut. This talk will discuss lessons learned from this program, how it addresses the needs of the 21st century STEM work force, and highlights of the 2009 internship.
Prieto Castrillo, F.; Boton Fernandez, M.
This work presents a GRID framework to estimate the vulnerability of structures under earthquake hazard. The tool has been designed to cover the needs of a typical earthquake engineering stability analysis; preparation of input data (pre-processing), response computation and stability analysis (post-processing). In order to validate the application over GRID, a simplified model of structure under artificially generated earthquake records has been implemented. To achieve this goal, the proposed scheme exploits the GRID technology and its main advantages (parallel intensive computing, huge storage capacity and collaboration analysis among institutions) through intensive interaction among the GRID elements (Computing Element, Storage Element, LHC File Catalogue, federated database etc.) The dynamical model is described by a set of ordinary differential equations (ODE's) and by a set of parameters. Both elements, along with the integration engine, are encapsulated into Java classes. With this high level design, subsequent improvements/changes of the model can be addressed with little effort. In the procedure, an earthquake record database is prepared and stored (pre-processing) in the GRID Storage Element (SE). The Metadata of these records is also stored in the GRID federated database. This Metadata contains both relevant information about the earthquake (as it is usual in a seismic repository) and also the Logical File Name (LFN) of the record for its later retrieval. Then, from the available set of accelerograms in the SE, the user can specify a range of earthquake parameters to carry out a dynamic analysis. This way, a GRID job is created for each selected accelerogram in the database. At the GRID Computing Element (CE), displacements are then obtained by numerical integration of the ODE's over time. The resulting response for that configuration is stored in the GRID Storage Element (SE) and the maximum structure displacement is computed. Then, the corresponding
Hellman, S. B.; Lisowski, S.; Baker, B.; Hagerty, M.; Lomax, A.; Leifer, J. M.; Thies, D. A.; Schnackenberg, A.; Barrows, J.
Tsunami Information technology Modernization (TIM) is a National Oceanic and Atmospheric Administration (NOAA) project to update and standardize the earthquake and tsunami monitoring systems currently employed at the U.S. Tsunami Warning Centers in Ewa Beach, Hawaii (PTWC) and Palmer, Alaska (NTWC). While this project was funded by NOAA to solve a specific problem, the requirements that the delivered system be both open source and easily maintainable have resulted in the creation of a variety of open source (OS) software packages. The open source software is now complete and this is a presentation of the OS Software that has been funded by NOAA for benefit of the entire seismic community. The design architecture comprises three distinct components: (1) The user interface, (2) The real-time data acquisition and processing system and (3) The scientific algorithm library. The system follows a modular design with loose coupling between components. We now identify the major project constituents. The user interface, CAVE, is written in Java and is compatible with the existing National Weather Service (NWS) open source graphical system AWIPS. The selected real-time seismic acquisition and processing system is open source SeisComp3 (sc3). The seismic library (libseismic) contains numerous custom written and wrapped open source seismic algorithms (e.g., ML/mb/Ms/Mwp, mantle magnitude (Mm), w-phase moment tensor, bodywave moment tensor, finite-fault inversion, array processing). The seismic library is organized in a way (function naming and usage) that will be familiar to users of Matlab. The seismic library extends sc3 so that it can be called by the real-time system, but it can also be driven and tested outside of sc3, for example, by ObsPy or Earthworm. To unify the three principal components we have developed a flexible and lightweight communication layer called SeismoEdex.
Jordan, T. H.; Scec/Itr Collaboration
The Southern California Earthquake Center (SCEC), in collaboration with the San Diego Supercomputer Center, the USC Information Sciences Institute,IRIS, and the USGS, has received a large five-year grant from the NSF's ITR Program and its Geosciences Directorate to build a new information infrastructure for earthquake science. In many respects, the SCEC/ITR Project presents a microcosm of the IT efforts now being organized across the geoscience community, including the EarthScope initiative. The purpose of this presentation is to discuss the experience gained by the project thus far and lay out the challenges that lie ahead; our hope is to encourage cross-discipline collaboration in future IT advancements. Project goals have been formulated in terms of four "computational pathways" related to seismic hazard analysis (SHA). For example, Pathway 1 involves the construction of an open-source, object-oriented, and web-enabled framework for SHA computations that can incorporate a variety of earthquake forecast models, intensity-measure relationships, and site-response models, while Pathway 2 aims to utilize the predictive power of wavefield simulation in modeling time-dependent ground motion for scenario earthquakes and constructing intensity-measure relationships. The overall goal is to create a SCEC "community modeling environment" or collaboratory that will comprise the curated (on-line, documented, maintained) resources needed by researchers to develop and use these four computational pathways. Current activities include (1) the development and verification of the computational modules, (2) the standardization of data structures and interfaces needed for syntactic interoperability, (3) the development of knowledge representation and management tools, (4) the construction SCEC computational and data grid testbeds, and (5) the creation of user interfaces for knowledge-acquisition, code execution, and visualization. I will emphasize the increasing role of standardized
Marcum, Deanna; Boss, Richard
Discusses a problem commonly encountered in library automation projects: the conversion from existing card catalog formats to machine readable catalog (MARC) records. Catalog formats, the advantages of full versus limited records, changing computer technology, the advantages of full MARC records, and record standardization are among the topics…
The 2008 MS 8.0 Wenchuan earthquake is one of the deadliest in recent human history. This earthquake has not just united the whole world to help local people to lead their life through the difficult time, it has also fostered significant global cooperation to study this event from various aspects: including pre-seismic events (such as the seismicity, gravity, electro-magnetic fields, well water level, radon level in water etc), co-seismic events (fault slipping, landslides, man-made structure damages etc) and post-seismic events (such as aftershocks, well water level changing etc) as well as the disaster relief efforts. In the last four years, more than 300 scientific articles have been published on peer-reviewed journals, among them about 50% are published in Chinese, 30% in English, and about 20% in both languages. These researches have advanced our understanding of earthquake science in general. It has also sparked open debates in many aspects. Notably, the role of the Zipingpu reservoir (built not long ago before the earthquake) in the triggering of this monstrous earthquake is still one of many continuing debates. Given that all these articles are ssporadically spread out on different journals and numerous issues and in different languages, it can be very inefficient, sometimes impossible, to dig out the information that are in need. The Earthquake Research Group in the Chengdu University of Technology (ERGCDUT) has initiated an effort to develop an information platform to collect and analyze scientific research on or related to this earthquake, the hosting faults and the surrounding tectonic regions. A preliminary website has been setup for this purpose: http://www.wenchuaneqresearch.org. Up to this point (July 2012), articles published in 6 Chinese journals and 7 international journals have been collected. Articles are listed journal by journal, and also grouped by contents into four major categories, including pre-seismic events, co-seismic events, post
Xu, J. H.; Nie, G. Z.; Xu, X.
Acquiring disaster information quickly after an earthquake is crucial for disaster and emergency rescue management. This study examines a digital social network - an earthquake disaster information reporting network - for rapid collection of earthquake disaster information. Based on the network, the disaster information rapid collection method is expounded in this paper. The structure and components of the reporting network are introduced. Then the work principles of the reporting network are discussed, in which the rapid collection of disaster information is realised by using Global System for Mobile Communications (GSM) messages to report the disaster information and Geographic information system (GIS) to analyse and extract useful disaster information. This study introduces some key technologies for the work principles, including the methods of mass sending and receiving of SMS for disaster management, the reporting network grouping management method, brief disaster information codes, and the GIS modelling of the reporting network. Finally, a city earthquake disaster information quick reporting system is developed and with the support of this system the reporting network obtained good results in a real earthquake and earthquake drills. This method is a semi-real time disaster information collection method which extends current SMS based method and meets the need of small and some moderate earthquakes.
Pulinets, S. A.; Krankowski, A.; Hernandez-Pajares, M.; Liu, J. Y. G.; Hattori, K.; Davidenko, D.; Ouzounov, D.
The existence of ionospheric anomalies before earthquakes is now widely accepted. These phenomena started to be considered by GPS community to mitigate the GPS signal degradation over the territories of the earthquake preparation. The question is still open if they could be useful for seismology and for short-term earthquake forecast. More than decade of intensive studies proved that ionospheric anomalies registered before earthquakes are initiated by processes in the boundary layer of atmosphere over earthquake preparation zone and are induced in the ionosphere by electromagnetic coupling through the Global Electric Circuit. Multiparameter approach based on the Lithosphere-Atmosphere-Ionosphere Coupling model demonstrated that earthquake forecast is possible only if we consider the final stage of earthquake preparation in the multidimensional space where every dimension is one from many precursors in ensemble, and they are synergistically connected. We demonstrate approaches developed in different countries (Russia, Taiwan, Japan, Spain, and Poland) within the framework of the ISSI and ESA projects) to identify the ionospheric precursors. They are also useful to determine the all three parameters necessary for the earthquake forecast: impending earthquake epicenter position, expectation time and magnitude. These parameters are calculated using different technologies of GPS signal processing: time series, correlation, spectral analysis, ionospheric tomography, wave propagation, etc. Obtained results from different teams demonstrate the high level of statistical significance and physical justification what gives us reason to suggest these methodologies for practical validation.
Bossu, R.; Lefebvre, S.; Mazet-Roux, G.; Steed, R.
The Euro-Med Seismological Centre (EMSC) operates the second global earthquake information website (www.emsc-csem.org) which attracts 2 million visits a month from about 200 different countries. We collect information about earthquakes' effects from eyewitnesses such as online questionnaires, geolocated pics to rapidly constrain impact scenario. At the beginning, the collection was purely intended to address a scientific issue: the rapid evaluation of earthquake's impact. However, it rapidly appears that the understanding of eyewitnesses' expectations and motivations in the immediate aftermath of an earthquake was essential to optimise this data collection. Crowdsourcing information on earthquake's effects does not apply to a pre-existing community. By definition, eyewitnesses only exist once the earthquake has struck. We developed a strategy on social networks (Facebook, Google+, Twitter...) to interface with spontaneously emerging online communities of eyewitnesses. The basic idea is to create a positive feedback loop: attract eyewitnesses and engage with them by providing expected earthquake information and services, collect their observations, collate them for improved earthquake information services to attract more witnesses. We will present recent examples to illustrate how important the use of social networks is to engage with eyewitnesses especially in regions of low seismic activity where people are unaware of existing Internet resources dealing with earthquakes. A second type of information collated in our information services is derived from the real time analysis of the traffic on our website in the first minutes following an earthquake occurrence, an approach named flashsourcing. We show, using the example of the Mineral, Virginia earthquake that the arrival times of eyewitnesses of our website follows the propagation of the generated seismic waves and then, that eyewitnesses can be considered as ground motion sensors. Flashsourcing discriminates felt
Suarez-Villa, L; Walrod, W
This study explores the relationship between industrial location geography, metropolitan patterns and earthquake disasters. Production losses from the 1994 Northridge earthquake to the Los Angeles Basin's most important high-technology industrial sector are evaluated in the context of that area's polycentric metropolitan form. Locations for each one of the Los Angeles Basin's 1,126 advanced electronics manufacturing establishments were identified and mapped, providing an indication of the patterns and clusters of the industry. An extensive survey of those establishments gathered information on disruptions from the Northridge earthquake. Production losses were then estimated, based on the sampled plants' lost workdays and the earthquake's distance-decay effects. A conservative estimate of total production losses to establishments in seven four-digit SIC advanced electronics industrial groups placed their value at US$220.4 million. Based on this estimate of losses, it is concluded that the Northridge earthquake's economic losses were much higher than initially anticipated. PMID:10204286
Thompson, K. J.; Krantz, D. H.
The Working Group on California Earthquake Probabilities (WGCEP) includes, in its introduction to earthquake rupture forecast maps, the assertion that "In daily living, people are used to making decisions based on probabilities -- from the flip of a coin (50% probability of heads) to weather forecasts (such as a 30% chance of rain) to the annual chance of being killed by lightning (about 0.0003%)."  However, psychology research identifies a large gap between lay and expert perception of risk for various hazards , and cognitive psychologists have shown in numerous studies [1,4-6] that people neglect, distort, misjudge, or misuse probabilities, even when given strong guidelines about the meaning of numerical or verbally stated probabilities . The gap between lay and expert use of probability needs to be recognized more clearly by scientific organizations such as WGCEP. This study undertakes to determine how the lay public interprets earthquake hazard information, as presented in graphical map form by the Uniform California Earthquake Rupture Forecast (UCERF), compiled by the WGCEP and other bodies including the USGS and CGS. It also explores alternate ways of presenting hazard data, to determine which presentation format most effectively translates information from scientists to public. Participants both from California and from elsewhere in the United States are included, to determine whether familiarity -- either with the experience of an earthquake, or with the geography of the forecast area -- affects people's ability to interpret an earthquake hazards map. We hope that the comparisons between the interpretations by scientific experts and by different groups of laypeople will both enhance theoretical understanding of factors that affect information transmission and assist bodies such as the WGCEP in their laudable attempts to help people prepare themselves and their communities for possible natural hazards.  Kahneman, D & Tversky, A (1979). Prospect
Wright, William F.; Hawkins, Donald T.
This selective annotated bibliography lists 86 references on the following topics: future technology for libraries, library automation, paperless information systems; computer conferencing and electronic mail, videotext systems, videodiscs, communications technology, networks, information retrieval, cataloging, microcomputers, and minicomputers.…
Snyder, Cathrine E.; And Others
Eight papers address technological, behavioral, and philosophical aspects of the application of information technology to training. Topics include instructional technology centers, intelligent training systems, distance learning, automated task analysis, training system selection, the importance of instructional methods, formative evaluation and…
Taylor, M. J.; Jones, R. P.; Haggerty, J.; Gresty, D.
In this paper we discuss an approach to the teaching of information technology law to higher education computing students that attempts to prepare them for professional computing practice. As information technology has become ubiquitous its interactions with the law have become more numerous. Information technology practitioners, and in particular…
A learning unit about earthquakes includes activities for primary grade students, including making inferences and defining operationally. Task cards are included for independent study on earthquake maps and earthquake measuring. (CB)
An earthquake happens when two blocks of the earth suddenly slip past one another. Earthquakes strike suddenly, violently, and without warning at any time of the day or night. If an earthquake occurs in a ...
An earthquake happens when two blocks of the earth suddenly slip past one another. Earthquakes strike suddenly, violently, and without warning at any time of the day or night. If an earthquake occurs in a populated area, it may cause ...
The Information Technology Resources Assessment (ITRA) is being published as a companion document to the Department of Energy (DOE) FY 1994--FY 1998 Information Resources Management Long-Range Plan. This document represents a collaborative effort between the Office of Information Resources Management and the Office of Energy Research that was undertaken to achieve, in part, the Technology Strategic Objective of IRM Vision 21. An integral part of this objective, technology forecasting provides an understanding of the information technology horizon and presents a perspective and focus on technologies of particular interest to DOE program activities. Specifically, this document provides site planners with an overview of the status and use of new information technology for their planning consideration.
Earle, P. S.; Wald, D. J.; Benz, H.; Sipkin, S.; Dewey, J.; Allen, T.; Jaiswal, K.; Buland, R.; Choy, G.; Hayes, G.; Hutko, A.
Immediately after detecting the May 12th, 2008 Mw 7.9 Wenchuan Earthquake, the USGS National Earthquake Information Center (NEIC) began a coordinated effort to understand and communicate the earthquake's seismological characteristics, tectonic context, and humanitarian impact. NEIC's initial estimates of magnitude and location were distributed within 30 minutes of the quake by e-mail and text message to 70,000 users via the Earthquake Notification System. The release of these basic parameters automatically triggered the generation of more sophisticated derivative products that were used by relief and government agencies to plan their humanitarian response to the disaster. Body-wave and centroid moment tensors identified the earthquake's mechanism. Predictive ShakeMaps provided the first estimates of the geographic extent and amplitude of shaking. The initial automated population exposure estimate generated and distributed by the Prompt Assessment of Global Earthquakes for Response (PAGER) system stated that 1.2 million people were exposed to severe-to-extreme shaking (Modified Mercalli Intensity VIII or greater), indicating a large-scale disaster had occurred. NEIC's modeling of the mainshock and aftershocks was continuously refined and expanded. The length and orientation of the fault were determined from aftershocks, finite-fault models, and back-projection source imaging. Firsthand accounts of shaking intensity were collected and mapped by the "Did You Feel It" system. These results were used to refine our ShakeMaps and PAGER exposure estimates providing a more accurate assessment of the extent and enormity of the disaster. The products were organized and distributed in an event-specific summary poster and via the USGS Earthquake Program web pages where they were viewed by millions and reproduced by major media outlets (over 1/2 billion hits were served that month). Rather than just a point showing magnitude and epicenter, several of the media's schematic maps
Walter, Edward J.
Presents an analysis of the causes of earthquakes. Topics discussed include (1) geological and seismological factors that determine the effect of a particular earthquake on a given structure; (2) description of some large earthquakes such as the San Francisco quake; and (3) prediction of earthquakes. (HM)
Pakiser, Louis C.
One of a series of general interest publications on science topics, the booklet provides those interested in earthquakes with an introduction to the subject. Following a section presenting an historical look at the world's major earthquakes, the booklet discusses earthquake-prone geographic areas, the nature and workings of earthquakes, earthquake…
Roper, Paul J.; Roper, Jere Gerard
Describes the causes and effects of earthquakes, defines the meaning of magnitude (measured on the Richter Magnitude Scale) and intensity (measured on a modified Mercalli Intensity Scale) and discusses earthquake prediction and control. (JR)
VanBiervliet, Alan; Parette, Howard P., Jr.
The Arkansas Technology Information System (ARTIS) was developed to fill a significant void in existing systems of technical support to Arkansans with disabilities by creating and maintaining a consumer-responsive statewide system of data storage and retrieval regarding assistive technology and services. ARTIS goals also include establishment of a…
This book deals with advances in telecommunications, artificial intelligence, supercomputers, personal computers, and the use of information technology. It focuses on computer crime, privacy, the impact of new technology on women and on the Third World, ''smart'' weapons, and the future of work.
The emphasis in Information Technology (IT) development has shifted from technology management to information management, and the tools of information management are increasingly at the disposal of end-users, people who deal with information. Moreover, the interactive capabilities of technologies such as hypertext, scientific visualization, virtual reality, video conferencing, and even database management systems have placed in the hands of users a significant amount of discretion over how these resources will be used. The emergence of high-performance networks, as well as network operating systems, improved interoperability, and platform independence of applications will eliminate technical barriers to the use of data, increase the power and range of resources that can be used cooperatively, and open up a wealth of possibilities for new applications. The very scope of these prospects for the immediate future is a problem for the IT planner or administrator. Technology procurement and implementation, integration of new technologies into the existing infrastructure, cost recovery and usage of networks and networked resources, training issues, and security concerns such as data protection and access to experiments are just some of the issues that need to be considered in the emerging IT environment. As managers we must use technology to improve competitiveness. When procuring new systems, we must take advantage of scalable resources. New resources such as distributed file systems can improve access to and efficiency of existing operating systems. In addition, we must assess opportunities to improve information worker productivity and information management through tedmologies such as distributed computational visualization and teleseminar applications.
Takahashi, I.; Nakamura, H.; Suzuki, W.; Kunugi, T.; Aoi, S.; Fujiwara, H.
J-RISQ (Japan Real-time Information System for earthquake) has been developing in NIED for appropriate first-actions to big earthquakes. When an earthquake occurs, seismic intensities (SI) are calculated first at each observation station and sent to the Data Management Center in different timing. The system begins the first estimation when the number of the stations observing the SI of 2.5 or larger exceeds the threshold amount. It estimates SI distribution, exposed population and earthquake damage on buildings by using basic data for estimation, such as subsurface amplification factors, population, and building information. It has been accumulated in J-SHIS (Japan Seismic Information Station) developed by NIED, a public portal for seismic hazard information across Japan. The series of the estimation is performed for each 250m square mesh and finally the estimated data is converted into information for each municipality. Since October 2013, we have opened estimated SI, exposed population etc. to the public through the website by making full use of maps and tables.In the previous system, we sometimes could not inspect the information of the surrounding areas out of the range suffered from strong motions, or the details of the focusing areas, and could not confirm whether the present information was the latest or not without accessing the website. J-RISQ has been advanced by introducing the following functions to settle those problems and promote utilization in local areas or in personal levels. In addition, the website in English has been released.・It has become possible to focus on the specific areas and inspect enlarged information.・The estimated information can be downloaded in the form of KML.・The estimated information can be updated automatically and be provided as the latest one.・The newest information can be inspected by using RSS readers or browsers corresponding to RSS.・Exclusive pages for smartphones have been prepared.The information estimated
Jüngling, Sebastian; Schroeder, Matthias; Lühr, Birger-Gottfried; Woith, Heiko; Wächter, Joachim
This year`s Information Technology Resources Assessment (ITRA) is something of a departure from traditional practice. Past assessments have concentrated on developments in fundamental technology, particularly with respect to hardware. They form an impressive chronicle of decreasing cycle times, increasing densities, decreasing costs (or, equivalently, increasing capacity and capability per dollar spent), and new system architectures, with a leavening of operating systems and languages. Past assessments have aimed -- and succeeded -- at putting information technology squarely in the spotlight; by contrast, in the first part of this assessment, we would like to move it to the background, and encourage the reader to reflect less on the continuing technological miracles of miniaturization in space and time and more on the second- and third-order implications of some possible workplace applications of these miracles. This Information Technology Resources Assessment is intended to provide a sense of technological direction for planners in projecting the hardware, software, and human resources necessary to support the diverse IT requirements of the various components of the DOE community. It is also intended to provide a sense of our new understanding of the place of IT in our organizations.
This year's Information Technology Resources Assessment (ITRA) is something of a departure from traditional practice. Past assessments have concentrated on developments in fundamental technology, particularly with respect to hardware. They form an impressive chronicle of decreasing cycle times, increasing densities, decreasing costs (or, equivalently, increasing capacity and capability per dollar spent), and new system architectures, with a leavening of operating systems and languages. Past assessments have aimed -- and succeeded -- at putting information technology squarely in the spotlight; by contrast, in the first part of this assessment, we would like to move it to the background, and encourage the reader to reflect less on the continuing technological miracles of miniaturization in space and time and more on the second- and third-order implications of some possible workplace applications of these miracles. This Information Technology Resources Assessment is intended to provide a sense of technological direction for planners in projecting the hardware, software, and human resources necessary to support the diverse IT requirements of the various components of the DOE community. It is also intended to provide a sense of our new understanding of the place of IT in our organizations.
Edelson, Burton I.; Pelton, Joseph N.; Bostian, Charles W.; Brandon, William T.; Chan, Vincent W. S.; Hager, E. Paul; Helm, Neil R.; Jennings, Raymond D.; Kwan, Robert K.; Mahle, Christoph E.
NASA and the National Science Foundation (NSF) commissioned a panel of U.S. experts to study the international status of satellite communications systems and technology. The study covers emerging systems concepts, applications, services, and the attendant technologies. The panel members traveled to Europe, Japan, and Russia to gather information firsthand. They visited 17 sites in Europe, 20 in Japan, and 4 in Russia. These included major manufacturers, government organizations, service providers, and associated research and development facilities. The panel's report was reviewed by the sites visited, by the panel, and by representatives of U.S. industry. The report details the information collected and compares it to U.S. activities.
Bensen, G. D.; Meertens, C. M.; Sheehan, A. F.
Some recent research at UNAVCO and the University of Colorado has been focused on Rocky Mountain tectonics, and Information Technology (IT) in the areas of data visualization and distributed data serving. At UNAVCO, we are participating in the geodynamics work in the Rocky Mountain Testbed of the GEON NSF funded (IT) Research project (www.geongrid.org). As part of this work, a variety of seismic tomography models, GPS velocity vector data, strain rate models and other data have been recompiled into a standard format. These data and models are being incorporated into our OPeNDAP server and the Integrated Data Viewer (IDV). OPeNDAP servers are platform independent, self-describing distributed data servers allowing easy access to a wide audience. The IDV is a freely distributed visualization and analysis tool developed by UCAR that has several exciting capabilities such as online collaboration, and a variety of 1-d, 2-d and 3-d viewing options. Necessary solid earth viewing capabilities (earthquakes, focal mechanisms, faults, etc.) are currently being added to the IDV. Both our OPeNDAP server and visualization tool are being integrated into the GEON portal, a website for data searching, analysis, and visualization. Designing and implementing such systems now allows us to be more prepared for the volumes of data anticipated from various EarthScope projects. As part of the scientific research for GEON, we have also begun investigations of Colorado seismicity. The 1992 Rocky Mountain Front IRIS/PASSCAL seismic experiment recorded many local earthquakes. We have begun to locate these events and are working to create focal mechanisms and calculations of stress drop for this region. These will aid in improving seismic hazard and risk assessments for the rapidly growing Rocky Mountain population. New IT capabilities will help augment the quality of this work through sharing the data with a larger audience, providing a means to view and analyze integrated data, and quickly
Context: Damages and loss of life sustained during an earthquake results from falling structures and flying glass and objects. To address these and other problems, new information technology and systems as a means can improve crisis management and crisis response. The most important factor for managing the crisis depends on our readiness before disasters by useful data. Aims: This study aimed to determine the Earthquake Information Management System (EIMS) in India, Afghanistan and Iran, and describe how we can reduce destruction by EIMS in crisis management. Materials and Methods: This study was an analytical comparison in which data were collected by questionnaire, observation and checklist. The population was EIMS in selected countries. Sources of information were staff in related organizations, scientific documentations and Internet. For data analysis, Criteria Rating Technique, Delphi Technique and descriptive methods were used. Results: Findings showed that EIMS in India (Disaster Information Management System), Afghanistan (Management Information for Natural Disasters) and Iran are decentralized. The Indian state has organized an expert group to inspect issues about disaster decreasing strategy. In Iran, there was no useful and efficient EIMS to evaluate earthquake information. Conclusions: According to outcomes, it is clear that an information system can only influence decisions if it is relevant, reliable and available for the decision-makers in a timely fashion. Therefore, it is necessary to reform and design a model. The model contains responsible organizations and their functions. PMID:23555130
Healthcare executives facing the challenges of delivering quality care and controlling costs must consider the role information technology systems can play in meeting those challenges. To make the best use of information system expenditures, organizations must carefully plan how to finance system acquisitions. Some options that should be considered are paying cash, financing, financing "soft" costs, leasing, credit warehousing and early acceptance financing, and tax-exempt and conduit financing. PMID:10154097
Will, Barbara, Ed.
Describes six information technology projects in California libraries, including Internet access in public libraries; digital library developments at the University of California, Berkeley; the World Wide Web home page for the state library; Pacific Bell's role in statewide connectivity; state government initiatives; and services of the state…
Couch, Carl J.
Presents a sociohistorical analysis of the development of a set of basic information technologies, namely, abstract symbols for quantities of space and time, and the formulation of computational strategies. Shows how numbers and geometry can be seen to reflect increases in a society's knowledge and social organization. (ARH)
Schroeder, M.; Stender, V.; Jüngling, S.
J. BOOKER; M. MEYER; ET AL
The Statistical Sciences Group at Los Alamos has successfully developed a structured, probabilistic, quantitative approach for the evaluation of system performance based on multiple information sources, called Information Integration Technology (IIT). The technology integrates diverse types and sources of data and information (both quantitative and qualitative), and their associated uncertainties, to develop distributions for performance metrics, such as reliability. Applications include predicting complex system performance, where test data are lacking or expensive to obtain, through the integration of expert judgment, historical data, computer/simulation model predictions, and any relevant test/experimental data. The technology is particularly well suited for tracking estimated system performance for systems under change (e.g. development, aging), and can be used at any time during product development, including concept and early design phases, prior to prototyping, testing, or production, and before costly design decisions are made. Techniques from various disciplines (e.g., state-of-the-art expert elicitation, statistical and reliability analysis, design engineering, physics modeling, and knowledge management) are merged and modified to develop formal methods for the data/information integration. The power of this technology, known as PREDICT (Performance and Reliability Evaluation with Diverse Information Combination and Tracking), won a 1999 R and D 100 Award (Meyer, Booker, Bement, Kerscher, 1999). Specifically the PREDICT application is a formal, multidisciplinary process for estimating the performance of a product when test data are sparse or nonexistent. The acronym indicates the purpose of the methodology: to evaluate the performance or reliability of a product/system by combining all available (often diverse) sources of information and then tracking that performance as the product undergoes changes.
Describes methods to access current earthquake information from the National Earthquake Information Center. Enables students to build genuine learning experiences using real data from earthquakes that have recently occurred. (JRH)
Examines the types of damage experienced by California State University at Northridge during the 1994 earthquake and what lessons were learned in handling this emergency are discussed. The problem of loose asbestos is addressed. (GR)
Bansal, Brijesh; Verma, Mithila
Science and Technology (S & T) interventions are considered to be very important in any effort related to earthquake risk reduction. Their three main components are: earthquake forecast, assessment of earthquake hazard, and education and awareness. In India, although the efforts towards earthquake forecast were initiated about two decades ago, systematic studies started recently with the launch of a National Program on Earthquake Precursors. The quantification of seismic hazard, which is imperative in the present scenario, started in India with the establishment of first seismic observatory in 1898 and since then a substantial progress has been made in this direction. A dedicated education and awareness program was initiated about 10 years ago to provide earthquake education and create awareness amongst the students and society at large. The paper highlights significant S & T efforts made in India towards reduction of risk due to future large earthquakes.
Benchmarking is a methodology for searching out industry best practices that lead to superior performance. It is exchanging information, not just with any organization, but with organizations known to be the best within PNNL, in industry, or in dissimilar industries with equivalent functions. It is used as a continuous improvement tool for business and technical processes, products, and services. Information technology--comprising all computer and electronic communication products and services--underpins the development and/or delivery of many PNNL products and services. This document describes the Pacific Northwest National Laboratory's (PNNL's) approach to information technology (IT) benchmarking. The purpose is to engage other organizations in the collaborative process of benchmarking in order to improve the value of IT services provided to customers. TM document's intended audience consists of other US Department of Energy (DOE) national laboratories and their IT staff. Although the individual participants must define the scope of collaborative benchmarking, an outline of IT service areas for possible benchmarking is described.
Yin, L.; Heaton, T. H.
Most of the current Earthquake Early Warning technologies focus on time analysis of wave amplitudes. There are two major drawbacks of these waveform-based techniques: tradeoffs between magnitude and distance estimation for the onsite algorithms, and time latency in alerts for the network algorithms. We are proposing an alternative EEW algorithm that combines the efficiency of onsite algorithms and accuracy of network algorithms, which provides the fastest alert at the moment of station trigger. It is achieved by using observed seismicity from the network as prior information to predict short-term seismic hazards, and then use trigger information from the onsite station as likelihood information to estimate earthquake probability and hypocenter location. This algorithm has numbers of advantages. First, due to the independent data source of this algorithm, results can be directly multiplied to the results of other algorithms such as GPS and waveform data under Bayesian framework to achieve posterior probability function. Second, it is especially beneficial for regions with sparsely distributed station density where it takes longer time for the seismic signals to arrive at the near stations. Lastly, it can significantly speed up warning process during aftershock sequence, swarm earthquake sequence, and mainshocks that had foreshocks. The concept can be further extended to network-based algorithms to incorporate arrived waveform data at more stations.
Living in postindustrial, 21st-century society means being surrounded by the accoutrements of information technology. Information technology is in people's offices, cars and homes. One third of adults do not deal well with information technology, according to the research of Larry Rosen, psychology professor, author, and pundit. Rosen is the Paul…
The full impact of the current information technology and networking revolution remains unknown, but the experiences of organizations and individuals who are using the tools and resources offered by information technology suggest that it may change our social fabric. Some of the current and emerging trends in information technology include: the…
Eggert, Silke; Fohringer, Joachim
Natural disasters like earthquakes require a fast response from local authorities. Well trained rescue teams have to be available, equipment and technology has to be ready set up, information have to be directed to the right positions so the head quarter can manage the operation precisely. The main goal is to reach the most affected areas in a minimum of time. But even with the best preparation for these cases, there will always be the uncertainty of what really happened in the affected area. Modern geophysical sensor networks provide high quality data. These measurements, however, are only mapping disjoint values from their respective locations for a limited amount of parameters. Using observations of witnesses represents one approach to enhance measured values from sensors ("humans as sensors"). These observations are increasingly disseminated via social media platforms. These "social sensors" offer several advantages over common sensors, e.g. high mobility, high versatility of captured parameters as well as rapid distribution of information. Moreover, the amount of data offered by social media platforms is quite extensive. We analyze messages distributed via Twitter after major earthquakes to get rapid information on what eye-witnesses report from the epicentral area. We use this information to (a) quickly learn about damage and losses to support fast disaster response and to (b) densify geophysical networks in areas where there is sparse information to gain a more detailed insight on felt intensities. We present a case study from the Mw 7.1 Philippines (Bohol) earthquake that happened on Oct. 15 2013. We extract Twitter messages, so called tweets containing one or more specified keywords from the semantic field of "earthquake" and use them for further analysis. For the time frame of Oct. 15 to Oct 18 we get a data base of in total 50.000 tweets whereof 2900 tweets are geo-localized and 470 have a photo attached. Analyses for both national level and locally for
Describes applications of new technology to information handling on the basis of two trends--the miniaturization of electronic circuits and the shift from analog to digital modes of communication. Information technologies discussed are microcomputers, word processors, telecommunications, storage technologies, databases, videotex, and teletext.…
Potter, Calvin J.; Lohr, Neah J.; Klein, Jim; Sorensen, Richard J.
Intended to help library media specialists, technology educators, and curriculum planning teams identify where specific information and technology competencies might best fit into the assessed content areas of the curriculum, this document presents a matrix that identifies the correlation between Wisconsin's Information and Technology Literacy…
Nevertheless, basic earthquake-related information has always been of consuming interest to the public and the media in this part of California (fig. 2.). So it is not surprising that earthquake prediction continues to be a significant reserach program at the laboratory. Several of the current spectrum of projects related to prediction are discussed below.
Kruse, F. A.; Kim, A. M.; Runyon, S. C.; Carlisle, Sarah C.; Clasen, C. C.; Esterline, C. H.; Jalobeanu, A.; Metcalf, J. P.; Basgall, P. L.; Trask, D. M.; Olsen, R. C.
The Naval Postgraduate School (NPS) Remote Sensing Center (RSC) and research partners have completed a remote sensing pilot project in support of California post-earthquake-event emergency response. The project goals were to dovetail emergency management requirements with remote sensing capabilities to develop prototype map products for improved earthquake response. NPS coordinated with emergency management services and first responders to compile information about essential elements of information (EEI) requirements. A wide variety of remote sensing datasets including multispectral imagery (MSI), hyperspectral imagery (HSI), and LiDAR were assembled by NPS for the purpose of building imagery baseline data; and to demonstrate the use of remote sensing to derive ground surface information for use in planning, conducting, and monitoring post-earthquake emergency response. Worldview-2 data were converted to reflectance, orthorectified, and mosaicked for most of Monterey County; CA. Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) data acquired at two spatial resolutions were atmospherically corrected and analyzed in conjunction with the MSI data. LiDAR data at point densities from 1.4 pts/m2 to over 40 points/ m2 were analyzed to determine digital surface models. The multimodal data were then used to develop change detection approaches and products and other supporting information. Analysis results from these data along with other geographic information were used to identify and generate multi-tiered products tied to the level of post-event communications infrastructure (internet access + cell, cell only, no internet/cell). Technology transfer of these capabilities to local and state emergency response organizations gives emergency responders new tools in support of post-disaster operational scenarios.
This article begins with an introduction to recent developments in information technology, including investment activities related to the technology in Europe, Japan, and the United States. It then deals with the challenging issues of access to electronic information of the U.S. government, fee or free for electronic information in publicly…
Bernardino, M. J.; Hayes, G. P.; Dannemann, F.; Benz, H.
One of the main missions of the United States Geological Survey (USGS) National Earthquake Information Center (NEIC) is the dissemination of information to national and international agencies, scientists, and the general public through various products such as ShakeMap and earthquake summary posters. During the summer of 2012, undergraduate and graduate student interns helped to update and improve our series of regional seismicity posters and regional tectonic summaries. The "Seismicity of the Earth (1900-2007)" poster placed over a century's worth of global seismicity data in the context of plate tectonics, highlighting regions that have experienced great (M+8.0) earthquakes, and the tectonic settings of those events. This endeavor became the basis for a series of more regionalized seismotectonic posters that focus on major subduction zones and their associated seismicity, including the Aleutian and Caribbean arcs. The first round of these posters were inclusive of events through 2007, and were made with the intent of being continually updated. Each poster includes a regional tectonic summary, a seismic hazard map, focal depth cross-sections, and a main map that illustrates the following: the main subduction zone and other physiographic features, seismicity, and rupture zones of historic great earthquakes. Many of the existing regional seismotectonic posters have been updated and new posters highlighting regions of current seismological interest have been created, including the Sumatra and Java arcs, the Middle East region and the Himalayas (all of which are currently in review). These new editions include updated lists of earthquakes, expanded tectonic summaries, updated relative plate motion vectors, and major crustal faults. These posters thus improve upon previous editions that included only brief tectonic discussions of the most prominent features and historic earthquakes, and which did not systematically represent non-plate boundary faults. Regional tectonic
Anthony, Denise; Campbell, Andrew T.; Candon, Thomas; Gettinger, Andrew; Kotz, David; Marsch, Lisa A.; Molina-Markham, Andrés; Page, Karen; Smith, Sean W.; Gunter, Carl A.; Johnson, M. Eric
Dartmouth College’s Institute for Security, Technology, and Society conducted three workshops on securing information technology in healthcare, attended by a diverse range of experts in the field. This article summarizes the three workshops. PMID:25379030
Loyd, R.; Walter, S.; Fenton, J.; Tubbesing, S.; Greene, M.
In the rush to remove debris after a damaging earthquake, perishable data related to a wide range of impacts on the physical, built and social environments can be lost. The California Post-Earthquake Information Clearinghouse is intended to prevent this data loss by supporting the earth scientists, engineers, and social and policy researchers who will conduct fieldwork in the affected areas in the hours and days following the earthquake to study these effects. First called for by Governor Ronald Reagan following the destructive M6.5 San Fernando earthquake in 1971, the concept of the Clearinghouse has since been incorporated into the response plans of the National Earthquake Hazard Reduction Program (USGS Circular 1242). This presentation is intended to acquaint scientists with the purpose, functions, and services of the Clearinghouse. Typically, the Clearinghouse is set up in the vicinity of the earthquake within 24 hours of the mainshock and is maintained for several days to several weeks. It provides a location where field researchers can assemble to share and discuss their observations, plan and coordinate subsequent field work, and communicate significant findings directly to the emergency responders and to the public through press conferences. As the immediate response effort winds down, the Clearinghouse will ensure that collected data are archived and made available through "lessons learned" reports and publications that follow significant earthquakes. Participants in the quarterly meetings of the Clearinghouse include representatives from state and federal agencies, universities, NGOs and other private groups. Overall management of the Clearinghouse is delegated to the agencies represented by the authors above.
Schultz, Mark D.
According to one embodiment, a system for removing heat from a rack of information technology equipment may include a sidecar indoor air to liquid heat exchanger that cools warm air generated by the rack of information technology equipment. The system may also include a liquid to liquid heat exchanger and an outdoor heat exchanger. The system may further include configurable pathways to connect and control fluid flow through the sidecar heat exchanger, the liquid to liquid heat exchanger, the rack of information technology equipment, and the outdoor heat exchanger based upon ambient temperature and/or ambient humidity to remove heat from the rack of information technology equipment.
Ewart, R. W.
As new information technology options are made available, office automation systems are being introduced and along with campus networks, and management information service requirements are evolving. Eight common strategies found in 10 American institutions are reported. (Author/MLW)
Cowen, A R; Denney, J P
On January 25, 1 week after the most devastating earthquake in Los Angeles history, the Southern California Hospital Council released the following status report: 928 patients evacuated from damaged hospitals. 805 beds available (136 critical, 669 noncritical). 7,757 patients treated/released from EDs. 1,496 patients treated/admitted to hospitals. 61 dead. 9,309 casualties. Where do we go from here? We are still waiting for the "big one." We'll do our best to be ready when Mother Nature shakes, rattles and rolls. The efforts of Los Angeles City Fire Chief Donald O. Manning cannot be overstated. He maintained department command of this major disaster and is directly responsible for implementing the fire department's Disaster Preparedness Division in 1987. Through the chief's leadership and ability to forecast consequences, the city of Los Angeles was better prepared than ever to cope with this horrendous earthquake. We also pay tribute to the men and women who are out there each day, where "the rubber meets the road." PMID:10133439
Heath, Melissa Allen; Dean, Brenda
Over the past decade, catastrophic earthquakes have garnered international attention regarding the need for improving immediate and ongoing support services for disrupted communities. Following the December 26, 2004 Indonesian earthquake, the Indian Ocean tsunami was responsible for displacing millions and taking the lives of an estimated 320,000…
Technology Teacher, 1992
This learning module gives background information on earthquakes, their measurement, and sociocultural impact. A design brief contains context, objectives, challenge to students, evaluation method, student quiz, outcomes, glossary, and eight references. (SK)
Li, Boren; Wu, Jianping; Pan, Mao; Huang, Jing
In hazard management, earthquake researchers have utilized GIS to ease the process of managing disasters. Researchers use WebGIS to assess hazards and seismic risk. Although they can provide a visual analysis platform based on GIS technology, they lack a general description in the extensibility of WebGIS for processing dynamic data, especially real-time data. In this paper, we propose a novel approach for real-time 3D visual earthquake information publishing model based on WebGIS and digital globe to improve the ability of processing real-time data in systems based on WebGIS. On the basis of the model, we implement a real-time 3D earthquake information publishing system—EqMap3D. The system can not only publish real-time earthquake information but also display these data and their background geoscience information in a 3D scene. It provides a powerful tool for display, analysis, and decision-making for researchers and administrators. It also facilitates better communication between researchers engaged in geosciences and the interested public.
Compares technology predictions from around 1989 with the technology of 2002. Discusses the place of computer-based assessment today, computer-scored testing, computer-administered formal assessment, Internet-based formal assessment, computerized adaptive tests, placement tests, informal assessment, electronic portfolios, information management,…
A flood of new electronic technologies promises to usher in the Information Age and alter economic and social structures. Telematics, a potent combination of telecommunications and computer technologies, could eventually bring huge volumes of information to great numbers of people by making large data bases accessible to computer terminals in…
Comptroller General of the U.S., Washington, DC.
Addressed to the new administration and the Congress, this summary report on Federal Government information management and technology issues begins by describing the environment in which information technology has been managed. Arguing that effective government depends directly on effective automation to support programs and initiatives, the…
Schultz, Mark D.
According to one embodiment, a system for removing heat from a rack of information technology equipment may include a sidecar indoor air to liquid heat exchanger that cools air utilized by the rack of information technology equipment to cool the rack of information technology equipment. The system may also include a liquid to liquid heat exchanger and an outdoor heat exchanger. The system may further include configurable pathways to connect and control fluid flow through the sidecar heat exchanger, the liquid to liquid heat exchanger, the rack of information technology equipment, and the outdoor heat exchanger based upon ambient temperature and/or ambient humidity to remove heat generated by the rack of information technology equipment.
Davies, Philip H. J.
Addresses the use of information technology for intelligence and information warfare in the context of national security and reviews the status of clandestine collection. Discusses hacking, human agent collection, signal interception, covert action, counterintelligence and security, and communications between intelligence producers and consumers…
This paper first evaluates the earthquake prediction method (1999 ) used by US Geological Survey as the lead example and reviews also the recent models. Secondly, points out the ongoing debate on the predictability of earthquake recurrences and lists the main claims of both sides. The traditional methods and the "frequentist" approach used in determining the earthquake probabilities cannot end the complaints that the earthquakes are unpredictable. It is argued that the prevailing "crisis" in seismic research corresponds to the Pre-Maxent Age of the current situation. The period of Kuhnian "Crisis" should give rise to a new paradigm based on the Information-Theoric framework including the inverse problem, Maxent and Bayesian methods. Paper aims to show that the information- theoric methods shall provide the required "Methodica Firma" for the earthquake prediction models.
The context for learning, education, and arts has altered dramatically, as has the cultural environment for educators and those involved in artistic and creative activities. Several crucial developments have transformed the terrain of technology, education, art, and culture, profoundly affecting not only the social and political structure of…
Patton, John M.; Ketchum, David C.; Guy, Michelle R.
This document provides an overview of the capabilities, design, and use cases of the data acquisition and archiving subsystem at the U.S. Geological Survey National Earthquake Information Center. The Edge and Continuous Waveform Buffer software supports the National Earthquake Information Center’s worldwide earthquake monitoring mission in direct station data acquisition, data import, short- and long-term data archiving, data distribution, query services, and playback, among other capabilities. The software design and architecture can be configured to support acquisition and (or) archiving use cases. The software continues to be developed in order to expand the acquisition, storage, and distribution capabilities.
When looking at the history of technology, we can see that all inventions are not of equal importance. Only a few technologies have the potential to start a new branching series (specifically, by increasing diversity), have a lasting impact in human life and ultimately became turning points. Technological transitions correspond to times and places in the past when a large number of novel artefact forms or behaviours appeared together or in rapid succession. Why does that happen? Is technological change continuous and gradual or does it occur in sudden leaps and bounds? The evolution of information technology (IT) allows for a quantitative and theoretical approach to technological transitions. The value of information systems experiences sudden changes (i) when we learn how to use this technology, (ii) when we accumulate a large amount of information, and (iii) when communities of practice create and exchange free information. The coexistence between gradual improvements and discontinuous technological change is a consequence of the asymmetric relationship between complexity and hardware and software. Using a cultural evolution approach, we suggest that sudden changes in the organization of ITs depend on the high costs of maintaining and transmitting reliable information.This article is part of the themed issue 'The major synthetic evolutionary transitions'. PMID:27431527
The development and implementation of an institutional framework to guide the management and use of information technologies (computing, office automation, and telecommunications) at Mount Royal College in Calgary, Alberta, are described. (Author/MLW)
Wald, David J.; Hayes, Gavin P.; Benz, Harley M.; Earle, Paul; Briggs, Richard W.
The M 9.0 11 March 2011 Tohoku, Japan, earthquake and associated tsunami near the east coast of the island of Honshu caused tens of thousands of deaths and potentially over one trillion dollars in damage, resulting in one of the worst natural disasters ever recorded. The U.S. Geological Survey National Earthquake Information Center (USGS NEIC), through its responsibility to respond to all significant global earthquakes as part of the National Earthquake Hazards Reduction Program, quickly produced and distributed a suite of earthquake information products to inform emergency responders, the public, the media, and the academic community of the earthquake's potential impact and to provide scientific background for the interpretation of the event's tectonic context and potential for future hazard. Here we present a timeline of the NEIC response to this devastating earthquake in the context of rapidly evolving information emanating from the global earthquake-response community. The timeline includes both internal and publicly distributed products, the relative timing of which highlights the inherent tradeoffs between the requirement to provide timely alerts and the necessity for accurate, authoritative information. The timeline also documents the iterative and evolutionary nature of the standard products produced by the NEIC and includes a behind-the-scenes look at the decisions, data, and analysis tools that drive our rapid product distribution.
Jimbo, Masahito; Nease, Donald E; Ruffin, Mack T; Rana, Gurpreet K
Information technology is rapidly advancing and making its way into many primary care settings. The technology may provide the means to increase the delivery of cancer preventive services. The aim of this systematic review is to examine the literature on information technology impacts on the delivery of cancer preventive services in primary care offices. Thirty studies met our selection criteria. Technology interventions studied to date have been limited to some type of reminder to either patients or providers. Patient reminders have been mailed before appointments, mailed unrelated to an appointment, mailed after a missed appointment, or given at the time of an appointment. Telephone call interventions have not used technology to automate the calls. Provider interventions have been primarily computer-generated reminders at the time of an appointment. However, there has been limited use of computer-generated audits, feedback, or report cards. The effectiveness of information technology on increasing cancer screening was modest at best. The full potential of information technology to unload the provider-patient face-to-face encounter has not been examined. There is critical need to study these new technologic approaches to understand the impact and acceptance by providers and patients. PMID:16449184
McNulty, Tom, Ed.
Four issues of this newsletter on information technology and disabilities (ITD) contain the following articles: "Developing an Accessible Online Public Access Catalog at the Washington Talking Book and Braille Library" (Charles Hamilton); "Assistive Technology in the Science Laboratory: A Talking Laboratory Work Station for Visually Impaired…
Anderson, John A.
The role of information technology (IT) is changing, and is becoming more important for the overall success of colleges today. The structure of IT has not changed much through the years, but a greater amount of institutions exist where multiple areas of technology are being merged back into a single IT organization. The model of IT explored in…
Haag, Stephen; Keen, Peter
This textbook is designed for a one-semester introductory course in which the goal is to give students a foundation in the basics of information technology (IT). It focuses on how the technology works, issues relating to its use and development, how it can lend personal and business advantages, and how it is creating a globally networked society.…
Amato, A.; Cultrera, G.; Margheriti, L.; Nostro, C.; Selvaggi, G.; INGVterremoti Team
A devastating earthquake had been predicted for May 11, 2011 in Rome. This prediction was never released officially by anyone, but it grew up in the Internet and was amplified by media. It was erroneously ascribed to Raffaele Bendandi, an Italian self-taught natural scientist who studied planetary motions. Indeed, around May 11, 2011, a planetary alignment was really expected and this contributed to give credibility to the earthquake prediction among people. During the previous months, INGV was overwhelmed with requests for information about this supposed prediction by Roman inhabitants and tourists. Given the considerable mediatic impact of this expected earthquake, INGV decided to organize an Open Day in its headquarter in Rome for people who wanted to learn more about the Italian seismicity and the earthquake as natural phenomenon. The Open Day was preceded by a press conference two days before, in which we talked about this prediction, we presented the Open Day, and we had a scientific discussion with journalists about the earthquake prediction and more in general on the real problem of seismic risk in Italy. About 40 journalists from newspapers, local and national tv's, press agencies and web news attended the Press Conference and hundreds of articles appeared in the following days, advertising the 11 May Open Day. The INGV opened to the public all day long (9am - 9pm) with the following program: i) meetings with INGV researchers to discuss scientific issues; ii) visits to the seismic monitoring room, open 24h/7 all year; iii) guided tours through interactive exhibitions on earthquakes and Earth's deep structure; iv) lectures on general topics from the social impact of rumors to seismic risk reduction; v) 13 new videos on channel YouTube.com/INGVterremoti to explain the earthquake process and give updates on various aspects of seismic monitoring in Italy; vi) distribution of books and brochures. Surprisingly, more than 3000 visitors came to visit INGV
... Disaster Cleanup of Flood Water After a Flood Worker Safety Educational Materials Floods PSAs Hurricanes Before a Hurricane ... Other Related Links Information for Professionals and Response Workers Health Care Professionals Response and Cleanup Workers Hurricanes PSAs ...
Ohio State Dept. of Education, Columbus.
This profile includes a comprehensive set of information technology competencies that are grounded in core academic subject areas and built around four occupational clusters (information services and support, network systems, programming and software development, and interactive media) that reflect the job opportunities and skills required for…
Dyson, Laurel, Ed.; Hendriks, Max, Ed.; Grant, Stephen, Ed.
Information Technology and Indigenous People provides theoretical and empirical information related to the planning and execution of IT projects aimed at serving indigenous people. It explores many cultural concerns with IT implementation, including language issues and questions of cultural appropriateness, and brings together cutting-edge…
Computers open the door to an ever-expanding arena of knowledge and technology. Most nurses practicing in perianesthesia setting were educated before the computer era, and many fear computers and the associated technology. Frequently, the greatest difficulty is finding the resources and knowing what questions to ask. The following is the first in a series of articles on computers and information technology. This article discusses computer hardware to get the novice started or the experienced user upgraded to access new technologies and the Internet. Future articles will discuss start up and usual software applications, getting up to speed on the information superhighway, and other technologies that will broaden our knowledge and expand our personal and professional world. PMID:9543967
The Southern California Earthquake Center's Fault Information System (FIS) provides a single point of access to fault-related data and models from multiple databases and datasets. The FIS is built of computer code, metadata and Web interfaces based on Web services technology, which enables queries and data interchange irrespective of computer software or platform. Currently we have working prototypes of programmatic and browser-based access. The first generation FIS may be searched and downloaded live, by automated processes, as well as interactively, by humans using a browser. Users get ascii data in plain text or encoded in XML. Via the Earthquake Information Technology (EIT) Interns (Juve and others, this meeting), we are also testing the effectiveness of querying multiple databases using a fault database ontology. For more than a decade, the California Geological Survey (CGS), SCEC, and the U. S. Geological Survey (USGS) have put considerable, shared resources into compiling and assessing published fault data, then providing the data on the Web. Several databases now exist, with different formats, datasets, purposes, and users, in various stages of completion. When fault databases were first envisioned, the full power of today's internet was not yet recognized, and the databases became the Web equivalents of review papers, where one could read an overview summation of a fault, then copy and paste pertinent data. Today, numerous researchers also require rapid queries and downloads of data. Consequently, the first components of the FIS are MySQL databases that deliver numeric values from earlier, text-based databases. Another essential service provided by the FIS is visualizations of fault representations such as those in SCEC's Community Fault Model. The long term goal is to provide a standardized, open-source, platform-independent visualization technique. Currently, the FIS makes available fault model viewing software for users with access to Matlab or Java3D
When Eric Calais, professor of geophysics in Purdue University's Department of Earth and Atmospheric Sciences, first learned about the 12 January strikeslip earthquake along a portion of the Enriquillo-Plantain Garden fault zone (EPGFZ) in Haiti, he knew right away that it would be a shallow event and a large event, very close to the capital city of Port-au-Prince. Having worked in Haiti, he also was aware that the poor nation lacks seismic and building construction codes. “My immediate reaction was, ‘This is going to be a total nightmare and a huge disaster for Haiti,’” Calais, who also is a researcher at the French National Center for Scientific Research, told Eos. The main earthquake, currently estimated at magnitude 7.0, occurred at 2153:10 UTC at a depth of 13 kilometers, just 25 kilometers outside of Port-au-Prince, the U.S. Geological Survey (USGS) reports. Since then, there have been dozens of aftershocks, many of them above magnitude 5.0; these aftershocks could continue for weeks or even months, according to USGS (see Figure 1). In recent decades, there had not been a major earthquake along the approximately 600-kilometer-long EPGFZ (named after the end points in Jamaica and the Dominican Republic), although seismologists indicate that large earthquakes in 1860, 1770, and earlier likely originated along that system.
Chen, Peng; Wu, Jian; Liu, Yaolin; Wang, Jing
At present, the extraction of earthquake disaster information from remote sensing data relies on visual interpretation. However, this technique cannot effectively and quickly obtain precise and efficient information for earthquake relief and emergency management. Collapsed buildings in the town of Zipingpu after the Wenchuan earthquake were used as a case study to validate two kinds of rapid extraction methods for earthquake-collapsed building information based on pixel-oriented and object-oriented theories. The pixel-oriented method is based on multi-layer regional segments that embody the core layers and segments of the object-oriented method. The key idea is to mask layer by layer all image information, including that on the collapsed buildings. Compared with traditional techniques, the pixel-oriented method is innovative because it allows considerably rapid computer processing. As for the object-oriented method, a multi-scale segment algorithm was applied to build a three-layer hierarchy. By analyzing the spectrum, texture, shape, location, and context of individual object classes in different layers, the fuzzy determined rule system was established for the extraction of earthquake-collapsed building information. We compared the two sets of results using three variables: precision assessment, visual effect, and principle. Both methods can extract earthquake-collapsed building information quickly and accurately. The object-oriented method successfully overcomes the pepper salt noise caused by the spectral diversity of high-resolution remote sensing data and solves the problem of same object, different spectrums and that of same spectrum, different objects. With an overall accuracy of 90.38%, the method achieves more scientific and accurate results compared with the pixel-oriented method (76.84%). The object-oriented image analysis method can be extensively applied in the extraction of earthquake disaster information based on high-resolution remote sensing.
ATTIC (Alternative Treatment Technology Information Center) is an on-line computer database and repository for information on remediation and treatment technologies. t contains several of EPA's technology databases, including the Treatment Technology Database, the RREL (Risk Redu...
Bossu, Rémy; Steed, Robert; Mazet-Roux, Gilles; Roussel, Fréderic; Caroline, Etivant
Historical earthquakes are only known to us through written recollections and so seismologists have a long experience of interpreting the reports of eyewitnesses, explaining probably why seismology has been a pioneer in crowdsourcing and citizen science. Today, Internet has been transforming this situation; It can be considered as the digital nervous system comprising of digital veins and intertwined sensors that capture the pulse of our planet in near real-time. How can both seismology and public could benefit from this new monitoring system? This paper will present the strategy implemented at Euro-Mediterranean Seismological Centre (EMSC) to leverage this new nervous system to detect and diagnose the impact of earthquakes within minutes rather than hours and how it transformed information systems and interactions with the public. We will show how social network monitoring and flashcrowds (massive website traffic increases on EMSC website) are used to automatically detect felt earthquakes before seismic detections, how damaged areas can me mapped through concomitant loss of Internet sessions (visitors being disconnected) and the benefit of collecting felt reports and geolocated pictures to further constrain rapid impact assessment of global earthquakes. We will also describe how public expectations within tens of seconds of ground shaking are at the basis of improved diversified information tools which integrate this user generated contents. A special attention will be given to LastQuake, the most complex and sophisticated Twitter QuakeBot, smartphone application and browser add-on, which deals with the only earthquakes that matter for the public: the felt and damaging earthquakes. In conclusion we will demonstrate that eyewitnesses are today real time earthquake sensors and active actors of rapid earthquake information.
Lauro, Claudia; Avanzini, Marco
its activity in 1981 and consists of 7 stations equipped with seismometers and acquisition digital technology, working 24 hours per day. Moreover, a network of 9 accelerometers has been set up in the southern Trentino, where most of the seismic events are concentrated. All the information revealed in each station flow to the "Data Acquisition Central Office", where the data are checked, processed and recorded. The Geological Service manages the seismometric network, elaborates and publishes the information regarding the seismicity of the area and surroundings. In case of earthquake the "Seismic Alert", an automatic alarm system, is activated to Civil Protection purposes. The "Seismic Alert" is managed by "Antilope", the consortium of the Eastern Alpine seismometric networks. Moreover the seismotectonic is another research field carried out by this Geological Service, to investigate the formation mechanism of earthquakes and estimate the causative tectonic stress, in relation to the main tectonic structures of the region and of the whole Alpine chain. Hence the Trento study-case reported in this exhibition illustrates the general methodology used to understand the "seismic behaviour" of a region. At the end this exhibition sector also presents the activity of the Trento Civil Protection in the Abruzzo region, where a dramatic seismic event occurred on 6th April 2009, describing the investigation of the still occurring surface deformations. This activity is part of a general framework in which the Trento Province provided first aid and assistance to the local communities. The collaboration between the Natural Science Tridentino Museum and the Geological Service of Trento, already fruitful on field geological researches, has been also effective in this project of science communication. In the future the two institutions could collaborate in other main themes of the relationship between science and society, regarding the dissemination of Earth Sciences.
Choate, Larry; And Others
A tech prep/associate degree program in information technology was developed to prepare workers for entry into and advancement in occupations entailing applications of scientific principles and higher mathematics in situations involving various office machines. According to the articulation agreement reached, students from five country regional…
Plowman, Travis S.
Considers the impact of information technology on academic integrity. Highlights include machines versus man; honor codes and student cheating; copyrights for digital data; authoring versus writing; intuitive software; and an example and analysis of the use of AutoSummary in Microsoft Word 97 to create a summary of a published article. (Contains…
Ekstrom, Joseph J.; Gorka, Sandra; Kamali, Reza; Lawson, Eydie; Lunt, Barry; Miller, Jacob; Reichgelt, Han
The last twenty years has seen the development of demand for a new type of computing professional, which has resulted in the emergence of the academic discipline of Information Technology (IT). Numerous colleges and universities across the country and abroad have responded by developing programs without the advantage of an existing model for…
McNulty, Tom, Ed.
This document consists of all issues/pages of the electronic journal "Information Technology and Disabilities" published during 1996, i.e., a total of 13 ITD articles: (1) "New CSUF (California State University at Fullerton) Braille Transcription Center Promotes Access to Postsecondary Instructional Materials for the California State University…
McNulty, Tom, Ed.
Four issues of this newsletter on information technology and disabilities (ITD) contain the following articles: "Building an Accessible CD-ROM Reference Station" (Rochelle Wyatt and Charles Hamilton); "Development of an Accessible User Interface for People Who Are Blind or Vision Impaired as Part of the Re-Computerisation of Royal Blind Society…
In this study, Information Technologies Certificate Program which is based on synchronous and asynchronous communication methods over the Internet offered by cooperation of Middle East Technical University, Computer Engineering Department and Continuing Education Center were examined. This online certificate program started in May 1998 and it is…
de Rubeis, Valerio; Sbarra, Paola; Sebaste, Beppe; Tosi, Patrizia
The experience of collection of data on earthquake effects and diffusion of information to people, carried on through the site "haisentitoilterremoto.it" (didyoufeelit) managed by the Istituto Nazionale di Geofisica e Vulcanologia (INGV), has evidenced a constantly growing interest by Italian citizens. Started in 2007, the site has collected more than 520,000 compiled intensity questionnaires, producing intensity maps of almost 6,000 earthquakes. One of the most peculiar feature of this experience is constituted by a bi-directional information exchange. Every person can record observed effects of the earthquake and, at the same time, look at the generated maps. Seismologists, on the other side, can find each earthquake described in real time through its effects on the whole territory. In this way people, giving punctual information, receive global information from the community, mediated and interpreted by seismological knowledge. The relationship amongst seismologists, mass media and civil society is, thus, deep and rich. The presence of almost 20,000 permanent subscribers distributed on the whole Italian territory, alerted in case of earthquake, has reinforced the participation: the subscriber is constantly informed by the seismologists, through e-mail, about events occurred in his-her area, even if with very small magnitude. The "alert" service provides the possibility to remember that earthquakes are a phenomenon continuously present, on the other hand it shows that high magnitude events are very rare. This kind of information is helpful as it is fully complementary to that one given by media. We analyze the effects of our activity on society and mass media. The knowledge of seismic phenomena is present in each person, having roots on fear, idea of death and destruction, often with the deep belief of very rare occurrence. This position feeds refusal and repression. When a strong earthquake occurs, surprise immediately changes into shock and desperation. A
McBride, S.; Tilley, E. N.; Johnston, D. M.; Becker, J.; Orchiston, C.
This research evaluates the public education earthquake information prior to the Canterbury Earthquake sequence (2010-present), and examines communication learnings to create recommendations for improvement in implementation for these types of campaigns in future. The research comes from a practitioner perspective of someone who worked on these campaigns in Canterbury prior to the Earthquake Sequence and who also was the Public Information Manager Second in Command during the earthquake response in February 2011. Documents, specifically those addressing seismic risk, that were created prior to the earthquake sequence, were analyzed, using a "best practice matrix" created by the researcher, for how closely these aligned to best practice academic research. Readability tests and word counts are also employed to assist with triangulation of the data as was practitioner involvement. This research also outlines the lessons learned by practitioners and explores their experiences in regards to creating these materials and how they perceive these now, given all that has happened since the inception of the booklets. The findings from the research showed these documents lacked many of the attributes of best practice. The overly long, jargon filled text had little positive outcome expectancy messages. This probably would have failed to persuade anyone that earthquakes were a real threat in Canterbury. Paradoxically, it is likely these booklets may have created fatalism in publics who read the booklets. While the overall intention was positive, for scientists to explain earthquakes, tsunami, landslides and other risks to encourage the public to prepare for these events, the implementation could be greatly improved. This final component of the research highlights points of improvement for implementation for more successful campaigns in future. The importance of preparedness and science information campaigns can be not only in preparing the population but also into development of
Guptill, Stephen C.
Computerized geographic information systems (GISs) are emerging as the spatial data handling tools of choice for solving complex geographical problems. However, few guidelines exist for assisting potential users in identifying suitable hardware and software. A process to be followed in evaluating the merits of GIS technology is presented. Related standards and guidelines, software functions, hardware components, and benchmarking are discussed. By making users aware of all aspects of adopting GIS technology, they can decide if GIS is an appropriate tool for their application and, if so, which GIS should be used.
The Environmental Handbook Series is designed to overcome the deficiency of information utility and transfer. Each of the works in this series brings together information in an area and format that is useful to both public and private sector needs. It is meant to serve as a basic reference document that will stand for a period of time and help to enrich decisionmaking and research in the interface of energy and the environment. This particular handbook deals with environmental characterization data for the energy technologies and presents the data in a format for use by DOE policy analysts. This treatment includes not only the actual information base, but also a preface which explains the present concept, the historical growth of the program, and the new direction for improved utility. The information base, itself, is constantly being enhanced and is republished periodically as necessary. The specific energy systems for which environmental/technology characterization information is provided are grouped as follows: nuclear energy; coal; petroleum; gas; synthetic fuels; solar energy; geothermal energy; and hydroelectricity.
Graves, Sara; Knoblock, Craig A.; Lannom, Larry
This report provides the results of a panel study conducted into the technology requirements for information management in support of application domains of particular government interest, including digital libraries, mission operations, and scientific research. The panel concluded that it was desirable to have a coordinated program of R&D that pursues a science of information management focused on an environment typified by applications of government interest - highly distributed with very large amounts of data and a high degree of heterogeneity of sources, data, and users.
Zhang, Hudan; Wu, Heng
This paper will propose a whole new viewpoint about building a CSPMS(Coal-mine Safety Production Management System) by means of information technology. This system whose core part is a four-grade automatic triggered warning system achieves the goal that information transmission will be smooth, nondestructive and in time. At the same time, the system provides a comprehensive and collective technology platform for various Public Management Organizations and coal-mine production units to deal with safety management, advance warning, unexpected incidents, preplan implementation, and resource deployment at different levels. The database of this system will support national related industry's resource control, plan, statistics, tax and the construction of laws and regulations effectively.
Information Technology (IT) Security Risk Management is a critical task for the organization to protect against the loss of confidentiality, integrity and availability of IT resources. As systems bgecome more complex and diverse and and attacks from intrusions and malicious content increase, it is becoming increasingly difficult to manage IT security risk. This paper describes a two-pronged approach in addressing IT security risk and risk management in the organization: 1) an institutional enterprise appraoch, and 2) a project life cycle approach.
Following the 2010 Haiti earthquake, more than two million people moved to temporary camps, most of which arose spontaneously in the days after the earthquake. This study focuses on the material assistance people in five Port-au-Prince camps reported receiving, noting the differences between assistance from formal aid agencies and from 'informal' sources such as family. Seven weeks after the earthquake, 32% of camp dwellers reported receiving no assistance whatsoever; 55% had received formal aid, typically a tent or tarpaulins; and 40% had received informal aid, usually in the form of cash transfers from family living abroad. While people were grateful for any material aid, cash was more frequently considered timely and more effective than aid-in-kind. Should this study be indicative of the greater displaced population, aid agencies should consider how they might make better use of cash transfers as an aid modality. PMID:24601934
The Physics of Information Technology explores the familiar devices that we use to collect, transform, transmit, and interact with electronic information. Many such devices operate surprisingly close to very many fundamental physical limits. Understanding how such devices work, and how they can (and cannot) be improved, requires deep insight into the character of physical law as well as engineering practice. The book starts with an introduction to units, forces, and the probabilistic foundations of noise and signaling, then progresses through the electromagnetics of wired and wireless communications, and the quantum mechanics of electronic, optical, and magnetic materials, to discussions of mechanisms for computation, storage, sensing, and display. This self-contained volume will help both physical scientists and computer scientists see beyond the conventional division between hardware and software to understand the implications of physical theory for information manipulation.
Tarr, A.; Benz, H.; Earle, P.; Wald, D. J.
Earthquake Summary Posters (ESP's), a new product of the U.S. Geological Survey's Earthquake Program, are produced at the National Earthquake Information Center (NEIC) in Golden. The posters consist of rapidly-generated, GIS-based maps made following significant earthquakes worldwide (typically M>7.0, or events of significant media/public interest). ESP's consolidate, in an attractive map format, a large-scale epicentral map, several auxiliary regional overviews (showing tectonic and geographical setting, seismic history, seismic hazard, and earthquake effects), depth sections (as appropriate), a table of regional earthquakes, and a summary of the reional seismic history and tectonics. The immediate availability of the latter text summaries has been facilitated by the availability of Rapid, Accurate Tectonic Summaries (RATS) produced at NEIC and posted on the web following significant events. The rapid production of ESP's has been facilitated by generating, during the past two years, regional templates for tectonic areas around the world by organizing the necessary spatially-referenced data for the map base and the thematic layers that overlay the base. These GIS databases enable scripted Arc Macro Language (AML) production of routine elements of the maps (for example background seismicity, tectonic features, and probabilistic hazard maps). However, other elements of the maps are earthquake-specific and are produced manually to reflect new data, earthquake effects, and special characteristics. By the end of this year, approximately 85% of the Earth's seismic zones will be covered for generating future ESP's. During the past year, 13 posters were completed, comparable to the yearly average expected for significant earthquakes. Each year, all ESPs will be published on a CD in PDF format as an Open-File Report. In addition, each is linked to the special event earthquake pages on the USGS Earthquake Program web site (http://earthquake.usgs.gov). Although three formats
The 2010 earthquake in Haiti, which killed an estimated 316,000 people, offered many lessons in mass-fatality management (MFM). The dissertation defined MFM in seeking information and in recovery, preservation, identification, and disposition of human remains. Specifically, it examined how mass fatalities were managed in Haiti, how affected…
In this article, the author discusses the liklihood of major earthquakes in both the western and eastern United States as well as the level of preparedness of each region of the U.S. for a major earthquake. Current technology in both earthquake-resistance design and earthquake detection is described. Governmental programs for earthquake hazard reduction are outlined and critiqued.
MacFarlane, Alistair G J
Technology is the sum of the ways in which social groups manipulate order in the world to achieve their ends. It enables our active engagement with the world. Technology is central to our present well-being and vital for our future survival. As such it needs a coherent world view, a conceptual framework which will enable the fundamental problems that it poses for society to be approached in an illuminating way. Furthermore, such an approach, while remaining convincing, must not be overwhelmed by an ever-increasing welter of specialization and diversity of application. It is the purpose of the set of papers presented here to examine some key aspects of such a conceptual framework; not in the sense of offering a fully worked out philosophy of technology--that would be a huge and complex undertaking--but rather by considering some key topics. Subsidiary aims are to survey important relevant areas, to identify key sources that can provide access points for further study, and to consider some possible future developments. Major, coherent domains of activity are characterized by a few, fundamental, extensively used and essentially unifying concepts. Technology is such a domain, and its fundamental concepts are information, knowledge and agency. The following sections give a synoptic overview of the material presented in this theme issue, and set it within a wider context. PMID:12952675
NASA's Information Technology (IT) resources and IT support continue to be a growing and integral part of all NASA missions. Furthermore, the growing IT support requirements are becoming more complex and diverse. The following are a few examples of the growing complexity and diversity of NASA's IT environment. NASA is conducting basic IT research in the Intelligent Synthesis Environment (ISE) and Intelligent Systems (IS) Initiatives. IT security, infrastructure protection, and privacy of data are requiring more and more management attention and an increasing share of the NASA IT budget. Outsourcing of IT support is becoming a key element of NASA's IT strategy as exemplified by Outsourcing Desktop Initiative for NASA (ODIN) and the outsourcing of NASA Integrated Services Network (NISN) support. Finally, technology refresh is helping to provide improved support at lower cost. Recently the NASA Automated Data Processing (ADP) Consolidation Center (NACC) upgraded its bipolar technology computer systems with Complementary Metal Oxide Semiconductor (CMOS) technology systems. This NACC upgrade substantially reduced the hardware maintenance and software licensing costs, significantly increased system speed and capacity, and reduced customer processing costs by 11 percent.
Tuttle, Mark S.
Non-health-care uses of information technology (IT) provide important lessons for health care informatics that are often overlooked because of the focus on the ways in which health care is different from other domains. Eight examples of IT use outside health care provide a context in which to examine the content and potential relevance of these lessons. Drawn from personal experience, five books, and two interviews, the examples deal with the role of leadership, academia, the private sector, the government, and individuals working in large organizations. The interviews focus on the need to manage technologic change. The lessons shed light on how to manage complexity, create and deploy standards, empower individuals, and overcome the occasional “wrongness” of conventional wisdom. One conclusion is that any health care informatics self-examination should be outward-looking and focus on the role of health care IT in the larger context of the evolving uses of IT in all domains. PMID:10495095
Ruppert, N. A.; Hansen, R. A.
The Alaska Earthquake Information Center (AEIC) receives data from over 400 seismic sites located within the state boundaries and the surrounding regions and serves as a regional data center. In 2007, the AEIC reported ~20,000 seismic events, with the largest event of M6.6 in Andreanof Islands. The real-time earthquake detection and data processing systems at AEIC are based on the Antelope system from BRTT, Inc. This modular and extensible processing platform allows an integrated system complete from data acquisition to catalog production. Multiple additional modules constructed with the Antelope toolbox have been developed to fit particular needs of the AEIC. The real-time earthquake locations and magnitudes are determined within 2-5 minutes of the event occurrence. AEIC maintains a 24/7 seismologist-on-duty schedule. Earthquake alarms are based on the real- time earthquake detections. Significant events are reviewed by the seismologist on duty within 30 minutes of the occurrence with information releases issued for significant events. This information is disseminated immediately via the AEIC website, ANSS website via QDDS submissions, through e-mail, cell phone and pager notifications, via fax broadcasts and recorded voice-mail messages. In addition, automatic regional moment tensors are determined for events with M>=4.0. This information is posted on the public website. ShakeMaps are being calculated in real-time with the information currently accessible via a password-protected website. AEIC is designing an alarm system targeted for the critical lifeline operations in Alaska. AEIC maintains an extensive computer network to provide adequate support for data processing and archival. For real-time processing, AEIC operates two identical, interoperable computer systems in parallel.
... National Institute of Standards and Technology Advisory Committee on Earthquake Hazards Reduction Meeting... meeting. SUMMARY: The Advisory Committee on Earthquake Hazards Reduction (ACEHR or Committee), will meet....m. The primary purpose of this meeting is to receive information on NEHRP earthquake...
Flournoy, Don M.
Satellites will operate more like wide area broadband computer networks in the 21st Century. Space-based information and communication technologies will therefore be a lot more accessible and functional for the individual user. These developments are the result of earth-based telecommunication and computing innovations being extended to space. The author predicts that the broadband Internet will eventually be available on demand to users of terrestrial networks wherever they are. Earth and space communication assets will be managed as a single network. Space networks will assure that online access is ubiquitous. No matter whether users are located in cities or in remote locations, they will always be within reach of a node on the Internet. Even today, scalable bandwidth can be delivered to active users when moving around in vehicles on the ground, or aboard ships at sea or in the air. Discussion of the innovative technologies produced by NASA's Advanced Communications Technology Satellite (1993-2004) demonstrates future capabilities of satellites that make them uniquely suited to serve as nodes on the broadband Internet.
Gottlich, Gretchen; Meyer, John M.; Nelson, Michael L.; Bianco, David J.
NASA Langley Research Center's product is aerospace research information. To this end, Langley uses information technology tools in three distinct ways. First, information technology tools are used in the production of information via computation, analysis, data collection and reduction. Second, information technology tools assist in streamlining business processes, particularly those that are primarily communication based. By applying these information tools to administrative activities, Langley spends fewer resources on managing itself and can allocate more resources for research. Third, Langley uses information technology tools to disseminate its aerospace research information, resulting in faster turn around time from the laboratory to the end-customer.
Since the Wenchuan earthquake in 2008, a dramatic progress on earthquake early warning (EEW) has been made by Institute of Care-life (ICL) in China. The research on EEW by ICL covers choosing appropriate sensors, methods of installing the sensors, data automatic process methods of the seismic waves for EEW, methods of applying of EEW warnings for public, schools and life-line projects. ICL innovatively applies distributed computing and cloud computing technology. So far, ICL has deployed over 5500 EEW sensors in China, which is 5 times the number of EEW sensors in Japan, covering more than 2.1 million square kilometers. Since June, 2011, over 5000 earthquakes, with 28 of them are destructive quakes, have triggered the EEWS with no false alert. The root mean square (RMS) error of the magnitude for the 28 destructive quakes is 0.32. In addition, innovative work is done to suppress false alarm and miss alarm, which pushes forward the application of EEW in China. The technology is also being applied in Nepal now.
Earle, P. S.; Guy, M.; Ostrum, C.; Horvath, S.; Buckmaster, R. A.
The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public, text messages, can augment its earthquake response products and the delivery of hazard information. The goal is to gather near real-time, earthquake-related messages (tweets) and provide geo-located earthquake detections and rough maps of the corresponding felt areas. Twitter and other social Internet technologies are providing the general public with anecdotal earthquake hazard information before scientific information has been published from authoritative sources. People local to an event often publish information within seconds via these technologies. In contrast, depending on the location of the earthquake, scientific alerts take between 2 to 20 minutes. Examining the tweets following the March 30, 2009, M4.3 Morgan Hill earthquake shows it is possible (in some cases) to rapidly detect and map the felt area of an earthquake using Twitter responses. Within a minute of the earthquake, the frequency of “earthquake” tweets rose above the background level of less than 1 per hour to about 150 per minute. Using the tweets submitted in the first minute, a rough map of the felt area can be obtained by plotting the tweet locations. Mapping the tweets from the first six minutes shows observations extending from Monterey to Sacramento, similar to the perceived shaking region mapped by the USGS “Did You Feel It” system. The tweets submitted after the earthquake also provided (very) short first-impression narratives from people who experienced the shaking. Accurately assessing the potential and robustness of a Twitter-based system is difficult because only tweets spanning the previous seven days can be searched, making a historical study impossible. We have, however, been archiving tweets for several months, and it is clear that significant limitations do exist. The main drawback is the lack of quantitative information
Tomei, Lawrence, Ed.
"Integrating Information & Communications Technologies Into the Classroom" examines topics critical to business, computer science, and information technology education, such as: school improvement and reform, standards-based technology education programs, data-driven decision making, and strategic technology education planning. This book also…
Zama, Shinsaku; Endo, Makoto; Takanashi, Ken'ichi; Araiba, Kiminori; Sekizawa, Ai; Hosokawa, Masafumi; Jeong, Byeong-Pyo; Hisada, Yoshiaki; Murakami, Masahiro
situation of whole damage of the city and necessity of evacuation with optimum timing and access. According to the evaluation by the city staffs through the experiments, information technology is available for rationally implementing initial responses just after a large earthquake in spite of some improvement on the systems used in the experiments.
Malapile, Sandy; Keengwe, Jared
This article explores major issues related to Information Communication Technology (ICT) in education and technology planning. Using the diffusion of innovation theory, the authors examine technology planning opportunities and challenges in Developing countries (DCs), technology planning trends in schools, and existing technology planning models…
Askew, Scott; Bluethmann, William; Alder, Ken; Ambrose, Robert
Robonaut, NASA's humanoid robot, is designed to work as both an astronaut assistant and, in certain situations, an astronaut surrogate. This highly dexterous robot performs complex tasks under telepresence control that could previously only be carried out directly by humans. Currently with 47 degrees of freedom (DOF), Robonaut is a state-of-the-art human size telemanipulator system. while many of Robonaut's embedded components have been custom designed to meet packaging or environmental requirements, the primary computing systems used in Robonaut are currently commercial-off-the-shelf (COTS) products which have some correlation to flight qualified computer systems. This loose coupling of information technology (IT) resources allows Robonaut to exploit cost effective solutions while floating the technology base to take advantage of the rapid pace of IT advances. These IT systems utilize a software development environment, which is both compatible with COTS hardware as well as flight proven computing systems, preserving the majority of software development for a flight system. The ability to use highly integrated and flexible COTS software development tools improves productivity while minimizing redesign for a space flight system. Further, the flexibility of Robonaut's software and communication architecture has allowed it to become a widely used distributed development testbed for integrating new capabilities and furthering experimental research.
Warmkessel, Marjorie M.
The language of information technology is discussed, with a focus on accessibility in the information society. The metaphors of information technology as an "information superhighway" or "infobahn" are analyzed; limitations of the "road system" and developments of Internet systems are considered. The concept of connectivity of the rhizome in "A…
Townsend, P. D.
Progress in electronics and optics offers faster computers, and rapid communication via the internet that is matched by ever larger and evolving storage systems. Instinctively one assumes that this must be totally beneficial. However advances in software and storage media are progressing in ways which are frequently incompatible with earlier systems and the economics and commercial pressures rarely guarantee total compatibility with earlier systems. Instead, the industries actively choose to force the users to purchase new systems and software. Thus we are moving forward with new technological variants that may have access to only the most recent systems and we will have lost earlier alternatives. The reality is that increased processing speed and storage capacity are matched by an equally rapid decline in the access and survival lifetime of older information. This pattern is not limited to modern electronic systems but is evident throughout history from writing on stone and clay tablets to papyrus and paper. It is equally evident in image systems from painting, through film, to magnetic tapes and digital cameras. In sound recording we have variously progressed from wax discs to vinyl, magnetic tape and CD formats. In each case the need for better definition and greater capacity has forced the earlier systems into oblivion. Indeed proposed interactive music systems could similarly relegate music CDs to specialist collections. The article will track some of the examples and discuss the consequences as well as noting that this information loss is further compounded by developments in language and changes in cultural views of different societies.
Cua, G. B.; Gasparini, P.; Giardini, D.; Zschau, J.; Filangieri, A. R.; Reakt Wp7 Team
The primary objective of European FP7 project REAKT (Strategies and Tools for Real-Time Earthquake Risk Reduction) is to improve the efficiency of real-time earthquake risk mitigation methods and their capability of protecting structures, infrastructures, and populations. REAKT aims to address the issues of real-time earthquake hazard and response from end-to-end, with efforts directed along the full spectrum of methodology development in earthquake forecasting, earthquake early warning, and real-time vulnerability systems, through optimal decision-making, and engagement and cooperation of scientists and end users for the establishment of best practices for use of real-time information. Twelve strategic test cases/end users throughout Europe have been selected. This diverse group of applications/end users includes civil protection authorities, railway systems, hospitals, schools, industrial complexes, nuclear plants, lifeline systems, national seismic networks, and critical structures. The scale of target applications covers a wide range, from two school complexes in Naples, to individual critical structures, such as the Rion Antirion bridge in Patras, and the Fatih Sultan Mehmet bridge in Istanbul, to large complexes, such as the SINES industrial complex in Portugal and the Thessaloniki port area, to distributed lifeline and transportation networks and nuclear plants. Some end-users are interested in in-depth feasibility studies for use of real-time information and development of rapid response plans, while others intend to install real-time instrumentation and develop customized automated control systems. From the onset, REAKT scientists and end-users will work together on concept development and initial implementation efforts using the data products and decision-making methodologies developed with the goal of improving end-user risk mitigation. The aim of this scientific/end-user partnership is to ensure that scientific efforts are applicable to operational
Bignami, Christian; Stramondo, Salvatore; Pierdicca, Nazzareno
The APhoRISM - Advanced PRocedure for volcanIc and Seismic Monitoring - project is an FP7 funded project, which aims at developing and testing two new methods to combine Earth Observation satellite data from different sensors, and ground data for seismic and volcanic risk management. The objective is to demonstrate that this two types of data, appropriately managed and integrated, can provide new improved products useful for seismic and volcanic crisis management. One of the two methods deals with earthquakes, and it concerns the generation of maps to address the detection and estimate of damage caused by a seism. The method is named APE - A Priori information for Earthquake damage mapping. The use of satellite data to investigate earthquake damages is not an innovative issue. Indeed, a wide literature and projects have addressed and focused such issue, but usually the proposed approaches are only based on change detection techniques and/or classifications algorithms. The novelty of APhoRISM-APE relies on the exploitation of a priori information derived by: - InSAR time series to measure surface movements - shakemaps obtained from seismological data - vulnerability information. This a priori information is then integrated with change detection map from earth observation satellite sensors (either Optical or Synthetic Aperture Radar) to improve accuracy and to limit false alarms.
Fischer, Kasper D.
Guy, Michelle R.; Patton, John M.; Fee, Jeremy; Hearne, Mike; Martinez, Eric; Ketchum, D.; Worden, Charles; Quitoriano, Vince; Hunter, Edward; Smoczyk, Gregory; Schwarz, Stan
It is important to note that this document provides a brief introduction to the work of dozens of software developers and IT specialists, spanning in many cases more than a decade. References to significant amounts of supporting documentation, code, and information are supplied within.
Degroot, R. M.; Springer, K.; Brooks, C. J.; Schuman, L.; Dalton, D.; Benthien, M. L.
In 1999 the Southern California Earthquake Center initiated an effort to expand its reach to multiple target audiences through the development of an interpretive trail on the San Andreas fault at Wallace Creek and an earthquake exhibit at Fingerprints Youth Museum in Hemet. These projects and involvement with the San Bernardino County Museum in Redlands beginning in 2007 led to the creation of Earthquake Education and Public Information Centers (EPIcenters) in 2008. The impetus for the development of the network was to broaden participation in The Great Southern California ShakeOut. In 2009 it has grown to be more comprehensive in its scope including its evolution into a statewide network. EPIcenters constitute a variety of free-choice learning institutions, representing museums, science centers, libraries, universities, parks, and other places visited by a variety of audiences including families, seniors, and school groups. They share a commitment to demonstrating and encouraging earthquake preparedness. EPIcenters coordinate Earthquake Country Alliance activities in their county or region, lead presentations or organize events in their communities, or in other ways demonstrate leadership in earthquake education and risk reduction. The San Bernardino County Museum (Southern California) and The Tech Museum of Innovation (Northern California) serve as EPIcenter regional coordinating institutions. They interact with over thirty institutional partners who have implemented a variety of activities from displays and talks to earthquake exhibitions. While many activities are focused on the time leading up to and just after the ShakeOut, most EPIcenter members conduct activities year round. Network members at Kidspace Museum in Pasadena and San Diego Natural History Museum have formed EPIcenter focus groups on early childhood education and safety and security. This presentation highlights the development of the EPIcenter network, synergistic activities resulting from this
Washington State Dept. of Information Services, Olympia. Policy and Regulation Div.
The Strategic Information Technology Plan of Washington is introduced and explained. The plan is mandated by state law to create a new framework for communication and collaboration to bring together agency technology planning with the achievement of statewide information technology goals and strategies. It provides a point of reference for the…
Liu, Leping, Ed.; Johnson, D. LaMont, Ed.; Maddux, Cleborne D., Ed.; Henderson, Norma J., Ed.
This book contains the following articles on evaluating and assessing educational information technology: (1) "Assessing Learning in the New Age of Information Technology in Education" (Leping Liu, D. LaMont Johnson, Cleborne D. Maddux, and Norma J. Henderson); (2) "Instruments for Assessing the Impact of Technology in Education" (Rhonda…
Mackey, R.; Some, R.; Aljabri, A.
Presented in this paper is a modified interpretation of the traditional TRLs aimed solely at information technology. The intent of this new set of definitions is twofold: First, to enable a definitive measurement of progress among developing information technologies for spacecraft; and second, to clarify particular challenges and requirements that must be met as these technologies are validated in increasingly realistic environments.
U.S. Geological Survey; Spall, Henry, (Edited By); Schnabel, Diane C.
Earthquakes and Volcanoes is published bimonthly by the U.S. Geological Survey to provide current information on earthquakes and seismology, volcanoes, and related natural hazards of interest to both generalized and specialized readers. The Secretary of the Interior has determined that the publication of this periodical is necessary in the transaction of the public business required by law of this Department. Use of funds for printing this periodical has been approved by the Office of Management and Budget through June 30, 1989. Any use of trade, product, or firm names is for descriptive purposes only and does not imply endorsement by the U.S. Government.
Kasahara, A.; Yagi, Y.
Rupture process of earthquake derived from geophysical observations is important information to understand nature of earthquake and assess seismic hazard. Finite fault inversion is a commonly applied method to construct seismic source model. In conventional inversion, fault is approximated by a simple fault surface even if rupture of real earthquake should propagate along non-planar complex fault. In the conventional inversion, complex rupture kinematics is approximated by limited model parameters that only represent slip on a simple fault surface. This over simplification may cause biased and hence misleading solution. MW 7.7 left-lateral strike-slip earthquake occurred in southwestern Pakistan on 2013-09-24 might be one of exemplar event to demonstrate the bias. For this earthquake, northeastward rupture propagation was suggested by a finite fault inversion of teleseismic body and long period surface waves with a single planer fault (USGS). However, surface displacement field measured from cross-correlation of optical satellite images and back-projection imaging revealed that rupture was unilaterally propagated toward southwest on a non-planer fault (Avouac et.al., 2014). To mitigate the bias, more flexible source parameterization should be employed. We extended multi-time window finite fault method to represent rupture kinematics on a complex fault. Each spatio-temporal knot has five degrees of freedom and is able to represent arbitrary strike, dip, rake, moment release rate and CLVD component. Detailed fault geometry for a source fault is not required in our method. The method considers data covariance matrix with uncertainty of Green's function (Yagi and Fukahata, 2011) to obtain stable solution. Preliminary results show southwestward rupture propagation and focal mechanism change that is consistent with fault trace. The result suggests usefulness of the flexible source parameterization for inversion of complex events.
Describes policies formulated by the Chinese government that use information technology to facilitate the development of scientific and technical information activities. Highlights include online information retrieval; the construction of databases, including Chinese language, numeric, and Chinese trade and technology databases; the development of…
Hirata, K.; Fujiwara, H.; Nakamura, H.; Osada, M.; Morikawa, N.; Kawai, S.; Ohsumi, T.; Aoi, S.; Yamamoto, N.; Matsuyama, H.; Toyama, N.; Kito, T.; Murashima, Y.; Murata, Y.; Inoue, T.; Saito, R.; Takayama, J.; Akiyama, S.; Korenaga, M.; Abe, Y.; Hashimoto, N.
The Earthquake Research Committee(ERC)/HERP, Government of Japan (2013) revised their long-term evaluation of the forthcoming large earthquake along the Nankai Trough; the next earthquake is estimated M8 to 9 class, and the probability (P30) that the next earthquake will occur within the next 30 years (from Jan. 1, 2013) is 60% to 70%. In this study, we assess tsunami hazards (maximum coastal tsunami heights) in the near future, in terms of a probabilistic approach, from the next earthquake along Nankai Trough, on the basis of ERC(2013)'s report. The probabilistic tsunami hazard assessment that we applied is as follows; (1) Characterized earthquake fault models (CEFMs) are constructed on each of the 15 hypothetical source areas (HSA) that ERC(2013) showed. The characterization rule follows Toyama et al.(2015, JpGU). As results, we obtained total of 1441 CEFMs. (2) We calculate tsunamis due to CEFMs by solving nonlinear, finite-amplitude, long-wave equations with advection and bottom friction terms by finite-difference method. Run-up computation on land is included. (3) A time predictable model predicts the recurrent interval of the present seismic cycle is T=88.2 years (ERC,2013). We fix P30 = 67% by applying the renewal process based on BPT distribution with T and alpha=0.24 as its aperiodicity. (4) We divide the probability P30 into P30(i) for i-th subgroup consisting of the earthquakes occurring in each of 15 HSA by following a probability re-distribution concept (ERC,2014). Then each earthquake (CEFM) in i-th subgroup is assigned a probability P30(i)/N where N is the number of CEFMs in each sub-group. Note that such re-distribution concept of the probability is nothing but tentative because the present seismology cannot give deep knowledge enough to do it. Epistemic logic-tree approach may be required in future. (5) We synthesize a number of tsunami hazard curves at every evaluation points on coasts by integrating the information about 30 years occurrence
Wolf, Milton, Ed.; And Others
Includes two articles that discuss science fiction and future possibilities in information technology: "'Jurassic Park' and Al Jolson: Thinking about the Information Revolution" (Connie Willis) and "The Good and the Bad: Outlines of Tomorrow" (David Brin). (LRW)
Waves of enthusiasm for technological innovations that promise to revitalize teaching and learning are at least a century old. Unfortunately, the record of accomplishment for the many varieties of hardware and software introduced into schools over the decades is remarkably thin. Today's promoters of technology in education tend to forget similar…
Although the health sciences will benefit from many of the advances in information technology that are applied to a wide variety of research areas, information technology is of particular importance to health care delivery. Developments of computerized patient records will enhance the efficiency effectiveness, and distribution of health care. As managed care programs develop, population-based information will be of increasing importance to health care providers and to the public health community. The capacity to transmit this information. 3 refs.
Goldstein, Charles M.
Provides basic information on videodisks and potential applications, including inexpensive online storage, random access graphics to complement online information systems, hybrid network architectures, office automation systems, and archival storage. (JN)
Bankes, Steve; And Others
Discusses the scope of the Information Revolution, considers initiatives for harnessing information technology, and proposes a research agenda. Seven appendices detail specific initiatives relating to a global communication network, a Council for North American Information, the news media, a pan-European security information agency, multinational…
Zoeller, G.; Holschneider, M.
In recent publications, it has been shown that earthquake catalogs are useful to estimate the maximum expected earthquake magnitude in a future time horizon Tf. However, earthquake catalogs alone do not allow to estimate the maximum possible magnitude M (Tf = ∞) in a study area. Therefore, we focus on the question, which data might be helpful to constrain M. Assuming a doubly-truncated Gutenberg-Richter law and independent events, optimal estimates of M depend solely on the largest observed magnitude μ regardless of all the other details in the catalog. For other models of the frequency-magnitude relation, this results holds in approximation. We show that the maximum observed magnitude μT in a known time interval T in the past provides provides the most powerful information on M in terms of the smallest confidence intervals. However, if high levels of confidence are required, the upper bound of the confidence interval may diverge. Geological or tectonic data, e.g. strain rates, might be helpful, if μT is not available; but these quantities can only serve as proxies for μT and will always lead to a higher degree of uncertainty and, therefore, to larger confidence intervals of M.
Alemna, A. Anaba
Outlines current and past library and information education and training in West Africa, focusing on the impacts of technological advancement; suggests information technologies that training programs should be able to access; and discusses formulating new curricula, staffing requirements, and further implications for the countries in this region…
ABLEDATA, Silver Spring, MD.
This directory lists sources for the funding of assistive technology for people with disabilities. Introductory information urges determination of what assistive technology is needed and the gathering of all necessary information (such as primary and secondary disabilities, employment history, income and expenses, and health insurance) prior to…
Herling, Thomas J.; Merskin, Debra
Since little empirical research has been conducted on adoption of currently available information technology by the advertising industry, a study explored the extent of advertising agencies' adoption of selected information technologies such as online database services and electronic mail. The study discussed data from earlier studies and analyzed…
de Stricker, Ulla
Presents observations about developments in information technology that will influence the information industry and libraries of the future. Discusses search engine capabilities; push technology; electronic commerce; WebTV; and optical discs with links to Web sites. Ten figures provide illustrations and charts. (AEF)
Describes several types of user devices (computers, laptops, personal digital assistants, telephones), explaining that they serve as a translator between technology's internal representation of information and what can be perceived, processed, and used by humans. Also addresses the use of information technology devices by people with disabilities…
Wang, Lin; Rau, Pei-Luen Patrick; Salvendy, Gavriel
This study investigated variables contributing to older adults' information technology acceptance through a survey, which was used to find factors explaining and predicting older adults' information technology acceptance behaviors. Four factors, including needs satisfaction, perceived usability, support availability, and public acceptance, were…
In recent years, students, workers, and jobseekers have received mixed signals about the job market for information technology. Periods of strong job growth have been punctuated by brief periods of employment declines. Optimism about information technology (commonly referred to as IT) as a career field has been tempered by concerns about job…
Information technology is driving business and industry into the future. This is the essence of reengineering, process innovation, downsizing, etc. Non-profits, schools, libraries, etc. need to follow or risk efficiency. However, to get their fair share of information technology, they need to help with funding. PMID:10187237
Hardman, John; Tu, Eugene (Technical Monitor)
The Computing, Information and Communications Technology Program (CICT) was established in 2001 to ensure NASA's Continuing leadership in emerging technologies. It is a coordinated, Agency-wide effort to develop and deploy key enabling technologies for a broad range of mission-critical tasks. The NASA CICT program is designed to address Agency-specific computing, information, and communications technology requirements beyond the projected capabilities of commercially available solutions. The areas of technical focus have been chosen for their impact on NASA's missions, their national importance, and the technical challenge they provide to the Program. In order to meet its objectives, the CICT Program is organized into the following four technology focused projects: 1) Computing, Networking and Information Systems (CNIS); 2) Intelligent Systems (IS); 3) Space Communications (SC); 4) Information Technology Strategic Research (ITSR).
Hayes, G.P.; Earle, P.S.; Benz, H.M.; Wald, D.J.; Briggs, R.W.
This article presents a timeline of NEIC response to a major global earthquake for the first time in a formal journal publication. We outline the key observations of the earthquake made by the NEIC and its partner agencies, discuss how these analyses evolved, and outline when and how this information was released to the public and to other internal and external parties. Our goal in the presentation of this material is to provide a detailed explanation of the issues faced in the response to a rare, giant earthquake. We envisage that the timeline format of this presentation can highlight technical and procedural successes and shortcomings, which may in turn help prompt research by our academic partners and further improvements to our future response efforts. We have shown how NEIC response efforts have significantly improved over the past six years since the great 2004 Sumatra-Andaman earthquake. We are optimistic that the research spawned from this disaster, and the unparalleled dense and diverse data sets that have been recorded, can lead to similar-and necessary-improvements in the future.
Discusses a unit on nuclear technology which is taught in a physics class. Explains the unit design, implementation process, demonstrations used, and topics of discussion that include light and optics, naturally and artificially produced sources of radioactivity, nuclear equations, isotopes and half-lives, and power-generating nuclear reactors.…
Altan, O.; Toz, G.; Kulur, S.; Seker, D.; Volz, S.; Fritsch, D.; Sester, M.
After a catastrophe like an earthquake, one on the most important problems is to provide shelter and housing for the homeless. To this end, it is necessary to decide if a building is still habitable, or if it is has to be renovated or even torn down. A prerequisite for such decisions is the detailed knowledge about the status of the building. Earlier earthquakes revealed problems in the processes of documenting and analysing the building damage, as they demanded much effort in terms of time and manpower. The main difficulties appeared to be because of the analogue damage assessments which created a great variety of unstructured information that had to be put in a line to allow further analysis. Apart from that, documentation of damage effects was not detailed and could only be carried out on the spot of a disaster. The aim of this study is to make an improvement, using combination of Geographic Information Systems (GIS) as a management and data analysis tool and photogrammetry as a documentation method. Photogrammetric data acquisition is achieved using a CCD camera and the digital photogrammetric software package PICTRAN by Technet. The information system part is the GIS package ArcView by ESRI. The combination of rapid data acquisition and GIS offers a quick assessment of the situation and the possibility of its objective and holistic analysis. This is the prerequisite for a quick initiation of appropriate measures to help people.
Denning, Peter J.
It is easy to extrapolate current trends to see where technologies relating to information systems in astrophysics and other disciplines will be by the end of the decade. These technologies include mineaturization, multiprocessing, software technology, networking, databases, graphics, pattern computation, and interdisciplinary studies. It is easy to see what limits our current paradigms place on our thinking about technologies that will allow us to understand the laws governing very large systems about which we have large datasets. Three limiting paradigms are saving all the bits collected by instruments or generated by supercomputers; obtaining technology for information compression, storage and retrieval off the shelf; and the linear mode of innovation. We must extend these paradigms to meet our goals for information technology at the end of the decade.
Denning, Peter J.
It is easy to extrapolate current trends to see where technologies relating to information systems in astrophysics and other disciplines will be by the end of the decade. These technologies include miniaturization, multiprocessing, software technology, networking, databases, graphics, pattern computation, and interdisciplinary studies. It is less easy to see what limits our current paradigms place on our thinking about technologies that will allow us to understand the laws governing very large systems about which we have large data sets. Three limiting paradigms are as follows: saving all the bits collected by instruments or generated by supercomputers; obtaining technology for information compression, storage, and retrieval off the shelf; and the linear model of innovation. We must extend these paradigms to meet our goals for information technology at the end of the decade.
Denning, Peter J.
It is easy to extrapolate current trends to see where technologies relating to information systems in astrophysics and other disciplines will be by the end of the decade. These technologies include mineaturization, multiprocessing, software technology, networking, databases, graphics, pattern computation, and interdisciplinary studies. It is easy to see what limits our current paradigms place on our thinking about technologies that will allow us to understand the laws governing very large systems about which we have large datasets. Three limiting paradigms are saving all the bits collected by instruments or generated by supercomputers; obtaining technology for information compression, storage and retrieval off the shelf; and the linear mode of innovation. We must extend these paradigms to meet our goals for information technology at the end of the decade.
This slide presentation reviews how information technology supports the Human Research Facility (HRF) and specifically the uses that contractor has for the information. There is information about the contractor, the HRF, some of the experiments that were performed using the HRF on board the Shuttle, overviews of the data architecture, and software both commercial and specially developed software for the specific experiments.
Norris, Donald M.
Partnerships are examined using information gathered in interviews with over 20 chief information officers or chief academic computing officers at universities considered leaders in information technology. Issues addressed include why partnerships are formed; four types; how they are formed; the importance of vendor contract; and impact.…
Leung, Shirley; Bisom, Diane
The objectives of this survey were: to gather information on the development of institutional information technology policies and guidelines for responsible computing and use of electronic information; to identify the scope of such policies and guidelines; and to determine the role of the library in the development and/or use of the policies and…
Purpose--The purpose of this paper is to study the provisions of information technology IT for development of academic resources and examines the effect of IT in academic institutions for sharing information. Design/methodology/approach--The paper examines the role of IT in sharing information in academic institutions and explores the IT…
Danovitch, Judith H.; Alzahabi, Reem
Although children are often exposed to technological devices early in life, little is known about how they evaluate these novel sources of information. In two experiments, children aged 3, 4, and 5 years old ("n" = 92) were presented with accurate and inaccurate computer informants, and they subsequently relied on information provided by…
Wolff, Jennifer L; Darer, Jonathan D; Larsen, Kevin L
Health information technology has been embraced as a strategy to facilitate patients' access to their health information and engagement in care. However, not all patients are able to access, or are capable of using, a computer or mobile device. Although family caregivers assist individuals with some of the most challenging and costly health needs, their role in health information technology is largely undefined and poorly understood. This perspective discusses challenges and opportunities of engaging family caregivers through the use of consumer-oriented health information technology. We compile existing evidence to make the case that involving family caregivers in health information technology as desired by patients is technically feasible and consistent with the principles of patient-centered and family-centered care. We discuss how more explicit and purposeful engagement of family caregivers in health information technology could advance clinical quality and patient safety by increasing the transparency, accuracy, and comprehensiveness of patient health information across settings of care. Finally, we describe how clarifying and executing patients' desires to involve family members or friends through health information technology would provide family caregivers greater legitimacy, convenience, and timeliness in health system interactions, and facilitate stronger partnerships between patients, family caregivers, and health care professionals. PMID:26311198
Kay, Peg, Ed.; Powell, Patricia, Ed.
Developed by the Institute for Computer Sciences and Technology and the Defense Intelligence Agency with input from other federal agencies, this detailed document contains the 1983 technical forecast for the information processing industry through 1997. Part I forecasts the underlying technologies of hardware and software, discusses changes in the…
Bajcsy, Ruzena, Dr.
The Information Age is transforming our economy and our lives. In its pathbreaking 1999 report to President Clinton, the Presidential Information Technology Advisory Committee (PITAC) outlined the ten crucial ways that new technologies are transforming society in the U.S. It is clear that the Federal government will need to provide the critical R&D investments that will help retain and bolster the U.S. technological lead in the 21st century. These investments will also support efforts to make new technologies and their benefits available to all U.S. citizens.
VanDalsem, William R.
The goal of the Computing, Information, and Communications Technology (CICT) program is to enable NASA's Scientific Research, Space Exploration, and Aerospace Technology Missions with greater mission assurance, for less cost, with increased science return through the development and use of advanced computing, information and communications technologies. This viewgraph presentation includes diagrams of how the political guidance behind CICT is structured. The presentation profiles each part of the NASA Mission in detail, and relates the Mission to the activities of CICT. CICT's Integrated Capability Goal is illustrated, and hypothetical missions which could be enabled by CICT are profiled. CICT technology development is profiled.
Bossu, R.; Etivant, C.; Roussel, F.; Mazet-Roux, G.; Steed, R.
Smartphone applications have swiftly become one of the most popular tools for rapid reception of earthquake information for the public. Wherever someone's own location is, they can be automatically informed when an earthquake has struck just by setting a magnitude threshold and an area of interest. No need to browse the internet: the information reaches you automatically and instantaneously! One question remains: are the provided earthquake notifications always relevant for the public? A while after damaging earthquakes many eyewitnesses scrap the application they installed just after the mainshock. Why? Because either the magnitude threshold is set too high and many felt earthquakes are missed, or it is set too low and the majority of the notifications are related to unfelt earthquakes thereby only increasing anxiety among the population at each new update. Felt and damaging earthquakes are the ones of societal importance even when of small magnitude. LastQuake app and Twitter feed (QuakeBot) focuses on these earthquakes that matter for the public by collating different information threads covering tsunamigenic, damaging and felt earthquakes. Non-seismic detections and macroseismic questionnaires collected online are combined to identify felt earthquakes regardless their magnitude. Non seismic detections include Twitter earthquake detections, developed by the USGS, where the number of tweets containing the keyword "earthquake" is monitored in real time and flashsourcing, developed by the EMSC, which detect traffic surges on its rapid earthquake information website caused by the natural convergence of eyewitnesses who rush to the Internet to investigate the cause of the shaking that they have just felt. We will present the identification process of the felt earthquakes, the smartphone application and the 27 automatically generated tweets and how, by providing better public services, we collect more data from citizens.
Hughes, M.J.; Lau, P.K.S.
The Information Management Technology Architecture (TA) is being driven by the business objectives of reducing costs and improving effectiveness. The strategy is to reduce the cost of computing through standardization. The Lockheed Martin Idaho Technologies Company (LMITCO) TA is a set of standards and products for use at the Idaho National Engineering Laboratory (INEL). The TA will provide direction for information management resource acquisitions, development of information systems, formulation of plans, and resolution of issues involving LMITCO computing resources. Exceptions to the preferred products may be granted by the Information Management Executive Council (IMEC). Certain implementation and deployment strategies are inherent in the design and structure of LMITCO TA. These include: migration from centralized toward distributed computing; deployment of the networks, servers, and other information technology infrastructure components necessary for a more integrated information technology support environment; increased emphasis on standards to make it easier to link systems and to share information; and improved use of the company`s investment in desktop computing resources. The intent is for the LMITCO TA to be a living document constantly being reviewed to take advantage of industry directions to reduce costs while balancing technological diversity with business flexibility.
Laufenberg, Lawrence; Tu, Eugene (Technical Monitor)
The CICT Program is part of the NASA Aerospace Technology Enterprise's fundamental technology thrust to develop tools. processes, and technologies that enable new aerospace system capabilities and missions. The CICT Program's four key objectives are: Provide seamless access to NASA resources- including ground-, air-, and space-based distributed information technology resources-so that NASA scientists and engineers can more easily control missions, make new scientific discoveries, and design the next-generation space vehicles, provide high-data delivery from these assets directly to users for missions, develop goal-oriented human-centered systems, and research, develop and evaluate revolutionary technology.
Obara, Kazushige; Kato, Aitaro
Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of their high sensitivity to stress changes in the seismogenic zone. Episodic stress transfer to megathrust source faults leads to an increased probability of triggering huge earthquakes if the adjacent locked region is critically loaded. Careful and precise monitoring of slow earthquakes may provide new information on the likelihood of impending huge earthquakes.
Obara, Kazushige; Kato, Aitaro
Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of their high sensitivity to stress changes in the seismogenic zone. Episodic stress transfer to megathrust source faults leads to an increased probability of triggering huge earthquakes if the adjacent locked region is critically loaded. Careful and precise monitoring of slow earthquakes may provide new information on the likelihood of impending huge earthquakes. PMID:27418504
Wigand, R. T.
The work environment surrounding integrated office systems is reviewed. The known effects of automated office technologies is synthesized and their known impact on work efficiency is reviewed. These effects are explored with regard to their impact on networks, work flow/processes, as well as organizational structure and power. Particular emphasis is given to structural changes due to the introduction of newer information technologies in organizations. The new information technologies have restructed the average organization's middle banks and, as a consequence, they have shrunk drastically. Organizational pyramids have flattened with fewer levels since executives have realized that they can get ahold of the needed information via the new technologies quicker and directly and do not have to rely on middle-level managers. Power shifts are typically accompanied with the introduction of these technologies resulting in the generation of a new form of organizational power.
Women working in higher education information technology (IT) organizations and those seeking leadership positions in these organizations face a double challenge in overcoming the traditionally male-dominated environments of higher education and IT. Three women higher education chief information officers (CIOs) provided their perspectives,…
... From the Federal Register Online via the Government Publishing Office ] DEPARTMENT OF HEALTH AND HUMAN SERVICES Health Resources and Services Administration Health Information Technology Implementation AGENCY: Health Resources and Services Administration (HRSA), Department of Health and Human Services...
... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF HEALTH AND HUMAN SERVICES Health Resources and Services Administration Health Information Technology Implementation AGENCY: Health Resources and Services Administration, HHS. ACTION: Notice of Noncompetitive...
Dong, Laigen; Shan, Jie; Ye, Yuanxin
It is important to grasp damage information in stricken areas after an earthquake in order to perform quick rescue and recovery activities. Recent research into remote sensing techniques has shown significant ability to generate quality damage information. The methods based on only post-earthquake data are widely researched especially because there are no pre-earthquake reference data in many cities of the world. This paper addresses a method for detection of damaged buildings using only post-event satellite imagery so that scientists and researchers can take advantage of the ability of helicopters and airplanes to fly over the damage faster. Statistical information of line segments extracted from post-event satellite imagery, such as mean length (ML) and weighted tilt angel standard deviation (WTASD), are used for discriminating the damaged and undamaged buildings.
Recent technological progress in the generation, manipulation and detection of individual single photons has opened a new scientific field of photonic quantum information. This progress includes the realization of single photon switches, photonic quantum circuits with specific functions, and the application of novel photonic states to novel optical metrology beyond the limits of standard optics. In this review article, the recent developments and current status of photonic quantum information technology are overviewed based on the author's past and recent works. PMID:26755398
A methodology is described for system engineering security into large information technology systems under development. The methodology is an integration of a risk management process and a generic system development life cycle process. The methodology is to be used by Security System Engineers to effectively engineer and integrate information technology security into a target system as it progresses through the development life cycle. The methodology can also be used to re-engineer security into a legacy system.
During the year since the founding of Squibb College in 1988, the Instructional Technology department has been charged with planning and implementing information technology systems for both office productivity and training. Decisions made, obstacles encountered, and progress achieved during that year are discussed, and the impact of the first…
Henry, Nicholas L.
Discusses the dilemmas created by photocopying, computing, and other neopulishing technologies in the light of existing copyright laws, analyzes the costs and benefits of computer-based information storage and retrieval systems and photocopying technologies, and suggests guidelines for developing future public policy. (JR)
Discusses the so-called third industrial revolution, or the information revolution. Topics addressed include the progression of the revolution in the U.S. economy, in Europe, and in Third World countries; the empowering technologies, including digital switches, optical fiber, semiconductors, CD-ROM, networks, and combining technologies; and future…
Huff, L. A.; Moreland, J.; Allison, R.; Elia, J.; Jerdee, B.
The objective of this study is to develop approaches for improved Line Replaceable Unit (LRU) internal communications, utilizing state-of-the-art techniques and technology, in order to reduce the growing number of interconnects with LRU's. Worst-case LRU data transfer requirements were established by analyzing internal signal routing, data rates, and duty cycles of the F-16 Fire Control Computer (FCC) and the Programmable Signal Processor (PSP). It was determined that 25/Mword/second is adequate for card-to-backplane (serial) transfers. Candidate designs for meeting these requirements were developed and then subjected to an extensive trade-off analysis. This analysis ultimately yielded the selection of Switched Network Electro-Optical (serial) and Electro-Optical Air-Gap (parallel) as the preferred approaches. The interface pin-count per module of the recommended designs has been reduced to approximately 40. This is substantially lower that the average of 250 connections per module in most conventional approaches and fulfills the primary objective of this program. Further, the zero insertion-force air-gap interfaces directly support modular architectures and enhance the prospects of making two-level maintenance concepts a practical reality.
Furlong, K. P.; Benz, H.; Hayes, G. P.; Villasenor, A.
Although most would agree that the occurrence of natural disaster events such as earthquakes, volcanic eruptions, and floods can provide effective learning opportunities for natural hazards-based courses, implementing compelling materials into the large-enrollment classroom environment can be difficult. These natural hazard events derive much of their learning potential from their real-time nature, and in the modern 24/7 news-cycle where all but the most devastating events are quickly out of the public eye, the shelf life for an event is quite limited. To maximize the learning potential of these events requires that both authoritative information be available and course materials be generated as the event unfolds. Although many events such as hurricanes, flooding, and volcanic eruptions provide some precursory warnings, and thus one can prepare background materials to place the main event into context, earthquakes present a particularly confounding situation of providing no warning, but where context is critical to student learning. Attempting to implement real-time materials into large enrollment classes faces the additional hindrance of limited internet access (for students) in most lecture classrooms. In Earth 101 Natural Disasters: Hollywood vs Reality, taught as a large enrollment (150+ students) general education course at Penn State, we are collaborating with the USGS’s National Earthquake Information Center (NEIC) to develop efficient means to incorporate their real-time products into learning activities in the lecture hall environment. Over time (and numerous events) we have developed a template for presenting USGS-produced real-time information in lecture mode. The event-specific materials can be quickly incorporated and updated, along with key contextual materials, to provide students with up-to-the-minute current information. In addition, we have also developed in-class activities, such as student determination of population exposure to severe ground
Gray, Martha M.
In 1959 the Director of the National Bureau of Standards declared “The emergence of science and technology as the paramount concern of the Nation in the 20th century … demanded the highest order of measurement competence, in order to provide the standards and measurement techniques on which maintenance of scientific progress depended.” Since 1959, information technology has emerged as having a global impact on all facets of industry. However, the “standards and measurement techniques” needed to maintain the scientific progress of information technology into the next century may not be in place. This paper discusses the current state of software metrics.
Clark, Thomas A.; Lipa, Brian E. G.; Macera, Anthony R.; Staskevich, Gennady R.
The Joint Battlespace Infosphere (JBI) Information Management (IM) services provide information exchange and persistence capabilities that support tailored, dynamic, and timely access to required information, enabling near real-time planning, control, and execution for DoD decision making. JBI IM services will be built on a substrate of network centric core enterprise services and when transitioned, will establish an interoperable information space that aggregates, integrates, fuses, and intelligently disseminates relevant information to support effective warfighter business processes. This virtual information space provides individual users with information tailored to their specific functional responsibilities and provides a highly tailored repository of, or access to, information that is designed to support a specific Community of Interest (COI), geographic area or mission. Critical to effective operation of JBI IM services is the implementation of repositories, where data, represented as information, is represented and persisted for quick and easy retrieval. This paper will address information representation, persistence and retrieval using existing database technologies to manage structured data in Extensible Markup Language (XML) format as well as unstructured data in an IM services-oriented environment. Three basic categories of database technologies will be compared and contrasted: Relational, XML-Enabled, and Native XML. These technologies have diverse properties such as maturity, performance, query language specifications, indexing, and retrieval methods. We will describe our application of these evolving technologies within the context of a JBI Reference Implementation (RI) by providing some hopefully insightful anecdotes and lessons learned along the way. This paper will also outline future directions, promising technologies and emerging COTS products that can offer more powerful information management representations, better persistence mechanisms and
Gore, Albert, Jr.
This statement by Senator Albert Gore, Jr., on introduction of the Information Infrastructure and Technology Act of 1992 highlights examples of applications of high-performance computing, the components of the Information Infrastructure Development Program (i.e., education, libraries, manufacturing, and health care), and participating agencies. A…
Allen, Kenneth C.; And Others
Communication and information technologies can reduce the barriers of distance and space that disadvantage rural areas. This report defines a set of distinct voice, computer, and video telecommunication services; describes several rural information applications that make use of these services; and surveys various wireline and wireless systems and…
Discussion of the use of technology in Australian schools to improve information access and educational outcomes focuses on the impact on school library resource centers. Topics discussed include online information services; electronic bulletin boards; electronic mail via microcomputers; and optical storage media, including CD-ROM, hypermedia, and…
While the dominance of Information Technology Outsourcing (ITO) as a sourcing strategy would seem to indicate successful and well-informed practice, frequent examples of unraveled engagements highlight the associated risks. Successful instances of outsourcing suggest that governance mechanisms effectively manage the related risks. This…
Nursing homes are considered lagging behind in adopting health information technology (HIT). Many studies have highlighted the use of HIT as a means of improving health care quality. However, these studies overwhelmingly do not provide empirical information proving that HIT can actually achieve these improvements. The main research goal of this…
Gilbert, Steven W.
Discusses issues affecting the fields of information technology, intellectual property, and education. Four main needs are addressed: (1) new economic mechanisms beyond copyright and patent; (2) new codes of ethics for education; (3) effective representation for creator/producers and users of information; and (4) a forum for the voice of…
Elam, Jimmy H.
To enhance the information technology literacy of optometry students, the Southern College of Optometry (Tennessee) developed an academic assignment, the Electronic Media Paper, in which second-year students must search two different electronic media for information. Results suggest Internet use for searching may be a useful tool for specific…
Like it or not, an institution's IT infrastructure is a matter with which institutional strategic planners must concern themselves. Information systems represent a significant investment, they perform mission-critical functions, and the appropriate use of information and learning technologies can have a critical part to play in delivering against…
Truckee Meadows Community Coll., Sparks, NV.
This document represents a major component of Nevada's Truckee Meadows Community College (TMCC) strategic planning activities and elaborates on the technology functions found in the college strategic plan. Information resources at TMCC are grouped into five areas: (1) administrative computing, the area of information processing that supports the…
This dissertation aims to investigate how advanced information technologies cope with the various demands of disaster response. It consists of three essays on the exploration of micro-blogging and FOSS environments. The first essay looks at the usage of micro-blogging in the aftermath of the massive 2008 China earthquake and explores the…
Rico, H.; Hauksson, E.; Thomas, E.; Friberg, P.; Given, D.
The California Integrated Seismic Network (CISN) Display is part of a Web-enabled earthquake notification system alerting users in near real-time of seismicity, and also valuable geophysical information following a large earthquake. It will replace the Caltech/USGS Broadcast of Earthquakes (CUBE) and Rapid Earthquake Data Integration (REDI) Display as the principal means of delivering graphical earthquake information to users at emergency operations centers, and other organizations. Features distinguishing the CISN Display from other GUI tools are a state-full client/server relationship, a scalable message format supporting automated hyperlink creation, and a configurable platform-independent client with a GIS mapping tool; supporting the decision-making activities of critical users. The CISN Display is the front-end of a client/server architecture known as the QuakeWatch system. It is comprised of the CISN Display (and other potential clients), message queues, server, server "feeder" modules, and messaging middleware, schema and generators. It is written in Java, making it platform-independent, and offering the latest in Internet technologies. QuakeWatch's object-oriented design allows components to be easily upgraded through a well-defined set of application programming interfaces (APIs). Central to the CISN Display's role as a gateway to other earthquake products is its comprehensive XML-schema. The message model starts with the CUBE message format, but extends it by provisioning additional attributes for currently available products, and those yet to be considered. The supporting metadata in the XML-message provides the data necessary for the client to create a hyperlink and associate it with a unique event ID. Earthquake products deliverable to the CISN Display are ShakeMap, Ground Displacement, Focal Mechanisms, Rapid Notifications, OES Reports, and Earthquake Commentaries. Leveraging the power of the XML-format, the CISN Display provides prompt access to
Rico, H.; Hauksson, E.; Thomas, E.; Friberg, P.; Frechette, K.; Given, D.
can give Emergency Response managers' information needed to allocate limited personnel and resources after a major event. The shaking intensity shape files may be downloaded out-of-band to the client computer, and with the GIS mapping tool, users can plot organizational assets on the CISN Display map and analyze their inventory against potentially damaged areas. Lastly, in support of a robust design is a well-established and reliable set of communication protocols. To achieve a state-full server connection and messaging via a signaling channel the application uses a Common Object Request Broker Architecture (CORBA). The client responds to keep-alive signals from the server, and alerts users of changes in the connection status. This full-featured messaging service will allow the system to trigger a reconnect strategy whenever the client detects a loss of connectivity. This sets the CISN Display apart from its predecessors, which do not provide a failover mechanism, or a state of connection. Thus by building on past programming successes and advances in proven Internet technologies, the CISN Display will augment the emergency responder's ability to make informed decisions following a potentially damaging earthquake.
Shabalova, I P; Dzhangirova, T V; Kasoian, K T
The lecture is devoted to the urgent problem that is to increase the quality of cytological diagnosis, by diminishing the subjectivism factor via introduction of up-to-date computer information technologies into a cytologist's practice. Its main lines from the standardization of cytological specimen preparation to the registration of a cytologist's opinion and the assessment of the specialist's work quality at the laboratories that successfully use the capacities of the current information systems are described. Information technology capabilities to improve the interpretation of the cellular composition of cytological specimens are detailed. PMID:20799410
... ADMINISTRATION Information Collection; Implementation of Information Technology Security Provision AGENCY... new information collection requirement regarding Implementation of Information Technology Security... forms of information technology. DATES: Submit comments on or before February 13, 2012....
National Science Foundation. Washington, DC. Div. of Information Science and Technology.
This volume contains the reports of three working groups which were convened separately over a 3-year period at the request of the Advisory Committee for the Division of Information Science and Technology of the National Science Foundation to obtain the opinion of experts concerning research opportunities and trends in information science and…
Mortensen, Carl; Donlin, Carolyn; Page, Robert A.; Ward, Peter
In coping with recent multibillion-dollar earthquake disasters, scientists and emergency managers have found new ways to speed and improve relief efforts. This progress is founded on the rapid availability of earthquake information from seismograph networks.
Procuniar, Molly; Murphy, Sue
Although medical technology is making great strides in improved diagnosis and treatment, the technologies used to document, communicate, and manage those activities are limiting its progress by converting clinicians into computer operators. In an environment of nurse and doctor shortages, reducing their efficiency is counter productive. Technology in healthcare that does not serve patients by improving cost, quality, or care delivery is technology that serves no purpose. Requiring clinicians to chart away from the bedside using technologies that do not feel intuitive, such as keyboarding and mouse use reduces efficiency of workflow, impedes direct care, and increases the cost of training. Intuitive forms of technology such as surface technology, voice activated charting, or digital pens, if embraced, could cause significant changes in healthcare workflows. Clinicians could be more focused on direct care and less utilized in clerical activity. The time it takes to access information could be decreased exponentially--and the opportunities to interact with that information would present a nearly endless horizon. This impact would be especially crucial in high acuity areas and emergency patient care situations. In short, technology should embrace familiar, natural movements and develop intuitive interfaces to improve effectiveness in the healthcare market of the future. PMID:18999045
... OFFICE Health Information Technology Policy Committee Appointment AGENCY: Government Accountability... Act of 2009 (ARRA) established the Health Information Technology Policy Committee to make recommendations on the implementation of a nationwide health information technology infrastructure to the...
... HUMAN SERVICES Health Resources and Services Administration Rural Health Information Technology Network... award under the Rural Health Information Technology Network Development Grant (RHITND) to Grace... relinquishing its fiduciary responsibilities for the Rural Health Information Technology Network...
Mahoney, Mary Ellen
No one would deny the need to transform health care. Information technology is capable of transforming health care organizations and delivering measurable value. However, these organizations will have to deploy effective, proactive strategies for managing information and adapting to the opportunities the technology offers. If, for example, an organization wants to become paperless, its information strategy must include appropriate tools to store and access unstructured data components of the medical record as well as structured data. An Electronic Document Management System (EDMS) is a critical element of this strategy. Also, a plan for managing change must be developed to mitigate technology risks. This can be realized through the development of a clear vision of the future and strong leadership, among other key items. PMID:12402636
Walter, John T.
Management's dilemma, when allocating financial resources towards the improvement of technological readiness and IT flexibility within their organizations, is to control financial risk and maximize IT effectiveness. Technological readiness is people's propensity to embrace and use technology. Its drivers are optimism, innovativeness, discomfort,…
Peterson, John (Editor)
A team was formed to assess NASA Office of Space Science (OSS) information technology research and development activities. These activities were reviewed for their relevance to OSS missions, for their potential for using products better supplied by industry or other government agencies, and for recommending an information technology (IT) infusion strategy for appropriate products for OSS missions. Assessment scope and methodology are presented. IT needs and interests for future OSS missions and current NASA IT research and development (R&D) are discussed. Non-NASA participants provide overviews of some of their IT R&D programs. Implementation and infusion issues and the findings and recommendations of the assessment team are presented.
Recent technological progress in the generation, manipulation and detection of individual single photons has opened a new scientific field of photonic quantum information. This progress includes the realization of single photon switches, photonic quantum circuits with specific functions, and the application of novel photonic states to novel optical metrology beyond the limits of standard optics. In this review article, the recent developments and current status of photonic quantum information technology are overviewed based on the author’s past and recent works. PMID:26755398
The goal of this paper is to identify and briefly describe major existing and near term information technologies that cold have a positive impact on the topics being discussed at this conference by helping to manage the data of global change science and helping global change scientists conduct their research. Desktop computer systems have changed dramatically during the past seven years. Faster data processing can be expected in the future through full development of traditional serial computer architectures. Some other proven information technologies may be currently underutilized by global change scientists. Relational database management systems and good organization of data through the use of thoughtful database design would enable the scientific community to better share and maintain quality research data. Custodians of the data should use rigorous data administration to ensure integrity and long term value of the data resource. Still other emerging information technologies that involve the use of artificial intelligence, parallel computer architectures, and new sensors for data collection will be in relatively common use in the near term and should become part of the global science community's technical toolkit. Consideration should also be given to the establishment of Information Analysis Centers to facilitate effective organization and management of interdisciplinary data and the prototype testing and use of advanced information technology to facilitate rapid and cost-effective integration of these tools into global change science. 8 refs.
California Earthquake Clearinghouse: Advocating for, and Advancing, Collaboration and Technology Interoperability, Between the Scientific and Emergency Response Communities, to Produce Actionable Intelligence for Situational Awareness, and Decision Support
Rosinski, A.; Beilin, P.; Colwell, J.; Hornick, M.; Glasscoe, M. T.; Morentz, J.; Smorodinsky, S.; Millington, A.; Hudnut, K. W.; Penn, P.; Ortiz, M.; Kennedy, M.; Long, K.; Miller, K.; Stromberg, M.
The Clearinghouse provides emergency management and response professionals, scientific and engineering communities with prompt information on ground failure, structural damage, and other consequences from significant seismic events such as earthquakes or tsunamis. Clearinghouse activations include participation from Federal, State and local government, law enforcement, fire, EMS, emergency management, public health, environmental protection, the military, public and non-governmental organizations, and private sector. For the August 24, 2014 S. Napa earthquake, over 100 people from 40 different organizations participated during the 3-day Clearinghouse activation. Every organization has its own role and responsibility in disaster response; however all require authoritative data about the disaster for rapid hazard assessment and situational awareness. The Clearinghouse has been proactive in fostering collaboration and sharing Essential Elements of Information across disciplines. The Clearinghouse-led collaborative promotes the use of standard formats and protocols to allow existing technology to transform data into meaningful incident-related content and to enable data to be used by the largest number of participating Clearinghouse partners, thus providing responding personnel with enhanced real-time situational awareness, rapid hazard assessment, and more informed decision-making in support of response and recovery. The Clearinghouse efforts address national priorities outlined in USGS Circular 1242, Plan to Coordinate NEHRP post-earthquake investigations and S. 740-Geospatial Data Act of 2015, Sen. Orrin Hatch (R-UT), to streamline and coordinate geospatial data infrastructure, maximizing geospatial data in support of the Robert T. Stafford Act. Finally, the US Dept. of Homeland Security, Geospatial Management Office, recognized Clearinghouse's data sharing efforts as a Best Practice to be included in the forthcoming 2015 HLS Geospatial Concept of Operations.
Wink, Donald J.
The National Science Foundation was the original organizational leader for the Internet, and it is still engaged in funding research and infrastructure related to the use of networked information. As it is written in the strategic plan for the Directorate for Computer and Information Science and Engineering, "These technologies promise to have at least as great an impact as did the invention of written language thousands of years ago."
... HUMAN SERVICES Office of the National Coordinator for Health Information Technology; Health Information... Electronic Health Records (EHRs) AGENCY: Health Information Technology (HIT) Policy Committee, Office of the National Coordinator for Health Information Technology (ONC), Department of Health and Human Services...
... Census Bureau Proposed Information Collection; Comment Request; Information and Communication Technology... 2012 Information and Communication Technology Survey (ICTS). The annual survey collects data on two... of information and communication technology equipment and software (computers and...
... Census Bureau Proposed Information Collection; Comment Request; Information and Communication Technology... Bureau plans to conduct the 2013 through 2015 Information and Communication Technology Survey (ICTS). The... leases and rental payments) for four types of information and communication technology equipment...
This paper opens with the following questions: "How prepared are you as a student affairs professional for information communication technology (ICT)? Do you understand such concepts as portals, e-business, Napster, computer use policies, and wireless communication? Will student affairs be shaped by ICT or will student affairs help shape ICT on…
Spicer, Donald Z.
For at least the last quarter century, enterprises--including higher education institutions--have increasingly relied on Information Technology Services (ITS) for business functions. As a result, IT organizations have had to develop the discipline of production operations as well as recovery procedures to respond when those operations are…
CAUSE, Boulder, CO.
Eight papers from the 1987 CAUSE conference's Track VII, Outstanding Applications, are presented. They include: "Image Databases in the University" (Reid Kaplan and Gordon Mathieson); "Using Information Technology for Travel Management at the University of Michigan" (Robert E. Russell and John C. Hufziger); "On-Line Access to University Policies…
This book presents an overview of the present status of the use of library automation hardware and software in Pakistan. The following 20 articles are included: (1) "The Status of Library Automation in Pakistan"; (2) "Promoting Information Technology in Pakistan: the Netherlands Library Development Project"; (3) "Library Software in Pakistan"; (4)…
Green, Kenneth C.
Data from a 1995 survey on campus computing indicate a major gain in the proportion of colleges and universities using information technology as an instructional resource. Four educators respond to this news and examine possible trends and issues to be addressed. (MSE)
Afshari, Mojgan; Bakar, Kamariah Abu; Luan, Wong Su; Samah, Bahaman Abu; Fooi, Foo Say
Leadership is an important component in guiding the teaching-learning process. Principal as school leaders have a major responsibility for initiating and implementing school change through the use of Information and Communication Technology (ICT) and can facilitate complex decision to integrate it into learning, teaching and school administration.…
Mercado, Marina I.
Explores library-related implications of the U.S. Department of Justice's investigations into the operations of Microsoft and Intel and suggests that developing a broader understanding of information technology marketing is crucial to the short- and long-term future of libraries. (MES)
Hundley, Stephen P.
Most higher-education institutions cannot compete with business/industry for information-technology (IT) workers, but can level the playing field by capitalizing on workers' desire for professional-development opportunities. Strategies include using the appeal of the institution's mission and as a training ground for future IT workers; redesigning…
This paper aims at explaining the outcomes of information technology education for international students using anthropological theories of cultural schemas. Even though computer science and engineering are usually assumed to be culture-independent, the practice in classrooms seems to indicate that learning patterns depend on culture. The…
Kitzmiller, Rebecca Rutherford
Background: Hospital adoption of health information technology (HIT) systems is promoted as essential to decreasing medical error and their associated 44,000 annual deaths and $17 billion in healthcare costs (Institute of Medicine, 2001; Kohn, Corrigan, & Donaldson, 1999). Leading national healthcare groups, such as the Institute of Medicine,…
Ghasemiyeh, Rahim; Li, Feng
This paper evaluates the impacts of the Internet on organizational structures and identifies new forms of organizations in light of information technology (IT) advances. Four traditional forms of organizations are summarized, i.e., the bureaucratic hierarchy, the entrepreneurial organization, the matrix organization, and the adhocacy. The…
Describes the role and purpose of the Office of Technology Assessment (OTA) and its relationship to Congress. A chain of congressional activities is developed which links the major events within selected committees to the current OTA assessment of federal information dissemination, and issues and implications of this study are addressed. (14…
Issues concerning the introduction of information technology in education were studied as part of a project sponsored by the Organisation for Economic Co-operation and Development (OECD). The following areas were explored: policies for schools and higher education and the context for the policies; implementation strategies; the impact on learning;…
Abras, Chadia N
Education up to the latter part of the 20th century used strict methods of instruction delivery, relying mostly on tried theories in cognition and social learning. Approaches in constructivism and collaborative learning affirm the success of existing methods of delivering curriculum, yet they also validate the use of information technology as a vehicle to improve student learning. PMID:22787924
Chan, Steven King-Lun
Information technology (IT) is the largest capital expenditure in many firms and is an integral part of many organizations' strategies. However, the benefits that each company receives from its IT investments vary. One study by Weill (2004) found that the top performer in the sample was estimated to have as high as a 40% greater return on its…
This document presents a logical and realistic plan to implement the Information Technology (IT) Standards Program throughout the Department of Energy (DOE). It was developed by DOE Chief Information Officer (CIO) staff, with participation from many other individuals throughout the DOE complex. The DOE IT Standards Program coordinates IT standards activities Department-wide, including implementation of standards to support the DOE Information Architecture. The Program is voluntary, participatory, and consensus-based. The intent is to enable accomplishment of the DOE mission, and the Program is applicable to all DOE elements, both Federal and contractor. The purpose of this document is to describe the key elements of the DOE IT Standards Program.
Donnellan, A.; Lyzenga, G.; Argus, D.; Peltzer, G.; Parker, J.; Webb, F.; Heflin, M.; Zumberge, J.
Global Positioning System (GPS) data are useful for understanding both interseismic and postseismic deformation. Models of GPS data suggest that the lower crust, lateral heterogeneity, and fault slip, all provide a role in the earthquake cycle.
Friedman, Richard H
The Medicaid Information Technology Architecture (MITA) is a roadmap and tool-kit for States to transform their Medicaid Management Information System (MMIS) into an enterprise-wide, beneficiary-centric system. MITA will enable State Medicaid agencies to align their information technology (IT) opportunities with their evolving business needs. It also addresses long-standing issues of interoperability, adaptability, and data sharing, including clinical data, across organizational boundaries by creating models based on nationally accepted technical standards. Perhaps most significantly, MITA allows State Medicaid Programs to actively participate in the DHHS Secretary's vision of a transparent health care market that utilizes electronic health records (EHRs), ePrescribing and personal health records (PHRs). PMID:17427840
This Fact Sheet provides a brief description of postearthquake tools and products provided by the Advanced National Seismic System (ANSS) through the U.S. Geological Survey Earthquake Hazards Program. The focus is on products specifically aimed at providing situational awareness in the period immediately following significant earthquake events.
Masic, Izet; Pandza, Haris; Toromanovic, Selim; Masic, Fedja; Sivic, Suad; Zunic, Lejla; Masic, Zlatan
Advances in medicine in recent decades are in significant correlation with the advances in the information technology. Modern information technologies (IT) have enabled faster, more reliable and comprehensive data collection. These technologies have started to create a large number of irrelevant information, which represents a limiting factor and a real growing gap, between the medical knowledge on one hand, and the ability of doctors to follow its growth on the other. Furthermore, in our environment, the term technology is generally reserved for its technical component. Education means, learning, teaching, or the process of acquiring skills or behavior modification through various exercises. Traditionally, medical education meant the oral, practical and more passive transferring of knowledge and skills from the educators to students and health professionals. For the clinical disciplines, of special importance are the principles, such as, “learning at bedside,” aided by the medical literature. In doing so, these techniques enable students to contact with their teachers, and to refer to the appropriate literature. The disadvantage of these educational methods is in the fact, that teachers often do not have enough time. Additionally they are not very convenient to the horizontal and vertical integration of teaching, create weak or almost no self education, as well as, low skill levels and poor integration of education with a real social environment. In this paper authors describe application of modern IT in medical education – their advantages and disadvantages comparing with traditional ways of education. PMID:23408471
Nam, Hyo Suk; Park, Eunjeong
Background and Purpose Information technology and mobile devices may be beneficial and useful in many aspects of stroke management, including recognition of stroke, transport and triage of patients, emergent stroke evaluation at the hospital, and rehabilitation. In this review, we address the contributions of information technology and mobile health to stroke management. Summary of Issues Rapid detection and triage are essential for effective thrombolytic treatment. Awareness of stroke warning signs and responses to stroke could be enhanced by using mobile applications. Furthermore, prehospital assessment and notification could be streamlined for use in telemedicine and teleradiology. A mobile telemedicine system for assessing the National Institutes of Health Stroke Scale scores has shown higher correlation and fast assessment comparing with face-to-face method. Because the benefits of thrombolytic treatment are time-dependent, treatment should be initiated as quickly as possible. In-hospital communication between multidisciplinary team members can be enhanced using information technology. A computerized in-hospital alert system using computerized physician-order entry was shown to be effective in reducing the time intervals from hospital arrival to medical evaluations and thrombolytic treatment. Mobile devices can also be used as supplementary tools for neurologic examination and clinical decision-making. In post-stroke rehabilitation, virtual reality and telerehabilitation are helpful. Mobile applications might be useful for public awareness, lifestyle modification, and education/training of healthcare professionals. Conclusions Information technology and mobile health are useful tools for management of stroke patients from the acute period to rehabilitation. Further improvement of technology will change and enhance stroke prevention and treatment. PMID:24396807
... electronic and information technology that meets the standards at 36 CFR part 1194 would impose an undue... 29 Labor 9 2012-07-01 2012-07-01 false Electronic and information technology requirements. 2205... ELECTRONIC AND INFORMATION TECHNOLOGY § 2205.135 Electronic and information technology requirements. (a)...
... 48 Federal Acquisition Regulations System 1 2014-10-01 2014-10-01 false Information technology... SPECIAL CATEGORIES OF CONTRACTING ACQUISITION OF INFORMATION TECHNOLOGY General 39.104 Information technology services. When acquiring information technology services, solicitations must not describe...
... 48 Federal Acquisition Regulations System 1 2013-10-01 2013-10-01 false Information technology... SPECIAL CATEGORIES OF CONTRACTING ACQUISITION OF INFORMATION TECHNOLOGY General 39.104 Information technology services. When acquiring information technology services, solicitations must not describe...
... 48 Federal Acquisition Regulations System 1 2011-10-01 2011-10-01 false Information technology... SPECIAL CATEGORIES OF CONTRACTING ACQUISITION OF INFORMATION TECHNOLOGY General 39.104 Information technology services. When acquiring information technology services, solicitations must not describe...
... electronic and information technology that meets the standards at 36 CFR part 1194 would impose an undue... 29 Labor 9 2014-07-01 2014-07-01 false Electronic and information technology requirements. 2205... ELECTRONIC AND INFORMATION TECHNOLOGY § 2205.135 Electronic and information technology requirements. (a)...
... 48 Federal Acquisition Regulations System 1 2012-10-01 2012-10-01 false Information technology... SPECIAL CATEGORIES OF CONTRACTING ACQUISITION OF INFORMATION TECHNOLOGY General 39.104 Information technology services. When acquiring information technology services, solicitations must not describe...
... 29 Labor 4 2014-07-01 2014-07-01 false Electronic and information technology requirements. 1615... INFORMATION TECHNOLOGY § 1615.135 Electronic and information technology requirements. (a) Development, procurement, maintenance, or use of electronic and information technology.—When developing,...
... 29 Labor 4 2013-07-01 2013-07-01 false Electronic and information technology requirements. 1615... INFORMATION TECHNOLOGY § 1615.135 Electronic and information technology requirements. (a) Development, procurement, maintenance, or use of electronic and information technology.—When developing,...
... 29 Labor 4 2012-07-01 2012-07-01 false Electronic and information technology requirements. 1615... INFORMATION TECHNOLOGY § 1615.135 Electronic and information technology requirements. (a) Development, procurement, maintenance, or use of electronic and information technology.—When developing,...
... 29 Labor 4 2010-07-01 2010-07-01 false Electronic and information technology requirements. 1615... INFORMATION TECHNOLOGY § 1615.135 Electronic and information technology requirements. (a) Development, procurement, maintenance, or use of electronic and information technology.—When developing,...
... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Information technology... SPECIAL CATEGORIES OF CONTRACTING ACQUISITION OF INFORMATION TECHNOLOGY General 39.104 Information technology services. When acquiring information technology services, solicitations must not describe...
Rebitzer, James B; Rege, Mari; Shepard, Christopher
We investigate whether information technology (IT) can help physicians more efficiently acquire new knowledge in a clinical environment characterized by information overload. We combine analysis of data from a randomized trial with a theoretical model of the influence that IT has on the acquisition of new medical knowledge. Although the theoretical framework we develop is conventionally microeconomic, the model highlights the non-market and non-pecuniary influence activities that have been emphasized in the sociological literature on technology diffusion. We report three findings. First, empirical evidence and theoretical reasoning suggests that computer-based decision support will speed the diffusion of new medical knowledge when physicians are coping with information overload. Second, spillover effects will likely lead to "underinvestment" in this decision support technology. Third, alternative financing strategies common to new IT, such as the use of marketing dollars to pay for the decision support systems, may lead to undesirable outcomes if physician information overload is sufficiently severe and if there is significant ambiguity in how best to respond to the clinical issues identified by the computer. This is the first paper to analyze empirically and theoretically how computer-based decision support influences the acquisition of new knowledge by physicians. PMID:19548513
Bates, David W.; Evans, R. Scott; Murff, Harvey; Stetson, Peter D.; Pizziferri, Lisa; Hripcsak, George
Context: Although patient safety is a major problem, most health care organizations rely on spontaneous reporting, which detects only a small minority of adverse events. As a result, problems with safety have remained hidden. Chart review can detect adverse events in research settings, but it is too expensive for routine use. Information technology techniques can detect some adverse events in a timely and cost-effective way, in some cases early enough to prevent patient harm. Objective: To review methodologies of detecting adverse events using information technology, reports of studies that used these techniques to detect adverse events, and study results for specific types of adverse events. Design: Structured review. Methodology: English-language studies that reported using information technology to detect adverse events were identified using standard techniques. Only studies that contained original data were included. Main Outcome Measures: Adverse events, with specific focus on nosocomial infections, adverse drug events, and injurious falls. Results: Tools such as event monitoring and natural language processing can inexpensively detect certain types of adverse events in clinical databases. These approaches already work well for some types of adverse events, including adverse drug events and nosocomial infections, and are in routine use in a few hospitals. In addition, it appears likely that these techniques will be adaptable in ways that allow detection of a broad array of adverse events, especially as more medical information becomes computerized. Conclusion: Computerized detection of adverse events will soon be practical on a widespread basis. PMID:12595401
The digital revolution affects the environment on several levels. Most directly, information and communications technology (ICT) has environmental impacts through the manufacturing, operation and disposal of devices and network equipment, but it also provides ways to mitigate energy use, for example through smart buildings and teleworking. At a broader system level, ICTs influence economic growth and bring about technological and societal change. Managing the direct impacts of ICTs is more complex than just producing efficient devices, owing to the energetically expensive manufacturing process, and the increasing proliferation of devices needs to be taken into account. PMID:22094696
Weinger, Matthew B; Abbott, Patricia A; Wears, Robert L
Current research suggests that the rate of adoption of health information technology (HIT) is low, and that HIT may not have the touted beneficial effects on quality of care or costs. The twin issues of the failure of HIT adoption and of HIT efficacy stem primarily from a series of fallacies about HIT. We discuss 12 HIT fallacies and their implications for design and implementation. These fallacies must be understood and addressed for HIT to yield better results. Foundational cognitive and human factors engineering research and development are essential to better inform HIT development, deployment, and use. PMID:20962121