Sample records for sophisticated information processing

  1. Examining Candidate Information Search Processes: The Impact of Processing Goals and Sophistication.

    ERIC Educational Resources Information Center

    Huang, Li-Ning

    2000-01-01

    Investigates how 4 different information-processing goals, varying on the dimensions of effortful versus effortless and impression-driven versus non-impression-driven processing, and individual difference in political sophistication affect the depth at which undergraduate students process candidate information and their decision-making strategies.…

  2. A regional assessment of information technology sophistication in Missouri nursing homes.

    PubMed

    Alexander, Gregory L; Madsen, Richard; Wakefield, Douglas

    2010-08-01

    To provide a state profile of information technology (IT) sophistication in Missouri nursing homes. Primary survey data were collected from December 2006 to August 2007. A descriptive, exploratory cross-sectional design was used to investigate dimensions of IT sophistication (technological, functional, and integration) related to resident care, clinical support, and administrative processes. Each dimension was used to describe the clinical domains and demographics (ownership, regional location, and bed size). The final sample included 185 nursing homes. A wide range of IT sophistication is being used in administrative and resident care management processes, but very little in clinical support activities. Evidence suggests nursing homes in Missouri are expanding use of IT beyond traditional administrative and billing applications to patient care and clinical applications. This trend is important to provide support for capabilities which have been implemented to achieve national initiatives for meaningful use of IT in health care settings.

  3. Securing Information with Complex Optical Encryption Networks

    DTIC Science & Technology

    2015-08-11

    Network Security, Network Vulnerability , Multi-dimentional Processing, optoelectronic devices 16. SECURITY CLASSIFICATION OF: 17. LIMITATION... optoelectronic devices and systems should be analyzed before the retrieval, any hostile hacker will need to possess multi-disciplinary scientific...sophisticated optoelectronic principles and systems where he/she needs to process the information. However, in the military applications, most military

  4. Handbook of automated data collection methods for the National Transit Database

    DOT National Transportation Integrated Search

    2003-10-01

    In recent years, with the increasing sophistication and capabilities of information processing technologies, there has been a renewed interest on the part of transit systems to tap the rich information potential of the National Transit Database (NTD)...

  5. Emotional Intelligence: New Ability or Eclectic Traits?

    ERIC Educational Resources Information Center

    Mayer, John D.; Salovey, Peter; Caruso, David R.

    2008-01-01

    Some individuals have a greater capacity than others to carry out sophisticated information processing about emotions and emotion-relevant stimuli and to use this information as a guide to thinking and behavior. The authors have termed this set of abilities emotional intelligence (EI). Since the introduction of the concept, however, a schism has…

  6. How Do Students Regulate their Learning of Complex Systems with Hypermedia?.

    ERIC Educational Resources Information Center

    Azevedo, Roger; Seibert, Diane; Guthrie, John T.; Cromley, Jennifer G.; Wang, Huei-yu; Tron, Myriam

    This study examined the role of different goal-setting instructional interventions in facilitating students' shift to more sophisticated mental models of the circulatory system as indicated by both performance and process data. Researchers adopted the information processing model of self-regulated learning of P. Winne and colleagues (1998, 2001)…

  7. The Neuroscience of Dance and the Dance of Neuroscience: Defining a Path of Inquiry

    ERIC Educational Resources Information Center

    Dale, J. Alexander; Hyatt, Janyce; Hollerman, Jeff

    2007-01-01

    The neural processes of a person comprehending or creating music have intrigued neuroscientists and prompted them to examine the processing of information and emotion with some of the most recent and sophisticated techniques in the brain sciences (see, for example, Zatorre and his colleagues' work). These techniques and the excitement of studying…

  8. There's gold in them thar' databases.

    PubMed

    Gillespie, G

    2000-11-01

    Some health care organizations are using sophisticated data mining applications to unearth hidden truths buried in their online clinical and financial information. But the lack of a standard clinical vocabulary and standard work processes is an obstacle CIOs must blast through to reach their treasure.

  9. The New Paradox of the College Textbook.

    ERIC Educational Resources Information Center

    Lichtenberg, James

    1992-01-01

    As college textbooks have become more attractive, sophisticated, and useful, the textbook industry is suffering from high costs, increased popularity of used books, effects of rapidly advancing information and instructional technology, the atypical business structure of the college textbook market, and changing textbook development processes. (MSE)

  10. A Methodology for Distributing the Corporate Database.

    ERIC Educational Resources Information Center

    McFadden, Fred R.

    The trend to distributed processing is being fueled by numerous forces, including advances in technology, corporate downsizing, increasing user sophistication, and acquisitions and mergers. Increasingly, the trend in corporate information systems (IS) departments is toward sharing resources over a network of multiple types of processors, operating…

  11. The Role of Self-Regulated Learning in Fostering Students' Conceptual Understanding of Complex Systems with Hypermedia

    ERIC Educational Resources Information Center

    Azevedo, Roger; Guthrie, John T.; Seibert, Diane

    2004-01-01

    This study examines the role of self-regulated learning (SRL) in facilitating students' shifts to more sophisticated mental models of the circulatory system as indicated by both performance and process data. We began with Winne and colleagues' information processing model of SRL (Winne, 2001; Winne & Hadwin, 1998) and used it to examine how…

  12. Processing power limits social group size: computational evidence for the cognitive costs of sociality

    PubMed Central

    Dávid-Barrett, T.; Dunbar, R. I. M.

    2013-01-01

    Sociality is primarily a coordination problem. However, the social (or communication) complexity hypothesis suggests that the kinds of information that can be acquired and processed may limit the size and/or complexity of social groups that a species can maintain. We use an agent-based model to test the hypothesis that the complexity of information processed influences the computational demands involved. We show that successive increases in the kinds of information processed allow organisms to break through the glass ceilings that otherwise limit the size of social groups: larger groups can only be achieved at the cost of more sophisticated kinds of information processing that are disadvantageous when optimal group size is small. These results simultaneously support both the social brain and the social complexity hypotheses. PMID:23804623

  13. The evolution of educational information systems and nurse faculty roles.

    PubMed

    Nelson, Ramona; Meyers, Linda; Rizzolo, Mary Anne; Rutar, Pamela; Proto, Marcia B; Newbold, Susan

    2006-01-01

    Institutions of higher education are purchasing and/or designing sophisticated administrative information systems to manage such functions as the application, admissions, and registration process, grants management, student records, and classroom scheduling. Although faculty also manage large amounts of data, few automated systems have been created to help faculty improve teaching and learning through the management of information related to individual students, the curriculum, educational programs, and program evaluation. This article highlights the potential benefits that comprehensive educational information systems offer nurse faculty.

  14. Molecular Thermodynamics for Cell Biology as Taught with Boxes

    ERIC Educational Resources Information Center

    Mayorga, Luis S.; Lopez, Maria Jose; Becker, Wayne M.

    2012-01-01

    Thermodynamic principles are basic to an understanding of the complex fluxes of energy and information required to keep cells alive. These microscopic machines are nonequilibrium systems at the micron scale that are maintained in pseudo-steady-state conditions by very sophisticated processes. Therefore, several nonstandard concepts need to be…

  15. When not to copy: female fruit flies use sophisticated public information to avoid mated males

    NASA Astrophysics Data System (ADS)

    Loyau, Adeline; Blanchet, Simon; van Laere, Pauline; Clobert, Jean; Danchin, Etienne

    2012-10-01

    Semen limitation (lack of semen to fertilize all of a female's eggs) imposes high fitness costs to female partners. Females should therefore avoid mating with semen-limited males. This can be achieved by using public information extracted from watching individual males' previous copulating activities. This adaptive preference should be flexible given that semen limitation is temporary. We first demonstrate that the number of offspring produced by males Drosophila melanogaster gradually decreases over successive copulations. We then show that females avoid mating with males they just watched copulating and that visual public cues are sufficient to elicit this response. Finally, after males were given the time to replenish their sperm reserves, females did not avoid the males they previously saw copulating anymore. These results suggest that female fruit flies may have evolved sophisticated behavioural processes of resistance to semen-limited males, and demonstrate unsuspected adaptive context-dependent mate choice in an invertebrate.

  16. Mission to Planet Earth

    NASA Technical Reports Server (NTRS)

    Wilson, Gregory S.; Huntress, Wesley T.

    1990-01-01

    The rationale behind Mission to Planet Earth is presented, and the program plan is described in detail. NASA and its interagency and international partners will place satellites carrying advanced sensors in strategic earth orbits to collect muultidisciplinary data. A sophisticated data system will process and archive an unprecedented large amount of information about the earth and how it functions as a system. Attention is given to the space observatories, the data and information systems, and the interdisciplinary research.

  17. Data management in pattern recognition and image processing systems

    NASA Technical Reports Server (NTRS)

    Zobrist, A. L.; Bryant, N. A.

    1976-01-01

    Data management considerations are important to any system which handles large volumes of data or where the manipulation of data is technically sophisticated. A particular problem is the introduction of image-formatted files into the mainstream of data processing application. This report describes a comprehensive system for the manipulation of image, tabular, and graphical data sets which involve conversions between the various data types. A key characteristic is the use of image processing technology to accomplish data management tasks. Because of this, the term 'image-based information system' has been adopted.

  18. Information technology sophistication in nursing homes.

    PubMed

    Alexander, Gregory L; Wakefield, Douglas S

    2009-07-01

    There is growing recognition that a more sophisticated information technology (IT) infrastructure is needed to improve the quality of nursing home care in the United States. The purpose of this study was to explore the concept of IT sophistication in nursing homes considering the level of technological diversity, maturity and level of integration in resident care, clinical support, and administration. Twelve IT stakeholders were interviewed from 4 nursing homes considered to have high IT sophistication using focus groups and key informant interviews. Common themes were derived using qualitative analytics and axial coding from field notes collected during interviews and focus groups. Respondents echoed the diversity of the innovative IT systems being implemented; these included resident alerting mechanisms for clinical decision support, enhanced reporting capabilities of patient-provider interactions, remote monitoring, and networking among affiliated providers. Nursing home IT is in its early stages of adoption; early adopters are beginning to realize benefits across clinical domains including resident care, clinical support, and administrative activities. The most important thread emerging from these discussions was the need for further interface development between IT systems to enhance integrity and connectivity. The study shows that some early adopters of sophisticated IT systems in nursing homes are beginning to achieve added benefit for resident care, clinical support, and administrative activities.

  19. Analysis of flight equipment purchasing practices of representative air carriers

    NASA Technical Reports Server (NTRS)

    1977-01-01

    The process through which representative air carriers decide whether or not to purchase flight equipment was investigated as well as their practices and policies in retiring surplus aircraft. An analysis of the flight equipment investment decision process in ten airlines shows that for the airline industry as a whole, the flight equipment investment decision is in a state of transition from a wholly informal process in earliest years to a much more organized and structured process in the future. Individual air carriers are in different stages with respect to the formality and sophistication associated with the flight equipment investment decision.

  20. An Examination of the Use of Accounting Information Systems and the Success of Small Businesses in South Carolina

    ERIC Educational Resources Information Center

    Saracina, Tara H.

    2012-01-01

    The purpose of this study was to explore the relationship between the use and sophistication of accounting information systems (AISs) and the success of small businesses in South Carolina. Additionally, this study explored the variables that influence South Carolinian small business owners/managers in the extent of adoption (sophistication) of…

  1. Effective Materials Property Information Management for the 21st Century

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ren, Weiju; Cebon, David; Arnold, Steve

    2010-01-01

    This paper discusses key principles for the development of materials property information management software systems. There are growing needs for automated materials information management in industry, research organizations and government agencies. In part these are fuelled by the demands for higher efficiency in material testing, product design and development and engineering analysis. But equally important, organizations are being driven to employ sophisticated methods and software tools for managing their mission-critical materials information by the needs for consistency, quality and traceability of data, as well as control of access to proprietary or sensitive information. Furthermore the use of increasingly sophisticated nonlinear,more » anisotropic and multi-scale engineering analysis approaches, particularly for composite materials, requires both processing of much larger volumes of test data for development of constitutive models and much more complex materials data input requirements for Computer-Aided Engineering (CAE) software. And finally, the globalization of engineering processes and outsourcing of design and development activities generates much greater needs for sharing a single gold source of materials information between members of global engineering teams in extended supply-chains. Fortunately material property management systems have kept pace with the growing user demands. They have evolved from hard copy archives, through simple electronic databases, to versatile data management systems that can be customized to specific user needs. The more sophisticated of these provide facilities for: (i) data management functions such as access control, version control, and quality control; (ii) a wide range of data import, export and analysis capabilities; (iii) mechanisms for ensuring that all data is traceable to its pedigree sources: details of testing programs, published sources, etc; (iv) tools for searching, reporting and viewing the data; and (v) access to the information via a wide range of interfaces, including web browsers, rich clients, programmatic access and clients embedded in third-party applications, such as CAE systems. This paper discusses the important requirements for advanced material data management systems as well as the future challenges and opportunities such as automated error checking, automated data quality assessment and characterization, identification of gaps in data, as well as functionalities and business models to keep users returning to the source: to generate user demand to fuel database growth and maintenance.« less

  2. Beyond mind-reading: multi-voxel pattern analysis of fMRI data.

    PubMed

    Norman, Kenneth A; Polyn, Sean M; Detre, Greg J; Haxby, James V

    2006-09-01

    A key challenge for cognitive neuroscience is determining how mental representations map onto patterns of neural activity. Recently, researchers have started to address this question by applying sophisticated pattern-classification algorithms to distributed (multi-voxel) patterns of functional MRI data, with the goal of decoding the information that is represented in the subject's brain at a particular point in time. This multi-voxel pattern analysis (MVPA) approach has led to several impressive feats of mind reading. More importantly, MVPA methods constitute a useful new tool for advancing our understanding of neural information processing. We review how researchers are using MVPA methods to characterize neural coding and information processing in domains ranging from visual perception to memory search.

  3. Nurse Assistant Communication Strategies About Pressure Ulcers in Nursing Homes.

    PubMed

    Alexander, Gregory L

    2015-07-01

    There is growing recognition of benefits of sophisticated information technology (IT) in nursing homes (NHs). In this research, we explore strategies nursing assistants (NAs) use to communicate pressure ulcer prevention practices in NHs with variable IT sophistication measures. Primary qualitative data were collected during focus groups with NAs in 16 NHs located across Missouri. NAs (n = 213) participated in 31 focus groups. Three major themes referencing communication strategies for pressure ulcer prevention were identified, including Passing on Information, Keeping Track of Needs and Information Access. NAs use a variety of strategies to prioritize care, and strategies are different based on IT sophistication level. NA work is an important part of patient care. However, little information about their work is included in communication, leaving patient records incomplete. NAs' communication is becoming increasingly important in the care of the millions of chronically ill elders in NHs. © The Author(s) 2014.

  4. Nurse Assistant Communication Strategies about Pressure Ulcers in Nursing Homes

    PubMed Central

    Alexander, Gregory L.

    2018-01-01

    There is growing recognition of benefits of sophisticated information technology (IT) in nursing homes. In this research, we explore strategies nurse assistants use to communicate pressure ulcer prevention practices in nursing homes with variable IT sophistication measures. Primary qualitative data was collected during focus groups with nursing assistants in 16 nursing homes located across Missouri. Nursing assistants (n=213) participated in 31 focus groups. Three major themes referencing communication strategies for pressure ulcer prevention were identified, including Passing on Information, Keeping Track of Needs and Information Access. Nurse assistants use a variety of strategies to prioritize care and strategies are different based on IT sophistication level. Nursing assistant work is an important part of patient care. However, little information about their work is included in communication, leaving patient records incomplete. Nursing assistant’s communication is becoming increasingly important in the care of the millions of chronically ill elders in nursing homes. PMID:25331206

  5. The arrival of economic evidence in managed care formulary decisions: the unsolicited request process.

    PubMed

    Neumann, Peter J

    2005-07-01

    Managed care plans have traditionally resisted using economic evidence explicitly in drug formulary decisions, even as they used ever more aggressive and sophisticated processes for managing care. In recent years, this has changed as health plans have begun to adopt evidence-based and value-based formulary submission guidelines. The guidelines have the potential to serve as a national unifying template for pharmacy and therapeutics committees to consider clinical and economic information in a systematic and rigorous fashion. However, many questions remain about their use and about the nature of communications (called "unsolicited requests") from plans to drug companies for information. This article describes the unsolicited request process and its potential impact on the use of economic evidence in formulary decisions.

  6. Robust Informatics Infrastructure Required For ICME: Combining Virtual and Experimental Data

    NASA Technical Reports Server (NTRS)

    Arnold, Steven M.; Holland, Frederic A. Jr.; Bednarcyk, Brett A.

    2014-01-01

    With the increased emphasis on reducing the cost and time to market of new materials, the need for robust automated materials information management system(s) enabling sophisticated data mining tools is increasing, as evidenced by the emphasis on Integrated Computational Materials Engineering (ICME) and the recent establishment of the Materials Genome Initiative (MGI). This need is also fueled by the demands for higher efficiency in material testing; consistency, quality and traceability of data; product design; engineering analysis; as well as control of access to proprietary or sensitive information. Further, the use of increasingly sophisticated nonlinear, anisotropic and or multi-scale models requires both the processing of large volumes of test data and complex materials data necessary to establish processing-microstructure-property-performance relationships. Fortunately, material information management systems have kept pace with the growing user demands and evolved to enable: (i) the capture of both point wise data and full spectra of raw data curves, (ii) data management functions such as access, version, and quality controls;(iii) a wide range of data import, export and analysis capabilities; (iv) data pedigree traceability mechanisms; (v) data searching, reporting and viewing tools; and (vi) access to the information via a wide range of interfaces. This paper discusses key principles for the development of a robust materials information management system to enable the connections at various length scales to be made between experimental data and corresponding multiscale modeling toolsets to enable ICME. In particular, NASA Glenn's efforts towards establishing such a database for capturing constitutive modeling behavior for both monolithic and composites materials

  7. Landauer in the Age of Synthetic Biology: Energy Consumption and Information Processing in Biochemical Networks

    NASA Astrophysics Data System (ADS)

    Mehta, Pankaj; Lang, Alex H.; Schwab, David J.

    2016-03-01

    A central goal of synthetic biology is to design sophisticated synthetic cellular circuits that can perform complex computations and information processing tasks in response to specific inputs. The tremendous advances in our ability to understand and manipulate cellular information processing networks raises several fundamental physics questions: How do the molecular components of cellular circuits exploit energy consumption to improve information processing? Can one utilize ideas from thermodynamics to improve the design of synthetic cellular circuits and modules? Here, we summarize recent theoretical work addressing these questions. Energy consumption in cellular circuits serves five basic purposes: (1) increasing specificity, (2) manipulating dynamics, (3) reducing variability, (4) amplifying signal, and (5) erasing memory. We demonstrate these ideas using several simple examples and discuss the implications of these theoretical ideas for the emerging field of synthetic biology. We conclude by discussing how it may be possible to overcome these limitations using "post-translational" synthetic biology that exploits reversible protein modification.

  8. The spatial and temporal organization of ubiquitin networks

    PubMed Central

    Grabbe, Caroline; Husnjak, Koraljka; Dikic, Ivan

    2013-01-01

    In the past decade, the diversity of signals generated by the ubiquitin system has emerged as a dominant regulator of biological processes and propagation of information in the eukaryotic cell. A wealth of information has been gained about the crucial role of spatial and temporal regulation of ubiquitin species of different lengths and linkages in the nuclear factor-κB (NF-κB) pathway, endocytic trafficking, protein degradation and DNA repair. This spatiotemporal regulation is achieved through sophisticated mechanisms of compartmentalization and sequential series of ubiquitylation events and signal decoding, which control diverse biological processes not only in the cell but also during the development of tissues and entire organisms. PMID:21448225

  9. Efficient and secure outsourcing of genomic data storage.

    PubMed

    Sousa, João Sá; Lefebvre, Cédric; Huang, Zhicong; Raisaro, Jean Louis; Aguilar-Melchor, Carlos; Killijian, Marc-Olivier; Hubaux, Jean-Pierre

    2017-07-26

    Cloud computing is becoming the preferred solution for efficiently dealing with the increasing amount of genomic data. Yet, outsourcing storage and processing sensitive information, such as genomic data, comes with important concerns related to privacy and security. This calls for new sophisticated techniques that ensure data protection from untrusted cloud providers and that still enable researchers to obtain useful information. We present a novel privacy-preserving algorithm for fully outsourcing the storage of large genomic data files to a public cloud and enabling researchers to efficiently search for variants of interest. In order to protect data and query confidentiality from possible leakage, our solution exploits optimal encoding for genomic variants and combines it with homomorphic encryption and private information retrieval. Our proposed algorithm is implemented in C++ and was evaluated on real data as part of the 2016 iDash Genome Privacy-Protection Challenge. Results show that our solution outperforms the state-of-the-art solutions and enables researchers to search over millions of encrypted variants in a few seconds. As opposed to prior beliefs that sophisticated privacy-enhancing technologies (PETs) are unpractical for real operational settings, our solution demonstrates that, in the case of genomic data, PETs are very efficient enablers.

  10. Training & Personnel Systems Technology. R&D Program Description FY 84-85.

    DTIC Science & Technology

    1984-04-01

    performance requirements in terms of rapid response times, high rates of information processing, and complex decision making that tax the capabilities...makers to make linguistic and format changes to texts to enhance general literacy rates , (d) begin integrating human and animal data on stress ;ffects...systems are being Integrated Into the force at unprecedented rates , arrival of this sophisticated, high-technology equipment will coincide with increased

  11. Visual Display Principles for C3I System Tasks

    DTIC Science & Technology

    1993-06-01

    early in the design process is now explicitly recognized in military R & D policy as evidenced by the Navy’s HARDMAN and the Army’s MANPRINT programs...information): required sampling rate for each battlefield area, target type, and sensor type, etc.? - Change detections aids - Where is the enemy...increasing load and sophistication for - Automated measurement and operators and decisionmakers scoring (%hits, miss distances, attrition rates , etc

  12. Spanish Language Processing at University of Maryland: Building Infrastructure for Multilingual Applications

    DTIC Science & Technology

    2001-09-01

    translation of the Spanish original sentence. Acquiring bilingual dictionary entries In addition to building and applying the more sophisticated LCS...porting LCS lexicons to new languages, as described above, and are also useful by themselves in improving dictionary -based cross language information...hold much of the time. Moreover, lexical dependencies have proven to be instrumental in advances in monolingual syntactic analysis (e.g. I-erg MY

  13. Technology for the product and process data base

    NASA Technical Reports Server (NTRS)

    Barnes, R. D.

    1984-01-01

    The computerized product and process data base is increasingly recognized to be the cornerstone component of an overall system aimed at the integrated automation of the industrial processes of a given company or enterprise. The technology needed to support these more effective computer integrated design and manufacturing methods, especially the concept of 3-D computer-sensible product definitions rather than engineering drawings, is not fully available and rationalized. Progress is being made, however, in bridging this technology gap with concentration on the modeling of sophisticated information and data structures, high-performance interactive user interfaces and comprehensive tools for managing the resulting computerized product definition and process data base.

  14. Development of a general-purpose, integrated knowledge capture and delivery system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roberts, A.G.; Freer, E.B.

    1991-01-01

    KATIE (Knowledge-Based Assistant for Troubleshooting Industrial Equipment) was first conceived as a solution for maintenance problems. In the area of process control, maintenance technicians have become responsible for increasingly complicated equipment and an overwhelming amount of associated information. The sophisticated distributed control systems have proven to be such a drastic change for technicians that they are forced to rely on the engineer for troubleshooting guidance. Because it is difficult for a knowledgeable engineer to be readily available for troubleshooting,maintenance personnel wish to capture the information provided by the engineer. The solution provided has two stages. First, a specific complicated systemmore » was chosen as a test case. An effort was made to gather all available system information in some form. Second, a method of capturing and delivering this collection of information was developed. Several features were desired for this knowledge capture/delivery system (KATIE). Creation of the knowledge base needed to be independent of the delivery system. The delivery path need to be as simple as possible for the technician, and the capture, or authoring, system could provide very sophisticated features. It was decided that KATIE should be as general as possible, not internalizing specifics about the first implementation. The knowledge bases created needed to be completely separate from KATIE needed to have a modular structure so that each type of information (rules, procedures, manuals, symptoms) could be encapsulated individually.« less

  15. Transformation as a Design Process and Runtime Architecture for High Integrity Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bespalko, S.J.; Winter, V.L.

    1999-04-05

    We have discussed two aspects of creating high integrity software that greatly benefit from the availability of transformation technology, which in this case is manifest by the requirement for a sophisticated backtracking parser. First, because of the potential for correctly manipulating programs via small changes, an automated non-procedural transformation system can be a valuable tool for constructing high assurance software. Second, modeling the processing of translating data into information as a, perhaps, context-dependent grammar leads to an efficient, compact implementation. From a practical perspective, the transformation process should begin in the domain language in which a problem is initially expressed.more » Thus in order for a transformation system to be practical it must be flexible with respect to domain-specific languages. We have argued that transformation applied to specification results in a highly reliable system. We also attempted to briefly demonstrate that transformation technology applied to the runtime environment will result in a safe and secure system. We thus believe that the sophisticated multi-lookahead backtracking parsing technology is central to the task of being in a position to demonstrate the existence of HIS.« less

  16. Mapa Sociolinguistico. Analisis demolinguistico de la Comunidad Autonoma Vasca derivado del padron de 1986 (Sociolinguistic Map. Demolinguistic analysis of the Autonomous Basque Community derived from the 1986 Census).

    ERIC Educational Resources Information Center

    Basque Autonomous Community, Vitoria (Spain). General Secretariat of Linguistic Policy.

    Sociolinguistic data are presented in the form of sophisticated maps and tables in this pioneering study on the status of the Basque language. Based on information collected from the 1986 census, the major demographic characteristics of Basque are examined in order to ascertain the factors and processes that have contributed to its current status.…

  17. From information processing to decisions: Formalizing and comparing psychologically plausible choice models.

    PubMed

    Heck, Daniel W; Hilbig, Benjamin E; Moshagen, Morten

    2017-08-01

    Decision strategies explain how people integrate multiple sources of information to make probabilistic inferences. In the past decade, increasingly sophisticated methods have been developed to determine which strategy explains decision behavior best. We extend these efforts to test psychologically more plausible models (i.e., strategies), including a new, probabilistic version of the take-the-best (TTB) heuristic that implements a rank order of error probabilities based on sequential processing. Within a coherent statistical framework, deterministic and probabilistic versions of TTB and other strategies can directly be compared using model selection by minimum description length or the Bayes factor. In an experiment with inferences from given information, only three of 104 participants were best described by the psychologically plausible, probabilistic version of TTB. Similar as in previous studies, most participants were classified as users of weighted-additive, a strategy that integrates all available information and approximates rational decisions. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. Experimental demonstration of graph-state quantum secret sharing.

    PubMed

    Bell, B A; Markham, D; Herrera-Martí, D A; Marin, A; Wadsworth, W J; Rarity, J G; Tame, M S

    2014-11-21

    Quantum communication and computing offer many new opportunities for information processing in a connected world. Networks using quantum resources with tailor-made entanglement structures have been proposed for a variety of tasks, including distributing, sharing and processing information. Recently, a class of states known as graph states has emerged, providing versatile quantum resources for such networking tasks. Here we report an experimental demonstration of graph state-based quantum secret sharing--an important primitive for a quantum network with applications ranging from secure money transfer to multiparty quantum computation. We use an all-optical setup, encoding quantum information into photons representing a five-qubit graph state. We find that one can reliably encode, distribute and share quantum information amongst four parties, with various access structures based on the complex connectivity of the graph. Our results show that graph states are a promising approach for realising sophisticated multi-layered communication protocols in quantum networks.

  19. Interpreting and Presenting Data to Management. Air Professional File Number 36.

    ERIC Educational Resources Information Center

    Clagett, Craig A.

    Guidelines are offered to institutional researchers and planning analysts for presenting research results in formats and levels of sophistication that are accessible to top management. Fundamental principles include: (1) know what is needed; (2) know when the information is needed; (3) match format to analytical sophistication and learning…

  20. Enhanced Information Retrieval Using AJAX

    NASA Astrophysics Data System (ADS)

    Kachhwaha, Rajendra; Rajvanshi, Nitin

    2010-11-01

    Information Retrieval deals with the representation, storage, organization of, and access to information items. The representation and organization of information items should provide the user with easy access to the information with the rapid development of Internet, large amounts of digitally stored information is readily available on the World Wide Web. This information is so huge that it becomes increasingly difficult and time consuming for the users to find the information relevant to their needs. The explosive growth of information on the Internet has greatly increased the need for information retrieval systems. However, most of the search engines are using conventional information retrieval systems. An information system needs to implement sophisticated pattern matching tools to determine contents at a faster rate. AJAX has recently emerged as the new tool such the of information retrieval process of information retrieval can become fast and information reaches the use at a faster pace as compared to conventional retrieval systems.

  1. Real-time face and gesture analysis for human-robot interaction

    NASA Astrophysics Data System (ADS)

    Wallhoff, Frank; Rehrl, Tobias; Mayer, Christoph; Radig, Bernd

    2010-05-01

    Human communication relies on a large number of different communication mechanisms like spoken language, facial expressions, or gestures. Facial expressions and gestures are one of the main nonverbal communication mechanisms and pass large amounts of information between human dialog partners. Therefore, to allow for intuitive human-machine interaction, a real-time capable processing and recognition of facial expressions, hand and head gestures are of great importance. We present a system that is tackling these challenges. The input features for the dynamic head gestures and facial expressions are obtained from a sophisticated three-dimensional model, which is fitted to the user in a real-time capable manner. Applying this model different kinds of information are extracted from the image data and afterwards handed over to a real-time capable data-transferring framework, the so-called Real-Time DataBase (RTDB). In addition to the head and facial-related features, also low-level image features regarding the human hand - optical flow, Hu-moments are stored into the RTDB for the evaluation process of hand gestures. In general, the input of a single camera is sufficient for the parallel evaluation of the different gestures and facial expressions. The real-time capable recognition of the dynamic hand and head gestures are performed via different Hidden Markov Models, which have proven to be a quick and real-time capable classification method. On the other hand, for the facial expressions classical decision trees or more sophisticated support vector machines are used for the classification process. These obtained results of the classification processes are again handed over to the RTDB, where other processes (like a Dialog Management Unit) can easily access them without any blocking effects. In addition, an adjustable amount of history can be stored by the RTDB buffer unit.

  2. CRIB; the mineral resources data bank of the U.S. Geological Survey

    USGS Publications Warehouse

    Calkins, James Alfred; Kays, Olaf; Keefer, Eleanor K.

    1973-01-01

    The recently established Computerized Resources Information Bank (CRIB) of the U.S. Geological Survey is expected to play an increasingly important role in the study of United States' mineral resources. CRIB provides a rapid means for organizing and summarizing information on mineral resources and for displaying the results. CRIB consists of a set of variable-length records containing the basic information needed to characterize one or more mineral commodities, a mineral deposit, or several related deposits. The information consists of text, numeric data, and codes. Some topics covered are: name, location, commodity information, geology, production, reserves, potential resources, and references. The data are processed by the GIPSY program, which performs all the processing tasks needed to build, operate, and maintain the CRIB file. The sophisticated retrieval program allows the user to make highly selective searches of the files for words, parts of words, phrases, numeric data, word ranges, numeric ranges, and others, and to interrelate variables by logic statements to any degree of refinement desired. Three print options are available, or the retrieved data can be passed to another program for further processing.

  3. Sender–receiver systems and applying information theory for quantitative synthetic biology

    PubMed Central

    Barcena Menendez, Diego; Senthivel, Vivek Raj; Isalan, Mark

    2015-01-01

    Sender–receiver (S–R) systems abound in biology, with communication systems sending information in various forms. Information theory provides a quantitative basis for analysing these processes and is being applied to study natural genetic, enzymatic and neural networks. Recent advances in synthetic biology are providing us with a wealth of artificial S–R systems, giving us quantitative control over networks with a finite number of well-characterised components. Combining the two approaches can help to predict how to maximise signalling robustness, and will allow us to make increasingly complex biological computers. Ultimately, pushing the boundaries of synthetic biology will require moving beyond engineering the flow of information and towards building more sophisticated circuits that interpret biological meaning. PMID:25282688

  4. Commercial applications for optical data storage

    NASA Astrophysics Data System (ADS)

    Tas, Jeroen

    1991-03-01

    Optical data storage has spurred the market for document imaging systems. These systems are increasingly being used to electronically manage the processing, storage and retrieval of documents. Applications range from straightforward archives to sophisticated workflow management systems. The technology is developing rapidly and within a few years optical imaging facilities will be incorporated in most of the office information systems. This paper gives an overview of the status of the market, the applications and the trends of optical imaging systems.

  5. A Macro-Level Analysis of SRL Processes and Their Relations to the Acquisition of a Sophisticated Mental Model of a Complex System

    ERIC Educational Resources Information Center

    Greene, Jeffrey Alan; Azevedo, Roger

    2009-01-01

    In this study, we used think-aloud verbal protocols to examine how various macro-level processes of self-regulated learning (SRL; e.g., planning, monitoring, strategy use, handling of task difficulty and demands) were associated with the acquisition of a sophisticated mental model of a complex biological system. Numerous studies examine how…

  6. Reliability-Based Model to Analyze the Performance and Cost of a Transit Fare Collection System.

    DOT National Transportation Integrated Search

    1985-06-01

    The collection of transit system fares has become more sophisticated in recent years, with more flexible structures requiring more sophisticated fare collection equipment to process tickets and admit passengers. However, this new and complex equipmen...

  7. Reinventing Solutions to Systems of Linear Differential Equations: A Case of Emergent Models Involving Analytic Expressions

    ERIC Educational Resources Information Center

    Rasmussen, Chris; Blumenfeld, Howard

    2007-01-01

    An enduring challenge in mathematics education is to create learning environments in which students generate, refine, and extend their intuitive and informal ways of reasoning to more sophisticated and formal ways of reasoning. Pressing concerns for research, therefore, are to detail students' progressively sophisticated ways of reasoning and…

  8. Case management information systems: how to put the pieces together now and beyond year 2000.

    PubMed

    Matthews, Pamela

    2002-01-01

    The case management process is a critical management and operational component in the delivery of customer services across the patient care continuum. Case management has transcended time and will continue to be a viable infrastructure process for successful organizations in the future. A key component of the case management infrastructure is information systems and technology support. Case management challenges include effective deployment and use of systems and technology. As more sophisticated, integrated systems are made available, case managers can use these tools to continue to expand effectively beyond the patient's episodic event to provide greater levels of cradle-to-grave management of healthcare. This article explores methods for defining case management system needs and identifying automation options available to the case manager.

  9. Document Examination: Applications of Image Processing Systems.

    PubMed

    Kopainsky, B

    1989-12-01

    Dealing with images is a familiar business for an expert in questioned documents: microscopic, photographic, infrared, and other optical techniques generate images containing the information he or she is looking for. A recent method for extracting most of this information is digital image processing, ranging from the simple contrast and contour enhancement to the advanced restoration of blurred texts. When combined with a sophisticated physical imaging system, an image pricessing system has proven to be a powerful and fast tool for routine non-destructive scanning of suspect documents. This article reviews frequent applications, comprising techniques to increase legibility, two-dimensional spectroscopy (ink discrimination, alterations, erased entries, etc.), comparison techniques (stamps, typescript letters, photo substitution), and densitometry. Computerized comparison of handwriting is not included. Copyright © 1989 Central Police University.

  10. Theory of Mind: Did Evolution Fool Us?

    PubMed Central

    Devaine, Marie; Hollard, Guillaume; Daunizeau, Jean

    2014-01-01

    Theory of Mind (ToM) is the ability to attribute mental states (e.g., beliefs and desires) to other people in order to understand and predict their behaviour. If others are rewarded to compete or cooperate with you, then what they will do depends upon what they believe about you. This is the reason why social interaction induces recursive ToM, of the sort “I think that you think that I think, etc.”. Critically, recursion is the common notion behind the definition of sophistication of human language, strategic thinking in games, and, arguably, ToM. Although sophisticated ToM is believed to have high adaptive fitness, broad experimental evidence from behavioural economics, experimental psychology and linguistics point towards limited recursivity in representing other’s beliefs. In this work, we test whether such apparent limitation may not in fact be proven to be adaptive, i.e. optimal in an evolutionary sense. First, we propose a meta-Bayesian approach that can predict the behaviour of ToM sophistication phenotypes who engage in social interactions. Second, we measure their adaptive fitness using evolutionary game theory. Our main contribution is to show that one does not have to appeal to biological costs to explain our limited ToM sophistication. In fact, the evolutionary cost/benefit ratio of ToM sophistication is non trivial. This is partly because an informational cost prevents highly sophisticated ToM phenotypes to fully exploit less sophisticated ones (in a competitive context). In addition, cooperation surprisingly favours lower levels of ToM sophistication. Taken together, these quantitative corollaries of the “social Bayesian brain” hypothesis provide an evolutionary account for both the limitation of ToM sophistication in humans as well as the persistence of low ToM sophistication levels. PMID:24505296

  11. Theory of mind: did evolution fool us?

    PubMed

    Devaine, Marie; Hollard, Guillaume; Daunizeau, Jean

    2014-01-01

    Theory of Mind (ToM) is the ability to attribute mental states (e.g., beliefs and desires) to other people in order to understand and predict their behaviour. If others are rewarded to compete or cooperate with you, then what they will do depends upon what they believe about you. This is the reason why social interaction induces recursive ToM, of the sort "I think that you think that I think, etc.". Critically, recursion is the common notion behind the definition of sophistication of human language, strategic thinking in games, and, arguably, ToM. Although sophisticated ToM is believed to have high adaptive fitness, broad experimental evidence from behavioural economics, experimental psychology and linguistics point towards limited recursivity in representing other's beliefs. In this work, we test whether such apparent limitation may not in fact be proven to be adaptive, i.e. optimal in an evolutionary sense. First, we propose a meta-Bayesian approach that can predict the behaviour of ToM sophistication phenotypes who engage in social interactions. Second, we measure their adaptive fitness using evolutionary game theory. Our main contribution is to show that one does not have to appeal to biological costs to explain our limited ToM sophistication. In fact, the evolutionary cost/benefit ratio of ToM sophistication is non trivial. This is partly because an informational cost prevents highly sophisticated ToM phenotypes to fully exploit less sophisticated ones (in a competitive context). In addition, cooperation surprisingly favours lower levels of ToM sophistication. Taken together, these quantitative corollaries of the "social Bayesian brain" hypothesis provide an evolutionary account for both the limitation of ToM sophistication in humans as well as the persistence of low ToM sophistication levels.

  12. Use telecommunications for real-time process control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zilberman, I.; Bigman, J.; Sela, I.

    1996-05-01

    Process operators design real-time accurate information to monitor and control product streams and to optimize unit operations. The challenge is how to cost-effectively install sophisticated analytical equipment in harsh environments such as process areas and maintain system reliability. Incorporating telecommunications technology with near infrared (NIR) spectroscopy may be the bridge to help operations achieve their online control goals. Coupling communications fiber optics with NIR analyzers enables the probe and sampling system to remain in the field and crucial analytical equipment to be remotely located in a general purpose area without specialized protection provisions. The case histories show how two refineriesmore » used NIR spectroscopy online to track octane levels for reformate streams.« less

  13. Designer cell signal processing circuits for biotechnology

    PubMed Central

    Bradley, Robert W.; Wang, Baojun

    2015-01-01

    Microorganisms are able to respond effectively to diverse signals from their environment and internal metabolism owing to their inherent sophisticated information processing capacity. A central aim of synthetic biology is to control and reprogramme the signal processing pathways within living cells so as to realise repurposed, beneficial applications ranging from disease diagnosis and environmental sensing to chemical bioproduction. To date most examples of synthetic biological signal processing have been built based on digital information flow, though analogue computing is being developed to cope with more complex operations and larger sets of variables. Great progress has been made in expanding the categories of characterised biological components that can be used for cellular signal manipulation, thereby allowing synthetic biologists to more rationally programme increasingly complex behaviours into living cells. Here we present a current overview of the components and strategies that exist for designer cell signal processing and decision making, discuss how these have been implemented in prototype systems for therapeutic, environmental, and industrial biotechnological applications, and examine emerging challenges in this promising field. PMID:25579192

  14. Economic analysis of linking operating room scheduling and hospital material management information systems for just-in-time inventory control.

    PubMed

    Epstein, R H; Dexter, F

    2000-08-01

    Operating room (OR) scheduling information systems can decrease perioperative labor costs. Material management information systems can decrease perioperative inventory costs. We used computer simulation to investigate whether using the OR schedule to trigger purchasing of perioperative supplies is likely to further decrease perioperative inventory costs, as compared with using sophisticated, stand-alone material management inventory control. Although we designed the simulations to favor financially linking the information systems, we found that this strategy would be expected to decrease inventory costs substantively only for items of high price ($1000 each) and volume (>1000 used each year). Because expensive items typically have different models and sizes, each of which is used by a hospital less often than this, for almost all items there will be no benefit to making daily adjustments to the order volume based on booked cases. We conclude that, in a hospital with a sophisticated material management information system, OR managers will probably achieve greater cost reductions from focusing on negotiating less expensive purchase prices for items than on trying to link the OR information system with the hospital's material management information system to achieve just-in-time inventory control. In a hospital with a sophisticated material management information system, operating room managers will probably achieve greater cost reductions from focusing on negotiating less expensive purchase prices for items than on trying to link the operating room information system with the hospital's material management information system to achieve just-in-time inventory control.

  15. Movement-related cortical magnetic fields associated with self-paced tongue protrusion in humans.

    PubMed

    Maezawa, Hitoshi; Oguma, Hidetoshi; Hirai, Yoshiyuki; Hisadome, Kazunari; Shiraishi, Hideaki; Funahashi, Makoto

    2017-04-01

    Sophisticated tongue movements are coordinated finely via cortical control. We elucidated the cortical processes associated with voluntary tongue movement. Movement-related cortical fields were investigated during self-paced repetitive tongue protrusion. Surface tongue electromyograms were recorded to determine movement onset. To identify the location of the primary somatosensory cortex (S1), tongue somatosensory evoked fields were measured. The readiness fields (RFs) over both hemispheres began prior to movement onset and culminated in the motor fields (MFs) around movement onset. These signals were followed by transient movement evoked fields (MEFs) after movement onset. The MF and MEF peak latencies and magnitudes were not different between the hemispheres. The MF current sources were located in the precentral gyrus, suggesting they were located in the primary motor cortex (M1); this was contrary to the MEF sources, which were located in S1. We conclude that the RFs and MFs mainly reflect the cortical processes for the preparation and execution of tongue movement in the bilateral M1, without hemispheric dominance. Moreover, the MEFs may represent proprioceptive feedback from the tongue to bilateral S1. Such cortical processing related to the efferent and afferent information may aid in the coordination of sophisticated tongue movements. Copyright © 2016 Elsevier Ireland Ltd and Japan Neuroscience Society. All rights reserved.

  16. Genalogical approaches to ethical implications of informational assimilative integrated discovery systems (AIDS) in business

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pharhizgar, K.D.; Lunce, S.E.

    1994-12-31

    Development of knowledge-based technological acquisition techniques and customers` information profiles are known as assimilative integrated discovery systems (AIDS) in modern organizations. These systems have access through processing to both deep and broad domains of information in modern societies. Through these systems organizations and individuals can predict future trend probabilities and events concerning their customers. AIDSs are new techniques which produce new information which informants can use without the help of the knowledge sources because of the existence of highly sophisticated computerized networks. This paper has analyzed the danger and side effects of misuse of information through the illegal, unethical andmore » immoral access to the data-base in an integrated and assimilative information system as described above. Cognivistic mapping, pragmatistic informational design gathering, and holistic classifiable and distributive techniques are potentially abusive systems whose outputs can be easily misused by businesses when researching the firm`s customers.« less

  17. Effective Materials Property Information Management for the 21st Century

    NASA Technical Reports Server (NTRS)

    Ren, Weiju; Cebon, David; Arnold, Steve

    2009-01-01

    This paper discusses key principles for the development of materials property information management software systems. There are growing needs for automated materials information management in various organizations. In part these are fueled by the demands for higher efficiency in material testing, product design and engineering analysis. But equally important, organizations are being driven by the need for consistency, quality and traceability of data, as well as control of access to sensitive information such as proprietary data. Further, the use of increasingly sophisticated nonlinear, anisotropic and multi-scale engineering analyses requires both processing of large volumes of test data for development of constitutive models and complex materials data input for Computer-Aided Engineering (CAE) software. And finally, the globalization of economy often generates great needs for sharing a single "gold source" of materials information between members of global engineering teams in extended supply chains. Fortunately, material property management systems have kept pace with the growing user demands and evolved to versatile data management systems that can be customized to specific user needs. The more sophisticated of these provide facilities for: (i) data management functions such as access, version, and quality controls; (ii) a wide range of data import, export and analysis capabilities; (iii) data "pedigree" traceability mechanisms; (iv) data searching, reporting and viewing tools; and (v) access to the information via a wide range of interfaces. In this paper the important requirements for advanced material data management systems, future challenges and opportunities such as automated error checking, data quality characterization, identification of gaps in datasets, as well as functionalities and business models to fuel database growth and maintenance are discussed.

  18. Historical data recording for process computers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hale, J.C.; Sellars, H.L.

    1981-11-01

    Computers have been used to monitor and control chemical and refining processes for more than 15 years. During this time, there has been a steady growth in the variety and sophistication of the functions performed by these process computers. Early systems were limited to maintaining only current operating measurements, available through crude operator's consoles or noisy teletypes. The value of retaining a process history, that is, a collection of measurements over time, became apparent, and early efforts produced shift and daily summary reports. The need for improved process historians which record, retrieve and display process information has grown as processmore » computers assume larger responsibilities in plant operations. This paper describes newly developed process historian functions that have been used on several of its in-house process monitoring and control systems in Du Pont factories. 3 refs.« less

  19. Frequency domain laser velocimeter signal processor

    NASA Technical Reports Server (NTRS)

    Meyers, James F.; Murphy, R. Jay

    1991-01-01

    A new scheme for processing signals from laser velocimeter systems is described. The technique utilizes the capabilities of advanced digital electronics to yield a signal processor capable of operating in the frequency domain maximizing the information obtainable from each signal burst. This allows a sophisticated approach to signal detection and processing, with a more accurate measurement of the chirp frequency resulting in an eight-fold increase in measurable signals over the present high-speed burst counter technology. Further, the required signal-to-noise ratio is reduced by a factor of 32, allowing measurements within boundary layers of wind tunnel models. Measurement accuracy is also increased up to a factor of five.

  20. Students' Scientific Epistemic Beliefs, Online Evaluative Standards, and Online Searching Strategies for Science Information: The Moderating Role of Cognitive Load Experience

    NASA Astrophysics Data System (ADS)

    Hsieh, Ya-Hui; Tsai, Chin-Chung

    2014-06-01

    The purpose of this study is to examine the moderating role of cognitive load experience between students' scientific epistemic beliefs and information commitments, which refer to online evaluative standards and online searching strategies. A total of 344 science-related major students participated in this study. Three questionnaires were used to ascertain the students' scientific epistemic beliefs, information commitments, and cognitive load experience. Structural equation modeling was then used to analyze the moderating effect of cognitive load, with the results revealing its significant moderating effect. The relationships between sophisticated scientific epistemic beliefs and the advanced evaluative standards used by the students were significantly stronger for low than for high cognitive load students. Moreover, considering the searching strategies that the students used, the relationships between sophisticated scientific epistemic beliefs and advanced searching strategies were also stronger for low than for high cognitive load students. However, for the high cognitive load students, only one of the sophisticated scientific epistemic belief dimensions was found to positively associate with advanced evaluative standard dimensions.

  1. A novel paradigm for telemedicine using the personal bio-monitor.

    PubMed

    Bhatikar, Sanjay R; Mahajan, Roop L; DeGroff, Curt

    2002-01-01

    The foray of solid-state technology in the medical field has yielded an arsenal of sophisticated healthcare tools. Personal, portable computing power coupled with the information superhighway open up the possibility of sophisticated healthcare management that will impact the medical field just as much. The full synergistic potential of three interwoven technologies: (1) compact electronics, (2) World Wide Web, and (3) Artificial Intelligence is yet to be realized. The system presented in this paper integrates these technologies synergistically, providing a new paradigm for healthcare. Our idea is to deploy internet-enabled, intelligent, handheld personal computers for medical diagnosis. The salient features of the 'Personal Bio-Monitor' we envisage are: (1) Utilization of the peripheral signals of the body which may be acquired non-invasively and with ease, for diagnosis of medical conditions; (2) An Artificial Neural Network (ANN) based approach for diagnosis; (3) Configuration of the diagnostic device as a handheld for personal use; (4) Internet connectivity, following the emerging bluetooth protocol, for prompt conveyance of information to a patient's health care provider via the World Wide Web. The proposal is substantiated with an intelligent handheld device developed by the investigators for pediatric cardiac auscultation. This device performed accurate diagnoses of cardiac abnormalities in pediatrics using an artificial neural network to process heart sounds acquired by a low-frequency microphone and transmitted its diagnosis to a desktop PC via infrared. The idea of the personal biomonitor presented here has the potential to streamline healthcare by optimizing two valuable resources: physicians' time and sophisticated equipment time. We show that the elements of such a system are in place, with our prototype. Our novel contribution is the synergistic integration of compact electronics' technology, artificial neural network methodology and the wireless web resulting in a revolutionary new paradigm for healthcare management.

  2. Iontronics

    NASA Astrophysics Data System (ADS)

    Chun, Honggu; Chung, Taek Dong

    2015-07-01

    Iontronics is an emerging technology based on sophisticated control of ions as signal carriers that bridges solid-state electronics and biological system. It is found in nature, e.g., information transduction and processing of brain in which neurons are dynamically polarized or depolarized by ion transport across cell membranes. It suggests the operating principle of aqueous circuits made of predesigned structures and functional materials that characteristically interact with ions of various charge, mobility, and affinity. Working in aqueous environments, iontronic devices offer profound implications for biocompatible or biodegradable logic circuits for sensing, ecofriendly monitoring, and brain-machine interfacing. Furthermore, iontronics based on multi-ionic carriers sheds light on futuristic biomimic information processing. In this review, we overview the historical achievements and the current state of iontronics with regard to theory, fabrication, integration, and applications, concluding with comments on where the technology may advance.

  3. Three Essays on Information Technology Security Management in Organizations

    ERIC Educational Resources Information Center

    Gupta, Manish

    2011-01-01

    Increasing complexity and sophistication of ever evolving information technologies has spurred unique and unprecedented challenges for organizations to protect their information assets. Companies suffer significant financial and reputational damage due to ineffective information technology security management, which has extensively been shown to…

  4. Effective Materials Property Information Management for the 21st Century

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ren, Weiju; Cebon, David; Barabash, Oleg M

    2011-01-01

    This paper discusses key principles for the development of materials property information management software systems. There are growing needs for automated materials information management in various organizations. In part these are fuelled by the demands for higher efficiency in material testing, product design and engineering analysis. But equally important, organizations are being driven by the needs for consistency, quality and traceability of data, as well as control of access to proprietary or sensitive information. Further, the use of increasingly sophisticated nonlinear, anisotropic and multi-scale engineering analyses requires both processing of large volumes of test data for development of constitutive modelsmore » and complex materials data input for Computer-Aided Engineering (CAE) software. And finally, the globalization of economy often generates great needs for sharing a single gold source of materials information between members of global engineering teams in extended supply-chains. Fortunately material property management systems have kept pace with the growing user demands and evolved to versatile data management systems that can be customized to specific user needs. The more sophisticated of these provide facilities for: (i) data management functions such as access, version, and quality controls; (ii) a wide range of data import, export and analysis capabilities; (iii) data pedigree traceability mechanisms; (iv) data searching, reporting and viewing tools; and (v) access to the information via a wide range of interfaces. In this paper the important requirements for advanced material data management systems, future challenges and opportunities such as automated error checking, data quality characterization, identification of gaps in datasets, as well as functionalities and business models to fuel database growth and maintenance are discussed.« less

  5. Musical Sophistication and the Effect of Complexity on Auditory Discrimination in Finnish Speakers.

    PubMed

    Dawson, Caitlin; Aalto, Daniel; Šimko, Juraj; Vainio, Martti; Tervaniemi, Mari

    2017-01-01

    Musical experiences and native language are both known to affect auditory processing. The present work aims to disentangle the influences of native language phonology and musicality on behavioral and subcortical sound feature processing in a population of musically diverse Finnish speakers as well as to investigate the specificity of enhancement from musical training. Finnish speakers are highly sensitive to duration cues since in Finnish, vowel and consonant duration determine word meaning. Using a correlational approach with a set of behavioral sound feature discrimination tasks, brainstem recordings, and a musical sophistication questionnaire, we find no evidence for an association between musical sophistication and more precise duration processing in Finnish speakers either in the auditory brainstem response or in behavioral tasks, but they do show an enhanced pitch discrimination compared to Finnish speakers with less musical experience and show greater duration modulation in a complex task. These results are consistent with a ceiling effect set for certain sound features which corresponds to the phonology of the native language, leaving an opportunity for music experience-based enhancement of sound features not explicitly encoded in the language (such as pitch, which is not explicitly encoded in Finnish). Finally, the pattern of duration modulation in more musically sophisticated Finnish speakers suggests integrated feature processing for greater efficiency in a real world musical situation. These results have implications for research into the specificity of plasticity in the auditory system as well as to the effects of interaction of specific language features with musical experiences.

  6. Musical Sophistication and the Effect of Complexity on Auditory Discrimination in Finnish Speakers

    PubMed Central

    Dawson, Caitlin; Aalto, Daniel; Šimko, Juraj; Vainio, Martti; Tervaniemi, Mari

    2017-01-01

    Musical experiences and native language are both known to affect auditory processing. The present work aims to disentangle the influences of native language phonology and musicality on behavioral and subcortical sound feature processing in a population of musically diverse Finnish speakers as well as to investigate the specificity of enhancement from musical training. Finnish speakers are highly sensitive to duration cues since in Finnish, vowel and consonant duration determine word meaning. Using a correlational approach with a set of behavioral sound feature discrimination tasks, brainstem recordings, and a musical sophistication questionnaire, we find no evidence for an association between musical sophistication and more precise duration processing in Finnish speakers either in the auditory brainstem response or in behavioral tasks, but they do show an enhanced pitch discrimination compared to Finnish speakers with less musical experience and show greater duration modulation in a complex task. These results are consistent with a ceiling effect set for certain sound features which corresponds to the phonology of the native language, leaving an opportunity for music experience-based enhancement of sound features not explicitly encoded in the language (such as pitch, which is not explicitly encoded in Finnish). Finally, the pattern of duration modulation in more musically sophisticated Finnish speakers suggests integrated feature processing for greater efficiency in a real world musical situation. These results have implications for research into the specificity of plasticity in the auditory system as well as to the effects of interaction of specific language features with musical experiences. PMID:28450829

  7. Changing epistemological beliefs: the unexpected impact of a short-term intervention.

    PubMed

    Kienhues, Dorothe; Bromme, Rainer; Stahl, Elmar

    2008-12-01

    Previous research has shown that sophisticated epistemological beliefs exert a positive influence on students' learning strategies and learning outcomes. This gives a clear educational relevance to studies on the development of methods for promoting a change in epistemological beliefs and making them more sophisticated. To investigate the potential for influencing domain-specific epistemological beliefs through a short instructional intervention. On the basis of their performance on a preliminary survey of epistemological beliefs, 58 students at a German university (87.7% females) with a mean age of 21.86 years (SD=2.88) were selected. Half of them had more naive beliefs and the other half had more sophisticated ones. Participants were randomly assigned to one of two groups: one whose epistemological beliefs were challenged through refutational epistemological instruction or another receiving non-challenging informational instruction. The treatment effect was assessed by comparing pre- and post-instructional scores on two instruments tapping different layers of epistemological beliefs (DEBQ and CAEB). Data were subjected to factor analyses and analyses of variance. According to the CAEB, the naive group receiving the refutational epistemological instruction changed towards a more sophisticated view, whereas the sophisticated students receiving the informational instruction changed towards an unintended, more naive standpoint. According to the DEBQ, all research groups except the naive refutational group revealed changes towards a more naive view. This study indicates the possibility of changing domain-specific epistemological beliefs through a short-term intervention. However, it questions the stability and elaborateness of domain-specific epistemological beliefs, particularly when domain knowledge is shallow.

  8. Pyramidal neurovision architecture for vision machines

    NASA Astrophysics Data System (ADS)

    Gupta, Madan M.; Knopf, George K.

    1993-08-01

    The vision system employed by an intelligent robot must be active; active in the sense that it must be capable of selectively acquiring the minimal amount of relevant information for a given task. An efficient active vision system architecture that is based loosely upon the parallel-hierarchical (pyramidal) structure of the biological visual pathway is presented in this paper. Although the computational architecture of the proposed pyramidal neuro-vision system is far less sophisticated than the architecture of the biological visual pathway, it does retain some essential features such as the converging multilayered structure of its biological counterpart. In terms of visual information processing, the neuro-vision system is constructed from a hierarchy of several interactive computational levels, whereupon each level contains one or more nonlinear parallel processors. Computationally efficient vision machines can be developed by utilizing both the parallel and serial information processing techniques within the pyramidal computing architecture. A computer simulation of a pyramidal vision system for active scene surveillance is presented.

  9. The interactive digital video interface

    NASA Technical Reports Server (NTRS)

    Doyle, Michael D.

    1989-01-01

    A frequent complaint in the computer oriented trade journals is that current hardware technology is progressing so quickly that software developers cannot keep up. A example of this phenomenon can be seen in the field of microcomputer graphics. To exploit the advantages of new mechanisms of information storage and retrieval, new approaches must be made towards incorporating existing programs as well as developing entirely new applications. A particular area of need is the correlation of discrete image elements to textural information. The interactive digital video (IDV) interface embodies a new concept in software design which addresses these needs. The IDV interface is a patented device and language independent process for identifying image features on a digital video display and which allows a number of different processes to be keyed to that identification. Its capabilities include the correlation of discrete image elements to relevant text information and the correlation of these image features to other images as well as to program control mechanisms. Sophisticated interrelationships can be set up between images, text, and program control mechanisms.

  10. New method for identifying features of an image on a digital video display

    NASA Astrophysics Data System (ADS)

    Doyle, Michael D.

    1991-04-01

    The MetaMap process extends the concept of direct manipulation human-computer interfaces to new limits. Its specific capabilities include the correlation of discrete image elements to relevant text information and the correlation of these image features to other images as well as to program control mechanisms. The correlation is accomplished through reprogramming of both the color map and the image so that discrete image elements comprise unique sets of color indices. This process allows the correlation to be accomplished with very efficient data storage and program execution times. Image databases adapted to this process become object-oriented as a result. Very sophisticated interrelationships can be set up between images text and program control mechanisms using this process. An application of this interfacing process to the design of an interactive atlas of medical histology as well as other possible applications are described. The MetaMap process is protected by U. S. patent #4

  11. Deciding to opt out of childhood vaccination mandates.

    PubMed

    Gullion, Jessica Smartt; Henry, Lisa; Gullion, Greg

    2008-01-01

    We explore the attitudes and beliefs of parents who consciously choose not to vaccinate their children and the ways in which these parents process information on the pros and cons of vaccines. In-depth, semistructured interviews were conducted. The study population consisted of 25 parents who do not vaccinate their children, identified through snowball and targeted sampling. Participants were asked about their processes and actions when choosing not to vaccinate their children. Interviews were taped and transcribed, and the content was analyzed for emergent themes. Two predominant themes emerged in our data: a desire to collect information on vaccines and trust issues with the medical community. Evidence of sophisticated data collection and information processing was a repeated theme in the interview data. Simultaneously, while participants placed a high value on scientific knowledge, they also expressed high levels of distrust of the medical community. The challenge for public health is to balance scientific data with popular epidemiology and to maintain legitimacy. Understanding the differences in lay versus expert knowledge has implications for crafting health messages. How experts frame knowledge for consumption has an important impact on this group and their decision-making processes.

  12. Smart Manufacturing Technologies and Data Analytics for Improving Energy Efficiency in Industrial Energy Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nimbalkar, Sachin U.; Guo, Wei; Wenning, Thomas J.

    Smart manufacturing and advanced data analytics can help the manufacturing sector unlock energy efficiency from the equipment level to the entire manufacturing facility and the whole supply chain. These technologies can make manufacturing industries more competitive, with intelligent communication systems, real-time energy savings, and increased energy productivity. Smart manufacturing can give all employees in an organization the actionable information they need, when they need it, so that each person can contribute to the optimal operation of the corporation through informed, data-driven decision making. This paper examines smart technologies and data analytics approaches for improving energy efficiency and reducing energy costsmore » in process-supporting energy systems. It dives into energy-saving improvement opportunities through smart manufacturing technologies and sophisticated data collection and analysis. The energy systems covered in this paper include those with motors and drives, fans, pumps, air compressors, steam, and process heating.« less

  13. Earth Science Informatics Comes of Age

    NASA Technical Reports Server (NTRS)

    Jodha, Siri; Khalsa, S.; Ramachandran, Rahul

    2014-01-01

    The volume and complexity of Earth science data have steadily increased, placing ever-greater demands on researchers, software developers and data managers tasked with handling such data. Additional demands arise from requirements being levied by funding agencies and governments to better manage, preserve and provide open access to data. Fortunately, over the past 10-15 years significant advances in information technology, such as increased processing power, advanced programming languages, more sophisticated and practical standards, and near-ubiquitous internet access have made the jobs of those acquiring, processing, distributing and archiving data easier. These advances have also led to an increasing number of individuals entering the field of informatics as it applies to Geoscience and Remote Sensing. Informatics is the science and technology of applying computers and computational methods to the systematic analysis, management, interchange, and representation of data, information, and knowledge. Informatics also encompasses the use of computers and computational methods to support decisionmaking and other applications for societal benefits.

  14. International Guide to Highway Transportation Information: Volume 1 - Highway Transportation Libraries and Information Centers

    DOT National Transportation Integrated Search

    2013-01-01

    The FHWA Road Weather Management Program partnered with Utah DOT to develop and implement advanced traveler information strategies during weather events. UDOT already has one of the most sophisticated Traffic Operations Centers (TOCs) in the country ...

  15. High-Tech Conservation: Information-Age Tools Have Revolutionized the Work of Ecologists.

    ERIC Educational Resources Information Center

    Chiles, James R.

    1992-01-01

    Describes a new direction for conservation efforts influenced by the advance of the information age and the introduction of many technologically sophisticated information collecting devices. Devices include microscopic computer chips, miniature electronic components, and Earth-observation satellite. (MCO)

  16. 75 FR 63884 - Self-Regulatory Organizations; Municipal Securities Rulemaking Board; Order Approving Proposed...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-18

    ... unfairly allows institutional and sophisticated investors to more easily access information about a... information directly to EMMA is a more efficient way of disseminating information to investors, noting that... in the level of investor protection provided by the MSRB's information systems and [[Page 63886...

  17. A Snapshot of Serial Rape: An Investigation of Criminal Sophistication and Use of Force on Victim Injury and Severity of the Assault.

    PubMed

    de Heer, Brooke

    2016-02-01

    Prior research on rapes reported to law enforcement has identified criminal sophistication and the use of force against the victim as possible unique identifiers to serial rape versus one-time rape. This study sought to contribute to the current literature on reported serial rape by investigating how the level of criminal sophistication of the rapist and use of force used were associated with two important outcomes of rape: victim injury and overall severity of the assault. In addition, it was evaluated whether rapist and victim ethnicity affected these relationships. A nation-wide sample of serial rape cases reported to law enforcement collected by the Federal Bureau of Investigation (FBI) was analyzed (108 rapists, 543 victims). Results indicated that serial rapists typically used a limited amount of force against the victim and displayed a high degree of criminal sophistication. In addition, the more criminally sophisticated the perpetrator was, the more sexual acts he performed on his victim. Finally, rapes between a White rapist and White victim were found to exhibit higher levels of criminal sophistication and were more severe in terms of number and types of sexual acts committed. These findings provide a more in-depth understanding of serial rape that can inform both academics and practitioners in the field about contributors to victim injury and severity of the assault. © The Author(s) 2014.

  18. Exploring the Role of Usability in the Software Process: A Study of Irish Software SMEs

    NASA Astrophysics Data System (ADS)

    O'Connor, Rory V.

    This paper explores the software processes and usability techniques used by Small and Medium Enterprises (SMEs) that develop web applications. The significance of this research is that it looks at development processes used by SMEs in order to assess to what degree usability is integrated into the process. This study seeks to gain an understanding into the level of awareness of usability within SMEs today and their commitment to usability in practice. The motivation for this research is to explore the current development processes used by SMEs in developing web applications and to understand how usability is represented in those processes. The background for this research is provided by the growth of the web application industry beyond informational web sites to more sophisticated applications delivering a broad range of functionality. This paper presents an analysis of the practices of several Irish SMEs that develop web applications through a series of case studies. With the focus on SMEs that develop web applications as Management Information Systems and not E-Commerce sites, informational sites, online communities or web portals. This study gathered data about the usability techniques practiced by these companies and their awareness of usability in the context of the software process in those SMEs. The contribution of this study is to further the understanding of the current role of usability within the software development processes of SMEs that develop web applications.

  19. Trail pheromones: an integrative view of their role in social insect colony organization.

    PubMed

    Czaczkes, Tomer J; Grüter, Christoph; Ratnieks, Francis L W

    2015-01-07

    Trail pheromones do more than simply guide social insect workers from point A to point B. Recent research has revealed additional ways in which they help to regulate colony foraging, often via positive and negative feedback processes that influence the exploitation of the different resources that a colony has knowledge of. Trail pheromones are often complementary or synergistic with other information sources, such as individual memory. Pheromone trails can be composed of two or more pheromones with different functions, and information may be embedded in the trail network geometry. These findings indicate remarkable sophistication in how trail pheromones are used to regulate colony-level behavior, and how trail pheromones are used and deployed at the individual level.

  20. Digital video steganalysis exploiting collusion sensitivity

    NASA Astrophysics Data System (ADS)

    Budhia, Udit; Kundur, Deepa

    2004-09-01

    In this paper we present an effective steganalyis technique for digital video sequences based on the collusion attack. Steganalysis is the process of detecting with a high probability and low complexity the presence of covert data in multimedia. Existing algorithms for steganalysis target detecting covert information in still images. When applied directly to video sequences these approaches are suboptimal. In this paper, we present a method that overcomes this limitation by using redundant information present in the temporal domain to detect covert messages in the form of Gaussian watermarks. Our gains are achieved by exploiting the collusion attack that has recently been studied in the field of digital video watermarking, and more sophisticated pattern recognition tools. Applications of our scheme include cybersecurity and cyberforensics.

  1. Managers and Information Technology: A Case Study of Organizational Change in County Government.

    ERIC Educational Resources Information Center

    Rosenbaum, Howard

    1992-01-01

    Discussion of the interrelationship of information technology, managers, and users, and the structure of public sector organizations focuses on a case study that examined the organizational changes in the information services department of a large, urban county government as a result of the use of sophisticated information technologies. (11…

  2. Recent advances in non-LTE stellar atmosphere models

    NASA Astrophysics Data System (ADS)

    Sander, Andreas A. C.

    2017-11-01

    In the last decades, stellar atmosphere models have become a key tool in understanding massive stars. Applied for spectroscopic analysis, these models provide quantitative information on stellar wind properties as well as fundamental stellar parameters. The intricate non-LTE conditions in stellar winds dictate the development of adequate sophisticated model atmosphere codes. The increase in both, the computational power and our understanding of physical processes in stellar atmospheres, led to an increasing complexity in the models. As a result, codes emerged that can tackle a wide range of stellar and wind parameters. After a brief address of the fundamentals of stellar atmosphere modeling, the current stage of clumped and line-blanketed model atmospheres will be discussed. Finally, the path for the next generation of stellar atmosphere models will be outlined. Apart from discussing multi-dimensional approaches, I will emphasize on the coupling of hydrodynamics with a sophisticated treatment of the radiative transfer. This next generation of models will be able to predict wind parameters from first principles, which could open new doors for our understanding of the various facets of massive star physics, evolution, and death.

  3. Data storage and retrieval.

    PubMed

    Kalisman, M; Kalisman, A

    1986-07-01

    The entire face of modern medical and surgical practice is being significantly affected by the application of technologic developments to the practice of surgery--developments that will tie together such areas as information management and processing, robotics, communication networks, and computerized surgical equipment. The achievements in these areas will create a sophisticated, fully automatic system that will assist the plastic surgeon in many aspects of work, such as regular office activities, doctor-patient interaction, professional updating, communication, and even assistance during the operational process itself. It will be as simple as dialing a telephone today. When it is necessary to consult with other colleagues, a combined vocal and visual consulting network in other medical centers as well as consulting computerized expert systems will be available all day and night as part of the communication services. The plastic surgical expert systems will store valuable information, based on the knowledge of the best human experts, on any important subtopics and will be accessed in a very friendly way. This will be an invaluable tool for the residents in training, for emergency room work, and for just getting a second opinion, even for the more experienced practitioner. All the electronic mail, professional magazines, and any other required professional information will flow between central and personal retrieval systems. The doctor, at a desired time in the privacy and comfort of his or her own home or office, can read the mail, make required changes to suit his or her needs, and store, send back, or distribute information, all in a speedy and efficient manner. The simulation of a planned surgery will give the surgeon the ability to prepare and will prevent difficulties during complicated procedures through the luxury of a dry run, without any sequelae if certain expected outcomes fail to materialize. The preprogrammed control of sophisticated surgical equipment and the use of robotics would generate new operational possibilities for more complicated surgeries, which are now prevented owing to the surgeon's physical limitations. Information urgently required during the operation as a result of an unexpected situation will be available immediately from storage and retrieval systems, and real-time vocal and visual consulting with expert colleagues, often in remote locations, will bring the operations process itself to a new era.(ABSTRACT TRUNCATED AT 400 WORDS)

  4. Relative cue encoding in the context of sophisticated models of categorization: Separating information from categorization.

    PubMed

    Apfelbaum, Keith S; McMurray, Bob

    2015-08-01

    Traditional studies of human categorization often treat the processes of encoding features and cues as peripheral to the question of how stimuli are categorized. However, in domains where the features and cues are less transparent, how information is encoded prior to categorization may constrain our understanding of the architecture of categorization. This is particularly true in speech perception, where acoustic cues to phonological categories are ambiguous and influenced by multiple factors. Here, it is crucial to consider the joint contributions of the information in the input and the categorization architecture. We contrasted accounts that argue for raw acoustic information encoding with accounts that posit that cues are encoded relative to expectations, and investigated how two categorization architectures-exemplar models and back-propagation parallel distributed processing models-deal with each kind of information. Relative encoding, akin to predictive coding, is a form of noise reduction, so it can be expected to improve model accuracy; however, like predictive coding, the use of relative encoding in speech perception by humans is controversial, so results are compared to patterns of human performance, rather than on the basis of overall accuracy. We found that, for both classes of models, in the vast majority of parameter settings, relative cues greatly helped the models approximate human performance. This suggests that expectation-relative processing is a crucial precursor step in phoneme categorization, and that understanding the information content is essential to understanding categorization processes.

  5. Relative cue encoding in the context of sophisticated models of categorization: Separating information from categorization

    PubMed Central

    McMurray, Bob

    2014-01-01

    Traditional studies of human categorization often treat the processes of encoding features and cues as peripheral to the question of how stimuli are categorized. However, in domains where the features and cues are less transparent, how information is encoded prior to categorization may constrain our understanding of the architecture of categorization. This is particularly true in speech perception, where acoustic cues to phonological categories are ambiguous and influenced by multiple factors. Here, it is crucial to consider the joint contributions of the information in the input and the categorization architecture. We contrasted accounts that argue for raw acoustic information encoding with accounts that posit that cues are encoded relative to expectations, and investigated how two categorization architectures—exemplar models and back-propagation parallel distributed processing models—deal with each kind of information. Relative encoding, akin to predictive coding, is a form of noise reduction, so it can be expected to improve model accuracy; however, like predictive coding, the use of relative encoding in speech perception by humans is controversial, so results are compared to patterns of human performance, rather than on the basis of overall accuracy. We found that, for both classes of models, in the vast majority of parameter settings, relative cues greatly helped the models approximate human performance. This suggests that expectation-relative processing is a crucial precursor step in phoneme categorization, and that understanding the information content is essential to understanding categorization processes. PMID:25475048

  6. Tailoring Information Strategies for Developing Countries: Some Latin American Experiences.

    ERIC Educational Resources Information Center

    Crowther, Warren

    This article addresses the conditions of developing countries which must be taken into account in developing information strategies for their public and educational institutions or projects. Its central argument is that newer information science concepts, although they demand technological and conceptual sophistication, can be useful in the…

  7. A Snapshot of the Electronic Transmission and Processing of Prescriptions project in the Iranian Social Security Organization

    PubMed Central

    Moghaddam, Ramin; Badredine, Hala

    2006-01-01

    Iranian Social Security Organization(ISSO) is going to enable the sharing of health related information in a secure environment by means of reliable data in the right time to improve health of insured people throughout the country. There are around 7000 pharmacy throughout the country that ISSO contracted with them in order to deliver seamless services to 30 million insured people. The management of the huge amount of prescriptions based on a scientific basis with considering the financial issues of rising the cost of medicaments certainley needs a sophisticated business process reeingineering using ICT ; the work that is going to be completed in the ISSO in next few months. PMID:17238655

  8. Analysis of Qualitative Interviews about the Impact of Information Technology on Pressure Ulcer Prevention Programs: Implications for the Wound Ostomy Continence Nurse

    PubMed Central

    Shepherd, Marilyn Murphy; Wipke-Tevis, Deidre D.; Alexander, Gregory L.

    2015-01-01

    Purpose The purpose of this study was to compare pressure ulcer prevention programs in 2 long term care facilities (LTC) with diverse Information Technology Sophistication (ITS), one with high sophistication and one with low sophistication, and to identify implications for the Wound Ostomy Continence Nurse (WOC Nurse) Design Secondary analysis of narrative data obtained from a mixed methods study. Subjects and Setting The study setting was 2 LTC facilities in the Midwestern United States. The sample comprised 39 staff from 2 facilities, including 26 from a high ITS facility and 13 from the low ITS facility. Respondents included Certified Nurse Assistants,, Certified Medical Technicians, Restorative Medical Technicians, Social Workers, Registered Nurses, Licensed Practical Nurses, Information Technology staff, Administrators, and Directors. Methods This study is a secondary analysis of interviews regarding communication and education strategies in two longterm care agencies. This analysis focused on focus group interviews, which included both direct and non-direct care providers. Results Eight themes (codes) were identified in the analysis. Three themes are presented individually with exemplars of communication and education strategies. The analysis revealed specific differences between the high ITS and low ITS facility in regards to education and communication involving pressure ulcer prevention. These differences have direct implications for WOC nurses consulting in the LTC setting. Conclusions Findings from this study suggest that effective strategies for staff education and communication regarding PU prevention differ based on the level of ITS within a given facility. Specific strategies for education and communication are suggested for agencies with high ITS and agencies with low ITS sophistication. PMID:25945822

  9. Mechanical break junctions: enormous information in a nanoscale package.

    PubMed

    Natelson, Douglas

    2012-04-24

    Mechanical break junctions, particularly those in which a metal tip is repeatedly moved in and out of contact with a metal film, have provided many insights into electronic conduction at the atomic and molecular scale, most often by averaging over many possible junction configurations. This averaging throws away a great deal of information, and Makk et al. in this issue of ACS Nano demonstrate that, with both simulated and real experimental data, more sophisticated two-dimensional analysis methods can reveal information otherwise obscured in simple histograms. As additional measured quantities come into play in break junction experiments, including thermopower, noise, and optical response, these more sophisticated analytic approaches are likely to become even more powerful. While break junctions are not directly practical for useful electronic devices, they are incredibly valuable tools for unraveling the electronic transport physics relevant for ultrascaled nanoelectronics.

  10. Accomplishing Mars exploration goals by returning a simple "locality" sample

    NASA Astrophysics Data System (ADS)

    McKay, G.; Draper, D.; Bogard, D.; Agee, C.; Ming, D.; Jones, J.

    A major stumbling block to a Mars sample return (MSR) mission is cost. This problem is greatly exacerbated by using elaborate rovers, sophisticated on-board instruments, and complex sample selection techniques to maximize diversity. We argue that many key science goals of the Mars Exploration Program may be accomplished by returning a simple "locality" sample from a well-chosen landing site. Such a sample , collected by a simple scoop, would consist of local regolith containing soil, windblown fines, and lithic fragments (plus Martian atmosphere). Even the simplest sample return mission could revolutionize our understanding of Mars, without the need for expensive rovers or sophisticated on-board instruments. We expect that by the time a MSR mission could be flown, information from the Mars Odyssey, Mars Express, 2003 Mars Exploration Rovers, and 2005 Mars Reconnaissance Orbiter will be sufficient to choose a good landing site. Returned samples of Martian regolith have the potential to answer key questions of fundamental importance to the Mars Exploration Program: The search for life; the role and history of water and other volatiles; interpreting remotely-sensed spectral data; and understanding the planet as a system. A locality sample can further the search for life by identifying trace organics, biogenic elements and their isotopic compositions, evidence for water such as hydrous minerals or cements, the Martian soil oxidant, and trace biomarkers. Learning the nature and timing of atmosphere-soil-rock interactions will improve understanding of the role and history of water. An atmosphere sample will reveal fundamental information about current atmospheric processes. Information about the mineralogy and lithology of sample materials, the extent of impact gardening, and the nature of dust coatings and alteration rinds will provide much-needed ground truth for interpreting remotely-sensed data, including Mars Pathfinder. Basic planetology questions that might be answered include the compositions and ages of the highlands or lowlands, and how wet Mars was, and at what time in its history. By bringing a simple locality sample back for analysis in the world's best labs, using the world's most sophisticated state-of-the-art instruments, we can make break-through progress in addressing fundamental questions about Mars.

  11. MorphoHawk: Geometric-based Software for Manufacturing and More

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keith Arterburn

    2001-04-01

    Hollywood movies portray facial recognition as a perfected technology, but reality is that sophisticated computers and algorithmic calculations are far from perfect. In fact, the most sophisticated and successful computer for recognizing faces and other imagery already is the human brain with more than 10 billion nerve cells. Beginning at birth, humans process data and connect optical and sensory experiences that create unparalleled accumulation of data for people to associate images with life experiences, emotions and knowledge. Computers are powerful, rapid and tireless, but still cannot compare to the highly sophisticated relational calculations and associations that the human computer canmore » produce in connecting ‘what we see with what we know.’« less

  12. The Operation of a Specialized Scientific Information and Data Analysis Center With Computer Base and Associated Communications Network.

    ERIC Educational Resources Information Center

    Cottrell, William B.; And Others

    The Nuclear Safety Information Center (NSIC) is a highly sophisticated scientific information center operated at Oak Ridge National Laboratory (ORNL) for the U.S. Atomic Energy Commission. Its information file, which consists of both data and bibliographic information, is computer stored and numerous programs have been developed to facilitate the…

  13. Automated speech understanding: the next generation

    NASA Astrophysics Data System (ADS)

    Picone, J.; Ebel, W. J.; Deshmukh, N.

    1995-04-01

    Modern speech understanding systems merge interdisciplinary technologies from Signal Processing, Pattern Recognition, Natural Language, and Linguistics into a unified statistical framework. These systems, which have applications in a wide range of signal processing problems, represent a revolution in Digital Signal Processing (DSP). Once a field dominated by vector-oriented processors and linear algebra-based mathematics, the current generation of DSP-based systems rely on sophisticated statistical models implemented using a complex software paradigm. Such systems are now capable of understanding continuous speech input for vocabularies of several thousand words in operational environments. The current generation of deployed systems, based on small vocabularies of isolated words, will soon be replaced by a new technology offering natural language access to vast information resources such as the Internet, and provide completely automated voice interfaces for mundane tasks such as travel planning and directory assistance.

  14. Publicly disclosed information about the quality of health care: response of the US public

    PubMed Central

    Schneider, E; Lieberman, T

    2001-01-01

    Public disclosure of information about the quality of health plans, hospitals, and doctors continues to be controversial. The US experience of the past decade suggests that sophisticated quality measures and reporting systems that disclose information on quality have improved the process and outcomes of care in limited ways in some settings, but these efforts have not led to the "consumer choice" market envisaged. Important reasons for this failure include limited salience of objective measures to consumers, the complexity of the task of interpretation, and insufficient use of quality results by organised purchasers and insurers to inform contracting and pricing decisions. Nevertheless, public disclosure may motivate quality managers and providers to undertake changes that improve the delivery of care. Efforts to measure and report information about quality should remain public, but may be most effective if they are targeted to the needs of institutional and individual providers of care. Key Words: public disclosure; quality of health care; quality improvement PMID:11389318

  15. Curriculum Design in Health Education

    ERIC Educational Resources Information Center

    Conceicao, Simone C. O.; Colby, Holly; Juhlmann, Anne; Johaningsmeir, Sarah

    2011-01-01

    While health care providers are knowledgeable of health conditions and of the information patients need to make appropriate health decisions and follow health providers' recommendations, they lack information about adult teaching and learning and appropriate curriculum design. Adult educators can contribute more sophisticated skills in program…

  16. Natural language processing and advanced information management

    NASA Technical Reports Server (NTRS)

    Hoard, James E.

    1989-01-01

    Integrating diverse information sources and application software in a principled and general manner will require a very capable advanced information management (AIM) system. In particular, such a system will need a comprehensive addressing scheme to locate the material in its docuverse. It will also need a natural language processing (NLP) system of great sophistication. It seems that the NLP system must serve three functions. First, it provides an natural language interface (NLI) for the users. Second, it serves as the core component that understands and makes use of the real-world interpretations (RWIs) contained in the docuverse. Third, it enables the reasoning specialists (RSs) to arrive at conclusions that can be transformed into procedures that will satisfy the users' requests. The best candidate for an intelligent agent that can satisfactorily make use of RSs and transform documents (TDs) appears to be an object oriented data base (OODB). OODBs have, apparently, an inherent capacity to use the large numbers of RSs and TDs that will be required by an AIM system and an inherent capacity to use them in an effective way.

  17. Science Language Accommodation in Elementary School Read-Alouds

    NASA Astrophysics Data System (ADS)

    Glass, Rory; Oliveira, Alandeom W.

    2014-03-01

    This study examines the pedagogical functions of accommodation (i.e. provision of simplified science speech) in science read-aloud sessions facilitated by five elementary teachers. We conceive of read-alouds as communicative events wherein teachers, faced with the task of orally delivering a science text of relatively high linguistic complexity, open up an alternate channel of communication, namely oral discussion. By doing so, teachers grant students access to a simplified linguistic input, a strategy designed to promote student comprehension of the textual contents of children's science books. It was found that nearly half (46%) of the read-aloud time was allotted to discussions with an increased percentage of less sophisticated words and reduced use of more sophisticated vocabulary than found in the books through communicative strategies such as simplified rewording, simplified definition, and simplified questioning. Further, aloud reading of more linguistically complex books required longer periods of discussion and an increased degree of teacher oral input and accommodation. We also found evidence of reversed simplification (i.e. sophistication), leading to student uptake of scientific language. The main significance of this study is that it reveals that teacher talk serves two often competing pedagogical functions (accessible communication of scientific information to students and promotion of student acquisition of the specialized language of science). It also underscores the importance of giving analytical consideration to the simplification-sophistication dimension of science classroom discourse as well as the potential of computer-based analysis of classroom discourse to inform science teaching.

  18. Benchmarking and beyond. Information trends in home care.

    PubMed

    Twiss, Amanda; Rooney, Heather; Lang, Christine

    2002-11-01

    With today's benchmarking concepts and tools, agencies have the unprecedented opportunity to use information as a strategic advantage. Because agencies are demanding more and better information, benchmark functionality has grown increasingly sophisticated. Agencies now require a new type of analysis, focused on high-level executive summaries while reducing the current "data overload."

  19. Technology and informal education: what is taught, what is learned.

    PubMed

    Greenfield, Patricia M

    2009-01-02

    The informal learning environments of television, video games, and the Internet are producing learners with a new profile of cognitive skills. This profile features widespread and sophisticated development of visual-spatial skills, such as iconic representation and spatial visualization. A pressing social problem is the prevalence of violent video games, leading to desensitization, aggressive behavior, and gender inequity in opportunities to develop visual-spatial skills. Formal education must adapt to these changes, taking advantage of new strengths in visual-spatial intelligence and compensating for new weaknesses in higher-order cognitive processes: abstract vocabulary, mindfulness, reflection, inductive problem solving, critical thinking, and imagination. These develop through the use of an older technology, reading, which, along with audio media such as radio, also stimulates imagination. Informal education therefore requires a balanced media diet using each technology's specific strengths in order to develop a complete profile of cognitive skills.

  20. Study on the supply chain of an enterprise based on axiomatic design

    NASA Astrophysics Data System (ADS)

    Fan, Shu-hai; Lin, Chao-qun; Ji, Chun; Zhou, Ce; Chen, Peng

    2018-06-01

    This paper first expounds the basic theoretical knowledge of axiomatic design, and then designs and improves the enterprise supply chain through two design axioms (axiom of independence and information axiom). In the axiomatic design of the axiom of independence, the user needs to determine the needs and problems to be solved, to determine the top total goals, the total goal decomposition, and to determine their own design equations. In the application of information axiom, the concept of cloud is used to quantify the amount of information, and the two schemes are evaluated and compared. Finally, through axiomatic design, we can get the best solution for the improvement of supply chain design. Axiomatic design is a generic, systematic and sophisticated approach to design that addresses the needs of different customers. Using this method to improve the level of supply chain management is creative. As a mature method, it will make the process efficient and convenient.

  1. Handling of huge multispectral image data volumes from a spectral hole burning device (SHBD)

    NASA Astrophysics Data System (ADS)

    Graff, Werner; Rosselet, Armel C.; Wild, Urs P.; Gschwind, Rudolf; Keller, Christoph U.

    1995-06-01

    We use chlorin-doped polymer films at low temperatures as the primary imaging detector. Based on the principles of persistent spectral hole burning, this system is capable of storing spatial and spectral information simultaneously in one exposure with extremely high resolution. The sun as an extended light source has been imaged onto the film. The information recorded amounts to tens of GBytes. This data volume is read out by scanning the frequency of a tunable dye laser and reading the images with a digital CCD camera. For acquisition, archival, processing, and visualization, we use MUSIC (MUlti processor System with Intelligent Communication), a single instruction multiple data parallel processor system equipped with the necessary I/O facilities. The huge amount of data requires the developemnt of sophisticated algorithms to efficiently calibrate the data and to extract useful and new information for solar physics.

  2. Development of Cloud and Precipitation Property Retrieval Algorithms and Measurement Simulators from ASR Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mace, Gerald G.

    What has made the ASR program unique is the amount of information that is available. The suite of recently deployed instruments significantly expands the scope of the program (Mather and Voyles, 2013). The breadth of this information allows us to pose sophisticated process-level questions. Our ASR project, now entering its third year, has been about developing algorithms that use this information in ways that fully exploit the new capacity of the ARM data streams. Using optimal estimation (OE) and Markov Chain Monte Carlo (MCMC) inversion techniques, we have developed methodologies that allow us to use multiple radar frequency Doppler spectramore » along with lidar and passive constraints where data streams can be added or subtracted efficiently and algorithms can be reformulated for various combinations of hydrometeors by exchanging sets of empirical coefficients. These methodologies have been applied to boundary layer clouds, mixed phase snow cloud systems, and cirrus.« less

  3. Less is (sometimes) more in cognitive engineering: the role of automation technology in improving patient safety

    PubMed Central

    Vicente, K

    2003-01-01

    

 There is a tendency to assume that medical error can be stamped out by automation. Technology may improve patient safety, but cognitive engineering research findings in several complex safety critical systems, including both aviation and health care, show that more is not always better. Less sophisticated technological systems can sometimes lead to better performance than more sophisticated systems. This "less is more" effect arises because safety critical systems are open systems where unanticipated events are bound to occur. In these contexts, decision support provided by a technological aid will be less than perfect because there will always be situations that the technology cannot accommodate. Designing sophisticated automation that suggests an uncertain course of action seems to encourage people to accept the imperfect advice, even though information to decide independently on a better course of action is available. It may be preferable to create more modest designs that merely provide feedback about the current state of affairs or that critique human generated solutions than to rush to automate by creating sophisticated technological systems that recommend (fallible) courses of action. PMID:12897363

  4. Electro-triggering and electrochemical monitoring of dopamine exocytosis from a single cell by using ultrathin electrodes based on Au nanowires

    NASA Astrophysics Data System (ADS)

    Kang, Mijeong; Yoo, Seung Min; Gwak, Raekeun; Eom, Gayoung; Kim, Jihwan; Lee, Sang Yup; Kim, Bongsoo

    2015-12-01

    A sophisticated set of an Au nanowire (NW) stimulator-Au NW detector system is developed for electrical cell stimulation and electrochemical analysis of subsequent exocytosis with very high spatial resolution. Dopamine release from a rat pheochromocytoma cell is more stimulated by a more negative voltage pulse. This system could help to improve the therapeutic efficacy of electrotherapies by providing valuable information on their healing mechanism.A sophisticated set of an Au nanowire (NW) stimulator-Au NW detector system is developed for electrical cell stimulation and electrochemical analysis of subsequent exocytosis with very high spatial resolution. Dopamine release from a rat pheochromocytoma cell is more stimulated by a more negative voltage pulse. This system could help to improve the therapeutic efficacy of electrotherapies by providing valuable information on their healing mechanism. Electronic supplementary information (ESI) available. See DOI: 10.1039/c5nr06021d

  5. User-centered requirements engineering in health information systems: a study in the hemophilia field.

    PubMed

    Teixeira, Leonor; Ferreira, Carlos; Santos, Beatriz Sousa

    2012-06-01

    The use of sophisticated information and communication technologies (ICTs) in the health care domain is a way to improve the quality of services. However, there are also hazards associated with the introduction of ICTs in this domain and a great number of projects have failed due to the lack of systematic consideration of human and other non-technology issues throughout the design or implementation process, particularly in the requirements engineering process. This paper presents the methodological approach followed in the design process of a web-based information system (WbIS) for managing the clinical information in hemophilia care, which integrates the values and practices of user-centered design (UCD) activities into the principles of software engineering, particularly in the phase of requirements engineering (RE). This process followed a paradigm that combines a grounded theory for data collection with an evolutionary design based on constant development and refinement of the generic domain model using three well-known methodological approaches: (a) object-oriented system analysis; (b) task analysis; and, (c) prototyping, in a triangulation work. This approach seems to be a good solution for the requirements engineering process in this particular case of the health care domain, since the inherent weaknesses of individual methods are reduced, and emergent requirements are easier to elicit. Moreover, the requirements triangulation matrix gives the opportunity to look across the results of all used methods and decide what requirements are critical for the system success. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  6. 32 CFR 806b.35 - Balancing protection.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ..., Computer Security, 5 for procedures on safeguarding personal information in automated records. 5 http://www... automated system with a log-on protocol. Others may require more sophisticated security protection based on the sensitivity of the information. Classified computer systems or those with established audit and...

  7. 32 CFR 806b.35 - Balancing protection.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ..., Computer Security, 5 for procedures on safeguarding personal information in automated records. 5 http://www... automated system with a log-on protocol. Others may require more sophisticated security protection based on the sensitivity of the information. Classified computer systems or those with established audit and...

  8. 32 CFR 806b.35 - Balancing protection.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ..., Computer Security, 5 for procedures on safeguarding personal information in automated records. 5 http://www... automated system with a log-on protocol. Others may require more sophisticated security protection based on the sensitivity of the information. Classified computer systems or those with established audit and...

  9. 32 CFR 806b.35 - Balancing protection.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ..., Computer Security, 5 for procedures on safeguarding personal information in automated records. 5 http://www... automated system with a log-on protocol. Others may require more sophisticated security protection based on the sensitivity of the information. Classified computer systems or those with established audit and...

  10. 32 CFR 806b.35 - Balancing protection.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., Computer Security, 5 for procedures on safeguarding personal information in automated records. 5 http://www... automated system with a log-on protocol. Others may require more sophisticated security protection based on the sensitivity of the information. Classified computer systems or those with established audit and...

  11. The politics of participation in watershed modeling.

    PubMed

    Korfmacher, K S

    2001-02-01

    While researchers and decision-makers increasingly recognize the importance of public participation in environmental decision-making, there is less agreement about how to involve the public. One of the most controversial issues is how to involve citizens in producing scientific information. Although this question is relevant to many areas of environmental policy, it has come to the fore in watershed management. Increasingly, the public is becoming involved in the sophisticated computer modeling efforts that have been developed to inform watershed management decisions. These models typically have been treated as technical inputs to the policy process. However, model-building itself involves numerous assumptions, judgments, and decisions that are relevant to the public. This paper examines the politics of public involvement in watershed modeling efforts and proposes five guidelines for good practice for such efforts. Using these guidelines, I analyze four cases in which different approaches to public involvement in the modeling process have been attempted and make recommendations for future efforts to involve communities in watershed modeling. Copyright 2001 Springer-Verlag

  12. Hydrographic processing considerations in the “Big Data” age: An overview of technology trends in ocean and coastal surveys

    NASA Astrophysics Data System (ADS)

    Holland, M.; Hoggarth, A.; Nicholson, J.

    2016-04-01

    The quantity of information generated by survey sensors for ocean and coastal zone mapping has reached the “Big Data” age. This is influenced by the number of survey sensors available to conduct a survey, high data resolution, commercial availability, as well as an increased use of autonomous platforms. The number of users of sophisticated survey information is also growing with the increase in data volume. This is leading to a greater demand and broader use of the processed results, which includes marine archeology, disaster response, and many other applications. Data processing and exchange techniques are evolving to ensure this increased accuracy in acquired data meets the user demand, and leads to an improved understanding of the ocean environment. This includes the use of automated processing, models that maintain the best possible representation of varying resolution data to reduce duplication, as well as data plug-ins and interoperability standards. Through the adoption of interoperable standards, data can be exchanged between stakeholders and used many times in any GIS to support an even wider range of activities. The growing importance of Marine Spatial Data Infrastructure (MSDI) is also contributing to the increased access of marine information to support sustainable use of ocean and coastal environments. This paper offers an industry perspective on trends in hydrographic surveying and processing, and the increased use of marine spatial data.

  13. Sudden infant death syndrome: a cybernetic etiology.

    PubMed

    ben-Aaron, M

    2003-01-01

    The brain's processes, by hypothesis, involve information processing by an extraordinarily complex, highly sophisticated, self-organizing cybernetic system embedded in the central nervous system. This cybernetic system generates itself in successive stages. Breathing is, by default, an autonomous function, but breath control is learned. If there is not a smooth transfer of function at the time when a successor system (one that enables autonomous breathing to be overridden by voluntary control) takes over, breathing may cease, without any overt cause being detectable, even with a thorough postmortem examination. If conditions are such that, at that point, the infant's body lacks the strength to resume breathing again under autonomic control, Sudden Infant Death Syndrome may result. The theory explains why infants are at greater risk if they sleep face down.

  14. Conceptions of Scientific Knowledge Influence Learning of Academic Skills: Epistemic Beliefs and the Efficacy of Information Literacy Instruction

    ERIC Educational Resources Information Center

    Rosman, Tom; Peter, Johannes; Mayer, Anne-Kathrin; Krampen, Günter

    2018-01-01

    The present article investigates the effects of epistemic beliefs (i.e. beliefs about the nature of knowledge and knowing) on the effectiveness of information literacy instruction (i.e. instruction on how to search for scholarly information in academic settings). We expected psychology students with less sophisticated beliefs (especially…

  15. Requirements for SPIRES II. An External Specification for the Stanford Public Information Retrieval System.

    ERIC Educational Resources Information Center

    Parker, Edwin B.

    SPIRES (Stanford Public Information Retrieval System) is a computerized information storage and retrieval system intended for use by students and faculty members who have little knowledge of computers but who need rapid and sophisticated retrieval and analysis. The functions and capabilities of the system from the user's point of view are…

  16. A strategic informatics approach to autoverification.

    PubMed

    Jones, Jay B

    2013-03-01

    Autoverification is rapidly expanding with increased functionality provided by middleware tools. It is imperative that autoverification of laboratory test results be viewed as a process evolving into a broader, more sophisticated form of decision support, which will require strategic planning to form a foundational tool set for the laboratory. One must strategically plan to expand autoverification in the future to include a vision of instrument-generated order interfaces, reflexive testing, and interoperability with other information systems. It is hoped that the observations, examples, and opinions expressed in this article will stimulate such short-term and long-term strategic planning. Copyright © 2013 Elsevier Inc. All rights reserved.

  17. Simulating motivated cognition

    NASA Technical Reports Server (NTRS)

    Gevarter, William B.

    1991-01-01

    A research effort to develop a sophisticated computer model of human behavior is described. A computer framework of motivated cognition was developed. Motivated cognition focuses on the motivations or affects that provide the context and drive in human cognition and decision making. A conceptual architecture of the human decision-making approach from the perspective of information processing in the human brain is developed in diagrammatic form. A preliminary version of such a diagram is presented. This architecture is then used as a vehicle for successfully constructing a computer program simulation Dweck and Leggett's findings that relate how an individual's implicit theories orient them toward particular goals, with resultant cognitions, affects, and behavior.

  18. Wires in the soup: quantitative models of cell signaling

    PubMed Central

    Cheong, Raymond; Levchenko, Andre

    2014-01-01

    Living cells are capable of extracting information from their environments and mounting appropriate responses to a variety of associated challenges. The underlying signal transduction networks enabling this can be quite complex, necessitating for their unraveling by sophisticated computational modeling coupled with precise experimentation. Although we are still at the beginning of this process, some recent examples of integrative analysis of cell signaling are very encouraging. This review highlights the case of the NF-κB pathway in order to illustrate how a quantitative model of a signaling pathway can be gradually constructed through continuous experimental validation, and what lessons one might learn from such exercises. PMID:18291655

  19. Life on Mars? 1: The chemical environment

    NASA Technical Reports Server (NTRS)

    Banin, A.; Mancinelli, R. L.

    1995-01-01

    The origin of life at its abiotic evolutionary stage, requires a combination of constituents and environmental conditions that enable the synthesis of complex replicating macromolecules from simpler monomeric molecules. It is very likely that the early stages of this evolutionary process have been spontaneous, rapid and widespread on the surface of the primitive Earth, resulting in the formation of quite sophisticated living organisms within less than a billion years. To what extend did such conditions prevail on Mars? Two companion-papers will review and discuss the available information related to the chemical, physical and environmental conditions on Mars and assess it from the perspective of potential exobiological evolution.

  20. Precision agriculture and information technology

    Treesearch

    Daniel L. Schmoldt

    2001-01-01

    As everyone knows, knowledge (along with its less-sophisticated sibling, information) is power. For a long time, detailed knowledge (in agriculture) has been generally inaccessible or was prohibitively expensive to acquire. Advances in electronics, communications, and software over the past several decades have removed those earlier impediments. Inexpensive sensors and...

  1. Education in the Information Age.

    ERIC Educational Resources Information Center

    Hay, Lee

    1983-01-01

    This essay considers the revolutionized education of a projected future of cheap and sophisticated technology. Predictions include a redefinition of literacy and basic skills and a restructuring of educational delivery employing computers to dispense information in order to free teachers to work directly with students on cognitive development.…

  2. Prior Consent: Not-So-Strange Bedfellows Plan Library/Computing Partnerships.

    ERIC Educational Resources Information Center

    McDonough, Kristin

    The increasing sophistication of information technologies and the nearly universal access to computing have blurred distinctions among information delivery units on college campuses, forcing institutions to rethink the separate organizational structures that evolved when computing in academe was more localized and less prevalent. Experiences in…

  3. Designing and examining e-waste recycling process: methodology and case studies.

    PubMed

    Li, Jinhui; He, Xin; Zeng, Xianlai

    2017-03-01

    Increasing concerns on resource depletion and environmental pollution have largely obliged electrical and electronic waste (e-waste) should be tackled in an environmentally sound manner. Recycling process development is regarded as the most effective and fundamental to solve the e-waste problem. Based on global achievements related to e-waste recycling in the past 15 years, we first propose a theory to design an e-waste recycling process, including measuring e-waste recyclability and selection of recycling process. And we summarize the indicators and tools in terms of resource dimension, environmental dimension, and economic dimension, to examine the e-waste recycling process. Using the sophisticated experience and adequate information of e-waste management, spent lithium-ion batteries and waste printed circuit boards are chosen as case studies to implement and verify the proposed method. All the potential theory and obtained results in this work can contribute to future e-waste management toward best available techniques and best environmental practices.

  4. [Biological evaluation within a risk management process].

    PubMed

    Zhuang, Fei; Ding, Biao

    2007-07-01

    Bio-evaluation within the medical device quality/risk management system is a risk analyzing and assessing process. On the basis of data from characterization of materials, scientific literatures, application history, bio-toxicology testing and so on, weighing the benefit and the risk, bio-evaluation does a conclusion to "take" or "quit" the product design. There is no "zero risk" though "no toxicity" always is the most desirable conclusion in a testing report. The application history data is the most comprehensive among the information available, since no testing system can "clone" the human body. In addition, the capital cost has to be taken into account when bringing the sophisticated testing technologies into the evaluating system. Investigating the #G95-1 of FDA CDRH and the changes of ISO 10993-1, the trend to integrate bio-evaluation into a quality/risk management process can be figured out.

  5. Evolution of an Information Competency Requirement for Undergraduates

    ERIC Educational Resources Information Center

    Walsh, Tiffany R.

    2011-01-01

    University at Buffalo undergraduate students are required to complete a non-credit-bearing information competency assessment prior to graduation, preferably within their first year of study. Called the "Library Skills Workbook," this assessment has evolved from a short, print-based quiz into a sophisticated, multi-module tutorial and…

  6. Demonstrating Success: Web Analytics and Continuous Improvement

    ERIC Educational Resources Information Center

    Loftus, Wayne

    2012-01-01

    As free and low-cost Web analytics tools become more sophisticated, libraries' approach to user analysis can become more nuanced and precise. Tracking appropriate metrics with a well-formulated analytics program can inform design decisions, demonstrate the degree to which those decisions have succeeded, and thereby inform the next iteration in the…

  7. Using Interactive Computer to Communicate Scientific Information.

    ERIC Educational Resources Information Center

    Selnow, Gary W.

    1988-01-01

    Asks whether the computer is another channel of communication, if its interactive qualities make it an information source, or if it is an undefined hybrid. Concludes that computers are neither the medium nor the source but will in the future provide the possibility of a sophisticated interaction between human intelligence and artificial…

  8. Towards an Enterprise Level Measure of Security

    ERIC Educational Resources Information Center

    Marchant, Robert L.

    2013-01-01

    Vulnerabilities of Information Technology (IT) Infrastructure have grown at the similar pace (at least) as the sophistication and complexity of the technology that is the cornerstone of our IT enterprises. Despite massive increased funding for research, for development, and to support deployment of Information Assurance (IA) defenses, the damages…

  9. Another Fine MeSH: Clinical Medicine Meets Information Science.

    ERIC Educational Resources Information Center

    O'Rourke, Alan; Booth, Andrew; Ford, Nigel

    1999-01-01

    Discusses evidence-based medicine (EBM) and the need for systematic use of databases like MEDLINE with more sophisticated search strategies to optimize the retrieval of relevant papers. Describes an empirical study of hospital libraries that examined requests for information and search strategies using both structured and unstructured forms.…

  10. Information Literacy as Foundational: Determining Competence

    ERIC Educational Resources Information Center

    DeMars, Christine; Cameron, Lynn; Erwin, T. Dary

    2003-01-01

    Finding, accessing, and determining the credibility of information are skills most people would deem necessary for the college educated person, if not the average citizen, to possess today. At the same time, educators, as well as constituents of educational institutions are asking for better and more sophisticated assessment instruments of…

  11. Quantum communication and information processing

    NASA Astrophysics Data System (ADS)

    Beals, Travis Roland

    Quantum computers enable dramatically more efficient algorithms for solving certain classes of computational problems, but, in doing so, they create new problems. In particular, Shor's Algorithm allows for efficient cryptanalysis of many public-key cryptosystems. As public key cryptography is a critical component of present-day electronic commerce, it is crucial that a working, secure replacement be found. Quantum key distribution (QKD), first developed by C.H. Bennett and G. Brassard, offers a partial solution, but many challenges remain, both in terms of hardware limitations and in designing cryptographic protocols for a viable large-scale quantum communication infrastructure. In Part I, I investigate optical lattice-based approaches to quantum information processing. I look at details of a proposal for an optical lattice-based quantum computer, which could potentially be used for both quantum communications and for more sophisticated quantum information processing. In Part III, I propose a method for converting and storing photonic quantum bits in the internal state of periodically-spaced neutral atoms by generating and manipulating a photonic band gap and associated defect states. In Part II, I present a cryptographic protocol which allows for the extension of present-day QKD networks over much longer distances without the development of new hardware. I also present a second, related protocol which effectively solves the authentication problem faced by a large QKD network, thus making QKD a viable, information-theoretic secure replacement for public key cryptosystems.

  12. Integration of the stratigraphic aspects of very large sea-floor databases using information processing

    USGS Publications Warehouse

    Jenkins, Clinton N.; Flocks, J.; Kulp, M.; ,

    2006-01-01

    Information-processing methods are described that integrate the stratigraphic aspects of large and diverse collections of sea-floor sample data. They efficiently convert common types of sea-floor data into database and GIS (geographical information system) tables, visual core logs, stratigraphic fence diagrams and sophisticated stratigraphic statistics. The input data are held in structured documents, essentially written core logs that are particularly efficient to create from raw input datasets. Techniques are described that permit efficient construction of regional databases consisting of hundreds of cores. The sedimentological observations in each core are located by their downhole depths (metres below sea floor - mbsf) and also by a verbal term that describes the sample 'situation' - a special fraction of the sediment or position in the core. The main processing creates a separate output event for each instance of top, bottom and situation, assigning top-base mbsf values from numeric or, where possible, from word-based relative locational information such as 'core catcher' in reference to sampler device, and recovery or penetration length. The processing outputs represent the sub-bottom as a sparse matrix of over 20 sediment properties of interest, such as grain size, porosity and colour. They can be plotted in a range of core-log programs including an in-built facility that better suits the requirements of sea-floor data. Finally, a suite of stratigraphic statistics are computed, including volumetric grades, overburdens, thicknesses and degrees of layering. ?? The Geological Society of London 2006.

  13. A transfer of technology from engineering: use of ROC curves from signal detection theory to investigate information processing in the brain during sensory difference testing.

    PubMed

    Wichchukit, Sukanya; O'Mahony, Michael

    2010-01-01

    This article reviews a beneficial effect of technology transfer from Electrical Engineering to Food Sensory Science. Specifically, it reviews the recent adoption in Food Sensory Science of the receiver operating characteristic (ROC) curve, a tool that is incorporated in the theory of signal detection. Its use allows the information processing that takes place in the brain during sensory difference testing to be studied and understood. The review deals with how Signal Detection Theory, also called Thurstonian modeling, led to the adoption of a more sophisticated way of analyzing the data from sensory difference tests, by introducing the signal-to-noise ratio, d', as a fundamental measure of perceived small sensory differences. Generally, the method of computation of d' is a simple matter for some of the better known difference tests like the triangle, duo-trio and 2-AFC. However, there are occasions when these tests are not appropriate and other tests like the same-different and the A Not-A test are more suitable. Yet, for these, it is necessary to understand how the brain processes information during the test before d' can be computed. It is for this task that the ROC curve has a particular use. © 2010 Institute of Food Technologists®

  14. The Effects of Gender and Type of Inquiry Curriculum on Sixth Grade Students' Science Process Skills and Epistemological Beliefs in Science

    NASA Astrophysics Data System (ADS)

    Zaleta, Kristy L.

    The purpose of this study was to investigate the impact of gender and type of inquiry curriculum (open or structured) on science process skills and epistemological beliefs in science of sixth grade students. The current study took place in an urban northeastern middle school. The researcher utilized a sample of convenience comprised of 303 sixth grade students taught by four science teachers on separate teams. The study employed mixed methods with a quasi-experimental design, pretest-posttest comparison group with 17 intact classrooms of students. Students' science process skills and epistemological beliefs in science (source, certainty, development, and justification) were measured before and after the intervention, which exposed different groups of students to different types of inquiry (structured or open). Differences between comparison and treatment groups and between male and female students were analyzed after the intervention, on science process skills, using a two-way analysis of covariance (ANCOVA), and, on epistemological beliefs in science, using a two-way multivariate analysis of covariance (MANCOVA). Responses from two focus groups of open inquiry students were cycle coded and examined for themes and patterns. Quantitative measurements indicated that girls scored significantly higher on science process skills than boys, regardless of type of inquiry instruction. Neither gender nor type of inquiry instruction predicted students' epistemological beliefs in science after accounting for students' pretest scores. The dimension Development accounted for 10.6% of the variance in students' science process skills. Qualitative results indicated that students with sophisticated epistemological beliefs expressed engagement with the open-inquiry curriculum. Students in both the sophisticated and naive beliefs groups identified challenges with the curriculum and improvement in learning as major themes. The types of challenges identified differed between the groups: sophisticated beliefs group students focused on their insecurity of not knowing how to complete the activities correctly, and naive beliefs group students focused on the amount of work and how long it took them to complete it. The description of the improvement in learning was at a basic level for the naive beliefs group and at a more complex level for the sophisticated beliefs group. Implications for researchers and educators are discussed.

  15. Marshall information retrieval and display system (MIRADS)

    NASA Technical Reports Server (NTRS)

    Groover, J. L.; Jones, S. C.; King, W. L.

    1974-01-01

    Program for data management system allows sophisticated inquiries while utilizing simplified language. Online system is composed of several programs. System is written primarily in COBOL with routines in ASSEMBLER and FORTRAN V.

  16. Multi-disciplinary communication networks for skin risk assessment in nursing homes with high IT sophistication.

    PubMed

    Alexander, Gregory L; Pasupathy, Kalyan S; Steege, Linsey M; Strecker, E Bradley; Carley, Kathleen M

    2014-08-01

    The role of nursing home (NH) information technology (IT) in quality improvement has not been clearly established, and its impacts on communication between care givers and patient outcomes in these settings deserve further attention. In this research, we describe a mixed method approach to explore communication strategies used by healthcare providers for resident skin risk in NH with high IT sophistication (ITS). Sample included NH participating in the statewide survey of ITS. We incorporated rigorous observation of 8- and 12-h shifts, and focus groups to identify how NH IT and a range of synchronous and asynchronous tools are used. Social network analysis tools and qualitative analysis were used to analyze data and identify relationships between ITS dimensions and communication interactions between care providers. Two of the nine ITS dimensions (resident care-technological and administrative activities-technological) and total ITS were significantly negatively correlated with number of unique interactions. As more processes in resident care and administrative activities are supported by technology, the lower the number of observed unique interactions. Additionally, four thematic areas emerged from staff focus groups that demonstrate how important IT is to resident care in these facilities including providing resident-centered care, teamwork and collaboration, maintaining safety and quality, and using standardized information resources. Our findings in this study confirm prior research that as technology support (resident care and administrative activities) and overall ITS increases, observed interactions between staff members decrease. Conversations during staff interviews focused on how technology facilitated resident centered care through enhanced information sharing, greater virtual collaboration between team members, and improved care delivery. These results provide evidence for improving the design and implementation of IT in long term care systems to support communication and associated resident outcomes. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  17. The Hico Image Processing System: A Web-Accessible Hyperspectral Remote Sensing Toolbox

    NASA Astrophysics Data System (ADS)

    Harris, A. T., III; Goodman, J.; Justice, B.

    2014-12-01

    As the quantity of Earth-observation data increases, the use-case for hosting analytical tools in geospatial data centers becomes increasingly attractive. To address this need, HySpeed Computing and Exelis VIS have developed the HICO Image Processing System, a prototype cloud computing system that provides online, on-demand, scalable remote sensing image processing capabilities. The system provides a mechanism for delivering sophisticated image processing analytics and data visualization tools into the hands of a global user community, who will only need a browser and internet connection to perform analysis. Functionality of the HICO Image Processing System is demonstrated using imagery from the Hyperspectral Imager for the Coastal Ocean (HICO), an imaging spectrometer located on the International Space Station (ISS) that is optimized for acquisition of aquatic targets. Example applications include a collection of coastal remote sensing algorithms that are directed at deriving critical information on water and habitat characteristics of our vulnerable coastal environment. The project leverages the ENVI Services Engine as the framework for all image processing tasks, and can readily accommodate the rapid integration of new algorithms, datasets and processing tools.

  18. Evaluating coastal and river valley communities evacuation network performance using macroscopic productivity.

    DOT National Transportation Integrated Search

    2017-06-30

    The ever-increasing processing speed and computational power of computers and simulation systems has led to correspondingly larger, more sophisticated representations of evacuation traffic processes. Today, micro-level analyses can be conducted for m...

  19. Critical Infrastructure Protection II, The International Federation for Information Processing, Volume 290.

    NASA Astrophysics Data System (ADS)

    Papa, Mauricio; Shenoi, Sujeet

    The information infrastructure -- comprising computers, embedded devices, networks and software systems -- is vital to day-to-day operations in every sector: information and telecommunications, banking and finance, energy, chemicals and hazardous materials, agriculture, food, water, public health, emergency services, transportation, postal and shipping, government and defense. Global business and industry, governments, indeed society itself, cannot function effectively if major components of the critical information infrastructure are degraded, disabled or destroyed. Critical Infrastructure Protection II describes original research results and innovative applications in the interdisciplinary field of critical infrastructure protection. Also, it highlights the importance of weaving science, technology and policy in crafting sophisticated, yet practical, solutions that will help secure information, computer and network assets in the various critical infrastructure sectors. Areas of coverage include: - Themes and Issues - Infrastructure Security - Control Systems Security - Security Strategies - Infrastructure Interdependencies - Infrastructure Modeling and Simulation This book is the second volume in the annual series produced by the International Federation for Information Processing (IFIP) Working Group 11.10 on Critical Infrastructure Protection, an international community of scientists, engineers, practitioners and policy makers dedicated to advancing research, development and implementation efforts focused on infrastructure protection. The book contains a selection of twenty edited papers from the Second Annual IFIP WG 11.10 International Conference on Critical Infrastructure Protection held at George Mason University, Arlington, Virginia, USA in the spring of 2008.

  20. Cybersecurity: The heat is on.

    PubMed

    Morrissey, John

    2015-10-01

    Breaches of confidential patient information are proliferating and the culprits are more sophisticated and sinister than ever. Hospitals and health systems are using smarter and faster tactics to stay one step ahead of the bad guys.

  1. Missing data exploration: highlighting graphical presentation of missing pattern.

    PubMed

    Zhang, Zhongheng

    2015-12-01

    Functions shipped with R base can fulfill many tasks of missing data handling. However, because the data volume of electronic medical record (EMR) system is always very large, more sophisticated methods may be helpful in data management. The article focuses on missing data handling by using advanced techniques. There are three types of missing data, that is, missing completely at random (MCAR), missing at random (MAR) and not missing at random (NMAR). This classification system depends on how missing values are generated. Two packages, Multivariate Imputation by Chained Equations (MICE) and Visualization and Imputation of Missing Values (VIM), provide sophisticated functions to explore missing data pattern. In particular, the VIM package is especially helpful in visual inspection of missing data. Finally, correlation analysis provides information on the dependence of missing data on other variables. Such information is useful in subsequent imputations.

  2. Epistemic Beliefs and Conceptual Understanding in Biotechnology: A Case Study

    NASA Astrophysics Data System (ADS)

    Rebello, Carina M.; Siegel, Marcelle A.; Witzig, Stephen B.; Freyermuth, Sharyn K.; McClure, Bruce A.

    2012-04-01

    The purpose of this investigation was to explore students' epistemic beliefs and conceptual understanding of biotechnology. Epistemic beliefs can influence reasoning, how individuals evaluate information, and informed decision making abilities. These skills are important for an informed citizenry that will participate in debates regarding areas in science such as biotechnology. We report on an in-depth case study analysis of three undergraduate, non-science majors in a biotechnology course designed for non-biochemistry majors. We selected participants who performed above average and below average on the first in-class exam. Data from multiple sources—interviews, exams, and a concept instrument—were used to construct (a) individual profiles and (b) a cross-case analysis of our participants' conceptual development and epistemic beliefs from two different theoretical perspectives—Women's Ways of Knowing and the Reflective Judgment Model. Two independent trained researchers coded all case records independently for both theoretical perspectives, with resultant initial Cohen's kappa values above .715 (substantial agreement), and then reached consensus on the codes. Results indicate that a student with more sophisticated epistemology demonstrated greater conceptual understandings at the end of the course than a student with less sophisticated epistemology, even though the latter performed higher initially. Also a student with a less sophisticated epistemology and low initial conceptual performance does not demonstrate gains in their overall conceptual understanding. Results suggest the need for instructional interventions fostering epistemological development of learners in order to facilitate their conceptual growth.

  3. Advances in borehole geophysics for hydrology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, P.H.

    1982-01-01

    Borehole geophysical methods provide vital subsurface information on rock properties, fluid movement, and the condition of engineered borehole structures. Within the first category, salient advances include the continuing improvement of the borehole televiewer, refinement of the electrical conductivity dipmeter for fracture characterization, and the development of a gigahertz-frequency electromagnetic propagation tool for water saturation measurements. The exploration of the rock mass between boreholes remains a challenging problem with high potential; promising methods are now incorporating high-density spatial sampling and sophisticated data processing. Flow-rate measurement methods appear adequate for all but low-flow situations. At low rates the tagging method seems themore » most attractive. The current exploitation of neutron-activation techniques for tagging means that the wellbore fluid itself is tagged, thereby eliminating the mixing of an alien fluid into the wellbore. Another method uses the acoustic noise generated by flow through constrictions and in and behind casing to detect and locate flaws in the production system. With the advent of field-recorded digital data, the interpretation of logs from sedimentary sequences is now reaching a sophisticated level with the aid of computer processing and the application of statistical methods. Lagging behind are interpretive schemes for the low-porosity, fracture-controlled igneous and metamorphic rocks encountered in the geothermal reservoirs and in potential waste-storage sites. Progress is being made on the general problem of fracture detection by use of electrical and acoustical techniques, but the reliable definition of permeability continues to be an elusive goal.« less

  4. NASA Blue Team: Determining Operational Security Posture of Critical Systems and Networks

    NASA Technical Reports Server (NTRS)

    Alley, Adam David

    2016-01-01

    Emergence of Cybersecurity has increased the focus on security risks to Information Technology (IT) assets going beyond traditional Information Assurance (IA) concerns: More sophisticated threats have emerged from increasing sources as advanced hacker tools and techniques have emerged and proliferated to broaden the attack surface available across globally interconnected networks.

  5. HUC--A User Designed System for All Recorded Knowledge and Information.

    ERIC Educational Resources Information Center

    Hilton, Howard J.

    This paper proposes a user designed system, HUC, intended to provide a single index and retrieval system covering all recorded knowledge and information capable of being retrieved from all modes of storage, from manual to the most sophisticated retrieval system. The concept integrates terminal hardware, software, and database structure to allow…

  6. The neural basis of deception in strategic interactions

    PubMed Central

    Volz, Kirsten G.; Vogeley, Kai; Tittgemeyer, Marc; von Cramon, D. Yves; Sutter, Matthias

    2015-01-01

    Communication based on informational asymmetries abounds in politics, business, and almost any other form of social interaction. Informational asymmetries may create incentives for the better-informed party to exploit her advantage by misrepresenting information. Using a game-theoretic setting, we investigate the neural basis of deception in human interaction. Unlike in most previous fMRI research on deception, the participants decide themselves whether to lie or not. We find activation within the right temporo-parietal junction (rTPJ), the dorsal anterior cingulate cortex (ACC), the (pre)cuneus (CUN), and the anterior frontal gyrus (aFG) when contrasting lying with truth telling. Notably, our design also allows for an investigation of the neural foundations of sophisticated deception through telling the truth—when the sender does not expect the receiver to believe her (true) message. Sophisticated deception triggers activation within the same network as plain lies, i.e., we find activity within the rTPJ, the CUN, and aFG. We take this result to show that brain activation can reveal the sender's veridical intention to deceive others, irrespective of whether in fact the sender utters the factual truth or not. PMID:25729358

  7. 3-D model-based Bayesian classification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Soenneland, L.; Tenneboe, P.; Gehrmann, T.

    1994-12-31

    The challenging task of the interpreter is to integrate different pieces of information and combine them into an earth model. The sophistication level of this earth model might vary from the simplest geometrical description to the most complex set of reservoir parameters related to the geometrical description. Obviously the sophistication level also depend on the completeness of the available information. The authors describe the interpreter`s task as a mapping between the observation space and the model space. The information available to the interpreter exists in observation space and the task is to infer a model in model-space. It is well-knownmore » that this inversion problem is non-unique. Therefore any attempt to find a solution depend son constraints being added in some manner. The solution will obviously depend on which constraints are introduced and it would be desirable to allow the interpreter to modify the constraints in a problem-dependent manner. They will present a probabilistic framework that gives the interpreter the tools to integrate the different types of information and produce constrained solutions. The constraints can be adapted to the problem at hand.« less

  8. The neural basis of deception in strategic interactions.

    PubMed

    Volz, Kirsten G; Vogeley, Kai; Tittgemeyer, Marc; von Cramon, D Yves; Sutter, Matthias

    2015-01-01

    Communication based on informational asymmetries abounds in politics, business, and almost any other form of social interaction. Informational asymmetries may create incentives for the better-informed party to exploit her advantage by misrepresenting information. Using a game-theoretic setting, we investigate the neural basis of deception in human interaction. Unlike in most previous fMRI research on deception, the participants decide themselves whether to lie or not. We find activation within the right temporo-parietal junction (rTPJ), the dorsal anterior cingulate cortex (ACC), the (pre)cuneus (CUN), and the anterior frontal gyrus (aFG) when contrasting lying with truth telling. Notably, our design also allows for an investigation of the neural foundations of sophisticated deception through telling the truth-when the sender does not expect the receiver to believe her (true) message. Sophisticated deception triggers activation within the same network as plain lies, i.e., we find activity within the rTPJ, the CUN, and aFG. We take this result to show that brain activation can reveal the sender's veridical intention to deceive others, irrespective of whether in fact the sender utters the factual truth or not.

  9. The future of high energy gamma ray astronomy and its potential astrophysical implications

    NASA Technical Reports Server (NTRS)

    Fichtel, C. E.

    1982-01-01

    Future satellites should carry instruments having over an order of magnitude greater sensitivity than those flown thus far as well as improved energy and angular resolution. The information to be obtained from these experiments should greatly enhance knowledge of: the very energetic and nuclear processes associated with compact objects; the structure of our galaxy; the origin and dynamic pressure effects of the cosmic rays; the high energy particles and energetic processes in other galaxies; and the degree of matter-antimatter symmetry of the universe. The relevant aspects of extragalactic gamma ray phenomena are emphasized along with the instruments planned. The high energy gamma ray results of forthcoming programs such as GAMMA-1 and the Gamma Ray Observatory should justify even more sophisticated telescopes. These advanced instruments might be placed on the space station currently being considered by NASA.

  10. Complex and differential glial responses in Alzheimer's disease and ageing.

    PubMed

    Rodríguez, José J; Butt, Arthur M; Gardenal, Emanuela; Parpura, Vladimir; Verkhratsky, Alexei

    2016-01-01

    Glial cells and their association with neurones are fundamental for brain function. The emergence of complex neurone-glial networks assures rapid information transfer, creating a sophisticated circuitry where both types of neural cells work in concert, serving different activities. All glial cells, represented by astrocytes, oligodendrocytes, microglia and NG2-glia, are essential for brain homeostasis and defence. Thus, glia are key not only for normal central nervous system (CNS) function, but also to its dysfunction, being directly associated with all forms of neuropathological processes. Therefore, the progression and outcome of neurological and neurodegenerative diseases depend on glial reactions. In this review, we provide a concise account of recent data obtained from both human material and animal models demonstrating the pathological involvement of glia in neurodegenerative processes, including Alzheimer's disease (AD), as well as physiological ageing.

  11. Can An Evolutionary Process Create English Text?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bailey, David H.

    Critics of the conventional theory of biological evolution have asserted that while natural processes might result in some limited diversity, nothing fundamentally new can arise from 'random' evolution. In response, biologists such as Richard Dawkins have demonstrated that a computer program can generate a specific short phrase via evolution-like iterations starting with random gibberish. While such demonstrations are intriguing, they are flawed in that they have a fixed, pre-specified future target, whereas in real biological evolution there is no fixed future target, but only a complicated 'fitness landscape'. In this study, a significantly more sophisticated evolutionary scheme is employed tomore » produce text segments reminiscent of a Charles Dickens novel. The aggregate size of these segments is larger than the computer program and the input Dickens text, even when comparing compressed data (as a measure of information content).« less

  12. Analysis of a municipal wastewater treatment plant using a neural network-based pattern analysis

    USGS Publications Warehouse

    Hong, Y.-S.T.; Rosen, Michael R.; Bhamidimarri, R.

    2003-01-01

    This paper addresses the problem of how to capture the complex relationships that exist between process variables and to diagnose the dynamic behaviour of a municipal wastewater treatment plant (WTP). Due to the complex biological reaction mechanisms, the highly time-varying, and multivariable aspects of the real WTP, the diagnosis of the WTP are still difficult in practice. The application of intelligent techniques, which can analyse the multi-dimensional process data using a sophisticated visualisation technique, can be useful for analysing and diagnosing the activated-sludge WTP. In this paper, the Kohonen Self-Organising Feature Maps (KSOFM) neural network is applied to analyse the multi-dimensional process data, and to diagnose the inter-relationship of the process variables in a real activated-sludge WTP. By using component planes, some detailed local relationships between the process variables, e.g., responses of the process variables under different operating conditions, as well as the global information is discovered. The operating condition and the inter-relationship among the process variables in the WTP have been diagnosed and extracted by the information obtained from the clustering analysis of the maps. It is concluded that the KSOFM technique provides an effective analysing and diagnosing tool to understand the system behaviour and to extract knowledge contained in multi-dimensional data of a large-scale WTP. ?? 2003 Elsevier Science Ltd. All rights reserved.

  13. A ganglion-cell-based primary image representation method and its contribution to object recognition

    NASA Astrophysics Data System (ADS)

    Wei, Hui; Dai, Zhi-Long; Zuo, Qing-Song

    2016-10-01

    A visual stimulus is represented by the biological visual system at several levels: in the order from low to high levels, they are: photoreceptor cells, ganglion cells (GCs), lateral geniculate nucleus cells and visual cortical neurons. Retinal GCs at the early level need to represent raw data only once, but meet a wide number of diverse requests from different vision-based tasks. This means the information representation at this level is general and not task-specific. Neurobiological findings have attributed this universal adaptation to GCs' receptive field (RF) mechanisms. For the purposes of developing a highly efficient image representation method that can facilitate information processing and interpretation at later stages, here we design a computational model to simulate the GC's non-classical RF. This new image presentation method can extract major structural features from raw data, and is consistent with other statistical measures of the image. Based on the new representation, the performances of other state-of-the-art algorithms in contour detection and segmentation can be upgraded remarkably. This work concludes that applying sophisticated representation schema at early state is an efficient and promising strategy in visual information processing.

  14. Three-dimensional quick response code based on inkjet printing of upconversion fluorescent nanoparticles for drug anti-counterfeiting

    NASA Astrophysics Data System (ADS)

    You, Minli; Lin, Min; Wang, Shurui; Wang, Xuemin; Zhang, Ge; Hong, Yuan; Dong, Yuqing; Jin, Guorui; Xu, Feng

    2016-05-01

    Medicine counterfeiting is a serious issue worldwide, involving potentially devastating health repercussions. Advanced anti-counterfeit technology for drugs has therefore aroused intensive interest. However, existing anti-counterfeit technologies are associated with drawbacks such as the high cost, complex fabrication process, sophisticated operation and incapability in authenticating drug ingredients. In this contribution, we developed a smart phone recognition based upconversion fluorescent three-dimensional (3D) quick response (QR) code for tracking and anti-counterfeiting of drugs. We firstly formulated three colored inks incorporating upconversion nanoparticles with RGB (i.e., red, green and blue) emission colors. Using a modified inkjet printer, we printed a series of colors by precisely regulating the overlap of these three inks. Meanwhile, we developed a multilayer printing and splitting technology, which significantly increases the information storage capacity per unit area. As an example, we directly printed the upconversion fluorescent 3D QR code on the surface of drug capsules. The 3D QR code consisted of three different color layers with each layer encoded by information of different aspects of the drug. A smart phone APP was designed to decode the multicolor 3D QR code, providing the authenticity and related information of drugs. The developed technology possesses merits in terms of low cost, ease of operation, high throughput and high information capacity, thus holds great potential for drug anti-counterfeiting.Medicine counterfeiting is a serious issue worldwide, involving potentially devastating health repercussions. Advanced anti-counterfeit technology for drugs has therefore aroused intensive interest. However, existing anti-counterfeit technologies are associated with drawbacks such as the high cost, complex fabrication process, sophisticated operation and incapability in authenticating drug ingredients. In this contribution, we developed a smart phone recognition based upconversion fluorescent three-dimensional (3D) quick response (QR) code for tracking and anti-counterfeiting of drugs. We firstly formulated three colored inks incorporating upconversion nanoparticles with RGB (i.e., red, green and blue) emission colors. Using a modified inkjet printer, we printed a series of colors by precisely regulating the overlap of these three inks. Meanwhile, we developed a multilayer printing and splitting technology, which significantly increases the information storage capacity per unit area. As an example, we directly printed the upconversion fluorescent 3D QR code on the surface of drug capsules. The 3D QR code consisted of three different color layers with each layer encoded by information of different aspects of the drug. A smart phone APP was designed to decode the multicolor 3D QR code, providing the authenticity and related information of drugs. The developed technology possesses merits in terms of low cost, ease of operation, high throughput and high information capacity, thus holds great potential for drug anti-counterfeiting. Electronic supplementary information (ESI) available: Calculating details of UCNP content per 3D QR code and decoding process of the 3D QR code. See DOI: 10.1039/c6nr01353h

  15. Online Process Scaffolding and Students' Self-Regulated Learning with Hypermedia.

    ERIC Educational Resources Information Center

    Azevedo, Roger; Cromley, Jennifer G.; Thomas, Leslie; Seibert, Diane; Tron, Myriam

    This study examined the role of different scaffolding instructional interventions in facilitating students' shift to more sophisticated mental models as indicated by both performance and process data. Undergraduate students (n=53) were randomly assigned to 1 of 3 scaffolding conditions (adaptive content and process scaffolding (ACPS), adaptive…

  16. A synthetic mammalian electro-genetic transcription circuit.

    PubMed

    Weber, Wilfried; Luzi, Stefan; Karlsson, Maria; Sanchez-Bustamante, Carlota Diaz; Frey, Urs; Hierlemann, Andreas; Fussenegger, Martin

    2009-03-01

    Electric signal processing has evolved to manage rapid information transfer in neuronal networks and muscular contraction in multicellular organisms and controls the most sophisticated man-built devices. Using a synthetic biology approach to assemble electronic parts with genetic control units engineered into mammalian cells, we designed an electric power-adjustable transcription control circuit able to integrate the intensity of a direct current over time, to translate the amplitude or frequency of an alternating current into an adjustable genetic readout or to modulate the beating frequency of primary heart cells. Successful miniaturization of the electro-genetic devices may pave the way for the design of novel hybrid electro-genetic implants assembled from electronic and genetic parts.

  17. Developments in cognitive neuroscience: I. Conflict, compromise, and connectionism.

    PubMed

    Westen, Drew; Gabbard, Glen O

    2002-01-01

    The strength of psychoanalysis has always been its understanding of affect and motivation. Contemporary developments in cognitive neuroscience offer possibilities of integrating sophisticated, experimentally informed models of thought and memory with an understanding of dynamically and clinically meaningful processes. Aspects of contemporary theory and research in cognitive neuroscience are integrated with psychoanalytic theory and technique, particularly theories of conflict and compromise. After a description of evolving models of the mind in cognitive neuroscience, several issues relevant to psychoanalytic theory and practice are addressed. These include the nature of representations, the interaction of cognition and affect, and the mechanisms by which the mind unconsciously forges compromise solutions that best fit multiple cognitive and affective-motivational constraints.

  18. Sensors Applications, Volume 4, Sensors for Automotive Applications

    NASA Astrophysics Data System (ADS)

    Marek, Jiri; Trah, Hans-Peter; Suzuki, Yasutoshi; Yokomori, Iwao

    2003-07-01

    An international team of experts from the leading companies in this field gives a detailed picture of existing as well as future applications. They discuss in detail current technologies, design and construction concepts, market considerations and commercial developments. Topics covered include vehicle safety, fuel consumption, air conditioning, emergency control, traffic control systems, and electronic guidance using radar and video. Meeting the growing need for comprehensive information on the capabilities, potentials and limitations of modern sensor systems, Sensors Applications is a book series covering the use of sophisticated technologies and materials for the creation of advanced sensors and their implementation in the key areas process monitoring, building control, health care, automobiles, aerospace, environmental technology and household appliances.

  19. A synthetic mammalian electro-genetic transcription circuit

    PubMed Central

    Weber, Wilfried; Luzi, Stefan; Karlsson, Maria; Sanchez-Bustamante, Carlota Diaz; Frey, Urs; Hierlemann, Andreas; Fussenegger, Martin

    2009-01-01

    Electric signal processing has evolved to manage rapid information transfer in neuronal networks and muscular contraction in multicellular organisms and controls the most sophisticated man-built devices. Using a synthetic biology approach to assemble electronic parts with genetic control units engineered into mammalian cells, we designed an electric power-adjustable transcription control circuit able to integrate the intensity of a direct current over time, to translate the amplitude or frequency of an alternating current into an adjustable genetic readout or to modulate the beating frequency of primary heart cells. Successful miniaturization of the electro-genetic devices may pave the way for the design of novel hybrid electro-genetic implants assembled from electronic and genetic parts. PMID:19190091

  20. Silica-on-silicon waveguide quantum circuits.

    PubMed

    Politi, Alberto; Cryan, Martin J; Rarity, John G; Yu, Siyuan; O'Brien, Jeremy L

    2008-05-02

    Quantum technologies based on photons will likely require an integrated optics architecture for improved performance, miniaturization, and scalability. We demonstrate high-fidelity silica-on-silicon integrated optical realizations of key quantum photonic circuits, including two-photon quantum interference with a visibility of 94.8 +/- 0.5%; a controlled-NOT gate with an average logical basis fidelity of 94.3 +/- 0.2%; and a path-entangled state of two photons with fidelity of >92%. These results show that it is possible to directly "write" sophisticated photonic quantum circuits onto a silicon chip, which will be of benefit to future quantum technologies based on photons, including information processing, communication, metrology, and lithography, as well as the fundamental science of quantum optics.

  1. Advances in Mid-Infrared Spectroscopy for Chemical Analysis

    NASA Astrophysics Data System (ADS)

    Haas, Julian; Mizaikoff, Boris

    2016-06-01

    Infrared spectroscopy in the 3-20 μm spectral window has evolved from a routine laboratory technique into a state-of-the-art spectroscopy and sensing tool by benefitting from recent progress in increasingly sophisticated spectra acquisition techniques and advanced materials for generating, guiding, and detecting mid-infrared (MIR) radiation. Today, MIR spectroscopy provides molecular information with trace to ultratrace sensitivity, fast data acquisition rates, and high spectral resolution catering to demanding applications in bioanalytics, for example, and to improved routine analysis. In addition to advances in miniaturized device technology without sacrificing analytical performance, selected innovative applications for MIR spectroscopy ranging from process analysis to biotechnology and medical diagnostics are highlighted in this review.

  2. TOF-SIMS imaging technique with information entropy

    NASA Astrophysics Data System (ADS)

    Aoyagi, Satoka; Kawashima, Y.; Kudo, Masahiro

    2005-05-01

    Time-of-flight secondary ion mass spectrometry (TOF-SIMS) is capable of chemical imaging of proteins on insulated samples in principal. However, selection of specific peaks related to a particular protein, which are necessary for chemical imaging, out of numerous candidates had been difficult without an appropriate spectrum analysis technique. Therefore multivariate analysis techniques, such as principal component analysis (PCA), and analysis with mutual information defined by information theory, have been applied to interpret SIMS spectra of protein samples. In this study mutual information was applied to select specific peaks related to proteins in order to obtain chemical images. Proteins on insulated materials were measured with TOF-SIMS and then SIMS spectra were analyzed by means of the analysis method based on the comparison using mutual information. Chemical mapping of each protein was obtained using specific peaks related to each protein selected based on values of mutual information. The results of TOF-SIMS images of proteins on the materials provide some useful information on properties of protein adsorption, optimality of immobilization processes and reaction between proteins. Thus chemical images of proteins by TOF-SIMS contribute to understand interactions between material surfaces and proteins and to develop sophisticated biomaterials.

  3. Evolving health information technology and the timely availability of visit diagnoses from ambulatory visits: a natural experiment in an integrated delivery system.

    PubMed

    Bardach, Naomi S; Huang, Jie; Brand, Richard; Hsu, John

    2009-07-17

    Health information technology (HIT) may improve health care quality and outcomes, in part by making information available in a timelier manner. However, there are few studies documenting the changes in timely availability of data with the use of a sophisticated electronic medical record (EMR), nor a description of how the timely availability of data might differ with different types of EMRs. We hypothesized that timely availability of data would improve with use of increasingly sophisticated forms of HIT. We used an historical observation design (2004-2006) using electronic data from office visits in an integrated delivery system with three types of HIT: Basic, Intermediate, and Advanced. We calculated the monthly percentage of visits using the various types of HIT for entry of visit diagnoses into the delivery system's electronic database, and the time between the visit and the availability of the visit diagnoses in the database. In January 2004, when only Basic HIT was available, 10% of office visits had diagnoses entered on the same day as the visit and 90% within a week; 85% of office visits used paper forms for recording visit diagnoses, 16% used Basic at that time. By December 2006, 95% of all office visits had diagnoses available on the same day as the visit, when 98% of office visits used some form of HIT for entry of visit diagnoses (Advanced HIT for 67% of visits). Use of HIT systems is associated with dramatic increases in the timely availability of diagnostic information, though the effects may vary by sophistication of HIT system. Timely clinical data are critical for real-time population surveillance, and valuable for routine clinical care.

  4. Data breach locations, types, and associated characteristics among US hospitals.

    PubMed

    Gabriel, Meghan Hufstader; Noblin, Alice; Rutherford, Ashley; Walden, Amanda; Cortelyou-Ward, Kendall

    2018-02-01

    The objectives of this study were to describe the locations in hospitals where data are breached, the types of breaches that occur most often at hospitals, and hospital characteristics, including health information technology (IT) sophistication and biometric security capabilities, that may be predicting factors of large data breaches that affect 500 or more patients. The Office of Civil Rights breach data from healthcare providers regarding breaches that affected 500 or more individuals from 2009 to 2016 were linked with hospital characteristics from the Health Information Management Systems Society and the American Hospital Association Health IT Supplement databases. Descriptive statistics were used to characterize hospitals with and without breaches, data breach type, and location/mode of data breaches in hospitals. Multivariate logistic regression analysis explored hospital characteristics that were predicting factors of a data breach affecting at least 500 patients, including area characteristics, region, health system membership, size, type, biometric security use, health IT sophistication, and ownership. Of all types of healthcare providers, hospitals accounted for approximately one-third of all data breaches and hospital breaches affected the largest number of individuals. Paper and films were the most frequent location of breached data, occurring in 65 hospitals during the study period, whereas network servers were the least common location but their breaches affected the most patients overall. Adjusted multivariate results showed significant associations among data breach occurrences and some hospital characteristics, including type and size, but not others, including health IT sophistication or biometric use for security. Hospitals should conduct routine audits to allow them to see their vulnerabilities before a breach occurs. Additionally, information security systems should be implemented concurrently with health information technologies. Improving access control and prioritizing patient privacy will be important steps in minimizing future breaches.

  5. Patient empowerment by the means of citizen-managed Electronic Health Records: web 2.0 health digital identity scenarios.

    PubMed

    Falcão-Reis, Filipa; Correia, Manuel E

    2010-01-01

    With the advent of more sophisticated and comprehensive healthcare information systems, system builders are becoming more interested in patient interaction and what he can do to help to improve his own health care. Information systems play nowadays a crucial and fundamental role in hospital work-flows, thus providing great opportunities to introduce and improve upon "patient empowerment" processes for the personalization and management of Electronic Health Records (EHRs). In this paper, we present a patient's privacy generic control mechanisms scenarios based on the Extended OpenID (eOID), a user centric digital identity provider previously developed by our group, which leverages a secured OpenID 2.0 infrastructure with the recently released Portuguese Citizen Card (CC) for secure authentication in a distributed health information environment. eOID also takes advantage of Oauth assertion based mechanisms to implement patient controlled secure qualified role based access to his EHR, by third parties.

  6. Controls for Burning Solid Wastes

    ERIC Educational Resources Information Center

    Toro, Richard F.; Weinstein, Norman J.

    1975-01-01

    Modern thermal solid waste processing systems are becoming more complex, incorporating features that require instrumentation and control systems to a degree greater than that previously required just for proper combustion control. With the advent of complex, sophisticated, thermal processing systems, TV monitoring and computer control should…

  7. Budget Limits Prompt Humming Hive Togetherness.

    ERIC Educational Resources Information Center

    Dain, Jo Anne

    1983-01-01

    Describes Truckee Meadows Community College's word processing center, in which students are trained in modern word processing techniques on the same equipment that meets the college's needs for a sophisticated computerized system. Considers equipment, safeguards, advantages, and current and potential uses of the center. (DMM)

  8. An Ecological Framework for Cancer Communication: Implications for Research

    PubMed Central

    Intille, Stephen S; Zabinski, Marion F

    2005-01-01

    The field of cancer communication has undergone a major revolution as a result of the Internet. As recently as the early 1990s, face-to-face, print, and the telephone were the dominant methods of communication between health professionals and individuals in support of the prevention and treatment of cancer. Computer-supported interactive media existed, but this usually required sophisticated computer and video platforms that limited availability. The introduction of point-and-click interfaces for the Internet dramatically improved the ability of non-expert computer users to obtain and publish information electronically on the Web. Demand for Web access has driven computer sales for the home setting and improved the availability, capability, and affordability of desktop computers. New advances in information and computing technologies will lead to similarly dramatic changes in the affordability and accessibility of computers. Computers will move from the desktop into the environment and onto the body. Computers are becoming smaller, faster, more sophisticated, more responsive, less expensive, and—essentially—ubiquitous. Computers are evolving into much more than desktop communication devices. New computers include sensing, monitoring, geospatial tracking, just-in-time knowledge presentation, and a host of other information processes. The challenge for cancer communication researchers is to acknowledge the expanded capability of the Web and to move beyond the approaches to health promotion, behavior change, and communication that emerged during an era when language- and image-based interpersonal and mass communication strategies predominated. Ecological theory has been advanced since the early 1900s to explain the highly complex relationships among individuals, society, organizations, the built and natural environments, and personal and population health and well-being. This paper provides background on ecological theory, advances an Ecological Model of Internet-Based Cancer Communication intended to broaden the vision of potential uses of the Internet for cancer communication, and provides some examples of how such a model might inform future research and development in cancer communication. PMID:15998614

  9. T.I.M.S: TaqMan Information Management System, tools to organize data flow in a genotyping laboratory

    PubMed Central

    Monnier, Stéphanie; Cox, David G; Albion, Tim; Canzian, Federico

    2005-01-01

    Background Single Nucleotide Polymorphism (SNP) genotyping is a major activity in biomedical research. The Taqman technology is one of the most commonly used approaches. It produces large amounts of data that are difficult to process by hand. Laboratories not equipped with a Laboratory Information Management System (LIMS) need tools to organize the data flow. Results We propose a package of Visual Basic programs focused on sample management and on the parsing of input and output TaqMan files. The code is written in Visual Basic, embedded in the Microsoft Office package, and it allows anyone to have access to those tools, without any programming skills and with basic computer requirements. Conclusion We have created useful tools focused on management of TaqMan genotyping data, a critical issue in genotyping laboratories whithout a more sophisticated and expensive system, such as a LIMS. PMID:16221298

  10. Physically Based Virtual Surgery Planning and Simulation Tools for Personal Health Care Systems

    NASA Astrophysics Data System (ADS)

    Dogan, Firat; Atilgan, Yasemin

    The virtual surgery planning and simulation tools have gained a great deal of importance in the last decade in a consequence of increasing capacities at the information technology level. The modern hardware architectures, large scale database systems, grid based computer networks, agile development processes, better 3D visualization and all the other strong aspects of the information technology brings necessary instruments into almost every desk. The last decade’s special software and sophisticated super computer environments are now serving to individual needs inside “tiny smart boxes” for reasonable prices. However, resistance to learning new computerized environments, insufficient training and all the other old habits prevents effective utilization of IT resources by the specialists of the health sector. In this paper, all the aspects of the former and current developments in surgery planning and simulation related tools are presented, future directions and expectations are investigated for better electronic health care systems.

  11. A New Virtual and Remote Experimental Environment for Teaching and Learning Science

    NASA Astrophysics Data System (ADS)

    Lustigova, Zdena; Lustig, Frantisek

    This paper describes how a scientifically exact and problem-solving-oriented remote and virtual science experimental environment might help to build a new strategy for science education. The main features are: the remote observations and control of real world phenomena, their processing and evaluation, verification of hypotheses combined with the development of critical thinking, supported by sophisticated relevant information search, classification and storing tools and collaborative environment, supporting argumentative writing and teamwork, public presentations and defense of achieved results, all either in real presence, in telepresence or in combination of both. Only then real understanding of generalized science laws and their consequences can be developed. This science learning and teaching environment (called ROL - Remote and Open Laboratory), has been developed and used by Charles University in Prague since 1996, offered to science students in both formal and informal learning, and also to science teachers within their professional development studies, since 2003.

  12. Social Media and Education: Reconceptualizing the Boundaries of Formal and Informal Learning

    ERIC Educational Resources Information Center

    Greenhow, Christine; Lewin, Cathy

    2016-01-01

    It is argued that social media has the potential to bridge formal and informal learning through participatory digital cultures. Exemplars of sophisticated use by young people support this claim, although the majority of young people adopt the role of consumers rather than full participants. Scholars have suggested the potential of social media for…

  13. Determining the Effectiveness of Various Delivery Methods in an Information Technology/Information Systems Curriculum

    ERIC Educational Resources Information Center

    Davis, Gary Alan; Kovacs, Paul J.; Scarpino, John; Turchek, John C.

    2010-01-01

    The emergence of increasingly sophisticated communication technologies and the media-rich extensions of the World Wide Web have prompted universities to use alternatives to the traditional classroom teaching and learning methods. This demand for alternative delivery methods has led to the development of a wide range of eLearning techniques.…

  14. Bio-Intelligence: A Research Program Facilitating the Development of New Paradigms for Tomorrow's Patient Care

    NASA Astrophysics Data System (ADS)

    Phan, Sieu; Famili, Fazel; Liu, Ziying; Peña-Castillo, Lourdes

    The advancement of omics technologies in concert with the enabling information technology development has accelerated biological research to a new realm in a blazing speed and sophistication. The limited single gene assay to the high throughput microarray assay and the laborious manual count of base-pairs to the robotic assisted machinery in genome sequencing are two examples to name. Yet even more sophisticated, the recent development in literature mining and artificial intelligence has allowed researchers to construct complex gene networks unraveling many formidable biological puzzles. To harness these emerging technologies to their full potential to medical applications, the Bio-intelligence program at the Institute for Information Technology, National Research Council Canada, aims to develop and exploit artificial intelligence and bioinformatics technologies to facilitate the development of intelligent decision support tools and systems to improve patient care - for early detection, accurate diagnosis/prognosis of disease, and better personalized therapeutic management.

  15. Missing data exploration: highlighting graphical presentation of missing pattern

    PubMed Central

    2015-01-01

    Functions shipped with R base can fulfill many tasks of missing data handling. However, because the data volume of electronic medical record (EMR) system is always very large, more sophisticated methods may be helpful in data management. The article focuses on missing data handling by using advanced techniques. There are three types of missing data, that is, missing completely at random (MCAR), missing at random (MAR) and not missing at random (NMAR). This classification system depends on how missing values are generated. Two packages, Multivariate Imputation by Chained Equations (MICE) and Visualization and Imputation of Missing Values (VIM), provide sophisticated functions to explore missing data pattern. In particular, the VIM package is especially helpful in visual inspection of missing data. Finally, correlation analysis provides information on the dependence of missing data on other variables. Such information is useful in subsequent imputations. PMID:26807411

  16. Information System Incidents: The Development of a Damage Assessment Model

    DTIC Science & Technology

    1999-12-01

    Cyber criminals use creativity, knowledge, software, and hardware to attack and infiltrate information systems (IS) in order to copy, delete, or...the Internet led to an increase in cyber criminals and a variety or cyber crimes such as attacks, intrusions, introduction of viruses, and data theft...organizations on information systems is contributing to the increased number of cyber criminals . Additionally, the growing sophistication and availability of

  17. To Share or Not to Share: A Cross-Sectional Study on Health Information Sharing and Its Determinants Among Chinese Rural Chronic Patients.

    PubMed

    Fu, Hang; Dong, Dong; Feng, Da; He, Zhifei; Tang, Shangfeng; Fu, Qian; Feng, Zhanchun

    2017-10-01

    To examine the determinants of the health information sharing among rural Chinese chronic patients. Two large population-based surveys in rural China were carried out from July 2011 to April 2012. Data used in this study were second hand and sorted out from the two previous databases. A binary logistic regression analysis was employed to discover the impact of demographic characteristics, level of health literacy, and other factors on respondents' health information sharing behavior. Among the total 1,324 participants, 63.6% share health information with others. Among all significant predictors, those who acquire health information via family and friends are 6.0 times the odds of sharing health information than those who do not. Participants who have more than six household members, with middle and high levels of health knowledge, and who are moderately involved in discussions or settlements of village affairs are also more likely to share health information. The reliance on interpersonal communication channels for health information, household size, the patients' preexisting health knowledge, and their activity in village affairs are crucial determinants for health information sharing among rural chronic patients. A more sophisticated model needs to be established to reveal the complex processes of health information communication.

  18. Word Processing and Its Implications for Business Communications Courses.

    ERIC Educational Resources Information Center

    Kruk, Leonard B.

    Word processing, a systematic approach to office work, is currently based on the use of sophisticated dictating and typing machines. The word processing market is rapidly increasing with the paper explosion brought on by such factors as increasing governmental regulation, Internal Revenue Service requirements, and the need for stockholders to be…

  19. Enhancing Student Learning of Enterprise Integration and Business Process Orientation through an ERP Business Simulation Game

    ERIC Educational Resources Information Center

    Seethamraju, Ravi

    2011-01-01

    The sophistication of the integrated world of work and increased recognition of business processes as critical corporate assets require graduates to develop "process orientation" and an "integrated view" of business. Responding to these dynamic changes in business organizations, business schools are also continuing to modify…

  20. An audit of alcohol brand websites.

    PubMed

    Gordon, Ross

    2011-11-01

    The study investigated the nature and content of alcohol brand websites in the UK. The research involved an audit of the websites of the 10 leading alcohol brands by sales in the UK across four categories: lager, spirits, Flavoured Alcoholic Beverages and cider/perry. Each site was visited twice over a 1-month period with site features and content recorded using a pro-forma. The content of websites was then reviewed against the regulatory codes governing broadcast advertising of alcohol. It was found that 27 of 40 leading alcohol brands had a dedicated website. Sites featured sophisticated content, including sports and music sections, games, downloads and competitions. Case studies of two brand websites demonstrate the range of content features on such sites. A review of the application of regulatory codes covering traditional advertising found some content may breach the codes. Study findings illustrate the sophisticated range of content accessible on alcohol brand websites. When applying regulatory codes covering traditional alcohol marketing channels it is apparent that some content on alcohol brand websites would breach the codes. This suggests the regulation of alcohol brand websites may be an issue requiring attention from policymakers. Further research in this area would help inform this process. © 2010 Australasian Professional Society on Alcohol and other Drugs.

  1. Modeling of the Human - Operator in a Complex System Functioning Under Extreme Conditions

    NASA Astrophysics Data System (ADS)

    Getzov, Peter; Hubenova, Zoia; Yordanov, Dimitar; Popov, Wiliam

    2013-12-01

    Problems, related to the explication of sophisticated control systems of objects, operating under extreme conditions, have been examined and the impact of the effectiveness of the operator's activity on the systems as a whole. The necessity of creation of complex simulation models, reflecting operator's activity, is discussed. Organizational and technical system of an unmanned aviation complex is described as a sophisticated ergatic system. Computer realization of main subsystems of algorithmic system of the man as a controlling system is implemented and specialized software for data processing and analysis is developed. An original computer model of a Man as a tracking system has been implemented. Model of unmanned complex for operators training and formation of a mental model in emergency situation, implemented in "matlab-simulink" environment, has been synthesized. As a unit of the control loop, the pilot (operator) is simplified viewed as an autocontrol system consisting of three main interconnected subsystems: sensitive organs (perception sensors); central nervous system; executive organs (muscles of the arms, legs, back). Theoretical-data model of prediction the level of operator's information load in ergatic systems is proposed. It allows the assessment and prediction of the effectiveness of a real working operator. Simulation model of operator's activity in takeoff based on the Petri nets has been synthesized.

  2. Managing Credit Card Expenses: Nova Southeastern University Shares Cost-Saving Techniques.

    ERIC Educational Resources Information Center

    Peskin, Carol Ann

    1994-01-01

    Nova Southeastern University, Florida, has implemented a variety of techniques of cost containment for campus credit card transactions. These include restricted card acceptance parameters, careful merchant rate negotiation, increased automation of transaction processing, and sophisticated processing techniques. The university has demonstrated…

  3. Assessment of input uncertainty by seasonally categorized latent variables using SWAT

    USDA-ARS?s Scientific Manuscript database

    Watershed processes have been explored with sophisticated simulation models for the past few decades. It has been stated that uncertainty attributed to alternative sources such as model parameters, forcing inputs, and measured data should be incorporated during the simulation process. Among varyin...

  4. 3D Texture Features Mining for MRI Brain Tumor Identification

    NASA Astrophysics Data System (ADS)

    Rahim, Mohd Shafry Mohd; Saba, Tanzila; Nayer, Fatima; Syed, Afraz Zahra

    2014-03-01

    Medical image segmentation is a process to extract region of interest and to divide an image into its individual meaningful, homogeneous components. Actually, these components will have a strong relationship with the objects of interest in an image. For computer-aided diagnosis and therapy process, medical image segmentation is an initial mandatory step. Medical image segmentation is a sophisticated and challenging task because of the sophisticated nature of the medical images. Indeed, successful medical image analysis heavily dependent on the segmentation accuracy. Texture is one of the major features to identify region of interests in an image or to classify an object. 2D textures features yields poor classification results. Hence, this paper represents 3D features extraction using texture analysis and SVM as segmentation technique in the testing methodologies.

  5. Integrated Modeling for Environmental Assessment of Ecosystem Services

    EPA Science Inventory

    The U.S. Environmental Protection Agency uses environmental models to inform rulemaking and policy decisions at multiple spatial and temporal scales. In this study, several sophisticated modeling technologies are seamlessly integrated to facilitate a baseline assessment of the re...

  6. Handheld spectrometers: the state of the art

    NASA Astrophysics Data System (ADS)

    Crocombe, Richard A.

    2013-05-01

    "Small" spectrometers fall into three broad classes: small versions of laboratory instruments, providing data, subsequently processed on a PC; dedicated analyzers, providing actionable information to an individual operator; and process analyzers, providing quantitative or semi-quantitative information to a process controller. The emphasis of this paper is on handheld dedicated analyzers. Many spectrometers have historically been large, possible fragile, expensive and complicated to use. The challenge over the last dozen years, as instruments have moved into the field, has been to make spectrometers smaller, affordable, rugged, easy-to-use, but most of all capable of delivering actionable results. Actionable results can dramatically improve the efficiency of a testing process and transform the way business is done. There are several keys to this handheld spectrometer revolution. Consumer electronics has given us powerful mobile platforms, compact batteries, clearly visible displays, new user interfaces, etc., while telecomm has revolutionized miniature optics, sources and detectors. While these technologies enable miniature spectrometers themselves, actionable information has demanded the development of rugged algorithms for material confirmation, unknown identification, mixture analysis and detection of suspicious materials in unknown matrices. These algorithms are far more sophisticated than the `correlation' or `dot-product' methods commonly used in benchtop instruments. Finally, continuing consumer electronics advances now enable many more technologies to be incorporated into handheld spectrometers, including Bluetooth, wireless, WiFi, GPS, cameras and bar code readers, and the continued size shrinkage of spectrometer `engines' leads to the prospect of dual technology or `hyphenated' handheld instruments.

  7. Scientific meaning of meanings: quests for discoveries concerning our cultural ills.

    PubMed

    Patterson, C C

    1998-08-01

    This paper outlines pioneering concepts of fundamental physical and emotional features of the human brain which served as primary operators. These have developed during the past 10,000 years, giving rise to our present global megacultures and their various ancestral culture progenitors. Essential points are these: (1) Biological evolution endowed the human brain (quite inadvertently and unintentionally) with enormous latent powers for complex and sophisticated abstract ratiocinations. (2) Magnitudes of these latent powers grew exponentially with linear enlargements of brain size during the evolution of the genetic ancestors of Homo sapiens sapiens (Hss) during the past 3 million years, but these latent powers never materialized in utilized forms within the environmental contexts in which they evolved. (3) These sophisticated, abstract ratiocinations, both latent powers and operative forms in today's Hss brain, are divided between two major categories: utilitarian thinking and nonutilitarian thinking. (4) These two different types of thinking processes are carried out within separate, different regional combinations of neuronal biochemical entities within the same individual brain. (5) Sensitivities of abstract, sophisticated ratiocination processes within the human brain to influences from communication interactions with other human brains are exponentially greater in comparison with any other species of central nervous system in the earth's biosphere. This makes the brain population density the utmost critical factor, and determines the character of human thought within interacting populations of brains at a given time and place within a particular culture. (6) Abrupt increases of sedentary brain population densities, unnaturally greater by orders of magnitude than those that existed previously in biological evolutionary contexts, were engendered by the inauguration of agricultural practices 10,000 years ago. This enabled latent powers of the human brain used for complex and sophisticated abstract ratiocinations to become manifest in materialized forms of usage within relatively large groups of humans living i certain regions of the earth. (7) Thinking processes of the utilitarian category within brains living in such regions guided and dominated the development of sophisticated and complex social hierarchies and institutions, forms of communication, technologies, and cultures since that time. This dominating factor relegated thinking processes within the nonutilitarian categories of those brains to subservient roles during those developments. (8) Nonutilitarian abstracts ratiocinations possess a potential for proper adjudication and guidance of utilitarian abstract ratiocinations in the latter's development of culture. However, lack of the former's proper role in cultural developments since the beginning of the Holocene interglacial era has resulted in the imprisonment of Hss as aliens in an intellectual hell on a foreign planet.

  8. Building Automation and the Contextualization of Information Technology: The Journey of a Midwestern Community College in the U.S.

    ERIC Educational Resources Information Center

    Grandgenett, Neal; Perry, Pam; Pensabene, Thomas; Wegner, Karen; Nirenberg, Robert; Pilcher, Phil; Otterpohl, Candi

    2018-01-01

    The buildings in which people work, live, and spend their leisure time are increasingly embedded with sophisticated information technology (IT). This article describes the approach of Metropolitan Community College (MCC) in Omaha, Nebraska of the United States to provide an occupational context to some of their IT coursework by organizing IT…

  9. Development of a Web-Enabled Informatics Platform for Manipulation of Gene Expression Data

    DTIC Science & Technology

    2004-12-01

    genomic platforms such as metabolomics and proteomics , and to federated databases for knowledge management. A successful SBIR Phase I completed...measurements that require sophisticated bioinformatic platforms for data archival, management, integration, and analysis if researchers are to derive...web-enabled bioinformatic platform consisting of a Laboratory Information Management System (LIMS), an Analysis Information Management System (AIMS

  10. Enhancing critical thinking with case studies and nursing process.

    PubMed

    Neill, K M; Lachat, M F; Taylor-Panek, S

    1997-01-01

    Challenged to enhance critical thinking concepts in a sophomore nursing process course, faculty expanded the lecture format to include group explorations of patient case studies. The group format facilitated a higher level of analysis of patient cases and more sophisticated applications of nursing process. This teaching strategy was a positive learning experience for students and faculty.

  11. Recent developments in turbomachinery component materials and manufacturing challenges for aero engine applications

    NASA Astrophysics Data System (ADS)

    Srinivas, G.; Raghunandana, K.; Satish Shenoy, B.

    2018-02-01

    In the recent years the development of turbomachinery materials performance enhancement plays a vital role especially in aircraft air breathing engines like turbojet engine, turboprop engine, turboshaft engine and turbofan engines. Especially the transonic flow engines required highly sophisticated materials where it can sustain the entire thrust which can create by the engine. The main objective of this paper is to give an overview of the present cost-effective and technological capabilities process for turbomachinery component materials. Especially the main focus is given to study the Electro physical, Photonic additive removal process and Electro chemical process for turbomachinery parts manufacture. The aeronautical propulsion based technologies are reviewed thoroughly where in surface reliability, geometrical precession, and material removal and highly strengthened composite material deposition rates usually difficult to cut dedicated steels, Titanium and Nickel based alloys. In this paper the past aeronautical and propulsion mechanical based manufacturing technologies, current sophisticated technologies and also future challenging material processing techniques are covered. The paper also focuses on the brief description of turbomachinery components of shaping process and coating in aeromechanical applications.

  12. The Media as an Invaluable Tool for Informal Earth System Science Education

    NASA Astrophysics Data System (ADS)

    James, E.; Gautier, C.

    2001-12-01

    One of the most widely utilized avenues for educating the general public about the Earth's environment is the media, be it print, radio or broadcast. Accurate and effective communication of issues in Earth System Science (ESS), however, is significantly hindered by the public's relative scientific illiteracy. Discussion of ESS concepts requires the laying down of a foundation of complex scientific information, which must first be conveyed to an incognizant audience before any strata of sophisticated social context can be appropriately considered. Despite such a substantial obstacle to be negotiated, the environmental journalist is afforded the unique opportunity of providing a broad-reaching informal scientific education to a largely scientifically uninformed population base. This paper will review the tools used by various environmental journalists to address ESS issues and consider how successful each of these approaches has been at conveying complex scientific messages to a general audience lacking sufficient scientific sophistication. Different kinds of media materials used to this effect will be analyzed for their ideas and concepts conveyed, as well as their effectiveness in reaching the public at large.

  13. Not Merely Experiential: Unconscious Thought Can Be Rational

    PubMed Central

    Garrison, Katie E.; Handley, Ian M.

    2017-01-01

    Individuals often form more reasonable judgments from complex information after a period of distraction vs. deliberation. This phenomenon has been attributed to sophisticated unconscious thought during the distraction period that integrates and organizes the information (Unconscious Thought Theory; Dijksterhuis and Nordgren, 2006). Yet, other research suggests that experiential processes are strengthened during the distraction (relative to deliberation) period, accounting for the judgment and decision benefit. We tested between these possibilities, hypothesizing that unconscious thought is distinct from experiential processes, and independently contributes to judgments and decisions during a distraction period. Using an established paradigm, Experiment 1 (N = 319) randomly induced participants into an experiential or rational mindset, after which participants received complex information describing three roommates to then consider consciously (i.e., deliberation) or unconsciously (i.e., distraction). Results revealed superior roommate judgments (but not choices) following distraction vs. deliberation, consistent with Unconscious Thought Theory. Mindset did not have an influence on roommate judgments. However, planned tests revealed a significant advantage of distraction only within the rational-mindset condition, which is contrary to the idea that experiential processing alone facilitates complex decision-making during periods of distraction. In a second experiment (N = 136), we tested whether effects of unconscious thought manifest for a complex analytical reasoning task for which experiential processing would offer no advantage. As predicted, participants in an unconscious thought condition outperformed participants in a control condition, suggesting that unconscious thought can be analytical. In sum, the current results support the existence of unconscious thinking processes that are distinct from experiential processes, and can be rational. Thus, the experiential vs. rational nature of a process might not cleanly delineate conscious and unconscious thought. PMID:28729844

  14. Not Merely Experiential: Unconscious Thought Can Be Rational.

    PubMed

    Garrison, Katie E; Handley, Ian M

    2017-01-01

    Individuals often form more reasonable judgments from complex information after a period of distraction vs. deliberation. This phenomenon has been attributed to sophisticated unconscious thought during the distraction period that integrates and organizes the information (Unconscious Thought Theory; Dijksterhuis and Nordgren, 2006). Yet, other research suggests that experiential processes are strengthened during the distraction (relative to deliberation) period, accounting for the judgment and decision benefit. We tested between these possibilities, hypothesizing that unconscious thought is distinct from experiential processes, and independently contributes to judgments and decisions during a distraction period. Using an established paradigm, Experiment 1 ( N = 319) randomly induced participants into an experiential or rational mindset, after which participants received complex information describing three roommates to then consider consciously (i.e., deliberation) or unconsciously (i.e., distraction). Results revealed superior roommate judgments (but not choices) following distraction vs. deliberation, consistent with Unconscious Thought Theory. Mindset did not have an influence on roommate judgments. However, planned tests revealed a significant advantage of distraction only within the rational-mindset condition, which is contrary to the idea that experiential processing alone facilitates complex decision-making during periods of distraction. In a second experiment ( N = 136), we tested whether effects of unconscious thought manifest for a complex analytical reasoning task for which experiential processing would offer no advantage. As predicted, participants in an unconscious thought condition outperformed participants in a control condition, suggesting that unconscious thought can be analytical. In sum, the current results support the existence of unconscious thinking processes that are distinct from experiential processes, and can be rational. Thus, the experiential vs. rational nature of a process might not cleanly delineate conscious and unconscious thought.

  15. Phase processing for quantitative susceptibility mapping of regions with large susceptibility and lack of signal.

    PubMed

    Fortier, Véronique; Levesque, Ives R

    2018-06-01

    Phase processing impacts the accuracy of quantitative susceptibility mapping (QSM). Techniques for phase unwrapping and background removal have been proposed and demonstrated mostly in brain. In this work, phase processing was evaluated in the context of large susceptibility variations (Δχ) and negligible signal, in particular for susceptibility estimation using the iterative phase replacement (IPR) algorithm. Continuous Laplacian, region-growing, and quality-guided unwrapping were evaluated. For background removal, Laplacian boundary value (LBV), projection onto dipole fields (PDF), sophisticated harmonic artifact reduction for phase data (SHARP), variable-kernel sophisticated harmonic artifact reduction for phase data (V-SHARP), regularization enabled sophisticated harmonic artifact reduction for phase data (RESHARP), and 3D quadratic polynomial field removal were studied. Each algorithm was quantitatively evaluated in simulation and qualitatively in vivo. Additionally, IPR-QSM maps were produced to evaluate the impact of phase processing on the susceptibility in the context of large Δχ with negligible signal. Quality-guided unwrapping was the most accurate technique, whereas continuous Laplacian performed poorly in this context. All background removal algorithms tested resulted in important phase inaccuracies, suggesting that techniques used for brain do not translate well to situations where large Δχ and no or low signal are expected. LBV produced the smallest errors, followed closely by PDF. Results suggest that quality-guided unwrapping should be preferred, with PDF or LBV for background removal, for QSM in regions with large Δχ and negligible signal. This reduces the susceptibility inaccuracy introduced by phase processing. Accurate background removal remains an open question. Magn Reson Med 79:3103-3113, 2017. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  16. Transition from isotropic to digitated growth modulates network formation in Physarum polycephalum

    NASA Astrophysics Data System (ADS)

    Vogel, David; Gautrais, Jacques; Perna, Andrea; Sumpter, David J. T.; Deneubourg, Jean-Louis; Dussutour, Audrey

    2017-01-01

    Some organisms, including fungi, ants, and slime molds, explore their environment and forage by forming interconnected networks. The plasmodium of the slime mold Physarum polycephalum is a large unicellular amoeboid organism that grows a tubular spatial network through which nutrients, body mass, and chemical signals are transported. Individual plasmodia are capable of sophisticated behaviours such as optimizing their network connectivity and dynamics using only decentralized information processing. In this study, we used a population of plasmodia that interconnect through time to analyse the dynamical interactions between growth of individual plasmodia and global network formation. Our results showed how initial conditions, such as the distance between plasmodia, their size, or the presence and quality of food, affect the emerging network connectivity.

  17. Biomolecular computing systems: principles, progress and potential.

    PubMed

    Benenson, Yaakov

    2012-06-12

    The task of information processing, or computation, can be performed by natural and man-made 'devices'. Man-made computers are made from silicon chips, whereas natural 'computers', such as the brain, use cells and molecules. Computation also occurs on a much smaller scale in regulatory and signalling pathways in individual cells and even within single biomolecules. Indeed, much of what we recognize as life results from the remarkable capacity of biological building blocks to compute in highly sophisticated ways. Rational design and engineering of biological computing systems can greatly enhance our ability to study and to control biological systems. Potential applications include tissue engineering and regeneration and medical treatments. This Review introduces key concepts and discusses recent progress that has been made in biomolecular computing.

  18. SigmaCLIPSE = presentation management + NASA CLI PS + SQL

    NASA Technical Reports Server (NTRS)

    Weiss, Bernard P., Jr.

    1990-01-01

    SigmaCLIPSE provides an expert systems and 'intelligent' data base development program for diverse systems integration environments that require support for automated reasoning and expert systems technology, presentation management, and access to 'intelligent' SQL data bases. The SigmaCLIPSE technology and and its integrated ability to access 4th generation application development and decision support tools through a portable SQL interface, comprises a sophisticated software development environment for solving knowledge engineering and expert systems development problems in information intensive commercial environments -- financial services, health care, and distributed process control -- where the expert system must be extendable -- a major architectural advantage of NASA CLIPS. SigmaCLIPSE is a research effort intended to test the viability of merging SQL data bases with expert systems technology.

  19. Cyber crime: can a standard risk analysis help in the challenges facing business continuity managers?

    PubMed

    Vande Putte, Danny; Verhelst, Marc

    Risk management has never been easy. Finding efficient mitigating measures is not always straightforward. Finding measures for cyber crime, however, is a really huge challenge because cyber threats are changing all the time. As the sophistication of these threats is growing, their impact increases. Moreover, society and its economy have become increasingly dependent on information and communication technologies. Standard risk analysis methodologies will help to score the cyber risk and to place it in the risk tolerance matrix. This will allow business continuity managers to figure out if there is still a gap with the maximum tolerable outage for time-critical business processes and if extra business continuity measures are necessary to fill the gap.

  20. Flexible Proton-Gated Oxide Synaptic Transistors on Si Membrane.

    PubMed

    Zhu, Li Qiang; Wan, Chang Jin; Gao, Ping Qi; Liu, Yang Hui; Xiao, Hui; Ye, Ji Chun; Wan, Qing

    2016-08-24

    Ion-conducting materials have received considerable attention for their applications in fuel cells, electrochemical devices, and sensors. Here, flexible indium zinc oxide (InZnO) synaptic transistors with multiple presynaptic inputs gated by proton-conducting phosphorosilicate glass-based electrolyte films are fabricated on ultrathin Si membranes. Transient characteristics of the proton gated InZnO synaptic transistors are investigated, indicating stable proton-gating behaviors. Short-term synaptic plasticities are mimicked on the proposed proton-gated synaptic transistors. Furthermore, synaptic integration regulations are mimicked on the proposed synaptic transistor networks. Spiking logic modulations are realized based on the transition between superlinear and sublinear synaptic integration. The multigates coupled flexible proton-gated oxide synaptic transistors may be interesting for neuroinspired platforms with sophisticated spatiotemporal information processing.

  1. Generation of an arbitrary concatenated Greenberger-Horne-Zeilinger state with single photons

    NASA Astrophysics Data System (ADS)

    Chen, Shan-Shan; Zhou, Lan; Sheng, Yu-Bo

    2017-02-01

    The concatenated Greenberger-Horne-Zeilinger (C-GHZ) state is a new kind of logic-qubit entangled state, which may have extensive applications in future quantum communication. In this letter, we propose a protocol for constructing an arbitrary C-GHZ state with single photons. We exploit the cross-Kerr nonlinearity for this purpose. This protocol has some advantages over previous protocols. First, it only requires two kinds of cross-Kerr nonlinearities to generate single phase shifts  ±θ. Second, it is not necessary to use sophisticated m-photon Toffoli gates. Third, this protocol is deterministic and can be used to generate an arbitrary C-GHZ state. This protocol may be useful in future quantum information processing based on the C-GHZ state.

  2. Talent Development Gamification in Talent Selection Assessment Centres

    ERIC Educational Resources Information Center

    Tansley, Carole; Hafermalz, Ella; Dery, Kristine

    2016-01-01

    Purpose: The purpose of this paper is to examine the relationship between the use of sophisticated talent selection processes such as gamification and training and development interventions designed to ensure that candidates can successfully navigate the talent assessment process. Gamification is the application of game elements to non-game…

  3. Acousto-Optic Tunable Filter for Time-Domain Processing of Ultra-Short Optical Pulses,

    DTIC Science & Technology

    The application of acousto - optic tunable filters for shaping of ultra-fast pulses in the time domain is analyzed and demonstrated. With the rapid...advance of acousto - optic tunable filter (AOTF) technology, the opportunity for sophisticated signal processing capabilities arises. AOTFs offer unique

  4. Tools for Achieving TQE.

    ERIC Educational Resources Information Center

    Latta, Raymond F.; Downey, Carolyn J.

    This book presents a wide array of sophisticated problem-solving tools and shows how to use them in a humanizing way that involves all stakeholders in the process. Chapter 1 develops the rationale for educational stakeholders to consider quality tools. Chapter 2 highlights three quality group-process tools--brainstorming, the nominal group…

  5. Proteomic Analysis of the Human Olfactory Bulb.

    PubMed

    Dammalli, Manjunath; Dey, Gourav; Madugundu, Anil K; Kumar, Manish; Rodrigues, Benvil; Gowda, Harsha; Siddaiah, Bychapur Gowrishankar; Mahadevan, Anita; Shankar, Susarla Krishna; Prasad, Thottethodi Subrahmanya Keshava

    2017-08-01

    The importance of olfaction to human health and disease is often underappreciated. Olfactory dysfunction has been reported in association with a host of common complex diseases, including neurological diseases such as Alzheimer's disease and Parkinson's disease. For health, olfaction or the sense of smell is also important for most mammals, for optimal engagement with their environment. Indeed, animals have developed sophisticated olfactory systems to detect and interpret the rich information presented to them to assist in day-to-day activities such as locating food sources, differentiating food from poisons, identifying mates, promoting reproduction, avoiding predators, and averting death. In this context, the olfactory bulb is a vital component of the olfactory system receiving sensory information from the axons of the olfactory receptor neurons located in the nasal cavity and the first place that processes the olfactory information. We report in this study original observations on the human olfactory bulb proteome in healthy subjects, using a high-resolution mass spectrometry-based proteomic approach. We identified 7750 nonredundant proteins from human olfactory bulbs. Bioinformatics analysis of these proteins showed their involvement in biological processes associated with signal transduction, metabolism, transport, and olfaction. These new observations provide a crucial baseline molecular profile of the human olfactory bulb proteome, and should assist the future discovery of biomarker proteins and novel diagnostics associated with diseases characterized by olfactory dysfunction.

  6. HPC Annual Report: Emulytics.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crussell, Jonathan; Boote, Jeffrey W.; Fritz, David Jakob

    Networked Information Technology systems play a key role supporting critical government, military, and private computer installations. Many of today's critical infrastructure systems have strong dependencies on secure information exchange among geographically dispersed facilities. As operations become increasingly dependent on the information exchange they also become targets for exploitation. The need to protect data and defend these systems from external attack has become increasingly vital while the nature of the threats has become sophisticated and pervasive making the challenges daunting. Enter Emulytics.

  7. Physiological Capacities: Estimating an Athlete's Potential.

    ERIC Educational Resources Information Center

    Lemon, Peter W. R.

    1982-01-01

    Several simple performance tests are described for assessing an athlete's major energy-producing capabilities. The tests are suitable for mass screening because they are easy to administer, require no sophisticated equipment, and can be done quickly. Information for evaluating test results is included. (PP)

  8. Highway Safety Information System guidebook for the Utah state data files. Volume 2 : single variable tabulations

    DOT National Transportation Integrated Search

    1996-07-01

    The increasingly sophisticated demands placed on transportation planning models by the 1990 Clean Air Act Amendments (CAAA), the 1991 Intermodal Surface Transportation Efficiency Act (ISTEA), and to a lesser extent some earlier legislation, have led ...

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Onyeji, Ijeoma; Bazilian, Morgan; Bronk, Chris

    Both the number and security implications of sophisticated cyber attacks on companies providing critical energy infrastructures are increasing. As power networks and, to a certain extent, oil and gas infrastructure both upstream and downstream, are becoming increasingly integrated with information communication technology systems, they are growing more susceptible to cyber attacks.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Siegel, S.

    An increased level of mathematical sophistication will be needed in the future to be able to handle the spectrum of information as it comes from a broad array of biological systems and other sources. Classification will be an increasingly complex and difficult issue. Several projects that are discussed are being developed by the US Department of Health and Human Services (DHHS), including a directory of risk assessment projects and a directory of exposure information resources.

  11. Spacecraft software training needs assessment research

    NASA Technical Reports Server (NTRS)

    Ratcliff, Shirley; Golas, Katharine

    1990-01-01

    The problems were identified, along with their causes and potential solutions, that the management analysts were encountering in performing their jobs. It was concluded that sophisticated training applications would provide the most effective solution to a substantial portion of the analysts' problems. The remainder could be alleviated through the introduction of tools that could help make retrieval of the needed information from the vast and complex information resources feasible.

  12. Parallel processing via a dual olfactory pathway in the honeybee.

    PubMed

    Brill, Martin F; Rosenbaum, Tobias; Reus, Isabelle; Kleineidam, Christoph J; Nawrot, Martin P; Rössler, Wolfgang

    2013-02-06

    In their natural environment, animals face complex and highly dynamic olfactory input. Thus vertebrates as well as invertebrates require fast and reliable processing of olfactory information. Parallel processing has been shown to improve processing speed and power in other sensory systems and is characterized by extraction of different stimulus parameters along parallel sensory information streams. Honeybees possess an elaborate olfactory system with unique neuronal architecture: a dual olfactory pathway comprising a medial projection-neuron (PN) antennal lobe (AL) protocerebral output tract (m-APT) and a lateral PN AL output tract (l-APT) connecting the olfactory lobes with higher-order brain centers. We asked whether this neuronal architecture serves parallel processing and employed a novel technique for simultaneous multiunit recordings from both tracts. The results revealed response profiles from a high number of PNs of both tracts to floral, pheromonal, and biologically relevant odor mixtures tested over multiple trials. PNs from both tracts responded to all tested odors, but with different characteristics indicating parallel processing of similar odors. Both PN tracts were activated by widely overlapping response profiles, which is a requirement for parallel processing. The l-APT PNs had broad response profiles suggesting generalized coding properties, whereas the responses of m-APT PNs were comparatively weaker and less frequent, indicating higher odor specificity. Comparison of response latencies within and across tracts revealed odor-dependent latencies. We suggest that parallel processing via the honeybee dual olfactory pathway provides enhanced odor processing capabilities serving sophisticated odor perception and olfactory demands associated with a complex olfactory world of this social insect.

  13. Characterization of cytochrome c as marker for retinal cell degeneration by uv/vis spectroscopic imaging

    NASA Astrophysics Data System (ADS)

    Hollmach, Julia; Schweizer, Julia; Steiner, Gerald; Knels, Lilla; Funk, Richard H. W.; Thalheim, Silko; Koch, Edmund

    2011-07-01

    Retinal diseases like age-related macular degeneration have become an important cause of visual loss depending on increasing life expectancy and lifestyle habits. Due to the fact that no satisfying treatment exists, early diagnosis and prevention are the only possibilities to stop the degeneration. The protein cytochrome c (cyt c) is a suitable marker for degeneration processes and apoptosis because it is a part of the respiratory chain and involved in the apoptotic pathway. The determination of the local distribution and oxidative state of cyt c in living cells allows the characterization of cell degeneration processes. Since cyt c exhibits characteristic absorption bands between 400 and 650 nm wavelength, uv/vis in situ spectroscopic imaging was used for its characterization in retinal ganglion cells. The large amount of data, consisting of spatial and spectral information, was processed by multivariate data analysis. The challenge consists in the identification of the molecular information of cyt c. Baseline correction, principle component analysis (PCA) and cluster analysis (CA) were performed in order to identify cyt c within the spectral dataset. The combination of PCA and CA reveals cyt c and its oxidative state. The results demonstrate that uv/vis spectroscopic imaging in conjunction with sophisticated multivariate methods is a suitable tool to characterize cyt c under in situ conditions.

  14. Power processing

    NASA Technical Reports Server (NTRS)

    Schwarz, F. C.

    1971-01-01

    Processing of electric power has been presented as a discipline that draws on almost every field of electrical engineering, including system and control theory, communications theory, electronic network design, and power component technology. The cost of power processing equipment, which often equals that of expensive, sophisticated, and unconventional sources of electrical energy, such as solar batteries, is a significant consideration in the choice of electric power systems.

  15. Automatic Neural Processing of Disorder-Related Stimuli in Social Anxiety Disorder: Faces and More

    PubMed Central

    Schulz, Claudia; Mothes-Lasch, Martin; Straube, Thomas

    2013-01-01

    It has been proposed that social anxiety disorder (SAD) is associated with automatic information processing biases resulting in hypersensitivity to signals of social threat such as negative facial expressions. However, the nature and extent of automatic processes in SAD on the behavioral and neural level is not entirely clear yet. The present review summarizes neuroscientific findings on automatic processing of facial threat but also other disorder-related stimuli such as emotional prosody or negative words in SAD. We review initial evidence for automatic activation of the amygdala, insula, and sensory cortices as well as for automatic early electrophysiological components. However, findings vary depending on tasks, stimuli, and neuroscientific methods. Only few studies set out to examine automatic neural processes directly and systematic attempts are as yet lacking. We suggest that future studies should: (1) use different stimulus modalities, (2) examine different emotional expressions, (3) compare findings in SAD with other anxiety disorders, (4) use more sophisticated experimental designs to investigate features of automaticity systematically, and (5) combine different neuroscientific methods (such as functional neuroimaging and electrophysiology). Finally, the understanding of neural automatic processes could also provide hints for therapeutic approaches. PMID:23745116

  16. Alpha spectrometric characterization of process-related particle size distributions from active particle sampling at the Los Alamos National Laboratory uranium foundry

    NASA Astrophysics Data System (ADS)

    Plionis, A. A.; Peterson, D. S.; Tandon, L.; LaMont, S. P.

    2010-03-01

    Uranium particles within the respirable size range pose a significant hazard to the health and safety of workers. Significant differences in the deposition and incorporation patterns of aerosols within the respirable range can be identified and integrated into sophisticated health physics models. Data characterizing the uranium particle size distribution resulting from specific foundry-related processes are needed. Using personal air sampling cascade impactors, particles collected from several foundry processes were sorted by activity median aerodynamic diameter onto various Marple substrates. After an initial gravimetric assessment of each impactor stage, the substrates were analyzed by alpha spectrometry to determine the uranium content of each stage. Alpha spectrometry provides rapid non-distructive isotopic data that can distinguish process uranium from natural sources and the degree of uranium contribution to the total accumulated particle load. In addition, the particle size bins utilized by the impactors provide adequate resolution to determine if a process particle size distribution is: lognormal, bimodal, or trimodal. Data on process uranium particle size values and distributions facilitate the development of more sophisticated and accurate models for internal dosimetry, resulting in an improved understanding of foundry worker health and safety.

  17. Keeping the History in Historical Seismology: The 1872 Owens Valley, California Earthquake

    NASA Astrophysics Data System (ADS)

    Hough, Susan E.

    2008-07-01

    The importance of historical earthquakes is being increasingly recognized. Careful investigations of key pre-instrumental earthquakes can provide critical information and insights for not only seismic hazard assessment but also for earthquake science. In recent years, with the explosive growth in computational sophistication in Earth sciences, researchers have developed increasingly sophisticated methods to analyze macroseismic data quantitatively. These methodological developments can be extremely useful to exploit fully the temporally and spatially rich information source that seismic intensities often represent. For example, the exhaustive and painstaking investigations done by Ambraseys and his colleagues of early Himalayan earthquakes provides information that can be used to map out site response in the Ganges basin. In any investigation of macroseismic data, however, one must stay mindful that intensity values are not data but rather interpretations. The results of any subsequent analysis, regardless of the degree of sophistication of the methodology, will be only as reliable as the interpretations of available accounts—and only as complete as the research done to ferret out, and in many cases translate, these accounts. When intensities are assigned without an appreciation of historical setting and context, seemingly careful subsequent analysis can yield grossly inaccurate results. As a case study, I report here on the results of a recent investigation of the 1872 Owen's Valley, California earthquake. Careful consideration of macroseismic observations reveals that this event was probably larger than the great San Francisco earthquake of 1906, and possibly the largest historical earthquake in California. The results suggest that some large earthquakes in California will generate significantly larger ground motions than San Andreas fault events of comparable magnitude.

  18. Keeping the History in Historical Seismology: The 1872 Owens Valley, California Earthquake

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hough, Susan E.

    2008-07-08

    The importance of historical earthquakes is being increasingly recognized. Careful investigations of key pre-instrumental earthquakes can provide critical information and insights for not only seismic hazard assessment but also for earthquake science. In recent years, with the explosive growth in computational sophistication in Earth sciences, researchers have developed increasingly sophisticated methods to analyze macroseismic data quantitatively. These methodological developments can be extremely useful to exploit fully the temporally and spatially rich information source that seismic intensities often represent. For example, the exhaustive and painstaking investigations done by Ambraseys and his colleagues of early Himalayan earthquakes provides information that can bemore » used to map out site response in the Ganges basin. In any investigation of macroseismic data, however, one must stay mindful that intensity values are not data but rather interpretations. The results of any subsequent analysis, regardless of the degree of sophistication of the methodology, will be only as reliable as the interpretations of available accounts - and only as complete as the research done to ferret out, and in many cases translate, these accounts. When intensities are assigned without an appreciation of historical setting and context, seemingly careful subsequent analysis can yield grossly inaccurate results. As a case study, I report here on the results of a recent investigation of the 1872 Owen's Valley, California earthquake. Careful consideration of macroseismic observations reveals that this event was probably larger than the great San Francisco earthquake of 1906, and possibly the largest historical earthquake in California. The results suggest that some large earthquakes in California will generate significantly larger ground motions than San Andreas fault events of comparable magnitude.« less

  19. Neo-Sophistic Rhetorical Theory: Sophistic Precedents for Contemporary Epistemic Rhetoric.

    ERIC Educational Resources Information Center

    McComiskey, Bruce

    Interest in the sophists has recently intensified among rhetorical theorists, culminating in the notion that rhetoric is epistemic. Epistemic rhetoric has its first and deepest roots in sophistic epistemological and rhetorical traditions, so that the view of rhetoric as epistemic is now being dubbed "neo-sophistic." In epistemic…

  20. Pure sources and efficient detectors for optical quantum information processing

    NASA Astrophysics Data System (ADS)

    Zielnicki, Kevin

    Over the last sixty years, classical information theory has revolutionized the understanding of the nature of information, and how it can be quantified and manipulated. Quantum information processing extends these lessons to quantum systems, where the properties of intrinsic uncertainty and entanglement fundamentally defy classical explanation. This growing field has many potential applications, including computing, cryptography, communication, and metrology. As inherently mobile quantum particles, photons are likely to play an important role in any mature large-scale quantum information processing system. However, the available methods for producing and detecting complex multi-photon states place practical limits on the feasibility of sophisticated optical quantum information processing experiments. In a typical quantum information protocol, a source first produces an interesting or useful quantum state (or set of states), perhaps involving superposition or entanglement. Then, some manipulations are performed on this state, perhaps involving quantum logic gates which further manipulate or entangle the intial state. Finally, the state must be detected, obtaining some desired measurement result, e.g., for secure communication or computationally efficient factoring. The work presented here concerns the first and last stages of this process as they relate to photons: sources and detectors. Our work on sources is based on the need for optimized non-classical states of light delivered at high rates, particularly of single photons in a pure quantum state. We seek to better understand the properties of spontaneous parameteric downconversion (SPDC) sources of photon pairs, and in doing so, produce such an optimized source. We report an SPDC source which produces pure heralded single photons with little or no spectral filtering, allowing a significant rate enhancement. Our work on detectors is based on the need to reliably measure single-photon states. We have focused on optimizing the detection efficiency of visible light photon counters (VLPCs), a single-photon detection technology that is also capable of resolving photon number states. We report a record-breaking quantum efficiency of 91 +/- 3% observed with our detection system. Both sources and detectors are independently interesting physical systems worthy of study, but together they promise to enable entire new classes and applications of information based on quantum mechanics.

  1. Sophisticated Clean Air Strategies Required to Mitigate Against Particulate Organic Pollution

    NASA Astrophysics Data System (ADS)

    Grigas, T.; Ovadnevaite, J.; Ceburnis, D.; Moran, E.; McGovern, F. M.; Jennings, S. G.; O'Dowd, C.

    2017-03-01

    Since the 1980’s, measures mitigating the impact of transboundary air pollution have been implemented successfully as evidenced in the 1980-2014 record of atmospheric sulphur pollution over the NE-Atlantic, a key region for monitoring background northern-hemisphere pollution levels. The record reveals a 72-79% reduction in annual-average airborne sulphur pollution (SO4 and SO2, respectively) over the 35-year period. The NE-Atlantic, as observed from the Mace Head research station on the Irish coast, can be considered clean for 64% of the time during which sulphate dominates PM1 levels, contributing 42% of the mass, and for the remainder of the time, under polluted conditions, a carbonaceous (organic matter and Black Carbon) aerosol prevails, contributing 60% to 90% of the PM1 mass and exhibiting a trend whereby its contribution increases with increasing pollution levels. The carbonaceous aerosol is known to be diverse in source and nature and requires sophisticated air pollution policies underpinned by sophisticated characterisation and source apportionment capabilities to inform selective emissions-reduction strategies. Inauspiciously, however, this carbonaceous concoction is not measured in regulatory Air Quality networks.

  2. Sophisticated Clean Air Strategies Required to Mitigate Against Particulate Organic Pollution.

    PubMed

    Grigas, T; Ovadnevaite, J; Ceburnis, D; Moran, E; McGovern, F M; Jennings, S G; O'Dowd, C

    2017-03-17

    Since the 1980's, measures mitigating the impact of transboundary air pollution have been implemented successfully as evidenced in the 1980-2014 record of atmospheric sulphur pollution over the NE-Atlantic, a key region for monitoring background northern-hemisphere pollution levels. The record reveals a 72-79% reduction in annual-average airborne sulphur pollution (SO 4 and SO 2 , respectively) over the 35-year period. The NE-Atlantic, as observed from the Mace Head research station on the Irish coast, can be considered clean for 64% of the time during which sulphate dominates PM 1 levels, contributing 42% of the mass, and for the remainder of the time, under polluted conditions, a carbonaceous (organic matter and Black Carbon) aerosol prevails, contributing 60% to 90% of the PM 1 mass and exhibiting a trend whereby its contribution increases with increasing pollution levels. The carbonaceous aerosol is known to be diverse in source and nature and requires sophisticated air pollution policies underpinned by sophisticated characterisation and source apportionment capabilities to inform selective emissions-reduction strategies. Inauspiciously, however, this carbonaceous concoction is not measured in regulatory Air Quality networks.

  3. A sophisticated, multi-channel data acquisition and processing system for high frequency noise research

    NASA Technical Reports Server (NTRS)

    Hall, David G.; Bridges, James

    1992-01-01

    A sophisticated, multi-channel computerized data acquisition and processing system was developed at the NASA LeRC for use in noise experiments. This technology, which is available for transfer to industry, provides a convenient, cost-effective alternative to analog tape recording for high frequency acoustic measurements. This system provides 32-channel acquisition of microphone signals with an analysis bandwidth up to 100 kHz per channel. Cost was minimized through the use of off-the-shelf components. Requirements to allow for future expansion were met by choosing equipment which adheres to established industry standards for hardware and software. Data processing capabilities include narrow band and 1/3 octave spectral analysis, compensation for microphone frequency response/directivity, and correction of acoustic data to standard day conditions. The system was used successfully in a major wind tunnel test program at NASA LeRC to acquire and analyze jet noise data in support of the High Speed Civil Transport (HSCT) program.

  4. Tools for Atmospheric Radiative Transfer: Streamer and FluxNet. Revised

    NASA Technical Reports Server (NTRS)

    Key, Jeffrey R.; Schweiger, Axel J.

    1998-01-01

    Two tools for the solution of radiative transfer problems are presented. Streamer is a highly flexible medium spectral resolution radiative transfer model based on the plane-parallel theory of radiative transfer. Capable of computing either fluxes or radiances, it is suitable for studying radiative processes at the surface or within the atmosphere and for the development of remote-sensing algorithms. FluxNet is a fast neural network-based implementation of Streamer for computing surface fluxes. It allows for a sophisticated treatment of radiative processes in the analysis of large data sets and potential integration into geophysical models where computational efficiency is an issue. Documentation and tools for the development of alternative versions of Fluxnet are available. Collectively, Streamer and FluxNet solve a wide variety of problems related to radiative transfer: Streamer provides the detail and sophistication needed to perform basic research on most aspects of complex radiative processes while the efficiency and simplicity of FluxNet make it ideal for operational use.

  5. Research in speech communication.

    PubMed

    Flanagan, J

    1995-10-24

    Advances in digital speech processing are now supporting application and deployment of a variety of speech technologies for human/machine communication. In fact, new businesses are rapidly forming about these technologies. But these capabilities are of little use unless society can afford them. Happily, explosive advances in microelectronics over the past two decades have assured affordable access to this sophistication as well as to the underlying computing technology. The research challenges in speech processing remain in the traditionally identified areas of recognition, synthesis, and coding. These three areas have typically been addressed individually, often with significant isolation among the efforts. But they are all facets of the same fundamental issue--how to represent and quantify the information in the speech signal. This implies deeper understanding of the physics of speech production, the constraints that the conventions of language impose, and the mechanism for information processing in the auditory system. In ongoing research, therefore, we seek more accurate models of speech generation, better computational formulations of language, and realistic perceptual guides for speech processing--along with ways to coalesce the fundamental issues of recognition, synthesis, and coding. Successful solution will yield the long-sought dictation machine, high-quality synthesis from text, and the ultimate in low bit-rate transmission of speech. It will also open the door to language-translating telephony, where the synthetic foreign translation can be in the voice of the originating talker.

  6. Appreciation of the nature of light demands enhancement over the prevailing scientific epistemology

    NASA Astrophysics Data System (ADS)

    Roychoudhuri, Chandrasekhar

    2011-09-01

    Based on attempts to resolve the problem of various self contradictory assumptions behind the prevailing belief on single photon interference, we have analyzed the process steps behind our experimental measurements and named the process as the Interaction Process Mapping Epistemology (IPM-E). This has helped us recognize that the quantum mechanical Measurement Problem has a much universal and deeper root in nature. Our scientific theorization process suffers from a Perpetual Information Challenge (PIC), which cannot be overcome by elegant and/or sophisticated mathematical theories alone. Iterative imaginative application of IPM-E needs to be used as a metaphorical analytical continuation to fill up the missing information gaps. IPM-E has also guided us to recognize the generic NIW-principle (Non-Interaction of Waves) in the linear domain, not explicitly recognized in current books and literature. Superposition effects become manifest through light-matter interactions. Detecting dipoles gets stimulated by multiple superposed beams; it sums the simultaneous multiple stimulations into a single resultant undulation, which then guides the resultant energy exchange. The consequent transformation in the detector corresponds to observed fringes. They neither represent interference of light; nor represent selective arrival or non-arrival of photons on the detector. Photons do not possess any force of mutual interaction to generate their redistribution. Implementation of IPM-E requires us to recognize our subjective interpretation propensity with which we are burdened due to our evolutionary successes.

  7. How Adolescents Search for and Appraise Online Health Information: A Systematic Review.

    PubMed

    Freeman, Jaimie L; Caldwell, Patrina H Y; Bennett, Patricia A; Scott, Karen M

    2018-04-01

    To conduct a systematic review of the evidence concerning whether and how adolescents search for online health information and the extent to which they appraise the credibility of information they retrieve. A systematic search of online databases (MEDLINE, EMBASE, PsycINFO, ERIC) was performed. Reference lists of included papers were searched manually for additional articles. Included were studies on whether and how adolescents searched for and appraised online health information, where adolescent participants were aged 13-18 years. Thematic analysis was used to synthesize the findings. Thirty-four studies met the inclusion criteria. In line with the research questions, 2 key concepts were identified within the papers: whether and how adolescents search for online health information, and the extent to which adolescents appraise online health information. Four themes were identified regarding whether and how adolescents search for online health information: use of search engines, difficulties in selecting appropriate search strings, barriers to searching, and absence of searching. Four themes emerged concerning the extent to which adolescents appraise the credibility of online health information: evaluation based on Web site name and reputation, evaluation based on first impression of Web site, evaluation of Web site content, and absence of a sophisticated appraisal strategy. Adolescents are aware of the varying quality of online health information. Strategies used by individuals for searching and appraising online health information differ in their sophistication. It is important to develop resources to enhance search and appraisal skills and to collaborate with adolescents to ensure that such resources are appropriate for them. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. 77 FR 71461 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-30

    ... Management and Budget (``OMB'') a request for approval of extension of the existing collection of information... determinations regarding the financial status of the customer, a bank employee's statutory disqualification... determination regarding a customer's high net worth or institutional status or suitability or sophistication...

  9. Sensor Web for Spatio-Temporal Monitoring of a Hydrological Environment

    NASA Technical Reports Server (NTRS)

    Delin, K. A.; Jackson, S. P.; Johnson, D. W.; Burleigh, S. C.; Woodrow, R. R.; McAuley, M.; Britton, J. T.; Dohm, J. M.; Ferre, T. P. A.; Ip, Felipe

    2004-01-01

    The Sensor Web is a macroinstrument concept that allows for the spatio-temporal understanding of an environment through coordinated efforts between multiple numbers and types of sensing platforms, including, in its most general form, both orbital and terrestrial and both fixed and mobile. Each of these platforms, or pods, communicates within its local neighborhood and thus distributes information to the instrument as a whole. The result of sharing and continual processing of this information among all the Sensor Web elements will result in an information flow and a global perception of and reactive capability to the environment. As illustrated, the Sensor Web concept also allows for the recursive notion of a web of webs with individual distributed instruments possibly playing the role of a single node point on a larger Sensor Web instrument. In particular, the fusion of inexpensive, yet sophisticated, commercial technology from both the computation and telecommunication revolutions has enabled the development of practical, fielded, and embedded in situ systems that have been the focus of the NASA/JPL Sensor Webs Project (http://sensorwebs.jpl.nasa.gov/). These Sensor Webs are complete systems consisting of not only the pod elements that wirelessly communicate among themselves, but also interfacing and archiving software that allows for easy use by the end-user. Previous successful deployments have included environments as diverse as coastal regions, Antarctica, and desert areas. The Sensor Web has broad implications for Earth and planetary science and will revolutionize the way experiments and missions are conceived and performed. As part of our current efforts to develop a macrointelligence within the system, we have deployed a Sensor Web at the Central Avra Valley Storage and Recovery Project (CAVSARP) facility located west of Tucson, AZ. This particular site was selected because it is ideal for studying spatio-temporal phenomena and for providing a test site for more sophisticated hydrological studies in the future.

  10. A Public Involvement Road Map

    DOT National Transportation Integrated Search

    1998-09-16

    In order to have effective public involvement, governments need a road map for : the decision-making process. Yet, citizens from small and medium sized cities : frequently do not have the resources to use sophisticated technology for public : involve...

  11. That Elusive, Eclectic Thing Called Thermal Environment: What a Board Should Know About It

    ERIC Educational Resources Information Center

    Schutte, Frederick

    1970-01-01

    Discussion of proper thermal environment for protection of sophisticated educational equipment such as computer and data-processing machines, magnetic tapes, closed-circuit television and video tape communications systems.

  12. Active learning strategies for the deduplication of electronic patient data using classification trees.

    PubMed

    Sariyar, M; Borg, A; Pommerening, K

    2012-10-01

    Supervised record linkage methods often require a clerical review to gain informative training data. Active learning means to actively prompt the user to label data with special characteristics in order to minimise the review costs. We conducted an empirical evaluation to investigate whether a simple active learning strategy using binary comparison patterns is sufficient or if string metrics together with a more sophisticated algorithm are necessary to achieve high accuracies with a small training set. Based on medical registry data with different numbers of attributes, we used active learning to acquire training sets for classification trees, which were then used to classify the remaining data. Active learning for binary patterns means that every distinct comparison pattern represents a stratum from which one item is sampled. Active learning for patterns consisting of the Levenshtein string metric values uses an iterative process where the most informative and representative examples are added to the training set. In this context, we extended the active learning strategy by Sarawagi and Bhamidipaty (2002). On the original data set, active learning based on binary comparison patterns leads to the best results. When dropping four or six attributes, using string metrics leads to better results. In both cases, not more than 200 manually reviewed training examples are necessary. In record linkage applications where only forename, name and birthday are available as attributes, we suggest the sophisticated active learning strategy based on string metrics in order to achieve highly accurate results. We recommend the simple strategy if more attributes are available, as in our study. In both cases, active learning significantly reduces the amount of manual involvement in training data selection compared to usual record linkage settings. Copyright © 2012 Elsevier Inc. All rights reserved.

  13. Building Geographic Information System Capacity in Local Health Departments: Lessons From a North Carolina Project

    PubMed Central

    Miranda, Marie Lynn; Silva, Jennifer M.; Overstreet Galeano, M. Alicia; Brown, Jeffrey P.; Campbell, Douglas S.; Coley, Evelyn; Cowan, Christopher S.; Harvell, Dianne; Lassiter, Jenny; Parks, Jerry L.; Sandelé, Wanda

    2005-01-01

    State government, university, and local health department (LHD) partners collaborated to build the geographic information system (GIS) capacity of 5 LHDs in North Carolina. Project elements included procuring hardware and software, conducting individualized and group training, developing data layers, guiding the project development process, coordinating participation in technical conferences, providing ongoing project consultation, and evaluating project milestones. The project provided health department personnel with the skills and resources required to use sophisticated information management systems, particularly those that address spatial dimensions of public health practice. This capacity-building project helped LHDs incorporate GIS technology into daily operations, resulting in improved time and cost efficiency. Keys to success included (1) methods training rooted in problems specific to the LHD, (2) required project identification by LHD staff with associated timelines for development, (3) ongoing technical support as staff returned to home offices after training, (4) subgrants to LHDs to ease hardware and software resource constraints, (5) networks of relationships among LHDs and other professional GIS users, and (6) senior LHD leadership who supported the professional development activities being undertaken by staff. PMID:16257950

  14. Assessing the impact of graphical quality on automatic text recognition in digital maps

    NASA Astrophysics Data System (ADS)

    Chiang, Yao-Yi; Leyk, Stefan; Honarvar Nazari, Narges; Moghaddam, Sima; Tan, Tian Xiang

    2016-08-01

    Converting geographic features (e.g., place names) in map images into a vector format is the first step for incorporating cartographic information into a geographic information system (GIS). With the advancement in computational power and algorithm design, map processing systems have been considerably improved over the last decade. However, the fundamental map processing techniques such as color image segmentation, (map) layer separation, and object recognition are sensitive to minor variations in graphical properties of the input image (e.g., scanning resolution). As a result, most map processing results would not meet user expectations if the user does not "properly" scan the map of interest, pre-process the map image (e.g., using compression or not), and train the processing system, accordingly. These issues could slow down the further advancement of map processing techniques as such unsuccessful attempts create a discouraged user community, and less sophisticated tools would be perceived as more viable solutions. Thus, it is important to understand what kinds of maps are suitable for automatic map processing and what types of results and process-related errors can be expected. In this paper, we shed light on these questions by using a typical map processing task, text recognition, to discuss a number of map instances that vary in suitability for automatic processing. We also present an extensive experiment on a diverse set of scanned historical maps to provide measures of baseline performance of a standard text recognition tool under varying map conditions (graphical quality) and text representations (that can vary even within the same map sheet). Our experimental results help the user understand what to expect when a fully or semi-automatic map processing system is used to process a scanned map with certain (varying) graphical properties and complexities in map content.

  15. Evolution of grid-wide access to database resident information in ATLAS using Frontier

    NASA Astrophysics Data System (ADS)

    Barberis, D.; Bujor, F.; de Stefano, J.; Dewhurst, A. L.; Dykstra, D.; Front, D.; Gallas, E.; Gamboa, C. F.; Luehring, F.; Walker, R.

    2012-12-01

    The ATLAS experiment deployed Frontier technology worldwide during the initial year of LHC collision data taking to enable user analysis jobs running on the Worldwide LHC Computing Grid to access database resident data. Since that time, the deployment model has evolved to optimize resources, improve performance, and streamline maintenance of Frontier and related infrastructure. In this presentation we focus on the specific changes in the deployment and improvements undertaken, such as the optimization of cache and launchpad location, the use of RPMs for more uniform deployment of underlying Frontier related components, improvements in monitoring, optimization of fail-over, and an increasing use of a centrally managed database containing site specific information (for configuration of services and monitoring). In addition, analysis of Frontier logs has allowed us a deeper understanding of problematic queries and understanding of use cases. Use of the system has grown beyond user analysis and subsystem specific tasks such as calibration and alignment, extending into production processing areas, such as initial reconstruction and trigger reprocessing. With a more robust and tuned system, we are better equipped to satisfy the still growing number of diverse clients and the demands of increasingly sophisticated processing and analysis.

  16. Quantum entanglement and informational activities of biomolecules

    NASA Astrophysics Data System (ADS)

    Al-Shargi, Hanan; Berkovich, Simon

    2009-03-01

    Our model of holographic Universe [1] explains the surprising property of quantum entanglement and reveals its biological implications. The suggested holographic mechanism handles 2D slices of the physical world as a whole. Fitting this simple holistic process in the Procrustean bed of individual particles interactions leads to intricacies of quantum theory with an unintelligible protrusion of distant correlations. Holographic medium imposes dependence of quantum effects on absolute positioning. Testing this prediction for a non-exponential radioactive decay could resolutely point to outside ``memory.'' The essence of Life is in the sophistication of macromolecules. Distinctions in biological information processing of nucleotides in DNA and amino acids in proteins are related to entropies of their structures. Randomness of genetic configurations as exposed by their maximal entropy is characteristic of passive identification rather than active storage functionality. Structural redundancy of proteins shows their operability, of which different foldings of prions is most indicative. Folding of one prion can reshape another prion without a direct contact appearing like ``quantum entanglement,'' or ``teleportation.'' Testing the surmised influence of absolute orientation on the prion reshaping can uncover the latency effects in the ``mad cow'' disease. 1. Simon Berkovich, TR-GWU-CS-07-006, http://www.cs.gwu.edu/research/reports.php

  17. Ontology-based coupled optimisation design method using state-space analysis for the spindle box system of large ultra-precision optical grinding machine

    NASA Astrophysics Data System (ADS)

    Wang, Qianren; Chen, Xing; Yin, Yuehong; Lu, Jian

    2017-08-01

    With the increasing complexity of mechatronic products, traditional empirical or step-by-step design methods are facing great challenges with various factors and different stages having become inevitably coupled during the design process. Management of massive information or big data, as well as the efficient operation of information flow, is deeply involved in the process of coupled design. Designers have to address increased sophisticated situations when coupled optimisation is also engaged. Aiming at overcoming these difficulties involved in conducting the design of the spindle box system of ultra-precision optical grinding machine, this paper proposed a coupled optimisation design method based on state-space analysis, with the design knowledge represented by ontologies and their semantic networks. An electromechanical coupled model integrating mechanical structure, control system and driving system of the motor is established, mainly concerning the stiffness matrix of hydrostatic bearings, ball screw nut and rolling guide sliders. The effectiveness and precision of the method are validated by the simulation results of the natural frequency and deformation of the spindle box when applying an impact force to the grinding wheel.

  18. Finding and Recommending Scholarly Articles

    NASA Astrophysics Data System (ADS)

    Kurtz, Michael J.; Henneken, Edwin A.

    2014-05-01

    The rate at which scholarly literature is being produced has been increasing at approximately 3.5 percent per year for decades. This means that during a typical 40 year career the amount of new literature produced each year increases by a factor of four. The methods scholars use to discover relevant literature must change. Just like everybody else involved in information discovery, scholars are confronted with information overload. Two decades ago, this discovery process essentially consisted of paging through abstract books, talking to colleagues and librarians, and browsing journals. A time-consuming process, which could even be longer if material had to be shipped from elsewhere. Now much of this discovery process is mediated by online scholarly information systems. All these systems are relatively new, and all are still changing. They all share a common goal: to provide their users with access to the literature relevant to their specific needs. To achieve this each system responds to actions by the user by displaying articles which the system judges relevant to the user's current needs. Recently search systems which use particularly sophisticated methodologies to recommend a few specific papers to the user have been called "recommender systems". These methods are in line with the current use of the term "recommender system" in computer science. We do not adopt this definition, rather we view systems like these as components in a larger whole, which is presented by the scholarly information systems themselves. In what follows we view the recommender system as an aspect of the entire information system; one which combines the massive memory capacities of the machine with the cognitive abilities of the human user to achieve a human-machine synergy.

  19. Climate Science Communications - Video Visualization Techniques

    NASA Astrophysics Data System (ADS)

    Reisman, J. P.; Mann, M. E.

    2010-12-01

    Communicating Climate science is challenging due to it's complexity. But as they say, a picture is worth a thousand words. Visualization techniques can be merely graphical or combine multimedia so as to make graphs come alive in context with other visual and auditory cues. This can also make the information come alive in a way that better communicates what the science is all about. What types of graphics to use depends on your audience, some graphs are great for scientists but if you are trying to communicate to a less sophisticated audience, certain visuals translate information in a more easily perceptible manner. Hollywood techniques and style can be applied to these graphs to give them even more impact. Video is one of the most powerful communication tools in its ability to combine visual and audio through time. Adding music and visual cues such as pans and zooms can greatly enhance the ability to communicate your concepts. Video software ranges from relatively simple to very sophisticated. In reality, you don't need the best tools to get your point across. In fact, with relatively inexpensive software, you can put together powerful videos that more effectively convey the science you are working on with greater sophistication, and in an entertaining way. We will examine some basic techniques to increase the quality of video visualization to make it more effective in communicating complexity. If a picture is worth a thousand words, a decent video with music, and a bit of narration is priceless.

  20. Roman sophisticated surface modification methods to manufacture silver counterfeited coins

    NASA Astrophysics Data System (ADS)

    Ingo, G. M.; Riccucci, C.; Faraldi, F.; Pascucci, M.; Messina, E.; Fierro, G.; Di Carlo, G.

    2017-11-01

    By means of the combined use of X-ray photoelectron spectroscopy (XPS), optical microscopy (OM) and scanning electron microscopy (SEM) coupled with energy dispersive X-ray spectroscopy (EDS) the surface and subsurface chemical and metallurgical features of silver counterfeited Roman Republican coins are investigated to decipher some aspects of the manufacturing methods and to evaluate the technological ability of the Roman metallurgists to produce thin silver coatings. The results demonstrate that over 2000 ago important advances in the technology of thin layer deposition on metal substrates were attained by Romans. The ancient metallurgists produced counterfeited coins by combining sophisticated micro-plating methods and tailored surface chemical modification based on the mercury-silvering process. The results reveal that Romans were able systematically to chemically and metallurgically manipulate alloys at a micro scale to produce adherent precious metal layers with a uniform thickness up to few micrometers. The results converge to reveal that the production of forgeries was aimed firstly to save expensive metals as much as possible allowing profitable large-scale production at a lower cost. The driving forces could have been a lack of precious metals, an unexpected need to circulate coins for trade and/or a combinations of social, political and economic factors that requested a change in money supply. Finally, some information on corrosion products have been achieved useful to select materials and methods for the conservation of these important witnesses of technology and economy.

  1. Information Technology in Complex Health Services

    PubMed Central

    Southon, Frank Charles Gray; Sauer, Chris; Dampney, Christopher Noel Grant (Kit)

    1997-01-01

    Abstract Objective: To identify impediments to the successful transfer and implementation of packaged information systems through large, divisionalized health services. Design: A case analysis of the failure of an implementation of a critical application in the Public Health System of the State of New South Wales, Australia, was carried out. This application had been proven in the United States environment. Measurements: Interviews involving over 60 staff at all levels of the service were undertaken by a team of three. The interviews were recorded and analyzed for key themes, and the results were shared and compared to enable a continuing critical assessment. Results: Two components of the transfer of the system were considered: the transfer from a different environment, and the diffusion throughout a large, divisionalized organization. The analyses were based on the Scott-Morton organizational fit framework. In relation to the first, it was found that there was a lack of fit in the business environments and strategies, organizational structures and strategy-structure pairing as well as the management process-roles pairing. The diffusion process experienced problems because of the lack of fit in the strategy-structure, strategy-structure-management processes, and strategy-structure-role relationships. Conclusion: The large-scale developments of integrated health services present great challenges to the efficient and reliable implementation of information technology, especially in large, divisionalized organizations. There is a need to take a more sophisticated approach to understanding the complexities of organizational factors than has traditionally been the case. PMID:9067877

  2. Information technology in complex health services: organizational impediments to successful technology transfer and diffusion.

    PubMed

    Southon, F C; Sauer, C; Grant, C N

    1997-01-01

    To identify impediments to the successful transfer and implementation of packaged information systems through large, divisionalized health services. A case analysis of the failure of an implementation of a critical application in the Public Health System of the State of New South Wales, Australia, was carried out. This application had been proven in the United States environment. Interviews involving over 60 staff at all levels of the service were undertaken by a team of three. The interviews were recorded and analyzed for key themes, and the results were shared and compared to enable a continuing critical assessment. Two components of the transfer of the system were considered: the transfer from a different environment, and the diffusion throughout a large, divisionalized organization. The analyses were based on the Scott-Morton organizational fit framework. In relation to the first, it was found that there was a lack of fit in the business environments and strategies, organizational structures and strategy-structure pairing as well as the management process-roles pairing. The diffusion process experienced problems because of the lack of fit in the strategy-structure, strategy-structure-management processes, and strategy-structure-role relationships. The large-scale developments of integrated health services present great challenges to the efficient and reliable implementation of information technology, especially in large, divisionalized organizations. There is a need to take a more sophisticated approach to understanding the complexities of organizational factors than has traditionally been the case.

  3. Geophysical phenomena classification by artificial neural networks

    NASA Technical Reports Server (NTRS)

    Gough, M. P.; Bruckner, J. R.

    1995-01-01

    Space science information systems involve accessing vast data bases. There is a need for an automatic process by which properties of the whole data set can be assimilated and presented to the user. Where data are in the form of spectrograms, phenomena can be detected by pattern recognition techniques. Presented are the first results obtained by applying unsupervised Artificial Neural Networks (ANN's) to the classification of magnetospheric wave spectra. The networks used here were a simple unsupervised Hamming network run on a PC and a more sophisticated CALM network run on a Sparc workstation. The ANN's were compared in their geophysical data recognition performance. CALM networks offer such qualities as fast learning, superiority in generalizing, the ability to continuously adapt to changes in the pattern set, and the possibility to modularize the network to allow the inter-relation between phenomena and data sets. This work is the first step toward an information system interface being developed at Sussex, the Whole Information System Expert (WISE). Phenomena in the data are automatically identified and provided to the user in the form of a data occurrence morphology, the Whole Information System Data Occurrence Morphology (WISDOM), along with relationships to other parameters and phenomena.

  4. Reasons in Support of Data Security and Data Security Management as Two Independent Concepts: A New Model.

    PubMed

    Moghaddasi, Hamid; Sajjadi, Samad; Kamkarhaghighi, Mehran

    2016-01-01

    Any information which is generated and saved needs to be protected against accidental or intentional losses and manipulations if it is to be used by the intended users in due time. As such, information managers have adopted numerous measures to achieve data security within data storage systems, along with the spread of information technology. The "data security models" presented thus far have unanimously highlighted the significance of data security management. For further clarification, the current study first introduces the "needs and improvement" cycle; the study will then present some independent definitions, together with a support umbrella, in an attempt to shed light on the data security management. Data security focuses on three features or attributes known as integrity, identity of sender(s) and identity of receiver(s). Management in data security follows an endless evolutionary process, to keep up with new developments in information technology and communication. In this process management develops new characteristics with greater capabilities to achieve better data security. The characteristics, continuously increasing in number, with a special focus on control, are as follows: private zone, confidentiality, availability, non-repudiation, possession, accountability, authenticity, authentication and auditability. Data security management steadily progresses, resulting in more sophisticated features. The developments are in line with new developments in information and communication technology and novel advances in intrusion detection systems (IDS). Attention to differences between data security and data security management by international organizations such as the International Standard Organization (ISO), and International Telecommunication Union (ITU) is necessary if information quality is to be enhanced.

  5. Reasons in Support of Data Security and Data Security Management as Two Independent Concepts: A New Model

    PubMed Central

    Moghaddasi, Hamid; Kamkarhaghighi, Mehran

    2016-01-01

    Introduction: Any information which is generated and saved needs to be protected against accidental or intentional losses and manipulations if it is to be used by the intended users in due time. As such, information managers have adopted numerous measures to achieve data security within data storage systems, along with the spread of information technology. Background: The “data security models” presented thus far have unanimously highlighted the significance of data security management. For further clarification, the current study first introduces the “needs and improvement” cycle; the study will then present some independent definitions, together with a support umbrella, in an attempt to shed light on the data security management. Findings: Data security focuses on three features or attributes known as integrity, identity of sender(s) and identity of receiver(s). Management in data security follows an endless evolutionary process, to keep up with new developments in information technology and communication. In this process management develops new characteristics with greater capabilities to achieve better data security. The characteristics, continuously increasing in number, with a special focus on control, are as follows: private zone, confidentiality, availability, non-repudiation, possession, accountability, authenticity, authentication and auditability. Conclusion: Data security management steadily progresses, resulting in more sophisticated features. The developments are in line with new developments in information and communication technology and novel advances in intrusion detection systems (IDS). Attention to differences between data security and data security management by international organizations such as the International Standard Organization (ISO), and International Telecommunication Union (ITU) is necessary if information quality is to be enhanced. PMID:27857823

  6. Comparing the Cognitive Process of Circular Causality in Two Patients with Strokes through Qualitative Analysis.

    PubMed

    Derakhshanrad, Seyed Alireza; Piven, Emily; Ghoochani, Bahareh Zeynalzadeh

    2017-10-01

    Walter J. Freeman pioneered the neurodynamic model of brain activity when he described the brain dynamics for cognitive information transfer as the process of circular causality at intention, meaning, and perception (IMP) levels. This view contributed substantially to establishment of the Intention, Meaning, and Perception Model of Neuro-occupation in occupational therapy. As described by the model, IMP levels are three components of the brain dynamics system, with nonlinear connections that enable cognitive function to be processed in a circular causality fashion, known as Cognitive Process of Circular Causality (CPCC). Although considerable research has been devoted to study the brain dynamics by sophisticated computerized imaging techniques, less attention has been paid to study it through investigating the adaptation process of thoughts and behaviors. To explore how CPCC manifested thinking and behavioral patterns, a qualitative case study was conducted on two matched female participants with strokes, who were of comparable ages, affected sides, and other characteristics, except for their resilience and motivational behaviors. CPCC was compared by matrix analysis between two participants, using content analysis with pre-determined categories. Different patterns of thinking and behavior may have happened, due to disparate regulation of CPCC between two participants.

  7. 77 FR 59029 - Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-25

    ... information to the Office of Management and Budget (``OMB'') for extension and approval. Regulation R, Rule... dealer) to notify the bank if the broker or dealer makes certain determinations regarding the financial... worth or institutional status or suitability or sophistication standing as well as a bank employee's...

  8. Identifying Productive Resources in Secondary School Students' Discourse about Energy

    ERIC Educational Resources Information Center

    Harrer, Benedikt

    2013-01-01

    A growing program of research in science education acknowledges the beginnings of disciplinary reasoning in students' ideas and seeks to inform instruction that responds productively to these disciplinary progenitors in the moment to foster their development into sophisticated scientific practice. This dissertation examines secondary school…

  9. Phonetics Information Base and Lexicon

    ERIC Educational Resources Information Center

    Moran, Steven Paul

    2012-01-01

    In this dissertation, I investigate the linguistic and technological challenges involved in creating a cross-linguistic data set to undertake phonological typology. I then address the question of whether more sophisticated, knowledge-based approaches to data modeling, coupled with a broad cross-linguistic data set, can extend previous typological…

  10. Multiple Hypnotizabilities: Differentiating the Building Blocks of Hypnotic Response

    ERIC Educational Resources Information Center

    Woody, Erik Z.; Barnier, Amanda J.; McConkey, Kevin M.

    2005-01-01

    Although hypnotizability can be conceptualized as involving component subskills, standard measures do not differentiate them from a more general unitary trait, partly because the measures include limited sets of dichotomous items. To overcome this, the authors applied full-information factor analysis, a sophisticated analytic approach for…

  11. Environmental Scanning Practices in Junior, Technical, and Community Colleges.

    ERIC Educational Resources Information Center

    Friedel, Janice N.; Rosenberg, Dana

    1993-01-01

    Reports results of a 1991 national survey of environmental scanning practices at two-year institutions. Examines sophistication of scanning efforts; personnel involved; and methods of collecting, compiling, interpreting, communicating, and using scan information. Finds scanning practices in use at 41% of the 601 responding institutions. (PAA)

  12. Cartoon Violence: Is It as Detrimental to Preschoolers as We Think?

    ERIC Educational Resources Information Center

    Peters, Kristen M.; Blumberg, Fran C.

    2002-01-01

    Critically reviews research on effects of cartoon violence on children's moral understanding and behavior to enable early childhood educators and parents to make informed decisions about what constitutes potentially harmful television viewing. Focuses on preschoolers' limited comprehension of television content and relatively sophisticated moral…

  13. Modeling, simulation, and analysis of optical remote sensing systems

    NASA Technical Reports Server (NTRS)

    Kerekes, John Paul; Landgrebe, David A.

    1989-01-01

    Remote Sensing of the Earth's resources from space-based sensors has evolved in the past 20 years from a scientific experiment to a commonly used technological tool. The scientific applications and engineering aspects of remote sensing systems have been studied extensively. However, most of these studies have been aimed at understanding individual aspects of the remote sensing process while relatively few have studied their interrelations. A motivation for studying these interrelationships has arisen with the advent of highly sophisticated configurable sensors as part of the Earth Observing System (EOS) proposed by NASA for the 1990's. Two approaches to investigating remote sensing systems are developed. In one approach, detailed models of the scene, the sensor, and the processing aspects of the system are implemented in a discrete simulation. This approach is useful in creating simulated images with desired characteristics for use in sensor or processing algorithm development. A less complete, but computationally simpler method based on a parametric model of the system is also developed. In this analytical model the various informational classes are parameterized by their spectral mean vector and covariance matrix. These class statistics are modified by models for the atmosphere, the sensor, and processing algorithms and an estimate made of the resulting classification accuracy among the informational classes. Application of these models is made to the study of the proposed High Resolution Imaging Spectrometer (HRIS). The interrelationships among observational conditions, sensor effects, and processing choices are investigated with several interesting results.

  14. Development of a fusion approach selection tool

    NASA Astrophysics Data System (ADS)

    Pohl, C.; Zeng, Y.

    2015-06-01

    During the last decades number and quality of available remote sensing satellite sensors for Earth observation has grown significantly. The amount of available multi-sensor images along with their increased spatial and spectral resolution provides new challenges to Earth scientists. With a Fusion Approach Selection Tool (FAST) the remote sensing community would obtain access to an optimized and improved image processing technology. Remote sensing image fusion is a mean to produce images containing information that is not inherent in the single image alone. In the meantime the user has access to sophisticated commercialized image fusion techniques plus the option to tune the parameters of each individual technique to match the anticipated application. This leaves the operator with an uncountable number of options to combine remote sensing images, not talking about the selection of the appropriate images, resolution and bands. Image fusion can be a machine and time-consuming endeavour. In addition it requires knowledge about remote sensing, image fusion, digital image processing and the application. FAST shall provide the user with a quick overview of processing flows to choose from to reach the target. FAST will ask for available images, application parameters and desired information to process this input to come out with a workflow to quickly obtain the best results. It will optimize data and image fusion techniques. It provides an overview on the possible results from which the user can choose the best. FAST will enable even inexperienced users to use advanced processing methods to maximize the benefit of multi-sensor image exploitation.

  15. Long Term Value of Apollo Samples: How Fundamental Understanding of a Body Takes Decades of Study

    NASA Astrophysics Data System (ADS)

    Borg, L. E.; Gaffney, A. M.; Kruijer, T. K.; Sio, C. K.

    2018-04-01

    Fundamental understanding of a body evolves as more sophisticated technology is applied to a progressively better understood sample set. Sample diversity is required to understand many geologic processes.

  16. A biomedical information system for retrieval and manipulation of NHANES data.

    PubMed

    Mukherjee, Sukrit; Martins, David; Norris, Keith C; Jenders, Robert A

    2013-01-01

    The retrieval and manipulation of data from large public databases like the U.S. National Health and Nutrition Examination Survey (NHANES) may require sophisticated statistical software and significant expertise that may be unavailable in the university setting. In response, we have developed the Data Retrieval And Manipulation System (DReAMS), an automated information system to handle all processes of data extraction and cleaning and then joining different subsets to produce analysis-ready output. The system is a browser-based data warehouse application in which the input data from flat files or operational systems are aggregated in a structured way so that the desired data can be read, recoded, queried and extracted efficiently. The current pilot implementation of the system provides access to a limited amount of NHANES database. We plan to increase the amount of data available through the system in the near future and to extend the techniques to other large databases from CDU archive with a current holding of about 53 databases.

  17. Acquisition and review of diagnostic images for use in medical research and medical testing examinations via the Internet

    NASA Astrophysics Data System (ADS)

    Pauley, Mark A.; Dalrymple, Glenn V.; Zhu, Quiming; Chu, Wei-Kom

    2000-12-01

    With the continued centralization of medical care into large, regional centers, there is a growing need for a flexible, inexpensive, and secure system to rapidly provide referring physicians in the field with the results of the sophisticated medical tests performed at these facilities. Furthermore, the medical community has long recognized the need for a system with similar characteristics to maintain and upgrade patient case sets for oral and written student examinations. With the move toward filmless radiographic instrumentation, the widespread and growing use of digital methods and the Internet, both of these processes can now be realized. This article describes the conceptual development and testing of a protocol that allow users to transmit, modify, remotely store and display the images and textual information of medical cases via the Internet. We also discuss some of the legal issues we encountered regarding the transmission of medical information; these issues have had a direct impact on the implementation of the results of this project.

  18. The evolving role of supply chain management technology in healthcare.

    PubMed

    Langabeer, Jim

    2005-01-01

    The healthcare supply chain is a vast, disintegrated network of products and players, loosely held together by manual and people-intensive processes. Managing the flow of information, supplies, equipment, and services from manufacturers to distributors to providers of care is especially difficult in clinical supply chains, compared with more technology-intense industries like consumer goods or industrial manufacturing. As supplies move downstream towards hospitals and clinics, the quality and robustness of accompanying management and information systems used to manage these products deteriorates significantly. Technology that provides advanced planning, synchronization, and collaboration upstream at the large supply manufacturers and distributors rarely is used at even the world's larger and more sophisticated hospitals. This article outlines the current state of healthcare supply chain management technologies, addresses potential reasons for the lack of adoption of technologies and provides a roadmap for the evolution of technology for the future. This piece is based on both quantitative and qualitative research assessments of the healthcare supply chain conducted during the last two years.

  19. Orbit transfer rocket engine technology program: Automated preflight methods concept definition

    NASA Technical Reports Server (NTRS)

    Erickson, C. M.; Hertzberg, D. W.

    1991-01-01

    The possibility of automating preflight engine checkouts on orbit transfer engines is discussed. The minimum requirements in terms of information and processing necessary to assess the engine'e integrity and readiness to perform its mission were first defined. A variety of ways for remotely obtaining that information were generated. The sophistication of these approaches varied from a simple preliminary power up, where the engine is fired up for the first time, to the most advanced approach where the sensor and operational history data system alone indicates engine integrity. The critical issues and benefits of these methods were identified, outlined, and prioritized. The technology readiness of each of these automated preflight methods were then rated on a NASA Office of Exploration scale used for comparing technology options for future mission choices. Finally, estimates were made of the remaining cost to advance the technology for each method to a level where the system validation models have been demonstrated in a simulated environment.

  20. Role of data warehousing in healthcare epidemiology.

    PubMed

    Wyllie, D; Davies, J

    2015-04-01

    Electronic storage of healthcare data, including individual-level risk factors for both infectious and other diseases, is increasing. These data can be integrated at hospital, regional and national levels. Data sources that contain risk factor and outcome information for a wide range of conditions offer the potential for efficient epidemiological analysis of multiple diseases. Opportunities may also arise for monitoring healthcare processes. Integrating diverse data sources presents epidemiological, practical, and ethical challenges. For example, diagnostic criteria, outcome definitions, and ascertainment methods may differ across the data sources. Data volumes may be very large, requiring sophisticated computing technology. Given the large populations involved, perhaps the most challenging aspect is how informed consent can be obtained for the development of integrated databases, particularly when it is not easy to demonstrate their potential. In this article, we discuss some of the ups and downs of recent projects as well as the potential of data warehousing for antimicrobial resistance monitoring. Copyright © 2015. Published by Elsevier Ltd.

  1. The data acquisition and reduction challenge at the Large Hadron Collider.

    PubMed

    Cittolin, Sergio

    2012-02-28

    The Large Hadron Collider detectors are technological marvels-which resemble, in functionality, three-dimensional digital cameras with 100 Mpixels-capable of observing proton-proton (pp) collisions at the crossing rate of 40 MHz. Data handling limitations at the recording end imply the selection of only one pp event out of each 10(5). The readout and processing of this huge amount of information, along with the selection of the best approximately 200 events every second, is carried out by a trigger and data acquisition system, supplemented by a sophisticated control and monitor system. This paper presents an overview of the challenges that the development of these systems has presented over the past 15 years. It concludes with a short historical perspective, some lessons learnt and a few thoughts on the future.

  2. Neuroimaging for psychotherapy research: Current trends

    PubMed Central

    WEINGARTEN, CAROL P.; STRAUMAN, TIMOTHY J.

    2014-01-01

    Objective This article reviews neuroimaging studies that inform psychotherapy research. An introduction to neuroimaging methods is provided as background for the increasingly sophisticated breadth of methods and findings appearing in psychotherapy research. Method We compiled and assessed a comprehensive list of neuroimaging studies of psychotherapy outcome, along with selected examples of other types of studies that also are relevant to psychotherapy research. We emphasized magnetic resonance imaging (MRI) since it is the dominant neuroimaging modality in psychological research. Results We summarize findings from neuroimaging studies of psychotherapy outcome, including treatment for depression, obsessive-compulsive disorder (OCD), and schizophrenia. Conclusions The increasing use of neuroimaging methods in the study of psychotherapy continues to refine our understanding of both outcome and process. We suggest possible directions for future neuroimaging studies in psychotherapy research. PMID:24527694

  3. Conservative zonal schemes for patched grids in 2 and 3 dimensions

    NASA Technical Reports Server (NTRS)

    Hessenius, Kristin A.

    1987-01-01

    The computation of flow over complex geometries, such as realistic aircraft configurations, poses difficult grid generation problems for computational aerodynamicists. The creation of a traditional, single-module grid of acceptable quality about an entire configuration may be impossible even with the most sophisticated of grid generation techniques. A zonal approach, wherein the flow field is partitioned into several regions within which grids are independently generated, is a practical alternative for treating complicated geometries. This technique not only alleviates the problems of discretizing a complex region, but also facilitates a block processing approach to computation thereby circumventing computer memory limitations. The use of such a zonal scheme, however, requires the development of an interfacing procedure that ensures a stable, accurate, and conservative calculation for the transfer of information across the zonal borders.

  4. Physical realizability of continuous-time quantum stochastic walks

    NASA Astrophysics Data System (ADS)

    Taketani, Bruno G.; Govia, Luke C. G.; Wilhelm, Frank K.

    2018-05-01

    Quantum walks are a promising methodology that can be used to both understand and implement quantum information processing tasks. The quantum stochastic walk is a recently developed framework that combines the concept of a quantum walk with that of a classical random walk, through open system evolution of a quantum system. Quantum stochastic walks have been shown to have applications in as far reaching fields as artificial intelligence. However, there are significant constraints on the kind of open system evolutions that can be realized in a physical experiment. In this work, we discuss the restrictions on the allowed open system evolution and the physical assumptions underpinning them. We show that general direct implementations would require the complete solution of the underlying unitary dynamics and sophisticated reservoir engineering, thus weakening the benefits of experimental implementation.

  5. Toward Scalable Trustworthy Computing Using the Human-Physiology-Immunity Metaphor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hively, Lee M; Sheldon, Frederick T

    The cybersecurity landscape consists of an ad hoc patchwork of solutions. Optimal cybersecurity is difficult for various reasons: complexity, immense data and processing requirements, resource-agnostic cloud computing, practical time-space-energy constraints, inherent flaws in 'Maginot Line' defenses, and the growing number and sophistication of cyberattacks. This article defines the high-priority problems and examines the potential solution space. In that space, achieving scalable trustworthy computing and communications is possible through real-time knowledge-based decisions about cyber trust. This vision is based on the human-physiology-immunity metaphor and the human brain's ability to extract knowledge from data and information. The article outlines future steps towardmore » scalable trustworthy systems requiring a long-term commitment to solve the well-known challenges.« less

  6. A reference model for space data system interconnection services

    NASA Astrophysics Data System (ADS)

    Pietras, John; Theis, Gerhard

    1993-03-01

    The widespread adoption of standard packet-based data communication protocols and services for spaceflight missions provides the foundation for other standard space data handling services. These space data handling services can be defined as increasingly sophisticated processing of data or information received from lower-level services, using a layering approach made famous in the International Organization for Standardization (ISO) Open System Interconnection Reference Model (OSI-RM). The Space Data System Interconnection Reference Model (SDSI-RM) incorporates the conventions of the OSIRM to provide a framework within which a complete set of space data handling services can be defined. The use of the SDSI-RM is illustrated through its application to data handling services and protocols that have been defined by, or are under consideration by, the Consultative Committee for Space Data Systems (CCSDS).

  7. A reference model for space data system interconnection services

    NASA Technical Reports Server (NTRS)

    Pietras, John; Theis, Gerhard

    1993-01-01

    The widespread adoption of standard packet-based data communication protocols and services for spaceflight missions provides the foundation for other standard space data handling services. These space data handling services can be defined as increasingly sophisticated processing of data or information received from lower-level services, using a layering approach made famous in the International Organization for Standardization (ISO) Open System Interconnection Reference Model (OSI-RM). The Space Data System Interconnection Reference Model (SDSI-RM) incorporates the conventions of the OSIRM to provide a framework within which a complete set of space data handling services can be defined. The use of the SDSI-RM is illustrated through its application to data handling services and protocols that have been defined by, or are under consideration by, the Consultative Committee for Space Data Systems (CCSDS).

  8. A Collaborative Reasoning Maintenance System for a Reliable Application of Legislations

    NASA Astrophysics Data System (ADS)

    Tamisier, Thomas; Didry, Yoann; Parisot, Olivier; Feltz, Fernand

    Decision support systems are nowadays used to disentangle all kinds of intricate situations and perform sophisticated analysis. Moreover, they are applied in areas where the knowledge can be heterogeneous, partially un-formalized, implicit, or diffuse. The representation and management of this knowledge become the key point to ensure the proper functioning of the system and keep an intuitive view upon its expected behavior. This paper presents a generic architecture for implementing knowledge-base systems used in collaborative business, where the knowledge is organized into different databases, according to the usage, persistence and quality of the information. This approach is illustrated with Cadral, a customizable automated tool built on this architecture and used for processing family benefits applications at the National Family Benefits Fund of the Grand-Duchy of Luxembourg.

  9. Intravital Fluorescence Videomicroscopy to Study Tumor Angiogenesis and Microcirculation1

    PubMed Central

    Vajkoczy, Peter; Ullrich, Axel; Meager, Michael D

    2000-01-01

    Abstract Angiogenesis and microcirculation play a central role in growth and metastasis of human neoplasms, and, thus, represent a major target for novel treatment strategies. Mechanistic analysis of processes involved in tumor vascularization, however, requires sophisticated in vivo experimental models and techniques. Intravital microscopy allows direct assessment of tumor angiogenesis, microcirculation and overall perfusion. Its application to the study of tumor-induced neovascularization further provides information on molecular transport and delivery, intra- and extravascular cell-to-cell and cell-to-matrix interaction, as well as tumor oxygenation and metabolism. With the recent advances in the field of bioluminescence and fluorescent reporter genes, appropriate for in vivo imaging, the intravital fluorescent microscopic approach has to be considered a powerful tool to study microvascular, cellular and molecular mechanisms of tumor growth. PMID:10933068

  10. End User Evaluations

    NASA Astrophysics Data System (ADS)

    Jay, Caroline; Lunn, Darren; Michailidou, Eleni

    As new technologies emerge, and Web sites become increasingly sophisticated, ensuring they remain accessible to disabled and small-screen users is a major challenge. While guidelines and automated evaluation tools are useful for informing some aspects of Web site design, numerous studies have demonstrated that they provide no guarantee that the site is genuinely accessible. The only reliable way to evaluate the accessibility of a site is to study the intended users interacting with it. This chapter outlines the processes that can be used throughout the design life cycle to ensure Web accessibility, describing their strengths and weaknesses, and discussing the practical and ethical considerations that they entail. The chapter also considers an important emerging trend in user evaluations: combining data from studies of “standard” Web use with data describing existing accessibility issues, to drive accessibility solutions forward.

  11. The young person's guide to the PDB.

    PubMed

    Minor, Wladek; Dauter, Zbigniew; Jaskolski, Mariusz

    The Protein Data Bank (PDB), created in 1971 when merely seven protein crystal structures were known, today holds over 120, 000 experimentally-determined three-dimensional models of macromolecules, including gigantic structures comprised of hundreds of thousands of atoms, such as ribosomes and viruses. Most of the deposits come from X-ray crystallography experiments, with important contributions also made by NMR spectroscopy and, recently, by the fast growing Cryo-Electron Microscopy. Although the determination of a macromolecular crystal structure is now facilitated by advanced experimental tools and by sophisticated software, it is still a highly complicated research process requiring specialized training, skill, experience and a bit of luck. Understanding the plethora of structural information provided by the PDB requires that its users (consumers) have at least a rudimentary initiation. This is the purpose of this educational overview.

  12. Editorial Comments, 1974-1986: The Case For and Against the Use of Computer-Assisted Decision Making

    PubMed Central

    Weaver, Robert R.

    1987-01-01

    Journal editorials are an important medium for communicating information about medical innovations. Evaluative statements contained in editorials pertain to the innovation's technical merits, as well as its probable economic, social and political, and ethical consequences. This information will either promote or impede the subsequent diffusion of innovations. This paper analyzes the evaluative information contained in thirty editorials that pertain to the topic of computer-assisted decision making (CDM). Most editorials agree that CDM technology is effective and economical in performing routine clinical tasks; controversy surrounds the use of more sophisticated CDM systems for complex problem solving. A few editorials argue that the innovation should play an integral role in transforming the established health care system. Most, however, maintain that it can or should be accommodated within the existing health care framework. Finally, while few editorials discuss the ethical ramifications of CDM technology, those that do suggest that it will contribute to more humane health care. The editorial analysis suggests that CDM technology aimed at routine clinical task will experience rapid diffusion. In contrast, the diffusion of more sophisticated CDM systems will, in the foreseeable future, likely be sporadic at best.

  13. Multivariate information-theoretic measures reveal directed information structure and task relevant changes in fMRI connectivity.

    PubMed

    Lizier, Joseph T; Heinzle, Jakob; Horstmann, Annette; Haynes, John-Dylan; Prokopenko, Mikhail

    2011-02-01

    The human brain undertakes highly sophisticated information processing facilitated by the interaction between its sub-regions. We present a novel method for interregional connectivity analysis, using multivariate extensions to the mutual information and transfer entropy. The method allows us to identify the underlying directed information structure between brain regions, and how that structure changes according to behavioral conditions. This method is distinguished in using asymmetric, multivariate, information-theoretical analysis, which captures not only directional and non-linear relationships, but also collective interactions. Importantly, the method is able to estimate multivariate information measures with only relatively little data. We demonstrate the method to analyze functional magnetic resonance imaging time series to establish the directed information structure between brain regions involved in a visuo-motor tracking task. Importantly, this results in a tiered structure, with known movement planning regions driving visual and motor control regions. Also, we examine the changes in this structure as the difficulty of the tracking task is increased. We find that task difficulty modulates the coupling strength between regions of a cortical network involved in movement planning and between motor cortex and the cerebellum which is involved in the fine-tuning of motor control. It is likely these methods will find utility in identifying interregional structure (and experimentally induced changes in this structure) in other cognitive tasks and data modalities.

  14. Development of a Learning Progression for the Formation of the Solar System

    ERIC Educational Resources Information Center

    Plummer, Julia D.; Palma, Christopher; Flarend, Alice; Rubin, KeriAnn; Ong, Yann Shiou; Botzer, Brandon; McDonald, Scott; Furman, Tanya

    2015-01-01

    This study describes the process of defining a hypothetical learning progression (LP) for astronomy around the big idea of "Solar System formation." At the most sophisticated level, students can explain how the formation process led to the current Solar System by considering how the planets formed from the collapse of a rotating cloud of…

  15. Designing a Web-Based Science Learning Environment for Model-Based Collaborative Inquiry

    ERIC Educational Resources Information Center

    Sun, Daner; Looi, Chee-Kit

    2013-01-01

    The paper traces a research process in the design and development of a science learning environment called WiMVT (web-based inquirer with modeling and visualization technology). The WiMVT system is designed to help secondary school students build a sophisticated understanding of scientific conceptions, and the science inquiry process, as well as…

  16. Precipitation links (PrecipLinks) - a prototype directory for precipitation information

    NASA Technical Reports Server (NTRS)

    Velanthapillia, Balendran; Stocker, Erich Franz

    2006-01-01

    This poster describes a web directory of research oriented precipitation links. In this era of sophisticated search engines and web agents, it might seem counterproductive to establish such a directory of links. However, entering precipitation into a search engine like google will yield over one million hits. To further exacerbate this situation many of the returned links are dead, duplicates of other links, incomplete, or only marginally related to research precipitation or even the broader precipitation area. Sometimes connecting the linked URL causes the browser to lose context and not be able to get back to the original page. Even using more sophisticated search engines query parameters or agents while reducing the overall return doesn't eliminate all of the other issues listed. As part of the development of the measurement-based Precipitation Processing System (PPS) that will support Tropical Rainfall Measuring Mission (TRMM) version 7 reprocessing and the Global Precipitation Measurement (GPM) mission a precipitation links (PrecipLinks) facility is being developed. PrecipLinks is intended to share locations of other sites that contain information or data pertaining to precipitation research. Potential contributors can log-on to the PrecipLinks website and register their site for inclusion in the directory. The price for inclusion is the requirement to place a link back to PrecipLinks on the webpage that is registered. This ensures that users will be able to easily get back to PrecipLinks regardless of any context issues that browsers might have. Perhaps more importantly users while visiting one site that they know can be referred to a location that has many others sites with which they might not be familiar. PrecipLinks is designed to have a very flat structure. This poster summarizes these categories (information, data, services) and the reasons for their selection. Providers may register multiple pages to which they wish to direct users. However, each page may be attached to only one of these categories. Each page to which they refer users will also have a return link to PrecipLinks. The poster describes the operation of the system both the automated and the human processes. It also provides images for the various steps in the registration and use.

  17. Science Teachers' Perspectives about Climate Change

    ERIC Educational Resources Information Center

    Dawson, Vaille

    2012-01-01

    Climate change and its effects are likely to present challenging problems for future generations of young people. It is important for Australian students to understand the mechanisms and consequences of climate change. If students are to develop a sophisticated understanding, then science teachers need to be well-informed about climate change…

  18. Lecture Comprehension and Note-Taking for L2 Students.

    ERIC Educational Resources Information Center

    Fahmy, Jane Jackson; Bilton, Linda

    Most information is still conveyed to university students through lectures. This necessitates that students have sophisticated listening and note-taking skills, and poses additional difficulties for non-native students. To identify areas for improvement, science lectures in English in the Sultanate of Oman were analyzed. The relationship between…

  19. A Statistical Study on Higher Educational Institutions in India

    ERIC Educational Resources Information Center

    Neelaveni, C.; Manimaran, S.

    2014-01-01

    This study aims to observe the increased effectiveness of Higher Educational Institutions in India and its competitiveness. It proposes to develop the interest in enhancing the quality in Educational Institutions. It is monitored and evaluated through rapid growth of information technology, which makes sophisticated data collection possible. This…

  20. Colleges Get Free Web Pages but with a Catch: Advertising.

    ERIC Educational Resources Information Center

    Blumenstyk, Goldie

    1999-01-01

    Striving to streamline student services and improve electronic communication, colleges and universities are signing on with companies that offer sophisticated World Wide Web sites through which students can accomplish basic administrative functions and receive information. The sites are often free of charge but also feature advertising messages…

  1. The relational clinical database: a possible solution to the star wars in registry systems.

    PubMed

    Michels, D K; Zamieroski, M

    1990-12-01

    In summary, having data from other service areas available in a relational clinical database could resolve many of the problems existing in today's registry systems. Uniting sophisticated information systems into a centralized database system could definitely be a corporate asset in managing the bottom line.

  2. Mapping where We Live and Play with GPS Technology

    ERIC Educational Resources Information Center

    Gentry, Deborah J.

    2006-01-01

    As a result of technological advances such as the Global Positioning System (GPS) and the Geographic Information System (GIS), mapping practices and applications have become far more sophisticated. This article suggests family and consumer sciences students and professionals consider using GPS technology to map their communities as a strategy to…

  3. Chronicle of Higher Education. Volume 51, Number 12, November 12, 2004

    ERIC Educational Resources Information Center

    Chronicle of Higher Education, 2004

    2004-01-01

    "Chronicle of Higher Education" presents an abundant source of news and information for college and university faculty members and administrators. This November 12, 2004 issue of "Chronicle for Higher Education" includes the following articles: (1) "The Transcendent Role of Chaplains" (Schaper, Donna); (2) "Offbeat Director's Sophistication Isn't…

  4. Information Technology and Fair Use

    ERIC Educational Resources Information Center

    Farmer, Lesley

    2011-01-01

    Intellectual pursuit and the recognition of ideas is a central concept. Copyrights protect the rights of intellectual creators while balancing those rights with the needs for access. As technologies have expanded, and production has become more sophisticated, the legal regulations surrounding their use have become more complex. With the advent of…

  5. Theorizing "Why" in E-Learning--A Frontier for Cognitive Engagement

    ERIC Educational Resources Information Center

    Mason, Jon

    2012-01-01

    "Asking why" is an important foundation of inquiry and fundamental to the development of reasoning skills and learning. Despite this, and despite the relentless and often disruptive nature of innovations in information and communications technology (ICT), sophisticated tools that directly support this basic act of learning appear to be…

  6. Don't Let "Phishers" Steal from You

    ERIC Educational Resources Information Center

    Goldsborough, Reid

    2005-01-01

    Many people still have not heard about the many ways identity theft via bogus e-mail links, or "phishing," is escalating, with criminals becoming ever more brazen and sophisticated in their online schemes to trick people into revealing their personal information. The tricksters are getting trickier. One of the newest scares involves…

  7. An Innovation in Children's T.V. the Infinity Factory

    ERIC Educational Resources Information Center

    La Luz, 1977

    1977-01-01

    "Infinity Factory" is a slick, fast-paced, sophisticated series aimed at teaching mathematics fundamentals with a unique and arresting approach. The 30 minutes of live-action skits, brief filmed documentaries, and animation sequences explore common sense math concepts and present useful information showing math at work in everyday life. (NQ)

  8. Science Results from the Mars Exploration Rover Mission

    ScienceCinema

    Squyres, Steven

    2017-12-11

    One of the most important scientific goals of the mission was to find and identify a variety of rocks and soils that provide evidence of the past presence of water on the planet. To obtain this information, Squyres is studying the data obtained on Mars by several sophisticated scientific instruments.

  9. Applications of satellite remote sensing to forested ecosystems

    Treesearch

    Louis R. Iverson; Robin Lambert Graham; Elizabeth A. Cook; Elizabeth A. Cook

    1989-01-01

    Since the launch of the first civilian earth-observing satellite in 1972, satellite remote sensing has provided increasingly sophisticated information on the structure and function of forested ecosystems. Forest classification and mapping, common uses of satellite data, have improved over the years as a result of more discriminating sensors, better classification...

  10. The Surveillance of Teachers and the Simulation of Teaching

    ERIC Educational Resources Information Center

    Page, Damien

    2017-01-01

    Just as surveillance in general has become more sophisticated, penetrative and ubiquitous, so has the surveillance of teachers. Enacted through an assemblage of strategies such as learning walks, parental networks, student voice and management information systems, the surveillance of teachers has proliferated as a means of managing the risks of…

  11. High Performance Liquid Chromatography of Some Analgesic Compounds: An Instrumental Analysis Experiment.

    ERIC Educational Resources Information Center

    Haddad, Paul; And Others

    1983-01-01

    Background information, procedures, and results are provided for an experiment demonstrating techniques of solvent selection, gradient elution, pH control, and ion-pairing in the analysis of an analgesic mixture using reversed-phase liquid chromatography on an octadecylsilane column. Although developed using sophisticated/expensive equipment, less…

  12. Laser Pointers: Low-Cost, Low-Tech Innovative, Interactive Instruction Tool

    ERIC Educational Resources Information Center

    Zdravkovska, Nevenka; Cech, Maureen; Beygo, Pinar; Kackley, Bob

    2010-01-01

    This paper discusses the use of laser pointers at the Engineering and Physical Sciences Library, University of Maryland, College Park, as a personal response system (PRS) tool to encourage student engagement in and interactivity with one-shot, lecture-based information literacy sessions. Unlike more sophisticated personal response systems like…

  13. A Performance Support Tool for Cisco Training Program Managers

    ERIC Educational Resources Information Center

    Benson, Angela D.; Bothra, Jashoda; Sharma, Priya

    2004-01-01

    Performance support systems can play an important role in corporations by managing and allowing distribution of information more easily. These systems run the gamut from simple paper job aids to sophisticated computer- and web-based software applications that support the entire corporate supply chain. According to Gery (1991), a performance…

  14. New Technologies Extend the Reach of Many College Fund Raisers.

    ERIC Educational Resources Information Center

    Nicklin, Julie L.

    1992-01-01

    Increasingly, colleges are using new technologies, often expensive, to improve fund-raising capacity among small-scale donors. Techniques include computerized screening of prospective donors based on personal information, automatic dialing and phone-bank worker training, and sophisticated direct-mail tactics. Concern about privacy and loss of the…

  15. Technology Acceptance in Social Work Education: Implications for the Field Practicum

    ERIC Educational Resources Information Center

    Colvin, Alex Don; Bullock, Angela N.

    2014-01-01

    The exponential growth and sophistication of new information and computer technology (ICT) have greatly influenced human interactions and provided new metaphors for understanding the world. The acceptance and integration of ICT into social work field education are examined here using the technological acceptance model. This article also explores…

  16. Standards for Privacy in Medical Information Systems: A Technico-Legal Revolution

    PubMed Central

    Brannigan, Vincent; Beier, Bernd

    1990-01-01

    The treatment of non-poor patients in hospitals creates a conflict between the privacy expectations of the patients and the historical traditions and administrative convenience of the hospital. Resolution of this problem requires a sophisticated theory of privacy, and one possible solution includes consensus standards for data privacy.

  17. How You Can Protect Public Access Computers "and" Their Users

    ERIC Educational Resources Information Center

    Huang, Phil

    2007-01-01

    By providing the public with online computing facilities, librarians make available a world of information resources beyond their traditional print materials. Internet-connected computers in libraries greatly enhance the opportunity for patrons to enjoy the benefits of the digital age. Unfortunately, as hackers become more sophisticated and…

  18. Conceptual ecological models to guide integrated landscape monitoring of the Great Basin

    USGS Publications Warehouse

    Miller, D.M.; Finn, S.P.; Woodward, Andrea; Torregrosa, Alicia; Miller, M.E.; Bedford, D.R.; Brasher, A.M.

    2010-01-01

    The Great Basin Integrated Landscape Monitoring Pilot Project was developed in response to the need for a monitoring and predictive capability that addresses changes in broad landscapes and waterscapes. Human communities and needs are nested within landscapes formed by interactions among the hydrosphere, geosphere, and biosphere. Understanding the complex processes that shape landscapes and deriving ways to manage them sustainably while meeting human needs require sophisticated modeling and monitoring. This document summarizes current understanding of ecosystem structure and function for many of the ecosystems within the Great Basin using conceptual models. The conceptual ecosystem models identify key ecological components and processes, identify external drivers, develop a hierarchical set of models that address both site and landscape attributes, inform regional monitoring strategy, and identify critical gaps in our knowledge of ecosystem function. The report also illustrates an approach for temporal and spatial scaling from site-specific models to landscape models and for understanding cumulative effects. Eventually, conceptual models can provide a structure for designing monitoring programs, interpreting monitoring and other data, and assessing the accuracy of our understanding of ecosystem functions and processes.

  19. Professional judgement and decision-making in adventure sports coaching: the role of interaction.

    PubMed

    Collins, Loel; Collins, Dave

    2016-01-01

    This qualitative study presents the view that coaching practice places demands on the coach's adaptability and flexibility. These requirements for being adaptive and flexible are met through a careful process of professional judgement and decision-making based on context-appropriate bodies of knowledge. Adventure sports coaches were selected for study on the basis that adventure sports create a hyper-dynamic environment in which these features can be examined. Thematic analysis revealed that coaches were generally well informed and practised with respect to the technical aspects of their sporting disciplines. Less positively, however, they often relied on ad hoc contextualisation of generalised theories of coaching practice to respond to the hyper-dynamic environments encountered in adventure sports. We propose that coaching practice reflects the demands of the environment, individual learning needs of the students and the task at hand. Together, these factors outwardly resemble a constraints-led approach but, we suggest, actually reflect manipulation of these parameters from a cognitive rather than an ecological perspective. This process is facilitated by a refined judgement and decision-making process, sophisticated epistemology and an explicit interaction of coaching components.

  20. IPAD: Integrated Programs for Aerospace-vehicle Design

    NASA Technical Reports Server (NTRS)

    Miller, R. E., Jr.

    1985-01-01

    Early work was performed to apply data base technology in support of the management of engineering data in the design and manufacturing environments. The principal objective of the IPAD project is to develop a computer software system for use in the design of aerospace vehicles. Two prototype systems are created for this purpose. Relational Information Manager (RIM) is a successful commercial product. The IPAD Information Processor (IPIP), a much more sophisticated system, is still under development.

  1. CISN ShakeAlert: Using early warnings for earthquakes in California

    NASA Astrophysics Data System (ADS)

    Vinci, M.; Hellweg, M.; Jones, L. M.; Khainovski, O.; Schwartz, K.; Lehrer, D.; Allen, R. M.; Neuhauser, D. S.

    2009-12-01

    Educated users who have developed response plans and procedures are just as important for an earthquake early warning (EEW) system as are the algorithms and computers that process the data and produce the warnings. In Japan, for example, the implementation of the EEW system which now provides advanced alerts of ground shaking included intense outreach efforts to both institutional and individual recipients. Alerts are now used in automatic control systems that stop trains, place sensitive equipment in safe mode and isolate hazards while the public takes cover. In California, the California Integrated Seismic Network (CISN) is now developing and implementing components of a prototype system for EEW, ShakeAlert. As this processing system is developed, we invite a suite of perspective users from critical industries and institutions throughout California to partner with us in developing useful ShakeAlert products and procedures. At the same time, we will support their efforts to determine and implement appropriate responses to an early warning of earthquake shaking. As a first step, in a collaboration with BART, we have developed a basic system allowing BART’s operation center to receive realtime ground shaking information from more than 150 seismic stations operating in the San Francisco Bay Area. BART engineers are implementing a display system for this information. Later phases will include the development of improved response procedures utilizing this information. We plan to continue this collaboration to include more sophisticated information from the prototype CISN ShakeAlert system.

  2. Image segmentation using hidden Markov Gauss mixture models.

    PubMed

    Pyun, Kyungsuk; Lim, Johan; Won, Chee Sun; Gray, Robert M

    2007-07-01

    Image segmentation is an important tool in image processing and can serve as an efficient front end to sophisticated algorithms and thereby simplify subsequent processing. We develop a multiclass image segmentation method using hidden Markov Gauss mixture models (HMGMMs) and provide examples of segmentation of aerial images and textures. HMGMMs incorporate supervised learning, fitting the observation probability distribution given each class by a Gauss mixture estimated using vector quantization with a minimum discrimination information (MDI) distortion. We formulate the image segmentation problem using a maximum a posteriori criteria and find the hidden states that maximize the posterior density given the observation. We estimate both the hidden Markov parameter and hidden states using a stochastic expectation-maximization algorithm. Our results demonstrate that HMGMM provides better classification in terms of Bayes risk and spatial homogeneity of the classified objects than do several popular methods, including classification and regression trees, learning vector quantization, causal hidden Markov models (HMMs), and multiresolution HMMs. The computational load of HMGMM is similar to that of the causal HMM.

  3. REDIdb: an upgraded bioinformatics resource for organellar RNA editing sites.

    PubMed

    Picardi, Ernesto; Regina, Teresa M R; Verbitskiy, Daniil; Brennicke, Axel; Quagliariello, Carla

    2011-03-01

    RNA editing is a post-transcriptional molecular process whereby the information in a genetic message is modified from that in the corresponding DNA template by means of nucleotide substitutions, insertions and/or deletions. It occurs mostly in organelles by clade-specific diverse and unrelated biochemical mechanisms. RNA editing events have been annotated in primary databases as GenBank and at more sophisticated level in the specialized databases REDIdb, dbRES and EdRNA. At present, REDIdb is the only freely available database that focuses on the organellar RNA editing process and annotates each editing modification in its biological context. Here we present an updated and upgraded release of REDIdb with a web-interface refurbished with graphical and computational facilities that improve RNA editing investigations. Details of the REDIdb features and novelties are illustrated and compared to other RNA editing databases. REDIdb is freely queried at http://biologia.unical.it/py_script/REDIdb/. Copyright © 2010 Elsevier B.V. and Mitochondria Research Society. All rights reserved.

  4. Installed Base as a Facilitator for User-Driven Innovation: How Can User Innovation Challenge Existing Institutional Barriers?

    PubMed Central

    Andersen, Synnøve Thomassen; Jansen, Arild

    2012-01-01

    The paper addresses an ICT-based, user-driven innovation process in the health sector in rural areas in Norway. The empirical base is the introduction of a new model for psychiatric health provision. This model is supported by a technical solution based on mobile phones that is aimed to help the communication between professional health personnel and patients. This innovation was made possible through the use of standard mobile technology rather than more sophisticated systems. The users were heavily involved in the development work. Our analysis shows that by thinking simple and small-scale solutions, including to take the user's needs and premises as a point of departure rather than focusing on advanced technology, the implementation process was made possible. We show that by combining theory on information infrastructures, user-oriented system development, and innovation in a three-layered analytical framework, we can explain the interrelationship between technical, organizational, and health professional factors that made this innovation a success. PMID:23304134

  5. Experiments with Analytic Centers: A confluence of data, tools and help in using them.

    NASA Astrophysics Data System (ADS)

    Little, M. M.; Crichton, D. J.; Hines, K.; Cole, M.; Quam, B. M.

    2017-12-01

    Traditional repositories have been primarily focused on data stewardship. Over the past two decades, data scientists have attempted to overlay a superstructure to make these repositories more amenable to analysis tasks, with limited success. This poster will summarize lessons learned and some realizations regarding what it takes to create an analytic center. As the volume of Earth Science data grows and the sophistication of analytic tools improves, a pattern has emerged that indicates different science communities uniquely apply a selection of tools to the data to produce scientific results. Infrequently do the experiences of one group help steer other groups. How can the information technology community seed these domains with tools that conform to the thought processes and experiences of that particular science group? What types of succcessful technology infusions have occured and how does technology get adopted. AIST has been experimenting with the management of this analytic center process; this paper will summarize the results and indicate a direction for future infusion attempts.

  6. Information Systems for NASA's Aeronautics and Space Enterprises

    NASA Technical Reports Server (NTRS)

    Kutler, Paul

    1998-01-01

    The aerospace industry is being challenged to reduce costs and development time as well as utilize new technologies to improve product performance. Information technology (IT) is the key to providing revolutionary solutions to the challenges posed by the increasing complexity of NASA's aeronautics and space missions and the sophisticated nature of the systems that enable them. The NASA Ames vision is to develop technologies enabling the information age, expanding the frontiers of knowledge for aeronautics and space, improving America's competitive position, and inspiring future generations. Ames' missions to accomplish that vision include: 1) performing research to support the American aviation community through the unique integration of computation, experimentation, simulation and flight testing, 2) studying the health of our planet, understanding living systems in space and the origins of the universe, developing technologies for space flight, and 3) to research, develop and deliver information technologies and applications. Information technology may be defined as the use of advance computing systems to generate data, analyze data, transform data into knowledge and to use as an aid in the decision-making process. The knowledge from transformed data can be displayed in visual, virtual and multimedia environments. The decision-making process can be fully autonomous or aided by a cognitive processes, i.e., computational aids designed to leverage human capacities. IT Systems can learn as they go, developing the capability to make decisions or aid the decision making process on the basis of experiences gained using limited data inputs. In the future, information systems will be used to aid space mission synthesis, virtual aerospace system design, aid damaged aircraft during landing, perform robotic surgery, and monitor the health and status of spacecraft and planetary probes. NASA Ames through the Center of Excellence for Information Technology Office is leading the effort in pursuit of revolutionary, IT-based approaches to satisfying NASA's aeronautics and space requirements. The objective of the effort is to incorporate information technologies within each of the Agency's four Enterprises, i.e., Aeronautics and Space Transportation Technology, Earth, Science, Human Exploration and Development of Space and Space Sciences. The end results of these efforts for Enterprise programs and projects should be reduced cost, enhanced mission capability and expedited mission completion.

  7. SAW chirp filter technology for satellite on-board processing applications

    NASA Astrophysics Data System (ADS)

    Shaw, M. D.; Miller, N. D. J.; Malarky, A. P.; Warne, D. H.

    1989-11-01

    Market growth in the area of thin route satellite communications services has led to consideration of nontraditional system architectures requiring sophisticated on-board processing functions. Surface acoustic wave (SAW) technology exists today which can provide implementation of key on-board processing subsystems by using multicarrier demodulators. This paper presents a review of this signal processing technology, along with a brief review of dispersive SAW device technology as applied to the implementation of multicarrier demodulators for on-board signal processing.

  8. Temporal consistent depth map upscaling for 3DTV

    NASA Astrophysics Data System (ADS)

    Schwarz, Sebastian; Sjöström, Mârten; Olsson, Roger

    2014-03-01

    The ongoing success of three-dimensional (3D) cinema fuels increasing efforts to spread the commercial success of 3D to new markets. The possibilities of a convincing 3D experience at home, such as three-dimensional television (3DTV), has generated a great deal of interest within the research and standardization community. A central issue for 3DTV is the creation and representation of 3D content. Acquiring scene depth information is a fundamental task in computer vision, yet complex and error-prone. Dedicated range sensors, such as the Time­ of-Flight camera (ToF), can simplify the scene depth capture process and overcome shortcomings of traditional solutions, such as active or passive stereo analysis. Admittedly, currently available ToF sensors deliver only a limited spatial resolution. However, sophisticated depth upscaling approaches use texture information to match depth and video resolution. At Electronic Imaging 2012 we proposed an upscaling routine based on error energy minimization, weighted with edge information from an accompanying video source. In this article we develop our algorithm further. By adding temporal consistency constraints to the upscaling process, we reduce disturbing depth jumps and flickering artifacts in the final 3DTV content. Temporal consistency in depth maps enhances the 3D experience, leading to a wider acceptance of 3D media content. More content in better quality can boost the commercial success of 3DTV.

  9. Human machine interface by using stereo-based depth extraction

    NASA Astrophysics Data System (ADS)

    Liao, Chao-Kang; Wu, Chi-Hao; Lin, Hsueh-Yi; Chang, Ting-Ting; Lin, Tung-Yang; Huang, Po-Kuan

    2014-03-01

    The ongoing success of three-dimensional (3D) cinema fuels increasing efforts to spread the commercial success of 3D to new markets. The possibilities of a convincing 3D experience at home, such as three-dimensional television (3DTV), has generated a great deal of interest within the research and standardization community. A central issue for 3DTV is the creation and representation of 3D content. Acquiring scene depth information is a fundamental task in computer vision, yet complex and error-prone. Dedicated range sensors, such as the Time­ of-Flight camera (ToF), can simplify the scene depth capture process and overcome shortcomings of traditional solutions, such as active or passive stereo analysis. Admittedly, currently available ToF sensors deliver only a limited spatial resolution. However, sophisticated depth upscaling approaches use texture information to match depth and video resolution. At Electronic Imaging 2012 we proposed an upscaling routine based on error energy minimization, weighted with edge information from an accompanying video source. In this article we develop our algorithm further. By adding temporal consistency constraints to the upscaling process, we reduce disturbing depth jumps and flickering artifacts in the final 3DTV content. Temporal consistency in depth maps enhances the 3D experience, leading to a wider acceptance of 3D media content. More content in better quality can boost the commercial success of 3DTV.

  10. A NEW LAND-SURFACE MODEL IN MM5

    EPA Science Inventory

    There has recently been a general realization that more sophisticated modeling of land-surface processes can be important for mesoscale meteorology models. Land-surface models (LSMs) have long been important components in global-scale climate models because of their more compl...

  11. Instructional Systems Development

    ERIC Educational Resources Information Center

    Watson, Russell

    The United States Army, confronted with sophisticated defense machinery and entry level soldiers with low educational backgrounds, selected a systems approach to training that was developed in 1975 by Florida State University. Instructional Systems Development (IDS), a five-phase process encompassing the entire educational environment, is…

  12. Sophisticated Clean Air Strategies Required to Mitigate Against Particulate Organic Pollution

    PubMed Central

    Grigas, T.; Ovadnevaite, J.; Ceburnis, D.; Moran, E.; McGovern, F. M.; Jennings, S. G.; O’Dowd, C.

    2017-01-01

    Since the 1980’s, measures mitigating the impact of transboundary air pollution have been implemented successfully as evidenced in the 1980–2014 record of atmospheric sulphur pollution over the NE-Atlantic, a key region for monitoring background northern-hemisphere pollution levels. The record reveals a 72–79% reduction in annual-average airborne sulphur pollution (SO4 and SO2, respectively) over the 35-year period. The NE-Atlantic, as observed from the Mace Head research station on the Irish coast, can be considered clean for 64% of the time during which sulphate dominates PM1 levels, contributing 42% of the mass, and for the remainder of the time, under polluted conditions, a carbonaceous (organic matter and Black Carbon) aerosol prevails, contributing 60% to 90% of the PM1 mass and exhibiting a trend whereby its contribution increases with increasing pollution levels. The carbonaceous aerosol is known to be diverse in source and nature and requires sophisticated air pollution policies underpinned by sophisticated characterisation and source apportionment capabilities to inform selective emissions-reduction strategies. Inauspiciously, however, this carbonaceous concoction is not measured in regulatory Air Quality networks. PMID:28303958

  13. Student Thinking Processes. The Influence of Immediate Computer Access on Students' Thinking. First- and Second-Year Findings. ACOT Report #3.

    ERIC Educational Resources Information Center

    Tierney, Robert J.

    This 2-year longitudinal study explored whether computers promote more sophisticated thinking, and examined how students' thinking changes as they become experienced computer users. The first-year study examined the thinking process of four ninth-grade Apple Classrooms of Tomorrow (ACOT) students. The second-year study continued following these…

  14. Optimizing Performance Through Sleep-Wake Homeostasis: Integrating Physiological and Neurobehavioral Data via Ambulatory Acquisition in Laboratory and Field Environments

    DTIC Science & Technology

    2009-04-18

    intake and sophisticated signal processing of electroencephalographic (EEG), electrooculographic ( EOG ), electrocardiographic (ECG), and...electroencephalographic (EEG), electrooculographic ( EOG ), electrocardiographic (ECG), and electromyographic (EMG) physiological signals . It also has markedly...ambulatory physiological acquisition and quantitative signal processing; (2) Brain Amp MR Plus 32 and BrainVision Recorder Professional Software Package for

  15. Multi-Sensor Data Fusion Project

    DTIC Science & Technology

    2000-02-28

    seismic network by detecting T phases generated by underground events ( generally earthquakes ) and associating these phases to seismic events. The...between underwater explosions (H), underground sources, mostly earthquake - generated (7), and noise detections (N). The phases classified as H are the only...processing for infrasound sensors is most similar to seismic array processing with the exception that the detections are based on a more sophisticated

  16. Numerical Order Processing in Children: From Reversing the Distance-Effect to Predicting Arithmetic

    ERIC Educational Resources Information Center

    Lyons, Ian M.; Ansari, Daniel

    2015-01-01

    Recent work has demonstrated that how we process the relative order--ordinality--of numbers may be key to understanding how we represent numbers symbolically, and has proven to be a robust predictor of more sophisticated math skills in both children and adults. However, it remains unclear whether numerical ordinality is primarily a by-product of…

  17. A Meta-Analytic Study of the Neural Systems for Auditory Processing of Lexical Tones.

    PubMed

    Kwok, Veronica P Y; Dan, Guo; Yakpo, Kofi; Matthews, Stephen; Fox, Peter T; Li, Ping; Tan, Li-Hai

    2017-01-01

    The neural systems of lexical tone processing have been studied for many years. However, previous findings have been mixed with regard to the hemispheric specialization for the perception of linguistic pitch patterns in native speakers of tonal language. In this study, we performed two activation likelihood estimation (ALE) meta-analyses, one on neuroimaging studies of auditory processing of lexical tones in tonal languages (17 studies), and the other on auditory processing of lexical information in non-tonal languages as a control analysis for comparison (15 studies). The lexical tone ALE analysis showed significant brain activations in bilateral inferior prefrontal regions, bilateral superior temporal regions and the right caudate, while the control ALE analysis showed significant cortical activity in the left inferior frontal gyrus and left temporo-parietal regions. However, we failed to obtain significant differences from the contrast analysis between two auditory conditions, which might be caused by the limited number of studies available for comparison. Although the current study lacks evidence to argue for a lexical tone specific activation pattern, our results provide clues and directions for future investigations on this topic, more sophisticated methods are needed to explore this question in more depth as well.

  18. A Meta-Analytic Study of the Neural Systems for Auditory Processing of Lexical Tones

    PubMed Central

    Kwok, Veronica P. Y.; Dan, Guo; Yakpo, Kofi; Matthews, Stephen; Fox, Peter T.; Li, Ping; Tan, Li-Hai

    2017-01-01

    The neural systems of lexical tone processing have been studied for many years. However, previous findings have been mixed with regard to the hemispheric specialization for the perception of linguistic pitch patterns in native speakers of tonal language. In this study, we performed two activation likelihood estimation (ALE) meta-analyses, one on neuroimaging studies of auditory processing of lexical tones in tonal languages (17 studies), and the other on auditory processing of lexical information in non-tonal languages as a control analysis for comparison (15 studies). The lexical tone ALE analysis showed significant brain activations in bilateral inferior prefrontal regions, bilateral superior temporal regions and the right caudate, while the control ALE analysis showed significant cortical activity in the left inferior frontal gyrus and left temporo-parietal regions. However, we failed to obtain significant differences from the contrast analysis between two auditory conditions, which might be caused by the limited number of studies available for comparison. Although the current study lacks evidence to argue for a lexical tone specific activation pattern, our results provide clues and directions for future investigations on this topic, more sophisticated methods are needed to explore this question in more depth as well. PMID:28798670

  19. Research in speech communication.

    PubMed Central

    Flanagan, J

    1995-01-01

    Advances in digital speech processing are now supporting application and deployment of a variety of speech technologies for human/machine communication. In fact, new businesses are rapidly forming about these technologies. But these capabilities are of little use unless society can afford them. Happily, explosive advances in microelectronics over the past two decades have assured affordable access to this sophistication as well as to the underlying computing technology. The research challenges in speech processing remain in the traditionally identified areas of recognition, synthesis, and coding. These three areas have typically been addressed individually, often with significant isolation among the efforts. But they are all facets of the same fundamental issue--how to represent and quantify the information in the speech signal. This implies deeper understanding of the physics of speech production, the constraints that the conventions of language impose, and the mechanism for information processing in the auditory system. In ongoing research, therefore, we seek more accurate models of speech generation, better computational formulations of language, and realistic perceptual guides for speech processing--along with ways to coalesce the fundamental issues of recognition, synthesis, and coding. Successful solution will yield the long-sought dictation machine, high-quality synthesis from text, and the ultimate in low bit-rate transmission of speech. It will also open the door to language-translating telephony, where the synthetic foreign translation can be in the voice of the originating talker. Images Fig. 1 Fig. 2 Fig. 5 Fig. 8 Fig. 11 Fig. 12 Fig. 13 PMID:7479806

  20. Informatics in radiology: RADTF: a semantic search-enabled, natural language processor-generated radiology teaching file.

    PubMed

    Do, Bao H; Wu, Andrew; Biswal, Sandip; Kamaya, Aya; Rubin, Daniel L

    2010-11-01

    Storing and retrieving radiology cases is an important activity for education and clinical research, but this process can be time-consuming. In the process of structuring reports and images into organized teaching files, incidental pathologic conditions not pertinent to the primary teaching point can be omitted, as when a user saves images of an aortic dissection case but disregards the incidental osteoid osteoma. An alternate strategy for identifying teaching cases is text search of reports in radiology information systems (RIS), but retrieved reports are unstructured, teaching-related content is not highlighted, and patient identifying information is not removed. Furthermore, searching unstructured reports requires sophisticated retrieval methods to achieve useful results. An open-source, RadLex(®)-compatible teaching file solution called RADTF, which uses natural language processing (NLP) methods to process radiology reports, was developed to create a searchable teaching resource from the RIS and the picture archiving and communication system (PACS). The NLP system extracts and de-identifies teaching-relevant statements from full reports to generate a stand-alone database, thus converting existing RIS archives into an on-demand source of teaching material. Using RADTF, the authors generated a semantic search-enabled, Web-based radiology archive containing over 700,000 cases with millions of images. RADTF combines a compact representation of the teaching-relevant content in radiology reports and a versatile search engine with the scale of the entire RIS-PACS collection of case material. ©RSNA, 2010

  1. The changing flow of management information systems in long-term care facilities.

    PubMed

    Stokes, D F

    1997-08-01

    Over the past three decades, the long-term care community has seen continual increases in the complexity and sophistication of management information systems. These changes have been brought about by the ever-increasing demands on owners and managers to provide accurate and timely data to both regulators and financial investors. The evolution of these systems has increased rapidly in recent years as the nation attempts to reinvent the funding mechanisms for long-term care.

  2. A Model for School Board Operation.

    ERIC Educational Resources Information Center

    Hickcox, Edward S.; And Others

    A school board must operate in such a way that it can cope with the increasingly larger size, complex organization, and sophisticated programs of schools. The relationships among the community, board, and school can be viewed as component parts of a system. Formal and informal lines of communication exist among these parts--between the community…

  3. The RCM: A Resource Management and Program Budgeting Approach for State and Local Educational Agencies.

    ERIC Educational Resources Information Center

    Chambers, Jay G.; Parrish, Thomas B.

    The Resource Cost Model (RCM) is a resource management system that combines the technical advantages of sophisticated computer simulation software with the practical benefits of group decision making to provide detailed information about educational program costs. The first section of this document introduces the conceptual framework underlying…

  4. The Online Classroom: Teaching with the Internet.

    ERIC Educational Resources Information Center

    Cotton, Eileen Giuffre

    Presenting a wide array of Internet addresses and sample lessons, this book shows how teachers can integrate the Internet into their K-12 curriculum to actively involve students. The ideas and lessons in the book help students to communicate with people in faraway places; gather information from around the globe; develop sophisticated research…

  5. Prediction of School Performance from the Minnesota Child Development Inventory: Implications for Preschool Screening.

    ERIC Educational Resources Information Center

    Colligan, Robert C.

    Almost all preschool screening programs depend entirely on information and observations obtained during a brief evaluative session with the child. However, the logistics involved in managing large numbers of parents and children, the use of volunteers having varying degrees of sophistication or competency in assessment, the reliability and…

  6. Perspectives On... Roaches, Guerrillas, and ''Librarians on the Loose''

    ERIC Educational Resources Information Center

    Macke, Barbara

    2005-01-01

    Sophistication and accessibility can be dangerous bedfellows, especially when it comes to information sources. One of the real challenges in undergraduate academic libraries can be connecting students in a meaningful way to the resources that will be most helpful and understandable for them. This article discusses how techniques borrowed from the…

  7. Ecosystem Services in Environmental Science Literacy

    ERIC Educational Resources Information Center

    Ruppert, John Robert

    2015-01-01

    Human beings depend on a set of benefits that emerge from functioning ecosystems, termed Ecosystem Services (ES), and make decisions in everyday life that affect these ES. Recent advancements in science have led to an increasingly sophisticated understanding of ES and how they can be used to inform environmental decision-making. Following suit, US…

  8. Anabat bat detection system: description and maintenance manual.

    Treesearch

    Douglas W. Waldren

    2000-01-01

    Anabat bat detection systems record ultrasonic bat calls on cassette tape by using a sophisticated ultrasonic microphone and cassette tape interface. This paper describes equipment setup and some maintenance issues. The layout and function of display panels are presented with special emphasis on how to use this information to troubleshoot equipment problems. The...

  9. Test Design with Cognition in Mind

    ERIC Educational Resources Information Center

    Gorin, Joanna S.

    2006-01-01

    One of the primary themes of the National Research Council's 2001 book "Knowing What Students Know" was the importance of cognition as a component of assessment design and measurement theory (NRC, 2001). One reaction to the book has been an increased use of sophisticated statistical methods to model cognitive information available in test data.…

  10. On the Nets. Comparing Web Browsers: Mosaic, Cello, Netscape, WinWeb and InternetWorks Life.

    ERIC Educational Resources Information Center

    Notess, Greg R.

    1995-01-01

    World Wide Web browsers are compared by speed, setup, hypertext transport protocol (HTTP) handling, management of file transfer protocol (FTP), telnet, gopher, and wide area information server (WAIS); bookmark options; and communication functions. Netscape has the most features, the fastest retrieval, sophisticated bookmark capabilities. (JMV)

  11. Supervisory Control Information Management Research (SCIMR)

    DTIC Science & Technology

    2013-06-01

    1 Figure 2: Screen Shot of a Typical VSCS Configuration. .............................................................. 2 Figure 3: Displayed...Station ( VSCS ) serves as a multi-faceted facilitator in areas ranging from research to combat missions. The result, consequentially, is an increase in the...success. Developed with this in mind, VSCS effectively integrates sophisticated advancements for the purpose of strengthening the collaborative

  12. Don't Get "Phished" out of Cyberspace

    ERIC Educational Resources Information Center

    Goldsborough, Reid

    2004-01-01

    Identity theft via bogus e-mail links, or "phishing," is escalating, with criminals becoming ever more brazen and sophisticated in their online schemes to trick people into revealing their personal information. People do get scammed. Phishing messages that appear to be sent by trusted companies dupe 3 percent of the people who receive…

  13. Factors Affecting Utilization of Information Output of Computer-Based Modeling Procedures in Local Government Organizations.

    ERIC Educational Resources Information Center

    Komsky, Susan

    Fiscal Impact Budgeting Systems (FIBS) are sophisticated computer based modeling procedures used in local government organizations, whose results, however, are often overlooked or ignored by decision makers. A study attempted to discover the reasons for this situation by focusing on four factors: potential usefulness, faith in computers,…

  14. Aeronautics and Space Report of the President: 1977 Activities.

    ERIC Educational Resources Information Center

    National Aeronautics and Space Administration, Washington, DC.

    The national programs in aeronautics and space made steady progress in 1977 toward their long-term objectives. In aeronautics the goals were improved performance, energy efficiency, and safety in aircraft. In space the goals were: (1) better remote sensing systems to generate more sophisticated information about the Earth's environment; (2)…

  15. Organizational and Technological Strategies for Higher Education in the Information Age. CAUSE Professional Paper Series, #13.

    ERIC Educational Resources Information Center

    Ernst, David J.; And Others

    This paper examines five key trends impacting higher education administration: (1) traditional funding sources are flat or decreasing; (2) public expectations and state mandates are calling for more reporting requirements and accountability; (3) consumer expectations demand more sophisticated services requiring greater access to date; (4) evolving…

  16. Child-Centered, Family-Sensitive Schools: An Educator's Guide to Family Dynamics.

    ERIC Educational Resources Information Center

    Garanzini, Michael J.

    An increasing number of children have family problems that interfere with their ability to learn at school. This book provides information about developing a clearer and more sophisticated child-centered school and classroom by making teachers and administrators more knowledgeable about the varieties of students' family structures, both healthy…

  17. Modeling current climate conditions for forest pest risk assessment

    Treesearch

    Frank H. Koch; John W. Coulston

    2010-01-01

    Current information on broad-scale climatic conditions is essential for assessing potential distribution of forest pests. At present, sophisticated spatial interpolation approaches such as the Parameter-elevation Regressions on Independent Slopes Model (PRISM) are used to create high-resolution climatic data sets. Unfortunately, these data sets are based on 30-year...

  18. The Online Classroom: Teaching with the Internet. 2nd Edition.

    ERIC Educational Resources Information Center

    Cotton, Eileen Guiffre

    Presenting a wide array of Internet addresses and sample lessons, this book shows how teachers can integrate the Internet into their K-12 curriculum to actively involve students. The ideas and lessons in the book help students to communicate with people in faraway places; gather information from around the globe; develop sophisticated research…

  19. How Commercial Banks Use the World Wide Web: A Content Analysis.

    ERIC Educational Resources Information Center

    Leovic, Lydia K.

    New telecommunications vehicles expand the possible ways that business is conducted. The hypermedia portion of the Internet, the World Wide Web, is such a telecommunications device. The Web is presently one of the most flexible and dynamic methods for electronic information dissemination. The level of technological sophistication necessary to…

  20. Measuring Skills for the 21st Century. Education Sector Reports

    ERIC Educational Resources Information Center

    Silva, Elena

    2008-01-01

    Leaders in government, business, and higher education are calling for today's students to show a mastery of broader and more sophisticated skills like evaluating and analyzing information and thinking creatively about how to solve real-world problems. Standing in the way of incorporating such skills into teaching and learning are widespread…

  1. Students' Choice of Universities in Germany: Structure, Factors and Information Sources Used

    ERIC Educational Resources Information Center

    Obermeit, Katrin

    2012-01-01

    Student recruitment is an increasingly important topic for universities worldwide. But in order to develop sophisticated recruitment strategies, recruitment officers need to have a clear understanding of how and why students choose colleges. This review compares the German and US research concerning university choice models, choice criteria and…

  2. Sophistry, the Sophists and modern medical education.

    PubMed

    Macsuibhne, S P

    2010-01-01

    The term 'sophist' has become a term of intellectual abuse in both general discourse and that of educational theory. However the actual thought of the fifth century BC Athenian-based philosophers who were the original Sophists was very different from the caricature. In this essay, I draw parallels between trends in modern medical educational practice and the thought of the Sophists. Specific areas discussed are the professionalisation of medical education, the teaching of higher-order characterological attributes such as personal development skills, and evidence-based medical education. Using the specific example of the Sophist Protagoras, it is argued that the Sophists were precursors of philosophical approaches and practices of enquiry underlying modern medical education.

  3. Windows to the soul: vision science as a tool for studying biological mechanisms of information processing deficits in schizophrenia.

    PubMed

    Yoon, Jong H; Sheremata, Summer L; Rokem, Ariel; Silver, Michael A

    2013-10-31

    Cognitive and information processing deficits are core features and important sources of disability in schizophrenia. Our understanding of the neural substrates of these deficits remains incomplete, in large part because the complexity of impairments in schizophrenia makes the identification of specific deficits very challenging. Vision science presents unique opportunities in this regard: many years of basic research have led to detailed characterization of relationships between structure and function in the early visual system and have produced sophisticated methods to quantify visual perception and characterize its neural substrates. We present a selective review of research that illustrates the opportunities for discovery provided by visual studies in schizophrenia. We highlight work that has been particularly effective in applying vision science methods to identify specific neural abnormalities underlying information processing deficits in schizophrenia. In addition, we describe studies that have utilized psychophysical experimental designs that mitigate generalized deficit confounds, thereby revealing specific visual impairments in schizophrenia. These studies contribute to accumulating evidence that early visual cortex is a useful experimental system for the study of local cortical circuit abnormalities in schizophrenia. The high degree of similarity across neocortical areas of neuronal subtypes and their patterns of connectivity suggests that insights obtained from the study of early visual cortex may be applicable to other brain regions. We conclude with a discussion of future studies that combine vision science and neuroimaging methods. These studies have the potential to address pressing questions in schizophrenia, including the dissociation of local circuit deficits vs. impairments in feedback modulation by cognitive processes such as spatial attention and working memory, and the relative contributions of glutamatergic and GABAergic deficits.

  4. Artificial intelligence in a mission operations and satellite test environment

    NASA Technical Reports Server (NTRS)

    Busse, Carl

    1988-01-01

    A Generic Mission Operations System using Expert System technology to demonstrate the potential of Artificial Intelligence (AI) automated monitor and control functions in a Mission Operations and Satellite Test environment will be developed at the National Aeronautics and Space Administration (NASA) Jet Propulsion Laboratory (JPL). Expert system techniques in a real time operation environment are being studied and applied to science and engineering data processing. Advanced decommutation schemes and intelligent display technology will be examined to develop imaginative improvements in rapid interpretation and distribution of information. The Generic Payload Operations Control Center (GPOCC) will demonstrate improved data handling accuracy, flexibility, and responsiveness in a complex mission environment. The ultimate goal is to automate repetitious mission operations, instrument, and satellite test functions by the applications of expert system technology and artificial intelligence resources and to enhance the level of man-machine sophistication.

  5. The young person’s guide to the PDB*

    PubMed Central

    Minor, Wladek; Dauter, Zbigniew; Jaskolski, Mariusz

    2017-01-01

    The Protein Data Bank (PDB), created in 1971 when merely seven protein crystal structures were known, today holds over 120,000 experimentally-determined three-dimensional models of macromolecules, including gigantic structures comprised of hundreds of thousands of atoms, such as ribosomes and viruses. Most of the deposits come from X-ray crystallography experiments, with important contributions also made by NMR spectroscopy and, recently, by the fast growing Cryo-Electron Microscopy. Although the determination of a macromolecular crystal structure is now facilitated by advanced experimental tools and by sophisticated software, it is still a highly complicated research process requiring specialized training, skill, experience and a bit of luck. Understanding the plethora of structural information provided by the PDB requires that its users (consumers) have at least a rudimentary initiation. This is the purpose of this educational overview. PMID:28132477

  6. Network Security Validation Using Game Theory

    NASA Astrophysics Data System (ADS)

    Papadopoulou, Vicky; Gregoriades, Andreas

    Non-functional requirements (NFR) such as network security recently gained widespread attention in distributed information systems. Despite their importance however, there is no systematic approach to validate these requirements given the complexity and uncertainty characterizing modern networks. Traditionally, network security requirements specification has been the results of a reactive process. This however, limited the immunity property of the distributed systems that depended on these networks. Security requirements specification need a proactive approach. Networks' infrastructure is constantly under attack by hackers and malicious software that aim to break into computers. To combat these threats, network designers need sophisticated security validation techniques that will guarantee the minimum level of security for their future networks. This paper presents a game-theoretic approach to security requirements validation. An introduction to game theory is presented along with an example that demonstrates the application of the approach.

  7. Transforming microbial genotyping: a robotic pipeline for genotyping bacterial strains.

    PubMed

    O'Farrell, Brian; Haase, Jana K; Velayudhan, Vimalkumar; Murphy, Ronan A; Achtman, Mark

    2012-01-01

    Microbial genotyping increasingly deals with large numbers of samples, and data are commonly evaluated by unstructured approaches, such as spread-sheets. The efficiency, reliability and throughput of genotyping would benefit from the automation of manual manipulations within the context of sophisticated data storage. We developed a medium- throughput genotyping pipeline for MultiLocus Sequence Typing (MLST) of bacterial pathogens. This pipeline was implemented through a combination of four automated liquid handling systems, a Laboratory Information Management System (LIMS) consisting of a variety of dedicated commercial operating systems and programs, including a Sample Management System, plus numerous Python scripts. All tubes and microwell racks were bar-coded and their locations and status were recorded in the LIMS. We also created a hierarchical set of items that could be used to represent bacterial species, their products and experiments. The LIMS allowed reliable, semi-automated, traceable bacterial genotyping from initial single colony isolation and sub-cultivation through DNA extraction and normalization to PCRs, sequencing and MLST sequence trace evaluation. We also describe robotic sequencing to facilitate cherrypicking of sequence dropouts. This pipeline is user-friendly, with a throughput of 96 strains within 10 working days at a total cost of < €25 per strain. Since developing this pipeline, >200,000 items were processed by two to three people. Our sophisticated automated pipeline can be implemented by a small microbiology group without extensive external support, and provides a general framework for semi-automated bacterial genotyping of large numbers of samples at low cost.

  8. Effects of Direct and Indirect Instruction on Fostering Decision-Making Competence in Socioscientific Issues

    NASA Astrophysics Data System (ADS)

    Böttcher, Florian; Meisert, Anke

    2013-04-01

    In this study the effects of different learning environments on the promotion of decision-making competence for the socioscientific issue of genetically modified crops is investigated. The comparison focuses on direct vs. indirect instructions. Therefore on the one hand a sophisticated decision-making strategy was presented to the directly instructed experimental group (1) and had to be applied correctly. On the other hand indirectly instructed students had to invent an appropriate strategy by themselves (2) based on the given information and the structure of the problem context. Group discussions are analysed qualitatively in order (1) to outline how the given strategy was understood and its results were reflected on by the students and (2) to explore the characteristics of invented strategies and their degree of complexity. Results indicate that the direct instruction of complex decision-making strategies may lead to a lack of understanding of the decision process when the given strategy is applied and therefore may cause rejection of the final decision. Indirectly instructed students were able to invent sophisticated decision-making strategies containing compensatory trade-offs. It is concluded that when directly instructing complex decision-making strategies, essential parts of reflection have to be integrated in order to gain greater transparency. Accordingly, empirical evidence has been found to consider indirect instruction as a possible way to foster decision-making strategies for complex socioscientific issues even if compensatory procedures are considered to be necessary.

  9. Transforming Microbial Genotyping: A Robotic Pipeline for Genotyping Bacterial Strains

    PubMed Central

    Velayudhan, Vimalkumar; Murphy, Ronan A.; Achtman, Mark

    2012-01-01

    Microbial genotyping increasingly deals with large numbers of samples, and data are commonly evaluated by unstructured approaches, such as spread-sheets. The efficiency, reliability and throughput of genotyping would benefit from the automation of manual manipulations within the context of sophisticated data storage. We developed a medium- throughput genotyping pipeline for MultiLocus Sequence Typing (MLST) of bacterial pathogens. This pipeline was implemented through a combination of four automated liquid handling systems, a Laboratory Information Management System (LIMS) consisting of a variety of dedicated commercial operating systems and programs, including a Sample Management System, plus numerous Python scripts. All tubes and microwell racks were bar-coded and their locations and status were recorded in the LIMS. We also created a hierarchical set of items that could be used to represent bacterial species, their products and experiments. The LIMS allowed reliable, semi-automated, traceable bacterial genotyping from initial single colony isolation and sub-cultivation through DNA extraction and normalization to PCRs, sequencing and MLST sequence trace evaluation. We also describe robotic sequencing to facilitate cherrypicking of sequence dropouts. This pipeline is user-friendly, with a throughput of 96 strains within 10 working days at a total cost of < €25 per strain. Since developing this pipeline, >200,000 items were processed by two to three people. Our sophisticated automated pipeline can be implemented by a small microbiology group without extensive external support, and provides a general framework for semi-automated bacterial genotyping of large numbers of samples at low cost. PMID:23144721

  10. A new approach to design and use of management information.

    PubMed

    Daft, R L; MacIntosh, N B

    1978-01-01

    Information, that is both accurate and timely, is probably the most important resource needed by managers to make sound decisions regarding the problems and issues facing their organizations. Unfortunately, sophisticated information systems often fail to meet this need. Managers complain that the data produced by information systems arrive too late, are too general and lack accuracy. Daft and MacIntosh studied the system problems of a number of organizations, discovering that understanding their work activities is critical to the design of successful information systems. The authors also considered the volume of information, preciseness of information and the way in which it is handled by users to develop a model describing information systems. The article illustrates how the model was applied successfully to four case situations.

  11. CD-ROM And Knowledge Integration

    NASA Astrophysics Data System (ADS)

    Rann, Leonard S.

    1988-06-01

    As the title of this paper suggests, it is about CD-ROM technology and the structuring of massive databases. Even more, it is about the impact CD-ROM has had on the publication of massive amounts of information, and the unique qualities of the medium that allows for the most sophisticated computer retrieval techniques that have ever been used. I am not drawing on experience as a pedant in the educational field, but rather as a software and database designer who has worked with CD-ROM since its inception. I will be giving examples from my company's current applications, as well as discussing some of the challenges that face information publishers in the future. In particular I have a belief about what the most valuable outlet can be created using CD-ROM will be: The CD-ROM is particularly suited for the mass delivery of information systems and databases that either require or utilize a large amount of computational preprocessing to allow a real-time or interactive response to be achieved. Until the advent of CD-ROM technology this level of sophistication in publication was virtually impossible. I will further explain this later in this paper. First, I will discuss the salient features of CD-ROM that make it unique in the world of data storage for electronic publishing.

  12. A multiscale forecasting method for power plant fleet management

    NASA Astrophysics Data System (ADS)

    Chen, Hongmei

    In recent years the electric power industry has been challenged by a high level of uncertainty and volatility brought on by deregulation and globalization. A power producer must minimize the life cycle cost while meeting stringent safety and regulatory requirements and fulfilling customer demand for high reliability. Therefore, to achieve true system excellence, a more sophisticated system-level decision-making process with a more accurate forecasting support system to manage diverse and often widely dispersed generation units as a single, easily scaled and deployed fleet system in order to fully utilize the critical assets of a power producer has been created as a response. The process takes into account the time horizon for each of the major decision actions taken in a power plant and develops methods for information sharing between them. These decisions are highly interrelated and no optimal operation can be achieved without sharing information in the overall process. The process includes a forecasting system to provide information for planning for uncertainty. A new forecasting method is proposed, which utilizes a synergy of several modeling techniques properly combined at different time-scales of the forecasting objects. It can not only take advantages of the abundant historical data but also take into account the impact of pertinent driving forces from the external business environment to achieve more accurate forecasting results. Then block bootstrap is utilized to measure the bias in the estimate of the expected life cycle cost which will actually be needed to drive the business for a power plant in the long run. Finally, scenario analysis is used to provide a composite picture of future developments for decision making or strategic planning. The decision-making process is applied to a typical power producer chosen to represent challenging customer demand during high-demand periods. The process enhances system excellence by providing more accurate market information, evaluating the impact of external business environment, and considering cross-scale interactions between decision actions. Along with this process, system operation strategies, maintenance schedules, and capacity expansion plans that guide the operation of the power plant are optimally identified, and the total life cycle costs are estimated.

  13. The Use of Electronic Data Capture Tools in Clinical Trials: Web-Survey of 259 Canadian Trials

    PubMed Central

    Jonker, Elizabeth; Sampson, Margaret; Krleža-Jerić, Karmela; Neisa, Angelica

    2009-01-01

    Background Electronic data capture (EDC) tools provide automated support for data collection, reporting, query resolution, randomization, and validation, among other features, for clinical trials. There is a trend toward greater adoption of EDC tools in clinical trials, but there is also uncertainty about how many trials are actually using this technology in practice. A systematic review of EDC adoption surveys conducted up to 2007 concluded that only 20% of trials are using EDC systems, but previous surveys had weaknesses. Objectives Our primary objective was to estimate the proportion of phase II/III/IV Canadian clinical trials that used an EDC system in 2006 and 2007. The secondary objectives were to investigate the factors that can have an impact on adoption and to develop a scale to assess the extent of sophistication of EDC systems. Methods We conducted a Web survey to estimate the proportion of trials that were using an EDC system. The survey was sent to the Canadian site coordinators for 331 trials. We also developed and validated a scale using Guttman scaling to assess the extent of sophistication of EDC systems. Trials using EDC were compared by the level of sophistication of their systems. Results We had a 78.2% response rate (259/331) for the survey. It is estimated that 41% (95% CI 37.5%-44%) of clinical trials were using an EDC system. Trials funded by academic institutions, government, and foundations were less likely to use an EDC system compared to those sponsored by industry. Also, larger trials tended to be more likely to adopt EDC. The EDC sophistication scale had six levels and a coefficient of reproducibility of 0.901 (P< .001) and a coefficient of scalability of 0.79. There was no difference in sophistication based on the funding source, but pediatric trials were likely to use a more sophisticated EDC system. Conclusion The adoption of EDC systems in clinical trials in Canada is higher than the literature indicated: a large proportion of clinical trials in Canada use some form of automated data capture system. To inform future adoption, research should gather stronger evidence on the costs and benefits of using different EDC systems. PMID:19275984

  14. The Future of Weapons of Mass Destruction: Their Nature and Role in 2030

    DTIC Science & Technology

    2014-06-01

    substantial improvements are al- lowed under the rubric of life extension. Other states are not so constrained and may find different ways to develop pure...The foregoing capabilities do not involve genetic manipulation or bioen- gineering; they utilize longstanding biological knowledge and processes. More...sophisticated understanding of biological systems (genomic and proteomic infor- mation) and processes ( genetic modification, bioengineering) for

  15. Optimality in mono- and multisensory map formation.

    PubMed

    Bürck, Moritz; Friedel, Paul; Sichert, Andreas B; Vossen, Christine; van Hemmen, J Leo

    2010-07-01

    In the struggle for survival in a complex and dynamic environment, nature has developed a multitude of sophisticated sensory systems. In order to exploit the information provided by these sensory systems, higher vertebrates reconstruct the spatio-temporal environment from each of the sensory systems they have at their disposal. That is, for each modality the animal computes a neuronal representation of the outside world, a monosensory neuronal map. Here we present a universal framework that allows to calculate the specific layout of the involved neuronal network by means of a general mathematical principle, viz., stochastic optimality. In order to illustrate the use of this theoretical framework, we provide a step-by-step tutorial of how to apply our model. In so doing, we present a spatial and a temporal example of optimal stimulus reconstruction which underline the advantages of our approach. That is, given a known physical signal transmission and rudimental knowledge of the detection process, our approach allows to estimate the possible performance and to predict neuronal properties of biological sensory systems. Finally, information from different sensory modalities has to be integrated so as to gain a unified perception of reality for further processing, e.g., for distinct motor commands. We briefly discuss concepts of multimodal interaction and how a multimodal space can evolve by alignment of monosensory maps.

  16. Sharing Digital Data

    ERIC Educational Resources Information Center

    Benedis-Grab, Gregory

    2011-01-01

    Computers have changed the landscape of scientific research in profound ways. Technology has always played an important role in scientific experimentation--through the development of increasingly sophisticated tools, the measurement of elusive quantities, and the processing of large amounts of data. However, the advent of social networking and the…

  17. Television camera as a scientific instrument

    NASA Technical Reports Server (NTRS)

    Smokler, M. I.

    1970-01-01

    Rigorous calibration program, coupled with a sophisticated data-processing program that introduced compensation for system response to correct photometry, geometric linearity, and resolution, converted a television camera to a quantitative measuring instrument. The output data are in the forms of both numeric printout records and photographs.

  18. TREATMENT OF VOLATILE ORGANIC COMPOUNDS IN WASTE GASES USING A TRICKLING BIOFILTER SYSTEM: A MODELING APPROACH

    EPA Science Inventory

    Biofiltration represents a novel strategy for controlling VOC emissions from a variety of industrial processes. As commercial applications of these systems increase, sophisticated theoretical models will be useful in establishing design criteria for providing insights into impor...

  19. Statistical mechanics of complex economies

    NASA Astrophysics Data System (ADS)

    Bardoscia, Marco; Livan, Giacomo; Marsili, Matteo

    2017-04-01

    In the pursuit of ever increasing efficiency and growth, our economies have evolved to remarkable degrees of complexity, with nested production processes feeding each other in order to create products of greater sophistication from less sophisticated ones, down to raw materials. The engine of such an expansion have been competitive markets that, according to general equilibrium theory (GET), achieve efficient allocations under specific conditions. We study large random economies within the GET framework, as templates of complex economies, and we find that a non-trivial phase transition occurs: the economy freezes in a state where all production processes collapse when either the number of primary goods or the number of available technologies fall below a critical threshold. As in other examples of phase transitions in large random systems, this is an unintended consequence of the growth in complexity. Our findings suggest that the Industrial Revolution can be regarded as a sharp transition between different phases, but also imply that well developed economies can collapse if too many intermediate goods are introduced.

  20. A scheme to calculate higher-order homogenization as applied to micro-acoustic boundary value problems

    NASA Astrophysics Data System (ADS)

    Vagh, Hardik A.; Baghai-Wadji, Alireza

    2008-12-01

    Current technological challenges in materials science and high-tech device industry require the solution of boundary value problems (BVPs) involving regions of various scales, e.g. multiple thin layers, fibre-reinforced composites, and nano/micro pores. In most cases straightforward application of standard variational techniques to BVPs of practical relevance necessarily leads to unsatisfactorily ill-conditioned analytical and/or numerical results. To remedy the computational challenges associated with sub-sectional heterogeneities various sophisticated homogenization techniques need to be employed. Homogenization refers to the systematic process of smoothing out the sub-structural heterogeneities, leading to the determination of effective constitutive coefficients. Ordinarily, homogenization involves a sophisticated averaging and asymptotic order analysis to obtain solutions. In the majority of the cases only zero-order terms are constructed due to the complexity of the processes involved. In this paper we propose a constructive scheme for obtaining homogenized solutions involving higher order terms, and thus, guaranteeing higher accuracy and greater robustness of the numerical results. We present

  1. Definition of variables required for comprehensive description of drug dosage and clinical pharmacokinetics.

    PubMed

    Medem, Anna V; Seidling, Hanna M; Eichler, Hans-Georg; Kaltschmidt, Jens; Metzner, Michael; Hubert, Carina M; Czock, David; Haefeli, Walter E

    2017-05-01

    Electronic clinical decision support systems (CDSS) require drug information that can be processed by computers. The goal of this project was to determine and evaluate a compilation of variables that comprehensively capture the information contained in the summary of product characteristic (SmPC) and unequivocally describe the drug, its dosage options, and clinical pharmacokinetics. An expert panel defined and structured a set of variables and drafted a guideline to extract and enter information on dosage and clinical pharmacokinetics from textual SmPCs as published by the European Medicines Agency (EMA). The set of variables was iteratively revised and evaluated by data extraction and variable allocation of roughly 7% of all centrally approved drugs. The information contained in the SmPC was allocated to three information clusters consisting of 260 variables. The cluster "drug characterization" specifies the nature of the drug. The cluster "dosage" provides information on approved drug dosages and defines corresponding specific conditions. The cluster "clinical pharmacokinetics" includes pharmacokinetic parameters of relevance for dosing in clinical practice. A first evaluation demonstrated that, despite the complexity of the current free text SmPCs, dosage and pharmacokinetic information can be reliably extracted from the SmPCs and comprehensively described by a limited set of variables. By proposing a compilation of variables well describing drug dosage and clinical pharmacokinetics, the project represents a step forward towards the development of a comprehensive database system serving as information source for sophisticated CDSS.

  2. A health plan report card for dentistry.

    PubMed

    Bader, J D; Shugars, D A; Hayden, W J; White, B A

    1996-01-01

    Employers are demanding information about the performance of the health care plans they purchase for their employees. As a result, "report cards" are now beginning to appear that provide standardized, population-based comparison data for managed medical care plans' quality of care, access and member satisfaction, utilization, and financial status. Although report cards for dental care plans have not yet been developed, it is likely that purchasers will soon expect such performance information. A prototype report card for dental managed care plans is proposed in an effort to facilitate the development of a consensus standard for dentistry. The thirty-eight measures proposed for the report card are designed to be obtainable with a realistic level of additional effort in most dental practices. They were selected to provide data on questions of importance to purchasers and to assess processes and outcomes important because there is strong evidence for their effectiveness. The rationale for the measures is discussed, as are the steps required to develop more sophisticated measures. While the responsibility for the procurement of the information needed for dental report cards will die initially with administrators of dental care plans, it is likely in the near future that individual practitioners will be expected to supply this information to both individual patients and potential contractors.

  3. A New Approach for Combining Time-of-Flight and RGB Cameras Based on Depth-Dependent Planar Projective Transformations

    PubMed Central

    Salinas, Carlota; Fernández, Roemi; Montes, Héctor; Armada, Manuel

    2015-01-01

    Image registration for sensor fusion is a valuable technique to acquire 3D and colour information for a scene. Nevertheless, this process normally relies on feature-matching techniques, which is a drawback for combining sensors that are not able to deliver common features. The combination of ToF and RGB cameras is an instance that problem. Typically, the fusion of these sensors is based on the extrinsic parameter computation of the coordinate transformation between the two cameras. This leads to a loss of colour information because of the low resolution of the ToF camera, and sophisticated algorithms are required to minimize this issue. This work proposes a method for sensor registration with non-common features and that avoids the loss of colour information. The depth information is used as a virtual feature for estimating a depth-dependent homography lookup table (Hlut). The homographies are computed within sets of ground control points of 104 images. Since the distance from the control points to the ToF camera are known, the working distance of each element on the Hlut is estimated. Finally, two series of experimental tests have been carried out in order to validate the capabilities of the proposed method. PMID:26404315

  4. Severe Thunderstorm and Tornado Warnings at Raleigh, North Carolina.

    NASA Astrophysics Data System (ADS)

    Hoium, Debra K.; Riordan, Allen J.; Monahan, John; Keeter, Kermit K.

    1997-11-01

    The National Weather Service issues public warnings for severe thunderstorms and tornadoes when these storms appear imminent. A study of the warning process was conducted at the National Weather Service Forecast Office at Raleigh, North Carolina, from 1994 through 1996. The purpose of the study was to examine the decision process by documenting the types of information leading to decisions to warn or not to warn and by describing the sequence and timing of events in the development of warnings. It was found that the evolution of warnings followed a logical sequence beginning with storm monitoring and proceeding with increasingly focused activity. For simplicity, information input to the process was categorized as one of three types: ground truth, radar reflectivity, or radar velocity.Reflectivity, velocity, and ground truth were all equally likely to initiate the investigation process. This investigation took an average of 7 min, after which either a decision was made not to warn or new information triggered the warning. Decisions not to issue warnings were based more on ground truth and reflectivity than radar velocity products. Warnings with investigations of more than 2 min were more likely to be triggered by radar reflectivity, than by velocity or ground truth. Warnings with a shorter investigation time, defined here as "immediate trigger warnings," were less frequently based on velocity products and more on ground truth information. Once the decision was made to warn, it took an average of 2.1 min to prepare the warning text. In 85% of cases when warnings were issued, at least one contact was made to emergency management officials or storm spotters in the warned county. Reports of severe weather were usually received soon after the warning was transmitted-almost half of these within 30 min after issue. A total of 68% were received during the severe weather episode, but some of these storm reports later proved false according to Storm Data.Even though the WSR-88D is a sophisticated tool, ground truth information was found to be a vital part of the warning process. However, the data did not indicate that population density was statistically correlated either with the number of warnings issued or the verification rate.

  5. 77 FR 71860 - Self-Regulatory Organizations; EDGX Exchange, Inc.; Notice of Filing and Immediate Effectiveness...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-04

    ... RLP Approval Order, which account for the difference of assumed information and sophistication level... Members that utilize Retail Orders. Flag ZA is proposed to be yielded for those Members that use Retail... is proposed to be yielded for those Members that use Retail Orders that remove liquidity from EDGX...

  6. Arsenic speciation in solids using X-ray absorption spectroscopy

    USGS Publications Warehouse

    Foster, Andrea L.; Kim, Chris S.

    2014-01-01

    One of the most important aims of this review is to clarify the different types of analysis that are performed on As-XAS spectra, and to describe the benefits, drawbacks, and limitations of each. Arsenic XAS spectra are analyzed to obtain one or more of the following types of information (in increasing order of sophistication):

  7. Reasoning about Benefits, Costs, and Risks of Chemical Substances: Mapping Different Levels of Sophistication

    ERIC Educational Resources Information Center

    Cullipher, S.; Sevian, H.; Talanquer, V.

    2015-01-01

    The ability to evaluate options and make informed decisions about problems in relevant contexts is a core competency in science education that requires the use of both domain-general and discipline-specific knowledge and reasoning strategies. In this study we investigated the implicit assumptions and modes of reasoning applied by individuals with…

  8. Advanced Telemetry System Development.

    DTIC Science & Technology

    Progress in advanced telemetry system development is described. Discussions are included of studies leading to the specification for design...characteristics of adaptive and analytical telemetry systems in which the information efficiently utilizes the data channel capacity. Also discussed are...Progress indicates that further sophistication of existing designs in telemetry will be less advantageous than the development of new systems of

  9. Data mining: sophisticated forms of managed care modeling through artificial intelligence.

    PubMed

    Borok, L S

    1997-01-01

    Data mining is a recent development in computer science that combines artificial intelligence algorithms and relational databases to discover patterns automatically, without the use of traditional statistical methods. Work with data mining tools in health care is in a developmental stage that holds great promise, given the combination of demographic and diagnostic information.

  10. Searching for Significance in Unstructured Data: Text Mining with Leximancer

    ERIC Educational Resources Information Center

    Thomas, David A.

    2014-01-01

    Scholars in many knowledge domains rely on sophisticated information technologies to search for and retrieve records and publications pertinent to their research interests. But what is a scholar to do when a search identifies hundreds of documents, any of which might be vital or irrelevant to his or her work? The problem is further complicated by…

  11. Robert Sabuda and Matthew Reinhart: A Cut Above

    ERIC Educational Resources Information Center

    Patton, Jessica Rae

    2006-01-01

    No one could argue the appeal for kids and adults alike of pop-up books. This article features two pop-up book author-artists, Robert Sabuda and Matthew Reinhart, whose books are in a league apart, with their stunning production values, well-written narratives, informative content and the sheer sophistication of the movable art. The two pioneered…

  12. The Recoverability of P-Technique Factor Analysis

    ERIC Educational Resources Information Center

    Molenaar, Peter C. M.; Nesselroade, John R.

    2009-01-01

    It seems that just when we are about to lay P-technique factor analysis finally to rest as obsolete because of newer, more sophisticated multivariate time-series models using latent variables--dynamic factor models--it rears its head to inform us that an obituary may be premature. We present the results of some simulations demonstrating that even…

  13. Multiresource inventories incorporating GIS, GPS, and database management systems

    Treesearch

    Loukas G. Arvanitis; Balaji Ramachandran; Daniel P. Brackett; Hesham Abd-El Rasol; Xuesong Du

    2000-01-01

    Large-scale natural resource inventories generate enormous data sets. Their effective handling requires a sophisticated database management system. Such a system must be robust enough to efficiently store large amounts of data and flexible enough to allow users to manipulate a wide variety of information. In a pilot project, related to a multiresource inventory of the...

  14. Teaching the History of Technical Communication: A Lesson with Franklin and Hoover

    ERIC Educational Resources Information Center

    Todd, Jeff

    2003-01-01

    The first part of this article shows that research in the history of technical communication has increased in quantity and sophistication over the last 20 years. Scholarship that describes how to teach with that information, however, has not followed, even though teaching the history of the field is a need recognized by several scholars. The…

  15. 32 CFR Appendix E to Part 323 - OMB Guidelines for Matching Programs

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... concern expressed by the Congress in the Privacy Act of 1974 that “the increasing use of computers and sophisticated information technology, while essential to the efficient operation of the Government, has greatly... which a computer is used to compare two or more automated systems of records or a system of records with...

  16. 32 CFR Appendix E to Part 323 - OMB Guidelines for Matching Programs

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... concern expressed by the Congress in the Privacy Act of 1974 that “the increasing use of computers and sophisticated information technology, while essential to the efficient operation of the Government, has greatly... which a computer is used to compare two or more automated systems of records or a system of records with...

  17. Annotated Bibliography on Science and Mathematics Education in Sub-Saharan Africa.

    ERIC Educational Resources Information Center

    Case, John H.

    This bibliography is intended to provide a source of information on what has been written on science and mathematics education in Africa until August 1967. The works included range from the level of the post-graduate thesis to articles in local teaching journals covering a range of topics from sophisticated research to teachers talking among…

  18. Comparative Longitudinal Consumer Information Research in Germany and the United States. German Studies Notes.

    ERIC Educational Resources Information Center

    Thorelli, Hans B.; And Others

    The paper presents the text of a business and economics session at a conference on recent sociopolitical and socioeconomic developments in West Germany and the United States. Intended as a contribution to the dialogue between the two societies, the paper focuses on differences between average and sophisticated consumers within each country and…

  19. Music as Active Information Resource for Players in Video Games

    ERIC Educational Resources Information Center

    Nagorsnick, Marian; Martens, Alke

    2015-01-01

    In modern video games, music can come in different shapes: it can be developed on a very high compositional level, with sophisticated sound elements like in professional film music; it can be developed on a very coarse level, underlying special situations (like danger or attack); it can also be automatically generated by sound engines. However, in…

  20. Factors Impacting University Instructors' and Students' Perceptions of Course Effectiveness and Technology Integration in the Age of Web 2.0

    ERIC Educational Resources Information Center

    Venkatesh, Vivek; Rabah, Jihan; Fusaro, Magda; Couture, Annie; Varela, Wynnpaul; Alexander, Kristopher

    2016-01-01

    We are witnessing the integration of increasingly sophisticated information and communication technologies (ICTs) in higher education settings. Understanding learners' and instructors' perceptions of their proficiency and use of ICTs is critical to the success of their integration in universities. Using a theoretical framework grounded in…

  1. 32 CFR Appendix E to Part 323 - OMB Guidelines for Matching Programs

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... concern expressed by the Congress in the Privacy Act of 1974 that “the increasing use of computers and sophisticated information technology, while essential to the efficient operation of the Government, has greatly... which a computer is used to compare two or more automated systems of records or a system of records with...

  2. Sharp Focus on Film; A Brief Guide to Research on Movies and Movie People.

    ERIC Educational Resources Information Center

    Smith, Phil; Sasse, Margo

    An attempt to increase individuals' sophistication in finding information about movies is made in this guide. Strategies to uncover reviews, criticism, biographies and other film data in the University of California at San Diego libraries are outlined. The first major section deals with approaches to the card catalog and suggests ways of getting…

  3. Identification of "At Risk" Students Using Learning Analytics: The Ethical Dilemmas of Intervention Strategies in a Higher Education Institution

    ERIC Educational Resources Information Center

    Lawson, Celeste; Beer, Colin; Rossi, Dolene; Moore, Teresa; Fleming, Julie

    2016-01-01

    Learning analytics is an emerging field in which sophisticated analytic tools are used to inform and improve learning and teaching. Researchers within a regional university in Australia identified an association between interaction and student success in online courses and subsequently developed a learning analytics system aimed at informing…

  4. Geographic Information Systems: A Primer

    DTIC Science & Technology

    1990-10-01

    AVAILABILITY OF REPORT Approved for public release; distribution 2b DECLASSjFICATION/ DOWNGRADING SCHEDULE unlimited. 4 PERFORMING ORGANIZATION REPORT...utilizing sophisticated integrated databases (usually vector-based), avoid the indirect value coding scheme by recognizing names or direct magnitudes...intricate involvement required by the operator in order to establish a functional coding scheme . A simple raster system, in which cell values indicate

  5. Inventors in the Making

    ERIC Educational Resources Information Center

    Murray, Jenny; Bartelmay, Kathy

    2005-01-01

    Can second-grade students construct an understanding of sophisticated science processes and explore physics concepts while creating their own inventions? Yes! Students accomplished this and much more through a month-long project in which they used Legos and Robolab, the Lego computer programing software, to create their own inventions. One…

  6. ENDOCRINE DISRUPTING COMPOUNDS: PROCESSES FOR REMOVAL FROM DRINKING WATER AND WASTEWATER

    EPA Science Inventory

    Although the list of potentially harmful substances is still being compiled and more sophisticated laboratory tests for detection of endocrine disrupting chemicals (EDCs) are being developed, an initial list of known EDCs has been made and an array of drinking water and wastewate...

  7. Applications in Digital Image Processing

    ERIC Educational Resources Information Center

    Silverman, Jason; Rosen, Gail L.; Essinger, Steve

    2013-01-01

    Students are immersed in a mathematically intensive, technological world. They engage daily with iPods, HDTVs, and smartphones--technological devices that rely on sophisticated but accessible mathematical ideas. In this article, the authors provide an overview of four lab-type activities that have been used successfully in high school mathematics…

  8. A Critical Review of Some Qualitative Research Methods Used to Explore Rater Cognition

    ERIC Educational Resources Information Center

    Suto, Irenka

    2012-01-01

    Internationally, many assessment systems rely predominantly on human raters to score examinations. Arguably, this facilitates the assessment of multiple sophisticated educational constructs, strengthening assessment validity. It can introduce subjectivity into the scoring process, however, engendering threats to accuracy. The present objectives…

  9. Agriculture and the Community: The Sociological Perspective.

    ERIC Educational Resources Information Center

    Heffernan, William D.; Campbell, Rex R.

    Emergence of a dual agricultural system, need for sophisticated knowledge and equipment, declining importance of labor, and geographic and organizational concentration of the production and processing of certain commodities are creating changes in rural communities. While some changes will have negative social/economic impacts, the importance of…

  10. NASA Remote Sensing Data in Earth Sciences: Processing, Archiving, Distribution, Applications at the GES DISC

    NASA Technical Reports Server (NTRS)

    Leptoukh, Gregory G.

    2005-01-01

    The NASA Goddard Earth Sciences Data and Information Services Center (GES DISC) is one of the major Distributed Active Archive Centers (DAACs) archiving and distributing remote sensing data from the NASA's Earth Observing System. In addition to providing just data, the GES DISC/DAAC has developed various value-adding processing services. A particularly useful service is data processing a t the DISC (i.e., close to the input data) with the users' algorithms. This can take a number of different forms: as a configuration-managed algorithm within the main processing stream; as a stand-alone program next to the on-line data storage; as build-it-yourself code within the Near-Archive Data Mining (NADM) system; or as an on-the-fly analysis with simple algorithms embedded into the web-based tools (to avoid downloading unnecessary all the data). The existing data management infrastructure at the GES DISC supports a wide spectrum of options: from data subsetting data spatially and/or by parameter to sophisticated on-line analysis tools, producing economies of scale and rapid time-to-deploy. Shifting processing and data management burden from users to the GES DISC, allows scientists to concentrate on science, while the GES DISC handles the data management and data processing at a lower cost. Several examples of successful partnerships with scientists in the area of data processing and mining are presented.

  11. Development of Smart Precision Forest in Conifer Plantation in Japan Using Laser Scanning Data

    NASA Astrophysics Data System (ADS)

    Katoh, M.; Deng, S.; Takenaka, Y.; Cheung, K.; Oono, K.; Horisawa, M.; Hyyppä, J.; Yu, X.; Liang, X.; Wang, Y.

    2017-10-01

    Currently, the authors are planning to launch a consortium effort toward Japan's first smart precision forestry project using laser data and to develop this technology throughout the country. Smart precision forestry information gathered using the Nagano model (laser scanning from aircraft, drone, and backpack) is being developed to improve the sophistication of forest information, reduce labor-intensive work, maintain sustainable timber productivity, and facilitate supply chain management by laser sensing information in collaboration with industry, academia, and government. In this paper, we outline the research project and the technical development situation of unmanned aerial vehicle laser scanning.

  12. The study and implementation of the wireless network data security model

    NASA Astrophysics Data System (ADS)

    Lin, Haifeng

    2013-03-01

    In recent years, the rapid development of Internet technology and the advent of information age, people are increasing the strong demand for the information products and the market for information technology. Particularly, the network security requirements have become more sophisticated. This paper analyzes the wireless network in the data security vulnerabilities. And a list of wireless networks in the framework is the serious defects with the related problems. It has proposed the virtual private network technology and wireless network security defense structure; and it also given the wireless networks and related network intrusion detection model for the detection strategies.

  13. Today's CIO: catalyst for managed care change.

    PubMed

    Sanchez, P

    1997-05-01

    As the impact of managed care increases and capitation becomes all pervasive, healthcare providers' attention to cost control will intensify. For integrated delivery networks (IDNs) to be competitive, today's CIO must leverage managed care as a catalyst for change, and use a sophisticated information system toolset as the means to an integrated end. An area many CIOs target for fast results and maximum cost savings in resource management. This article reviews how Dick Escue, chief information officer at Baptist Memorial Health Care Corporation (Memphis, TN), uses electronic information management systems to integrate and conserve the resources of Baptist's widespread healthcare organization.

  14. The Understanding and Interpretation of Innovative Technology-Enabled Multidimensional Physical Activity Feedback in Patients at Risk of Future Chronic Disease

    PubMed Central

    Western, Max J.; Peacock, Oliver J.; Stathi, Afroditi; Thompson, Dylan

    2015-01-01

    Background Innovative physical activity monitoring technology can be used to depict rich visual feedback that encompasses the various aspects of physical activity known to be important for health. However, it is unknown whether patients who are at risk of chronic disease would understand such sophisticated personalised feedback or whether they would find it useful and motivating. The purpose of the present study was to determine whether technology-enabled multidimensional physical activity graphics and visualisations are comprehensible and usable for patients at risk of chronic disease. Method We developed several iterations of graphics depicting minute-by-minute activity patterns and integrated physical activity health targets. Subsequently, patients at moderate/high risk of chronic disease (n=29) and healthcare practitioners (n=15) from South West England underwent full 7-days activity monitoring followed by individual semi-structured interviews in which they were asked to comment on their own personalised visual feedback Framework analysis was used to gauge their interpretation and of personalised feedback, graphics and visualisations. Results We identified two main components focussing on (a) the interpretation of feedback designs and data and (b) the impact of personalised visual physical activity feedback on facilitation of health behaviour change. Participants demonstrated a clear ability to understand the sophisticated personal information plus an enhanced physical activity knowledge. They reported that receiving multidimensional feedback was motivating and could be usefully applied to facilitate their efforts in becoming more physically active. Conclusion Multidimensional physical activity feedback can be made comprehensible, informative and motivational by using appropriate graphics and visualisations. There is an opportunity to exploit the full potential created by technological innovation and provide sophisticated personalised physical activity feedback as an adjunct to support behaviour change. PMID:25938455

  15. Modeling of outpatient prescribing process in iran: a gateway toward electronic prescribing system.

    PubMed

    Ahmadi, Maryam; Samadbeik, Mahnaz; Sadoughi, Farahnaz

    2014-01-01

    Implementation of electronic prescribing system can overcome many problems of the paper prescribing system, and provide numerous opportunities of more effective and advantageous prescribing. Successful implementation of such a system requires complete and deep understanding of work content, human force, and workflow of paper prescribing. The current study was designed in order to model the current business process of outpatient prescribing in Iran and clarify different actions during this process. In order to describe the prescribing process and the system features in Iran, the methodology of business process modeling and analysis was used in the present study. The results of the process documentation were analyzed using a conceptual model of workflow elements and the technique of modeling "As-Is" business processes. Analysis of the current (as-is) prescribing process demonstrated that Iran stood at the first levels of sophistication in graduated levels of electronic prescribing, namely electronic prescription reference, and that there were problematic areas including bottlenecks, redundant and duplicated work, concentration of decision nodes, and communicative weaknesses among stakeholders of the process. Using information technology in some activities of medication prescription in Iran has not eliminated the dependence of the stakeholders on paper-based documents and prescriptions. Therefore, it is necessary to implement proper system programming in order to support change management and solve the problems in the existing prescribing process. To this end, a suitable basis should be provided for reorganization and improvement of the prescribing process for the future electronic systems.

  16. Translational Cognition for Decision Support in Critical Care Environments: A Review

    PubMed Central

    Patel, Vimla L.; Zhang, Jiajie; Yoskowitz, Nicole A.; Green, Robert; Sayan, Osman R.

    2008-01-01

    The dynamic and distributed work environment in critical care requires a high level of collaboration among clinical team members and a sophisticated task coordination system to deliver safe, timely and effective care. A complex cognitive system underlies the decision-making process in such cooperative workplaces. This methodological review paper addresses the issues of translating cognitive research to clinical practice with a specific focus on decision-making in critical care, and the role of information and communication technology to aid in such decisions. Examples are drawn from studies of critical care in our own research laboratories. Critical care, in this paper, includes both intensive (inpatient) and emergency (outpatient) care. We define translational cognition as the research on basic and applied cognitive issues that contribute to our understanding of how information is stored, retrieved and used for problem-solving and decision-making. The methods and findings are discussed in the context of constraints on decision-making in real world complex environments and implications for supporting the design and evaluation of decision support tools for critical care health providers. PMID:18343731

  17. Translational cognition for decision support in critical care environments: a review.

    PubMed

    Patel, Vimla L; Zhang, Jiajie; Yoskowitz, Nicole A; Green, Robert; Sayan, Osman R

    2008-06-01

    The dynamic and distributed work environment in critical care requires a high level of collaboration among clinical team members and a sophisticated task coordination system to deliver safe, timely and effective care. A complex cognitive system underlies the decision-making process in such cooperative workplaces. This methodological review paper addresses the issues of translating cognitive research to clinical practice with a specific focus on decision-making in critical care, and the role of information and communication technology to aid in such decisions. Examples are drawn from studies of critical care in our own research laboratories. Critical care, in this paper, includes both intensive (inpatient) and emergency (outpatient) care. We define translational cognition as the research on basic and applied cognitive issues that contribute to our understanding of how information is stored, retrieved and used for problem-solving and decision-making. The methods and findings are discussed in the context of constraints on decision-making in real-world complex environments and implications for supporting the design and evaluation of decision support tools for critical care health providers.

  18. Value added data archiving

    NASA Technical Reports Server (NTRS)

    Berard, Peter R.

    1993-01-01

    Researchers in the Molecular Sciences Research Center (MSRC) of Pacific Northwest Laboratory (PNL) currently generate massive amounts of scientific data. The amount of data that will need to be managed by the turn of the century is expected to increase significantly. Automated tools that support the management, maintenance, and sharing of this data are minimal. Researchers typically manage their own data by physically moving datasets to and from long term storage devices and recording a dataset's historical information in a laboratory notebook. Even though it is not the most efficient use of resources, researchers have tolerated the process. The solution to this problem will evolve over the next three years in three phases. PNL plans to add sophistication to existing multilevel file system (MLFS) software by integrating it with an object database management system (ODBMS). The first phase in the evolution is currently underway. A prototype system of limited scale is being used to gather information that will feed into the next two phases. This paper describes the prototype system, identifies the successes and problems/complications experienced to date, and outlines PNL's long term goals and objectives in providing a permanent solution.

  19. Theory of Remote Image Formation

    NASA Astrophysics Data System (ADS)

    Blahut, Richard E.

    2004-11-01

    In many applications, images, such as ultrasonic or X-ray signals, are recorded and then analyzed with digital or optical processors in order to extract information. Such processing requires the development of algorithms of great precision and sophistication. This book presents a unified treatment of the mathematical methods that underpin the various algorithms used in remote image formation. The author begins with a review of transform and filter theory. He then discusses two- and three-dimensional Fourier transform theory, the ambiguity function, image construction and reconstruction, tomography, baseband surveillance systems, and passive systems (where the signal source might be an earthquake or a galaxy). Information-theoretic methods in image formation are also covered, as are phase errors and phase noise. Throughout the book, practical applications illustrate theoretical concepts, and there are many homework problems. The book is aimed at graduate students of electrical engineering and computer science, and practitioners in industry. Presents a unified treatment of the mathematical methods that underpin the algorithms used in remote image formation Illustrates theoretical concepts with reference to practical applications Provides insights into the design parameters of real systems

  20. User Needs and Assessing the Impact of Low Latency NASA Earth Observation Data Availability on Societal Benefit

    NASA Technical Reports Server (NTRS)

    Brown, Molly E.; Carroll, Mark L.; Escobar, Vanessa M.

    2014-01-01

    Since the advent of NASA's Earth Observing System, knowledge of the practical benefits of Earth science data has grown considerably. The community using NASA Earth science observations in applications has grown significantly, with increasing sophistication to serve national interests. Data latency, or how quickly communities receive science observations after acquisition, can have a direct impact on the applications and usability of the information. This study was conducted to determine how users are incorporating NASA data into applications and operational processes to benefit society beyond scientific research, as well as to determine the need for data latency of less than 12 h. The results of the analysis clearly show the significant benefit to society of serving the needs of the agricultural, emergency response, environmental monitoring and weather communities who use rapidly delivered, accurate Earth science data. The study also showed the potential of expanding the communities who use low latency NASA science data products to provide new ways of transforming data into information. These benefits can be achieved with a clear and consistent NASA policy on product latency.

  1. Extracting Semantic Building Models from Aerial Stereo Images and Conversion to Citygml

    NASA Astrophysics Data System (ADS)

    Sengul, A.

    2012-07-01

    The collection of geographic data is of primary importance for the creation and maintenance of a GIS. Traditionally the acquisition of 3D information has been the task of photogrammetry using aerial stereo images. Digital photogrammetric systems employ sophisticated software to extract digital terrain models or to plot 3D objects. The demand for 3D city models leads to new applications and new standards. City Geography Mark-up Language (CityGML), a concept for modelling and exchange of 3D city and landscape models, defines the classes and relations for the most relevant topographic objects in cities and regional models with respect to their geometrical, topological, semantically and topological properties. It now is increasingly accepted, since it fulfils the prerequisites required e.g. for risk analysis, urban planning, and simulations. There is a need to include existing 3D information derived from photogrammetric processes in CityGML databases. In order to filling the gap, this paper reports on a framework transferring data plotted by Erdas LPS and Stereo Analyst for ArcGIS software to CityGML using Safe Software's Feature Manupulate Engine (FME)

  2. University Students' Knowledge Structures and Informal Reasoning on the Use of Genetically Modified Foods: Multidimensional Analyses

    NASA Astrophysics Data System (ADS)

    Wu, Ying-Tien

    2013-10-01

    This study aims to provide insights into the role of learners' knowledge structures about a socio-scientific issue (SSI) in their informal reasoning on the issue. A total of 42 non-science major university students' knowledge structures and informal reasoning were assessed with multidimensional analyses. With both qualitative and quantitative analyses, this study revealed that those students with more extended and better-organized knowledge structures, as well as those who more frequently used higher-order information processing modes, were more oriented towards achieving a higher-level informal reasoning quality. The regression analyses further showed that the "richness" of the students' knowledge structures explained 25 % of the variation in their rebuttal construction, an important indicator of reasoning quality, indicating the significance of the role of students' sophisticated knowledge structure in SSI reasoning. Besides, this study also provides some initial evidence for the significant role of the "core" concept within one's knowledge structure in one's SSI reasoning. The findings in this study suggest that, in SSI-based instruction, science instructors should try to identify students' core concepts within their prior knowledge regarding the SSI, and then they should try to guide students to construct and structure relevant concepts or ideas regarding the SSI based on their core concepts. Thus, students could obtain extended and well-organized knowledge structures, which would then help them achieve better learning transfer in dealing with SSIs.

  3. Towards Web-based representation and processing of health information

    PubMed Central

    Gao, Sheng; Mioc, Darka; Yi, Xiaolun; Anton, Francois; Oldfield, Eddie; Coleman, David J

    2009-01-01

    Background There is great concern within health surveillance, on how to grapple with environmental degradation, rapid urbanization, population mobility and growth. The Internet has emerged as an efficient way to share health information, enabling users to access and understand data at their fingertips. Increasingly complex problems in the health field require increasingly sophisticated computer software, distributed computing power, and standardized data sharing. To address this need, Web-based mapping is now emerging as an important tool to enable health practitioners, policy makers, and the public to understand spatial health risks, population health trends and vulnerabilities. Today several web-based health applications generate dynamic maps; however, for people to fully interpret the maps they need data source description and the method used in the data analysis or statistical modeling. For the representation of health information through Web-mapping applications, there still lacks a standard format to accommodate all fixed (such as location) and variable (such as age, gender, health outcome, etc) indicators in the representation of health information. Furthermore, net-centric computing has not been adequately applied to support flexible health data processing and mapping online. Results The authors of this study designed a HEalth Representation XML (HERXML) schema that consists of the semantic (e.g., health activity description, the data sources description, the statistical methodology used for analysis), geometric, and cartographical representations of health data. A case study has been carried on the development of web application and services within the Canadian Geospatial Data Infrastructure (CGDI) framework for community health programs of the New Brunswick Lung Association. This study facilitated the online processing, mapping and sharing of health information, with the use of HERXML and Open Geospatial Consortium (OGC) services. It brought a new solution in better health data representation and initial exploration of the Web-based processing of health information. Conclusion The designed HERXML has been proven to be an appropriate solution in supporting the Web representation of health information. It can be used by health practitioners, policy makers, and the public in disease etiology, health planning, health resource management, health promotion and health education. The utilization of Web-based processing services in this study provides a flexible way for users to select and use certain processing functions for health data processing and mapping via the Web. This research provides easy access to geospatial and health data in understanding the trends of diseases, and promotes the growth and enrichment of the CGDI in the public health sector. PMID:19159445

  4. 3D Visualization Development of SIUE Campus

    NASA Astrophysics Data System (ADS)

    Nellutla, Shravya

    Geographic Information Systems (GIS) has progressed from the traditional map-making to the modern technology where the information can be created, edited, managed and analyzed. Like any other models, maps are simplified representations of real world. Hence visualization plays an essential role in the applications of GIS. The use of sophisticated visualization tools and methods, especially three dimensional (3D) modeling, has been rising considerably due to the advancement of technology. There are currently many off-the-shelf technologies available in the market to build 3D GIS models. One of the objectives of this research was to examine the available ArcGIS and its extensions for 3D modeling and visualization and use them to depict a real world scenario. Furthermore, with the advent of the web, a platform for accessing and sharing spatial information on the Internet, it is possible to generate interactive online maps. Integrating Internet capacity with GIS functionality redefines the process of sharing and processing the spatial information. Enabling a 3D map online requires off-the-shelf GIS software, 3D model builders, web server, web applications and client server technologies. Such environments are either complicated or expensive because of the amount of hardware and software involved. Therefore, the second objective of this research was to investigate and develop simpler yet cost-effective 3D modeling approach that uses available ArcGIS suite products and the free 3D computer graphics software for designing 3D world scenes. Both ArcGIS Explorer and ArcGIS Online will be used to demonstrate the way of sharing and distributing 3D geographic information on the Internet. A case study of the development of 3D campus for the Southern Illinois University Edwardsville is demonstrated.

  5. SimHap GUI: an intuitive graphical user interface for genetic association analysis.

    PubMed

    Carter, Kim W; McCaskie, Pamela A; Palmer, Lyle J

    2008-12-25

    Researchers wishing to conduct genetic association analysis involving single nucleotide polymorphisms (SNPs) or haplotypes are often confronted with the lack of user-friendly graphical analysis tools, requiring sophisticated statistical and informatics expertise to perform relatively straightforward tasks. Tools, such as the SimHap package for the R statistics language, provide the necessary statistical operations to conduct sophisticated genetic analysis, but lacks a graphical user interface that allows anyone but a professional statistician to effectively utilise the tool. We have developed SimHap GUI, a cross-platform integrated graphical analysis tool for conducting epidemiological, single SNP and haplotype-based association analysis. SimHap GUI features a novel workflow interface that guides the user through each logical step of the analysis process, making it accessible to both novice and advanced users. This tool provides a seamless interface to the SimHap R package, while providing enhanced functionality such as sophisticated data checking, automated data conversion, and real-time estimations of haplotype simulation progress. SimHap GUI provides a novel, easy-to-use, cross-platform solution for conducting a range of genetic and non-genetic association analyses. This provides a free alternative to commercial statistics packages that is specifically designed for genetic association analysis.

  6. Cardiovascular imaging environment: will the future be cloud-based?

    PubMed

    Kawel-Boehm, Nadine; Bluemke, David A

    2017-07-01

    In cardiovascular CT and MR imaging large datasets have to be stored, post-processed, analyzed and distributed. Beside basic assessment of volume and function in cardiac magnetic resonance imaging e.g., more sophisticated quantitative analysis is requested requiring specific software. Several institutions cannot afford various types of software and provide expertise to perform sophisticated analysis. Areas covered: Various cloud services exist related to data storage and analysis specifically for cardiovascular CT and MR imaging. Instead of on-site data storage, cloud providers offer flexible storage services on a pay-per-use basis. To avoid purchase and maintenance of specialized software for cardiovascular image analysis, e.g. to assess myocardial iron overload, MR 4D flow and fractional flow reserve, evaluation can be performed with cloud based software by the consumer or complete analysis is performed by the cloud provider. However, challenges to widespread implementation of cloud services include regulatory issues regarding patient privacy and data security. Expert commentary: If patient privacy and data security is guaranteed cloud imaging is a valuable option to cope with storage of large image datasets and offer sophisticated cardiovascular image analysis for institutions of all sizes.

  7. Groundwater modeling in integrated water resources management--visions for 2020.

    PubMed

    Refsgaard, Jens Christian; Højberg, Anker Lajer; Møller, Ingelise; Hansen, Martin; Søndergaard, Verner

    2010-01-01

    Groundwater modeling is undergoing a change from traditional stand-alone studies toward being an integrated part of holistic water resources management procedures. This is illustrated by the development in Denmark, where comprehensive national databases for geologic borehole data, groundwater-related geophysical data, geologic models, as well as a national groundwater-surface water model have been established and integrated to support water management. This has enhanced the benefits of using groundwater models. Based on insight gained from this Danish experience, a scientifically realistic scenario for the use of groundwater modeling in 2020 has been developed, in which groundwater models will be a part of sophisticated databases and modeling systems. The databases and numerical models will be seamlessly integrated, and the tasks of monitoring and modeling will be merged. Numerical models for atmospheric, surface water, and groundwater processes will be coupled in one integrated modeling system that can operate at a wide range of spatial scales. Furthermore, the management systems will be constructed with a focus on building credibility of model and data use among all stakeholders and on facilitating a learning process whereby data and models, as well as stakeholders' understanding of the system, are updated to currently available information. The key scientific challenges for achieving this are (1) developing new methodologies for integration of statistical and qualitative uncertainty; (2) mapping geological heterogeneity and developing scaling methodologies; (3) developing coupled model codes; and (4) developing integrated information systems, including quality assurance and uncertainty information that facilitate active stakeholder involvement and learning.

  8. Application of Advanced Signal Processing Techniques to Angle of Arrival Estimation in ATC Navigation and Surveillance Systems

    DTIC Science & Technology

    1982-06-23

    Administration Systems Research and Development Service 14, Spseq Aese Ce ’ Washington, D.C. 20591 It. SeppkW•aae metm The work reported in this document was...consider sophisticated signal processing techniques as an alternative method of improving system performanceH Some work in this area has already taken place...demands on the frequency spectrum. As noted in Table 1-1, there has been considerable work on advanced signal processing in the MLS context

  9. A visual analytic framework for data fusion in investigative intelligence

    NASA Astrophysics Data System (ADS)

    Cai, Guoray; Gross, Geoff; Llinas, James; Hall, David

    2014-05-01

    Intelligence analysis depends on data fusion systems to provide capabilities of detecting and tracking important objects, events, and their relationships in connection to an analytical situation. However, automated data fusion technologies are not mature enough to offer reliable and trustworthy information for situation awareness. Given the trend of increasing sophistication of data fusion algorithms and loss of transparency in data fusion process, analysts are left out of the data fusion process cycle with little to no control and confidence on the data fusion outcome. Following the recent rethinking of data fusion as human-centered process, this paper proposes a conceptual framework towards developing alternative data fusion architecture. This idea is inspired by the recent advances in our understanding of human cognitive systems, the science of visual analytics, and the latest thinking about human-centered data fusion. Our conceptual framework is supported by an analysis of the limitation of existing fully automated data fusion systems where the effectiveness of important algorithmic decisions depend on the availability of expert knowledge or the knowledge of the analyst's mental state in an investigation. The success of this effort will result in next generation data fusion systems that can be better trusted while maintaining high throughput.

  10. Experiences with Text Mining Large Collections of Unstructured Systems Development Artifacts at JPL

    NASA Technical Reports Server (NTRS)

    Port, Dan; Nikora, Allen; Hihn, Jairus; Huang, LiGuo

    2011-01-01

    Often repositories of systems engineering artifacts at NASA's Jet Propulsion Laboratory (JPL) are so large and poorly structured that they have outgrown our capability to effectively manually process their contents to extract useful information. Sophisticated text mining methods and tools seem a quick, low-effort approach to automating our limited manual efforts. Our experiences of exploring such methods mainly in three areas including historical risk analysis, defect identification based on requirements analysis, and over-time analysis of system anomalies at JPL, have shown that obtaining useful results requires substantial unanticipated efforts - from preprocessing the data to transforming the output for practical applications. We have not observed any quick 'wins' or realized benefit from short-term effort avoidance through automation in this area. Surprisingly we have realized a number of unexpected long-term benefits from the process of applying text mining to our repositories. This paper elaborates some of these benefits and our important lessons learned from the process of preparing and applying text mining to large unstructured system artifacts at JPL aiming to benefit future TM applications in similar problem domains and also in hope for being extended to broader areas of applications.

  11. Citizens' Jury and Elder Care: Public Participation and Deliberation in Long-Term Care Policy in Thailand.

    PubMed

    Chuengsatiansup, Komatra; Tengrang, Kanisorn; Posayanonda, Tipicha; Sihapark, Siranee

    2018-02-16

    Health care policies for the elderly are complex, multidimensional, and contextually circumscribed. While engagement of health experts, economists, health care administrators, and political leaders is generally viewed as instrumental to the success and sustainability of eldercare programs, the elders themselves are often viewed as passive recipients of care and not included in the policy processes. Experiences and expectations from users' perspectives can be invaluable information for policy formulation and systems design. This paper examines a participatory policy process using a "citizens' jury" to promote public engagement in eldercare policy. The process was initiated by the National Health Commission Office in Thailand to explore how a citizens' jury as a model for civic deliberation can be utilized to provide sophisticated policy recommendations on long-term care policies for the elderly. The objectives of this paper are to (1) examine how public participation in health policy can be actualized through the citizens' jury as an operational model, (2) understand the strengths and weaknesses of the ways the idea was implemented, and (3) provide recommendations for further use of the model. Details of how a citizens' jury was deployed are discussed, with recommendations for further use provided at the end.

  12. Single-Scale Fusion: An Effective Approach to Merging Images.

    PubMed

    Ancuti, Codruta O; Ancuti, Cosmin; De Vleeschouwer, Christophe; Bovik, Alan C

    2017-01-01

    Due to its robustness and effectiveness, multi-scale fusion (MSF) based on the Laplacian pyramid decomposition has emerged as a popular technique that has shown utility in many applications. Guided by several intuitive measures (weight maps) the MSF process is versatile and straightforward to be implemented. However, the number of pyramid levels increases with the image size, which implies sophisticated data management and memory accesses, as well as additional computations. Here, we introduce a simplified formulation that reduces MSF to only a single level process. Starting from the MSF decomposition, we explain both mathematically and intuitively (visually) a way to simplify the classical MSF approach with minimal loss of information. The resulting single-scale fusion (SSF) solution is a close approximation of the MSF process that eliminates important redundant computations. It also provides insights regarding why MSF is so effective. While our simplified expression is derived in the context of high dynamic range imaging, we show its generality on several well-known fusion-based applications, such as image compositing, extended depth of field, medical imaging, and blending thermal (infrared) images with visible light. Besides visual validation, quantitative evaluations demonstrate that our SSF strategy is able to yield results that are highly competitive with traditional MSF approaches.

  13. Ground Processing of Data From the Mars Exploration Rovers

    NASA Technical Reports Server (NTRS)

    Wright, Jesse; Sturdevant, Kathryn; Noble, David

    2006-01-01

    A computer program implements the Earth side of the protocol that governs the transfer of data files generated by the Mars Exploration Rovers. It also provides tools for viewing data in these files and integrating data-product files into automated and manual processes. It reconstitutes files from telemetry data packets. Even if only one packet is received, metadata provide enough information to enable this program to identify and use partial data products. This software can generate commands to acknowledge received files and retransmit missed parts of files, or it can feed a manual process to make decisions about retransmission. The software uses an Extensible Markup Language (XML) data dictionary to provide a generic capability for displaying files of basic types, and uses external "plug-in" application programs to provide more sophisticated displays. This program makes data products available with very low latency, and can trigger automated actions when complete or partial products are received. The software is easy to install and use. The only system requirement for installing the software is a Java J2SE 1.4 platform. Several instances of the software can be executed simultaneously on the same machine.

  14. Speech comprehension training and auditory and cognitive processing in older adults.

    PubMed

    Pichora-Fuller, M Kathleen; Levitt, Harry

    2012-12-01

    To provide a brief history of speech comprehension training systems and an overview of research on auditory and cognitive aging as background to recommendations for future directions for rehabilitation. Two distinct domains were reviewed: one concerning technological and the other concerning psychological aspects of training. Historical trends and advances in these 2 domains were interrelated to highlight converging trends and directions for future practice. Over the last century, technological advances have influenced both the design of hearing aids and training systems. Initially, training focused on children and those with severe loss for whom amplification was insufficient. Now the focus has shifted to older adults with relatively little loss but difficulties listening in noise. Evidence of brain plasticity from auditory and cognitive neuroscience provides new insights into how to facilitate perceptual (re-)learning by older adults. There is a new imperative to complement training to increase bottom-up processing of the signal with more ecologically valid training to boost top-down information processing based on knowledge of language and the world. Advances in digital technologies enable the development of increasingly sophisticated training systems incorporating complex meaningful materials such as music, audiovisual interactive displays, and conversation.

  15. CANFAR + Skytree: Mining Massive Datasets as an Essential Part of the Future of Astronomy

    NASA Astrophysics Data System (ADS)

    Ball, Nicholas M.

    2013-01-01

    The future study of large astronomical datasets, consisting of hundreds of millions to billions of objects, will be dominated by large computing resources, and by analysis tools of the necessary scalability and sophistication to extract useful information. Significant effort will be required to fulfil their potential as a provider of the next generation of science results. To-date, computing systems have allowed either sophisticated analysis of small datasets, e.g., most astronomy software, or simple analysis of large datasets, e.g., database queries. At the Canadian Astronomy Data Centre, we have combined our cloud computing system, the Canadian Advanced Network for Astronomical Research (CANFAR), with the world's most advanced machine learning software, Skytree, to create the world's first cloud computing system for data mining in astronomy. This allows the full sophistication of the huge fields of data mining and machine learning to be applied to the hundreds of millions of objects that make up current large datasets. CANFAR works by utilizing virtual machines, which appear to the user as equivalent to a desktop. Each machine is replicated as desired to perform large-scale parallel processing. Such an arrangement carries far more flexibility than other cloud systems, because it enables the user to immediately install and run the same code that they already utilize for science on their desktop. We demonstrate the utility of the CANFAR + Skytree system by showing science results obtained, including assigning photometric redshifts with full probability density functions (PDFs) to a catalog of approximately 133 million galaxies from the MegaPipe reductions of the Canada-France-Hawaii Telescope Legacy Wide and Deep surveys. Each PDF is produced nonparametrically from 100 instances of the photometric parameters for each galaxy, generated by perturbing within the errors on the measurements. Hence, we produce, store, and assign redshifts to, a catalog of over 13 billion object instances. This catalog is comparable in size to those expected from next-generation surveys, such as Large Synoptic Survey Telescope. The CANFAR+Skytree system is open for use by any interested member of the astronomical community.

  16. In Praise of the Sophists.

    ERIC Educational Resources Information Center

    Gibson, Walker

    1993-01-01

    Discusses the thinking of the Greek Sophist philosophers, particularly Gorgias and Protagoras, and their importance and relevance for contemporary English instructors. Considers the problem of language as signs of reality in the context of Sophist philosophy. (HB)

  17. Development of Course Material in a Multi-Author Environment

    ERIC Educational Resources Information Center

    Schlotter, Michael

    2009-01-01

    Software for text processing and presentation design is becoming increasingly sophisticated. Nevertheless, it is difficult to find a good solution for collaborative writing of technical course material, allowing the creation of high quality lecture notes and presentation slides from a single source. This article presents a new editing framework…

  18. Using ROI to Demonstrate Performance Value in the Public Sector

    ERIC Educational Resources Information Center

    Phillips, Jack J.; Phillips, Patti

    2009-01-01

    The demand for accountability through measurement continues to increase. Although much progress has been made, it remains an issue that challenges even the most sophisticated and progressive performance improvement function. This article provides an overview of best practices in the return-on-investment process and describes how applying these…

  19. Emerging Uses of Computer Technology in Qualitative Research.

    ERIC Educational Resources Information Center

    Parker, D. Randall

    The application of computer technology in qualitative research and evaluation ranges from simple word processing to doing sophisticated data sorting and retrieval. How computer software can be used for qualitative research is discussed. Researchers should consider the use of computers in data analysis in light of their own familiarity and comfort…

  20. Does Adaptive Scaffolding Facilitate Students' Ability to Regulate their Learning with Hypermedia?

    ERIC Educational Resources Information Center

    Azevedo, Roger; Cromley, Jennifer G.; Seibert, Diane

    2004-01-01

    Is adaptive scaffolding effective in facilitating students' ability to regulate their learning of complex science topics with hypermedia? We examined the role of different scaffolding instructional interventions in facilitating students' shift to more sophisticated mental models as indicated by both performance and process data. Undergraduate…

  1. Individual Differences in Boys' and Girls' Timing and Tempo of Puberty: Modeling Development with Nonlinear Growth Models

    ERIC Educational Resources Information Center

    Marceau, Kristine; Ram, Nilam; Houts, Renate M.; Grimm, Kevin J.; Susman, Elizabeth J.

    2011-01-01

    Pubertal development is a nonlinear process progressing from prepubescent beginnings through biological, physical, and psychological changes to full sexual maturity. To tether theoretical concepts of puberty with sophisticated longitudinal, analytical models capable of articulating pubertal development more accurately, we used nonlinear…

  2. "It's All Human Error!": When a School Science Experiment Fails

    ERIC Educational Resources Information Center

    Viechnicki, Gail Brendel; Kuipers, Joel

    2006-01-01

    This paper traces the sophisticated negotiations to re-inscribe the authority of Nature when a school science experiment fails during the enactment of a highly rated science curriculum unit. Drawing on transcriptions from classroom videotapes, we identify and describe four primary patterns of interaction that characterize this process, arguing…

  3. Chinese College Students' English Reading Comprehension in Silent and Loud Reading-Mode

    ERIC Educational Resources Information Center

    Jiang, Yan

    2015-01-01

    In language teaching, emphasis is usually placed on students' reading comprehension, because reading comprehension remains one of the main important factors for their English language learning. Research shows, however, that reading comprehension is a sophisticated process and many students have met difficulties in constructing meaning from writing…

  4. Learning Through Technology. ZIFF Papiere 26.

    ERIC Educational Resources Information Center

    Wedemeyer, Charles A.

    Advances in educational technology have brought about changes in the scope of learning facilitated by technology, the roles of teachers and learners, and the sophistication of the processes used in developing instruction which will be communicated by technology. This paper considers these issues from the viewpoint of the learner. The first section…

  5. Irresistible

    ERIC Educational Resources Information Center

    Curio, Michele

    2005-01-01

    One of Michele Curio's favorite art lessons is creating a resist using oil pastels under black tempera paint. The process produces dramatic and creative results with a high success rate even for the most art-challenged students. The artworks have a sophisticated, painterly quality that is achieved with more control and less mess than direct…

  6. Application of large-scale, multi-resolution watershed modeling framework using the Hydrologic and Water Quality System (HAWQS)

    USDA-ARS?s Scientific Manuscript database

    In recent years, large-scale watershed modeling has been implemented broadly in the field of water resources planning and management. Complex hydrological, sediment, and nutrient processes can be simulated by sophisticated watershed simulation models for important issues such as water resources all...

  7. Hearing: An Overlooked Fact in Relationship to Dyslexia.

    ERIC Educational Resources Information Center

    Johansen, Kjeld

    Sophisticated neurological research shows that early problems with auditory perception can result in long-range negative effects for the linguistic processes in general, and such long-range effects must be assumed to be correlated with induced degenerative changes in the auditory system and perhaps in the brain's linguistic sector. This research…

  8. CONTROL OF MICROBIAL CONTAMINANTS AND DISINFECTION BY-PRODUCTS (DBPS): COST AND PERFORMANCE

    EPA Science Inventory

    The USEPA is in the process of developing a sophisticated regulatory strategy in an attempt to balance the complex trade-offs in risks associated with controlling disinfectants and disinfection by-products (D/DBPs) in drinking water. EPA first attempted to control DBPs in 1974, w...

  9. Effects of soil moisture on the diurnal pattern of pesticide emission: Numerical simulation and sensitivity analysis

    USDA-ARS?s Scientific Manuscript database

    Accurate prediction of pesticide volatilization is important for the protection of human and environmental health. Due to the complexity of the volatilization process, sophisticated predictive models are needed, especially for dry soil conditions. A mathematical model was developed to allow simulati...

  10. Middle School Children's Problem-Solving Behavior: A Cognitive Analysis from a Reading Comprehension Perspective

    ERIC Educational Resources Information Center

    Pape, Stephen J.

    2004-01-01

    Many children read mathematics word problems and directly translate them to arithmetic operations. More sophisticated problem solvers transform word problems into object-based or mental models. Subsequent solutions are often qualitatively different because these models differentially support cognitive processing. Based on a conception of problem…

  11. Program Assessment: Getting to a Practical How-To Model

    ERIC Educational Resources Information Center

    Gardiner, Lorraine R.; Corbitt, Gail; Adams, Steven J.

    2010-01-01

    The Association to Advance Collegiate Schools of Business (AACSB) International's assurance of learning (AoL) standards require that schools develop a sophisticated continuous-improvement process. The authors review various assessment models and develop a practical, 6-step AoL model based on the literature and the authors' AoL-implementation…

  12. Families, Risk, and Competence.

    ERIC Educational Resources Information Center

    Lewis, Michael, Ed.; Feiring, Candice, Ed.

    The problems of studying families arise from the difficulty in studying systems in which there are multiple elements interacting with each other and with the child. This book attests to the growing sophistication of the conceptualization and measurement techniques for understanding family processes. Chapters in the first part of the book,…

  13. Overview of artificial neural networks.

    PubMed

    Zou, Jinming; Han, Yi; So, Sung-Sau

    2008-01-01

    The artificial neural network (ANN), or simply neural network, is a machine learning method evolved from the idea of simulating the human brain. The data explosion in modem drug discovery research requires sophisticated analysis methods to uncover the hidden causal relationships between single or multiple responses and a large set of properties. The ANN is one of many versatile tools to meet the demand in drug discovery modeling. Compared to a traditional regression approach, the ANN is capable of modeling complex nonlinear relationships. The ANN also has excellent fault tolerance and is fast and highly scalable with parallel processing. This chapter introduces the background of ANN development and outlines the basic concepts crucially important for understanding more sophisticated ANN. Several commonly used learning methods and network setups are discussed briefly at the end of the chapter.

  14. Optical Processing Techniques For Pseudorandom Sequence Prediction

    NASA Astrophysics Data System (ADS)

    Gustafson, Steven C.

    1983-11-01

    Pseudorandom sequences are series of apparently random numbers generated, for example, by linear or nonlinear feedback shift registers. An important application of these sequences is in spread spectrum communication systems, in which, for example, the transmitted carrier phase is digitally modulated rapidly and pseudorandomly and in which the information to be transmitted is incorporated as a slow modulation in the pseudorandom sequence. In this case the transmitted information can be extracted only by a receiver that uses for demodulation the same pseudorandom sequence used by the transmitter, and thus this type of communication system has a very high immunity to third-party interference. However, if a third party can predict in real time the probable future course of the transmitted pseudorandom sequence given past samples of this sequence, then interference immunity can be significantly reduced.. In this application effective pseudorandom sequence prediction techniques should be (1) applicable in real time to rapid (e.g., megahertz) sequence generation rates, (2) applicable to both linear and nonlinear pseudorandom sequence generation processes, and (3) applicable to error-prone past sequence samples of limited number and continuity. Certain optical processing techniques that may meet these requirements are discussed in this paper. In particular, techniques based on incoherent optical processors that perform general linear transforms or (more specifically) matrix-vector multiplications are considered. Computer simulation examples are presented which indicate that significant prediction accuracy can be obtained using these transforms for simple pseudorandom sequences. However, the useful prediction of more complex pseudorandom sequences will probably require the application of more sophisticated optical processing techniques.

  15. Viewing Marine Bacteria, Their Activity and Response to Environmental Drivers from Orbit

    PubMed Central

    Grimes, D. Jay; Ford, Tim E.; Colwell, Rita R.; Baker-Austin, Craig; Martinez-Urtaza, Jaime; Subramaniam, Ajit; Capone, Douglas G.

    2014-01-01

    Satellite-based remote sensing of marine microorganisms has become a useful tool in predicting human health risks associated with these microscopic targets. Early applications were focused on harmful algal blooms, but more recently methods have been developed to interrogate the ocean for bacteria. As satellite-based sensors have become more sophisticated and our ability to interpret information derived from these sensors has advanced, we have progressed from merely making fascinating pictures from space to developing process models with predictive capability. Our understanding of the role of marine microorganisms in primary production and global elemental cycles has been vastly improved as has our ability to use the combination of remote sensing data and models to provide early warning systems for disease outbreaks. This manuscript will discuss current approaches to monitoring cyanobacteria and vibrios, their activity and response to environmental drivers, and will also suggest future directions. PMID:24477922

  16. pH-programmable DNA logic arrays powered by modular DNAzyme libraries.

    PubMed

    Elbaz, Johann; Wang, Fuan; Remacle, Francoise; Willner, Itamar

    2012-12-12

    Nature performs complex information processing circuits, such the programmed transformations of versatile stem cells into targeted functional cells. Man-made molecular circuits are, however, unable to mimic such sophisticated biomachineries. To reach these goals, it is essential to construct programmable modular components that can be triggered by environmental stimuli to perform different logic circuits. We report on the unprecedented design of artificial pH-programmable DNA logic arrays, constructed by modular libraries of Mg(2+)- and UO(2)(2+)-dependent DNAzyme subunits and their substrates. By the appropriate modular design of the DNA computation units, pH-programmable logic arrays of various complexities are realized, and the arrays can be erased, reused, and/or reprogrammed. Such systems may be implemented in the near future for nanomedical applications by pH-controlled regulation of cellular functions or may be used to control biotransformations stimulated by bacteria.

  17. Positron emission tomography (PET) advances in neurological applications

    NASA Astrophysics Data System (ADS)

    Sossi, V.

    2003-09-01

    Positron Emission Tomography (PET) is a functional imaging modality used in brain research to map in vivo neurotransmitter and receptor activity and to investigate glucose utilization or blood flow patterns both in healthy and disease states. Such research is made possible by the wealth of radiotracers available for PET, by the fact that metabolic and kinetic parameters of particular processes can be extracted from PET data and by the continuous development of imaging techniques. In recent years great advancements have been made in the areas of PET instrumentation, data quantification and image reconstruction that allow for more detailed and accurate biological information to be extracted from PET data. It is now possible to quantitatively compare data obtained either with different tracers or with the same tracer under different scanning conditions. These sophisticated imaging approaches enable detailed investigation of disease mechanisms and system response to disease and/or therapy.

  18. Self-assembled peptide nanostructures for functional materials

    NASA Astrophysics Data System (ADS)

    Sardan Ekiz, Melis; Cinar, Goksu; Aref Khalily, Mohammad; Guler, Mustafa O.

    2016-10-01

    Nature is an important inspirational source for scientists, and presents complex and elegant examples of adaptive and intelligent systems created by self-assembly. Significant effort has been devoted to understanding these sophisticated systems. The self-assembly process enables us to create supramolecular nanostructures with high order and complexity, and peptide-based self-assembling building blocks can serve as suitable platforms to construct nanostructures showing diverse features and applications. In this review, peptide-based supramolecular assemblies will be discussed in terms of their synthesis, design, characterization and application. Peptide nanostructures are categorized based on their chemical and physical properties and will be examined by rationalizing the influence of peptide design on the resulting morphology and the methods employed to characterize these high order complex systems. Moreover, the application of self-assembled peptide nanomaterials as functional materials in information technologies and environmental sciences will be reviewed by providing examples from recently published high-impact studies.

  19. Business model for sensor-based fall recognition systems.

    PubMed

    Fachinger, Uwe; Schöpke, Birte

    2014-01-01

    AAL systems require, in addition to sophisticated and reliable technology, adequate business models for their launch and sustainable establishment. This paper presents the basic features of alternative business models for a sensor-based fall recognition system which was developed within the context of the "Lower Saxony Research Network Design of Environments for Ageing" (GAL). The models were developed parallel to the R&D process with successive adaptation and concretization. An overview of the basic features (i.e. nine partial models) of the business model is given and the mutual exclusive alternatives for each partial model are presented. The partial models are interconnected and the combinations of compatible alternatives lead to consistent alternative business models. However, in the current state, only initial concepts of alternative business models can be deduced. The next step will be to gather additional information to work out more detailed models.

  20. CDF trigger interface board 'FRED'

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Campbell, M.; Dell' Orso, M.; Giannetti, P.

    1985-08-01

    We describe FASTBUS boards which interface sixteen different trigger interrupts to the Collider Detector Facility (CDF) data acquisition system. The boards are known to CDF by the acronym 'FRED'. The data acquisition scheme for CDF allows for up to 16 different parts of the detector, called 'Partitions', to run independently. Four partitions are reserved for physics runs and sophisticated calibration and debugging: they use the common Level 1 and Level 2 trigger logic and have access to information from all the components of the CDF detector. These four partitions are called ''CDF Partitions''. The remaining twelve partitions have no accessmore » to the common trigger logic and provide their own Level 1 and Level 2 signals: they are called ''Autonomous Partitions''. Fred collects and interprets signals from independent parts of the CDF trigger system and delivers Level 1 and Level 2 responses to the Trigger Supervisors (FASTBUS masters which control the data acquisition process in each partition).« less

  1. Research on the ride comfort of elevator monitoring using smartphone

    NASA Astrophysics Data System (ADS)

    Zhang, Yang; Sun, Xiaowei; Xie, Zhao; Su, Wensheng; Xue, Zhigang; Zhao, Xuefeng

    2017-04-01

    With the rapid development of high-rise buildings, the requirement of the elevator's speed is growing higher. And the vibration amplitude of elevator will also increasing with the improvement of running speed. The vibration problems of elevator have become the important factors that affect the comfort feeling of elevator. At the same time, the strong vibration will affect the normal work of elevator, and even cause accidents. So it's necessary to study the vibration characteristics of the elevator. In recent years, smartphone has developed rapidly, with a variety of sophisticated sensors; it has the powerful data processing and transmission capacity. In this paper, the author has presented an elevator comfort monitoring method based on smartphone. This method using Monitoring App can monitor the acceleration and inclination information using MEMS sensors embedded in smartphone. Then a confirmatory test for an elevator was designed, experimental results show that elevator comfort monitoring method based on smartphone is stable and reliable.

  2. Complement and innate immune evasion strategies of the human pathogenic fungus Candida albicans.

    PubMed

    Luo, Shanshan; Skerka, Christine; Kurzai, Oliver; Zipfel, Peter F

    2013-12-15

    Candida albicans is a medically important fungus that can cause a wide range of diseases ranging from superficial infections to disseminated disease, which manifests primarily in immuno-compromised individuals. Despite the currently applied anti-fungal therapies, both mortality and morbidity caused by this human pathogenic fungus are still unacceptably high. Therefore new prophylactic and therapeutic strategies are urgently needed to prevent fungal infection. In order to define new targets for combating fungal disease, there is a need to understand the immune evasion strategies of C. albicans in detail. In this review, we summarize different sophisticated immune evasion strategies that are utilized by C. albicans. The description of the molecular mechanisms used for immune evasion does on one hand help to understand the infection process, and on the other hand provides valuable information to define new strategies and diagnostic approaches to fight and interfere with Candida infections. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. Viewing marine bacteria, their activity and response to environmental drivers from orbit: satellite remote sensing of bacteria.

    PubMed

    Grimes, D Jay; Ford, Tim E; Colwell, Rita R; Baker-Austin, Craig; Martinez-Urtaza, Jaime; Subramaniam, Ajit; Capone, Douglas G

    2014-04-01

    Satellite-based remote sensing of marine microorganisms has become a useful tool in predicting human health risks associated with these microscopic targets. Early applications were focused on harmful algal blooms, but more recently methods have been developed to interrogate the ocean for bacteria. As satellite-based sensors have become more sophisticated and our ability to interpret information derived from these sensors has advanced, we have progressed from merely making fascinating pictures from space to developing process models with predictive capability. Our understanding of the role of marine microorganisms in primary production and global elemental cycles has been vastly improved as has our ability to use the combination of remote sensing data and models to provide early warning systems for disease outbreaks. This manuscript will discuss current approaches to monitoring cyanobacteria and vibrios, their activity and response to environmental drivers, and will also suggest future directions.

  4. Performance Enhancement Strategies for Multi-Block Overset Grid CFD Applications

    NASA Technical Reports Server (NTRS)

    Djomehri, M. Jahed; Biswas, Rupak

    2003-01-01

    The overset grid methodology has significantly reduced time-to-solution of highfidelity computational fluid dynamics (CFD) simulations about complex aerospace configurations. The solution process resolves the geometrical complexity of the problem domain by using separately generated but overlapping structured discretization grids that periodically exchange information through interpolation. However, high performance computations of such large-scale realistic applications must be handled efficiently on state-of-the-art parallel supercomputers. This paper analyzes the effects of various performance enhancement strategies on the parallel efficiency of an overset grid Navier-Stokes CFD application running on an SGI Origin2000 machinc. Specifically, the role of asynchronous communication, grid splitting, and grid grouping strategies are presented and discussed. Details of a sophisticated graph partitioning technique for grid grouping are also provided. Results indicate that performance depends critically on the level of latency hiding and the quality of load balancing across the processors.

  5. The AAO fiber instrument data simulator

    NASA Astrophysics Data System (ADS)

    Goodwin, Michael; Farrell, Tony; Smedley, Scott; Heald, Ron; Heijmans, Jeroen; De Silva, Gayandhi; Carollo, Daniela

    2012-09-01

    The fiber instrument data simulator is an in-house software tool that simulates detector images of fiber-fed spectrographs developed by the Australian Astronomical Observatory (AAO). In addition to helping validate the instrument designs, the resulting simulated images are used to develop the required data reduction software. Example applications that have benefited from the tool usage are the HERMES and SAMI instrumental projects for the Anglo-Australian Telescope (AAT). Given the sophistication of these projects an end-to-end data simulator that accurately models the predicted detector images is required. The data simulator encompasses all aspects of the transmission and optical aberrations of the light path: from the science object, through the atmosphere, telescope, fibers, spectrograph and finally the camera detectors. The simulator runs under a Linux environment that uses pre-calculated information derived from ZEMAX models and processed data from MATLAB. In this paper, we discuss the aspects of the model, software, example simulations and verification.

  6. Evolving political science. Biological adaptation, rational action, and symbolism.

    PubMed

    Tingley, Dustin

    2006-01-01

    Political science, as a discipline, has been reluctant to adopt theories and methodologies developed in fields studying human behavior from an evolutionary standpoint. I ask whether evolutionary concepts are reconcilable with standard political-science theories and whether those concepts help solve puzzles to which these theories classically are applied. I find that evolutionary concepts readily and simultaneously accommodate theories of rational choice, symbolism, interpretation, and acculturation. Moreover, phenomena perennially hard to explain in standard political science become clearer when human interactions are understood in light of natural selection and evolutionary psychology. These phenomena include the political and economic effects of emotion, status, personal attractiveness, and variations in information-processing and decision-making under uncertainty; exemplary is the use of "focal points" in multiple-equilibrium games. I conclude with an overview of recent research by, and ongoing debates among, scholars analyzing politics in evolutionarily sophisticated terms.

  7. Child maltreatment: the collaboration of child welfare, mental health, and judicial system.

    PubMed

    Butler, S; Atkinson, L; Magnatta, M; Hood, E

    1995-03-01

    The alliance of child welfare, mental health, and legal systems has received little empirical attention, despite the magnitude of its impact on children and families. We examined the congruence of child protection agencies legal positions, court clinic recommendations, and judicial dispositions in a sample of 59 contested child maltreatment cases. Placement recommendations/decisions among all three systems were highly correlated, although the relationship was not so strong as to undermine the independence of any one system. Where there was disagreement between successive evaluations, it was in the direction of enhancing family integrity and parental access rights. We advanced three hypotheses to account for our findings: (a) changes in successive recommendations reflect the increasing sophistication of the assessment process; (b) changes reflect increasing distance from the family's ecology and are therefore increasingly ill informed; and (c) the changes are purely probabilistic, reflecting a drift toward the societal status quo.

  8. Experiences with a generator tool for building clinical application modules.

    PubMed

    Kuhn, K A; Lenz, R; Elstner, T; Siegele, H; Moll, R

    2003-01-01

    To elaborate main system characteristics and relevant deployment experiences for the health information system (HIS) Orbis/OpenMed, which is in widespread use in Germany, Austria, and Switzerland. In a deployment phase of 3 years in a 1.200 bed university hospital, where the system underwent significant improvements, the system's functionality and its software design have been analyzed in detail. We focus on an integrated CASE tool for generating embedded clinical applications and for incremental system evolution. We present a participatory and iterative software engineering process developed for efficient utilization of such a tool. The system's functionality is comparable to other commercial products' functionality; its components are embedded in a vendor-specific application framework, and standard interfaces are being used for connecting subsystems. The integrated generator tool is a remarkable feature; it became a key factor of our project. Tool generated applications are workflow enabled and embedded into the overall data base schema. Rapid prototyping and iterative refinement are supported, so application modules can be adapted to the users' work practice. We consider tools supporting an iterative and participatory software engineering process highly relevant for health information system architects. The potential of a system to continuously evolve and to be effectively adapted to changing needs may be more important than sophisticated but hard-coded HIS functionality. More work will focus on HIS software design and on software engineering. Methods and tools are needed for quick and robust adaptation of systems to health care processes and changing requirements.

  9. Demonstrating the Value of Fine-resolution Optical Data for Minimising Aliasing Impacts on Biogeochemical Models of Surface Waters

    NASA Astrophysics Data System (ADS)

    Chappell, N. A.; Jones, T.; Young, P.; Krishnaswamy, J.

    2015-12-01

    There is increasing awareness that under-sampling may have resulted in the omission of important physicochemical information present in water quality signatures of surface waters - thereby affecting interpretation of biogeochemical processes. For dissolved organic carbon (DOC) and nitrogen this under-sampling can now be avoided using UV-visible spectroscopy measured in-situ and continuously at a fine-resolution e.g. 15 minutes ("real time"). Few methods are available to extract biogeochemical process information directly from such high-frequency data. Jones, Chappell & Tych (2014 Environ Sci Technol: 13289-97) developed one such method using optically-derived DOC data based upon a sophisticated time-series modelling tool. Within this presentation we extend the methodology to quantify the minimum sampling interval required to avoid distortion of model structures and parameters that describe fundamental biogeochemical processes. This shifting of parameters which results from under-sampling is called "aliasing". We demonstrate that storm dynamics at a variety of sites dominate over diurnal and seasonal changes and that these must be characterised by sampling that may be sub-hourly to avoid aliasing. This is considerably shorter than that used by other water quality studies examining aliasing (e.g. Kirchner 2005 Phys Rev: 069902). The modelling approach presented is being developed into a generic tool to calculate the minimum sampling for water quality monitoring in systems driven primarily by hydrology. This is illustrated with fine-resolution, optical data from watersheds in temperate Europe through to the humid tropics.

  10. The economics of health information technology in medication management: a systematic review of economic evaluations.

    PubMed

    O'Reilly, Daria; Tarride, Jean-Eric; Goeree, Ron; Lokker, Cynthia; McKibbon, K Ann

    2012-01-01

    To conduct a systematic review and synthesis of the evidence surrounding the cost-effectiveness of health information technology (HIT) in the medication process. Peer-reviewed electronic databases and gray literature were searched to identify studies on HIT used to assist in the medication management process. Articles including an economic component were reviewed for further screening. For this review, full cost-effectiveness analyses, cost-utility analyses and cost-benefit analyses, as well as cost analyses, were eligible for inclusion and synthesis. The 31 studies included were heterogeneous with respect to the HIT evaluated, setting, and economic methods used. Thus the data could not be synthesized, and a narrative review was conducted. Most studies evaluated computer decision support systems in hospital settings in the USA, and only five of the studied performed full economic evaluations. Most studies merely provided cost data; however, useful economic data involves far more input. A full economic evaluation includes a full enumeration of the costs, synthesized with the outcomes of the intervention. The quality of the economic literature in this area is poor. A few studies found that HIT may offer cost advantages despite their increased acquisition costs. However, given the uncertainty that surrounds the costs and outcomes data, and limited study designs, it is difficult to reach any definitive conclusion as to whether the additional costs and benefits represent value for money. Sophisticated concurrent prospective economic evaluations need to be conducted to address whether HIT interventions in the medication management process are cost-effective.

  11. Comparing clinical automated, medical record, and hybrid data sources for diabetes quality measures.

    PubMed

    Kerr, Eve A; Smith, Dylan M; Hogan, Mary M; Krein, Sarah L; Pogach, Leonard; Hofer, Timothy P; Hayward, Rodney A

    2002-10-01

    Little is known about the relative reliability of medical record and clinical automated data, sources commonly used to assess diabetes quality of care. The agreement between diabetes quality measures constructed from clinical automated versus medical record data sources was compared, and the performance of hybrid measures derived from a combination of the two data sources was examined. Medical records were abstracted for 1,032 patients with diabetes who received care from 21 facilities in 4 Veterans Integrated Service Networks. Automated data were obtained from a central Veterans Health Administration diabetes registry containing information on laboratory tests and medication use. Success rates were higher for process measures derived from medical record data than from automated data, but no substantial differences among data sources were found for the intermediate outcome measures. Agreement for measures derived from the medical record compared with automated data was moderate for process measures but high for intermediate outcome measures. Hybrid measures yielded success rates similar to those of medical record-based measures but would have required about 50% fewer chart reviews. Agreement between medical record and automated data was generally high. Yet even in an integrated health care system with sophisticated information technology, automated data tended to underestimate the success rate in technical process measures for diabetes care and yielded different quartile performance rankings for facilities. Applying hybrid methodology yielded results consistent with the medical record but required less data to come from medical record reviews.

  12. Subconscious detection of threat as reflected by an enhanced response bias.

    PubMed

    Windmann, S; Krüger, T

    1998-12-01

    Neurobiological and cognitive models of unconscious information processing suggest that subconscious threat detection can lead to cognitive misinterpretations and false alarms, while conscious processing is assumed to be perceptually and conceptually accurate and unambiguous. Furthermore, clinical theories suggest that pathological anxiety results from a crude preattentive warning system predominating over more sophisticated and controlled modes of processing. We investigated the hypothesis that subconscious detection of threat in a cognitive task is reflected by enhanced "false signal" detection rather than by selectively enhanced discrimination of threat items in 30 patients with panic disorder and 30 healthy controls. We presented a tachistoscopic word-nonword discrimination task and a subsequent recognition task and analyzed the data by means of process-dissociation procedures. In line with our expectations, subjects of both groups showed more false signal detection to threat than to neutral stimuli as indicated by an enhanced response bias, whereas indices of discriminative sensitivity did not show this effect. In addition, patients with panic disorder showed a generally enhanced response bias in comparison to healthy controls. They also seemed to have processed the stimuli less elaborately and less differentially. Results are consistent with the assumption that subconscious threat detection can lead to misrepresentations of stimulus significance and that pathological anxiety is characterized by a hyperactive preattentive alarm system that is insufficiently controlled by higher cognitive processes. Copyright 1998 Academic Press.

  13. Development of spatial density maps based on geoprocessing web services: application to tuberculosis incidence in Barcelona, Spain.

    PubMed

    Dominkovics, Pau; Granell, Carlos; Pérez-Navarro, Antoni; Casals, Martí; Orcau, Angels; Caylà, Joan A

    2011-11-29

    Health professionals and authorities strive to cope with heterogeneous data, services, and statistical models to support decision making on public health. Sophisticated analysis and distributed processing capabilities over geocoded epidemiological data are seen as driving factors to speed up control and decision making in these health risk situations. In this context, recent Web technologies and standards-based web services deployed on geospatial information infrastructures have rapidly become an efficient way to access, share, process, and visualize geocoded health-related information. Data used on this study is based on Tuberculosis (TB) cases registered in Barcelona city during 2009. Residential addresses are geocoded and loaded into a spatial database that acts as a backend database. The web-based application architecture and geoprocessing web services are designed according to the Representational State Transfer (REST) principles. These web processing services produce spatial density maps against the backend database. The results are focused on the use of the proposed web-based application to the analysis of TB cases in Barcelona. The application produces spatial density maps to ease the monitoring and decision making process by health professionals. We also include a discussion of how spatial density maps may be useful for health practitioners in such contexts. In this paper, we developed web-based client application and a set of geoprocessing web services to support specific health-spatial requirements. Spatial density maps of TB incidence were generated to help health professionals in analysis and decision-making tasks. The combined use of geographic information tools, map viewers, and geoprocessing services leads to interesting possibilities in handling health data in a spatial manner. In particular, the use of spatial density maps has been effective to identify the most affected areas and its spatial impact. This study is an attempt to demonstrate how web processing services together with web-based mapping capabilities suit the needs of health practitioners in epidemiological analysis scenarios.

  14. Development of spatial density maps based on geoprocessing web services: application to tuberculosis incidence in Barcelona, Spain

    PubMed Central

    2011-01-01

    Background Health professionals and authorities strive to cope with heterogeneous data, services, and statistical models to support decision making on public health. Sophisticated analysis and distributed processing capabilities over geocoded epidemiological data are seen as driving factors to speed up control and decision making in these health risk situations. In this context, recent Web technologies and standards-based web services deployed on geospatial information infrastructures have rapidly become an efficient way to access, share, process, and visualize geocoded health-related information. Methods Data used on this study is based on Tuberculosis (TB) cases registered in Barcelona city during 2009. Residential addresses are geocoded and loaded into a spatial database that acts as a backend database. The web-based application architecture and geoprocessing web services are designed according to the Representational State Transfer (REST) principles. These web processing services produce spatial density maps against the backend database. Results The results are focused on the use of the proposed web-based application to the analysis of TB cases in Barcelona. The application produces spatial density maps to ease the monitoring and decision making process by health professionals. We also include a discussion of how spatial density maps may be useful for health practitioners in such contexts. Conclusions In this paper, we developed web-based client application and a set of geoprocessing web services to support specific health-spatial requirements. Spatial density maps of TB incidence were generated to help health professionals in analysis and decision-making tasks. The combined use of geographic information tools, map viewers, and geoprocessing services leads to interesting possibilities in handling health data in a spatial manner. In particular, the use of spatial density maps has been effective to identify the most affected areas and its spatial impact. This study is an attempt to demonstrate how web processing services together with web-based mapping capabilities suit the needs of health practitioners in epidemiological analysis scenarios. PMID:22126392

  15. Teaching computer interfacing with virtual instruments in an object-oriented language.

    PubMed Central

    Gulotta, M

    1995-01-01

    LabVIEW is a graphic object-oriented computer language developed to facilitate hardware/software communication. LabVIEW is a complete computer language that can be used like Basic, FORTRAN, or C. In LabVIEW one creates virtual instruments that aesthetically look like real instruments but are controlled by sophisticated computer programs. There are several levels of data acquisition VIs that make it easy to control data flow, and many signal processing and analysis algorithms come with the software as premade VIs. In the classroom, the similarity between virtual and real instruments helps students understand how information is passed between the computer and attached instruments. The software may be used in the absence of hardware so that students can work at home as well as in the classroom. This article demonstrates how LabVIEW can be used to control data flow between computers and instruments, points out important features for signal processing and analysis, and shows how virtual instruments may be used in place of physical instrumentation. Applications of LabVIEW to the teaching laboratory are also discussed, and a plausible course outline is given. PMID:8580361

  16. What do we know about Indonesian tropical lakes? Insights from high frequency measurement

    NASA Astrophysics Data System (ADS)

    Budi Santoso, Arianto; Triwisesa, Endra; Fakhrudin, Muh.; Harsono, Eko; Agita Rustini, Hadiid

    2018-02-01

    When measuring ecological variables in lakes, sampling frequency is critical in capturing an environmental pattern. Discrete sampling of traditional monitoring programs is likely to result in vital knowledge gaps in understanding any processes particularly those with fine temporal scale characteristics. The development of high frequency measurements offer a sophisticated range of information in recording any events in lakes at a finer time scale. We present physical indices of a tropical deep Lake Maninjau arrayed from OnLine Monitoring System (OLM). It is revealed that Lake Maninjau mostly has a diurnal thermal stratification pattern. The calculated lake stability (Schmidt stability), however, follows a seasonal pattern; low in December-January and around August, and high in May and September. Using a 3D numerical model simulation (ELCOM), we infer how wind and solar radiation intensity control lake’s temperature profiles. In this review, we highlight the needs of high frequency measurement establishment in Indonesian tropical lakes to better understand the unique processes and to support the authorities’ decision making in maximizing the provision of ecosystem services supplied by lakes and reservoirs.

  17. Graphene oxide windows for in situ environmental cell photoelectron spectroscopy.

    PubMed

    Kolmakov, Andrei; Dikin, Dmitriy A; Cote, Laura J; Huang, Jiaxing; Abyaneh, Majid Kazemian; Amati, Matteo; Gregoratti, Luca; Günther, Sebastian; Kiskinova, Maya

    2011-08-28

    The performance of new materials and devices often depends on processes taking place at the interface between an active solid element and the environment (such as air, water or other fluids). Understanding and controlling such interfacial processes require surface-specific spectroscopic information acquired under real-world operating conditions, which can be challenging because standard approaches such as X-ray photoelectron spectroscopy generally require high-vacuum conditions. The state-of-the-art approach to this problem relies on unique and expensive apparatus including electron analysers coupled with sophisticated differentially pumped lenses. Here, we develop a simple environmental cell with graphene oxide windows that are transparent to low-energy electrons (down to 400 eV), and demonstrate the feasibility of X-ray photoelectron spectroscopy measurements on model samples such as gold nanoparticles and aqueous salt solution placed on the back side of a window. These proof-of-principle results show the potential of using graphene oxide, graphene and other emerging ultrathin membrane windows for the fabrication of low-cost, single-use environmental cells compatible with commercial X-ray and Auger microprobes as well as scanning or transmission electron microscopes.

  18. Teaching computer interfacing with virtual instruments in an object-oriented language.

    PubMed

    Gulotta, M

    1995-11-01

    LabVIEW is a graphic object-oriented computer language developed to facilitate hardware/software communication. LabVIEW is a complete computer language that can be used like Basic, FORTRAN, or C. In LabVIEW one creates virtual instruments that aesthetically look like real instruments but are controlled by sophisticated computer programs. There are several levels of data acquisition VIs that make it easy to control data flow, and many signal processing and analysis algorithms come with the software as premade VIs. In the classroom, the similarity between virtual and real instruments helps students understand how information is passed between the computer and attached instruments. The software may be used in the absence of hardware so that students can work at home as well as in the classroom. This article demonstrates how LabVIEW can be used to control data flow between computers and instruments, points out important features for signal processing and analysis, and shows how virtual instruments may be used in place of physical instrumentation. Applications of LabVIEW to the teaching laboratory are also discussed, and a plausible course outline is given.

  19. Classifier dependent feature preprocessing methods

    NASA Astrophysics Data System (ADS)

    Rodriguez, Benjamin M., II; Peterson, Gilbert L.

    2008-04-01

    In mobile applications, computational complexity is an issue that limits sophisticated algorithms from being implemented on these devices. This paper provides an initial solution to applying pattern recognition systems on mobile devices by combining existing preprocessing algorithms for recognition. In pattern recognition systems, it is essential to properly apply feature preprocessing tools prior to training classification models in an attempt to reduce computational complexity and improve the overall classification accuracy. The feature preprocessing tools extended for the mobile environment are feature ranking, feature extraction, data preparation and outlier removal. Most desktop systems today are capable of processing a majority of the available classification algorithms without concern of processing while the same is not true on mobile platforms. As an application of pattern recognition for mobile devices, the recognition system targets the problem of steganalysis, determining if an image contains hidden information. The measure of performance shows that feature preprocessing increases the overall steganalysis classification accuracy by an average of 22%. The methods in this paper are tested on a workstation and a Nokia 6620 (Symbian operating system) camera phone with similar results.

  20. Different coding strategies for the perception of stable and changeable facial attributes.

    PubMed

    Taubert, Jessica; Alais, David; Burr, David

    2016-09-01

    Perceptual systems face competing requirements: improving signal-to-noise ratios of noisy images, by integration; and maximising sensitivity to change, by differentiation. Both processes occur in human vision, under different circumstances: they have been termed priming, or serial dependencies, leading to positive sequential effects; and adaptation or habituation, which leads to negative sequential effects. We reasoned that for stable attributes, such as the identity and gender of faces, the system should integrate: while for changeable attributes like facial expression, it should also engage contrast mechanisms to maximise sensitivity to change. Subjects viewed a sequence of images varying simultaneously in gender and expression, and scored each as male or female, and happy or sad. We found strong and consistent positive serial dependencies for gender, and negative dependency for expression, showing that both processes can operate at the same time, on the same stimuli, depending on the attribute being judged. The results point to highly sophisticated mechanisms for optimizing use of past information, either by integration or differentiation, depending on the permanence of that attribute.

  1. Platonic Dialogue, Maieutic Method and Critical Thinking

    ERIC Educational Resources Information Center

    Leigh, Fiona

    2007-01-01

    In this paper I offer a reading of one of Plato's later works, the "Sophist", that reveals it to be informed by principles comparable on the face of it with those that have emerged recently in the field of critical thinking. As a development of the famous Socratic method of his teacher, I argue, Plato deployed his own pedagogical method, a…

  2. Making Effective Video Tutorials: An Investigation of Online Written and Video Help Tutorials in Mathematics for Preservice Elementary School Teachers

    ERIC Educational Resources Information Center

    Gawlik, Christina L.

    2009-01-01

    Online assessments afford many advantages for teachers and students. Okolo (2006) stated, "As the power, sophistication, and availability of technology have increased in the classroom, online assessments have become a viable tool for providing the type of frequent and dynamic assessment information that educators need to guide instructional…

  3. Young Learners: An Examination of the Psychometric Properties of the Early Literacy Knowledge and Skills Instrument

    ERIC Educational Resources Information Center

    Chan, Man Ching Esther

    2015-01-01

    The Early Literacy Knowledge and Skills (ELKS) instrument was informed by the work of Ferreiro and Teberosky based on the notion that young children could be differentiated according to levels of sophistication in their understanding of the rules of written language. As an initial step to evaluate the instrument for teaching purposes, the present…

  4. Analysis of the Effects of Individual Differences on Cognitive Performance for the Development of Military Socio-Cultural Performance Moderators

    ERIC Educational Resources Information Center

    Bagley, Katherine G.

    2012-01-01

    Technological devices are ubiquitous in nearly every facet of society. There are substantial investments made in organizations on a daily basis to improve information technology. From a military perspective, the ultimate goal of these highly sophisticated devices is to assist soldiers in achieving mission success across dynamic and often chaotic…

  5. Fruit Bats, Cats, and Naked Mole Rats: Lifelong Learning at the Zoo. ERIC/CSMEE Digest.

    ERIC Educational Resources Information Center

    Thomson, Barbara S.; Diem, Jason J.

    An informal study found that zoo visitors want to know not just the name, weight, and age of animals in a collection, but also about diet, reproduction, life span, and behavioral characteristics. What kinds of learning opportunities, beyond enhanced signage, can be offered to the sophisticated new breed of visitors in zoos, aquariums, and nature…

  6. Patterns of Sophistication and Naivety: Some Features of Anthropological Approaches to the Study of Education. Occasional Paper No. 22.

    ERIC Educational Resources Information Center

    Erickson, Frederick

    The limits and boundaries of anthropology are briefly discussed, along with a general description of lay attitudes towards the field. A research case is given to illustrate the way in which anthropological study methods can contribute to educational research. Noted among these contributions is an informed distrust that anthropologists exhibit…

  7. Military Training. Its Effectiveness for Technical Specialties Is Unknown. Report to the Secretary of Defense.

    ERIC Educational Resources Information Center

    General Accounting Office, Washington, DC. Program Evaluation and Methodology Div.

    A study examined the information collected by the Department of Defense on both the quality of its new recruits and the effectiveness of its training in preparing recruits to operate in a technologically sophisticated environment. It found that data were collected at a recruit's entrance to military life, during and upon completion of formal…

  8. Besides Google: Guiding Gifted Elementary Students onto the Entrance Ramp of the Information Superhighway

    ERIC Educational Resources Information Center

    Schneider, Joan

    2009-01-01

    For gifted students, the power of the Internet is its vastness. Students can access extensive resources that far exceed the collections in their classrooms or school library. Especially with the rapid growth of the Internet during the last decade, the gateway to a rich array of sophisticated resources is literally a click away. Curriculum content…

  9. A Role for Marketing in College Admissions. Papers Presented at the Colloquium on College Admissions, May 16-l8, 1976.

    ERIC Educational Resources Information Center

    College Entrance Examination Board, New York, NY.

    This collection stresses the need for informed and more sophisticated marketing techniques for college admissions officers to help them cope with the decreasing number of prospective college students. The importance of the college admissions office is increasing as admissions becomes a more crucial element to the colleges' financial well-being.…

  10. The State of Nursing Home Information Technology Sophistication in Rural and Nonrural US Markets.

    PubMed

    Alexander, Gregory L; Madsen, Richard W; Miller, Erin L; Wakefield, Douglas S; Wise, Keely K; Alexander, Rachel L

    2017-06-01

    To test for significant differences in information technology sophistication (ITS) in US nursing homes (NH) based on location. We administered a primary survey January 2014 to July 2015 to NH in each US state. The survey was cross-sectional and examined 3 dimensions (IT capabilities, extent of IT use, degree of IT integration) among 3 domains (resident care, clinical support, administrative activities) of ITS. ITS was broken down by NH location. Mean responses were compared across 4 NH categories (Metropolitan, Micropolitan, Small Town, and Rural) for all 9 ITS dimensions and domains. Least square means and Tukey's method were used for multiple comparisons. Methods yielded 815/1,799 surveys (45% response rate). In every health care domain (resident care, clinical support, and administrative activities) statistical differences in facility ITS occurred in larger (metropolitan or micropolitan) and smaller (small town or rural) populated areas. This study represents the most current national assessment of NH IT since 2004. Historically, NH IT has been used solely for administrative activities and much less for resident care and clinical support. However, results are encouraging as ITS in other domains appears to be greater than previously imagined. © 2016 National Rural Health Association.

  11. Nursing leadership in academic nursing: The wisdom of development and the development of wisdom.

    PubMed

    Pesut, Daniel J; Thompson, Sarah A

    The purpose of this article is to discuss insights derived from adult cognitive developmental theories and relate the insights to vertical leadership development in academic nursing contexts. Equipped with developmental understanding, academic leaders are in a better position to support the vertical leadership development of one's self, faculty, peers, and colleagues. From a cognitive developmental perspective, the authors' reason as leaders develop, grow, and evolve, sense making becomes more sophisticated and nuanced resulting in the development of wisdom. Leadership wisdom is a function of horizontal (acquisition of information, skills, and competencies) and vertical development (the development of more complex and sophisticated ways of thinking). Ways to enhance vertical development, and sense making to cultivate wisdom are discussed. Principles and practices that promote vertical development in self and others deepens performance expectations of those in the academy and supports personal professional development and organizational success. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. Sparsity based target detection for compressive spectral imagery

    NASA Astrophysics Data System (ADS)

    Boada, David Alberto; Arguello Fuentes, Henry

    2016-09-01

    Hyperspectral imagery provides significant information about the spectral characteristics of objects and materials present in a scene. It enables object and feature detection, classification, or identification based on the acquired spectral characteristics. However, it relies on sophisticated acquisition and data processing systems able to acquire, process, store, and transmit hundreds or thousands of image bands from a given area of interest which demands enormous computational resources in terms of storage, computationm, and I/O throughputs. Specialized optical architectures have been developed for the compressed acquisition of spectral images using a reduced set of coded measurements contrary to traditional architectures that need a complete set of measurements of the data cube for image acquisition, dealing with the storage and acquisition limitations. Despite this improvement, if any processing is desired, the image has to be reconstructed by an inverse algorithm in order to be processed, which is also an expensive task. In this paper, a sparsity-based algorithm for target detection in compressed spectral images is presented. Specifically, the target detection model adapts a sparsity-based target detector to work in a compressive domain, modifying the sparse representation basis in the compressive sensing problem by means of over-complete training dictionaries and a wavelet basis representation. Simulations show that the presented method can achieve even better detection results than the state of the art methods.

  13. A simulator study on information requirements for precision hovering

    NASA Technical Reports Server (NTRS)

    Lemons, J. L.; Dukes, T. A.

    1975-01-01

    A fixed base simulator study of an advanced helicopter instrument display utilizing translational acceleration, velocity and position information is reported. The simulation involved piloting a heavy helicopter using the Integrated Trajectory Error Display (ITED) in a precision hover task. The test series explored two basic areas. The effect on hover accuracy of adding acceleration information was of primary concern. Also of interest was the operators' ability to use degraded information derived from less sophisticated sources. The addition of translational acceleration to a display containing velocity and position information did not appear to improve the hover performance significantly. However, displayed acceleration information seemed to increase the damping of the man machine system. Finally, the pilots could use translational information synthesized from attitude and angular acceleration as effectively as perfect acceleration.

  14. Coupling mRNA processing with transcription in time and space

    PubMed Central

    Bentley, David L.

    2015-01-01

    Maturation of mRNA precursors often occurs simultaneously with their synthesis by RNA polymerase II (Pol II). The co-transcriptional nature of mRNA processing has permitted the evolution of coupling mechanisms that coordinate transcription with mRNA capping, splicing, editing and 3′ end formation. Recent experiments using sophisticated new methods for analysis of nascent RNA have provided important insights into the relative amount of co-transcriptional and post-transcriptional processing, the relationship between mRNA elongation and processing, and the role of the Pol II carboxy-terminal domain (CTD) in regulating these processes. PMID:24514444

  15. Using genetic information while protecting the privacy of the soul.

    PubMed

    Moor, J H

    1999-01-01

    Computing plays an important role in genetics (and vice versa). Theoretically, computing provides a conceptual model for the function and malfunction of our genetic machinery. Practically, contemporary computers and robots equipped with advanced algorithms make the revelation of the complete human genome imminent--computers are about to reveal our genetic souls for the first time. Ethically, computers help protect privacy by restricting access in sophisticated ways to genetic information. But the inexorable fact that computers will increasingly collect, analyze, and disseminate abundant amounts of genetic information made available through the genetic revolution, not to mention that inexpensive computing devices will make genetic information gathering easier, underscores the need for strong and immediate privacy legislation.

  16. Expanding rural primary care training by employing information technologies: the need for participation by medical reference librarians.

    PubMed

    Coggan, J M; Crandall, L A

    1995-01-01

    The use of rural sites to train badly needed primary care providers requires access to sophisticated medical information not traditionally available outside of academic health centers. Medical reference librarians can play a key role in the development of primary care training sites in rural settings. Electronic information technologies, with proactive support from medical reference librarians, can provide current and detailed information without concern for distance from the health science center library. This paper discusses recent developments in technology, describes current challenges to the application of this technology in rural settings, and provides policy recommendations for medical reference librarians to enhance rural primary care training.

  17. The Master Lens Database and The Orphan Lenses Project

    NASA Astrophysics Data System (ADS)

    Moustakas, Leonidas

    2012-10-01

    Strong gravitational lenses are uniquely suited for the study of dark matter structure and substructure within massive halos of many scales, act as gravitational telescopes for distant faint objects, and can give powerful and competitive cosmological constraints. While hundreds of strong lenses are known to date, spanning five orders of magnitude in mass scale, thousands will be identified this decade. To fully exploit the power of these objects presently, and in the near future, we are creating the Master Lens Database. This is a clearinghouse of all known strong lens systems, with a sophisticated and modern database of uniformly measured and derived observational and lens-model derived quantities, using archival Hubble data across several instruments. This Database enables new science that can be done with a comprehensive sample of strong lenses. The operational goal of this proposal is to develop the process and the code to semi-automatically stage Hubble data of each system, create appropriate masks of the lensing objects and lensing features, and derive gravitational lens models, to provide a uniform and fairly comprehensive information set that is ingested into the Database. The scientific goal for this team is to use the properties of the ensemble of lenses to make a new study of the internal structure of lensing galaxies, and to identify new objects that show evidence of strong substructure lensing, for follow-up study. All data, scripts, masks, model setup files, and derived parameters, will be public, and free. The Database will be accessible online and through a sophisticated smartphone application, which will also be free.

  18. Computer programs: Information retrieval and data analysis, a compilation

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The items presented in this compilation are divided into two sections. Section one treats of computer usage devoted to the retrieval of information that affords the user rapid entry into voluminous collections of data on a selective basis. Section two is a more generalized collection of computer options for the user who needs to take such data and reduce it to an analytical study within a specific discipline. These programs, routines, and subroutines should prove useful to users who do not have access to more sophisticated and expensive computer software.

  19. Information intervention in the pharmaceutical sciences.

    PubMed

    Chatfield, Amy J; Romero, Rebecca M; Haworth, Ian S

    2012-01-01

    Professional guidelines state that higher-order thinking skills are a desirable outcome of pharmacy education. In this context, courses in pharmaceutics at the University of Southern California are taught in a learner-centered manner that requires use of chemical reference sources and interpretation of physicochemical information for drug molecules. To facilitate these activities, a librarian worked with faculty to design a class on reference sources and primary literature. Students believed the librarian instruction was beneficial. After the intervention, faculty fielded fewer information-related questions and the librarian received more sophisticated questions. The class emphasizes the importance of collaboration between librarians and faculty in achieving these results.

  20. SMS-Based Learning in Tertiary Education: Achievement and Attitudinal Outcomes

    ERIC Educational Resources Information Center

    Katz, Yaacov J.

    2013-01-01

    SMS delivery platforms are being increasingly used at the university level to enhance student achievement as well as traits and attitudes related to the learning process. SMS delivery provides access to learning materials without being limited by space or time and sophisticated technological advances in SMS delivery have led to enhanced learner…

  1. GiveMe Shelter: A People-Centred Design Process for Promoting Independent Inquiry-Led Learning in Engineering

    ERIC Educational Resources Information Center

    Dyer, Mark; Grey, Thomas; Kinnane, Oliver

    2017-01-01

    It has become increasingly common for tasks traditionally carried out by engineers to be undertaken by technicians and technologist with access to sophisticated computers and software that can often perform complex calculations that were previously the responsibility of engineers. Not surprisingly, this development raises serious questions about…

  2. Peer Dynamics among Marquesan School-Aged Children.

    ERIC Educational Resources Information Center

    Martini, Mary

    This research describes an observation study of 100 children, ages 9-13 years, on the island of 'Ua Pou, Marquesas Islands, French Polynesia. The children were in a French government boarding school in the main valley of the island. Complex, sophisticated group processes among the Marquesan children were observed. The role structures of the group…

  3. Cognitive Pathways: Analysis of Students' Written Texts for Science Understanding

    ERIC Educational Resources Information Center

    Grimberg, Bruna Irene; Hand, Brian

    2009-01-01

    The purpose of this study was to reconstruct writers' reasoning process as reflected in their written texts. The codes resulting from the text analysis were related to cognitive operations, ranging from simple to more sophisticated ones. The sequence of the cognitive operations as the text unfolded represents the writer's cognitive pathway at the…

  4. Gender, Race and Class in Media. A Text-Reader.

    ERIC Educational Resources Information Center

    Dines, Gail, Ed.; Humez, Jean M., Ed.

    This reader is intended to introduce undergraduates to the richness, sophistication, and diversity that characterize contemporary media scholarship. Another goal is to take the mystery out of the idea of media culture by examining its production, construction, and the meaning-making processes through which media imagery and messages help shape our…

  5. Deep FIFO Surge Buffer

    NASA Technical Reports Server (NTRS)

    Temple, Gerald; Siegel, Marc; Amitai, Zwie

    1991-01-01

    First-in/first-out (FIFO) temporarily stores short surges of data generated by data-acquisition system at excessively high rate and releases data at lower rate suitable for processing by computer. Size and complexity reduced while capacity enhanced by use of newly developed, sophisticated integrated circuits and by "byte-folding" scheme doubling effective depth and data rate.

  6. Toward a Sociology of Criminological Theory

    ERIC Educational Resources Information Center

    Hauhart, Robert C.

    2012-01-01

    It is a truism to remind ourselves that scientific theory is a human product subject to many of the same social processes that govern other social acts. Science, however, whether social or natural, pretends to claim a higher mission, a more sophisticated methodology, and more consequential and reliable outcomes than human efforts arising from…

  7. The Role of Co-Regulated Learning during Students' Understanding of Complex Systems with Hypermedia.

    ERIC Educational Resources Information Center

    Azevedo, Roger; Cromley, Jennifer G.; Seibert, Diane; Tron, Myriam

    This study examined the role of different scaffolding instructional interventions in facilitating students' shift to more sophisticated mental models as indicated by both performance and process data. Undergraduate students (n=51) were randomly assigned to use of one of three scaffolding conditions (adaptive scaffolding (AS), fixed scaffolding…

  8. Argument, Counterargument, and Integration? Patterns of Argument Reappraisal in Controversial Classroom Discussions

    ERIC Educational Resources Information Center

    Gronostay, Dorothee

    2016-01-01

    Being challenged by opposing views in a controversial discussion can stimulate the production of more elaborate and sophisticated argumentations. According to the model of argument reappraisal (Leitão, 2000), such processes require transactivity, meaning that students do not only give reasons to support their own position (e.g., pro/contra…

  9. The Role of Salience in the Extraction of Algebraic Rules

    ERIC Educational Resources Information Center

    Endress, Ansgar D.; Scholl, Brian J.; Mehler, Jacques

    2005-01-01

    Recent research suggests that humans and other animals have sophisticated abilities to extract both statistical dependencies and rule-based regularities from sequences. Most of this research stresses the flexibility and generality of such processes. Here the authors take up an equally important project, namely, to explore the limits of such…

  10. CONTROL OF MICROBIAL CONTAMINANTS AND DISINFECTION BY-PRODUCTS IN DRINKING WATER: COST AND PERFORMANCE

    EPA Science Inventory

    The U.S. Environmental Protection Agency (U.S. EPA) is in the process of developing a sophisticated regulatory strategy in an attempt to balance the risks associated with disinfectants and disinfection by-products (D/DBP) in drinking water. A major aspect of this strategy is the...

  11. Integrating CAD/CAM in Automation and Materials Handling

    ERIC Educational Resources Information Center

    Deal, Walter F.; Jones, Catherine E.

    2012-01-01

    Humans by their very nature are users of tools, materials, and processes as a part of their survival and existence. As humans have progressed over time, their civilizations and societies have changed beyond imagination and have moved from hunters and gatherers of food and materials for survival to sophisticated societies with complex social and…

  12. The Role of the Social Scientist in the School of Education

    ERIC Educational Resources Information Center

    Schlechty, Philip C.; Morrison, James L.

    1977-01-01

    A conflict exists within schools of education over the content of foundations courses: should they stress the relationship of education to society, or should they focus on developing the sophistication of the teacher? The authors recommend that social researchers in education examine problems of learning motivation, instructional processes, and…

  13. SNOWMIP2: An evaluation of forest snow process simulations

    Treesearch

    Richard Essery; Nick Rutter; John Pomeroy; Robert Baxter; Manfred Stahli; David Gustafsson; Alan Barr; Paul Bartlett; Kelly Elder

    2009-01-01

    Models of terrestrial snow cover, or snow modules within land surface models, are used in many meteorological, hydrological, and ecological applications. Such models were developed first, and have achieved their greatest sophistication, for snow in open areas; however, huge tracts of the Northern Hemisphere both have seasonal snow cover and are forested (Fig. 1)....

  14. Meeting the Social and Legal Needs of Urban Indians: An Experimental Program.

    ERIC Educational Resources Information Center

    Halverson, Lowell K.; Garrow, Tom

    Approximately 40 percent of America's Indians live in urban environments; of these, about 12,000 live in Seattle, Washington. With no representation in local government, and lacking the power and cultural sophistication to make the political process work for them, many Indian emigres have developed an almost institutionalized distrust of and…

  15. Security Systems Commissioning: An Old Trick for Your New Dog

    ERIC Educational Resources Information Center

    Black, James R.

    2009-01-01

    Sophisticated, software-based security systems can provide powerful tools to support campus security. By nature, such systems are flexible, with many capabilities that can help manage the process of physical protection. However, the full potential of these systems can be overlooked because of unfamiliarity with the products, weaknesses in security…

  16. Using Statistical Natural Language Processing for Understanding Complex Responses to Free-Response Tasks

    ERIC Educational Resources Information Center

    DeMark, Sarah F.; Behrens, John T.

    2004-01-01

    Whereas great advances have been made in the statistical sophistication of assessments in terms of evidence accumulation and task selection, relatively little statistical work has explored the possibility of applying statistical techniques to data for the purposes of determining appropriate domain understanding and to generate task-level scoring…

  17. Nanoarchitectonics for Controlling the Number of Dopant Atoms in Solid Electrolyte Nanodots.

    PubMed

    Nayak, Alpana; Unayama, Satomi; Tai, Seishiro; Tsuruoka, Tohru; Waser, Rainer; Aono, Masakazu; Valov, Ilia; Hasegawa, Tsuyoshi

    2018-02-01

    Controlling movements of electrons and holes is the key task in developing today's highly sophisticated information society. As transistors reach their physical limits, the semiconductor industry is seeking the next alternative to sustain its economy and to unfold a new era of human civilization. In this context, a completely new information token, i.e., ions instead of electrons, is promising. The current trend in solid-state nanoionics for applications in energy storage, sensing, and brain-type information processing, requires the ability to control the properties of matter at the ultimate atomic scale. Here, a conceptually novel nanoarchitectonic strategy is proposed for controlling the number of dopant atoms in a solid electrolyte to obtain discrete electrical properties. Using α-Ag 2+ δ S nanodots with a finite number of nonstoichiometry excess dopants as a model system, a theory matched with experiments is presented that reveals the role of physical parameters, namely, the separation between electrochemical energy levels and the cohesive energy, underlying atomic-scale manipulation of dopants in nanodots. This strategy can be applied to different nanoscale materials as their properties strongly depend on the number of doping atoms/ions, and has the potential to create a new paradigm based on controlled single atom/ion transfer. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Adaptive Quadrature Detection for Multicarrier Continuous-Variable Quantum Key Distribution

    NASA Astrophysics Data System (ADS)

    Gyongyosi, Laszlo; Imre, Sandor

    2015-03-01

    We propose the adaptive quadrature detection for multicarrier continuous-variable quantum key distribution (CVQKD). A multicarrier CVQKD scheme uses Gaussian subcarrier continuous variables for the information conveying and Gaussian sub-channels for the transmission. The proposed multicarrier detection scheme dynamically adapts to the sub-channel conditions using a corresponding statistics which is provided by our sophisticated sub-channel estimation procedure. The sub-channel estimation phase determines the transmittance coefficients of the sub-channels, which information are used further in the adaptive quadrature decoding process. We define the technique called subcarrier spreading to estimate the transmittance conditions of the sub-channels with a theoretical error-minimum in the presence of a Gaussian noise. We introduce the terms of single and collective adaptive quadrature detection. We also extend the results for a multiuser multicarrier CVQKD scenario. We prove the achievable error probabilities, the signal-to-noise ratios, and quantify the attributes of the framework. The adaptive detection scheme allows to utilize the extra resources of multicarrier CVQKD and to maximize the amount of transmittable information. This work was partially supported by the GOP-1.1.1-11-2012-0092 (Secure quantum key distribution between two units on optical fiber network) project sponsored by the EU and European Structural Fund, and by the COST Action MP1006.

  19. Neurobiology of Empathy and Callousness: Implications for the Development of Antisocial Behavior

    PubMed Central

    Shirtcliff, Elizabeth A.; Vitacco, Michael J.; Graf, Alexander R.; Gostisha, Andrew J.; Merz, Jenna L.; Zahn-Waxler, Carolyn

    2009-01-01

    Information on the neurobiology of empathy and callousness provides clinicians an opportunity to develop sophisticated understanding of mechanisms underpinning antisocial behavior and its counterpart, moral decision making. This paper provides an integrated in-depth review of hormones (e.g., peripheral steroid hormones like cortisol) and brain structures (e.g., insula, anterior cingulate cortex, and amygdala) implicated in empathy, callousness and psychopathic-like behavior. The overarching goal of this paper is to relate these hormones and brain structures to moral decision-making. This review will begin in the brain, but will then integrate information about biological functioning in the body, specifically stress-reactivity. Our aim is to integrate understanding of neural processes with hormones like cortisol, both of which have demonstrated relationships to empathy, psychopathy, and antisocial behavior. The review proposes neurobiological impairments in individuals who display little empathy are not necessarily due to a reduced ability to understand the emotions of others. Instead, evidence suggests individuals who show little arousal to the distress of others likewise show decreased physiological arousal to their own distress; one manifestation of reduced stress reactivity may be a dysfunction in empathy which supports psychopathic-like constructs (e.g., callousness). This integration will assist in the development of objective methodologies that can inform and monitor treatment interventions focused on decreasing antisocial behavior. PMID:19319834

  20. Three-dimensional quick response code based on inkjet printing of upconversion fluorescent nanoparticles for drug anti-counterfeiting.

    PubMed

    You, Minli; Lin, Min; Wang, Shurui; Wang, Xuemin; Zhang, Ge; Hong, Yuan; Dong, Yuqing; Jin, Guorui; Xu, Feng

    2016-05-21

    Medicine counterfeiting is a serious issue worldwide, involving potentially devastating health repercussions. Advanced anti-counterfeit technology for drugs has therefore aroused intensive interest. However, existing anti-counterfeit technologies are associated with drawbacks such as the high cost, complex fabrication process, sophisticated operation and incapability in authenticating drug ingredients. In this contribution, we developed a smart phone recognition based upconversion fluorescent three-dimensional (3D) quick response (QR) code for tracking and anti-counterfeiting of drugs. We firstly formulated three colored inks incorporating upconversion nanoparticles with RGB (i.e., red, green and blue) emission colors. Using a modified inkjet printer, we printed a series of colors by precisely regulating the overlap of these three inks. Meanwhile, we developed a multilayer printing and splitting technology, which significantly increases the information storage capacity per unit area. As an example, we directly printed the upconversion fluorescent 3D QR code on the surface of drug capsules. The 3D QR code consisted of three different color layers with each layer encoded by information of different aspects of the drug. A smart phone APP was designed to decode the multicolor 3D QR code, providing the authenticity and related information of drugs. The developed technology possesses merits in terms of low cost, ease of operation, high throughput and high information capacity, thus holds great potential for drug anti-counterfeiting.

Top