48 CFR 15.202 - Advisory multi-step process.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Advisory multi-step... Information 15.202 Advisory multi-step process. (a) The agency may publish a presolicitation notice (see 5.204... participate in the acquisition. This process should not be used for multi-step acquisitions where it would...
Improving Program Performance through Management Information. A Workbook.
ERIC Educational Resources Information Center
Bienia, Nancy
Designed specifically for state and local managers and supervisors who plan, direct, and operate child support enforcement programs, this workbook provides a four-part, step-by-step process for identifying needed information and methods of using the information to operate an effective program. The process consists of: (1) determining what…
Perez, Susan L; Paterniti, Debora A; Wilson, Machelle; Bell, Robert A; Chan, Man Shan; Villareal, Chloe C; Nguyen, Hien Huy; Kravitz, Richard L
2015-07-20
Little is known about the processes people use to find health-related information on the Internet or the individual characteristics that shape selection of information-seeking approaches. Our aim was to describe the processes by which users navigate the Internet for information about a hypothetical acute illness and to identify individual characteristics predictive of their information-seeking strategies. Study participants were recruited from public settings and agencies. Interested individuals were screened for eligibility using an online questionnaire. Participants listened to one of two clinical scenarios—consistent with influenza or bacterial meningitis—and then conducted an Internet search. Screen-capture video software captured Internet search mouse clicks and keystrokes. Each step of the search was coded as hypothesis testing (etiology), evidence gathering (symptoms), or action/treatment seeking (behavior). The coded steps were used to form a step-by-step pattern of each participant's information-seeking process. A total of 78 Internet health information seekers ranging from 21-35 years of age and who experienced barriers to accessing health care services participated. We identified 27 unique patterns of information seeking, which were grouped into four overarching classifications based on the number of steps taken during the search, whether a pattern consisted of developing a hypothesis and exploring symptoms before ending the search or searching an action/treatment, and whether a pattern ended with action/treatment seeking. Applying dual-processing theory, we categorized the four overarching pattern classifications as either System 1 (41%, 32/78), unconscious, rapid, automatic, and high capacity processing; or System 2 (59%, 46/78), conscious, slow, and deliberative processing. Using multivariate regression, we found that System 2 processing was associated with higher education and younger age. We identified and classified two approaches to processing Internet health information. System 2 processing, a methodical approach, most resembles the strategies for information processing that have been found in other studies to be associated with higher-quality decisions. We conclude that the quality of Internet health-information seeking could be improved through consumer education on methodical Internet navigation strategies and the incorporation of decision aids into health information websites.
Paterniti, Debora A; Wilson, Machelle; Bell, Robert A; Chan, Man Shan; Villareal, Chloe C; Nguyen, Hien Huy; Kravitz, Richard L
2015-01-01
Background Little is known about the processes people use to find health-related information on the Internet or the individual characteristics that shape selection of information-seeking approaches. Objective Our aim was to describe the processes by which users navigate the Internet for information about a hypothetical acute illness and to identify individual characteristics predictive of their information-seeking strategies. Methods Study participants were recruited from public settings and agencies. Interested individuals were screened for eligibility using an online questionnaire. Participants listened to one of two clinical scenarios—consistent with influenza or bacterial meningitis—and then conducted an Internet search. Screen-capture video software captured Internet search mouse clicks and keystrokes. Each step of the search was coded as hypothesis testing (etiology), evidence gathering (symptoms), or action/treatment seeking (behavior). The coded steps were used to form a step-by-step pattern of each participant’s information-seeking process. A total of 78 Internet health information seekers ranging from 21-35 years of age and who experienced barriers to accessing health care services participated. Results We identified 27 unique patterns of information seeking, which were grouped into four overarching classifications based on the number of steps taken during the search, whether a pattern consisted of developing a hypothesis and exploring symptoms before ending the search or searching an action/treatment, and whether a pattern ended with action/treatment seeking. Applying dual-processing theory, we categorized the four overarching pattern classifications as either System 1 (41%, 32/78), unconscious, rapid, automatic, and high capacity processing; or System 2 (59%, 46/78), conscious, slow, and deliberative processing. Using multivariate regression, we found that System 2 processing was associated with higher education and younger age. Conclusions We identified and classified two approaches to processing Internet health information. System 2 processing, a methodical approach, most resembles the strategies for information processing that have been found in other studies to be associated with higher-quality decisions. We conclude that the quality of Internet health-information seeking could be improved through consumer education on methodical Internet navigation strategies and the incorporation of decision aids into health information websites. PMID:26194787
A method for tailoring the information content of a software process model
NASA Technical Reports Server (NTRS)
Perkins, Sharon; Arend, Mark B.
1990-01-01
The framework is defined for a general method for selecting a necessary and sufficient subset of a general software life cycle's information products, to support new software development process. Procedures for characterizing problem domains in general and mapping to a tailored set of life cycle processes and products is presented. An overview of the method is shown using the following steps: (1) During the problem concept definition phase, perform standardized interviews and dialogs between developer and user, and between user and customer; (2) Generate a quality needs profile of the software to be developed, based on information gathered in step 1; (3) Translate the quality needs profile into a profile of quality criteria that must be met by the software to satisfy the quality needs; (4) Map the quality criteria to set of accepted processes and products for achieving each criterion; (5) Select the information products which match or support the accepted processes and product of step 4; and (6) Select the design methodology which produces the information products selected in step 5.
A method for tailoring the information content of a software process model
NASA Technical Reports Server (NTRS)
Perkins, Sharon; Arend, Mark B.
1990-01-01
The framework is defined for a general method for selecting a necessary and sufficient subset of a general software life cycle's information products, to support new software development process. Procedures for characterizing problem domains in general and mapping to a tailored set of life cycle processes and products is presented. An overview of the method is shown using the following steps: (1) During the problem concept definition phase, perform standardized interviews and dialogs between developer and user, and between user and customer; (2) Generate a quality needs profile of the software to be developed, based on information gathered in step 1; (3) Translate the quality needs profile into a profile of quality criteria that must be met by the software to satisfy the quality needs; (4) Map the quality criteria to a set of accepted processes and products for achieving each criterion; (5) select the information products which match or support the accepted processes and product of step 4; and (6) Select the design methodology which produces the information products selected in step 5.
The RiverFish Approach to Business Process Modeling: Linking Business Steps to Control-Flow Patterns
NASA Astrophysics Data System (ADS)
Zuliane, Devanir; Oikawa, Marcio K.; Malkowski, Simon; Alcazar, José Perez; Ferreira, João Eduardo
Despite the recent advances in the area of Business Process Management (BPM), today’s business processes have largely been implemented without clearly defined conceptual modeling. This results in growing difficulties for identification, maintenance, and reuse of rules, processes, and control-flow patterns. To mitigate these problems in future implementations, we propose a new approach to business process modeling using conceptual schemas, which represent hierarchies of concepts for rules and processes shared among collaborating information systems. This methodology bridges the gap between conceptual model description and identification of actual control-flow patterns for workflow implementation. We identify modeling guidelines that are characterized by clear phase separation, step-by-step execution, and process building through diagrams and tables. The separation of business process modeling in seven mutually exclusive phases clearly delimits information technology from business expertise. The sequential execution of these phases leads to the step-by-step creation of complex control-flow graphs. The process model is refined through intuitive table and diagram generation in each phase. Not only does the rigorous application of our modeling framework minimize the impact of rule and process changes, but it also facilitates the identification and maintenance of control-flow patterns in BPM-based information system architectures.
Algorithmic and heuristic processing of information by the nervous system.
Restian, A
1980-01-01
Starting from the fact that the nervous system must discover the information it needs, the author describes the way it decodes the received message. The logical circuits of the nervous system, submitting the received signals to a process by means of which information brought is discovered step by step, participates in decoding the message. The received signals, as information, can be algorithmically or heuristically processed. Algorithmic processing is done according to precise rules, which must be fulfilled step by step. By algorithmic processing, one develops somatic and vegetative reflexes as blood pressure, heart frequency or water metabolism control. When it does not dispose of precise rules of information processing or when algorithmic processing needs a very long time, the nervous system must use heuristic processing. This is the feature that differentiates the human brain from the electronic computer that can work only according to some extremely precise rules. The human brain can work according to less precise rules because it can resort to trial and error operations, and because it works according to a form of logic. Working with superior order signals which represent the class of all inferior type signals from which they begin, the human brain need not perform all the operations that it would have to perform by superior type of signals. Therefore the brain tries to submit the received signals to intensive as possible superization. All informational processing, and especially heuristical processing, is accompanied by a certain affective color and the brain cannot operate without it. Emotions, passions and sentiments usually complete the lack of precision of the heuristical programmes. Finally, the author shows that informational and especially heuristical processes study can contribute to a better understanding of the transition from neurological to psychological activity.
ERIC Educational Resources Information Center
Frazier, Thomas W.; Youngstrom, Eric A.
2006-01-01
In this article, the authors illustrate a step-by-step process of acquiring and integrating information according to the recommendations of evidence-based practices. A case example models the process, leading to specific recommendations regarding instruments and strategies for evidence-based assessment (EBA) of attention-deficit/hyperactivity…
ERIC Educational Resources Information Center
Onwuegbuzie, Anthony J.; Weinbaum, Rebecca K.
2016-01-01
Recently, several authors have attempted to make the literature review process more transparent by providing a step-by-step guide to conducting literature reviews. However, although these works are very informative, none of them delineate how to display information extracted from literature reviews in a reader-friendly and visually appealing…
Enhanced Traceability for Bulk Processing of Sentinel-Derived Information Products
NASA Astrophysics Data System (ADS)
Lankester, Thomas; Hubbard, Steven; Knowelden, Richard
2016-08-01
The advent of widely available, systematically acquired and advanced Earth observations from the Sentinel platforms is spurring development of a wide range of derived information products. Whilst welcome, this rapid rate of development inevitably leads to some processing instability as algorithms and production steps are required to evolve accordingly. To mitigate this instability, the provenance of EO-derived information products needs to be traceable and transparent.Airbus Defence and Space (Airbus DS) has developed the Airbus Processing Cloud (APC) as a virtualised processing farm for bulk production of EO-derived data and information products. The production control system of the APC transforms internal configuration control information into an INSPIRE metadata file containing a stepwise set of processing steps and data source elements that provide the complete and transparent provenance of each product generated.
Lights, Camera, Action: Facilitating the Design and Production of Effective Instructional Videos
ERIC Educational Resources Information Center
Di Paolo, Terry; Wakefield, Jenny S.; Mills, Leila A.; Baker, Laura
2017-01-01
This paper outlines a rudimentary process intended to guide faculty in K-12 and higher education through the steps involved to produce video for their classes. The process comprises four steps: planning, development, delivery and reflection. Each step is infused with instructional design information intended to support the collaboration between…
Zimmerman, Margaret S
2018-01-01
This paper explores the reproductive health-related information seeking of low-income women that has been found to be affected by digital divide disparities. A survey conducted with 70 low-income women explores what information sources women use for reproductive health-related information seeking, what process they go through to find information, and if they are using sources that they trust. The findings of this study detail a two-step information-seeking process that typically includes a preference for personal, informal sources. Women of this income group often rely upon sources that they do not consider credible. While there have been many studies on the end effects of a lack of accurate and accessible reproductive health information, little research has been conducted to examine the reproductive healthcare information-seeking patterns of women who live in poverty.
Evaluating and selecting an information system, Part 1.
Neal, T
1993-01-01
Initial steps in the process of evaluating and selecting a computerized information system for the pharmacy department are described. The first step in the selection process is to establish a steering committee and a project committee. The steering committee oversees the project, providing policy guidance, making major decisions, and allocating budgeted expenditures. The project committee conducts the departmental needs assessment, identifies system requirements, performs day-to-day functions, evaluates vendor proposals, trains personnel, and implements the system chosen. The second step is the assessment of needs in terms of personnel, workload, physical layout, and operating requirements. The needs assessment should be based on the department's mission statement and strategic plan. The third step is the development of a request for information (RFI) and a request for proposal (RFP). The RFI is a document designed for gathering preliminary information from a wide range of vendors; this general information is used in deciding whether to send the RFP to a given vendor. The RFP requests more detailed information and gives the purchaser's exact specifications for a system; the RFP also includes contractual information. To help ensure project success, many institutions turn to computer consultants for guidance. The initial steps in selecting a computerized pharmacy information system are establishing computerization committees, conducting a needs assessment, and writing an RFI and an RFP. A crucial early decision is whether to seek a consultant's expertise.
NASA Astrophysics Data System (ADS)
Gholibeigian, Hassan
In my vision, there are four animated sub-particles (mater, plant, animal and human sub-particles) as the origin of the life and creator of momentum in each fundamental particle (string). They communicate with dimension of information which is nested with space-time for getting a package of information in each Planck time. They are link-point between dimension of information and space-time. Sub-particle which identifies its fundamental particle, processes the package of information for finding its next step. Processed information carry always by fundamental particles as the history of the universe and enhance its entropy. My proposed formula for calculating number of packages is I =tP- 1 . τ , Planck time tP, and τ is fundamental particle's lifetime. For example a photon needs processes 1 . 8 ×1043 packages of information for finding its path in a second. Duration of each process is faster than light speed. In our bodies, human's sub-particles (substrings) communicate with dimension of information and get packages of information including standard ethics for process and finding their next step. The processed information transforms to knowledge in our mind. This knowledge is always carried by us. Knowledge, as the Result of the Processed Information by Human's Sub-particles (sub-strings)/Mind in our Brain.
Wang, Degeng
2008-01-01
Discrepancy between the abundance of cognate protein and RNA molecules is frequently observed. A theoretical understanding of this discrepancy remains elusive, and it is frequently described as surprises and/or technical difficulties in the literature. Protein and RNA represent different steps of the multi-stepped cellular genetic information flow process, in which they are dynamically produced and degraded. This paper explores a comparison with a similar process in computers - multi-step information flow from storage level to the execution level. Functional similarities can be found in almost every facet of the retrieval process. Firstly, common architecture is shared, as the ribonome (RNA space) and the proteome (protein space) are functionally similar to the computer primary memory and the computer cache memory respectively. Secondly, the retrieval process functions, in both systems, to support the operation of dynamic networks – biochemical regulatory networks in cells and, in computers, the virtual networks (of CPU instructions) that the CPU travels through while executing computer programs. Moreover, many regulatory techniques are implemented in computers at each step of the information retrieval process, with a goal of optimizing system performance. Cellular counterparts can be easily identified for these regulatory techniques. In other words, this comparative study attempted to utilize theoretical insight from computer system design principles as catalysis to sketch an integrative view of the gene expression process, that is, how it functions to ensure efficient operation of the overall cellular regulatory network. In context of this bird’s-eye view, discrepancy between protein and RNA abundance became a logical observation one would expect. It was suggested that this discrepancy, when interpreted in the context of system operation, serves as a potential source of information to decipher regulatory logics underneath biochemical network operation. PMID:18757239
Wang, Degeng
2008-12-01
Discrepancy between the abundance of cognate protein and RNA molecules is frequently observed. A theoretical understanding of this discrepancy remains elusive, and it is frequently described as surprises and/or technical difficulties in the literature. Protein and RNA represent different steps of the multi-stepped cellular genetic information flow process, in which they are dynamically produced and degraded. This paper explores a comparison with a similar process in computers-multi-step information flow from storage level to the execution level. Functional similarities can be found in almost every facet of the retrieval process. Firstly, common architecture is shared, as the ribonome (RNA space) and the proteome (protein space) are functionally similar to the computer primary memory and the computer cache memory, respectively. Secondly, the retrieval process functions, in both systems, to support the operation of dynamic networks-biochemical regulatory networks in cells and, in computers, the virtual networks (of CPU instructions) that the CPU travels through while executing computer programs. Moreover, many regulatory techniques are implemented in computers at each step of the information retrieval process, with a goal of optimizing system performance. Cellular counterparts can be easily identified for these regulatory techniques. In other words, this comparative study attempted to utilize theoretical insight from computer system design principles as catalysis to sketch an integrative view of the gene expression process, that is, how it functions to ensure efficient operation of the overall cellular regulatory network. In context of this bird's-eye view, discrepancy between protein and RNA abundance became a logical observation one would expect. It was suggested that this discrepancy, when interpreted in the context of system operation, serves as a potential source of information to decipher regulatory logics underneath biochemical network operation.
Information in general medical practices: the information processing model.
Crowe, Sarah; Tully, Mary P; Cantrill, Judith A
2010-04-01
The need for effective communication and handling of secondary care information in general practices is paramount. To explore practice processes on receiving secondary care correspondence in a way that integrates the information needs and perceptions of practice staff both clinical and administrative. Qualitative study using semi-structured interviews with a wide range of practice staff (n = 36) in nine practices in the Northwest of England. Analysis was based on the framework approach using N-Vivo software and involved transcription, familiarization, coding, charting, mapping and interpretation. The 'information processing model' was developed to describe the six stages involved in practice processing of secondary care information. These included the amendment or updating of practice records whilst simultaneously or separately actioning secondary care recommendations, using either a 'one-step' or 'two-step' approach, respectively. Many factors were found to influence each stage and impact on the continuum of patient care. The primary purpose of processing secondary care information is to support patient care; this study raises the profile of information flow and usage within practices as an issue requiring further consideration.
Hand and goods judgment algorithm based on depth information
NASA Astrophysics Data System (ADS)
Li, Mingzhu; Zhang, Jinsong; Yan, Dan; Wang, Qin; Zhang, Ruiqi; Han, Jing
2016-03-01
A tablet computer with a depth camera and a color camera is loaded on a traditional shopping cart. The inside information of the shopping cart is obtained by two cameras. In the shopping cart monitoring field, it is very important for us to determine whether the customer with goods in or out of the shopping cart. This paper establishes a basic framework for judging empty hand, it includes the hand extraction process based on the depth information, process of skin color model building based on WPCA (Weighted Principal Component Analysis), an algorithm for judging handheld products based on motion and skin color information, statistical process. Through this framework, the first step can ensure the integrity of the hand information, and effectively avoids the influence of sleeve and other debris, the second step can accurately extract skin color and eliminate the similar color interference, light has little effect on its results, it has the advantages of fast computation speed and high efficiency, and the third step has the advantage of greatly reducing the noise interference and improving the accuracy.
Eureka: Six Easy Steps to Research Success
ERIC Educational Resources Information Center
Hubel, Joy Alter
2005-01-01
Eureka is similar to the Big6(super TM) research skills by Michael Eisenberg and Bob Berkowitz, as both methods simplify the complex process of critical information gathering into six user-friendly steps. The six research steps to Eureka are presented.
ERIC Educational Resources Information Center
Stanford, Linda
This course curriculum is intended for use by community college insructors and administrators in implementing an advanced information processing course. It builds on the skills developed in the previous information processing course but goes one step further by requiring students to perform in a simulated office environment and improve their…
Initial Crisis Reaction and Poliheuristic Theory
ERIC Educational Resources Information Center
DeRouen, Karl, Jr.; Sprecher, Christopher
2004-01-01
Poliheuristic (PH) theory models foreign policy decisions using a two-stage process. The first step eliminates alternatives on the basis of a simplifying heuristic. The second step involves a selection from among the remaining alternatives and can employ a more rational and compensatory means of processing information. The PH model posits that…
ERIC Educational Resources Information Center
Stanford, Linda
This course curriculum is intended for use in an advanced information processing course. It builds on the skills developed in the previous information processing course but goes one step further by requiring students to perform in a simulated office environment and improve their decision-making skills. This volume contains two parts of the…
ERIC Educational Resources Information Center
Alfonseca, Enrique; Rodriguez, Pilar; Perez, Diana
2007-01-01
This work describes a framework that combines techniques from Adaptive Hypermedia and Natural Language processing in order to create, in a fully automated way, on-line information systems from linear texts in electronic format, such as textbooks. The process is divided into two steps: an "off-line" processing step, which analyses the source text,…
Information Systems Technician Rating Stakeholders: Implications for Effective Performance
2011-01-01
DeSanctis, and Borge Obel. (2006). Organizational Design: A Step-by-Step Approach. Cambridge, UK: Cambridge University Press. Carroll, G . R., and M...manpower, personnel, and training processes for managing the information systems technician (IT) rating and the effects of these different stakeholders...Strategic Human Resource Management and Management Structure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6 Organizational
Neikter, Susanna Allgurin; Rehnqvist, Nina; Rosén, Måns; Dahlgren, Helena
2009-12-01
The aim of this study was to facilitate effective internal and external communication of an international network and to explore how to support communication and work processes in health technology assessment (HTA). STRUCTURE AND METHODS: European network for Health Technology Assessment (EUnetHTA) connected sixty-four HTA Partner organizations from thirty-three countries. User needs in the different steps of the HTA process were the starting point for developing an information system. A step-wise, interdisciplinary, creative approach was used in developing practical tools. An Information Platform facilitated the exchange of scientific information between Partners and with external target groups. More than 200 virtual meetings were set up during the project using an e-meeting tool. A Clearinghouse prototype was developed with the intent to offering a single point of access to HTA relevant information. This evolved into a next step not planned from the outset: Developing a running HTA Information System including several Web-based tools to support communication and daily HTA processes. A communication strategy guided the communication effort, focusing on practical tools, creating added value, involving stakeholders, and avoiding duplication of effort. Modern technology enables a new information infrastructure for HTA. The potential of information and communication technology was used as a strategic tool. Several target groups were represented among the Partners, which supported collaboration and made it easier to identify user needs. A distinctive visual identity made it easier to gain and maintain visibility on a limited budget.
Apply creative thinking of decision support in electrical nursing record.
Hao, Angelica Te-Hui; Hsu, Chien-Yeh; Li-Fang, Huang; Jian, Wen-Shan; Wu, Li-Bin; Kao, Ching-Chiu; Lu, Mei-Show; Chang, Her-Kung
2006-01-01
The nursing process consists of five interrelated steps: assessment, diagnosis, planning, intervention, and evaluation. In the nursing process, the nurse collects a great deal of data and information. The amount of data and information may exceed the amount the nurse can process efficiently and correctly. Thus, the nurse needs assistance to become proficient in the planning of nursing care, due to the difficulty of simultaneously processing a large set of information. Computer systems are viewed as tools to expand the capabilities of the nurse's mind. Using computer technology to support clinicians' decision making may provide high-quality, patient-centered, and efficient healthcare. Although some existing nursing information systems aid in the nursing process, they only provide the most fundamental decision support--i.e., standard care plans associated with common nursing diagnoses. Such a computerized decision support system helps the nurse develop a care plan step-by-step. But it does not assist the nurse in the decision-making process. The decision process about how to generate nursing diagnoses from data and how to individualize the care plans still reminds of the nurse. The purpose of this study is to develop a pilot structure in electronic nursing record system integrated with international nursing standard for improving the proficiency and accuracy of plan of care in clinical pathway process. The proposed pilot systems not only assist both student nurses and nurses who are novice in nursing practice, but also experts who need to work in a practice area which they are not familiar with.
Emergence of Coding and its Specificity as a Physico-Informatic Problem
NASA Astrophysics Data System (ADS)
Wills, Peter R.; Nieselt, Kay; McCaskill, John S.
2015-06-01
We explore the origin-of-life consequences of the view that biological systems are demarcated from inanimate matter by their possession of referential information, which is processed computationally to control choices of specific physico-chemical events. Cells are cybernetic: they use genetic information in processes of communication and control, subjecting physical events to a system of integrated governance. The genetic code is the most obvious example of how cells use information computationally, but the historical origin of the usefulness of molecular information is not well understood. Genetic coding made information useful because it imposed a modular metric on the evolutionary search and thereby offered a general solution to the problem of finding catalysts of any specificity. We use the term "quasispecies symmetry breaking" to describe the iterated process of self-organisation whereby the alphabets of distinguishable codons and amino acids increased, step by step.
Bibliographic Instruction in a Step-by-Step Approach.
ERIC Educational Resources Information Center
Soash, Richard L.
1992-01-01
Describes an information search process based on Kuhlthau's model that was used to teach bibliographic research to ninth grade students. A research test to ensure that students are familiar with basic library skills is presented, forms for helping students narrow the topic and evaluate materials are provided, and a research process checklist is…
IONIO Project: Computer-mediated Decision Support System and Communication in Ocean Science
NASA Astrophysics Data System (ADS)
Oddo, Paolo; Acierno, Arianna; Cuna, Daniela; Federico, Ivan; Galati, Maria Barbara; Awad, Esam; Korres, Gerasimos; Lecci, Rita; Manzella, Giuseppe M. R.; Merico, Walter; Perivoliotis, Leonidas; Pinardi, Nadia; Shchekinova, Elena; Mannarini, Gianandrea; Vamvakaki, Chrysa; Pecci, Leda; Reseghetti, Franco
2013-04-01
A decision Support System is composed by four main steps. The first one is the definition of the problem, the issue to be covered, decisions to be taken. Different causes can provoke different problems, for each of the causes or its effects it is necessary to define a list of information and/or data that are required in order to take the better decision. The second step is the determination of sources from where information/data needed for decision-making can be obtained and who has that information. Furthermore it must be possible to evaluate the quality of the sources to see which of them can provide the best information, and identify the mode and format in which the information is presented. The third step is relying on the processing of knowledge, i.e. if the information/data are fitting for purposes. It has to be decided which parts of the information/data need to be used, what additional data or information is necessary to access, how can information be best presented to be able to understand the situation and take decisions. Finally, the decision making process is an interactive and inclusive process involving all concerned parties, whose different views must be taken into consideration. A knowledge based discussion forum is necessary to reach a consensus. A decision making process need to be examined closely and refined, and modified to meet differing needs over time. The report is presenting legal framework and knowledge base for a scientific based decision support system and a brief exploration of some of the skills that enhances the quality of decisions taken.
Manpower Information Manual. A Manual for Local Planning.
ERIC Educational Resources Information Center
Allred, Marcus D.; Myers, Christine F.
The step-by-step procedures contained in this manual are intended to develop a simple information system that can be used to collect and process the best possible factual data on the manpower needs of the community served by an educational institution, so that long-range planning of vocational curriculum and guidance can be based on what the jobs…
Joint Extraction of Entities and Relations Using Reinforcement Learning and Deep Learning.
Feng, Yuntian; Zhang, Hongjun; Hao, Wenning; Chen, Gang
2017-01-01
We use both reinforcement learning and deep learning to simultaneously extract entities and relations from unstructured texts. For reinforcement learning, we model the task as a two-step decision process. Deep learning is used to automatically capture the most important information from unstructured texts, which represent the state in the decision process. By designing the reward function per step, our proposed method can pass the information of entity extraction to relation extraction and obtain feedback in order to extract entities and relations simultaneously. Firstly, we use bidirectional LSTM to model the context information, which realizes preliminary entity extraction. On the basis of the extraction results, attention based method can represent the sentences that include target entity pair to generate the initial state in the decision process. Then we use Tree-LSTM to represent relation mentions to generate the transition state in the decision process. Finally, we employ Q -Learning algorithm to get control policy π in the two-step decision process. Experiments on ACE2005 demonstrate that our method attains better performance than the state-of-the-art method and gets a 2.4% increase in recall-score.
Joint Extraction of Entities and Relations Using Reinforcement Learning and Deep Learning
Zhang, Hongjun; Chen, Gang
2017-01-01
We use both reinforcement learning and deep learning to simultaneously extract entities and relations from unstructured texts. For reinforcement learning, we model the task as a two-step decision process. Deep learning is used to automatically capture the most important information from unstructured texts, which represent the state in the decision process. By designing the reward function per step, our proposed method can pass the information of entity extraction to relation extraction and obtain feedback in order to extract entities and relations simultaneously. Firstly, we use bidirectional LSTM to model the context information, which realizes preliminary entity extraction. On the basis of the extraction results, attention based method can represent the sentences that include target entity pair to generate the initial state in the decision process. Then we use Tree-LSTM to represent relation mentions to generate the transition state in the decision process. Finally, we employ Q-Learning algorithm to get control policy π in the two-step decision process. Experiments on ACE2005 demonstrate that our method attains better performance than the state-of-the-art method and gets a 2.4% increase in recall-score. PMID:28894463
2014-06-01
and Coastal Data Information Program ( CDIP ). This User’s Guide includes step-by-step instructions for accessing the GLOS/GLCFS database via WaveNet...access, processing and analysis tool; part 3 – CDIP database. ERDC/CHL CHETN-xx-14. Vicksburg, MS: U.S. Army Engineer Research and Development Center
First Processing Steps and the Quality of Wild and Farmed Fish
Borderías, Antonio J; Sánchez-Alonso, Isabel
2011-01-01
First processing steps of fish are species-dependent and have common practices for wild and for farmed fish. Fish farming does, however, have certain advantages over traditional fisheries in that the processor can influence postmortem biochemistry and various quality parameters. This review summarizes information about the primary processing of fish based on the influence of catching, slaughtering, bleeding, gutting, washing, and filleting. Recommendations are given for the correct primary processing of fish. PMID:21535702
Impact of modellers' decisions on hydrological a priori predictions
NASA Astrophysics Data System (ADS)
Holländer, H. M.; Bormann, H.; Blume, T.; Buytaert, W.; Chirico, G. B.; Exbrayat, J.-F.; Gustafsson, D.; Hölzel, H.; Krauße, T.; Kraft, P.; Stoll, S.; Blöschl, G.; Flühler, H.
2013-07-01
The purpose of this paper is to stimulate a re-thinking of how we, the catchment hydrologists, could become reliable forecasters. A group of catchment modellers predicted the hydrological response of a man-made 6 ha catchment in its initial phase (Chicken Creek) without having access to the observed records. They used conceptually different model families. Their modelling experience differed largely. The prediction exercise was organized in three steps: (1) for the 1st prediction modellers received a basic data set describing the internal structure of the catchment (somewhat more complete than usually available to a priori predictions in ungauged catchments). They did not obtain time series of stream flow, soil moisture or groundwater response. (2) Before the 2nd improved prediction they inspected the catchment on-site and attended a workshop where the modellers presented and discussed their first attempts. (3) For their improved 3rd prediction they were offered additional data by charging them pro forma with the costs for obtaining this additional information. Holländer et al. (2009) discussed the range of predictions obtained in step 1. Here, we detail the modeller's decisions in accounting for the various processes based on what they learned during the field visit (step 2) and add the final outcome of step 3 when the modellers made use of additional data. We document the prediction progress as well as the learning process resulting from the availability of added information. For the 2nd and 3rd step, the progress in prediction quality could be evaluated in relation to individual modelling experience and costs of added information. We learned (i) that soft information such as the modeller's system understanding is as important as the model itself (hard information), (ii) that the sequence of modelling steps matters (field visit, interactions between differently experienced experts, choice of model, selection of available data, and methods for parameter guessing), and (iii) that added process understanding can be as efficient as adding data for improving parameters needed to satisfy model requirements.
Computational mate choice: theory and empirical evidence.
Castellano, Sergio; Cadeddu, Giorgia; Cermelli, Paolo
2012-06-01
The present review is based on the thesis that mate choice results from information-processing mechanisms governed by computational rules and that, to understand how females choose their mates, we should identify which are the sources of information and how they are used to make decisions. We describe mate choice as a three-step computational process and for each step we present theories and review empirical evidence. The first step is a perceptual process. It describes the acquisition of evidence, that is, how females use multiple cues and signals to assign an attractiveness value to prospective mates (the preference function hypothesis). The second step is a decisional process. It describes the construction of the decision variable (DV), which integrates evidence (private information by direct assessment), priors (public information), and value (perceived utility) of prospective mates into a quantity that is used by a decision rule (DR) to produce a choice. We make the assumption that females are optimal Bayesian decision makers and we derive a formal model of DV that can explain the effects of preference functions, mate copying, social context, and females' state and condition on the patterns of mate choice. The third step of mating decision is a deliberative process that depends on the DRs. We identify two main categories of DRs (absolute and comparative rules), and review the normative models of mate sampling tactics associated to them. We highlight the limits of the normative approach and present a class of computational models (sequential-sampling models) that are based on the assumption that DVs accumulate noisy evidence over time until a decision threshold is reached. These models force us to rethink the dichotomy between comparative and absolute decision rules, between discrimination and recognition, and even between rational and irrational choice. Since they have a robust biological basis, we think they may represent a useful theoretical tool for behavioural ecologist interested in integrating proximate and ultimate causes of mate choice. Copyright © 2012 Elsevier B.V. All rights reserved.
Communication in diagnostic radiology: meeting the challenges of complexity.
Larson, David B; Froehle, Craig M; Johnson, Neil D; Towbin, Alexander J
2014-11-01
As patients and information flow through the imaging process, value is added step-by-step when information is acquired, interpreted, and communicated back to the referring clinician. However, radiology information systems are often plagued with communication errors and delays. This article presents theories and recommends strategies to continuously improve communication in the complex environment of modern radiology. Communication theories, methods, and systems that have proven their effectiveness in other environments can serve as models for radiology.
Comprehension of Multiple Documents with Conflicting Information: A Two-Step Model of Validation
ERIC Educational Resources Information Center
Richter, Tobias; Maier, Johanna
2017-01-01
In this article, we examine the cognitive processes that are involved when readers comprehend conflicting information in multiple texts. Starting from the notion of routine validation during comprehension, we argue that readers' prior beliefs may lead to a biased processing of conflicting information and a one-sided mental model of controversial…
ERIC Educational Resources Information Center
Sternberg, Robert J.
1979-01-01
An information-processing framework is presented for understanding intelligence. Two levels of processing are discussed: the steps involved in solving a complex intellectual task, and higher-order processes used to decide how to solve the problem. (MH)
Melatonin: a universal time messenger.
Erren, Thomas C; Reiter, Russel J
2015-01-01
Temporal organization plays a key role in humans, and presumably all species on Earth. A core building block of the chronobiological architecture is the master clock, located in the suprachi asmatic nuclei [SCN], which organizes "when" things happen in sub-cellular biochemistry, cells, organs and organisms, including humans. Conceptually, time messenging should follow a 5 step-cascade. While abundant evidence suggests how steps 1 through 4 work, step 5 of "how is central time information transmitted througout the body?" awaits elucidation. Step 1: Light provides information on environmental (external) time; Step 2: Ocular interfaces between light and biological (internal) time are intrinsically photosensitive retinal ganglion cells [ipRGS] and rods and cones; Step 3: Via the retinohypothalamic tract external time information reaches the light-dependent master clock in the brain, viz the SCN; Step 4: The SCN translate environmental time information into biological time and distribute this information to numerous brain structures via a melanopsin-based network. Step 5: Melatonin, we propose, transmits, or is a messenger of, internal time information to all parts of the body to allow temporal organization which is orchestrated by the SCN. Key reasons why we expect melatonin to have such role include: First, melatonin, as the chemical expression of darkness, is centrally involved in time- and timing-related processes such as encoding clock and calendar information in the brain; Second, melatonin travels throughout the body without limits and is thus a ubiquitous molecule. The chemial conservation of melatonin in all tested species could make this molecule a candidate for a universal time messenger, possibly constituting a legacy of an all-embracing evolutionary history.
[Information system for supporting the Nursing Care Systematization].
Malucelli, Andreia; Otemaier, Kelly Rafaela; Bonnet, Marcel; Cubas, Marcia Regina; Garcia, Telma Ribeiro
2010-01-01
It is an unquestionable fact, the importance, relevance and necessity of implementing the Nursing Care Systematization in the different environments of professional practice. Considering it as a principle, emerged the motivation for the development of an information system to support the Nursing Care Systematization, based on Nursing Process steps and Human Needs, using the diagnoses language, nursing interventions and outcomes for professional practice documentation. This paper describes the methodological steps and results of the information system development - requirements elicitation, modeling, object-relational mapping, implementation and system validation.
Strategic Information Systems Planning.
ERIC Educational Resources Information Center
Rowley, Jennifer
1995-01-01
Strategic Information Systems Planning (SISP) is the process of establishing a program for implementation and use of information systems in ways that will optimize effectiveness of information resources and use them to support the objectives of the organization. Basic steps in SISP methodology are outlined. (JKP)
Calculation tool for transported geothermal energy using two-step absorption process
Kyle Gluesenkamp
2016-02-01
This spreadsheet allows the user to calculate parameters relevant to techno-economic performance of a two-step absorption process to transport low temperature geothermal heat some distance (1-20 miles) for use in building air conditioning. The parameters included are (1) energy density of aqueous LiBr and LiCl solutions, (2) transportation cost of trucking solution, and (3) equipment cost for the required chillers and cooling towers in the two-step absorption approach. More information is available in the included public report: "A Technical and Economic Analysis of an Innovative Two-Step Absorption System for Utilizing Low-Temperature Geothermal Resources to Condition Commercial Buildings"
Graphical modeling and query language for hospitals.
Barzdins, Janis; Barzdins, Juris; Rencis, Edgars; Sostaks, Agris
2013-01-01
So far there has been little evidence that implementation of the health information technologies (HIT) is leading to health care cost savings. One of the reasons for this lack of impact by the HIT likely lies in the complexity of the business process ownership in the hospitals. The goal of our research is to develop a business model-based method for hospital use which would allow doctors to retrieve directly the ad-hoc information from various hospital databases. We have developed a special domain-specific process modelling language called the MedMod. Formally, we define the MedMod language as a profile on UML Class diagrams, but we also demonstrate it on examples, where we explain the semantics of all its elements informally. Moreover, we have developed the Process Query Language (PQL) that is based on MedMod process definition language. The purpose of PQL is to allow a doctor querying (filtering) runtime data of hospital's processes described using MedMod. The MedMod language tries to overcome deficiencies in existing process modeling languages, allowing to specify the loosely-defined sequence of the steps to be performed in the clinical process. The main advantages of PQL are in two main areas - usability and efficiency. They are: 1) the view on data through "glasses" of familiar process, 2) the simple and easy-to-perceive means of setting filtering conditions require no more expertise than using spreadsheet applications, 3) the dynamic response to each step in construction of the complete query that shortens the learning curve greatly and reduces the error rate, and 4) the selected means of filtering and data retrieving allows to execute queries in O(n) time regarding the size of the dataset. We are about to continue developing this project with three further steps. First, we are planning to develop user-friendly graphical editors for the MedMod process modeling and query languages. The second step is to do evaluation of usability the proposed language and tool involving the physicians from several hospitals in Latvia and working with real data from these hospitals. Our third step is to develop an efficient implementation of the query language.
ERIC Educational Resources Information Center
Wolery, Mark; Brashers, Margaret Sigalove; Neitzel, Jennifer C.
2002-01-01
This article explains how educators can use the ecological congruence assessment process for identifying functional goals for young children with disabilities. Process steps include: teacher collects information about functioning in usual classroom activities, routines, and transitions; summarizes the collected information; and shares the…
Single-Receiver GPS Phase Bias Resolution
NASA Technical Reports Server (NTRS)
Bertiger, William I.; Haines, Bruce J.; Weiss, Jan P.; Harvey, Nathaniel E.
2010-01-01
Existing software has been modified to yield the benefits of integer fixed double-differenced GPS-phased ambiguities when processing data from a single GPS receiver with no access to any other GPS receiver data. When the double-differenced combination of phase biases can be fixed reliably, a significant improvement in solution accuracy is obtained. This innovation uses a large global set of GPS receivers (40 to 80 receivers) to solve for the GPS satellite orbits and clocks (along with any other parameters). In this process, integer ambiguities are fixed and information on the ambiguity constraints is saved. For each GPS transmitter/receiver pair, the process saves the arc start and stop times, the wide-lane average value for the arc, the standard deviation of the wide lane, and the dual-frequency phase bias after bias fixing for the arc. The second step of the process uses the orbit and clock information, the bias information from the global solution, and only data from the single receiver to resolve double-differenced phase combinations. It is called "resolved" instead of "fixed" because constraints are introduced into the problem with a finite data weight to better account for possible errors. A receiver in orbit has much shorter continuous passes of data than a receiver fixed to the Earth. The method has parameters to account for this. In particular, differences in drifting wide-lane values must be handled differently. The first step of the process is automated, using two JPL software sets, Longarc and Gipsy-Oasis. The resulting orbit/clock and bias information files are posted on anonymous ftp for use by any licensed Gipsy-Oasis user. The second step is implemented in the Gipsy-Oasis executable, gd2p.pl, which automates the entire process, including fetching the information from anonymous ftp
NASA Astrophysics Data System (ADS)
Vandenbroucke, D.; Vancauwenberghe, G.
2016-12-01
The European Union Location Framework (EULF), as part of the Interoperable Solutions for European Public Administrations (ISA) Programme of the EU (EC DG DIGIT), aims to enhance the interactions between governments, businesses and citizens by embedding location information into e-Government processes. The challenge remains to find scientific sound and at the same time practicable approaches to estimate or measure the impact of location enablement of e-Government processes on the performance of the processes. A method has been defined to estimate process performance in terms of variables describing the efficiency, effectiveness, as well as the quality of the output of the work processes. A series of use cases have been identified, corresponding to existing e-Government work processes in which location information could bring added value. In a first step, the processes are described by means of BPMN (Business Process Model and Notation) to better understand the process steps, the actors involved, the spatial data flows, as well as the required input and the generated output. In a second step the processes are assessed in terms of the (sub-optimal) use of location information and the potential enhancement of the process by better integrating location information and services. The process performance is measured ex ante (before using location enabled e-Government services) and ex-post (after the integration of such services) in order to estimate and measure the impact of location information. The paper describes the method for performance measurement and highlights how the method is applied to one use case, i.e. the process of traffic safety monitoring. The use case is analysed and assessed in terms of location enablement and its potential impact on process performance. The results of applying the methodology on the use case revealed that performance is highly impacted by factors such as the way location information is collected, managed and shared throughout the process, and the degree to which spatial data are harmonized. The work led also to the formulation of some recommendations to enrich the BPMN standard with additional methods for annotating processes, and to the proposal of the development of some tools for automatic process performance. In that context some planned future work is highlighted as well.
Impact of modellers' decisions on hydrological a priori predictions
NASA Astrophysics Data System (ADS)
Holländer, H. M.; Bormann, H.; Blume, T.; Buytaert, W.; Chirico, G. B.; Exbrayat, J.-F.; Gustafsson, D.; Hölzel, H.; Krauße, T.; Kraft, P.; Stoll, S.; Blöschl, G.; Flühler, H.
2014-06-01
In practice, the catchment hydrologist is often confronted with the task of predicting discharge without having the needed records for calibration. Here, we report the discharge predictions of 10 modellers - using the model of their choice - for the man-made Chicken Creek catchment (6 ha, northeast Germany, Gerwin et al., 2009b) and we analyse how well they improved their prediction in three steps based on adding information prior to each following step. The modellers predicted the catchment's hydrological response in its initial phase without having access to the observed records. They used conceptually different physically based models and their modelling experience differed largely. Hence, they encountered two problems: (i) to simulate discharge for an ungauged catchment and (ii) using models that were developed for catchments, which are not in a state of landscape transformation. The prediction exercise was organized in three steps: (1) for the first prediction the modellers received a basic data set describing the catchment to a degree somewhat more complete than usually available for a priori predictions of ungauged catchments; they did not obtain information on stream flow, soil moisture, nor groundwater response and had therefore to guess the initial conditions; (2) before the second prediction they inspected the catchment on-site and discussed their first prediction attempt; (3) for their third prediction they were offered additional data by charging them pro forma with the costs for obtaining this additional information. Holländer et al. (2009) discussed the range of predictions obtained in step (1). Here, we detail the modeller's assumptions and decisions in accounting for the various processes. We document the prediction progress as well as the learning process resulting from the availability of added information. For the second and third steps, the progress in prediction quality is evaluated in relation to individual modelling experience and costs of added information. In this qualitative analysis of a statistically small number of predictions we learned (i) that soft information such as the modeller's system understanding is as important as the model itself (hard information), (ii) that the sequence of modelling steps matters (field visit, interactions between differently experienced experts, choice of model, selection of available data, and methods for parameter guessing), and (iii) that added process understanding can be as efficient as adding data for improving parameters needed to satisfy model requirements.
The systematic review as a research process in music therapy.
Hanson-Abromeit, Deanna; Sena Moore, Kimberly
2014-01-01
Music therapists are challenged to present evidence on the efficacy of music therapy treatment and incorporate the best available research evidence to make informed healthcare and treatment decisions. Higher standards of evidence can come from a variety of sources including systematic reviews. To define and describe a range of research review methods using examples from music therapy and related literature, with emphasis on the systematic review. In addition, the authors provide a detailed overview of methodological processes for conducting and reporting systematic reviews in music therapy. The systematic review process is described in five steps. Step 1 identifies the research plan and operationalized research question(s). Step 2 illustrates the identification and organization of the existing literature related to the question(s). Step 3 details coding of data extracted from the literature. Step 4 explains the synthesis of coded findings and analysis to answer the research question(s). Step 5 describes the strength of evidence evaluation and results presentation for practice recommendations. Music therapists are encouraged to develop and conduct systematic reviews. This methodology contributes to review outcome credibility and can determine how information is interpreted and used by clinicians, clients or patients, and policy makers. A systematic review is a methodologically rigorous research method used to organize and evaluate extant literature related to a clinical problem. Systematic reviews can assist music therapists in managing the ever-increasing literature, making well-informed evidence based practice and research decisions, and translating existing music-based and nonmusic based literature to clinical practice and research development. © the American Music Therapy Association 2014. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Giebel, Clarissa M; Challis, David; Hooper, Nigel M; Ferris, Sally
2018-03-01
In order to increase the efficacy of psychosocial interventions in dementia, a step-by-step process translating evidence and public engagement should be adhered to. This paper describes such a process by involving a two-stage focus group with people with dementia (PwD), informal carers, and staff. Based on previous evidence, general aspects of effective interventions were drawn out. These were tested in the first stage of focus groups, one with informal carers and PwD and one with staff. Findings from this stage helped shape the intervention further specifying its content. In the second stage, participants were consulted about the detailed components. The extant evidence base and focus groups helped to identify six practical and situation-specific elements worthy of consideration in planning such an intervention, including underlying theory and personal motivations for participation. Carers, PwD, and staff highlighted the importance of rapport between practitioners and PwD prior to commencing the intervention. It was also considered important that the intervention would be personalised to each individual. This paper shows how valuable public involvement can be to intervention development, and outlines a process of public involvement for future intervention development. The next step would be to formally test the intervention.
Qualitative Features Extraction from Sensor Data using Short-time Fourier Transform
NASA Technical Reports Server (NTRS)
Amini, Abolfazl M.; Figueroa, Fernando
2004-01-01
The information gathered from sensors is used to determine the health of a sensor. Once a normal mode of operation is established any deviation from the normal behavior indicates a change. This change may be due to a malfunction of the sensor(s) or the system (or process). The step-up and step-down features, as well as sensor disturbances are assumed to be exponential. An RC network is used to model the main process, which is defined by a step-up (charging), drift, and step-down (discharging). The sensor disturbances and spike are added while the system is in drift. The system runs for a period of at least three time-constants of the main process every time a process feature occurs (e.g. step change). The Short-Time Fourier Transform of the Signal is taken using the Hamming window. Three window widths are used. The DC value is removed from the windowed data prior to taking the FFT. The resulting three dimensional spectral plots provide good time frequency resolution. The results indicate distinct shapes corresponding to each process.
A Reference Unit on Home Vegetable Gardening.
ERIC Educational Resources Information Center
McCully, James S., Comp.; And Others
Designed to provide practical, up-to-date, basic information on home gardening for vocational agriculture students with only a limited knowledge of vegetable gardening, this reference unit includes step-by-step procedures for planning, planting, cultivating, harvesting, and processing vegetables in a small plot. Topics covered include plot…
A Systematic Approach to Subgroup Classification in Intellectual Disability
ERIC Educational Resources Information Center
Schalock, Robert L.; Luckasson, Ruth
2015-01-01
This article describes a systematic approach to subgroup classification based on a classification framework and sequential steps involved in the subgrouping process. The sequential steps are stating the purpose of the classification, identifying the classification elements, using relevant information, and using clearly stated and purposeful…
Multi-Mission Automated Task Invocation Subsystem
NASA Technical Reports Server (NTRS)
Cheng, Cecilia S.; Patel, Rajesh R.; Sayfi, Elias M.; Lee, Hyun H.
2009-01-01
Multi-Mission Automated Task Invocation Subsystem (MATIS) is software that establishes a distributed data-processing framework for automated generation of instrument data products from a spacecraft mission. Each mission may set up a set of MATIS servers for processing its data products. MATIS embodies lessons learned in experience with prior instrument- data-product-generation software. MATIS is an event-driven workflow manager that interprets project-specific, user-defined rules for managing processes. It executes programs in response to specific events under specific conditions according to the rules. Because requirements of different missions are too diverse to be satisfied by one program, MATIS accommodates plug-in programs. MATIS is flexible in that users can control such processing parameters as how many pipelines to run and on which computing machines to run them. MATIS has a fail-safe capability. At each step, MATIS captures and retains pertinent information needed to complete the step and start the next step. In the event of a restart, this information is retrieved so that processing can be resumed appropriately. At this writing, it is planned to develop a graphical user interface (GUI) for monitoring and controlling a product generation engine in MATIS. The GUI would enable users to schedule multiple processes and manage the data products produced in the processes. Although MATIS was initially designed for instrument data product generation,
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-21
... knowledge transfer, technology transition, and technology diffusion steps, along with numerous... promising research discoveries and ideas for advanced, high-value-added products and processes with existing...
Brentner, Laura B; Eckelman, Matthew J; Zimmerman, Julie B
2011-08-15
The use of algae as a feedstock for biodiesel production is a rapidly growing industry, in the United States and globally. A life cycle assessment (LCA) is presented that compares various methods, either proposed or under development, for algal biodiesel to inform the most promising pathways for sustainable full-scale production. For this analysis, the system is divided into five distinct process steps: (1) microalgae cultivation, (2) harvesting and/or dewatering, (3) lipid extraction, (4) conversion (transesterification) into biodiesel, and (5) byproduct management. A number of technology options are considered for each process step and various technology combinations are assessed for their life cycle environmental impacts. The optimal option for each process step is selected yielding a best case scenario, comprised of a flat panel enclosed photobioreactor and direct transesterification of algal cells with supercritical methanol. For a functional unit of 10 GJ biodiesel, the best case production system yields a cumulative energy demand savings of more than 65 GJ, reduces water consumption by 585 m(3) and decreases greenhouse gas emissions by 86% compared to a base case scenario typical of early industrial practices, highlighting the importance of technological innovation in algae processing and providing guidance on promising production pathways.
Expedited vocational assessment under the sequential evaluation process. Final rules.
2012-07-25
We are revising our rules to give adjudicators the discretion to proceed to the fifth step of the sequential evaluation process for assessing disability when we have insufficient information about a claimant's past relevant work history to make the findings required for step 4. If an adjudicator finds at step 5 that a claimant may be unable to adjust to other work existing in the national economy, the adjudicator will return to the fourth step to develop the claimant's work history and make a finding about whether the claimant can perform his or her past relevant work. We expect that this new expedited process will not disadvantage any claimant or change the ultimate conclusion about whether a claimant is disabled, but it will promote administrative efficiency and help us make more timely disability determinations and decisions.
The Industrial Process System Assessment (IPSA) methodology is a multiple step allocation approach for connecting information from the production line level up to the facility level and vice versa using a multiscale model of process systems. The allocation procedure assigns inpu...
Inside the Black Box: Tracking Decision-Making in an Action Research Study
ERIC Educational Resources Information Center
Smith, Cathryn
2017-01-01
Action research has been described as "designing the plane while flying it" (Herr & Anderson, 2005, p. 69). A black box documented the researcher's decisions while facilitating leadership development sessions with teacher leaders. Ten process folio steps informed the study through six iterations. Planning steps included a design…
ARES - A New Airborne Reflective Emissive Spectrometer
2005-10-01
Information and Management System (DIMS), an automated processing environment with robot archive interface as established for the handling of satellite data...consisting of geocoded ground reflectance data. All described processing steps will be integrated in the automated processing environment DIMS to assure a
Text block identification in restoration process of Javanese script damage
NASA Astrophysics Data System (ADS)
Himamunanto, A. R.; Setyowati, E.
2018-05-01
Generally, in a sheet of documents there are two objects of information, namely text and image. A text block area in the sheet of manuscript is a vital object because the restoration process would be done only in this object. Text block or text area identification becomes an important step before. This paper describes the steps leading to the restoration of Java script destruction. The process stages are: pre-processing, identification of text block, segmentation, damage identification, restoration. The test result based on the input manuscript “Hamong Tani” show that the system works with a success rate of 82.07%
The dynamics of team cognition: A process-oriented theory of knowledge emergence in teams.
Grand, James A; Braun, Michael T; Kuljanin, Goran; Kozlowski, Steve W J; Chao, Georgia T
2016-10-01
Team cognition has been identified as a critical component of team performance and decision-making. However, theory and research in this domain continues to remain largely static; articulation and examination of the dynamic processes through which collectively held knowledge emerges from the individual- to the team-level is lacking. To address this gap, we advance and systematically evaluate a process-oriented theory of team knowledge emergence. First, we summarize the core concepts and dynamic mechanisms that underlie team knowledge-building and represent our theory of team knowledge emergence (Step 1). We then translate this narrative theory into a formal computational model that provides an explicit specification of how these core concepts and mechanisms interact to produce emergent team knowledge (Step 2). The computational model is next instantiated into an agent-based simulation to explore how the key generative process mechanisms described in our theory contribute to improved knowledge emergence in teams (Step 3). Results from the simulations demonstrate that agent teams generate collectively shared knowledge more effectively when members are capable of processing information more efficiently and when teams follow communication strategies that promote equal rates of information sharing across members. Lastly, we conduct an empirical experiment with real teams participating in a collective knowledge-building task to verify that promoting these processes in human teams also leads to improved team knowledge emergence (Step 4). Discussion focuses on implications of the theory for examining team cognition processes and dynamics as well as directions for future research. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
ERIC Educational Resources Information Center
Salisbury, Fiona A.; Karasmanis, Sharon; Robertson, Tracy; Corbin, Jenny; Hulett, Heather; Peseta, Tai L.
2012-01-01
Information literacy is an essential component of the La Trobe University inquiry/research graduate capability and it provides the skill set needed for students to take their first steps on the path to engaging with academic information and scholarly communication processes. A deep learning approach to information literacy can be achieved if…
Informed consent process: A step further towards making it meaningful!
Kadam, Rashmi Ashish
2017-01-01
Informed consent process is the cornerstone of ethics in clinical research. Obtaining informed consent from patients participating in clinical research is an important legal and ethical imperative for clinical trial researchers. Although informed consent is an important process in clinical research, its effectiveness and validity are always a concern. Issues related to understanding, comprehension, competence, and voluntariness of clinical trial participants may adversely affect the informed consent process. Communication of highly technical, complex, and specialized clinical trial information to participants with limited literacy, diverse sociocultural background, diminished autonomy, and debilitating diseases is a difficult task for clinical researchers. It is therefore essential to investigate and adopt innovative communication strategies to enhance understanding of clinical trial information among participants. This review article visits the challenges that affect the informed consent process and explores various innovative strategies to enhance the consent process. PMID:28828304
WaveNet: A Web-Based Metocean Data Access, Processing, and Analysis Tool. Part 3 - CDIP Database
2014-06-01
and Analysis Tool; Part 3 – CDIP Database by Zeki Demirbilek, Lihwa Lin, and Derek Wilson PURPOSE: This Coastal and Hydraulics Engineering...Technical Note (CHETN) describes coupling of the Coastal Data Information Program ( CDIP ) database to WaveNet, the first module of MetOcnDat (Meteorological...provides a step-by-step procedure to access, process, and analyze wave and wind data from the CDIP database. BACKGROUND: WaveNet addresses a basic
Non-cellulosic polysaccharides from cotton fibre are differently impacted by textile processing.
Runavot, Jean-Luc; Guo, Xiaoyuan; Willats, William G T; Knox, J Paul; Goubet, Florence; Meulewaeter, Frank
2014-01-01
Cotton fibre is mainly composed of cellulose, although non-cellulosic polysaccharides play key roles during fibre development and are still present in the harvested fibre. This study aimed at determining the fate of non-cellulosic polysaccharides during cotton textile processing. We analyzed non-cellulosic cotton fibre polysaccharides during different steps of cotton textile processing using GC-MS, HPLC and comprehensive microarray polymer profiling to obtain monosaccharide and polysaccharide amounts and linkage compositions. Additionally, in situ detection was used to obtain information on polysaccharide localization and accessibility. We show that pectic and hemicellulosic polysaccharide levels decrease during cotton textile processing and that some processing steps have more impact than others. Pectins and arabinose-containing polysaccharides are strongly impacted by the chemical treatments, with most being removed during bleaching and scouring. However, some forms of pectin are more resistant than others. Xylan and xyloglucan are affected in later processing steps and to a lesser extent, whereas callose showed a strong resistance to the chemical processing steps. This study shows that non-cellulosic polysaccharides are differently impacted by the treatments used in cotton textile processing with some hemicelluloses and callose being resistant to these harsh treatments.
A Signal for the Need to Restructure the Learning Process.
ERIC Educational Resources Information Center
Breivik, Patricia Senn
1991-01-01
Although the U.S. will not disintegrate tomorrow if information literacy and resource-based learning remain underfunded, today's disadvantaged groups will fall further behind, as a new "information elite" emerges. The American Library Association's 1989 information literacy report is one step toward creating a national agenda for…
Development of DKB ETL module in case of data conversion
NASA Astrophysics Data System (ADS)
Kaida, A. Y.; Golosova, M. V.; Grigorieva, M. A.; Gubin, M. Y.
2018-05-01
Modern scientific experiments involve the producing of huge volumes of data that requires new approaches in data processing and storage. These data themselves, as well as their processing and storage, are accompanied by a valuable amount of additional information, called metadata, distributed over multiple informational systems and repositories, and having a complicated, heterogeneous structure. Gathering these metadata for experiments in the field of high energy nuclear physics (HENP) is a complex issue, requiring the quest for solutions outside the box. One of the tasks is to integrate metadata from different repositories into some kind of a central storage. During the integration process, metadata taken from original source repositories go through several processing steps: metadata aggregation, transformation according to the current data model and loading it to the general storage in a standardized form. The R&D project of ATLAS experiment on LHC, Data Knowledge Base, is aimed to provide fast and easy access to significant information about LHC experiments for the scientific community. The data integration subsystem, being developed for the DKB project, can be represented as a number of particular pipelines, arranging data flow from data sources to the main DKB storage. The data transformation process, represented by a single pipeline, can be considered as a number of successive data transformation steps, where each step is implemented as an individual program module. This article outlines the specifics of program modules, used in the dataflow, and describes one of the modules developed and integrated into the data integration subsystem of DKB.
ERIC Educational Resources Information Center
Denham, Susanne A.; Bassett, Hideko Hamada; Way, Erin; Kalb, Sara; Warren-Khot, Heather; Zinsser, Katherine
2014-01-01
Young children's social information processing (SIP) encompasses a series of steps by which they make sense of encounters with other persons; cognitive and emotional aspects of SIP often predict adjustment in school settings. More attention is needed, however, to the development of preschoolers' SIP and its potential foundations. To this end, a…
Articulation of Phonologically Similar Items Disrupts Free Recall of Nonwords
ERIC Educational Resources Information Center
Nishiyama, Ryoji; Ukita, Jun
2013-01-01
The present study sought to clarify whether phonological similarity of encoded information impairs free recall performance (the phonological similarity effect: PSE) for nonwords. Five experiments examined the influence of the encoding process on the PSE in a step-by-step fashion, by using lists that consisted of phonologically similar (decoy)…
NASA Technical Reports Server (NTRS)
Brower, S. J.; Ridd, M. K.
1984-01-01
The use of the Environmental Protection Agency (EPA) Enviropod camera system is detailed in this handbook which contains a step-by-step guide for mission planning, flights, film processing, indexing, and documentation. Information regarding Enviropod equipment and specifications is included.
Reading Assessment: A Primer for Teachers and Tutors.
ERIC Educational Resources Information Center
Caldwell, JoAnne Schudt
This primer provides the basic information that teachers and tutors need to get started on the complex process of reading assessment. Designed for maximum utility in today's standards-driven classroom, the primer presents simple, practical assessment strategies that are based on theory and research. It takes teachers step by step through learning…
The Costs and Potential Benefits of Alternative Scholarly Publishing Models
ERIC Educational Resources Information Center
Houghton, John W.
2011-01-01
Introduction: This paper reports on a study undertaken for the UK Joint Information Systems Committee (JISC), which explored the economic implications of alternative scholarly publishing models. Rather than simply summarising the study's findings, this paper focuses on the approach and presents a step-by-step account of the research process,…
Mapping Saldana's Coding Methods onto the Literature Review Process
ERIC Educational Resources Information Center
Onwuegbuzie, Anthony J.; Frels, Rebecca K.; Hwang, Eunjin
2016-01-01
Onwuegbuzie and Frels (2014) provided a step-by-step guide illustrating how discourse analysis can be used to analyze literature. However, more works of this type are needed to address the way that counselor researchers conduct literature reviews. Therefore, we present a typology for coding and analyzing information extracted for literature…
Strategies for developing competency models.
Marrelli, Anne F; Tondora, Janis; Hoge, Michael A
2005-01-01
There is an emerging trend within healthcare to introduce competency-based approaches in the training, assessment, and development of the workforce. The trend is evident in various disciplines and specialty areas within the field of behavioral health. This article is designed to inform those efforts by presenting a step-by-step process for developing a competency model. An introductory overview of competencies, competency models, and the legal implications of competency development is followed by a description of the seven steps involved in creating a competency model for a specific function, role, or position. This modeling process is drawn from advanced work on competencies in business and industry.
40 CFR 161.162 - Description of production process.
Code of Federal Regulations, 2010 CFR
2010-07-01
... applicant must submit information on the production (reaction) processes used to produce the active... continuous (a single reaction process from starting materials to active ingredient), but is accomplished in...) A flow chart of the chemical equations of each intended reaction occurring at each step of the...
40 CFR 161.162 - Description of production process.
Code of Federal Regulations, 2011 CFR
2011-07-01
... applicant must submit information on the production (reaction) processes used to produce the active... continuous (a single reaction process from starting materials to active ingredient), but is accomplished in...) A flow chart of the chemical equations of each intended reaction occurring at each step of the...
Framework for Integrating Science Data Processing Algorithms Into Process Control Systems
NASA Technical Reports Server (NTRS)
Mattmann, Chris A.; Crichton, Daniel J.; Chang, Albert Y.; Foster, Brian M.; Freeborn, Dana J.; Woollard, David M.; Ramirez, Paul M.
2011-01-01
A software framework called PCS Task Wrapper is responsible for standardizing the setup, process initiation, execution, and file management tasks surrounding the execution of science data algorithms, which are referred to by NASA as Product Generation Executives (PGEs). PGEs codify a scientific algorithm, some step in the overall scientific process involved in a mission science workflow. The PCS Task Wrapper provides a stable operating environment to the underlying PGE during its execution lifecycle. If the PGE requires a file, or metadata regarding the file, the PCS Task Wrapper is responsible for delivering that information to the PGE in a manner that meets its requirements. If the PGE requires knowledge of upstream or downstream PGEs in a sequence of executions, that information is also made available. Finally, if information regarding disk space, or node information such as CPU availability, etc., is required, the PCS Task Wrapper provides this information to the underlying PGE. After this information is collected, the PGE is executed, and its output Product file and Metadata generation is managed via the PCS Task Wrapper framework. The innovation is responsible for marshalling output Products and Metadata back to a PCS File Management component for use in downstream data processing and pedigree. In support of this, the PCS Task Wrapper leverages the PCS Crawler Framework to ingest (during pipeline processing) the output Product files and Metadata produced by the PGE. The architectural components of the PCS Task Wrapper framework include PGE Task Instance, PGE Config File Builder, Config File Property Adder, Science PGE Config File Writer, and PCS Met file Writer. This innovative framework is really the unifying bridge between the execution of a step in the overall processing pipeline, and the available PCS component services as well as the information that they collectively manage.
An Ontology-Based Approach to Incorporate User-Generated Geo-Content Into Sdi
NASA Astrophysics Data System (ADS)
Deng, D.-P.; Lemmens, R.
2011-08-01
The Web is changing the way people share and communicate information because of emergence of various Web technologies, which enable people to contribute information on the Web. User-Generated Geo-Content (UGGC) is a potential resource of geographic information. Due to the different production methods, UGGC often cannot fit in geographic information model. There is a semantic gap between UGGC and formal geographic information. To integrate UGGC into geographic information, this study conducts an ontology-based process to bridge this semantic gap. This ontology-based process includes five steps: Collection, Extraction, Formalization, Mapping, and Deployment. In addition, this study implements this process on Twitter messages, which is relevant to Japan Earthquake disaster. By using this process, we extract disaster relief information from Twitter messages, and develop a knowledge base for GeoSPARQL queries in disaster relief information.
ERIC Educational Resources Information Center
Peace Corps, Washington, DC. Information Collection and Exchange Div.
Focusing on the production and utilization of printing processes in constructing effective visuals for teaching, this bulletin contains articles on the silk screening stencil process, use of a similar process with a portable mimeograph, and the hectograph process. The first article lists equipment needed to make a silk screen, steps in building…
Willinger, Ulrike; Deckert, Matthias; Schmöger, Michaela; Schaunig-Busch, Ines; Formann, Anton K; Auff, Eduard
2017-12-01
Metaphor is a specific type of figurative language that is used in various important fields such as in the work with children in clinical or teaching contexts. The aim of the study was to investigate the developmental course, developmental steps, and possible cognitive predictors regarding metaphor processing in childhood and early adolescence. One hundred sixty-four typically developing children (7-year-olds, 9-year-olds) and early adolescents (11-year-olds) were tested for metaphor identification, comprehension, comprehension quality, and preference by the Metaphoric Triads Task as well as for analogical reasoning, information processing speed, cognitive flexibility under time pressure, and cognitive flexibility without time pressure. Metaphor identification and comprehension consecutively increased with age. Eleven-year-olds showed significantly higher metaphor comprehension quality and preference scores than seven- and nine-year-olds, whilst these younger age groups did not differ. Age, cognitive flexibility under time pressure, information processing speed, analogical reasoning, and cognitive flexibility without time pressure significantly predicted metaphor comprehension. Metaphorical language ability shows an ongoing development and seemingly changes qualitatively at the beginning of early adolescence. These results can possibly be explained by a greater synaptic reorganization in early adolescents. Furthermore, cognitive flexibility under time pressure and information processing speed possibly facilitate the ability to adapt metaphor processing strategies in a flexible, quick, and appropriate way.
Ren, Xiaojun; Deng, Ruijie; Wang, Lida; Zhang, Kaixiang
2017-01-01
RNA splicing, which mainly involves two transesterification steps, is a fundamental process of gene expression and its abnormal regulation contributes to serious genetic diseases. Antisense oligonucleotides (ASOs) are genetic control tools that can be used to specifically control genes through alteration of the RNA splicing pathway. Despite intensive research, how ASOs or various other factors influence the multiple processes of RNA splicing still remains obscure. This is largely due to an inability to analyze the splicing efficiency of each step in the RNA splicing process with high sensitivity. We addressed this limitation by introducing a padlock probe-based isothermal amplification assay to achieve quantification of the specific products in different splicing steps. With this amplified assay, the roles that ASOs play in RNA splicing inhibition in the first and second steps could be distinguished. We identified that 5′-ASO could block RNA splicing by inhibiting the first step, while 3′-ASO could block RNA splicing by inhibiting the second step. This method provides a versatile tool for assisting efficient ASO design and discovering new splicing modulators and therapeutic drugs. PMID:28989608
Fraccaro, Paolo; Vigo, Markel; Balatsoukas, Panagiotis; Buchan, Iain E; Peek, Niels; van der Veer, Sabine N
2018-03-01
Patient portals are considered valuable conduits for supporting patients' self-management. However, it is unknown why they often fail to impact on health care processes and outcomes. This may be due to a scarcity of robust studies focusing on the steps that are required to induce improvement: users need to effectively interact with the portal (step 1) in order to receive information (step 2), which might influence their decision-making (step 3). We aimed to explore this potential knowledge gap by investigating to what extent each step has been investigated for patient portals, and explore the methodological approaches used. We performed a systematic literature review using Coiera's information value chain as a guiding theoretical framework. We searched MEDLINE and Scopus by combining terms related to patient portals and evaluation methodologies. Two reviewers selected relevant papers through duplicate screening, and one extracted data from the included papers. We included 115 articles. The large majority (n = 104) evaluated aspects related to interaction with patient portals (step 1). Usage was most often assessed (n = 61), mainly by analysing system interaction data (n = 50), with most authors considering participants as active users if they logged in at least once. Overall usability (n = 57) was commonly assessed through non-validated questionnaires (n = 44). Step 2 (information received) was investigated in 58 studies, primarily by analysing interaction data to evaluate usage of specific system functionalities (n = 34). Eleven studies explicitly assessed the influence of patient portals on patients' and clinicians' decisions (step 3). Whereas interaction with patient portals has been extensively studied, their influence on users' decision-making remains under-investigated. Methodological approaches to evaluating usage and usability of portals showed room for improvement. To unlock the potential of patient portals, more (robust) research should focus on better understanding the complex process of how portals lead to improved health and care. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.
Designing a fixed-blade gang ripsaw arbor with a pencil
Charles J. Gatchell; Charles J. Gatchell
1996-01-01
This paper presents a step-by-step procedure for designing the "best" sequence of saw spacings for a fixed-blade gang ripsaw arbor. Using the information contained in a cutting bill and knowledge of the lumber width distributions to be processed, thousands of possible saw spacing sequences can be reduced to a few good ones.
Screening for Usher Syndrome: A Hands-On Guide for School Nurses.
ERIC Educational Resources Information Center
Houghton, Joan; Coonts, Teresa; Jordan, Beth; Schafer, Jacqueline, Ed.
This manual was written specifically to help school nurses conduct screenings for Usher syndrome, a genetic condition that involves deafness or hearing loss and the progressive loss of vision. It provides information on the step-by-step process of how to conduct a screening, the actual forms needed for a screening, and resources for referring…
Using Movement to Teach Academics: The Mind and Body as One Entity
ERIC Educational Resources Information Center
Minton, Sandra
2008-01-01
This book is developed to help teach curriculum through the use of movement and dance, while giving students a chance to use their creative problem-solving skills. The text describes a step-by-step process through which instructor and students can learn to transform academic concepts into actions and dances. Theoretical information is also…
Small Craft Advisory!: Cardboard Boat Challenges Students' Research, Design and Construction Skills
ERIC Educational Resources Information Center
Griffis, Kurt; Brand, Lance; Shackelford, Ray
2006-01-01
Throughout history, people have moved themselves and cargo across water in boats and other types of vessels. Most vessels are developed using a technological design process, which typically involves problem solving and a series of steps. The designer documents each step to provide an accurate record of accomplishments and information to guide…
Planning, Promoting and Passing School Tax Issues. [Revised Edition].
ERIC Educational Resources Information Center
Whitman, Robert L.; Pittner, Nicholas A.
This book provides Ohio citizens with information on school tax issues and levy campaigning. The material is presented in a structural step-by-step process that lends itself to the practical application for preparing a levy. This book is a guide to understanding various tax issues, tax reduction factors, and the changing tax duplicate that affects…
Schulze, H Georg; Turner, Robin F B
2015-06-01
High-throughput information extraction from large numbers of Raman spectra is becoming an increasingly taxing problem due to the proliferation of new applications enabled using advances in instrumentation. Fortunately, in many of these applications, the entire process can be automated, yielding reproducibly good results with significant time and cost savings. Information extraction consists of two stages, preprocessing and analysis. We focus here on the preprocessing stage, which typically involves several steps, such as calibration, background subtraction, baseline flattening, artifact removal, smoothing, and so on, before the resulting spectra can be further analyzed. Because the results of some of these steps can affect the performance of subsequent ones, attention must be given to the sequencing of steps, the compatibility of these sequences, and the propensity of each step to generate spectral distortions. We outline here important considerations to effect full automation of Raman spectral preprocessing: what is considered full automation; putative general principles to effect full automation; the proper sequencing of processing and analysis steps; conflicts and circularities arising from sequencing; and the need for, and approaches to, preprocessing quality control. These considerations are discussed and illustrated with biological and biomedical examples reflecting both successful and faulty preprocessing.
NASA Technical Reports Server (NTRS)
Callender, E. D.; Farny, A. M.
1983-01-01
Problem Statement Language/Problem Statement Analyzer (PSL/PSA) applications, which were once a one-step process in which product system information was immediately translated into PSL statements, have in light of experience been shown to result in inconsistent representations. These shortcomings have prompted the development of an intermediate step, designated the Product System Information Model (PSIM), which provides a basis for the mutual understanding of customer terminology and the formal, conceptual representation of that product system in a PSA data base. The PSIM is initially captured as a paper diagram, followed by formal capture in the PSL/PSA data base.
Materials And Processes Technical Information System (MAPTIS) LDEF materials data base
NASA Technical Reports Server (NTRS)
Funk, Joan G.; Strickland, John W.; Davis, John M.
1993-01-01
A preliminary Long Duration Exposure Facility (LDEF) Materials Data Base was developed by the LDEF Materials Special Investigation Group (MSIG). The LDEF Materials Data Base is envisioned to eventually contain the wide variety and vast quantity of materials data generated from LDEF. The data is searchable by optical, thermal, and mechanical properties, exposure parameters (such as atomic oxygen flux) and author(s) or principal investigator(s). Tne LDEF Materials Data Base was incorporated into the Materials and Processes Technical Information System (MAPTIS). MAPTIS is a collection of materials data which has been computerized and is available to engineers, designers, and researchers in the aerospace community involved in the design and development of spacecraft and related hardware. The LDEF Materials Data Base is described and step-by-step example searches using the data base are included. Information on how to become an authorized user of the system is included.
Global Persistent Attack: A Systems Architecture, Process Modeling, and Risk Analysis Approach
2008-06-01
develop an analysis process for quantifying risk associated with the limitations presented by a fiscally constrained environment. The second step...previous independent analysis of each force structure provided information for quantifying risk associated with the given force presentations, the
A methodology to event reconstruction from trace images.
Milliet, Quentin; Delémont, Olivier; Sapin, Eric; Margot, Pierre
2015-03-01
The widespread use of digital imaging devices for surveillance (CCTV) and entertainment (e.g., mobile phones, compact cameras) has increased the number of images recorded and opportunities to consider the images as traces or documentation of criminal activity. The forensic science literature focuses almost exclusively on technical issues and evidence assessment [1]. Earlier steps in the investigation phase have been neglected and must be considered. This article is the first comprehensive description of a methodology to event reconstruction using images. This formal methodology was conceptualised from practical experiences and applied to different contexts and case studies to test and refine it. Based on this practical analysis, we propose a systematic approach that includes a preliminary analysis followed by four main steps. These steps form a sequence for which the results from each step rely on the previous step. However, the methodology is not linear, but it is a cyclic, iterative progression for obtaining knowledge about an event. The preliminary analysis is a pre-evaluation phase, wherein potential relevance of images is assessed. In the first step, images are detected and collected as pertinent trace material; the second step involves organising and assessing their quality and informative potential. The third step includes reconstruction using clues about space, time and actions. Finally, in the fourth step, the images are evaluated and selected as evidence. These steps are described and illustrated using practical examples. The paper outlines how images elicit information about persons, objects, space, time and actions throughout the investigation process to reconstruct an event step by step. We emphasise the hypothetico-deductive reasoning framework, which demonstrates the contribution of images to generating, refining or eliminating propositions or hypotheses. This methodology provides a sound basis for extending image use as evidence and, more generally, as clues in investigation and crime reconstruction processes. Copyright © 2015 Forensic Science Society. Published by Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Bielik, M.; Vozar, J.; Hegedus, E.; Celebration Working Group
2003-04-01
The contribution informs about the preliminary results that relate to the first arrival p-wave seismic tomographic processing of data measured along the profiles CEL01, CEL04, CEL05, CEL06, CEL09 and CEL11. These profiles were measured in a framework of the seismic project called CELEBRATION 2000. Data acquisition and geometric parameters of the processed profiles, tomographic processing’s principle, particular processing steps and program parameters are described. Characteristic data (shot points, geophone points, total length of profiles, for all profiles, sampling, sensors and record lengths) of observation profiles are given. The fast program package developed by C. Zelt was applied for tomographic velocity inversion. This process consists of several steps. First step is a creation of the starting velocity field for which the calculated arrival times are modelled by the method of finite differences. The next step is minimization of differences between the measured and modelled arrival time till the deviation is small. Elimination of equivalency problem by including a priori information in the starting velocity field was done too. A priori information consists of the depth to the pre-Tertiary basement, estimation of its overlying sedimentary velocity from well-logging and or other seismic velocity data, etc. After checking the reciprocal times, pickings were corrected. The final result of the processing is a reliable travel time curve set considering the reciprocal times. We carried out picking of travel time curves, enhancement of signal-to-noise ratio on the seismograms using the program system of PROMAX. Tomographic inversion was carried out by so called 3D/2D procedure taking into account 3D wave propagation. It means that a corridor along the profile, which contains the outlying shot points and geophone points as well was defined and we carried out 3D processing within this corridor. The preliminary results indicate the seismic anomalous zones within the crust and the uppermost part of the upper mantle in the area consists of the Western Carpathians, the North European platform, the Pannonian basin and the Bohemian Massif.
(On)line dancing: choosing an appropriate distance education partner.
Menn, Mindy; Don Chaney, J
2014-05-01
Online-delivered distance education is a burgeoning component of professional development and continuing education. Distance education programs allow individuals to learn in a different location and/or at a different time from fellow learners, thereby increasing the flexibility and number of learning options. Selecting the "right" program for personal development from the ever-growing body of online-delivered education is an individualized decision that can become an overwhelming and challenging process. This Tool presents four important definitions for navigating distance education program description materials and outlines a five-step process to assist in identifying an appropriate program for personal development. The five-step process includes key questions and points to consider while conducting a candid self-assessment, identifying and investigating distance education programs, and then compiling information, comparing programs, and prioritizing a list of programs suitable for application. Furthermore, this Tool highlights important websites for distance education degree program reviews, accreditation information, and open educational resources.
Application of The APA Practice Guidelines on Suicide to Clinical Practice.
Jacobs, Douglas G; Brewer, Margaret L
2006-06-01
This article presents charts from The American Psychiatric Association Practice Guideline for the Assessment and Treatment of Patients with Suicidal Behaviors, part of the Practice Guidelines for the Treatment of Psychiatric Disorders Compendium, and a summary of the assessment information in a format that can be used in routine clinical practice. Four steps in the assessment process are presented: the use of a thorough psychiatric examination to obtain information about the patient's current presentation, history, diagnosis, and to recognize suicide risk factors therein; the necessity of asking very specific questions about suicidal ideation, intent, plans, and attempts; the process of making an estimation of the patient's level of suicide risk is explained; and the use of modifiable risk and protective factors as the basis for treatment planning is demonstrated. Case reports are used to clarify use of each step in this process.
Mertens, Wilson C; Christov, Stefan C; Avrunin, George S; Clarke, Lori A; Osterweil, Leon J; Cassells, Lucinda J; Marquard, Jenna L
2012-11-01
Chemotherapy ordering and administration, in which errors have potentially severe consequences, was quantitatively and qualitatively evaluated by employing process formalism (or formal process definition), a technique derived from software engineering, to elicit and rigorously describe the process, after which validation techniques were applied to confirm the accuracy of the described process. The chemotherapy ordering and administration process, including exceptional situations and individuals' recognition of and responses to those situations, was elicited through informal, unstructured interviews with members of an interdisciplinary team. The process description (or process definition), written in a notation developed for software quality assessment purposes, guided process validation (which consisted of direct observations and semistructured interviews to confirm the elicited details for the treatment plan portion of the process). The overall process definition yielded 467 steps; 207 steps (44%) were dedicated to handling 59 exceptional situations. Validation yielded 82 unique process events (35 new expected but not yet described steps, 16 new exceptional situations, and 31 new steps in response to exceptional situations). Process participants actively altered the process as ambiguities and conflicts were discovered by the elicitation and validation components of the study. Chemotherapy error rates declined significantly during and after the project, which was conducted from October 2007 through August 2008. Each elicitation method and the subsequent validation discussions contributed uniquely to understanding the chemotherapy treatment plan review process, supporting rapid adoption of changes, improved communication regarding the process, and ensuing error reduction.
Research to Go: Taking an Information Literacy Credit Course Online
ERIC Educational Resources Information Center
Long, Jessica; Burke, John J.; Tumbleson, Beth
2012-01-01
Adapting an existing face-to-face information literacy course that teaches undergraduates how to successfully conduct research and creating an online or hybrid version is a multi-step process. It begins with a desire to reach more students and help them achieve academic success. The primary learning outcomes for any information literacy course are…
21 CFR 212.50 - What production and process controls must I have?
Code of Federal Regulations, 2012 CFR
2012-04-01
... must have adequate production and process controls to ensure the consistent production of a PET drug... all steps in the PET drug production process. The master production and control records must include the following information: (1) The name and strength of the PET drug; (2) If applicable, the name and...
21 CFR 212.50 - What production and process controls must I have?
Code of Federal Regulations, 2014 CFR
2014-04-01
... must have adequate production and process controls to ensure the consistent production of a PET drug... all steps in the PET drug production process. The master production and control records must include the following information: (1) The name and strength of the PET drug; (2) If applicable, the name and...
21 CFR 212.50 - What production and process controls must I have?
Code of Federal Regulations, 2013 CFR
2013-04-01
... must have adequate production and process controls to ensure the consistent production of a PET drug... all steps in the PET drug production process. The master production and control records must include the following information: (1) The name and strength of the PET drug; (2) If applicable, the name and...
Public Participation Procedure in Integrated Transport and Green Infrastructure Planning
NASA Astrophysics Data System (ADS)
Finka, Maroš; Ondrejička, Vladimír; Jamečný, Ľubomír; Husár, Milan
2017-10-01
The dialogue among the decision makers and stakeholders is a crucial part of any decision-making processes, particularly in case of integrated transportation planning and planning of green infrastructure where a multitude of actors is present. Although the theory of public participation is well-developed after several decades of research, there is still a lack of practical guidelines due to the specificity of public participation challenges. The paper presents a model of public participation for integrated transport and green infrastructure planning for international project TRANSGREEN covering the area of five European countries - Slovakia, Czech Republic, Austria, Hungary and Romania. The challenge of the project is to coordinate the efforts of public actors and NGOs in international environment in oftentimes precarious projects of transport infrastructure building and developing of green infrastructure. The project aims at developing and environmentally-friendly and safe international transport network. The proposed public participation procedure consists of five main steps - spread of information (passive), collection of information (consultation), intermediate discussion, engagement and partnership (empowerment). The initial spread of information is a process of communicating with the stakeholders, informing and educating them and it is based on their willingness to be informed. The methods used in this stage are public displays, newsletters or press releases. The second step of consultation is based on transacting the opinions of stakeholders to the decision makers. Pools, surveys, public hearings or written responses are examples of the multitude of ways to achieve this objective and the main principle of openness of stakeholders. The third step is intermediate discussion where all sides of are invited to a dialogue using the tools such as public meetings, workshops or urban walks. The fourth step is an engagement based on humble negotiation, arbitration and mediation. The collaborative skill needed here is dealing with conflicts. The final step in the procedure is partnership and empowerment employing methods as multi-actor decision making, voting or referenda. The leading principle is cooperation. In this ultimate step, the stakeholders are becoming decision makers themselves and the success factor here is continuous evaluation.
ERIC Educational Resources Information Center
Bruess, Clint E.; Laing, Susan J.
This module covers in nine lessons the anatomy and physiology of the male and female reproductive systems, the birth process, healthy pregnancy, birthing choices, and contraceptive methods. The book provides detailed teacher information sheets, reproducible diagrams and a step-by-step approach to teaching about these topics with candor and ease.…
ERIC Educational Resources Information Center
Ohly, Sandra; Plückthun, Laura; Kissel, Dorothea
2017-01-01
The development of novel and useful ideas is a process that can be described in multiple steps, including information gathering, generating ideas and evaluating ideas. We evaluated a university course that was developed based on design thinking principles which employ similar steps. Our results show that the course was not effective in enhancing…
ERIC Educational Resources Information Center
Stodden, Robert A.; Boone, Rosalie
1986-01-01
Discusses the role of teachers in providing vocational assessment to disabled students. Steps in this process include (1) establish planning team and conduct information search, (2) define purpose, (3) establish basic considerations, (4) formulate assessment model, (5) establish implementation focus, and (6) pilot test and evaluate assessment…
Eini C. Lowell; Dennis R. Becker; Robert Rummer; Debra Larson; Linda Wadleigh
2008-01-01
This research provides an important step in the conceptualization and development of an integrated wildfire fuels reduction system from silvicultural prescription, through stem selection, harvesting, in-woods processing, transport, and market selection. Decisions made at each functional step are informed by knowledge about subsequent functions. Data on the resource...
Eini C. Lowell; Dennis R. Becker; Robert Rummer; Debra Larson; Linda Wadleigh
2008-01-01
This research provides an important step in the conceptualization and development of an integrated wildfire fuels reduction system from silvicultural prescription, through stem selection, harvesting, in-woods processing, transport, and market selection. Decisions made at each functional step are informed by knowledge about subsequent functions. Data on the resource...
Reconstructing Face Image from the Thermal Infrared Spectrum to the Visible Spectrum †
Kresnaraman, Brahmastro; Deguchi, Daisuke; Takahashi, Tomokazu; Mekada, Yoshito; Ide, Ichiro; Murase, Hiroshi
2016-01-01
During the night or in poorly lit areas, thermal cameras are a better choice instead of normal cameras for security surveillance because they do not rely on illumination. A thermal camera is able to detect a person within its view, but identification from only thermal information is not an easy task. The purpose of this paper is to reconstruct the face image of a person from the thermal spectrum to the visible spectrum. After the reconstruction, further image processing can be employed, including identification/recognition. Concretely, we propose a two-step thermal-to-visible-spectrum reconstruction method based on Canonical Correlation Analysis (CCA). The reconstruction is done by utilizing the relationship between images in both thermal infrared and visible spectra obtained by CCA. The whole image is processed in the first step while the second step processes patches in an image. Results show that the proposed method gives satisfying results with the two-step approach and outperforms comparative methods in both quality and recognition evaluations. PMID:27110781
NASA Technical Reports Server (NTRS)
Pinelli, Thomas E.; Kennedy, John M.; White, Terry F.
1991-01-01
Phase 1 of a 4 part study was undertaken studying the use of scientific and technical information (STI) by U.S. aerospace engineers and scientists. Specific attention was paid to institutional and socioeconomic variables and to the step-by-step process of information gathering used by the respondents. Data were collected by means of three self administered mail-back questionnaires. The approximately 34,000 members of the American Institute of Aeronautics and Astronautics (AIAA) served as the study population. More than 65 percent of the randomly selected respondents returned the questionnaires in each of the three groups. Respondents relied more heavily upon informal sources of information than formal sources and turned to librarians and other technical information specialists only when they did not obtain results via informal means or their own formal searches.
An industrial ecology approach to municipal solid waste ...
Municipal solid waste (MSW) can be viewed as a feedstock for industrial ecology inspired conversions of wastes to valuable products and energy. The industrial ecology principle of symbiotic processes using waste streams for creating value-added products is applied to MSW, with examples suggested for various residual streams. A methodology is presented to consider individual waste-to-energy or waste-to-product system synergies, evaluating the economic and environmental issues associated with each system. Steps included in the methodology include identifying waste streams, specific waste components of interest, and conversion technologies, plus steps for determining the economic and environmental effects of using wastes and changes due to transport, administrative handling, and processing. In addition to presenting the methodology, technologies for various MSW input streams are categorized as commercialized or demonstrated to provide organizations that are considering processes for MSW with summarized information. The organization can also follow the methodology to analyze interesting processes. Presents information useful for analyzing the sustainability of alternatives for the management of municipal solid waste.
Pre-eruptive magmatic processes re-timed using a non-isothermal approach to magma chamber dynamics.
Petrone, Chiara Maria; Bugatti, Giuseppe; Braschi, Eleonora; Tommasini, Simone
2016-10-05
Constraining the timescales of pre-eruptive magmatic processes in active volcanic systems is paramount to understand magma chamber dynamics and the triggers for volcanic eruptions. Temporal information of magmatic processes is locked within the chemical zoning profiles of crystals but can be accessed by means of elemental diffusion chronometry. Mineral compositional zoning testifies to the occurrence of substantial temperature differences within magma chambers, which often bias the estimated timescales in the case of multi-stage zoned minerals. Here we propose a new Non-Isothermal Diffusion Incremental Step model to take into account the non-isothermal nature of pre-eruptive processes, deconstructing the main core-rim diffusion profiles of multi-zoned crystals into different isothermal steps. The Non-Isothermal Diffusion Incremental Step model represents a significant improvement in the reconstruction of crystal lifetime histories. Unravelling stepwise timescales at contrasting temperatures provides a novel approach to constraining pre-eruptive magmatic processes and greatly increases our understanding of magma chamber dynamics.
Pre-eruptive magmatic processes re-timed using a non-isothermal approach to magma chamber dynamics
Petrone, Chiara Maria; Bugatti, Giuseppe; Braschi, Eleonora; Tommasini, Simone
2016-01-01
Constraining the timescales of pre-eruptive magmatic processes in active volcanic systems is paramount to understand magma chamber dynamics and the triggers for volcanic eruptions. Temporal information of magmatic processes is locked within the chemical zoning profiles of crystals but can be accessed by means of elemental diffusion chronometry. Mineral compositional zoning testifies to the occurrence of substantial temperature differences within magma chambers, which often bias the estimated timescales in the case of multi-stage zoned minerals. Here we propose a new Non-Isothermal Diffusion Incremental Step model to take into account the non-isothermal nature of pre-eruptive processes, deconstructing the main core-rim diffusion profiles of multi-zoned crystals into different isothermal steps. The Non-Isothermal Diffusion Incremental Step model represents a significant improvement in the reconstruction of crystal lifetime histories. Unravelling stepwise timescales at contrasting temperatures provides a novel approach to constraining pre-eruptive magmatic processes and greatly increases our understanding of magma chamber dynamics. PMID:27703141
MIRADS-2 Implementation Manual
NASA Technical Reports Server (NTRS)
1975-01-01
The Marshall Information Retrieval and Display System (MIRADS) which is a data base management system designed to provide the user with a set of generalized file capabilities is presented. The system provides a wide variety of ways to process the contents of the data base and includes capabilities to search, sort, compute, update, and display the data. The process of creating, defining, and loading a data base is generally called the loading process. The steps in the loading process which includes (1) structuring, (2) creating, (3) defining, (4) and implementing the data base for use by MIRADS are defined. The execution of several computer programs is required to successfully complete all steps of the loading process. This library must be established as a cataloged mass storage file as the first step in MIRADS implementation. The procedure for establishing the MIRADS Library is given. The system is currently operational for the UNIVAC 1108 computer system utilizing the Executive Operating System. All procedures relate to the use of MIRADS on the U-1108 computer.
NASA Technical Reports Server (NTRS)
Cobb, Sharon
2017-01-01
NASA has a phased approach to ensure our nation's leadership in space exploration, beginning in Earth orbit, developing our skills in lunar space, and extending those skills and technologies to a human mission to Mars. We're currently in Phase 0, using the ISS to better understand living and working in space. You may have heard about our "twin study" with astronauts Scott and Mike Kelly that's giving us valuable information on the effects of microgravity environments on the human body during long stays in LEO. During Phase 1 in the 2020s, SLS will be used to lift the pieces of a "deep space gateway" outpost to lunar orbit. Developing and operating the gateway will get us to Mars in a step-by-step fashion, with lessons learned in each phase of the process informing the next steps. First step of moving humans farther into the solar system is completing and flying SLS and Orion.
Combined process automation for large-scale EEG analysis.
Sfondouris, John L; Quebedeaux, Tabitha M; Holdgraf, Chris; Musto, Alberto E
2012-01-01
Epileptogenesis is a dynamic process producing increased seizure susceptibility. Electroencephalography (EEG) data provides information critical in understanding the evolution of epileptiform changes throughout epileptic foci. We designed an algorithm to facilitate efficient large-scale EEG analysis via linked automation of multiple data processing steps. Using EEG recordings obtained from electrical stimulation studies, the following steps of EEG analysis were automated: (1) alignment and isolation of pre- and post-stimulation intervals, (2) generation of user-defined band frequency waveforms, (3) spike-sorting, (4) quantification of spike and burst data and (5) power spectral density analysis. This algorithm allows for quicker, more efficient EEG analysis. Copyright © 2011 Elsevier Ltd. All rights reserved.
Learning and study strategies correlate with medical students' performance in anatomical sciences.
Khalil, Mohammed K; Williams, Shanna E; Gregory Hawkins, H
2018-05-06
Much of the content delivered during medical students' preclinical years is assessed nationally by such testing as the United States Medical Licensing Examination ® (USMLE ® ) Step 1 and Comprehensive Osteopathic Medical Licensing Examination ® (COMPLEX-USA ® ) Step 1. Improvement of student study/learning strategies skills is associated with academic success in internal and external (USMLE Step 1) examinations. This research explores the strength of association between the Learning and Study Strategies Inventory (LASSI) scores and student performance in the anatomical sciences and USMLE Step 1 examinations. The LASSI inventory assesses learning and study strategies based on ten subscale measures. These subscales include three components of strategic learning: skill (Information processing, Selecting main ideas, and Test strategies), will (Anxiety, Attitude, and Motivation) and self-regulation (Concentration, Time management, Self-testing, and Study aid). During second year (M2) orientation, 180 students (Classes of 2016, 2017, and 2018) were administered the LASSI survey instrument. Pearson Product-Moment correlation analyses identified significant associations between five of the ten LASSI subscales (Anxiety, Information processing, Motivation, Selecting main idea, and Test strategies) and students' performance in the anatomical sciences and USMLE Step 1 examinations. Identification of students lacking these skills within the anatomical sciences curriculum allows targeted interventions, which not only maximize academic achievement in an aspect of an institution's internal examinations, but in the external measure of success represented by USMLE Step 1 scores. Anat Sci Educ 11: 236-242. © 2017 American Association of Anatomists. © 2017 American Association of Anatomists.
A preliminary model of work during initial examination and treatment planning appointments.
Irwin, J Y; Torres-Urquidy, M H; Schleyer, T; Monaco, V
2009-01-10
Objective This study's objective was to formally describe the work process for charting and treatment planning in general dental practice to inform the design of a new clinical computing environment.Methods Using a process called contextual inquiry, researchers observed 23 comprehensive examination and treatment planning sessions during 14 visits to 12 general US dental offices. For each visit, field notes were analysed and reformulated as formalised models. Subsequently, each model type was consolidated across all offices and visits. Interruptions to the workflow, called breakdowns, were identified.Results Clinical work during dental examination and treatment planning appointments is a highly collaborative activity involving dentists, hygienists and assistants. Personnel with multiple overlapping roles complete complex multi-step tasks supported by a large and varied collection of equipment, artifacts and technology. Most of the breakdowns were related to technology which interrupted the workflow, caused rework and increased the number of steps in work processes.Conclusion Current dental software could be significantly improved with regard to its support for communication and collaboration, workflow, information design and presentation, information content, and data entry.
CrossTalk. The Journal of Defense Software Engineering. Volume 25, Number 3
2012-06-01
OMG) standard Business Process Modeling and Nota- tion ( BPMN ) [6] graphical notation. I will address each of these: identify and document steps...to a value stream map using BPMN and textual process narratives. The resulting process narratives or process metadata includes key information...objectives. Once the processes are identified we can graphically document them capturing the process using BPMN (see Figure 1). The BPMN models
Mediation, Alignment, and Information Services for Semantic interoperability (MAISSI): A Trade Study
2007-06-01
Modeling Notation ( BPMN ) • Business Process Definition Metamodel (BPDM) A Business Process (BP) is a defined sequence of steps to be executed in...enterprise applications, to evaluate the capabilities of suppliers, and to compare against the competition. BPMN standardizes flowchart diagrams that
Development of an E-mail Application Seemit and its Utilization in an Information Literacy Course
NASA Astrophysics Data System (ADS)
Kita, Toshihiro; Miyazaki, Makoto; Nakano, Hiroshi; Sugitani, Kenichi; Akiyama, Hidenori
We have developed a simple e-mail application named Seemit which is designed for being used in information literacy courses. It has necessary and sufficient functionality of an e-mail application, and it has been developed for the purpose of learning basic operations and mechanisms of e-mail transfer easily. It is equipped with the function to automatically set the configuration of user's SMTP/POP servers and e-mail address, etc. The process of transferring e-mail via SMTP and POP can be demonstrated step by step showing actual messages passed during the client-server interaction. We have utilized Seemit in a university-wide information literacy course which holds about 1800 students.
NASA Astrophysics Data System (ADS)
von Ruette, Jonas; Lehmann, Peter; Fan, Linfeng; Bickel, Samuel; Or, Dani
2017-04-01
Landslides and subsequent debris-flows initiated by rainfall represent a ubiquitous natural hazard in steep mountainous regions. We integrated a landslide hydro-mechanical triggering model and associated debris flow runout pathways with a graphical user interface (GUI) to represent these natural hazards in a wide range of catchments over the globe. The STEP-TRAMM GUI provides process-based locations and sizes of landslides patterns using digital elevation models (DEM) from SRTM database (30 m resolution) linked with soil maps from global database SoilGrids (250 m resolution) and satellite based information on rainfall statistics for the selected region. In a preprocessing step STEP-TRAMM models soil depth distribution and complements soil information that jointly capture key hydrological and mechanical properties relevant to local soil failure representation. In the presentation we will discuss feature of this publicly available platform and compare landslide and debris flow patterns for different regions considering representative intense rainfall events. Model outcomes will be compared for different spatial and temporal resolutions to test applicability of web-based information on elevation and rainfall for hazard assessment.
Lamm, Claus; Windischberger, Christian; Moser, Ewald; Bauer, Herbert
2007-07-15
Subjects deciding whether two objects presented at angular disparity are identical or mirror versions of each other usually show response times that linearly increase with the angle between objects. This phenomenon has been termed mental rotation. While there is widespread agreement that parietal cortex plays a dominant role in mental rotation, reports concerning the involvement of motor areas are less consistent. From a theoretical point of view, activation in motor areas suggests that mental rotation relies upon visuo-motor rather than visuo-spatial processing alone. However, the type of information that is processed by motor areas during mental rotation remains unclear. In this study we used event-related fMRI to assess whether activation in parietal and dorsolateral premotor areas (dPM) during mental rotation is distinctively related to processing spatial orientation information. Using a newly developed task paradigm we explicitly separated the processing steps (encoding, mental rotation proper and object matching) required by mental rotation tasks and additionally modulated the amount of spatial orientation information that had to be processed. Our results show that activation in dPM during mental rotation is not strongly modulated by the processing of spatial orientation information, and that activation in dPM areas is strongest during mental rotation proper. The latter finding suggests that dPM is involved in more generalized processes such as visuo-spatial attention and movement anticipation. We propose that solving mental rotation tasks is heavily dependent upon visuo-motor processes and evokes neural processing that may be considered as an implicit simulation of actual object rotation.
Singh, Sonal
2013-01-01
Background: Regulatory decision-making involves assessment of risks and benefits of medications at the time of approval or when relevant safety concerns arise with a medication. The Analytic Hierarchy Process (AHP) facilitates decision-making in complex situations involving tradeoffs by considering risks and benefits of alternatives. The AHP allows a more structured method of synthesizing and understanding evidence in the context of importance assigned to outcomes. Our objective is to evaluate the use of an AHP in a simulated committee setting selecting oral medications for type 2 diabetes. Methods: This study protocol describes the AHP in five sequential steps using a small group of diabetes experts representing various clinical disciplines. The first step will involve defining the goal of the decision and developing the AHP model. In the next step, we will collect information about how well alternatives are expected to fulfill the decision criteria. In the third step, we will compare the ability of the alternatives to fulfill the criteria and judge the importance of eight criteria relative to the decision goal of the optimal medication choice for type 2 diabetes. We will use pairwise comparisons to sequentially compare the pairs of alternative options regarding their ability to fulfill the criteria. In the fourth step, the scales created in the third step will be combined to create a summary score indicating how well the alternatives met the decision goal. The resulting scores will be expressed as percentages and will indicate the alternative medications' relative abilities to fulfill the decision goal. The fifth step will consist of sensitivity analyses to explore the effects of changing the estimates. We will also conduct a cognitive interview and process evaluation. Discussion: Multi-criteria decision analysis using the AHP will aid, support and enhance the ability of decision makers to make evidence-based informed decisions consistent with their values and preferences. PMID:24555077
Maruthur, Nisa M; Joy, Susan; Dolan, James; Segal, Jodi B; Shihab, Hasan M; Singh, Sonal
2013-01-01
Regulatory decision-making involves assessment of risks and benefits of medications at the time of approval or when relevant safety concerns arise with a medication. The Analytic Hierarchy Process (AHP) facilitates decision-making in complex situations involving tradeoffs by considering risks and benefits of alternatives. The AHP allows a more structured method of synthesizing and understanding evidence in the context of importance assigned to outcomes. Our objective is to evaluate the use of an AHP in a simulated committee setting selecting oral medications for type 2 diabetes. This study protocol describes the AHP in five sequential steps using a small group of diabetes experts representing various clinical disciplines. The first step will involve defining the goal of the decision and developing the AHP model. In the next step, we will collect information about how well alternatives are expected to fulfill the decision criteria. In the third step, we will compare the ability of the alternatives to fulfill the criteria and judge the importance of eight criteria relative to the decision goal of the optimal medication choice for type 2 diabetes. We will use pairwise comparisons to sequentially compare the pairs of alternative options regarding their ability to fulfill the criteria. In the fourth step, the scales created in the third step will be combined to create a summary score indicating how well the alternatives met the decision goal. The resulting scores will be expressed as percentages and will indicate the alternative medications' relative abilities to fulfill the decision goal. The fifth step will consist of sensitivity analyses to explore the effects of changing the estimates. We will also conduct a cognitive interview and process evaluation. Multi-criteria decision analysis using the AHP will aid, support and enhance the ability of decision makers to make evidence-based informed decisions consistent with their values and preferences.
Sensorimotor and Cognitive Predictors of Impaired Gait Adaptability in Older People.
Caetano, Maria Joana D; Menant, Jasmine C; Schoene, Daniel; Pelicioni, Paulo H S; Sturnieks, Daina L; Lord, Stephen R
2017-09-01
The ability to adapt gait when negotiating unexpected hazards is crucial to maintain stability and avoid falling. This study investigated whether impaired gait adaptability in a task including obstacle and stepping targets is associated with cognitive and sensorimotor capacities in older adults. Fifty healthy older adults (74±7 years) were instructed to either (a) avoid an obstacle at usual step distance or (b) step onto a target at either a short or long step distance projected on a walkway two heel strikes ahead and then continue walking. Participants also completed cognitive and sensorimotor function assessments. Stroop test and reaction time performance significantly discriminated between participants who did and did not make stepping errors, and poorer Trail-Making test performance predicted shorter penultimate step length in the obstacle avoidance condition. Slower reaction time predicted poorer stepping accuracy; increased postural sway, weaker quadriceps strength, and poorer Stroop and Trail-Making test performances predicted increased number of steps taken to approach the target/obstacle and shorter step length; and increased postural sway and higher concern about falling predicted slower step velocity. Superior executive function, fast processing speed, and good muscle strength and balance were all associated with successful gait adaptability. Processing speed appears particularly important for precise foot placements; cognitive capacity for step length adjustments; and early and/or additional cognitive processing involving the inhibition of a stepping pattern for obstacle avoidance. This information may facilitate fall risk assessments and fall prevention strategies. © The Author 2016. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
An intraorganizational model for developing and spreading quality improvement innovations.
Kellogg, Katherine C; Gainer, Lindsay A; Allen, Adrienne S; OʼSullivan, Tatum; Singer, Sara J
Recent policy reforms encourage quality improvement (QI) innovations in primary care, but practitioners lack clear guidance regarding spread inside organizations. We designed this study to identify how large organizations can facilitate intraorganizational spread of QI innovations. We conducted ethnographic observation and interviews in a large, multispecialty, community-based medical group that implemented three QI innovations across 10 primary care sites using a new method for intraorganizational process development and spread. We compared quantitative outcomes achieved through the group's traditional versus new method, created a process model describing the steps in the new method, and identified barriers and facilitators at each step. The medical group achieved substantial improvement using its new method of intraorganizational process development and spread of QI innovations: standard work for rooming and depression screening, vaccine error rates and order compliance, and Pap smear error rates. Our model details nine critical steps for successful intraorganizational process development (set priorities, assess the current state, develop the new process, and measure and refine) and spread (develop support, disseminate information, facilitate peer-to-peer training, reinforce, and learn and adapt). Our results highlight the importance of utilizing preexisting organizational structures such as established communication channels, standardized roles, common workflows, formal authority, and performance measurement and feedback systems when developing and spreading QI processes inside an organization. In particular, we detail how formal process advocate positions in each site for each role can facilitate the spread of new processes. Successful intraorganizational spread is possible and sustainable. Developing and spreading new QI processes across sites inside an organization requires creating a shared understanding of the necessary process steps, considering the barriers that may arise at each step, and leveraging preexisting organizational structures to facilitate intraorganizational process development and spread.
An intraorganizational model for developing and spreading quality improvement innovations
Kellogg, Katherine C.; Gainer, Lindsay A.; Allen, Adrienne S.; O'Sullivan, Tatum; Singer, Sara J.
2017-01-01
Background: Recent policy reforms encourage quality improvement (QI) innovations in primary care, but practitioners lack clear guidance regarding spread inside organizations. Purpose: We designed this study to identify how large organizations can facilitate intraorganizational spread of QI innovations. Methodology/Approach: We conducted ethnographic observation and interviews in a large, multispecialty, community-based medical group that implemented three QI innovations across 10 primary care sites using a new method for intraorganizational process development and spread. We compared quantitative outcomes achieved through the group’s traditional versus new method, created a process model describing the steps in the new method, and identified barriers and facilitators at each step. Findings: The medical group achieved substantial improvement using its new method of intraorganizational process development and spread of QI innovations: standard work for rooming and depression screening, vaccine error rates and order compliance, and Pap smear error rates. Our model details nine critical steps for successful intraorganizational process development (set priorities, assess the current state, develop the new process, and measure and refine) and spread (develop support, disseminate information, facilitate peer-to-peer training, reinforce, and learn and adapt). Our results highlight the importance of utilizing preexisting organizational structures such as established communication channels, standardized roles, common workflows, formal authority, and performance measurement and feedback systems when developing and spreading QI processes inside an organization. In particular, we detail how formal process advocate positions in each site for each role can facilitate the spread of new processes. Practice Implications: Successful intraorganizational spread is possible and sustainable. Developing and spreading new QI processes across sites inside an organization requires creating a shared understanding of the necessary process steps, considering the barriers that may arise at each step, and leveraging preexisting organizational structures to facilitate intraorganizational process development and spread. PMID:27428788
Medicare Part D Beneficiaries' Plan Switching Decisions and Information Processing.
Han, Jayoung; Urmie, Julie
2017-03-01
Medicare Part D beneficiaries tend not to switch plans despite the government's efforts to engage beneficiaries in the plan switching process. Understanding current and alternative plan features is a necessary step to make informed plan switching decisions. This study explored beneficiaries' plan switching using a mixed-methods approach, with a focus on the concept of information processing. We found large variation in beneficiary comprehension of plan information among both switchers and nonswitchers. Knowledge about alternative plans was especially poor, with only about half of switchers and 2 in 10 nonswitchers being well informed about plans other than their current plan. We also found that helpers had a prominent role in plan decision making-nearly twice as many switchers as nonswitchers worked with helpers for their plan selection. Our study suggests that easier access to helpers as well as helpers' extensive involvement in the decision-making process promote informed plan switching decisions.
Åhlfeldt, Rose-Mharie; Persson, Anne; Rexhepi, Hanife; Wåhlander, Kalle
2016-12-01
This article presents and illustrates the main features of a proposed process-oriented approach for patient information distribution in future health care information systems, by using a prototype of a process support system. The development of the prototype was based on the Visuera method, which includes five defined steps. The results indicate that a visualized prototype is a suitable tool for illustrating both the opportunities and constraints of future ideas and solutions in e-Health. The main challenges for developing and implementing a fully functional process support system concern both technical and organizational/management aspects. © The Author(s) 2015.
Peres Penteado, Alissa; Fábio Maciel, Rafael; Erbs, João; Feijó Ortolani, Cristina Lucia; Aguiar Roza, Bartira; Torres Pisa, Ivan
2015-01-01
The entire kidney transplantation process in Brazil is defined through laws, decrees, ordinances, and resolutions, but there is no defined theoretical map describing this process. From this representation it's possible to perform analysis, such as the identification of bottlenecks and information and communication technologies (ICTs) that support this process. The aim of this study was to analyze and represent the kidney transplantation workflow using business process modeling notation (BPMN) and then to identify the ICTs involved in the process. This study was conducted in eight steps, including document analysis and professional evaluation. The results include the BPMN model of the kidney transplantation process in Brazil and the identification of ICTs. We discovered that there are great delays in the process due to there being many different ICTs involved, which can cause information to be poorly integrated.
Investigation of correlation classification techniques
NASA Technical Reports Server (NTRS)
Haskell, R. E.
1975-01-01
A two-step classification algorithm for processing multispectral scanner data was developed and tested. The first step is a single pass clustering algorithm that assigns each pixel, based on its spectral signature, to a particular cluster. The output of that step is a cluster tape in which a single integer is associated with each pixel. The cluster tape is used as the input to the second step, where ground truth information is used to classify each cluster using an iterative method of potentials. Once the clusters have been assigned to classes the cluster tape is read pixel-by-pixel and an output tape is produced in which each pixel is assigned to its proper class. In addition to the digital classification programs, a method of using correlation clustering to process multispectral scanner data in real time by means of an interactive color video display is also described.
Next Steps After Your Diagnosis: Finding Information and Support
... Scientific Peer Review Award Process Post-Award Grant Management AHRQ Grantee Profiles Getting Recognition for Your AHRQ-Funded Study Contracts Project Research Online Database (PROD) Searchable database of AHRQ ...
NASA Technical Reports Server (NTRS)
Pinelli, Thomas E.; Kennedy, John M.; White, Terry F.
1991-01-01
Phase 1 of a four part study was undertaken to investigate the use of scientific and technical information (STI) by U.S. aerospace engineers and scientists. Specific attention was paid to institutional and sociometric variables and to the step-by-step process of information gathering used by the respondents. Data were collected by means of three self-administered mail-back questionnaires. The approximately 34,000 members of the American Institute of Aeronautics and Astronautics served as the study population. More than 65 percent of the randomly selected respondants returned the questionnaires in each of the three groups. Respondants relied more heavily on informal sources of information than formal sources and turned to librarians and other technical information specialists only when they did not obtain results via informal means or their own formal searches. The report includes frequency distributions for the questions.
33 CFR 385.11 - Implementation process for projects.
Code of Federal Regulations, 2010 CFR
2010-07-01
... figure 1 in Appendix A of this part. Typical steps in this process involve: (a) Project Management Plan. The Project Management Plan describes the activities, tasks, and responsibilities that will be used to... effectiveness of the project and to provide information that will be used for the adaptive management program. ...
Needs Assessment: Who Needs It?
ERIC Educational Resources Information Center
Hays, Donald G.; Linn, Joan K.
This monograph addresses the issue of needs assessment in the educational process and how it applies to the school counselor's role. The authors provide information on the process of needs assessment, from the initial step of obtaining commitment to the final outcome of improved program planning and development. Using an example common to many…
44 CFR 9.6 - Decision-making process.
Code of Federal Regulations, 2012 CFR
2012-10-01
... HOMELAND SECURITY GENERAL FLOODPLAIN MANAGEMENT AND PROTECTION OF WETLANDS § 9.6 Decision-making process. (a) Purpose. The purpose of this section is to set out the floodplain management and wetlands... light of the information gained in Steps 4 and 5. FEMA shall not act in a floodplain or wetland unless...
44 CFR 9.6 - Decision-making process.
Code of Federal Regulations, 2011 CFR
2011-10-01
... HOMELAND SECURITY GENERAL FLOODPLAIN MANAGEMENT AND PROTECTION OF WETLANDS § 9.6 Decision-making process. (a) Purpose. The purpose of this section is to set out the floodplain management and wetlands... light of the information gained in Steps 4 and 5. FEMA shall not act in a floodplain or wetland unless...
44 CFR 9.6 - Decision-making process.
Code of Federal Regulations, 2013 CFR
2013-10-01
... HOMELAND SECURITY GENERAL FLOODPLAIN MANAGEMENT AND PROTECTION OF WETLANDS § 9.6 Decision-making process. (a) Purpose. The purpose of this section is to set out the floodplain management and wetlands... light of the information gained in Steps 4 and 5. FEMA shall not act in a floodplain or wetland unless...
44 CFR 9.6 - Decision-making process.
Code of Federal Regulations, 2014 CFR
2014-10-01
... HOMELAND SECURITY GENERAL FLOODPLAIN MANAGEMENT AND PROTECTION OF WETLANDS § 9.6 Decision-making process. (a) Purpose. The purpose of this section is to set out the floodplain management and wetlands... light of the information gained in Steps 4 and 5. FEMA shall not act in a floodplain or wetland unless...
33 CFR 385.11 - Implementation process for projects.
Code of Federal Regulations, 2011 CFR
2011-07-01
... figure 1 in Appendix A of this part. Typical steps in this process involve: (a) Project Management Plan. The Project Management Plan describes the activities, tasks, and responsibilities that will be used to... effectiveness of the project and to provide information that will be used for the adaptive management program. ...
33 CFR 385.11 - Implementation process for projects.
Code of Federal Regulations, 2014 CFR
2014-07-01
... figure 1 in appendix A of this part. Typical steps in this process involve: (a) Project Management Plan. The Project Management Plan describes the activities, tasks, and responsibilities that will be used to... effectiveness of the project and to provide information that will be used for the adaptive management program. ...
33 CFR 385.11 - Implementation process for projects.
Code of Federal Regulations, 2012 CFR
2012-07-01
... figure 1 in Appendix A of this part. Typical steps in this process involve: (a) Project Management Plan. The Project Management Plan describes the activities, tasks, and responsibilities that will be used to... effectiveness of the project and to provide information that will be used for the adaptive management program. ...
33 CFR 385.11 - Implementation process for projects.
Code of Federal Regulations, 2013 CFR
2013-07-01
... figure 1 in Appendix A of this part. Typical steps in this process involve: (a) Project Management Plan. The Project Management Plan describes the activities, tasks, and responsibilities that will be used to... effectiveness of the project and to provide information that will be used for the adaptive management program. ...
Implementing an ROI Measurement Process at Dell Computer.
ERIC Educational Resources Information Center
Tesoro, Ferdinand
1998-01-01
This return-on-investment (ROI) evaluation study determined the business impact of the sales negotiation training course to Dell Computer Corporation. A five-step ROI measurement process was used: Plan-Develop-Analyze-Communicate-Leverage. The corporate sales information database was used to compare pre- and post-training metrics for both training…
A fusion approach for coarse-to-fine target recognition
NASA Astrophysics Data System (ADS)
Folkesson, Martin; Grönwall, Christina; Jungert, Erland
2006-04-01
A fusion approach in a query based information system is presented. The system is designed for querying multimedia data bases, and here applied to target recognition using heterogeneous data sources. The recognition process is coarse-to-fine, with an initial attribute estimation step and a following matching step. Several sensor types and algorithms are involved in each of these two steps. An independence of the matching results, on the origin of the estimation results, is observed. It allows for distribution of data between algorithms in an intermediate fusion step, without risk of data incest. This increases the overall chance of recognising the target. An implementation of the system is described.
An Update on the NASA Planetary Science Division Research and Analysis Program
NASA Astrophysics Data System (ADS)
Richey, Christina; Bernstein, Max; Rall, Jonathan
2015-01-01
Introduction: NASA's Planetary Science Division (PSD) solicits its Research and Analysis (R&A) programs each year in Research Opportunities in Space and Earth Sciences (ROSES). Beginning with the 2014 ROSES solicitation, PSD will be changing the structure of the program elements under which the majority of planetary science R&A is done. Major changes include the creation of five core research program elements aligned with PSD's strategic science questions, the introduction of several new R&A opportunities, new submission requirements, and a new timeline for proposal submissionROSES and NSPIRES: ROSES contains the research announcements for all of SMD. Submission of ROSES proposals is done electronically via NSPIRES: http://nspires.nasaprs.com. We will present further details on the proposal submission process to help guide younger scientists. Statistical trends, including the average award size within the PSD programs, selections rates, and lessons learned, will be presented. Information on new programs will also be presented, if available.Review Process and Volunteering: The SARA website (http://sara.nasa.gov) contains information on all ROSES solicitations. There is an email address (SARA@nasa.gov) for inquiries and an area for volunteer reviewers to sign up. The peer review process is based on Scientific/Technical Merit, Relevance, and Level of Effort, and will be detailed within this presentation.ROSES 2014 submission changes: All PSD programs will use a two-step proposal submission process. A Step-1 proposal is required and must be submitted electronically by the Step-1 due date. The Step-1 proposal should include a description of the science goals and objectives to be addressed by the proposal, a brief description of the methodology to be used to address the science goals and objectives, and the relevance of the proposed research to the call submitted to.Additional Information: Additional details will be provided on the Cassini Data Analysis Program, the Exoplanets Research program and Discovery Data Analysis Program, for which Dr. Richey is the Lead Program Officer.
Strategic, Organizational and Standardization Aspects of Integrated Information Systems. Volume 6.
1987-12-01
TEST CHART NATIONAL BUREAU OF STANDARDS- 1963-A Masaustt Strategic, Organizational, and Intueoyffomto TechnlogyStandardization Aspects of UJ Kowledge ...reasons (such as the desired level of processing power and the amount of storage space), organizational reasons (such as each department obtaining its...of processing power falls, Abbott can afford to subordinate efficient processing for organizational effectiveness. 4. Steps in an Analytical Process
ASPECTS: an automation-assisted SPE method development system.
Li, Ming; Chou, Judy; King, Kristopher W; Yang, Liyu
2013-07-01
A typical conventional SPE method development (MD) process usually involves deciding the chemistry of the sorbent and eluent based on information about the analyte; experimentally preparing and trying out various combinations of adsorption chemistry and elution conditions; quantitatively evaluating the various conditions; and comparing quantitative results from all combination of conditions to select the best condition for method qualification. The second and fourth steps have mostly been performed manually until now. We developed an automation-assisted system that expedites the conventional SPE MD process by automating 99% of the second step, and expedites the fourth step by automatically processing the results data and presenting it to the analyst in a user-friendly format. The automation-assisted SPE MD system greatly saves the manual labor in SPE MD work, prevents analyst errors from causing misinterpretation of quantitative results, and shortens data analysis and interpretation time.
An algorithmic approach to crustal deformation analysis
NASA Technical Reports Server (NTRS)
Iz, Huseyin Baki
1987-01-01
In recent years the analysis of crustal deformation measurements has become important as a result of current improvements in geodetic methods and an increasing amount of theoretical and observational data provided by several earth sciences. A first-generation data analysis algorithm which combines a priori information with current geodetic measurements was proposed. Relevant methods which can be used in the algorithm were discussed. Prior information is the unifying feature of this algorithm. Some of the problems which may arise through the use of a priori information in the analysis were indicated and preventive measures were demonstrated. The first step in the algorithm is the optimal design of deformation networks. The second step in the algorithm identifies the descriptive model of the deformation field. The final step in the algorithm is the improved estimation of deformation parameters. Although deformation parameters are estimated in the process of model discrimination, they can further be improved by the use of a priori information about them. According to the proposed algorithm this information must first be tested against the estimates calculated using the sample data only. Null-hypothesis testing procedures were developed for this purpose. Six different estimators which employ a priori information were examined. Emphasis was put on the case when the prior information is wrong and analytical expressions for possible improvements under incompatible prior information were derived.
Coeur d'Alene Tribal Production Facility, Volume I of III, 2002-2003 Progress Report.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anders, Paul
2003-01-01
In fulfillment of the NWPPC's 3-Step Process for the implementation of new hatcheries in the Columbia Basin, this Step 1 submission package to the Council includes four items: (1) Cover letter from the Coeur d'Alene Tribe, Interdisciplinary Team Chair, and the USFWS; (2) References to key information (Attachments 1-4); (3) The updated Master Plan for the Tribe's native cutthroat restoration project; and (4) Appendices. In support of the Master Plan submitted by the Coeur d'Alene Tribe the reference chart (Item 2) was developed to allow reviewers to quickly access information necessary for accurate peer review. The Northwest Power Planning Councilmore » identified pertinent issues to be addressed in the master planning process for new artificial production facilities. References to this key information are provided in three attachments: (1) NWPPC Program language regarding the Master Planning Process, (2) Questions Identified in the September 1997 Council Policy, and (3) Program language identified by the Council's Independent Scientific Review Panel (ISRP). To meet the need for off-site mitigation for fish losses on the mainstem Columbia River, in a manner consistent with the objectives of the Council's Program, the Coeur d'Alene Tribe is proposing that the BPA fund the design, construction, operation, and maintenance of a trout production facility located adjacent to Coeur d'Alene Lake on the Coeur d'Alene Indian Reservation. The updated Master Plan (Item 3) represents the needs associated with the re-evaluation of the Coeur d'Alene Tribe's Trout Production Facility (No.199004402). This plan addresses issues and concerns expressed by the NWPPC as part of the issue summary for the Mountain Columbia provincial review, and the 3-step hatchery review process. Finally, item 4 (Appendices) documents the 3-Step process correspondence to date between the Coeur d'Alene Tribe and additional relevant entities. Item 4 provides a chronological account of previous ISRP reviews, official Coeur d'Alene fisheries program responses to a series of ISRP reviews, master planning documentation, and annual reports dating back to 1990. Collectively, the materials provided by the Coeur d'Alene Tribe in this Step-1 submission package comprehensively assesses key research, habitat improvement activities, and hatchery production issues to best protect and enhance native cutthroat trout populations and the historically and culturally important tribal fisheries they support.« less
Access information and tools to support the CHP project development process, including identifying if your facility is a good fit for CHP, the steps involved with CHP project development, and policies and incentives supportive of CHP.
Technology Readiness Level Guidebook
DOT National Transportation Integrated Search
2017-09-01
This guidebook provides the necessary information for conducting a Technology Readiness Level (TRL) Assessment. TRL Assessments are a tool for determining the maturity of technologies and identifying next steps in the research process. This guidebook...
Developing an orientation program.
Edwards, K
1999-01-01
When the local area experienced tremendous growth and change, the radiology department at Maury Hospital in Columbia, Tennessee looked seriously at its orientation process in preparation for hiring additional personnel. It was an appropriate time for the department to review its orientation process and to develop a manual to serve as both a tool for supervisors and an ongoing reference for new employees. To gather information for the manual, supervisors were asked to identify information they considered vital for new employees to know concerning the daily operations of the department, its policies and procedures, the organizational structure of the hospital, and hospital and departmental computer systems. That information became the basis of the orientation manual, and provided an introduction to the hospital and radiology department; the structure of the organization; an overview of the radiology department; personnel information; operating procedures and computer systems; and various policies and procedures. With the manual complete, the radiology department concentrated on an orientation process that would meet the needs of supervisors who said they had trouble remembering the many details necessary to teach new employees. A pre-orientation checklist was developed, which contained the many details supervisors must handle between the time an employee is hired and arrives for work. The next step was the creation of a checklist for use by the supervisor during a new employee's first week on the job. A final step in the hospital's orientation program is to have each new employee evaluate the entire orientation process. That information is then used to update and revise the manual.
Using Learning Analytics to Characterize Student Experimentation Strategies in Engineering Design
ERIC Educational Resources Information Center
Vieira, Camilo; Goldstein, Molly Hathaway; Purzer, Senay; Magana, Alejandra J.
2016-01-01
Engineering design is a complex process both for students to participate in and for instructors to assess. Informed designers use the key strategy of conducting experiments as they test ideas to inform next steps. Conversely, beginning designers experiment less, often with confounding variables. These behaviours are not easy to assess in…
ERIC Educational Resources Information Center
Bedford, Denise A. D.
2015-01-01
The knowledge life cycle is applied to two core capabilities of library and information science (LIS) education--teaching, and research and development. The knowledge claim validation, invalidation and integration steps of the knowledge life cycle are translated to learning, unlearning and relearning processes. Mixed methods are used to determine…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-16
... and by educating the public, especially young people, about tobacco products and the dangers their use... identified. When FDA receives tobacco-specific adverse event and product problem information, it will use the... quality problem, or product use error occurs. This risk identification process is the first necessary step...
A Guide to Networking for K-12 Schools.
ERIC Educational Resources Information Center
Northwest Regional Educational Lab., Portland, OR.
The purpose of this guide is to provide basic networking information and planning assistance for technology coordinators and others involved in building networks for K-12 schools. The information in this guide focuses on the first few steps in the networking process. It reviews planning considerations and network design issues facing educators who…
Pre-School Students' Informal Acquisitions Regarding the Concepts of Point and Straight Line
ERIC Educational Resources Information Center
Orbay, Keziban; Develi, Mehmet Hikmet
2015-01-01
This study aimed to investigate the informal cognitive structures regarding "point" and "straight line"--two basic and undefined terms of geometry--in children registered in preschool--the previous step before in-class formal education process. The study was conducted with the participation of 50 children enrolled in nursery,…
Interactivity Between Proteges and Scientists in an Electronic Mentoring Program
ERIC Educational Resources Information Center
Bonnett, Cara; Wildemuth, Barbara M.; Sonnenwald, Diane H.
2006-01-01
Interactivity is defined by Henri (1992) as a three-step process involving communication of information, a response to this information, and a reply to that first response. It is a key dimension of computer-mediated communication, particularly in the one-on-one communication involved in an electronic mentoring program. This report analyzes the…
A Framework for Integrating Environmental Justice in Regulatory Analysis
Nweke, Onyemaechi C.
2011-01-01
With increased interest in integrating environmental justice into the process for developing environmental regulations in the United States, analysts and decision makers are confronted with the question of what methods and data can be used to assess disproportionate environmental health impacts. However, as a first step to identifying data and methods, it is important that analysts understand what information on equity impacts is needed for decision making. Such knowledge originates from clearly stated equity objectives and the reflection of those objectives throughout the analytical activities that characterize Regulatory Impact Analysis (RIA), a process that is traditionally used to inform decision making. The framework proposed in this paper advocates structuring analyses to explicitly provide pre-defined output on equity impacts. Specifically, the proposed framework emphasizes: (a) defining equity objectives for the proposed regulatory action at the onset of the regulatory process, (b) identifying specific and related sub-objectives for key analytical steps in the RIA process, and (c) developing explicit analytical/research questions to assure that stated sub-objectives and objectives are met. In proposing this framework, it is envisioned that information on equity impacts informs decision-making in regulatory development, and that this is achieved through a systematic and consistent approach that assures linkages between stated equity objectives, regulatory analyses, selection of policy options, and the design of compliance and enforcement activities. PMID:21776235
MO-D-213-01: Workflow Monitoring for a High Volume Radiation Oncology Center
DOE Office of Scientific and Technical Information (OSTI.GOV)
Laub, S; Dunn, M; Galbreath, G
2015-06-15
Purpose: Implement a center wide communication system that increases interdepartmental transparency and accountability while decreasing redundant work and treatment delays by actively monitoring treatment planning workflow. Methods: Intake Management System (IMS), a program developed by ProCure Treatment Centers Inc., is a multi-function database that stores treatment planning process information. It was devised to work with the oncology information system (Mosaiq) to streamline interdepartmental workflow.Each step in the treatment planning process is visually represented and timelines for completion of individual tasks are established within the software. The currently active step of each patient’s planning process is highlighted either red or greenmore » according to whether the initially allocated amount of time has passed for the given process. This information is displayed as a Treatment Planning Process Monitor (TPPM), which is shown on screens in the relevant departments throughout the center. This display also includes the individuals who are responsible for each task.IMS is driven by Mosaiq’s quality checklist (QCL) functionality. Each step in the workflow is initiated by a Mosaiq user sending the responsible party a QCL assignment. IMS is connected to Mosaiq and the sending or completing of a QCL updates the associated field in the TPPM to the appropriate status. Results: Approximately one patient a week is identified during the workflow process as needing to have his/her treatment start date modified or resources re-allocated to address the most urgent cases. Being able to identify a realistic timeline for planning each patient and having multiple departments communicate their limitations and time constraints allows for quality plans to be developed and implemented without overburdening any one department. Conclusion: Monitoring the progression of the treatment planning process has increased transparency between departments, which enables efficient communication. Having built-in timelines allows easy prioritization of tasks and resources and facilitates effective time management.« less
Alvarsson, Jonathan; Andersson, Claes; Spjuth, Ola; Larsson, Rolf; Wikberg, Jarl E S
2011-05-20
Compound profiling and drug screening generates large amounts of data and is generally based on microplate assays. Current information systems used for handling this are mainly commercial, closed source, expensive, and heavyweight and there is a need for a flexible lightweight open system for handling plate design, and validation and preparation of data. A Bioclipse plugin consisting of a client part and a relational database was constructed. A multiple-step plate layout point-and-click interface was implemented inside Bioclipse. The system contains a data validation step, where outliers can be removed, and finally a plate report with all relevant calculated data, including dose-response curves. Brunn is capable of handling the data from microplate assays. It can create dose-response curves and calculate IC50 values. Using a system of this sort facilitates work in the laboratory. Being able to reuse already constructed plates and plate layouts by starting out from an earlier step in the plate layout design process saves time and cuts down on error sources.
ERIC Educational Resources Information Center
Vine, Rita
2001-01-01
Explains how to train users in effective Web searching. Discusses challenges of teaching Web information retrieval; a framework for information searching; choosing the right search tools for users; the seven-step lesson planning process; tips for delivering group Internet training; and things that help people work faster and smarter on the Web.…
Context-based virtual metrology
NASA Astrophysics Data System (ADS)
Ebersbach, Peter; Urbanowicz, Adam M.; Likhachev, Dmitriy; Hartig, Carsten; Shifrin, Michael
2018-03-01
Hybrid and data feed forward methodologies are well established for advanced optical process control solutions in highvolume semiconductor manufacturing. Appropriate information from previous measurements, transferred into advanced optical model(s) at following step(s), provides enhanced accuracy and exactness of the measured topographic (thicknesses, critical dimensions, etc.) and material parameters. In some cases, hybrid or feed-forward data are missed or invalid for dies or for a whole wafer. We focus on approaches of virtual metrology to re-create hybrid or feed-forward data inputs in high-volume manufacturing. We discuss missing data inputs reconstruction which is based on various interpolation and extrapolation schemes and uses information about wafer's process history. Moreover, we demonstrate data reconstruction approach based on machine learning techniques utilizing optical model and measured spectra. And finally, we investigate metrics that allow one to assess error margin of virtual data input.
Process modeling of a HLA research lab
NASA Astrophysics Data System (ADS)
Ribeiro, Bruna G. C.; Sena, Alexandre C.; Silva, Dilson; Marzulo, Leandro A. J.
2017-11-01
Bioinformatics has provided tremendous breakthroughs in the field of molecular biology. All this evolution has generated a large volume of biological data that increasingly require the use of computing for analysis and storage of this information. The identification of the human leukocyte antigen (HLA) genotypes is critical to the success of organ transplants in humans. HLA typing involves not only laboratory tests but also DNA sequencing, with the participation of several professionals responsible for different stages of the process. Thus, the objective of this paper is to map the main steps in HLA typing in a laboratory specialized in performing such procedures, analyzing each process and proposing solutions to speed up the these steps, avoiding mistakes.
ERIC Educational Resources Information Center
Pace, Diana; Witucki, Laurie; Blumreich, Kathleen
2008-01-01
This paper describes the rationale and the step by step process for setting up a WISE (Women in Science and Engineering) learning community at one institution. Background information on challenges for women in science and engineering and the benefits of a learning community for female students in these major areas are described. Authors discuss…
Clustering-based spot segmentation of cDNA microarray images.
Uslan, Volkan; Bucak, Ihsan Ömür
2010-01-01
Microarrays are utilized as that they provide useful information about thousands of gene expressions simultaneously. In this study segmentation step of microarray image processing has been implemented. Clustering-based methods, fuzzy c-means and k-means, have been applied for the segmentation step that separates the spots from the background. The experiments show that fuzzy c-means have segmented spots of the microarray image more accurately than the k-means.
NASA Astrophysics Data System (ADS)
Kandel, D. D.; Western, A. W.; Grayson, R. B.
2004-12-01
Mismatches in scale between the fundamental processes, the model and supporting data are a major limitation in hydrologic modelling. Surface runoff generation via infiltration excess and the process of soil erosion are fundamentally short time-scale phenomena and their average behaviour is mostly determined by the short time-scale peak intensities of rainfall. Ideally, these processes should be simulated using time-steps of the order of minutes to appropriately resolve the effect of rainfall intensity variations. However, sub-daily data support is often inadequate and the processes are usually simulated by calibrating daily (or even coarser) time-step models. Generally process descriptions are not modified but rather effective parameter values are used to account for the effect of temporal lumping, assuming that the effect of the scale mismatch can be counterbalanced by tuning the parameter values at the model time-step of interest. Often this results in parameter values that are difficult to interpret physically. A similar approach is often taken spatially. This is problematic as these processes generally operate or interact non-linearly. This indicates a need for better techniques to simulate sub-daily processes using daily time-step models while still using widely available daily information. A new method applicable to many rainfall-runoff-erosion models is presented. The method is based on temporal scaling using statistical distributions of rainfall intensity to represent sub-daily intensity variations in a daily time-step model. This allows the effect of short time-scale nonlinear processes to be captured while modelling at a daily time-step, which is often attractive due to the wide availability of daily forcing data. The approach relies on characterising the rainfall intensity variation within a day using a cumulative distribution function (cdf). This cdf is then modified by various linear and nonlinear processes typically represented in hydrological and erosion models. The statistical description of sub-daily variability is thus propagated through the model, allowing the effects of variability to be captured in the simulations. This results in cdfs of various fluxes, the integration of which over a day gives respective daily totals. Using 42-plot-years of surface runoff and soil erosion data from field studies in different environments from Australia and Nepal, simulation results from this cdf approach are compared with the sub-hourly (2-minute for Nepal and 6-minute for Australia) and daily models having similar process descriptions. Significant improvements in the simulation of surface runoff and erosion are achieved, compared with a daily model that uses average daily rainfall intensities. The cdf model compares well with a sub-hourly time-step model. This suggests that the approach captures the important effects of sub-daily variability while utilizing commonly available daily information. It is also found that the model parameters are more robustly defined using the cdf approach compared with the effective values obtained at the daily scale. This suggests that the cdf approach may offer improved model transferability spatially (to other areas) and temporally (to other periods).
Danker, Jared F; Anderson, John R
2007-04-15
In naturalistic algebra problem solving, the cognitive processes of representation and retrieval are typically confounded, in that transformations of the equations typically require retrieval of mathematical facts. Previous work using cognitive modeling has associated activity in the prefrontal cortex with the retrieval demands of algebra problems and activity in the posterior parietal cortex with the transformational demands of algebra problems, but these regions tend to behave similarly in response to task manipulations (Anderson, J.R., Qin, Y., Sohn, M.-H., Stenger, V.A., Carter, C.S., 2003. An information-processing model of the BOLD response in symbol manipulation tasks. Psychon. Bull. Rev. 10, 241-261; Qin, Y., Carter, C.S., Silk, E.M., Stenger, A., Fissell, K., Goode, A., Anderson, J.R., 2004. The change of brain activation patterns as children learn algebra equation solving. Proc. Natl. Acad. Sci. 101, 5686-5691). With this study we attempt to isolate activity in these two regions by using a multi-step algebra task in which transformation (parietal) is manipulated in the first step and retrieval (prefrontal) is manipulated in the second step. Counter to our initial predictions, both brain regions were differentially active during both steps. We designed two cognitive models, one encompassing our initial assumptions and one in which both processes were engaged during both steps. The first model provided a poor fit to the behavioral and neural data, while the second model fit both well. This simultaneously emphasizes the strong relationship between retrieval and representation in mathematical reasoning and demonstrates that cognitive modeling can serve as a useful tool for understanding task manipulations in neuroimaging experiments.
Risk perception and decision processes underlying informed consent to research participation.
Reynolds, William W; Nelson, Robert M
2007-11-01
According to the rational choice model, informed consent should consist of a systematic, step-by-step evaluation of all information pertinent to the treatment or research participation decision. Research shows that people frequently deviate from this normative model, however, employing decision-making shortcuts, or heuristics. In this paper we report findings from a qualitative study of 32 adolescents and (their) 31 parents who were recruited from two Northeastern US hospitals and asked to consider the risks of and make hypothetical decisions about research participation. The purpose of this study was to increase our understanding of how diabetic and at-risk adolescents (i.e., those who are obese and/or have a family history of diabetes) and their parents perceive risks and make decisions about research participation. Using data collected from adolescents and parents, we identify heuristic decision processes in which participant perceptions of risk magnitude, which are formed quickly and intuitively and appear to be based on affective responses to information, are far more prominent and central to the participation decision than are perceptions of probability. We discuss participants' use of decision-making heuristics in the context of recent research on affect and decision processes, and we consider the implications of these findings for researchers.
Debelle, Aurelien; Boulle, Alexandre; Chartier, Alain; ...
2014-11-25
We present a combination of experimental and computational evaluations of disorder level and lattice swelling in ion-irradiated materials. Information obtained from X-ray diffraction experiments is compared to X-ray diffraction data generated using atomic-scale simulations. The proposed methodology, which can be applied to a wide range of crystalline materials, is used to study the amorphization process in irradiated SiC. Results show that this process can be divided into two steps. In the first step, point defects and small defect clusters are produced and generate both large lattice swelling and high elastic energy. In the second step, enhanced coalescence of defects andmore » defect clusters occurs to limit this increase in energy, which rapidly leads to complete amorphization.« less
Tracing information flow on a global scale using Internet chain-letter data
Liben-Nowell, David; Kleinberg, Jon
2008-01-01
Although information, news, and opinions continuously circulate in the worldwide social network, the actual mechanics of how any single piece of information spreads on a global scale have been a mystery. Here, we trace such information-spreading processes at a person-by-person level using methods to reconstruct the propagation of massively circulated Internet chain letters. We find that rather than fanning out widely, reaching many people in very few steps according to “small-world” principles, the progress of these chain letters proceeds in a narrow but very deep tree-like pattern, continuing for several hundred steps. This suggests a new and more complex picture for the spread of information through a social network. We describe a probabilistic model based on network clustering and asynchronous response times that produces trees with this characteristic structure on social-network data. PMID:18353985
Cima, Robert R; Brown, Michael J; Hebl, James R; Moore, Robin; Rogers, James C; Kollengode, Anantha; Amstutz, Gwendolyn J; Weisbrod, Cheryl A; Narr, Bradly J; Deschamps, Claude
2011-07-01
Operating rooms (ORs) are resource-intense and costly hospital units. Maximizing OR efficiency is essential to maintaining an economically viable institution. OR efficiency projects often focus on a limited number of ORs or cases. Efforts across an entire OR suite have not been reported. Lean and Six Sigma methodologies were developed in the manufacturing industry to increase efficiency by eliminating non-value-added steps. We applied Lean and Six Sigma methodologies across an entire surgical suite to improve efficiency. A multidisciplinary surgical process improvement team constructed a value stream map of the entire surgical process from the decision for surgery to discharge. Each process step was analyzed in 3 domains, ie, personnel, information processed, and time. Multidisciplinary teams addressed 5 work streams to increase value at each step: minimizing volume variation; streamlining the preoperative process; reducing nonoperative time; eliminating redundant information; and promoting employee engagement. Process improvements were implemented sequentially in surgical specialties. Key performance metrics were collected before and after implementation. Across 3 surgical specialties, process redesign resulted in substantial improvements in on-time starts and reduction in number of cases past 5 pm. Substantial gains were achieved in nonoperative time, staff overtime, and ORs saved. These changes resulted in substantial increases in margin/OR/day. Use of Lean and Six Sigma methodologies increased OR efficiency and financial performance across an entire operating suite. Process mapping, leadership support, staff engagement, and sharing performance metrics are keys to enhancing OR efficiency. The performance gains were substantial, sustainable, positive financially, and transferrable to other specialties. Copyright © 2011 American College of Surgeons. Published by Elsevier Inc. All rights reserved.
Two-stage, dilute sulfuric acid hydrolysis of wood : an investigation of fundamentals
John F. Harris; Andrew J. Baker; Anthony H. Conner; Thomas W. Jeffries; James L. Minor; Roger C. Pettersen; Ralph W. Scott; Edward L Springer; Theodore H. Wegner; John I. Zerbe
1985-01-01
This paper presents a fundamental analysis of the processing steps in the production of methanol from southern red oak (Quercus falcata Michx.) by two-stage dilute sulfuric acid hydrolysis. Data for hemicellulose and cellulose hydrolysis are correlated using models. This information is used to develop and evaluate a process design.
Schultz, David; Ambike, Archana; Logie, Sean Kevin; Bohner, Katherine E; Stapleton, Laura M; Vanderwalde, Holly; Min, Christopher B; Betkowski, Jennifer A
2010-07-01
Crick and Dodge's (Psychological Bulletin 115:74-101, 1994) social information processing model has proven very useful in guiding research focused on aggressive and peer-rejected children's social-cognitive functioning. Its application to early childhood, however, has been much more limited. The present study responds to this gap by developing and validating a video-based assessment tool appropriate for early childhood, the Schultz Test of Emotion Processing-Preliminary Version (STEP-P). One hundred twenty-five Head Start preschool children participated in the study. More socially competent children more frequently attributed sadness to the victims of provocation and labeled aggressive behaviors as both morally unacceptable and less likely to lead to positive outcomes. More socially competent girls labeled others' emotions more accurately. More disruptive children more frequently produced physically aggressive solutions to social provocations, and more disruptive boys less frequently interpreted social provocations as accidental. The STEP-P holds promise as an assessment tool that assesses knowledge structures related to the SIP model in early childhood.
NASA Astrophysics Data System (ADS)
Or, D.; von Ruette, J.; Lehmann, P.
2017-12-01
Landslides and subsequent debris-flows initiated by rainfall represent a common natural hazard in mountainous regions. We integrated a landslide hydro-mechanical triggering model with a simple model for debris flow runout pathways and developed a graphical user interface (GUI) to represent these natural hazards at catchment scale at any location. The STEP-TRAMM GUI provides process-based estimates of the initiation locations and sizes of landslides patterns based on digital elevation models (SRTM) linked with high resolution global soil maps (SoilGrids 250 m resolution) and satellite based information on rainfall statistics for the selected region. In the preprocessing phase the STEP-TRAMM model estimates soil depth distribution to supplement other soil information for delineating key hydrological and mechanical properties relevant to representing local soil failure. We will illustrate this publicly available GUI and modeling platform to simulate effects of deforestation on landslide hazards in several regions and compare model outcome with satellite based information.
Information Handling in Selected Academic Libraries of the Caribbean.
ERIC Educational Resources Information Center
Rodriguez, Ketty
1988-01-01
Describes a survey that examined the extent of library technical processes automation within academic libraries at 10 Caribbean universities. Existing conditions, steps in progress, and plans for future automation are discussed. (8 references) (CLB)
Lyme Disease Tests: MedlinePlus Lab Test Information
... and Human Services; Lyme Disease: Transmission [updated 2015 Mar 4; cited 2017 Dec 28]; [about 2 screens]. ... Disease: Two-step Laboratory Testing Process [updated 2015 Mar 26; cited 2017 Dec 28]; [about 2 screens]. ...
Communication and the Social Representation of Scientific Knowledge.
ERIC Educational Resources Information Center
Lievrouw, Leah A.
1990-01-01
Examines the process of disseminating scientific information to the public. Explores the particular steps and strategies that scientists use in taking research findings to a popular audience. Examines the popularization of cold-fusion research. (RS)
Information systems in healthcare - state and steps towards sustainability.
Lenz, R
2009-01-01
To identify core challenges and first steps on the way to sustainable information systems in healthcare. Recent articles on healthcare information technology and related articles from Medical Informatics and Computer Science were reviewed and analyzed. Core challenges that couldn't be solved over the years are identified. The two core problem areas are process integration, meaning to effectively embed IT-systems into routine workflows, and systems integration, meaning to reduce the effort for interconnecting independently developed IT-components. Standards for systems integration have improved a lot, but their usefulness is limited where system evolution is needed. Sustainable Healthcare Information Systems should be based on system architectures that support system evolution and avoid costly system replacements every five to ten years. Some basic principles for the design of such systems are separation of concerns, loose coupling, deferred systems design, and service oriented architectures.
Automated synthesis of image processing procedures using AI planning techniques
NASA Technical Reports Server (NTRS)
Chien, Steve; Mortensen, Helen
1994-01-01
This paper describes the Multimission VICAR (Video Image Communication and Retrieval) Planner (MVP) (Chien 1994) system, which uses artificial intelligence planning techniques (Iwasaki & Friedland, 1985, Pemberthy & Weld, 1992, Stefik, 1981) to automatically construct executable complex image processing procedures (using models of the smaller constituent image processing subprograms) in response to image processing requests made to the JPL Multimission Image Processing Laboratory (MIPL). The MVP system allows the user to specify the image processing requirements in terms of the various types of correction required. Given this information, MVP derives unspecified required processing steps and determines appropriate image processing programs and parameters to achieve the specified image processing goals. This information is output as an executable image processing program which can then be executed to fill the processing request.
NASA Technical Reports Server (NTRS)
Brumfield, M. L. (Compiler)
1984-01-01
A plan to develop a space technology experiments platform (STEP) was examined. NASA Langley Research Center held a STEP Experiment Requirements Workshop on June 29 and 30 and July 1, 1983, at which experiment proposers were invited to present more detailed information on their experiment concept and requirements. A feasibility and preliminary definition study was conducted and the preliminary definition of STEP capabilities and experiment concepts and expected requirements for support services are presented. The preliminary definition of STEP capabilities based on detailed review of potential experiment requirements is investigated. Topics discussed include: Shuttle on-orbit dynamics; effects of the space environment on damping materials; erectable beam experiment; technology for development of very large solar array deployers; thermal energy management process experiment; photovoltaic concentrater pointing dynamics and plasma interactions; vibration isolation technology; flight tests of a synthetic aperture radar antenna with use of STEP.
Calibration process of highly parameterized semi-distributed hydrological model
NASA Astrophysics Data System (ADS)
Vidmar, Andrej; Brilly, Mitja
2017-04-01
Hydrological phenomena take place in the hydrological system, which is governed by nature, and are essentially stochastic. These phenomena are unique, non-recurring, and changeable across space and time. Since any river basin with its own natural characteristics and any hydrological event therein, are unique, this is a complex process that is not researched enough. Calibration is a procedure of determining the parameters of a model that are not known well enough. Input and output variables and mathematical model expressions are known, while only some parameters are unknown, which are determined by calibrating the model. The software used for hydrological modelling nowadays is equipped with sophisticated algorithms for calibration purposes without possibility to manage process by modeler. The results are not the best. We develop procedure for expert driven process of calibration. We use HBV-light-CLI hydrological model which has command line interface and coupling it with PEST. PEST is parameter estimation tool which is used widely in ground water modeling and can be used also on surface waters. Process of calibration managed by expert directly, and proportionally to the expert knowledge, affects the outcome of the inversion procedure and achieves better results than if the procedure had been left to the selected optimization algorithm. First step is to properly define spatial characteristic and structural design of semi-distributed model including all morphological and hydrological phenomena, like karstic area, alluvial area and forest area. This step includes and requires geological, meteorological, hydraulic and hydrological knowledge of modeler. Second step is to set initial parameter values at their preferred values based on expert knowledge. In this step we also define all parameter and observation groups. Peak data are essential in process of calibration if we are mainly interested in flood events. Each Sub Catchment in the model has own observations group. Third step is to set appropriate bounds to parameters in their range of realistic values. Fourth step is to use of singular value decomposition (SVD) ensures that PEST maintains numerical stability, regardless of how ill-posed is the inverse problem Fifth step is to run PWTADJ1. This creates a new PEST control file in which weights are adjusted such that the contribution made to the total objective function by each observation group is the same. This prevents the information content of any group from being invisible to the inversion process. Sixth step is to add Tikhonov regularization to the PEST control file by running the ADDREG1 utility (Doherty, J, 2013). In adding regularization to the PEST control file ADDREG1 automatically provides a prior information equation for each parameter in which the preferred value of that parameter is equated to its initial value. Last step is to run PEST. We run BeoPEST which a parallel version of PEST and can be run on multiple computers in parallel in same time on TCP communications and this speedup process of calibrations. The case study with results of calibration and validation of the model will be presented.
An Update on the NASA Planetary Science Division Research and Analysis Program
NASA Astrophysics Data System (ADS)
Bernstein, Max; Richey, Christina; Rall, Jonathan
2015-11-01
Introduction: NASA’s Planetary Science Division (PSD) solicits its research and analysis (R&A) programs each year in Research Opportunities in Space and Earth Sciences (ROSES). Beginning with the 2014 ROSES solicitation, PSD changed the structure of the program elements under which the majority of planetary science R&A is done. Major changes included the creation of five core research program elements aligned with PSD’s strategic science questions, the introduction of several new R&A opportunities, new submission requirements, and a new timeline for proposal submission.ROSES and NSPIRES: ROSES contains the research announcements for all of SMD. Submission of ROSES proposals is done electronically via NSPIRES: http://nspires.nasaprs.com. We will present further details on the proposal submission process to help guide younger scientists. Statistical trends, including the average award size within the PSD programs, selections rates, and lessons learned, will be presented. Information on new programs will also be presented, if available.Review Process and Volunteering: The SARA website (http://sara.nasa.gov) contains information on all ROSES solicitations. There is an email address (SARA@nasa.gov) for inquiries and an area for volunteer reviewers to sign up. The peer review process is based on Scientific/Technical Merit, Relevance, and Level of Effort, and will be detailed within this presentation.ROSES 2015 submission changes: All PSD programs will continue to use a two-step proposal submission process. A Step-1 proposal is required and must be submitted electronically by the Step-1 due date. The Step-1 proposal should include a description of the science goals and objectives to be addressed by the proposal, a brief description of the methodology to be used to address the science goals and objectives, and the relevance of the proposed research to the call submitted to.
77 FR 42556 - Proposed Information Collection (Notice of Disagreement) Activity: Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-19
... information technology. Title: Notice of Disagreement, VA Form 21-0958. OMB Control Number: 2900-NEW. Type of... decision issued by Regional Office. This is the first step in the appeal process. The respondent may or may not continue with an appeal to the Board of Veterans Appeals (BVA). If the Veteran opts to continue to...
ERIC Educational Resources Information Center
Johnston, Katrina M.
2013-01-01
School information systems (SIS) have the potential to cause a change in a school's technical, structural, psycho-social, and managerial systems. Implementation of a technological innovation such as an SIS is not a one-step occurrence; it is a process that occurs over time. Implementing any technological innovation involves active learning…
The US Response to Bologna: Expanding Knowledge, First Steps of Convergence
ERIC Educational Resources Information Center
Adelman, Clifford
2010-01-01
The roads of incoming information to the US higher education system about the Bologna Process are varied and numerous. They include not only the on-line and traditional trade press, but also conferences of national organisations. Whether anyone remembers much of that information, on the other hand, is an open question, as a limited survey…
ERIC Educational Resources Information Center
Attaullah; Johnson, Jane S.
1991-01-01
Outlines the various steps involved in the process of modernizing the information storage and retrieval systems within the library of the Northwest Frontier Province Agricultural University (NWFP AU) in Peshawar, Pakistan. Implementation of services for faculty, students, and provincial agricultural scientists is discussed, and staff training is…
ERIC Educational Resources Information Center
Russo, James; Hopkins, Sarah
2017-01-01
This paper outlines a seven-step process for developing problem-solving tasks informed by cognitive load theory. Through an example of a task developed for Year 2 students, we show how this approach can be used to produce challenging mathematical tasks that aim to optimise cognitive load for each student.
NASA Astrophysics Data System (ADS)
Curiac, Daniel-Ioan; Pachia, Mihai
2015-05-01
Information security represents the cornerstone of every data processing system that resides in an organisation's trusted network, implementing all necessary protocols, mechanisms and policies to be one step ahead of possible threats. Starting from the need to strengthen the set of security services, in this article we introduce a new and innovative process named controlled information destruction (CID) that is meant to secure sensitive data that are no longer needed for the organisation's future purposes but would be very damaging if revealed. The disposal of this type of data has to be controlled carefully in order to delete not only the information itself but also all its splinters spread throughout the network, thus denying any possibility of recovering the information after its alleged destruction. This process leads to a modified model of information assurance and also reconfigures the architecture of any information security management system. The scheme we envisioned relies on a reshaped information lifecycle, which reveals the impact of the CID procedure directly upon the information states.
Timing paradox of stepping and falls in ageing: not so quick and quick(er) on the trigger
Mille, Marie‐Laure
2016-01-01
Abstract Physiological and degenerative changes affecting human standing balance are major contributors to falls with ageing. During imbalance, stepping is a powerful protective action for preserving balance that may be voluntarily initiated in recognition of a balance threat, or be induced by an externally imposed mechanical or sensory perturbation. Paradoxically, with ageing and falls, initiation slowing of voluntary stepping is observed together with perturbation‐induced steps that are triggered as fast as or faster than for younger adults. While age‐associated changes in sensorimotor conduction, central neuronal processing and cognitive functions are linked to delayed voluntary stepping, alterations in the coupling of posture and locomotion may also prolong step triggering. It is less clear, however, how these factors may explain the accelerated triggering of induced stepping. We present a conceptual model that addresses this issue. For voluntary stepping, a disruption in the normal coupling between posture and locomotion may underlie step‐triggering delays through suppression of the locomotion network based on an estimation of the evolving mechanical state conditions for stability. During induced stepping, accelerated step initiation may represent an event‐triggering process whereby stepping is released according to the occurrence of a perturbation rather than to the specific sensorimotor information reflecting the evolving instability. In this case, errors in the parametric control of induced stepping and its effectiveness in stabilizing balance would be likely to occur. We further suggest that there is a residual adaptive capacity with ageing that could be exploited to improve paradoxical triggering and other changes in protective stepping to impact fall risk. PMID:26915664
NASA Astrophysics Data System (ADS)
Bykovskii, Yurii A.; Eloev, E. N.; Kukharenko, K. L.; Panin, A. M.; Solodovnikov, N. P.; Torgashin, A. N.; Arestova, E. L.
1995-10-01
An acousto-optical system for input, display, and coherent-optical processing of information was implemented experimentally. The information transmission capacity, the structure of the information fluxes, and the efficiency of spaceborne telemetric systems were taken into account. The number of equivalent frequency-resolved channels corresponded to the structure of a telemetric frame of a two-step switch. The number of intensity levels of laser radiation corresponded to the scale of changes in the parameters. Use was made of the technology of a liquid optical contact between a wedge-shaped piezoelectric transducer made of lithium niobate and an anisotropic light-and-sound guide made of paratellurite with asymmetric scattering geometry. The simplest technique for optical filtering of multiparameter signals was analysed.
Moreno-Conde, Alberto; Moner, David; Cruz, Wellington Dimas da; Santos, Marcelo R; Maldonado, José Alberto; Robles, Montserrat; Kalra, Dipak
2015-07-01
This systematic review aims to identify and compare the existing processes and methodologies that have been published in the literature for defining clinical information models (CIMs) that support the semantic interoperability of electronic health record (EHR) systems. Following the preferred reporting items for systematic reviews and meta-analyses systematic review methodology, the authors reviewed published papers between 2000 and 2013 that covered that semantic interoperability of EHRs, found by searching the PubMed, IEEE Xplore, and ScienceDirect databases. Additionally, after selection of a final group of articles, an inductive content analysis was done to summarize the steps and methodologies followed in order to build CIMs described in those articles. Three hundred and seventy-eight articles were screened and thirty six were selected for full review. The articles selected for full review were analyzed to extract relevant information for the analysis and characterized according to the steps the authors had followed for clinical information modeling. Most of the reviewed papers lack a detailed description of the modeling methodologies used to create CIMs. A representative example is the lack of description related to the definition of terminology bindings and the publication of the generated models. However, this systematic review confirms that most clinical information modeling activities follow very similar steps for the definition of CIMs. Having a robust and shared methodology could improve their correctness, reliability, and quality. Independently of implementation technologies and standards, it is possible to find common patterns in methods for developing CIMs, suggesting the viability of defining a unified good practice methodology to be used by any clinical information modeler. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Introduction to Remote Sensing Image Registration
NASA Technical Reports Server (NTRS)
Le Moigne, Jacqueline
2017-01-01
For many applications, accurate and fast image registration of large amounts of multi-source data is the first necessary step before subsequent processing and integration. Image registration is defined by several steps and each step can be approached by various methods which all present diverse advantages and drawbacks depending on the type of data, the type of applications, the a prior information known about the data and the type of accuracy that is required. This paper will first present a general overview of remote sensing image registration and then will go over a few specific methods and their applications
High-throughput screening of chromatographic separations: IV. Ion-exchange.
Kelley, Brian D; Switzer, Mary; Bastek, Patrick; Kramarczyk, Jack F; Molnar, Kathleen; Yu, Tianning; Coffman, Jon
2008-08-01
Ion-exchange (IEX) chromatography steps are widely applied in protein purification processes because of their high capacity, selectivity, robust operation, and well-understood principles. Optimization of IEX steps typically involves resin screening and selection of the pH and counterion concentrations of the load, wash, and elution steps. Time and material constraints associated with operating laboratory columns often preclude evaluating more than 20-50 conditions during early stages of process development. To overcome this limitation, a high-throughput screening (HTS) system employing a robotic liquid handling system and 96-well filterplates was used to evaluate various operating conditions for IEX steps for monoclonal antibody (mAb) purification. A screening study for an adsorptive cation-exchange step evaluated eight different resins. Sodium chloride concentrations defining the operating boundaries of product binding and elution were established at four different pH levels for each resin. Adsorption isotherms were measured for 24 different pH and salt combinations for a single resin. An anion-exchange flowthrough step was then examined, generating data on mAb adsorption for 48 different combinations of pH and counterion concentration for three different resins. The mAb partition coefficients were calculated and used to estimate the characteristic charge of the resin-protein interaction. Host cell protein and residual Protein A impurity levels were also measured, providing information on selectivity within this operating window. The HTS system shows promise for accelerating process development of IEX steps, enabling rapid acquisition of large datasets addressing the performance of the chromatography step under many different operating conditions. (c) 2008 Wiley Periodicals, Inc.
RN, CIO: an executive informatics career.
Staggers, Nancy; Lasome, Caterina E M
2005-01-01
The Chief Information Officer (CIO) position is a viable new career track for clinical informaticists. Nurses, especially informatics nurses, are uniquely positioned for the CIO role because of their operational knowledge of clinical processes, communication skills, systems thinking abilities, and knowledge about information structures and processes. This article describes essential knowledge and skills for the CIO executive position. Competencies not typical to nurses can be learned and developed, particularly strategic visioning and organizational finesse. This article concludes by describing career development steps toward the CIO position: leadership and management; healthcare operations; organizational finesse; and informatics knowledge, processes, methods, and structures.
NASA Astrophysics Data System (ADS)
Raj, Rahul; Hamm, Nicholas Alexander Samuel; van der Tol, Christiaan; Stein, Alfred
2016-03-01
Gross primary production (GPP) can be separated from flux tower measurements of net ecosystem exchange (NEE) of CO2. This is used increasingly to validate process-based simulators and remote-sensing-derived estimates of simulated GPP at various time steps. Proper validation includes the uncertainty associated with this separation. In this study, uncertainty assessment was done in a Bayesian framework. It was applied to data from the Speulderbos forest site, The Netherlands. We estimated the uncertainty in GPP at half-hourly time steps, using a non-rectangular hyperbola (NRH) model for its separation from the flux tower measurements. The NRH model provides a robust empirical relationship between radiation and GPP. It includes the degree of curvature of the light response curve, radiation and temperature. Parameters of the NRH model were fitted to the measured NEE data for every 10-day period during the growing season (April to October) in 2009. We defined the prior distribution of each NRH parameter and used Markov chain Monte Carlo (MCMC) simulation to estimate the uncertainty in the separated GPP from the posterior distribution at half-hourly time steps. This time series also allowed us to estimate the uncertainty at daily time steps. We compared the informative with the non-informative prior distributions of the NRH parameters and found that both choices produced similar posterior distributions of GPP. This will provide relevant and important information for the validation of process-based simulators in the future. Furthermore, the obtained posterior distributions of NEE and the NRH parameters are of interest for a range of applications.
Industrial implementation of spatial variability control by real-time SPC
NASA Astrophysics Data System (ADS)
Roule, O.; Pasqualini, F.; Borde, M.
2016-10-01
Advanced technology nodes require more and more information to get the wafer process well setup. The critical dimension of components decreases following Moore's law. At the same time, the intra-wafer dispersion linked to the spatial non-uniformity of tool's processes is not capable to decrease in the same proportions. APC systems (Advanced Process Control) are being developed in waferfab to automatically adjust and tune wafer processing, based on a lot of process context information. It can generate and monitor complex intrawafer process profile corrections between different process steps. It leads us to put under control the spatial variability, in real time by our SPC system (Statistical Process Control). This paper will outline the architecture of an integrated process control system for shape monitoring in 3D, implemented in waferfab.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1988-08-25
The directive provides an overview of the information gathering tools under CERCLA section 104(e) and 122(e)(3), and focuses on the steps to be taken throughout the information gathering process to ensure that EPA is in the strongest position possible to enforce the information gathering or subpoena. The guidance replaces existing guidance entitled, Policy on Enforcing Information Requests in Hazardous Waste Cases, dated September 10, 1984, to the extent that the previous guidance addresses information gathering under CERCLA section 104(e), directive no. 9834.4.
Space Medicine in the Human System Integration Process
NASA Technical Reports Server (NTRS)
Scheuring, Richard A.
2010-01-01
This slide presentation reviews the importance of integration of space medicine in the human system of lunar exploration. There is a review of historical precedence in reference to lunar surface operations. The integration process is reviewed in a chart which shows the steps from research to requirements development, requirements integration, design, verification, operations and using the lessons learned, giving more information and items for research. These steps are reviewed in view of specific space medical issues. Some of the testing of the operations are undertaken in an environment that is an analog to the exploration environment. Some of these analog environments are reviewed, and there is some discussion of the benefits of use of an analog environment in testing the processes that are derived.
Extraction of Qualitative Features from Sensor Data Using Windowed Fourier Transform
NASA Technical Reports Server (NTRS)
Amini, Abolfazl M.; Figueroa, Fenando
2003-01-01
In this paper, we use Matlab to model the health monitoring of a system through the information gathered from sensors. This implies assessment of the condition of the system components. Once a normal mode of operation is established any deviation from the normal behavior indicates a change. This change may be due to a malfunction of an element, a qualitative change, or a change due to a problem with another element in the network. For example, if one sensor indicates that the temperature in the tank has experienced a step change then a pressure sensor associated with the process in the tank should also experience a step change. The step up and step down as well as sensor disturbances are assumed to be exponential. An RC network is used to model the main process, which is step-up (charging), drift, and step-down (discharging). The sensor disturbances and spike are added while the system is in drift. The system is allowed to run for a period equal to three time constant of the main process before changes occur. Then each point of the signal is selected with a trailing data collected previously. Two trailing lengths of data are selected, one equal to two time constants of the main process and the other equal to two time constants of the sensor disturbance. Next, the DC is removed from each set of data and then the data are passed through a window followed by calculation of spectra for each set. In order to extract features the signal power, peak, and spectrum are plotted vs time. The results indicate distinct shapes corresponding to each process. The study is also carried out for a number of Gaussian distributed noisy cases.
Anokye, Nana Kwame; Pokhrel, Subhash; Buxton, Martin; Fox-Rushby, Julia
2013-06-01
Little is known about the correlates of meeting recommended levels of participation in physical activity (PA) and how this understanding informs public health policies on behaviour change. To analyse who meets the recommended level of participation in PA in males and females separately by applying 'process' modelling frameworks (single vs. sequential 2-step process). Using the Health Survey for England 2006, (n = 14 142; ≥ 16 years), gender-specific regression models were estimated using bivariate probit with selectivity correction and single probit models. A 'sequential, 2-step process' modelled participation and meeting the recommended level separately, whereas the 'single process' considered both participation and level together. In females, meeting the recommended level was associated with degree holders [Marginal effect (ME) = 0.013] and age (ME = -0.001), whereas in males, age was a significant correlate (ME = -0.003 to -0.004). The order of importance of correlates was similar across genders, with ethnicity being the most important correlate in both males (ME = -0.060) and females (ME = -0.133). In females, the 'sequential, 2-step process' performed better (ρ = -0.364, P < 0.001) than that in males (ρ = 0.154). The degree to which people undertake the recommended level of PA through vigorous activity varies between males and females, and the process that best predicts such decisions, i.e. whether it is a sequential, 2-step process or a single-step choice, is also different for males and females. Understanding this should help to identify subgroups that are less likely to meet the recommended level of PA (and hence more likely to benefit from any PA promotion intervention).
Free energy of steps using atomistic simulations
NASA Astrophysics Data System (ADS)
Freitas, Rodrigo; Frolov, Timofey; Asta, Mark
The properties of solid-liquid interfaces are known to play critical roles in solidification processes. Particularly special importance is given to thermodynamic quantities that describe the equilibrium state of these surfaces. For example, on the solid-liquid-vapor heteroepitaxial growth of semiconductor nanowires the crystal nucleation process on the faceted solid-liquid interface is influenced by the solid-liquid and vapor-solid interfacial free energies, and also by the free energies of associated steps at these faceted interfaces. Crystal-growth theories and mesoscale simulation methods depend on quantitative information about these properties, which are often poorly characterized from experimental measurements. In this work we propose an extension of the capillary fluctuation method for calculation of the free energy of steps on faceted crystal surfaces. From equilibrium atomistic simulations of steps on (111) surfaces of Copper we computed accurately the step free energy for different step orientations. We show that the step free energy remains finite at all temperature up to the melting point and that the results obtained agree with the more well established method of thermodynamic integration if finite size effects are taken into account. The research of RF and MA at UC Berkeley were supported by the US National Science Foundation (Grant No. DMR-1105409). TF acknowledges support through a postdoctoral fellowship from the Miller Institute for Basic Research in Science.
Extraction of Data from a Hospital Information System to Perform Process Mining.
Neira, Ricardo Alfredo Quintano; de Vries, Gert-Jan; Caffarel, Jennifer; Stretton, Erin
2017-01-01
The aim of this work is to share our experience in relevant data extraction from a hospital information system in preparation for a research study using process mining techniques. The steps performed were: research definition, mapping the normative processes, identification of tables and fields names of the database, and extraction of data. We then offer lessons learned during data extraction phase. Any errors made in the extraction phase will propagate and have implications on subsequent analyses. Thus, it is essential to take the time needed and devote sufficient attention to detail to perform all activities with the goal of ensuring high quality of the extracted data. We hope this work will be informative for other researchers to plan and execute extraction of data for process mining research studies.
[Verbal patient information through nurses--a case of stroke patients].
Christmann, Elli; Holle, Regina; Schüssler, Dörte; Beier, Jutta; Dassen, Theo
2004-06-01
The article represents results of a theoretical work in the field of nursing education, with the topic: Verbal Patient Information through Nurses--A Case of Stroke Patients. The literature review and analysis show that there is a shortage in (stroke) patient information generally and a lack of successful concepts and strategies for the verbal (stroke) patient information through nurses in hospitals. The authors have developed a theoretical basis for health information as a nursing intervention and this represents a model of health information as a "communicational teach-and-learn process", which is of general application to all patients. The health information takes place as a separate nursing intervention within a non-public, face-to-face communication situation and in the steps-model of the nursing process. Health information is seen as a learning process for patients and nurses too. We consider learning as information production (constructivism) and information processing (cognitivism). Both processes are influenced by different factors and the illness-situation of patients, personality information content and the environment. For a successful health information output, it is necessary to take care of these aspects and this can be realized through a constructivational understanding of didactics. There is a need for an evaluation study to prove our concept of health information.
Moving Digital Libraries into the Student Learning Space: The GetSmart Experience
ERIC Educational Resources Information Center
Marshall, Byron B.; Chen, Hsinchun; Shen, Rao; Fox, Edward A.
2006-01-01
The GetSmart system was built to support theoretically sound learning processes in a digital library environment by integrating course management, digital library, and concept mapping components to support a constructivist, six-step, information search process. In the fall of 2002 more than 100 students created 1400 concept maps as part of…
ERIC Educational Resources Information Center
Strelnikov, Kuzma
2007-01-01
This article aims to provide a theoretical framework to elucidate the neurophysiological underpinnings of deviance detection as reflected by mismatch negativity. A six-step model of the information processing necessary for deviance detection is proposed. In this model, predictive coding of learned regularities is realized by means of long-term…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-01
... frequency of use and frequency of condition) as well as expert input, a better approach for mass outreach... process, including transparency, stakeholder input, and leadership; and Expert involvement to inform and... BPCA Web site, http://bpca.nichd.nih.gov . As a final step in the process, the NICHD, with input from...
Pocket Pal: A Graphic Arts Digest for Printers and Advertising Production Managers. Tenth Edition.
ERIC Educational Resources Information Center
1970
In this digest of information about printing a brief survey of the history of printing precedes detailed explanations of the processes and the materials involved in printing. The four major printing processes--letterpress, gravure, offset lithography, and screen--are explained. Steps in preparing art and copy for printing, including selection of…
Measurement of the bystander intervention model for bullying and sexual harassment.
Nickerson, Amanda B; Aloe, Ariel M; Livingston, Jennifer A; Feeley, Thomas Hugh
2014-06-01
Although peer bystanders can exacerbate or prevent bullying and sexual harassment, research has been hindered by the absence of a validated assessment tool to measure the process and sequential steps of the bystander intervention model. A measure was developed based on the five steps of Latané and Darley's (1970) bystander intervention model applied to bullying and sexual harassment. Confirmatory factor analysis with a sample of 562 secondary school students confirmed the five-factor structure of the measure. Structural equation modeling revealed that all the steps were influenced by the previous step in the model, as the theory proposed. In addition, the bystander intervention measure was positively correlated with empathy, attitudes toward bullying and sexual harassment, and awareness of bullying and sexual harassment facts. This measure can be used for future research and to inform intervention efforts related to the process of bystander intervention for bullying and sexual harassment. Copyright © 2014 The Foundation for Professionals in Services for Adolescents. Published by Elsevier Ltd. All rights reserved.
Building a common pipeline for rule-based document classification.
Patterson, Olga V; Ginter, Thomas; DuVall, Scott L
2013-01-01
Instance-based classification of clinical text is a widely used natural language processing task employed as a step for patient classification, document retrieval, or information extraction. Rule-based approaches rely on concept identification and context analysis in order to determine the appropriate class. We propose a five-step process that enables even small research teams to develop simple but powerful rule-based NLP systems by taking advantage of a common UIMA AS based pipeline for classification. Our proposed methodology coupled with the general-purpose solution provides researchers with access to the data locked in clinical text in cases of limited human resources and compact timelines.
Migliavacca, Mirco; Meroni, Michele; Busetto, Lorenzo; Colombo, Roberto; Zenone, Terenzio; Matteucci, Giorgio; Manca, Giovanni; Seufert, Guenther
2009-01-01
In this paper we present results obtained in the framework of a regional-scale analysis of the carbon budget of poplar plantations in Northern Italy. We explored the ability of the process-based model BIOME-BGC to estimate the gross primary production (GPP) using an inverse modeling approach exploiting eddy covariance and satellite data. We firstly present a version of BIOME-BGC coupled with the radiative transfer models PROSPECT and SAILH (named PROSAILH-BGC) with the aims of i) improving the BIOME-BGC description of the radiative transfer regime within the canopy and ii) allowing the assimilation of remotely-sensed vegetation index time series, such as MODIS NDVI, into the model. Secondly, we present a two-step model inversion for optimization of model parameters. In the first step, some key ecophysiological parameters were optimized against data collected by an eddy covariance flux tower. In the second step, important information about phenological dates and about standing biomass were optimized against MODIS NDVI. Results obtained showed that the PROSAILH-BGC allowed simulation of MODIS NDVI with good accuracy and that we described better the canopy radiation regime. The inverse modeling approach was demonstrated to be useful for the optimization of ecophysiological model parameters, phenological dates and parameters related to the standing biomass, allowing good accuracy of daily and annual GPP predictions. In summary, this study showed that assimilation of eddy covariance and remote sensing data in a process model may provide important information for modeling gross primary production at regional scale. PMID:22399948
Migliavacca, Mirco; Meroni, Michele; Busetto, Lorenzo; Colombo, Roberto; Zenone, Terenzio; Matteucci, Giorgio; Manca, Giovanni; Seufert, Guenther
2009-01-01
In this paper we present results obtained in the framework of a regional-scale analysis of the carbon budget of poplar plantations in Northern Italy. We explored the ability of the process-based model BIOME-BGC to estimate the gross primary production (GPP) using an inverse modeling approach exploiting eddy covariance and satellite data. We firstly present a version of BIOME-BGC coupled with the radiative transfer models PROSPECT and SAILH (named PROSAILH-BGC) with the aims of i) improving the BIOME-BGC description of the radiative transfer regime within the canopy and ii) allowing the assimilation of remotely-sensed vegetation index time series, such as MODIS NDVI, into the model. Secondly, we present a two-step model inversion for optimization of model parameters. In the first step, some key ecophysiological parameters were optimized against data collected by an eddy covariance flux tower. In the second step, important information about phenological dates and about standing biomass were optimized against MODIS NDVI. Results obtained showed that the PROSAILH-BGC allowed simulation of MODIS NDVI with good accuracy and that we described better the canopy radiation regime. The inverse modeling approach was demonstrated to be useful for the optimization of ecophysiological model parameters, phenological dates and parameters related to the standing biomass, allowing good accuracy of daily and annual GPP predictions. In summary, this study showed that assimilation of eddy covariance and remote sensing data in a process model may provide important information for modeling gross primary production at regional scale.
Cohen, Alex S; Dinzeo, Thomas J; Donovan, Neila J; Brown, Caitlin E; Morrison, Sean C
2015-03-30
Vocal expression reflects an integral component of communication that varies considerably within individuals across contexts and is disrupted in a range of neurological and psychiatric disorders. There is reason to suspect that variability in vocal expression reflects, in part, the availability of "on-line" resources (e.g., working memory, attention). Thus, understanding vocal expression is a potentially important biometric index of information processing, not only across but within individuals over time. A first step in this line of research involves establishing a link between vocal expression and information processing systems in healthy adults. The present study employed a dual attention experimental task where participants provided natural speech while simultaneously engaged in a baseline, medium or high nonverbal processing-load task. Objective, automated, and computerized analysis was employed to measure vocal expression in 226 adults. Increased processing load resulted in longer pauses, fewer utterances, greater silence overall and less variability in frequency and intensity levels. These results provide compelling evidence of a link between information processing resources and vocal expression, and provide important information for the development of an automated, inexpensive and uninvasive biometric measure of information processing. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Castagnoli, Giuseppe
2018-03-01
The usual representation of quantum algorithms, limited to the process of solving the problem, is physically incomplete. We complete it in three steps: (i) extending the representation to the process of setting the problem, (ii) relativizing the extended representation to the problem solver to whom the problem setting must be concealed, and (iii) symmetrizing the relativized representation for time reversal to represent the reversibility of the underlying physical process. The third steps projects the input state of the representation, where the problem solver is completely ignorant of the setting and thus the solution of the problem, on one where she knows half solution (half of the information specifying it when the solution is an unstructured bit string). Completing the physical representation shows that the number of computation steps (oracle queries) required to solve any oracle problem in an optimal quantum way should be that of a classical algorithm endowed with the advanced knowledge of half solution.
Timing paradox of stepping and falls in ageing: not so quick and quick(er) on the trigger.
Rogers, Mark W; Mille, Marie-Laure
2016-08-15
Physiological and degenerative changes affecting human standing balance are major contributors to falls with ageing. During imbalance, stepping is a powerful protective action for preserving balance that may be voluntarily initiated in recognition of a balance threat, or be induced by an externally imposed mechanical or sensory perturbation. Paradoxically, with ageing and falls, initiation slowing of voluntary stepping is observed together with perturbation-induced steps that are triggered as fast as or faster than for younger adults. While age-associated changes in sensorimotor conduction, central neuronal processing and cognitive functions are linked to delayed voluntary stepping, alterations in the coupling of posture and locomotion may also prolong step triggering. It is less clear, however, how these factors may explain the accelerated triggering of induced stepping. We present a conceptual model that addresses this issue. For voluntary stepping, a disruption in the normal coupling between posture and locomotion may underlie step-triggering delays through suppression of the locomotion network based on an estimation of the evolving mechanical state conditions for stability. During induced stepping, accelerated step initiation may represent an event-triggering process whereby stepping is released according to the occurrence of a perturbation rather than to the specific sensorimotor information reflecting the evolving instability. In this case, errors in the parametric control of induced stepping and its effectiveness in stabilizing balance would be likely to occur. We further suggest that there is a residual adaptive capacity with ageing that could be exploited to improve paradoxical triggering and other changes in protective stepping to impact fall risk. © 2016 The Authors. The Journal of Physiology © 2016 The Physiological Society.
Electrochemical Probing through a Redox Capacitor To Acquire Chemical Information on Biothiols
2016-01-01
The acquisition of chemical information is a critical need for medical diagnostics, food/environmental monitoring, and national security. Here, we report an electrochemical information processing approach that integrates (i) complex electrical inputs/outputs, (ii) mediators to transduce the electrical I/O into redox signals that can actively probe the chemical environment, and (iii) a redox capacitor that manipulates signals for information extraction. We demonstrate the capabilities of this chemical information processing strategy using biothiols because of the emerging importance of these molecules in medicine and because their distinct chemical properties allow evaluation of hypothesis-driven information probing. We show that input sequences can be tailored to probe for chemical information both qualitatively (step inputs probe for thiol-specific signatures) and quantitatively. Specifically, we observed picomolar limits of detection and linear responses to concentrations over 5 orders of magnitude (1 pM–0.1 μM). This approach allows the capabilities of signal processing to be extended for rapid, robust, and on-site analysis of chemical information. PMID:27385047
Electrochemical Probing through a Redox Capacitor To Acquire Chemical Information on Biothiols.
Liu, Zhengchun; Liu, Yi; Kim, Eunkyoung; Bentley, William E; Payne, Gregory F
2016-07-19
The acquisition of chemical information is a critical need for medical diagnostics, food/environmental monitoring, and national security. Here, we report an electrochemical information processing approach that integrates (i) complex electrical inputs/outputs, (ii) mediators to transduce the electrical I/O into redox signals that can actively probe the chemical environment, and (iii) a redox capacitor that manipulates signals for information extraction. We demonstrate the capabilities of this chemical information processing strategy using biothiols because of the emerging importance of these molecules in medicine and because their distinct chemical properties allow evaluation of hypothesis-driven information probing. We show that input sequences can be tailored to probe for chemical information both qualitatively (step inputs probe for thiol-specific signatures) and quantitatively. Specifically, we observed picomolar limits of detection and linear responses to concentrations over 5 orders of magnitude (1 pM-0.1 μM). This approach allows the capabilities of signal processing to be extended for rapid, robust, and on-site analysis of chemical information.
ERIC Educational Resources Information Center
Braune, Rolf; Foshay, Wellesley R.
1983-01-01
The proposed three-step strategy for research on human information processing--concept hierarchy analysis, analysis of example sets to teach relations among concepts, and analysis of problem sets to build a progressively larger schema for the problem space--may lead to practical procedures for instructional design and task analysis. Sixty-four…
Glaser, John P
2008-01-01
Partners Healthcare, and its affiliated hospitals, have a long track record of accomplishments in clinical information systems implementations and research. Seven ideas have shaped the information systems strategies and tactics at Partners; centrality of processes, organizational partnerships, progressive incrementalism, agility, architecture, embedded research, and engage the field. This article reviews the ideas and discusses the rationale and steps taken to put the ideas into practice.
Glaser, John P.
2008-01-01
Partners Healthcare, and its affiliated hospitals, have a long track record of accomplishments in clinical information systems implementations and research. Seven ideas have shaped the information systems strategies and tactics at Partners; centrality of processes, organizational partnerships, progressive incrementalism, agility, architecture, embedded research, and engage the field. This article reviews the ideas and discusses the rationale and steps taken to put the ideas into practice. PMID:18308978
ERIC Educational Resources Information Center
Blau, Vera; Reithler, Joel; van Atteveldt, Nienke; Seitz, Jochen; Gerretsen, Patty; Goebel, Rainer; Blomert, Leo
2010-01-01
Learning to associate auditory information of speech sounds with visual information of letters is a first and critical step for becoming a skilled reader in alphabetic languages. Nevertheless, it remains largely unknown which brain areas subserve the learning and automation of such associations. Here, we employ functional magnetic resonance…
ERIC Educational Resources Information Center
Feinberg, Lynn
2008-01-01
Assessment is a critical step in determining appropriate support services. This article discusses "caregiver assessment," a systematic process of gathering information to describe a caregiving situation. Caregiver assessment identifies the particular problems, needs, resources, and strengths of the family caregiver and approaches issues from the…
Guz, Nataliia; Halámek, Jan; Rusling, James F.; Katz, Evgeny
2014-01-01
The biocatalytic cascade based on enzyme-catalyzed reactions activated by several biomolecular input signals and producing output signal after each reaction step was developed as an example of a logically reversible information processing system. The model system was designed to mimic the operation of concatenated AND logic gates with optically readable output signals generated at each step of the logic operation. Implications include concurrent bioanalyses and data interpretation for medical diagnostics. PMID:24748446
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ragan, Eric D; Goodall, John R
2014-01-01
Provenance tools can help capture and represent the history of analytic processes. In addition to supporting analytic performance, provenance tools can be used to support memory of the process and communication of the steps to others. Objective evaluation methods are needed to evaluate how well provenance tools support analyst s memory and communication of analytic processes. In this paper, we present several methods for the evaluation of process memory, and we discuss the advantages and limitations of each. We discuss methods for determining a baseline process for comparison, and we describe various methods that can be used to elicit processmore » recall, step ordering, and time estimations. Additionally, we discuss methods for conducting quantitative and qualitative analyses of process memory. By organizing possible memory evaluation methods and providing a meta-analysis of the potential benefits and drawbacks of different approaches, this paper can inform study design and encourage objective evaluation of process memory and communication.« less
A Geometry Based Infra-structure for Computational Analysis and Design
NASA Technical Reports Server (NTRS)
Haimes, Robert
1997-01-01
The computational steps traditionally taken for most engineering analysis (CFD, structural analysis, and etc.) are: Surface Generation - usually by employing a CAD system; Grid Generation - preparing the volume for the simulation; Flow Solver - producing the results at the specified operational point; and Post-processing Visualization - interactively attempting to understand the results For structural analysis, integrated systems can be obtained from a number of commercial vendors. For CFD, these steps have worked well in the past for simple steady-state simulations at the expense of much user interaction. The data was transmitted between phases via files. Specifically the problems with this procedure are: (1) File based. Information flows from one step to the next via data files with formats specified for that procedure. (2) 'Good' Geometry. A bottleneck in getting results from a solver is the construction of proper geometry to be fed to the grid generator. With 'good' geometry a grid can be constructed in tens of minutes (even with a complex configuration) using unstructured techniques. (3) One-Way communication. All information travels on from one phase to the next. Until this process can be automated, more complex problems such as multi-disciplinary analysis or using the above procedure for design becomes prohibitive.
Intracranial Cortical Responses during Visual–Tactile Integration in Humans
Quinn, Brian T.; Carlson, Chad; Doyle, Werner; Cash, Sydney S.; Devinsky, Orrin; Spence, Charles; Halgren, Eric
2014-01-01
Sensory integration of touch and sight is crucial to perceiving and navigating the environment. While recent evidence from other sensory modality combinations suggests that low-level sensory areas integrate multisensory information at early processing stages, little is known about how the brain combines visual and tactile information. We investigated the dynamics of multisensory integration between vision and touch using the high spatial and temporal resolution of intracranial electrocorticography in humans. We present a novel, two-step metric for defining multisensory integration. The first step compares the sum of the unisensory responses to the bimodal response as multisensory responses. The second step eliminates the possibility that double addition of sensory responses could be misinterpreted as interactions. Using these criteria, averaged local field potentials and high-gamma-band power demonstrate a functional processing cascade whereby sensory integration occurs late, both anatomically and temporally, in the temporo–parieto–occipital junction (TPOJ) and dorsolateral prefrontal cortex. Results further suggest two neurophysiologically distinct and temporally separated integration mechanisms in TPOJ, while providing direct evidence for local suppression as a dominant mechanism for synthesizing visual and tactile input. These results tend to support earlier concepts of multisensory integration as relatively late and centered in tertiary multimodal association cortices. PMID:24381279
Designing Health Information Technology Tools to Prevent Gaps in Public Health Insurance.
Hall, Jennifer D; Harding, Rose L; DeVoe, Jennifer E; Gold, Rachel; Angier, Heather; Sumic, Aleksandra; Nelson, Christine A; Likumahuwa-Ackman, Sonja; Cohen, Deborah J
2017-06-23
Changes in health insurance policies have increased coverage opportunities, but enrollees are required to annually reapply for benefits which, if not managed appropriately, can lead to insurance gaps. Electronic health records (EHRs) can automate processes for assisting patients with health insurance enrollment and re-enrollment. We describe community health centers' (CHC) workflow, documentation, and tracking needs for assisting families with insurance application processes, and the health information technology (IT) tool components that were developed to meet those needs. We conducted a qualitative study using semi-structured interviews and observation of clinic operations and insurance application assistance processes. Data were analyzed using a grounded theory approach. We diagramed workflows and shared information with a team of developers who built the EHR-based tools. Four steps to the insurance assistance workflow were common among CHCs: 1) Identifying patients for public health insurance application assistance; 2) Completing and submitting the public health insurance application when clinic staff met with patients to collect requisite information and helped them apply for benefits; 3) Tracking public health insurance approval to monitor for decisions; and 4) assisting with annual health insurance reapplication. We developed EHR-based tools to support clinical staff with each of these steps. CHCs are uniquely positioned to help patients and families with public health insurance applications. CHCs have invested in staff to assist patients with insurance applications and help prevent coverage gaps. To best assist patients and to foster efficiency, EHR based insurance tools need comprehensive, timely, and accurate health insurance information.
Patterson, Brandon J; Bakken, Brianne K; Doucette, William R; Urmie, Julie M; McDonough, Randal P
The evolving health care system necessitates pharmacy organizations' adjustments by delivering new services and establishing inter-organizational relationships. One approach supporting pharmacy organizations in making changes may be informal learning by technicians, pharmacists, and pharmacy owners. Informal learning is characterized by a four-step cycle including intent to learn, action, feedback, and reflection. This framework helps explain individual and organizational factors that influence learning processes within an organization as well as the individual and organizational outcomes of those learning processes. A case study of an Iowa independent community pharmacy with years of experience in offering patient care services was made. Nine semi-structured interviews with pharmacy personnel revealed initial evidence in support of the informal learning model in practice. Future research could investigate more fully the informal learning model in delivery of patient care services in community pharmacies. Copyright © 2016 Elsevier Inc. All rights reserved.
Idle waves in high-performance computing
NASA Astrophysics Data System (ADS)
Markidis, Stefano; Vencels, Juris; Peng, Ivy Bo; Akhmetova, Dana; Laure, Erwin; Henri, Pierre
2015-01-01
The vast majority of parallel scientific applications distributes computation among processes that are in a busy state when computing and in an idle state when waiting for information from other processes. We identify the propagation of idle waves through processes in scientific applications with a local information exchange between the two processes. Idle waves are nondispersive and have a phase velocity inversely proportional to the average busy time. The physical mechanism enabling the propagation of idle waves is the local synchronization between two processes due to remote data dependency. This study provides a description of the large number of processes in parallel scientific applications as a continuous medium. This work also is a step towards an understanding of how localized idle periods can affect remote processes, leading to the degradation of global performance in parallel scientific applications.
Riaz, Qaiser; Vögele, Anna; Krüger, Björn; Weber, Andreas
2015-01-01
A number of previous works have shown that information about a subject is encoded in sparse kinematic information, such as the one revealed by so-called point light walkers. With the work at hand, we extend these results to classifications of soft biometrics from inertial sensor recordings at a single body location from a single step. We recorded accelerations and angular velocities of 26 subjects using integrated measurement units (IMUs) attached at four locations (chest, lower back, right wrist and left ankle) when performing standardized gait tasks. The collected data were segmented into individual walking steps. We trained random forest classifiers in order to estimate soft biometrics (gender, age and height). We applied two different validation methods to the process, 10-fold cross-validation and subject-wise cross-validation. For all three classification tasks, we achieve high accuracy values for all four sensor locations. From these results, we can conclude that the data of a single walking step (6D: accelerations and angular velocities) allow for a robust estimation of the gender, height and age of a person. PMID:26703601
Chained Kullback-Leibler Divergences
Pavlichin, Dmitri S.; Weissman, Tsachy
2017-01-01
We define and characterize the “chained” Kullback-Leibler divergence minw D(p‖w) + D(w‖q) minimized over all intermediate distributions w and the analogous k-fold chained K-L divergence min D(p‖wk−1) + … + D(w2‖w1) + D(w1‖q) minimized over the entire path (w1,…,wk−1). This quantity arises in a large deviations analysis of a Markov chain on the set of types – the Wright-Fisher model of neutral genetic drift: a population with allele distribution q produces offspring with allele distribution w, which then produce offspring with allele distribution p, and so on. The chained divergences enjoy some of the same properties as the K-L divergence (like joint convexity in the arguments) and appear in k-step versions of some of the same settings as the K-L divergence (like information projections and a conditional limit theorem). We further characterize the optimal k-step “path” of distributions appearing in the definition and apply our findings in a large deviations analysis of the Wright-Fisher process. We make a connection to information geometry via the previously studied continuum limit, where the number of steps tends to infinity, and the limiting path is a geodesic in the Fisher information metric. Finally, we offer a thermodynamic interpretation of the chained divergence (as the rate of operation of an appropriately defined Maxwell’s demon) and we state some natural extensions and applications (a k-step mutual information and k-step maximum likelihood inference). We release code for computing the objects we study. PMID:29130024
Automating the evaluation of flood damages: methodology and potential gains
NASA Astrophysics Data System (ADS)
Eleutério, Julian; Martinez, Edgar Daniel
2010-05-01
The evaluation of flood damage potential consists of three main steps: assessing and processing data, combining data and calculating potential damages. The first step consists of modelling hazard and assessing vulnerability. In general, this step of the evaluation demands more time and investments than the others. The second step of the evaluation consists of combining spatial data on hazard with spatial data on vulnerability. Geographic Information System (GIS) is a fundamental tool in the realization of this step. GIS software allows the simultaneous analysis of spatial and matrix data. The third step of the evaluation consists of calculating potential damages by means of damage-functions or contingent analysis. All steps demand time and expertise. However, the last two steps must be realized several times when comparing different management scenarios. In addition, uncertainty analysis and sensitivity test are made during the second and third steps of the evaluation. The feasibility of these steps could be relevant in the choice of the extent of the evaluation. Low feasibility could lead to choosing not to evaluate uncertainty or to limit the number of scenario comparisons. Several computer models have been developed over time in order to evaluate the flood risk. GIS software is largely used to realise flood risk analysis. The software is used to combine and process different types of data, and to visualise the risk and the evaluation results. The main advantages of using a GIS in these analyses are: the possibility of "easily" realising the analyses several times, in order to compare different scenarios and study uncertainty; the generation of datasets which could be used any time in future to support territorial decision making; the possibility of adding information over time to update the dataset and make other analyses. However, these analyses require personnel specialisation and time. The use of GIS software to evaluate the flood risk requires personnel with a double professional specialisation. The professional should be proficient in GIS software and in flood damage analysis (which is already a multidisciplinary field). Great effort is necessary in order to correctly evaluate flood damages, and the updating and the improvement of the evaluation over time become a difficult task. The automation of this process should bring great advance in flood management studies over time, especially for public utilities. This study has two specific objectives: (1) show the entire process of automation of the second and third steps of flood damage evaluations; and (2) analyse the induced potential gains in terms of time and expertise needed in the analysis. A programming language is used within GIS software in order to automate hazard and vulnerability data combination and potential damages calculation. We discuss the overall process of flood damage evaluation. The main result of this study is a computational tool which allows significant operational gains on flood loss analyses. We quantify these gains by means of a hypothetical example. The tool significantly reduces the time of analysis and the needs for expertise. An indirect gain is that sensitivity and cost-benefit analyses can be more easily realized.
Jaeger, Johannes; Irons, David; Monk, Nick
2008-10-01
Positional specification by morphogen gradients is traditionally viewed as a two-step process. A gradient is formed and then interpreted, providing a spatial metric independent of the target tissue, similar to the concept of space in classical mechanics. However, the formation and interpretation of gradients are coupled, dynamic processes. We introduce a conceptual framework for positional specification in which cellular activity feeds back on positional information encoded by gradients, analogous to the feedback between mass-energy distribution and the geometry of space-time in Einstein's general theory of relativity. We discuss how such general relativistic positional information (GRPI) can guide systems-level approaches to pattern formation.
Point Clouds to Indoor/outdoor Accessibility Diagnosis
NASA Astrophysics Data System (ADS)
Balado, J.; Díaz-Vilariño, L.; Arias, P.; Garrido, I.
2017-09-01
This work presents an approach to automatically detect structural floor elements such as steps or ramps in the immediate environment of buildings, elements that may affect the accessibility to buildings. The methodology is based on Mobile Laser Scanner (MLS) point cloud and trajectory information. First, the street is segmented in stretches along the trajectory of the MLS to work in regular spaces. Next, the lower region of each stretch (the ground zone) is selected as the ROI and normal, curvature and tilt are calculated for each point. With this information, points in the ROI are classified in horizontal, inclined or vertical. Points are refined and grouped in structural elements using raster process and connected components in different phases for each type of previously classified points. At last, the trajectory data is used to distinguish between road and sidewalks. Adjacency information is used to classify structural elements in steps, ramps, curbs and curb-ramps. The methodology is tested in a real case study, consisting of 100 m of an urban street. Ground elements are correctly classified in an acceptable computation time. Steps and ramps also are exported to GIS software to enrich building models from Open Street Map with information about accessible/inaccessible entrances and their locations.
Using a contextualized sensemaking model for interaction design: A case study of tumor contouring.
Aselmaa, Anet; van Herk, Marcel; Laprie, Anne; Nestle, Ursula; Götz, Irina; Wiedenmann, Nicole; Schimek-Jasch, Tanja; Picaud, Francois; Syrykh, Charlotte; Cagetti, Leonel V; Jolnerovski, Maria; Song, Yu; Goossens, Richard H M
2017-01-01
Sensemaking theories help designers understand the cognitive processes of a user when he/she performs a complicated task. This paper introduces a two-step approach of incorporating sensemaking support within the design of health information systems by: (1) modeling the sensemaking process of physicians while performing a task, and (2) identifying software interaction design requirements that support sensemaking based on this model. The two-step approach is presented based on a case study of the tumor contouring clinical task for radiotherapy planning. In the first step of the approach, a contextualized sensemaking model was developed to describe the sensemaking process based on the goal, the workflow and the context of the task. In the second step, based on a research software prototype, an experiment was conducted where three contouring tasks were performed by eight physicians respectively. Four types of navigation interactions and five types of interaction sequence patterns were identified by analyzing the gathered interaction log data from those twenty-four cases. Further in-depth study on each of the navigation interactions and interaction sequence patterns in relation to the contextualized sensemaking model revealed five main areas for design improvements to increase sensemaking support. Outcomes of the case study indicate that the proposed two-step approach was beneficial for gaining a deeper understanding of the sensemaking process during the task, as well as for identifying design requirements for better sensemaking support. Copyright © 2016. Published by Elsevier Inc.
Hill, Jacqueline J; Kuyken, Willem; Richards, David A
2014-11-20
Stepped care is recommended and implemented as a means to organise depression treatment. Compared with alternative systems, it is assumed to achieve equivalent clinical effects and greater efficiency. However, no trials have examined these assumptions. A fully powered trial of stepped care compared with intensive psychological therapy is required but a number of methodological and procedural uncertainties associated with the conduct of a large trial need to be addressed first. STEPS (Developing stepped care treatment for depression) is a mixed methods study to address uncertainties associated with a large-scale evaluation of stepped care compared with high-intensity psychological therapy alone for the treatment of depression. We will conduct a pilot randomised controlled trial with an embedded process study. Quantitative trial data on recruitment, retention and the pathway of patients through treatment will be used to assess feasibility. Outcome data on the effects of stepped care compared with high-intensity therapy alone will inform a sample size calculation for a definitive trial. Qualitative interviews will be undertaken to explore what people think of our trial methods and procedures and the stepped care intervention. A minimum of 60 patients with Major Depressive Disorder will be recruited from an Improving Access to Psychological Therapies service and randomly allocated to receive stepped care or intensive psychological therapy alone. All treatments will be delivered at clinic facilities within the University of Exeter. Quantitative patient-related data on depressive symptoms, worry and anxiety and quality of life will be collected at baseline and 6 months. The pilot trial and interviews will be undertaken concurrently. Quantitative and qualitative data will be analysed separately and then integrated. The outcomes of this study will inform the design of a fully powered randomised controlled trial to evaluate the effectiveness and efficiency of stepped care. Qualitative data on stepped care will be of immediate interest to patients, clinicians, service managers, policy makers and guideline developers. A more informed understanding of the feasibility of a large trial will be obtained than would be possible from a purely quantitative (or qualitative) design. Current Controlled Trials ISRCTN66346646 registered on 2 July 2014.
Gonzalez-Sanchez, M Beatriz; Lopez-Valeiras, Ernesto; Morente, Manuel M; Fernández Lago, Orlando
2013-10-01
Current economic conditions and budget constraints in publicly funded biomedical research have brought about a renewed interest in analyzing the cost and economic viability of research infrastructures. However, there are no proposals for specific cost accounting models for these types of organizations in the international scientific literature. The aim of this paper is to present the basis of a cost analysis model useful for any biobank regardless of the human biological samples that it stores for biomedical research. The development of a unique cost model for biobanks can be a complicated task due to the diversity of the biological samples they store. Different types of samples (DNA, tumor tissues, blood, serum, etc.) require different production processes. Nonetheless, the common basic steps of the production process can be identified. Thus, the costs incurred in each step can be analyzed in detail to provide cost information. Six stages and four cost objects were obtained by taking the production processes of biobanks belonging to the Spanish National Biobank Network as a starting point. Templates and examples are provided to help managers to identify and classify the costs involved in their own biobanks to implement the model. The application of this methodology will provide accurate information on cost objects, along with useful information to give an economic value to the stored samples, to analyze the efficiency of the production process and to evaluate the viability of some sample collections.
Seven Steps for Success: Selecting IT Consultants
ERIC Educational Resources Information Center
Moriarty, Daniel F.
2004-01-01
Information technology (IT) presents community colleges with both powerful opportunities and formidable challenges. The prospects of expedited and more efficient business processes, greater student access through distance learning, improved communication, and strengthened relationships with students can embolden the most hesitant college…
The AskA Starter Kit: How To Build and Maintain Digital Reference Services.
ERIC Educational Resources Information Center
Lankes, R. David; Kasowitz, Abby S.
This Starter Kit is designed to help organizations and individuals who wish to offer human-mediated information services via the Internet to users in the K-12 community. A six-step process is proposed for organizations to follow in creating an "AskA" service. This process addresses all aspects involved in building and maintaining an AskA…
Erika L. Rowland; Jennifer E. Davison; Lisa J. Graumlich
2011-01-01
Assessing the impact of climate change on species and associated management objectives is a critical initial step for engaging in the adaptation planning process. Multiple approaches are available. While all possess limitations to their application associated with the uncertainties inherent in the data and models that inform their results, conducting and incorporating...
Garcí A-de-León-Chocano, Ricardo; Sáez, Carlos; Muñoz-Soler, Verónica; Garcí A-de-León-González, Ricardo; García-Gómez, Juan M
2015-12-01
This is the first paper of a series of two regarding the construction of data quality (DQ) assured repositories for the reuse of information on infant feeding from birth until two years old. This first paper justifies the need for such repositories and describes the design of a process to construct them from Electronic Health Records (EHR). As a result, Part 1 proposes a computational process to obtain quality-assured datasets represented by a canonical structure extracted from raw data from multiple EHR. For this, 13 steps were defined to ensure the harmonization, standardization, completion, de-duplication, and consistency of the dataset content. Moreover, the quality of the input and output data for each of these steps is controlled according to eight DQ dimensions: predictive value, correctness, duplication, consistency, completeness, contextualization, temporal-stability and spatial-stability. The second paper of the series will describe the application of this computational process to construct the first quality-assured repository for the reuse of information on infant feeding in the perinatal period aimed at the monitoring of clinical activities and research. Copyright © 2015 Elsevier Ltd. All rights reserved.
Tools to support evidence-informed public health decision making
2014-01-01
Background Public health professionals are increasingly expected to engage in evidence-informed decision making to inform practice and policy decisions. Evidence-informed decision making involves the use of research evidence along with expertise, existing public health resources, knowledge about community health issues, the local context and community, and the political climate. The National Collaborating Centre for Methods and Tools has identified a seven step process for evidence-informed decision making. Tools have been developed to support public health professionals as they work through each of these steps. This paper provides an overview of tools used in three Canadian public health departments involved in a study to develop capacity for evidence-informed decision making. Methods As part of a knowledge translation and exchange intervention, a Knowledge Broker worked with public health professionals to identify and apply tools for use with each of the steps of evidence-informed decision making. The Knowledge Broker maintained a reflective journal and interviews were conducted with a purposive sample of decision makers and public health professionals. This paper presents qualitative analysis of the perceived usefulness and usability of the tools. Results Tools were used in the health departments to assist in: question identification and clarification; searching for the best available research evidence; assessing the research evidence for quality through critical appraisal; deciphering the ‘actionable message(s)’ from the research evidence; tailoring messages to the local context to ensure their relevance and suitability; deciding whether and planning how to implement research evidence in the local context; and evaluating the effectiveness of implementation efforts. Decision makers provided descriptions of how the tools were used within the health departments and made suggestions for improvement. Overall, the tools were perceived as valuable for advancing and sustaining evidence-informed decision making. Conclusion Tools are available to support the process of evidence-informed decision making among public health professionals. The usability and usefulness of these tools for advancing and sustaining evidence-informed decision making are discussed, including recommendations for the tools’ application in other public health settings beyond this study. Knowledge and awareness of these tools may assist other health professionals in their efforts to implement evidence-informed practice. PMID:25034534
Tools to support evidence-informed public health decision making.
Yost, Jennifer; Dobbins, Maureen; Traynor, Robyn; DeCorby, Kara; Workentine, Stephanie; Greco, Lori
2014-07-18
Public health professionals are increasingly expected to engage in evidence-informed decision making to inform practice and policy decisions. Evidence-informed decision making involves the use of research evidence along with expertise, existing public health resources, knowledge about community health issues, the local context and community, and the political climate. The National Collaborating Centre for Methods and Tools has identified a seven step process for evidence-informed decision making. Tools have been developed to support public health professionals as they work through each of these steps. This paper provides an overview of tools used in three Canadian public health departments involved in a study to develop capacity for evidence-informed decision making. As part of a knowledge translation and exchange intervention, a Knowledge Broker worked with public health professionals to identify and apply tools for use with each of the steps of evidence-informed decision making. The Knowledge Broker maintained a reflective journal and interviews were conducted with a purposive sample of decision makers and public health professionals. This paper presents qualitative analysis of the perceived usefulness and usability of the tools. Tools were used in the health departments to assist in: question identification and clarification; searching for the best available research evidence; assessing the research evidence for quality through critical appraisal; deciphering the 'actionable message(s)' from the research evidence; tailoring messages to the local context to ensure their relevance and suitability; deciding whether and planning how to implement research evidence in the local context; and evaluating the effectiveness of implementation efforts. Decision makers provided descriptions of how the tools were used within the health departments and made suggestions for improvement. Overall, the tools were perceived as valuable for advancing and sustaining evidence-informed decision making. Tools are available to support the process of evidence-informed decision making among public health professionals. The usability and usefulness of these tools for advancing and sustaining evidence-informed decision making are discussed, including recommendations for the tools' application in other public health settings beyond this study. Knowledge and awareness of these tools may assist other health professionals in their efforts to implement evidence-informed practice.
A Combinatorial Geometry Computer Description of the M9 ACE (Armored Combat Earthmover) Vehicle
1984-12-01
program requires as input the M9 target descriptions as processed by the Geometric Information for Targets ( GIFT ) ’ computer code. The first step is...model of the target. This COM-GEOM target description is used as input to the Geometric Information For Targets ( GIFT ) computer code. Among other...things, the GIFT code traces shotlines through a COM-GEOM description from any specified aspect, listing pertinent information about each component hit
Angus, Lynne
2012-01-01
This paper addresses the fundamental contributions of client narrative disclosure in psychotherapy and its importance for the elaboration of new emotional meanings and self understanding in the context of Emotion-focused therapy (EFT) of depression. An overview of the multi-methodological steps undertaken to empirically investigate the contributions of client story telling, emotional differentiation and meaning-making processes (Narrative Processes Coding System; Angus et al., 1999) in EFT treatments of depression is provided, followed by a summary of key research findings that informed the development of a narrative-informed approach to Emotion-focused therapy of depression (Angus & Greenberg, 2011). Finally, the clinical practice and training implications of adopting a research-informed approach to working with narrative and emotion processes in EFT are described, and future research directions discussed.
NASA Astrophysics Data System (ADS)
Raj, R.; Hamm, N. A. S.; van der Tol, C.; Stein, A.
2015-08-01
Gross primary production (GPP), separated from flux tower measurements of net ecosystem exchange (NEE) of CO2, is used increasingly to validate process-based simulators and remote sensing-derived estimates of simulated GPP at various time steps. Proper validation should include the uncertainty associated with this separation at different time steps. This can be achieved by using a Bayesian framework. In this study, we estimated the uncertainty in GPP at half hourly time steps. We used a non-rectangular hyperbola (NRH) model to separate GPP from flux tower measurements of NEE at the Speulderbos forest site, The Netherlands. The NRH model included the variables that influence GPP, in particular radiation, and temperature. In addition, the NRH model provided a robust empirical relationship between radiation and GPP by including the degree of curvature of the light response curve. Parameters of the NRH model were fitted to the measured NEE data for every 10-day period during the growing season (April to October) in 2009. Adopting a Bayesian approach, we defined the prior distribution of each NRH parameter. Markov chain Monte Carlo (MCMC) simulation was used to update the prior distribution of each NRH parameter. This allowed us to estimate the uncertainty in the separated GPP at half-hourly time steps. This yielded the posterior distribution of GPP at each half hour and allowed the quantification of uncertainty. The time series of posterior distributions thus obtained allowed us to estimate the uncertainty at daily time steps. We compared the informative with non-informative prior distributions of the NRH parameters. The results showed that both choices of prior produced similar posterior distributions GPP. This will provide relevant and important information for the validation of process-based simulators in the future. Furthermore, the obtained posterior distributions of NEE and the NRH parameters are of interest for a range of applications.
Information Fluxes as Concept for Categorizations of Life
NASA Astrophysics Data System (ADS)
Hildenbrand, Georg; Hausmann, M.
2012-05-01
Definitions of life are controversially discussed; however, they are mostly depending on bio- evolutionary driven arguments. Here, we propose a systematic, theoretical approach to the question what life is, by categorization and classification of different levels of life. This approach is mainly based on the analysis of information flux occurring in systems being suspicious to be alive, and on the analysis of their power of environmental control. In a first step, we show that all biological definitions of life can be derived from basic physical principles of entropy (number of possible states of a thermodynamic system) and of the energy needed for controlling entropic development. In a next step we discuss how any process where information flux is generated, regardless of its materialization is defined and related to classical definitions of life. In a third step we resume the proposed classification scheme in its most basic way, looking only for existence of data storage, its processing, and its environmental control. We join inhere a short discussion how the materialization of information fluxes can take place depending on the special properties of the four basic physical forces. Having done all this we are able to give everybody a classification catalogue at hand that one can categorize the kind of life one is talking about, thus overcoming the obstacles deriving from the simple appearing question whether something is alive or not. On its most basic level as presented here, our scheme offers a categorization for fire, crystals, prions, viruses, spores, up to cells and even tardigrada and cryostases.
SED16 autonomous star tracker night sky testing
NASA Astrophysics Data System (ADS)
Foisneau, Thierry; Piriou, Véronique; Perrimon, Nicolas; Jacob, Philippe; Blarre, Ludovic; Vilaire, Didier
2017-11-01
The SED16 is an autonomous multi-missions star tracker which delivers three axis satellite attitude in an inertial reference frame and the satellite angular velocity with no prior information. The qualification process of this star sensor includes five validation steps using optical star simulator, digitized image simulator and a night sky tests setup. The night sky testing was the final step of the qualification process during which all the functions of the star tracker were used in almost nominal conditions : Autonomous Acquisition of the attitude, Autonomous Tracking of ten stars. These tests were performed in Calern in the premises of the OCA (Observatoire de la Cote d'Azur). The test set-up and the test results are described after a brief review of the sensor main characteristics and qualification process.
The effect of external forces on discrete motion within holographic optical tweezers.
Eriksson, E; Keen, S; Leach, J; Goksör, M; Padgett, M J
2007-12-24
Holographic optical tweezers is a widely used technique to manipulate the individual positions of optically trapped micron-sized particles in a sample. The trap positions are changed by updating the holographic image displayed on a spatial light modulator. The updating process takes a finite time, resulting in a temporary decrease of the intensity, and thus the stiffness, of the optical trap. We have investigated this change in trap stiffness during the updating process by studying the motion of an optically trapped particle in a fluid flow. We found a highly nonlinear behavior of the change in trap stiffness vs. changes in step size. For step sizes up to approximately 300 nm the trap stiffness is decreasing. Above 300 nm the change in trap stiffness remains constant for all step sizes up to one particle radius. This information is crucial for optical force measurements using holographic optical tweezers.
2016-01-01
This review aimed to arrange the process of a systematic review of genome-wide association studies in order to practice and apply a genome-wide meta-analysis (GWMA). The process has a series of five steps: searching and selection, extraction of related information, evaluation of validity, meta-analysis by type of genetic model, and evaluation of heterogeneity. In contrast to intervention meta-analyses, GWMA has to evaluate the Hardy–Weinberg equilibrium (HWE) in the third step and conduct meta-analyses by five potential genetic models, including dominant, recessive, homozygote contrast, heterozygote contrast, and allelic contrast in the fourth step. The ‘genhwcci’ and ‘metan’ commands of STATA software evaluate the HWE and calculate a summary effect size, respectively. A meta-regression using the ‘metareg’ command of STATA should be conducted to evaluate related factors of heterogeneities. PMID:28092928
Negotiating Decisions during Informed Consent for Pediatric Phase I Oncology Trials
Marshall, Patricia A.; Magtanong, Ruth V.; Leek, Angela C.; Hizlan, Sabahat; Yamokoski, Amy D.; Kodish, Eric D.
2012-01-01
During informed consent conferences (ICCs) for Phase I trials, oncologists must present complex information while addressing concerns. Research on communication that evolves during ICCs remains largely unexplored. We examined communication during ICCs for pediatric Phase I cancer trials using a stratified random sample from six pediatric cancer centers. A grounded theory approach identified key communication steps and factors influencing the negotiation of decisions for trial participation. Analysis suggests that during ICCs, families, patients, and clinicians exercise choice and control by negotiating micro-decisions in two broad domains: drug logic and logistics, and administration/scheduling. Micro-decisions unfold in a four-step communication process: (1) introduction of an issue; (2) response; (3) negotiation of the issue; and (4) resolution and decision. Negotiation over smaller micro-decisions is prominent in ICCs and merits further study. PMID:22565583
Negotiating decisions during informed consent for pediatric Phase I oncology trials.
Marshall, Patricia A; Magtanong, Ruth V; Leek, Angela C; Hizlan, Sabahat; Yamokoski, Amy D; Kodish, Eric D
2012-04-01
During informed consent conferences (ICCs) for Phase I trials, oncologists must present complex information while addressing concerns. Research on communication that evolves during ICCs remains largely unexplored. We examined communication during ICCs for pediatric Phase I cancer trials using a stratified random sample from six pediatric cancer centers. A grounded theory approach identified key communication steps and factors influencing the negotiation of decisions for trial participation. Analysis suggests that during ICCs, families, patients, and clinicians exercise choice and control by negotiating micro-decisions in two broad domains: drug logic and logistics, and administration/scheduling. Micro-decisions unfold in a four-step communication process: (1) introduction of an issue; (2) response; (3) negotiation of the issue; and (4) resolution and decision. Negotiation over smaller micro-decisions is prominent in ICCs and merits further study.
IRIS Toxicological Review of Tetrahydrofuran (THF) ...
EPA is releasing the draft report, Toxicological Review of Tetrahydrofuran, that was distributed to Federal agencies and White House Offices for comment during the Science Discussion step of the IRIS Assessment Development Process. Comments received from other Federal agencies and White House Offices are provided below with external peer review panel comments. EPA is undertaking an Integrated Risk Information System (IRIS) health assessment for tetrahydrofuran. IRIS is an EPA database containing Agency scientific positions on potential adverse human health effects that may result from chronic (or lifetime) exposure to chemicals in the environment. IRIS contains chemical-specific summaries of qualitative and quantitative health information in support of two steps of the risk assessment paradigm, i.e., hazard identification and dose-response evaluation. IRIS assessments are used in combination with specific situational exposure assessment information to evaluate potential public health risk associated with environmental contaminants.
NASA Technical Reports Server (NTRS)
Souther, J. W.
1981-01-01
The need to teach informational writing as a decision-making process is discussed. Situational analysis, its relationship to decisions in writing, and the need for relevant assignments are considered. Teaching students to ask the right questions is covered. The need to teach writing responsiveness is described. Three steps to get started and four teaching techniques are described. The information needs of the 'expert' and the 'manager' are contrasted.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boggs, Paul T.; Althsuler, Alan; Larzelere, Alex R.
2005-08-01
The Design-through-Analysis Realization Team (DART) is chartered with reducing the time Sandia analysts require to complete the engineering analysis process. The DART system analysis team studied the engineering analysis processes employed by analysts in Centers 9100 and 8700 at Sandia to identify opportunities for reducing overall design-through-analysis process time. The team created and implemented a rigorous analysis methodology based on a generic process flow model parameterized by information obtained from analysts. They also collected data from analysis department managers to quantify the problem type and complexity distribution throughout Sandia's analyst community. They then used this information to develop a communitymore » model, which enables a simple characterization of processes that span the analyst community. The results indicate that equal opportunity for reducing analysis process time is available both by reducing the ''once-through'' time required to complete a process step and by reducing the probability of backward iteration. In addition, reducing the rework fraction (i.e., improving the engineering efficiency of subsequent iterations) offers approximately 40% to 80% of the benefit of reducing the ''once-through'' time or iteration probability, depending upon the process step being considered. Further, the results indicate that geometry manipulation and meshing is the largest portion of an analyst's effort, especially for structural problems, and offers significant opportunity for overall time reduction. Iteration loops initiated late in the process are more costly than others because they increase ''inner loop'' iterations. Identifying and correcting problems as early as possible in the process offers significant opportunity for time savings.« less
Chlorpyrifos residual behaviors in field crops and transfers during duck pellet feed processing.
Li, Rui; Wei, Wei; He, Liang; Hao, Lili; Ji, Xiaofeng; Zhou, Yu; Wang, Qiang
2014-10-22
Chlorpyrifos is a widely used organophosphorus pesticide in agricultural crops (including food) and animal feeds in China, resulting in heavy contamination. Many studies have focused on the food-processing effects on chlorpyrifos removal, but sufficient information is not observed for feed-processing steps. Here, chlorpyrifos residual behaviors in field crops and its transfers in duck pellet feed-processing steps were evaluated. In field trials, the highest residues for rice grain, shelled corn, and soybean seed were 12.0, 0.605, and 0.220 mg/kg, respectively. Residues of all rice grain and about half of shelled corn exceeded the maximum residue limits (MRLs) of China, and five soybean seeds exceeded the MRL of China. Chlorpyrifos residue was reduced 38.2% in brown rice after the raw rice grain was hulled. The residue in bran increased 71.2% after milling from brown rice. During the squashing step, the residue reduced 73.8% in soybean meal. The residues reduced significantly (23.7-36.8%) during the process of granulating for rice, maize, and soybean products. Comparatively, the grinding process showed only limited influence on chlorpyrifos removal (<10%). The residues of duck pellet feeds produced from highly contaminated raw materials of this study were 1.01 mg/kg (maize-soybean feed) and 3.20 mg/kg (rice-soybean feed), which were much higher than the generally accepted value (>0.1 mg/kg) for animal feeding. Chlorpyrifos residues were removed significantly by processing steps of pellet feeds, but the residue of raw materials was the determining factor for the safety of duck feeding.
Stein, Karin; Hindin, Michelle J; Chou, Doris; Say, Lale
2017-02-01
Female genital mutilation (FGM) constitutes a harmful traditional practice that can have a profound impact on the health and well-being of girls and women who undergo the procedure. In recent years, due to international migration, healthcare providers worldwide are increasingly confronted with the need to provide adequate health care to this population. Recognizing this situation the WHO recently developed the first evidence-based guidelines on the management of health complications from FGM. To inform the guideline recommendations, an expert-driven, two-step process was conducted. The first step consisted of developing and ranking a list of priority research questions for the evidence retrieval. The second step involved conducting a series of systematic reviews and qualitative data syntheses. In the present paper, we first provide the methodology used in the development and ranking of the research questions (step 1) and then detail the common methodology for each of the systematic reviews and qualitative evidence syntheses (step 2). © 2017 International Federation of Gynecology and Obstetrics. The World Health Organization retains copyright and all other rights in the manuscript of this article as submitted for publication.
Wagner, Robert M [Knoxville, TN; Daw, Charles S [Knoxville, TN; Green, Johney B [Knoxville, TN; Edwards, Kevin D [Knoxville, TN
2008-10-07
This invention is a method of achieving stable, optimal mixtures of HCCI and SI in practical gasoline internal combustion engines comprising the steps of: characterizing the combustion process based on combustion process measurements, determining the ratio of conventional and HCCI combustion, determining the trajectory (sequence) of states for consecutive combustion processes, and determining subsequent combustion process modifications using said information to steer the engine combustion toward desired behavior.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Erickson, Randall Mark
Information is given on waste generation at TA-55 and remediation needed to meet WIPP acceptance criteria, including the role of nitrate salts. Breaching of a particular waste-filled drum is reviewed, along with an accident analysis and steps for corrective actions and improved process management.
41 CFR 102-2.80 - What steps must an agency take to deviate from the FMR?
Code of Federal Regulations, 2010 CFR
2010-07-01
... deviate from the FMR? (a) Consult informally with appropriate GSA program personnel to learn more about... consultation process may also highlight reasons why an agency would not be permitted to deviate from the FMR; e...
41 CFR 102-2.80 - What steps must an agency take to deviate from the FMR?
Code of Federal Regulations, 2011 CFR
2011-01-01
... deviate from the FMR? (a) Consult informally with appropriate GSA program personnel to learn more about... consultation process may also highlight reasons why an agency would not be permitted to deviate from the FMR; e...
Self-assembly and continuous growth of hexagonal graphene flakes on liquid Cu
NASA Astrophysics Data System (ADS)
Cho, Seong-Yong; Kim, Min-Sik; Kim, Minsu; Kim, Ki-Ju; Kim, Hyun-Mi; Lee, Do-Joong; Lee, Sang-Hoon; Kim, Ki-Bum
2015-07-01
Graphene growth on liquid Cu has received great interest, owing to the self-assembly behavior of hexagonal graphene flakes with aligned orientation and to the possibility of forming a single grain of graphene through a commensurate growth of these graphene flakes. Here, we propose and demonstrate a two-step growth process which allows the formation of self-assembled, completely continuous graphene on liquid Cu. After the formation of full coverage on the liquid Cu, grain boundaries were revealed via selective hydrogen etching and the original grain boundaries were clearly resolved. This result indicates that, while the flakes self-assembled with the same orientation, there still remain structural defects, gaps and voids that were not resolved by optical microscopy or scanning electron microscopy. To overcome this limitation, the two-step growth process was employed, consisting of a sequential process of a normal single-layer graphene growth and self-assembly process with a low carbon flux, followed by the final stage of graphene growth at a high degree of supersaturation with a high carbon flux. Continuity of the flakes was verified via hydrogen etching and a NaCl-assisted oxidation process, as well as by measuring the electrical properties of the graphene grown by the two-step process. Two-step growth can provide a continuous graphene layer, but commensurate stitching should be further studied.Graphene growth on liquid Cu has received great interest, owing to the self-assembly behavior of hexagonal graphene flakes with aligned orientation and to the possibility of forming a single grain of graphene through a commensurate growth of these graphene flakes. Here, we propose and demonstrate a two-step growth process which allows the formation of self-assembled, completely continuous graphene on liquid Cu. After the formation of full coverage on the liquid Cu, grain boundaries were revealed via selective hydrogen etching and the original grain boundaries were clearly resolved. This result indicates that, while the flakes self-assembled with the same orientation, there still remain structural defects, gaps and voids that were not resolved by optical microscopy or scanning electron microscopy. To overcome this limitation, the two-step growth process was employed, consisting of a sequential process of a normal single-layer graphene growth and self-assembly process with a low carbon flux, followed by the final stage of graphene growth at a high degree of supersaturation with a high carbon flux. Continuity of the flakes was verified via hydrogen etching and a NaCl-assisted oxidation process, as well as by measuring the electrical properties of the graphene grown by the two-step process. Two-step growth can provide a continuous graphene layer, but commensurate stitching should be further studied. Electronic supplementary information (ESI) available. See DOI: 10.1039/c5nr03352g
Bates, Imelda; Boyd, Alan; Smith, Helen; Cole, Donald C
2014-03-03
Despite increasing investment in health research capacity strengthening efforts in low and middle income countries, published evidence to guide the systematic design and monitoring of such interventions is very limited. Systematic processes are important to underpin capacity strengthening interventions because they provide stepwise guidance and allow for continual improvement. Our objective here was to use evidence to inform the design of a replicable but flexible process to guide health research capacity strengthening that could be customized for different contexts, and to provide a framework for planning, collecting information, making decisions, and improving performance. We used peer-reviewed and grey literature to develop a five-step pathway for designing and evaluating health research capacity strengthening programmes, tested in a variety of contexts in Africa. The five steps are: i) defining the goal of the capacity strengthening effort, ii) describing the optimal capacity needed to achieve the goal, iii) determining the existing capacity gaps compared to the optimum, iv) devising an action plan to fill the gaps and associated indicators of change, and v) adapting the plan and indicators as the programme matures. Our paper describes three contrasting case studies of organisational research capacity strengthening to illustrate how our five-step approach works in practice. Our five-step pathway starts with a clear goal and objectives, making explicit the capacity required to achieve the goal. Strategies for promoting sustainability are agreed with partners and incorporated from the outset. Our pathway for designing capacity strengthening programmes focuses not only on technical, managerial, and financial processes within organisations, but also on the individuals within organisations and the wider system within which organisations are coordinated, financed, and managed. Our five-step approach is flexible enough to generate and utilise ongoing learning. We have tested and critiqued our approach in a variety of organisational settings in the health sector in sub-Saharan Africa, but it needs to be applied and evaluated in other sectors and continents to determine the extent of transferability.
A three-talk model for shared decision making: multistage consultation process
Durand, Marie Anne; Song, Julia; Aarts, Johanna; Barr, Paul J; Berger, Zackary; Cochran, Nan; Frosch, Dominick; Galasiński, Dariusz; Gulbrandsen, Pål; Han, Paul K J; Härter, Martin; Kinnersley, Paul; Lloyd, Amy; Mishra, Manish; Perestelo-Perez, Lilisbeth; Scholl, Isabelle; Tomori, Kounosuke; Trevena, Lyndal; Witteman, Holly O; Van der Weijden, Trudy
2017-01-01
Objectives To revise an existing three-talk model for learning how to achieve shared decision making, and to consult with relevant stakeholders to update and obtain wider engagement. Design Multistage consultation process. Setting Key informant group, communities of interest, and survey of clinical specialties. Participants 19 key informants, 153 member responses from multiple communities of interest, and 316 responses to an online survey from medically qualified clinicians from six specialties. Results After extended consultation over three iterations, we revised the three-talk model by making changes to one talk category, adding the need to elicit patient goals, providing a clear set of tasks for each talk category, and adding suggested scripts to illustrate each step. A new three-talk model of shared decision making is proposed, based on “team talk,” “option talk,” and “decision talk,” to depict a process of collaboration and deliberation. Team talk places emphasis on the need to provide support to patients when they are made aware of choices, and to elicit their goals as a means of guiding decision making processes. Option talk refers to the task of comparing alternatives, using risk communication principles. Decision talk refers to the task of arriving at decisions that reflect the informed preferences of patients, guided by the experience and expertise of health professionals. Conclusions The revised three-talk model of shared decision making depicts conversational steps, initiated by providing support when introducing options, followed by strategies to compare and discuss trade-offs, before deliberation based on informed preferences. PMID:29109079
The Ph.D. Process - A Student's Guide to Graduate School in the Sciences
NASA Astrophysics Data System (ADS)
Bloom, Dale F.; Karp, Jonathan D.; Cohen, Nicholas
1999-02-01
The Ph.D. Process offers the essential guidance that students in the biological and physical sciences need to get the most out of their years in graduate school. Drawing upon the insights of numerous current and former graduate students, this book presents a rich portrayal of the intellectual and emotional challenges inherent in becoming a scientist, and offers the informed, practical advice a "best friend" would give about each stage of the graduate school experience. What are the best strategies for applying to a graduate program? How are classes conducted? How should I choose an advisor and a research project? What steps can I take now to make myself more "employable" when I get my degree? What goes on at the oral defense? Through a balanced, thorough examination of issues ranging from lab etiquette to stress management, the authors--each a Ph.D. in the sciences--provide the vital information that will allow students to make informed decisions all along the way to the degree. Headlined sections within each chapter make it fast and easy to look up any subject, while dozens of quotes describing personal experiences in graduate programs from people in diverse scientific fields contribute invaluable real-life expertise. Special attention is also given to the needs of international students.Read in advance, this book prepares students for each step of the graduate school experience that awaits them. Read during the course of a graduate education, it serves as a handy reference covering virtually all major issues and decisions a doctoral candidate is likely to face. The Ph.D. Process is the one book every graduate student in the biological and physical sciences can use to stay a step ahead, from application all the way through graduation.
Impact of user influence on information multi-step communication in a micro-blog
NASA Astrophysics Data System (ADS)
Wu, Yue; Hu, Yong; He, Xiao-Hai; Deng, Ken
2014-06-01
User influence is generally considered as one of the most critical factors that affect information cascading spreading. Based on this common assumption, this paper proposes a theoretical model to examine user influence on the information multi-step communication in a micro-blog. The multi-steps of information communication are divided into first-step and non-first-step, and user influence is classified into five dimensions. Actual data from the Sina micro-blog is collected to construct the model by means of an approach based on structural equations that uses the Partial Least Squares (PLS) technique. Our experimental results indicate that the dimensions of the number of fans and their authority significantly impact the information of first-step communication. Leader rank has a positive impact on both first-step and non-first-step communication. Moreover, global centrality and weight of friends are positively related to the information non-first-step communication, but authority is found to have much less relation to it.
Gaia DR2 documentation Chapter 3: Astrometry
NASA Astrophysics Data System (ADS)
Hobbs, D.; Lindegren, L.; Bastian, U.; Klioner, S.; Butkevich, A.; Stephenson, C.; Hernandez, J.; Lammers, U.; Bombrun, A.; Mignard, F.; Altmann, M.; Davidson, M.; de Bruijne, J. H. J.; Fernández-Hernández, J.; Siddiqui, H.; Utrilla Molina, E.
2018-04-01
This chapter of the Gaia DR2 documentation describes the models and processing steps used for the astrometric core solution, namely, the Astrometric Global Iterative Solution (AGIS). The inputs to this solution rely heavily on the basic observables (or astrometric elementaries) which have been pre-processed and discussed in Chapter 2, the results of which were published in Fabricius et al. (2016). The models consist of reference systems and time scales; assumed linear stellar motion and relativistic light deflection; in addition to fundamental constants and the transformation of coordinate systems. Higher level inputs such as: planetary and solar system ephemeris; Gaia tracking and orbit information; initial quasar catalogues and BAM data are all needed for the processing described here. The astrometric calibration models are outlined followed by the details processing steps which give AGIS its name. We also present a basic quality assessment and validation of the scientific results (for details, see Lindegren et al. 2018).
To Trace a Law: Use of Library Materials in a Classroom Exercise.
ERIC Educational Resources Information Center
Shannon, Michael Owen
A legislative history shows the various stages in the process of enacting laws. In order to follow the legislative process the student is asked to select a topic of interest and research the various steps as a bill becomes law. Then he is given descriptions of some current and standard reference works which will help him find information on the…
ERIC Educational Resources Information Center
Luckin, Rosemary; Clark, Wilma; Avramides, Katerina; Hunter, Jade; Oliver, Martin
2017-01-01
In this paper we review the literature on teacher inquiry (TI) to explore the possibility that this process can equip teachers to investigate students' learning as a step towards the process of formative assessment. We draw a distinction between formative assessment and summative forms of assessment [CRELL. (2009). The transition to computer-based…
NASA Astrophysics Data System (ADS)
Krasilenko, Vladimir G.; Nikolsky, Alexander I.; Zaitsev, Alexandr V.; Voloshin, Victor M.
2001-03-01
Historic information regarding the appearance and creation of fundamentals of algebra-logical apparatus-`equivalental algebra' for description of neuro-nets paradigms and algorithms is considered which is unification of theory of neuron nets (NN), linear algebra and the most generalized neuro-biology extended for matrix case. A survey is given of `equivalental models' of neuron nets and associative memory is suggested new, modified matrix-tenzor neurological equivalental models (MTNLEMS) are offered with double adaptive-equivalental weighing (DAEW) for spatial-non- invariant recognition (SNIR) and space-invariant recognition (SIR) of 2D images (patterns). It is shown, that MTNLEMS DAEW are the most generalized, they can describe the processes in NN both within the frames of known paradigms and within new `equivalental' paradigm of non-interaction type, and the computing process in NN under using the offered MTNLEMs DAEW is reduced to two-step and multi-step algorithms and step-by-step matrix-tenzor procedures (for SNIR) and procedures of defining of space-dependent equivalental functions from two images (for SIR).
Spatial Data Integration Using Ontology-Based Approach
NASA Astrophysics Data System (ADS)
Hasani, S.; Sadeghi-Niaraki, A.; Jelokhani-Niaraki, M.
2015-12-01
In today's world, the necessity for spatial data for various organizations is becoming so crucial that many of these organizations have begun to produce spatial data for that purpose. In some circumstances, the need to obtain real time integrated data requires sustainable mechanism to process real-time integration. Case in point, the disater management situations that requires obtaining real time data from various sources of information. One of the problematic challenges in the mentioned situation is the high degree of heterogeneity between different organizations data. To solve this issue, we introduce an ontology-based method to provide sharing and integration capabilities for the existing databases. In addition to resolving semantic heterogeneity, better access to information is also provided by our proposed method. Our approach is consisted of three steps, the first step is identification of the object in a relational database, then the semantic relationships between them are modelled and subsequently, the ontology of each database is created. In a second step, the relative ontology will be inserted into the database and the relationship of each class of ontology will be inserted into the new created column in database tables. Last step is consisted of a platform based on service-oriented architecture, which allows integration of data. This is done by using the concept of ontology mapping. The proposed approach, in addition to being fast and low cost, makes the process of data integration easy and the data remains unchanged and thus takes advantage of the legacy application provided.
Photogrammetry in 3d Modelling of Human Bone Structures from Radiographs
NASA Astrophysics Data System (ADS)
Hosseinian, S.; Arefi, H.
2017-05-01
Photogrammetry can have great impact on the success of medical processes for diagnosis, treatment and surgeries. Precise 3D models which can be achieved by photogrammetry improve considerably the results of orthopedic surgeries and processes. Usual 3D imaging techniques, computed tomography (CT) and magnetic resonance imaging (MRI), have some limitations such as being used only in non-weight-bearing positions, costs and high radiation dose(for CT) and limitations of MRI for patients with ferromagnetic implants or objects in their bodies. 3D reconstruction of bony structures from biplanar X-ray images is a reliable and accepted alternative for achieving accurate 3D information with low dose radiation in weight-bearing positions. The information can be obtained from multi-view radiographs by using photogrammetry. The primary step for 3D reconstruction of human bone structure from medical X-ray images is calibration which is done by applying principles of photogrammetry. After the calibration step, 3D reconstruction can be done using efficient methods with different levels of automation. Because of the different nature of X-ray images from optical images, there are distinct challenges in medical applications for calibration step of stereoradiography. In this paper, after demonstrating the general steps and principles of 3D reconstruction from X-ray images, a comparison will be done on calibration methods for 3D reconstruction from radiographs and they are assessed from photogrammetry point of view by considering various metrics such as their camera models, calibration objects, accuracy, availability, patient-friendly and cost.
NASA Astrophysics Data System (ADS)
Zhang, Dongqing; Liu, Yuan; Noble, Jack H.; Dawant, Benoit M.
2016-03-01
Cochlear Implants (CIs) are electrode arrays that are surgically inserted into the cochlea. Individual contacts stimulate frequency-mapped nerve endings thus replacing the natural electro-mechanical transduction mechanism. CIs are programmed post-operatively by audiologists but this is currently done using behavioral tests without imaging information that permits relating electrode position to inner ear anatomy. We have recently developed a series of image processing steps that permit the segmentation of the inner ear anatomy and the localization of individual contacts. We have proposed a new programming strategy that uses this information and we have shown in a study with 68 participants that 78% of long term recipients preferred the programming parameters determined with this new strategy. A limiting factor to the large scale evaluation and deployment of our technique is the amount of user interaction still required in some of the steps used in our sequence of image processing algorithms. One such step is the rough registration of an atlas to target volumes prior to the use of automated intensity-based algorithms when the target volumes have very different fields of view and orientations. In this paper we propose a solution to this problem. It relies on a random forest-based approach to automatically localize a series of landmarks. Our results obtained from 83 images with 132 registration tasks show that automatic initialization of an intensity-based algorithm proves to be a reliable technique to replace the manual step.
A Case Study Approach to Marine and Aquatic Issues.
ERIC Educational Resources Information Center
Snively, Gloria
1993-01-01
Suggests using case studies of resource management conflict involving marine and aquatic resource issues to increase student involvement in decision-making processes. Provides information for a potential case involving oyster farms and six steps to help students explore problems and make decisions. (MDH)
CVISN guide to top-level design : preliminary version P1.1
DOT National Transportation Integrated Search
1999-06-25
The CVISN (Commercial Vehicle Information Systems and Networks) guide to top-level design shows the initial steps in the process of deploying CVISN in a state. For purposes of these guides, top-level design encompasses setting the scope of the projec...
Liu, Hsiu-Chu; Li, Hsing; Chang, Hsin-Fei; Lu, Mei-Rou; Chen, Feng-Chuan
2015-01-01
Learning from the experience of another medical center in Taiwan, Kaohsiung Municipal Kai-Suan Psychiatric Hospital has changed the nursing informatics system step by step in the past year and a half . We considered ethics in the original idea of implementing barcodes on the test tube labels to process the identification of the psychiatric patients. The main aims of this project are to maintain the confidential information and to transport the sample effectively. The primary nurses had been using different work sheets for this project to ensure the acceptance of the new barcode system. In the past two years the errors in the blood testing process were as high as 11,000 in 14,000 events per year, resulting in wastage of resources. The actions taken by the nurses and the new barcode system implementation can improve the clinical nursing care quality, safety of the patients, and efficiency, while decreasing the cost due to the human error.
Method for network analyzation and apparatus
Bracht, Roger B.; Pasquale, Regina V.
2001-01-01
A portable network analyzer and method having multiple channel transmit and receive capability for real-time monitoring of processes which maintains phase integrity, requires low power, is adapted to provide full vector analysis, provides output frequencies of up to 62.5 MHz and provides fine sensitivity frequency resolution. The present invention includes a multi-channel means for transmitting and a multi-channel means for receiving, both in electrical communication with a software means for controlling. The means for controlling is programmed to provide a signal to a system under investigation which steps consecutively over a range of predetermined frequencies. The resulting received signal from the system provides complete time domain response information by executing a frequency transform of the magnitude and phase information acquired at each frequency step.
Using Lattice Topology Information to Investigate Persistent Scatterers at Facades in Urban Areas
NASA Astrophysics Data System (ADS)
Schack, L.; Soergel, U.
2013-05-01
Modern spaceborne SAR sensors like TerraSAR-X offer ground resolution of up to one meter in range and azimuth direction. Buildings, roads, bridges, and other man-made structures appear in such data often as regular patterns of strong and temporally stable points (Persistent Scatterer, PS). As one step in the process of unveiling what object structure actually causes the PS (i.e., physical nature) we compare those regular structures in SAR data to their correspondences in optical imagery. We use lattices as a common data representation for visible facades. By exploiting the topology information given by the lattices we can complete gaps in the structures which is one step towards the understanding of the complex scattering characteristics of distinct facade objects.
NASA Astrophysics Data System (ADS)
Budde, M. E.; Rowland, J.; Anthony, M.; Palka, S.; Martinez, J.; Hussain, R.
2017-12-01
The U.S. Geological Survey (USGS) supports the use of Earth observation data for food security monitoring through its role as an implementing partner of the Famine Early Warning Systems Network (FEWS NET). The USGS Earth Resources Observation and Science (EROS) Center has developed tools designed to aid food security analysts in developing assumptions of agro-climatological outcomes. There are four primary steps to developing agro-climatology assumptions; including: 1) understanding the climatology, 2) evaluating current climate modes, 3) interpretation of forecast information, and 4) incorporation of monitoring data. Analysts routinely forecast outcomes well in advance of the growing season, which relies on knowledge of climatology. A few months prior to the growing season, analysts can assess large-scale climate modes that might influence seasonal outcomes. Within two months of the growing season, analysts can evaluate seasonal forecast information as indicators. Once the growing season begins, monitoring data, based on remote sensing and field information, can characterize the start of season and remain integral monitoring tools throughout the duration of the season. Each subsequent step in the process can lead to modifications of the original climatology assumption. To support such analyses, we have created an agro-climatology analysis tool that characterizes each step in the assumption building process. Satellite-based rainfall and normalized difference vegetation index (NDVI)-based products support both the climatology and monitoring steps, sea-surface temperature data and knowledge of the global climate system inform the climate modes, and precipitation forecasts at multiple scales support the interpretation of forecast information. Organizing these data for a user-specified area provides a valuable tool for food security analysts to better formulate agro-climatology assumptions that feed into food security assessments. We have also developed a knowledge base for over 80 countries that provide rainfall and NDVI-based products, including annual and seasonal summaries, historical anomalies, coefficient of variation, and number of years below 70% of annual or seasonal averages. These products provide a quick look for analysts to assess the agro-climatology of a country.
NASA Planning for Orion Multi-Purpose Crew Vehicle Ground Operations
NASA Technical Reports Server (NTRS)
Letchworth, Gary; Schlierf, Roland
2011-01-01
The NASA Orion Ground Processing Team was originally formed by the Kennedy Space Center (KSC) Constellation (Cx) Project Office's Orion Division to define, refine and mature pre-launch and post-landing ground operations for the Orion human spacecraft. The multidisciplined KSC Orion team consisted of KSC civil servant, SAIC, Productivity Apex, Inc. and Boeing-CAPPS engineers, project managers and safety engineers, as well as engineers from Constellation's Orion Project and Lockheed Martin Orion Prime contractor. The team evaluated the Orion design configurations as the spacecraft concept matured between Systems Design Review (SDR), Systems Requirement Review (SRR) and Preliminary Design Review (PDR). The team functionally decomposed prelaunch and post-landing steps at three levels' of detail, or tiers, beginning with functional flow block diagrams (FFBDs). The third tier FFBDs were used to build logic networks and nominal timelines. Orion ground support equipment (GSE) was identified and mapped to each step. This information was subsequently used in developing lower level operations steps in a Ground Operations Planning Document PDR product. Subject matter experts for each spacecraft and GSE subsystem were used to define 5th - 95th percentile processing times for each FFBD step, using the Delphi Method. Discrete event simulations used this information and the logic network to provide processing timeline confidence intervals for launch rate assessments. The team also used the capabilities of the KSC Visualization Lab, the FFBDs and knowledge of the spacecraft, GSE and facilities to build visualizations of Orion pre-launch and postlanding processing at KSC. Visualizations were a powerful tool for communicating planned operations within the KSC community (i.e., Ground Systems design team), and externally to the Orion Project, Lockheed Martin spacecraft designers and other Constellation Program stakeholders during the SRR to PDR timeframe. Other operations planning tools included Kaizen/Lean events, mockups and human factors analysis. The majority of products developed by this team are applicable as KSC prepares 21st Century Ground Systems for the Orion Multi-Purpose Crew Vehicle and Space Launch System.
Savla, Jill J; Fisher, Brian T; Faerber, Jennifer A; Huang, Yuan-Shung V; Mercer-Rosa, Laura
2017-12-12
The surgical strategy for neonates with tetralogy of Fallot (TOF) consists of complete or staged repair. Assessing the comparative effectiveness of these approaches is facilitated by a large multicenter cohort. We propose a novel process for cohort assembly using the Pediatric Health Information System (PHIS), an administrative database that contains clinical and billing data for inpatient and emergency department stays from tertiary children's hospitals. A 4-step process was used to identify neonates with TOF: (1) screen neonates in PHIS with International Classification of Diseases-9 (ICD-9) diagnosis or procedure codes for TOF; (2) include patients with TOF procedures before 30 days of age; (3) exclude patients with missing 2-year follow-up data; (4) analyze patients' 2-year surgery sequence patterns, exclude patients inconsistent with a treatment strategy for TOF, and designate patients as complete or staged repair. Manual chart review at 1 PHIS center was performed to validate this process. Between January 2004 and March 2015, 5862 patients were identified in step 1. Step 2 of cohort assembly excluded 3425 patients (58%); step 3 excluded 148 patients (3%); and step 4 excluded 54 patients (1%). The final cohort consisted of 2235 neonates with TOF from 45 hospitals. Manual chart review of 336 patients showed a positive predictive value for accurate PHIS identification of 44% after step 1 and 97% after step 4. This systematic cohort identification algorithm resulted in a high positive predictive value to appropriately categorize patients. This carefully assembled cohort offers a unique opportunity for future studies in neonatal TOF outcomes.
16 CFR 1101.33 - Reasonable steps to assure information release is fair in the circumstances.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 16 Commercial Practices 2 2010-01-01 2010-01-01 false Reasonable steps to assure information... PRODUCT SAFETY ACT Reasonable Steps Commission Will Take To Assure Information It Discloses Is Accurate... of the Acts It Administers § 1101.33 Reasonable steps to assure information release is fair in the...
Quality and efficiency successes leveraging IT and new processes.
Chaiken, Barry P; Christian, Charles E; Johnson, Liz
2007-01-01
Today, healthcare annually invests billions of dollars in information technology, including clinical systems, electronic medical records and interoperability platforms. While continued investment and parallel development of standards are critical to secure exponential benefits from clinical information technology, intelligent and creative redesign of processes through path innovation is necessary to deliver meaningful value. Reports from two organizations included in this report review the steps taken to reinvent clinical processes that best leverage information technology to deliver safer and more efficient care. Good Samaritan Hospital, Vincennes, Indiana, implemented electronic charting, point-of-care bar coding of medications prior to administration, and integrated clinical documentation for nursing, laboratory, radiology and pharmacy. Tenet Healthcare, during its implementation and deployment of multiple clinical systems across several hospitals, focused on planning that included team-based process redesign. In addition, Tenet constructed valuable and measurable metrics that link outcomes with its strategic goals.
On vertical seismic profile processing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tariel, P.; Michon, D.
1984-10-01
From the wealth of information which can be deduced from VSP, the information most directly comparable to well logs is considered: P-wave and S-wave interval velocity, acoustic impedance, and the velocity ratio ..gamma.. = V /SUB s/ /V /SUB p/ . This information not only allows better interpretation of surface seismic sections but also improves processing. For these results to be usable a number of precautions must be taken during acquisition and processing; the sampling in depth should be chosen in such a way that aliasing phenomena do not unnecessarily limit the spectra during the separation of upwards and downwardsmore » travelling waves. True amplitudes should be respected and checked by recording of signatures, and the interference of upwards and downwards travelling waves should be taken into account for the picking of first arrivals. The different steps in processing and the combination of results in the interpretation of surface seismic results are described with actual records.« less
Invariance algorithms for processing NDE signals
NASA Astrophysics Data System (ADS)
Mandayam, Shreekanth; Udpa, Lalita; Udpa, Satish S.; Lord, William
1996-11-01
Signals that are obtained in a variety of nondestructive evaluation (NDE) processes capture information not only about the characteristics of the flaw, but also reflect variations in the specimen's material properties. Such signal changes may be viewed as anomalies that could obscure defect related information. An example of this situation occurs during in-line inspection of gas transmission pipelines. The magnetic flux leakage (MFL) method is used to conduct noninvasive measurements of the integrity of the pipe-wall. The MFL signals contain information both about the permeability of the pipe-wall and the dimensions of the flaw. Similar operational effects can be found in other NDE processes. This paper presents algorithms to render NDE signals invariant to selected test parameters, while retaining defect related information. Wavelet transform based neural network techniques are employed to develop the invariance algorithms. The invariance transformation is shown to be a necessary pre-processing step for subsequent defect characterization and visualization schemes. Results demonstrating the successful application of the method are presented.
Social Information Processing in Deaf Adolescents.
Torres, Jesús; Saldaña, David; Rodríguez-Ortiz, Isabel R
2016-07-01
The goal of this study was to compare the processing of social information in deaf and hearing adolescents. A task was developed to assess social information processing (SIP) skills of deaf adolescents based on Crick and Dodge's (1994; A review and reformulation of social information-processing mechanisms in children's social adjustment. Psychological Bulletin, 115, 74-101) reformulated six-stage model. It consisted of a structured interview after watching 18 scenes of situations depicting participation in a peer group or provocations by peers. Participants included 32 deaf and 20 hearing adolescents and young adults aged between 13 and 21 years. Deaf adolescents and adults had lower scores than hearing participants in all the steps of the SIP model (coding, interpretation, goal formulation, response generation, response decision, and representation). However, deaf girls and women had better scores on social adjustment and on some SIP skills than deaf male participants. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Brower, Kevin P; Ryakala, Venkat K; Bird, Ryan; Godawat, Rahul; Riske, Frank J; Konstantinov, Konstantin; Warikoo, Veena; Gamble, Jean
2014-01-01
Downstream sample purification for quality attribute analysis is a significant bottleneck in process development for non-antibody biologics. Multi-step chromatography process train purifications are typically required prior to many critical analytical tests. This prerequisite leads to limited throughput, long lead times to obtain purified product, and significant resource requirements. In this work, immunoaffinity purification technology has been leveraged to achieve single-step affinity purification of two different enzyme biotherapeutics (Fabrazyme® [agalsidase beta] and Enzyme 2) with polyclonal and monoclonal antibodies, respectively, as ligands. Target molecules were rapidly isolated from cell culture harvest in sufficient purity to enable analysis of critical quality attributes (CQAs). Most importantly, this is the first study that demonstrates the application of predictive analytics techniques to predict critical quality attributes of a commercial biologic. The data obtained using the affinity columns were used to generate appropriate models to predict quality attributes that would be obtained after traditional multi-step purification trains. These models empower process development decision-making with drug substance-equivalent product quality information without generation of actual drug substance. Optimization was performed to ensure maximum target recovery and minimal target protein degradation. The methodologies developed for Fabrazyme were successfully reapplied for Enzyme 2, indicating platform opportunities. The impact of the technology is significant, including reductions in time and personnel requirements, rapid product purification, and substantially increased throughput. Applications are discussed, including upstream and downstream process development support to achieve the principles of Quality by Design (QbD) as well as integration with bioprocesses as a process analytical technology (PAT). © 2014 American Institute of Chemical Engineers.
Single photon emission from plasma treated 2D hexagonal boron nitride.
Xu, Zai-Quan; Elbadawi, Christopher; Tran, Toan Trong; Kianinia, Mehran; Li, Xiuling; Liu, Daobin; Hoffman, Timothy B; Nguyen, Minh; Kim, Sejeong; Edgar, James H; Wu, Xiaojun; Song, Li; Ali, Sajid; Ford, Mike; Toth, Milos; Aharonovich, Igor
2018-05-03
Artificial atomic systems in solids are becoming increasingly important building blocks in quantum information processing and scalable quantum nanophotonic networks. Amongst numerous candidates, 2D hexagonal boron nitride has recently emerged as a promising platform hosting single photon emitters. Here, we report a number of robust plasma and thermal annealing methods for fabrication of emitters in tape-exfoliated hexagonal boron nitride (hBN) crystals. A two-step process comprising Ar plasma etching and subsequent annealing in Ar is highly robust, and yields an eight-fold increase in the concentration of emitters in hBN. The initial plasma-etching step generates emitters that suffer from blinking and bleaching, whereas the two-step process yields emitters that are photostable at room temperature with emission wavelengths greater than ∼700 nm. Density functional theory modeling suggests that the emitters might be associated with defect complexes that contain oxygen. This is further confirmed by generating the emitters via annealing hBN in air. Our findings advance the present understanding of the structure of quantum emitters in hBN and enhance the nanofabrication toolkit needed to realize integrated quantum nanophotonic circuits.
Frames of reference in action plan recall: influence of hand and handedness.
Seegelke, Christian; Hughes, Charmayne M L; Wunsch, Kathrin; van der Wel, Robrecht; Weigelt, Matthias
2015-10-01
Evidence suggests that people are more likely to recall features of previous plans and use them for subsequent movements, rather than generating action plans from scratch for each movement. The information used for plan recall during object manipulation tasks is stored in extrinsic (object-centered) rather than intrinsic (body-centered) coordinates. The present study examined whether action plan recall processes are influenced by manual asymmetries. Right-handed (Experiment 1) and left-handed (Experiment 2) participants grasped a plunger from a home position using either the dominant or the non-dominant hand and placed it at one of the three target positions located at varying heights (home-to-target moves). Subsequently, they stepped sideways down from a podium (step-down podium), onto a podium (step-up podium), or without any podium present (no podium), before returning the plunger to the home platform using the same hand (target-back-to-home moves). The data show that, regardless of hand and handedness, participants grasped the plunger at similar heights during the home-to-target and target-back-to-home moves, even if they had to adopt quite different arm postures to do so. Thus, these findings indicate that the information used for plan recall processes in sequential object manipulation tasks is stored in extrinsic coordinates and in an effector-independent manner.
Cloud, Richard N; Kingree, J B
2008-01-01
Researchers have observed that a majority of addicted persons who are encouraged and facilitated by treatment providers to attend twelve-step (TS) programs either drop out or sporadically use twelve-step programs following treatment. This is troubling given considerable evidence of TS program benefits associated with regular weekly attendance and ubiquitous reliance by treatment professionals on these programs to provide important support services. This chapter reviews and advances theory of TS utilization and dose that is supported by prior research, multivariate models, and scales that predict risk of TS meeting underutilization. Advancing theory should organize and clarify the process of initial utilization, guide intervention development, and improve adherence of TS program referrals, all of which should lead to improved treatment planning and better outcomes. Three theories are integrated to explain processes that may influence TS program dose: the health belief model, self-determination theory (motivational theory), and a person-in-organization cultural fit theory. Four multidimensional scales developed specifically to predict participation are described. Implications for practice and future research are considered in a final discussion. Information contained in this chapter raises awareness of the need for TS-focused treatments to focus on achieving weekly attendance during and after treatment.
Lavery, Richard; Zakrzewska, Krystyna; Beveridge, David; Bishop, Thomas C.; Case, David A.; Cheatham, Thomas; Dixit, Surjit; Jayaram, B.; Lankas, Filip; Laughton, Charles; Maddocks, John H.; Michon, Alexis; Osman, Roman; Orozco, Modesto; Perez, Alberto; Singh, Tanya; Spackova, Nada; Sponer, Jiri
2010-01-01
It is well recognized that base sequence exerts a significant influence on the properties of DNA and plays a significant role in protein–DNA interactions vital for cellular processes. Understanding and predicting base sequence effects requires an extensive structural and dynamic dataset which is currently unavailable from experiment. A consortium of laboratories was consequently formed to obtain this information using molecular simulations. This article describes results providing information not only on all 10 unique base pair steps, but also on all possible nearest-neighbor effects on these steps. These results are derived from simulations of 50–100 ns on 39 different DNA oligomers in explicit solvent and using a physiological salt concentration. We demonstrate that the simulations are converged in terms of helical and backbone parameters. The results show that nearest-neighbor effects on base pair steps are very significant, implying that dinucleotide models are insufficient for predicting sequence-dependent behavior. Flanking base sequences can notably lead to base pair step parameters in dynamic equilibrium between two conformational sub-states. Although this study only provides limited data on next-nearest-neighbor effects, we suggest that such effects should be analyzed before attempting to predict the sequence-dependent behavior of DNA. PMID:19850719
2018-01-01
Background Electronic health (eHealth) and mobile health (mHealth) tools can support and improve the whole process of workplace health promotion (WHP) projects. However, several challenges and opportunities have to be considered while integrating these tools in WHP projects. Currently, a large number of eHealth tools are developed for changing health behavior, but these tools can support the whole WHP process, including group administration, information flow, assessment, intervention development process, or evaluation. Objective To support a successful implementation of eHealth tools in the whole WHP processes, we introduce a concept of WHP (life cycle model of WHP) with 7 steps and present critical and success factors for the implementation of eHealth tools in each step. Methods We developed a life cycle model of WHP based on the World Health Organization (WHO) model of healthy workplace continual improvement process. We suggest adaptations to the WHO model to demonstrate the large number of possibilities to implement eHealth tools in WHP as well as possible critical points in the implementation process. Results eHealth tools can enhance the efficiency of WHP in each of the 7 steps of the presented life cycle model of WHP. Specifically, eHealth tools can support by offering easier administration, providing an information and communication platform, supporting assessments, presenting and discussing assessment results in a dashboard, and offering interventions to change individual health behavior. Important success factors include the possibility to give automatic feedback about health parameters, create incentive systems, or bring together a large number of health experts in one place. Critical factors such as data security, anonymity, or lack of management involvement have to be addressed carefully to prevent nonparticipation and dropouts. Conclusions Using eHealth tools can support WHP, but clear regulations for the usage and implementation of these tools at the workplace are needed to secure quality and reach sustainable results. PMID:29475828
Graci, Valentina; Rabuffetti, Marco; Frigo, Carlo; Ferrarin, Maurizio
2017-02-01
The importance of peripheral visual information during stair climbing and how peripheral visual information is weighted as a function of step number during step climbing is unclear. Previous authors postulated that the knowledge of predictable characteristics of the steps may decrease reliance on foveal vision and transfer the online visual guidance of stair climbing to peripheral vision. Hence the aim of this study was to investigate if and how the occlusion of the lower peripheral visual field influenced stair climbing and if peripheral visual information was weighted differently between steps. Ten young adult male participants ascended a 5-step staircase under 2 visual conditions: full vision (FV) and lower visual occlusion (LO). Kinematic data (100Hz) were collected. The effect of Vision and Step condition on vertical forefoot clearance was examined with a Repeated Measures 2-way ANOVA. Tukey's HSD test was used for post-hoc comparisons. A significant interaction Vision x Step and main effect of Step were found (p<=0.04): vertical forefoot clearance was greater in LO compared to FV condition only on the 1st and the 2nd steps (p<0.013) and on the last step compared to the other steps (p<0.01). These findings suggest that online peripheral visual information is more relevant when negotiating the first two steps, rather than the end of a staircase and that the steps subsequent the first few ones may require different information likely based on proprioception or working memory of the step height. Copyright © 2016 Elsevier B.V. All rights reserved.
Environmental factor(tm) system: RCRA hazardous waste handler information (on CD-ROM). Data file
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1995-11-01
Environmental Factor(trademark) RCRA Hazardous Waste Handler Information on CD-ROM unleashes the invaluable information found in two key EPA data sources on hazardous waste handlers and offers cradle-to-grave waste tracking. It`s easy to search and display: (1) Permit status, design capacity, and compliance history for facilities found in the EPA Research Conservation and Recovery Information System (RCRIS) program tracking database; (2) Detailed information on hazardous wastes generation, management, and minimization by companies who are large quantity generators; and (3) Data on the waste management practices of treatment, storage, and disposal (TSD) facilities from the EPA Biennial Reporting System which is collectedmore » every other year. Environmental Factor`s powerful database retrieval system lets you: (1) Search for RCRA facilities by permit type, SIC code, waste codes, corrective action, or violation information, TSD status, generator and transporter status, and more. (2) View compliance information - dates of evaluation, violation, enforcement, and corrective action. (3) Lookup facilities by waste processing categories of marketing, transporting, processing, and energy recovery. (4) Use owner/operator information and names, titles, and telephone numbers of project managers for prospecting. (5) Browse detailed data on TSD facility and large quantity generators` activities such as onsite waste treatment, disposal, or recycling, offsite waste received, and waste generation and management. The product contains databases, search and retrieval software on two CD-ROMs, an installation diskette and User`s Guide. Environmental Factor has online context-sensitive help from any screen and a printed User`s Guide describing installation and step-by-step procedures for searching, retrieving, and exporting.« less
Vagos, Paula; Rijo, Daniel; Santos, Isabel M
2016-04-01
Relatively little is known about measures used to investigate the validity and applications of social information processing theory. The Scenes for Social Information Processing in Adolescence includes items built using a participatory approach to evaluate the attribution of intent, emotion intensity, response evaluation, and response decision steps of social information processing. We evaluated a sample of 802 Portuguese adolescents (61.5% female; mean age = 16.44 years old) using this instrument. Item analysis and exploratory and confirmatory factor analytic procedures were used for psychometric examination. Two measures for attribution of intent were produced, including hostile and neutral; along with 3 emotion measures, focused on negative emotional states; 8 response evaluation measures; and 4 response decision measures, including prosocial and impaired social behavior. All of these measures achieved good internal consistency values and fit indicators. Boys seemed to favor and choose overt and relational aggression behaviors more often; girls conveyed higher levels of neutral attribution, sadness, and assertiveness and passiveness. The Scenes for Social Information Processing in Adolescence achieved adequate psychometric results and seems a valuable alternative for evaluating social information processing, even if it is essential to continue investigation into its internal and external validity. (c) 2016 APA, all rights reserved.
Preparing for the Integration of Emerging Technologies.
ERIC Educational Resources Information Center
Dyrli, Odvard Egil; Kinnaman, Daniel E.
1994-01-01
Discussion of the process of integrating new technologies into schools considers the evolution of technology, including personal computers, CD-ROMs, hypermedia, and networking/communications; the transition from Industrial-Age to Information-Age schools; and the logical steps of transition. Sidebars discuss a networked multimedia pilot project and…
A Probabilistic Approach to Crosslingual Information Retrieval
2001-06-01
language expansion step can be performed before the translation process. Implemented as a call to the INQUERY function get_modified_query with one of the...database consists of American English while the dictionary is British English. Therefore, e.g. the Spanish word basura is translated to rubbish and
Fundraising for Early Childhood Programs: Getting Started and Getting Results.
ERIC Educational Resources Information Center
Finn, Matia
Designed to assist practitioners serving young children and their families, this book contains information about methods of raising money and managing nonprofit organizations. Following the first chapter's introductory definition of important terms associated with the fundraising process, chapter 2 discusses some prerequisite steps required before…
Investigating Students' Similarity Judgments in Organic Chemistry
ERIC Educational Resources Information Center
Graulich, N.; Bhattacharyya, G.
2017-01-01
Organic chemistry is possibly the most visual science of all chemistry disciplines. The process of scientific inquiry in organic chemistry relies on external representations, such as Lewis structures, mechanisms, and electron arrows. Information about chemical properties or driving forces of mechanistic steps is not available through direct…
Photosynthesis. Agricultural Lesson Plans.
ERIC Educational Resources Information Center
Southern Illinois Univ., Carbondale. Dept. of Agricultural Education and Mechanization.
This lesson plan is intended for use in conducting classes on photosynthesis. Presented first are an attention step/problem statement and a series of questions and answers designed to convey general information about photosynthesis. The following topics are among those discussed: the photosynthesis process and its importance, the organisms that…
Vocabulary, Grammar, Sex, and Aging
ERIC Educational Resources Information Center
Moscoso del Prado Martín, Fermín
2017-01-01
Understanding the changes in our language abilities along the lifespan is a crucial step for understanding the aging process both in normal and in abnormal circumstances. Besides controlled experimental tasks, it is equally crucial to investigate language in unconstrained conversation. I present an information-theoretical analysis of a corpus of…
How Does Your Organic Garden Grow?
ERIC Educational Resources Information Center
Reemer, Rita, Ed.
A complete organic gardening cycle--from soil composition to harvesting--can be conducted using the 14 activities suggested in this teacher's guide. Background information, questions for discussion, and related activities are presented for each step in the process. The activities, useful for elementary grade students, are titled: Starting the…
Ahmadi, Maryam; Valinejadi, Ali; Goodarzi, Afshin; Safari, Ameneh; Hemmat, Morteza; Majdabadi, Hesamedin Askari; Mohammadi, Ali
2017-06-01
Traffic accidents are one of the more important national and international issues, and their consequences are important for the political, economical, and social level in a country. Management of traffic accident information requires information systems with analytical and accessibility capabilities to spatial and descriptive data. The aim of this study was to determine the capabilities of a Geographic Information System (GIS) in management of traffic accident information. This qualitative cross-sectional study was performed in 2016. In the first step, GIS capabilities were identified via literature retrieved from the Internet and based on the included criteria. Review of the literature was performed until data saturation was reached; a form was used to extract the capabilities. In the second step, study population were hospital managers, police, emergency, statisticians, and IT experts in trauma, emergency and police centers. Sampling was purposive. Data was collected using a questionnaire based on the first step data; validity and reliability were determined by content validity and Cronbach's alpha of 75%. Data was analyzed using the decision Delphi technique. GIS capabilities were identified in ten categories and 64 sub-categories. Import and process of spatial and descriptive data and so, analysis of this data were the most important capabilities of GIS in traffic accident information management. Storing and retrieving of descriptive and spatial data, providing statistical analysis in table, chart and zoning format, management of bad structure issues, determining the cost effectiveness of the decisions and prioritizing their implementation were the most important capabilities of GIS which can be efficient in the management of traffic accident information.
NASA Astrophysics Data System (ADS)
Cucchi, K.; Kawa, N.; Hesse, F.; Rubin, Y.
2017-12-01
In order to reduce uncertainty in the prediction of subsurface flow and transport processes, practitioners should use all data available. However, classic inverse modeling frameworks typically only make use of information contained in in-situ field measurements to provide estimates of hydrogeological parameters. Such hydrogeological information about an aquifer is difficult and costly to acquire. In this data-scarce context, the transfer of ex-situ information coming from previously investigated sites can be critical for improving predictions by better constraining the estimation procedure. Bayesian inverse modeling provides a coherent framework to represent such ex-situ information by virtue of the prior distribution and combine them with in-situ information from the target site. In this study, we present an innovative data-driven approach for defining such informative priors for hydrogeological parameters at the target site. Our approach consists in two steps, both relying on statistical and machine learning methods. The first step is data selection; it consists in selecting sites similar to the target site. We use clustering methods for selecting similar sites based on observable hydrogeological features. The second step is data assimilation; it consists in assimilating data from the selected similar sites into the informative prior. We use a Bayesian hierarchical model to account for inter-site variability and to allow for the assimilation of multiple types of site-specific data. We present the application and validation of the presented methods on an established database of hydrogeological parameters. Data and methods are implemented in the form of an open-source R-package and therefore facilitate easy use by other practitioners.
NASA Astrophysics Data System (ADS)
Wan, Jiangping; Jones, James D.
2013-11-01
The Warfield version of systems science supports a wide variety of application areas, and is useful to practitioners who use the work program of complexity (WPOC) tool. In this article, WPOC is applied to information technology service management (ITSM) for managing the complexity of projects. In discussing the application of WPOC to ITSM, we discuss several steps of WPOC. The discovery step of WPOC consists of a description process and a diagnosis process. During the description process, 52 risk factors are identified, which are then narrowed to 20 key risk factors. All of this is done by interviews and surveys. Root risk factors (the most basic risk factors) consist of 11 kinds of common 'mindbugs' which are selected from an interpretive structural model. This is achieved by empirical analysis of 25 kinds of mindbugs. (A lesser aim of this research is to affirm that these mindbugs developed from a Western mindset have corresponding relevance in a completely different culture: the Peoples Republic of China.) During the diagnosis process, the relationships among the root risk factors in the implementation of the ITSM project are identified. The resolution step of WPOC consists of a design process and an implementation process. During the design process, issues related to the ITSM application are compared to both e-Government operation and maintenance, and software process improvement. The ITSM knowledge support structure is also designed at this time. During the implementation process, 10 keys to the successful implementation of ITSM projects are identified.
Foerster, Rebecca M.; Carbone, Elena; Schneider, Werner X.
2014-01-01
Evidence for long-term memory (LTM)-based control of attention has been found during the execution of highly practiced multi-step tasks. However, does LTM directly control for attention or are working memory (WM) processes involved? In the present study, this question was investigated with a dual-task paradigm. Participants executed either a highly practiced visuospatial sensorimotor task (speed stacking) or a verbal task (high-speed poem reciting), while maintaining visuospatial or verbal information in WM. Results revealed unidirectional and domain-specific interference. Neither speed stacking nor high-speed poem reciting was influenced by WM retention. Stacking disrupted the retention of visuospatial locations, but did not modify memory performance of verbal material (letters). Reciting reduced the retention of verbal material substantially whereas it affected the memory performance of visuospatial locations to a smaller degree. We suggest that the selection of task-relevant information from LTM for the execution of overlearned multi-step tasks recruits domain-specific WM. PMID:24847304
On salesmen and tourists: Two-step optimization in deterministic foragers
NASA Astrophysics Data System (ADS)
Maya, Miguel; Miramontes, Octavio; Boyer, Denis
2017-02-01
We explore a two-step optimization problem in random environments, the so-called restaurant-coffee shop problem, where a walker aims at visiting the nearest and better restaurant in an area and then move to the nearest and better coffee-shop. This is an extension of the Tourist Problem, a one-step optimization dynamics that can be viewed as a deterministic walk in a random medium. A certain amount of heterogeneity in the values of the resources to be visited causes the emergence of power-laws distributions for the steps performed by the walker, similarly to a Lévy flight. The fluctuations of the step lengths tend to decrease as a consequence of multiple-step planning, thus reducing the foraging uncertainty. We find that the first and second steps of each planned movement play very different roles in heterogeneous environments. The two-step process improves only slightly the foraging efficiency compared to the one-step optimization, at a much higher computational cost. We discuss the implications of these findings for animal and human mobility, in particular in relation to the computational effort that informed agents should deploy to solve search problems.
TUNS/TCIS information model/process model
NASA Technical Reports Server (NTRS)
Wilson, James
1992-01-01
An Information Model is comprised of graphical and textual notation suitable for describing and defining the problem domain - in our case, TUNS or TCIS. The model focuses on the real world under study. It identifies what is in the problem and organizes the data into a formal structure for documentation and communication purposes. The Information Model is composed of an Entity Relationship Diagram (ERD) and a Data Dictionary component. The combination of these components provide an easy to understand methodology for expressing the entities in the problem space, the relationships between entities and the characteristics (attributes) of the entities. This approach is the first step in information system development. The Information Model identifies the complete set of data elements processed by TUNS. This representation provides a conceptual view of TUNS from the perspective of entities, data, and relationships. The Information Model reflects the business practices and real-world entities that users must deal with.
Combining Static Analysis and Model Checking for Software Analysis
NASA Technical Reports Server (NTRS)
Brat, Guillaume; Visser, Willem; Clancy, Daniel (Technical Monitor)
2003-01-01
We present an iterative technique in which model checking and static analysis are combined to verify large software systems. The role of the static analysis is to compute partial order information which the model checker uses to reduce the state space. During exploration, the model checker also computes aliasing information that it gives to the static analyzer which can then refine its analysis. The result of this refined analysis is then fed back to the model checker which updates its partial order reduction. At each step of this iterative process, the static analysis computes optimistic information which results in an unsafe reduction of the state space. However we show that the process converges to a fired point at which time the partial order information is safe and the whole state space is explored.
An Analysis of the Role of ATC in the AILS Concept
NASA Technical Reports Server (NTRS)
Waller, Marvin C.; Doyle, Thomas M.; McGee, Frank G.
2000-01-01
Airborne information for lateral spacing (AILS) is a concept for making approaches to closely spaced parallel runways in instrument meteorological conditions (IMC). Under the concept, each equipped aircraft will assume responsibility for accurately managing its flight path along the approach course and maintaining separation from aircraft on the parallel approach. This document presents the results of an analysis of the AILS concept from an Air Traffic Control (ATC) perspective. The process has been examined in a step by step manner to determine ATC system support necessary to safely conduct closely spaced parallel approaches using the AILS concept. The analysis resulted in recognizing a number of issues related to integrating the process into the airspace system and proposes operating procedures.
Summarizing health inequalities in a Balanced Scorecard. Methodological considerations.
Auger, Nathalie; Raynault, Marie-France
2006-01-01
The association between social determinants and health inequalities is well recognized. What are now needed are tools to assist in disseminating such information. This article describes how the Balanced Scorecard may be used for summarizing data on health inequalities. The process begins by selecting appropriate social groups and indicators, and is followed by the measurement of differences across person, place, or time. The next step is to decide whether to focus on absolute versus relative inequality. The last step is to determine the scoring method, including whether to address issues of depth of inequality.
Standardization of Performance Tests: A Proposal for Further Steps.
1986-07-01
obviously demand substantial attention can sometimes be time shared perfectly. Wickens describes cases in which skilled pianists can time share sight-reading...effects of divided attention on information processing in tracking. Journal of Experimental Psychology, 1, 1-13. Wickens, C.D. (1984). Processing resources... attention he regards focused- divided attention tasks (e.g. dichotic listening, dual task situations) as theoretically useful. From his point of view good
Lee, Young Han
2012-01-01
The objectives are (1) to introduce an easy open-source macro program as connection software and (2) to illustrate the practical usages in radiologic reading environment by simulating the radiologic reading process. The simulation is a set of radiologic reading process to do a practical task in the radiologic reading room. The principal processes are: (1) to view radiologic images on the Picture Archiving and Communicating System (PACS), (2) to connect the HIS/EMR (Hospital Information System/Electronic Medical Record) system, (3) to make an automatic radiologic reporting system, and (4) to record and recall information of interesting cases. This simulation environment was designed by using open-source macro program as connection software. The simulation performed well on the Window-based PACS workstation. Radiologists practiced the steps of the simulation comfortably by utilizing the macro-powered radiologic environment. This macro program could automate several manual cumbersome steps in the radiologic reading process. This program successfully acts as connection software for the PACS software, EMR/HIS, spreadsheet, and other various input devices in the radiologic reading environment. A user-friendly efficient radiologic reading environment could be established by utilizing open-source macro program as connection software. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
IBES: a tool for creating instructions based on event segmentation
Mura, Katharina; Petersen, Nils; Huff, Markus; Ghose, Tandra
2013-01-01
Receiving informative, well-structured, and well-designed instructions supports performance and memory in assembly tasks. We describe IBES, a tool with which users can quickly and easily create multimedia, step-by-step instructions by segmenting a video of a task into segments. In a validation study we demonstrate that the step-by-step structure of the visual instructions created by the tool corresponds to the natural event boundaries, which are assessed by event segmentation and are known to play an important role in memory processes. In one part of the study, 20 participants created instructions based on videos of two different scenarios by using the proposed tool. In the other part of the study, 10 and 12 participants respectively segmented videos of the same scenarios yielding event boundaries for coarse and fine events. We found that the visual steps chosen by the participants for creating the instruction manual had corresponding events in the event segmentation. The number of instructional steps was a compromise between the number of fine and coarse events. Our interpretation of results is that the tool picks up on natural human event perception processes of segmenting an ongoing activity into events and enables the convenient transfer into meaningful multimedia instructions for assembly tasks. We discuss the practical application of IBES, for example, creating manuals for differing expertise levels, and give suggestions for research on user-oriented instructional design based on this tool. PMID:24454296
IBES: a tool for creating instructions based on event segmentation.
Mura, Katharina; Petersen, Nils; Huff, Markus; Ghose, Tandra
2013-12-26
Receiving informative, well-structured, and well-designed instructions supports performance and memory in assembly tasks. We describe IBES, a tool with which users can quickly and easily create multimedia, step-by-step instructions by segmenting a video of a task into segments. In a validation study we demonstrate that the step-by-step structure of the visual instructions created by the tool corresponds to the natural event boundaries, which are assessed by event segmentation and are known to play an important role in memory processes. In one part of the study, 20 participants created instructions based on videos of two different scenarios by using the proposed tool. In the other part of the study, 10 and 12 participants respectively segmented videos of the same scenarios yielding event boundaries for coarse and fine events. We found that the visual steps chosen by the participants for creating the instruction manual had corresponding events in the event segmentation. The number of instructional steps was a compromise between the number of fine and coarse events. Our interpretation of results is that the tool picks up on natural human event perception processes of segmenting an ongoing activity into events and enables the convenient transfer into meaningful multimedia instructions for assembly tasks. We discuss the practical application of IBES, for example, creating manuals for differing expertise levels, and give suggestions for research on user-oriented instructional design based on this tool.
Hosseinpoor, Ahmad Reza; Nambiar, Devaki; Tawilah, Jihane; Schlotheuber, Anne; Briot, Benedicte; Bateman, Massee; Davey, Tamzyn; Kusumawardani, Nunik; Myint, Theingi; Nuryetty, Mariet Tetty; Prasetyo, Sabarinah; Suparmi; Floranita, Rustini
Inequalities in health represent a major problem in many countries, including Indonesia. Addressing health inequality is a central component of the Sustainable Development Goals and a priority of the World Health Organization (WHO). WHO provides technical support for health inequality monitoring among its member states. Following a capacity-building workshop in the WHO South-East Asia Region in 2014, Indonesia expressed interest in incorporating health-inequality monitoring into its national health information system. This article details the capacity-building process for national health inequality monitoring in Indonesia, discusses successes and challenges, and how this process may be adapted and implemented in other countries/settings. We outline key capacity-building activities undertaken between April 2016 and December 2017 in Indonesia and present the four key outcomes of this process. The capacity-building process entailed a series of workshops, meetings, activities, and processes undertaken between April 2016 and December 2017. At each stage, a range of stakeholders with access to the relevant data and capacity for data analysis, interpretation and reporting was engaged with, under the stewardship of state agencies. Key steps to strengthening health inequality monitoring included capacity building in (1) identification of the health topics/areas of interest, (2) mapping data sources and identifying gaps, (3) conducting equity analyses using raw datasets, and (4) interpreting and reporting inequality results. As a result, Indonesia developed its first national report on the state of health inequality. A number of peer-reviewed manuscripts on various aspects of health inequality in Indonesia have also been developed. The capacity-building process undertaken in Indonesia is designed to be adaptable to other contexts. Capacity building for health inequality monitoring among countries is a critical step for strengthening equity-oriented national health information systems and eventually tackling health inequities.
Motivation for health information seeking and processing about clinical trial enrollment.
Yang, Z Janet; McComas, Katherine; Gay, Geri; Leonard, John P; Dannenberg, Andrew J; Dillon, Hildy
2010-07-01
Low patient accrual in clinical trials poses serious concerns for the advancement of medical science in the United States. Past research has identified health communication as a crucial step in overcoming barriers to enrollment. However, few communication scholars have studied this problem from a sociopsychological perspective to understand what motivates people to look for or pay attention to information about clinical trial enrollment. This study applies the model of Risk Information Seeking and Processing (RISP) to this context of health decision making. By recognizing the uncertainties embedded in clinical trials, we view clinical trial enrollment as a case study of risk. With data from a random-digit-dial telephone survey of 500 adults living in the United States, we used structural equation modeling to test the central part of the RISP model. In particular, we examined the role of optimistic feelings, as a type of positive affect, in motivating information seeking and processing. Our results indicated that rather than exerting an indirect influence on information seeking through motivating a psychological need for more information, optimistic feelings have more direct relationships with information seeking and processing. Similarly, informational subjective norms also exhibit a more direct relationship with information seeking and processing. These results suggest merit in applying the RISP model to study health decision making related to clinical trial enrollment. Our findings also render practical implications on how to improve communication about clinical trial enrollment.
Scientific Workflows + Provenance = Better (Meta-)Data Management
NASA Astrophysics Data System (ADS)
Ludaescher, B.; Cuevas-Vicenttín, V.; Missier, P.; Dey, S.; Kianmajd, P.; Wei, Y.; Koop, D.; Chirigati, F.; Altintas, I.; Belhajjame, K.; Bowers, S.
2013-12-01
The origin and processing history of an artifact is known as its provenance. Data provenance is an important form of metadata that explains how a particular data product came about, e.g., how and when it was derived in a computational process, which parameter settings and input data were used, etc. Provenance information provides transparency and helps to explain and interpret data products. Other common uses and applications of provenance include quality control, data curation, result debugging, and more generally, 'reproducible science'. Scientific workflow systems (e.g. Kepler, Taverna, VisTrails, and others) provide controlled environments for developing computational pipelines with built-in provenance support. Workflow results can then be explained in terms of workflow steps, parameter settings, input data, etc. using provenance that is automatically captured by the system. Scientific workflows themselves provide a user-friendly abstraction of the computational process and are thus a form of ('prospective') provenance in their own right. The full potential of provenance information is realized when combining workflow-level information (prospective provenance) with trace-level information (retrospective provenance). To this end, the DataONE Provenance Working Group (ProvWG) has developed an extension of the W3C PROV standard, called D-PROV. Whereas PROV provides a 'least common denominator' for exchanging and integrating provenance information, D-PROV adds new 'observables' that described workflow-level information (e.g., the functional steps in a pipeline), as well as workflow-specific trace-level information ( timestamps for each workflow step executed, the inputs and outputs used, etc.) Using examples, we will demonstrate how the combination of prospective and retrospective provenance provides added value in managing scientific data. The DataONE ProvWG is also developing tools based on D-PROV that allow scientists to get more mileage from provenance metadata. DataONE is a federation of member nodes that store data and metadata for discovery and access. By enriching metadata with provenance information, search and reuse of data is enhanced, and the 'social life' of data (being the product of many workflow runs, different people, etc.) is revealed. We are currently prototyping a provenance repository (PBase) to demonstrate what can be achieved with advanced provenance queries. The ProvExplorer and ProPub tools support advanced ad-hoc querying and visualization of provenance as well as customized provenance publications (e.g., to address privacy issues, or to focus provenance to relevant details). In a parallel line of work, we are exploring ways to add provenance support to widely-used scripting platforms (e.g. R and Python) and then expose that information via D-PROV.
Lenderink, Bertil W; Egberts, Toine C G
2004-08-01
Recent reports and studies of errors in the medication process have raised the awareness of the threat to public health. An essential step in this multi-stage process is the actual administration of a medicine to the patient. The closed loop system is thought to be a way of preventing medication errors. Current information technology can facilitate this process. This article describes the way barcode technology is being used to facilitate medication administration registration on several wards in our hospital and nursing home.
[Role of medical information processing for quality assurance in obstetrics].
Selbmann, H K
1983-06-01
The paradigma of problem-orientated assuring of the professional quality of medical case is a kind of "control loop system" consisting of the following 5 steps: routine observation, identification of the problem, analysis of the problem, translation of problem solutions into daily practice and control as to whether the problem has been solved or eliminated. Medical data processing, which involves documentation, electronic data processing and statistics, can make substantial contributions especially to the steps of observation, identification of the problem, and follow-up control. Perinatal data collection, which has already been introduced in 6 Länder of the Federal Republic of Germany, has supplied ample proof of this. These operations were conducted under the heading "internal clinical assuring of quality with external aid". Those clinics who participated in this programme, were given the necessary aid in self-observation (questionnaires, clinical statistics), and they were also given comparative informative data to help them in identifying the problems (clinical profiles, etc.). It is entirely left to the responsibility of the clinics themselves--voluntary cooperation and guarantee of remaining anonymous being a matter of course -- to draw their own consequences from the collected data and to translate these into clinical everyday practice.
NIMH Prototype Management Information System for Community Mental Health Centers
Wurster, Cecil R.; Goodman, John D.
1980-01-01
Various approaches to centralized support of computer applications in health care are described. The NIMH project to develop a prototype Management Information System (MIS) for community mental health centers is presented and discussed as a centralized development of an automated data processing system for multiple user organizations. The NIMH program is summarized, the prototype MIS is characterized, and steps taken to provide for the differing needs of the mental health centers are highlighted.
A three-talk model for shared decision making: multistage consultation process.
Elwyn, Glyn; Durand, Marie Anne; Song, Julia; Aarts, Johanna; Barr, Paul J; Berger, Zackary; Cochran, Nan; Frosch, Dominick; Galasiński, Dariusz; Gulbrandsen, Pål; Han, Paul K J; Härter, Martin; Kinnersley, Paul; Lloyd, Amy; Mishra, Manish; Perestelo-Perez, Lilisbeth; Scholl, Isabelle; Tomori, Kounosuke; Trevena, Lyndal; Witteman, Holly O; Van der Weijden, Trudy
2017-11-06
Objectives To revise an existing three-talk model for learning how to achieve shared decision making, and to consult with relevant stakeholders to update and obtain wider engagement. Design Multistage consultation process. Setting Key informant group, communities of interest, and survey of clinical specialties. Participants 19 key informants, 153 member responses from multiple communities of interest, and 316 responses to an online survey from medically qualified clinicians from six specialties. Results After extended consultation over three iterations, we revised the three-talk model by making changes to one talk category, adding the need to elicit patient goals, providing a clear set of tasks for each talk category, and adding suggested scripts to illustrate each step. A new three-talk model of shared decision making is proposed, based on "team talk," "option talk," and "decision talk," to depict a process of collaboration and deliberation. Team talk places emphasis on the need to provide support to patients when they are made aware of choices, and to elicit their goals as a means of guiding decision making processes. Option talk refers to the task of comparing alternatives, using risk communication principles. Decision talk refers to the task of arriving at decisions that reflect the informed preferences of patients, guided by the experience and expertise of health professionals. Conclusions The revised three-talk model of shared decision making depicts conversational steps, initiated by providing support when introducing options, followed by strategies to compare and discuss trade-offs, before deliberation based on informed preferences. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Changing Careers: Steps to Success.
ERIC Educational Resources Information Center
Sikula, Lola
This book is intended to assist adults who are contemplating changing careers or actually doing so. The information, exercises, questionnaires, guides, and bibliographies included in it are designed to give adults the coping skills to adapt to and work through the personal, financial, and other changes that accompany the career change process. The…
Bereaved Employee: Returning to Work
... Out-of-town relatives return home. Children go back to school and grieving adults must get back to work. For some, returning to work is ... one is a long, slow process, but getting back into a routine is an important step in the journey. Share this: ... Copyright Information The materials on this website are ...
The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...
Implications of New and Changing Occupations for Instructional Development.
ERIC Educational Resources Information Center
Russell, Jill Frymier
A study was conducted to determine what occupations nationally are new and changing and if they need curriculum development at the vocational education level. The process used to conduct this study involved four steps: identifying new and changing occupations, collecting information about the occupations, locating available instructional…
One-Step "Change" and "Compare" Word Problems: Focusing on Eye-Movements
ERIC Educational Resources Information Center
Moutsios-Rentzos, Andreas; Stamatis, Panagiotis J.
2015-01-01
Introduction. In this study, we focus on the relationship between the students' mathematical thinking and their non-mechanically identified eye-movements with the purpose to gain deeper understanding about the students' reasoning processes and to investigate the feasibility of incorporating eye-movement information in everyday pedagogy. Method.…
The Archivists' Toolkit: Another Step toward Streamlined Archival Processing
ERIC Educational Resources Information Center
Westbrook, Bradley D.; Mandell, Lee; Shepherd, Kelcy; Stevens, Brian; Varghese, Jason
2006-01-01
The Archivists' Toolkit is a software application currently in development and designed to support the creation and management of archival information. This article summarizes the development of the application, including some of the problems the application is designed to resolve. Primary emphasis is placed on describing the application's…
Surviving Accreditation: A QIAS Ideas Bank. Accreditation and Beyond Series, Volume I.
ERIC Educational Resources Information Center
Ferry, Jan
This publication provides information on the accreditation process for early childhood education and care providers participating in the Quality Improvement and Accreditation System (QIAS), developed by the National Childcare Accreditation Council of Australia. The publication is divided into sections corresponding to steps in the…
The New Guide to Utility Ratemaking.
ERIC Educational Resources Information Center
American Gas Association, Arlington, VA. Educational Services.
This booklet focuses on state regulations of gas, electricity, water, and telephone services. Section 1 describes the basic steps in a rate case, procedures followed, and key terms used in explaining these processes. Included information highlights preparing and tracking a rate case in terms of: (1) preliminary events; (2) the staff's position and…
A Call for Strategic Planning: The Two-Year College Imperative.
ERIC Educational Resources Information Center
Masoner, David J.; Essex, Nathan, L.
1987-01-01
Addresses the imperative for strategic and tactical planning to support the viability of the two-year college. Describes a process for approaching strategic planning, comprising the following steps: self-identification, self-analysis, analysis of service area, informed decision making, and the development of a marketing plan. (CBC)
Communique: Resources for Practicing Counselors, Vol. 2, No. 8.
ERIC Educational Resources Information Center
Walz, Garry R., Ed.
This issue of Communique, a newsletter providing resource information for practicing counselors, features an article describing two non-verbal group counseling techniques for the elementary school counselor; a description of value clarification including a definition of values, the steps in the value clarification process, and specific value…
Teacher as Writer: Entering the Professional Conversation.
ERIC Educational Resources Information Center
Dahl, Karin L., Ed.
This book, featuring teacher writers from all levels of education, offers consciousness-raising stories of the teachers' first steps toward authorship, advice for all aspects of the Writing process, suggestions for conducting writing groups, and a wealth of insider information on how to develop quality articles for professional journals and get…
Get Started: Energy Efficiency Makes More Sense Than Ever.
ERIC Educational Resources Information Center
Alban, Josh; Drabick, J. R.
2003-01-01
Describes the benefits of making school building more energy efficient. Provides examples of physical retrofits and behavioral changes to save energy costs. Describes four-step process to create an energy efficiency plan. Includes resources and information such as U.S. Department of Energy's Energy STAR program (www.energystar.gov). (PKP)
SCORE A: A Student Research Paper Writing Strategy.
ERIC Educational Resources Information Center
Korinek, Lori; Bulls, Jill A.
1996-01-01
A mnemonic strategy for writing a research paper is explained. "SCORE A" reminds the student to select a subject, create categories, obtain sources, read and take notes, evenly organize the information, and apply process writing steps. Implementation of the strategy with five eighth graders with learning disabilities is reported. (DB)
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-09
...., Washington, DC 20230. SUPPLEMENTARY INFORMATION: The Agenda topics to be discussed are: U.S. Small Business Administration State Trade and Export Promotion (STEP) Grants Process. Christine L. Turner, Assistant U.S. Trade...] BILLING CODE 3190-W2-P ...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-10
... on small businesses). Our initial plan, as the first step in the assessment process, was to interview..., and for-hire recreational fishing operations (charter and party/head boat operations)--with questions... and fishing industry interviews were completed. The commercial fisheries interviews were not begun due...
Measuring diagnoses: ICD code accuracy.
O'Malley, Kimberly J; Cook, Karon F; Price, Matt D; Wildes, Kimberly Raiford; Hurdle, John F; Ashton, Carol M
2005-10-01
To examine potential sources of errors at each step of the described inpatient International Classification of Diseases (ICD) coding process. The use of disease codes from the ICD has expanded from classifying morbidity and mortality information for statistical purposes to diverse sets of applications in research, health care policy, and health care finance. By describing a brief history of ICD coding, detailing the process for assigning codes, identifying where errors can be introduced into the process, and reviewing methods for examining code accuracy, we help code users more systematically evaluate code accuracy for their particular applications. We summarize the inpatient ICD diagnostic coding process from patient admission to diagnostic code assignment. We examine potential sources of errors at each step and offer code users a tool for systematically evaluating code accuracy. Main error sources along the "patient trajectory" include amount and quality of information at admission, communication among patients and providers, the clinician's knowledge and experience with the illness, and the clinician's attention to detail. Main error sources along the "paper trail" include variance in the electronic and written records, coder training and experience, facility quality-control efforts, and unintentional and intentional coder errors, such as misspecification, unbundling, and upcoding. By clearly specifying the code assignment process and heightening their awareness of potential error sources, code users can better evaluate the applicability and limitations of codes for their particular situations. ICD codes can then be used in the most appropriate ways.
Ivezic, Nenad; Potok, Thomas E.
2003-09-30
A method for automatically evaluating a manufacturing technique comprises the steps of: receiving from a user manufacturing process step parameters characterizing a manufacturing process; accepting from the user a selection for an analysis of a particular lean manufacturing technique; automatically compiling process step data for each process step in the manufacturing process; automatically calculating process metrics from a summation of the compiled process step data for each process step; and, presenting the automatically calculated process metrics to the user. A method for evaluating a transition from a batch manufacturing technique to a lean manufacturing technique can comprise the steps of: collecting manufacturing process step characterization parameters; selecting a lean manufacturing technique for analysis; communicating the selected lean manufacturing technique and the manufacturing process step characterization parameters to an automatic manufacturing technique evaluation engine having a mathematical model for generating manufacturing technique evaluation data; and, using the lean manufacturing technique evaluation data to determine whether to transition from an existing manufacturing technique to the selected lean manufacturing technique.
NASA Astrophysics Data System (ADS)
Wallace, Jon Michael
2003-10-01
Reliability prediction of components operating in complex systems has historically been conducted in a statistically isolated manner. Current physics-based, i.e. mechanistic, component reliability approaches focus more on component-specific attributes and mathematical algorithms and not enough on the influence of the system. The result is that significant error can be introduced into the component reliability assessment process. The objective of this study is the development of a framework that infuses the needs and influence of the system into the process of conducting mechanistic-based component reliability assessments. The formulated framework consists of six primary steps. The first three steps, identification, decomposition, and synthesis, are primarily qualitative in nature and employ system reliability and safety engineering principles to construct an appropriate starting point for the component reliability assessment. The following two steps are the most unique. They involve a step to efficiently characterize and quantify the system-driven local parameter space and a subsequent step using this information to guide the reduction of the component parameter space. The local statistical space quantification step is accomplished using two proposed multivariate probability models: Multi-Response First Order Second Moment and Taylor-Based Inverse Transformation. Where existing joint probability models require preliminary distribution and correlation information of the responses, these models combine statistical information of the input parameters with an efficient sampling of the response analyses to produce the multi-response joint probability distribution. Parameter space reduction is accomplished using Approximate Canonical Correlation Analysis (ACCA) employed as a multi-response screening technique. The novelty of this approach is that each individual local parameter and even subsets of parameters representing entire contributing analyses can now be rank ordered with respect to their contribution to not just one response, but the entire vector of component responses simultaneously. The final step of the framework is the actual probabilistic assessment of the component. Although the same multivariate probability tools employed in the characterization step can be used for the component probability assessment, variations of this final step are given to allow for the utilization of existing probabilistic methods such as response surface Monte Carlo and Fast Probability Integration. The overall framework developed in this study is implemented to assess the finite-element based reliability prediction of a gas turbine airfoil involving several failure responses. Results of this implementation are compared to results generated using the conventional 'isolated' approach as well as a validation approach conducted through large sample Monte Carlo simulations. The framework resulted in a considerable improvement to the accuracy of the part reliability assessment and an improved understanding of the component failure behavior. Considerable statistical complexity in the form of joint non-normal behavior was found and accounted for using the framework. Future applications of the framework elements are discussed.
Development of STEP-NC Adaptor for Advanced Web Manufacturing System
NASA Astrophysics Data System (ADS)
Ajay Konapala, Mr.; Koona, Ramji, Dr.
2017-08-01
Information systems play a key role in the modern era of Information Technology. Rapid developments in IT & global competition calls for many changes in basic CAD/CAM/CAPP/CNC manufacturing chain of operations. ‘STEP-NC’ an enhancement to STEP for operating CNC machines, creating new opportunities for collaborative, concurrent, adaptive works across the manufacturing chain of operations. Schemas and data models defined by ISO14649 in liaison with ISO10303 standards made STEP-NC file rich with feature based, rather than mere point to point information of G/M Code format. But one needs to have a suitable information system to understand and modify these files. Various STEP-NC information systems are reviewed to understand the suitability of STEP-NC for web manufacturing. Present work also deals with the development of an adaptor which imports STEP-NC file, organizes its information, allowing modifications to entity values and finally generates a new STEP-NC file to export. The system is designed and developed to work on web to avail additional benefits through the web and also to be part of a proposed ‘Web based STEP-NC manufacturing platform’ which is under development and explained as future scope.
Bryant, Jamie; Sanson-Fisher, Rob; Tzelepis, Flora; Henskens, Frans; Paul, Christine; Stevenson, William
2014-01-01
Background Effective communication with cancer patients and their families about their disease, treatment options, and possible outcomes may improve psychosocial outcomes. However, traditional approaches to providing information to patients, including verbal information and written booklets, have a number of shortcomings centered on their limited ability to meet patient preferences and literacy levels. New-generation Web-based technologies offer an innovative and pragmatic solution for overcoming these limitations by providing a platform for interactive information seeking, information sharing, and user-centered tailoring. Objective The primary goal of this paper is to discuss the advantages of comprehensive and iterative Web-based technologies for health information provision and propose a four-phase framework for the development of Web-based information tools. Methods The proposed framework draws on our experience of constructing a Web-based information tool for hematological cancer patients and their families. The framework is based on principles for the development and evaluation of complex interventions and draws on the Agile methodology of software programming that emphasizes collaboration and iteration throughout the development process. Results The DoTTI framework provides a model for a comprehensive and iterative approach to the development of Web-based informational tools for patients. The process involves 4 phases of development: (1) Design and development, (2) Testing early iterations, (3) Testing for effectiveness, and (4) Integration and implementation. At each step, stakeholders (including researchers, clinicians, consumers, and programmers) are engaged in consultations to review progress, provide feedback on versions of the Web-based tool, and based on feedback, determine the appropriate next steps in development. Conclusions This 4-phase framework is evidence-informed and consumer-centered and could be applied widely to develop Web-based programs for a diverse range of diseases. PMID:24641991
Smits, Rochelle; Bryant, Jamie; Sanson-Fisher, Rob; Tzelepis, Flora; Henskens, Frans; Paul, Christine; Stevenson, William
2014-03-14
Effective communication with cancer patients and their families about their disease, treatment options, and possible outcomes may improve psychosocial outcomes. However, traditional approaches to providing information to patients, including verbal information and written booklets, have a number of shortcomings centered on their limited ability to meet patient preferences and literacy levels. New-generation Web-based technologies offer an innovative and pragmatic solution for overcoming these limitations by providing a platform for interactive information seeking, information sharing, and user-centered tailoring. The primary goal of this paper is to discuss the advantages of comprehensive and iterative Web-based technologies for health information provision and propose a four-phase framework for the development of Web-based information tools. The proposed framework draws on our experience of constructing a Web-based information tool for hematological cancer patients and their families. The framework is based on principles for the development and evaluation of complex interventions and draws on the Agile methodology of software programming that emphasizes collaboration and iteration throughout the development process. The DoTTI framework provides a model for a comprehensive and iterative approach to the development of Web-based informational tools for patients. The process involves 4 phases of development: (1) Design and development, (2) Testing early iterations, (3) Testing for effectiveness, and (4) Integration and implementation. At each step, stakeholders (including researchers, clinicians, consumers, and programmers) are engaged in consultations to review progress, provide feedback on versions of the Web-based tool, and based on feedback, determine the appropriate next steps in development. This 4-phase framework is evidence-informed and consumer-centered and could be applied widely to develop Web-based programs for a diverse range of diseases.
The role of deep-water sedimentary processes in shaping a continental margin: The Northwest Atlantic
Mosher, David C.; Campbell, D.C.; Gardner, J.V.; Piper, D.J.W.; Chaytor, Jason; Rebesco, M.
2017-01-01
The tectonic history of a margin dictates its general shape; however, its geomorphology is generally transformed by deep-sea sedimentary processes. The objective of this study is to show the influences of turbidity currents, contour currents and sediment mass failures on the geomorphology of the deep-water northwestern Atlantic margin (NWAM) between Blake Ridge and Hudson Trough, spanning about 32° of latitude and the shelf edge to the abyssal plain. This assessment is based on new multibeam echosounder data, global bathymetric models and sub-surface geophysical information.The deep-water NWAM is divided into four broad geomorphologic classifications based on their bathymetric shape: graded, above-grade, stepped and out-of-grade. These shapes were created as a function of the balance between sediment accumulation and removal that in turn were related to sedimentary processes and slope-accommodation. This descriptive method of classifying continental margins, while being non-interpretative, is more informative than the conventional continental shelf, slope and rise classification, and better facilitates interpretation concerning dominant sedimentary processes.Areas of the margin dominated by turbidity currents and slope by-pass developed graded slopes. If sediments did not by-pass the slope due to accommodation then an above grade or stepped slope resulted. Geostrophic currents created sedimentary bodies of a variety of forms and positions along the NWAM. Detached drifts form linear, above-grade slopes along their crests from the shelf edge to the deep basin. Plastered drifts formed stepped slope profiles. Sediment mass failure has had a variety of consequences on the margin morphology; large mass-failures created out-of-grade profiles, whereas smaller mass failures tended to remain on the slope and formed above-grade profiles at trough-mouth fans, or nearly graded profiles, such as offshore Cape Fear.
Global phenomena from local rules: Peer-to-peer networks and crystal steps
NASA Astrophysics Data System (ADS)
Finkbiner, Amy
Even simple, deterministic rules can generate interesting behavior in dynamical systems. This dissertation examines some real world systems for which fairly simple, locally defined rules yield useful or interesting properties in the system as a whole. In particular, we study routing in peer-to-peer networks and the motion of crystal steps. Peers can vary by three orders of magnitude in their capacities to process network traffic. This heterogeneity inspires our use of "proportionate load balancing," where each peer provides resources in proportion to its individual capacity. We provide an implementation that employs small, local adjustments to bring the entire network into a global balance. Analytically and through simulations, we demonstrate the effectiveness of proportionate load balancing on two routing methods for de Bruijn graphs, introducing a new "reversed" routing method which performs better than standard forward routing in some cases. The prevalence of peer-to-peer applications prompts companies to locate the hosts participating in these networks. We explore the use of supervised machine learning to identify peer-to-peer hosts, without using application-specific information. We introduce a model for "triples," which exploits information about nearly contemporaneous flows to give a statistical picture of a host's activities. We find that triples, together with measurements of inbound vs. outbound traffic, can capture most of the behavior of peer-to-peer hosts. An understanding of crystal surface evolution is important for the development of modern nanoscale electronic devices. The most commonly studied surface features are steps, which form at low temperatures when the crystal is cut close to a plane of symmetry. Step bunching, when steps arrange into widely separated clusters of tightly packed steps, is one important step phenomenon. We analyze a discrete model for crystal steps, in which the motion of each step depends on the two steps on either side of it. We find an time-dependence term for the motion that does not appear in continuum models, and we determine an explicit dependence on step number.
Neural correlates of phonetic convergence and speech imitation.
Garnier, Maëva; Lamalle, Laurent; Sato, Marc
2013-01-01
Speakers unconsciously tend to mimic their interlocutor's speech during communicative interaction. This study aims at examining the neural correlates of phonetic convergence and deliberate imitation, in order to explore whether imitation of phonetic features, deliberate, or unconscious, might reflect a sensory-motor recalibration process. Sixteen participants listened to vowels with pitch varying around the average pitch of their own voice, and then produced the identified vowels, while their speech was recorded and their brain activity was imaged using fMRI. Three degrees and types of imitation were compared (unconscious, deliberate, and inhibited) using a go-nogo paradigm, which enabled the comparison of brain activations during the whole imitation process, its active perception step, and its production. Speakers followed the pitch of voices they were exposed to, even unconsciously, without being instructed to do so. After being informed about this phenomenon, 14 participants were able to inhibit it, at least partially. The results of whole brain and ROI analyses support the fact that both deliberate and unconscious imitations are based on similar neural mechanisms and networks, involving regions of the dorsal stream, during both perception and production steps of the imitation process. While no significant difference in brain activation was found between unconscious and deliberate imitations, the degree of imitation, however, appears to be determined by processes occurring during the perception step. Four regions of the dorsal stream: bilateral auditory cortex, bilateral supramarginal gyrus (SMG), and left Wernicke's area, indeed showed an activity that correlated significantly with the degree of imitation during the perception step.
NASA Astrophysics Data System (ADS)
Capar, Laure
2013-04-01
Within the framework of the transnational project GeoMol geophysical and geological information on the entire Molasse Basin and on the Po Basin are gathered to build consistent cross-border 3D geological models based on borehole evidence and seismic data. Benefiting from important progress in seismic processing, these new models will provide some answers to various questions regarding the usage of subsurface resources, as there are geothermal energy, CO2 and gas storage, oil and gas production, and support decisions-making to national and local administrations as well as to industries. More than 28 000 km of 2D seismic lines are compiled reprocessed and harmonized. This work faces various problems like the vertical drop of more than 700 meters between West and East of the Molasse Basin and to al lesser extent in the Po Plain, the heterogeneities of the substratum, the large disparities between the period and parameters of seismic acquisition, and depending of their availability, the use of two types of seismic data, raw and processed seismic data. The main challenge is to harmonize all lines at the same reference level, amplitude and step of signal processing from France to Austria, spanning more than 1000 km, to avoid misfits at crossing points between seismic lines and artifacts at the country borders, facilitating the interpretation of the various geological layers in the Molasse Basin and Po Basin. A generalized stratigraphic column for the two basins is set up, representing all geological layers relevant to subsurface usage. This stratigraphy constitutes the harmonized framework for seismic reprocessing. In general, processed seismic data is available on paper at stack stage and the mandatory information to take these seismic lines to the final stage of processing, the migration step, are datum plane and replacement velocity. However several datum planes and replacement velocities were used during previous processing projects. Our processing sequence is to first digitize the data, to have them in SEG-Y format. The second step is to apply some post-stack processing to obtain a good data quality before the final migration step. The third step is the final migration, using optimized migration velocities and the fourth step is the post-migration processing. In case of raw seismic data, the mandatory information for processing is made accessible, like from observer logs, coordinates and field seismic data. The processing sequence in order to obtain the final usable version of the seismic line is based on a pre-stack time migration. A complex processing sequence is applied. One main issue is to deal with the significant changes in the topography along the seismic lines and in the first twenty meter layer, this low velocity zone (LVZ) or weathered zone, where some lateral velocity variations occur and disturb the wave propagation, therefore the seismic signal. In seismic processing, this matter is solved by using the static corrections which allow removing these effects of lateral velocity variations and the effects of topography. Another main item is the good determination of root mean square velocities for migration, to improve the final result of seismic processing. Within GeoMol, generalized 3D velocity models of stack velocities are calculated in order to perform a rapid time-depth conversion. In final, all seismic lines of the project GeoMol will be at the same level of processing, the migration level. But to tie all these lines, a single appropriate datum plane and replacement velocity for the entire Molasse Basin and Po Plain, respectively, have to be carefully set up, to avoid misties at crossing points. The reprocessing and use of these 28 000 km of seismic lines in the project GeoMol provide the pivotal database to build a 3D framework model for regional subsurface information on the Alpine foreland basins (cf. Rupf et al. 2013, EGU2013-8924). The project GeoMol is co-funded by the Alpine Space Program as part of the European Territorial Cooperation 2007-2013. The project integrates partners from Austria, France, Germany, Italy, Slovenia and Switzerland and runs from September 2012 to June 2015. Further information on www.geomol.eu The GeoMol seismic interpretation team: Roland Baumberger (swisstopo), Agnès BRENOT (BRGM), Alessandro CAGNONI (RLB), Renaud COUËFFE (BRGM), Gabriel COURRIOUX (BRGM), Chiara D'Ambrogi (ISPRA), Chrystel Dezayes (BRGM), Charlotte Fehn (LGRB), Sunseare GABALDA (BRGM), Gregor Götzl (GBA), Andrej Lapanje (GeoZS), Stéphane MARC (BRGM), Alberto MARTINI (RER-SGSS), Fabio Carlo Molinari (RER-SGSS), Edgar Nitsch (LGRB), Robert Pamer (LfU BY), Marco PANTALONI (ISPRA), Sebastian Pfleiderer (GBA), Andrea PICCIN (RLB), (Nils Oesterling (swisstopo), Isabel Rupf (LGRB), Uta Schulz (LfU BY), Yves SIMEON (BRGM), Günter SÖKOL (LGRB), Heiko Zumsprekel (LGRB)
NASA Astrophysics Data System (ADS)
Kawamura, M.; Umeda, K.; Ohi, T.; Ishimaru, T.; Niizato, T.; Yasue, K.; Makino, H.
2007-12-01
We have developed a formal evaluation method to assess the potential impact of natural phenomena (earthquakes and faulting; volcanism; uplift, subsidence, denudation and sedimentation; climatic and sea-level changes) on a High Level Radioactive Waste (HLW) Disposal System. In 2000, we had developed perturbation scenarios in a generic and conservative sense and illustrated the potential impact on a HLW disposal system. As results of the development of perturbation scenarios, two points were highlighted for consideration in subsequent work: improvement of the scenarios from the viewpoints of reality, transparency, traceability and consistency and avoiding extreme conservatism. Subsequently, we have thus developed a new procedure for describing such perturbation scenarios based on further studies of the characteristics of these natural perturbation phenomena in Japan. The approach to describing the perturbation scenario is effectively developed in five steps: Step 1: Description of potential process of phenomena and their impacts on the geological environment. Step 2: Characterization of potential changes of geological environment in terms of T-H-M-C (Thermal - Hydrological - Mechanical - Chemical) processes. The focus is on specific T-H-M-C parameters that influence geological barrier performance, utilizing the input from Step 1. Step 3: Classification of potential influences, based on similarity of T-H-M-C perturbations. This leads to development of perturbation scenarios to serve as a basis for consequence analysis. Step 4: Establishing models and parameters for performance assessment. Step 5: Calculation and assessment. This study focuses on identifying key T-H-M-C process associated with perturbations at Step 2. This framework has two advantages. First one is assuring maintenance of traceability during the scenario construction processes, facilitating the production and structuring of suitable records. The second is providing effective elicitation and organization of information from a wide range of investigations of earth sciences within a performance assessment context. In this framework, scenario development work proceeds in a stepwise manner, to ensure clear identification of the impact of processes associated with these phenomena on a HLW disposal system. Output is organized to create credible scenarios with required transparency, consistency, traceability and adequate conservatism. In this presentation, the potential impact of natural phenomena in the viewpoint of performance assessment for HLW disposal will be discussed and modeled using the approach.
Magnetic Memory from Site Isolated Dy(III) on Silica Materials
2017-01-01
Achieving magnetic remanence at single isolated metal sites dispersed at the surface of a solid matrix has been envisioned as a key step toward information storage and processing in the smallest unit of matter. Here, we show that isolated Dy(III) sites distributed at the surface of silica nanoparticles, prepared with a simple and scalable two-step process, show magnetic remanence and display a hysteresis loop open at liquid 4He temperature, in contrast to the molecular precursor which does not display any magnetic memory. This singular behavior is achieved through the controlled grafting of a tailored Dy(III) siloxide complex on partially dehydroxylated silica nanoparticles followed by thermal annealing. This approach allows control of the density and the structure of isolated, “bare” Dy(III) sites bound to the silica surface. During the process, all organic fragments are removed, leaving the surface as the sole ligand, promoting magnetic remanence. PMID:28386602
Magnetic memory from site isolated Dy(III) on silica materials
DOE Office of Scientific and Technical Information (OSTI.GOV)
Allouche, Florian; Lapadula, Giuseppe; Siddiqi, Georges
Achieving magnetic remanence at single isolated metal sites dispersed at the surface of a solid matrix has been envisioned as a key step toward information storage and processing in the smallest unit of matter. Here, we show that isolated Dy(III) sites distributed at the surface of silica nanoparticles, prepared with a simple and scalable two-step process, show magnetic remanence and display a hysteresis loop open at liquid 4He temperature, in contrast to the molecular precursor which does not display any magnetic memory. This singular behavior is achieved through the controlled grafting of a tailored Dy(III) siloxide complex on partially dehydroxylatedmore » silica nanoparticles followed by thermal annealing. This approach allows control of the density and the structure of isolated, “bare” Dy(III) sites bound to the silica surface. Throughout the process, all organic fragments are removed, leaving the surface as the sole ligand, promoting magnetic remanence.« less
Magnetic memory from site isolated Dy(III) on silica materials
Allouche, Florian; Lapadula, Giuseppe; Siddiqi, Georges; ...
2017-02-22
Achieving magnetic remanence at single isolated metal sites dispersed at the surface of a solid matrix has been envisioned as a key step toward information storage and processing in the smallest unit of matter. Here, we show that isolated Dy(III) sites distributed at the surface of silica nanoparticles, prepared with a simple and scalable two-step process, show magnetic remanence and display a hysteresis loop open at liquid 4He temperature, in contrast to the molecular precursor which does not display any magnetic memory. This singular behavior is achieved through the controlled grafting of a tailored Dy(III) siloxide complex on partially dehydroxylatedmore » silica nanoparticles followed by thermal annealing. This approach allows control of the density and the structure of isolated, “bare” Dy(III) sites bound to the silica surface. Throughout the process, all organic fragments are removed, leaving the surface as the sole ligand, promoting magnetic remanence.« less
Photonic Quantum Networks formed from NV− centers
Nemoto, Kae; Trupke, Michael; Devitt, Simon J.; Scharfenberger, Burkhard; Buczak, Kathrin; Schmiedmayer, Jörg; Munro, William J.
2016-01-01
In this article we present a simple repeater scheme based on the negatively-charged nitrogen vacancy centre in diamond. Each repeater node is built from modules comprising an optical cavity containing a single NV−, with one nuclear spin from 15N as quantum memory. The module uses only deterministic processes and interactions to achieve high fidelity operations (>99%), and modules are connected by optical fiber. In the repeater node architecture, the processes between modules by photons can be in principle deterministic, however current limitations on optical components lead the processes to be probabilistic but heralded. Our resource-modest repeater architecture contains two modules at each node, and the repeater nodes are then connected by entangled photon pairs. We discuss the performance of such a quantum repeater network with modest resources and then incorporate more resource-intense strategies step by step. Our architecture should allow large-scale quantum information networks with existing or near future technology. PMID:27215433
Photonic Quantum Networks formed from NV(-) centers.
Nemoto, Kae; Trupke, Michael; Devitt, Simon J; Scharfenberger, Burkhard; Buczak, Kathrin; Schmiedmayer, Jörg; Munro, William J
2016-05-24
In this article we present a simple repeater scheme based on the negatively-charged nitrogen vacancy centre in diamond. Each repeater node is built from modules comprising an optical cavity containing a single NV(-), with one nuclear spin from (15)N as quantum memory. The module uses only deterministic processes and interactions to achieve high fidelity operations (>99%), and modules are connected by optical fiber. In the repeater node architecture, the processes between modules by photons can be in principle deterministic, however current limitations on optical components lead the processes to be probabilistic but heralded. Our resource-modest repeater architecture contains two modules at each node, and the repeater nodes are then connected by entangled photon pairs. We discuss the performance of such a quantum repeater network with modest resources and then incorporate more resource-intense strategies step by step. Our architecture should allow large-scale quantum information networks with existing or near future technology.
Davis, Thomas D
2017-01-01
Practice evaluation strategies range in style from the formal-analytic tools of single-subject designs, rapid assessment instruments, algorithmic steps in evidence-informed practice, and computer software applications, to the informal-interactive tools of clinical supervision, consultation with colleagues, use of client feedback, and clinical experience. The purpose of this article is to provide practice researchers in social work with an evidence-informed theory that is capable of explaining both how and why social workers use practice evaluation strategies to self-monitor the effectiveness of their interventions in terms of client change. The author delineates the theoretical contours and consequences of what is called dual-process theory. Drawing on evidence-informed advances in the cognitive and social neurosciences, the author identifies among everyday social workers a theoretically stable, informal-interactive tool preference that is a cognitively necessary, sufficient, and stand-alone preference that requires neither the supplementation nor balance of formal-analytic tools. The author's delineation of dual-process theory represents a theoretical contribution in the century-old attempt to understand how and why social workers evaluate their practice the way they do.
Toouli, George; Georgiou, Andrew; Westbrook, Johanna
2012-01-01
It is expected that health information technology (HIT) will deliver a safer, more efficient and effective health care system. The aim of this study was to undertake a qualitative and video-ethnographic examination of the impact of information technologies on work processes in the reception area of a Microbiology Department, to ascertain what changed, how it changed and the impact of the change. The setting for this study was the microbiology laboratory of a large tertiary hospital in Sydney. The study consisted of qualitative (interview and focus group) data and observation sessions for the period August 2005 to October 2006 along with video footage shot in three sessions covering the original system and the two stages of the Cerner implementation. Data analysis was assisted by NVivo software and process maps were produced from the video footage. There were two laboratory information systems observed in the video footage with computerized provider order entry introduced four months later. Process maps highlighted the large number of pre data entry steps with the original system whilst the newer system incorporated many of these steps in to the data entry stage. However, any time saved with the new system was offset by the requirement to complete some data entry of patient information not previously required. Other changes noted included the change of responsibilities for the reception staff and the physical changes required to accommodate the increased activity around the data entry area. Implementing a new HIT is always an exciting time for any environment but ensuring that the implementation goes smoothly and with minimal trouble requires the administrator and their team to plan well in advance for staff training, physical layout and possible staff resource reallocation.
Theoretical analysis of Lumry-Eyring models in differential scanning calorimetry
Sanchez-Ruiz, Jose M.
1992-01-01
A theoretical analysis of several protein denaturation models (Lumry-Eyring models) that include a rate-limited step leading to an irreversibly denatured state of the protein (the final state) has been carried out. The differential scanning calorimetry transitions predicted for these models can be broadly classified into four groups: situations A, B, C, and C′. (A) The transition is calorimetrically irreversible but the rate-limited, irreversible step takes place with significant rate only at temperatures slightly above those corresponding to the transition. Equilibrium thermodynamics analysis is permissible. (B) The transition is distorted by the occurrence of the rate-limited step; nevertheless, it contains thermodynamic information about the reversible unfolding of the protein, which could be obtained upon the appropriate data treatment. (C) The heat absorption is entirely determined by the kinetics of formation of the final state and no thermodynamic information can be extracted from the calorimetric transition; the rate-determining step is the irreversible process itself. (C′) same as C, but, in this case, the rate-determining step is a previous step in the unfolding pathway. It is shown that ligand and protein concentration effects on transitions corresponding to situation C (strongly rate-limited transitions) are similar to those predicted by equilibrium thermodynamics for simple reversible unfolding models. It has been widely held in recent literature that experimentally observed ligand and protein concentration effects support the applicability of equilibrium thermodynamics to irreversible protein denaturation. The theoretical analysis reported here disfavors this claim. PMID:19431826
Safeguarding the process of drug administration with an emphasis on electronic support tools
Seidling, Hanna M; Lampert, Anette; Lohmann, Kristina; Schiele, Julia T; Send, Alexander J F; Witticke, Diana; Haefeli, Walter E
2013-01-01
Aims The aim of this work is to understand the process of drug administration and identify points in the workflow that resulted in interventions by clinical information systems in order to improve patient safety. Methods To identify a generic way to structure the drug administration process we performed peer-group discussions and supplemented these discussions with a literature search for studies reporting errors in drug administration and strategies for their prevention. Results We concluded that the drug administration process might consist of up to 11 sub-steps, which can be grouped into the four sub-processes of preparation, personalization, application and follow-up. Errors in drug handling and administration are diverse and frequent and in many cases not caused by the patient him/herself, but by family members or nurses. Accordingly, different prevention strategies have been set in place with relatively few approaches involving e-health technology. Conclusions A generic structuring of the administration process and particular error-prone sub-steps may facilitate the allocation of prevention strategies and help to identify research gaps. PMID:24007450
Digital image processing of vascular angiograms
NASA Technical Reports Server (NTRS)
Selzer, R. H.; Beckenbach, E. S.; Blankenhorn, D. H.; Crawford, D. W.; Brooks, S. H.
1975-01-01
The paper discusses the estimation of the degree of atherosclerosis in the human femoral artery through the use of a digital image processing system for vascular angiograms. The film digitizer uses an electronic image dissector camera to scan the angiogram and convert the recorded optical density information into a numerical format. Another processing step involves locating the vessel edges from the digital image. The computer has been programmed to estimate vessel abnormality through a series of measurements, some derived primarily from the vessel edge information and others from optical density variations within the lumen shadow. These measurements are combined into an atherosclerosis index, which is found in a post-mortem study to correlate well with both visual and chemical estimates of atherosclerotic disease.
Army Program Value Added Analysis 90-97 (VAA 90-97)
1991-08-01
affordability or duplication of capability. The AHP process appears to hold the greatest possibilities in this regard. 1-11. OTHER KEY FINDINGS a. The...to provide the logical skeleton in which to build an alternative’s effectiveness value. The analytical hierarchy process ( AHP ) is particularly...likely to be, at first cut, very fuzzy . Thus, the issue clarification step is inherently iterative. As the analyst gathers more and more information in
NASA Astrophysics Data System (ADS)
Gholibeigian, Hassan; Gholibeigian, Ghasem; Amirshahkarami, Azim; Gholibeigian, Kazem
2017-01-01
Four animated sub-particles (sub-strings) as origin of the life and generator of momentum (vibration) of elementary particles (strings) are communicated for transferring information for processing and preparing fundamental particles for the next step. It means that information may be a ``dimension'' of the nature which fundamental particles, dark matter/energy and space-time are floating in it and listening to its whispering and getting quantum information packages about their conditions and laws. So, communication of information which began before the spark to B.B. (Convection Bang), may be a ``Fundamental symmetry'' in the nature because leads other symmetries and supersymmetry as well as other phenomena. The processed information are always carried by fundamental particles as the preserved history and entropy of Universe. So, information wouldn't be destroyed, lost or released by black hole. But the involved fundamental particles of thermal radiation, electromagnetic and gravitational fields carry processed information during emitting from black hole, while they are communicated from fifth dimension for their new movement. AmirKabir University of Technology, Tehran, Iran.
NASA Astrophysics Data System (ADS)
Renschler, C.; Sheridan, M. F.; Patra, A. K.
2008-05-01
The impact and consequences of extreme geophysical events (hurricanes, floods, wildfires, volcanic flows, mudflows, etc.) on properties and processes should be continuously assessed by a well-coordinated interdisciplinary research and outreach approach addressing risk assessment and resilience. Communication between various involved disciplines and stakeholders is the key to a successful implementation of an integrated risk management plan. These issues become apparent at the level of decision support tools for extreme events/disaster management in natural and managed environments. The Geospatial Project Management Tool (GeoProMT) is a collaborative platform for research and training to document and communicate the fundamental steps in transforming information for extreme events at various scales for analysis and management. GeoProMT is an internet-based interface for the management of shared geo-spatial and multi-temporal information such as measurements, remotely sensed images, and other GIS data. This tool enhances collaborative research activities and the ability to assimilate data from diverse sources by integrating information management. This facilitates a better understanding of natural processes and enhances the integrated assessment of resilience against both the slow and fast onset of hazard risks. Fundamental to understanding and communicating complex natural processes are: (a) representation of spatiotemporal variability, extremes, and uncertainty of environmental properties and processes in the digital domain, (b) transformation of their spatiotemporal representation across scales (e.g. interpolation, aggregation, disaggregation.) during data processing and modeling in the digital domain, and designing and developing tools for (c) geo-spatial data management, and (d) geo-spatial process modeling and effective implementation, and (e) supporting decision- and policy-making in natural resources and hazard management at various spatial and temporal scales of interest. GeoProMT is useful for researchers, practitioners, and decision-makers, because it provides an integrated environmental system assessment and data management approach that considers the spatial and temporal scales and variability in natural processes. Particularly in the occurrence or onset of extreme events it can utilize the latest data sources that are available at variable scales, combine them with existing information, and update assessment products such as risk and vulnerability assessment maps. Because integrated geo-spatial assessment requires careful consideration of all the steps in utilizing data, modeling and decision-making formats, each step in the sequence must be assessed in terms of how information is being scaled. At the process scale various geophysical models (e.g. TITAN, LAHARZ, or many other examples) are appropriate for incorporation in the tool. Some examples that illustrate our approach include: 1) coastal parishes impacted by Hurricane Rita (Southwestern Louisiana), 2) a watershed affected by extreme rainfall induced debris-flows (Madison County, Virginia; Panabaj, Guatemala; Casita, Nicaragua), and 3) the potential for pyroclastic flows to threaten a city (Tungurahua, Ecuador). This research was supported by the National Science Foundation.
Employing the Intelligence Cycle Process Model Within the Homeland Security Enterprise
2013-12-01
the Iraq anti-war movement, a former U.S. Congresswoman, the U.S. Treasury Department and hip hop bands to spread Sharia law in the U.S. A Virginia...challenges remain with threat notification, access to information, and database management of information that may have contributed the 2013 Boston...The FBI said it took a number of investigative steps to check on the request, including looking at his travel history, checking databases for
Genomic medicine in the military
De Castro, Mauricio; Biesecker, Leslie G; Turner, Clesson; Brenner, Ruth; Witkop, Catherine; Mehlman, Maxwell; Bradburne, Chris; Green, Robert C
2016-01-01
The announcement of the Precision Medicine Initiative was an important step towards establishing the use of genomic information as part of the wider practice of medicine. The US military has been exploring the role that genomic information will have in health care for service members (SMs) and its integration into the continuum of military medicine. An important part of the process is establishing robust protections to protect SMs from genetic discrimination in the era of exome/genome sequencing. PMID:29263806
Ahmadi, Maryam; Valinejadi, Ali; Goodarzi, Afshin; Safari, Ameneh; Hemmat, Morteza; Majdabadi, Hesamedin Askari; Mohammadi, Ali
2017-01-01
Background Traffic accidents are one of the more important national and international issues, and their consequences are important for the political, economical, and social level in a country. Management of traffic accident information requires information systems with analytical and accessibility capabilities to spatial and descriptive data. Objective The aim of this study was to determine the capabilities of a Geographic Information System (GIS) in management of traffic accident information. Methods This qualitative cross-sectional study was performed in 2016. In the first step, GIS capabilities were identified via literature retrieved from the Internet and based on the included criteria. Review of the literature was performed until data saturation was reached; a form was used to extract the capabilities. In the second step, study population were hospital managers, police, emergency, statisticians, and IT experts in trauma, emergency and police centers. Sampling was purposive. Data was collected using a questionnaire based on the first step data; validity and reliability were determined by content validity and Cronbach’s alpha of 75%. Data was analyzed using the decision Delphi technique. Results GIS capabilities were identified in ten categories and 64 sub-categories. Import and process of spatial and descriptive data and so, analysis of this data were the most important capabilities of GIS in traffic accident information management. Conclusion Storing and retrieving of descriptive and spatial data, providing statistical analysis in table, chart and zoning format, management of bad structure issues, determining the cost effectiveness of the decisions and prioritizing their implementation were the most important capabilities of GIS which can be efficient in the management of traffic accident information. PMID:28848627
2013-01-01
Background The goal of many proteomics experiments is to determine the abundance of proteins in biological samples, and the variation thereof in various physiological conditions. High-throughput quantitative proteomics, specifically label-free LC-MS/MS, allows rapid measurement of thousands of proteins, enabling large-scale studies of various biological systems. Prior to analyzing these information-rich datasets, raw data must undergo several computational processing steps. We present a method to address one of the essential steps in proteomics data processing - the matching of peptide measurements across samples. Results We describe a novel method for label-free proteomics data alignment with the ability to incorporate previously unused aspects of the data, particularly ion mobility drift times and product ion information. We compare the results of our alignment method to PEPPeR and OpenMS, and compare alignment accuracy achieved by different versions of our method utilizing various data characteristics. Our method results in increased match recall rates and similar or improved mismatch rates compared to PEPPeR and OpenMS feature-based alignment. We also show that the inclusion of drift time and product ion information results in higher recall rates and more confident matches, without increases in error rates. Conclusions Based on the results presented here, we argue that the incorporation of ion mobility drift time and product ion information are worthy pursuits. Alignment methods should be flexible enough to utilize all available data, particularly with recent advancements in experimental separation methods. PMID:24341404
Benjamin, Ashlee M; Thompson, J Will; Soderblom, Erik J; Geromanos, Scott J; Henao, Ricardo; Kraus, Virginia B; Moseley, M Arthur; Lucas, Joseph E
2013-12-16
The goal of many proteomics experiments is to determine the abundance of proteins in biological samples, and the variation thereof in various physiological conditions. High-throughput quantitative proteomics, specifically label-free LC-MS/MS, allows rapid measurement of thousands of proteins, enabling large-scale studies of various biological systems. Prior to analyzing these information-rich datasets, raw data must undergo several computational processing steps. We present a method to address one of the essential steps in proteomics data processing--the matching of peptide measurements across samples. We describe a novel method for label-free proteomics data alignment with the ability to incorporate previously unused aspects of the data, particularly ion mobility drift times and product ion information. We compare the results of our alignment method to PEPPeR and OpenMS, and compare alignment accuracy achieved by different versions of our method utilizing various data characteristics. Our method results in increased match recall rates and similar or improved mismatch rates compared to PEPPeR and OpenMS feature-based alignment. We also show that the inclusion of drift time and product ion information results in higher recall rates and more confident matches, without increases in error rates. Based on the results presented here, we argue that the incorporation of ion mobility drift time and product ion information are worthy pursuits. Alignment methods should be flexible enough to utilize all available data, particularly with recent advancements in experimental separation methods.
Environmental Factor(tm) system: RCRA hazardous waste handler information (on cd-rom). Database
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1996-04-01
Environmental Factor(tm) RCRA Hazardous Waste Handler Information on CD-ROM unleashes the invaluable information found in two key EPA data sources on hazardous waste handlers and offers cradle-to-grave waste tracking. It`s easy to search and display: (1) Permit status, design capacity and compliance history for facilities found in the EPA Resource Conservation and Recovery Information System (RCRIS) program tracking database; (2) Detailed information on hazardous wastes generation, management and minimization by companies who are large quantity generators, and (3) Data on the waste management practices of treatment, storage and disposal (TSD) facilities from the EPA Biennial Reporting System which is collectedmore » every other year. Environmental Factor`s powerful database retrieval system lets you: (1) Search for RCRA facilities by permit type, SIC code, waste codes, corrective action or violation information, TSD status, generator and transporter status and more; (2) View compliance information - dates of evaluation, violation, enforcement and corrective action; (3) Lookup facilities by waste processing categories of marketing, transporting, processing and energy recovery; (4) Use owner/operator information and names, titles and telephone numbers of project managers for prospecting; and (5) Browse detailed data on TSD facility and large quantity generators` activities such as onsite waste treatment, disposal, or recycling, offsite waste received, and waste generation and management. The product contains databases, search and retrieval software on two CD-ROMs, an installation diskette and User`s Guide. Environmental Factor has online context-sensitive help from any screen and a printed User`s Guide describing installation and step-by-step procedures for searching, retrieving and exporting. Hotline support is also available for no additional charge.« less
Environmental Factor{trademark} system: RCRA hazardous waste handler information
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1999-03-01
Environmental Factor{trademark} RCRA Hazardous Waste Handler Information on CD-ROM unleashes the invaluable information found in two key EPA data sources on hazardous waste handlers and offers cradle-to-grave waste tracking. It`s easy to search and display: (1) Permit status, design capacity and compliance history for facilities found in the EPA Resource Conservation and Recovery Information System (RCRIS) program tracking database; (2) Detailed information on hazardous wastes generation, management and minimization by companies who are large quantity generators, and (3) Data on the waste management practices of treatment, storage and disposal (TSD) facilities from the EPA Biennial Reporting System which is collectedmore » every other year. Environmental Factor`s powerful database retrieval system lets you: (1) Search for RCRA facilities by permit type, SIC code, waste codes, corrective action or violation information, TSD status, generator and transporter status and more; (2) View compliance information -- dates of evaluation, violation, enforcement and corrective action; (3) Lookup facilities by waste processing categories of marketing, transporting, processing and energy recovery; (4) Use owner/operator information and names, titles and telephone numbers of project managers for prospecting; and (5) Browse detailed data on TSD facility and large quantity generators` activities such as onsite waste treatment, disposal, or recycling, offsite waste received, and waste generation and management. The product contains databases, search and retrieval software on two CD-ROMs, an installation diskette and User`s Guide. Environmental Factor has online context-sensitive help from any screen and a printed User`s Guide describing installation and step-by-step procedures for searching, retrieving and exporting. Hotline support is also available for no additional charge.« less
Effects of rewiring strategies on information spreading in complex dynamic networks
NASA Astrophysics Data System (ADS)
Ally, Abdulla F.; Zhang, Ning
2018-04-01
Recent advances in networks and communication services have attracted much interest to understand information spreading in social networks. Consequently, numerous studies have been devoted to provide effective and accurate models for mimicking information spreading. However, knowledge on how to spread information faster and more widely remains a contentious issue. Yet, most existing works are based on static networks which limit the reality of dynamism of entities that participate in information spreading. Using the SIR epidemic model, this study explores and compares effects of two rewiring models (Fermi-Dirac and Linear functions) on information spreading in scale free and small world networks. Our results show that for all the rewiring strategies, the spreading influence replenishes with time but stabilizes in a steady state at later time-steps. This means that information spreading takes-off during the initial spreading steps, after which the spreading prevalence settles toward its equilibrium, with majority of the population having recovered and thus, no longer affecting the spreading. Meanwhile, rewiring strategy based on Fermi-Dirac distribution function in one way or another impedes the spreading process, however, the structure of the networks mimic the spreading, even with a low spreading rate. The worst case can be when the spreading rate is extremely small. The results emphasize that despite a big role of such networks in mimicking the spreading, the role of the parameters cannot be simply ignored. Apparently, the probability of giant degree neighbors being informed grows much faster with the rewiring strategy of linear function compared to that of Fermi-Dirac distribution function. Clearly, rewiring model based on linear function generates the fastest spreading across the networks. Therefore, if we are interested in speeding up the spreading process in stochastic modeling, linear function may play a pivotal role.
NASA Astrophysics Data System (ADS)
Alshakova, E. L.
2017-01-01
The program in the AutoLISP language allows automatically to form parametrical drawings during the work in the AutoCAD software product. Students study development of programs on AutoLISP language with the use of the methodical complex containing methodical instructions in which real examples of creation of images and drawings are realized. Methodical instructions contain reference information necessary for the performance of the offered tasks. The method of step-by-step development of the program is the basis for training in programming on AutoLISP language: the program draws elements of the drawing of a detail by means of definitely created function which values of arguments register in that sequence in which AutoCAD gives out inquiries when performing the corresponding command in the editor. The process of the program design is reduced to the process of step-by-step formation of functions and sequence of their calls. The author considers the development of the AutoLISP program for the creation of parametrical drawings of details, the defined design, the user enters the dimensions of elements of details. These programs generate variants of tasks of the graphic works performed in educational process of "Engineering graphics", "Engineering and computer graphics" disciplines. Individual tasks allow to develop at students skills of independent work in reading and creation of drawings, as well as 3D modeling.
Applying soil property information for watershed assessment.
NASA Astrophysics Data System (ADS)
Archer, V.; Mayn, C.; Brown, S. R.
2017-12-01
The Forest Service uses a priority watershed scheme to guide where to direct watershed restoration work. Initial assessment was done across the nation following the watershed condition framework process. This assessment method uses soils information for a three step ranking across each 12 code hydrologic unit; however, the soil information used in the assessment may not provide adequate detail to guide work on the ground. Modern remote sensing information and terrain derivatives that model the environmental gradients hold promise of showing the influence of soil forming factors on watershed processes. These small scale data products enable the disaggregation of coarse scale soils mapping to show continuous soil property information across a watershed. When this information is coupled with the geomorphic and geologic information, watershed specialists can more aptly understand the controlling influences of drainage within watersheds and focus on where watershed restoration projects can have the most success. A case study on the application of this work shows where road restoration may be most effective.
Pressure modulates the self-cleavage step of the hairpin ribozyme
NASA Astrophysics Data System (ADS)
Schuabb, Caroline; Kumar, Narendra; Pataraia, Salome; Marx, Dominik; Winter, Roland
2017-03-01
The ability of certain RNAs, denoted as ribozymes, to not only store genetic information but also catalyse chemical reactions gave support to the RNA world hypothesis as a putative step in the development of early life on Earth. This, however, might have evolved under extreme environmental conditions, including the deep sea with pressures in the kbar regime. Here we study pressure-induced effects on the self-cleavage of hairpin ribozyme by following structural changes in real-time. Our results suggest that compression of the ribozyme leads to an accelerated transesterification reaction, being the self-cleavage step, although the overall process is retarded in the high-pressure regime. The results reveal that favourable interactions between the reaction site and neighbouring nucleobases are strengthened under pressure, resulting therefore in an accelerated self-cleavage step upon compression. These results suggest that properly engineered ribozymes may also act as piezophilic biocatalysts in addition to their hitherto known properties.
Carter, Richard J.; Wiesner, Karoline
2018-01-01
As a step towards understanding pre-evolutionary organization in non-genetic systems, we develop a model to investigate the emergence and dynamics of proto-autopoietic networks in an interacting population of simple information processing entities (automata). Our simulations indicate that dynamically stable strongly connected networks of mutually producing communication channels emerge under specific environmental conditions. We refer to these distinct organizational steady states as information niches. In each case, we measure the information content by the Shannon entropy, and determine the fitness landscape, robustness and transition pathways for information niches subjected to intermittent environmental perturbations under non-evolutionary conditions. By determining the information required to generate each niche, we show that niche transitions are only allowed if accompanied by an equal or increased level of information production that arises internally or via environmental perturbations that serve as an exogenous source of population diversification. Overall, our simulations show how proto-autopoietic networks of basic information processors form and compete, and under what conditions they persist over time or go extinct. These findings may be relevant to understanding how inanimate systems such as chemically communicating protocells can initiate the transition to living matter prior to the onset of contemporary evolutionary and genetic mechanisms. PMID:29343630
Sensation-to-Cognition Cortical Streams in Attention-Deficit/Hyperactivity Disorder
Carmona, Susana; Hoekzema, Elseline; Castellanos, Francisco X.; García-García, David; Lage-Castellanos, Agustín; Dijk, Koene R.A.Van; Navas-Sánchez, Francisco J.; Martínez, Kenia; Desco, Manuel; Sepulcre, Jorge
2015-01-01
We sought to determine whether functional connectivity streams that link sensory, attentional, and higher-order cognitive circuits are atypical in attention-deficit/hyperactivity disorder (ADHD). We applied a graph-theory method to the resting-state functional magnetic resonance imaging data of 120 children with ADHD and 120 age-matched typically developing children (TDC). Starting in unimodal primary cortex—visual, auditory, and somatosensory—we used stepwise functional connectivity to calculate functional connectivity paths at discrete numbers of relay stations (or link-step distances). First, we characterized the functional connectivity streams that link sensory, attentional, and higher-order cognitive circuits in TDC and found that systems do not reach the level of integration achieved by adults. Second, we searched for stepwise functional connectivity differences between children with ADHD and TDC. We found that, at the initial steps of sensory functional connectivity streams, patients display significant enhancements of connectivity degree within neighboring areas of primary cortex, while connectivity to attention-regulatory areas is reduced. Third, at subsequent link-step distances from primary sensory cortex, children with ADHD show decreased connectivity to executive processing areas and increased degree of connections to default mode regions. Fourth, in examining medication histories in children with ADHD, we found that children medicated with psychostimulants present functional connectivity streams with higher degree of connectivity to regions subserving attentional and executive processes compared to medication-naïve children. We conclude that predominance of local sensory processing and lesser influx of information to attentional and executive regions may reduce the ability to organize and control the balance between external and internal sources of information in ADHD. PMID:25821110
Tomson, Tanja; Zary, Nabil
2014-01-01
Introduction. Antimicrobial resistance is a global health issue. Studies have shown that improved antibiotic prescription education among healthcare professionals reduces mistakes during the antibiotic prescription process. The aim of this study was to investigate novel educational approaches that through the use of Augmented Reality technology could make use of the real physical context and thereby enrich the educational process of antibiotics prescription. The objective is to investigate which type of information related to antibiotics could be used in an augmented reality application for antibiotics education. Methods. This study followed the Design-Based Research Methodology composed of the following main steps: problem analysis, investigation of information that should be visualized for the training session, and finally the involvement of the end users the development and evaluation processes of the prototype. Results. Two of the most important aspects in the antibiotic prescription process, to represent in an augmented reality application, are the antibiotic guidelines and the side effects. Moreover, this study showed how this information could be visualized from a mobile device using an Augmented Reality scanner and antibiotic drug boxes as markers. Discussion. In this study we investigated the usage of objects from a real physical context such as drug boxes and how they could be used as educational resources. The logical next steps are to examine how this approach of combining physical and virtual contexts through Augmented Reality applications could contribute to the improvement of competencies among healthcare professionals and its impact on the decrease of antibiotics resistance. PMID:25548733
Nifakos, Sokratis; Tomson, Tanja; Zary, Nabil
2014-01-01
Introduction. Antimicrobial resistance is a global health issue. Studies have shown that improved antibiotic prescription education among healthcare professionals reduces mistakes during the antibiotic prescription process. The aim of this study was to investigate novel educational approaches that through the use of Augmented Reality technology could make use of the real physical context and thereby enrich the educational process of antibiotics prescription. The objective is to investigate which type of information related to antibiotics could be used in an augmented reality application for antibiotics education. Methods. This study followed the Design-Based Research Methodology composed of the following main steps: problem analysis, investigation of information that should be visualized for the training session, and finally the involvement of the end users the development and evaluation processes of the prototype. Results. Two of the most important aspects in the antibiotic prescription process, to represent in an augmented reality application, are the antibiotic guidelines and the side effects. Moreover, this study showed how this information could be visualized from a mobile device using an Augmented Reality scanner and antibiotic drug boxes as markers. Discussion. In this study we investigated the usage of objects from a real physical context such as drug boxes and how they could be used as educational resources. The logical next steps are to examine how this approach of combining physical and virtual contexts through Augmented Reality applications could contribute to the improvement of competencies among healthcare professionals and its impact on the decrease of antibiotics resistance.
A Systematic Method for Reviewing and Analyzing Health Information on Consumer-Oriented Websites.
Rew, Lynn; Saenz, Ashley; Walker, Lorraine O
2018-05-29
A discussion of a proposed method for analyzing the quality of consumer-oriented websites that provide health-related information. The quality of health information available to consumers online varies widely in quality. In an effort to improve the quality of online information, experts have undertaken systematic reviews on selected health topics; however, no standardized comprehensive methodology currently exists for such review. An eight-step method is recommended embracing the following steps: (1) select topic; (2) determine the purpose of the analysis; (3) select search terms and engines; (4) develop and apply website inclusion and exclusion criteria; (5) develop processes and tools to manage search results; (6) specify measures of quality; (7) compute readability; (8) evaluate websites. Each of these steps is illustrated in relation to the health topic of gynecomastia, a physical and mental health challenge for many adolescent males and young men. Although most extant analyses of consumer-oriented websites have focused on disease conditions and their treatment, website-analysis methodology would encourage analyses that fall into the nursing care domain. The method outlined in this paper is intended to provide nurses and others who work with specific patient populations with the tools needed for website analytic studies. Such studies provide a foundation for making recommendations about quality websites, as well as identifying gaps in online information for health consumers. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
Johannessen, Liv Karen; Obstfelder, Aud; Lotherington, Ann Therese
2013-05-01
The purpose of this paper is to explore the making and scaling of information infrastructures, as well as how the conditions for scaling a component may change for the vendor. The first research question is how the making and scaling of a healthcare information infrastructure can be done and by whom. The second question is what scope for manoeuvre there might be for vendors aiming to expand their market. This case study is based on an interpretive approach, whereby data is gathered through participant observation and semi-structured interviews. A case study of the making and scaling of an electronic system for general practitioners ordering laboratory services from hospitals is described as comprising two distinct phases. The first may be characterized as an evolving phase, when development, integration and implementation were achieved in small steps, and the vendor, together with end users, had considerable freedom to create the solution according to the users' needs. The second phase was characterized by a large-scale procurement process over which regional healthcare authorities exercised much more control and the needs of groups other than the end users influenced the design. The making and scaling of healthcare information infrastructures is not simply a process of evolution, in which the end users use and change the technology. It also consists of large steps, during which different actors, including vendors and healthcare authorities, may make substantial contributions. This process requires work, negotiation and strategies. The conditions for the vendor may change dramatically, from considerable freedom and close relationships with users and customers in the small-scale development, to losing control of the product and being required to engage in more formal relations with customers in the wider public healthcare market. Onerous procurement processes may be one of the reasons why large-scale implementation of information projects in healthcare is difficult and slow. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
2014-01-01
Background Despite increasing investment in health research capacity strengthening efforts in low and middle income countries, published evidence to guide the systematic design and monitoring of such interventions is very limited. Systematic processes are important to underpin capacity strengthening interventions because they provide stepwise guidance and allow for continual improvement. Our objective here was to use evidence to inform the design of a replicable but flexible process to guide health research capacity strengthening that could be customized for different contexts, and to provide a framework for planning, collecting information, making decisions, and improving performance. Methods We used peer-reviewed and grey literature to develop a five-step pathway for designing and evaluating health research capacity strengthening programmes, tested in a variety of contexts in Africa. The five steps are: i) defining the goal of the capacity strengthening effort, ii) describing the optimal capacity needed to achieve the goal, iii) determining the existing capacity gaps compared to the optimum, iv) devising an action plan to fill the gaps and associated indicators of change, and v) adapting the plan and indicators as the programme matures. Our paper describes three contrasting case studies of organisational research capacity strengthening to illustrate how our five-step approach works in practice. Results Our five-step pathway starts with a clear goal and objectives, making explicit the capacity required to achieve the goal. Strategies for promoting sustainability are agreed with partners and incorporated from the outset. Our pathway for designing capacity strengthening programmes focuses not only on technical, managerial, and financial processes within organisations, but also on the individuals within organisations and the wider system within which organisations are coordinated, financed, and managed. Conclusions Our five-step approach is flexible enough to generate and utilise ongoing learning. We have tested and critiqued our approach in a variety of organisational settings in the health sector in sub-Saharan Africa, but it needs to be applied and evaluated in other sectors and continents to determine the extent of transferability. PMID:24581148
Method for modeling social care processes for national information exchange.
Miettinen, Aki; Mykkänen, Juha; Laaksonen, Maarit
2012-01-01
Finnish social services include 21 service commissions of social welfare including Adoption counselling, Income support, Child welfare, Services for immigrants and Substance abuse care. This paper describes the method used for process modeling in the National project for IT in Social Services in Finland (Tikesos). The process modeling in the project aimed to support common national target state processes from the perspective of national electronic archive, increased interoperability between systems and electronic client documents. The process steps and other aspects of the method are presented. The method was developed, used and refined during the three years of process modeling in the national project.
Gibbs, Jo; Sutcliffe, Lorna J; Gkatzidou, Voula; Hone, Kate; Ashcroft, Richard E; Harding-Esch, Emma M; Lowndes, Catherine M; Sadiq, S Tariq; Sonnenberg, Pam; Estcourt, Claudia S
2016-07-22
Despite considerable international eHealth impetus, there is no guidance on the development of online clinical care pathways. Advances in diagnostics now enable self-testing with home diagnosis, to which comprehensive online clinical care could be linked, facilitating completely self-directed, remote care. We describe a new framework for developing complex online clinical care pathways and its application to clinical management of people with genital chlamydia infection, the commonest sexually transmitted infection (STI) in England. Using the existing evidence-base, guidelines and examples from contemporary clinical practice, we developed the eClinical Care Pathway Framework, a nine-step iterative process. Step 1: define the aims of the online pathway; Step 2: define the functional units; Step 3: draft the clinical consultation; Step 4: expert review; Step 5: cognitive testing; Step 6: user-centred interface testing; Step 7: specification development; Step 8: software testing, usability testing and further comprehension testing; Step 9: piloting. We then applied the Framework to create a chlamydia online clinical care pathway (Online Chlamydia Pathway). Use of the Framework elucidated content and structure of the care pathway and identified the need for significant changes in sequences of care (Traditional: history, diagnosis, information versus Online: diagnosis, information, history) and prescribing safety assessment. The Framework met the needs of complex STI management and enabled development of a multi-faceted, fully-automated consultation. The Framework provides a comprehensive structure on which complex online care pathways such as those needed for STI management, which involve clinical services, public health surveillance functions and third party (sexual partner) management, can be developed to meet national clinical and public health standards. The Online Chlamydia Pathway's standardised method of collecting data on demographics and sexual behaviour, with potential for interoperability with surveillance systems, could be a powerful tool for public health and clinical management.
A distributed fault-detection and diagnosis system using on-line parameter estimation
NASA Technical Reports Server (NTRS)
Guo, T.-H.; Merrill, W.; Duyar, A.
1991-01-01
The development of a model-based fault-detection and diagnosis system (FDD) is reviewed. The system can be used as an integral part of an intelligent control system. It determines the faults of a system from comparison of the measurements of the system with a priori information represented by the model of the system. The method of modeling a complex system is described and a description of diagnosis models which include process faults is presented. There are three distinct classes of fault modes covered by the system performance model equation: actuator faults, sensor faults, and performance degradation. A system equation for a complete model that describes all three classes of faults is given. The strategy for detecting the fault and estimating the fault parameters using a distributed on-line parameter identification scheme is presented. A two-step approach is proposed. The first step is composed of a group of hypothesis testing modules, (HTM) in parallel processing to test each class of faults. The second step is the fault diagnosis module which checks all the information obtained from the HTM level, isolates the fault, and determines its magnitude. The proposed FDD system was demonstrated by applying it to detect actuator and sensor faults added to a simulation of the Space Shuttle Main Engine. The simulation results show that the proposed FDD system can adequately detect the faults and estimate their magnitudes.
Real-time traffic sign detection and recognition
NASA Astrophysics Data System (ADS)
Herbschleb, Ernst; de With, Peter H. N.
2009-01-01
The continuous growth of imaging databases increasingly requires analysis tools for extraction of features. In this paper, a new architecture for the detection of traffic signs is proposed. The architecture is designed to process a large database with tens of millions of images with a resolution up to 4,800x2,400 pixels. Because of the size of the database, a high reliability as well as a high throughput is required. The novel architecture consists of a three-stage algorithm with multiple steps per stage, combining both color and specific spatial information. The first stage contains an area-limitation step which is performance critical in both the detection rate as the overall processing time. The second stage locates suggestions for traffic signs using recently published feature processing. The third stage contains a validation step to enhance reliability of the algorithm. During this stage, the traffic signs are recognized. Experiments show a convincing detection rate of 99%. With respect to computational speed, the throughput for line-of-sight images of 800×600 pixels is 35 Hz and for panorama images it is 4 Hz. Our novel architecture outperforms existing algorithms, with respect to both detection rate and throughput
Writing to Learn across the Curriculum. Fastback 209.
ERIC Educational Resources Information Center
Myers, John W.
Intended for use by secondary school teachers in all subject areas, this booklet provides research based information designed to make writing a learning process. Following brief discussions of the writing-to-learn concept, the importance of writing in all curricular areas, and steps in developing a writing across the curriculum program, the…
Linking Project Procedure Manual for Using Dumb-Barcode Linking on GEAC.
ERIC Educational Resources Information Center
Condron, Lyn
This procedure manual is designed to assist cataloging staff members at a university library through the 10-step process of barcoding and linking books classified by the Library of Congress system to the library's GEAC online computer system. A brief introduction provides background information on the project. The procedures involved in each…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-30
... CDC-2011-0008] Assessing the Current Research, Policy, and Practice Environment in Public Health... information helpful to assess the current research, policy, and practice environment in public health genomics. HHS/CDC is currently leading a process to assess the most important steps for public health genomics...
State-of-the-Art Facility: A Planning Process.
ERIC Educational Resources Information Center
Day, C. William; Speicher, A. Dean
Chief executive officers of school districts and facility planners must assume the role of change agent to meet the information needs of the 21st century. Public school learning, which will serve more groupings of people on a continual basis, will be disseminated through media learning centers. Management should follow six steps in planning…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-14
... an article of food (other than infant formula and dietary supplements) for which there is a... identify any risks, harms, or other dangers to health for all FDA-regulated human and animal products, the... occurs. This risk identification process is the first necessary step that allows the Agency to gather the...
Code of Federal Regulations, 2010 CFR
2010-04-01
... decision-making process and the reasons for using its emergency action authority. Information on steps... have clear procedures and guidelines for decision-making regarding emergency intervention in the market, including procedures and guidelines to avoid conflicts of interest while carrying out such decision-making...
2006-06-01
KMO ) for the CFMCC staff. That officer had a daily meeting with all of the CFMCC’s collateral duty knowledge managers (KM) to discuss information...analyses of process steps) and mentored by the KMO , could enhance knowledge creation and utilization while not jeopardizing work flows. Clearly in
How to Program a Domain Independent Tracer for Explanations
ERIC Educational Resources Information Center
Ishizaka, Alessio; Lusti, Markus
2006-01-01
Explanations are essential in the teaching process. Tracers are one possibility to provide students with explanations in an intelligent tutoring system. Their development can be divided into four steps: (a) the definition of the trace model; (b) the extraction of the information from this model; (c) the analysis and abstraction of the extracted…
Action Research: Its Origins and Early Application.
ERIC Educational Resources Information Center
Cook, Stuart W.
This paper contains informal remarks on action research in social psychology from its post World War II origins to its current status. Kurt Lewin first described action research in the 1946 article, "Action Research and Minority Problems," as a three-step process of program planning, program execution, and follow-up evaluation. Ronald Lippitt and…
ERIC Educational Resources Information Center
Aerospace Industries Association of America, Inc., Washington, DC.
The Aerospace Industries Association (AIA) examined its member companies and their existing university relationships as an initial step in the process of strengthening these ties. Information drawn from background research, interviews (with company representatives and university, government, and private sector spokesmen), and a formal survey of…
What to Do When a Bad Teacher Doesn't Get Better.
ERIC Educational Resources Information Center
Dennis, Bruce L.
1990-01-01
Responsible administrators are obligated to confront poor teacher performance. Guides principals through 12 steps to take in the confrontation process that include the following: gathering information, waiting for a specific incident, developing a file, meeting with the teacher, helping the teacher to improve, and working with the teacher union.…
Altan, Irem; Charbonneau, Patrick; Snell, Edward H.
2016-01-01
Crystallization is a key step in macromolecular structure determination by crystallography. While a robust theoretical treatment of the process is available, due to the complexity of the system, the experimental process is still largely one of trial and error. In this article, efforts in the field are discussed together with a theoretical underpinning using a solubility phase diagram. Prior knowledge has been used to develop tools that computationally predict the crystallization outcome and define mutational approaches that enhance the likelihood of crystallization. For the most part these tools are based on binary outcomes (crystal or no crystal), and the full information contained in an assembly of crystallization screening experiments is lost. The potential of this additional information is illustrated by examples where new biological knowledge can be obtained and where a target can be sub-categorized to predict which class of reagents provides the crystallization driving force. Computational analysis of crystallization requires complete and correctly formatted data. While massive crystallization screening efforts are under way, the data available from many of these studies are sparse. The potential for this data and the steps needed to realize this potential are discussed. PMID:26792536
NASA Astrophysics Data System (ADS)
Sabbatini, S.; Fratini, G.; Arriga, N.; Papale, D.
2012-04-01
Eddy Covariance (EC) is the only technologically available direct method to measure carbon and energy fluxes between ecosystems and atmosphere. However, uncertainties related to this method have not been exhaustively assessed yet, including those deriving from post-field data processing. The latter arise because there is no exact processing sequence established for any given situation, and the sequence itself is long and complex, with many processing steps and options available. However, the consistency and inter-comparability of flux estimates may be largely affected by the adoption of different processing sequences. The goal of our work is to quantify the uncertainty introduced in each processing step by the fact that different options are available, and to study how the overall uncertainty propagates throughout the processing sequence. We propose an easy-to-use methodology to assign a confidence level to the calculated fluxes of energy and mass, based on the adopted processing sequence, and on available information such as the EC system type (e.g. open vs. closed path), the climate and the ecosystem type. The proposed methodology synthesizes the results of a massive full-factorial experiment. We use one year of raw data from 15 European flux stations and process them so as to cover all possible combinations of the available options across a selection of the most relevant processing steps. The 15 sites have been selected to be representative of different ecosystems (forests, croplands and grasslands), climates (mediterranean, nordic, arid and humid) and instrumental setup (e.g. open vs. closed path). The software used for this analysis is EddyPro™ 3.0 (www.licor.com/eddypro). The critical processing steps, selected on the basis of the different options commonly used in the FLUXNET community, are: angle of attack correction; coordinate rotation; trend removal; time lag compensation; low- and high- frequency spectral correction; correction for air density fluctuations; and length of the flux averaging interval. We illustrate the results of the full-factorial combination relative to a subset of the selected sites with particular emphasis on the total uncertainty at different time scales and aggregations, as well as a preliminary analysis of the most critical steps for their contribution to the total uncertainties and their potential relation with site set-up characteristics and ecosystem type.
Consistency of internal fluxes in a hydrological model running at multiple time steps
NASA Astrophysics Data System (ADS)
Ficchi, Andrea; Perrin, Charles; Andréassian, Vazken
2016-04-01
Improving hydrological models remains a difficult task and many ways can be explored, among which one can find the improvement of spatial representation, the search for more robust parametrization, the better formulation of some processes or the modification of model structures by trial-and-error procedure. Several past works indicate that model parameters and structure can be dependent on the modelling time step, and there is thus some rationale in investigating how a model behaves across various modelling time steps, to find solutions for improvements. Here we analyse the impact of data time step on the consistency of the internal fluxes of a rainfall-runoff model run at various time steps, by using a large data set of 240 catchments. To this end, fine time step hydro-climatic information at sub-hourly resolution is used as input of a parsimonious rainfall-runoff model (GR) that is run at eight different model time steps (from 6 minutes to one day). The initial structure of the tested model (i.e. the baseline) corresponds to the daily model GR4J (Perrin et al., 2003), adapted to be run at variable sub-daily time steps. The modelled fluxes considered are interception, actual evapotranspiration and intercatchment groundwater flows. Observations of these fluxes are not available, but the comparison of modelled fluxes at multiple time steps gives additional information for model identification. The joint analysis of flow simulation performance and consistency of internal fluxes at different time steps provides guidance to the identification of the model components that should be improved. Our analysis indicates that the baseline model structure is to be modified at sub-daily time steps to warrant the consistency and realism of the modelled fluxes. For the baseline model improvement, particular attention is devoted to the interception model component, whose output flux showed the strongest sensitivity to modelling time step. The dependency of the optimal model complexity on time step is also analysed. References: Perrin, C., Michel, C., Andréassian, V., 2003. Improvement of a parsimonious model for streamflow simulation. Journal of Hydrology, 279(1-4): 275-289. DOI:10.1016/S0022-1694(03)00225-7
One-step fabrication of multifunctional micromotors
NASA Astrophysics Data System (ADS)
Gao, Wenlong; Liu, Mei; Liu, Limei; Zhang, Hui; Dong, Bin; Li, Christopher Y.
2015-08-01
Although artificial micromotors have undergone tremendous progress in recent years, their fabrication normally requires complex steps or expensive equipment. In this paper, we report a facile one-step method based on an emulsion solvent evaporation process to fabricate multifunctional micromotors. By simultaneously incorporating various components into an oil-in-water droplet, upon emulsification and solidification, a sphere-shaped, asymmetric, and multifunctional micromotor is formed. Some of the attractive functions of this model micromotor include autonomous movement in high ionic strength solution, remote control, enzymatic disassembly and sustained release. This one-step, versatile fabrication method can be easily scaled up and therefore may have great potential in mass production of multifunctional micromotors for a wide range of practical applications.Although artificial micromotors have undergone tremendous progress in recent years, their fabrication normally requires complex steps or expensive equipment. In this paper, we report a facile one-step method based on an emulsion solvent evaporation process to fabricate multifunctional micromotors. By simultaneously incorporating various components into an oil-in-water droplet, upon emulsification and solidification, a sphere-shaped, asymmetric, and multifunctional micromotor is formed. Some of the attractive functions of this model micromotor include autonomous movement in high ionic strength solution, remote control, enzymatic disassembly and sustained release. This one-step, versatile fabrication method can be easily scaled up and therefore may have great potential in mass production of multifunctional micromotors for a wide range of practical applications. Electronic supplementary information (ESI) available: Videos S1-S4 and Fig. S1-S3. See DOI: 10.1039/c5nr03574k
Sheldon, Lisa Kennedy; Ellington, Lee
2008-11-01
This paper is a report of a study to assess the applicability of a theoretical model of social information processing in expanding a nursing theory addressing how nurses respond to patients. Nursing communication affects patient outcomes such as anxiety, adherence to treatments and satisfaction with care. Orlando's theory of nursing process describes nurses' reactions to patients' behaviour as generating a perception, thought and feeling in the nurse and then action by the nurse. A model of social information processing describes the sequential steps in the cognitive processes used to respond to social cues and may be useful in describing the nursing process. Cognitive interviews were conducted in 2006 with a convenience sample of 5 nurses in the United States of America. The data were interpreted using the Crick and Dodge model of social information processing. Themes arising from cognitive interviews validated concepts of the nursing theory and the constructs of the model of social information processing. The interviews revealed that the support of peers was an additional construct involved in the development of communication skills, creation of a database and enhancement of self-efficacy. Models of social information processing enhance understanding of the process of how nurses respond to patients and further develop nursing theories further. In combination, the theories are useful in developing research into nurse-patient communication. Future research based on the expansion of nursing theory may identify effective and culturally appropriate nurse response patterns to specific patient interactions with implications for nursing care and patient outcomes.
Quantifying social influence in an online cultural market.
Krumme, Coco; Cebrian, Manuel; Pickard, Galen; Pentland, Sandy
2012-01-01
We revisit experimental data from an online cultural market in which 14,000 users interact to download songs, and develop a simple model that can explain seemingly complex outcomes. Our results suggest that individual behavior is characterized by a two-step process--the decision to sample and the decision to download a song. Contrary to conventional wisdom, social influence is material to the first step only. The model also identifies the role of placement in mediating social signals, and suggests that in this market with anonymous feedback cues, social influence serves an informational rather than normative role.
Ten steps to successful poster presentation.
Hardicre, Jayne; Devitt, Patric; Coad, Jane
Receiving a letter confirming acceptance for you to present a poster at a conference can evoke mixed emotions. Joy, panic, fear and dread are among the many possible emotions and this is not exclusive to first time presenters. Developing an effective poster presentation is a skill that you can learn and can provide a rewarding way to present your work in a manner less intimidating than oral presentation (Shelledy, 2004). The key to successful poster presentation is meticulous, timely, well informed preparation. This article outlines ten steps to help guide you through the process to maximize your success.
Reflectance of vegetation, soil, and water
NASA Technical Reports Server (NTRS)
Wiegand, C. L. (Principal Investigator)
1973-01-01
There are no author-identified significant results in this report. This report deals with the selection of the best channels from the 24-channel aircraft data to represent crop and soil conditions. A three-step procedure has been developed that involves using univariate statistics and an F-ratio test to indicate the best 14 channels. From the 14, the 10 best channels are selected by a multivariate stochastic process. The third step involves the pattern recognition procedures developed in the data analysis plan. Indications are that the procedures in use are satsifactory and will extract the desired information from the data.
Bao, X Y; Huang, W J; Zhang, K; Jin, M; Li, Y; Niu, C Z
2018-04-18
There is a huge amount of diagnostic or treatment information in electronic medical record (EMR), which is a concrete manifestation of clinicians actual diagnosis and treatment details. Plenty of episodes in EMRs, such as complaints, present illness, past history, differential diagnosis, diagnostic imaging, surgical records, reflecting details of diagnosis and treatment in clinical process, adopt Chinese description of natural language. How to extract effective information from these Chinese narrative text data, and organize it into a form of tabular for analysis of medical research, for the practical utilization of clinical data in the real world, is a difficult problem in Chinese medical data processing. Based on the EMRs narrative text data in a tertiary hospital in China, a customized information extracting rules learning, and rule based information extraction methods is proposed. The overall method consists of three steps, which includes: (1) Step 1, a random sample of 600 copies (including the history of present illness, past history, personal history, family history, etc.) of the electronic medical record data, was extracted as raw corpora. With our developed Chinese clinical narrative text annotation platform, the trained clinician and nurses marked the tokens and phrases in the corpora which would be extracted (with a history of diabetes as an example). (2) Step 2, based on the annotated corpora clinical text data, some extraction templates were summarized and induced firstly. Then these templates were rewritten using regular expressions of Perl programming language, as extraction rules. Using these extraction rules as basic knowledge base, we developed extraction packages in Perl, for extracting data from the EMRs text data. In the end, the extracted data items were organized in tabular data format, for later usage in clinical research or hospital surveillance purposes. (3) As the final step of the method, the evaluation and validation of the proposed methods were implemented in the National Clinical Service Data Integration Platform, and we checked the extraction results using artificial verification and automated verification combined, proved the effectiveness of the method. For all the patients with diabetes as diagnosed disease in the Department of Endocrine in the hospital, the medical history episode of these patients showed that, altogether 1 436 patients were dismissed in 2015, and a history of diabetes medical records extraction results showed that the recall rate was 87.6%, the accuracy rate was 99.5%, and F-Score was 0.93. For all the 10% patients (totally 1 223 patients) with diabetes by the dismissed dates of August 2017 in the same department, the extracted diabetes history extraction results showed that the recall rate was 89.2%, the accuracy rate was 99.2%, F-Score was 0.94. This study mainly adopts the combination of natural language processing and rule-based information extraction, and designs and implements an algorithm for extracting customized information from unstructured Chinese electronic medical record text data. It has better results than existing work.
Systems Maintenance Automated Repair Tasks (SMART)
NASA Technical Reports Server (NTRS)
Schuh, Joseph; Mitchell, Brent; Locklear, Louis; Belson, Martin A.; Al-Shihabi, Mary Jo Y.; King, Nadean; Norena, Elkin; Hardin, Derek
2010-01-01
SMART is a uniform automated discrepancy analysis and repair-authoring platform that improves technical accuracy and timely delivery of repair procedures for a given discrepancy (see figure a). SMART will minimize data errors, create uniform repair processes, and enhance the existing knowledge base of engineering repair processes. This innovation is the first tool developed that links the hardware specification requirements with the actual repair methods, sequences, and required equipment. SMART is flexibly designed to be useable by multiple engineering groups requiring decision analysis, and by any work authorization and disposition platform (see figure b). The organizational logic creates the link between specification requirements of the hardware, and specific procedures required to repair discrepancies. The first segment in the SMART process uses a decision analysis tree to define all the permutations between component/ subcomponent/discrepancy/repair on the hardware. The second segment uses a repair matrix to define what the steps and sequences are for any repair defined in the decision tree. This segment also allows for the selection of specific steps from multivariable steps. SMART will also be able to interface with outside databases and to store information from them to be inserted into the repair-procedure document. Some of the steps will be identified as optional, and would only be used based on the location and the current configuration of the hardware. The output from this analysis would be sent to a work authoring system in the form of a predefined sequence of steps containing required actions, tools, parts, materials, certifications, and specific requirements controlling quality, functional requirements, and limitations.
NASA Astrophysics Data System (ADS)
Purwoko, Saad, Noor Shah; Tajudin, Nor'ain Mohd
2017-05-01
This study aims to: i) develop problem solving questions of Linear Equations System of Two Variables (LESTV) based on levels of IPT Model, ii) explain the level of students' skill of information processing in solving LESTV problems; iii) explain students' skill in information processing in solving LESTV problems; and iv) explain students' cognitive process in solving LESTV problems. This study involves three phases: i) development of LESTV problem questions based on Tessmer Model; ii) quantitative survey method on analyzing students' skill level of information processing; and iii) qualitative case study method on analyzing students' cognitive process. The population of the study was 545 eighth grade students represented by a sample of 170 students of five Junior High Schools in Hilir Barat Zone, Palembang (Indonesia) that were chosen using cluster sampling. Fifteen students among them were drawn as a sample for the interview session with saturated information obtained. The data were collected using the LESTV problem solving test and the interview protocol. The quantitative data were analyzed using descriptive statistics, while the qualitative data were analyzed using the content analysis. The finding of this study indicated that students' cognitive process was just at the step of indentifying external source and doing algorithm in short-term memory fluently. Only 15.29% students could retrieve type A information and 5.88% students could retrieve type B information from long-term memory. The implication was the development problems of LESTV had validated IPT Model in modelling students' assessment by different level of hierarchy.
Thermal sensors to control polymer forming. Challenge and solutions
NASA Astrophysics Data System (ADS)
Lemeunier, F.; Boyard, N.; Sarda, A.; Plot, C.; Lefèvre, N.; Petit, I.; Colomines, G.; Allanic, N.; Bailleul, J. L.
2017-10-01
Many thermal sensors are already used, for many years, to better understand and control material forming processes, especially polymer processing. Due to technical constraints (high pressure, sealing, sensor dimensions…) the thermal measurement is often performed in the tool or close its surface. Thus, it only gives partial and disturbed information. Having reliable information about the heat flux exchanges between the tool and the material during the process would be very helpful to improve the control of the process and to favor the development of new materials. In this work, we present several sensors developed in labs to study the molding steps in forming processes. The analysis of the obtained thermal measurements (temperature, heat flux) shows the required sensitivity threshold of sensitivity of thermal sensors to be able to detect on-line the rate of thermal reaction. Based on these data, we will present new sensor designs which have been patented.
Kinetics in the real world: linking molecules, processes, and systems.
Kohse-Höinghaus, Katharina; Troe, Jürgen; Grabow, Jens-Uwe; Olzmann, Matthias; Friedrichs, Gernot; Hungenberg, Klaus-Dieter
2018-04-25
Unravelling elementary steps, reaction pathways, and kinetic mechanisms is key to understanding the behaviour of many real-world chemical systems that span from the troposphere or even interstellar media to engines and process reactors. Recent work in chemical kinetics provides detailed information on the reactive changes occurring in chemical systems, often on the atomic or molecular scale. The optimisation of practical processes, for instance in combustion, catalysis, battery technology, polymerisation, and nanoparticle production, can profit from a sound knowledge of the underlying fundamental chemical kinetics. Reaction mechanisms can combine information gained from theory and experiments to enable the predictive simulation and optimisation of the crucial process variables and influences on the system's behaviour that may be exploited for both monitoring and control. Chemical kinetics, as one of the pillars of Physical Chemistry, thus contributes importantly to understanding and describing natural environments and technical processes and is becoming increasingly relevant for interactions in and with the real world.
Cutting, Elizabeth M; Overby, Casey L; Banchero, Meghan; Pollin, Toni; Kelemen, Mark; Shuldiner, Alan R; Beitelshees, Amber L
Delivering genetic test results to clinicians is a complex process. It involves many actors and multiple steps, requiring all of these to work together in order to create an optimal course of treatment for the patient. We used information gained from focus groups in order to illustrate the current process of delivering genetic test results to clinicians. We propose a business process model and notation (BPMN) representation of this process for a Translational Pharmacogenomics Project being implemented at the University of Maryland Medical Center, so that personalized medicine program implementers can identify areas to improve genetic testing processes. We found that the current process could be improved to reduce input errors, better inform and notify clinicians about the implications of certain genetic tests, and make results more easily understood. We demonstrate our use of BPMN to improve this important clinical process for CYP2C19 genetic testing in patients undergoing invasive treatment of coronary heart disease.
Cutting, Elizabeth M.; Overby, Casey L.; Banchero, Meghan; Pollin, Toni; Kelemen, Mark; Shuldiner, Alan R.; Beitelshees, Amber L.
2015-01-01
Delivering genetic test results to clinicians is a complex process. It involves many actors and multiple steps, requiring all of these to work together in order to create an optimal course of treatment for the patient. We used information gained from focus groups in order to illustrate the current process of delivering genetic test results to clinicians. We propose a business process model and notation (BPMN) representation of this process for a Translational Pharmacogenomics Project being implemented at the University of Maryland Medical Center, so that personalized medicine program implementers can identify areas to improve genetic testing processes. We found that the current process could be improved to reduce input errors, better inform and notify clinicians about the implications of certain genetic tests, and make results more easily understood. We demonstrate our use of BPMN to improve this important clinical process for CYP2C19 genetic testing in patients undergoing invasive treatment of coronary heart disease. PMID:26958179
Parametric Grid Information in the DOE Knowledge Base: Data Preparation, Storage, and Access
DOE Office of Scientific and Technical Information (OSTI.GOV)
HIPP,JAMES R.; MOORE,SUSAN G.; MYERS,STEPHEN C.
The parametric grid capability of the Knowledge Base provides an efficient, robust way to store and access interpolatable information which is needed to monitor the Comprehensive Nuclear Test Ban Treaty. To meet both the accuracy and performance requirements of operational monitoring systems, we use a new approach which combines the error estimation of kriging with the speed and robustness of Natural Neighbor Interpolation (NNI). The method involves three basic steps: data preparation (DP), data storage (DS), and data access (DA). The goal of data preparation is to process a set of raw data points to produce a sufficient basis formore » accurate NNI of value and error estimates in the Data Access step. This basis includes a set of nodes and their connectedness, collectively known as a tessellation, and the corresponding values and errors that map to each node, which we call surfaces. In many cases, the raw data point distribution is not sufficiently dense to guarantee accurate error estimates from the NNI, so the original data set must be densified using a newly developed interpolation technique known as Modified Bayesian Kriging. Once appropriate kriging parameters have been determined by variogram analysis, the optimum basis for NNI is determined in a process they call mesh refinement, which involves iterative kriging, new node insertion, and Delauny triangle smoothing. The process terminates when an NNI basis has been calculated which will fir the kriged values within a specified tolerance. In the data storage step, the tessellations and surfaces are stored in the Knowledge Base, currently in a binary flatfile format but perhaps in the future in a spatially-indexed database. Finally, in the data access step, a client application makes a request for an interpolated value, which triggers a data fetch from the Knowledge Base through the libKBI interface, a walking triangle search for the containing triangle, and finally the NNI interpolation.« less
NASA Technical Reports Server (NTRS)
Duxbury, J. H.
1983-01-01
The JPL's Scientific Data Analysis System (SDAS), which will process IRAS data and produce a catalogue of perhaps a million infrared sources in the sky, as well as other information for astronomical records, is described. The purposes of SDAS are discussed, and the major SDAS processors are shown in block diagram. The catalogue processing is addressed, mentioning the basic processing steps which will be applied to raw detector data. Signal reconstruction and conversion to astrophysical units, source detection, source confirmation, data management, and survey data products are considered in detail.
GUILD: GUidance for Information about Linking Data sets†
Gilbert, Ruth; Lafferty, Rosemary; Hagger-Johnson, Gareth; Zhang, Li-Chun; Smith, Peter; Dibben, Chris; Goldstein, Harvey
2018-01-01
Abstract Record linkage of administrative and survey data is increasingly used to generate evidence to inform policy and services. Although a powerful and efficient way of generating new information from existing data sets, errors related to data processing before, during and after linkage can bias results. However, researchers and users of linked data rarely have access to information that can be used to assess these biases or take them into account in analyses. As linked administrative data are increasingly used to provide evidence to guide policy and services, linkage error, which disproportionately affects disadvantaged groups, can undermine evidence for public health. We convened a group of researchers and experts from government data providers to develop guidance about the information that needs to be made available about the data linkage process, by data providers, data linkers, analysts and the researchers who write reports. The guidance goes beyond recommendations for information to be included in research reports. Our aim is to raise awareness of information that may be required at each step of the linkage pathway to improve the transparency, reproducibility, and accuracy of linkage processes, and the validity of analyses and interpretation of results. PMID:28369581
Godson, Richard H.
1974-01-01
GEOPAC .consists of a series of subroutines to primarily process potential-field geophysical data but other types of data can also be used with the program. The package contains routines to reduce, store, process and display information in two-dimensional or three-dimensional form. Input and output formats are standardized and temporary disk storage permits data sets to be processed by several subroutines in one job step. The subroutines are link-edited in an overlay mode to form one program and they can be executed by submitting a card containing the subroutine name in the input stream.
Metal- matrix composite processing technologies for aircraft engine applications
NASA Astrophysics Data System (ADS)
Pank, D. R.; Jackson, J. J.
1993-06-01
Titanium metal-matrix composites (MMC) are prime candidate materials for aerospace applications be-cause of their excellent high-temperature longitudinal strength and stiffness and low density compared with nickel- and steel-base materials. This article examines the steps GE Aircraft Engines (GEAE) has taken to develop an induction plasma deposition (IPD) processing method for the fabrication of Ti6242/SiC MMC material. Information regarding process methodology, microstructures, and mechani-cal properties of consolidated MMC structures will be presented. The work presented was funded under the GE-Aircraft Engine IR & D program.
Horan, Thomas A; Daniels, Susan M; Feldman, Sue S
2009-07-01
The disability community could benefit significantly from the widespread adoption of health information technology, in particular from its ability to streamline and accelerate processing of the estimated 3 million disability benefits applications filed with the Social Security Administration each year. Disability determination is an inefficient, largely paper-based process requiring large volumes of clinical data compiled from multiple provider sources. That, coupled with a lack of transparency within the process, adds unnecessary delays and expense. The objective of this paper is to outline the case for how personal health records, particularly those populated with information from provider-held electronic health records and payer claims data, offer a means to achieve financial savings from shortened disability determination processes, as well as a tool for disability health self-management and care coordination. Drawing from research and policy forums and testimony before the American Health Information Community, the importance of including the disability community as the nation moves forward with health information technology initiatives is explored. Our research suggests that systemwide improvements such as the Nationwide Health Information Network and other such health information technology initiatives could be used to bring benefits to the disability community. The time has come to use health information technology initiatives so that federal policy makers can takes steps to reduce the inefficiencies in the Social Security Administration disability determination process while improving the program's value to those who need it the most.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Soliman, A; Safigholi, H; Sunnybrook Health Sciences Center, Toronto, ON
Purpose: To propose a new method that provides a positive contrast visualization of the prostate brachytherapy seeds using the phase information from MR images. Additionally, the feasibility of using the processed phase information to distinguish seeds from calcifications is explored. Methods: A gel phantom was constructed using 2% agar dissolved in 1 L of distilled water. Contrast agents were added to adjust the relaxation times. Four iodine-125 (Eckert & Ziegler SML86999) dummy seeds were placed at different orientations with respect to the main magnetic field (B0). Calcifications were obtained from a sheep femur cortical bone due to its close similaritymore » to human bone tissue composition. Five samples of calcifications were shaped into different dimensions with lengths ranging between 1.2 – 6.1 mm.MR imaging was performed on a 3T Philips Achieva using an 8-channel head coil. Eight images were acquired at eight echo-times using a multi-gradient echo sequence. Spatial resolution was 0.7 × 0.7 × 2 mm, TR/TE/dTE = 20.0/2.3/2.3 ms and BW = 541 Hz/pixel. Complex images were acquired and fed into a two-step processing pipeline: the first includes phase unwrapping and background phase removal using Laplacian operator (Wei et al. 2013). The second step applies a specific phase mask on the resulting tissue phase from the first step to provide the desired positive contrast of the seeds and to, potentially, differentiate them from the calcifications. Results: The phase-processing was performed in less than 30 seconds. The proposed method has successfully resulted in a positive contrast of the brachytherapy seeds. Additionally, the final processed phase image showed difference between the appearance of seeds and calcifications. However, the shape of the seeds was slightly distorted compared to the original dimensions. Conclusion: It is feasible to provide a positive contrast of the seeds from MR images using Laplacian operator-based phase processing.« less
Aniba, Mohamed Radhouene; Siguenza, Sophie; Friedrich, Anne; Plewniak, Frédéric; Poch, Olivier; Marchler-Bauer, Aron; Thompson, Julie Dawn
2009-01-01
The traditional approach to bioinformatics analyses relies on independent task-specific services and applications, using different input and output formats, often idiosyncratic, and frequently not designed to inter-operate. In general, such analyses were performed by experts who manually verified the results obtained at each step in the process. Today, the amount of bioinformatics information continuously being produced means that handling the various applications used to study this information presents a major data management and analysis challenge to researchers. It is now impossible to manually analyse all this information and new approaches are needed that are capable of processing the large-scale heterogeneous data in order to extract the pertinent information. We review the recent use of integrated expert systems aimed at providing more efficient knowledge extraction for bioinformatics research. A general methodology for building knowledge-based expert systems is described, focusing on the unstructured information management architecture, UIMA, which provides facilities for both data and process management. A case study involving a multiple alignment expert system prototype called AlexSys is also presented.
Aniba, Mohamed Radhouene; Siguenza, Sophie; Friedrich, Anne; Plewniak, Frédéric; Poch, Olivier; Marchler-Bauer, Aron
2009-01-01
The traditional approach to bioinformatics analyses relies on independent task-specific services and applications, using different input and output formats, often idiosyncratic, and frequently not designed to inter-operate. In general, such analyses were performed by experts who manually verified the results obtained at each step in the process. Today, the amount of bioinformatics information continuously being produced means that handling the various applications used to study this information presents a major data management and analysis challenge to researchers. It is now impossible to manually analyse all this information and new approaches are needed that are capable of processing the large-scale heterogeneous data in order to extract the pertinent information. We review the recent use of integrated expert systems aimed at providing more efficient knowledge extraction for bioinformatics research. A general methodology for building knowledge-based expert systems is described, focusing on the unstructured information management architecture, UIMA, which provides facilities for both data and process management. A case study involving a multiple alignment expert system prototype called AlexSys is also presented. PMID:18971242
Meadows, Anthony; Wimpenny, Katherine
2017-07-01
Although clinical improvisation continues to be an important focus of music therapy research and practice, less attention has been given to integrating qualitative research in this area. As a result, this knowledge base tends to be contained within specific areas of practice rather than integrated across practices and approaches. This qualitative research synthesis profiles, integrates, and re-presents qualitative research focused on the ways music therapists and clients engage in, and make meaning from, clinical improvisation. Further, as a conduit for broadening dialogues, opening up this landscape fully, and sharing our response to the analysis and interpretation process, we present an arts-informed re-presentation of this synthesis. Following an eight-step methodological sequence, 13 qualitative studies were synthesized. This included reciprocal and refutational processes associated with synthesizing the primary studies, and additional steps associated with an arts-informed representation. Three themes, professional artistry, performing self, and meaning-making, are presented. Each theme is explored and exemplified through the selected articles, and discussed within a larger theoretical framework. An artistic re-presentation of the data is also presented. Music therapists use complex frameworks through which to engage clients in, and make meaning from, improvisational experiences. Artistic representation of the findings offers an added dimension to the synthesis process, challenging our understanding of representation, and thereby advancing synthesis methodology. © the American Music Therapy Association 2017. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com
NASA Astrophysics Data System (ADS)
Abramov, G. V.; Emeljanov, A. E.; Ivashin, A. L.
Theoretical bases for modeling a digital control system with information transfer via the channel of plural access and a regular quantization cycle are submitted. The theory of dynamic systems with random changes of the structure including elements of the Markov random processes theory is used for a mathematical description of a network control system. The characteristics of similar control systems are received. Experimental research of the given control systems is carried out.
High Resolution Soil Water from Regional Databases and Satellite Images
NASA Technical Reports Server (NTRS)
Morris, Robin D.; Smelyanskly, Vadim N.; Coughlin, Joseph; Dungan, Jennifer; Clancy, Daniel (Technical Monitor)
2002-01-01
This viewgraph presentation provides information on the ways in which plant growth can be inferred from satellite data and can then be used to infer soil water. There are several steps in this process, the first of which is the acquisition of data from satellite observations and relevant information databases such as the State Soil Geographic Database (STATSGO). Then probabilistic analysis and inversion with the Bayes' theorem reveals sources of uncertainty. The Markov chain Monte Carlo method is also used.
An automated workflow for parallel processing of large multiview SPIM recordings
Schmied, Christopher; Steinbach, Peter; Pietzsch, Tobias; Preibisch, Stephan; Tomancak, Pavel
2016-01-01
Summary: Selective Plane Illumination Microscopy (SPIM) allows to image developing organisms in 3D at unprecedented temporal resolution over long periods of time. The resulting massive amounts of raw image data requires extensive processing interactively via dedicated graphical user interface (GUI) applications. The consecutive processing steps can be easily automated and the individual time points can be processed independently, which lends itself to trivial parallelization on a high performance computing (HPC) cluster. Here, we introduce an automated workflow for processing large multiview, multichannel, multiillumination time-lapse SPIM data on a single workstation or in parallel on a HPC cluster. The pipeline relies on snakemake to resolve dependencies among consecutive processing steps and can be easily adapted to any cluster environment for processing SPIM data in a fraction of the time required to collect it. Availability and implementation: The code is distributed free and open source under the MIT license http://opensource.org/licenses/MIT. The source code can be downloaded from github: https://github.com/mpicbg-scicomp/snakemake-workflows. Documentation can be found here: http://fiji.sc/Automated_workflow_for_parallel_Multiview_Reconstruction. Contact: schmied@mpi-cbg.de Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26628585
Measuring Diagnoses: ICD Code Accuracy
O'Malley, Kimberly J; Cook, Karon F; Price, Matt D; Wildes, Kimberly Raiford; Hurdle, John F; Ashton, Carol M
2005-01-01
Objective To examine potential sources of errors at each step of the described inpatient International Classification of Diseases (ICD) coding process. Data Sources/Study Setting The use of disease codes from the ICD has expanded from classifying morbidity and mortality information for statistical purposes to diverse sets of applications in research, health care policy, and health care finance. By describing a brief history of ICD coding, detailing the process for assigning codes, identifying where errors can be introduced into the process, and reviewing methods for examining code accuracy, we help code users more systematically evaluate code accuracy for their particular applications. Study Design/Methods We summarize the inpatient ICD diagnostic coding process from patient admission to diagnostic code assignment. We examine potential sources of errors at each step and offer code users a tool for systematically evaluating code accuracy. Principle Findings Main error sources along the “patient trajectory” include amount and quality of information at admission, communication among patients and providers, the clinician's knowledge and experience with the illness, and the clinician's attention to detail. Main error sources along the “paper trail” include variance in the electronic and written records, coder training and experience, facility quality-control efforts, and unintentional and intentional coder errors, such as misspecification, unbundling, and upcoding. Conclusions By clearly specifying the code assignment process and heightening their awareness of potential error sources, code users can better evaluate the applicability and limitations of codes for their particular situations. ICD codes can then be used in the most appropriate ways. PMID:16178999
Using Geographic Information Systems for Exposure Assessment in Environmental Epidemiology Studies
Nuckols, John R.; Ward, Mary H.; Jarup, Lars
2004-01-01
Geographic information systems (GIS) are being used with increasing frequency in environmental epidemiology studies. Reported applications include locating the study population by geocoding addresses (assigning mapping coordinates), using proximity analysis of contaminant source as a surrogate for exposure, and integrating environmental monitoring data into the analysis of the health outcomes. Although most of these studies have been ecologic in design, some have used GIS in estimating environmental levels of a contaminant at the individual level and to design exposure metrics for use in epidemiologic studies. In this article we discuss fundamentals of three scientific disciplines instrumental to using GIS in exposure assessment for epidemiologic studies: geospatial science, environmental science, and epidemiology. We also explore how a GIS can be used to accomplish several steps in the exposure assessment process. These steps include defining the study population, identifying source and potential routes of exposure, estimating environmental levels of target contaminants, and estimating personal exposures. We present and discuss examples for the first three steps. We discuss potential use of GIS and global positioning systems (GPS) in the last step. On the basis of our findings, we conclude that the use of GIS in exposure assessment for environmental epidemiology studies is not only feasible but can enhance the understanding of the association between contaminants in our environment and disease. PMID:15198921
Neural correlates of phonetic convergence and speech imitation
Garnier, Maëva; Lamalle, Laurent; Sato, Marc
2013-01-01
Speakers unconsciously tend to mimic their interlocutor's speech during communicative interaction. This study aims at examining the neural correlates of phonetic convergence and deliberate imitation, in order to explore whether imitation of phonetic features, deliberate, or unconscious, might reflect a sensory-motor recalibration process. Sixteen participants listened to vowels with pitch varying around the average pitch of their own voice, and then produced the identified vowels, while their speech was recorded and their brain activity was imaged using fMRI. Three degrees and types of imitation were compared (unconscious, deliberate, and inhibited) using a go-nogo paradigm, which enabled the comparison of brain activations during the whole imitation process, its active perception step, and its production. Speakers followed the pitch of voices they were exposed to, even unconsciously, without being instructed to do so. After being informed about this phenomenon, 14 participants were able to inhibit it, at least partially. The results of whole brain and ROI analyses support the fact that both deliberate and unconscious imitations are based on similar neural mechanisms and networks, involving regions of the dorsal stream, during both perception and production steps of the imitation process. While no significant difference in brain activation was found between unconscious and deliberate imitations, the degree of imitation, however, appears to be determined by processes occurring during the perception step. Four regions of the dorsal stream: bilateral auditory cortex, bilateral supramarginal gyrus (SMG), and left Wernicke's area, indeed showed an activity that correlated significantly with the degree of imitation during the perception step. PMID:24062704
Zamanzadeh, Vahid; Ghahramanian, Akram; Rassouli, Maryam; Abbaszadeh, Abbas; Alavi-Majd, Hamid; Nikanfar, Ali-Reza
2015-01-01
Introduction: The importance of content validity in the instrument psychometric and its relevance with reliability, have made it an essential step in the instrument development. This article attempts to give an overview of the content validity process and to explain the complexity of this process by introducing an example. Methods: We carried out a methodological study conducted to examine the content validity of the patient-centered communication instrument through a two-step process (development and judgment). At the first step, domain determination, sampling (item generation) and instrument formation and at the second step, content validity ratio, content validity index and modified kappa statistic was performed. Suggestions of expert panel and item impact scores are used to examine the instrument face validity. Results: From a set of 188 items, content validity process identified seven dimensions includes trust building (eight items), informational support (seven items), emotional support (five items), problem solving (seven items), patient activation (10 items), intimacy/friendship (six items) and spirituality strengthening (14 items). Content validity study revealed that this instrument enjoys an appropriate level of content validity. The overall content validity index of the instrument using universal agreement approach was low; however, it can be advocated with respect to the high number of content experts that makes consensus difficult and high value of the S-CVI with the average approach, which was equal to 0.93. Conclusion: This article illustrates acceptable quantities indices for content validity a new instrument and outlines them during design and psychometrics of patient-centered communication measuring instrument. PMID:26161370
Semi-autonomous remote sensing time series generation tool
NASA Astrophysics Data System (ADS)
Babu, Dinesh Kumar; Kaufmann, Christof; Schmidt, Marco; Dhams, Thorsten; Conrad, Christopher
2017-10-01
High spatial and temporal resolution data is vital for crop monitoring and phenology change detection. Due to the lack of satellite architecture and frequent cloud cover issues, availability of daily high spatial data is still far from reality. Remote sensing time series generation of high spatial and temporal data by data fusion seems to be a practical alternative. However, it is not an easy process, since it involves multiple steps and also requires multiple tools. In this paper, a framework of Geo Information System (GIS) based tool is presented for semi-autonomous time series generation. This tool will eliminate the difficulties by automating all the steps and enable the users to generate synthetic time series data with ease. Firstly, all the steps required for the time series generation process are identified and grouped into blocks based on their functionalities. Later two main frameworks are created, one to perform all the pre-processing steps on various satellite data and the other one to perform data fusion to generate time series. The two frameworks can be used individually to perform specific tasks or they could be combined to perform both the processes in one go. This tool can handle most of the known geo data formats currently available which makes it a generic tool for time series generation of various remote sensing satellite data. This tool is developed as a common platform with good interface which provides lot of functionalities to enable further development of more remote sensing applications. A detailed description on the capabilities and the advantages of the frameworks are given in this paper.
Jimenez, Paulino; Bregenzer, Anita
2018-02-23
Electronic health (eHealth) and mobile health (mHealth) tools can support and improve the whole process of workplace health promotion (WHP) projects. However, several challenges and opportunities have to be considered while integrating these tools in WHP projects. Currently, a large number of eHealth tools are developed for changing health behavior, but these tools can support the whole WHP process, including group administration, information flow, assessment, intervention development process, or evaluation. To support a successful implementation of eHealth tools in the whole WHP processes, we introduce a concept of WHP (life cycle model of WHP) with 7 steps and present critical and success factors for the implementation of eHealth tools in each step. We developed a life cycle model of WHP based on the World Health Organization (WHO) model of healthy workplace continual improvement process. We suggest adaptations to the WHO model to demonstrate the large number of possibilities to implement eHealth tools in WHP as well as possible critical points in the implementation process. eHealth tools can enhance the efficiency of WHP in each of the 7 steps of the presented life cycle model of WHP. Specifically, eHealth tools can support by offering easier administration, providing an information and communication platform, supporting assessments, presenting and discussing assessment results in a dashboard, and offering interventions to change individual health behavior. Important success factors include the possibility to give automatic feedback about health parameters, create incentive systems, or bring together a large number of health experts in one place. Critical factors such as data security, anonymity, or lack of management involvement have to be addressed carefully to prevent nonparticipation and dropouts. Using eHealth tools can support WHP, but clear regulations for the usage and implementation of these tools at the workplace are needed to secure quality and reach sustainable results. ©Paulino Jimenez, Anita Bregenzer. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 23.02.2018.
Kasper, Jürgen; Köpke, Sascha; Mühlhauser, Ingrid; Heesen, Christoph
2006-07-01
This study analysis the comprehension and emotional responses of people suffering from multiple sclerosis when provided with an evidence-based information module. It is a core module of a comprehensive decision aid about immunotherapy. The core module is designed to enable patients to process scientific uncertainty without adverse effects. It considers existing standards for risk communication and presentation of data. Using a mailing approach we investigated 169 patients with differing courses of disease in a before-after design. Items addressed the competence in processing relative and absolute risk information and patients' emotional response to the tool, comprising grade of familiarity with the information, understanding, relevance, emotional arousal, and certainty. Overall, numeracy improved (p < 0.001), although 99 of 169 patients did not complete the numeracy task correctly. Understanding depended on the relevance related to the course of disease. A moderate level of uncertainty was induced. No adverse emotional responses could be shown, neither in those who did comprehend the information, nor in those who did not develop numeracy skills. In conclusion, the tool supports people suffering from multiple sclerosis to process evidence-based medical information and scientific uncertainty without burdening them emotionally. This study is an example for the documentation of an important step in the development process of a complex intervention.
NASA Technical Reports Server (NTRS)
Kempler, Steven; Teng, William; Acker, James; Belvedere, Deborah; Liu, Zhong; Leptoukh, Gregory
2010-01-01
In support of the NASA Energy and Water Cycle Study (NEWS), the Collaborative Energy and Water Cycle Information Services (CEWIS), sponsored by NEWS Program Manager Jared Entin, was initiated to develop an evolving set of community-based data and information services that would facilitate users to locate, access, and bring together multiple distributed heterogeneous energy and water cycle datasets. The CEWIS workshop, June 15-16, 2010, at NASA/GSFC, was the initial step of the process, starting with identifying and scoping the issues, as defined by the community.
NASA Astrophysics Data System (ADS)
Hubbard, Stephen; Kostic, Svetlana; Englert, Rebecca; Coutts, Daniel; Covault, Jacob
2017-04-01
Recent bathymetric observations of fjord prodeltas in British Columbia, Canada, reveal evidence for multi-phase channel erosion and deposition. These processes are interpreted to be related to the upstream migration of upper-flow-regime bedforms, namely cyclic steps. We integrate data from high-resolution bathymetric surveys and monitoring to inform morphodynamic numerical models of turbidity currents and associated bedforms in the Squamish prodelta. These models are applied to the interpretation of upper-flow-regime bedforms, including cyclic steps, antidunes, and/or transitional bedforms, in Late Cretaceous submarine conduit strata of the Nanaimo Group at Gabriola Island, British Columbia. In the Squamish prodelta, as bedforms migrate, >90% of the deposits are reworked, making morphology- and facies-based recognition challenging. Sedimentary bodies are 5-30 m long, 0.5-2 m thick and <30 m wide. The Nanaimo Group comprises scour fills of similar scale composed of structureless sandstone, with laminated siltstone locally overlying basal erosion surfaces. Backset stratification is locally observed; packages of 2-4 backset beds, each of which are up to 60 cm thick and up to 15 m long (along dip), commonly share composite basal erosion surfaces. Numerous scour fills are recognized over thin sections (<4 m), indicating limited aggradation and preservation of the bedforms. Preliminary morphodynamic numerical modeling indicates that Squamish and Nanaimo bedforms could be transitional upper-flow-regime bedforms between cyclic steps and antidunes. It is likely that cyclic steps and related upper-flow-regime bedforms are common in strata deposited on high gradient submarine slopes. Evidence for updip-migrating cyclic step and related deposits inform a revised interpretation of a high gradient setting dominated by supercritical flow, or alternating supercritical and subcritical flow in the Nanaimo Group. Integrating direct observations, morphodynamic numerical modeling, and outcrop characterization better constrains fundamental processes that operate in deep-water depositional systems; our analyses aims to further deduce the stratigraphy and preservation potential of upper flow-regime bedforms.
Kielar, Ania Z; El-Maraghi, Robert H; Schweitzer, Mark E
2010-08-01
In Canada, equal access to health care is the goal, but this is associated with wait times. Wait times should be fair rather than uniform, taking into account the urgency of the problem as well as the time an individual has already waited. In November 2004, the Ontario government began addressing this issue. One of the first steps was to institute benchmarks reflecting "acceptable" wait times for CT and MRI. A public Web site was developed indicating wait times at each Local Health Integration Network. Since starting the Wait Time Information Program, there has been a sustained reduction in wait times for Ontarians requiring CT and MRI. The average wait time for a CT scan went from 81 days in September 2005 to 47 days in September 2009. For MRI, the resulting wait time was reduced from 120 to 105 days. Increased patient scans have been achieved by purchasing new CT and MRI scanners, expanding hours of operation, and improving patient throughput using strategies learned from the Lean initiative, based on Toyota's manufacturing philosophy for car production. Institution-specific changes in booking procedures have been implemented. Concurrently, government guidelines have been developed to ensure accountability for monies received. The Ontario Wait Time Information Program is an innovative first step in improving fair and equitable access to publicly funded imaging services. There have been reductions in wait times for both CT and MRI. As various new processes are implemented, further review will be necessary for each step to determine their individual efficacy. Copyright 2010 American College of Radiology. Published by Elsevier Inc. All rights reserved.
Edge detection and localization with edge pattern analysis and inflection characterization
NASA Astrophysics Data System (ADS)
Jiang, Bo
2012-05-01
In general edges are considered to be abrupt changes or discontinuities in two dimensional image signal intensity distributions. The accuracy of front-end edge detection methods in image processing impacts the eventual success of higher level pattern analysis downstream. To generalize edge detectors designed from a simple ideal step function model to real distortions in natural images, research on one dimensional edge pattern analysis to improve the accuracy of edge detection and localization proposes an edge detection algorithm, which is composed by three basic edge patterns, such as ramp, impulse, and step. After mathematical analysis, general rules for edge representation based upon the classification of edge types into three categories-ramp, impulse, and step (RIS) are developed to reduce detection and localization errors, especially reducing "double edge" effect that is one important drawback to the derivative method. But, when applying one dimensional edge pattern in two dimensional image processing, a new issue is naturally raised that the edge detector should correct marking inflections or junctions of edges. Research on human visual perception of objects and information theory pointed out that a pattern lexicon of "inflection micro-patterns" has larger information than a straight line. Also, research on scene perception gave an idea that contours have larger information are more important factor to determine the success of scene categorization. Therefore, inflections or junctions are extremely useful features, whose accurate description and reconstruction are significant in solving correspondence problems in computer vision. Therefore, aside from adoption of edge pattern analysis, inflection or junction characterization is also utilized to extend traditional derivative edge detection algorithm. Experiments were conducted to test my propositions about edge detection and localization accuracy improvements. The results support the idea that these edge detection method improvements are effective in enhancing the accuracy of edge detection and localization.
NASA Astrophysics Data System (ADS)
Krysta, M.; Kusmierczyk-Michulec, J.; Nikkinen, M.; Carter, J. A.
2011-12-01
In order to support its mission of monitoring compliance with the treaty banning nuclear explosions, the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) operates four global networks of, respectively, seismic, infrasound, hydroacoustic sensors and air samplers accompanied with radionuclide detectors. The role of the International Data Centre (IDC) of CTBTO is to associate the signals detected in the monitoring networks with the physical phenomena which emitted these signals, by forming events. One of the aspects of associating detections with emitters is the problem of inferring the sources of radionuclides from the detections made at CTBTO radionuclide network stations. This task is particularly challenging because the average transport distance between a release point and detectors is large. Complex processes of turbulent diffusion are responsible for efficient mixing and consequently for decreasing the information content of detections with an increasing distance from the source. The problem is generally addressed in a two-step process. In the first step, an atmospheric transport model establishes a link between the detections and the regions of possible source location. In the second step this link is inverted to infer source information from the detections. In this presentation, we will discuss enhancements of the presently used regression-based inversion algorithm to reconstruct a source of radionuclides. To this aim, modern inversion algorithms accounting for prior information and appropriately regularizing an under-determined reconstruction problem will be briefly introduced. Emphasis will be on the CTBTO context and the choice of inversion methods. An illustration of the first tests will be provided using a framework of twin experiments, i.e. fictitious detections in the CTBTO radionuclide network generated with an atmospheric transport model.
Low NO sub x heavy fuel combustor concept program
NASA Technical Reports Server (NTRS)
Russell, P.; Beal, G.; Hinton, B.
1981-01-01
A gas turbine technology program to improve and optimize the staged rich lean low NOx combustor concept is described. Subscale combustor tests to develop the design information for optimization of the fuel preparation, rich burn, quick air quench, and lean burn steps of the combustion process were run. The program provides information for the design of high pressure full scale gas turbine combustors capable of providing environmentally clean combustion of minimally of minimally processed and synthetic fuels. It is concluded that liquid fuel atomization and mixing, rich zone stoichiometry, rich zone liner cooling, rich zone residence time, and quench zone stoichiometry are important considerations in the design and scale up of the rich lean combustor.
Information adaptive system of NEEDS. [of NASA End to End Data System
NASA Technical Reports Server (NTRS)
Howle, W. M., Jr.; Kelly, W. L.
1979-01-01
The NASA End-to-End Data System (NEEDS) program was initiated by NASA to improve significantly the state of the art in acquisition, processing, and distribution of space-acquired data for the mid-1980s and beyond. The information adaptive system (IAS) is a program element under NEEDS Phase II which addresses sensor specific processing on board the spacecraft. The IAS program is a logical first step toward smart sensors, and IAS developments - particularly the system components and key technology improvements - are applicable to future smart efforts. The paper describes the design goals and functional elements of the IAS. In addition, the schedule for IAS development and demonstration is discussed.
Method for extracting long-equivalent wavelength interferometric information
NASA Technical Reports Server (NTRS)
Hochberg, Eric B. (Inventor)
1991-01-01
A process for extracting long-equivalent wavelength interferometric information from a two-wavelength polychromatic or achromatic interferometer. The process comprises the steps of simultaneously recording a non-linear sum of two different frequency visible light interferograms on a high resolution film and then placing the developed film in an optical train for Fourier transformation, low pass spatial filtering and inverse transformation of the film image to produce low spatial frequency fringes corresponding to a long-equivalent wavelength interferogram. The recorded non-linear sum irradiance derived from the two-wavelength interferometer is obtained by controlling the exposure so that the average interferogram irradiance is set at either the noise level threshold or the saturation level threshold of the film.
A First Step Forward: Context Assessment
ERIC Educational Resources Information Center
Conner, Ross F.; Fitzpatrick, Jody L.; Rog, Debra J.
2012-01-01
In this chapter, we revisit and expand the context framework of Debra Rog, informed by three cases and by new aspects that we have identified. We then propose a way to move the framework into action, making context explicit. Based on the framework's components, we describe and illustrate a process we label context assessment (CA), which provides a…
Rep. Dreier, David [R-CA-26
2011-07-27
House - 07/28/2011 On agreeing to the resolution Agreed to by the Yeas and Nays: 238 - 186 (Roll no. 663). (All Actions) Tracker: This bill has the status Agreed to in HouseHere are the steps for Status of Legislation:
Interactive Computer Based Assessment Tasks: How Problem-Solving Process Data Can Inform Instruction
ERIC Educational Resources Information Center
Zoanetti, Nathan
2010-01-01
This article presents key steps in the design and analysis of a computer based problem-solving assessment featuring interactive tasks. The purpose of the assessment is to support targeted instruction for students by diagnosing strengths and weaknesses at different stages of problem-solving. The first focus of this article is the task piloting…
77 FR 43606 - Preliminary Damage Assessment for Individual Assistance Operations Manual (9327.2-PR)
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-25
... site at http://www.fema.gov . The proposed and final manual, all related Federal Register Notices, and... for conducting IA PDAs is to identify the impact, type, and extent of disaster damages and to... to recover. The PDA is an important first step in the disaster declaration process. The information...
Improving the role of vulnerability assessments In decision support for effective climate adaptation
Linda A. Joyce; Constance I. Millar
2014-01-01
Vulnerability assessments (VA) have been proposed as an initial step in a process to develop and implement adaptation management for climate change in forest ecosystems. Scientific understanding of the effects of climate change is an ever-accumulating knowledge base. Synthesizing information from this knowledge base in the context of our understanding of ecosystem...
ERIC Educational Resources Information Center
Glogger, Inga; Schwonke, Rolf; Holzapfel, Lars; Nuckles, Matthias; Renkl, Alexander
2012-01-01
Recently, there have been efforts to rethink assessment. Instead of informing about (relatively stable) learner characteristics, assessment should assist instruction by looking at the learning process, facilitating feedback about what students' next step in learning could be. Similarly, new forms of strategy assessment aim at capturing…
A Promising Step for Improving Career Service Delivery: Comment on Sampson et al. (2000).
ERIC Educational Resources Information Center
Jepsen, David A.
2000-01-01
Presents a positive response to Sampson et al.'s article (this issue [2000]) describing a cognitive-information processing (CIP) framework useful for improving career services. Asserts that the authors strike an appropriate tone of optimism and caution that matches the article author's own experience as a practitioner and a teacher of the CIP…
ERIC Educational Resources Information Center
Murray, Nancy; Kelder, Steve; Parcel, Guy; Orpinas, Pamela
1998-01-01
Describes development of an intervention program for Hispanic parents to reduce violence by increased monitoring of their middle school students. Program development used a five-step guided intervention mapping process. Student surveys and parent interviews provided data to inform program design. Intervention mapping ensured involvement with the…
Five Steps for Structuring Data-Informed Conversations and Action in Education. REL 2013-001
ERIC Educational Resources Information Center
Kekahio, Wendy; Baker, Myriam
2013-01-01
Using data strategically to guide decisions and actions can have a positive effect on education practices and processes. This facilitation guide shows education data teams how to move beyond simply reporting data to applying data to direct strategic action. Using guiding questions, suggested activities, and activity forms, this guide provides…
ERIC Educational Resources Information Center
Adkins, Florence E.
2015-01-01
This qualitative phenomenological research study used narrative inquiry to examine police officer perceptions of effective school responses to active shooting scenarios. Creswell's (2013) six step process for analyzing and interpreting qualitative data was used to examine the interview information. The study results support the idea that changes…
Ensuring a Bright Future for Babies: How to Advocate Effectively for Infants and Toddlers
ERIC Educational Resources Information Center
Rappaport, Debbie M.; Yarbrough, Karen
2006-01-01
Rappaport and Yarbrough describe three steps for early childhood professionals to take toward becoming effective advocates for infants and toddlers: (1) gather information about the public policy process; (2) learn to communicate effectively about the early years; and (3) build and maintain relationships with allies who can make policy decisions.…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-06
... Bonnethead sharks. SUMMARY: The SEDAR assessment of the HMS stocks of Atlantic Sharpnose and Bonnethead sharks will consist of one workshop and a series of Webinars. See SUPPLEMENTARY INFORMATION. DATES: The... status of fish stocks in the Southeast Region. SEDAR is a multi-step process including: (1) Data...
ERIC Educational Resources Information Center
Kirk, Jamie; Jahoda, Andrew; Pert, Carol
2008-01-01
Recent research has examined the relevance of the social information processing model of aggression to individuals with intellectual disability (ID). This study investigated the "response access" and "response decision" steps of this model. Photo stories were used to compare aggressive and nonaggressive individuals' beliefs about the outcomes of…
Process, including PSA and membrane separation, for separating hydrogen from hydrocarbons
Baker, Richard W.; Lokhandwala, Kaaeid A.; He, Zhenjie; Pinnau, Ingo
2001-01-01
An improved process for separating hydrogen from hydrocarbons. The process includes a pressure swing adsorption step, a compression/cooling step and a membrane separation step. The membrane step relies on achieving a methane/hydrogen selectivity of at least about 2.5 under the conditions of the process.
NASA Astrophysics Data System (ADS)
Hottenhuis, M. H. J.; Lucasius, C. B.
1988-09-01
Quantitative information about the influence of impurities on the crystal growth process of potassium hydrogen phthalate from its aqueous solution was obtained at two levels: microscopic and macroscopic. At the microscopic level, detailed in situ observations of spiral steps at the (010) face were performed. The velocity of these steps was measured, as well in a "clean" as in a contaminated solution, where the influence of a number of different impurities was investigated. This resulted in a measure of effectiveness of step retardation for each of these impurities. From the same microscopic observations it was observed how these effectiveness factors were influenced by the supersaturation σ, the saturation temperature Ts of the solution and the concentration cimp of the impurity that w as used. At the macroscopic level, ICP (inductively coupled plasma) measurements were carried out in order to determine the distribution coefficient of the same impurities. In these measurements again the influence of the impurity concentration and the supersaturation on the distribution coefficient kD was determined.
Breast cancer mitosis detection in histopathological images with spatial feature extraction
NASA Astrophysics Data System (ADS)
Albayrak, Abdülkadir; Bilgin, Gökhan
2013-12-01
In this work, cellular mitosis detection in histopathological images has been investigated. Mitosis detection is very expensive and time consuming process. Development of digital imaging in pathology has enabled reasonable and effective solution to this problem. Segmentation of digital images provides easier analysis of cell structures in histopathological data. To differentiate normal and mitotic cells in histopathological images, feature extraction step is very crucial step for the system accuracy. A mitotic cell has more distinctive textural dissimilarities than the other normal cells. Hence, it is important to incorporate spatial information in feature extraction or in post-processing steps. As a main part of this study, Haralick texture descriptor has been proposed with different spatial window sizes in RGB and La*b* color spaces. So, spatial dependencies of normal and mitotic cellular pixels can be evaluated within different pixel neighborhoods. Extracted features are compared with various sample sizes by Support Vector Machines using k-fold cross validation method. According to the represented results, it has been shown that separation accuracy on mitotic and non-mitotic cellular pixels gets better with the increasing size of spatial window.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1996-03-01
This module covers EPA`s Superfund community involvement program, a set of requirements under the National Contingency Plan (NCP) designed to ensure that public is informed about site conditions and given the opportunity to comment on the proposed remedy of a Superfund site. The NCP serves to uphold the public`s right to voice opinions and express concerns about Superfund site activities. EPA must involve communities throughout Superfund process - particularly at critical decision-making steps in the process.
NASA Technical Reports Server (NTRS)
Diorio, Kimberly A.; Voska, Ned (Technical Monitor)
2002-01-01
This viewgraph presentation provides information on Human Factors Process Failure Modes and Effects Analysis (HF PFMEA). HF PFMEA includes the following 10 steps: Describe mission; Define System; Identify human-machine; List human actions; Identify potential errors; Identify factors that effect error; Determine likelihood of error; Determine potential effects of errors; Evaluate risk; Generate solutions (manage error). The presentation also describes how this analysis was applied to a liquid oxygen pump acceptance test.
Software for rapid prototyping in the pharmaceutical and biotechnology industries.
Kappler, Michael A
2008-05-01
The automation of drug discovery methods continues to develop, especially techniques that process information, represent workflow and facilitate decision-making. The magnitude of data and the plethora of questions in pharmaceutical and biotechnology research give rise to the need for rapid prototyping software. This review describes the advantages and disadvantages of three solutions: Competitive Workflow, Taverna and Pipeline Pilot. Each of these systems processes large amounts of data, integrates diverse systems and assists novice programmers and human experts in critical decision-making steps.
Helsens, Kenny; Colaert, Niklaas; Barsnes, Harald; Muth, Thilo; Flikka, Kristian; Staes, An; Timmerman, Evy; Wortelkamp, Steffi; Sickmann, Albert; Vandekerckhove, Joël; Gevaert, Kris; Martens, Lennart
2010-03-01
MS-based proteomics produces large amounts of mass spectra that require processing, identification and possibly quantification before interpretation can be undertaken. High-throughput studies require automation of these various steps, and management of the data in association with the results obtained. We here present ms_lims (http://genesis.UGent.be/ms_lims), a freely available, open-source system based on a central database to automate data management and processing in MS-driven proteomics analyses.
Stewart, W F; Stewart, P A
1994-09-01
The strength and credibility of evidence from occupational case-control studies largely depend on the validity and precision with which the work history is reported and the exposure is assessed. We discuss the two steps which ultimately lead to an exposure decision. The first step involves the exchange between the respondent and an interviewer. The latter is usually naïve to occupations and workplace exposures and, as such, is limited to asking generic and open-ended questions about the workplace. Often, this type of information is too nonspecific to assess exposure. In the second step, an expert reviews the information reported on each occupation and decides on exposure status without contacting either the interviewer or respondent. Exposure assessment is not, therefore, integrated with data collection and, in fact, is usually not initiated until after all the interviews are completed. As such, the exposure expert does not have an opportunity to resolve questions before making the exposure decision. To improve the quality and specificity of data collected, we have developed over 40 sets of close-ended questions (branch questions) which are specific to defined occupations. These branch questions, incorporated into a computer-assisted telephone interview, are asked if selected occupations or their synonyms are reported. Second, to link the data collection process with the assessment process, we have developed a procedure called SCORE (Subject Corrected Occupational Report) which provides the industrial hygienist with a cost efficient method to ask questions directly of respondents. Shortly after each interview is completed, a computerized version of the work history is reviewed by the industrial hygienist who develops questions when more information is needed. Subsequently, respondents are mailed a form listing their reported work history along with the questions. After two mailings, 73% of participants in a pilot study returned the SCORE form.
Coupling image processing and stress analysis for damage identification in a human premolar tooth.
Andreaus, U; Colloca, M; Iacoviello, D
2011-08-01
Non-carious cervical lesions are characterized by the loss of dental hard tissue at the cement-enamel junction (CEJ). Exceeding stresses are therefore generated in the cervical region of the tooth that cause disruption of the bonds between the hydroxyapatite crystals, leading to crack formation and eventual loss of enamel and the underlying dentine. Damage identification was performed by image analysis techniques and allowed to quantitatively assess changes in teeth. A computerized two-step procedure was generated and applied to the first left maxillary human premolar. In the first step, dental images were digitally processed by a segmentation method in order to identify the damage. The considered morphological properties were the enamel thickness and total area, the number of fragments in which the enamel is chipped. The information retrieved by the data processing of the section images allowed to orient the stress investigation toward selected portions of the tooth. In the second step, a three-dimensional finite element model based on CT images of both the tooth and the periodontal ligament was employed to compare the changes occurring in the stress distributions in normal occlusion and malocclusion. The stress states were analyzed exclusively in the critical zones designated in the first step. The risk of failure at the CEJ and of crack initiation at the dentin-enamel junction through the quantification of first and third principal stresses, von Mises stress, and normal and tangential stresses, were also estimated. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
Reed-Jones, James G; Reed-Jones, Rebecca J; Hollands, Mark A
2014-04-30
The useful field of view (UFOV) is the visual area from which information is obtained at a brief glance. While studies have examined the effects of increased cognitive load on the visual field, no one has specifically looked at the effects of postural control or locomotor activity on the UFOV. The current study aimed to examine the effects of postural demand and locomotor activity on UFOV performance in healthy young adults. Eleven participants were tested on three modified UFOV tasks (central processing, peripheral processing, and divided-attention) while seated, standing, and stepping in place. Across all postural conditions, participants showed no difference in their central or peripheral processing. However, in the divided-attention task (reporting the letter in central vision and target location in peripheral vision amongst distracter items) a main effect of posture condition on peripheral target accuracy was found for targets at 57° of eccentricity (p=.037). The mean accuracy reduced from 80.5% (standing) to 74% (seated) to 56.3% (stepping). These findings show that postural demands do affect UFOV divided-attention performance. In particular, the size of the useful field of view significantly decreases when stepping. This finding has important implications for how the results of a UFOV test are used to evaluate the general size of the UFOV during varying activities, as the traditional seated test procedure may overestimate the size of the UFOV during locomotor activities. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
To analyse a trace or not? Evaluating the decision-making process in the criminal investigation.
Bitzer, Sonja; Ribaux, Olivier; Albertini, Nicola; Delémont, Olivier
2016-05-01
In order to broaden our knowledge and understanding of the decision steps in the criminal investigation process, we started by evaluating the decision to analyse a trace and the factors involved in this decision step. This decision step is embedded in the complete criminal investigation process, involving multiple decision and triaging steps. Considering robbery cases occurring in a geographic region during a 2-year-period, we have studied the factors influencing the decision to submit biological traces, directly sampled on the scene of the robbery or on collected objects, for analysis. The factors were categorised into five knowledge dimensions: strategic, immediate, physical, criminal and utility and decision tree analysis was carried out. Factors in each category played a role in the decision to analyse a biological trace. Interestingly, factors involving information available prior to the analysis are of importance, such as the fact that a positive result (a profile suitable for comparison) is already available in the case, or that a suspect has been identified through traditional police work before analysis. One factor that was taken into account, but was not significant, is the matrix of the trace. Hence, the decision to analyse a trace is not influenced by this variable. The decision to analyse a trace first is very complex and many of the tested variables were taken into account. The decisions are often made on a case-by-case basis. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Muncy, Nathan M; Hedges-Muncy, Ariana M; Kirwan, C Brock
2017-01-01
Pre-processing MRI scans prior to performing volumetric analyses is common practice in MRI studies. As pre-processing steps adjust the voxel intensities, the space in which the scan exists, and the amount of data in the scan, it is possible that the steps have an effect on the volumetric output. To date, studies have compared between and not within pipelines, and so the impact of each step is unknown. This study aims to quantify the effects of pre-processing steps on volumetric measures in T1-weighted scans within a single pipeline. It was our hypothesis that pre-processing steps would significantly impact ROI volume estimations. One hundred fifteen participants from the OASIS dataset were used, where each participant contributed three scans. All scans were then pre-processed using a step-wise pipeline. Bilateral hippocampus, putamen, and middle temporal gyrus volume estimations were assessed following each successive step, and all data were processed by the same pipeline 5 times. Repeated-measures analyses tested for a main effects of pipeline step, scan-rescan (for MRI scanner consistency) and repeated pipeline runs (for algorithmic consistency). A main effect of pipeline step was detected, and interestingly an interaction between pipeline step and ROI exists. No effect for either scan-rescan or repeated pipeline run was detected. We then supply a correction for noise in the data resulting from pre-processing.
NASA Astrophysics Data System (ADS)
Ailianou, Artemis
New and promising treatments for coronary heart disease are enabled by vascular scaffolds made of poly(L-lactic acid) (PLLA), as demonstrated by Abbott Vascular's bioresorbable vascular scaffold. PLLA is a semicrystalline polymer whose degree of crystallinity and crystalline microstructure depend on the thermal and deformation history during processing. In turn, the semicrystalline morphology determines scaffold strength and biodegradation time. However, spatially-resolved information about the resulting material structure (crystallinity and crystal orientation) is needed to interpret in vivo observations. The first manufacturing step of the scaffold is tube expansion in a process similar to injection blow molding. Spatial uniformity of the tube microstructure is essential for the consistent production and performance of the final scaffold. For implantation into the artery, solid-state deformation below the glass transition temperature is imposed on a laser-cut subassembly to crimp it into a small diameter. Regions of localized strain during crimping are implicated in deployment behavior. To examine the semicrystalline microstructure development of the scaffold, we employed complementary techniques of scanning electron and polarized light microscopy, wide-angle X-ray scattering, and X-ray microdiffraction. These techniques enabled us to assess the microstructure at the micro and nano length scale. The results show that the expanded tube is very uniform in the azimuthal and axial directions and that radial variations are more pronounced. The crimping step dramatically changes the microstructure of the subassembly by imposing extreme elongation and compression. Spatial information on the degree and direction of chain orientation from X-ray microdiffraction data gives insight into the mechanism by which the PLLA dissipates the stresses during crimping, without fracture. Finally, analysis of the microstructure after deployment shows that it is inherited from the crimping step and contributes to the scaffold's successful implantation in vivo.
Sensation-to-cognition cortical streams in attention-deficit/hyperactivity disorder.
Carmona, Susana; Hoekzema, Elseline; Castellanos, Francisco X; García-García, David; Lage-Castellanos, Agustín; Van Dijk, Koene R A; Navas-Sánchez, Francisco J; Martínez, Kenia; Desco, Manuel; Sepulcre, Jorge
2015-07-01
We sought to determine whether functional connectivity streams that link sensory, attentional, and higher-order cognitive circuits are atypical in attention-deficit/hyperactivity disorder (ADHD). We applied a graph-theory method to the resting-state functional magnetic resonance imaging data of 120 children with ADHD and 120 age-matched typically developing children (TDC). Starting in unimodal primary cortex-visual, auditory, and somatosensory-we used stepwise functional connectivity to calculate functional connectivity paths at discrete numbers of relay stations (or link-step distances). First, we characterized the functional connectivity streams that link sensory, attentional, and higher-order cognitive circuits in TDC and found that systems do not reach the level of integration achieved by adults. Second, we searched for stepwise functional connectivity differences between children with ADHD and TDC. We found that, at the initial steps of sensory functional connectivity streams, patients display significant enhancements of connectivity degree within neighboring areas of primary cortex, while connectivity to attention-regulatory areas is reduced. Third, at subsequent link-step distances from primary sensory cortex, children with ADHD show decreased connectivity to executive processing areas and increased degree of connections to default mode regions. Fourth, in examining medication histories in children with ADHD, we found that children medicated with psychostimulants present functional connectivity streams with higher degree of connectivity to regions subserving attentional and executive processes compared to medication-naïve children. We conclude that predominance of local sensory processing and lesser influx of information to attentional and executive regions may reduce the ability to organize and control the balance between external and internal sources of information in ADHD. © 2015 Wiley Periodicals, Inc.
Gary, Robin H.; Wilson, Zachary D.; Archuleta, Christy-Ann M.; Thompson, Florence E.; Vrabel, Joseph
2009-01-01
During 2006-09, the U.S. Geological Survey, in cooperation with the National Atlas of the United States, produced a 1:1,000,000-scale (1:1M) hydrography dataset comprising streams and waterbodies for the entire United States, including Puerto Rico and the U.S. Virgin Islands, for inclusion in the recompiled National Atlas. This report documents the methods used to select, simplify, and refine features in the 1:100,000-scale (1:100K) (1:63,360-scale in Alaska) National Hydrography Dataset to create the national 1:1M hydrography dataset. Custom tools and semi-automated processes were created to facilitate generalization of the 1:100K National Hydrography Dataset (1:63,360-scale in Alaska) to 1:1M on the basis of existing small-scale hydrography datasets. The first step in creating the new 1:1M dataset was to address feature selection and optimal data density in the streams network. Several existing methods were evaluated. The production method that was established for selecting features for inclusion in the 1:1M dataset uses a combination of the existing attributes and network in the National Hydrography Dataset and several of the concepts from the methods evaluated. The process for creating the 1:1M waterbodies dataset required a similar approach to that used for the streams dataset. Geometric simplification of features was the next step. Stream reaches and waterbodies indicated in the feature selection process were exported as new feature classes and then simplified using a geographic information system tool. The final step was refinement of the 1:1M streams and waterbodies. Refinement was done through the use of additional geographic information system tools.
Application of a responsive evaluation approach in medical education.
Curran, Vernon; Christopher, Jeanette; Lemire, Francine; Collins, Alice; Barrett, Brendan
2003-03-01
This paper reports on the usefulness of a responsive evaluation model in evaluating the clinical skills assessment and training (CSAT) programme at the Faculty of Medicine, Memorial University of Newfoundland, Canada. The purpose of this paper is to introduce the responsive evaluation approach, ascertain its utility, feasibility, propriety and accuracy in a medical education context, and discuss its applicability as a model for medical education programme evaluation. Robert Stake's original 12-step responsive evaluation model was modified and reduced to five steps, including: (1) stakeholder audience identification, consultation and issues exploration; (2) stakeholder concerns and issues analysis; (3) identification of evaluative standards and criteria; (4) design and implementation of evaluation methodology; and (5) data analysis and reporting. This modified responsive evaluation process was applied to the CSAT programme and a meta-evaluation was conducted to evaluate the effectiveness of the approach. The responsive evaluation approach was useful in identifying the concerns and issues of programme stakeholders, solidifying the standards and criteria for measuring the success of the CSAT programme, and gathering rich and descriptive evaluative information about educational processes. The evaluation was perceived to be human resource dependent in nature, yet was deemed to have been practical, efficient and effective in uncovering meaningful and useful information for stakeholder decision-making. Responsive evaluation is derived from the naturalistic paradigm and concentrates on examining the educational process rather than predefined outcomes of the process. Responsive evaluation results are perceived as having more relevance to stakeholder concerns and issues, and therefore more likely to be acted upon. Conducting an evaluation that is responsive to the needs of these groups will ensure that evaluative information is meaningful and more likely to be used for programme enhancement and improvement.
Contreras-López, Orlando; Moyano, Tomás C; Soto, Daniela C; Gutiérrez, Rodrigo A
2018-01-01
The rapid increase in the availability of transcriptomics data generated by RNA sequencing represents both a challenge and an opportunity for biologists without bioinformatics training. The challenge is handling, integrating, and interpreting these data sets. The opportunity is to use this information to generate testable hypothesis to understand molecular mechanisms controlling gene expression and biological processes (Fig. 1). A successful strategy to generate tractable hypotheses from transcriptomics data has been to build undirected network graphs based on patterns of gene co-expression. Many examples of new hypothesis derived from network analyses can be found in the literature, spanning different organisms including plants and specific fields such as root developmental biology.In order to make the process of constructing a gene co-expression network more accessible to biologists, here we provide step-by-step instructions using published RNA-seq experimental data obtained from a public database. Similar strategies have been used in previous studies to advance root developmental biology. This guide includes basic instructions for the operation of widely used open source platforms such as Bio-Linux, R, and Cytoscape. Even though the data we used in this example was obtained from Arabidopsis thaliana, the workflow developed in this guide can be easily adapted to work with RNA-seq data from any organism.
Tawilah, Jihane; Schlotheuber, Anne; Bateman, Massee; Davey, Tamzyn; Kusumawardani, Nunik; Myint, Theingi; Nuryetty, Mariet Tetty; Prasetyo, Sabarinah; Suparmi; Floranita, Rustini
2018-01-01
ABSTRACT Background: Inequalities in health represent a major problem in many countries, including Indonesia. Addressing health inequality is a central component of the Sustainable Development Goals and a priority of the World Health Organization (WHO). WHO provides technical support for health inequality monitoring among its member states. Following a capacity-building workshop in the WHO South-East Asia Region in 2014, Indonesia expressed interest in incorporating health-inequality monitoring into its national health information system. Objectives: This article details the capacity-building process for national health inequality monitoring in Indonesia, discusses successes and challenges, and how this process may be adapted and implemented in other countries/settings. Methods: We outline key capacity-building activities undertaken between April 2016 and December 2017 in Indonesia and present the four key outcomes of this process. Results: The capacity-building process entailed a series of workshops, meetings, activities, and processes undertaken between April 2016 and December 2017. At each stage, a range of stakeholders with access to the relevant data and capacity for data analysis, interpretation and reporting was engaged with, under the stewardship of state agencies. Key steps to strengthening health inequality monitoring included capacity building in (1) identification of the health topics/areas of interest, (2) mapping data sources and identifying gaps, (3) conducting equity analyses using raw datasets, and (4) interpreting and reporting inequality results. As a result, Indonesia developed its first national report on the state of health inequality. A number of peer-reviewed manuscripts on various aspects of health inequality in Indonesia have also been developed. Conclusions: The capacity-building process undertaken in Indonesia is designed to be adaptable to other contexts. Capacity building for health inequality monitoring among countries is a critical step for strengthening equity-oriented national health information systems and eventually tackling health inequities. PMID:29569528
JSC Pharmacy Services for Remote Operations
NASA Technical Reports Server (NTRS)
Stoner, Paul S.; Bayuse, Tina
2005-01-01
The Johnson Space Center Pharmacy began operating in March of 2003. The pharmacy serves in two main capacities: to directly provide medications and services in support of the medical clinics at the Johnson Space Center, physician travel kits for NASA flight surgeon staff, and remote operations, such as the clinics in Devon Island, Star City and Moscow; and indirectly provide medications and services for the International Space Station and Space Shuttle medical kits. Process changes that occurred and continued to evolve in the advent of the installation of the new JSC Pharmacy, and the process of stocking medications for each of these aforementioned areas will be discussed. Methods: The incorporation of pharmacy involvement to provide services for remote operations and supplying medical kits was evaluated. The first step was to review the current processes and work the JSC Pharmacy into the existing system. The second step was to provide medications to these areas. Considerations for the timeline of expiring medications for shipment are reviewed with each request. The third step was the development of a process to provide accountability for the medications. Results: The JSC Pharmacy utilizes a pharmacy management system to document all medications leaving the pharmacy. Challenges inherent to providing medications to remote areas were encountered. A process has been designed to incorporate usage into the electronic medical record upon return of the information from these remote areas. This is an evolving program and several areas have been identified for further improvement.
Adaptive multi-step Full Waveform Inversion based on Waveform Mode Decomposition
NASA Astrophysics Data System (ADS)
Hu, Yong; Han, Liguo; Xu, Zhuo; Zhang, Fengjiao; Zeng, Jingwen
2017-04-01
Full Waveform Inversion (FWI) can be used to build high resolution velocity models, but there are still many challenges in seismic field data processing. The most difficult problem is about how to recover long-wavelength components of subsurface velocity models when seismic data is lacking of low frequency information and without long-offsets. To solve this problem, we propose to use Waveform Mode Decomposition (WMD) method to reconstruct low frequency information for FWI to obtain a smooth model, so that the initial model dependence of FWI can be reduced. In this paper, we use adjoint-state method to calculate the gradient for Waveform Mode Decomposition Full Waveform Inversion (WMDFWI). Through the illustrative numerical examples, we proved that the low frequency which is reconstructed by WMD method is very reliable. WMDFWI in combination with the adaptive multi-step inversion strategy can obtain more faithful and accurate final inversion results. Numerical examples show that even if the initial velocity model is far from the true model and lacking of low frequency information, we still can obtain good inversion results with WMD method. From numerical examples of anti-noise test, we see that the adaptive multi-step inversion strategy for WMDFWI has strong ability to resist Gaussian noise. WMD method is promising to be able to implement for the land seismic FWI, because it can reconstruct the low frequency information, lower the dominant frequency in the adjoint source, and has a strong ability to resist noise.
Use of Single-Cysteine Variants for Trapping Transient States in DNA Mismatch Repair.
Friedhoff, Peter; Manelyte, Laura; Giron-Monzon, Luis; Winkler, Ines; Groothuizen, Flora S; Sixma, Titia K
2017-01-01
DNA mismatch repair (MMR) is necessary to prevent incorporation of polymerase errors into the newly synthesized DNA strand, as they would be mutagenic. In humans, errors in MMR cause a predisposition to cancer, called Lynch syndrome. The MMR process is performed by a set of ATPases that transmit, validate, and couple information to identify which DNA strand requires repair. To understand the individual steps in the repair process, it is useful to be able to study these large molecular machines structurally and functionally. However, the steps and states are highly transient; therefore, the methods to capture and enrich them are essential. Here, we describe how single-cysteine variants can be used for specific cross-linking and labeling approaches that allow trapping of relevant transient states. Analysis of these defined states in functional and structural studies is instrumental to elucidate the molecular mechanism of this important DNA MMR process. © 2017 Elsevier Inc. All rights reserved.
Advanced Extraction of Spatial Information from High Resolution Satellite Data
NASA Astrophysics Data System (ADS)
Pour, T.; Burian, J.; Miřijovský, J.
2016-06-01
In this paper authors processed five satellite image of five different Middle-European cities taken by five different sensors. The aim of the paper was to find methods and approaches leading to evaluation and spatial data extraction from areas of interest. For this reason, data were firstly pre-processed using image fusion, mosaicking and segmentation processes. Results going into the next step were two polygon layers; first one representing single objects and the second one representing city blocks. In the second step, polygon layers were classified and exported into Esri shapefile format. Classification was partly hierarchical expert based and partly based on the tool SEaTH used for separability distinction and thresholding. Final results along with visual previews were attached to the original thesis. Results are evaluated visually and statistically in the last part of the paper. In the discussion author described difficulties of working with data of large size, taken by different sensors and different also thematically.
Branching dynamics of viral information spreading.
Iribarren, José Luis; Moro, Esteban
2011-10-01
Despite its importance for rumors or innovations propagation, peer-to-peer collaboration, social networking, or marketing, the dynamics of information spreading is not well understood. Since the diffusion depends on the heterogeneous patterns of human behavior and is driven by the participants' decisions, its propagation dynamics shows surprising properties not explained by traditional epidemic or contagion models. Here we present a detailed analysis of our study of real viral marketing campaigns where tracking the propagation of a controlled message allowed us to analyze the structure and dynamics of a diffusion graph involving over 31,000 individuals. We found that information spreading displays a non-Markovian branching dynamics that can be modeled by a two-step Bellman-Harris branching process that generalizes the static models known in the literature and incorporates the high variability of human behavior. It explains accurately all the features of information propagation under the "tipping point" and can be used for prediction and management of viral information spreading processes.
Branching dynamics of viral information spreading
NASA Astrophysics Data System (ADS)
Iribarren, José Luis; Moro, Esteban
2011-10-01
Despite its importance for rumors or innovations propagation, peer-to-peer collaboration, social networking, or marketing, the dynamics of information spreading is not well understood. Since the diffusion depends on the heterogeneous patterns of human behavior and is driven by the participants’ decisions, its propagation dynamics shows surprising properties not explained by traditional epidemic or contagion models. Here we present a detailed analysis of our study of real viral marketing campaigns where tracking the propagation of a controlled message allowed us to analyze the structure and dynamics of a diffusion graph involving over 31 000 individuals. We found that information spreading displays a non-Markovian branching dynamics that can be modeled by a two-step Bellman-Harris branching process that generalizes the static models known in the literature and incorporates the high variability of human behavior. It explains accurately all the features of information propagation under the “tipping point” and can be used for prediction and management of viral information spreading processes.
Röthlisberger, Fabian; Boes, Stefan; Rubinelli, Sara; Schmitt, Klaus; Scheel-Sailer, Anke
2017-06-26
The admission process of patients to a hospital is the starting point for inpatient services. In order to optimize the quality of the health services provision, one needs a good understanding of the patient admission workflow in a clinic. The aim of this study was to identify challenges and potential improvements in the admission process of spinal cord injury patients at a specialized rehabilitation clinic from the perspective of an interdisciplinary team of health professionals. Semi-structured interviews with eight health professionals (medical doctors, physical therapists, occupational therapists, nurses) at the Swiss Paraplegic Centre (acute and rehabilitation clinic) were conducted based on a maximum variety purposive sampling strategy. The interviews were analyzed using a thematic analysis approach. The interviewees described the challenges and potential improvements in this admission process, focusing on five themes. First, the characteristics of the patient with his/her health condition and personality and his/her family influence different areas in the admission process. Improvements in the exchange of information between the hospital and the patient could speed up and simplify the admission process. In addition, challenges and potential improvements were found concerning the rehabilitation planning, the organization of the admission process and the interdisciplinary work. This study identified five themes of challenges and potential improvements in the admission process of spinal cord injury patients at a specialized rehabilitation clinic. When planning adaptations of process steps in one of the areas, awareness of effects in other fields is necessary. Improved pre-admission information would be a first important step to optimize the admission process. A common IT-system providing an interdisciplinary overview and possibilities for interdisciplinary exchange would support the management of the admission process. Managers of other hospitals can supplement the results of this study with their own process analyses, to improve their own patient admission processes.
Page, Grier P; Coulibaly, Issa
2008-01-01
Microarrays are a very powerful tool for quantifying the amount of RNA in samples; however, their ability to query essentially every gene in a genome, which can number in the tens of thousands, presents analytical and interpretative problems. As a result, a variety of software and web-based tools have been developed to help with these issues. This article highlights and reviews some of the tools for the first steps in the analysis of a microarray study. We have tried for a balance between free and commercial systems. We have organized the tools by topics including image processing tools (Section 2), power analysis tools (Section 3), image analysis tools (Section 4), database tools (Section 5), databases of functional information (Section 6), annotation tools (Section 7), statistical and data mining tools (Section 8), and dissemination tools (Section 9).
NASA Astrophysics Data System (ADS)
Rahman, Md M.; Antani, Sameer K.; Demner-Fushman, Dina; Thoma, George R.
2015-03-01
This paper presents a novel approach to biomedical image retrieval by mapping image regions to local concepts and represent images in a weighted entropy-based concept feature space. The term concept refers to perceptually distinguishable visual patches that are identified locally in image regions and can be mapped to a glossary of imaging terms. Further, the visual significance (e.g., visualness) of concepts is measured as Shannon entropy of pixel values in image patches and is used to refine the feature vector. Moreover, the system can assist user in interactively select a Region-Of-Interest (ROI) and search for similar image ROIs. Further, a spatial verification step is used as a post-processing step to improve retrieval results based on location information. The hypothesis that such approaches would improve biomedical image retrieval, is validated through experiments on a data set of 450 lung CT images extracted from journal articles from four different collections.
The preliminary Long Duration Exposure Facility (LDEF) materials data base
NASA Technical Reports Server (NTRS)
Funk, Joan G.; Strickland, John W.; Davis, John M.
1992-01-01
A preliminary Long Duration Exposure Facility (LDEF) Materials Data Base was developed by the LDEF Materials Special Investigation Group (MSIG). The LDEF Materials Data Base is envisioned to eventually contain the wide variety and vast quantity of materials data generated for LDEF. The data is searchable by optical, thermal, and mechanical properties, exposure parameters (such as atomic oxygen flux), and author(s) or principal investigator(s). The LDEF Materials Data Base was incorporated into the Materials and Processes Technical Information System (MAPTIS). MAPTIS is a collection of materials data which was computerized and is available to engineers, designers, and researchers in the aerospace community involved in the design and development of spacecraft and related hardware. This paper describes the LDEF Materials Data Base and includes step-by-step example searches using the data base. Information on how to become an authorized user of the system is included.
Implementation of the fugitive emissions system program: The OxyChem experience
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deshmukh, A.
An overview is provided for the Fugitive Emissions System (FES) that has been implemented at Occidental Chemical in conjunction with the computer-based maintenance system called PassPort{reg_sign} developed by Indus Corporation. The goal of PassPort{reg_sign} FES program has been to interface with facilities data, equipment information, work standards and work orders. Along the way, several implementation hurdles had to be overcome before a monitoring and regulatory system could be standardized for the appropriate maintenance, process and environmental groups. This presentation includes step-by-step account of several case studies that developed during the implementation of the FES system.
Computer-assisted techniques to evaluate fringe patterns
NASA Astrophysics Data System (ADS)
Sciammarella, Cesar A.; Bhat, Gopalakrishna K.
1992-01-01
Strain measurement using interferometry requires an efficient way to extract the desired information from interferometric fringes. Availability of digital image processing systems makes it possible to use digital techniques for the analysis of fringes. In the past, there have been several developments in the area of one dimensional and two dimensional fringe analysis techniques, including the carrier fringe method (spatial heterodyning) and the phase stepping (quasi-heterodyning) technique. This paper presents some new developments in the area of two dimensional fringe analysis, including a phase stepping technique supplemented by the carrier fringe method and a two dimensional Fourier transform method to obtain the strain directly from the discontinuous phase contour map.
Experimental entanglement of 25 individually accessible atomic quantum interfaces.
Pu, Yunfei; Wu, Yukai; Jiang, Nan; Chang, Wei; Li, Chang; Zhang, Sheng; Duan, Luming
2018-04-01
A quantum interface links the stationary qubits in a quantum memory with flying photonic qubits in optical transmission channels and constitutes a critical element for the future quantum internet. Entanglement of quantum interfaces is an important step for the realization of quantum networks. Through heralded detection of photon interference, we generate multipartite entanglement between 25 (or 9) individually addressable quantum interfaces in a multiplexed atomic quantum memory array and confirm genuine 22-partite (or 9-partite) entanglement. This experimental entanglement of a record-high number of individually addressable quantum interfaces makes an important step toward the realization of quantum networks, long-distance quantum communication, and multipartite quantum information processing.
Brief International Cognitive Assessment for MS (BICAMS): international standards for validation.
Benedict, Ralph H B; Amato, Maria Pia; Boringa, Jan; Brochet, Bruno; Foley, Fred; Fredrikson, Stan; Hamalainen, Paivi; Hartung, Hans; Krupp, Lauren; Penner, Iris; Reder, Anthony T; Langdon, Dawn
2012-07-16
An international expert consensus committee recently recommended a brief battery of tests for cognitive evaluation in multiple sclerosis. The Brief International Cognitive Assessment for MS (BICAMS) battery includes tests of mental processing speed and memory. Recognizing that resources for validation will vary internationally, the committee identified validation priorities, to facilitate international acceptance of BICAMS. Practical matters pertaining to implementation across different languages and countries were discussed. Five steps to achieve optimal psychometric validation were proposed. In Step 1, test stimuli should be standardized for the target culture or language under consideration. In Step 2, examiner instructions must be standardized and translated, including all information from manuals necessary for administration and interpretation. In Step 3, samples of at least 65 healthy persons should be studied for normalization, matched to patients on demographics such as age, gender and education. The objective of Step 4 is test-retest reliability, which can be investigated in a small sample of MS and/or healthy volunteers over 1-3 weeks. Finally, in Step 5, criterion validity should be established by comparing MS and healthy controls. At this time, preliminary studies are underway in a number of countries as we move forward with this international assessment tool for cognition in MS.
MacDonald, Goldie; Garcia, Danyael; Zaza, Stephanie; Schooley, Michael; Compton, Don; Bryant, Terry; Bagnol, Lulu; Edgerly, Cathy; Haverkate, Rick
2006-01-01
The Steps to a HealthierUS Cooperative Agreement Program (Steps Program) enables funded communities to implement chronic disease prevention and health promotion efforts to reduce the burden of diabetes, obesity, asthma, and related risk factors. At both the national and community levels, investment in surveillance and program evaluation is substantial. Public health practitioners engaged in program evaluation planning often identify desired outcomes, related indicators, and data collection methods but may pay only limited attention to an overarching vision for program evaluation among participating sites. We developed a set of foundational elements to provide a vision of program evaluation that informs the technical decisions made throughout the evaluation process. Given the diversity of activities across the Steps Program and the need for coordination between national- and community-level evaluation efforts, our recommendations to guide program evaluation practice are explicit yet leave room for site-specific context and needs. Staff across the Steps Program must consider these foundational elements to prepare a formal plan for program evaluation. Attention to each element moves the Steps Program closer to well-designed and complementary plans for program evaluation at the national, state, and community levels.
Making Information Useful: Engagement in the Sustained National Climate Assessment Process
NASA Astrophysics Data System (ADS)
Lough, G. C.; Cloyd, E.
2015-12-01
Creation of actionable information requires that the producers of that information understand the needs of the intended users and decision makers. To that end, the U.S. Global Change Research Program's sustained National Climate Assessment process includes a focus on engaging users through an inclusive, broad-based, and ongoing process. Such a process provides opportunities for scientific experts and decision makers to share knowledge about the climate-related issues, impacts, and potential response actions that are most important in a particular region or sector. Such a process is also highly transparent in order to produce results that are credible, salient, and legitimate for both scientists and decision makers, ultimately making the results extremely useful. To implement these principles, USGCRP implements a broad-based engagement strategy that invites participation from users and stakeholder communities and considers methods for communicating with potential users at every step. The strategy elicits contributions to help shape the framing of the assessment process and products, improve the transparency of the process, and increase the utility of the final information. Specific user inputs are gathered through workshops, public comment opportunities, town hall meetings, presentations, requests for information, submitted documents, and open meetings. Further, a network of contributors self-organizes around topics of interest to extend assessment activities to a wider range of user groups. Here, we describe the outcomes of these innovations in assessment engagement and identify clear successes, notable surprises, future evaluation needs, and areas for new ideas.
Reconstruction dynamics of recorded holograms in photochromic glass.
Mihailescu, Mona; Pavel, Eugen; Nicolae, Vasile B
2011-06-20
We have investigated the dynamics of the record-erase process of holograms in photochromic glass using continuum Nd:YVO₄ laser radiation (λ=532 nm). A bidimensional microgrid pattern was formed and visualized in photochromic glass, and its diffraction efficiency decay versus time (during reconstruction step) gave us information (D, Δn) about the diffusion process inside the material. The recording and reconstruction processes were carried out in an off-axis setup, and the images of the reconstructed object were recorded by a CCD camera. Measurements realized on reconstructed object images using holograms recorded at a different incident power laser have shown a two-stage process involved in silver atom kinetics.
The contract process: a methodology for negotiation. Part I.
Kleinschmidt, W M
1990-05-01
This is the first of a three-part series on the contract process for acquiring a hospital information system product. Part One addresses negotiation methodology; points which will facilitate effective negotiation. Part Two will cover contract contents focusing on those topics which must be included in a good contract. Part Three will discuss contract philosophy and contract management; subjects which are critical to the good rapport buyers and vendors want. The adversarial approach to the contract process is not the best approach. Rather, the process should be treated as a step in the building of a partnership and relationship in which both parties win.
Wiggins, Paul A
2015-07-21
This article describes the application of a change-point algorithm to the analysis of stochastic signals in biological systems whose underlying state dynamics consist of transitions between discrete states. Applications of this analysis include molecular-motor stepping, fluorophore bleaching, electrophysiology, particle and cell tracking, detection of copy number variation by sequencing, tethered-particle motion, etc. We present a unified approach to the analysis of processes whose noise can be modeled by Gaussian, Wiener, or Ornstein-Uhlenbeck processes. To fit the model, we exploit explicit, closed-form algebraic expressions for maximum-likelihood estimators of model parameters and estimated information loss of the generalized noise model, which can be computed extremely efficiently. We implement change-point detection using the frequentist information criterion (which, to our knowledge, is a new information criterion). The frequentist information criterion specifies a single, information-based statistical test that is free from ad hoc parameters and requires no prior probability distribution. We demonstrate this information-based approach in the analysis of simulated and experimental tethered-particle-motion data. Copyright © 2015 Biophysical Society. Published by Elsevier Inc. All rights reserved.
Advancing the science of forensic data management
NASA Astrophysics Data System (ADS)
Naughton, Timothy S.
2002-07-01
Many individual elements comprise a typical forensics process. Collecting evidence, analyzing it, and using results to draw conclusions are all mutually distinct endeavors. Different physical locations and personnel are involved, juxtaposed against an acute need for security and data integrity. Using digital technologies and the Internet's ubiquity, these diverse elements can be conjoined using digital data as the common element. This result is a new data management process that can be applied to serve all elements of the community. The first step is recognition of a forensics lifecycle. Evidence gathering, analysis, storage, and use in legal proceedings are actually just distinct parts of a single end-to-end process, and thus, it is hypothesized that a single data system that can also accommodate each constituent phase using common network and security protocols. This paper introduces the idea of web-based Central Data Repository. Its cornerstone is anywhere, anytime Internet upload, viewing, and report distribution. Archives exist indefinitely after being created, and high-strength security and encryption protect data and ensure subsequent case file additions do not violate chain-of-custody or other handling provisions. Several legal precedents have been established for using digital information in courts of law, and in fact, effective prosecution of cyber crimes absolutely relies on its use. An example is a US Department of Agriculture division's use of digital images to back up its inspection process, with pictures and information retained on secure servers to enforce the Perishable Agricultural Commodities Act. Forensics is a cumulative process. Secure, web-based data management solutions, such as the Central Data Repository postulated here, can support each process step. Logically marrying digital technologies with Internet accessibility should help nurture a thought process to explore alternatives that make forensics data accessible to authorized individuals, whenever and wherever they need it.
Van der Auwermeulen, Thomas; Van Ooteghem, Jan; Jacobs, An; Verbrugge, Sofie; Colle, Didier
2016-01-01
Background In response to the increasing pressure of the societal challenge because of a graying society, a gulf of new Information and Communication Technology (ICT) supported care services (eCare) can now be noticed. Their common goal is to increase the quality of care while decreasing its costs. Smart Care Platforms (SCPs), installed in the homes of care-dependent people, foster the interoperability of these services and offer a set of eCare services that are complementary on one platform. These eCare services could not only result in more quality care for care receivers, but they also offer opportunities to care providers to optimize their processes. Objective The objective of the study was to identify and describe the expected added values and impacts of integrating SCPs in current home care delivery processes for all actors. In addition, the potential economic impact of SCP deployment is quantified from the perspective of home care organizations. Methods Semistructured and informal interviews and focus groups and cocreation workshops with service providers, managers of home care organizations, and formal and informal care providers led to the identification of added values of SCP integration. In a second step, process breakdown analyses of home care provisioning allowed defining the operational impact for home care organization. Impacts on 2 different process steps of providing home care were quantified. After modeling the investment, an economic evaluation compared the business as usual (BAU) scenario versus the integrated SCP scenario. Results The added value of SCP integration for all actors involved in home care was identified. Most impacts were qualitative such as increase in peace of mind, better quality of care, strengthened involvement in care provisioning, and more transparent care communication. For home care organizations, integrating SCPs could lead to a decrease of 38% of the current annual expenses for two administrative process steps namely, care rescheduling and the billing for care provisioning. Conclusions Although integrating SCP in home care processes could affect both the quality of life of the care receiver and informal care giver, only scarce and weak evidence was found that supports this assumption. In contrast, there exists evidence that indicates the lack of the impact on quality of life of the care receiver while it increases the cost of care provisioning. However, our cost-benefit quantification model shows that integrating SCPs in home care provisioning could lead to a considerable decrease of costs for care administrative tasks. Because of this cost decreasing impact, we believe that the integration of SCPs will be driven by home care organizations instead of the care receivers themselves. PMID:27799137
Vannieuwenborg, Frederic; Van der Auwermeulen, Thomas; Van Ooteghem, Jan; Jacobs, An; Verbrugge, Sofie; Colle, Didier
2016-10-31
In response to the increasing pressure of the societal challenge because of a graying society, a gulf of new Information and Communication Technology (ICT) supported care services (eCare) can now be noticed. Their common goal is to increase the quality of care while decreasing its costs. Smart Care Platforms (SCPs), installed in the homes of care-dependent people, foster the interoperability of these services and offer a set of eCare services that are complementary on one platform. These eCare services could not only result in more quality care for care receivers, but they also offer opportunities to care providers to optimize their processes. The objective of the study was to identify and describe the expected added values and impacts of integrating SCPs in current home care delivery processes for all actors. In addition, the potential economic impact of SCP deployment is quantified from the perspective of home care organizations. Semistructured and informal interviews and focus groups and cocreation workshops with service providers, managers of home care organizations, and formal and informal care providers led to the identification of added values of SCP integration. In a second step, process breakdown analyses of home care provisioning allowed defining the operational impact for home care organization. Impacts on 2 different process steps of providing home care were quantified. After modeling the investment, an economic evaluation compared the business as usual (BAU) scenario versus the integrated SCP scenario. The added value of SCP integration for all actors involved in home care was identified. Most impacts were qualitative such as increase in peace of mind, better quality of care, strengthened involvement in care provisioning, and more transparent care communication. For home care organizations, integrating SCPs could lead to a decrease of 38% of the current annual expenses for two administrative process steps namely, care rescheduling and the billing for care provisioning. Although integrating SCP in home care processes could affect both the quality of life of the care receiver and informal care giver, only scarce and weak evidence was found that supports this assumption. In contrast, there exists evidence that indicates the lack of the impact on quality of life of the care receiver while it increases the cost of care provisioning. However, our cost-benefit quantification model shows that integrating SCPs in home care provisioning could lead to a considerable decrease of costs for care administrative tasks. Because of this cost decreasing impact, we believe that the integration of SCPs will be driven by home care organizations instead of the care receivers themselves. ©Frederic Vannieuwenborg, Thomas Van der Auwermeulen, Jan Van Ooteghem, An Jacobs, Sofie Verbrugge, Didier Colle. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 31.10.2016.
Teaching concept analysis to graduate nursing students.
Schiller, Catharine J
2018-04-01
To provide guidance to educators who use the Wilson (1963) concept analysis method, as modified by Walker and Avant (2011), in their graduate nursing curriculum BACKGROUND: While graduate nursing curricula often include a concept analysis assignment, there is a paucity of literature to assist educators in guiding students through this challenging process. This article details one way for educators to assist graduate nursing students in learning how to undertake each step of the Wilson (1963) concept analysis method, as modified by Walker and Avant (2011). Wilson (1963) concept analysis method, as modified by Walker and Avant (2011). Using examples, this article walks the reader through the Walker and Avant (2011) concept analysis process and addresses those issues commonly encountered by educators during this process. This article presented one way of walking students through a Walker and Avant (2011) concept analysis. Having clear information about the steps involved in developing a concept analysis will make it easier for educators to incorporate it into their graduate nursing curriculum and to effectively guide students on their journey through this process. © 2018 Wiley Periodicals, Inc.
Bidding-based autonomous process planning and scheduling
NASA Astrophysics Data System (ADS)
Gu, Peihua; Balasubramanian, Sivaram; Norrie, Douglas H.
1995-08-01
Improving productivity through computer integrated manufacturing systems (CIMS) and concurrent engineering requires that the islands of automation in an enterprise be completely integrated. The first step in this direction is to integrate design, process planning, and scheduling. This can be achieved through a bidding-based process planning approach. The product is represented in a STEP model with detailed design and administrative information including design specifications, batch size, and due dates. Upon arrival at the manufacturing facility, the product registered in the shop floor manager which is essentially a coordinating agent. The shop floor manager broadcasts the product's requirements to the machines. The shop contains autonomous machines that have knowledge about their functionality, capabilities, tooling, and schedule. Each machine has its own process planner and responds to the product's request in a different way that is consistent with its capabilities and capacities. When more than one machine offers certain process(es) for the same requirements, they enter into negotiation. Based on processing time, due date, and cost, one of the machines wins the contract. The successful machine updates its schedule and advises the product to request raw material for processing. The concept was implemented using a multi-agent system with the task decomposition and planning achieved through contract nets. The examples are included to illustrate the approach.
NASA Astrophysics Data System (ADS)
Belabbassi, L.; Garzio, L. M.; Smith, M. J.; Knuth, F.; Vardaro, M.; Kerfoot, J.
2016-02-01
The Ocean Observatories Initiative (OOI), funded by the National Science Foundation, provides users with access to long-term datasets from a variety of deployed oceanographic sensors. The Pioneer Array in the Atlantic Ocean off the Coast of New England hosts 10 moorings and 6 gliders. Each mooring is outfitted with 6 to 19 different instruments telemetering more than 1000 data streams. These data are available to science users to collaborate on common scientific goals such as water quality monitoring and scale variability measures of continental shelf processes and coastal open ocean exchanges. To serve this purpose, the acquired datasets undergo an iterative multi-step quality assurance and quality control procedure automated to work with all types of data. Data processing involves several stages, including a fundamental pre-processing step when the data are prepared for processing. This takes a considerable amount of processing time and is often not given enough thought in development initiatives. The volume and complexity of OOI data necessitates the development of a systematic diagnostic tool to enable the management of a comprehensive data information system for the OOI arrays. We present two examples to demonstrate the current OOI pre-processing diagnostic tool. First, Data Filtering is used to identify incomplete, incorrect, or irrelevant parts of the data and then replaces, modifies or deletes the coarse data. This provides data consistency with similar datasets in the system. Second, Data Normalization occurs when the database is organized in fields and tables to minimize redundancy and dependency. At the end of this step, the data are stored in one place to reduce the risk of data inconsistency and promote easy and efficient mapping to the database.
Kim, Jeong-Nam; Oh, Yu Won; Krishna, Arunima
2018-01-01
This study proposes the idea of justificatory information forefending, a cognitive process by which individuals accept information that confirms their preexisting health beliefs, and reject information that is dissonant with their attitudes. In light of the sheer volume of often contradictory information related to health that is frequently highlighted by the traditional media, this study sought to identify antecedents and outcomes of this justificatory information forefending. Results indicate that individuals who are exposed to contradictory health information, currently engage in risky health behavior, are comfortable using the Internet to search for information, and are currently taking steps to maintain their health are likely to actively select health information that confirms their preexisting notions about their health, and to reject information that is contradictory to their beliefs. Additionally, individuals who engage in justificatory information forefending were also found to continue to engage in risky health behavior. Implications for theory and practice are discussed.
Medical image registration based on normalized multidimensional mutual information
NASA Astrophysics Data System (ADS)
Li, Qi; Ji, Hongbing; Tong, Ming
2009-10-01
Registration of medical images is an essential research topic in medical image processing and applications, and especially a preliminary and key step for multimodality image fusion. This paper offers a solution to medical image registration based on normalized multi-dimensional mutual information. Firstly, affine transformation with translational and rotational parameters is applied to the floating image. Then ordinal features are extracted by ordinal filters with different orientations to represent spatial information in medical images. Integrating ordinal features with pixel intensities, the normalized multi-dimensional mutual information is defined as similarity criterion to register multimodality images. Finally the immune algorithm is used to search registration parameters. The experimental results demonstrate the effectiveness of the proposed registration scheme.
The Internet as a New Tool in the Rehabilitation Process of Patients—Education in Focus
Forczek, Erzsébet; Makra, Péter; Sik Lanyi, Cecilia; Bari, Ferenc
2015-01-01
In the article we deal with the rehabilitation of patients using information technology, especially Internet support. We concentrate on two main areas in the IT support of rehabilitation: one of them is the support for individual therapy, the other one is providing patients with information, which is the basic step in emphasising individual responsibility. In the development of rehabilitation programmes, the knowledge of the IT professional and the therapist, in the IT support of web guidance, medical expertise plays the primary role. The degree of assistance involved in the rehabilitation process depends on the IT knowledge of medical (general practitioner, nursing staff) professionals as well. The necessary knowledge required in healing and development processes is imparted to professionals by a special (full-time) university training. It was a huge challenge for us to teach web-based information organisation skills to doctors and nurses, and it is also a complex task to put forward such an IT viewpoint to information specialists in order to create the foundations of the cooperation between IT and healthcare professionals. PMID:25711359
The Internet as a new tool in the rehabilitation process of patients--education in focus.
Forczek, Erzsébet; Makra, Péter; Lanyi, Cecilia Sik; Bari, Ferenc
2015-02-23
In the article we deal with the rehabilitation of patients using information technology, especially Internet support. We concentrate on two main areas in the IT support of rehabilitation: one of them is the support for individual therapy, the other one is providing patients with information, which is the basic step in emphasising individual responsibility. In the development of rehabilitation programmes, the knowledge of the IT professional and the therapist, in the IT support of web guidance, medical expertise plays the primary role. The degree of assistance involved in the rehabilitation process depends on the IT knowledge of medical (general practitioner, nursing staff) professionals as well. The necessary knowledge required in healing and development processes is imparted to professionals by a special (full-time) university training. It was a huge challenge for us to teach web-based information organisation skills to doctors and nurses, and it is also a complex task to put forward such an IT viewpoint to information specialists in order to create the foundations of the cooperation between IT and healthcare professionals.
Stepped frequency ground penetrating radar
Vadnais, Kenneth G.; Bashforth, Michael B.; Lewallen, Tricia S.; Nammath, Sharyn R.
1994-01-01
A stepped frequency ground penetrating radar system is described comprising an RF signal generating section capable of producing stepped frequency signals in spaced and equal increments of time and frequency over a preselected bandwidth which serves as a common RF signal source for both a transmit portion and a receive portion of the system. In the transmit portion of the system the signal is processed into in-phase and quadrature signals which are then amplified and then transmitted toward a target. The reflected signals from the target are then received by a receive antenna and mixed with a reference signal from the common RF signal source in a mixer whose output is then fed through a low pass filter. The DC output, after amplification and demodulation, is digitized and converted into a frequency domain signal by a Fast Fourier Transform. A plot of the frequency domain signals from all of the stepped frequencies broadcast toward and received from the target yields information concerning the range (distance) and cross section (size) of the target.