7 CFR 1794.11 - Apply NEPA early in the planning process.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 7 Agriculture 12 2011-01-01 2011-01-01 false Apply NEPA early in the planning process. 1794.11... National Environmental Policy Act § 1794.11 Apply NEPA early in the planning process. The environmental review process requires early coordination with and involvement of RUS. Applicants should consult with...
7 CFR 1794.11 - Apply NEPA early in the planning process.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 12 2010-01-01 2010-01-01 false Apply NEPA early in the planning process. 1794.11... National Environmental Policy Act § 1794.11 Apply NEPA early in the planning process. The environmental review process requires early coordination with and involvement of RUS. Applicants should consult with...
The amount of ergonomics and user involvement in 151 design processes.
Kok, Barbara N E; Slegers, Karin; Vink, Peter
2012-01-01
Ergonomics, usability and user-centered design are terms that are well known among designers. Yet, products often seem to fail to meet the users' needs, resulting in a gap between expected and experienced usability. To understand the possible causes of this gap the actions taken by the designer during the design process are studied in this paper. This can show whether and how certain actions influence the user-friendliness of the design products. The aim of this research was to understand whether ergonomic principles and methods are included in the design process, whether users are involved in this process and whether the experience of the designer (in ergonomics/user involvement) has an effect on the end product usability. In this study the design processes of 151 tangible products of students in design were analyzed. It showed that in 75% of the cases some ergonomic principles were applied. User involvement was performed in only 1/3 of the design cases. Hardly any correlation was found between the designers' experience in ergonomic principles and the way they applied it and no correlations were found between the designers' experience in user involvement and the users' involvement in the design process.
A Tutorial Design Process Applied to an Introductory Materials Engineering Course
ERIC Educational Resources Information Center
Rosenblatt, Rebecca; Heckler, Andrew F.; Flores, Katharine
2013-01-01
We apply a "tutorial design process", which has proven to be successful for a number of physics topics, to design curricular materials or "tutorials" aimed at improving student understanding of important concepts in a university-level introductory materials science and engineering course. The process involves the identification…
Process dissociation and mixture signal detection theory.
DeCarlo, Lawrence T
2008-11-01
The process dissociation procedure was developed in an attempt to separate different processes involved in memory tasks. The procedure naturally lends itself to a formulation within a class of mixture signal detection models. The dual process model is shown to be a special case. The mixture signal detection model is applied to data from a widely analyzed study. The results suggest that a process other than recollection may be involved in the process dissociation procedure.
Statistically qualified neuro-analytic failure detection method and system
Vilim, Richard B.; Garcia, Humberto E.; Chen, Frederick W.
2002-03-02
An apparatus and method for monitoring a process involve development and application of a statistically qualified neuro-analytic (SQNA) model to accurately and reliably identify process change. The development of the SQNA model is accomplished in two stages: deterministic model adaption and stochastic model modification of the deterministic model adaptation. Deterministic model adaption involves formulating an analytic model of the process representing known process characteristics, augmenting the analytic model with a neural network that captures unknown process characteristics, and training the resulting neuro-analytic model by adjusting the neural network weights according to a unique scaled equation error minimization technique. Stochastic model modification involves qualifying any remaining uncertainty in the trained neuro-analytic model by formulating a likelihood function, given an error propagation equation, for computing the probability that the neuro-analytic model generates measured process output. Preferably, the developed SQNA model is validated using known sequential probability ratio tests and applied to the process as an on-line monitoring system. Illustrative of the method and apparatus, the method is applied to a peristaltic pump system.
How High Is the Tramping Track? Mathematising and Applying in a Calculus Model-Eliciting Activity
ERIC Educational Resources Information Center
Yoon, Caroline; Dreyfus, Tommy; Thomas, Michael O. J.
2010-01-01
Two complementary processes involved in mathematical modelling are mathematising a realistic situation and applying a mathematical technique to a given realistic situation. We present and analyse work from two undergraduate students and two secondary school teachers who engaged in both processes during a mathematical modelling task that required…
Malfait, Simon; Van Hecke, Ann; Hellings, Johan; De Bodt, Griet; Eeckloo, Kristof
2017-02-01
In many health care systems, strategies are currently deployed to engage patients and other stakeholders in decisions affecting hospital services. In this paper, a model for stakeholder involvement is presented and evaluated in three Flemish hospitals. In the model, a stakeholder committee advises the hospital's board of directors on themes of strategic importance. To study the internal hospital's decision processes in order to identify the impact of a stakeholder involvement committee on strategic themes in the hospital decision processes. A retrospective analysis of the decision processes was conducted in three hospitals that implemented a stakeholder committee. The analysis consisted of process and outcome evaluation. Fifteen themes were discussed in the stakeholder committees, whereof 11 resulted in a considerable change. None of these were on a strategic level. The theoretical model was not applied as initially developed, but was altered by each hospital. Consequentially, the decision processes differed between the hospitals. Despite alternation of the model, the stakeholder committee showed a meaningful impact in all hospitals on the operational level. As a result of the differences in decision processes, three factors could be identified as facilitators for success: (1) a close interaction with the board of executives, (2) the inclusion of themes with a more practical and patient-oriented nature, and (3) the elaboration of decisions on lower echelons of the organization. To effectively influence the organization's public accountability, hospitals should involve stakeholders in the decision-making process of the organization. The model of a stakeholder committee was not applied as initially developed and did not affect the strategic decision-making processes in the involved hospitals. Results show only impact at the operational level in the participating hospitals. More research is needed connecting stakeholder involvement with hospital governance.
43 CFR 10010.9 - Apply NEPA early.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 43 Public Lands: Interior 2 2011-10-01 2011-10-01 false Apply NEPA early. 10010.9 Section 10010.9... Initiating the NEPA Process § 10010.9 Apply NEPA early. (a) The Commission will initiate early consultation... early with interested private parties and organizations, including when the Commission's own involvement...
Beta-decay half-lives for short neutron rich nuclei involved into the r-process
NASA Astrophysics Data System (ADS)
Panov, I.; Lutostansky, Yu; Thielemann, F.-K.
2018-01-01
The beta-strength function model based on Finite Fermi-Systems Theory is applied for calculations of the beta-decay half-lives for short neutron rich nuclei involved into the r- process. It is shown that the accuracy of beta-decay half-lives of short-lived neutron-rich nuclei is improving with increasing neutron excess and can be used for modeling of nucleosynthesis of heavy nuclei in the r-process.
Palma, José M; Ruiz, Carmelo; Corpas, Francisco J
2018-01-01
Nitric oxide (NO) is involved many physiological plant processes, including germination, growth and development of roots, flower setting and development, senescence, and fruit ripening. In the latter physiological process, NO has been reported to play an opposite role to ethylene. Thus, treatment of fruits with NO may lead to delay ripening independently of whether they are climacteric or nonclimacteric. In many cases different methods have been reported to apply NO to plant systems involving sodium nitroprusside, NONOates, DETANO, or GSNO to investigate physiological and molecular consequences. In this chapter a method to treat plant materials with NO is provided using bell pepper fruits as a model. This method is cheap, free of side effects, and easy to apply since it only requires common chemicals and tools available in any biology laboratory.
Estimating costs and performance of systems for machine processing of remotely sensed data
NASA Technical Reports Server (NTRS)
Ballard, R. J.; Eastwood, L. F., Jr.
1977-01-01
This paper outlines a method for estimating computer processing times and costs incurred in producing information products from digital remotely sensed data. The method accounts for both computation and overhead, and may be applied to any serial computer. The method is applied to estimate the cost and computer time involved in producing Level II Land Use and Vegetative Cover Maps for a five-state midwestern region. The results show that the amount of data to be processed overloads some example computer systems, but that the processing is feasible on others.
Kaizen: The Search for Quality.
ERIC Educational Resources Information Center
Zimmerman, William J.
1991-01-01
The Japanese concept of Kaizen (continuous improvement) may be applied to higher education institutions. Focus is on improvement of the products produced, the process by which they are delivered, and the people involved in the products and the process. (SK)
Fostering Faculty Leadership in the Institutional Assessment Process.
ERIC Educational Resources Information Center
La Potin, Armand S.; Haessig, Carolyn J.
1999-01-01
Applies John P. Kotter's eight-stage process as a model to demonstrate how faculty leadership can evolve in an institutional-assessment process that promotes change in campus tradition. Supports the idea that because faculty are responsible for student learning, the process of involving them as leaders will enhance the quality of the outcomes.…
Process Dissociation and Mixture Signal Detection Theory
ERIC Educational Resources Information Center
DeCarlo, Lawrence T.
2008-01-01
The process dissociation procedure was developed in an attempt to separate different processes involved in memory tasks. The procedure naturally lends itself to a formulation within a class of mixture signal detection models. The dual process model is shown to be a special case. The mixture signal detection model is applied to data from a widely…
DOT National Transportation Integrated Search
1999-09-01
This booklet presents some of the successes of the community-sensitive transportation facility development process. Although a comprehensive process is described here, not every project involves the full range of steps. By applying the techniques out...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goins, Bobby
A systems based approach will be used to evaluate the nitrogen delivery process. This approach involves principles found in Lean, Reliability, Systems Thinking, and Requirements. This unique combination of principles and thought process yields a very in depth look into the system to which it is applied. By applying a systems based approach to the nitrogen delivery process there should be improvements in cycle time, efficiency, and a reduction in the required number of personnel needed to sustain the delivery process. This will in turn reduce the amount of demurrage charges that the site incurs. In addition there should bemore » less frustration associated with the delivery process.« less
A Systems Approach to Nitrogen Delivery
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goins, Bobby
A systems based approach will be used to evaluate the nitrogen delivery process. This approach involves principles found in Lean, Reliability, Systems Thinking, and Requirements. This unique combination of principles and thought process yields a very in depth look into the system to which it is applied. By applying a systems based approach to the nitrogen delivery process there should be improvements in cycle time, efficiency, and a reduction in the required number of personnel needed to sustain the delivery process. This will in turn reduce the amount of demurrage charges that the site incurs. In addition there should bemore » less frustration associated with the delivery process.« less
Simulation of textile manufacturing processes for planning, scheduling, and quality control purposes
NASA Astrophysics Data System (ADS)
Cropper, A. E.; Wang, Z.
1995-08-01
Simulation, as a management information tool, has been applied to engineering manufacture and assembly operations. The application of the principles to textile manufacturing (fiber to fabric) is discussed. The particular problems and solutions in applying the simulation software package to the yarn production processes are discussed with an indication of how the software achieves the production schedule. The system appears to have application in planning, scheduling, and quality assurance. The latter being a result of the traceability possibilities through a process involving mixing and splitting of material.
Low friction and galling resistant coatings and processes for coating
Johnson, Roger N.
1987-01-01
The present invention describes coating processes and the resultant coated articles for use in high temperature sodium environments, such as those found in liquid metal fast breeder reactors and their associated systems. The substrate to which the coating is applied may be either an iron base or nickel base alloy. The coating itself is applied to the substrate by electro-spark deposition techniques which result in metallurgical bonding between the coating and the substrate. One coating according to the present invention involves electro-spark depositing material from a cemented chromium carbide electrode and an aluminum electrode. Another coating according to the present invention involves electro-spark depositing material from a cemented chromium carbide electrode and a nickel-base hardfacing alloy electrode.
Du, Xiaoming; Chen, Lin; Zhou, Ke
2012-10-01
Converging evidence from neuroimaging as well as lesion and transcranial magnetic stimulation (TMS) studies has been obtained for the involvement of right ventral posterior parietal cortex (PPC) in exogenous orienting. However, the contribution of dorsal PPC to attentional orienting, particularly endogenous orienting, is still under debate. In an informative peripheral cueing paradigm, in which the exogenous and endogenous orienting can be studied in relative isolation within a single task, we applied TMS over sub-regions of dorsal PPC to explore their possible distinct involvement in exogenous and endogenous processes. We found that disruption of the left posterior intraparietal sulcus (pIPS) weakened the attentional effects of endogenous orienting, but did not affect exogenous processes. In addition, TMS applied over the right superior parietal lobule (SPL) resulted in an overall increase in reaction times. The present study provides the causal evidence that the left pIPS plays a crucial role in voluntary orienting of visual attention, while right SPL is involved in the processing of arousal and/or vigilance. Copyright © 2011 Wiley Periodicals, Inc.
Heuristics as a Basis for Assessing Creative Potential: Measures, Methods, and Contingencies
ERIC Educational Resources Information Center
Vessey, William B.; Mumford, Michael D.
2012-01-01
Studies of creative thinking skills have generally measured a single aspect of creativity, divergent thinking. A number of other processes involved in creative thought have been identified. Effective execution of these processes is held to depend on the strategies applied in process execution, or heuristics. In this article, we review prior…
Systems Engineering, Quality and Testing
NASA Technical Reports Server (NTRS)
Shepherd, Christena C.
2015-01-01
AS9100 has little to say about how to apply a Quality Management System (QMS) to aerospace test programs. There is little in the quality engineering Body of Knowledge that applies to testing, unless it is nondestructive examination or some type of lab or bench testing. If one examines how the systems engineering processes are implemented throughout a test program; and how these processes can be mapped to AS9100, a number of areas for involvement of the quality professional are revealed.
Medical Staff Involvement in Nursing Homes: Development of a Conceptual Model and Research Agenda
Shield, Renée; Rosenthal, Marsha; Wetle, Terrie; Tyler, Denise; Clark, Melissa; Intrator, Orna
2013-01-01
Medical staff (physicians, nurse practitioners, physicians’ assistants) involvement in nursing homes (NH) is limited by professional guidelines, government policies, regulations, and reimbursements, creating bureaucratic burden. The conceptual NH Medical Staff Involvement Model, based on our mixed methods research, applies the Donabedian structure-process-outcomes framework to the NH identifying measures for a coordinated research agenda. Quantitative surveys and qualitative interviews conducted with medical directors, administrators and directors of nursing, other experts, residents and family members and Minimum Data Set, the Online Certification and Reporting System and Medicare Part B claims data related to NH structure, process and outcomes were analyzed. NH control of medical staff, or structure, affects medical staff involvement in care processes and is associated with better outcomes (e.g. symptom management, appropriate transitions, satisfaction). The Model identifies measures clarifying the impact of NH medical staff involvement on care processes and resident outcomes and has strong potential to inform regulatory policies. PMID:24652944
A Case Study of Measuring Process Risk for Early Insights into Software Safety
NASA Technical Reports Server (NTRS)
Layman, Lucas; Basili, Victor; Zelkowitz, Marvin V.; Fisher, Karen L.
2011-01-01
In this case study, we examine software safety risk in three flight hardware systems in NASA's Constellation spaceflight program. We applied our Technical and Process Risk Measurement (TPRM) methodology to the Constellation hazard analysis process to quantify the technical and process risks involving software safety in the early design phase of these projects. We analyzed 154 hazard reports and collected metrics to measure the prevalence of software in hazards and the specificity of descriptions of software causes of hazardous conditions. We found that 49-70% of 154 hazardous conditions could be caused by software or software was involved in the prevention of the hazardous condition. We also found that 12-17% of the 2013 hazard causes involved software, and that 23-29% of all causes had a software control. The application of the TPRM methodology identified process risks in the application of the hazard analysis process itself that may lead to software safety risk.
Interference detection and correction applied to incoherent-scatter radar power spectrum measurement
NASA Technical Reports Server (NTRS)
Ying, W. P.; Mathews, J. D.; Rastogi, P. K.
1986-01-01
A median filter based interference detection and correction technique is evaluated and the method applied to the Arecibo incoherent scatter radar D-region ionospheric power spectrum is discussed. The method can be extended to other kinds of data when the statistics involved in the process are still valid.
Applying Psychology in Local Authority Emergency Planning Processes
ERIC Educational Resources Information Center
Posada, Susan E.
2006-01-01
This article describes the work of two EPs involved in a multi-agency project to produce Local Authority (LA) guidelines on psycho/social support following critical incidents and disasters. EPs were involved as participant observers during a simulation of setting up and running a LA reception centre for evacuees. A questionnaire was then…
Application of a High-Fidelity Icing Analysis Method to a Model-Scale Rotor in Forward Flight
NASA Technical Reports Server (NTRS)
Narducci, Robert; Orr, Stanley; Kreeger, Richard E.
2012-01-01
An icing analysis process involving the loose coupling of OVERFLOW-RCAS for rotor performance prediction and with LEWICE3D for thermal analysis and ice accretion is applied to a model-scale rotor for validation. The process offers high-fidelity rotor analysis for the noniced and iced rotor performance evaluation that accounts for the interaction of nonlinear aerodynamics with blade elastic deformations. Ice accumulation prediction also involves loosely coupled data exchanges between OVERFLOW and LEWICE3D to produce accurate ice shapes. Validation of the process uses data collected in the 1993 icing test involving Sikorsky's Powered Force Model. Non-iced and iced rotor performance predictions are compared to experimental measurements as are predicted ice shapes.
Statistically Qualified Neuro-Analytic system and Method for Process Monitoring
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vilim, Richard B.; Garcia, Humberto E.; Chen, Frederick W.
1998-11-04
An apparatus and method for monitoring a process involves development and application of a statistically qualified neuro-analytic (SQNA) model to accurately and reliably identify process change. The development of the SQNA model is accomplished in two steps: deterministic model adaption and stochastic model adaptation. Deterministic model adaption involves formulating an analytic model of the process representing known process characteristics,augmenting the analytic model with a neural network that captures unknown process characteristics, and training the resulting neuro-analytic model by adjusting the neural network weights according to a unique scaled equation emor minimization technique. Stochastic model adaptation involves qualifying any remaining uncertaintymore » in the trained neuro-analytic model by formulating a likelihood function, given an error propagation equation, for computing the probability that the neuro-analytic model generates measured process output. Preferably, the developed SQNA model is validated using known sequential probability ratio tests and applied to the process as an on-line monitoring system.« less
Re-engineering: a prescription for hospitals.
Bolton, C; Gordon, J R
1994-01-01
Previously applied mostly in large, private sector corporations, "re-engineering" is fast becoming a tool that hospitals can use to break away from the old to find a new and better way of doing things. Re-engineering, however, first requires strong leadership which is committed to employee involvement and re-inventing the process design to meet the needs of the customers. Once the transition has been completed, the processes and the organization must continue to be managed differently. This article reviews the processes involved in re-engineering, and discusses the implementation of the initiative at the Sunnybrook Health Science Centre in Toronto.
The Market Responses to the Government Regulation of Chlorinated Solvents: A Policy Analysis
1988-10-01
in the process of statistical estimation of model parameters. The results of the estimation process applied to chlorinated solvent markets show the...93 C.5. Marginal Feedstock Cost Series Estimates for Process Share of Total Production .................................. 94 F.I...poliay context for this research. Section III provides analysis necessary to understand the chemicals involved, their production processes and costs, and
A Systematic Review of Sensory Processing Interventions for Children with Autism Spectrum Disorders
ERIC Educational Resources Information Center
Case-Smith, Jane; Weaver, Lindy L.; Fristad, Mary A.
2015-01-01
Children with autism spectrum disorders often exhibit co-occurring sensory processing problems and receive interventions that target self-regulation. In current practice, sensory interventions apply different theoretic constructs, focus on different goals, use a variety of sensory modalities, and involve markedly disparate procedures. Previous…
One approach to predictive modeling of biological contamination of recreational waters and drinking water sources involves applying process-based models that consider microbial sources, hydrodynamic transport, and microbial fate. Fecal indicator bacteria such as enterococci have ...
Strength enhancement process for prealloyed powder superalloys
NASA Technical Reports Server (NTRS)
Waters, W. J.; Freche, J. C.
1977-01-01
A technique involving superplastic processing and high pressure autoclaving was applied to a nickel base prealloyed powder alloy. Tensile strengths as high as 2865 MN/sq m at 480 C were obtained with as-superplastically deformed material. Appropriate treatments yielding materials with high temperature tensile and stress rupture strengths were also devised.
ERIC Educational Resources Information Center
Frederik, Hans; Hasanefendic, Sandra; van der Sijde, Peter
2017-01-01
In this paper, we analyse 53 Dutch accreditation reports in the field of information technology to assess the mechanisms of the reported involvement of the professional field in the undergraduate programmes of universities of applied sciences. The results of qualitative content analysis reveal a coupling effect in reporting on mechanisms of…
Applying Early Systems Engineering: Injecting Knowledge into the Capability Development Process
2012-10-01
involves early use of systems engi- neering and technical analyses to supplement the existing operational analysis techniques currently used in...complexity, and costs of systems now being developed require tight coupling between operational requirements stated in the CDD, system requirements...Fleischer » Keywords: Capability Development, Competitive Prototyping, Knowledge Points, Early Systems Engineering Applying Early Systems
ERIC Educational Resources Information Center
Tsai, Chia-Wen
2015-01-01
This research investigated, via quasi-experiments, the effects of web-based co-regulated learning (CRL) on developing students' computing skills. Two classes of 68 undergraduates in a one-semester course titled "Applied Information Technology: Data Processing" were chosen for this research. The first class (CRL group, n = 38) received…
Sensitizing Children to the Social and Emotional Mechanisms Involved in Racism: A Program Evaluation
ERIC Educational Resources Information Center
Triliva, Sofia; Anagnostopoulou, Tanya; Vleioras, Georgios
2014-01-01
This paper describes and discusses the results of an intervention aiming to sensitize children to the social and emotional processes involved in racism. The intervention was applied and evaluated in 10 Greek elementary schools. The goals and the intervention methods of the program modules are briefly outlined and the results of the program…
ERIC Educational Resources Information Center
Wareham, Todd
2017-01-01
In human problem solving, there is a wide variation between individuals in problem solution time and success rate, regardless of whether or not this problem solving involves insight. In this paper, we apply computational and parameterized analysis to a plausible formalization of extended representation change theory (eRCT), an integration of…
Classification of processes involved in sharing individual participant data from clinical trials.
Ohmann, Christian; Canham, Steve; Banzi, Rita; Kuchinke, Wolfgang; Battaglia, Serena
2018-01-01
Background: In recent years, a cultural change in the handling of data from research has resulted in the strong promotion of a culture of openness and increased sharing of data. In the area of clinical trials, sharing of individual participant data involves a complex set of processes and the interaction of many actors and actions. Individual services/tools to support data sharing are available, but what is missing is a detailed, structured and comprehensive list of processes/subprocesses involved and tools/services needed. Methods : Principles and recommendations from a published data sharing consensus document are analysed in detail by a small expert group. Processes/subprocesses involved in data sharing are identified and linked to actors and possible services/tools. Definitions are adapted from the business process model and notation (BPMN) and applied in the analysis. Results: A detailed and comprehensive list of individual processes/subprocesses involved in data sharing, structured according to 9 main processes, is provided. Possible tools/services to support these processes/subprocesses are identified and grouped according to major type of support. Conclusions: The list of individual processes/subprocesses and tools/services identified is a first step towards development of a generic framework or architecture for sharing of data from clinical trials. Such a framework is strongly needed to give an overview of how various actors, research processes and services could form an interoperable system for data sharing.
Classification of processes involved in sharing individual participant data from clinical trials
Ohmann, Christian; Canham, Steve; Banzi, Rita; Kuchinke, Wolfgang; Battaglia, Serena
2018-01-01
Background: In recent years, a cultural change in the handling of data from research has resulted in the strong promotion of a culture of openness and increased sharing of data. In the area of clinical trials, sharing of individual participant data involves a complex set of processes and the interaction of many actors and actions. Individual services/tools to support data sharing are available, but what is missing is a detailed, structured and comprehensive list of processes/subprocesses involved and tools/services needed. Methods: Principles and recommendations from a published data sharing consensus document are analysed in detail by a small expert group. Processes/subprocesses involved in data sharing are identified and linked to actors and possible services/tools. Definitions are adapted from the business process model and notation (BPMN) and applied in the analysis. Results: A detailed and comprehensive list of individual processes/subprocesses involved in data sharing, structured according to 9 main processes, is provided. Possible tools/services to support these processes/subprocesses are identified and grouped according to major type of support. Conclusions: The list of individual processes/subprocesses and tools/services identified is a first step towards development of a generic framework or architecture for sharing of data from clinical trials. Such a framework is strongly needed to give an overview of how various actors, research processes and services could form an interoperable system for data sharing. PMID:29623192
Strategic Communication: The Meaning is in the People
2011-03-06
to collect, share, and apply knowledge. Civilization has been possible only through the process of human communication . Fredrick Williams1...to affect SC. Overview of the Communication Process Communication ( human communication , at least) is something people do. To understand the human ...societies. It involves influencing each other and being informed. In order to understand the human communication process, one must understand how people
Salutogenic service user involvement in nursing research: a case study.
Mjøsund, Nina Helen; Vinje, Hege Forbech; Eriksson, Monica; Haaland-Øverby, Mette; Jensen, Sven Liang; Kjus, Solveig; Norheim, Irene; Portaasen, Inger-Lill; Espnes, Geir Arild
2018-05-12
The aim was to explore the process of involving mental healthcare service users in a mental health promotion research project as research advisors and to articulate features of the collaboration which encouraged and empowered the advisors to make significant contributions to the research process and outcome. There is an increasing interest in evaluating aspects of service user involvement in nursing research. Few descriptions exist of features that enable meaningful service user involvement. We draw on experiences from conducting research which used the methodology interpretative phenomenological analysis to explore how persons with mental disorders perceived mental health. Aside from the participants in the project, five research advisors with service user experience were involved in the entire research process. We applied a case study design to explore the ongoing processes of service user involvement. Documents and texts produced while conducting the project (2012-2016), as well as transcripts from multistage focus group discussions with the research advisors, were analysed. The level of involvement was dynamic and varied throughout the different stages of the research process. Six features: leadership, meeting structure, role clarification, being members of a team, a focus on possibilities and being seen and treated as holistic individuals, were guiding principles for a salutogenic service user involvement. These features strengthened the advisors' perception of themselves as valuable and competent contributors. Significant contributions from research advisors were promoted by facilitating the process of involvement. A supporting structure and atmosphere were consistent with a salutogenic service user involvement. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
Modular multiaperatures for light sensors
NASA Technical Reports Server (NTRS)
Rizzo, A. A.
1977-01-01
Process involves electroplating multiaperature masks as unit, eliminating alinement and assembly difficulties previously encountered. Technique may be applied to masks in automated and surveillance light systems, when precise, wide angle field of view is needed.
Piezoelectric Ceramics and Their Applications
ERIC Educational Resources Information Center
Flinn, I.
1975-01-01
Describes the piezoelectric effect in ceramics and presents a quantitative representation of this effect. Explains the processes involved in the manufacture of piezoelectric ceramics, the materials used, and the situations in which they are applied. (GS)
Code of Federal Regulations, 2010 CFR
2010-07-01
... 32 National Defense 2 2010-07-01 2010-07-01 false Definition. 257.3 Section 257.3 National Defense... SERVICE OF PROCESS § 257.3 Definition. Service of Process. When applied to the filing of a court action... delivery of a subpoena for any other reason whether or not the matter involves the United States. ...
Code of Federal Regulations, 2011 CFR
2011-07-01
... 32 National Defense 2 2011-07-01 2011-07-01 false Definition. 257.3 Section 257.3 National Defense... SERVICE OF PROCESS § 257.3 Definition. Service of Process. When applied to the filing of a court action... delivery of a subpoena for any other reason whether or not the matter involves the United States. ...
Code of Federal Regulations, 2014 CFR
2014-07-01
... 32 National Defense 2 2014-07-01 2014-07-01 false Definition. 257.3 Section 257.3 National Defense... SERVICE OF PROCESS § 257.3 Definition. Service of Process. When applied to the filing of a court action... delivery of a subpoena for any other reason whether or not the matter involves the United States. ...
Code of Federal Regulations, 2013 CFR
2013-07-01
... 32 National Defense 2 2013-07-01 2013-07-01 false Definition. 257.3 Section 257.3 National Defense... SERVICE OF PROCESS § 257.3 Definition. Service of Process. When applied to the filing of a court action... delivery of a subpoena for any other reason whether or not the matter involves the United States. ...
Code of Federal Regulations, 2012 CFR
2012-07-01
... 32 National Defense 2 2012-07-01 2012-07-01 false Definition. 257.3 Section 257.3 National Defense... SERVICE OF PROCESS § 257.3 Definition. Service of Process. When applied to the filing of a court action... delivery of a subpoena for any other reason whether or not the matter involves the United States. ...
Should Humanism Approach Be Applied in English as a Second Language (ESL) Classrooms?
ERIC Educational Resources Information Center
Ling, Lee Yi; Jin, Ng Yu; Tong, Chong Seng; Tarmizi, Mohd. Ariff
2014-01-01
In the process of learning, many elements fall into place wholly in order to enhance effectiveness. These elements include not only environmental factors but also learners' mentality which involves their feelings, needs and interests. Humanism approach is one which caters these elements required by learners' learning process through emphasis on…
Coding the Composing Process: A Guide for Teachers and Researchers.
ERIC Educational Resources Information Center
Perl, Sondra
Designed for teachers and researchers interested in the study of the composing process, this guide introduces a method of analysis that can be applied to data from a range of different cases. Specifically, the guide offers a simple, direct coding scheme for describing the movements occurring during composing that involves four procedures: teaching…
Gender Differences in the Motivational Processing of Facial Beauty
ERIC Educational Resources Information Center
Levy, Boaz; Ariely, Dan; Mazar, Nina; Chi, Won; Lukas, Scott; Elman, Igor
2008-01-01
Gender may be involved in the motivational processing of facial beauty. This study applied a behavioral probe, known to activate brain motivational regions, to healthy heterosexual subjects. Matched samples of men and women were administered two tasks: (a) key pressing to change the viewing time of average or beautiful female or male facial…
Experiencing the Progress Report: An Analysis of Gender and Administration in Doctoral Candidature
ERIC Educational Resources Information Center
Mewburn, Inger; Cuthbert, Denise; Tokareva, Ekaterina
2014-01-01
Most universities around the world put in place administrative processes and systems to manage student progress. These processes usually involve filling out standardised forms and instruments: managerial tools intended to increase transparency, promote efficiency and ensure fairness by applying the same standards to all. The progress report is a…
The Technology Adoption Process Model and Self-Efficacy of Distance Education Students
ERIC Educational Resources Information Center
Olson, Joel D.; Appunn, Frank D.
2017-01-01
The technology adoption process model (TAPM) is applied to a new synchronous conference technology with 27 asynchronous courses involving 520 participants and 17 instructors. The TAPM resulted from a qualitative study reviewing webcam conference technology adoption. The TAPM is now tested using self-efficacy as the dependent variable. The…
Cognitive Processes in Interpreting the Contour-Line Portrayal of Terrain Relief.
ERIC Educational Resources Information Center
Cross, Kenneth D.; And Others
Designed to gain a more thorough understanding of the cognitive processes involved and apply this knowledge in defining improved teaching strategies, this study of contour interpretation (referred to as "position fixing") required 12 subjects to locate their position on a map after being transported, blindfolded, to test sites where…
Towards a Periodical and Monograph Price Index. AIR Forum 1980 Paper.
ERIC Educational Resources Information Center
Belanger, Charles H.; Lavallee, Lise
The steps involved in tailoring a periodical and monograph price index to a university library are examined, as are the difficulties involved in applying a simple methodology such as a price index when the data base has not been organized to play an active role in the decision-making process. The following topics are addressed: the shifting of…
Han, Kyung-Ja; Kim, Hesook Suzie; Kim, Mae-Ja; Hong, Kyung-Ja; Park, Sungae; Yun, Soon-Nyoung; Song, Misoon; Jung, Yoenyi; Kim, Haewon; Kim, Dong-Oak Debbie; Choi, Heejung; Kim, Kyungae
2007-06-01
The purpose of the paper is to discover the patterns and processes of decision-making in clinical nursing practice. A set of think-aloud data from five critical care nurses during 40 to 50 minutes of caregiving in intensive care units were obtained and analyzed by applying the procedures recommended by Ericsson and Simon for protocol analysis. Four thinking processes before acting were identified to constitute various sorts of thoughts in which the nurses were engaged during patient care: reviewing, validation, consideration, rationalization, and action. In addition, three patterns of sequential streaming of thinking (short, intermediate, long) were identified to reveal various ways the nurses dealt with clinical situations involving nursing tasks and responsibilities. This study specifies the initial categories of thoughts for each of the processes and various patterns with which these processes are sequentially combined, providing insights into the ways nurses think about problems and address their concerns. The findings suggest that the thinking in clinical practice involves more than focused decision-making and reasoning, and needs to be examined from a broader perspective.
Posterior parietal cortex mediates encoding and maintenance processes in change blindness.
Tseng, Philip; Hsu, Tzu-Yu; Muggleton, Neil G; Tzeng, Ovid J L; Hung, Daisy L; Juan, Chi-Hung
2010-03-01
It is commonly accepted that right posterior parietal cortex (PPC) plays an important role in updating spatial representations, directing visuospatial attention, and planning actions. However, recent studies suggest that right PPC may also be involved in processes that are more closely associated with our visual awareness as its activation level positively correlates with successful conscious change detection (Beck, D.M., Rees, G., Frith, C.D., & Lavie, N. (2001). Neural correlates of change detection and change blindness. Nature Neuroscience, 4, 645-650.). Furthermore, disruption of its activity increases the occurrences of change blindness, thus suggesting a causal role for right PPC in change detection (Beck, D.M., Muggleton, N., Walsh, V., & Lavie, N. (2006). Right parietal cortex plays a critical role in change blindness. Cerebral Cortex, 16, 712-717.). In the context of a 1-shot change detection paradigm, we applied transcranial magnetic stimulation (TMS) during different time intervals to elucidate the temporally precise involvement of PPC in change detection. While subjects attempted to detect changes between two image sets separated by a brief time interval, TMS was applied either during the presentation of picture 1 when subjects were encoding and maintaining information into visual short-term memory, or picture 2 when subjects were retrieving information relating to picture 1 and comparing it to picture 2. Our results show that change blindness occurred more often when TMS was applied during the viewing of picture 1, which implies that right PPC plays a crucial role in the processes of encoding and maintaining information in visual short-term memory. In addition, since our stimuli did not involve changes in spatial locations, our findings also support previous studies suggesting that PPC may be involved in the processes of encoding non-spatial visual information (Todd, J.J. & Marois, R. (2004). Capacity limit of visual short-term memory in human posterior parietal cortex. Nature, 428, 751-754.). Copyright (c) 2009 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Young, Gerald W.; Clemons, Curtis B.
2004-01-01
The focus of this Cooperative Agreement between the Computational Materials Laboratory (CML) of the Processing Science and Technology Branch of the NASA Glenn Research Center (GRC) and the Department of Theoretical and Applied Mathematics at The University of Akron was in the areas of system development of the CML workstation environment, modeling of microgravity and earth-based material processing systems, and joint activities in laboratory projects. These efforts complement each other as the majority of the modeling work involves numerical computations to support laboratory investigations. Coordination and interaction between the modelers, system analysts, and laboratory personnel are essential toward providing the most effective simulations and communication of the simulation results. Toward these means, The University of Akron personnel involved in the agreement worked at the Applied Mathematics Research Laboratory (AMRL) in the Department of Theoretical and Applied Mathematics while maintaining a close relationship with the personnel of the Computational Materials Laboratory at GRC. Network communication between both sites has been established. A summary of the projects we undertook during the time period 9/1/03 - 6/30/04 is included.
Applications Integration Strategy in the Mission Development Process
NASA Astrophysics Data System (ADS)
Cox, E. L., Jr.
2016-12-01
NASA's Earth Science Applied Science Program has worked for the past four to five years with the Earth Science Division's Flight Program to cultivate an understanding of the importance of satellite remote sensing impacts on decision-making policy and decision support tools utilized by academia, state and local governments, other government agencies, private sector companies, and non-profit organizations. It has long been recognized that applications projects and studies in areas such as Health and Air Quality, Water Resources, Disasters, and Ecological Forecasting, have benefited and been enhanced through the use of satellite remote sensing. Applications researchers often use remote sensing data once it becomes available after the post-launch evaluation phase in the format and level of fidelity that is available. The results from the many applications projects, over the years, have been significant and there are countless examples of improvements and enhancements to operational systems and decision-making policies in the Applied Sciences community. However, feedback received from the applications community regarding the need for improved data availability and latency; processing and formatting, to name a few, prompted the idea of applied science involvement early in the life cycle of mission development. Over time, the Applied Science Program personnel have learned a great deal from the flight mission development life cycle process and recognized key areas of alignment. This presentation will discuss specific aspects of applied science that investigators should consider when proposing to future announcements involving an applications dimension. The Program's experience with user community needs, decision-making requirements, and stakeholder operations requirements will be highlighted.
A quantitative framework for estimating risk of collision between marine mammals and boats
Martin, Julien; Sabatier, Quentin; Gowan, Timothy A.; Giraud, Christophe; Gurarie, Eliezer; Calleson, Scott; Ortega-Ortiz, Joel G.; Deutsch, Charles J.; Rycyk, Athena; Koslovsky, Stacie M.
2016-01-01
By applying encounter rate theory to the case of boat collisions with marine mammals, we gained new insights about encounter processes between wildlife and watercraft. Our work emphasizes the importance of considering uncertainty when estimating wildlife mortality. Finally, our findings are relevant to other systems and ecological processes involving the encounter between moving agents.
Code of Federal Regulations, 2010 CFR
2010-10-01
... the floodplain management and wetlands protection decision-making process, as set out below in § 9.6...) Decision-making involving certain categories of actions. The provisions set forth in this regulation are... apply steps 1, 2, 4, 5 and 8 of the decision-making process (§§ 9.7, 9.8, 9.10 and 9.11, see § 9.6...
This NESHAP applies to facilities using pressure or thermal treatment processes involving wood preservatives containing chromium, arsenic, dioxins, or methylene chloride. Page includes rule summary, rule history and additional documents.
This NESHAP applies to facilities using pressure or thermal treatment processes involving wood preservatives containing chromium, arsenic, dioxins, or methylene chloride. Inlcudes federal register citations, rule history and additional resources.
NASA Astrophysics Data System (ADS)
Iramina, Keiji; Ge, Sheng; Hyodo, Akira; Hayami, Takehito; Ueno, Shoogo
2009-04-01
In this study, we applied a transcranial magnetic stimulation (TMS) to investigate the temporal aspect for the functional processing of visual attention. Although it has been known that right posterior parietal cortex (PPC) in the brain has a role in certain visual search tasks, there is little knowledge about the temporal aspect of this area. Three visual search tasks that have different difficulties of task execution individually were carried out. These three visual search tasks are the "easy feature task," the "hard feature task," and the "conjunction task." To investigate the temporal aspect of the PPC involved in the visual search, we applied various stimulus onset asynchronies (SOAs) and measured the reaction time of the visual search. The magnetic stimulation was applied on the right PPC or the left PPC by the figure-eight coil. The results show that the reaction times of the hard feature task are longer than those of the easy feature task. When SOA=150 ms, compared with no-TMS condition, there was a significant increase in target-present reaction time when TMS pulses were applied. We considered that the right PPC was involved in the visual search at about SOA=150 ms after visual stimulus presentation. The magnetic stimulation to the right PPC disturbed the processing of the visual search. However, the magnetic stimulation to the left PPC gives no effect on the processing of the visual search.
Experiences Using Formal Methods for Requirements Modeling
NASA Technical Reports Server (NTRS)
Easterbrook, Steve; Lutz, Robyn; Covington, Rick; Kelly, John; Ampo, Yoko; Hamilton, David
1996-01-01
This paper describes three cases studies in the lightweight application of formal methods to requirements modeling for spacecraft fault protection systems. The case studies differ from previously reported applications of formal methods in that formal methods were applied very early in the requirements engineering process, to validate the evolving requirements. The results were fed back into the projects, to improve the informal specifications. For each case study, we describe what methods were applied, how they were applied, how much effort was involved, and what the findings were. In all three cases, the formal modeling provided a cost effective enhancement of the existing verification and validation processes. We conclude that the benefits gained from early modeling of unstable requirements more than outweigh the effort needed to maintain multiple representations.
A framework of knowledge creation processes in participatory simulation of hospital work systems.
Andersen, Simone Nyholm; Broberg, Ole
2017-04-01
Participatory simulation (PS) is a method to involve workers in simulating and designing their own future work system. Existing PS studies have focused on analysing the outcome, and minimal attention has been devoted to the process of creating this outcome. In order to study this process, we suggest applying a knowledge creation perspective. The aim of this study was to develop a framework describing the process of how ergonomics knowledge is created in PS. Video recordings from three projects applying PS of hospital work systems constituted the foundation of process mining analysis. The analysis resulted in a framework revealing the sources of ergonomics knowledge creation as sequential relationships between the activities of simulation participants sharing work experiences; experimenting with scenarios; and reflecting on ergonomics consequences. We argue that this framework reveals the hidden steps of PS that are essential when planning and facilitating PS that aims at designing work systems. Practitioner Summary: When facilitating participatory simulation (PS) in work system design, achieving an understanding of the PS process is essential. By applying a knowledge creation perspective and process mining, we investigated the knowledge-creating activities constituting the PS process. The analysis resulted in a framework of the knowledge-creating process in PS.
Generic equilibration dynamics of planar defects in trapped atomic superfluids
Scherpelz, Peter; Padavić, Karmela; Murray, Andy; ...
2015-03-18
Here, we investigate equilibration processes shortly after sudden perturbations are applied to ultracold trapped superfluids. We show the similarity of phase imprinting and localized density depletion perturbations, both of which initially are found to produce “phase walls”. These planar defects are associated with a sharp gradient in the phase. Importantly they relax following a quite general sequence. Our studies, based on simulations of the complex time-dependent Ginzburg-Landau equation, address the challenge posed by these experiments: how a superfluid eventually eliminatesa spatially extended planar defect. The processes involved are necessarily more complex than equilibration involving simpler line vortices. An essential mechanismmore » form relaxation involves repeated formation and loss of vortex rings near the trap edge.« less
ERIC Educational Resources Information Center
Dodd, Carol Ann
This study explores a technique for evaluating teacher education programs in terms of teaching competencies, as applied to the Indiana University Mathematics Methods Program (MMP). The evaluation procedures formulated for the study include a process product design in combination with a modification of Pophan's performance test paradigm and Gage's…
Verdejo-Román, Juan; Fornito, Alex; Soriano-Mas, Carles; Vilar-López, Raquel; Verdejo-García, Antonio
2017-02-01
Overvaluation of palatable food is a primary driver of obesity, and is associated with brain regions of the reward system. However, it remains unclear if this network is specialized in food reward, or generally involved in reward processing. We used functional magnetic resonance imaging (fMRI) to characterize functional connectivity during processing of food and monetary rewards. Thirty-nine adults with excess weight and 37 adults with normal weight performed the Willingness to Pay for Food task and the Monetary Incentive Delay task in the fMRI scanner. A data-driven graph approach was applied to compare whole-brain, task-related functional connectivity between groups. Excess weight was associated with decreased functional connectivity during the processing of food rewards in a network involving primarily frontal and striatal areas, and increased functional connectivity during the processing of monetary rewards in a network involving principally frontal and parietal areas. These two networks were topologically and anatomically distinct, and were independently associated with BMI. The processing of food and monetary rewards involve segregated neural networks, and both are altered in individuals with excess weight. Copyright © 2016 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Plonsky, Luke; Brown, Dan
2015-01-01
Applied linguists have turned increasingly in recent years to meta-analysis as the preferred means of synthesizing quantitative research. The first step in the meta-analytic process involves defining a domain of interest. Despite its apparent simplicity, this step involves a great deal of subjectivity on the part of the meta-analyst. This article…
A comparison of the environmental impact of different AOPs: risk indexes.
Giménez, Jaime; Bayarri, Bernardí; González, Óscar; Malato, Sixto; Peral, José; Esplugas, Santiago
2014-12-31
Today, environmental impact associated with pollution treatment is a matter of great concern. A method is proposed for evaluating environmental risk associated with Advanced Oxidation Processes (AOPs) applied to wastewater treatment. The method is based on the type of pollution (wastewater, solids, air or soil) and on materials and energy consumption. An Environmental Risk Index (E), constructed from numerical criteria provided, is presented for environmental comparison of processes and/or operations. The Operation Environmental Risk Index (EOi) for each of the unit operations involved in the process and the Aspects Environmental Risk Index (EAj) for process conditions were also estimated. Relative indexes were calculated to evaluate the risk of each operation (E/NOP) or aspect (E/NAS) involved in the process, and the percentage of the maximum achievable for each operation and aspect was found. A practical application of the method is presented for two AOPs: photo-Fenton and heterogeneous photocatalysis with suspended TiO2 in Solarbox. The results report the environmental risks associated with each process, so that AOPs tested and the operations involved with them can be compared.
Ventilation in the patient with unilateral lung disease.
Thomas, A R; Bryce, T L
1998-10-01
Severe ULD presents a challenge in ventilator management because of the marked asymmetry in the mechanics of the two lungs. The asymmetry may result from significant decreases or increases in the compliance of the involved lung. Traditional ventilator support may fail to produce adequate gas exchange in these situations and has the potential to cause further deterioration. Fortunately, conventional techniques can be safely and effectively applied in the majority of cases without having to resort to less familiar and potentially hazardous forms of support. In those circumstances when conventional ventilation is unsuccessful in restoring adequate gas exchange, lateral positioning and ILV have proved effective at improving and maintaining gas exchange. Controlled trials to guide clinical decision making are lacking. In patients who have processes associated with decreased compliance in the involved lung, lateral positioning may be a simple method of improving gas exchange but is associated with many practical limitations. ILV in these patients is frequently successful when differential PEEP is applied with the higher pressure to the involved lung. In patients in whom the pathology results in distribution of ventilation favoring the involved lung, particularly BPF, ILV can be used to supply adequate support while minimizing flow through the fistula and allowing it to close. The application of these techniques should be undertaken with an understanding of the pathophysiology of the underlying process; the reported experience with these techniques, including indications and successfully applied methods; and the potential problems encountered with their use. Fortunately, these modalities are infrequently required, but they provide a critical means of support when conventional techniques fail.
ERIC Educational Resources Information Center
Veloo, Arsaythamby; Krishnasamy, Hariharan N.; Harun, Hana Mulyani
2015-01-01
The purpose of this study is to determine gender differences and type of learning approaches among Universiti Utara Malaysia (UUM) undergraduate students in English writing performance. The study involved 241 (32.8% male & 67.2% female) undergraduate students of UUM who were taking the Process Writing course. This study uses a Two-Factor Study…
ERIC Educational Resources Information Center
Kozbelt, Aaron; Dexter, Scott; Dolese, Melissa; Meredith, Daniel; Ostrofsky, Justin
2015-01-01
We applied computer-based text analyses of regressive imagery to verbal protocols of individuals engaged in creative problem-solving in two domains: visual art (23 experts, 23 novices) and computer programming (14 experts, 14 novices). Percentages of words involving primary process and secondary process thought, plus emotion-related words, were…
Biomedical device innovation methodology: applications in biophotonics
NASA Astrophysics Data System (ADS)
Beswick, Daniel M.; Kaushik, Arjun; Beinart, Dylan; McGarry, Sarah; Yew, Ming Khoon; Kennedy, Brendan F.; Maria, Peter Luke Santa
2018-02-01
The process of medical device innovation involves an iterative method that focuses on designing innovative, device-oriented solutions that address unmet clinical needs. This process has been applied to the field of biophotonics with many notable successes. Device innovation begins with identifying an unmet clinical need and evaluating this need through a variety of lenses, including currently existing solutions for the need, stakeholders who are interested in the need, and the market that will support an innovative solution. Only once the clinical need is understood in detail can the invention process begin. The ideation phase often involves multiple levels of brainstorming and prototyping with the aim of addressing technical and clinical questions early and in a cost-efficient manner. Once potential solutions are found, they are tested against a number of known translational factors, including intellectual property, regulatory, and reimbursement landscapes. Only when the solution matches the clinical need, the next phase of building a "to market" strategy should begin. Most aspects of the innovation process can be conducted relatively quickly and without significant capital expense. This white paper focuses on key points of the medical device innovation method and how the field of biophotonics has been applied within this framework to generate clinical and commercial success.
Gerfo, Emanuele Lo; Oliveri, Massimiliano; Torriero, Sara; Salerno, Silvia; Koch, Giacomo; Caltagirone, Carlo
2008-01-31
We investigated the differential role of two frontal regions in the processing of grammatical and semantic knowledge. Given the documented specificity of the prefrontal cortex for the grammatical class of verbs, and of the primary motor cortex for the semantic class of action words, we sought to investigate whether the prefrontal cortex is also sensitive to semantic effects, and whether the motor cortex is also sensitive to grammatical class effects. We used repetitive transcranial magnetic stimulation (rTMS) to suppress the excitability of a portion of left prefontal cortex (first experiment) and of the motor area (second experiment). In the first experiment we found that rTMS applied to the left prefrontal cortex delays the processing of action verbs' retrieval, but is not critical for retrieval of state verbs and state nouns. In the second experiment we found that rTMS applied to the left motor cortex delays the processing of action words, both name and verbs, while it is not critical for the processing of state words. These results support the notion that left prefrontal and motor cortex are involved in the process of action word retrieval. Left prefrontal cortex subserves processing of both grammatical and semantic information, whereas motor cortex contributes to the processing of semantic representation of action words without any involvement in the representation of grammatical categories.
ERIC Educational Resources Information Center
Clayton, Lynette; Robinson, Luther D.
1971-01-01
Observations based on psychodrama with deaf people, relating to interaction between people and the communication process, are made. How role training skills, which involve some of the skills of psychodrama, can be applied by professionals in vocational and social learning situations is illustrated. (KW)
Greener and Sustainable Trends in Synthesis of Organics and Nanomaterials
Trends in greener and sustainable process development during the past 25 years are abridged involving the use of alternate energy inputs (mechanochemistry, ultrasound- or microwave irradiation), photochemistry, and greener reaction media as applied to synthesis of organics and na...
Resilience as Regulation of Developmental and Family Processes
MacPhee, David; Lunkenheimer, Erika; Riggs, Nathaniel
2015-01-01
Resilience can be defined as establishing equilibrium subsequent to disturbances to a system caused by significant adversity. When families experience adversity or transitions, multiple regulatory processes may be involved in establishing equilibrium, including adaptability, regulation of negative affect, and effective problem-solving skills. The authors’ resilience-as-regulation perspective integrates insights about the regulation of individual development with processes that regulate family systems. This middle-range theory of family resilience focuses on regulatory processes across levels that are involved in adaptation: whole-family systems such as routines and sense of coherence; coregulation of dyads involving emotion regulation, structuring, and reciprocal influences between social partners; and individual self-regulation. Insights about resilience-as-regulation are then applied to family-strengthening interventions that are designed to promote adaptation to adversity. Unresolved issues are discussed in relation to resilience-as-regulation in families, in particular how risk exposure is assessed, interrelations among family regulatory mechanisms, and how families scaffold the development of children’s resilience. PMID:26568647
Twenty-First Century Research Needs in Electrostatic Processes Applied to Industry and Medicine
NASA Technical Reports Server (NTRS)
Mazumder, M. K.; Sims, R. A.; Biris, A. S.; Srirama, P. K.; Saini, D.; Yurteri, C. U.; Trigwell, S.; De, S.; Sharma, R.
2005-01-01
From the early century Nobel Prize winning (1923) experiments with charged oil droplets, resulting in the discovery of the elementary electronic charge by Robert Millikan, to the early 21st century Nobel Prize (2002) awarded to John Fenn for his invention of electrospray ionization mass spectroscopy and its applications to proteomics, electrostatic processes have been successfully applied to many areas of industry and medicine. Generation, transport, deposition, separation, analysis, and control of charged particles involved in the four states of matter: solid, liquid, gas, and plasma are of interest in many industrial and biomedical processes. In this paper, we briefly discuss some of the applications and research needs involving charged particles in industrial and medical applications including: (1) Generation and deposition of unipolarly charged dry powder without the presence of ions or excessive ozone, (2) Control of tribocharging process for consistent and reliable charging, (3) Thin film (less than 25 micrometers) powder coating and Powder coating on insulative surfaces, (4) Fluidization and dispersion of fine powders, (5) Mitigation of Mars dust, (6) Effect of particle charge on the lung deposition of inhaled medical aerosols, (7) Nanoparticle deposition, and (8) Plasma/Corona discharge processes. A brief discussion on the measurements of charged particles and suggestions for research needs are also included.
Transcranial Direct Current Stimulation Effects on Semantic Processing in Healthy Individuals.
Joyal, Marilyne; Fecteau, Shirley
2016-01-01
Semantic processing allows us to use conceptual knowledge about the world. It has been associated with a large distributed neural network that includes the frontal, temporal and parietal cortices. Recent studies using transcranial direct current stimulation (tDCS) also contributed at investigating semantic processing. The goal of this article was to review studies investigating semantic processing in healthy individuals with tDCS and discuss findings from these studies in line with neuroimaging results. Based on functional magnetic resonance imaging studies assessing semantic processing, we predicted that tDCS applied over the inferior frontal gyrus, middle temporal gyrus, and posterior parietal cortex will impact semantic processing. We conducted a search on Pubmed and selected 27 articles in which tDCS was used to modulate semantic processing in healthy subjects. We analysed each article according to these criteria: demographic information, experimental outcomes assessing semantic processing, study design, and effects of tDCS on semantic processes. From the 27 reviewed studies, 8 found main effects of stimulation. In addition to these 8 studies, 17 studies reported an interaction between stimulus types and stimulation conditions (e.g. incoherent functional, but not instrumental, actions were processed faster when anodal tDCS was applied over the posterior parietal cortex as compared to sham tDCS). Results suggest that regions in the frontal, temporal, and parietal cortices are involved in semantic processing. tDCS can modulate some aspects of semantic processing and provide information on the functional roles of brain regions involved in this cognitive process. Copyright © 2016 Elsevier Inc. All rights reserved.
Photoactivated methods for enabling cartilage-to-cartilage tissue fixation
NASA Astrophysics Data System (ADS)
Sitterle, Valerie B.; Roberts, David W.
2003-06-01
The present study investigates whether photoactivated attachment of cartilage can provide a viable method for more effective repair of damaged articular surfaces by providing an alternative to sutures, barbs, or fibrin glues for initial fixation. Unlike artificial materials, biological constructs do not possess the initial strength for press-fitting and are instead sutured or pinned in place, typically inducing even more tissue trauma. A possible alternative involves the application of a photosensitive material, which is then photoactivated with a laser source to attach the implant and host tissues together in either a photothermal or photochemical process. The photothermal version of this method shows potential, but has been almost entirely applied to vascularized tissues. Cartilage, however, exhibits several characteristics that produce appreciable differences between applying and refining these techniques when compared to previous efforts involving vascularized tissues. Preliminary investigations involving photochemical photosensitizers based on singlet oxygen and electron transfer mechanisms are discussed, and characterization of the photodynamic effects on bulk collagen gels as a simplified model system using FTIR is performed. Previous efforts using photothermal welding applied to cartilaginous tissues are reviewed.
NASA Astrophysics Data System (ADS)
Kuncser, A.; Antohe, S.; Kuncser, V.
2017-02-01
Peculiarities of the magnetization reversal process in cylindrical Ni-Cu soft magnetic nanowires with dominant shape anisotropy are analyzed via both static and time dependent micromagnetic simulations. A reversible process involving a coherent-like spin rotation is always observed for magnetic fields applied perpendicularly to the easy axis whereas nucleation of domain walls is introduced for fields applied along the easy axis. Simple criteria for making distinction between a Stoner-Wohlfarth type rotation and a nucleation mechanism in systems with uniaxial magnetic anisotropy are discussed. Superposed reversal mechanisms can be in action for magnetic fields applied at arbitrary angles with respect to the easy axis within the condition of an enough strong axial component required by the nucleation. The dynamics of the domain wall, involving two different stages (nucleation and propagation), is discussed with respect to initial computing conditions and orientations of the magnetic field. A nucleation time of about 3 ns and corkscrew domain walls propagating with a constant velocity of about 150 m/s are obtained in case of Ni-Cu alloy (Ni rich side) NWs with diameters of 40 nm and high aspect ratio.
Effects Based Operations: Applying Network Centric Warfare in Peace, Crisis, and War
2003-01-01
understanding what factors play in the cognitive process of the leadership (e.g., the survival of the organization), and then threatening those...concept. To address the “why” of the stimulus and response, we must understand something of the cognitive processes involved in observing and responding to...between the cognitive domain and the information domain. It is the problem of communicating understanding
King, James Claude
1976-01-13
The disclosure is directed to a method for processing quartz used in fabricating crystal resonators such that transient frequency change of resonators exposed to pulse irradiation is virtually eliminated. The method involves heating the crystal quartz in a hydrogen-free atmosphere while simultaneously applying an electric field in the Z-axis direction of the crystal. The electric field is maintained during the cool-down phase of the process.
Experiences Using Lightweight Formal Methods for Requirements Modeling
NASA Technical Reports Server (NTRS)
Easterbrook, Steve; Lutz, Robyn; Covington, Rick; Kelly, John; Ampo, Yoko; Hamilton, David
1997-01-01
This paper describes three case studies in the lightweight application of formal methods to requirements modeling for spacecraft fault protection systems. The case studies differ from previously reported applications of formal methods in that formal methods were applied very early in the requirements engineering process, to validate the evolving requirements. The results were fed back into the projects, to improve the informal specifications. For each case study, we describe what methods were applied, how they were applied, how much effort was involved, and what the findings were. In all three cases, formal methods enhanced the existing verification and validation processes, by testing key properties of the evolving requirements, and helping to identify weaknesses. We conclude that the benefits gained from early modeling of unstable requirements more than outweigh the effort needed to maintain multiple representations.
Recapturing Graphite-Based Fuel Element Technology for Nuclear Thermal Propulsion
DOE Office of Scientific and Technical Information (OSTI.GOV)
Trammell, Michael P; Jolly, Brian C; Miller, James Henry
ORNL is currently recapturing graphite based fuel forms for Nuclear Thermal Propulsion (NTP). This effort involves research and development on materials selection, extrusion, and coating processes to produce fuel elements representative of historical ROVER and NERVA fuel. Initially, lab scale specimens were fabricated using surrogate oxides to develop processing parameters that could be applied to full length NTP fuel elements. Progress toward understanding the effect of these processing parameters on surrogate fuel microstructure is presented.
Listening to sound patterns as a dynamic activity
NASA Astrophysics Data System (ADS)
Jones, Mari Riess
2003-04-01
The act of listening to a series of sounds created by some natural event is described as involving an entrainmentlike process that transpires in real time. Some aspects of this dynamic process are suggested. In particular, real-time attending is described in terms of an adaptive synchronization activity that permits a listener to target attending energy to forthcoming elements within an acoustical pattern (e.g., music, speech, etc.). Also described are several experiments that illustrate features of this approach as it applies to attending to musiclike patterns. These involve listeners' responses to changes in either the timing or the pitch structure (or both) of various acoustical sequences.
Cognitive pragmatics of language disorders in adults.
Davis, G Albyn
2007-05-01
Cognitive pragmatics is the study of the mental structures and processes involved in the use of language in communicative contexts. Paradigms of cognitive psychology (off-line and on-line) have been applied to the study of the abilities to go beyond the literal (inference) and derive meaning in relation to context (e.g., metaphor and sarcasm). These pragmatic functions have been examined for the involvement of processes of meaning activation, embellishment, and revision. Clinical investigators have explored abilities and deficits in acquired aphasia, right hemisphere dysfunction, and closed head injury. This article reviews and provides some analysis of clinical studies that are consistent with the themes constituting cognitive pragmatics.
Ethical questions for resource managers.
G.H. Reeves; D.L. Bottom; M.H. Brookes
1992-01-01
The decisions of natural resource managers are not simply scientific issues but involve fundamental questions of ethics. Conflicts in fisheries management, forestry, and other applied sciences arise from social and economic factors that affect natural resource values. Administrative processes, cost-benefit analyses, and various management "myths" have been...
Predictive failure analysis: planning for the worst so that it never happens!
Hipple, Jack
2008-01-01
This article reviews an alternative approach to failure analysis involving a deliberate saboteurial approach rather than a checklist approach to disaster and emergency preparedness. This process is in the form of an algorithm that is easily applied to any planning situation.
ERIC Educational Resources Information Center
Garner, Waunita L.
The paper explains how Project LIFE (Language Improvement to Facilitate Education) has applied the principles of programed instruction in developing language materials for language handicapped children, especially the hearing impaired. Early strategy decisions are said to have involved obtaining a teaching machine which would be equipped with a…
Remote Sensing as a Demonstration of Applied Physics.
ERIC Educational Resources Information Center
Colwell, Robert N.
1980-01-01
Provides information about the field of remote sensing, including discussions of geo-synchronous and sun-synchronous remote-sensing platforms, the actual physical processes and equipment involved in sensing, the analysis of images by humans and machines, and inexpensive, small scale methods, including aerial photography. (CS)
Baracat-Pereira, Maria Cristina; de Oliveira Barbosa, Meire; Magalhães, Marcos Jorge; Carrijo, Lanna Clicia; Games, Patrícia Dias; Almeida, Hebréia Oliveira; Sena Netto, José Fabiano; Pereira, Matheus Rodrigues; de Barros, Everaldo Gonçalves
2012-06-01
The enrichment and isolation of proteins are considered limiting steps in proteomic studies. Identification of proteins whose expression is transient, those that are of low-abundance, and of natural peptides not described in databases, is still a great challenge. Plant extracts are in general complex, and contaminants interfere with the identification of proteins involved in important physiological processes, such as plant defense against pathogens. This review discusses the challenges and strategies of separomics applied to the identification of low-abundance proteins and peptides in plants, especially in plants challenged by pathogens. Separomics is described as a group of methodological strategies for the separation of protein molecules for proteomics. Several tools have been used to remove highly abundant proteins from samples and also non-protein contaminants. The use of chromatographic techniques, the partition of the proteome into subproteomes, and an effort to isolate proteins in their native form have allowed the isolation and identification of rare proteins involved in different processes.
Baracat-Pereira, Maria Cristina; de Oliveira Barbosa, Meire; Magalhães, Marcos Jorge; Carrijo, Lanna Clicia; Games, Patrícia Dias; Almeida, Hebréia Oliveira; Sena Netto, José Fabiano; Pereira, Matheus Rodrigues; de Barros, Everaldo Gonçalves
2012-01-01
The enrichment and isolation of proteins are considered limiting steps in proteomic studies. Identification of proteins whose expression is transient, those that are of low-abundance, and of natural peptides not described in databases, is still a great challenge. Plant extracts are in general complex, and contaminants interfere with the identification of proteins involved in important physiological processes, such as plant defense against pathogens. This review discusses the challenges and strategies of separomics applied to the identification of low-abundance proteins and peptides in plants, especially in plants challenged by pathogens. Separomics is described as a group of methodological strategies for the separation of protein molecules for proteomics. Several tools have been used to remove highly abundant proteins from samples and also non-protein contaminants. The use of chromatographic techniques, the partition of the proteome into subproteomes, and an effort to isolate proteins in their native form have allowed the isolation and identification of rare proteins involved in different processes. PMID:22802713
PUBLIC AND PATIENT INVOLVEMENT IN HEALTH TECHNOLOGY ASSESSMENT: A FRAMEWORK FOR ACTION.
Abelson, Julia; Wagner, Frank; DeJean, Deirdre; Boesveld, Sarah; Gauvin, Franςois-Pierre; Bean, Sally; Axler, Renata; Petersen, Stephen; Baidoobonso, Shamara; Pron, Gaylene; Giacomini, Mita; Lavis, John
2016-01-01
As health technology assessment (HTA) organizations in Canada and around the world seek to involve the public and patients in their activities, frameworks to guide decisions about whom to involve, through which mechanisms, and at what stages of the HTA process have been lacking. The aim of this study was to describe the development and outputs of a comprehensive framework for involving the public and patients in a government agency's HTA process. The framework was informed by a synthesis of international practice and published literature, a dialogue with local, national and international stakeholders, and the deliberations of a government agency's public engagement subcommittee in Ontario, Canada. The practice and literature synthesis failed to identify a single, optimal approach to involving the public and patients in HTA. Choice of methods should be considered in the context of each HTA stage, goals for incorporating societal and/or patient perspectives into the process, and relevant societal and/or patient values at stake. The resulting framework is structured around four actionable elements: (i) guiding principles and goals for public and patient involvement (PPI) in HTA, (ii) the establishment of a common language to support PPI efforts, (iii) a flexible array of PPI approaches, and (iv) on-going evaluation of PPI to inform adjustments over time. A public and patient involvement framework has been developed for implementation in a government agency's HTA process. Core elements of this framework may apply to other organizations responsible for HTA and health system quality improvement.
Analysis of hot forming of a sheet metal component made of advanced high strength steel
NASA Astrophysics Data System (ADS)
Demirkaya, Sinem; Darendeliler, Haluk; Gökler, Mustafa İlhan; Ayhaner, Murat
2013-05-01
To provide reduction in weight while maintaining crashworthiness and to decrease the fuel consumption of vehicles, thinner components made of Advanced High Strength Steels (AHSS) are being increasingly used in automotive industry. However, AHSS cannot be formed easily at the room temperature (i.e. cold forming). The alternative process involves heating, hot forming and subsequent quenching. A-pillar upper reinforcement of a vehicle is currently being produced by cold forming of DP600 steel sheet with a thickness of 1.8 mm. In this study, the possible decrease in the thickness of this particular part by using 22MnB5 as appropriate AHSS material and applying this alternative process has been studied. The proposed process involves deep drawing, trimming, heating, sizing, cooling and piercing operations. Both the current production process and the proposed process are analyzed by the finite element method. The die geometry, blank holding forces and the design of the cooling channels for the cooling process are determined numerically. It is shown that the particular part made of 22MnB5 steel sheet with a thickness of 1.2 mm can be successfully produced by applying the proposed process sequence and can be used without sacrificing the crashworthiness. With the use of the 22MnB5 steel with a thickness of 1.2 mm instead of DP600 sheet metal with a thickness of 1.8 mm, the weight is reduced by approximately 33%.
Cleaning process for EUV optical substrates
Weber, Frank J.; Spiller, Eberhard A.
1999-01-01
A cleaning process for surfaces with very demanding cleanliness requirements, such as extreme-ultraviolet (EUV) optical substrates. Proper cleaning of optical substrates prior to applying reflective coatings thereon is very critical in the fabrication of the reflective optics used in EUV lithographic systems, for example. The cleaning process involves ultrasonic cleaning in acetone, methanol, and a pH neutral soap, such as FL-70, followed by rinsing in de-ionized water and drying with dry filtered nitrogen in conjunction with a spin-rinse.
Addition to the Lewis Chemical Equilibrium Program to allow computation from coal composition data
NASA Technical Reports Server (NTRS)
Sevigny, R.
1980-01-01
Changes made to the Coal Gasification Project are reported. The program was developed by equilibrium combustion in rocket engines. It can be applied directly to the entrained flow coal gasification process. The particular problem addressed is the reduction of the coal data into a form suitable to the program, since the manual process is involved and error prone. A similar problem in relating the normal output of the program to parameters meaningful to the coal gasification process is also addressed.
Jia, Lei; Dickter, Cheryl L; Luo, Junlong; Xiao, Xiao; Yang, Qun; Lei, Ming; Qiu, Jiang; Zhang, Qinglin
2012-01-01
Stereotyping involves two processes in which first, social stereotypes are activated (stereotype activation), and then, stereotypes are applied to given targets (stereotype application). Previous behavioral studies have suggested that these two processes are independent of each other and may have different mechanisms. As few psychophysiological studies have given an integrated account of these stages in stereotyping so far, this study utilized a trait categorization task in which event-related potentials (ERPs) were used to explore the brain mechanisms associated with the processes of stereotype activation and its application. The behavioral (reaction time) and electrophysiological data showed that stereotype activation and application were elicited respectively in an affective valence identification subtask and in a semantic content judgment subtask. The electrophysiological results indicated that the categorization processes involved in stereotype activation to quickly identify stereotypic and nonstereotypic information were quite different from those involved in the application. During the process of stereotype activation, a P2 and N2 effect was observed, indicating that stereotype activation might be facilitated by an early attentional bias. Also, a late positive potential (LPP) was elicited, suggesting that social expectancy violation might be involved. During the process of the stereotype application, electrophysiological data showed a P2 and P3 effect, indicating that stereotype application might be related to the rapid social knowledge identification in semantic representation and thus may be associated with an updating of existing stereotypic contents or a motivation to resolve the inconsistent information. This research strongly suggested that different mechanisms are involved in the stereotype activation and application processes.
Software Process Improvement Journey: IBM Australia Application Management Services
2005-03-01
learned from its successes and mistakes and then applied that learning to the next project . 28 CMU/SEI-2005-TR...worldwide re- quirements for project management and quality; it was the organization’s staff members who played a part in the development of the ...environ- ment and that it involves personnel from a variety of areas, ideally not part of the group that developed the technology or process
The lexical processing of abstract and concrete nouns.
Papagno, Costanza; Fogliata, Arianna; Catricalà, Eleonora; Miniussi, Carlo
2009-03-31
Recent activation studies have suggested different neural correlates for processing concrete and abstract words. However, the precise localization is far from being defined. One reason for the heterogeneity of these results could lie in the extreme variability of experimental paradigms, ranging from explicit semantic judgments to lexical decision tasks (auditory and/or visual). The present study explored the processing of abstract/concrete nouns by using repetitive Transcranial Magnetic Stimulation (rTMS) and a lexical decision paradigm in neurologically-unimpaired subjects. Four sites were investigated: left inferior frontal, bilaterally posterior-superior temporal and left posterior-inferior parietal. An interference on accuracy was found for abstract words when rTMS was applied over the left temporal site, while for concrete words accuracy decreased when rTMS was applied over the right temporal site. Accuracy for abstract words, but not for concrete words, decreased after frontal stimulation as compared to the sham condition. These results suggest that abstract lexical entries are stored in the posterior part of the left temporal superior gyrus and possibly in the left frontal inferior gyrus, while the regions involved in storing concrete items include the right temporal cortex. It cannot be excluded, however, that additional areas, not tested in this experiment, are involved in processing both, concrete and abstract nouns.
Application of Technology to Cognitive Development.
ERIC Educational Resources Information Center
Wilson, Louise
This report presents a summary of research being conducted at the University of Minnesota in which new technologies are being applied to development of cognition in hearing impaired learners. The study involved an application of concept analysis, information-processing theories, and group-based interactive technology in the teaching of…
29 CFR 1915.1026 - Chromium (VI).
Code of Federal Regulations, 2012 CFR
2012-07-01
... a specific process, operation, or activity involving chromium cannot release dusts, fumes, or mists... 29 Labor 7 2012-07-01 2012-07-01 false Chromium (VI). 1915.1026 Section 1915.1026 Labor... § 1915.1026 Chromium (VI). (a) Scope. (1) This standard applies to occupational exposures to chromium (VI...
29 CFR 1915.1026 - Chromium (VI).
Code of Federal Regulations, 2011 CFR
2011-07-01
... a specific process, operation, or activity involving chromium cannot release dusts, fumes, or mists... 29 Labor 7 2011-07-01 2011-07-01 false Chromium (VI). 1915.1026 Section 1915.1026 Labor... § 1915.1026 Chromium (VI). (a) Scope. (1) This standard applies to occupational exposures to chromium (VI...
29 CFR 1915.1026 - Chromium (VI).
Code of Federal Regulations, 2013 CFR
2013-07-01
... a specific process, operation, or activity involving chromium cannot release dusts, fumes, or mists... 29 Labor 7 2013-07-01 2013-07-01 false Chromium (VI). 1915.1026 Section 1915.1026 Labor... § 1915.1026 Chromium (VI). (a) Scope. (1) This standard applies to occupational exposures to chromium (VI...
29 CFR 1915.1026 - Chromium (VI).
Code of Federal Regulations, 2014 CFR
2014-07-01
... a specific process, operation, or activity involving chromium cannot release dusts, fumes, or mists... 29 Labor 7 2014-07-01 2014-07-01 false Chromium (VI). 1915.1026 Section 1915.1026 Labor... § 1915.1026 Chromium (VI). (a) Scope. (1) This standard applies to occupational exposures to chromium (VI...
Developing an Experiential Learning Program: Milestones and Challenges
ERIC Educational Resources Information Center
Austin, M. Jill; Rust, Dianna Zeh
2015-01-01
College and University faculty members have increasingly adopted experiential learning teaching methods that are designed to engage students in the learning process. Experiential learning is simply defined as "hands-on" learning and may involve any of the following activities: service learning, applied learning in the discipline,…
Factors Influencing Recruitment in Educational Psychology
ERIC Educational Resources Information Center
Frederickson, Norah
2003-01-01
This paper reports an investigation of the factors that educational psychologists in training (EPiTs) look for when applying for jobs in educational psychology services. Relevant literature on "job attraction" is reviewed and a three-stage research process employed. This involved a focus group approach to questionnaire generation…
Real-Time Teaching: Lessons from Katrina
ERIC Educational Resources Information Center
Phillips, Antoinette S.; Phillips, Carl R.
2008-01-01
Professors strive constantly to find ways for students to apply what they are learning in the classroom, thereby reinforcing principles being taught and increasing student interest and involvement in the learning process. Hurricane Katrina's devastating impact on the Gulf Coast had wide-ranging consequences. As a result, many individuals…
ERIC Educational Resources Information Center
Dowling-Sendor, Benjamin
2000-01-01
In a case involving use of physical force against a misbehaving eighth grader, a federal court judge concluded that the teacher's conduct did not violate the student's due-process rights, after applying the "shock the conscience" test. However, the case proceeded to trial, since district policy supported such force. (MLH)
An experimental study on particle effects in liquid sheets
NASA Astrophysics Data System (ADS)
Sauret, Alban; Troger, Anthony; Jop, Pierre
2017-06-01
Many industrial processes, such as surface coating or liquid transport in tubes, involve liquid sheets or thin films of suspensions. In these situations, the thickness of the liquid film becomes comparable to the particle size, which leads to unexpected dynamics. In addition, the classical constitutive rheological law for suspensions cannot be applied as the continuum approximation is no longer valid. Here, we consider experimentally a transient particle-laden liquid sheet that expands radially. We characterize the influence of the particles on the shape of the liquid film and the atomization process. We highlight that the presence of particles modifies the thickness and stability of the liquid sheet. Our study suggests that the influence of particles through capillary effects can modify significantly the dynamics of processes that involve suspensions and particles confined in liquid films.
Estimation of bladder wall location in ultrasound images.
Topper, A K; Jernigan, M E
1991-05-01
A method of automatically estimating the location of the bladder wall in ultrasound images is proposed. Obtaining this estimate is intended to be the first stage in the development of an automatic bladder volume calculation system. The first step in the bladder wall estimation scheme involves globally processing the images using standard image processing techniques to highlight the bladder wall. Separate processing sequences are required to highlight the anterior bladder wall and the posterior bladder wall. The sequence to highlight the anterior bladder wall involves Gaussian smoothing and second differencing followed by zero-crossing detection. Median filtering followed by thresholding and gradient detection is used to highlight as much of the rest of the bladder wall as was visible in the original images. Then a 'bladder wall follower'--a line follower with rules based on the characteristics of ultrasound imaging and the anatomy involved--is applied to the processed images to estimate the bladder wall location by following the portions of the bladder wall which are highlighted and filling in the missing segments. The results achieved using this scheme are presented.
BENCH-SCALE PROCESS EVALUATION OF REBURNING AND SORBENT INJECTION FOR IN-FURNACE NOX/SOX REDUCTION
The report gives results of combining reburning with the injection of calcium-based sorbents to investigate the potential for combined NOx and SOx reduction. Reburning, applied to pulverized-coal-fired utility boilers, involves injecting a secondary fuel above the main firing zon...
Measuring Student Engagement during Collaboration
ERIC Educational Resources Information Center
Halpin, Peter F.; von Davier, Alina A.; Hao, Jiangang; Liu, Lei
2017-01-01
This article addresses performance assessments that involve collaboration among students. We apply the Hawkes process to infer whether the actions of one student are associated with increased probability of further actions by his/her partner(s) in the near future. This leads to an intuitive notion of engagement among collaborators, and we consider…
40 CFR 60.560 - Applicability and designation of affected facilities.
Code of Federal Regulations, 2010 CFR
2010-07-01
...) manufacturing processes. (i) Affected facilities with a design capacity to produce less than 1,000 Mg/yr (1,102... Performance for Volatile Organic Compound (VOC) Emissions from the Polymer Manufacturing Industry § 60.560... apply to affected facilities involved in the manufacture of polypropylene, polyethylene, polystyrene, or...
Helping Teachers Use Research Findings: The Consumer-Validation Process.
ERIC Educational Resources Information Center
Eaker, Robert E.; Huffman, James O.
A program stressing teacher involvement and classroom implementation of educational research findings is described. The program was designed to familiarize teachers with current findings, have them apply the findings in their classrooms, analyze their own teaching behavior, and critically evaluate the findings in terms of their applicability to…
Understanding First Law of Thermodynamics through Activities
ERIC Educational Resources Information Center
Pathare, Shirish; Huli, Saurabhee; Ladage, Savita; Pradhan, H. C.
2018-01-01
The first law of thermodynamics involves several types of energies and many studies have shown that students lack awareness of them. They have difficulties in applying the law to different thermodynamic processes. These observations were confirmed in our pilot studies, carried out with students from undergraduate colleges across the whole of…
Applying Cluster Analysis to Physics Education Research Data
ERIC Educational Resources Information Center
Springuel, R. Padraic
2010-01-01
One major thrust of Physics Education Research (PER) is the identification of student ideas about specific physics concepts, both correct ideas and those that differ from the expert consensus. Typically the research process of eliciting the spectrum of student ideas involves the administration of specially designed questions to students. One major…
A New Lean Paradigm in Higher Education: A Case Study
ERIC Educational Resources Information Center
Doman, Mark S.
2011-01-01
Purpose: This case study aims to demonstrate that lean principles and practices utilized in industry can be successfully applied to improve higher education administrative processes through an innovative and engaging learning experience involving undergraduate students. Design/methodology/approach: This is a first-hand account by the instructor of…
Steps in the open space planning process
Stephanie B. Kelly; Melissa M. Ryan
1995-01-01
This paper presents the steps involved in developing an open space plan. The steps are generic in that the methods may be applied various size communities. The intent is to provide a framework to develop an open space plan that meets Massachusetts requirements for funding of open space acquisition.
Plasma Spraying of Ceramics with Particular Difficulties in Processing
NASA Astrophysics Data System (ADS)
Mauer, G.; Schlegel, N.; Guignard, A.; Jarligo, M. O.; Rezanka, S.; Hospach, A.; Vaßen, R.
2015-01-01
Emerging new applications and growing demands of plasma-sprayed coatings initiate the development of new materials. Regarding ceramics, often complex compositions are employed to achieve advanced material properties, e.g., high thermal stability, low thermal conductivity, high electronic and ionic conductivity as well as specific thermo-mechanical properties and microstructures. Such materials however, often involve particular difficulties in processing by plasma spraying. The inhomogeneous dissociation and evaporation behavior of individual constituents can lead to changes of the chemical composition and the formation of secondary phases in the deposited coatings. Hence, undesired effects on the coating characteristics are encountered. In this work, examples of such challenging materials are investigated, namely pyrochlores applied for thermal barrier coatings as well as perovskites for gas separation membranes. In particular, new plasma spray processes like suspension plasma spraying and plasma spray-physical vapor deposition are considered. In some cases, plasma diagnostics are applied to analyze the processing conditions.
Process Improvement for Interinstitutional Research Contracting
Logan, Jennifer; Bjorklund, Todd; Whitfield, Jesse; Reed, Peggy; Lesher, Laurie; Sikalis, Amy; Brown, Brent; Drollinger, Sandy; Larrabee, Kristine; Thompson, Kristie; Clark, Erin; Workman, Michael; Boi, Luca
2015-01-01
Abstract Introduction Sponsored research increasingly requires multiinstitutional collaboration. However, research contracting procedures have become more complicated and time consuming. The perinatal research units of two colocated healthcare systems sought to improve their research contracting processes. Methods The Lean Process, a management practice that iteratively involves team members in root cause analyses and process improvement, was applied to the research contracting process, initially using Process Mapping and then developing Problem Solving Reports. Results Root cause analyses revealed that the longest delays were the individual contract legal negotiations. In addition, the “business entity” was the research support personnel of both healthcare systems whose “customers” were investigators attempting to conduct interinstitutional research. Development of mutually acceptable research contract templates and language, chain of custody templates, and process development and refinement formats decreased the Notice of Grant Award to Purchase Order time from a mean of 103.5 days in the year prior to Lean Process implementation to 45.8 days in the year after implementation (p = 0.004). Conclusions The Lean Process can be applied to interinstitutional research contracting with significant improvement in contract implementation. PMID:26083433
Process Improvement for Interinstitutional Research Contracting.
Varner, Michael; Logan, Jennifer; Bjorklund, Todd; Whitfield, Jesse; Reed, Peggy; Lesher, Laurie; Sikalis, Amy; Brown, Brent; Drollinger, Sandy; Larrabee, Kristine; Thompson, Kristie; Clark, Erin; Workman, Michael; Boi, Luca
2015-08-01
Sponsored research increasingly requires multiinstitutional collaboration. However, research contracting procedures have become more complicated and time consuming. The perinatal research units of two colocated healthcare systems sought to improve their research contracting processes. The Lean Process, a management practice that iteratively involves team members in root cause analyses and process improvement, was applied to the research contracting process, initially using Process Mapping and then developing Problem Solving Reports. Root cause analyses revealed that the longest delays were the individual contract legal negotiations. In addition, the "business entity" was the research support personnel of both healthcare systems whose "customers" were investigators attempting to conduct interinstitutional research. Development of mutually acceptable research contract templates and language, chain of custody templates, and process development and refinement formats decreased the Notice of Grant Award to Purchase Order time from a mean of 103.5 days in the year prior to Lean Process implementation to 45.8 days in the year after implementation (p = 0.004). The Lean Process can be applied to interinstitutional research contracting with significant improvement in contract implementation. © 2015 Wiley Periodicals, Inc.
1H NMR-based metabolic profiling for evaluating poppy seed rancidity and brewing.
Jawień, Ewa; Ząbek, Adam; Deja, Stanisław; Łukaszewicz, Marcin; Młynarz, Piotr
2015-12-01
Poppy seeds are widely used in household and commercial confectionery. The aim of this study was to demonstrate the application of metabolic profiling for industrial monitoring of the molecular changes which occur during minced poppy seed rancidity and brewing processes performed on raw seeds. Both forms of poppy seeds were obtained from a confectionery company. Proton nuclear magnetic resonance (1H NMR) was applied as the analytical method of choice together with multivariate statistical data analysis. Metabolic fingerprinting was applied as a bioprocess control tool to monitor rancidity with the trajectory of change and brewing progressions. Low molecular weight compounds were found to be statistically significant biomarkers of these bioprocesses. Changes in concentrations of chemical compounds were explained relative to the biochemical processes and external conditions. The obtained results provide valuable and comprehensive information to gain a better understanding of the biology of rancidity and brewing processes, while demonstrating the potential for applying NMR spectroscopy combined with multivariate data analysis tools for quality control in food industries involved in the processing of oilseeds. This precious and versatile information gives a better understanding of the biology of these processes.
Event-related desynchronization (ERD) in the alpha band during a hand mental rotation task.
Chen, Xiaogang; Bin, Guangyu; Daly, Ian; Gao, Xiaorong
2013-04-29
Recent studies have demonstrated that mentally rotating the hands involves participants engaging in motor imagery processing. However, far less is known about the possible neurophysiological basis of such processing. To contribute to a better understanding of hand mental rotation processing, event-related spectral perturbation (ERSP) methods were applied to electroencephalography (EEG) data collected from participants mentally rotating their hands. Time-frequency analyses revealed that alpha-band power suppression was larger over central-parietal regions. This is in accordance with motor imagery findings suggesting that the motor regions may be involved in processing or detection of kinaesthetic information. Furthermore, the presence of a significant negative correlation between reaction times (RTs) and alpha-band power suppression over central regions is illustrated. These findings are consistent with the neural efficiency hypothesis, which proposes the non-use of many brain regions irrelevant for the task performance as well as the more focused use of specific task-related regions in individuals with better performance. These results indicate that ERSP provides some independent insights into the mental rotation process and further confirms that parietal and motor cortices are involved in mental rotation. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Vagh, Hardik A.; Baghai-Wadji, Alireza
2008-12-01
Current technological challenges in materials science and high-tech device industry require the solution of boundary value problems (BVPs) involving regions of various scales, e.g. multiple thin layers, fibre-reinforced composites, and nano/micro pores. In most cases straightforward application of standard variational techniques to BVPs of practical relevance necessarily leads to unsatisfactorily ill-conditioned analytical and/or numerical results. To remedy the computational challenges associated with sub-sectional heterogeneities various sophisticated homogenization techniques need to be employed. Homogenization refers to the systematic process of smoothing out the sub-structural heterogeneities, leading to the determination of effective constitutive coefficients. Ordinarily, homogenization involves a sophisticated averaging and asymptotic order analysis to obtain solutions. In the majority of the cases only zero-order terms are constructed due to the complexity of the processes involved. In this paper we propose a constructive scheme for obtaining homogenized solutions involving higher order terms, and thus, guaranteeing higher accuracy and greater robustness of the numerical results. We present
Multi-Criteria Decision Making For Determining A Simple Model of Supplier Selection
NASA Astrophysics Data System (ADS)
Harwati
2017-06-01
Supplier selection is a decision with many criteria. Supplier selection model usually involves more than five main criteria and more than 10 sub-criteria. In fact many model includes more than 20 criteria. Too many criteria involved in supplier selection models sometimes make it difficult to apply in many companies. This research focuses on designing supplier selection that easy and simple to be applied in the company. Analytical Hierarchy Process (AHP) is used to weighting criteria. The analysis results there are four criteria that are easy and simple can be used to select suppliers: Price (weight 0.4) shipment (weight 0.3), quality (weight 0.2) and services (weight 0.1). A real case simulation shows that simple model provides the same decision with a more complex model.
Automaticity of phonological and semantic processing during visual word recognition.
Pattamadilok, Chotiga; Chanoine, Valérie; Pallier, Christophe; Anton, Jean-Luc; Nazarian, Bruno; Belin, Pascal; Ziegler, Johannes C
2017-04-01
Reading involves activation of phonological and semantic knowledge. Yet, the automaticity of the activation of these representations remains subject to debate. The present study addressed this issue by examining how different brain areas involved in language processing responded to a manipulation of bottom-up (level of visibility) and top-down information (task demands) applied to written words. The analyses showed that the same brain areas were activated in response to written words whether the task was symbol detection, rime detection, or semantic judgment. This network included posterior, temporal and prefrontal regions, which clearly suggests the involvement of orthographic, semantic and phonological/articulatory processing in all tasks. However, we also found interactions between task and stimulus visibility, which reflected the fact that the strength of the neural responses to written words in several high-level language areas varied across tasks. Together, our findings suggest that the involvement of phonological and semantic processing in reading is supported by two complementary mechanisms. First, an automatic mechanism that results from a task-independent spread of activation throughout a network in which orthography is linked to phonology and semantics. Second, a mechanism that further fine-tunes the sensitivity of high-level language areas to the sensory input in a task-dependent manner. Copyright © 2017 Elsevier Inc. All rights reserved.
Scaling and scale invariance of conservation laws in Reynolds transport theorem framework
NASA Astrophysics Data System (ADS)
Haltas, Ismail; Ulusoy, Suleyman
2015-07-01
Scale invariance is the case where the solution of a physical process at a specified time-space scale can be linearly related to the solution of the processes at another time-space scale. Recent studies investigated the scale invariance conditions of hydrodynamic processes by applying the one-parameter Lie scaling transformations to the governing equations of the processes. Scale invariance of a physical process is usually achieved under certain conditions on the scaling ratios of the variables and parameters involved in the process. The foundational axioms of hydrodynamics are the conservation laws, namely, conservation of mass, conservation of linear momentum, and conservation of energy from continuum mechanics. They are formulated using the Reynolds transport theorem. Conventionally, Reynolds transport theorem formulates the conservation equations in integral form. Yet, differential form of the conservation equations can also be derived for an infinitesimal control volume. In the formulation of the governing equation of a process, one or more than one of the conservation laws and, some times, a constitutive relation are combined together. Differential forms of the conservation equations are used in the governing partial differential equation of the processes. Therefore, differential conservation equations constitute the fundamentals of the governing equations of the hydrodynamic processes. Applying the one-parameter Lie scaling transformation to the conservation laws in the Reynolds transport theorem framework instead of applying to the governing partial differential equations may lead to more fundamental conclusions on the scaling and scale invariance of the hydrodynamic processes. This study will investigate the scaling behavior and scale invariance conditions of the hydrodynamic processes by applying the one-parameter Lie scaling transformation to the conservation laws in the Reynolds transport theorem framework.
Seismic data compression speeds exploration projects
DOE Office of Scientific and Technical Information (OSTI.GOV)
Galibert, P.Y.
As part of an ongoing commitment to ensure industry-wide distribution of its revolutionary seismic data compression technology, Chevron Petroleum Technology Co. (CPTC) has entered into licensing agreements with Compagnie Generale de Geophysique (CGG) and other seismic contractors for use of its software in oil and gas exploration programs. CPTC expects use of the technology to be far-reaching to all of its industry partners involved in seismic data collection, processing, analysis and storage. Here, CGG--one of the world`s leading seismic acquisition and processing companies--talks about its success in applying the new methodology to replace full on-board seismic processing. Chevron`s technology ismore » already being applied on large off-shore 3-D seismic surveys. Worldwide, CGG has acquired more than 80,000 km of seismic data using the data compression technology.« less
Participatory Design in Gerontechnology: A Systematic Literature Review.
Merkel, Sebastian; Kucharski, Alexander
2018-05-19
Participatory design (PD) is widely used within gerontechnology but there is no common understanding about which methods are used for what purposes. This review aims to examine what different forms of PD exist in the field of gerontechnology and how these can be categorized. We conducted a systematic literature review covering several databases. The search strategy was based on 3 elements: (1) participatory methods and approaches with (2) older persons aiming at developing (3) technology for older people. Our final review included 26 studies representing a variety of technologies designed/developed and methods/instruments applied. According to the technologies, the publications reviewed can be categorized in 3 groups: Studies that (1) use already existing technology with the aim to find new ways of use; (2) aim at creating new devices; (3) test and/or modify prototypes. The implementation of PD depends on the questions: Why a participatory approach is applied, who is involved as future user(s), when those future users are involved, and how they are incorporated into the innovation process. There are multiple ways, methods, and instruments to integrate users into the innovation process. Which methods should be applied, depends on the context. However, most studies do not evaluate if participatory approaches will lead to a better acceptance and/or use of the co-developed products. Therefore, participatory design should follow a comprehensive strategy, starting with the users' needs and ending with an evaluation if the applied methods have led to better results.
Dynamic of particle-laden liquid sheet
NASA Astrophysics Data System (ADS)
Sauret, Alban; Jop, Pierre; Troger, Anthony
2016-11-01
Many industrial processes, such as surface coating or liquid transport in tubes, involve liquid sheets or thin liquid films of suspensions. In these situations, the thickness of the liquid film becomes comparable to the particle size, which leads to unexpected dynamics. In addition, the classical constitutive rheological law cannot be applied as the continuum approximation is no longer valid. Here, we consider experimentally a transient free liquid sheet that expands radially. We characterize the influence of the particles on the shape of the liquid film as a function of time and the atomization process. We highlight that the presence of particles modifies the thickness and the stability of the liquid sheet. Our study suggests that the influence of particles through capillary effects can modify significantly the dynamics of processes that involve suspensions and particles confined in liquid films.
van Limburg, Maarten; Wentzel, Jobke; Sanderman, Robbert; van Gemert-Pijnen, Lisette
2015-08-13
It is acknowledged that the success and uptake of eHealth improve with the involvement of users and stakeholders to make technology reflect their needs. Involving stakeholders in implementation research is thus a crucial element in developing eHealth technology. Business modeling is an approach to guide implementation research for eHealth. Stakeholders are involved in business modeling by identifying relevant stakeholders, conducting value co-creation dialogs, and co-creating a business model. Because implementation activities are often underestimated as a crucial step while developing eHealth, comprehensive and applicable approaches geared toward business modeling in eHealth are scarce. This paper demonstrates the potential of several stakeholder-oriented analysis methods and their practical application was demonstrated using Infectionmanager as an example case. In this paper, we aim to demonstrate how business modeling, with the focus on stakeholder involvement, is used to co-create an eHealth implementation. We divided business modeling in 4 main research steps. As part of stakeholder identification, we performed literature scans, expert recommendations, and snowball sampling (Step 1). For stakeholder analyzes, we performed "basic stakeholder analysis," stakeholder salience, and ranking/analytic hierarchy process (Step 2). For value co-creation dialogs, we performed a process analysis and stakeholder interviews based on the business model canvas (Step 3). Finally, for business model generation, we combined all findings into the business model canvas (Step 4). Based on the applied methods, we synthesized a step-by-step guide for business modeling with stakeholder-oriented analysis methods that we consider suitable for implementing eHealth. The step-by-step guide for business modeling with stakeholder involvement enables eHealth researchers to apply a systematic and multidisciplinary, co-creative approach for implementing eHealth. Business modeling becomes an active part in the entire development process of eHealth and starts an early focus on implementation, in which stakeholders help to co-create the basis necessary for a satisfying success and uptake of the eHealth technology.
Currency arbitrage detection using a binary integer programming model
NASA Astrophysics Data System (ADS)
Soon, Wanmei; Ye, Heng-Qing
2011-04-01
In this article, we examine the use of a new binary integer programming (BIP) model to detect arbitrage opportunities in currency exchanges. This model showcases an excellent application of mathematics to the real world. The concepts involved are easily accessible to undergraduate students with basic knowledge in Operations Research. Through this work, students can learn to link several types of basic optimization models, namely linear programming, integer programming and network models, and apply the well-known sensitivity analysis procedure to accommodate realistic changes in the exchange rates. Beginning with a BIP model, we discuss how it can be reduced to an equivalent but considerably simpler model, where an efficient algorithm can be applied to find the arbitrages and incorporate the sensitivity analysis procedure. A simple comparison is then made with a different arbitrage detection model. This exercise helps students learn to apply basic Operations Research concepts to a practical real-life example, and provides insights into the processes involved in Operations Research model formulations.
Interservice Procedures for Instructional Systems Development: Executive Summary and Model.
ERIC Educational Resources Information Center
Branson, Robert K.
The document is the last of a five-part series focusing in minute detail on the processes involved in the formulation of an instructional systems development (ISD) program for military interservice training that will adequately train individuals to do a particular job and which can also be applied to any interservice curriculum development…
ERIC Educational Resources Information Center
van Urk, Felix; Grant, Sean; Bonell, Chris
2016-01-01
The use of explicit programme theory to guide evaluation is widely recommended. However, practitioners and other partnering stakeholders often initiate programmes based on implicit theories, leaving researchers to explicate them before commencing evaluation. The current study aimed to apply a systematic method to undertake this process. We…
The Dynamics of Climate Change: A Case Study in Organisational Learning
ERIC Educational Resources Information Center
Wasdell, David
2011-01-01
Purpose: Based in the discipline of applied consultancy-research, this paper seeks to present a synthesis-review of the social dynamics underlying the stalled negotiations of the United Nations Framework Convention on Climate Change. Its aim is to enhance understanding of the processes involved, to offer a working agenda to the organizational…
Social Dynamics in the Preschool
ERIC Educational Resources Information Center
Martin, Carol Lynn; Fabes, Richard A.; Hanish, Laura D.; Hollenstein, Tom
2005-01-01
In this paper, we consider how concepts from dynamic systems (such as attractors, repellors, and self-organization) can be applied to the study of young children's peer relationships. We also consider how these concepts can be used to explore basic issues involving early peer processes. We use the dynamical systems approach called state space grid…
Manufacturing Systems. Curriculum Guide for Technology Education.
ERIC Educational Resources Information Center
Lloyd, Theodore J.
This curriculum for a 1-semester or 1-year course in manufacturing is designed to give students experience in applying knowledge from other courses and some basic production skills as they become involved in a manufacturing enterprise. Course content is organized around the laboratory activities necessary to organize and operate a process to mass…
Using and Applying Mathematics
ERIC Educational Resources Information Center
Knight, Rupert
2011-01-01
The Nobel prize winning physicist Richard Feynman (2007) famously enthused about "the pleasure of finding things out". In day-to-day classroom life, however, it is easy to lose and undervalue this pleasure in the process, as opposed to products, of mathematics. Finding things out involves a journey and is often where the learning takes place.…
Set Theory Applied to Uniquely Define the Inputs to Territorial Systems in Emergy Analyses
The language of set theory can be utilized to represent the emergy involved in all processes. In this paper we use set theory in an emergy evaluation to ensure an accurate representation of the inputs to territorial systems. We consider a generic territorial system and we describ...
Globalizing the Curriculum: How to Incorporate a Global Perspective into Your Courses
ERIC Educational Resources Information Center
Rusciano, Frank Louis
2014-01-01
Integrating a "global perspective" into courses necessarily involves examining whether traditional disciplinary assumptions still apply in a global context and, if not, how they need to be translated in order to remain relevant. In this article, the author maps out this process by tracing one "intellectual journey" in order to…
ERIC Educational Resources Information Center
Pogorelova, Elena V.; Yakhneeva, Irina V.; Agafonova, Anna N.; Prokubovskaya, Alla O.
2016-01-01
The relevance of the analyzed issue is caused by the need to study the process of transformation of marketing in e-commerce, as the active involvement of business organizations in the field of e-business is often accompanied by problems of applying the usual marketing tools in a virtual environment. The article seeks to identify changes in the…
Introduce Construction Technology through Home Inspection
ERIC Educational Resources Information Center
Wiggins, Enrique R.
2007-01-01
Introducing technology education students to the field of home inspection gives them a great opportunity to learn about and apply construction technology content. In working with his 8th-grade students, the author covers the purpose of a home inspection, the dynamic of home inspections, the process involved in inspecting schools and homes and…
Planning Guide for Career Academies
ERIC Educational Resources Information Center
Dayton, Charles
2010-01-01
A career academy is a small learning community within a high school, which selects a subset of students and teachers for a two-, three-, or four-year period. Students enter through a voluntary process; they must apply and be accepted, with parental knowledge and support. A career academy involves teachers from different subjects working together…
Conditions in the Reader that Affect His Embodiment of the Text.
ERIC Educational Resources Information Center
Myers, Jeanette S.
Three factors in the reader have a generalized effect on all perception, including reading: competence, purpose, and set. Competence involves applying past learning to new learning through transference, understanding the conventions of different types of texts, and transforming the text through the perceptual process into a new entity. Competent…
Theory of molecular rate processes in the presence of intense laser radiation
NASA Technical Reports Server (NTRS)
George, T. F.; Zimmerman, I. H.; Devries, P. L.; Yuan, J.-M.; Lam, K.-S.; Bellum, J. C.; Lee, H.-W.; Slutsky, M. S.; Lin, J.-T.
1979-01-01
The present paper deals with the influence of intense laser radiation on gas-phase molecular rate processes. Representations of the radiation field, the particle system, and the interaction involving these two entities are discussed from a general rather than abstract point of view. The theoretical methods applied are outlined, and the formalism employed is illustrated by application to a variety of specific processes. Quantum mechanical and semiclassical treatments of representative atom-atom and atom-diatom collision processes in the presence of a field are examined, and examples of bound-continuum processes and heterogeneous catalysis are discussed within the framework of both quantum-mechanical and semiclassical theories.
Identifying pathogenic processes by integrating microarray data with prior knowledge
2014-01-01
Background It is of great importance to identify molecular processes and pathways that are involved in disease etiology. Although there has been an extensive use of various high-throughput methods for this task, pathogenic pathways are still not completely understood. Often the set of genes or proteins identified as altered in genome-wide screens show a poor overlap with canonical disease pathways. These findings are difficult to interpret, yet crucial in order to improve the understanding of the molecular processes underlying the disease progression. We present a novel method for identifying groups of connected molecules from a set of differentially expressed genes. These groups represent functional modules sharing common cellular function and involve signaling and regulatory events. Specifically, our method makes use of Bayesian statistics to identify groups of co-regulated genes based on the microarray data, where external information about molecular interactions and connections are used as priors in the group assignments. Markov chain Monte Carlo sampling is used to search for the most reliable grouping. Results Simulation results showed that the method improved the ability of identifying correct groups compared to traditional clustering, especially for small sample sizes. Applied to a microarray heart failure dataset the method found one large cluster with several genes important for the structure of the extracellular matrix and a smaller group with many genes involved in carbohydrate metabolism. The method was also applied to a microarray dataset on melanoma cancer patients with or without metastasis, where the main cluster was dominated by genes related to keratinocyte differentiation. Conclusion Our method found clusters overlapping with known pathogenic processes, but also pointed to new connections extending beyond the classical pathways. PMID:24758699
Liu, Chao; Abu-Jamous, Basel; Brattico, Elvira; Nandi, Asoke K
2017-03-01
In the past decades, neuroimaging of humans has gained a position of status within neuroscience, and data-driven approaches and functional connectivity analyses of functional magnetic resonance imaging (fMRI) data are increasingly favored to depict the complex architecture of human brains. However, the reliability of these findings is jeopardized by too many analysis methods and sometimes too few samples used, which leads to discord among researchers. We propose a tunable consensus clustering paradigm that aims at overcoming the clustering methods selection problem as well as reliability issues in neuroimaging by means of first applying several analysis methods (three in this study) on multiple datasets and then integrating the clustering results. To validate the method, we applied it to a complex fMRI experiment involving affective processing of hundreds of music clips. We found that brain structures related to visual, reward, and auditory processing have intrinsic spatial patterns of coherent neuroactivity during affective processing. The comparisons between the results obtained from our method and those from each individual clustering algorithm demonstrate that our paradigm has notable advantages over traditional single clustering algorithms in being able to evidence robust connectivity patterns even with complex neuroimaging data involving a variety of stimuli and affective evaluations of them. The consensus clustering method is implemented in the R package "UNCLES" available on http://cran.r-project.org/web/packages/UNCLES/index.html .
Motivating Company Personnel by Applying the Semi-self-organized Teams Principle
NASA Astrophysics Data System (ADS)
Kumlander, Deniss
The only way nowadays to improve stability of software development process in the global rapidly evolving world is to be innovative and involve professionals into projects motivating them using both material and non material factors. In this paper self-organized teams are discussed. Unfortunately not all kind of organizations can benefit directly from agile method including applying self-organized teams. The paper proposes semi-self-organized teams presenting it as a new and promising motivating factor allowing deriving many positive sides of been self-organized and partly agile and been compliant to less strict conditions for following this innovating process. The semi-self organized teams are reliable at least in the short-term perspective and are simple to organize and support.
Optical Fourier diffractometry applied to degraded bone structure recognition
NASA Astrophysics Data System (ADS)
Galas, Jacek; Godwod, Krzysztof; Szawdyn, Jacek; Sawicki, Andrzej
1993-09-01
Image processing and recognition methods are useful in many fields. This paper presents the hybrid optical and digital method applied to recognition of pathological changes in bones involved by metabolic bone diseases. The trabecular bone structure, registered by x ray on the photographic film, is analyzed in the new type of computer controlled diffractometer. The set of image parameters, extracted from diffractogram, is evaluated by statistical analysis. The synthetic image descriptors in discriminant space, constructed on the base of 3 training groups of images (control, osteoporosis, and osteomalacia groups) by discriminant analysis, allow us to recognize bone samples with degraded bone structure and to recognize the disease. About 89% of the images were classified correctly. This method after optimization process will be verified in medical investigations.
Oscillatory Reduction in Option Pricing Formula Using Shifted Poisson and Linear Approximation
NASA Astrophysics Data System (ADS)
Nur Rachmawati, Ro'fah; Irene; Budiharto, Widodo
2014-03-01
Option is one of derivative instruments that can help investors improve their expected return and minimize the risks. However, the Black-Scholes formula is generally used in determining the price of the option does not involve skewness factor and it is difficult to apply in computing process because it produces oscillation for the skewness values close to zero. In this paper, we construct option pricing formula that involve skewness by modified Black-Scholes formula using Shifted Poisson model and transformed it into the form of a Linear Approximation in the complete market to reduce the oscillation. The results are Linear Approximation formula can predict the price of an option with very accurate and successfully reduce the oscillations in the calculation processes.
Uncertainties in Forecasting Streamflow using Entropy Theory
NASA Astrophysics Data System (ADS)
Cui, H.; Singh, V. P.
2017-12-01
Streamflow forecasting is essential in river restoration, reservoir operation, power generation, irrigation, navigation, and water management. However, there is always uncertainties accompanied in forecast, which may affect the forecasting results and lead to large variations. Therefore, uncertainties must be considered and be assessed properly when forecasting streamflow for water management. The aim of our work is to quantify the uncertainties involved in forecasting streamflow and provide reliable streamflow forecast. Despite that streamflow time series are stochastic, they exhibit seasonal and periodic patterns. Therefore, streamflow forecasting entails modeling seasonality, periodicity, and its correlation structure, and assessing uncertainties. This study applies entropy theory to forecast streamflow and measure uncertainties during the forecasting process. To apply entropy theory for streamflow forecasting, spectral analysis is combined to time series analysis, as spectral analysis can be employed to characterize patterns of streamflow variation and identify the periodicity of streamflow. That is, it permits to extract significant information for understanding the streamflow process and prediction thereof. Application of entropy theory for streamflow forecasting involves determination of spectral density, determination of parameters, and extension of autocorrelation function. The uncertainties brought by precipitation input, forecasting model and forecasted results are measured separately using entropy. With information theory, how these uncertainties transported and aggregated during these processes will be described.
When do combinatorial mechanisms apply in the production of inflected words?
Cholin, Joana; Rapp, Brenda; Miozzo, Michele
2010-01-01
A central question for theories of inflected word processing is to determine under what circumstances compositional procedures apply. Some accounts (e.g., the dual-mechanism model; Clahsen, 1999 ) propose that compositional processes only apply to verbs that take productive affixes. For all other verbs, inflected forms are assumed to be stored in the lexicon in a nondecomposed manner. This account makes clear predictions about the consequences of disruption to the lexical access mechanisms involved in the spoken production of inflected forms. Briefly, it predicts that nonproductive forms (which require lexical access) should be more affected than productive forms (which, depending on the language task, may not). We tested these predictions through the detailed analysis of the spoken production of a German-speaking individual with an acquired lexical impairment resulting from a stroke. Analyses of response accuracy, error types, and frequency effects revealed that combinatorial processes are not restricted to verbs that take productive inflections. On this basis, we propose an alternative account, the stem-based assembly model (SAM), which posits that combinatorial processes may be available to all stems and not only to those that combine with productive affixes.
When do combinatorial mechanisms apply in the production of inflected words?
Cholin, Joana; Rapp, Brenda; Miozzo, Michele
2010-01-01
A central question for theories of inflected word processing is to determine under what circumstances compositional procedures apply. Some accounts (e.g., the Dual Mechanism Model; Clahsen, 1999) propose that compositional processes only apply to verbs that take productive affixes. For all other verbs, inflected forms are assumed to be stored in the lexicon in a non-decomposed manner. This account makes clear predictions about the consequences of disruption to the lexical access mechanisms involved in the spoken production of inflected forms. Briefly, it predicts that non-productive forms (which require lexical access) should be more affected than productive forms (which, depending on the language task, may not). We tested these predictions through the detailed analysis of the spoken production of a German-speaking individual with an acquired lexical impairment resulting from a stroke. Analyses of response accuracy, error types, and frequency effects revealed that combinatorial processes are not restricted to verbs that take productive inflections. On this basis, we propose an alternative account, the Stem-based Assembly Model (SAM) that posits that combinatorial processes may be available to all stems, and not only those that combine with productive affixes. PMID:21104479
Using Automation to Improve the Flight Software Testing Process
NASA Technical Reports Server (NTRS)
ODonnell, James R., Jr.; Andrews, Stephen F.; Morgenstern, Wendy M.; Bartholomew, Maureen O.; McComas, David C.; Bauer, Frank H. (Technical Monitor)
2001-01-01
One of the critical phases in the development of a spacecraft attitude control system (ACS) is the testing of its flight software. The testing (and test verification) of ACS flight software requires a mix of skills involving software, attitude control, data manipulation, and analysis. The process of analyzing and verifying flight software test results often creates a bottleneck which dictates the speed at which flight software verification can be conducted. In the development of the Microwave Anisotropy Probe (MAP) spacecraft ACS subsystem, an integrated design environment was used that included a MAP high fidelity (HiFi) simulation, a central database of spacecraft parameters, a script language for numeric and string processing, and plotting capability. In this integrated environment, it was possible to automate many of the steps involved in flight software testing, making the entire process more efficient and thorough than on previous missions. In this paper, we will compare the testing process used on MAP to that used on previous missions. The software tools that were developed to automate testing and test verification will be discussed, including the ability to import and process test data, synchronize test data and automatically generate HiFi script files used for test verification, and an automated capability for generating comparison plots. A summary of the perceived benefits of applying these test methods on MAP will be given. Finally, the paper will conclude with a discussion of re-use of the tools and techniques presented, and the ongoing effort to apply them to flight software testing of the Triana spacecraft ACS subsystem.
Using Automation to Improve the Flight Software Testing Process
NASA Technical Reports Server (NTRS)
ODonnell, James R., Jr.; Morgenstern, Wendy M.; Bartholomew, Maureen O.
2001-01-01
One of the critical phases in the development of a spacecraft attitude control system (ACS) is the testing of its flight software. The testing (and test verification) of ACS flight software requires a mix of skills involving software, knowledge of attitude control, and attitude control hardware, data manipulation, and analysis. The process of analyzing and verifying flight software test results often creates a bottleneck which dictates the speed at which flight software verification can be conducted. In the development of the Microwave Anisotropy Probe (MAP) spacecraft ACS subsystem, an integrated design environment was used that included a MAP high fidelity (HiFi) simulation, a central database of spacecraft parameters, a script language for numeric and string processing, and plotting capability. In this integrated environment, it was possible to automate many of the steps involved in flight software testing, making the entire process more efficient and thorough than on previous missions. In this paper, we will compare the testing process used on MAP to that used on other missions. The software tools that were developed to automate testing and test verification will be discussed, including the ability to import and process test data, synchronize test data and automatically generate HiFi script files used for test verification, and an automated capability for generating comparison plots. A summary of the benefits of applying these test methods on MAP will be given. Finally, the paper will conclude with a discussion of re-use of the tools and techniques presented, and the ongoing effort to apply them to flight software testing of the Triana spacecraft ACS subsystem.
Manufacturing Precise, Lightweight Paraboloidal Mirrors
NASA Technical Reports Server (NTRS)
Hermann, Frederick Thomas
2006-01-01
A process for fabricating a precise, diffraction- limited, ultra-lightweight, composite- material (matrix/fiber) paraboloidal telescope mirror has been devised. Unlike the traditional process of fabrication of heavier glass-based mirrors, this process involves a minimum of manual steps and subjective judgment. Instead, this process involves objectively controllable, repeatable steps; hence, this process is better suited for mass production. Other processes that have been investigated for fabrication of precise composite-material lightweight mirrors have resulted in print-through of fiber patterns onto reflecting surfaces, and have not provided adequate structural support for maintenance of stable, diffraction-limited surface figures. In contrast, this process does not result in print-through of the fiber pattern onto the reflecting surface and does provide a lightweight, rigid structure capable of maintaining a diffraction-limited surface figure in the face of changing temperature, humidity, and air pressure. The process consists mainly of the following steps: 1. A precise glass mandrel is fabricated by conventional optical grinding and polishing. 2. The mandrel is coated with a release agent and covered with layers of a carbon- fiber composite material. 3. The outer surface of the outer layer of the carbon-fiber composite material is coated with a surfactant chosen to provide for the proper flow of an epoxy resin to be applied subsequently. 4. The mandrel as thus covered is mounted on a temperature-controlled spin table. 5. The table is heated to a suitable temperature and spun at a suitable speed as the epoxy resin is poured onto the coated carbon-fiber composite material. 6. The surface figure of the optic is monitored and adjusted by use of traditional Ronchi, Focault, and interferometric optical measurement techniques while the speed of rotation and the temperature are adjusted to obtain the desired figure. The proper selection of surfactant, speed or rotation, viscosity of the epoxy, and temperature make it possible to obtain the desired diffraction-limited, smooth (1/50th wave) parabolic outer surface, suitable for reflective coating. 7. A reflective coat is applied by use of conventional coating techniques. 8. Once the final figure is set, a lightweight structural foam is applied to the rear of the optic to ensure stability of the figure.
Involving young people in health promotion, research and policy-making: practical recommendations.
Aceves-Martins, Magaly; Aleman-Diaz, Aixa Y; Giralt, Montse; Solà, Rosa
2018-05-18
Youth is a dynamic and complex transition period in life where many factors jeopardise its present and future health. Youth involvement enables young people to influence processes and decisions that affect them, leading to changes in themselves and their environment (e.g. peers, services, communities and policies); this strategy could be applied to improve health and prevent diseases. Nonetheless, scientific evidence of involving youth in health-related programmes is scarce. The aim of this paper is to describe youth involvement as a health promotion strategy and to compile practical recommendations for health promoters, researchers and policy-makers interested in successful involvement of young people in health-related programmes. These suggestions aim to encourage a positive working synergy between adults and youth during the development, implementation and evaluation of policies, research and/or health promotion efforts that target adolescents.
Use of direct washing of chemical dispense nozzle for defect control
NASA Astrophysics Data System (ADS)
Linnane, Michael; Mack, George; Longstaff, Christopher; Winter, Thomas
2006-03-01
Demands for continued defect reduction in 300mm IC manufacturing are driving process engineers to examine all aspects of the chemical apply process for improvement. Historically, the defect contribution from photoresist apply nozzles has been minimized through a carefully controlled process of "dummy dispenses" to keep the photoresist in the tip "fresh" and remove any solidified material, a preventive maintenance regime involving periodic cleaning or replacing of the nozzles, and reliance on a pool of solvent within the nozzle storage block to keep the photoresist from solidifying at the nozzle tip. The industry standard has worked well for the most part but has limitations in terms of cost effectiveness and absolute defect elimination. In this study, we investigate the direct washing of the chemical apply nozzle to reduce defects seen on the coated wafer. Data is presented on how the direct washing of the chemical dispense nozzle can be used to reduce coating related defects, reduce material costs from the reduction of "dummy dispense", and can reduce equipment downtime related to nozzle cleaning or replacement.
Six Sigma methods applied to cryogenic coolers assembly line
NASA Astrophysics Data System (ADS)
Ventre, Jean-Marc; Germain-Lacour, Michel; Martin, Jean-Yves; Cauquil, Jean-Marc; Benschop, Tonny; Griot, René
2009-05-01
Six Sigma method have been applied to manufacturing process of a rotary Stirling cooler: RM2. Name of the project is NoVa as main goal of the Six Sigma approach is to reduce variability (No Variability). Project has been based on the DMAIC guideline following five stages: Define, Measure, Analyse, Improve, Control. Objective has been set on the rate of coolers succeeding performance at first attempt with a goal value of 95%. A team has been gathered involving people and skills acting on the RM2 manufacturing line. Measurement System Analysis (MSA) has been applied to test bench and results after R&R gage show that measurement is one of the root cause for variability in RM2 process. Two more root causes have been identified by the team after process mapping analysis: regenerator filling factor and cleaning procedure. Causes for measurement variability have been identified and eradicated as shown by new results from R&R gage. Experimental results show that regenerator filling factor impacts process variability and affects yield. Improved process haven been set after new calibration process for test bench, new filling procedure for regenerator and an additional cleaning stage have been implemented. The objective for 95% coolers succeeding performance test at first attempt has been reached and kept for a significant period. RM2 manufacturing process is now managed according to Statistical Process Control based on control charts. Improvement in process capability have enabled introduction of sample testing procedure before delivery.
Hybrid flotation--membrane filtration process for the removal of heavy metal ions from wastewater.
Blöcher, C; Dorda, J; Mavrov, V; Chmiel, H; Lazaridis, N K; Matis, K A
2003-09-01
A promising process for the removal of heavy metal ions from aqueous solutions involves bonding the metals firstly to a special bonding agent and then separating the loaded bonding agents from the wastewater stream by separation processes. For the separation stage, a new hybrid process of flotation and membrane separation has been developed in this work by integrating specially designed submerged microfiltration modules directly into a flotation reactor. This made it possible to combine the advantages of both flotation and membrane separation while overcoming the limitations. The feasibility of this hybrid process was proven using powdered synthetic zeolites as bonding agents. Stable fluxes of up to 80l m(-2)h(-1) were achieved with the ceramic flat-sheet multi-channel membranes applied at low transmembrane pressure (<100 mbar). The process was applied in lab-scale to treat wastewater from the electronics industry. All toxic metals in question, namely copper, nickel and zinc, were reduced from initial concentrations of 474, 3.3 and 167mg x l(-1), respectively, to below 0.05 mg x l(-1), consistently meeting the discharge limits.
Importance of joint efforts for balanced process of designing and education
NASA Astrophysics Data System (ADS)
Mayorova, V. I.; Bannova, O. K.; Kristiansen, T.-H.; Igritsky, V. A.
2015-06-01
This paper discusses importance of a strategic planning and design process when developing long-term space exploration missions both robotic and manned. The discussion begins with reviewing current and/or traditional international perspectives on space development at the American, Russian and European space agencies. Some analogies and comparisons will be drawn upon analysis of several international student collaborative programs: Summer International workshops at the Bauman Moscow State Technical University, International European Summer Space School "Future Space Technologies and Experiments in Space", Summer school at Stuttgart University in Germany. The paper will focus on discussion about optimization of design and planning processes for successful space exploration missions and will highlight importance of the following: understanding connectivity between different levels of human being and machinery; simultaneous mission planning approach; reflections and correlations between disciplines involved in planning and executing space exploration missions; knowledge gained from different disciplines and through cross-applying and re-applying design approaches between variable space related fields of study and research. The conclusions will summarize benefits and complications of applying balanced design approach at all levels of the design process. Analysis of successes and failures of organizational efforts in space endeavors is used as a methodological approach to identify key questions to be researched as they often cause many planning and design processing problems.
Improving quality of care in substance abuse treatment using five key process improvement principles
Hoffman, Kim A.; Green, Carla A.; Ford, James H.; Wisdom, Jennifer P.; Gustafson, David H.; McCarty, Dennis
2012-01-01
Process and quality improvement techniques have been successfully applied in health care arenas, but efforts to institute these strategies in alcohol and drug treatment are underdeveloped. The Network for the Improvement of Addiction Treatment (NIATx) teaches participating substance abuse treatment agencies to use process improvement strategies to increase client access to, and retention in, treatment. NIATx recommends five principles to promote organizational change: 1) Understand and involve the customer; 2) Fix key problems; 3) Pick a powerful change leader; 4) Get ideas from outside the organization; and 5) Use rapid-cycle testing. Using case studies, supplemented with cross-agency analyses of interview data, this paper profiles participating NIATx treatment agencies that illustrate application of each principle. Results suggest that the most successful organizations integrate and apply most, if not all, of the five principles as they develop and test change strategies. PMID:22282129
NASA Astrophysics Data System (ADS)
Ebadi, H.; Saeedian, M.; Ausloos, M.; Jafari, G. R.
2016-11-01
The Boolean network is one successful model to investigate discrete complex systems such as the gene interacting phenomenon. The dynamics of a Boolean network, controlled with Boolean functions, is usually considered to be a Markovian (memory-less) process. However, both self-organizing features of biological phenomena and their intelligent nature should raise some doubt about ignoring the history of their time evolution. Here, we extend the Boolean network Markovian approach: we involve the effect of memory on the dynamics. This can be explored by modifying Boolean functions into non-Markovian functions, for example, by investigating the usual non-Markovian threshold function —one of the most applied Boolean functions. By applying the non-Markovian threshold function on the dynamical process of the yeast cell cycle network, we discover a power-law-like memory with a more robust dynamics than the Markovian dynamics.
Adsorption of phenolic compound by aged-refuse.
Xiaoli, Chai; Youcai, Zhao
2006-09-01
The adsorption of phenol, 2-chlorophenol, 4-chlorophenol and 2,4-dichlorophenol by aged-refuse has been studied. Adsorption isotherms have been determined for phenol, 2-chlorophenol, 4-chlorophenol and 2,4-dichlorophenol and the data fits well to the Freundlich equation. The chlorinated phenols are absorbed more strongly than the phenol and the adsorption capacity has an oblivious relationship with the numbers and the position of chlorine subsistent. The experiment data suggests that both the partition function and the chemical adsorption involve in the adsorption process. Pseudo-first-order and pseudo-second-order model were applied to investigate the kinetics of the adsorption and the results show that it fit the pseudo-second-order model. More than one step involves in the adsorption process and the overall rate of the adsorption process appears to be controlled by the chemical reaction. The thermodynamic analysis indicates that the adsorption is spontaneous and endothermic.
Redesigning care at the Flinders Medical Centre: clinical process redesign using "lean thinking".
Ben-Tovim, David I; Bassham, Jane E; Bennett, Denise M; Dougherty, Melissa L; Martin, Margaret A; O'Neill, Susan J; Sincock, Jackie L; Szwarcbord, Michael G
2008-03-17
*The Flinders Medical Centre (FMC) Redesigning Care program began in November 2003; it is a hospital-wide process improvement program applying an approach called "lean thinking" (developed in the manufacturing sector) to health care. *To date, the FMC has involved hundreds of staff from all areas of the hospital in a wide variety of process redesign activities. *The initial focus of the program was on improving the flow of patients through the emergency department, but the program quickly spread to involve the redesign of managing medical and surgical patients throughout the hospital, and to improving major support services. *The program has fallen into three main phases, each of which is described in this article: "getting the knowledge"; "stabilising high-volume flows"; and "standardising and sustaining". *Results to date show that the Redesigning Care program has enabled the hospital to provide safer and more accessible care during a period of growth in demand.
[Systematization of nursing care in the obstetrical center].
dos Santos, Raquel Bezerra; Ramos, Karla da Silva
2012-01-01
This is a descriptive and exploratory study with a quantitative approach, aiming to propose a protocol for the systematization of nursing care to women in the process of giving birth in the Obstetrical Center of a public hospital in Recife, Pernambuco, Brazil. A semi-structured instrument was applied to forty women in the process of giving birth, in order to obtaining the nursing history; from which the nursing diagnoses were identified, having as a basis the International Classification for Nursing Practice (ICNP®), version 1, and their respective results and nursing interventions were established. The protocol consists in two stages: the first one is the nursing consultation, which involves the anamnesis and physical examination; and the second, involves the judicious identification of the nursing diagnoses, which will guide the planning of the nursing care to provide the individualized attention to women in the process of giving birth, using a universal terminology.
Henriksen, James A.; Heasley, John; Kennen, Jonathan G.; Nieswand, Steven
2006-01-01
Applying the Hydroecological Integrity Assessment Process involves four steps: (1) a hydrologic classification of relatively unmodified streams in a geographic area using long-term gage records and 171 ecologically relevant indices; (2) the identification of statistically significant, nonredundant, hydroecologically relevant indices associated with the five major flow components for each stream class; and (3) the development of a stream-classification tool and a hydrologic assessment tool. Four computer software tools have been developed.
NASA Technical Reports Server (NTRS)
Chen, W. T.
1972-01-01
Technology developed for signal and data processing was applied to diagnostic techniques in the area of phonocardiography (pcg), the graphic recording of the sounds of the heart generated by the functioning of the aortic and ventricular valves. The relatively broad bandwidth of the PCG signal (20 to 2000 Hz) was reduced to less than 100 Hz by the use of a heart sound envelope. The process involves full-wave rectification of the PCG signal, envelope detection of the rectified wave, and low pass filtering of the resultant envelope.
Cost and efficiency lead to increased value for the patient and bottom line for the practice.
Dahl, Owen J
2009-01-01
Understanding how much it costs to provide a service is a basic premise of any business. In addition, healthcare is in need of improved processes to provide and increase value to the patient. This can be accomplished by the application of principles called Six Sigma and Lean Management. Today's medical practice leader must be aware of the costs of doing business and be able to apply proven management principles to the processes involved in providing patient care.
ERIC Educational Resources Information Center
Ellefson, Michelle R.; Brinker, Rebecca A.; Vernacchio, Vincent J.; Schunn, Christian D.
2008-01-01
Gene expression is a difficult topic for students to learn and comprehend, at least partially because it involves various biochemical structures and processes occurring at the microscopic level. Designer Bacteria, a design-based learning (DBL) unit for high-school students, applies principles of DBL to the teaching of gene expression. Throughout…
The Motivational Factor of Erasmus Students at the University
ERIC Educational Resources Information Center
Fombona, Javier; Rodríguez, Celestino; Sevillano, Ángeles Pascual
2013-01-01
This study involved 377 ERASMUS students from the University of Oviedo in an academic year. An ad-hoc questionnaire was applied in on-line format to determine students' perceptions and opinions and to understand the motivations that impel them to participate in these activities and their degree of satisfaction. The study analyzes the process of…
Richard A. Vercoe; M. Welch-Devine; Dean Hardy; J.A. Demoss; S.N. Bonney; K. Allen; Peter Brosius; D. Charles; B. Crawford; S. Heisel; Nik Heynen; R.G. de Jesus-Crespo; N. Nibbelink; L. Parker; Cathy Pringle; A. Shaw; L. Van Sant
2014-01-01
We applied an integrative framework to illuminate and discuss the complexities of exurbanization in Macon County, North Carolina. The case of Macon County, North Carolina, highlights the complexity involved in addressing issues of exurbanization in the Southern Appalachian region. Exurbanization, the process by which urban residents move into rural areas in search of...
ERIC Educational Resources Information Center
Lee, Yeung Chung; Grace, Marcus
2010-01-01
Education for scientific literacy entails the development of scientific knowledge and the ability to apply this knowledge and value judgments to decisions about real-life issues. This paper reports an attempt to involve secondary level biology students in making decisions about an authentic socio-scientific issue--that of bat conservation--through…
Predicting bending stiffness of randomly oriented hybrid panels
Laura Moya; William T.Y. Tze; Jerrold E. Winandy
2010-01-01
This study was conducted to develop a simple model to predict the bending modulus of elasticity (MOE) of randomly oriented hybrid panels. The modeling process involved three modules: the behavior of a single layer was computed by applying micromechanics equations, layer properties were adjusted for densification effects, and the entire panel was modeled as a three-...
ERIC Educational Resources Information Center
Barnett, Dori
2012-01-01
A qualitative grounded theory study examined how practicing professionals involved in the ED identification process reconstructed the category of "emotional disturbance" as it applied to students in an alternative educational setting. A grounded theory integrates six emergent themes and essentially reframes the existing ED criteria in contemporary…
ERIC Educational Resources Information Center
Field, M. J.; Harrison, A. B.
Quality circles attempt to satisfy both task and personal needs through staff involvement in solving work-related problems. This paper summarizes quality circle theory, applies it to school settings, and suggests a framework for introducing the process to educational institutions. After briefly defining quality circles, the article presents two…
A Model of Resource Allocation in Public School Districts: A Theoretical and Empirical Analysis.
ERIC Educational Resources Information Center
Chambers, Jay G.
This paper formulates a comprehensive model of resource allocation in a local public school district. The theoretical framework specified could be applied equally well to any number of local public social service agencies. Section 1 develops the theoretical model describing the process of resource allocation. This involves the determination of the…
Researchers Apply Lesson Study: A Cycle of Lesson Planning, Implementation, and Revision
ERIC Educational Resources Information Center
Regan, Kelley S.; Evmenova, Anya S.; Kurz, Leigh Ann; Hughes, Melissa D.; Sacco, Donna; Ahn, Soo Y.; MacVittie, Nichole; Good, Kevin; Boykin, Andrea; Schwartzer, Jessica; Chirinos, David S.
2016-01-01
Scripted lesson plans and/or professional development alone may not be sufficient to encourage teachers to reflect on the quality of their teaching and improve their teaching. One learning tool that teachers may use to improve their teaching is Lesson Study (LS). LS is a collaborative process involving educators, based on concepts of iteration and…
Interplay of Computer and Paper-Based Sketching in Graphic Design
ERIC Educational Resources Information Center
Pan, Rui; Kuo, Shih-Ping; Strobel, Johannes
2013-01-01
The purpose of this study is to investigate student designers' attitude and choices towards the use of computers and paper sketches when involved in a graphic design process. 65 computer graphic technology undergraduates participated in this research. A mixed method study with survey and in-depth interviews was applied to answer the research…
Creativity, self creation, and the treatment of mental illness.
Rothenberg, A
2006-06-01
This paper examines how an understanding of systematic findings about creative processes involved in art, literature, and science can be applied to the effective treatment of mental illness. These findings and applications are illustrated by particular reference to the work of the poet Sylvia Plath and the treatment of a patient who aspired to become a writer.
ERIC Educational Resources Information Center
Smith, Rebekah E.; Bayen, Ute J.
2006-01-01
Event-based prospective memory involves remembering to perform an action in response to a particular future event. Normal younger and older adults performed event-based prospective memory tasks in 2 experiments. The authors applied a formal multinomial processing tree model of prospective memory (Smith & Bayen, 2004) to disentangle age differences…
Cardona, Alvaro; Nieto, Emmanuel; Mejía, Luz M
2010-01-01
Performing an academic exercise aimed at applying the analytical categories from the governance approach developed by Marc Hufty et al., to understand social actors relationships in an investigation and intervention project studying so-cioeconomic conditions and seeking to guarantee health insurance continuity for those workers who had lost their work in the city of Medellin, Colombia, from 2004 to 2007. A process of investigation and intervention was examined as a casestudy in which researchers were one of the actors so involved. Characterising stake-holders included: their level of inclusion/involvement in the problem; their power for influencing public policy proposals; their perceptions and proposals characteristics, power and dynamics regarding the problem of unemployment and health insurance when someone has lost her/his work; and the characteristics of their interaction with other actors. The results showed that the four analytical dimensions proposed by Hufty (actors, social norms, nodal points and processes) were useful for describing and understanding the interaction of the actors involved in the research and intervention proposal being analysed here (i.e. the case-study). It was concluded that the analytical governance framework proposed by Hufty was useful for understanding how the social subjects interacted; these were the rules which were taken for describing their interaction, being the most important nodes for interaction and progresses achieved whilst implementing the intervention proposal.
Baxter, Ruth; Taylor, Natalie; Kellar, Ian; Lawton, Rebecca
2016-01-01
Background The positive deviance approach focuses on those who demonstrate exceptional performance, despite facing the same constraints as others. ‘Positive deviants’ are identified and hypotheses about how they succeed are generated. These hypotheses are tested and then disseminated within the wider community. The positive deviance approach is being increasingly applied within healthcare organisations, although limited guidance exists and different methods, of varying quality, are used. This paper systematically reviews healthcare applications of the positive deviance approach to explore how positive deviance is defined, the quality of existing applications and the methods used within them, including the extent to which staff and patients are involved. Methods Peer-reviewed articles, published prior to September 2014, reporting empirical research on the use of the positive deviance approach within healthcare, were identified from seven electronic databases. A previously defined four-stage process for positive deviance in healthcare was used as the basis for data extraction. Quality assessments were conducted using a validated tool, and a narrative synthesis approach was followed. Results 37 of 818 articles met the inclusion criteria. The positive deviance approach was most frequently applied within North America, in secondary care, and to address healthcare-associated infections. Research predominantly identified positive deviants and generated hypotheses about how they succeeded. The approach and processes followed were poorly defined. Research quality was low, articles lacked detail and comparison groups were rarely included. Applications of positive deviance typically lacked staff and/or patient involvement, and the methods used often required extensive resources. Conclusion Further research is required to develop high quality yet practical methods which involve staff and patients in all stages of the positive deviance approach. The efficacy and efficiency of positive deviance must be assessed and compared with other quality improvement approaches. PROSPERO registration number CRD42014009365. PMID:26590198
STATISTICAL ANALYSIS OF SNAP 10A THERMOELECTRIC CONVERTER ELEMENT PROCESS DEVELOPMENT VARIABLES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fitch, S.H.; Morris, J.W.
1962-12-15
Statistical analysis, primarily analysis of variance, was applied to evaluate several factors involved in the development of suitable fabrication and processing techniques for the production of lead telluride thermoelectric elements for the SNAP 10A energy conversion system. The analysis methods are described as to their application for determining the effects of various processing steps, estabIishing the value of individual operations, and evaluating the significance of test results. The elimination of unnecessary or detrimental processing steps was accomplished and the number of required tests was substantially reduced by application of these statistical methods to the SNAP 10A production development effort. (auth)
End-to-end performance analysis using engineering confidence models and a ground processor prototype
NASA Astrophysics Data System (ADS)
Kruse, Klaus-Werner; Sauer, Maximilian; Jäger, Thomas; Herzog, Alexandra; Schmitt, Michael; Huchler, Markus; Wallace, Kotska; Eisinger, Michael; Heliere, Arnaud; Lefebvre, Alain; Maher, Mat; Chang, Mark; Phillips, Tracy; Knight, Steve; de Goeij, Bryan T. G.; van der Knaap, Frits; Van't Hof, Adriaan
2015-10-01
The European Space Agency (ESA) and the Japan Aerospace Exploration Agency (JAXA) are co-operating to develop the EarthCARE satellite mission with the fundamental objective of improving the understanding of the processes involving clouds, aerosols and radiation in the Earth's atmosphere. The EarthCARE Multispectral Imager (MSI) is relatively compact for a space borne imager. As a consequence, the immediate point-spread function (PSF) of the instrument will be mainly determined by the diffraction caused by the relatively small optical aperture. In order to still achieve a high contrast image, de-convolution processing is applied to remove the impact of diffraction on the PSF. A Lucy-Richardson algorithm has been chosen for this purpose. This paper will describe the system setup and the necessary data pre-processing and post-processing steps applied in order to compare the end-to-end image quality with the L1b performance required by the science community.
Processes involved in solving mathematical problems
NASA Astrophysics Data System (ADS)
Shahrill, Masitah; Putri, Ratu Ilma Indra; Zulkardi, Prahmana, Rully Charitas Indra
2018-04-01
This study examines one of the instructional practices features utilized within the Year 8 mathematics lessons in Brunei Darussalam. The codes from the TIMSS 1999 Video Study were applied and strictly followed, and from the 183 mathematics problems recorded, there were 95 problems with a solution presented during the public segments of the video-recorded lesson sequences of the four sampled teachers. The analyses involved firstly, identifying the processes related to mathematical problem statements, and secondly, examining the different processes used in solving the mathematical problems for each problem publicly completed during the lessons. The findings revealed that for three of the teachers, their problem statements coded as `using procedures' ranged from 64% to 83%, while the remaining teacher had 40% of his problem statements coded as `making connections.' The processes used when solving the problems were mainly `using procedures', and none of the problems were coded as `giving results only'. Furthermore, all four teachers made use of making the relevant connections in solving the problems given to their respective students.
Thinking before sinning: reasoning processes in hedonic consumption
de Witt Huberts, Jessie; Evers, Catharine; de Ridder, Denise
2014-01-01
Whereas hedonic consumption is often labeled as impulsive, findings from self-licensing research suggest that people sometimes rely on reasons to justify hedonic consumption. Although the concept of self-licensing assumes the involvement of reasoning processes, this has not been demonstrated explicitly. Two studies investigated whether people indeed rely on reasons to allow themselves a guilty pleasure. Participants were exposed to a food temptation after which passive and active reasoning was assessed by asking participants to indicate the justifications that applied to them for indulging in that temptation (Study 1) or having them construe reasons to consume the hedonic product (Study 2). Regression analyses indicated that higher levels of temptation predicted the number of reasons employed and construed to justify consumption. By providing evidence for the involvement of reasoning processes, these findings support the assumption of self-licensing theory that temptations not only exert their influence by making us more impulsive, but can also facilitate gratification by triggering deliberative reasoning processes. PMID:25408680
Intelligent monitoring and control of semiconductor manufacturing equipment
NASA Technical Reports Server (NTRS)
Murdock, Janet L.; Hayes-Roth, Barbara
1991-01-01
The use of AI methods to monitor and control semiconductor fabrication in a state-of-the-art manufacturing environment called the Rapid Thermal Multiprocessor is described. Semiconductor fabrication involves many complex processing steps with limited opportunities to measure process and product properties. By applying additional process and product knowledge to that limited data, AI methods augment classical control methods by detecting abnormalities and trends, predicting failures, diagnosing, planning corrective action sequences, explaining diagnoses or predictions, and reacting to anomalous conditions that classical control systems typically would not correct. Research methodology and issues are discussed, and two diagnosis scenarios are examined.
Investigation into Cloud Computing for More Robust Automated Bulk Image Geoprocessing
NASA Technical Reports Server (NTRS)
Brown, Richard B.; Smoot, James C.; Underwood, Lauren; Armstrong, C. Duane
2012-01-01
Geospatial resource assessments frequently require timely geospatial data processing that involves large multivariate remote sensing data sets. In particular, for disasters, response requires rapid access to large data volumes, substantial storage space and high performance processing capability. The processing and distribution of this data into usable information products requires a processing pipeline that can efficiently manage the required storage, computing utilities, and data handling requirements. In recent years, with the availability of cloud computing technology, cloud processing platforms have made available a powerful new computing infrastructure resource that can meet this need. To assess the utility of this resource, this project investigates cloud computing platforms for bulk, automated geoprocessing capabilities with respect to data handling and application development requirements. This presentation is of work being conducted by Applied Sciences Program Office at NASA-Stennis Space Center. A prototypical set of image manipulation and transformation processes that incorporate sample Unmanned Airborne System data were developed to create value-added products and tested for implementation on the "cloud". This project outlines the steps involved in creating and testing of open source software developed process code on a local prototype platform, and then transitioning this code with associated environment requirements into an analogous, but memory and processor enhanced cloud platform. A data processing cloud was used to store both standard digital camera panchromatic and multi-band image data, which were subsequently subjected to standard image processing functions such as NDVI (Normalized Difference Vegetation Index), NDMI (Normalized Difference Moisture Index), band stacking, reprojection, and other similar type data processes. Cloud infrastructure service providers were evaluated by taking these locally tested processing functions, and then applying them to a given cloud-enabled infrastructure to assesses and compare environment setup options and enabled technologies. This project reviews findings that were observed when cloud platforms were evaluated for bulk geoprocessing capabilities based on data handling and application development requirements.
Wu, Jing-Shan; Lo, Hsin-Yi; Li, Chia-Cheng; Chen, Feng-Yuan; Hsiang, Chien-Yun; Ho, Tin-Yun
2017-08-15
Electroacupuncture (EA) has been applied to treat and prevent diseases for years. However, molecular events happened in both the acupunctured site and the internal organs after EA stimulation have not been clarified. Here we applied transcriptomic analysis to explore the gene expression signatures after EA stimulation. Mice were applied EA stimulation at ST36 for 15 min and nine tissues were collected three hours later for microarray analysis. We found that EA affected the expression of genes not only in the acupunctured site but also in the internal organs. EA commonly affected biological networks involved in cytoskeleton and cell adhesion, and also regulated unique process networks in specific organs, such as γ-aminobutyric acid-ergic neurotransmission in brain and inflammation process in lung. In addition, EA affected the expression of genes related to various diseases, such as neurodegenerative diseases in brain and obstructive pulmonary diseases in lung. This report applied, for the first time, a global comprehensive genome-wide approach to analyze the gene expression profiling of acupunctured site and internal organs after EA stimulation. The connection between gene expression signatures, biological processes, and diseases might provide a basis for prediction and explanation on the therapeutic potentials of acupuncture in organs.
Applying the compound Poisson process model to the reporting of injury-related mortality rates.
Kegler, Scott R
2007-02-16
Injury-related mortality rate estimates are often analyzed under the assumption that case counts follow a Poisson distribution. Certain types of injury incidents occasionally involve multiple fatalities, however, resulting in dependencies between cases that are not reflected in the simple Poisson model and which can affect even basic statistical analyses. This paper explores the compound Poisson process model as an alternative, emphasizing adjustments to some commonly used interval estimators for population-based rates and rate ratios. The adjusted estimators involve relatively simple closed-form computations, which in the absence of multiple-case incidents reduce to familiar estimators based on the simpler Poisson model. Summary data from the National Violent Death Reporting System are referenced in several examples demonstrating application of the proposed methodology.
Sebastian, Alexandra; Rössler, Kora; Wibral, Michael; Mobascher, Arian; Lieb, Klaus; Jung, Patrick; Tüscher, Oliver
2017-10-04
In stimulus-selective stop-signal tasks, the salient stop signal needs attentional processing before genuine response inhibition is completed. Differential prefrontal involvement in attentional capture and response inhibition has been linked to the right inferior frontal junction (IFJ) and ventrolateral prefrontal cortex (VLPFC), respectively. Recently, it has been suggested that stimulus-selective stopping may be accomplished by the following different strategies: individuals may selectively inhibit their response only upon detecting a stop signal (independent discriminate then stop strategy) or unselectively whenever detecting a stop or attentional capture signal (stop then discriminate strategy). Alternatively, the discrimination process of the critical signal (stop vs attentional capture signal) may interact with the go process (dependent discriminate then stop strategy). Those different strategies might differentially involve attention- and stopping-related processes that might be implemented by divergent neural networks. This should lead to divergent activation patterns and, if disregarded, interfere with analyses in neuroimaging studies. To clarify this crucial issue, we studied 87 human participants of both sexes during a stimulus-selective stop-signal task and performed strategy-dependent functional magnetic resonance imaging analyses. We found that, regardless of the strategy applied, outright stopping displayed indistinguishable brain activation patterns. However, during attentional capture different strategies resulted in divergent neural activation patterns with variable activation of right IFJ and bilateral VLPFC. In conclusion, the neural network involved in outright stopping is ubiquitous and independent of strategy, while different strategies impact on attention-related processes and underlying neural network usage. Strategic differences should therefore be taken into account particularly when studying attention-related processes in stimulus-selective stopping. SIGNIFICANCE STATEMENT Dissociating inhibition from attention has been a major challenge for the cognitive neuroscience of executive functions. Selective stopping tasks have been instrumental in addressing this question. However, recent theoretical, cognitive and behavioral research suggests that different strategies are applied in successful execution of the task. The underlying strategy-dependent neural networks might differ substantially. Here, we show evidence that, regardless of the strategy used, the neural network involved in outright stopping is ubiquitous. However, significant differences can only be found in the attention-related processes underlying those different strategies. Thus, when studying attentional processing of salient stop signals, strategic differences should be considered. In contrast, the neural networks implementing outright stopping seem less or not at all affected by strategic differences. Copyright © 2017 the authors 0270-6474/17/379786-10$15.00/0.
What motivates professionals to engage in the accreditation of healthcare organizations?
Greenfield, David; Pawsey, Marjorie; Braithwaite, Jeffrey
2011-02-01
Motivated staff are needed to improve quality and safety in healthcare organizations. Stimulating and engaging staff to participate in accreditation processes is a considerable challenge. The purpose of this study was to explore the experiences of health executives, managers and frontline clinicians who participated in organizational accreditation processes: what motivated them to engage, and what benefits accrued? The setting was a large public teaching hospital undergoing a planned review of its accreditation status. A research protocol was employed to conduct semi-structured interviews with a purposive sample of 30 staff with varied organizational roles, from different professions, to discuss their involvement in accreditation. Thematic analysis of the data was undertaken. The analysis identified three categories, each with sub-themes: accreditation response (reactions to accreditation and the value of surveys); survey issues (participation in the survey, learning through interactions and constraints) and documentation issues (self-assessment report, survey report and recommendations). Participants' occupational role focuses their attention to prioritize aspects of the accreditation process. Their motivations to participate and the benefits that accrue to them can be positively self-reinforcing. Participants have a desire to engage collaboratively with colleagues to learn and validate their efforts to improve. Participation in the accreditation process promoted a quality and safety culture that crossed organizational boundaries. The insights into worker motivation can be applied to engage staff to promote learning, overcome organizational boundaries and improve services. The findings can be applied to enhance involvement with accreditation and, more broadly, to other quality and safety activities.
Fetterhoff, Dustin; Opris, Ioan; Simpson, Sean L.; Deadwyler, Sam A.; Hampson, Robert E.; Kraft, Robert A.
2014-01-01
Background Multifractal analysis quantifies the time-scale-invariant properties in data by describing the structure of variability over time. By applying this analysis to hippocampal interspike interval sequences recorded during performance of a working memory task, a measure of long-range temporal correlations and multifractal dynamics can reveal single neuron correlates of information processing. New method Wavelet leaders-based multifractal analysis (WLMA) was applied to hippocampal interspike intervals recorded during a working memory task. WLMA can be used to identify neurons likely to exhibit information processing relevant to operation of brain–computer interfaces and nonlinear neuronal models. Results Neurons involved in memory processing (“Functional Cell Types” or FCTs) showed a greater degree of multifractal firing properties than neurons without task-relevant firing characteristics. In addition, previously unidentified FCTs were revealed because multifractal analysis suggested further functional classification. The cannabinoid-type 1 receptor partial agonist, tetrahydrocannabinol (THC), selectively reduced multifractal dynamics in FCT neurons compared to non-FCT neurons. Comparison with existing methods WLMA is an objective tool for quantifying the memory-correlated complexity represented by FCTs that reveals additional information compared to classification of FCTs using traditional z-scores to identify neuronal correlates of behavioral events. Conclusion z-Score-based FCT classification provides limited information about the dynamical range of neuronal activity characterized by WLMA. Increased complexity, as measured with multifractal analysis, may be a marker of functional involvement in memory processing. The level of multifractal attributes can be used to differentially emphasize neural signals to improve computational models and algorithms underlying brain–computer interfaces. PMID:25086297
Review of computational fluid dynamics applications in biotechnology processes.
Sharma, C; Malhotra, D; Rathore, A S
2011-01-01
Computational fluid dynamics (CFD) is well established as a tool of choice for solving problems that involve one or more of the following phenomena: flow of fluids, heat transfer,mass transfer, and chemical reaction. Unit operations that are commonly utilized in biotechnology processes are often complex and as such would greatly benefit from application of CFD. The thirst for deeper process and product understanding that has arisen out of initiatives such as quality by design provides further impetus toward usefulness of CFD for problems that may otherwise require extensive experimentation. Not surprisingly, there has been increasing interest in applying CFD toward a variety of applications in biotechnology processing in the last decade. In this article, we will review applications in the major unit operations involved with processing of biotechnology products. These include fermentation,centrifugation, chromatography, ultrafiltration, microfiltration, and freeze drying. We feel that the future applications of CFD in biotechnology processing will focus on establishing CFD as a tool of choice for providing process understanding that can be then used to guide more efficient and effective experimentation. This article puts special emphasis on the work done in the last 10 years. © 2011 American Institute of Chemical Engineers
Manuel Colunga-Garcia; Roger A. Magarey; Robert A. Haack; Stuart H. Gage; Jiaquo Qi
2010-01-01
Urban areas are hubs of international transport and therefore are major gateways for exotic pests. Applying an urban gradient to analyze this pathway could provide insight into the ecological processes involved in human-mediated invasions. We defined an urban gradient for agricultural and forest ecosystems in the contiguous United States to (1) assess whether...
ERIC Educational Resources Information Center
Leffert, Beatrice G.
From the perspective of a reading consultant, the processes of thinking and reading apply to efficient learning. Language teachers should know: (1) the difference between surface structure and deep meaning of an utterance, (2) the importance of "affect" on learning: the reader's personal involvement with the material and with its presentation,…
Which Sweetener Is Best for Yeast? An Inquiry-Based Learning for Conceptual Change
ERIC Educational Resources Information Center
Cherif, Abour H.; Siuda, JoElla E.; Kassem, Sana; Gialamas, Stefanos; Movahedzadeh, Farahnaz
2017-01-01
One way to help students understand the scientific inquiry process, and how it applies in investigative research, is to involve them in scientific investigation. An example of this would be letting them come to their own understanding of how different variables (e.g., starting products) can affect outcomes (e.g., variable quality end products)…
ERIC Educational Resources Information Center
Schonert-Reichl, Kimberly A.; Kitil, M. Jennifer; Hanson-Peterson, Jennifer
2017-01-01
Social and emotional learning, or SEL, involves the processes through which individuals acquire and effectively apply the knowledge, attitudes, and skills necessary to understand and manage their emotions, feel and show empathy for others, establish and achieve positive goals, develop and maintain positive relationships, and make responsible…
USDA-ARS?s Scientific Manuscript database
The National Animal Disease Center (NADC) conducts basic and applied research on endemic animal diseases of high priority that adversely affect U.S. livestock production or trade. Experiments conducted at this Center vary in range and scope, with a subset involving synthetic or recombinant nucleic a...
Overview 1993: Computational applications
NASA Technical Reports Server (NTRS)
Benek, John A.
1993-01-01
Computational applications include projects that apply or develop computationally intensive computer programs. Such programs typically require supercomputers to obtain solutions in a timely fashion. This report describes two CSTAR projects involving Computational Fluid Dynamics (CFD) technology. The first, the Parallel Processing Initiative, is a joint development effort and the second, the Chimera Technology Development, is a transfer of government developed technology to American industry.
Texture measurement of shaped material by impulse acoustic microscopy
Eyraud; Nadal; Gondard
2000-03-01
All the microstructural parameters involved in metallurgical processes are difficult to determine directly on a shaped material. The aim of this paper is to use an impulse line-focus acoustic microscope (LFAM) as a non-destructive alternative to X-ray diffraction for measuring texture of slightly anisotropic materials. We apply it to characterize the rolling and annealing texture for tantalum sheets.
ERIC Educational Resources Information Center
Dronkers, Jaap; Avram, Silvia
2010-01-01
We apply propensity score matching to the estimation of differential school effectiveness between the publicly funded private sector and the public sector in a sample of 26 countries. This technique allows us to distinguish between school choice and school effectiveness processes and thus to account for selectivity issues involved in the…
Cross Sections From Scalar Field Theory
NASA Technical Reports Server (NTRS)
Norbury, John W.; Dick, Frank; Norman, Ryan B.; Nasto, Rachel
2008-01-01
A one pion exchange scalar model is used to calculate differential and total cross sections for pion production through nucleon- nucleon collisions. The collisions involve intermediate delta particle production and decay to nucleons and a pion. The model provides the basic theoretical framework for scalar field theory and can be applied to particle production processes where the effects of spin can be neglected.
ERIC Educational Resources Information Center
Lindquist, David H.
2012-01-01
Examining history from the perspective of investigators who wrestle with involved scenarios for which no simple answers exist, or from which no obvious conclusions can be drawn, allows students to understand the historiographic process and the complex nature of historical events, while gaining valuable practice in applying analytical and critical…
Struckmann, Verena; Panteli, Dimitra; Legido-Quigley, Helena; Risso-Gill, Isabelle; McKee, Martin; Busse, Reinhard
2015-08-01
In 1974, the European Economic Community established mutual recognition of medical qualifications obtained in any of its member states. Subsequently, a series of directives has elaborated on the initial provisions, with the most recent enacted in 2013. However, greater movement of physicians across borders and some high-profile scandals have raised questions about how to prevent a physician sanctioned in one country from simply moving to another, without undermining the principle of free movement. A survey of key informants in 11 European Union (EU) member states was supplemented by a review of peer-reviewed and grey literature, with the results validated by independent reviewers. It examined processes, adjudicative and disciplinary measures that are in place to evaluate physicians about whom concerns arise, and related sanctions, along with other aspects of professional standards and regulation. Overall, responses varied greatly between participating countries, with respect to the institutions responsible for the regulation of medical professions, the investigation processes in place, and the terminology used in each member state. While the types of sanction (removal from the register of medical professionals and/or licence revocation, suspension, dismissal, reprimand, warnings, fines, as well as additional education and training) applied are similar, both the roles of the individuals involved and the level of public disclosure of information vary considerably. However, some key features, such as the involvement of professional peers in disciplinary panels and the involvement of courts in criminal cases, are similar in most member states studied. Given the variation in the regulatory context, individuals and processes involved that is illustrated by our findings, a common understanding of definitions of what constitutes competence to practise, its impairment and its potential impact on patient safety becomes particularly important. Public disclosure of disciplinary outcomes is already applied by some member states, but additional measures should be considered to protect medical professionals from undue consequences. © Royal College of Physicians 2015. All rights reserved.
Obermeier, Christian; Hosseini, Bashir; Friedt, Wolfgang; Snowdon, Rod
2009-01-01
Background Serial analysis of gene expression (LongSAGE) was applied for gene expression profiling in seeds of oilseed rape (Brassica napus ssp. napus). The usefulness of this technique for detailed expression profiling in a non-model organism was demonstrated for the highly complex, neither fully sequenced nor annotated genome of B. napus by applying a tag-to-gene matching strategy based on Brassica ESTs and the annotated proteome of the closely related model crucifer A. thaliana. Results Transcripts from 3,094 genes were detected at two time-points of seed development, 23 days and 35 days after pollination (DAP). Differential expression showed a shift from gene expression involved in diverse developmental processes including cell proliferation and seed coat formation at 23 DAP to more focussed metabolic processes including storage protein accumulation and lipid deposition at 35 DAP. The most abundant transcripts at 23 DAP were coding for diverse protease inhibitor proteins and proteases, including cysteine proteases involved in seed coat formation and a number of lipid transfer proteins involved in embryo pattern formation. At 35 DAP, transcripts encoding napin, cruciferin and oleosin storage proteins were most abundant. Over both time-points, 18.6% of the detected genes were matched by Brassica ESTs identified by LongSAGE tags in antisense orientation. This suggests a strong involvement of antisense transcript expression in regulatory processes during B. napus seed development. Conclusion This study underlines the potential of transcript tagging approaches for gene expression profiling in Brassica crop species via EST matching to annotated A. thaliana genes. Limits of tag detection for low-abundance transcripts can today be overcome by ultra-high throughput sequencing approaches, so that tag-based gene expression profiling may soon become the method of choice for global expression profiling in non-model species. PMID:19575793
Color enhancement of landsat agricultural imagery: JPL LACIE image processing support task
NASA Technical Reports Server (NTRS)
Madura, D. P.; Soha, J. M.; Green, W. B.; Wherry, D. B.; Lewis, S. D.
1978-01-01
Color enhancement techniques were applied to LACIE LANDSAT segments to determine if such enhancement can assist analysis in crop identification. The procedure involved increasing the color range by removing correlation between components. First, a principal component transformation was performed, followed by contrast enhancement to equalize component variances, followed by an inverse transformation to restore familiar color relationships. Filtering was applied to lower order components to reduce color speckle in the enhanced products. Use of single acquisition and multiple acquisition statistics to control the enhancement were compared, and the effects of normalization investigated. Evaluation is left to LACIE personnel.
Matero, Sanni; van Den Berg, Frans; Poutiainen, Sami; Rantanen, Jukka; Pajander, Jari
2013-05-01
The manufacturing of tablets involves many unit operations that possess multivariate and complex characteristics. The interactions between the material characteristics and process related variation are presently not comprehensively analyzed due to univariate detection methods. As a consequence, current best practice to control a typical process is to not allow process-related factors to vary i.e. lock the production parameters. The problem related to the lack of sufficient process understanding is still there: the variation within process and material properties is an intrinsic feature and cannot be compensated for with constant process parameters. Instead, a more comprehensive approach based on the use of multivariate tools for investigating processes should be applied. In the pharmaceutical field these methods are referred to as Process Analytical Technology (PAT) tools that aim to achieve a thorough understanding and control over the production process. PAT includes the frames for measurement as well as data analyzes and controlling for in-depth understanding, leading to more consistent and safer drug products with less batch rejections. In the optimal situation, by applying these techniques, destructive end-product testing could be avoided. In this paper the most prominent multivariate data analysis measuring tools within tablet manufacturing and basic research on operations are reviewed. Copyright © 2013 Wiley Periodicals, Inc.
A simple and inexpensive retainer for overdenture prosthesis
Kumar, Lakshya; Rao, Jitendra; Yadav, Akanksha
2013-01-01
This article describes a clinical case report of a 65-year-old male patient in which an overdenture was fabricated by using a simple, logical and inexpensive means of retentive device. The described mandibular overdenture involves a simple modification in the coping design and a wire lock mechanism which was fabricated during denture processing. The problems associated with copings were overcome by putting the patient on a regimen wherein topical fluoride was applied every week on the abutment. The denture, fabricated involving a wire lock mechanism, was highly retentive and stable. Patient was highly satisfied with the outcome of the treatment. PMID:23861281
Rao, Ravella Sreenivas; Kumar, C Ganesh; Prakasham, R Shetty; Hobbs, Phil J
2008-04-01
Success in experiments and/or technology mainly depends on a properly designed process or product. The traditional method of process optimization involves the study of one variable at a time, which requires a number of combinations of experiments that are time, cost and labor intensive. The Taguchi method of design of experiments is a simple statistical tool involving a system of tabulated designs (arrays) that allows a maximum number of main effects to be estimated in an unbiased (orthogonal) fashion with a minimum number of experimental runs. It has been applied to predict the significant contribution of the design variable(s) and the optimum combination of each variable by conducting experiments on a real-time basis. The modeling that is performed essentially relates signal-to-noise ratio to the control variables in a 'main effect only' approach. This approach enables both multiple response and dynamic problems to be studied by handling noise factors. Taguchi principles and concepts have made extensive contributions to industry by bringing focused awareness to robustness, noise and quality. This methodology has been widely applied in many industrial sectors; however, its application in biological sciences has been limited. In the present review, the application and comparison of the Taguchi methodology has been emphasized with specific case studies in the field of biotechnology, particularly in diverse areas like fermentation, food processing, molecular biology, wastewater treatment and bioremediation.
Kellman, Philip J; Massey, Christine M; Son, Ji Y
2010-04-01
Learning in educational settings emphasizes declarative and procedural knowledge. Studies of expertise, however, point to other crucial components of learning, especially improvements produced by experience in the extraction of information: perceptual learning (PL). We suggest that such improvements characterize both simple sensory and complex cognitive, even symbolic, tasks through common processes of discovery and selection. We apply these ideas in the form of perceptual learning modules (PLMs) to mathematics learning. We tested three PLMs, each emphasizing different aspects of complex task performance, in middle and high school mathematics. In the MultiRep PLM, practice in matching function information across multiple representations improved students' abilities to generate correct graphs and equations from word problems. In the Algebraic Transformations PLM, practice in seeing equation structure across transformations (but not solving equations) led to dramatic improvements in the speed of equation solving. In the Linear Measurement PLM, interactive trials involving extraction of information about units and lengths produced successful transfer to novel measurement problems and fraction problem solving. Taken together, these results suggest (a) that PL techniques have the potential to address crucial, neglected dimensions of learning, including discovery and fluent processing of relations; (b) PL effects apply even to complex tasks that involve symbolic processing; and (c) appropriately designed PL technology can produce rapid and enduring advances in learning. Copyright © 2009 Cognitive Science Society, Inc.
TU-C-218-01: Effective Medical Imaging Physics Education.
Sprawls, P
2012-06-01
A practical and applied knowledge of physics and the associated technology is required for the clinically effective and safe use of the various medical imaging modalities. This is needed by all involved in the imaging process, including radiologists, especially residents in training, technologists, and physicists who provide consultation on optimum and safe procedures and as educators for the other imaging professionals. This area of education is undergoing considerable change and evolution for three reasons: 1. Increasing capabilities and complexity of medical imaging technology and procedures, 2.Expanding scope and availability of educational resources, especially on the internet, and 3. A significant increase in our knowledge of the mental learning process and the design of learning activities to optimize effectiveness and efficiency, especially for clinically applied physics. This course will address those three issues by providing guidance on establishing appropriate clinically focused learning outcomes, a review of the brain function for enhancing clinically applied physics, and the design and delivery of effective learning activities beginning with the classroom and continuing through learning physics during the clinical practice of radiology. Characteristics of each type of learning activity will be considered with respect to effectiveness and efficiency in achieving appropriate learning outcomes. A variety of available resources will be identified and demonstrated for use in the different phases of learning process. A major focus is on enhancing the role of the medical physicist in clinical radiology both as a resource and educator with contemporary technology being the tool, but not the teacher. 1. Develop physics learning objectives that will support effective and safe medical imaging procedures. 2. Understand specific brain functions that are involved in learning and applying physics. 3. Describe the characteristics and development of mental knowledge structures for applied clinical physics. 4. List the established levels of learning and associate each with specific functions that can be performed. 5. Analyze the different types of learning activities (classroom, individual study, clinical, etc.) with respect to effectiveness and efficiency. 6. Design and Provide a comprehensive physics education program with each activity optimized with respect to outcomes and available resources. © 2012 American Association of Physicists in Medicine.
Wentzel, Jobke; Sanderman, Robbert; van Gemert-Pijnen, Lisette
2015-01-01
Background It is acknowledged that the success and uptake of eHealth improve with the involvement of users and stakeholders to make technology reflect their needs. Involving stakeholders in implementation research is thus a crucial element in developing eHealth technology. Business modeling is an approach to guide implementation research for eHealth. Stakeholders are involved in business modeling by identifying relevant stakeholders, conducting value co-creation dialogs, and co-creating a business model. Because implementation activities are often underestimated as a crucial step while developing eHealth, comprehensive and applicable approaches geared toward business modeling in eHealth are scarce. Objective This paper demonstrates the potential of several stakeholder-oriented analysis methods and their practical application was demonstrated using Infectionmanager as an example case. In this paper, we aim to demonstrate how business modeling, with the focus on stakeholder involvement, is used to co-create an eHealth implementation. Methods We divided business modeling in 4 main research steps. As part of stakeholder identification, we performed literature scans, expert recommendations, and snowball sampling (Step 1). For stakeholder analyzes, we performed “basic stakeholder analysis,” stakeholder salience, and ranking/analytic hierarchy process (Step 2). For value co-creation dialogs, we performed a process analysis and stakeholder interviews based on the business model canvas (Step 3). Finally, for business model generation, we combined all findings into the business model canvas (Step 4). Results Based on the applied methods, we synthesized a step-by-step guide for business modeling with stakeholder-oriented analysis methods that we consider suitable for implementing eHealth. Conclusions The step-by-step guide for business modeling with stakeholder involvement enables eHealth researchers to apply a systematic and multidisciplinary, co-creative approach for implementing eHealth. Business modeling becomes an active part in the entire development process of eHealth and starts an early focus on implementation, in which stakeholders help to co-create the basis necessary for a satisfying success and uptake of the eHealth technology. PMID:26272510
Chen, Jianjun; Frey, H Christopher
2004-12-15
Methods for optimization of process technologies considering the distinction between variability and uncertainty are developed and applied to case studies of NOx control for Integrated Gasification Combined Cycle systems. Existing methods of stochastic optimization (SO) and stochastic programming (SP) are demonstrated. A comparison of SO and SP results provides the value of collecting additional information to reduce uncertainty. For example, an expected annual benefit of 240,000 dollars is estimated if uncertainty can be reduced before a final design is chosen. SO and SP are typically applied to uncertainty. However, when applied to variability, the benefit of dynamic process control is obtained. For example, an annual savings of 1 million dollars could be achieved if the system is adjusted to changes in process conditions. When variability and uncertainty are treated distinctively, a coupled stochastic optimization and programming method and a two-dimensional stochastic programming method are demonstrated via a case study. For the case study, the mean annual benefit of dynamic process control is estimated to be 700,000 dollars, with a 95% confidence range of 500,000 dollars to 940,000 dollars. These methods are expected to be of greatest utility for problems involving a large commitment of resources, for which small differences in designs can produce large cost savings.
Recruiting the next generation: applying a values-based approach to recruitment.
Ritchie, Georgina; Ashworth, Lisa; Bades, Annette
2018-05-02
The qualified district nurse (DN) role demands high levels of leadership. Attracting the right candidates to apply for the Specialist Practice Qualification District Nursing (SPQDN) education programme is essential to ensure fitness to practice on qualification. Anecdotal evidence suggested that the traditional panel interview discouraged candidates from applying and a need to improve the quality of the overall interview process was identified by the authors. The University of Central Lancashire in partnership with Lancashire Care NHS Foundation Trust adopted the National Values Based Recruitment (VBR) Framework to select candidates to gain entry onto the SPQDN course. This involved using 'selection centres' of varying activities including a multiple mini interview, written exercise, group discussion, and portfolio review with scores attached to each centre. The ultimate aim of utilising VBR was to align personal and profession values to both the nursing profession and the Trust whilst allowing a fairer assessment process. An evaluation of the VBR recruitment process demonstrated 100% pass rate for the course and 100% satisfaction with the interview process reported by all 16 candidates over three academic years. Interviewer feedback showed deeper insight into the candidates' skills and values aligned with the core values and skills required by future District Nurse leaders within the Trust.
Participatory design of healthcare technology with children.
Sims, Tara
2018-02-12
Purpose There are many frameworks and methods for involving children in design research. Human-Computer Interaction provides rich methods for involving children when designing technologies. The paper aims to discuss these issues. Design/methodology/approach This paper examines various approaches to involving children in design, considering whether users view children as study objects or active participants. Findings The BRIDGE method is a sociocultural approach to product design that views children as active participants, enabling them to contribute to the design process as competent and resourceful partners. An example is provided, in which BRIDGE was successfully applied to developing upper limb prostheses with children. Originality/value Approaching design in this way can provide children with opportunities to develop social, academic and design skills and to develop autonomy.
45 CFR 46.401 - To what do these regulations apply?
Code of Federal Regulations, 2014 CFR
2014-10-01
... this subpart. However, the exemption at § 46.101(b)(2) for research involving survey or interview... HUMAN SUBJECTS Additional Protections for Children Involved as Subjects in Research § 46.401 To what do these regulations apply? (a) This subpart applies to all research involving children as subjects...
45 CFR 46.401 - To what do these regulations apply?
Code of Federal Regulations, 2010 CFR
2010-10-01
... this subpart. However, the exemption at § 46.101(b)(2) for research involving survey or interview... HUMAN SUBJECTS Additional Protections for Children Involved as Subjects in Research § 46.401 To what do these regulations apply? (a) This subpart applies to all research involving children as subjects...
45 CFR 46.401 - To what do these regulations apply?
Code of Federal Regulations, 2013 CFR
2013-10-01
... this subpart. However, the exemption at § 46.101(b)(2) for research involving survey or interview... HUMAN SUBJECTS Additional Protections for Children Involved as Subjects in Research § 46.401 To what do these regulations apply? (a) This subpart applies to all research involving children as subjects...
45 CFR 46.401 - To what do these regulations apply?
Code of Federal Regulations, 2012 CFR
2012-10-01
... this subpart. However, the exemption at § 46.101(b)(2) for research involving survey or interview... HUMAN SUBJECTS Additional Protections for Children Involved as Subjects in Research § 46.401 To what do these regulations apply? (a) This subpart applies to all research involving children as subjects...
45 CFR 46.401 - To what do these regulations apply?
Code of Federal Regulations, 2011 CFR
2011-10-01
... this subpart. However, the exemption at § 46.101(b)(2) for research involving survey or interview... HUMAN SUBJECTS Additional Protections for Children Involved as Subjects in Research § 46.401 To what do these regulations apply? (a) This subpart applies to all research involving children as subjects...
NASA Astrophysics Data System (ADS)
Vieceli, Nathália; Nogueira, Carlos A.; Pereira, Manuel F. C.; Durão, Fernando O.; Guimarães, Carlos; Margarido, Fernanda
2018-01-01
The recovery of lithium from hard rock minerals has received increased attention given the high demand for this element. Therefore, this study optimized an innovative process, which does not require a high-temperature calcination step, for lithium extraction from lepidolite. Mechanical activation and acid digestion were suggested as crucial process parameters, and experimental design and response-surface methodology were applied to model and optimize the proposed lithium extraction process. The promoting effect of amorphization and the formation of lithium sulfate hydrate on lithium extraction yield were assessed. Several factor combinations led to extraction yields that exceeded 90%, indicating that the proposed process is an effective approach for lithium recovery.
NASA Technical Reports Server (NTRS)
Raiman, Laura B.
1992-01-01
Total Quality Management (TQM) is a cooperative form of doing business that relies on the talents of everyone in an organization to continually improve quality and productivity, using teams and an assortment of statistical and measurement tools. The objective of the activities described in this paper was to implement effective improvement tools and techniques in order to build work processes which support good management and technical decisions and actions which are crucial to the success of the ACRV project. The objectives were met by applications in both the technical and management areas. The management applications involved initiating focused continuous improvement projects with widespread team membership. The technical applications involved applying proven statistical tools and techniques to the technical issues associated with the ACRV Project. Specific activities related to the objective included working with a support contractor team to improve support processes, examining processes involved in international activities, a series of tutorials presented to the New Initiatives Office and support contractors, a briefing to NIO managers, and work with the NIO Q+ Team. On the technical side, work included analyzing data from the large-scale W.A.T.E.R. test, landing mode trade analyses, and targeting probability calculations. The results of these efforts will help to develop a disciplined, ongoing process for producing fundamental decisions and actions that shape and guide the ACRV organization .
NASA Astrophysics Data System (ADS)
Raiman, Laura B.
1992-12-01
Total Quality Management (TQM) is a cooperative form of doing business that relies on the talents of everyone in an organization to continually improve quality and productivity, using teams and an assortment of statistical and measurement tools. The objective of the activities described in this paper was to implement effective improvement tools and techniques in order to build work processes which support good management and technical decisions and actions which are crucial to the success of the ACRV project. The objectives were met by applications in both the technical and management areas. The management applications involved initiating focused continuous improvement projects with widespread team membership. The technical applications involved applying proven statistical tools and techniques to the technical issues associated with the ACRV Project. Specific activities related to the objective included working with a support contractor team to improve support processes, examining processes involved in international activities, a series of tutorials presented to the New Initiatives Office and support contractors, a briefing to NIO managers, and work with the NIO Q+ Team. On the technical side, work included analyzing data from the large-scale W.A.T.E.R. test, landing mode trade analyses, and targeting probability calculations. The results of these efforts will help to develop a disciplined, ongoing process for producing fundamental decisions and actions that shape and guide the ACRV organization .
Application of computational fluid mechanics to atmospheric pollution problems
NASA Technical Reports Server (NTRS)
Hung, R. J.; Liaw, G. S.; Smith, R. E.
1986-01-01
One of the most noticeable effects of air pollution on the properties of the atmosphere is the reduction in visibility. This paper reports the results of investigations of the fluid dynamical and microphysical processes involved in the formation of advection fog on aerosols from combustion-related pollutants, as condensation nuclei. The effects of a polydisperse aerosol distribution, on the condensation/nucleation processes which cause the reduction in visibility are studied. This study demonstrates how computational fluid mechanics and heat transfer modeling can be applied to simulate the life cycle of the atmosphereic pollution problems.
Mapping land use changes in the carboniferous region of Santa Catarina, report 2
NASA Technical Reports Server (NTRS)
Valeriano, D. D. (Principal Investigator); Bitencourtpereira, M. D.
1983-01-01
The techniques applied to MSS-LANDSAT data in the land-use mapping of Criciuma region (Santa Catarina state, Brazil) are presented along with the results of a classification accuracy estimate tested on the resulting map. The MSS-LANDSAT data digital processing involves noise suppression, features selection and a hybrid classifier. The accuracy test is made through comparisons with aerial photographs of sampled points. The utilization of digital processing to map the classes agricultural lands, forest lands and urban areas is recommended, while the coal refuse areas should be mapped visually.
A survey of Applied Psychological Services' models of the human operator
NASA Technical Reports Server (NTRS)
Siegel, A. I.; Wolf, J. J.
1979-01-01
A historical perspective is presented in terms of the major features and status of two families of computer simulation models in which the human operator plays the primary role. Both task oriented and message oriented models are included. Two other recent efforts are summarized which deal with visual information processing. They involve not whole model development but a family of subroutines customized to add the human aspects to existing models. A global diagram of the generalized model development/validation process is presented and related to 15 criteria for model evaluation.
Automated image processing of Landsat II digital data for watershed runoff prediction
NASA Technical Reports Server (NTRS)
Sasso, R. R.; Jensen, J. R.; Estes, J. E.
1977-01-01
Digital image processing of Landsat data from a 230 sq km area was examined as a possible means of generating soil cover information for use in the watershed runoff prediction of Kern County, California. The soil cover information included data on brush, grass, pasture lands and forests. A classification accuracy of 94% for the Landsat-based soil cover survey suggested that the technique could be applied to the watershed runoff estimate. However, problems involving the survey of complex mountainous environments may require further attention
Enhanced nitrogen diffusion induced by atomic attrition
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ochoa, E.A.; Figueroa, C.A.; Czerwiec, T.
2006-06-19
The nitrogen diffusion in steel is enhanced by previous atomic attrition with low energy xenon ions. The noble gas bombardment generates nanoscale texture surfaces and stress in the material. The atomic attrition increases nitrogen diffusion at lower temperatures than the ones normally used in standard processes. The stress causes binding energy shifts of the Xe 3d{sub 5/2} electron core level. The heavy ion bombardment control of the texture and stress of the material surfaces may be applied to several plasma processes where diffusing species are involved.
MARKOV: A methodology for the solution of infinite time horizon MARKOV decision processes
Williams, B.K.
1988-01-01
Algorithms are described for determining optimal policies for finite state, finite action, infinite discrete time horizon Markov decision processes. Both value-improvement and policy-improvement techniques are used in the algorithms. Computing procedures are also described. The algorithms are appropriate for processes that are either finite or infinite, deterministic or stochastic, discounted or undiscounted, in any meaningful combination of these features. Computing procedures are described in terms of initial data processing, bound improvements, process reduction, and testing and solution. Application of the methodology is illustrated with an example involving natural resource management. Management implications of certain hypothesized relationships between mallard survival and harvest rates are addressed by applying the optimality procedures to mallard population models.
Melo, E Correa
2003-08-01
The author describes the reasons why evaluation processes should be applied to the Veterinary Services of Member Countries, either for trade in animals and animal products and by-products between two countries, or for establishing essential measures to improve the Veterinary Service concerned. The author also describes the basic elements involved in conducting an evaluation process, including the instruments for doing so. These basic elements centre on the following:--designing a model, or desirable image, against which a comparison can be made--establishing a list of processes to be analysed and defining the qualitative and quantitative mechanisms for this analysis--establishing a multidisciplinary evaluation team and developing a process for standardising the evaluation criteria.
NASA Astrophysics Data System (ADS)
von Bilderling, Catalina; Caldarola, Martín; Masip, Martín E.; Bragas, Andrea V.; Pietrasanta, Lía I.
2017-01-01
The adhesion of cells to the extracellular matrix is a hierarchical, force-dependent, multistage process that evolves at several temporal scales. An understanding of this complex process requires a precise measurement of forces and its correlation with protein responses in living cells. We present a method to quantitatively assess live cell responses to a local and specific mechanical stimulus. Our approach combines atomic force microscopy with fluorescence imaging. Using this approach, we evaluated the recruitment of adhesion proteins such as vinculin, focal adhesion kinase, paxillin, and zyxin triggered by applying forces in the nN regime to live cells. We observed in real time the development of nascent adhesion sites, evident from the accumulation of early adhesion proteins at the position where the force was applied. We show that the method can be used to quantify the recruitment characteristic times for adhesion proteins in the formation of focal complexes. We also found a spatial remodeling of the mature focal adhesion protein zyxin as a function of the applied force. Our approach allows the study of a variety of complex biological processes involved in cellular mechanotransduction.
von Bilderling, Catalina; Caldarola, Martín; Masip, Martín E; Bragas, Andrea V; Pietrasanta, Lía I
2017-01-01
The adhesion of cells to the extracellular matrix is a hierarchical, force-dependent, multistage process that evolves at several temporal scales. An understanding of this complex process requires a precise measurement of forces and its correlation with protein responses in living cells. We present a method to quantitatively assess live cell responses to a local and specific mechanical stimulus. Our approach combines atomic force microscopy with fluorescence imaging. Using this approach, we evaluated the recruitment of adhesion proteins such as vinculin, focal adhesion kinase, paxillin, and zyxin triggered by applying forces in the nN regime to live cells. We observed in real time the development of nascent adhesion sites, evident from the accumulation of early adhesion proteins at the position where the force was applied. We show that the method can be used to quantify the recruitment characteristic times for adhesion proteins in the formation of focal complexes. We also found a spatial remodeling of the mature focal adhesion protein zyxin as a function of the applied force. Our approach allows the study of a variety of complex biological processes involved in cellular mechanotransduction.
Can dual processing theory explain physics students' performance on the Force Concept Inventory?
NASA Astrophysics Data System (ADS)
Wood, Anna K.; Galloway, Ross K.; Hardy, Judy
2016-12-01
According to dual processing theory there are two types, or modes, of thinking: system 1, which involves intuitive and nonreflective thinking, and system 2, which is more deliberate and requires conscious effort and thought. The Cognitive Reflection Test (CRT) is a widely used and robust three item instrument that measures the tendency to override system 1 thinking and to engage in reflective, system 2 thinking. Each item on the CRT has an intuitive (but wrong) answer that must be rejected in order to answer the item correctly. We therefore hypothesized that performance on the CRT may give useful insights into the cognitive processes involved in learning physics, where success involves rejecting the common, intuitive ideas about the world (often called misconceptions) and instead carefully applying physical concepts. This paper presents initial results from an ongoing study examining the relationship between students' CRT scores and their performance on the Force Concept Inventory (FCI), which tests students' understanding of Newtonian mechanics. We find that a higher CRT score predicts a higher FCI score for both precourse and postcourse tests. However, we also find that the FCI normalized gain is independent of CRT score. The implications of these results are discussed.
Hovasse, Agnès; Bruneel, Odile; Casiot, Corinne; Desoeuvre, Angélique; Farasin, Julien; Hery, Marina; Van Dorsselaer, Alain; Carapito, Christine; Arsène-Ploetze, Florence
2016-01-01
The acid mine drainage (AMD) impacted creek of the Carnoulès mine (Southern France) is characterized by acid waters with a high heavy metal content. The microbial community inhabiting this AMD was extensively studied using isolation, metagenomic and metaproteomic methods, and the results showed that a natural arsenic (and iron) attenuation process involving the arsenite oxidase activity of several Thiomonas strains occurs at this site. A sensitive quantitative Selected Reaction Monitoring (SRM)-based proteomic approach was developed for detecting and quantifying the two subunits of the arsenite oxidase and RpoA of two different Thiomonas groups. Using this approach combined with FISH and pyrosequencing-based 16S rRNA gene sequence analysis, it was established here for the first time that these Thiomonas strains are ubiquitously present in minor proportions in this AMD and that they express the key enzymes involved in natural remediation processes at various locations and time points. In addition to these findings, this study also confirms that targeted proteomics applied at the community level can be used to detect weakly abundant proteins in situ. PMID:26870729
TMS affects moral judgment, showing the role of DLPFC and TPJ in cognitive and emotional processing.
Jeurissen, Danique; Sack, Alexander T; Roebroeck, Alard; Russ, Brian E; Pascual-Leone, Alvaro
2014-01-01
Decision-making involves a complex interplay of emotional responses and reasoning processes. In this study, we use TMS to explore the neurobiological substrates of moral decisions in humans. To examining the effects of TMS on the outcome of a moral-decision, we compare the decision outcome of moral-personal and moral-impersonal dilemmas to each other and examine the differential effects of applying TMS over the right DLPFC or right TPJ. In this comparison, we find that the TMS-induced disruption of the DLPFC during the decision process, affects the outcome of the moral-personal judgment, while TMS-induced disruption of TPJ affects only moral-impersonal conditions. In other words, we find a double-dissociation between DLPFC and TPJ in the outcome of a moral decision. Furthermore, we find that TMS-induced disruption of the DLPFC during non-moral, moral-impersonal, and moral-personal decisions lead to lower ratings of regret about the decision. Our results are in line with the dual-process theory and suggest a role for both the emotional response and cognitive reasoning process in moral judgment. Both the emotional and cognitive processes were shown to be involved in the decision outcome.
Eliciting expert opinion for economic models: an applied example.
Leal, José; Wordsworth, Sarah; Legood, Rosa; Blair, Edward
2007-01-01
Expert opinion is considered as a legitimate source of information for decision-analytic modeling where required data are unavailable. Our objective was to develop a practical computer-based tool for eliciting expert opinion about the shape of the uncertainty distribution around individual model parameters. We first developed a prepilot survey with departmental colleagues to test a number of alternative approaches to eliciting opinions on the shape of the uncertainty distribution around individual parameters. This information was used to develop a survey instrument for an applied clinical example. This involved eliciting opinions from experts to inform a number of parameters involving Bernoulli processes in an economic model evaluating DNA testing for families with a genetic disease, hypertrophic cardiomyopathy. The experts were cardiologists, clinical geneticists, and laboratory scientists working with cardiomyopathy patient populations and DNA testing. Our initial prepilot work suggested that the more complex elicitation techniques advocated in the literature were difficult to use in practice. In contrast, our approach achieved a reasonable response rate (50%), provided logical answers, and was generally rated as easy to use by respondents. The computer software user interface permitted graphical feedback throughout the elicitation process. The distributions obtained were incorporated into the model, enabling the use of probabilistic sensitivity analysis. There is clearly a gap in the literature between theoretical elicitation techniques and tools that can be used in applied decision-analytic models. The results of this methodological study are potentially valuable for other decision analysts deriving expert opinion.
Detecting Inappropriate Access to Electronic Health Records Using Collaborative Filtering.
Menon, Aditya Krishna; Jiang, Xiaoqian; Kim, Jihoon; Vaidya, Jaideep; Ohno-Machado, Lucila
2014-04-01
Many healthcare facilities enforce security on their electronic health records (EHRs) through a corrective mechanism: some staff nominally have almost unrestricted access to the records, but there is a strict ex post facto audit process for inappropriate accesses, i.e., accesses that violate the facility's security and privacy policies. This process is inefficient, as each suspicious access has to be reviewed by a security expert, and is purely retrospective, as it occurs after damage may have been incurred. This motivates automated approaches based on machine learning using historical data. Previous attempts at such a system have successfully applied supervised learning models to this end, such as SVMs and logistic regression. While providing benefits over manual auditing, these approaches ignore the identity of the users and patients involved in a record access. Therefore, they cannot exploit the fact that a patient whose record was previously involved in a violation has an increased risk of being involved in a future violation. Motivated by this, in this paper, we propose a collaborative filtering inspired approach to predicting inappropriate accesses. Our solution integrates both explicit and latent features for staff and patients, the latter acting as a personalized "finger-print" based on historical access patterns. The proposed method, when applied to real EHR access data from two tertiary hospitals and a file-access dataset from Amazon, shows not only significantly improved performance compared to existing methods, but also provides insights as to what indicates an inappropriate access.
Detecting Inappropriate Access to Electronic Health Records Using Collaborative Filtering
Menon, Aditya Krishna; Jiang, Xiaoqian; Kim, Jihoon; Vaidya, Jaideep; Ohno-Machado, Lucila
2013-01-01
Many healthcare facilities enforce security on their electronic health records (EHRs) through a corrective mechanism: some staff nominally have almost unrestricted access to the records, but there is a strict ex post facto audit process for inappropriate accesses, i.e., accesses that violate the facility’s security and privacy policies. This process is inefficient, as each suspicious access has to be reviewed by a security expert, and is purely retrospective, as it occurs after damage may have been incurred. This motivates automated approaches based on machine learning using historical data. Previous attempts at such a system have successfully applied supervised learning models to this end, such as SVMs and logistic regression. While providing benefits over manual auditing, these approaches ignore the identity of the users and patients involved in a record access. Therefore, they cannot exploit the fact that a patient whose record was previously involved in a violation has an increased risk of being involved in a future violation. Motivated by this, in this paper, we propose a collaborative filtering inspired approach to predicting inappropriate accesses. Our solution integrates both explicit and latent features for staff and patients, the latter acting as a personalized “finger-print” based on historical access patterns. The proposed method, when applied to real EHR access data from two tertiary hospitals and a file-access dataset from Amazon, shows not only significantly improved performance compared to existing methods, but also provides insights as to what indicates an inappropriate access. PMID:24683293
Toward a Framework for Dynamic Service Binding in E-Procurement
NASA Astrophysics Data System (ADS)
Ashoori, Maryam; Eze, Benjamin; Benyoucef, Morad; Peyton, Liam
In an online environment, an E-Procurement process should be able to react and adapt in near real-time to changes in suppliers, requirements, and regulations. WS-BPEL is an emerging standard for process automation, but is oriented towards design-time binding of services. This missing issue can be resolved through designing an extension to WS-BPEL to support automation of flexible e-Procurement processes. Our proposed framework will support dynamic acquisition of procurement services from different suppliers dealing with changing procurement requirements. The proposed framework is illustrated by applying it to health care where different health insurance providers could be involved to procure the medication for patients.
A factory concept for processing and manufacturing with lunar material
NASA Technical Reports Server (NTRS)
Driggers, G. W.
1977-01-01
A conceptual design for an orbital factory sized to process 1.5 million metric tons per year of raw lunar fines into 0.3 million metric tons of manufacturing materials is presented. A conservative approach involving application of present earth-based technology leads to a design devoid of new inventions. Earth based counterparts to the factory machinery were used to generate subsystem masses and lumped parameters for volume and mass estimates. The results are considered to be conservative since technologies more advanced than those assumed are presently available in many areas. Some attributes of potential space processing technologies applied to material refinement and component manufacture are discussed.
Chemical surface deposition of ultra-thin semiconductors
McCandless, Brian E.; Shafarman, William N.
2003-03-25
A chemical surface deposition process for forming an ultra-thin semiconducting film of Group IIB-VIA compounds onto a substrate. This process eliminates particulates formed by homogeneous reactions in bath, dramatically increases the utilization of Group IIB species, and results in the formation of a dense, adherent film for thin film solar cells. The process involves applying a pre-mixed liquid coating composition containing Group IIB and Group VIA ionic species onto a preheated substrate. Heat from the substrate causes a heterogeneous reaction between the Group IIB and VIA ionic species of the liquid coating composition, thus forming a solid reaction product film on the substrate surface.
Ingham, Richard J; Battilocchio, Claudio; Fitzpatrick, Daniel E; Sliwinski, Eric; Hawkins, Joel M; Ley, Steven V
2015-01-01
Performing reactions in flow can offer major advantages over batch methods. However, laboratory flow chemistry processes are currently often limited to single steps or short sequences due to the complexity involved with operating a multi-step process. Using new modular components for downstream processing, coupled with control technologies, more advanced multi-step flow sequences can be realized. These tools are applied to the synthesis of 2-aminoadamantane-2-carboxylic acid. A system comprising three chemistry steps and three workup steps was developed, having sufficient autonomy and self-regulation to be managed by a single operator. PMID:25377747
Martin, Graham P; McNicol, Sarah; Chew, Sarah
2013-01-01
Collaborations for Leadership in Applied Health Research and Care (CLAHRCs) are a new UK initiative to promote collaboration between universities and healthcare organisations in carrying out and applying the findings of applied health research. But they face significant, institutionalised barriers to their success. This paper seeks to analyse these challenges and discuss prospects for overcoming them. The paper draws on in-depth qualitative interview data from the first round of an ongoing evaluation of one CLAHRC to understand the views of different stakeholders on its progress so far, challenges faced, and emergent solutions. The breadth of CLAHRCs' missions seems crucial to mobilise the diverse stakeholders needed to succeed, but also produces disagreement about what the prime goal of the Collaborations should be. A process of consensus building is necessary to instil a common vision among CLAHRC members, but deep-seated institutional divisions continue to orient them in divergent directions, which may need to be overcome through other means. This analysis suggests some of the key means by which those involved in joint enterprises such as CLAHRCs can achieve consensus and action towards a current goal, and offers recommendations for those involved in their design, commissioning and performance management.
Applying for ethical approval for research: the main issues.
Gelling, Leslie
2016-01-13
The need to obtain research ethical approval is common to all research involving human participants. This approval must be obtained before research participants can be approached and before data collection can begin. The process of ethical review is one way that research participants can be confident that possible risks have been considered, minimised and deemed acceptable. This article outlines some of the main issues researchers should consider when planning an application for research ethical approval by answering the following six questions: 'Do I need research ethical approval?', 'How many applications will I need to make?', 'Where should I apply for research ethical approval?', 'What do I need to include in my application?', 'What do research ethics committees look for?' and 'What other approvals might I need?' Answering these questions will enable researchers to navigate the ethical review process.
Analyzing gene expression data in mice with the Neuro Behavior Ontology.
Hoehndorf, Robert; Hancock, John M; Hardy, Nigel W; Mallon, Ann-Marie; Schofield, Paul N; Gkoutos, Georgios V
2014-02-01
We have applied the Neuro Behavior Ontology (NBO), an ontology for the annotation of behavioral gene functions and behavioral phenotypes, to the annotation of more than 1,000 genes in the mouse that are known to play a role in behavior. These annotations can be explored by researchers interested in genes involved in particular behaviors and used computationally to provide insights into the behavioral phenotypes resulting from differences in gene expression. We developed the OntoFUNC tool and have applied it to enrichment analyses over the NBO to provide high-level behavioral interpretations of gene expression datasets. The resulting increase in the number of gene annotations facilitates the identification of behavioral or neurologic processes by assisting the formulation of hypotheses about the relationships between gene, processes, and phenotypic manifestations resulting from behavioral observations.
Universality Results for Multi-phase Hele-Shaw Flows
NASA Astrophysics Data System (ADS)
Daripa, Prabir
2013-03-01
Saffman-Taylor instability is a well known viscosity driven instability of an interface separating two immiscible fluids. We study linear stability of displacement processes in a Hele-Shaw cell involving an arbitrary number of immiscible fluid phases. This is a problem involving many interfaces. Universal stability results have been obtained for this multi-phase immiscible flow in the sense that the results hold for arbitrary number of interfaces. These stability results have been applied to design displacement processes that are considerably less unstable than the pure Saffman-Taylor case. In particular, we derive universal formula which gives specific values of the viscosities of the fluid layers corresponding to smallest unstable band. Other similar universal results will also be presented. The talk is based on the following paper. This work was supported by the Qatar National Research Fund (a member of The Qatar Foundation).
Applied metrology in the production of superconducting model magnets for particle accelerators
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ferradas Troitino, Jose; Bestmann, Patrick; Bourcey, Nicolas
2017-12-22
The production of superconducting magnets for particle accelerators involves high precision assemblies and tight tolerances, in order to achieve the requirements for their appropriate performance. It is therefore essential to have a strict control and traceability over the geometry of each component of the system, and also to be able to compensate possible inherent deviations coming from the production process.
Sean N. Gordon; Gallo Kirsten
2011-01-01
Assessments of watershed condition for aquatic and riparian species often have to rely on expert opinion because of the complexity of establishing statistical relationships among the many factors involved. Such expert-based assessments can be difficult to document and apply consistently over time and space. We describe and reflect on the process of developing a...
The Systems Test Architect: Enabling The Leap From Testable To Tested
2016-09-01
engineering process requires an interdisciplinary approach, involving both technical and managerial disciplines applied to the synthesis and integration...relationship between the technical and managerial aspects of systems engineering. TP-2003-020-01 describes measurement as having the following...it is evident that DOD makes great strides to tackle both the managerial and technical aspects of test and evaluation within the systems
Uncertainty Analysis Principles and Methods
2007-09-01
error source . The Data Processor converts binary coded numbers to values, performs D/A curve fitting and applies any correction factors that may be...describes the stages or modules involved in the measurement process. We now need to identify all relevant error sources and develop the mathematical... sources , gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden
Halámek, Jan; Zhou, Jian; Halámková, Lenka; Bocharova, Vera; Privman, Vladimir; Wang, Joseph; Katz, Evgeny
2011-11-15
Biomolecular logic systems processing biochemical input signals and producing "digital" outputs in the form of YES/NO were developed for analysis of physiological conditions characteristic of liver injury, soft tissue injury, and abdominal trauma. Injury biomarkers were used as input signals for activating the logic systems. Their normal physiological concentrations were defined as logic-0 level, while their pathologically elevated concentrations were defined as logic-1 values. Since the input concentrations applied as logic 0 and 1 values were not sufficiently different, the output signals being at low and high values (0, 1 outputs) were separated with a short gap making their discrimination difficult. Coupled enzymatic reactions functioning as a biomolecular signal processing system with a built-in filter property were developed. The filter process involves a partial back-conversion of the optical-output-signal-yielding product, but only at its low concentrations, thus allowing the proper discrimination between 0 and 1 output values.
Greenwood, Taylor J; Lopez-Costa, Rodrigo I; Rhoades, Patrick D; Ramírez-Giraldo, Juan C; Starr, Matthew; Street, Mandie; Duncan, James; McKinstry, Robert C
2015-01-01
The marked increase in radiation exposure from medical imaging, especially in children, has caused considerable alarm and spurred efforts to preserve the benefits but reduce the risks of imaging. Applying the principles of the Image Gently campaign, data-driven process and quality improvement techniques such as process mapping and flowcharting, cause-and-effect diagrams, Pareto analysis, statistical process control (control charts), failure mode and effects analysis, "lean" or Six Sigma methodology, and closed feedback loops led to a multiyear program that has reduced overall computed tomographic (CT) examination volume by more than fourfold and concurrently decreased radiation exposure per CT study without compromising diagnostic utility. This systematic approach involving education, streamlining access to magnetic resonance imaging and ultrasonography, auditing with comparison with benchmarks, applying modern CT technology, and revising CT protocols has led to a more than twofold reduction in CT radiation exposure between 2005 and 2012 for patients at the authors' institution while maintaining diagnostic utility. (©)RSNA, 2015.
Molinos-Senante, María; Gómez, Trinidad; Caballero, Rafael; Hernández-Sancho, Francesc; Sala-Garrido, Ramón
2015-11-01
The selection of the most appropriate wastewater treatment (WWT) technology is a complex problem since many alternatives are available and many criteria are involved in the decision-making process. To deal with this challenge, the analytic network process (ANP) is applied for the first time to rank a set of seven WWT technology set-ups for secondary treatment in small communities. A major advantage of ANP is that it incorporates interdependent relationships between elements. Results illustrated that extensive technologies, constructed wetlands and pond systems are the most preferred alternatives by WWT experts. The sensitivity analysis performed verified that the ranking of WWT alternatives is very stable since constructed wetlands are almost always placed in the first position. This paper showed that ANP analysis is suitable to deal with complex decision-making problems, such as the selection of the most appropriate WWT system contributing to better understand the multiple interdependences among elements involved in the assessment. Copyright © 2015 Elsevier B.V. All rights reserved.
Burger, Gerhard A.; Danen, Erik H. J.; Beltman, Joost B.
2017-01-01
Epithelial–mesenchymal transition (EMT), the process by which epithelial cells can convert into motile mesenchymal cells, plays an important role in development and wound healing but is also involved in cancer progression. It is increasingly recognized that EMT is a dynamic process involving multiple intermediate or “hybrid” phenotypes rather than an “all-or-none” process. However, the role of EMT in various cancer hallmarks, including metastasis, is debated. Given the complexity of EMT regulation, computational modeling has proven to be an invaluable tool for cancer research, i.e., to resolve apparent conflicts in experimental data and to guide experiments by generating testable hypotheses. In this review, we provide an overview of computational modeling efforts that have been applied to regulation of EMT in the context of cancer progression and its associated tumor characteristics. Moreover, we identify possibilities to bridge different modeling approaches and point out outstanding questions in which computational modeling can contribute to advance our understanding of pathological EMT. PMID:28824874
Process for preparing tapes from thermoplastic polymers and carbon fibers
NASA Technical Reports Server (NTRS)
Chung, Tai-Shung (Inventor); Furst, Howard (Inventor); Gurion, Zev (Inventor); McMahon, Paul E. (Inventor); Orwoll, Richard D. (Inventor); Palangio, Daniel (Inventor)
1986-01-01
The instant invention involves a process for use in preparing tapes or rovings, which are formed from a thermoplastic material used to impregnate longitudinally extended bundles of carbon fibers. The process involves the steps of (a) gas spreading a tow of carbon fibers; (b) feeding the spread tow into a crosshead die; (c) impregnating the tow in the die with a thermoplastic polymer; (d) withdrawing the impregnated tow from the die; and (e) gas cooling the impregnated tow with a jet of air. The crosshead die useful in the instant invention includes a horizontally extended, carbon fiber bundle inlet channel, means for providing melted polymer under pressure to the die, means for dividing the polymeric material flowing into the die into an upper flow channel and a lower flow channel disposed above and below the moving carbon fiber bundle, means for applying the thermoplastic material from both the upper and lower channels to the fiber bundle, and means for withdrawing the resulting tape from the die.
Interactive genetic counseling role-play: a novel educational strategy for family physicians.
Blaine, Sean M; Carroll, June C; Rideout, Andrea L; Glendon, Gord; Meschino, Wendy; Shuman, Cheryl; Telner, Deanna; Van Iderstine, Natasha; Permaul, Joanne
2008-04-01
Family physicians (FPs) are increasingly involved in delivering genetic services. Familiarization with aspects of genetic counseling may enable FPs to help patients make informed choices. Exploration of interactive role-play as a means to raise FPs' awareness of the process and content of genetic counseling. FPs attending two large Canadian family medicine conferences in 2005 were eligible -- 93 participated. FPs discussed a case during a one-on-one session with a genetic counselor. Evaluation involved pre and post intervention questionnaires FPs' baseline genetic knowledge was self-rated as uniformly poor. Baseline confidence was highest in eliciting family history and providing psychosocial support and lowest in discussing risks/benefits of genetic testing and counseling process. Post-intervention, 80% of FPs had better appreciation of family history and 97% indicated this was an effective learning experience. Role-play with FPs is effective in raising awareness of the process and content of genetic counseling and may be applied to other health disciplines.
A Systems Engineering Approach to Quality Assurance for Aerospace Testing
NASA Technical Reports Server (NTRS)
Shepherd, Christena C.
2015-01-01
On the surface, it appears that AS91001 has little to say about how to apply a Quality Management System (QMS) to major aerospace test programs (or even smaller ones). It also appears that there is little in the quality engineering Body of Knowledge (BOK)2 that applies to testing, unless it is nondestructive examination (NDE), or some type of lab or bench testing associated with the manufacturing process. However, if one examines: a) how the systems engineering (SE) processes are implemented throughout a test program; and b) how these SE processes can be mapped to the requirements of AS9100, a number of areas for involvement of the quality professional are revealed. What often happens is that quality assurance during a test program is limited to inspections of the test article; what could be considered a manufacturing al fresco approach. This limits the quality professional and is a disservice to the programs and projects, since there are a number of ways that quality can enhance critical processes, and support efforts to improve risk reduction, efficiency and effectiveness.
Davis, J P; Akella, S; Waddell, P H
2004-01-01
Having greater computational power on the desktop for processing taxa data sets has been a dream of biologists/statisticians involved in phylogenetics data analysis. Many existing algorithms have been highly optimized-one example being Felsenstein's PHYLIP code, written in C, for UPGMA and neighbor joining algorithms. However, the ability to process more than a few tens of taxa in a reasonable amount of time using conventional computers has not yielded a satisfactory speedup in data processing, making it difficult for phylogenetics practitioners to quickly explore data sets-such as might be done from a laptop computer. We discuss the application of custom computing techniques to phylogenetics. In particular, we apply this technology to speed up UPGMA algorithm execution by a factor of a hundred, against that of PHYLIP code running on the same PC. We report on these experiments and discuss how custom computing techniques can be used to not only accelerate phylogenetics algorithm performance on the desktop, but also on larger, high-performance computing engines, thus enabling the high-speed processing of data sets involving thousands of taxa.
Beyond Inhibition: A Dual-Process Perspective to Renew the Exploration of Binge Drinking
Lannoy, Séverine; Billieux, Joël; Maurage, Pierre
2014-01-01
Binge drinking is a widespread alcohol-consumption pattern in youth and is linked to cognitive consequences, mostly for executive functions. However, other crucial factors remain less explored in binge drinking and notably the emotional-automatic processes. Dual-process model postulates that addictive disorders are not only due to impaired reflective system (involved in deliberate behaviors), but rather to an imbalance between under-activated reflective system and over-activated affective-automatic one (involved in impulsive behaviors). This proposal has been confirmed in alcohol-dependence, but has not been tested in binge drinking. The observation of comparable impairments in binge drinking and alcohol-dependence led to the “continuum hypothesis,” suggesting similar deficits across different alcohol-related disorders. In this perspective, applying the dual-process model to binge drinking might renew the understanding of this continuum hypothesis. A three-axes research agenda will be proposed, exploring: (1) the affective-automatic system in binge drinking; (2) the systems’ interactions and imbalance in binge drinking; (3) the evolution of this imbalance in the transition between binge drinking and alcohol-dependence. PMID:24926251
[Comparison of Coptidis Rhizoma processed with different ginger juice based on metabolomics].
Zhong, Ling-Yun; Su, Dan; Zhu, Jing; Deng, Yu-Fen
2016-07-01
To investigate the effects of two different ginger juices on the medicinal properties of Coptidis Rhizoma(CR) by using UPLC-MS-TOF. The rats were fed with decoction of raw CR (RCR), CR processed with ginger juice from fresh ginger(CRGJFG), CR processed with ginger juice from Zinger (CRGJZ), ginger juice from fresh ginger (GJFG) and ginger juice from Zinger (GJZ), and then their urine was collected at different time points for metabolomics analysis. PeakviewTM 1.7 software was applied to analyze the total ion current under positive ion mode; MarkerviewTM 2.0 software was applied for principal component analysis (PCA). The possible biomarkers were screened and their content changes were described according to the searching results in Scifinder and Chemspider databases and related literature reports. The results showed that CR processed with different ginger juice would produce different effects on energy metabolism. Nine possible biomarkers relating to medicinal properties were found as sarcosine, hippuric acid, creatinine, kynurenine, tyrosine, L-tryptophan, nicotinic acid, arachidonic acid and L-proline. L-tryptophan, kynurenine and nicotinic acid were involved in the metabolism of tryptophan; sarcosine, creatinine, L-proline and tyrosine were involved in arginine and proline metabolism; the content of arachidonic acid in urine, precursor of leukotrienes B4, from high to low were CRGJZ, CRGJFG and RCR. The contents of all biomarkers in GJZ group were higher than those in GJFG group, indicating the cold nature of CR was gradually decreased in the following order: RCR, CRGJZ and CRGJFG, and resulting in different anti-inflammatory effects of samples. The results were in consistent with the conclusion that GJFG had hot nature and GJZ had warm nature. The study provided the scientific basis for proper use of different ginger juice as processing assistants. Copyright© by the Chinese Pharmaceutical Association.
An Ensemble Framework Coping with Instability in the Gene Selection Process.
Castellanos-Garzón, José A; Ramos, Juan; López-Sánchez, Daniel; de Paz, Juan F; Corchado, Juan M
2018-03-01
This paper proposes an ensemble framework for gene selection, which is aimed at addressing instability problems presented in the gene filtering task. The complex process of gene selection from gene expression data faces different instability problems from the informative gene subsets found by different filter methods. This makes the identification of significant genes by the experts difficult. The instability of results can come from filter methods, gene classifier methods, different datasets of the same disease and multiple valid groups of biomarkers. Even though there is a wide number of proposals, the complexity imposed by this problem remains a challenge today. This work proposes a framework involving five stages of gene filtering to discover biomarkers for diagnosis and classification tasks. This framework performs a process of stable feature selection, facing the problems above and, thus, providing a more suitable and reliable solution for clinical and research purposes. Our proposal involves a process of multistage gene filtering, in which several ensemble strategies for gene selection were added in such a way that different classifiers simultaneously assess gene subsets to face instability. Firstly, we apply an ensemble of recent gene selection methods to obtain diversity in the genes found (stability according to filter methods). Next, we apply an ensemble of known classifiers to filter genes relevant to all classifiers at a time (stability according to classification methods). The achieved results were evaluated in two different datasets of the same disease (pancreatic ductal adenocarcinoma), in search of stability according to the disease, for which promising results were achieved.
Sciutto, Giorgia; Oliveri, Paolo; Catelli, Emilio; Bonacini, Irene
2017-01-01
In the field of applied researches in heritage science, the use of multivariate approach is still quite limited and often chemometric results obtained are often underinterpreted. Within this scenario, the present paper is aimed at disseminating the use of suitable multivariate methodologies and proposes a procedural workflow applied on a representative group of case studies, of considerable importance for conservation purposes, as a sort of guideline on the processing and on the interpretation of this FTIR data. Initially, principal component analysis (PCA) is performed and the score values are converted into chemical maps. Successively, the brushing approach is applied, demonstrating its usefulness for a deep understanding of the relationships between the multivariate map and PC score space, as well as for the identification of the spectral bands mainly involved in the definition of each area localised within the score maps. PMID:29333162
Involvement of astrocyte metabolic coupling in Tourette syndrome pathogenesis.
de Leeuw, Christiaan; Goudriaan, Andrea; Smit, August B; Yu, Dongmei; Mathews, Carol A; Scharf, Jeremiah M; Verheijen, Mark H G; Posthuma, Danielle
2015-11-01
Tourette syndrome is a heritable neurodevelopmental disorder whose pathophysiology remains unknown. Recent genome-wide association studies suggest that it is a polygenic disorder influenced by many genes of small effect. We tested whether these genes cluster in cellular function by applying gene-set analysis using expert curated sets of brain-expressed genes in the current largest available Tourette syndrome genome-wide association data set, involving 1285 cases and 4964 controls. The gene sets included specific synaptic, astrocytic, oligodendrocyte and microglial functions. We report association of Tourette syndrome with a set of genes involved in astrocyte function, specifically in astrocyte carbohydrate metabolism. This association is driven primarily by a subset of 33 genes involved in glycolysis and glutamate metabolism through which astrocytes support synaptic function. Our results indicate for the first time that the process of astrocyte-neuron metabolic coupling may be an important contributor to Tourette syndrome pathogenesis.
Involvement of astrocyte metabolic coupling in Tourette syndrome pathogenesis
de Leeuw, Christiaan; Goudriaan, Andrea; Smit, August B; Yu, Dongmei; Mathews, Carol A; Scharf, Jeremiah M; Scharf, J M; Pauls, D L; Yu, D; Illmann, C; Osiecki, L; Neale, B M; Mathews, C A; Reus, V I; Lowe, T L; Freimer, N B; Cox, N J; Davis, L K; Rouleau, G A; Chouinard, S; Dion, Y; Girard, S; Cath, D C; Posthuma, D; Smit, J H; Heutink, P; King, R A; Fernandez, T; Leckman, J F; Sandor, P; Barr, C L; McMahon, W; Lyon, G; Leppert, M; Morgan, J; Weiss, R; Grados, M A; Singer, H; Jankovic, J; Tischfield, J A; Heiman, G A; Verheijen, Mark H G; Posthuma, Danielle
2015-01-01
Tourette syndrome is a heritable neurodevelopmental disorder whose pathophysiology remains unknown. Recent genome-wide association studies suggest that it is a polygenic disorder influenced by many genes of small effect. We tested whether these genes cluster in cellular function by applying gene-set analysis using expert curated sets of brain-expressed genes in the current largest available Tourette syndrome genome-wide association data set, involving 1285 cases and 4964 controls. The gene sets included specific synaptic, astrocytic, oligodendrocyte and microglial functions. We report association of Tourette syndrome with a set of genes involved in astrocyte function, specifically in astrocyte carbohydrate metabolism. This association is driven primarily by a subset of 33 genes involved in glycolysis and glutamate metabolism through which astrocytes support synaptic function. Our results indicate for the first time that the process of astrocyte-neuron metabolic coupling may be an important contributor to Tourette syndrome pathogenesis. PMID:25735483
Influencing organizations to promote health: applying stakeholder theory.
Kok, Gerjo; Gurabardhi, Zamira; Gottlieb, Nell H; Zijlstra, Fred R H
2015-04-01
Stakeholder theory may help health promoters to make changes at the organizational and policy level to promote health. A stakeholder is any individual, group, or organization that can influence an organization. The organization that is the focus for influence attempts is called the focal organization. The more salient a stakeholder is and the more central in the network, the stronger the influence. As stakeholders, health promoters may use communicative, compromise, deinstitutionalization, or coercive methods through an ally or a coalition. A hypothetical case study, involving adolescent use of harmful legal products, illustrates the process of applying stakeholder theory to strategic decision making. © 2015 Society for Public Health Education.
Natural fracture systems on planetary surfaces: Genetic classification and pattern randomness
NASA Technical Reports Server (NTRS)
Rossbacher, Lisa A.
1987-01-01
One method for classifying natural fracture systems is by fracture genesis. This approach involves the physics of the formation process, and it has been used most frequently in attempts to predict subsurface fractures and petroleum reservoir productivity. This classification system can also be applied to larger fracture systems on any planetary surface. One problem in applying this classification system to planetary surfaces is that it was developed for ralatively small-scale fractures that would influence porosity, particularly as observed in a core sample. Planetary studies also require consideration of large-scale fractures. Nevertheless, this system offers some valuable perspectives on fracture systems of any size.
What can individual differences reveal about face processing?
Yovel, Galit; Wilmer, Jeremy B.; Duchaine, Brad
2014-01-01
Faces are probably the most widely studied visual stimulus. Most research on face processing has used a group-mean approach that averages behavioral or neural responses to faces across individuals and treats variance between individuals as noise. However, individual differences in face processing can provide valuable information that complements and extends findings from group-mean studies. Here we demonstrate that studies employing an individual differences approach—examining associations and dissociations across individuals—can answer fundamental questions about the way face processing operates. In particular these studies allow us to associate and dissociate the mechanisms involved in face processing, tie behavioral face processing mechanisms to neural mechanisms, link face processing to broader capacities and quantify developmental influences on face processing. The individual differences approach we illustrate here is a powerful method that should be further explored within the domain of face processing as well as fruitfully applied across the cognitive sciences. PMID:25191241
NASA Technical Reports Server (NTRS)
Van Dongen, Hans P A.; Dinges, David F.
2003-01-01
The two-process model of sleep regulation has been applied successfully to describe, predict, and understand sleep-wake regulation in a variety of experimental protocols such as sleep deprivation and forced desynchrony. A non-linear interaction between the homeostatic and circadian processes was reported when the model was applied to describe alertness and performance data obtained during forced desynchrony. This non-linear interaction could also be due to intrinsic non-linearity in the metrics used to measure alertness and performance, however. Distinguishing these possibilities would be of theoretical interest, but could also have important implications for the design and interpretation of experiments placing sleep at different circadian phases or varying the duration of sleep and/or wakefulness. Although to date no resolution to this controversy has been found, here we show that the issue can be addressed with existing data sets. The interaction between the homeostatic and circadian processes of sleep-wake regulation was investigated using neurobehavioural performance data from a laboratory experiment involving total sleep deprivation. The results provided evidence of an actual non-linear interaction between the homeostatic and circadian processes of sleep-wake regulation for the prediction of waking neurobehavioural performance.
NASA Astrophysics Data System (ADS)
Goienetxea Uriarte, A.; Ruiz Zúñiga, E.; Urenda Moris, M.; Ng, A. H. C.
2015-05-01
Discrete Event Simulation (DES) is nowadays widely used to support decision makers in system analysis and improvement. However, the use of simulation for improving stochastic logistic processes is not common among healthcare providers. The process of improving healthcare systems involves the necessity to deal with trade-off optimal solutions that take into consideration a multiple number of variables and objectives. Complementing DES with Multi-Objective Optimization (SMO) creates a superior base for finding these solutions and in consequence, facilitates the decision-making process. This paper presents how SMO has been applied for system improvement analysis in a Swedish Emergency Department (ED). A significant number of input variables, constraints and objectives were considered when defining the optimization problem. As a result of the project, the decision makers were provided with a range of optimal solutions which reduces considerably the length of stay and waiting times for the ED patients. SMO has proved to be an appropriate technique to support healthcare system design and improvement processes. A key factor for the success of this project has been the involvement and engagement of the stakeholders during the whole process.
Edlmann, Ellie; Giorgi-Coll, Susan; Whitfield, Peter C; Carpenter, Keri L H; Hutchinson, Peter J
2017-05-30
Chronic subdural haematoma (CSDH) is an encapsulated collection of blood and fluid on the surface of the brain. Historically considered a result of head trauma, recent evidence suggests there are more complex processes involved. Trauma may be absent or very minor and does not explain the progressive, chronic course of the condition. This review focuses on several key processes involved in CSDH development: angiogenesis, fibrinolysis and inflammation. The characteristic membrane surrounding the CSDH has been identified as a source of fluid exudation and haemorrhage. Angiogenic stimuli lead to the creation of fragile blood vessels within membrane walls, whilst fibrinolytic processes prevent clot formation resulting in continued haemorrhage. An abundance of inflammatory cells and markers have been identified within the membranes and subdural fluid and are likely to contribute to propagating an inflammatory response which stimulates ongoing membrane growth and fluid accumulation. Currently, the mainstay of treatment for CSDH is surgical drainage, which has associated risks of recurrence requiring repeat surgery. Understanding of the underlying pathophysiological processes has been applied to developing potential drug treatments. Ongoing research is needed to identify if these therapies are successful in controlling the inflammatory and angiogenic disease processes leading to control and resolution of CSDH.
Andrés-Toro, B; Girón-Sierra, J M; Fernández-Blanco, P; López-Orozco, J A; Besada-Portas, E
2004-04-01
This paper describes empirical research on the model, optimization and supervisory control of beer fermentation. Conditions in the laboratory were made as similar as possible to brewery industry conditions. Since mathematical models that consider realistic industrial conditions were not available, a new mathematical model design involving industrial conditions was first developed. Batch fermentations are multiobjective dynamic processes that must be guided along optimal paths to obtain good results. The paper describes a direct way to apply a Pareto set approach with multiobjective evolutionary algorithms (MOEAs). Successful finding of optimal ways to drive these processes were reported. Once obtained, the mathematical fermentation model was used to optimize the fermentation process by using an intelligent control based on certain rules.
34 CFR 97.401 - To what do these regulations apply?
Code of Federal Regulations, 2013 CFR
2013-07-01
... involving survey or interview procedures or observations of public behavior does not apply to research... ED Protections for Children Who Are Subjects in Research § 97.401 To what do these regulations apply? (a) This subpart applies to all research involving children as subjects conducted or supported by the...
34 CFR 97.401 - To what do these regulations apply?
Code of Federal Regulations, 2011 CFR
2011-07-01
... involving survey or interview procedures or observations of public behavior does not apply to research... ED Protections for Children Who Are Subjects in Research § 97.401 To what do these regulations apply? (a) This subpart applies to all research involving children as subjects conducted or supported by the...
34 CFR 97.401 - To what do these regulations apply?
Code of Federal Regulations, 2010 CFR
2010-07-01
... involving survey or interview procedures or observations of public behavior does not apply to research... ED Protections for Children Who Are Subjects in Research § 97.401 To what do these regulations apply? (a) This subpart applies to all research involving children as subjects conducted or supported by the...
34 CFR 97.401 - To what do these regulations apply?
Code of Federal Regulations, 2012 CFR
2012-07-01
... involving survey or interview procedures or observations of public behavior does not apply to research... ED Protections for Children Who Are Subjects in Research § 97.401 To what do these regulations apply? (a) This subpart applies to all research involving children as subjects conducted or supported by the...
34 CFR 97.401 - To what do these regulations apply?
Code of Federal Regulations, 2014 CFR
2014-07-01
... involving survey or interview procedures or observations of public behavior does not apply to research... ED Protections for Children Who Are Subjects in Research § 97.401 To what do these regulations apply? (a) This subpart applies to all research involving children as subjects conducted or supported by the...
Bona, Silvia; Cattaneo, Zaira; Silvanto, Juha
2016-01-01
The right occipital face area (rOFA) is known to be involved in face discrimination based on local featural information. Whether this region is also involved in global, holistic stimulus processing is not known. We used fMRI-guided transcranial magnetic stimulation (TMS) to investigate whether rOFA is causally implicated in stimulus detection based on holistic processing, by the use of Mooney stimuli. Two studies were carried out: In Experiment 1, participants performed a detection task involving Mooney faces and Mooney objects; Mooney stimuli lack distinguishable local features and can be detected solely via holistic processing (i.e. at a global level) with top-down guidance from previously stored representations. Experiment 2 required participants to detect shapes which are recognized via bottom-up integration of local (collinear) Gabor elements and was performed to control for specificity of rOFA's implication in holistic detection. In Experiment 1, TMS over rOFA and rLO impaired detection of all stimulus categories, with no category-specific effect. In Experiment 2, shape detection was impaired when TMS was applied over rLO but not over rOFA. Our results demonstrate that rOFA is causally implicated in the type of top-down holistic detection required by Mooney stimuli and that such role is not face-selective. In contrast, rOFA does not appear to play a causal role in detection of shapes based on bottom-up integration of local components, demonstrating that its involvement in processing non-face stimuli is specific for holistic processing. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
Statistical Models and Inference Procedures for Structural and Materials Reliability
1990-12-01
as an official Department of the Army positio~n, policy, or decision, unless sD designated by other documentazion. 12a. DISTRIBUTION /AVAILABILITY...Some general stress-strength models were also developed and applied to the failure of systems subject to cyclic loading. Involved in the failure of...process control ideas and sequential design and analysis methods. Finally, smooth nonparametric quantile .wJ function estimators were studied. All of
An Evaluation of Teaching-Learning of Drawing at School of Applied Arts, Takoradi Polytechnic
ERIC Educational Resources Information Center
Ofori-Anyinam, Sampong; Andrews, Amoako-Temeng; Ankrah, Owusu-Ansah
2016-01-01
Drawing is described as the bases of all art work when an art idea is conceived. It can only materialize into concrete form when it has gone through a process of designing which basically involves drawing. The ability of an artist to draw is very paramount in the art profession. The bases for selecting students to pursue an art programme is their…
Engineering Design Handbook: Environmental Series. Part Five. Glossary of Environmental Terms
1975-07-31
temperature, surface. shield. In cables, the metallic layer applied over the dielectric, or group of dielectrics, composed of woven, braided , or served...greases, thereby reducing corrosion of metals and hardening of seals. active vibration isolation and absorption systems. Servomechanism-type systems...Usually refers to grav- AMCP708119 el or crashed rock. Sometimes called road metal (in England) (Ref. 1). aging. A gradual process involving physical
Know the Network, Knit the Network: Applying SNA to N2C2 Maturity Model Experiments
2010-06-01
Networks (COINS) 2009. Procedia - Social and Behavioral Sciences (2009). Snijders, Tom A.B., Christian E. G. Steglich and Michael Schweinberger...8217 patterning that create social structures. As an interdisciplinary behavioural science specialty, SNA defends that social actors are interdependent...production of social science data involve a process of interpretation. To carry out such interpretation robustly it is understood that it is imperative to
Developing nursing care plans.
Ballantyne, Helen
2016-02-24
This article aims to enhance nurses' understanding of nursing care plans, reflecting on the past, present and future use of care planning. This involves consideration of the central theories of nursing and discussion of nursing models and the nursing process. An explanation is provided of how theories of nursing may be applied to care planning, in combination with clinical assessment tools, to ensure that care plans are context specific and patient centred.
ERIC Educational Resources Information Center
St. John, Edward P.; Loescher, Siri; Jacob, Stacy; Cekic, Osman; Kupersmith, Leigh; Musoba, Glenda Droogsma
A growing number of schools are exploring the prospect of applying for funding to implement a Comprehensive School Reform (CSR) model. But the process of selecting a CSR model can be complicated because it frequently involves self-study and a review of models to determine which models best meet the needs of the school. This study guide is intended…
Pérez, Rosa Ana; Albero, Beatriz; Tadeo, José Luis; Sánchez-Brunete, Consuelo
2016-11-01
A rapid extraction procedure is presented for the determination of five endocrine-disrupting compounds, estrone, ethinylestradiol, bisphenol A, triclosan, and 2-ethylhexylsalicylate, in water samples. The analysis involves a two-step extraction procedure that combines dispersive liquid-liquid microextraction (DLLME) with dispersive micro-solid phase extraction (D-μ-SPE), using magnetic nanoparticles, followed by in situ derivatization in the injection port of a gas chromatograph coupled to triple quadrupole mass spectrometry. The use of uncoated or oleate-coated Fe 3 O 4 nanoparticles as sorbent in the extraction process was evaluated and compared. The main parameters involved in the extraction process were optimized applying experimental designs. Uncoated Fe 3 O 4 nanoparticles were selected in order to simplify and make more cost-effective the procedure. DLLME was carried out at pH 3, during 2 min, followed by the addition of the nanoparticles for D-μ-SPE employing 1 min in the extraction. Analysis of spiked water samples of different sources gave satisfactory recovery results for all the compounds with detection limits ranging from 7 to 180 ng l -1 . Finally, the procedure was applied in tap, well, and river water. Graphical abstract Diagram of the extraction method using magnetic nanoparticles (MNPs).
Guidelines for safe handling of hazardous drugs: A systematic review
Bernabeu-Martínez, Mari A.; Ramos Merino, Mateo; Santos Gago, Juan M.; Álvarez Sabucedo, Luis M.; Wanden-Berghe, Carmina
2018-01-01
Objective To review the scientific literature related to the safe handling of hazardous drugs (HDs). Method Critical analysis of works retrieved from MEDLINE, the Cochrane Library, Scopus, CINHAL, Web of Science and LILACS using the terms "Hazardous Substances", "Antineoplastic Agents" and "Cytostatic Agents", applying "Humans" and "Guidelines" as filters. Date of search: January 2017. Results In total, 1100 references were retrieved, and from those, 61 documents were selected based on the inclusion and exclusion criteria: 24 (39.3%) documents related to recommendations about HDs; 27 (44.3%) about antineoplastic agents, and 10 (33.3%) about other types of substances (monoclonal antibodies, gene medicine and other chemical and biological agents). In 14 (23.3%) guides, all the stages in the manipulation process involving a risk due to exposure were considered. Only one guide addressed all stages of the handling process of HDs (including stages with and without the risk of exposure). The most described stages were drug preparation (41 guides, 67.2%), staff training and/or patient education (38 guides, 62.3%), and administration (37 guides, 60.7%). No standardized informatics system was found that ensured quality management, traceability and minimization of the risks associated with these drugs. Conclusions Most of the analysed guidelines limit their recommendations to the manipulation of antineoplastics. The most frequently described activities were preparation, training, and administration. It would be convenient to apply ICTs (Information and Communications Technologies) to manage processes involving HDs in a more complete and simpler fashion. PMID:29750798
Guidelines for safe handling of hazardous drugs: A systematic review.
Bernabeu-Martínez, Mari A; Ramos Merino, Mateo; Santos Gago, Juan M; Álvarez Sabucedo, Luis M; Wanden-Berghe, Carmina; Sanz-Valero, Javier
2018-01-01
To review the scientific literature related to the safe handling of hazardous drugs (HDs). Critical analysis of works retrieved from MEDLINE, the Cochrane Library, Scopus, CINHAL, Web of Science and LILACS using the terms "Hazardous Substances", "Antineoplastic Agents" and "Cytostatic Agents", applying "Humans" and "Guidelines" as filters. Date of search: January 2017. In total, 1100 references were retrieved, and from those, 61 documents were selected based on the inclusion and exclusion criteria: 24 (39.3%) documents related to recommendations about HDs; 27 (44.3%) about antineoplastic agents, and 10 (33.3%) about other types of substances (monoclonal antibodies, gene medicine and other chemical and biological agents). In 14 (23.3%) guides, all the stages in the manipulation process involving a risk due to exposure were considered. Only one guide addressed all stages of the handling process of HDs (including stages with and without the risk of exposure). The most described stages were drug preparation (41 guides, 67.2%), staff training and/or patient education (38 guides, 62.3%), and administration (37 guides, 60.7%). No standardized informatics system was found that ensured quality management, traceability and minimization of the risks associated with these drugs. Most of the analysed guidelines limit their recommendations to the manipulation of antineoplastics. The most frequently described activities were preparation, training, and administration. It would be convenient to apply ICTs (Information and Communications Technologies) to manage processes involving HDs in a more complete and simpler fashion.
Investigation of autofocus algorithms for brightfield microscopy of unstained cells
NASA Astrophysics Data System (ADS)
Wu, Shu Yu; Dugan, Nazim; Hennelly, Bryan M.
2014-05-01
In the past decade there has been significant interest in image processing for brightfield cell microscopy. Much of the previous research on image processing for microscopy has focused on fluorescence microscopy, including cell counting, cell tracking, cell segmentation and autofocusing. Fluorescence microscopy provides functional image information that involves the use of labels in the form of chemical stains or dyes. For some applications, where the biochemical integrity of the cell is required to remain unchanged so that sensitive chemical testing can later be applied, it is necessary to avoid staining. For this reason the challenge of processing images of unstained cells has become a topic of increasing attention. These cells are often effectively transparent and appear to have a homogenous intensity profile when they are in focus. Bright field microscopy is the most universally available and most widely used form of optical microscopy and for this reason we are interested in investigating image processing of unstained cells recorded using a standard bright field microscope. In this paper we investigate the application of a range of different autofocus metrics applied to unstained bladder cancer cell lines using a standard inverted bright field microscope with microscope objectives that have high magnification and numerical aperture. We present a number of conclusions on the optimum metrics and the manner in which they should be applied for this application.
Strakova, Eva; Zikova, Alice; Vohradsky, Jiri
2014-01-01
A computational model of gene expression was applied to a novel test set of microarray time series measurements to reveal regulatory interactions between transcriptional regulators represented by 45 sigma factors and the genes expressed during germination of a prokaryote Streptomyces coelicolor. Using microarrays, the first 5.5 h of the process was recorded in 13 time points, which provided a database of gene expression time series on genome-wide scale. The computational modeling of the kinetic relations between the sigma factors, individual genes and genes clustered according to the similarity of their expression kinetics identified kinetically plausible sigma factor-controlled networks. Using genome sequence annotations, functional groups of genes that were predominantly controlled by specific sigma factors were identified. Using external binding data complementing the modeling approach, specific genes involved in the control of the studied process were identified and their function suggested.
Leveraging design thinking to build sustainable mobile health systems.
Eckman, Molly; Gorski, Irena; Mehta, Khanjan
Mobile health, or mHealth, technology has the potential to improve health care access in the developing world. However, the majority of mHealth projects do not expand beyond the pilot stage. A core reason why is because they do not account for the individual needs and wants of those involved. A collaborative approach is needed to integrate the perspectives of all stakeholders into the design and operation of mHealth endeavours. Design thinking is a methodology used to develop and evaluate novel concepts for systems. With roots in participatory processes and self-determined pathways, design thinking provides a compelling framework to understand and apply the needs of diverse stakeholders to mHealth project development through a highly iterative process. The methodology presented in this article provides a structured approach to apply design thinking principles to assess the feasibility of novel mHealth endeavours during early conceptualisation.
Applying Current Concepts in Pain-Related Brain Science to Dance Rehabilitation.
Wallwork, Sarah B; Bellan, Valeria; Moseley, G Lorimer
2017-03-01
Dance involves exemplary sensory-motor control, which is subserved by sophisticated neural processing at the spinal cord and brain level. Such neural processing is altered in the presence of nociception and pain, and the adaptations within the central nervous system that are known to occur with persistent nociception or pain have clear implications for movement and, indeed, risk of further injury. Recent rapid advances in our understanding of the brain's representation of the body and the role of cortical representations, or "neurotags," in bodily protection and regulation have given rise to new strategies that are gaining traction in sports medicine. Those strategies are built on the principles that govern the operation of neurotags and focus on minimizing the impact of pain, injury, and immobilization on movement control and optimal performance. Here we apply empirical evidence from the chronic pain clinical neurosciences to introduce new opportunities for rehabilitation after dance injury.
A guide to structural factors for advanced composites used on spacecraft
NASA Technical Reports Server (NTRS)
Vanwagenen, Robert
1989-01-01
The use of composite materials in spacecraft systems is constantly increasing. Although the areas of composite design and fabrication are maturing, they remain distinct from the same activities performed using conventional materials and processes. This has led to some confusion regarding the precise meaning of the term 'factor of safety' as it applies to these structures. In addition, composite engineering introduces terms such as 'knock-down factors' to further modify material properties for design purposes. This guide is intended to clarify these terms as well as their use in the design of composite structures for spacecraft. It is particularly intended to be used by the engineering community not involved in the day-to-day composites design process. An attempt is also made to explain the wide range of factors of safety encountered in composite designs as well as their relationship to the 1.4 factor of safety conventionally applied to metallic structures.
Fracture Tests of Etched Components Using a Focused Ion Beam Machine
NASA Technical Reports Server (NTRS)
Kuhn, Jonathan, L.; Fettig, Rainer K.; Moseley, S. Harvey; Kutyrev, Alexander S.; Orloff, Jon; Powers, Edward I. (Technical Monitor)
2000-01-01
Many optical MEMS device designs involve large arrays of thin (0.5 to 1 micron components subjected to high stresses due to cyclic loading. These devices are fabricated from a variety of materials, and the properties strongly depend on size and processing. Our objective is to develop standard and convenient test methods that can be used to measure the properties of large numbers of witness samples, for every device we build. In this work we explore a variety of fracture test configurations for 0.5 micron thick silicon nitride membranes machined using the Reactive Ion Etching (RIE) process. Testing was completed using an FEI 620 dual focused ion beam milling machine. Static loads were applied using a probe. and dynamic loads were applied through a piezo-electric stack mounted at the base of the probe. Results from the tests are presented and compared, and application for predicting fracture probability of large arrays of devices are considered.
Cárdenas-García, Maura; González-Pérez, Pedro Pablo
2013-03-01
Apoptotic cell death plays a crucial role in development and homeostasis. This process is driven by mitochondrial permeabilization and activation of caspases. In this paper we adopt a tuple spaces-based modelling and simulation approach, and show how it can be applied to the simulation of this intracellular signalling pathway. Specifically, we are working to explore and to understand the complex interaction patterns of the caspases apoptotic and the mitochondrial role. As a first approximation, using the tuple spacesbased in silico approach, we model and simulate both the extrinsic and intrinsic apoptotic signalling pathways and the interactions between them. During apoptosis, mitochondrial proteins, released from mitochondria to cytosol are decisively involved in the process. If the decision is to die, from this point there is normally no return, cancer cells offer resistance to the mitochondrial induction.
Cárdenas-García, Maura; González-Pérez, Pedro Pablo
2013-04-11
Apoptotic cell death plays a crucial role in development and homeostasis. This process is driven by mitochondrial permeabilization and activation of caspases. In this paper we adopt a tuple spaces-based modelling and simulation approach, and show how it can be applied to the simulation of this intracellular signalling pathway. Specifically, we are working to explore and to understand the complex interaction patterns of the caspases apoptotic and the mitochondrial role. As a first approximation, using the tuple spaces-based in silico approach, we model and simulate both the extrinsic and intrinsic apoptotic signalling pathways and the interactions between them. During apoptosis, mitochondrial proteins, released from mitochondria to cytosol are decisively involved in the process. If the decision is to die, from this point there is normally no return, cancer cells offer resistance to the mitochondrial induction.
Farias, Diego Carlos; Araujo, Fernando Oliveira de
2017-06-01
Hospitals are complex organizations which, in addition to the technical assistance expected in the context of treatment and prevention of health hazards, also require good management practices aimed at improving their efficiency in their core business. However, in administrative terms, recurrent conflicts arise involving technical and managerial areas. Thus, this article sets out to conducta review of the scientific literature pertaining to the themes of hospital management and projects that have been applied in the hospital context. In terms of methodology, the study adopts the webiblioming method of collection and systematic analysis of knowledge in indexed journal databases. The results show a greater interest on the part of researchers in looking for a more vertically and horizontally dialogical administration, better definition of work processes, innovative technological tools to support the management process and finally the possibility of applying project management methodologies in collaboration with hospital management.
Discontinuity Detection in the Shield Metal Arc Welding Process
Cocota, José Alberto Naves; Garcia, Gabriel Carvalho; da Costa, Adilson Rodrigues; de Lima, Milton Sérgio Fernandes; Rocha, Filipe Augusto Santos; Freitas, Gustavo Medeiros
2017-01-01
This work proposes a new methodology for the detection of discontinuities in the weld bead applied in Shielded Metal Arc Welding (SMAW) processes. The detection system is based on two sensors—a microphone and piezoelectric—that acquire acoustic emissions generated during the welding. The feature vectors extracted from the sensor dataset are used to construct classifier models. The approaches based on Artificial Neural Network (ANN) and Support Vector Machine (SVM) classifiers are able to identify with a high accuracy the three proposed weld bead classes: desirable weld bead, shrinkage cavity and burn through discontinuities. Experimental results illustrate the system’s high accuracy, greater than 90% for each class. A novel Hierarchical Support Vector Machine (HSVM) structure is proposed to make feasible the use of this system in industrial environments. This approach presented 96.6% overall accuracy. Given the simplicity of the equipment involved, this system can be applied in the metal transformation industries. PMID:28489045
Discontinuity Detection in the Shield Metal Arc Welding Process.
Cocota, José Alberto Naves; Garcia, Gabriel Carvalho; da Costa, Adilson Rodrigues; de Lima, Milton Sérgio Fernandes; Rocha, Filipe Augusto Santos; Freitas, Gustavo Medeiros
2017-05-10
This work proposes a new methodology for the detection of discontinuities in the weld bead applied in Shielded Metal Arc Welding (SMAW) processes. The detection system is based on two sensors-a microphone and piezoelectric-that acquire acoustic emissions generated during the welding. The feature vectors extracted from the sensor dataset are used to construct classifier models. The approaches based on Artificial Neural Network (ANN) and Support Vector Machine (SVM) classifiers are able to identify with a high accuracy the three proposed weld bead classes: desirable weld bead, shrinkage cavity and burn through discontinuities. Experimental results illustrate the system's high accuracy, greater than 90% for each class. A novel Hierarchical Support Vector Machine (HSVM) structure is proposed to make feasible the use of this system in industrial environments. This approach presented 96.6% overall accuracy. Given the simplicity of the equipment involved, this system can be applied in the metal transformation industries.
Weismann, Wittgenstein and the homunculus fallacy.
Smit, Harry
2010-09-01
A problem that has troubled both neo-Darwinists and neo-Lamarckians is whether instincts involve knowledge. This paper discusses the contributions to this problem of the evolutionary biologist August Weismann and the philosopher Ludwig Wittgenstein. Weismann discussed an empirical homunculus fallacy: Lamarck's theory mistakenly presupposes a homunculus in the germ cells. Wittgenstein discussed a conceptual homunculus fallacy which applies to Lamarck's theory: it is mistaken to suppose that knowledge is stored in the brain or DNA. The upshot of these two fallacies is that instincts arise through a neo-Darwinian process but are not cognitions in the sense that they involve (the recollection of stored) knowledge. Although neo-Lamarckians have rightly argued that learning processes may contribute to the development of instincts, their ideas about the role of knowledge in the evolution and development of instincts are mistaken. Copyright © 2010 Elsevier Ltd. All rights reserved.
Timing of translation in cross-language qualitative research.
Santos, Hudson P O; Black, Amanda M; Sandelowski, Margarete
2015-01-01
Although there is increased understanding of language barriers in cross-language studies, the point at which language transformation processes are applied in research is inconsistently reported, or treated as a minor issue. Differences in translation timeframes raise methodological issues related to the material to be translated, as well as for the process of data analysis and interpretation. In this article we address methodological issues related to the timing of translation from Portuguese to English in two international cross-language collaborative research studies involving researchers from Brazil, Canada, and the United States. One study entailed late-phase translation of a research report, whereas the other study involved early phase translation of interview data. The timing of translation in interaction with the object of translation should be considered, in addition to the language, cultural, subject matter, and methodological competencies of research team members. © The Author(s) 2014.
NASA Astrophysics Data System (ADS)
McCune, Matthew; Shafiee, Ashkan; Forgacs, Gabor; Kosztin, Ioan
2014-03-01
Cellular Particle Dynamics (CPD) is an effective computational method for describing and predicting the time evolution of biomechanical relaxation processes of multicellular systems. A typical example is the fusion of spheroidal bioink particles during post bioprinting structure formation. In CPD cells are modeled as an ensemble of cellular particles (CPs) that interact via short-range contact interactions, characterized by an attractive (adhesive interaction) and a repulsive (excluded volume interaction) component. The time evolution of the spatial conformation of the multicellular system is determined by following the trajectories of all CPs through integration of their equations of motion. CPD was successfully applied to describe and predict the fusion of 3D tissue construct involving identical spherical aggregates. Here, we demonstrate that CPD can also predict tissue formation involving uneven spherical aggregates whose volumes decrease during the fusion process. Work supported by NSF [PHY-0957914]. Computer time provided by the University of Missouri Bioinformatics Consortium.
Grotheer, Mareike; Ambrus, Géza Gergely; Kovács, Gyula
2016-05-15
Recent research suggests the existence of a visual area selectively processing numbers in the human inferior temporal cortex (number form area (NFA); Abboud et al., 2015; Grotheer et al., 2016; Shum et al., 2013). The NFA is thought to be involved in the preferential encoding of numbers over false characters, letters and non-number words (Grotheer et al., 2016; Shum et al., 2013), independently of the sensory modality (Abboud et al., 2015). However, it is not yet clear if this area is mandatory for normal number processing. The present study exploited the fact that high-resolution fMRI can be applied to identify the NFA individually (Grotheer et al., 2016) and tested if transcranial magnetic stimulation (TMS) of this area interferes with stimulus processing in a selective manner. Double-pulse TMS targeted at the right NFA significantly impaired the detection of briefly presented and masked Arabic numbers in comparison to vertex stimulation. This suggests the NFA to be necessary for fluent number processing. Surprisingly, TMS of the NFA also impaired the detection of Roman letters. On the other hand, stimulation of the lateral occipital complex (LO) had neither an effect on the detection of numbers nor on letters. Our results show, for the first time, that the NFA is causally involved in the early visual processing of numbers as well as of letters. Copyright © 2016 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Russell, E. E.; Chandos, R. A.; Kodak, J. C.; Pellicori, S. F.; Tomasko, M. G.
1974-01-01
The constraints that are imposed on the Outer Planet Missions (OPM) imager design are of critical importance. Imager system modeling analyses define important parameters and systematic means for trade-offs applied to specific Jupiter orbiter missions. Possible image sequence plans for Jupiter missions are discussed in detail. Considered is a series of orbits that allow repeated near encounters with three of the Jovian satellites. The data handling involved in the image processing is discussed, and it is shown that only minimal processing is required for the majority of images for a Jupiter orbiter mission.
An Overview of the State of the Art in Atomistic and Multiscale Simulation of Fracture
NASA Technical Reports Server (NTRS)
Saether, Erik; Yamakov, Vesselin; Phillips, Dawn R.; Glaessgen, Edward H.
2009-01-01
The emerging field of nanomechanics is providing a new focus in the study of the mechanics of materials, particularly in simulating fundamental atomic mechanisms involved in the initiation and evolution of damage. Simulating fundamental material processes using first principles in physics strongly motivates the formulation of computational multiscale methods to link macroscopic failure to the underlying atomic processes from which all material behavior originates. This report gives an overview of the state of the art in applying concurrent and sequential multiscale methods to analyze damage and failure mechanisms across length scales.
Numerical Validation of Chemical Compositional Model for Wettability Alteration Processes
NASA Astrophysics Data System (ADS)
Bekbauov, Bakhbergen; Berdyshev, Abdumauvlen; Baishemirov, Zharasbek; Bau, Domenico
2017-12-01
Chemical compositional simulation of enhanced oil recovery and surfactant enhanced aquifer remediation processes is a complex task that involves solving dozens of equations for all grid blocks representing a reservoir. In the present work, we perform a numerical validation of the newly developed mathematical formulation which satisfies the conservation laws of mass and energy and allows applying a sequential solution approach to solve the governing equations separately and implicitly. Through its application to the numerical experiment using a wettability alteration model and comparisons with existing chemical compositional model's numerical results, the new model has proven to be practical, reliable and stable.
Using hypnosis to help deaf children help themselves: report of two cases.
Kohen, D P; Mann-Rinehart, P; Schmitz, D; Wills, L M
1998-04-01
This is a report of deaf children who demonstrated the ability to quickly learn hypnotic skills and apply them effectively to the management of their problems. The children were taught hypnosis through American Sign Language, their preferred mode of communication. As with hypnosis with hearing children, we focused upon induction with fantasy and imaginative involvement, creation in imagination of a metaphor for, or imagery of, the desired outcome, and associated sense of pride (ego-strengthening), positive expectation, and teaching self-hypnosis to emphasize the importance of repeated, daily practice. Case examples presented are an 11-year-old deaf girl who used hypnosis to eliminate multiple warts, and a 9-year-old deaf boy with mild developmental disability whose self-hypnosis skills were applied to the management of myoclonus. In the former, the clinician is also the sign language communicator and in the latter, a professional sign language interpreter and parent are both intimately involved in the communication and hypnosis process.
Implementation of a Web-Based Collaborative Process Planning System
NASA Astrophysics Data System (ADS)
Wang, Huifen; Liu, Tingting; Qiao, Li; Huang, Shuangxi
Under the networked manufacturing environment, all phases of product manufacturing involving design, process planning, machining and assembling may be accomplished collaboratively by different enterprises, even different manufacturing stages of the same part may be finished collaboratively by different enterprises. Based on the self-developed networked manufacturing platform eCWS(e-Cooperative Work System), a multi-agent-based system framework for collaborative process planning is proposed. In accordance with requirements of collaborative process planning, share resources provided by cooperative enterprises in the course of collaboration are classified into seven classes. Then a reconfigurable and extendable resource object model is built. Decision-making strategy is also studied in this paper. Finally a collaborative process planning system e-CAPP is developed and applied. It provides strong support for distributed designers to collaboratively plan and optimize product process though network.
Lightweight Concrete Produced Using a Two-Stage Casting Process.
Yoon, Jin Young; Kim, Jae Hong; Hwang, Yoon Yi; Shin, Dong Kyu
2015-03-25
The type of lightweight aggregate and its volume fraction in a mix determine the density of lightweight concrete. Minimizing the density obviously requires a higher volume fraction, but this usually causes aggregates segregation in a conventional mixing process. This paper proposes a two-stage casting process to produce a lightweight concrete. This process involves placing lightweight aggregates in a frame and then filling in the remaining interstitial voids with cementitious grout. The casting process results in the lowest density of lightweight concrete, which consequently has low compressive strength. The irregularly shaped aggregates compensate for the weak point in terms of strength while the round-shape aggregates provide a strength of 20 MPa. Therefore, the proposed casting process can be applied for manufacturing non-structural elements and structural composites requiring a very low density and a strength of at most 20 MPa.
Falcone, U; Gilardi, Luisella; Pasqualini, O; Santoro, S; Coffano, Elena
2010-01-01
Exposure to carcinogens is still widespread in working environments. For the purpose of defining priority of interventions, it is necessary to estimate the number and the geographic distribution of workers potentially exposed to carcinogens. It could therefore be useful to test the use of tools and information sources already available in order to map the distribution of exposure to carcinogens. Formaldehyde is suggested as an example of an occupational carcinogen in this study. The study aimed at verifying and investigating the potential of 3 integrated databases: MATline, CAREX, and company databases resulting from occupational accident and disease claims (INAIL), in order to estimate the number of workers exposed to formaldehyde and map their distribution in the Piedmont Region. The list of manufacturing processes involving exposure to formaldehyde was sorted by MIATline; for each process the number of firms and employees were obtained from the INAIL archives. By applying the prevalence of exposed workers obtained with CAREX, an estimate of exposure for each process was determined. A map of the distribution of employees associated with a specific process was produced using ArcView GIS software. It was estimated that more than 13,000 employees are exposed to formaldehyde in the Piedmont Region. The manufacture of furniture was identified as the process with the highest number of workers exposed to formaldehyde (3,130),followed by metal workers (2,301 exposed) and synthetic resin processing (1,391 exposed). The results obtained from the integrated use of databases provide a basis for defining priority of preventive interventions required in the industrial processes involving exposure to carcinogens in the Piedmont Region.
Effects of Pump-turbine S-shaped Characteristics on Transient Behaviours: Model Setup
NASA Astrophysics Data System (ADS)
Zeng, Wei; Yang, Jiandong; Hu, Jinhong
2017-04-01
Pumped storage stations undergo numerous transition processes, which make the pump turbines go through the unstable S-shaped region. The hydraulic transient in S-shaped region has normally been investigated through numerical simulations, while field experiments generally involve high risks and are difficult to perform. In this research, a pumped storage model composed of a piping system, two model units, two electrical control systems, a measurement system and a collection system was set up to study the transition processes. The model platform can be applied to simulate almost any hydraulic transition process that occurs in real power stations, such as load rejection, startup, frequency control and grid connection.
Loncke, Filip T; Campbell, Jamie; England, Amanda M; Haley, Tanya
2006-02-15
Message generating is a complex process involving a number of processes, including the selection of modes to use. When expressing a message, human communicators typically use a combination of modes. This phenomenon is often termed multimodality. This article explores the use of models that explain multimodality as an explanatory framework for augmentative and alternative communication (AAC). Multimodality is analysed from a communication, psycholinguistic, and cognitive perspective. Theoretical and applied topics within AAC can be explained or described within the multimodality framework considering iconicity, simultaneous communication, lexical organization, and compatibility of communication modes. Consideration of multimodality is critical to understanding underlying processes in individuals who use AAC and individuals who interact with them.
Fabrication of boron sputter targets
Makowiecki, Daniel M.; McKernan, Mark A.
1995-01-01
A process for fabricating high density boron sputtering targets with sufficient mechanical strength to function reliably at typical magnetron sputtering power densities and at normal process parameters. The process involves the fabrication of a high density boron monolithe by hot isostatically compacting high purity (99.9%) boron powder, machining the boron monolithe into the final dimensions, and brazing the finished boron piece to a matching boron carbide (B.sub.4 C) piece, by placing aluminum foil there between and applying pressure and heat in a vacuum. An alternative is the application of aluminum metallization to the back of the boron monolithe by vacuum deposition. Also, a titanium based vacuum braze alloy can be used in place of the aluminum foil.
Solder extrusion pressure bonding process and bonded products produced thereby
Beavis, L.C.; Karnowsky, M.M.; Yost, F.G.
1992-06-16
Disclosed is a process for production of soldered joints which are highly reliable and capable of surviving 10,000 thermal cycles between about [minus]40 C and 110 C. Process involves interposing a thin layer of a metal solder composition between the metal surfaces of members to be bonded and applying heat and up to about 1000 psi compression pressure to the superposed members, in the presence of a reducing atmosphere, to extrude the major amount of the solder composition, contaminants including fluxing gases and air, from between the members being bonded, to form a very thin, strong intermetallic bonding layer having a thermal expansion tolerant with that of the bonded members.
Virkki, Tuija
2015-06-01
This article examines social and health care professionals' views, based on their encounters with both victims and perpetrators, on the division of responsibility in the process of ending intimate partner violence. Applying discourse analysis to focus group discussions with a total of 45 professionals on solutions to the problem, several positions of responsible agency in which professionals place themselves and their clients are identified. The results suggest that one key to understanding the complexities involved in violence intervention lies in a more adequate theorization of the temporal and intersubjective dimensions of the process of assigning responsibility for the problem. © The Author(s) 2015.
15 CFR 712.1 - Round to zero rule that applies to activities involving Schedule 1 chemicals.
Code of Federal Regulations, 2011 CFR
2011-01-01
... CHEMICAL WEAPONS CONVENTION REGULATIONS ACTIVITIES INVOLVING SCHEDULE 1 CHEMICALS § 712.1 Round to zero rule that applies to activities involving Schedule 1 chemicals. Facilities that produce, export or... activities involving Schedule 1 chemicals. 712.1 Section 712.1 Commerce and Foreign Trade Regulations...
Gianelo, M.C.S.; Polizzelo, J.C.; Chesca, D.; Mattiello-Sverzut, A.C.
2015-01-01
The aim of this study was to determine the effects of intermittent passive manual stretching on various proteins involved in force transmission in skeletal muscle. Female Wistar weanling rats were randomly assigned to 5 groups: 2 control groups containing 21- and 30-day-old rats that received neither immobilization nor stretching, and 3 test groups that received 1) passive stretching over 3 days, 2) immobilization for 7 days and then passive stretching over 3 days, or 3) immobilization for 7 days. Maximal plantar flexion in the right hind limb was imposed, and the stretching protocol of 10 repetitions of 30 s stretches was applied. The soleus muscles were harvested and processed for HE and picrosirius staining; immunohistochemical analysis of collagen types I, III, IV, desmin, and vimentin; and immunofluorescence labeling of dystrophin and CD68. The numbers of desmin- and vimentin-positive cells were significantly decreased compared with those in the control following immobilization, regardless of whether stretching was applied (P<0.05). In addition, the semi-quantitative analysis showed that collagen type I was increased and type IV was decreased in the immobilized animals, regardless of whether the stretching protocol was applied. In conclusion, the largest changes in response to stretching were observed in muscles that had been previously immobilized, and the stretching protocol applied here did not mitigate the immobilization-induced muscle changes. Muscle disuse adversely affected several proteins involved in the transmission of forces between the intracellular and extracellular compartments. Thus, the 3-day rehabilitation period tested here did not provide sufficient time for the muscles to recover from the disuse maladaptations in animals undergoing postnatal development. PMID:26648091
Gianelo, M C S; Polizzelo, J C; Chesca, D; Mattiello-Sverzut, A C
2016-02-01
The aim of this study was to determine the effects of intermittent passive manual stretching on various proteins involved in force transmission in skeletal muscle. Female Wistar weanling rats were randomly assigned to 5 groups: 2 control groups containing 21- and 30-day-old rats that received neither immobilization nor stretching, and 3 test groups that received 1) passive stretching over 3 days, 2) immobilization for 7 days and then passive stretching over 3 days, or 3) immobilization for 7 days. Maximal plantar flexion in the right hind limb was imposed, and the stretching protocol of 10 repetitions of 30 s stretches was applied. The soleus muscles were harvested and processed for HE and picrosirius staining; immunohistochemical analysis of collagen types I, III, IV, desmin, and vimentin; and immunofluorescence labeling of dystrophin and CD68. The numbers of desmin- and vimentin-positive cells were significantly decreased compared with those in the control following immobilization, regardless of whether stretching was applied (P<0.05). In addition, the semi-quantitative analysis showed that collagen type I was increased and type IV was decreased in the immobilized animals, regardless of whether the stretching protocol was applied. In conclusion, the largest changes in response to stretching were observed in muscles that had been previously immobilized, and the stretching protocol applied here did not mitigate the immobilization-induced muscle changes. Muscle disuse adversely affected several proteins involved in the transmission of forces between the intracellular and extracellular compartments. Thus, the 3-day rehabilitation period tested here did not provide sufficient time for the muscles to recover from the disuse maladaptations in animals undergoing postnatal development.
Streefland, M; Van Herpen, P F G; Van de Waterbeemd, B; Van der Pol, L A; Beuvery, E C; Tramper, J; Martens, D E; Toft, M
2009-10-15
A licensed pharmaceutical process is required to be executed within the validated ranges throughout the lifetime of product manufacturing. Changes to the process, especially for processes involving biological products, usually require the manufacturer to demonstrate that the safety and efficacy of the product remains unchanged by new or additional clinical testing. Recent changes in the regulations for pharmaceutical processing allow broader ranges of process settings to be submitted for regulatory approval, the so-called process design space, which means that a manufacturer can optimize his process within the submitted ranges after the product has entered the market, which allows flexible processes. In this article, the applicability of this concept of the process design space is investigated for the cultivation process step for a vaccine against whooping cough disease. An experimental design (DoE) is applied to investigate the ranges of critical process parameters that still result in a product that meets specifications. The on-line process data, including near infrared spectroscopy, are used to build a descriptive model of the processes used in the experimental design. Finally, the data of all processes are integrated in a multivariate batch monitoring model that represents the investigated process design space. This article demonstrates how the general principles of PAT and process design space can be applied for an undefined biological product such as a whole cell vaccine. The approach chosen for model development described here, allows on line monitoring and control of cultivation batches in order to assure in real time that a process is running within the process design space.
NASA Astrophysics Data System (ADS)
Mwaniki, M. W.; Kuria, D. N.; Boitt, M. K.; Ngigi, T. G.
2017-04-01
Image enhancements lead to improved performance and increased accuracy of feature extraction, recognition, identification, classification and hence change detection. This increases the utility of remote sensing to suit environmental applications and aid disaster monitoring of geohazards involving large areas. The main aim of this study was to compare the effect of image enhancement applied to synthetic aperture radar (SAR) data and Landsat 8 imagery in landslide identification and mapping. The methodology involved pre-processing Landsat 8 imagery, image co-registration, despeckling of the SAR data, after which Landsat 8 imagery was enhanced by Principal and Independent Component Analysis (PCA and ICA), a spectral index involving bands 7 and 4, and using a False Colour Composite (FCC) with the components bearing the most geologic information. The SAR data were processed using textural and edge filters, and computation of SAR incoherence. The enhanced spatial, textural and edge information from the SAR data was incorporated to the spectral information from Landsat 8 imagery during the knowledge based classification. The methodology was tested in the central highlands of Kenya, characterized by rugged terrain and frequent rainfall induced landslides. The results showed that the SAR data complemented Landsat 8 data which had enriched spectral information afforded by the FCC with enhanced geologic information. The SAR classification depicted landslides along the ridges and lineaments, important information lacking in the Landsat 8 image classification. The success of landslide identification and classification was attributed to the enhanced geologic features by spectral, textural and roughness properties.
A numerical model to simulate foams during devolatilization of polymers
NASA Astrophysics Data System (ADS)
Khan, Irfan; Dixit, Ravindra
2014-11-01
Customers often demand that the polymers sold in the market have low levels of volatile organic compounds (VOC). Some of the processes for making polymers involve the removal of volatiles to the levels of parts per million (devolatilization). During this step the volatiles are phase separated out of the polymer through a combination of heating and applying lower pressure, creating foam with the pure polymer in liquid phase and the volatiles in the gas phase. The efficiency of the devolatilization process depends on predicting the onset of solvent phase change in the polymer and volatiles mixture accurately based on the processing conditions. However due to the complex relationship between the polymer properties and the processing conditions this is not trivial. In this work, a bubble scale model is coupled with a bulk scale transport model to simulate the processing conditions of polymer devolatilization. The bubble scale model simulates the nucleation and bubble growth based on the classical nucleation theory and the popular ``influence volume approach.'' As such it provides the information of bubble size distribution and number density inside the polymer at any given time and position. This information is used to predict the bulk properties of the polymer and its behavior under the applied processing conditions. Initial results of this modeling approach will be presented.
Howlett, Owen; McKinstry, Carol; Lannin, Natasha A
2018-04-01
Allied health professionals frequently use surveys to collect data for clinical practice and service improvement projects. Careful development and piloting of purpose-designed surveys is important to ensure intended measuring (that respondents correctly interpret survey items when responding). Cognitive interviewing is a specific technique that can improve the design of self-administered surveys. The aim of this study was to describe the use of the cognitive interviewing process to improve survey design, which involved a purpose-designed, online survey evaluating staff use of functional electrical stimulation. A qualitative study involving one round of cognitive interviewing with three occupational therapists and three physiotherapists. The cognitive interviewing process identified 11 issues with the draft survey, which could potentially influence the validity and quality of responses. The raised issues included difficulties with: processing the question to be able to respond, determining a response to the question, retrieving relevant information from memory and comprehending the written question. Twelve survey amendments were made following the cognitive interviewing process, comprising four additions, seven revisions and one correction. The cognitive interviewing process applied during the development of a purpose-designed survey enabled the identification of potential problems and informed revisions to the survey prior to its use. © 2017 Occupational Therapy Australia.
Steginga, Suzanne K; Occhipinti, Stefano
2004-01-01
The study investigated the utility of the Heuristic-Systematic Processing Model as a framework for the investigation of patient decision making. A total of 111 men recently diagnosed with localized prostate cancer were assessed using Verbal Protocol Analysis and self-report measures. Study variables included men's use of nonsystematic and systematic information processing, desire for involvement in decision making, and the individual differences of health locus of control, tolerance of ambiguity, and decision-related uncertainty. Most men (68%) preferred that decision making be shared equally between them and their doctor. Men's use of the expert opinion heuristic was related to men's verbal reports of decisional uncertainty and having a positive orientation to their doctor and medical care; a desire for greater involvement in decision making was predicted by a high internal locus of health control. Trends were observed for systematic information processing to increase when the heuristic strategy used was negatively affect laden and when men were uncertain about the probabilities for cure and side effects. There was a trend for decreased systematic processing when the expert opinion heuristic was used. Findings were consistent with the Heuristic-Systematic Processing Model and suggest that this model has utility for future research in applied decision making about health.
An ERP investigation of visual word recognition in syllabary scripts.
Okano, Kana; Grainger, Jonathan; Holcomb, Phillip J
2013-06-01
The bimodal interactive-activation model has been successfully applied to understanding the neurocognitive processes involved in reading words in alphabetic scripts, as reflected in the modulation of ERP components in masked repetition priming. In order to test the generalizability of this approach, in the present study we examined word recognition in a different writing system, the Japanese syllabary scripts hiragana and katakana. Native Japanese participants were presented with repeated or unrelated pairs of Japanese words in which the prime and target words were both in the same script (within-script priming, Exp. 1) or were in the opposite script (cross-script priming, Exp. 2). As in previous studies with alphabetic scripts, in both experiments the N250 (sublexical processing) and N400 (lexical-semantic processing) components were modulated by priming, although the time course was somewhat delayed. The earlier N/P150 effect (visual feature processing) was present only in "Experiment 1: Within-script priming", in which the prime and target words shared visual features. Overall, the results provide support for the hypothesis that visual word recognition involves a generalizable set of neurocognitive processes that operate in similar manners across different writing systems and languages, as well as pointing to the viability of the bimodal interactive-activation framework for modeling such processes.
An ERP Investigation of Visual Word Recognition in Syllabary Scripts
Okano, Kana; Grainger, Jonathan; Holcomb, Phillip J.
2013-01-01
The bi-modal interactive-activation model has been successfully applied to understanding the neuro-cognitive processes involved in reading words in alphabetic scripts, as reflected in the modulation of ERP components in masked repetition priming. In order to test the generalizability of this approach, the current study examined word recognition in a different writing system, the Japanese syllabary scripts Hiragana and Katakana. Native Japanese participants were presented with repeated or unrelated pairs of Japanese words where the prime and target words were both in the same script (within-script priming, Experiment 1) or were in the opposite script (cross-script priming, Experiment 2). As in previous studies with alphabetic scripts, in both experiments the N250 (sub-lexical processing) and N400 (lexical-semantic processing) components were modulated by priming, although the time-course was somewhat delayed. The earlier N/P150 effect (visual feature processing) was present only in Experiment 1 where prime and target words shared visual features. Overall, the results provide support for the hypothesis that visual word recognition involves a generalizable set of neuro-cognitive processes that operate in a similar manner across different writing systems and languages, as well as pointing to the viability of the bi-modal interactive activation framework for modeling such processes. PMID:23378278
Walter, Alexander I; Helgenberger, Sebastian; Wiek, Arnim; Scholz, Roland W
2007-11-01
Most Transdisciplinary Research (TdR) projects combine scientific research with the building of decision making capacity for the involved stakeholders. These projects usually deal with complex, societally relevant, real-world problems. This paper focuses on TdR projects, which integrate the knowledge of researchers and stakeholders in a collaborative transdisciplinary process through structured methods of mutual learning. Previous research on the evaluation of TdR has insufficiently explored the intended effects of transdisciplinary processes on the real world (societal effects). We developed an evaluation framework for assessing the societal effects of transdisciplinary processes. Outputs (measured as procedural and product-related involvement of the stakeholders), impacts (intermediate effects connecting outputs and outcomes) and outcomes (enhanced decision making capacity) are distinguished as three types of societal effects. Our model links outputs and outcomes of transdisciplinary processes via the impacts using a mediating variables approach. We applied this model in an ex post evaluation of a transdisciplinary process. 84 out of 188 agents participated in a survey. The results show significant mediation effects of the two impacts "network building" and "transformation knowledge". These results indicate an influence of a transdisciplinary process on the decision making capacity of stakeholders, especially through social network building and the generation of knowledge relevant for action.
Heuristics Applied in the Development of Advanced Space Mission Concepts
NASA Technical Reports Server (NTRS)
Nilsen, Erik N.
1998-01-01
Advanced mission studies are the first step in determining the feasibility of a given space exploration concept. A space scientist develops a science goal in the exploration of space. This may be a new observation method, a new instrument or a mission concept to explore a solar system body. In order to determine the feasibility of a deep space mission, a concept study is convened to determine the technology needs and estimated cost of performing that mission. Heuristics are one method of defining viable mission and systems architectures that can be assessed for technology readiness and cost. Developing a viable architecture depends to a large extent upon extending the existing body of knowledge, and applying it in new and novel ways. These heuristics have evolved over time to include methods for estimating technical complexity, technology development, cost modeling and mission risk in the unique context of deep space missions. This paper examines the processes involved in performing these advanced concepts studies, and analyzes the application of heuristics in the development of an advanced in-situ planetary mission. The Venus Surface Sample Return mission study provides a context for the examination of the heuristics applied in the development of the mission and systems architecture. This study is illustrative of the effort involved in the initial assessment of an advance mission concept, and the knowledge and tools that are applied.
Evidence of automatic processing in sequence learning using process-dissociation
Mong, Heather M.; McCabe, David P.; Clegg, Benjamin A.
2012-01-01
This paper proposes a way to apply process-dissociation to sequence learning in addition and extension to the approach used by Destrebecqz and Cleeremans (2001). Participants were trained on two sequences separated from each other by a short break. Following training, participants self-reported their knowledge of the sequences. A recognition test was then performed which required discrimination of two trained sequences, either under the instructions to call any sequence encountered in the experiment “old” (the inclusion condition), or only sequence fragments from one half of the experiment “old” (the exclusion condition). The recognition test elicited automatic and controlled process estimates using the process dissociation procedure, and suggested both processes were involved. Examining the underlying processes supporting performance may provide more information on the fundamental aspects of the implicit and explicit constructs than has been attainable through awareness testing. PMID:22679465
Fontana, Silvia Alicia; Raimondi, Waldina; Rizzo, María Laura
2014-09-05
Sleep quality not only refers to sleeping well at night, but also includes appropriate daytime functioning. Poor quality of sleep can affect a variety of attention processes. The aim of this investigation was to evaluate the relationship between the perceived quality of sleep and selective focus in a group of college students. A descriptive cross-sectional study was carried out in a group of 52 Argentinian college students of the Universidad Adventista del Plata. The Pittsburgh Sleep Quality Index, the Continuous Performance Test and the Trail Making Test were applied. The main results indicate that students sleep an average of 6.48 hours. Generally half of the population tested had a good quality of sleep. However, the dispersion seen in some components demonstrates the heterogeneity of the sample in these variables. It was observed that the evaluated attention processes yielded different levels of alteration in the total sample: major variability in the process of process and in the divided-attention processes were detected. A lower percentage of alteration was observed in the process of attention support. Poor quality of sleep has more impact in the sub processes with greater participation of corticocortical circuits (selective and divided attention) and greater involvement of the prefrontal cortex. Fewer difficulties were found in the attention-support processes that rely on subcortical regions and have less frontal involvement.
Method and apparatus for continuous electrophoresis
Watson, Jack S.
1992-01-01
A method and apparatus for conducting continuous separation of substances by electrophoresis are disclosed. The process involves electrophoretic separation combined with couette flow in a thin volume defined by opposing surfaces. By alternating the polarity of the applied potential and producing reciprocating short rotations of at least one of the surfaces relative to the other, small increments of separation accumulate to cause substantial, useful segregation of electrophoretically separable components in a continuous flow system.
Lignin Formation and the Effects of Gravity: A New Approach
NASA Technical Reports Server (NTRS)
Lewis, Norman G.
1997-01-01
Two aspects of considerable importance in the enigmatic processes associated with lignification have made excellent progress. The first is that, even in a microgravity environment, compression wood formation, and hence altered lignin deposition, can be induced upon mechanically bending the stems of woody gymnosperms. It now needs to be established if an organism reorientating its woody stem tissue will generate this tissue in microgravity, in the absence of externally applied pressure. If it does not, then gravity has no effect on its formation, and instead it results from alterations in the stress gradient experienced by the organism impacted. The second area of progress involves establishing how the biochemical pathway to lignin is regulated, particularly with respect to selective monolignol biosynthesis. This is an important question since individual monomer deposition occurs in a temporally and spatially specific manner. In this regard, the elusive metabolic switch between E-p-coumaryl alcohol and E-coniferyl alcohol synthesis has been detected, the significance of which now needs to be defined at the enzyme and gene level. Switching between monolignol synthesis is important, since it is viewed to be a consequence of different perceptions by plants in the gravitational load experienced, and thus in the control of the type of lignification response. Additional experiments also revealed the rate-limiting processes involved in monolignol synthesis, and suggest that a biological system (involving metabolite concentrations, as well as enzymatic and gene (in)activation processes) is involved, rather than a single rate-limiting step.
Flores, Walter; Ruano, Ana Lorena; Funchal, Denise Phe
2009-01-01
Social participation has been understood in many different ways, and there are even typologies classifying participation by the degree of a population's control in decision making. Participation can vary from a symbolic act, which does not involve decision making, to processes in which it constitutes the principal tool for redistributing power within a population. This article argues that analyzing social participation from a perspective of power relations requires knowledge of the historical, social, and economic processes that have characterized the social relations in a specific context. Applying such an analysis to Guatemala reveals asymmetrical power relations characterized by a long history of repression and political violence. The armed conflict during the second half of the 20th century had devastating consequences for a large portion of the population as well as the country's social leadership. The ongoing violence resulted in negative psychosocial effects among the population, including mistrust toward institutions and low levels of social and political participation. Although Guatemala made progress in creating spaces for social participation in public policy after signing the Peace Accords in 1996, the country still faces after-effects of the conflict. One important task for the organizations that work in the field of health and the right to health is to help regenerate the social fabric and to rebuild trust between the state and its citizens. Such regeneration involves helping the population gain the skills, knowledge, and information needed in order to participate in and affect formal political processes that are decided and promoted by various public entities, such as the legislative and executive branches, municipal governments, and political parties. This process also applies to other groups that build citizenship through participation, such as neighborhood organizations and school and health committees.
Applying Game Thinking to Slips, Trips and Falls Prevention.
Dewick, Paul; Stanmore, Emma
2017-01-01
Gamification is about the way in which 'game thinking' can engage participants and change behaviours in real, non-game contexts. This paper explores how game thinking can be applied to help prevent slips, trips and falls (STF), which are the largest cause of accidental death in older people across Europe. The paper contributes to the assistive technology, digital health and computer science/human behaviour communities by responding to a gap in the literature for papers detailing the innovation process of developing interventions to improve health and quality of life. The aim of the paper is of interest to the many stakeholders involved in enabling older people to live independent, confident, healthy and safe lives in the community.
Applicability of Macroscopic Wear and Friction Laws on the Atomic Length Scale.
Eder, S J; Feldbauer, G; Bianchi, D; Cihak-Bayr, U; Betz, G; Vernes, A
2015-07-10
Using molecular dynamics, we simulate the abrasion process of an atomically rough Fe surface with multiple hard abrasive particles. By quantifying the nanoscopic wear depth in a time-resolved fashion, we show that Barwell's macroscopic wear law can be applied at the atomic scale. We find that in this multiasperity contact system, the Bowden-Tabor term, which describes the friction force as a function of the real nanoscopic contact area, can predict the kinetic friction even when wear is involved. From this the Derjaguin-Amontons-Coulomb friction law can be recovered, since we observe a linear dependence of the contact area on the applied load in accordance with Greenwood-Williamson contact mechanics.
A methodology proposal for collaborative business process elaboration using a model-driven approach
NASA Astrophysics Data System (ADS)
Mu, Wenxin; Bénaben, Frédérick; Pingaud, Hervé
2015-05-01
Business process management (BPM) principles are commonly used to improve processes within an organisation. But they can equally be applied to supporting the design of an Information System (IS). In a collaborative situation involving several partners, this type of BPM approach may be useful to support the design of a Mediation Information System (MIS), which would ensure interoperability between the partners' ISs (which are assumed to be service oriented). To achieve this objective, the first main task is to build a collaborative business process cartography. The aim of this article is to present a method for bringing together collaborative information and elaborating collaborative business processes from the information gathered (by using a collaborative situation framework, an organisational model, an informational model, a functional model and a metamodel and by using model transformation rules).
Developing cloud-based Business Process Management (BPM): a survey
NASA Astrophysics Data System (ADS)
Mercia; Gunawan, W.; Fajar, A. N.; Alianto, H.; Inayatulloh
2018-03-01
In today’s highly competitive business environment, modern enterprises are dealing difficulties to cut unnecessary costs, eliminate wastes and delivery huge benefits for the organization. Companies are increasingly turning to a more flexible IT environment to help them realize this goal. For this reason, the article applies cloud based Business Process Management (BPM) that enables to focus on modeling, monitoring and process management. Cloud based BPM consists of business processes, business information and IT resources, which help build real-time intelligence systems, based on business management and cloud technology. Cloud computing is a paradigm that involves procuring dynamically measurable resources over the internet as an IT resource service. Cloud based BPM service enables to address common problems faced by traditional BPM, especially in promoting flexibility, event-driven business process to exploit opportunities in the marketplace.
Discrete post-processing of total cloud cover ensemble forecasts
NASA Astrophysics Data System (ADS)
Hemri, Stephan; Haiden, Thomas; Pappenberger, Florian
2017-04-01
This contribution presents an approach to post-process ensemble forecasts for the discrete and bounded weather variable of total cloud cover. Two methods for discrete statistical post-processing of ensemble predictions are tested. The first approach is based on multinomial logistic regression, the second involves a proportional odds logistic regression model. Applying them to total cloud cover raw ensemble forecasts from the European Centre for Medium-Range Weather Forecasts improves forecast skill significantly. Based on station-wise post-processing of raw ensemble total cloud cover forecasts for a global set of 3330 stations over the period from 2007 to early 2014, the more parsimonious proportional odds logistic regression model proved to slightly outperform the multinomial logistic regression model. Reference Hemri, S., Haiden, T., & Pappenberger, F. (2016). Discrete post-processing of total cloud cover ensemble forecasts. Monthly Weather Review 144, 2565-2577.
NASA Astrophysics Data System (ADS)
Bascetin, A.
2007-04-01
The selection of an optimal reclamation method is one of the most important factors in open-pit design and production planning. It also affects economic considerations in open-pit design as a function of plan location and depth. Furthermore, the selection is a complex multi-person, multi-criteria decision problem. The group decision-making process can be improved by applying a systematic and logical approach to assess the priorities based on the inputs of several specialists from different functional areas within the mine company. The analytical hierarchy process (AHP) can be very useful in involving several decision makers with different conflicting objectives to arrive at a consensus decision. In this paper, the selection of an optimal reclamation method using an AHP-based model was evaluated for coal production in an open-pit coal mine located at Seyitomer region in Turkey. The use of the proposed model indicates that it can be applied to improve the group decision making in selecting a reclamation method that satisfies optimal specifications. Also, it is found that the decision process is systematic and using the proposed model can reduce the time taken to select a optimal method.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoover, Mark D.; Myers, David S.; Cash, Leigh J.
The National Council on Radiation Protection and Measurements (NCRP) has established NCRP Scientific Committee 2-6 to develop a report on the current state of knowledge and guidance for radiation safety programs involved with nanotechnology. Nanotechnology is the understanding and control of matter at the nanoscale, at dimensions between approximately 1 and 100 nanometers, where unique phenomena enable novel applications. While the full report is in preparation, this article presents and applies an informatics-based decision-making framework and process through which the radiation protection community can anticipate that nano-enabled applications, processes, nanomaterials, and nanoparticles are likely to become present or are alreadymore » present in radiation-related activities; recognize specific situations where environmental and worker safety, health, well-being, and productivity may be affected by nano-related activities; evaluate how radiation protection practices may need to be altered to improve protection; control information, interpretations, assumptions, and conclusions to implement scientifically sound decisions and actions; and confirm that desired protection outcomes have been achieved. This generally applicable framework and supporting process can be continuously applied to achieve health and safety at the convergence of nanotechnology and radiation-related activities.« less
Hoover, Mark D; Myers, David S; Cash, Leigh J; Guilmette, Raymond A; Kreyling, Wolfgang G; Oberdörster, Günter; Smith, Rachel; Cassata, James R; Boecker, Bruce B; Grissom, Michael P
2015-02-01
The National Council on Radiation Protection and Measurements (NCRP) established NCRP Scientific Committee 2-6 to develop a report on the current state of knowledge and guidance for radiation safety programs involved with nanotechnology. Nanotechnology is the understanding and control of matter at the nanoscale, at dimensions between ∼1 and 100 nm, where unique phenomena enable novel applications. While the full report is in preparation, this paper presents and applies an informatics-based decision-making framework and process through which the radiation protection community can anticipate that nano-enabled applications, processes, nanomaterials, and nanoparticles are likely to become present or are already present in radiation-related activities; recognize specific situations where environmental and worker safety, health, well-being, and productivity may be affected by nano-related activities; evaluate how radiation protection practices may need to be altered to improve protection; control information, interpretations, assumptions, and conclusions to implement scientifically sound decisions and actions; and confirm that desired protection outcomes have been achieved. This generally applicable framework and supporting process can be continuously applied to achieve health and safety at the convergence of nanotechnology and radiation-related activities.
Orlandini, S; Pasquini, B; Caprini, C; Del Bubba, M; Squarcialupi, L; Colotta, V; Furlanetto, S
2016-09-30
A comprehensive strategy involving the use of mixture-process variable (MPV) approach and Quality by Design principles has been applied in the development of a capillary electrophoresis method for the simultaneous determination of the anti-inflammatory drug diclofenac and its five related substances. The selected operative mode consisted in microemulsion electrokinetic chromatography with the addition of methyl-β-cyclodextrin. The critical process parameters included both the mixture components (MCs) of the microemulsion and the process variables (PVs). The MPV approach allowed the simultaneous investigation of the effects of MCs and PVs on the critical resolution between diclofenac and its 2-deschloro-2-bromo analogue and on analysis time. MPV experiments were used both in the screening phase and in the Response Surface Methodology, making it possible to draw MCs and PVs contour plots and to find important interactions between MCs and PVs. Robustness testing was carried out by MPV experiments and validation was performed following International Conference on Harmonisation guidelines. The method was applied to a real sample of diclofenac gastro-resistant tablets. Copyright © 2016 Elsevier B.V. All rights reserved.
Collaborative simulation method with spatiotemporal synchronization process control
NASA Astrophysics Data System (ADS)
Zou, Yisheng; Ding, Guofu; Zhang, Weihua; Zhang, Jian; Qin, Shengfeng; Tan, John Kian
2016-10-01
When designing a complex mechatronics system, such as high speed trains, it is relatively difficult to effectively simulate the entire system's dynamic behaviors because it involves multi-disciplinary subsystems. Currently,a most practical approach for multi-disciplinary simulation is interface based coupling simulation method, but it faces a twofold challenge: spatial and time unsynchronizations among multi-directional coupling simulation of subsystems. A new collaborative simulation method with spatiotemporal synchronization process control is proposed for coupling simulating a given complex mechatronics system across multiple subsystems on different platforms. The method consists of 1) a coupler-based coupling mechanisms to define the interfacing and interaction mechanisms among subsystems, and 2) a simulation process control algorithm to realize the coupling simulation in a spatiotemporal synchronized manner. The test results from a case study show that the proposed method 1) can certainly be used to simulate the sub-systems interactions under different simulation conditions in an engineering system, and 2) effectively supports multi-directional coupling simulation among multi-disciplinary subsystems. This method has been successfully applied in China high speed train design and development processes, demonstrating that it can be applied in a wide range of engineering systems design and simulation with improved efficiency and effectiveness.
Hoover, Mark D.; Myers, David S.; Cash, Leigh J.; ...
2015-01-01
The National Council on Radiation Protection and Measurements (NCRP) has established NCRP Scientific Committee 2-6 to develop a report on the current state of knowledge and guidance for radiation safety programs involved with nanotechnology. Nanotechnology is the understanding and control of matter at the nanoscale, at dimensions between approximately 1 and 100 nanometers, where unique phenomena enable novel applications. While the full report is in preparation, this article presents and applies an informatics-based decision-making framework and process through which the radiation protection community can anticipate that nano-enabled applications, processes, nanomaterials, and nanoparticles are likely to become present or are alreadymore » present in radiation-related activities; recognize specific situations where environmental and worker safety, health, well-being, and productivity may be affected by nano-related activities; evaluate how radiation protection practices may need to be altered to improve protection; control information, interpretations, assumptions, and conclusions to implement scientifically sound decisions and actions; and confirm that desired protection outcomes have been achieved. This generally applicable framework and supporting process can be continuously applied to achieve health and safety at the convergence of nanotechnology and radiation-related activities.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wantuck, P. J.; Hollen, R. M.
2002-01-01
This paper provides an overview of some design and automation-related projects ongoing within the Applied Engineering Technologies (AET) Group at Los Alamos National Laboratory. AET uses a diverse set of technical capabilities to develop and apply processes and technologies to applications for a variety of customers both internal and external to the Laboratory. The Advanced Recovery and Integrated Extraction System (ARIES) represents a new paradigm for the processing of nuclear material from retired weapon systems in an environment that seeks to minimize the radiation dose to workers. To achieve this goal, ARIES relies upon automation-based features to handle and processmore » the nuclear material. Our Chemical Process Development Team specializes in fuzzy logic and intelligent control systems. Neural network technology has been utilized in some advanced control systems developed by team members. Genetic algorithms and neural networks have often been applied for data analysis. Enterprise modeling, or discrete event simulation, as well as chemical process simulation has been employed for chemical process plant design. Fuel cell research and development has historically been an active effort within the AET organization. Under the principal sponsorship of the Department of Energy, the Fuel Cell Team is now focusing on technologies required to produce fuel cell compatible feed gas from reformation of a variety of conventional fuels (e.g., gasoline, natural gas), principally for automotive applications. This effort involves chemical reactor design and analysis, process modeling, catalyst analysis, as well as full scale system characterization and testing. The group's Automation and Robotics team has at its foundation many years of experience delivering automated and robotic systems for nuclear, analytical chemistry, and bioengineering applications. As an integrator of commercial systems and a developer of unique custom-made systems, the team currently supports the automation needs of many Laboratory programs.« less
Modeling Healthcare Processes Using Commitments: An Empirical Evaluation.
Telang, Pankaj R; Kalia, Anup K; Singh, Munindar P
2015-01-01
The two primary objectives of this paper are: (a) to demonstrate how Comma, a business modeling methodology based on commitments, can be applied in healthcare process modeling, and (b) to evaluate the effectiveness of such an approach in producing healthcare process models. We apply the Comma approach on a breast cancer diagnosis process adapted from an HHS committee report, and presents the results of an empirical study that compares Comma with a traditional approach based on the HL7 Messaging Standard (Traditional-HL7). Our empirical study involved 47 subjects, and two phases. In the first phase, we partitioned the subjects into two approximately equal groups. We gave each group the same requirements based on a process scenario for breast cancer diagnosis. Members of one group first applied Traditional-HL7 and then Comma whereas members of the second group first applied Comma and then Traditional-HL7-each on the above-mentioned requirements. Thus, each subject produced two models, each model being a set of UML Sequence Diagrams. In the second phase, we repartitioned the subjects into two groups with approximately equal distributions from both original groups. We developed exemplar Traditional-HL7 and Comma models; we gave one repartitioned group our Traditional-HL7 model and the other repartitioned group our Comma model. We provided the same changed set of requirements to all subjects and asked them to modify the provided exemplar model to satisfy the new requirements. We assessed solutions produced by subjects in both phases with respect to measures of flexibility, time, difficulty, objective quality, and subjective quality. Our study found that Comma is superior to Traditional-HL7 in flexibility and objective quality as validated via Student's t-test to the 10% level of significance. Comma is a promising new approach for modeling healthcare processes. Further gains could be made through improved tooling and enhanced training of modeling personnel.
Modeling Healthcare Processes Using Commitments: An Empirical Evaluation
2015-01-01
The two primary objectives of this paper are: (a) to demonstrate how Comma, a business modeling methodology based on commitments, can be applied in healthcare process modeling, and (b) to evaluate the effectiveness of such an approach in producing healthcare process models. We apply the Comma approach on a breast cancer diagnosis process adapted from an HHS committee report, and presents the results of an empirical study that compares Comma with a traditional approach based on the HL7 Messaging Standard (Traditional-HL7). Our empirical study involved 47 subjects, and two phases. In the first phase, we partitioned the subjects into two approximately equal groups. We gave each group the same requirements based on a process scenario for breast cancer diagnosis. Members of one group first applied Traditional-HL7 and then Comma whereas members of the second group first applied Comma and then Traditional-HL7—each on the above-mentioned requirements. Thus, each subject produced two models, each model being a set of UML Sequence Diagrams. In the second phase, we repartitioned the subjects into two groups with approximately equal distributions from both original groups. We developed exemplar Traditional-HL7 and Comma models; we gave one repartitioned group our Traditional-HL7 model and the other repartitioned group our Comma model. We provided the same changed set of requirements to all subjects and asked them to modify the provided exemplar model to satisfy the new requirements. We assessed solutions produced by subjects in both phases with respect to measures of flexibility, time, difficulty, objective quality, and subjective quality. Our study found that Comma is superior to Traditional-HL7 in flexibility and objective quality as validated via Student’s t-test to the 10% level of significance. Comma is a promising new approach for modeling healthcare processes. Further gains could be made through improved tooling and enhanced training of modeling personnel. PMID:26539985
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kevin L. Kenney; Kara G. Cafferty; Jacob J. Jacobson
The U.S. Department of Energy promotes the production of liquid fuels from lignocellulosic biomass feedstocks by funding fundamental and applied research that advances the state of technology in biomass sustainable supply, logistics, conversion, and overall system sustainability. As part of its involvement in this program, Idaho National Laboratory (INL) investigates the feedstock logistics economics and sustainability of these fuels. Between 2000 and 2012, INL quantified and the economics and sustainability of moving biomass from the field or stand to the throat of the conversion process using conventional equipment and processes. All previous work to 2012 was designed to improve themore » efficiency and decrease costs under conventional supply systems. The 2012 programmatic target was to demonstrate a biomass logistics cost of $55/dry Ton for woody biomass delivered to fast pyrolysis conversion facility. The goal was achieved by applying field and process demonstration unit-scale data from harvest, collection, storage, preprocessing, handling, and transportation operations into INL’s biomass logistics model.« less
Chakraverty, S; Sahoo, B K; Rao, T D; Karunakar, P; Sapra, B K
2018-02-01
Modelling radon transport in the earth crust is a useful tool to investigate the changes in the geo-physical processes prior to earthquake event. Radon transport is modeled generally through the deterministic advection-diffusion equation. However, in order to determine the magnitudes of parameters governing these processes from experimental measurements, it is necessary to investigate the role of uncertainties in these parameters. Present paper investigates this aspect by combining the concept of interval uncertainties in transport parameters such as soil diffusivity, advection velocity etc, occurring in the radon transport equation as applied to soil matrix. The predictions made with interval arithmetic have been compared and discussed with the results of classical deterministic model. The practical applicability of the model is demonstrated through a case study involving radon flux measurements at the soil surface with an accumulator deployed in steady-state mode. It is possible to detect the presence of very low levels of advection processes by applying uncertainty bounds on the variations in the observed concentration data in the accumulator. The results are further discussed. Copyright © 2017 Elsevier Ltd. All rights reserved.
Minehunting sonar system research and development
NASA Astrophysics Data System (ADS)
Ferguson, Brian
2002-05-01
Sea mines have the potential to threaten the freedom of the seas by disrupting maritime trade and restricting the freedom of maneuver of navies. The acoustic detection, localization, and classification of sea mines involves a sequence of operations starting with the transmission of a sonar pulse and ending with an operator interpreting the information on a sonar display. A recent improvement to the process stems from the application of neural networks to the computed aided detection of sea mines. The advent of ultrawideband sonar transducers together with pulse compression techniques offers a thousandfold increase in the bandwidth-time product of conventional minehunting sonar transmissions enabling stealth mines to be detected at longer ranges. These wideband signals also enable mines to be imaged at safe standoff distances by applying tomographic image reconstruction techniques. The coupling of wideband transducer technology with synthetic aperture processing enhances the resolution of side scan sonars in both the cross-track and along-track directions. The principles on which conventional and advanced minehunting sonars are based are reviewed and the results of applying novel sonar signal processing algorithms to high-frequency sonar data collected in Australian waters are presented.
Scaling Laws Applied to a Modal Formulation of the Aeroservoelastic Equations
NASA Technical Reports Server (NTRS)
Pototzky, Anthony S.
2002-01-01
A method of scaling is described that easily converts the aeroelastic equations of motion of a full-sized aircraft into ones of a wind-tunnel model. To implement the method, a set of rules is provided for the conversion process involving matrix operations with scale factors. In addition, a technique for analytically incorporating a spring mounting system into the aeroelastic equations is also presented. As an example problem, a finite element model of a full-sized aircraft is introduced from the High Speed Research (HSR) program to exercise the scaling method. With a set of scale factor values, a brief outline is given of a procedure to generate the first-order aeroservoelastic analytical model representing the wind-tunnel model. To verify the scaling process as applied to the example problem, the root-locus patterns from the full-sized vehicle and the wind-tunnel model are compared to see if the root magnitudes scale with the frequency scale factor value. Selected time-history results are given from a numerical simulation of an active-controlled wind-tunnel model to demonstrate the utility of the scaling process.
NASA Astrophysics Data System (ADS)
Luo, Y.; Nissen-Meyer, T.; Morency, C.; Tromp, J.
2008-12-01
Seismic imaging in the exploration industry is often based upon ray-theoretical migration techniques (e.g., Kirchhoff) or other ideas which neglect some fraction of the seismic wavefield (e.g., wavefield continuation for acoustic-wave first arrivals) in the inversion process. In a companion paper we discuss the possibility of solving the full physical forward problem (i.e., including visco- and poroelastic, anisotropic media) using the spectral-element method. With such a tool at hand, we can readily apply the adjoint method to tomographic inversions, i.e., iteratively improving an initial 3D background model to fit the data. In the context of this inversion process, we draw connections between kernels in adjoint tomography and basic imaging principles in migration. We show that the images obtained by migration are nothing but particular kinds of adjoint kernels (mainly density kernels). Migration is basically a first step in the iterative inversion process of adjoint tomography. We apply the approach to basic 2D problems involving layered structures, overthrusting faults, topography, salt domes, and poroelastic regions.
Visual enhancement of unmixed multispectral imagery using adaptive smoothing
Lemeshewsky, G.P.; Rahman, Z.-U.; Schowengerdt, R.A.; Reichenbach, S.E.
2004-01-01
Adaptive smoothing (AS) has been previously proposed as a method to smooth uniform regions of an image, retain contrast edges, and enhance edge boundaries. The method is an implementation of the anisotropic diffusion process which results in a gray scale image. This paper discusses modifications to the AS method for application to multi-band data which results in a color segmented image. The process was used to visually enhance the three most distinct abundance fraction images produced by the Lagrange constraint neural network learning-based unmixing of Landsat 7 Enhanced Thematic Mapper Plus multispectral sensor data. A mutual information-based method was applied to select the three most distinct fraction images for subsequent visualization as a red, green, and blue composite. A reported image restoration technique (partial restoration) was applied to the multispectral data to reduce unmixing error, although evaluation of the performance of this technique was beyond the scope of this paper. The modified smoothing process resulted in a color segmented image with homogeneous regions separated by sharpened, coregistered multiband edges. There was improved class separation with the segmented image, which has importance to subsequent operations involving data classification.
Scheduling multirobot operations in manufacturing by truncated Petri nets
NASA Astrophysics Data System (ADS)
Chen, Qin; Luh, J. Y.
1995-08-01
Scheduling of operational sequences in manufacturing processes is one of the important problems in automation. Methods of applying Petri nets to model and analyze the problem with constraints on precedence relations, multiple resources allocation, etc. have been available in literature. Searching for an optimum schedule can be implemented by combining the branch-and-bound technique with the execution of the timed Petri net. The process usually produces a large Petri net which is practically not manageable. This disadvantage, however, can be handled by a truncation technique which divides the original large Petri net into several smaller size subnets. The complexity involved in the analysis of each subnet individually is greatly reduced. However, when the locally optimum schedules of the resulting subnets are combined together, it may not yield an overall optimum schedule for the original Petri net. To circumvent this problem, algorithms are developed based on the concepts of Petri net execution and modified branch-and-bound process. The developed technique is applied to a multi-robot task scheduling problem of the manufacturing work cell.
[Social inequality and participation in aging urban societies].
Rüssler, H; Köster, D; Heite, E; Stiel, J
2013-06-01
The social and political participation of elderly people is characterized by social inequality. Participation processes normally consolidate and intensify the exclusion of senior citizens having low incomes and low educational qualifications. In the research and development project "Quality of Life of Elderly People in Living Quarters" being conducted by Dortmund University of Applied Sciences and Arts, one of the questions being examined is whether and to what extent socially disadvantaged elderly people in a social space typical of the Ruhr region (reference area Gelsenkirchen-Schalke) can be included in the shaping of their quarter. This paper is based on the results of a quantitative, written survey (cross-section) on the subjects of quality of life and participation, and on a trend analysis measuring the effects of participation processes initiated on the elderly persons involved. The results of the study show that it is possible to involve socially disadvantaged elderly people in participation processes geared to the specific social space. They also indicate that elderly people from different income groups increase their social capital in the context of enabling structures.
Mo, Zhenghai; Feng, Gang; Su, Wenchuan; Liu, Zhuangzhuang; Peng, Fangren
2018-02-05
Pecan ( Carya illinoinensis ), as a popular nut tree, has been widely planted in China in recent years. Grafting is an important technique for its cultivation. For a successful grafting, graft union development generally involves the formation of callus and vascular bundles at the graft union. To explore the molecular mechanism of graft union development, we applied high throughput RNA sequencing to investigate the transcriptomic profiles of graft union at four timepoints (0 days, 8 days, 15 days, and 30 days) during the pecan grafting process. After de novo assembly, 83,693 unigenes were obtained, and 40,069 of them were annotated. A total of 12,180 differentially expressed genes were identified between by grafting. Genes involved in hormone signaling, cell proliferation, xylem differentiation, cell elongation, secondary cell wall deposition, programmed cell death, and reactive oxygen species (ROS) scavenging showed significant differential expression during the graft union developmental process. In addition, we found that the content of auxin, cytokinin, and gibberellin were accumulated at the graft unions during the grafting process. These results will aid in our understanding of successful grafting in the future.
Mo, Zhenghai; Feng, Gang; Su, Wenchuan; Liu, Zhuangzhuang; Peng, Fangren
2018-01-01
Pecan (Carya illinoinensis), as a popular nut tree, has been widely planted in China in recent years. Grafting is an important technique for its cultivation. For a successful grafting, graft union development generally involves the formation of callus and vascular bundles at the graft union. To explore the molecular mechanism of graft union development, we applied high throughput RNA sequencing to investigate the transcriptomic profiles of graft union at four timepoints (0 days, 8 days, 15 days, and 30 days) during the pecan grafting process. After de novo assembly, 83,693 unigenes were obtained, and 40,069 of them were annotated. A total of 12,180 differentially expressed genes were identified between by grafting. Genes involved in hormone signaling, cell proliferation, xylem differentiation, cell elongation, secondary cell wall deposition, programmed cell death, and reactive oxygen species (ROS) scavenging showed significant differential expression during the graft union developmental process. In addition, we found that the content of auxin, cytokinin, and gibberellin were accumulated at the graft unions during the grafting process. These results will aid in our understanding of successful grafting in the future. PMID:29401757
How orangutans (Pongo pygmaeus) innovate for water.
Russon, Anne E; Kuncoro, Purwo; Ferisa, Agnes; Handayani, Dwi Putri
2010-02-01
We report an observational field study that aimed to identify innovative processes in rehabilitant orangutans' (Pongo pygmaeus) water innovations on Kaja Island, Central Kalimantan, Indonesia. We tested for the basic model of innovating (make small changes to old behavior), 4 contributors (apply old behavior to new ends, accidents, independent working out, social cross-fertilization), development, and social rank. Focal observations of Kaja rehabilitants' behavior over 20 months yielded 18 probable innovations from among 44 water variants. We identified variants by function and behavioral grain, innovations by prevalence, and innovative processes by relations between innovations, other behaviors, and social encounters. Findings indicate innovating by small changes and some involvement of all 4 contributors; midrank orangutans were the most innovative; and rehabilitants' adolescent age profile, orphaning, and intense sociality probably enhanced innovativeness. Important complexities include: orangutan innovating may favor certain behavioral levels and narrowly defined similarities, and it may constitute a phase-like process involving a succession of changes and contributors. Discussion focuses on links with great ape cognition and parallels with innovating in humans and other nonhuman species.
Frank Gilbreth and health care delivery method study driven learning.
Towill, Denis R
2009-01-01
The purpose of this article is to look at method study, as devised by the Gilbreths at the beginning of the twentieth century, which found early application in hospital quality assurance and surgical "best practice". It has since become a core activity in all modern methods, as applied to healthcare delivery improvement programmes. The article traces the origin of what is now currently and variously called "business process re-engineering", "business process improvement" and "lean healthcare" etc., by different management gurus back to the century-old pioneering work of Frank Gilbreth. The outcome is a consistent framework involving "width", "length" and "depth" dimensions within which healthcare delivery systems can be analysed, designed and successfully implemented to achieve better and more consistent performance. Healthcare method (saving time plus saving motion) study is best practised as co-joint action learning activity "owned" by all "players" involved in the re-engineering process. However, although process mapping is a key step forward, in itself it is no guarantee of effective re-engineering. It is not even the beginning of the end of the change challenge, although it should be the end of the beginning. What is needed is innovative exploitation of method study within a healthcare organisational learning culture accelerated via the Gilbreth Knowledge Flywheel. It is shown that effective healthcare delivery pipeline improvement is anchored into a team approach involving all "players" in the system especially physicians. A comprehensive process study, constructive dialogue, proper and highly professional re-engineering plus managed implementation are essential components. Experience suggests "learning" is thereby achieved via "natural groups" actively involved in healthcare processes. The article provides a proven method for exploiting Gilbreths' outputs and their many successors in enabling more productive evidence-based healthcare delivery as summarised in the "learn-do-learn-do" feedback loop in the Gilbreth Knowledge Flywheel.
[Consensus conferences in Israel--a collaborative model for national policy making].
Tal, Orna; Oberlander, Shira; Siebzehner, Miri I
2014-07-01
The determination of an integrated national policy on controversial issues is a challenge for health systems worldwide. A common method to reach agreements for national policies in different countries throughout the world is group discussion that involves all stakeholders. A structured model of discussion on medical technologies started in the 1970s, mostly in North America, spreading to Europe and in the last decade also crossed borders to India, South America and Israel. Public discussion in the format of a consensus conference is a complex process that includes a thorough literature review for technology assessment, combining academic information using a technique of close consultation with experts, extensive panel discussion and dialogue with representatives of the public. At the end of the process a broad consensus is determined facilitating national-level policy implementation. The multiple factors involved, the issues addressed, the nature of the health system where the intended results will be applied, as well as political and social characteristics, produce variations among different countries. Therefore, this process requires flexibility in adjusting the classic model according to the awakening needs. The advantages of this method include encouraging the appropriate utilization of existing technologies, contemporary assessment by leading experts, aligning between all involved parties, public sharing and more. The initial model of the consensus conference was implemented in an orderly, systematic, structured process which allowed broad discussion, and many factors for thorough preparation. The disadvantages are its complexity, length and cost. In order to cope with the dynamics of the health system in israel, forcing policymakers to make decisions in real time, parts of the model were adjusted to address the issues arising in the system. Hence, a new process was developed--a derivative of the original Israeli model, with an emphasis on professional reviews, group discussion, and involvement of leading factors in the system. The participation of patients and the public in the process requires a thorough examination.
van Lent, Wineke A M; de Beer, Relinde D; van Harten, Wim H
2010-08-31
Benchmarking is one of the methods used in business that is applied to hospitals to improve the management of their operations. International comparison between hospitals can explain performance differences. As there is a trend towards specialization of hospitals, this study examines the benchmarking process and the success factors of benchmarking in international specialized cancer centres. Three independent international benchmarking studies on operations management in cancer centres were conducted. The first study included three comprehensive cancer centres (CCC), three chemotherapy day units (CDU) were involved in the second study and four radiotherapy departments were included in the final study. Per multiple case study a research protocol was used to structure the benchmarking process. After reviewing the multiple case studies, the resulting description was used to study the research objectives. We adapted and evaluated existing benchmarking processes through formalizing stakeholder involvement and verifying the comparability of the partners. We also devised a framework to structure the indicators to produce a coherent indicator set and better improvement suggestions. Evaluating the feasibility of benchmarking as a tool to improve hospital processes led to mixed results. Case study 1 resulted in general recommendations for the organizations involved. In case study 2, the combination of benchmarking and lean management led in one CDU to a 24% increase in bed utilization and a 12% increase in productivity. Three radiotherapy departments of case study 3, were considering implementing the recommendations.Additionally, success factors, such as a well-defined and small project scope, partner selection based on clear criteria, stakeholder involvement, simple and well-structured indicators, analysis of both the process and its results and, adapt the identified better working methods to the own setting, were found. The improved benchmarking process and the success factors can produce relevant input to improve the operations management of specialty hospitals.
2010-01-01
Background Benchmarking is one of the methods used in business that is applied to hospitals to improve the management of their operations. International comparison between hospitals can explain performance differences. As there is a trend towards specialization of hospitals, this study examines the benchmarking process and the success factors of benchmarking in international specialized cancer centres. Methods Three independent international benchmarking studies on operations management in cancer centres were conducted. The first study included three comprehensive cancer centres (CCC), three chemotherapy day units (CDU) were involved in the second study and four radiotherapy departments were included in the final study. Per multiple case study a research protocol was used to structure the benchmarking process. After reviewing the multiple case studies, the resulting description was used to study the research objectives. Results We adapted and evaluated existing benchmarking processes through formalizing stakeholder involvement and verifying the comparability of the partners. We also devised a framework to structure the indicators to produce a coherent indicator set and better improvement suggestions. Evaluating the feasibility of benchmarking as a tool to improve hospital processes led to mixed results. Case study 1 resulted in general recommendations for the organizations involved. In case study 2, the combination of benchmarking and lean management led in one CDU to a 24% increase in bed utilization and a 12% increase in productivity. Three radiotherapy departments of case study 3, were considering implementing the recommendations. Additionally, success factors, such as a well-defined and small project scope, partner selection based on clear criteria, stakeholder involvement, simple and well-structured indicators, analysis of both the process and its results and, adapt the identified better working methods to the own setting, were found. Conclusions The improved benchmarking process and the success factors can produce relevant input to improve the operations management of specialty hospitals. PMID:20807408
Schaub, Jochen; Clemens, Christoph; Kaufmann, Hitto; Schulz, Torsten W
2012-01-01
Development of efficient bioprocesses is essential for cost-effective manufacturing of recombinant therapeutic proteins. To achieve further process improvement and process rationalization comprehensive data analysis of both process data and phenotypic cell-level data is essential. Here, we present a framework for advanced bioprocess data analysis consisting of multivariate data analysis (MVDA), metabolic flux analysis (MFA), and pathway analysis for mapping of large-scale gene expression data sets. This data analysis platform was applied in a process development project with an IgG-producing Chinese hamster ovary (CHO) cell line in which the maximal product titer could be increased from about 5 to 8 g/L.Principal component analysis (PCA), k-means clustering, and partial least-squares (PLS) models were applied to analyze the macroscopic bioprocess data. MFA and gene expression analysis revealed intracellular information on the characteristics of high-performance cell cultivations. By MVDA, for example, correlations between several essential amino acids and the product concentration were observed. Also, a grouping into rather cell specific productivity-driven and process control-driven processes could be unraveled. By MFA, phenotypic characteristics in glycolysis, glutaminolysis, pentose phosphate pathway, citrate cycle, coupling of amino acid metabolism to citrate cycle, and in the energy yield could be identified. By gene expression analysis 247 deregulated metabolic genes were identified which are involved, inter alia, in amino acid metabolism, transport, and protein synthesis.
A network perspective on the processes of empowered organizations.
Neal, Zachary P
2014-06-01
Organizational empowerment is a multi-faceted concept that involves processes occurring both within and between organizations that facilitate achievement of their goals. This paper takes a closer look at three interorganizational processes that lead to empowered organizations: building alliances, getting the word out, and capturing others' attention. These processes are located within the broader nomological network of empowerment and organizational empowerment, and are linked to particular patterns of interorganizational relationships that facilitate organizations' ability to engage in them. A new network-based measure, γ-centrality, is introduced to capture the particular network structure associated with each process to be assessed. It is demonstrated first in a hypothetical organizational network, then applied to take a closer look at organizational empowerment in the context of a coordinating council composed of human service agencies. The paper concludes with a discussion of the implications of relationships between these processes, and the potential for unintended consequences in the empowerment of organizations.
Selecting reasonable future land use scenarios
DOE Office of Scientific and Technical Information (OSTI.GOV)
Allred, W.E.; Smith, R.W.
1995-12-31
This paper examines a process to help select the most reasonable future land use scenarios for hazardous waste and/or low-level radioactive waste disposal sites. The process involves evaluating future land use scenarios by applying selected criteria currently used by commercial mortgage companies to determine the feasibility of obtaining a loan for purchasing such land. The basis for the process is that only land use activities for which a loan can be obtained will be considered. To examine the process, a low-level radioactive waste site, the Radioactive Waste Management Complex at the Idaho National Engineering Laboratory, is used as an example.more » The authors suggest that the process is a very precise, comprehensive, and systematic (common sense) approach for determining reasonable future use of land. Implementing such a process will help enhance the planning, decisionmaking, safe management, and cleanup of present and future disposal facilities.« less
Lightweight Concrete Produced Using a Two-Stage Casting Process
Yoon, Jin Young; Kim, Jae Hong; Hwang, Yoon Yi; Shin, Dong Kyu
2015-01-01
The type of lightweight aggregate and its volume fraction in a mix determine the density of lightweight concrete. Minimizing the density obviously requires a higher volume fraction, but this usually causes aggregates segregation in a conventional mixing process. This paper proposes a two-stage casting process to produce a lightweight concrete. This process involves placing lightweight aggregates in a frame and then filling in the remaining interstitial voids with cementitious grout. The casting process results in the lowest density of lightweight concrete, which consequently has low compressive strength. The irregularly shaped aggregates compensate for the weak point in terms of strength while the round-shape aggregates provide a strength of 20 MPa. Therefore, the proposed casting process can be applied for manufacturing non-structural elements and structural composites requiring a very low density and a strength of at most 20 MPa. PMID:28788007
Dementia Grief: A Theoretical Model of a Unique Grief Experience
Blandin, Kesstan; Pepin, Renee
2016-01-01
Previous literature reveals a high prevalence of grief in dementia caregivers before physical death of the person with dementia that is associated with stress, burden, and depression. To date, theoretical models and therapeutic interventions with grief in caregivers have not adequately considered the grief process, but instead have focused on grief as a symptom that manifests within the process of caregiving. The Dementia Grief Model explicates the unique process of pre-death grief in dementia caregivers. In this paper we introduce the Dementia Grief Model, describe the unique characteristics dementia grief, and present the psychological states associated with the process of dementia grief. The model explicates an iterative grief process involving three states – separation, liminality, and re-emergence – each with a dynamic mechanism that facilitates or hinders movement through the dementia grief process. Finally, we offer potential applied research questions informed by the model. PMID:25883036
DOE Office of Scientific and Technical Information (OSTI.GOV)
Overman, N. R.; Whalen, S. A.; Bowden, M. E.
Shear Assisted Processing and Extrusion (ShAPE) -a novel processing route that combines high shear and extrusion conditions- was evaluated as a processing method to densify melt spun magnesium alloy (AZ91E) flake materials. This study illustrates the microstructural regimes and transitions in crystallographic texture that occur as a result of applying simultaneous linear and rotational shear during extrusion. Characterization of the flake precursor and extruded tube was performed using scanning and transmission electron microscopy, x-ray diffraction and microindentation techniques. Results show a unique transition in the orientation of basal texture development. Despite the high temperatures involved during processing, uniform grain refinementmore » and material homogenization are observed. These results forecast the ability to implement the ShAPE processing approach for a broader range of materials with novel microstructures and high performance.« less
Cravo-Laureau, Cristiana; Duran, Robert
2014-01-01
Coastal marine sediments, where important biological processes take place, supply essential ecosystem services. By their location, such ecosystems are particularly exposed to human activities as evidenced by the recent Deepwater Horizon disaster. This catastrophe revealed the importance to better understand the microbial processes involved on hydrocarbon degradation in marine sediments raising strong interests of the scientific community. During the last decade, several studies have shown the key role played by microorganisms in determining the fate of hydrocarbons in oil-polluted sediments but only few have taken into consideration the whole sediment’s complexity. Marine coastal sediment ecosystems are characterized by remarkable heterogeneity, owning high biodiversity and are subjected to fluctuations in environmental conditions, especially to important oxygen oscillations due to tides. Thus, for understanding the fate of hydrocarbons in such environments, it is crucial to study microbial activities, taking into account sediment characteristics, physical-chemical factors (electron acceptors, temperature), nutrients, co-metabolites availability as well as sediment’s reworking due to bioturbation activities. Key information could be collected from in situ studies, which provide an overview of microbial processes, but it is difficult to integrate all parameters involved. Microcosm experiments allow to dissect in-depth some mechanisms involved in hydrocarbon degradation but exclude environmental complexity. To overcome these lacks, strategies have been developed, by creating experiments as close as possible to environmental conditions, for studying natural microbial communities subjected to oil pollution. We present here a review of these approaches, their results and limitation, as well as the promising future of applying “omics” approaches to characterize in-depth microbial communities and metabolic networks involved in hydrocarbon degradation. In addition, we present the main conclusions of our studies in this field. PMID:24575083
40 CFR 26.401 - To what does this subpart apply?
Code of Federal Regulations, 2011 CFR
2011-07-01
... § 26.101(b)(2) for research involving survey or interview procedures or observations of public behavior... SUBJECTS Observational Research: Additional Protections for Children Involved as Subjects in Observational Research Conducted or Supported by EPA § 26.401 To what does this subpart apply? (a) This subpart applies...
40 CFR 26.401 - To what does this subpart apply?
Code of Federal Regulations, 2012 CFR
2012-07-01
... § 26.101(b)(2) for research involving survey or interview procedures or observations of public behavior... SUBJECTS Observational Research: Additional Protections for Children Involved as Subjects in Observational Research Conducted or Supported by EPA § 26.401 To what does this subpart apply? (a) This subpart applies...
40 CFR 26.401 - To what does this subpart apply?
Code of Federal Regulations, 2010 CFR
2010-07-01
... § 26.101(b)(2) for research involving survey or interview procedures or observations of public behavior... SUBJECTS Observational Research: Additional Protections for Children Involved as Subjects in Observational Research Conducted or Supported by EPA § 26.401 To what does this subpart apply? (a) This subpart applies...
40 CFR 26.401 - To what does this subpart apply?
Code of Federal Regulations, 2014 CFR
2014-07-01
... § 26.101(b)(2) for research involving survey or interview procedures or observations of public behavior... SUBJECTS Observational Research: Additional Protections for Children Involved as Subjects in Observational Research Conducted or Supported by EPA § 26.401 To what does this subpart apply? (a) This subpart applies...
40 CFR 26.401 - To what does this subpart apply?
Code of Federal Regulations, 2013 CFR
2013-07-01
... § 26.101(b)(2) for research involving survey or interview procedures or observations of public behavior... SUBJECTS Observational Research: Additional Protections for Children Involved as Subjects in Observational Research Conducted or Supported by EPA § 26.401 To what does this subpart apply? (a) This subpart applies...
Clarification of vaccines: An overview of filter based technology trends and best practices.
Besnard, Lise; Fabre, Virginie; Fettig, Michael; Gousseinov, Elina; Kawakami, Yasuhiro; Laroudie, Nicolas; Scanlan, Claire; Pattnaik, Priyabrata
2016-01-01
Vaccines are derived from a variety of sources including tissue extracts, bacterial cells, virus particles, recombinant mammalian, yeast and insect cell produced proteins and nucleic acids. The most common method of vaccine production is based on an initial fermentation process followed by purification. Production of vaccines is a complex process involving many different steps and processes. Selection of the appropriate purification method is critical to achieving desired purity of the final product. Clarification of vaccines is a critical step that strongly impacts product recovery and subsequent downstream purification. There are several technologies that can be applied for vaccine clarification. Selection of a harvesting method and equipment depends on the type of cells, product being harvested, and properties of the process fluids. These techniques include membrane filtration (microfiltration, tangential-flow filtration), centrifugation, and depth filtration (normal flow filtration). Historically vaccine harvest clarification was usually achieved by centrifugation followed by depth filtration. Recently membrane based technologies have gained prominence in vaccine clarification. The increasing use of single-use technologies in upstream processes necessitated a shift in harvest strategies. This review offers a comprehensive view on different membrane based technologies and their application in vaccine clarification, outlines the challenges involved and presents the current state of best practices in the clarification of vaccines. Copyright © 2015 Elsevier Inc. All rights reserved.
Criteria for identifying the molecular basis of the engram (CaMKII, PKMzeta).
Lisman, John
2017-11-29
The engram refers to the molecular changes by which a memory is stored in the brain. Substantial evidence suggests that memory involves learning-dependent changes at synapses, a process termed long-term potentiation (LTP). Thus, understanding the storages process that underlies LTP may provide insight into how the engram is stored. LTP involves induction, maintenance (storage), and expression sub-processes; special tests are required to specifically reveal properties of the storage process. The strongest of these is the Erasure test in which a transiently applied agent that attacks a putative storage molecule may lead to persistent erasure of previously induced LTP/memory. Two major hypotheses have been proposed for LTP/memory storage: the CaMKII and PKM-zeta hypotheses. After discussing the tests that can be used to identify the engram (Necessity test, Saturation/Occlusion test, Erasure test), the status of these hypotheses is evaluated, based on the literature on LTP and memory-guided behavior. Review of the literature indicates that all three tests noted above support the CaMKII hypothesis when done at both the LTP level and at the behavioral level. Taken together, the results strongly suggest that the engram is stored by an LTP process in which CaMKII is a critical memory storage molecule.
Wavelet transform analysis of transient signals: the seismogram and the electrocardiogram
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anant, K.S.
1997-06-01
In this dissertation I quantitatively demonstrate how the wavelet transform can be an effective mathematical tool for the analysis of transient signals. The two key signal processing applications of the wavelet transform, namely feature identification and representation (i.e., compression), are shown by solving important problems involving the seismogram and the electrocardiogram. The seismic feature identification problem involved locating in time the P and S phase arrivals. Locating these arrivals accurately (particularly the S phase) has been a constant issue in seismic signal processing. In Chapter 3, I show that the wavelet transform can be used to locate both the Pmore » as well as the S phase using only information from single station three-component seismograms. This is accomplished by using the basis function (wave-let) of the wavelet transform as a matching filter and by processing information across scales of the wavelet domain decomposition. The `pick` time results are quite promising as compared to analyst picks. The representation application involved the compression of the electrocardiogram which is a recording of the electrical activity of the heart. Compression of the electrocardiogram is an important problem in biomedical signal processing due to transmission and storage limitations. In Chapter 4, I develop an electrocardiogram compression method that applies vector quantization to the wavelet transform coefficients. The best compression results were obtained by using orthogonal wavelets, due to their ability to represent a signal efficiently. Throughout this thesis the importance of choosing wavelets based on the problem at hand is stressed. In Chapter 5, I introduce a wavelet design method that uses linear prediction in order to design wavelets that are geared to the signal or feature being analyzed. The use of these designed wavelets in a test feature identification application led to positive results. The methods developed in this thesis; the feature identification methods of Chapter 3, the compression methods of Chapter 4, as well as the wavelet design methods of Chapter 5, are general enough to be easily applied to other transient signals.« less
On whether mirror neurons play a significant role in processing affective prosody.
Ramachandra, Vijayachandra
2009-02-01
Several behavioral and neuroimaging studies have indicated that both right and left cortical structures and a few subcortical ones are involved in processing affective prosody. Recent investigations have shown that the mirror neuron system plays a crucial role in several higher-level functions such as empathy, theory of mind, language, etc., but no studies so far link the mirror neuron system with affective prosody. In this paper is a speculation that the mirror neuron system, which serves as a common neural substrate for different higher-level functions, may play a significant role in processing affective prosody via its connections with the limbic lobe. Actual research must apply electrophysiological and neuroimaging techniques to assess whether the mirror neuron systems underly affective prosody in humans.
Responding mindfully to distressing psychosis: A grounded theory analysis.
Abba, Nicola; Chadwick, Paul; Stevenson, Chris
2008-01-01
This study investigates the psychological process involved when people with current distressing psychosis learned to respond mindfully to unpleasant psychotic sensations (voices, thoughts, and images). Sixteen participants were interviewed on completion of a mindfulness group program. Grounded theory methodology was used to generate a theory of the core psychological process using a systematically applied set of methods linking analysis with data collection. The theory inducted describes the experience of relating differently to psychosis through a three-stage process: centering in awareness of psychosis; allowing voices, thoughts, and images to come and go without reacting or struggle; and reclaiming power through acceptance of psychosis and the self. The conceptual and clinical applications of the theory and its limits are discussed.
Fabrication of boron sputter targets
Makowiecki, D.M.; McKernan, M.A.
1995-02-28
A process is disclosed for fabricating high density boron sputtering targets with sufficient mechanical strength to function reliably at typical magnetron sputtering power densities and at normal process parameters. The process involves the fabrication of a high density boron monolithe by hot isostatically compacting high purity (99.9%) boron powder, machining the boron monolithe into the final dimensions, and brazing the finished boron piece to a matching boron carbide (B{sub 4}C) piece, by placing aluminum foil there between and applying pressure and heat in a vacuum. An alternative is the application of aluminum metallization to the back of the boron monolithe by vacuum deposition. Also, a titanium based vacuum braze alloy can be used in place of the aluminum foil. 7 figs.
Innovation and design approaches within prospective ergonomics.
Liem, André; Brangier, Eric
2012-01-01
In this conceptual article the topic of "Prospective Ergonomics" will be discussed within the context of innovation, design thinking and design processes & methods. Design thinking is essentially a human-centred innovation process that emphasises observation, collaboration, interpretation, visualisation of ideas, rapid concept prototyping and concurrent business analysis, which ultimately influences innovation and business strategy. The objective of this project is to develop a roadmap for innovation, involving consumers, designers and business people in an integrative process, which can be applied to product, service and business design. A theoretical structure comprising of Innovation perspectives (1), Worldviews supported by rationalist-historicist and empirical-idealistic dimensions (2) and Models of "design" reasoning (3) precedes the development and classification of existing methods as well as the introduction of new ones.
Biskupek, Johannes; Kaiser, Ute; Falk, Fritz
2008-06-01
In this study, we describe the transport of gold (Au) nanoparticles from the surface into crystalline silicon (Si) covered by silicon oxide (SiO(2)) as revealed by in situ high-resolution transmission electron microscopy. Complete crystalline Au nanoparticles sink through the SiO(2) layer into the Si substrate when high-dose electron irradiation is applied and temperature is raised above 150 degrees C. Above temperatures of 250 degrees C, the Au nanoparticles finally dissolve into fragments accompanied by crystallization of the amorphized Si substrate around these fragments. The transport process is explained by a wetting process followed by Stokes motion. Modelling this process yields boundaries for the interface energies involved.
Laser beam heat method reported
NASA Astrophysics Data System (ADS)
Tsuchiya, Hachiro; Goto, Hidekazu
1988-07-01
An outline of research involving the processing method utilizing laser-induced thermochemistry was presented, with the CO2 laser processing of ceramics in CF4 gas used as a practical processing example. It has become clear that it will be possible to conduct laser proccessing of ceramics with high efficiency and high precision by utilizing the thermochemical processes, but it is not believed that the present method is the best one and it is not clear that it can be applied to commercial processing. It is thought that the processing characteristics of this method will be greatly changed by the combination of the atmospheric gas and the material, and it is important to conduct tests on various combinations. However, it is believed that the improvement and development will become possible by theoretically confirming the basic process of the processing, especially of the the thermochemical process between the solid surface and the atmospheric gas molecule. Actually, it is believed that the thermochemical process on the solid surface is quite complicated. For example, it was confirmed that when thermochemical processing the Si monocrystal in the CF4 gas, the processing speed would change by at least 10 times through changing the gas pressure and the mixing O2 gas density. However, conversely speaking, it is believed that the fact that this method is complicated, with many unexplained points and room for research, conceals the possibility of its being applied to various fields, and also, in this sense, the quantitative confirmation of its basic process in an important problem to be solved in the future.
A meta-analysis of fMRI studies on Chinese orthographic, phonological, and semantic processing.
Wu, Chiao-Yi; Ho, Moon-Ho Ringo; Chen, Shen-Hsing Annabel
2012-10-15
A growing body of neuroimaging evidence has shown that Chinese character processing recruits differential activation from alphabetic languages due to its unique linguistic features. As more investigations on Chinese character processing have recently become available, we applied a meta-analytic approach to summarize previous findings and examined the neural networks for orthographic, phonological, and semantic processing of Chinese characters independently. The activation likelihood estimation (ALE) method was used to analyze eight studies in the orthographic task category, eleven in the phonological and fifteen in the semantic task categories. Converging activation among three language-processing components was found in the left middle frontal gyrus, the left superior parietal lobule and the left mid-fusiform gyrus, suggesting a common sub-network underlying the character recognition process regardless of the task nature. With increasing task demands, the left inferior parietal lobule and the right superior temporal gyrus were specialized for phonological processing, while the left middle temporal gyrus was involved in semantic processing. Functional dissociation was identified in the left inferior frontal gyrus, with the posterior dorsal part for phonological processing and the anterior ventral part for semantic processing. Moreover, bilateral involvement of the ventral occipito-temporal regions was found for both phonological and semantic processing. The results provide better understanding of the neural networks underlying Chinese orthographic, phonological, and semantic processing, and consolidate the findings of additional recruitment of the left middle frontal gyrus and the right fusiform gyrus for Chinese character processing as compared with the universal language network that has been based on alphabetic languages. Copyright © 2012 Elsevier Inc. All rights reserved.
Quality measurement and benchmarking of HPV vaccination services: a new approach.
Maurici, Massimo; Paulon, Luca; Campolongo, Alessandra; Meleleo, Cristina; Carlino, Cristiana; Giordani, Alessandro; Perrelli, Fabrizio; Sgricia, Stefano; Ferrante, Maurizio; Franco, Elisabetta
2014-01-01
A new measurement process based upon a well-defined mathematical model was applied to evaluate the quality of human papillomavirus (HPV) vaccination centers in 3 of 12 Local Health Units (ASLs) within the Lazio Region of Italy. The quality aspects considered for evaluation were communicational efficiency, organizational efficiency and comfort. The overall maximum achievable value was 86.10%, while the HPV vaccination quality scores for ASL1, ASL2 and ASL3 were 73.07%, 71.08%, and 67.21%, respectively. With this new approach it is possible to represent the probabilistic reasoning of a stakeholder who evaluates the quality of a healthcare provider. All ASLs had margins for improvements and optimal quality results can be assessed in terms of better performance conditions, confirming the relationship between the resulting quality scores and HPV vaccination coverage. The measurement process was structured into three steps and involved four stakeholder categories: doctors, nurses, parents and vaccinated women. In Step 1, questionnaires were administered to collect different stakeholders' points of view (i.e., subjective data) that were elaborated to obtain the best and worst performance conditions when delivering a healthcare service. Step 2 of the process involved the gathering of performance data during the service delivery (i.e., objective data collection). Step 3 of the process involved the elaboration of all data: subjective data from step 1 are used to define a "standard" to test objective data from step 2. This entire process led to the creation of a set of scorecards. Benchmarking is presented as a result of the probabilistic meaning of the evaluated scores.
Planning and executing motions for multibody systems in free-fall. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Cameron, Jonathan M.
1991-01-01
The purpose of this research is to develop an end-to-end system that can be applied to a multibody system in free-fall to analyze its possible motions, save those motions in a database, and design a controller that can execute those motions. A goal is for the process to be highly automated and involve little human intervention. Ideally, the output of the system would be data and algorithms that could be put in ROM to control the multibody system in free-fall. The research applies to more than just robots in space. It applies to any multibody system in free-fall. Mathematical techniques from nonlinear control theory were used to study the nature of the system dynamics and its possible motions. Optimization techniques were applied to plan motions. Image compression techniques were proposed to compress the precomputed motion data for storage. A linearized controller was derived to control the system while it executes preplanned trajectories.
Perspective: Evolutionary design of granular media and block copolymer patterns
NASA Astrophysics Data System (ADS)
Jaeger, Heinrich M.; de Pablo, Juan J.
2016-05-01
The creation of new materials "by design" is a process that starts from desired materials properties and proceeds to identify requirements for the constituent components. Such process is challenging because it inverts the typical modeling approach, which starts from given micro-level components to predict macro-level properties. We describe how to tackle this inverse problem using concepts from evolutionary computation. These concepts have widespread applicability and open up new opportunities for design as well as discovery. Here we apply them to design tasks involving two very different classes of soft materials, shape-optimized granular media and nanopatterned block copolymer thin films.
The Utility of EEG Band Power Analysis in the Study of Infancy and Early Childhood
Saby, Joni N.; Marshall, Peter J.
2012-01-01
Research employing electroencephalographic (EEG) techniques with infants and young children has flourished in recent years due to increased interest in understanding the neural processes involved in early social and cognitive development. This review focuses on the functional characteristics of the alpha, theta, and gamma frequency bands in the developing EEG. Examples of how analyses of EEG band power have been applied to specific lines of developmental research are also discussed. These examples include recent work on the infant mu rhythm and action processing, frontal alpha asymmetry and approach-withdrawal tendencies, and EEG power measures in the study of early psychosocial adversity. PMID:22545661
Vascularization strategies for tissue engineers.
Dew, Lindsey; MacNeil, Sheila; Chong, Chuh Khiun
2015-01-01
All tissue-engineered substitutes (with the exception of cornea and cartilage) require a vascular network to provide the nutrient and oxygen supply needed for their survival in vivo. Unfortunately the process of vascular ingrowth into an engineered tissue can take weeks to occur naturally and during this time the tissues become starved of essential nutrients, leading to tissue death. This review initially gives a brief overview of the processes and factors involved in the formation of new vasculature. It then summarizes the different approaches that are being applied or developed to overcome the issue of slow neovascularization in a range of tissue-engineered substitutes. Some potential future strategies are then discussed.
NASA Astrophysics Data System (ADS)
Wyrick, Jonathan; Einstein, T. L.; Bartels, Ludwig
2015-03-01
We present a method of analyzing the results of density functional modeling of molecular adsorption in terms of an analogue of molecular orbitals. This approach permits intuitive chemical insight into the adsorption process. Applied to a set of anthracene derivates (anthracene, 9,10-anthraquinone, 9,10-dithioanthracene, and 9,10-diselenonanthracene), we follow the electronic states of the molecules that are involved in the bonding process and correlate them to both the molecular adsorption geometry and the species' diffusive behavior. We additionally provide computational code to easily repeat this analysis on any system.
Energy geotechnics: Advances in subsurface energy recovery, storage, exchange, and waste management
McCartney, John S.; Sanchez, Marcelo; Tomac, Ingrid
2016-02-17
Energy geotechnics involves the use of geotechnical principles to understand and engineer the coupled thermo-hydro-chemo-mechanical processes encountered in collecting, exchanging, storing, and protecting energy resources in the subsurface. In addition to research on these fundamental coupled processes and characterization of relevant material properties, applied research is being performed to develop analytical tools for the design and analysis of different geo-energy applications. In conclusion, the aims of this paper are to discuss the fundamental physics and constitutive models that are common to these different applications, and to summarize recent advances in the development of relevant analytical tools.
Solder extrusion pressure bonding process and bonded products produced thereby
Beavis, Leonard C.; Karnowsky, Maurice M.; Yost, Frederick G.
1992-01-01
Production of soldered joints which are highly reliable and capable of surviving 10,000 thermal cycles between about -40.degree. C. and 110.degree. C. Process involves interposing a thin layer of a metal solder composition between the metal surfaces of members to be bonded and applying heat and up to about 1000 psi compression pressure to the superposed members, in the presence of a reducing atmosphere, to extrude the major amount of the solder composition, contaminants including fluxing gases and air, from between the members being bonded, to form a very thin, strong intermetallic bonding layer having a thermal expansion tolerant with that of the bonded members.
Energy geotechnics: Advances in subsurface energy recovery, storage, exchange, and waste management
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCartney, John S.; Sanchez, Marcelo; Tomac, Ingrid
Energy geotechnics involves the use of geotechnical principles to understand and engineer the coupled thermo-hydro-chemo-mechanical processes encountered in collecting, exchanging, storing, and protecting energy resources in the subsurface. In addition to research on these fundamental coupled processes and characterization of relevant material properties, applied research is being performed to develop analytical tools for the design and analysis of different geo-energy applications. In conclusion, the aims of this paper are to discuss the fundamental physics and constitutive models that are common to these different applications, and to summarize recent advances in the development of relevant analytical tools.
Solder extrusion pressure bonding process and bonded products produced thereby
NASA Astrophysics Data System (ADS)
Beavis, L. C.; Karnowsky, M. M.; Yost, F. G.
1990-04-01
The production of soldered joints are highly reliable and capable of surviving 10,000 thermal cycles between about -40 and 110 C. The process involves interposing a thin layer of a metal solder composition between the metal surfaces of members to be bonded and applying heat up to about 1000 psi compression pressure to the superposed members, in the presence of a reducing atmosphere, to extrude the major amount of the solder composition, contaminants including fluxing gases and air, from between the members being bonded, to form a very thin, strong intermetallic bonding layer having a thermal expansion tolerant with that of the bonded members.
Laser-Induced-Emission Spectroscopy In Hg/Ar Discharge
NASA Technical Reports Server (NTRS)
Maleki, Lutfollah; Blasenheim, Barry J.; Janik, Gary R.
1992-01-01
Laser-induced-emission (LIE) spectroscopy used to probe low-pressure mercury/argon discharge to determine influence of mercury atoms in metastable 6(Sup3)P(Sub2) state on emission of light from discharge. LIE used to study all excitation processes affected by metastable population, including possible effects on excitation of atoms, ions, and buffer gas. Technique applied to emissions of other plasmas. Provides data used to make more-accurate models of such emissions, exploited by lighting and laser industries and by laboratories studying discharges. Also useful in making quantitative measurements of relative rates and cross sections of direct and two-step collisional processes involving metastable level.
Dai, Ru H.; Chen, Hsueh-Chih; Chan, Yu C.; Wu, Ching-Lin; Li, Ping; Cho, Shu L.; Hu, Jon-Fan
2017-01-01
It is well accepted that the humor comprehension processing involves incongruity detection and resolution and then induces a feeling of amusement. However, this three-stage model of humor processing does not apply to absurd humor (so-called nonsense humor). Absurd humor contains an unresolvable incongruity but can still induce a feeling of mirth. In this study, we used functional magnetic resonance imaging (fMRI) to identify the neural mechanisms of absurd humor. Specifically, we aimed to investigate the neural substrates associated with the complete resolution of incongruity resolution humor and partial resolution of absurd humor. Based on the fMRI data, we propose a dual-path model of incongruity resolution and absurd verbal humor. According to this model, the detection and resolution for the incongruity of incongruity resolution humor activate brain regions involved in the temporo-parietal lobe (TPJ) implicated in the integration of multiple information and precuneus, likely to be involved in the ability of perspective taking. The appreciation of incongruity resolution humor activates regions the posterior cingulate cortex (PCC), implicated in autobiographic or event memory retrieval, and parahippocampal gyrus (PHG), implying the funny feeling. By contrast, the partial resolution of absurd humor elicits greater activation in the fusiform gyrus which have been implicated in word processing, inferior frontal gyrus (IFG) for the process of incongruity resolution and superior temporal gyrus (STG) for the pragmatic awareness. PMID:28484402
Diversity of neuropsin (KLK8)-dependent synaptic associativity in the hippocampal pyramidal neuron
Ishikawa, Yasuyuki; Tamura, Hideki; Shiosaka, Sadao
2011-01-01
Abstract Hippocampal early (E-) long-term potentiation (LTP) and long-term depression (LTD) elicited by a weak stimulus normally fades within 90 min. Late (L-) LTP and LTD elicited by strong stimuli continue for >180 min and require new protein synthesis to persist. If a strong tetanus is applied once to synaptic inputs, even a weak tetanus applied to another synaptic input can evoke persistent LTP. A synaptic tag is hypothesized to enable the capture of newly synthesized synaptic molecules. This process, referred to as synaptic tagging, is found between not only the same processes (i.e. E- and L-LTP; E- and L-LTD) but also between different processes (i.e. E-LTP and L-LTD; E-LTD and L-LTP) induced at two independent synaptic inputs (cross-tagging). However, the mechanisms of synaptic tag setting remain unclear. In our previous study, we found that synaptic associativity in the hippocampal Schaffer collateral pathway depended on neuropsin (kallikrein-related peptidase 8 or KLK8), a plasticity-related extracellular protease. In the present study, we investigated how neuropsin participates in synaptic tagging and cross-tagging. We report that neuropsin is involved in synaptic tagging during LTP at basal and apical dendritic inputs. Moreover, neuropsin is involved in synaptic tagging and cross-tagging during LTP at apical dendritic inputs via integrin β1 and calcium/calmodulin-dependent protein kinase II signalling. Thus, neuropsin is a candidate molecule for the LTP-specific tag setting and regulates the transformation of E- to L-LTP during both synaptic tagging and cross-tagging. PMID:21646406
Damle, Aneel; Andrew, Nathan; Kaur, Shubjeet; Orquiola, Alan; Alavi, Karim; Steele, Scott R; Maykel, Justin
2016-07-01
Lean processes involve streamlining methods and maximizing efficiency. Well established in the manufacturing industry, they are increasingly being applied to health care. The objective of this study was to determine feasibility and effectiveness of applying Lean principles to an academic medical center colonoscopy unit. Lean process improvement involved training endoscopy personnel, observing patients, mapping the value stream, analyzing patient flow, designing and implementing new processes, and finally re-observing the process. Our primary endpoint was total colonoscopy time (minutes from check-in to discharge) with secondary endpoints of individual segment times and unit colonoscopy capacity. A total of 217 patients were included (November 2013-May 2014), with 107 pre-Lean and 110 post-Lean intervention. Pre-Lean total colonoscopy time was 134 min. After implementation of the Lean process, mean colonoscopy time decreased by 10 % to 121 min (p = 0.01). The three steps of the process affected by the Lean intervention (time to achieve adequate sedation, time to recovery, and time to discharge) decreased from 3.7 to 2.4 min (p < 0.01), 4.0 to 3.4 min (p = 0.09), and 41.2 to 35.4 min (p = 0.05), respectively. Overall, unit capacity of colonoscopies increased from 39.6 per day to 43.6. Post-Lean patient satisfaction surveys demonstrated an average score of 4.5/5.0 (n = 73) regarding waiting time, 4.9/5.0 (n = 60) regarding how favorably this experienced compared to prior colonoscopy experiences, and 4.9/5.0 (n = 74) regarding professionalism of staff. One hundred percentage of respondents (n = 69) stated they would recommend our institution to a friend for colonoscopy. With no additional utilization of resources, a single Lean process improvement cycle increased productivity and capacity of our colonoscopy unit. We expect this to result in increased patient access and revenue while maintaining patient satisfaction. We believe these results are widely generalizable to other colonoscopy units as well as other process-based interventions in health care.
Ito, Jun; Herter, Thomas; Baidoo, Edward E K; Lao, Jeemeng; Vega-Sánchez, Miguel E; Michelle Smith-Moritz, A; Adams, Paul D; Keasling, Jay D; Usadel, Björn; Petzold, Christopher J; Heazlewood, Joshua L
2014-03-01
Understanding the intricate metabolic processes involved in plant cell wall biosynthesis is limited by difficulties in performing sensitive quantification of many involved compounds. Hydrophilic interaction liquid chromatography is a useful technique for the analysis of hydrophilic metabolites from complex biological extracts and forms the basis of this method to quantify plant cell wall precursors. A zwitterionic silica-based stationary phase has been used to separate hydrophilic nucleotide sugars involved in cell wall biosynthesis from milligram amounts of leaf tissue. A tandem mass spectrometry operating in selected reaction monitoring mode was used to quantify nucleotide sugars. This method was highly repeatable and quantified 12 nucleotide sugars at low femtomole quantities, with linear responses up to four orders of magnitude to several 100pmol. The method was also successfully applied to the analysis of purified leaf extracts from two model plant species with variations in their cell wall sugar compositions and indicated significant differences in the levels of 6 out of 12 nucleotide sugars. The plant nucleotide sugar extraction procedure was demonstrated to have good recovery rates with minimal matrix effects. The approach results in a significant improvement in sensitivity when applied to plant samples over currently employed techniques. Copyright © 2013 Elsevier Inc. All rights reserved.
Insights into microbial involvement in desert varnish formation retrieved from metagenomic analysis.
Lang-Yona, Naama; Maier, Stefanie; Macholdt, Dorothea S; Müller-Germann, Isabell; Yordanova, Petya; Rodriguez-Caballero, Emilio; Jochum, Klaus P; Al-Amri, Abdullah; Andreae, Meinrat O; Fröhlich-Nowoisky, Janine; Weber, Bettina
2018-02-28
Desert varnishes are dark rock coatings observed in arid environments and might resemble Mn-rich coatings found on Martian rocks. Their formation mechanism is not fully understood and the possible microbial involvement is under debate. In this study, we applied DNA metagenomic Shotgun sequencing of varnish and surrounding soil to evaluate the composition of the microbial community and its potential metabolic function. We found that the α diversity was lower in varnish compared to soil samples (p value < 0.05), suggesting distinct populations with significantly higher abundance of Actinobacteria, Proteobacteria and Cyanobacteria within the varnish. Additionally, we observed increased levels of transition metal metabolic processes in varnish compared to soil samples. Nevertheless, potentially relevant enzymes for varnish formation were detected at low to insignificant levels in both niches, indicating no current direct microbial involvement in Mn oxidation. This finding is supported by quantitative genomic analysis, elemental analysis, fluorescence imaging and scanning transmission X-ray microscopy. We thus conclude that the distinct microbial communities detected in desert varnish originate from settled Aeolian microbes, which colonized this nutrient-enriched niche, and discuss possible indirect contributions of microorganisms to the formation of desert varnish. © 2018 Society for Applied Microbiology and John Wiley & Sons Ltd.
Informational analysis involving application of complex information system
NASA Astrophysics Data System (ADS)
Ciupak, Clébia; Vanti, Adolfo Alberto; Balloni, Antonio José; Espin, Rafael
The aim of the present research is performing an informal analysis for internal audit involving the application of complex information system based on fuzzy logic. The same has been applied in internal audit involving the integration of the accounting field into the information systems field. The technological advancements can provide improvements to the work performed by the internal audit. Thus we aim to find, in the complex information systems, priorities for the work of internal audit of a high importance Private Institution of Higher Education. The applied method is quali-quantitative, as from the definition of strategic linguistic variables it was possible to transform them into quantitative with the matrix intersection. By means of a case study, where data were collected via interview with the Administrative Pro-Rector, who takes part at the elaboration of the strategic planning of the institution, it was possible to infer analysis concerning points which must be prioritized at the internal audit work. We emphasize that the priorities were identified when processed in a system (of academic use). From the study we can conclude that, starting from these information systems, audit can identify priorities on its work program. Along with plans and strategic objectives of the enterprise, the internal auditor can define operational procedures to work in favor of the attainment of the objectives of the organization.
Pre-genomic, genomic and post-genomic study of microbial communities involved in bioenergy.
Rittmann, Bruce E; Krajmalnik-Brown, Rosa; Halden, Rolf U
2008-08-01
Microorganisms can produce renewable energy in large quantities and without damaging the environment or disrupting food supply. The microbial communities must be robust and self-stabilizing, and their essential syntrophies must be managed. Pre-genomic, genomic and post-genomic tools can provide crucial information about the structure and function of these microbial communities. Applying these tools will help accelerate the rate at which microbial bioenergy processes move from intriguing science to real-world practice.
Marine geodetic control for geoidal profile mapping across the Puerto Rican Trench
NASA Technical Reports Server (NTRS)
Fubara, D. M.; Mourad, A. G.
1975-01-01
A marine geodetic control was established for the northern end of the geoidal profile mapping experiment across the Puerto Rican Trench by determining the three-dimensional geodetic coordinates of the four ocean-bottom mounted acoustic transponders. The data reduction techniques employed and analytical processes involved are described. Before applying the analytical techniques to the field data, they were tested with simulated data and proven to be effective in theory as well as in practice.
Role and interest of new technologies in data processing for space control centers
NASA Astrophysics Data System (ADS)
Denier, Jean-Paul; Caspar, Raoul; Borillo, Mario; Soubie, Jean-Luc
1990-10-01
The ways in which a multidisplinary approach will improve space control centers is discussed. Electronic documentation, ergonomics of human computer interfaces, natural language, intelligent tutoring systems and artificial intelligence systems are considered and applied in the study of the Hermes flight control center. It is concluded that such technologies are best integrated into a classical operational environment rather than taking a revolutionary approach which would involve a global modification of the system.
The calcium binding properties and structure prediction of the Hax-1 protein.
Balcerak, Anna; Rowinski, Sebastian; Szafron, Lukasz M; Grzybowska, Ewa A
2017-01-01
Hax-1 is a protein involved in regulation of different cellular processes, but its properties and exact mechanisms of action remain unknown. In this work, using purified, recombinant Hax-1 and by applying an in vitro autoradiography assay we have shown that this protein binds Ca 2+ . Additionally, we performed structure prediction analysis which shows that Hax-1 displays definitive structural features, such as two α-helices, short β-strands and four disordered segments.
Techno-economic analysis of a biomass depot
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jacobson, Jacob Jordan; Lamers, Patrick; Roni, Mohammad Sadekuzzaman
2014-10-01
The U.S. Department of Energy (DOE) Bioenergy Technologies Office (BETO) promotes the production of an array of liquid fuels and fuel blendstocks from lignocellulosic biomass feedstocks by funding fundamental and applied research that advances the state of technology in biomass collection, conversion, and sustainability. As part of its involvement in this program, the Idaho National Laboratory (INL) investigates the technical, economic, and environmental performance of different feedstock supply systems and their impacts on the downstream conversion processes.
Contributions of in situ microscopy to the current understanding of stone biodeterioration.
de Los Ríos, Asunción; Ascaso, Carmen
2005-09-01
In situ microscopy consists of simultaneously applying several microscopy techniques without separating the biological component from its habitat. Over the past few years, this strategy has allowed characterization of the biofilms involved in biodeterioration processes affecting stone monuments and has revealed the biogeophysical and biogeochemical impact of the microbiota present. In addition, through in situ microscopy diagnosis, appropriate treatments can be designed to resolve the problems related to microbial colonization of stone monuments.
Applying Acquisition Lessons Learned to Operational Energy Initiatives
2013-03-01
current and future platforms to meet the demands of Energy-Informed Operations. Endnotes 1 Charles F. Wald , and Tom Captain, Energy Security America’s...2013); Wald and Captain, Energy Security America’s Best Defense, 1. 3 The Army’s agile process involves seven phases and three decision points to...https://acc.dau.mil/adl/en- US/329976/file/47235/EVM_Report_to_Congress.pdf (accessed January 14, 2013). 22 Lisa Pracchia, “The AV-8B Team Learns Synergy
Minakata, Daisuke; Mezyk, Stephen P; Jones, Jace W; Daws, Brittany R; Crittenden, John C
2014-12-02
Aqueous phase advanced oxidation processes (AOPs) produce hydroxyl radicals (HO•) which can completely oxidize electron rich organic compounds. The proper design and operation of AOPs require that we predict the formation and fate of the byproducts and their associated toxicity. Accordingly, there is a need to develop a first-principles kinetic model that can predict the dominant reaction pathways that potentially produce toxic byproducts. We have published some of our efforts on predicting the elementary reaction pathways and the HO• rate constants. Here we develop linear free energy relationships (LFERs) that predict the rate constants for aqueous phase radical reactions. The LFERs relate experimentally obtained kinetic rate constants to quantum mechanically calculated aqueous phase free energies of activation. The LFERs have been applied to 101 reactions, including (1) HO• addition to 15 aromatic compounds; (2) addition of molecular oxygen to 65 carbon-centered aliphatic and cyclohexadienyl radicals; (3) disproportionation of 10 peroxyl radicals, and (4) unimolecular decay of nine peroxyl radicals. The LFERs correlations predict the rate constants within a factor of 2 from the experimental values for HO• reactions and molecular oxygen addition, and a factor of 5 for peroxyl radical reactions. The LFERs and the elementary reaction pathways will enable us to predict the formation and initial fate of the byproducts in AOPs. Furthermore, our methodology can be applied to other environmental processes in which aqueous phase radical-involved reactions occur.
Application of gamma-ray spectrometry in a NORM industry for its radiometrical characterization
NASA Astrophysics Data System (ADS)
Mantero, J.; Gázquez, M. J.; Hurtado, S.; Bolívar, J. P.; García-Tenorio, R.
2015-11-01
Industrial activities involving Naturally Occurring Radioactive Materials (NORM) are found among the most important industrial sectors worldwide as oil/gas facilities, metal production, phosphate Industry, zircon treatment, etc. being really significant the radioactive characterization of the materials involved in their production processes in order to assess the potential radiological risk for workers or natural environment. High resolution gamma spectrometry is a versatile non-destructive radiometric technique that makes simultaneous determination of several radionuclides possible with little sample preparation. However NORM samples cover a wide variety of densities and composition, as opposed to the standards used in gamma efficiency calibration, which are either water-based solutions or standard/reference sources of similar composition. For that reason self-absorption correction effects (especially in the low energy range) must be considered individually in every sample. In this work an experimental and a semi-empirical methodology of self-absorption correction were applied to NORM samples, and the obtained results compared critically, in order to establish the best practice in relation to the circumstances of an individual laboratory. This methodology was applied in samples coming from a TiO2 factory (NORM industry) located in the south-west of Spain where activity concentration of several radionuclides from the Uranium and Thorium series through the production process was measured. These results will be shown in this work.
45 CFR 46.201 - To what do these regulations apply?
Code of Federal Regulations, 2010 CFR
2010-10-01
... HUMAN SUBJECTS Additional Protections for Pregnant Women, Human Fetuses and Neonates Involved in... section, this subpart applies to all research involving pregnant women, human fetuses, neonates of...
45 CFR 46.201 - To what do these regulations apply?
Code of Federal Regulations, 2012 CFR
2012-10-01
... HUMAN SUBJECTS Additional Protections for Pregnant Women, Human Fetuses and Neonates Involved in... section, this subpart applies to all research involving pregnant women, human fetuses, neonates of...
45 CFR 46.201 - To what do these regulations apply?
Code of Federal Regulations, 2013 CFR
2013-10-01
... HUMAN SUBJECTS Additional Protections for Pregnant Women, Human Fetuses and Neonates Involved in... section, this subpart applies to all research involving pregnant women, human fetuses, neonates of...
45 CFR 46.201 - To what do these regulations apply?
Code of Federal Regulations, 2014 CFR
2014-10-01
... HUMAN SUBJECTS Additional Protections for Pregnant Women, Human Fetuses and Neonates Involved in... section, this subpart applies to all research involving pregnant women, human fetuses, neonates of...
45 CFR 46.201 - To what do these regulations apply?
Code of Federal Regulations, 2011 CFR
2011-10-01
... HUMAN SUBJECTS Additional Protections for Pregnant Women, Human Fetuses and Neonates Involved in... section, this subpart applies to all research involving pregnant women, human fetuses, neonates of...
Kim, Hongkeun
2018-03-15
Functional neuroimaging studies on episodic memory retrieval consistently indicated the activation of the precuneus (PCU), mid-cingulate cortex (MCC), and lateral intraparietal sulcus (latIPS) regions. Although studies typically interpreted these activations in terms of memory retrieval processes, resting-state functional connectivity data indicate that these regions are part of the frontoparietal control network, suggesting a more general, cross-functional role. In this regard, this study proposes a novel hypothesis which suggests that the parietal control network plays a strong role in accommodating the co-occurrence of externally directed cognition (EDC) and internally directed cognition (IDC), which are typically antagonistic to each other. To evaluate how well this dual cognitive processes hypothesis can account for parietal activation patterns during memory tasks, this study provides a cross-function meta-analysis involving 3 different memory paradigms, namely, retrieval success (hit > correct rejection), repetition enhancement (repeated > novel), and subsequent forgetting (forgotten > remembered). Common to these paradigms is that the target condition may involve both EDC (stimulus processing and motor responding) and IDC (intentional remembering, involuntary awareness of previous encounter, or task-unrelated thoughts) strongly, whereas the reference condition may involve EDC to a greater extent, but IDC to a lesser extent. Thus, the dual cognitive processes hypothesis predicts that each of these paradigms will activate similar, overlapping PCU, MCC, and latIPS regions. The results were fully consistent with the prediction, supporting the dual cognitive processes hypothesis. Evidence from relevant prior studies suggests that the dual cognitive processes hypothesis may also apply to non-memory domain tasks. Copyright © 2018 Elsevier B.V. All rights reserved.
Qiu, Yuchen; Yan, Shiju; Gundreddy, Rohith Reddy; Wang, Yunzhi; Cheng, Samuel; Liu, Hong; Zheng, Bin
2017-01-01
PURPOSE To develop and test a deep learning based computer-aided diagnosis (CAD) scheme of mammograms for classifying between malignant and benign masses. METHODS An image dataset involving 560 regions of interest (ROIs) extracted from digital mammograms was used. After down-sampling each ROI from 512×512 to 64×64 pixel size, we applied an 8 layer deep learning network that involves 3 pairs of convolution-max-pooling layers for automatic feature extraction and a multiple layer perceptron (MLP) classifier for feature categorization to process ROIs. The 3 pairs of convolution layers contain 20, 10, and 5 feature maps, respectively. Each convolution layer is connected with a max-pooling layer to improve the feature robustness. The output of the sixth layer is fully connected with a MLP classifier, which is composed of one hidden layer and one logistic regression layer. The network then generates a classification score to predict the likelihood of ROI depicting a malignant mass. A four-fold cross validation method was applied to train and test this deep learning network. RESULTS The results revealed that this CAD scheme yields an area under the receiver operation characteristic curve (AUC) of 0.696±0.044, 0.802±0.037, 0.836±0.036, and 0.822±0.035 for fold 1 to 4 testing datasets, respectively. The overall AUC of the entire dataset is 0.790±0.019. CONCLUSIONS This study demonstrates the feasibility of applying a deep learning based CAD scheme to classify between malignant and benign breast masses without a lesion segmentation, image feature computation and selection process. PMID:28436410
Qiu, Yuchen; Yan, Shiju; Gundreddy, Rohith Reddy; Wang, Yunzhi; Cheng, Samuel; Liu, Hong; Zheng, Bin
2017-01-01
To develop and test a deep learning based computer-aided diagnosis (CAD) scheme of mammograms for classifying between malignant and benign masses. An image dataset involving 560 regions of interest (ROIs) extracted from digital mammograms was used. After down-sampling each ROI from 512×512 to 64×64 pixel size, we applied an 8 layer deep learning network that involves 3 pairs of convolution-max-pooling layers for automatic feature extraction and a multiple layer perceptron (MLP) classifier for feature categorization to process ROIs. The 3 pairs of convolution layers contain 20, 10, and 5 feature maps, respectively. Each convolution layer is connected with a max-pooling layer to improve the feature robustness. The output of the sixth layer is fully connected with a MLP classifier, which is composed of one hidden layer and one logistic regression layer. The network then generates a classification score to predict the likelihood of ROI depicting a malignant mass. A four-fold cross validation method was applied to train and test this deep learning network. The results revealed that this CAD scheme yields an area under the receiver operation characteristic curve (AUC) of 0.696±0.044, 0.802±0.037, 0.836±0.036, and 0.822±0.035 for fold 1 to 4 testing datasets, respectively. The overall AUC of the entire dataset is 0.790±0.019. This study demonstrates the feasibility of applying a deep learning based CAD scheme to classify between malignant and benign breast masses without a lesion segmentation, image feature computation and selection process.
Ianni, Elena; Geneletti, Davide
2010-11-01
This paper proposes a method to select forest restoration priority areas consistently with the key principles of the Ecosystem Approach (EA) and the Forest Landscape Restoration (FLR) framework. The methodology is based on the principles shared by the two approaches: acting at ecosystem scale, involving stakeholders, and evaluating alternatives. It proposes the involvement of social actors which have a stake in forest management through multicriteria analysis sessions aimed at identifying the most suitable forest restoration intervention. The method was applied to a study area in the native forests of Northern Argentina (the Yungas). Stakeholders were asked to identify alternative restoration actions, i.e. potential areas implementing FLR. Ten alternative fincas-estates derived from the Spanish land tenure system-differing in relation to ownership, management, land use, land tenure, and size were evaluated. Twenty criteria were selected and classified into four groups: biophysical, social, economic and political. Finca Ledesma was the closest to the economic, social, environmental and political goals, according to the values and views of the actors involved in the decision. This study represented the first attempt to apply EA principles to forest restoration at landscape scale in the Yungas region. The benefits obtained by the application of the method were twofold: on one hand, researchers and local actors were forced to conceive the Yungas as a complex net of rights rather than as a sum of personal interests. On the other hand, the participatory multicriteria approach provided a structured process for collective decision-making in an area where it has never been implemented.
NASA Astrophysics Data System (ADS)
Ianni, Elena; Geneletti, Davide
2010-11-01
This paper proposes a method to select forest restoration priority areas consistently with the key principles of the Ecosystem Approach (EA) and the Forest Landscape Restoration (FLR) framework. The methodology is based on the principles shared by the two approaches: acting at ecosystem scale, involving stakeholders, and evaluating alternatives. It proposes the involvement of social actors which have a stake in forest management through multicriteria analysis sessions aimed at identifying the most suitable forest restoration intervention. The method was applied to a study area in the native forests of Northern Argentina (the Yungas). Stakeholders were asked to identify alternative restoration actions, i.e. potential areas implementing FLR. Ten alternative fincas—estates derived from the Spanish land tenure system—differing in relation to ownership, management, land use, land tenure, and size were evaluated. Twenty criteria were selected and classified into four groups: biophysical, social, economic and political. Finca Ledesma was the closest to the economic, social, environmental and political goals, according to the values and views of the actors involved in the decision. This study represented the first attempt to apply EA principles to forest restoration at landscape scale in the Yungas region. The benefits obtained by the application of the method were twofold: on one hand, researchers and local actors were forced to conceive the Yungas as a complex net of rights rather than as a sum of personal interests. On the other hand, the participatory multicriteria approach provided a structured process for collective decision-making in an area where it has never been implemented.
Introducing Interactive Teaching Styles into Astronomy Lectures
NASA Astrophysics Data System (ADS)
Deming, G. L.
1997-12-01
The majority of undergraduate students who take an astronomy class are non-science majors attempting to satisfy a science requirement. Often in these "scientific literacy" courses, facts are memorized for the exam and forgotten shortly afterwards. Scientific literacy courses should advance student skills toward processing information and applying higher order thinking rather than simple recall and memorization of facts. Thinking about material as it is presented, applying new knowledge to solve problems, and thinking critically about topics are objectives that many astronomy instructors hope their students are achieving. A course in astronomy is more likely to achieve such goals if students routinely participate in their learning. Interactive techniques can be quite effective even in large classes. Examples of activities are presented that involve using cooperative learning techniques, writing individual and group "minute papers," identifying and correcting misconceptions, including the whole class in a demonstration, and applying knowledge to new situations.
Korban, Zygmunt
2015-01-01
Occupational health and safety management systems apply audit examinations as an integral element of these systems. The examinations are used to verify whether the undertaken actions are in compliance with the accepted regulations, whether they are implemented in a suitable way and whether they are effective. One of the earliest solutions of that type applied in the mining industry in Poland involved the application of audit research based on the MERIT survey (Management Evaluation Regarding Itemized Tendencies). A mathematical model applied in the survey facilitates the determination of assessment indexes WOPi for each of the assessed problem areas, which, among other things, can be used to set up problem area rankings and to determine an aggregate (synthetic) assessment. In the paper presented here, the assessment indexes WOPi were used to calculate a development measure, and the calculation process itself was supplemented with sensitivity analysis.
[Quality assurance and quality improvement. Personal experiences and intentions].
Roche, B G; Sommer, C
1995-01-01
In may 1994 we were selected by the surgical Swiss association to make a study about quality in USA. During our travel we visited 3 types of institutions: Hospitals, National Institute of standard and Technology, Industry, Johnson & Johnson. We appreciate to compare 2 types of quality programs: Quality Assurance (QA) and Continuous Quality Improvement (CQI). In traditional healthcare circles, QA is the process established to meet external regulatory requirements and to assure that patient care is consistent with established standards. In a modern quality terms, QA outside of healthcare means designing a product or service, as well as controlling its production, so well that quality is inevitable. The ideas of W. Edward Deming is that there is never improvement just by inspection. He developed a theory based on 14 principles. A productive work is accomplished through processes. Understanding the variability of processes is a key to improve quality. Quality management sees each person in an organisation as part of one or more processes. The job of every worker is to receive the work of others, add value to that work, and supply it to the next person in the process. This is called the triple role the workers as customer, processor, and supplier. The main source of quality defects is problems in the process. The old assumption is that quality fails when people do the right thing wrong; the new assumption is that, more often, quality failures arise when people do the wrong think right. Exhortation, incentives and discipline of workers are unlikely to improve quality. If quality is failing when people do their jobs as designed, then exhorting them to do better is managerial nonsense. Modern quality theory is customer focused. Customers are identified internally and externally. The modern approach to quality is thoroughly grounded in scientific and statistical thinking. Like in medicine, the symptom is a defect in quality. The therapist of process must perform diagnostic test, formulate hypotheses of cause, test those hypotheses, apply remedies, and assess the effect of remedies. Total employee involvement is critical. A power comes from enabling all employees to become involved in quality improvement. A great advantage of CQI is the prevention orientation of the concept. The CQI permeated a collegial approach, people learn how to work together to improve. CQI is a time consuming procedure. During our travel we learned the definition of quality as the customer satisfaction. To build a CQI concept in employed time but all employed are involved in quality improvement. Applying CQI we could be able to refuse Quality control programs.
A practical approach to programmatic assessment design.
Timmerman, A A; Dijkstra, J
2017-12-01
Assessment of complex tasks integrating several competencies calls for a programmatic design approach. As single instruments do not provide the information required to reach a robust judgment of integral performance, 73 guidelines for programmatic assessment design were developed. When simultaneously applying these interrelated guidelines, it is challenging to keep a clear overview of all assessment activities. The goal of this study was to provide practical support for applying a programmatic approach to assessment design, not bound to any specific educational paradigm. The guidelines were first applied in a postgraduate medical training setting, and a process analysis was conducted. This resulted in the identification of four steps for programmatic assessment design: evaluation, contextualisation, prioritisation and justification. Firstly, the (re)design process starts with sufficiently detailing the assessment environment and formulating the principal purpose. Key stakeholders with sufficient (assessment) expertise need to be involved in the analysis of strengths and weaknesses and identification of developmental needs. Central governance is essential to balance efforts and stakes with the principal purpose and decide on prioritisation of design decisions and selection of relevant guidelines. Finally, justification of assessment design decisions, quality assurance and external accountability close the loop, to ensure sound underpinning and continuous improvement of the assessment programme.
NASA Astrophysics Data System (ADS)
Ping, Owi Wei; Ahmad, Azhar; Adnan, Mazlini; Hua, Ang Kean
2017-05-01
Higher Order Thinking Skills (HOTS) is a new concept of education reform based on the Taxonomies Bloom. The concept concentrate on student understanding in learning process based on their own methods. Through the HOTS questions are able to train students to think creatively, critic and innovative. The aim of this study was to identify the student's proficiency in solving HOTS Mathematics question by using i-Think map. This research takes place in Sabak Bernam, Selangor. The method applied is quantitative approach that involves approximately all of the standard five students. Pra-posttest was conduct before and after the intervention using i-Think map in solving the HOTS questions. The result indicates significant improvement for post-test, which prove that applying i-Think map enhance the students ability to solve HOTS question. Survey's analysis showed 90% of the students agree having i-Thinking map in analysis the question carefully and using keywords in the map to solve the questions. As conclusion, this process benefits students to minimize in making the mistake when solving the questions. Therefore, teachers are necessarily to guide students in applying the eligible i-Think map and methods in analyzing the question through finding the keywords.
NASA Astrophysics Data System (ADS)
Johnson, Andrew P.
1999-11-01
This dissertation is a study of students' model development processes in a physical science course for preservice elementary teachers. It details the models of magnetic materials developed and used by students during a unit on static electricity and magnetism. In this inquiry-based course, the class developed and formally accepted a model, in the form of diagrams and descriptions, that is very similar to the accepted magnetic domains model. They did this without textbooks or lectures on magnetism. Before adopting this model, however, most groups in the class temporarily used models involving opposite charges at the two ends of magnetized nails. How did the students do it? The explanation involves detailed study of the groups' interactions and use of structure in the classroom environment. This dissertation uses two theoretical frameworks to analyze interactions. It applies Yackel and Cobb's (1996) concepts of classroom social norms to characterize aspects of the classroom participation structure which affected groups' construction and declaration of models. It also applies distributed cognition ideas to analyze the sense-making conversations that small groups had when constructing group responses. This research found that conversations in one small group could be characterized into sixteen categories. Important categories included "extending ideas" which involved gradual deepening and elaboration of the group's understanding of their model(s), and "joint typing", an interactive process by which group members collaborated on typed statements or group diagrams and simultaneously developed common language for communicating their ideas to each other. Some of these categories of activity were closely connected to computer use. Also, four classroom norms are described. One small group social classroom norm involved group members developing a "common ground" consisting of agreed-upon group statements. Three sociophysics norms which characterize the whole class interactions as well as those of the small group involved a distinction between generalizations of phenomena and theoretical statements, class criteria for accepting evidence, and the obligation for each group to have a model of magnetic materials that they could support with acceptable evidence.
Genes involved in host-parasite interactions can be revealed by their correlated expression.
Reid, Adam James; Berriman, Matthew
2013-02-01
Molecular interactions between a parasite and its host are key to the ability of the parasite to enter the host and persist. Our understanding of the genes and proteins involved in these interactions is limited. To better understand these processes it would be advantageous to have a range of methods to predict pairs of genes involved in such interactions. Correlated gene expression profiles can be used to identify molecular interactions within a species. Here we have extended the concept to different species, showing that genes with correlated expression are more likely to encode proteins, which directly or indirectly participate in host-parasite interaction. We go on to examine our predictions of molecular interactions between the malaria parasite and both its mammalian host and insect vector. Our approach could be applied to study any interaction between species, for example, between a host and its parasites or pathogens, but also symbiotic and commensal pairings.
Prager, Katrin; Freese, Jan
2009-02-01
Recent European regulations for rural development emphasise the requirement to involve stakeholder groups and other appropriate bodies in the policy-making process. This paper presents two cases involving stakeholder participation in agri-environmental development and policy making, targeted at different policy-making levels. One study was undertaken in Lower Saxony where a local partnership developed and tested an agri-environmental prescription, which was later included in the state's menu of agri-environmental schemes. In Sachsen-Anhalt, state-facilitated stakeholder workshops including a mathematical model were used to optimise the programme planning and budget allocation at the state level. Both studies aimed at improving the acceptance of agri-environmental schemes. The authors gauge the effectiveness of the two approaches and discuss what lessons can be learned. The experience suggests that the approaches can complement one another and could also be applied to rural policy making.
Fontes, Cristiano Hora; Budman, Hector
2017-11-01
A clustering problem involving multivariate time series (MTS) requires the selection of similarity metrics. This paper shows the limitations of the PCA similarity factor (SPCA) as a single metric in nonlinear problems where there are differences in magnitude of the same process variables due to expected changes in operation conditions. A novel method for clustering MTS based on a combination between SPCA and the average-based Euclidean distance (AED) within a fuzzy clustering approach is proposed. Case studies involving either simulated or real industrial data collected from a large scale gas turbine are used to illustrate that the hybrid approach enhances the ability to recognize normal and fault operating patterns. This paper also proposes an oversampling procedure to create synthetic multivariate time series that can be useful in commonly occurring situations involving unbalanced data sets. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.
The role of power in health policy dialogues: lessons from African countries.
Mwisongo, Aziza; Nabyonga-Orem, Juliet; Yao, Theodore; Dovlo, Delanyo
2016-07-18
Policy-making is a dynamic process involving the interplay of various factors. Power and its role are some of its core components. Though power exerts a profound role in policy-making, empirical evidence suggests that health policy analysis has paid only limited attention to the role of power, particularly in policy dialogues. This exploratory study, which used qualitative methods, had the main aim of learning about and understanding policy dialogues in five African countries and how power influences such processes. Data were collected using key informant interviews. An interview guide was developed with standardised questions and probes on the policy dialogues in each country. This paper utilises these data plus document review to understand how power was manifested during the policy dialogues. Reference is made to the Arts and Tatenhove conceptual framework on power dimensions to understand how power featured during the policy dialogues in African health contexts. Arts and Tatenhove conceptualise power in policy-making in relational, dispositional and structural layers. Our study found that power was applied positively during the dialogues to prioritise agendas, fast-track processes, reorganise positions, focus attention on certain items and foster involvement of the community. Power was applied negatively during the dialogues, for example when position was used to control and shape dialogues, which limited innovation, and when knowledge power was used to influence decisions and the direction of the dialogues. Transitive power was used to challenge the government to think of implementation issues often forgotten during policy-making processes. Dispositional power was the most complex form of power expressed both overtly and covertly. Structural power was manifested socially, culturally, politically, legally and economically. This study shows that we need to be cognisant of the role of power during policy dialogues and put mechanisms in place to manage its influence. There is need for more research to determine how to channel power influence policy-making processes positively, for example through interactive policy dialogues.
Particle Formation and Product Formulation Using Supercritical Fluids.
Knez, Željko; Knez Hrnčič, Maša; Škerget, Mojca
2015-01-01
Traditional methods for solids processing involve either high temperatures, necessary for melting or viscosity reduction, or hazardous organic solvents. Owing to the negative impact of the solvents on the environment, especially on living organisms, intensive research has focused on new, sustainable methods for the processing of these substances. Applying supercritical fluids for particle formation may produce powders and composites with special characteristics. Several processes for formation and design of solid particles using dense gases have been studied intensively. The unique thermodynamic and fluid-dynamic properties of supercritical fluids can be used also for impregnation of solid particles or for the formation of solid powderous emulsions and particle coating, e.g., for formation of solids with unique properties for use in different applications. We give an overview of the application of sub- and supercritical fluids as green processing media for particle formation processes and present recent advances and trends in development.
Dael, Nele; Sierro, Guillaume; Mohr, Christine
2013-01-01
The literature on developmental synesthesia has seen numerous sensory combinations, with surprisingly few reports on synesthesias involving affect. On the one hand, emotion, or more broadly affect, might be of minor importance to the synesthetic experience (e.g., Sinke et al., 2012). On the other hand, predictions on how affect could be relevant to the synesthetic experience remain to be formulated, in particular those that are driven by emotion theories. In this theoretical paper, we hypothesize that a priori studies on synesthesia involving affect will observe the following. Firstly, the synesthetic experience is not merely about discrete emotion processing or overall valence (positive, negative) but is determined by or even altered through cognitive appraisal processes. Secondly, the synesthetic experience changes temporarily on a quantitative level according to (i) the affective appraisal of the inducing stimulus or (ii) the current affective state of the individual. These hypotheses are inferred from previous theoretical and empirical accounts on synesthesia (including the few examples involving affect), different emotion theories, crossmodal processing accounts in synesthetes, and non-synesthetes, and the presumed stability of the synesthetic experience. We hope that the current review will succeed in launching a new series of studies on “affective synesthesias.” We particularly hope that such studies will apply the same creativity in experimental paradigms as we have seen and still see when assessing and evaluating “traditional” synesthesias. PMID:24151478
Illeghems, Koen; Weckx, Stefan; De Vuyst, Luc
2015-09-01
A high-resolution functional metagenomic analysis of a representative single sample of a Brazilian spontaneous cocoa bean fermentation process was carried out to gain insight into its bacterial community functioning. By reconstruction of microbial meta-pathways based on metagenomic data, the current knowledge about the metabolic capabilities of bacterial members involved in the cocoa bean fermentation ecosystem was extended. Functional meta-pathway analysis revealed the distribution of the metabolic pathways between the bacterial members involved. The metabolic capabilities of the lactic acid bacteria present were most associated with the heterolactic fermentation and citrate assimilation pathways. The role of Enterobacteriaceae in the conversion of substrates was shown through the use of the mixed-acid fermentation and methylglyoxal detoxification pathways. Furthermore, several other potential functional roles for Enterobacteriaceae were indicated, such as pectinolysis and citrate assimilation. Concerning acetic acid bacteria, metabolic pathways were partially reconstructed, in particular those related to responses toward stress, explaining their metabolic activities during cocoa bean fermentation processes. Further, the in-depth metagenomic analysis unveiled functionalities involved in bacterial competitiveness, such as the occurrence of CRISPRs and potential bacteriocin production. Finally, comparative analysis of the metagenomic data with bacterial genomes of cocoa bean fermentation isolates revealed the applicability of the selected strains as functional starter cultures. Copyright © 2015 Elsevier Ltd. All rights reserved.
Semisolid Metal Processing Techniques for Nondendritic Feedstock Production
Mohammed, M. N.; Omar, M. Z.; Salleh, M. S.; Alhawari, K. S.; Kapranos, P.
2013-01-01
Semisolid metal (SSM) processing or thixoforming is widely known as a technology that involves the formation of metal alloys between solidus and liquidus temperatures. For the procedure to operate successfully, the microstructure of the starting material must consist of solid near-globular grains surrounded by a liquid matrix and a wide solidus-to-liquidus transition area. Currently, this process is industrially successful, generating a variety of products with high quality parts in various industrial sectors. Throughout the years since its inception, a number of technologies to produce the appropriate globular microstructure have been developed and applied worldwide. The main aim of this paper is to classify the presently available SSM technologies and present a comprehensive review of the potential mechanisms that lead to microstructural alterations during the preparation of feedstock materials for SSM processing. PMID:24194689
Interfacing LabVIEW With Instrumentation for Electronic Failure Analysis and Beyond
NASA Technical Reports Server (NTRS)
Buchanan, Randy K.; Bryan, Coleman; Ludwig, Larry
1996-01-01
The Laboratory Virtual Instrumentation Engineering Workstation (LabVIEW) software is designed such that equipment and processes related to control systems can be operationally lined and controlled by the use of a computer. Various processes within the failure analysis laboratories of NASA's Kennedy Space Center (KSC) demonstrate the need for modernization and, in some cases, automation, using LabVIEW. An examination of procedures and practices with the Failure Analaysis Laboratory resulted in the conclusion that some device was necessary to elevate the potential users of LabVIEW to an operational level in minimum time. This paper outlines the process involved in creating a tutorial application to enable personnel to apply LabVIEW to their specific projects. Suggestions for furthering the extent to which LabVIEW is used are provided in the areas of data acquisition and process control.
Santos, Ana M C; Doria, Mara S; Meirinhos-Soares, Luís; Almeida, António J; Menezes, José C
2018-01-01
Microbial quality control of non-sterile drug products has been a concern to regulatory agencies and the pharmaceutical industry since the 1960s. Despite being an old challenge to companies, microbial contamination still affects a high number of manufacturers of non-sterile products. Consequences go well beyond the obvious direct costs related to batch rejections or product recalls, as human lives and a company's reputation are significantly impacted if such events occur. To better manage risk and establish effective mitigation strategies, it is necessary to understand the microbial hazards involved in non-sterile drug products manufacturing, be able to evaluate their potential impact on final product quality, and apply mitigation actions. Herein we discuss the most likely root causes involved in microbial contaminations referenced in warning letters issued by US health authorities and non-compliance reports issued by European health authorities over a period of several years. The quality risk management tools proposed were applied to the data gathered from those databases, and a generic risk ranking was provided based on a panel of non-sterile drug product manufacturers that was assembled and given the opportunity to perform the risk assessments. That panel identified gaps and defined potential mitigation actions, based on their own experience of potential risks expected for their processes. Major findings clearly indicate that the manufacturers affected by the warning letters should focus their attention on process improvements and microbial control strategies, especially those related to microbial analysis and raw material quality control. Additionally, the WLs considered frequently referred to failures in quality-related issues, which indicates that the quality commitment should be reinforced at most companies to avoid microbiological contaminations. LAY ABSTRACT: Microbial contamination of drug products affects the quality of non-sterile drug products produced by numerous manufacturers, representing a major risk to patients. It is necessary to understand the microbial hazards involved in the manufacturing process and evaluate their impact on final product quality so that effective prevention strategies can be implemented. A risk-based classification of most likely root causes for microbial contamination found in the warning letters issued by the US Food and Drug Administration and the European Medicines Agency is proposed. To validate the likely root causes extracted from the warning letters, a subject matter expert panel made of several manufacturers was formed and consulted. A quality risk management approach to assess microbiological contamination of non-sterile drug products is proposed for the identification of microbial hazards involved in the manufacturing process. To enable ranking of microbial contamination risks, quality risk management metrics related to criticality and overall risk were applied. The results showed that manufacturers of non-sterile drug products should improve their microbial control strategy, with special attention to quality controls of raw materials, primary containers, and closures. Besides that, they should invest in a more robust quality system and culture. As a start, manufacturers may consider investigating their specific microbiological risks, adressing their sites' own microbial ecology, type of manufacturing processes, and dosage form characteristics, as these may lead to increased contamination risks. Authorities should allow and enforce innovative, more comprehensive, and more effective approaches to in-process contamination monitoring and controls. © PDA, Inc. 2018.
Clinical informatics in undergraduate teaching of health informatics.
Pantazi, Stefan V; Pantazi, Felicia; Daly, Karen
2011-01-01
We are reporting on a recent experience with Health Informatics (HI) teaching at undergraduate degree level to an audience of HI and Pharmacy students. The important insight is that effective teaching of clinical informatics must involve highly interactive, applied components in addition to the traditional theoretical material. This is in agreement with general literature underlining the importance of simulations and role playing in teaching and is well supported by our student evaluation results. However, the viability and sustainability of such approaches to teaching hinges on significant course preparation efforts. These efforts consist of time-consuming investigations of informatics technologies, applications and systems followed by the implementation of workable solutions to a wide range of technical problems. In effect, this approach to course development is an involved process that relies on a special form of applied research whose technical complexity could explain the dearth of published reports on similar approaches in HI education. Despite its difficulties, we argue that this approach can be used to set a baseline for clinical informatics training at undergraduate level and that its implications for HI education in Canada are of importance.
Mechanism of thermal decomposition of K2FeO4 and BaFeO4: A review
NASA Astrophysics Data System (ADS)
Sharma, Virender K.; Machala, Libor
2016-12-01
This paper presents thermal decomposition of potassium ferrate(VI) (K2FeO4) and barium ferrate(VI) (BaFeO4) in air and nitrogen atmosphere. Mössbauer spectroscopy and nuclear forward scattering (NFS) synchrotron radiation approaches are reviewed to advance understanding of electron-transfer processes involved in reduction of ferrate(VI) to Fe(III) phases. Direct evidences of Fe V and Fe IV as intermediate iron species using the applied techniques are given. Thermal decomposition of K2FeO4 involved Fe V, Fe IV, and K3FeO3 as intermediate species while BaFeO3 (i.e. Fe IV) was the only intermediate species during the decomposition of BaFeO4. Nature of ferrite species, formed as final Fe(III) species, of thermal decomposition of K2FeO4 and BaFeO4 under different conditions are evaluated. Steps of the mechanisms of thermal decomposition of ferrate(VI), which reasonably explained experimental observations of applied approaches in conjunction with thermal and surface techniques, are summarized.
Conformal mapping for multiple terminals
Wang, Weimin; Ma, Wenying; Wang, Qiang; Ren, Hao
2016-01-01
Conformal mapping is an important mathematical tool that can be used to solve various physical and engineering problems in many fields, including electrostatics, fluid mechanics, classical mechanics, and transformation optics. It is an accurate and convenient way to solve problems involving two terminals. However, when faced with problems involving three or more terminals, which are more common in practical applications, existing conformal mapping methods apply assumptions or approximations. A general exact method does not exist for a structure with an arbitrary number of terminals. This study presents a conformal mapping method for multiple terminals. Through an accurate analysis of boundary conditions, additional terminals or boundaries are folded into the inner part of a mapped region. The method is applied to several typical situations, and the calculation process is described for two examples of an electrostatic actuator with three electrodes and of a light beam splitter with three ports. Compared with previously reported results, the solutions for the two examples based on our method are more precise and general. The proposed method is helpful in promoting the application of conformal mapping in analysis of practical problems. PMID:27830746
System for routine surface anthropometry using reprojection registration
NASA Astrophysics Data System (ADS)
Sadleir, R. J.; Owens, R. A.; Hartmann, P. E.
2003-11-01
Range data measurement can be usefully applied to non-invasive monitoring of anthropometric changes due to disease, healing or during normal physiological processes. We have developed a computer vision system that allows routine capture of biological surface shapes and accurate measurement of anthropometric changes, using a structured light stripe triangulation system. In many applications involving relocation of soft tissue for image-guided surgery or anthropometry it is neither accurate nor practical to apply fiducial markers directly to the body. This system features a novel method of achieving subject re-registration that involves application of fiducials by a standard data projector. Calibration of this reprojector is achieved using a variation of structured lighting techniques. The method allows accurate and comparable repositioning of elastic surfaces. Tests of repositioning using the reprojector found a significant improvement in subject registration compared to an earlier method which used video overlay comparison only. It has a current application to the measurement of breast volume changes in lactating mothers, but may be extended to any application where repeatable positioning and measurement is required.
NASA Technical Reports Server (NTRS)
Glick, B. J.
1985-01-01
Techniques for classifying objects into groups or clases go under many different names including, most commonly, cluster analysis. Mathematically, the general problem is to find a best mapping of objects into an index set consisting of class identifiers. When an a priori grouping of objects exists, the process of deriving the classification rules from samples of classified objects is known as discrimination. When such rules are applied to objects of unknown class, the process is denoted classification. The specific problem addressed involves the group classification of a set of objects that are each associated with a series of measurements (ratio, interval, ordinal, or nominal levels of measurement). Each measurement produces one variable in a multidimensional variable space. Cluster analysis techniques are reviewed and methods for incuding geographic location, distance measures, and spatial pattern (distribution) as parameters in clustering are examined. For the case of patterning, measures of spatial autocorrelation are discussed in terms of the kind of data (nominal, ordinal, or interval scaled) to which they may be applied.
Multivariate statistical model for 3D image segmentation with application to medical images.
John, Nigel M; Kabuka, Mansur R; Ibrahim, Mohamed O
2003-12-01
In this article we describe a statistical model that was developed to segment brain magnetic resonance images. The statistical segmentation algorithm was applied after a pre-processing stage involving the use of a 3D anisotropic filter along with histogram equalization techniques. The segmentation algorithm makes use of prior knowledge and a probability-based multivariate model designed to semi-automate the process of segmentation. The algorithm was applied to images obtained from the Center for Morphometric Analysis at Massachusetts General Hospital as part of the Internet Brain Segmentation Repository (IBSR). The developed algorithm showed improved accuracy over the k-means, adaptive Maximum Apriori Probability (MAP), biased MAP, and other algorithms. Experimental results showing the segmentation and the results of comparisons with other algorithms are provided. Results are based on an overlap criterion against expertly segmented images from the IBSR. The algorithm produced average results of approximately 80% overlap with the expertly segmented images (compared with 85% for manual segmentation and 55% for other algorithms).
Gomes, Maria Angélica da Conceição; Hauser-Davis, Rachel Ann; de Souza, Adriane Nunes; Vitória, Angela Pierre
2016-12-01
The accumulation of metals in different environmental compartments poses a risk to both the environment and biota health. In particular, the continuous increase of these elements in soil ecosystems is a major worldwide concern. Phytoremediation has been gaining more attention in this regard. This approach takes advantage of the unique and selective uptake capabilities of plant root systems, and applies these natural processes alongside the translocation, bioaccumulation, and contaminant degradation abilities of the entire plant and, although it is a relatively recent technology, beginning in the 90's, it is already considered a green alternative solution to the problem of metal pollution, with great potential. This review focuses on phytoremediation of metals from soil, sludge, wastewater and water, the different strategies applied, the biological and physico-chemical processes involved and the advantages and limitations of each strategy. Special note is given to the use of transgenic species and phytoremediation of metallic nanoparticles. Copyright © 2016 Elsevier Inc. All rights reserved.
Marcondes, Freddy Beretta; de Vasconcelos, Rodrigo Antunes; Marchetto, Adriano; de Andrade, André Luis Lugnani; Filho, Américo Zoppi; Etchebehere, Maurício
2015-01-01
Objetctive: Study was to translate and culturally adapt the modified Rowe score for overhead athletes. Methods: The translation and cultural adaptation process initially involved the stages of translation, synthesis, back-translation, and revision by the Translation Group. It was than created the pre-final version of the questionnaire, being the areas “function” and “pain” applied to 20 athletes that perform overhead movements and that suffered SLAP lesions in the dominant shoulder and the areas “active compression test and anterior apprehension test” and “motion” were applied to 15 health professionals. Results: During the translation process there were made little modifications in the questionnaire in order to adapt it to Brazilian culture, without changing the semantics and the idiomatic concept originally described. Conclusion: The questionnaire was easily understood by the subjects of the study, being possible to obtain the Brazilian version of the modified Rowe score for overhead athletes that underwent surgical treatment of the SLAP lesion. PMID:27047903
An Experimental Study of Applied Ground Loads in Landing
NASA Technical Reports Server (NTRS)
Milwitzky, Benjamin; Lindquist, Dean C; Potter, Dexter M
1955-01-01
Results are presented of an experimental investigation made of the applied ground loads and the coefficient of friction between the tire and the ground during the wheel spin-up process in impacts of a small landing gear under controlled conditions on a concrete landing strip in the Langley impact basin. The basic investigation included three major phases: impacts with forward speed at horizontal velocities up to approximately 86 feet per second, impacts with forward speed and reverse wheel rotation to simulate horizontal velocities up to about 273 feet per second, and spin-up drop tests for comparison with the other tests. In addition to the basic investigation, supplementary tests were made to evaluate the drag-load alleviating effects of prerotating the wheel before impact so as to reduce the relative velocity between the tire and ground. In the presentation of the results, an attempt has been made to interpret the experimental data so as to obtain some insight into the physical phenomena involved in the wheel spin-up process.
Halogen-Mediated Conversion of Hydrocarbons to Commodities.
Lin, Ronghe; Amrute, Amol P; Pérez-Ramírez, Javier
2017-03-08
Halogen chemistry plays a central role in the industrial manufacture of various important chemicals, pharmaceuticals, and polymers. It involves the reaction of halogens or halides with hydrocarbons, leading to intermediate compounds which are readily converted to valuable commodities. These transformations, predominantly mediated by heterogeneous catalysts, have long been successfully applied in the production of polymers. Recent discoveries of abundant conventional and unconventional natural gas reserves have revitalized strong interest in these processes as the most cost-effective gas-to-liquid technologies. This review provides an in-depth analysis of the fundamental understanding and applied relevance of halogen chemistry in polymer industries (polyvinyl chloride, polyurethanes, and polycarbonates) and in the activation of light hydrocarbons. The reactions of particular interest include halogenation and oxyhalogenation of alkanes and alkenes, dehydrogenation of alkanes, conversion of alkyl halides, and oxidation of hydrogen halides, with emphasis on the catalyst, reactor, and process design. Perspectives on the challenges and directions for future development in this exciting field are provided.
On the state of lithospheric stress in the absence of applied tectonic forces
McGarr, A.
1988-01-01
Numerous published analyses of the nontectonic state of stress are based on Hooke's law and the boundary condition of zero horizontal deformation. This approach has been used to determine the gravitational stress state as well as the effects of processes such as erosion and temperature changes on the state of lithospheric stress. The major disadvantage of these analyses involves the assumption of lateral constraint which seems unrealistic in view of the observational fact that the crust can deform horizontally in response to applied loads. If the same problems are addressed by assuming that the remote stress state is constant, instead of the condition of zero horizontal deformation, then the resulting stress states are entirely different and in good accord with observations. The processes of erosion and sedimentation have slight tendencies to increase and decrease, respectively, the state of deviatoric stress. Temperature changes have only minor effects on the stress state, as averaged over the thickness of the lithosphere. -from Author
Fan, Hong Jin; Knez, Mato; Scholz, Roland; Hesse, Dietrich; Nielsch, Kornelius; Zacharias, Margit; Gösele, Ulrich
2007-04-01
The Kirkendall effect has been widely applied for fabrication of nanoscale hollow structures, which involves an unbalanced counterdiffusion through a reaction interface. Conventional treatment of this process only considers the bulk diffusion of growth species and vacancies. In this letter, a conceptual extension is proposed: the development of the hollow interior undergoes two main stages. The initial stage is the generation of small Kirkendall voids intersecting the compound interface via a bulk diffusion process; the second stage is dominated by surface diffusion of the core material (viz., the fast-diffusing species) along the pore surface. This concept applies to spherical as well as cylindrical nanometer and microscale structures, and even to macroscopic bilayers. As supporting evidence, we show the results of a spinel-forming solid-state reaction of core-shell nanowires, as well as of a planar bilayer of ZnO-Al2O3 to illustrate the influence of surface diffusion on the morphology evolution.
Insights into software development in Japan
NASA Technical Reports Server (NTRS)
Duvall, Lorraine M.
1992-01-01
The interdependence of the U.S.-Japanese economies makes it imperative that we in the United States understand how business and technology developments take place in Japan. We can gain insight into these developments in software engineering by studying the context in which Japanese software is developed, the practices that are used, the problems encountered, the setting surrounding these problems, and the resolution of these problems. Context includes the technological and sociological characteristics of the software development environment, the software processes applied, personnel involved in the development process, and the corporate and social culture surrounding the development. Presented in this paper is a summary of results of a study that addresses these issues. Data for this study was collected during a three month visit to Japan where the author interviewed 20 software managers representing nine companies involved in developing software in Japan. These data are compared to similar data from the United States in which 12 managers from five companies were interviewed.
The role of syllabic structure in French visual word recognition.
Rouibah, A; Taft, M
2001-03-01
Two experiments are reported in which the processing units involved in the reading of French polysyllabic words are examined. A comparison was made between units following the maximal onset principle (i.e., the spoken syllable) and units following the maximal coda principle (i.e., the basic orthographic syllabic structure [BOSS]). In the first experiment, it took longer to recognize that a syllable was the beginning of a word (e.g., the FOE of FOETUS) than to make the same judgment of a BOSS (e.g., FOET). The fact that a BOSS plus one letter (e.g., FOETU) also took longer to judge than the BOSS indicated that the maximal coda principle applies to the units of processing in French. The second experiment confirmed this, using a lexical decision task with the different units being demarcated on the basis of color. It was concluded that the syllabic structure that is so clearly manifested in the spoken form of French is not involved in visual word recognition.
Leite, Valéria Rodrigues; Lima, Kenio Costa; de Vasconcelos, Cipriano Maia
2012-07-01
This article investigates the issue of funding and the decentralization process in order to examine the composition, application and management of resources in the healthcare area. The sample surveyed involved 14 municipalities in the state of Rio Grande do Norte, Brazil. The research involved data gathering of financial transfers, the municipality's own resources and primary healthcare expenses. Management analysis included a survey of local managers and counselors. It was seen that the Unified Health System is funded mainly by federal transfers and municipal revenues and to a far lesser extent by state resources. Funds have been applied predominantly in primary healthcare. The management process saw centralization of actions in the city governments. Municipal secretarial offices and councils comply partially with legislation, though they have problems with autonomy and social control. The results show that planning and management instruments are limited, due to the contradictions inherent to the institutional, political and cultural context of the region.
Cognitive and neural foundations of discrete sequence skill: a TMS study.
Ruitenberg, Marit F L; Verwey, Willem B; Schutter, Dennis J L G; Abrahamse, Elger L
2014-04-01
Executing discrete movement sequences typically involves a shift with practice from a relatively slow, stimulus-based mode to a fast mode in which performance is based on retrieving and executing entire motor chunks. The dual processor model explains the performance of (skilled) discrete key-press sequences in terms of an interplay between a cognitive processor and a motor system. In the present study, we tested and confirmed the core assumptions of this model at the behavioral level. In addition, we explored the involvement of the pre-supplementary motor area (pre-SMA) in discrete sequence skill by applying inhibitory 20 min 1-Hz off-line repetitive transcranial magnetic stimulation (rTMS). Based on previous work, we predicted pre-SMA involvement in the selection/initiation of motor chunks, and this was confirmed by our results. The pre-SMA was further observed to be more involved in more complex than in simpler sequences, while no evidence was found for pre-SMA involvement in direct stimulus-response translations or associative learning processes. In conclusion, support is provided for the dual processor model, and for pre-SMA involvement in the initiation of motor chunks. Copyright © 2014 Elsevier Ltd. All rights reserved.
Benefits and Perceptions of Public Health Accreditation Among Health Departments Not Yet Applying.
Heffernan, Megan; Kennedy, Mallory; Siegfried, Alexa; Meit, Michael
To identify the benefits and perceptions among health departments not yet participating in the public health accreditation program implemented by the Public Health Accreditation Board (PHAB). Quantitative and qualitative data were gathered via Web-based surveys of health departments that had not yet applied for PHAB accreditation (nonapplicants) and health departments that had been accredited for 1 year. Respondents from 150 nonapplicant health departments and 57 health departments that had been accredited for 1 year. The majority of nonapplicant health departments are reportedly conducting a community health assessment (CHA), community health improvement plan (CHIP), and health department strategic plan-3 documents that are required to be in place before applying for PHAB accreditation. To develop these documents, most nonapplicants are reportedly referencing PHAB requirements. The most commonly reported perceived benefits of accreditation among health departments that planned to or were undecided about applying for accreditation were as follows: increased awareness of strengths and weaknesses, stimulated quality improvement (QI) and performance improvement activities, and increased awareness of/focus on QI. Nonapplicants that planned to apply reported a higher level of these perceived benefits. Compared with health departments that had been accredited for 1 year, nonapplicants were more likely to report that their staff had no or limited QI knowledge or familiarity. The PHAB accreditation program has influenced the broader public health field-not solely health departments that have undergone accreditation. Regardless of their intent to apply for accreditation, nonapplicant health departments are reportedly referencing PHAB guidelines for developing the CHA, CHIP, and health department strategic plan. Health departments may experience benefits associated with accreditation prior to their formal involvement in the PHAB accreditation process. The most common challenge for health departments applying for accreditation is identifying the time and resources to dedicate to the process.
Astronomical Instrumentation Systems Quality Management Planning: AISQMP
NASA Astrophysics Data System (ADS)
Goldbaum, Jesse
2017-06-01
The capability of small aperture astronomical instrumentation systems (AIS) to make meaningful scientific contributions has never been better. The purpose of AIS quality management planning (AISQMP) is to ensure the quality of these contributions such that they are both valid and reliable. The first step involved with AISQMP is to specify objective quality measures not just for the AIS final product, but also for the instrumentation used in its production. The next step is to set up a process to track these measures and control for any unwanted variation. The final step is continual effort applied to reducing variation and obtaining measured values near optimal theoretical performance. This paper provides an overview of AISQMP while focusing on objective quality measures applied to astronomical imaging systems.
Deionization and desalination using electrostatic ion pumping
Bourcier, William L.; Aines, Roger D.; Haslam, Jeffery J.; Schaldach, Charlene M.; O& #x27; Brien, Kevin C.; Cussler, Edward
2013-06-11
The present invention provides a new method and apparatus/system for purifying ionic solutions, such as, for example, desalinating water, using engineered charged surfaces to sorb ions from such solutions. Surface charge is applied externally, and is synchronized with oscillatory fluid movements between substantially parallel charged plates. Ions are held in place during fluid movement in one direction (because they are held in the electrical double layer), and released for transport during fluid movement in the opposite direction by removing the applied electric field. In this way the ions, such as salt, are "ratcheted" across the charged surface from the feed side to the concentrate side. The process itself is very simple and involves only pumps, charged surfaces, and manifolds for fluid collection.
A Review of the Anaerobic Digestion of Fruit and Vegetable Waste.
Ji, Chao; Kong, Chui-Xue; Mei, Zi-Li; Li, Jiang
2017-11-01
Fruit and vegetable waste is an ever-growing global question. Anaerobic digestion techniques have been developed that facilitate turning such waste into possible sources for energy and fertilizer, simultaneously helping to reduce environmental pollution. However, various problems are encountered in applying these techniques. The purpose of this study is to review local and overseas studies, which focus on the use of anaerobic digestion to dispose fruit and vegetable wastes, discuss the acidification problems and solutions in applying anaerobic digestion for fruit and vegetable wastes and investigate the reactor design (comparing single phase with two phase) and the thermal pre-treatment for processing raw wastes. Furthermore, it analyses the dominant microorganisms involved at different stages of digestion and suggests a focus for future studies.
Tribological systems as applied to aircraft engines
NASA Technical Reports Server (NTRS)
Buckley, D. H.
1985-01-01
Tribological systems as applied to aircraft are reviewed. The importance of understanding the fundamental concepts involved in such systems is discussed. Basic properties of materials which can be related to adhesion, friction and wear are presented and correlated with tribology. Surface processes including deposition and treatment are addressed in relation to their present and future application to aircraft components such as bearings, gears and seals. Lubrication of components with both liquids and solids is discussed. Advances in both new liquid molecular structures and additives for those structures are reviewed and related to the needs of advanced engines. Solids and polymer composites are suggested for increasing use and ceramic coatings containing fluoride compounds are offered for the extreme temperatures encountered in such components as advanced bearings and seals.
Deionization and desalination using electrostatic ion pumping
Bourcier, William L [Livermore, CA; Aines, Roger D [Livermore, CA; Haslam, Jeffery J [Livermore, CA; Schaldach, Charlene M [Pleasanton, CA; O'Brien, Kevin C [San Ramon, CA; Cussler, Edward [Edina, MN
2011-07-19
The present invention provides a new method and apparatus/system for purifying ionic solutions, such as, for example, desalinating water, using engineered charged surfaces to sorb ions from such solutions. Surface charge is applied externally, and is synchronized with oscillatory fluid movements between substantially parallel charged plates. Ions are held in place during fluid movement in one direction (because they are held in the electrical double layer), and released for transport during fluid movement in the opposite direction by removing the applied electric field. In this way the ions, such as salt, are "ratcheted" across the charged surface from the feed side to the concentrate side. The process itself is very simple and involves only pumps, charged surfaces, and manifolds for fluid collection.
Astronomical Instrumentation Systems Quality Management Planning: AISQMP (Abstract)
NASA Astrophysics Data System (ADS)
Goldbaum, J.
2017-12-01
(Abstract only) The capability of small aperture astronomical instrumentation systems (AIS) to make meaningful scientific contributions has never been better. The purpose of AIS quality management planning (AISQMP) is to ensure the quality of these contributions such that they are both valid and reliable. The first step involved with AISQMP is to specify objective quality measures not just for the AIS final product, but also for the instrumentation used in its production. The next step is to set up a process to track these measures and control for any unwanted variation. The final step is continual effort applied to reducing variation and obtaining measured values near optimal theoretical performance. This paper provides an overview of AISQMP while focusing on objective quality measures applied to astronomical imaging systems.
Computation of NLO processes involving heavy quarks using Loop-Tree Duality
NASA Astrophysics Data System (ADS)
Driencourt-Mangin, Félix
2017-03-01
We present a new method to compute higher-order corrections to physical cross-sections, at Next-to-Leading Order and beyond. This method, based on the Loop Tree Duality, leads to locally integrable expressions in four dimensions. By introducing a physically motivated momentum mapping between the momenta involved in the real and the virtual contributions, infrared singularities naturally cancel at integrand level, without the need to introduce subtraction counter-terms. Ultraviolet singularities are dealt with by using dual representations of suitable counter-terms, with some subtleties regarding the self-energy contributions. As an example, we apply this method to compute the 1 → 2 decay rate in the context of a scalar toy model with massive particles.
Brown, Ryan M; Meah, Christopher J; Heath, Victoria L; Styles, Iain B; Bicknell, Roy
2016-01-01
Angiogenesis involves the generation of new blood vessels from the existing vasculature and is dependent on many growth factors and signaling events. In vivo angiogenesis is dynamic and complex, meaning assays are commonly utilized to explore specific targets for research into this area. Tube-forming assays offer an excellent overview of the molecular processes in angiogenesis. The Matrigel tube forming assay is a simple-to-implement but powerful tool for identifying biomolecules involved in angiogenesis. A detailed experimental protocol on the implementation of the assay is described in conjunction with an in-depth review of methods that can be applied to the analysis of the tube formation. In addition, an ImageJ plug-in is presented which allows automatic quantification of tube images reducing analysis times while removing user bias and subjectivity.
Grasser, Susanne; Schunko, Christoph; Vogl, Christian R
2016-10-10
Ethically sound research in applied ethnobiology should benefit local communities by giving them full access to research processes and results. Participatory research may ensure such access, but there has been little discussion on methodological details of participatory approaches in ethnobiological research. This paper presents and discusses the research processes and methods developed in the course of a three-year research project on wild plant gathering, the involvement of children as co-researchers and the project's indications for local impact. Research was conducted in the Grosses Walsertal Biosphere Reserve, Austria, between 2008 and 2010 in four research phases. In phase 1, 36 freelist interviews with local people and participant observation was conducted. In phase 2 school workshops were held in 14 primary school classes and their 189 children interviewed 506 family members with structured questionnaires. In phase 3, 27 children and two researchers co-produced participatory videos. In phase 4 indications for the impact of the project were investigated with questionnaires from ten children and with participant observation. Children participated in various ways in the research process and the scientific output and local impact of the project was linked to the phases, degrees and methods of children's involvement. Children were increasingly involved in the project, from non-participation to decision-making. Scientific output was generated from participatory and non-participatory activities whereas local impact - on personal, familial, communal and institutional levels - was mainly generated through the participatory involvement of children as interviewers and as co-producers of videos. Creating scientific outputs from participatory video is little developed in ethnobiology, whereas bearing potential. As ethnobotanists and ethnobiologists, if we are truly concerned about the impact and benefits of our research processes and results to local communities, the details of the research processes need to be deliberately planned and evaluated and then reported and discussed in academic publications.
NASA Astrophysics Data System (ADS)
Bianchizza, C.; Del Bianco, D.; Pellizzoni, L.; Scolobig, A.
2012-04-01
Flood risk mitigation decisions pose key challenges not only from a technical but also from a social, economic and political viewpoint. There is an increasing demand for improving the quality of these processes by including different stakeholders - and especially by involving the local residents in the decision making process - and by guaranteeing the actual improvement of local social capacities during and after the decision making. In this paper we analyse two case studies of flood risk mitigation decisions, Malborghetto-Valbruna and Vipiteno-Sterzing, in the Italian Alps. In both of them, mitigation works have been completed or planned, yet following completely different approaches especially in terms of responses of residents and involvement of local authorities. In Malborghetto-Valbruna an 'interventionist' approach (i.e. leaning towards a top down/technocratic decision process) was used to make decisions after the flood event that affected the municipality in the year 2003. In Vipiteno-Sterzing, a 'participatory' approach (i.e. leaning towards a bottom-up/inclusive decision process) was applied: decisions about risk mitigation measures were made by submitting different projects to the local citizens and by involving them in the decision making process. The analysis of the two case studies presented in the paper is grounded on the results of two research projects. Structured and in-depth interviews, as well as questionnaire surveys were used to explore residents' and local authorities' orientations toward flood risk mitigation. Also a SWOT analysis (Strengths, Weaknesses, Opportunities and Threats) involving key stakeholders was used to better understand the characteristics of the communities and their perception of flood risk mitigation issues. The results highlight some key differences between interventionist and participatory approaches, together with some implications of their adoption in the local context. Strengths and weaknesses of the two approaches, as well as key challenges for the future are also discussed.
ERIC Educational Resources Information Center
Martens, Marga A. W.; Janssen, Marleen J.; Ruijssenaars, Wied A. J. J. M.; Huisman, Mark; Riksen-Walraven, J. Marianne
2014-01-01
Introduction: In this study, we applied the Intervention Model for Affective Involvement (IMAI) to four participants who are congenitally deafblind and their 16 communication partners in 3 different settings (school, a daytime activities center, and a group home). We examined whether the intervention increased affective involvement between the…
Ortiz-Boyer, F; Tena, M T; Luque de Castro, M D; Valcárcel, M
1995-10-01
Methods are reported for the determination of tyrothricin and benzocaine by HPLC and menthol by GC in the analysis of throat lozenges (tablets) containing all three compounds. After optimization of the variables involved in both HPLC and GC the methods have been characterized and validated according to the guidelines of the Spanish Pharmacopoeia, and applied to both the monitoring of the manufacturing process and the quality control of the final product.
NASA Technical Reports Server (NTRS)
Kazem, Sayyed M.
1992-01-01
Materials and Processes 1 (MET 141) is offered to freshmen by the Mechanical Engineering Department at Purdue University. The goal of MET 141 is to broaden the technical background of students who have not had any college science courses. Hence, applied physics, chemistry, and mathematics are included and quantitative problem solving is involved. In the elementary metallography experiment of this course, the objectives are: (1) introduce the vocabulary and establish outlook; (2) make qualitative observations and quantitative measurements; (3) demonstrate the proper use of equipment; and (4) review basic mathematics and science.
Including customers in health service design.
Perrott, Bruce E
2013-01-01
This article will explore the concept and meaning of codesign as it applies to the delivery of health services. The results of a pilot study in health codesign will be used as a research based case discussion, thus providing a platform to suggest future research that could lead to building more robust knowledge of how the consumers of health services may be more effectively involved in the process of developing and delivering the type of services that are in line with expectations of the various stakeholder groups.
Martinez-Ariza, Guillermo; McConnell, Nicholas; Hulme, Christopher
2016-04-15
A cesium carbonate promoted three-component reaction of N-H containing heterocycles, primary or secondary amines, arylglyoxaldehydes, and anilines is reported. The key step involves a tandem sequence of N-1 addition of a heterocycle or an amine to preformed α-iminoketones, followed by an air- or oxygen-mediated oxidation to form α-oxo-acetamidines. The scope of the reaction is enticingly broad, and this novel methodology is applied toward the synthesis of various polycyclic heterocycles.
Learning just-in-time in medical informatics.
Sancho, J J; Sanz, F
2000-01-01
Just-in-time learning (JITL) methodology has been applied to many areas of knowledge acquisition and dissemination. The paradigm is a challenge to the traditional classroom course-oriented approach with the aim to shorten the learning time, increasing the efficiency of the learning process, improve availability and save money. The information technology tools and platforms have been heavily involved to develop and deliver JITL. This paper discusses the main characteristics of JITL with regard to its implementation to teaching Medical Informatics.
An elementary research on wireless transmission of holographic 3D moving pictures
NASA Astrophysics Data System (ADS)
Takano, Kunihiko; Sato, Koki; Endo, Takaya; Asano, Hiroaki; Fukuzawa, Atsuo; Asai, Kikuo
2009-05-01
In this paper, a transmitting process of a sequence of holograms describing 3D moving objects over the communicating wireless-network system is presented. A sequence of holograms involves holograms is transformed into a bit stream data, and then it is transmitted over the wireless LAN and Bluetooth. It is shown that applying this technique, holographic data of 3D moving object is transmitted in high quality and a relatively good reconstruction of holographic images is performed.
Environmental sciences information storage and retrieval system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Engstrom, D.E.; White, M.G.; Dunaway, P.B.
Reynolds Electrical and Engineering Co., Inc. (REECo), has since 1970 accumulated information relating to the AEC's Nevada Applied Ecology Group (NAEG) programs at the Nevada Test Site (NTS). These programs, involving extensive soil, vegetation, and small-animal studies, have generated informational data concerning the collecting, processing, analyzing, and shipping of sample materials to various program participants and contractors. Future plans include incorporation of Lawrence Livermore Laboratory's resuspension study data, REECo's on-site air data, and EPA's large-animal, off-site air, and off-site soil data. (auth)
NASA Astrophysics Data System (ADS)
Trimpin, Sarah; Lu, I.-Chung; Rauschenbach, Stephan; Hoang, Khoa; Wang, Beixi; Chubatyi, Nicholas D.; Zhang, Wen-Jing; Inutan, Ellen D.; Pophristic, Milan; Sidorenko, Alexander; McEwen, Charles N.
2018-02-01
Ionization processes have been discovered by which small and large as well as volatile and nonvolatile compounds are converted to gas-phase ions when associated with a matrix and exposed to sub-atmospheric pressure. Here, we discuss experiments further defining these simple and unexpected processes. Charge separation is found to be a common process for small molecule chemicals, solids and liquids, passed through an inlet tube from a higher to a lower pressure region, with and without heat applied. This charge separation process produces positively- and negatively-charged particles with widely different efficiencies depending on the compound and its physical state. Circumstantial evidence is presented suggesting that in the new ionization process, charged particles carry analyte into the gas phase, and desolvation of these particles produce the bare ions similar to electrospray ionization, except that solid particles appear likely to be involved. This mechanistic proposition is in agreement with previous theoretical work related to ion emission from ice.
Application of participatory ergonomics to the redesign of the family-centred rounds process.
Xie, Anping; Carayon, Pascale; Cox, Elizabeth D; Cartmill, Randi; Li, Yaqiong; Wetterneck, Tosha B; Kelly, Michelle M
2015-01-01
Participatory ergonomics (PE) can promote the application of human factors and ergonomics (HFE) principles to healthcare system redesign. This study applied a PE approach to redesigning the family-centred rounds (FCR) process to improve family engagement. Various FCR stakeholders (e.g. patients and families, physicians, nurses, hospital management) were involved in different stages of the PE process. HFE principles were integrated in both the content (e.g. shared mental model, usability, workload consideration, systems approach) and process (e.g. top management commitment, stakeholder participation, communication and feedback, learning and training, project management) of FCR redesign. We describe activities of the PE process (e.g. formation and meetings of the redesign team, data collection activities, intervention development, intervention implementation) and present data on PE process evaluation. To demonstrate the value of PE-based FCR redesign, future research should document its impact on FCR process measures (e.g. family engagement, round efficiency) and patient outcome measures (e.g. patient satisfaction).
Application of participatory ergonomics to the redesign of the family-centered rounds process
Xie, Anping; Carayon, Pascale; Cox, Elizabeth D.; Cartmill, Randi; Li, Yaqiong; Wetterneck, Tosha B.; Kelly, Michelle M.
2015-01-01
Participatory ergonomics (PE) can promote the application of human factors and ergonomics (HFE) principles to healthcare system redesign. This study applied a PE approach to redesigning the family-centered rounds (FCR) process to improve family engagement. Various FCR stakeholders (e.g., patients and families, physicians, nurses, hospital management) were involved in different stages of the PE process. HFE principles were integrated in both the content (e.g., shared mental model, usability, workload consideration, systems approach) and process (e.g., top management commitment, stakeholder participation, communication and feedback, learning and training, project management) of FCR redesign. We describe activities of the PE process (e.g., formation and meetings of the redesign team, data collection activities, intervention development, intervention implementation) and present data on PE process evaluation. To demonstrate the value of PE-based FCR redesign, future research should document its impact on FCR process measures (e.g., family engagement, round efficiency) and patient outcome measures (e.g., patient satisfaction). PMID:25777042
The composing process of technical writers: A preliminary study
NASA Technical Reports Server (NTRS)
Mair, D.; Roundy, N.
1981-01-01
The assumption that technical writers compose as do other writers is tested. The literature on the composing process, not limited to the pure or applied sciences, was reviewed, yielding three areas of general agreement. The composing process (1) consists of several stages, (2) is reflexive, and (3) may be mastered by means of strategies. Data on the ways technical writers compose were collected, and findings were related to the three areas of agreement. Questionnaires and interviews surveying 70 writers were used. The disciplines represented by these writers included civil, chemical, agricultural, geological, mechanical, electrical, and petroleum engineering, chemistry, hydrology, geology, and biology. Those providing consulting services, or performing research. No technical editors or professional writers were surveyed, only technicians, engineers, and researchers whose jobs involved composing reports. Three pedagogical implications are included.
Impact of self-esteem and sex on stress reactions.
Kogler, Lydia; Seidel, Eva-Maria; Metzler, Hannah; Thaler, Hanna; Boubela, Roland N; Pruessner, Jens C; Kryspin-Exner, Ilse; Gur, Ruben C; Windischberger, Christian; Moser, Ewald; Habel, Ute; Derntl, Birgit
2017-12-08
Positive self-evaluation is a major psychological resource modulating stress coping behavior. Sex differences have been reported in self-esteem as well as stress reactions, but so far their interactions have not been investigated. Therefore, we investigated sex-specific associations of self-esteem and stress reaction on behavioral, hormonal and neural levels. We applied a commonly used fMRI-stress task in 80 healthy participants. Men compared to women showed higher activation during stress in hippocampus, precuneus, superior temporal gyrus (STG) and insula. Furthermore, men outperformed women in the stress task and had higher cortisol and testosterone levels than women after stress. Self-esteem had an impact on precuneus, insula and STG activation during stress across the whole group. During stress, men recruit regions associated with emotion and stress regulation, self-referential processing and cognitive control more strongly than women. Self-esteem affects stress processing, however in a sex-independent fashion: participants with lower self-esteem show higher activation of regions involved in emotion and stress regulation, self-referential processing and cognitive control. Taken together, our data suggest that men are more engaged during the applied stress task. Across women and men, lower self-esteem increases the effort in emotion and stress processing and cognitive control, possibly leading to self-related thoughts in stressful situations.
High-efficiency screen-printed belt co-fired solar cells on cast multicrystalline silicon
NASA Astrophysics Data System (ADS)
Upadhyaya, Ajay; Sheoran, Manav; Rohatgi, Ajeet
2005-01-01
High-efficiency 4cm2 untextured screen-printed solar cells were achieved on cast multicrystalline silicon. These cells were fabricated using a simple manufacturable process involving POCl3 diffusion for emitter, PECVD SiNx:H deposition for a single-layer antireflection coating and rapid co-firing of Ag grid, Al backcontact, and Al-BSF in a belt furnace. An optimized process sequence contributed to effective impurity gettering and defect passivation, resulting in high average bulk lifetimes in the range of 100-250 μs after the cell processing. The contact firing contributed to good ohmic contacts with low series resistance of <1Ωcm2, low backsurface recombination velocity of <500cm/s, and high fill factors of ˜0.78. These parameters resulted in 16.9% and 16.8% efficient untextured screen-printed cells with a single layer AR coating on heat exchanger method (HEM) and Baysix mc-Si. The identical process applied to the untextured float zone wafers gave an efficiency of 17.2%. The same optimized co-firing cycle, when applied to HEM mc-Si wafers with starting lifetimes varying over a wide range of 4-70 μs, resulted in cell efficiencies in the range of 16.5%-17%.
Rational approximations to rational models: alternative algorithms for category learning.
Sanborn, Adam N; Griffiths, Thomas L; Navarro, Daniel J
2010-10-01
Rational models of cognition typically consider the abstract computational problems posed by the environment, assuming that people are capable of optimally solving those problems. This differs from more traditional formal models of cognition, which focus on the psychological processes responsible for behavior. A basic challenge for rational models is thus explaining how optimal solutions can be approximated by psychological processes. We outline a general strategy for answering this question, namely to explore the psychological plausibility of approximation algorithms developed in computer science and statistics. In particular, we argue that Monte Carlo methods provide a source of rational process models that connect optimal solutions to psychological processes. We support this argument through a detailed example, applying this approach to Anderson's (1990, 1991) rational model of categorization (RMC), which involves a particularly challenging computational problem. Drawing on a connection between the RMC and ideas from nonparametric Bayesian statistics, we propose 2 alternative algorithms for approximate inference in this model. The algorithms we consider include Gibbs sampling, a procedure appropriate when all stimuli are presented simultaneously, and particle filters, which sequentially approximate the posterior distribution with a small number of samples that are updated as new data become available. Applying these algorithms to several existing datasets shows that a particle filter with a single particle provides a good description of human inferences.
Risk assessment as standard work in design.
Morrill, Patricia W
2013-01-01
This case study article examines a formal risk assessment as part of the decision making process for design solutions in high risk areas. The overview of the Failure Modes and Effects Analysis (FMEA) tool with examples of its application in hospital building projects will demonstrate the benefit of those structured conversations. This article illustrates how two hospitals used FMEA when integrating operational processes with building projects: (1) adjacency decision for Intensive Care Unit (ICU); and (2) distance concern for handling of specimens from Surgery to Lab. Both case studies involved interviews that exposed facility solution concerns. Just-in-time studies using the FMEA followed the same risk assessment process with the same workshop facilitator involving structured conversations in analyzing risks. In both cases, participants uncovered key areas of risk enabling them to take the necessary next steps. While the focus of this article is not the actual design solution, it is apparent that the risk assessment brought clarity to the situations resulting in prompt decision making about facility solutions. Hospitals are inherently risky environments; therefore, use of the formal risk assessment process, FMEA, is an opportunity for design professionals to apply more rigor to design decision making when facility solutions impact operations in high risk areas. Case study, decision making, hospital, infection control, strategy, work environment.
Process and representation in graphical displays
NASA Technical Reports Server (NTRS)
Gillan, Douglas J.; Lewis, Robert; Rudisill, Marianne
1990-01-01
How people comprehend graphics is examined. Graphical comprehension involves the cognitive representation of information from a graphic display and the processing strategies that people apply to answer questions about graphics. Research on representation has examined both the features present in a graphic display and the cognitive representation of the graphic. The key features include the physical components of a graph, the relation between the figure and its axes, and the information in the graph. Tests of people's memory for graphs indicate that both the physical and informational aspect of a graph are important in the cognitive representation of a graph. However, the physical (or perceptual) features overshadow the information to a large degree. Processing strategies also involve a perception-information distinction. In order to answer simple questions (e.g., determining the value of a variable, comparing several variables, and determining the mean of a set of variables), people switch between two information processing strategies: (1) an arithmetic, look-up strategy in which they use a graph much like a table, looking up values and performing arithmetic calculations; and (2) a perceptual strategy in which they use the spatial characteristics of the graph to make comparisons and estimations. The user's choice of strategies depends on the task and the characteristics of the graph. A theory of graphic comprehension is presented.
Independent component processes underlying emotions during natural music listening
Zollinger, Nina; Elmer, Stefan; Jäncke, Lutz
2016-01-01
The aim of this study was to investigate the brain processes underlying emotions during natural music listening. To address this, we recorded high-density electroencephalography (EEG) from 22 subjects while presenting a set of individually matched whole musical excerpts varying in valence and arousal. Independent component analysis was applied to decompose the EEG data into functionally distinct brain processes. A k-means cluster analysis calculated on the basis of a combination of spatial (scalp topography and dipole location mapped onto the Montreal Neurological Institute brain template) and functional (spectra) characteristics revealed 10 clusters referring to brain areas typically involved in music and emotion processing, namely in the proximity of thalamic-limbic and orbitofrontal regions as well as at frontal, fronto-parietal, parietal, parieto-occipital, temporo-occipital and occipital areas. This analysis revealed that arousal was associated with a suppression of power in the alpha frequency range. On the other hand, valence was associated with an increase in theta frequency power in response to excerpts inducing happiness compared to sadness. These findings are partly compatible with the model proposed by Heller, arguing that the frontal lobe is involved in modulating valenced experiences (the left frontal hemisphere for positive emotions) whereas the right parieto-temporal region contributes to the emotional arousal. PMID:27217116
Database constraints applied to metabolic pathway reconstruction tools.
Vilaplana, Jordi; Solsona, Francesc; Teixido, Ivan; Usié, Anabel; Karathia, Hiren; Alves, Rui; Mateo, Jordi
2014-01-01
Our group developed two biological applications, Biblio-MetReS and Homol-MetReS, accessing the same database of organisms with annotated genes. Biblio-MetReS is a data-mining application that facilitates the reconstruction of molecular networks based on automated text-mining analysis of published scientific literature. Homol-MetReS allows functional (re)annotation of proteomes, to properly identify both the individual proteins involved in the process(es) of interest and their function. It also enables the sets of proteins involved in the process(es) in different organisms to be compared directly. The efficiency of these biological applications is directly related to the design of the shared database. We classified and analyzed the different kinds of access to the database. Based on this study, we tried to adjust and tune the configurable parameters of the database server to reach the best performance of the communication data link to/from the database system. Different database technologies were analyzed. We started the study with a public relational SQL database, MySQL. Then, the same database was implemented by a MapReduce-based database named HBase. The results indicated that the standard configuration of MySQL gives an acceptable performance for low or medium size databases. Nevertheless, tuning database parameters can greatly improve the performance and lead to very competitive runtimes.
ERIC Educational Resources Information Center
Kennedy, Mark; Betts, Lucy; Dunn, Thomas; Sonuga-Barke, Edmund; Underwood, Jean
2015-01-01
Recent re-conceptualisation of paternal involvement (Pleck, J. H. (2010). Paternal involvement: Revised conceptualization and theoretical linkages with child outcomes. In M. Lamb (Ed.), "The role of the father in child development" (5th ed., pp. 67-107). London: Wiley), while proving fruitful, has yet to be applied to investigations into…
User-Centered Design Practices to Redesign a Nursing e-Chart in Line with the Nursing Process.
Schachner, María B; Recondo, Francisco J; González, Zulma A; Sommer, Janine A; Stanziola, Enrique; Gassino, Fernando D; Simón, Mariana; López, Gastón E; Benítez, Sonia E
2016-01-01
Regarding the user-centered design (UCD) practices carried out at Hospital Italiano of Buenos Aires, nursing e-chart user interface was redesigned in order to improve records' quality of nursing process based on an adapted Virginia Henderson theoretical model and patient safety standards to fulfil Joint Commission accreditation requirements. UCD practices were applied as standardized and recommended for electronic medical records usability evaluation. Implementation of these practices yielded a series of prototypes in 5 iterative cycles of incremental improvements to achieve goals of usability which were used and perceived as satisfactory by general care nurses. Nurses' involvement allowed balance between their needs and institution requirements.
Pretreatment of high solid microbial sludges
Rivard, Christopher J.; Nagle, Nicholas J.
1998-01-01
A process and apparatus for pretreating microbial sludges in order to enhance secondary anaerobic digestion. The pretreatment process involves disrupting the cellular integrity of municipal sewage sludge through a combination of thermal, explosive decompression and shear forces. The sludge is pressurized and pumped to a pretreatment reactor where it is mixed with steam to heat and soften the sludge. The pressure of the sludge is suddenly reduced and explosive decompression forces are imparted which partially disrupt the cellular integrity of the sludge. Shear forces are then applied to the sludge to further disrupt the cellular integrity of the sludge. Disrupting cellular integrity releases both soluble and insoluble organic constituents and thereby renders municipal sewage sludge more amenable to secondary anaerobic digestion.
Informed Consent to Research in Long-Term Care Settings
Jablonski, Rita A.; Bourbonniere, Meg; Kolanowski, Ann
2010-01-01
Informed consent to nursing home research is a two-tiered process that begins with obtaining the consent of a long-term care community at the institutional level and progresses to the engagement of individuals in the consent process. Drawing on a review of the literature and the authors’ research experiences and institutional review board service, this paper describes the practical implications of nurse investigators’ obligation to ensure informed consent among participants in long-term care research. Recommendations focus on applying a community consent model to long-term care research, promoting an evidence-based approach to the protection of residents with decisional impairment, and increasing investigators’ attention to ethical issues involving long-term care staff. PMID:20078005
NASA Astrophysics Data System (ADS)
Purwadi, D.; Nurlaily, I.
2018-03-01
Concerning environmental into focus of innovation process will expand the number of actor involved. Eco-innovation and triple helix are often frameworks applied to analyse how environmental concern are integrated in innovation process and how different stakeholder groups are having inter relation. Case study from biofloc catfish farming in Yogyakarta is presented to demonstrate a possible approach for researching the success of triple helix frameworks. This case is considered on basic of the result of a survey among farmers, academician and government. The paper concludes the creating of full triple helix encounters problem in practice. It also includes suggestion for further research on fisheries development.
NASA Astrophysics Data System (ADS)
Liu, Chih Hao; Skryabina, M. N.; Singh, Manmohan; Li, Jiasong; Wu, Chen; Sobol, E.; Larin, Kirill V.
2015-03-01
Current clinical methods of reconstruction surgery involve laser reshaping of nasal cartilage. The process of stress relaxation caused by laser heating is the primary method to achieve nasal cartilage reshaping. Based on this, a rapid, non-destructive and accurate elasticity measurement would allow for a more robust reshaping procedure. In this work, we have utilized a phase-stabilized swept source optical coherence elastography (PhSSSOCE) to quantify the Young's modulus of porcine nasal septal cartilage during the relaxation process induced by heating. The results show that PhS-SSOCE was able to monitor changes in elasticity of hyaline cartilage, and this method could potentially be applied in vivo during laser reshaping therapies.
Songür, Rahime; Lurçi, Binnaz; Bayraktar, Emine; Mehmetoğlu, Ulkü; Demir, Ayhan S
2011-06-01
In this study, the production of enantiopure benzoin from rac-benzoin acetate was achieved by lipase catalyzed kinetic resolution combined with deracemization using Rhizopus oryzae (CBS111718). The growth cells were pretreated with 20 kHz and 30 kHz ultrasound irradiation and mechanical homogenization. Approximately 100% conversion and 96% enantiomeric excess of the product (S-benzoin) were obtained by applying 20 kHz ultrasound irradiation at pH 6. The deracemization process involves new and important processes that allow for the transformation of a racemate into a single stereoisomeric product in 100% theoretical yields. Moreover, the application of ultrasound increases the conversion rate by reducing mass transfer limitation.
Influence of the boundary conditions on heat and mass transfer in spacer-filled channels
NASA Astrophysics Data System (ADS)
Ciofalo, M.; La Cerva, M. F.; Di Liberto, M.; Tamburini, A.
2017-11-01
The purpose of this study is to discuss some problems which arise in heat or mass transfer in complex channels, with special reference to the spacer-filled channels adopted in membrane processes. Among the issues addressed are the consistent definition of local and mean heat or mass transfer coefficients; the influence of the wall boundary conditions; the influence of one-side versus two-side heat/mass transfer. Most of the results discussed were obtained by finite volume CFD simulations concerning heat transfer in Membrane Distillation or mass transfer in Electrodialysis and Reverse Electrodialysis, but many of the conclusions apply also to different processes involving geometrically complex channels
Mancosu, Pietro; Nicolini, Giorgia; Goretti, Giulia; De Rose, Fiorenza; Franceschini, Davide; Ferrari, Chiara; Reggiori, Giacomo; Tomatis, Stefano; Scorsetti, Marta
2018-05-01
Lean Six Sigma Methodology (LSSM) was introduced in industry to provide near-perfect services to large processes, by reducing improbable occurrence. LSSM has been applied to redesign the 2D-2D breast repositioning process (Lean) by the retrospective analysis of the database (Six Sigma). Breast patients with daily 2D-2D matching before RT were considered. The five DMAIC (define, measure, analyze, improve, and control) LSSM steps were applied. The process was retrospectively measured over 30 months (7/2014-12/2016) by querying the RT Record&Verify database. Two Lean instruments (Poka-Yoke and Visual Management) were considered for advancing the process. The new procedure was checked over 6 months (1-6/2017). 14,931 consecutive shifts from 1342 patients were analyzed. Only 0.8% of patients presented median shifts >1 cm. The major observed discrepancy was the monthly percentage of fractions with almost zero shifts (AZS = 13.2% ± 6.1%). Ishikawa fishbone diagram helped in defining the main discrepancy con-causes. Procedure harmonization involving a multidisciplinary team to increase confidence in matching procedure was defined. AZS was reduced to 4.8% ± 0.6%. Furthermore, distribution symmetry improvement (Skewness moved from 1.4 to 1.1) and outlier reduction, verified by Kurtosis diminution, demonstrated a better "normalization" of the procedure after the LSSM application. LSSM was implemented in a RT department, allowing to redesign the breast repositioning matching procedure. Copyright © 2018 Elsevier B.V. All rights reserved.
Gilaie-Dotan, Sharon; Silvanto, Juha; Schwarzkopf, Dietrich S.; Rees, Geraint
2010-01-01
The occipital face area (OFA) is face-selective. This enhanced activation to faces could reflect either generic face and shape-related processing or high-level conceptual processing of identity. Here we examined these two possibilities using a state-dependent transcranial magnetic stimulation (TMS) paradigm. The lateral occipital (LO) cortex which is activated non-selectively by various types of objects served as a control site. We localized OFA and LO on a per-participant basis using functional MRI. We then examined whether TMS applied to either of these regions affected the ability of participants to decide whether two successively presented and physically different face images were of the same famous person or different famous people. TMS was applied during the delay between first and second face presentations to investigate whether neuronal populations in these regions played a causal role in mediating the behavioral effects of identity repetition. Behaviorally we found a robust identity repetition effect, with shorter reaction times (RTs) when identity was repeated, regardless of the fact that the pictures were physically different. Surprisingly, TMS applied over LO (but not OFA) modulated overall RTs, compared to the No-TMS condition. But critically, we found no effects of TMS to either area that were modulated by identity repetition. Thus, we found no evidence to suggest that OFA or LO contain neuronal representations selective for the identity of famous faces which play a causal role in identity processing. Instead, these brain regions may be involved in the processing of more generic features of their preferred stimulus categories. PMID:20631842
Looby, Mairead; Ibarra, Neysi; Pierce, James J; Buckley, Kevin; O'Donovan, Eimear; Heenan, Mary; Moran, Enda; Farid, Suzanne S; Baganz, Frank
2011-01-01
This study describes the application of quality by design (QbD) principles to the development and implementation of a major manufacturing process improvement for a commercially distributed therapeutic protein produced in Chinese hamster ovary cell culture. The intent of this article is to focus on QbD concepts, and provide guidance and understanding on how the various components combine together to deliver a robust process in keeping with the principles of QbD. A fed-batch production culture and a virus inactivation step are described as representative examples of upstream and downstream unit operations that were characterized. A systematic approach incorporating QbD principles was applied to both unit operations, involving risk assessment of potential process failure points, small-scale model qualification, design and execution of experiments, definition of operating parameter ranges and process validation acceptance criteria followed by manufacturing-scale implementation and process validation. Statistical experimental designs were applied to the execution of process characterization studies evaluating the impact of operating parameters on product quality attributes and process performance parameters. Data from process characterization experiments were used to define the proven acceptable range and classification of operating parameters for each unit operation. Analysis of variance and Monte Carlo simulation methods were used to assess the appropriateness of process design spaces. Successful implementation and validation of the process in the manufacturing facility and the subsequent manufacture of hundreds of batches of this therapeutic protein verifies the approaches taken as a suitable model for the development, scale-up and operation of any biopharmaceutical manufacturing process. Copyright © 2011 American Institute of Chemical Engineers (AIChE).
Cardoso, Teresa; Fonseca, Teresa; Pereira, Sofia; Lencastre, Luís
2003-12-01
The objective of the present study was to evaluate the opinion of Portuguese intensive care physicians regarding 'do-not-resuscitate' (DNR) orders and decisions to withhold/withdraw treatment. A questionnaire was sent to all physicians working on a full-time basis in all intensive care units (ICUs) registered with the Portuguese Intensive Care Society. A total of 266 questionnaires were sent and 175 (66%) were returned. Physicians from 79% of the ICUs participated. All participants stated that DNR orders are applied in their units, and 98.3% stated that decisions to withhold treatment and 95.4% stated that decisions to withdraw treatment are also applied. About three quarters indicated that only the medical group makes these decisions. Fewer than 15% of the responders stated that they involve nurses, 9% involve patients and fewer than 11% involve patients' relatives in end-of-life decisions. Physicians with more than 10 years of clinical experience more frequently indicated that they involve nurses in these decisions (P < 0.05), and agnostic/atheist doctors more frequently involve patients' relatives in decisions to withhold/withdraw treatment (P < 0.05). When asked about who they thought should be involved, more than 26% indicated nurses, more than 35% indicated the patient and more than 25% indicated patients' relatives. More experienced doctors more frequently felt that nurses should be involved (P < 0.05), and male doctors more frequently stated that patients' relatives should be involved in DNR orders (P < 0.05). When a decision to withdraw treatment is made, 76.8% of 151 respondents indicated that they would initiate palliative care; no respondent indicated that they would administer drugs to accelerate the expected outcome. The probability of survival from the acute episode and patients' wishes were the most important criteria influencing end-of-life decisions. These decisions are made only by the medical group in most of the responding ICUs, with little input from nursing staff, patients, or patients' relatives, although many respondents expressed a wish to involve them more in this process. Sex, experience and religious beliefs of the respondents influences the way in which these decisions are made.
NASA Astrophysics Data System (ADS)
Verrelst, Jochem; Malenovský, Zbyněk; Van der Tol, Christiaan; Camps-Valls, Gustau; Gastellu-Etchegorry, Jean-Philippe; Lewis, Philip; North, Peter; Moreno, Jose
2018-06-01
An unprecedented spectroscopic data stream will soon become available with forthcoming Earth-observing satellite missions equipped with imaging spectroradiometers. This data stream will open up a vast array of opportunities to quantify a diversity of biochemical and structural vegetation properties. The processing requirements for such large data streams require reliable retrieval techniques enabling the spatiotemporally explicit quantification of biophysical variables. With the aim of preparing for this new era of Earth observation, this review summarizes the state-of-the-art retrieval methods that have been applied in experimental imaging spectroscopy studies inferring all kinds of vegetation biophysical variables. Identified retrieval methods are categorized into: (1) parametric regression, including vegetation indices, shape indices and spectral transformations; (2) nonparametric regression, including linear and nonlinear machine learning regression algorithms; (3) physically based, including inversion of radiative transfer models (RTMs) using numerical optimization and look-up table approaches; and (4) hybrid regression methods, which combine RTM simulations with machine learning regression methods. For each of these categories, an overview of widely applied methods with application to mapping vegetation properties is given. In view of processing imaging spectroscopy data, a critical aspect involves the challenge of dealing with spectral multicollinearity. The ability to provide robust estimates, retrieval uncertainties and acceptable retrieval processing speed are other important aspects in view of operational processing. Recommendations towards new-generation spectroscopy-based processing chains for operational production of biophysical variables are given.
The role of clinician emotion in clinical reasoning: Balancing the analytical process.
Langridge, Neil; Roberts, Lisa; Pope, Catherine
2016-02-01
This review paper identifies and describes the role of clinicians' memory, emotions and physical responses in clinical reasoning processes. Clinical reasoning is complex and multi-factorial and key models of clinical reasoning within musculoskeletal physiotherapy are discussed, highlighting the omission of emotion and subsequent physical responses and how these can impact upon a clinician when making a decision. It is proposed that clinicians should consider the emotions associated with decision-making, especially when there is concern surrounding a presentation. Reflecting on practice in the clinical environment and subsequently applying this to a patient presentation should involve some acknowledgement of clinicians' physical responses, emotions and how they may play a part in any decision made. Presenting intuition and gut-feeling as separate reasoning methods and how these processes co-exist with other more accepted reasoning such as hypothetico-deductive is also discussed. Musculoskeletal physiotherapy should consider the elements of feelings, emotions and physical responses when applying reflective practice principles. Furthermore, clinicians dealing with difficult and challenging presentations should look at the emotional as well as the analytical experience when justifying decisions and learning from practice. Copyright © 2015 Elsevier Ltd. All rights reserved.