Sample records for individual processing steps

  1. Array automated assembly task low cost silicon solar array project, phase 2

    NASA Technical Reports Server (NTRS)

    Olson, C.

    1980-01-01

    Analyses of solar cell and module process steps for throughput rate, cost effectiveness, and reproductibility are reported. In addition to the concentration on cell and module processing sequences, an investigation was made into the capability of using microwave energy in the diffusion, sintering, and thick film firing steps of cell processing. Although the entire process sequence was integrated, the steps are treated individually with test and experimental data, conclusions, and recommendations.

  2. Multiple dual mode counter-current chromatography with variable duration of alternating phase elution steps.

    PubMed

    Kostanyan, Artak E; Erastov, Andrey A; Shishilov, Oleg N

    2014-06-20

    The multiple dual mode (MDM) counter-current chromatography separation processes consist of a succession of two isocratic counter-current steps and are characterized by the shuttle (forward and back) transport of the sample in chromatographic columns. In this paper, the improved MDM method based on variable duration of alternating phase elution steps has been developed and validated. The MDM separation processes with variable duration of phase elution steps are analyzed. Basing on the cell model, analytical solutions are developed for impulse and non-impulse sample loading at the beginning of the column. Using the analytical solutions, a calculation program is presented to facilitate the simulation of MDM with variable duration of phase elution steps, which can be used to select optimal process conditions for the separation of a given feed mixture. Two options of the MDM separation are analyzed: 1 - with one-step solute elution: the separation is conducted so, that the sample is transferred forward and back with upper and lower phases inside the column until the desired separation of the components is reached, and then each individual component elutes entirely within one step; 2 - with multi-step solute elution, when the fractions of individual components are collected in over several steps. It is demonstrated that proper selection of the duration of individual cycles (phase flow times) can greatly increase the separation efficiency of CCC columns. Experiments were carried out using model mixtures of compounds from the GUESSmix with solvent systems hexane/ethyl acetate/methanol/water. The experimental results are compared to the predictions of the theory. A good agreement between theory and experiment has been demonstrated. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. Patterning control strategies for minimum edge placement error in logic devices

    NASA Astrophysics Data System (ADS)

    Mulkens, Jan; Hanna, Michael; Slachter, Bram; Tel, Wim; Kubis, Michael; Maslow, Mark; Spence, Chris; Timoshkov, Vadim

    2017-03-01

    In this paper we discuss the edge placement error (EPE) for multi-patterning semiconductor manufacturing. In a multi-patterning scheme the creation of the final pattern is the result of a sequence of lithography and etching steps, and consequently the contour of the final pattern contains error sources of the different process steps. We describe the fidelity of the final pattern in terms of EPE, which is defined as the relative displacement of the edges of two features from their intended target position. We discuss our holistic patterning optimization approach to understand and minimize the EPE of the final pattern. As an experimental test vehicle we use the 7-nm logic device patterning process flow as developed by IMEC. This patterning process is based on Self-Aligned-Quadruple-Patterning (SAQP) using ArF lithography, combined with line cut exposures using EUV lithography. The computational metrology method to determine EPE is explained. It will be shown that ArF to EUV overlay, CDU from the individual process steps, and local CD and placement of the individual pattern features, are the important contributors. Based on the error budget, we developed an optimization strategy for each individual step and for the final pattern. Solutions include overlay and CD metrology based on angle resolved scatterometry, scanner actuator control to enable high order overlay corrections and computational lithography optimization to minimize imaging induced pattern placement errors of devices and metrology targets.

  4. A time to search: finding the meaning of variable activation energy.

    PubMed

    Vyazovkin, Sergey

    2016-07-28

    This review deals with the phenomenon of variable activation energy frequently observed when studying the kinetics in the liquid or solid phase. This phenomenon commonly manifests itself through nonlinear Arrhenius plots or dependencies of the activation energy on conversion computed by isoconversional methods. Variable activation energy signifies a multi-step process and has a meaning of a collective parameter linked to the activation energies of individual steps. It is demonstrated that by using appropriate models of the processes, the link can be established in algebraic form. This allows one to analyze experimentally observed dependencies of the activation energy in a quantitative fashion and, as a result, to obtain activation energies of individual steps, to evaluate and predict other important parameters of the process, and generally to gain deeper kinetic and mechanistic insights. This review provides multiple examples of such analysis as applied to the processes of crosslinking polymerization, crystallization and melting of polymers, gelation, and solid-solid morphological and glass transitions. The use of appropriate computational techniques is discussed as well.

  5. Parallel workflow tools to facilitate human brain MRI post-processing

    PubMed Central

    Cui, Zaixu; Zhao, Chenxi; Gong, Gaolang

    2015-01-01

    Multi-modal magnetic resonance imaging (MRI) techniques are widely applied in human brain studies. To obtain specific brain measures of interest from MRI datasets, a number of complex image post-processing steps are typically required. Parallel workflow tools have recently been developed, concatenating individual processing steps and enabling fully automated processing of raw MRI data to obtain the final results. These workflow tools are also designed to make optimal use of available computational resources and to support the parallel processing of different subjects or of independent processing steps for a single subject. Automated, parallel MRI post-processing tools can greatly facilitate relevant brain investigations and are being increasingly applied. In this review, we briefly summarize these parallel workflow tools and discuss relevant issues. PMID:26029043

  6. Characterizing the Processes for Navigating Internet Health Information Using Real-Time Observations: A Mixed-Methods Approach.

    PubMed

    Perez, Susan L; Paterniti, Debora A; Wilson, Machelle; Bell, Robert A; Chan, Man Shan; Villareal, Chloe C; Nguyen, Hien Huy; Kravitz, Richard L

    2015-07-20

    Little is known about the processes people use to find health-related information on the Internet or the individual characteristics that shape selection of information-seeking approaches. Our aim was to describe the processes by which users navigate the Internet for information about a hypothetical acute illness and to identify individual characteristics predictive of their information-seeking strategies. Study participants were recruited from public settings and agencies. Interested individuals were screened for eligibility using an online questionnaire. Participants listened to one of two clinical scenarios—consistent with influenza or bacterial meningitis—and then conducted an Internet search. Screen-capture video software captured Internet search mouse clicks and keystrokes. Each step of the search was coded as hypothesis testing (etiology), evidence gathering (symptoms), or action/treatment seeking (behavior). The coded steps were used to form a step-by-step pattern of each participant's information-seeking process. A total of 78 Internet health information seekers ranging from 21-35 years of age and who experienced barriers to accessing health care services participated. We identified 27 unique patterns of information seeking, which were grouped into four overarching classifications based on the number of steps taken during the search, whether a pattern consisted of developing a hypothesis and exploring symptoms before ending the search or searching an action/treatment, and whether a pattern ended with action/treatment seeking. Applying dual-processing theory, we categorized the four overarching pattern classifications as either System 1 (41%, 32/78), unconscious, rapid, automatic, and high capacity processing; or System 2 (59%, 46/78), conscious, slow, and deliberative processing. Using multivariate regression, we found that System 2 processing was associated with higher education and younger age. We identified and classified two approaches to processing Internet health information. System 2 processing, a methodical approach, most resembles the strategies for information processing that have been found in other studies to be associated with higher-quality decisions. We conclude that the quality of Internet health-information seeking could be improved through consumer education on methodical Internet navigation strategies and the incorporation of decision aids into health information websites.

  7. Characterizing the Processes for Navigating Internet Health Information Using Real-Time Observations: A Mixed-Methods Approach

    PubMed Central

    Paterniti, Debora A; Wilson, Machelle; Bell, Robert A; Chan, Man Shan; Villareal, Chloe C; Nguyen, Hien Huy; Kravitz, Richard L

    2015-01-01

    Background Little is known about the processes people use to find health-related information on the Internet or the individual characteristics that shape selection of information-seeking approaches. Objective Our aim was to describe the processes by which users navigate the Internet for information about a hypothetical acute illness and to identify individual characteristics predictive of their information-seeking strategies. Methods Study participants were recruited from public settings and agencies. Interested individuals were screened for eligibility using an online questionnaire. Participants listened to one of two clinical scenarios—consistent with influenza or bacterial meningitis—and then conducted an Internet search. Screen-capture video software captured Internet search mouse clicks and keystrokes. Each step of the search was coded as hypothesis testing (etiology), evidence gathering (symptoms), or action/treatment seeking (behavior). The coded steps were used to form a step-by-step pattern of each participant’s information-seeking process. A total of 78 Internet health information seekers ranging from 21-35 years of age and who experienced barriers to accessing health care services participated. Results We identified 27 unique patterns of information seeking, which were grouped into four overarching classifications based on the number of steps taken during the search, whether a pattern consisted of developing a hypothesis and exploring symptoms before ending the search or searching an action/treatment, and whether a pattern ended with action/treatment seeking. Applying dual-processing theory, we categorized the four overarching pattern classifications as either System 1 (41%, 32/78), unconscious, rapid, automatic, and high capacity processing; or System 2 (59%, 46/78), conscious, slow, and deliberative processing. Using multivariate regression, we found that System 2 processing was associated with higher education and younger age. Conclusions We identified and classified two approaches to processing Internet health information. System 2 processing, a methodical approach, most resembles the strategies for information processing that have been found in other studies to be associated with higher-quality decisions. We conclude that the quality of Internet health-information seeking could be improved through consumer education on methodical Internet navigation strategies and the incorporation of decision aids into health information websites. PMID:26194787

  8. 10 Steps for Implementing Change.

    ERIC Educational Resources Information Center

    Marsee, Jeff

    2002-01-01

    Offers steps for adapting the change process to institutional culture: align leadership style with organizational culture, don't overuse change missionaries, protect change agents, define the problem, maintain focus when the project drifts, identify and remove barriers before implementing action plans, assign responsibilities to individuals,…

  9. The Principal as Professional Development Leader

    ERIC Educational Resources Information Center

    Lindstrom, Phyllis H.; Speck, Marsha

    2004-01-01

    Individual teachers have the greatest effect on student performance. Principals, as professional development leaders, are in the best position to provide teachers with the professional development strategies they need to improve skills and raise student achievement. This book guides readers through a step-by-step process to formulate, implement,…

  10. Functional Communication Training in Children with Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Battaglia, Dana

    2017-01-01

    This article explicitly addresses the correlation between communication and behavior, and describes how to provide intervention addressing these two overlapping domains using an intervention called functional communication training (FCT; E. G. Carr & Durand, 1985) in individuals with ASD. A step-by-step process is outlined with supporting…

  11. Understanding, Developing, and Writing Effective IEPs: A Step-by-Step Guide for Educators

    ERIC Educational Resources Information Center

    Pierangelo, Roger; Giuliani, George A.

    2007-01-01

    Creating and evaluating Individualized Education Programs (IEPs) for students with disabilities is a major responsibility for teachers and school leaders, yet the process involves legal components not always understood by educators. In "Understanding, Developing, and Writing Effective IEPs," legal and special education experts Roger…

  12. The Role of the School Nurse in the Special Education Process: Part 2: Eligibility Determination and the Individualized Education Program.

    PubMed

    Shannon, Robin Adair; Yonkaitis, Catherine Falusi

    2017-07-01

    This is the second of two articles outlining the professional school nurse's role in the special education process for students with disabilities. The Individuals with Disabilities in Education Improvement Act of 2004 mandates the special education process: identification, full and individual evaluation, eligibility determination, and development of the individual education program (IEP), including special education placement. Part 1 focused on the importance of the school nurse's role in student identification, response to intervention, and the full and individual evaluation. Part 2 highlights the school nurse's vital and unique contribution to the subsequent special education steps of eligibility determination, IEP development, and special education services placement and minutes.

  13. The management submodel of the Wind Erosion Prediction System

    USDA-ARS?s Scientific Manuscript database

    The Wind Erosion Prediction System (WEPS) is a process-based, daily time-step, computer model that predicts soil erosion via simulation of the physical processes controlling wind erosion. WEPS is comprised of several individual modules (submodels) that reflect different sets of physical processes, ...

  14. Structural and chemical evolution of the CdS:O window layer during individual CdTe solar cell processing steps

    DOE PAGES

    Abbas, A.; Meysing, D. M.; Reese, M. O.; ...

    2017-12-01

    Oxygenated cadmium sulfide (CdS:O) is often used as the n-type window layer in high-performance CdTe heterojunction solar cells. The as-deposited layer prepared by reactive sputtering is XRD amorphous, with a bulk composition of CdS0.8O1.2. Recently it was shown that this layer undergoes significant transformation during device fabrication, but the roles of the individual high temperature processing steps was unclear. In this work high resolution transmission electron microscopy coupled to elemental analysis was used to understand the evolution of the heterojunction region through the individual high temperature fabrication steps of CdTe deposition, CdCl2 activation, and back contact activation. It is foundmore » that during CdTe deposition by close spaced sublimation at 600 degrees C the CdS:O film undergoes recrystallization, accompanied by a significant (~30%) reduction in thickness. It is observed that oxygen segregates during this step, forming a bi-layer morphology consisting of nanocrystalline CdS adjacent to the tin oxide contact and an oxygen-rich layer adjacent to the CdTe absorber. This bilayer structure is then lost during the 400 degrees C CdCl2 treatment where the film transforms into a heterogeneous structure with cadmium sulfate clusters distributed randomly throughout the window layer. The thickness of window layer remains essentially unchanged after CdCl2 treatment, but a ~25 nm graded interfacial layer between CdTe and the window region is formed. Finally, the rapid thermal processing step used to activate the back contact was found to have a negligible impact on the structure or composition of the heterojunction region.« less

  15. Structural and chemical evolution of the CdS:O window layer during individual CdTe solar cell processing steps

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abbas, A.; Meysing, D. M.; Reese, M. O.

    Oxygenated cadmium sulfide (CdS:O) is often used as the n-type window layer in high-performance CdTe heterojunction solar cells. The as-deposited layer prepared by reactive sputtering is XRD amorphous, with a bulk composition of CdS0.8O1.2. Recently it was shown that this layer undergoes significant transformation during device fabrication, but the roles of the individual high temperature processing steps was unclear. In this work high resolution transmission electron microscopy coupled to elemental analysis was used to understand the evolution of the heterojunction region through the individual high temperature fabrication steps of CdTe deposition, CdCl2 activation, and back contact activation. It is foundmore » that during CdTe deposition by close spaced sublimation at 600 degrees C the CdS:O film undergoes recrystallization, accompanied by a significant (~30%) reduction in thickness. It is observed that oxygen segregates during this step, forming a bi-layer morphology consisting of nanocrystalline CdS adjacent to the tin oxide contact and an oxygen-rich layer adjacent to the CdTe absorber. This bilayer structure is then lost during the 400 degrees C CdCl2 treatment where the film transforms into a heterogeneous structure with cadmium sulfate clusters distributed randomly throughout the window layer. The thickness of window layer remains essentially unchanged after CdCl2 treatment, but a ~25 nm graded interfacial layer between CdTe and the window region is formed. Finally, the rapid thermal processing step used to activate the back contact was found to have a negligible impact on the structure or composition of the heterojunction region.« less

  16. Experimental entanglement of 25 individually accessible atomic quantum interfaces.

    PubMed

    Pu, Yunfei; Wu, Yukai; Jiang, Nan; Chang, Wei; Li, Chang; Zhang, Sheng; Duan, Luming

    2018-04-01

    A quantum interface links the stationary qubits in a quantum memory with flying photonic qubits in optical transmission channels and constitutes a critical element for the future quantum internet. Entanglement of quantum interfaces is an important step for the realization of quantum networks. Through heralded detection of photon interference, we generate multipartite entanglement between 25 (or 9) individually addressable quantum interfaces in a multiplexed atomic quantum memory array and confirm genuine 22-partite (or 9-partite) entanglement. This experimental entanglement of a record-high number of individually addressable quantum interfaces makes an important step toward the realization of quantum networks, long-distance quantum communication, and multipartite quantum information processing.

  17. Safe and Effective Schooling for All Students: Putting into Practice the Disciplinary Provisions of the 1997 IDEA.

    ERIC Educational Resources Information Center

    Gable, Robert A.; Butler, C. J.; Walker-Bolton, Irene; Tonelson, Stephen W.; Quinn, Mary M.; Fox, James J.

    2003-01-01

    Virginia's statewide plan of educator preparation in functional behavioral assessment, as required under the Individuals with Disabilities Education Act, is described. The step-by-step training process facilitated positive academic and nonacademic outcomes for all students. Preliminary data support the effectiveness of both the content and…

  18. Safe Schools: A Planning Guide for Action. 2002 Edition.

    ERIC Educational Resources Information Center

    Abbott, Carol

    This publication summarizes research on the benefits of safe-school planning, provides examples of successful programs and strategies, and offers a step-by-step planning process that school teams can apply to their individual campuses and student populations. It also reflects new state and federal laws that established California's School Safety…

  19. Five Steps for Developing Effective Transition Plans for High School Students with Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Szidon, Katherine; Ruppar, Andrea; Smith, Leann

    2015-01-01

    The Individuals With Disabilities Education Act (IDEA; 2006) requires schools to develop transition plans for students with disabilities, beginning at age 16, if not before. For students with autism spectrum disorder (ASD), the transition planning process includes unique considerations. This article describes five steps for developing effective…

  20. Beliefs about Aggression and Submissiveness: A Comparison of Aggressive and Nonaggressive Individuals with Mild Intellectual Disability

    ERIC Educational Resources Information Center

    Kirk, Jamie; Jahoda, Andrew; Pert, Carol

    2008-01-01

    Recent research has examined the relevance of the social information processing model of aggression to individuals with intellectual disability (ID). This study investigated the "response access" and "response decision" steps of this model. Photo stories were used to compare aggressive and nonaggressive individuals' beliefs about the outcomes of…

  1. Site-selective substitutional doping with atomic precision on stepped Al (111) surface by single-atom manipulation

    PubMed Central

    2014-01-01

    In fabrication of nano- and quantum devices, it is sometimes critical to position individual dopants at certain sites precisely to obtain the specific or enhanced functionalities. With first-principles simulations, we propose a method for substitutional doping of individual atom at a certain position on a stepped metal surface by single-atom manipulation. A selected atom at the step of Al (111) surface could be extracted vertically with an Al trimer-apex tip, and then the dopant atom will be positioned to this site. The details of the entire process including potential energy curves are given, which suggests the reliability of the proposed single-atom doping method. PMID:24899871

  2. Site-selective substitutional doping with atomic precision on stepped Al (111) surface by single-atom manipulation.

    PubMed

    Chen, Chang; Zhang, Jinhu; Dong, Guofeng; Shao, Hezhu; Ning, Bo-Yuan; Zhao, Li; Ning, Xi-Jing; Zhuang, Jun

    2014-01-01

    In fabrication of nano- and quantum devices, it is sometimes critical to position individual dopants at certain sites precisely to obtain the specific or enhanced functionalities. With first-principles simulations, we propose a method for substitutional doping of individual atom at a certain position on a stepped metal surface by single-atom manipulation. A selected atom at the step of Al (111) surface could be extracted vertically with an Al trimer-apex tip, and then the dopant atom will be positioned to this site. The details of the entire process including potential energy curves are given, which suggests the reliability of the proposed single-atom doping method.

  3. Group Analysis in MNE-Python of Evoked Responses from a Tactile Stimulation Paradigm: A Pipeline for Reproducibility at Every Step of Processing, Going from Individual Sensor Space Representations to an across-Group Source Space Representation

    PubMed Central

    Andersen, Lau M.

    2018-01-01

    An important aim of an analysis pipeline for magnetoencephalographic data is that it allows for the researcher spending maximal effort on making the statistical comparisons that will answer the questions of the researcher, while in turn spending minimal effort on the intricacies and machinery of the pipeline. I here present a set of functions and scripts that allow for setting up a clear, reproducible structure for separating raw and processed data into folders and files such that minimal effort can be spend on: (1) double-checking that the right input goes into the right functions; (2) making sure that output and intermediate steps can be accessed meaningfully; (3) applying operations efficiently across groups of subjects; (4) re-processing data if changes to any intermediate step are desirable. Applying the scripts requires only general knowledge about the Python language. The data analyses are neural responses to tactile stimulations of the right index finger in a group of 20 healthy participants acquired from an Elekta Neuromag System. Two analyses are presented: going from individual sensor space representations to, respectively, an across-group sensor space representation and an across-group source space representation. The processing steps covered for the first analysis are filtering the raw data, finding events of interest in the data, epoching data, finding and removing independent components related to eye blinks and heart beats, calculating participants' individual evoked responses by averaging over epoched data and calculating a grand average sensor space representation over participants. The second analysis starts from the participants' individual evoked responses and covers: estimating noise covariance, creating a forward model, creating an inverse operator, estimating distributed source activity on the cortical surface using a minimum norm procedure, morphing those estimates onto a common cortical template and calculating the patterns of activity that are statistically different from baseline. To estimate source activity, processing of the anatomy of subjects based on magnetic resonance imaging is necessary. The necessary steps are covered here: importing magnetic resonance images, segmenting the brain, estimating boundaries between different tissue layers, making fine-resolution scalp surfaces for facilitating co-registration, creating source spaces and creating volume conductors for each subject. PMID:29403349

  4. Group Analysis in MNE-Python of Evoked Responses from a Tactile Stimulation Paradigm: A Pipeline for Reproducibility at Every Step of Processing, Going from Individual Sensor Space Representations to an across-Group Source Space Representation.

    PubMed

    Andersen, Lau M

    2018-01-01

    An important aim of an analysis pipeline for magnetoencephalographic data is that it allows for the researcher spending maximal effort on making the statistical comparisons that will answer the questions of the researcher, while in turn spending minimal effort on the intricacies and machinery of the pipeline. I here present a set of functions and scripts that allow for setting up a clear, reproducible structure for separating raw and processed data into folders and files such that minimal effort can be spend on: (1) double-checking that the right input goes into the right functions; (2) making sure that output and intermediate steps can be accessed meaningfully; (3) applying operations efficiently across groups of subjects; (4) re-processing data if changes to any intermediate step are desirable. Applying the scripts requires only general knowledge about the Python language. The data analyses are neural responses to tactile stimulations of the right index finger in a group of 20 healthy participants acquired from an Elekta Neuromag System. Two analyses are presented: going from individual sensor space representations to, respectively, an across-group sensor space representation and an across-group source space representation. The processing steps covered for the first analysis are filtering the raw data, finding events of interest in the data, epoching data, finding and removing independent components related to eye blinks and heart beats, calculating participants' individual evoked responses by averaging over epoched data and calculating a grand average sensor space representation over participants. The second analysis starts from the participants' individual evoked responses and covers: estimating noise covariance, creating a forward model, creating an inverse operator, estimating distributed source activity on the cortical surface using a minimum norm procedure, morphing those estimates onto a common cortical template and calculating the patterns of activity that are statistically different from baseline. To estimate source activity, processing of the anatomy of subjects based on magnetic resonance imaging is necessary. The necessary steps are covered here: importing magnetic resonance images, segmenting the brain, estimating boundaries between different tissue layers, making fine-resolution scalp surfaces for facilitating co-registration, creating source spaces and creating volume conductors for each subject.

  5. Testing Components of a Self-Management Theory in Adolescents With Type 1 Diabetes Mellitus.

    PubMed

    Verchota, Gwen; Sawin, Kathleen J

    The role of self-management in adolescents with type 1 diabetes mellitus is not well understood. The purpose of the research was to examine the relationship of key individual and family self-management theory, context, and process variables on proximal (self-management behaviors) and distal (hemoglobin A1c and diabetes-specific health-related quality of life) outcomes in adolescents with type 1 diabetes. A correlational, cross-sectional study was conducted to identify factors contributing to outcomes in adolescents with Type 1 diabetes and examine potential relationships between context, process, and outcome variables delineated in individual and family self-management theory. Participants were 103 adolescent-parent dyads (adolescents ages 12-17) with Type 1 diabetes from a Midwest, outpatient, diabetes clinic. The dyads completed a self-report survey including instruments intended to measure context, process, and outcome variables from individual and family self-management theory. Using hierarchical multiple regression, context (depressive symptoms) and process (communication) variables explained 37% of the variance in self-management behaviors. Regimen complexity-the only significant predictor-explained 11% of the variance in hemoglobin A1c. Neither process variables nor self-management behaviors were significant. For the diabetes-specific health-related quality of life outcome, context (regimen complexity and depressive symptoms) explained 26% of the variance at step 1; an additional 9% of the variance was explained when process (self-efficacy and communication) variables were added at step 2; and 52% of the variance was explained when self-management behaviors were added at Step 3. In the final model, three variables were significant predictors: depressive symptoms, self-efficacy, and self-management behaviors. The individual and family self-management theory can serve as a cogent theory for understanding key concepts, processes, and outcomes essential to self-management in adolescents and families dealing with Type 1 diabetes mellitus.

  6. Poisson point process modeling for polyphonic music transcription.

    PubMed

    Peeling, Paul; Li, Chung-fai; Godsill, Simon

    2007-04-01

    Peaks detected in the frequency domain spectrum of a musical chord are modeled as realizations of a nonhomogeneous Poisson point process. When several notes are superimposed to make a chord, the processes for individual notes combine to give another Poisson process, whose likelihood is easily computable. This avoids a data association step linking individual harmonics explicitly with detected peaks in the spectrum. The likelihood function is ideal for Bayesian inference about the unknown note frequencies in a chord. Here, maximum likelihood estimation of fundamental frequencies shows very promising performance on real polyphonic piano music recordings.

  7. Engaged for Change: A Community-Engaged Process for Developing Interventions to Reduce Health Disparities.

    PubMed

    Rhodes, Scott D; Mann-Jackson, Lilli; Alonzo, Jorge; Simán, Florence M; Vissman, Aaron T; Nall, Jennifer; Abraham, Claire; Aronson, Robert E; Tanner, Amanda E

    2017-12-01

    The science underlying the development of individual, community, system, and policy interventions designed to reduce health disparities has lagged behind other innovations. Few models, theoretical frameworks, or processes exist to guide intervention development. Our community-engaged research partnership has been developing, implementing, and evaluating efficacious interventions to reduce HIV disparities for over 15 years. Based on our intervention research experiences, we propose a novel 13-step process designed to demystify and guide intervention development. Our intervention development process includes steps such as establishing an intervention team to manage the details of intervention development; assessing community needs, priorities, and assets; generating intervention priorities; evaluating and incorporating theory; developing a conceptual or logic model; crafting activities; honing materials; administering a pilot, noting its process, and gathering feedback from all those involved; and editing the intervention based on what was learned. Here, we outline and describe each of these 13 steps.

  8. STATISTICAL ANALYSIS OF SNAP 10A THERMOELECTRIC CONVERTER ELEMENT PROCESS DEVELOPMENT VARIABLES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fitch, S.H.; Morris, J.W.

    1962-12-15

    Statistical analysis, primarily analysis of variance, was applied to evaluate several factors involved in the development of suitable fabrication and processing techniques for the production of lead telluride thermoelectric elements for the SNAP 10A energy conversion system. The analysis methods are described as to their application for determining the effects of various processing steps, estabIishing the value of individual operations, and evaluating the significance of test results. The elimination of unnecessary or detrimental processing steps was accomplished and the number of required tests was substantially reduced by application of these statistical methods to the SNAP 10A production development effort. (auth)

  9. Experimental entanglement of 25 individually accessible atomic quantum interfaces

    PubMed Central

    Jiang, Nan; Chang, Wei; Li, Chang; Zhang, Sheng

    2018-01-01

    A quantum interface links the stationary qubits in a quantum memory with flying photonic qubits in optical transmission channels and constitutes a critical element for the future quantum internet. Entanglement of quantum interfaces is an important step for the realization of quantum networks. Through heralded detection of photon interference, we generate multipartite entanglement between 25 (or 9) individually addressable quantum interfaces in a multiplexed atomic quantum memory array and confirm genuine 22-partite (or 9-partite) entanglement. This experimental entanglement of a record-high number of individually addressable quantum interfaces makes an important step toward the realization of quantum networks, long-distance quantum communication, and multipartite quantum information processing. PMID:29725621

  10. (On)line dancing: choosing an appropriate distance education partner.

    PubMed

    Menn, Mindy; Don Chaney, J

    2014-05-01

    Online-delivered distance education is a burgeoning component of professional development and continuing education. Distance education programs allow individuals to learn in a different location and/or at a different time from fellow learners, thereby increasing the flexibility and number of learning options. Selecting the "right" program for personal development from the ever-growing body of online-delivered education is an individualized decision that can become an overwhelming and challenging process. This Tool presents four important definitions for navigating distance education program description materials and outlines a five-step process to assist in identifying an appropriate program for personal development. The five-step process includes key questions and points to consider while conducting a candid self-assessment, identifying and investigating distance education programs, and then compiling information, comparing programs, and prioritizing a list of programs suitable for application. Furthermore, this Tool highlights important websites for distance education degree program reviews, accreditation information, and open educational resources.

  11. Integrating social media and social marketing: a four-step process.

    PubMed

    Thackeray, Rosemary; Neiger, Brad L; Keller, Heidi

    2012-03-01

    Social media is a group of Internet-based applications that allows individuals to create, collaborate, and share content with one another. Practitioners can realize social media's untapped potential by incorporating it as part of the larger social marketing strategy, beyond promotion. Social media, if used correctly, may help organizations increase their capacity for putting the consumer at the center of the social marketing process. The purpose of this article is to provide a template for strategic thinking to successfully include social media as part of the social marketing strategy by using a four-step process.

  12. A systematic writing program as a tool in the grief process: part 1.

    PubMed

    Furnes, Bodil; Dysvik, Elin

    2010-12-06

    The basic aim of this paper is to suggest a flexible and individualized writing program as a tool for use during the grief process of bereaved adults. An open, qualitative approach following distinct steps was taken to gain a broad perspective on the grief and writing processes, as a platform for the writing program. Following several systematic methodological steps, we arrived at suggestions for the initiation of a writing program and its structure and substance, with appropriate guidelines. We believe that open and expressive writing, including free writing and focused writing, may have beneficial effects on a person experiencing grief. These writing forms may be undertaken and systematized through a writing program, with participation in a grief writing group and with diary writing, to achieve optimal results. A structured writing program might be helpful in promoting thought activities and as a tool to increase the coherence and understanding of individuals in the grief process. Our suggested program may also be a valuable guide to future program development and research.

  13. Interactions of double patterning technology with wafer processing, OPC and design flows

    NASA Astrophysics Data System (ADS)

    Lucas, Kevin; Cork, Chris; Miloslavsky, Alex; Luk-Pat, Gerry; Barnes, Levi; Hapli, John; Lewellen, John; Rollins, Greg; Wiaux, Vincent; Verhaegen, Staf

    2008-03-01

    Double patterning technology (DPT) is one of the main options for printing logic devices with half-pitch less than 45nm; and flash and DRAM memory devices with half-pitch less than 40nm. DPT methods decompose the original design intent into two individual masking layers which are each patterned using single exposures and existing 193nm lithography tools. The results of the individual patterning layers combine to re-create the design intent pattern on the wafer. In this paper we study interactions of DPT with lithography, masks synthesis and physical design flows. Double exposure and etch patterning steps create complexity for both process and design flows. DPT decomposition is a critical software step which will be performed in physical design and also in mask synthesis. Decomposition includes cutting (splitting) of original design intent polygons into multiple polygons where required; and coloring of the resulting polygons. We evaluate the ability to meet key physical design goals such as: reduce circuit area; minimize rework; ensure DPT compliance; guarantee patterning robustness on individual layer targets; ensure symmetric wafer results; and create uniform wafer density for the individual patterning layers.

  14. Prosopography, prosoporecognography and the Prosoporecognographical Chart.

    PubMed

    Santos-Filho, E F; Pereira, H B B

    2017-11-01

    Recognizing and identifying an individual based on his or her face is a technical and scientific challenge and the objective of our investigation. This article's goal is to establish a method, a foundation and an instrument for carrying out the process of recognizing and identifying an individual. Both the construction of the term and the deepening, conceptualization and epistemology of the process of describing and representing the face through a particular method of recognizing and identifying individuals are described in this article. The proposal of the Prosoporecognographical Chart is an important step in the facial-identification process, establishing taxonomic parameters for the phenotypic manifestations of the elements constituting the face. Based on the proposal presented here, the construction of a protocol for the process of recognizing and identifying an individual can be implemented computationally. Copyright © 2017. Published by Elsevier Ltd.

  15. The Problem of Existence

    DTIC Science & Technology

    1985-01-01

    envisionment) produced by GIZMO . ? In the envisionment, I s indicates the set of quantity—conditioned individuals that exists during a situa- tion...envisionment step by step . In START, the initial state, GIZMO deduces that heat flow occurs, since there is assumed to be a temperature difference between the...stov e GIZMO implements the basic operations of qualitative process theory, including an envisioner for makin g predictions and a program for

  16. Streamlining the Bankability Process using International Standards

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kurtz, Sarah; Repins, Ingrid L; Kelly, George

    NREL has supported the international efforts to create a streamlined process for documenting bankability and/or completion of each step of a PV project plan. IECRE was created for this purpose in 2014. This poster describes the goals, current status of this effort, and how individuals and companies can become involved.

  17. Implementation of Competency-Based Pharmacy Education (CBPE)

    PubMed Central

    Koster, Andries; Schalekamp, Tom; Meijerman, Irma

    2017-01-01

    Implementation of competency-based pharmacy education (CBPE) is a time-consuming, complicated process, which requires agreement on the tasks of a pharmacist, commitment, institutional stability, and a goal-directed developmental perspective of all stakeholders involved. In this article the main steps in the development of a fully-developed competency-based pharmacy curriculum (bachelor, master) are described and tips are given for a successful implementation. After the choice for entering into CBPE is made and a competency framework is adopted (step 1), intended learning outcomes are defined (step 2), followed by analyzing the required developmental trajectory (step 3) and the selection of appropriate assessment methods (step 4). Designing the teaching-learning environment involves the selection of learning activities, student experiences, and instructional methods (step 5). Finally, an iterative process of evaluation and adjustment of individual courses, and the curriculum as a whole, is entered (step 6). Successful implementation of CBPE requires a system of effective quality management and continuous professional development as a teacher. In this article suggestions for the organization of CBPE and references to more detailed literature are given, hoping to facilitate the implementation of CBPE. PMID:28970422

  18. A muscle-driven approach to restore stepping with an exoskeleton for individuals with paraplegia.

    PubMed

    Chang, Sarah R; Nandor, Mark J; Li, Lu; Kobetic, Rudi; Foglyano, Kevin M; Schnellenberger, John R; Audu, Musa L; Pinault, Gilles; Quinn, Roger D; Triolo, Ronald J

    2017-05-30

    Functional neuromuscular stimulation, lower limb orthosis, powered lower limb exoskeleton, and hybrid neuroprosthesis (HNP) technologies can restore stepping in individuals with paraplegia due to spinal cord injury (SCI). However, a self-contained muscle-driven controllable exoskeleton approach based on an implanted neural stimulator to restore walking has not been previously demonstrated, which could potentially result in system use outside the laboratory and viable for long term use or clinical testing. In this work, we designed and evaluated an untethered muscle-driven controllable exoskeleton to restore stepping in three individuals with paralysis from SCI. The self-contained HNP combined neural stimulation to activate the paralyzed muscles and generate joint torques for limb movements with a controllable lower limb exoskeleton to stabilize and support the user. An onboard controller processed exoskeleton sensor signals, determined appropriate exoskeletal constraints and stimulation commands for a finite state machine (FSM), and transmitted data over Bluetooth to an off-board computer for real-time monitoring and data recording. The FSM coordinated stimulation and exoskeletal constraints to enable functions, selected with a wireless finger switch user interface, for standing up, standing, stepping, or sitting down. In the stepping function, the FSM used a sensor-based gait event detector to determine transitions between gait phases of double stance, early swing, late swing, and weight acceptance. The HNP restored stepping in three individuals with motor complete paralysis due to SCI. The controller appropriately coordinated stimulation and exoskeletal constraints using the sensor-based FSM for subjects with different stimulation systems. The average range of motion at hip and knee joints during walking were 8.5°-20.8° and 14.0°-43.6°, respectively. Walking speeds varied from 0.03 to 0.06 m/s, and cadences from 10 to 20 steps/min. A self-contained muscle-driven exoskeleton was a feasible intervention to restore stepping in individuals with paraplegia due to SCI. The untethered hybrid system was capable of adjusting to different individuals' needs to appropriately coordinate exoskeletal constraints with muscle activation using a sensor-driven FSM for stepping. Further improvements for out-of-the-laboratory use should include implantation of plantar flexor muscles to improve walking speed and power assist as needed at the hips and knees to maintain walking as muscles fatigue.

  19. MOCAT: A Metagenomics Assembly and Gene Prediction Toolkit

    PubMed Central

    Li, Junhua; Chen, Weineng; Chen, Hua; Mende, Daniel R.; Arumugam, Manimozhiyan; Pan, Qi; Liu, Binghang; Qin, Junjie; Wang, Jun; Bork, Peer

    2012-01-01

    MOCAT is a highly configurable, modular pipeline for fast, standardized processing of single or paired-end sequencing data generated by the Illumina platform. The pipeline uses state-of-the-art programs to quality control, map, and assemble reads from metagenomic samples sequenced at a depth of several billion base pairs, and predict protein-coding genes on assembled metagenomes. Mapping against reference databases allows for read extraction or removal, as well as abundance calculations. Relevant statistics for each processing step can be summarized into multi-sheet Excel documents and queryable SQL databases. MOCAT runs on UNIX machines and integrates seamlessly with the SGE and PBS queuing systems, commonly used to process large datasets. The open source code and modular architecture allow users to modify or exchange the programs that are utilized in the various processing steps. Individual processing steps and parameters were benchmarked and tested on artificial, real, and simulated metagenomes resulting in an improvement of selected quality metrics. MOCAT can be freely downloaded at http://www.bork.embl.de/mocat/. PMID:23082188

  20. MOCAT: a metagenomics assembly and gene prediction toolkit.

    PubMed

    Kultima, Jens Roat; Sunagawa, Shinichi; Li, Junhua; Chen, Weineng; Chen, Hua; Mende, Daniel R; Arumugam, Manimozhiyan; Pan, Qi; Liu, Binghang; Qin, Junjie; Wang, Jun; Bork, Peer

    2012-01-01

    MOCAT is a highly configurable, modular pipeline for fast, standardized processing of single or paired-end sequencing data generated by the Illumina platform. The pipeline uses state-of-the-art programs to quality control, map, and assemble reads from metagenomic samples sequenced at a depth of several billion base pairs, and predict protein-coding genes on assembled metagenomes. Mapping against reference databases allows for read extraction or removal, as well as abundance calculations. Relevant statistics for each processing step can be summarized into multi-sheet Excel documents and queryable SQL databases. MOCAT runs on UNIX machines and integrates seamlessly with the SGE and PBS queuing systems, commonly used to process large datasets. The open source code and modular architecture allow users to modify or exchange the programs that are utilized in the various processing steps. Individual processing steps and parameters were benchmarked and tested on artificial, real, and simulated metagenomes resulting in an improvement of selected quality metrics. MOCAT can be freely downloaded at http://www.bork.embl.de/mocat/.

  1. Structured scaffolding for reflection and problem solving in diabetes self-management: qualitative study of mobile diabetes detective.

    PubMed

    Mamykina, Lena; Heitkemper, Elizabeth M; Smaldone, Arlene M; Kukafka, Rita; Cole-Lewis, Heather; Davidson, Patricia G; Mynatt, Elizabeth D; Tobin, Jonathan N; Cassells, Andrea; Goodman, Carrie; Hripcsak, George

    2016-01-01

    To investigate subjective experiences and patterns of engagement with a novel electronic tool for facilitating reflection and problem solving for individuals with type 2 diabetes, Mobile Diabetes Detective (MoDD). In this qualitative study, researchers conducted semi-structured interviews with individuals from economically disadvantaged communities and ethnic minorities who are participating in a randomized controlled trial of MoDD. The transcripts of the interviews were analyzed using inductive thematic analysis; usage logs were analyzed to determine how actively the study participants used MoDD. Fifteen participants in the MoDD randomized controlled trial were recruited for the qualitative interviews. Usage log analysis showed that, on average, during the 4 weeks of the study, the study participants logged into MoDD twice per week, reported 120 blood glucose readings, and set two behavioral goals. The qualitative interviews suggested that individuals used MoDD to follow the steps of the problem-solving process, from identifying problematic blood glucose patterns, to exploring behavioral triggers contributing to these patterns, to selecting alternative behaviors, to implementing these behaviors while monitoring for improvements in glycemic control. This qualitative study suggested that informatics interventions for reflection and problem solving can provide structured scaffolding for facilitating these processes by guiding users through the different steps of the problem-solving process and by providing them with context-sensitive evidence and practice-based knowledge related to diabetes self-management on each of those steps. This qualitative study suggested that MoDD was perceived as a useful tool in engaging individuals in self-monitoring, reflection, and problem solving. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  2. Using process elicitation and validation to understand and improve chemotherapy ordering and delivery.

    PubMed

    Mertens, Wilson C; Christov, Stefan C; Avrunin, George S; Clarke, Lori A; Osterweil, Leon J; Cassells, Lucinda J; Marquard, Jenna L

    2012-11-01

    Chemotherapy ordering and administration, in which errors have potentially severe consequences, was quantitatively and qualitatively evaluated by employing process formalism (or formal process definition), a technique derived from software engineering, to elicit and rigorously describe the process, after which validation techniques were applied to confirm the accuracy of the described process. The chemotherapy ordering and administration process, including exceptional situations and individuals' recognition of and responses to those situations, was elicited through informal, unstructured interviews with members of an interdisciplinary team. The process description (or process definition), written in a notation developed for software quality assessment purposes, guided process validation (which consisted of direct observations and semistructured interviews to confirm the elicited details for the treatment plan portion of the process). The overall process definition yielded 467 steps; 207 steps (44%) were dedicated to handling 59 exceptional situations. Validation yielded 82 unique process events (35 new expected but not yet described steps, 16 new exceptional situations, and 31 new steps in response to exceptional situations). Process participants actively altered the process as ambiguities and conflicts were discovered by the elicitation and validation components of the study. Chemotherapy error rates declined significantly during and after the project, which was conducted from October 2007 through August 2008. Each elicitation method and the subsequent validation discussions contributed uniquely to understanding the chemotherapy treatment plan review process, supporting rapid adoption of changes, improved communication regarding the process, and ensuing error reduction.

  3. Integrated process development-a robust, rapid method for inclusion body harvesting and processing at the microscale level.

    PubMed

    Walther, Cornelia; Kellner, Martin; Berkemeyer, Matthias; Brocard, Cécile; Dürauer, Astrid

    2017-10-21

    Escherichia coli stores large amounts of highly pure product within inclusion bodies (IBs). To take advantage of this beneficial feature, after cell disintegration, the first step to optimal product recovery is efficient IB preparation. This step is also important in evaluating upstream optimization and process development, due to the potential impact of bioprocessing conditions on product quality and on the nanoscale properties of IBs. Proper IB preparation is often neglected, due to laboratory-scale methods requiring large amounts of materials and labor. Miniaturization and parallelization can accelerate analyses of individual processing steps and provide a deeper understanding of up- and downstream processing interdependencies. Consequently, reproducible, predictive microscale methods are in demand. In the present study, we complemented a recently established high-throughput cell disruption method with a microscale method for preparing purified IBs. This preparation provided results comparable to laboratory-scale IB processing, regarding impurity depletion, and product loss. Furthermore, with this method, we performed a "design of experiments" study to demonstrate the influence of fermentation conditions on the performance of subsequent downstream steps and product quality. We showed that this approach provided a 300-fold reduction in material consumption for each fermentation condition and a 24-fold reduction in processing time for 24 samples.

  4. A Post-It-Note Pedagogy: Investigating the "petit recit" in an Emergent Model of the Writing Process.

    ERIC Educational Resources Information Center

    Noble, Michael

    Perhaps writing is equated with process. But, there are too many complicating factors that make it difficult to evaluate the success or failure of prewriting and drafting assignments--the process and the value of each step is different for each individual. By teaching students to recognize the cultural contingencies of textuality, the status of…

  5. Automated aray assembly, phase 2

    NASA Technical Reports Server (NTRS)

    Daiello, R. V.

    1979-01-01

    A manufacturing process suitable for the large-scale production of silicon solar array modules at a cost of less than $500/peak kW is described. Factors which control the efficiency of ion implanted silicon solar cells, screen-printed thick film metallization, spray-on antireflection coating process, and panel assembly are discussed. Conclusions regarding technological readiness or cost effectiveness of individual process steps are presented.

  6. A Process for Evaluating Student Records Management Software. ERIC/AE Digest.

    ERIC Educational Resources Information Center

    Vecchioli, Lisa

    This digest provides practical advice on evaluating software for managing student records. An evaluation of record-keeping software should start with a process to identify all of the individual needs the software produce must meet in order to be considered for purchase. The first step toward establishing an administrative computing system is…

  7. Advances in polycrystalline thin-film photovoltaics for space applications

    NASA Technical Reports Server (NTRS)

    Lanning, Bruce R.; Armstrong, Joseph H.; Misra, Mohan S.

    1994-01-01

    Polycrystalline, thin-film photovoltaics represent one of the few (if not the only) renewable power sources which has the potential to satisfy the demanding technical requirements for future space applications. The demand in space is for deployable, flexible arrays with high power-to-weight ratios and long-term stability (15-20 years). In addition, there is also the demand that these arrays be produced by scalable, low-cost, high yield, processes. An approach to significantly reduce costs and increase reliability is to interconnect individual cells series via monolithic integration. Both CIS and CdTe semiconductor films are optimum absorber materials for thin-film n-p heterojunction solar cells, having band gaps between 0.9-1.5 ev and demonstrated small area efficiencies, with cadmium sulfide window layers, above 16.5 percent. Both CIS and CdTe polycrystalline thin-film cells have been produced on a laboratory scale by a variety of physical and chemical deposition methods, including evaporation, sputtering, and electrodeposition. Translating laboratory processes which yield these high efficiency, small area cells into the design of a manufacturing process capable of producing 1-sq ft modules, however, requires a quantitative understanding of each individual step in the process and its (each step) effect on overall module performance. With a proper quantification and understanding of material transport and reactivity for each individual step, manufacturing process can be designed that is not 'reactor-specific' and can be controlled intelligently with the design parameters of the process. The objective of this paper is to present an overview of the current efforts at MMC to develop large-scale manufacturing processes for both CIS and CdTe thin-film polycrystalline modules. CIS cells/modules are fabricated in a 'substrate configuration' by physical vapor deposition techniques and CdTe cells/modules are fabricated in a 'superstrate configuration' by wet chemical methods. Both laser and mechanical scribing operations are used to monolithically integrate (series interconnect) the individual cells into modules. Results will be presented at the cell and module development levels with a brief description of the test methods used to qualify these devices for space applications. The approach and development efforts are directed towards large-scale manufacturability of established thin-film, polycrystalline processing methods for large area modules with less emphasis on maximizing small area efficiencies.

  8. Analyzing developmental processes on an individual level using nonstationary time series modeling.

    PubMed

    Molenaar, Peter C M; Sinclair, Katerina O; Rovine, Michael J; Ram, Nilam; Corneal, Sherry E

    2009-01-01

    Individuals change over time, often in complex ways. Generally, studies of change over time have combined individuals into groups for analysis, which is inappropriate in most, if not all, studies of development. The authors explain how to identify appropriate levels of analysis (individual vs. group) and demonstrate how to estimate changes in developmental processes over time using a multivariate nonstationary time series model. They apply this model to describe the changing relationships between a biological son and father and a stepson and stepfather at the individual level. The authors also explain how to use an extended Kalman filter with iteration and smoothing estimator to capture how dynamics change over time. Finally, they suggest further applications of the multivariate nonstationary time series model and detail the next steps in the development of statistical models used to analyze individual-level data.

  9. Survey Methods for Educators: Collaborative Survey Development (Part 1 of 3). REL 2016-163

    ERIC Educational Resources Information Center

    Irwin, Clare W.; Stafford, Erin T.

    2016-01-01

    This guide describes a five-step collaborative process that educators can use with other educators, researchers, and content experts to write or adapt questions and develop surveys for education contexts. This process allows educators to leverage the expertise of individuals within and outside of their organization to ensure a high-quality survey…

  10. Large-Scale Traffic Microsimulation From An MPO Perspective

    DOT National Transportation Integrated Search

    1997-01-01

    One potential advancement of the four-step travel model process is the forecasting and simulation of individual activities and travel. A common concern with such an approach is that the data and computational requirements for a large-scale, regional ...

  11. Mechanism of muscle contraction based on stochastic properties of single actomyosin motors observed in vitro

    PubMed Central

    Kitamura, Kazuo; Tokunaga, Makio; Esaki, Seiji; Iwane, Atsuko Hikikoshi; Yanagida, Toshio

    2005-01-01

    We have previously measured the process of displacement generation by a single head of muscle myosin (S1) using scanning probe nanometry. Given that the myosin head was rigidly attached to a fairly large scanning probe, it was assumed to stably interact with an underlying actin filament without diffusing away as would be the case in muscle. The myosin head has been shown to step back and forth stochastically along an actin filament with actin monomer repeats of 5.5 nm and to produce a net movement in the forward direction. The myosin head underwent 5 forward steps to produce a maximum displacement of 30 nm per ATP at low load (<1 pN). Here, we measured the steps over a wide range of forces up to 4 pN. The size of the steps (∼5.5 nm) did not change as the load increased whereas the number of steps per displacement and the stepping rate both decreased. The rate of the 5.5-nm steps at various force levels produced a force-velocity curve of individual actomyosin motors. The force-velocity curve from the individual myosin heads was comparable to that reported in muscle, suggesting that the fundamental mechanical properties in muscle are basically due to the intrinsic stochastic nature of individual actomyosin motors. In order to explain multiple stochastic steps, we propose a model arguing that the thermally-driven step of a myosin head is biased in the forward direction by a potential slope along the actin helical pitch resulting from steric compatibility between the binding sites of actin and a myosin head. Furthermore, computer simulations show that multiple cooperating heads undergoing stochastic steps generate a long (>60 nm) sliding distance per ATP between actin and myosin filaments, i.e., the movement is loosely coupled to the ATPase cycle as observed in muscle. PMID:27857548

  12. Mechanism of muscle contraction based on stochastic properties of single actomyosin motors observed in vitro.

    PubMed

    Kitamura, Kazuo; Tokunaga, Makio; Esaki, Seiji; Iwane, Atsuko Hikikoshi; Yanagida, Toshio

    2005-01-01

    We have previously measured the process of displacement generation by a single head of muscle myosin (S1) using scanning probe nanometry. Given that the myosin head was rigidly attached to a fairly large scanning probe, it was assumed to stably interact with an underlying actin filament without diffusing away as would be the case in muscle. The myosin head has been shown to step back and forth stochastically along an actin filament with actin monomer repeats of 5.5 nm and to produce a net movement in the forward direction. The myosin head underwent 5 forward steps to produce a maximum displacement of 30 nm per ATP at low load (<1 pN). Here, we measured the steps over a wide range of forces up to 4 pN. The size of the steps (∼5.5 nm) did not change as the load increased whereas the number of steps per displacement and the stepping rate both decreased. The rate of the 5.5-nm steps at various force levels produced a force-velocity curve of individual actomyosin motors. The force-velocity curve from the individual myosin heads was comparable to that reported in muscle, suggesting that the fundamental mechanical properties in muscle are basically due to the intrinsic stochastic nature of individual actomyosin motors. In order to explain multiple stochastic steps, we propose a model arguing that the thermally-driven step of a myosin head is biased in the forward direction by a potential slope along the actin helical pitch resulting from steric compatibility between the binding sites of actin and a myosin head. Furthermore, computer simulations show that multiple cooperating heads undergoing stochastic steps generate a long (>60 nm) sliding distance per ATP between actin and myosin filaments, i.e., the movement is loosely coupled to the ATPase cycle as observed in muscle.

  13. Change management in health care.

    PubMed

    Campbell, Robert James

    2008-01-01

    This article introduces health care managers to the theories and philosophies of John Kotter and William Bridges, 2 leaders in the evolving field of change management. For Kotter, change has both an emotional and situational component, and methods for managing each are expressed in his 8-step model (developing urgency, building a guiding team, creating a vision, communicating for buy-in, enabling action, creating short-term wins, don't let up, and making it stick). Bridges deals with change at a more granular, individual level, suggesting that change within a health care organization means that individuals must transition from one identity to a new identity when they are involved in a process of change. According to Bridges, transitions occur in 3 steps: endings, the neutral zone, and beginnings. The major steps and important concepts within the models of each are addressed, and examples are provided to demonstrate how health care managers can actualize the models within their health care organizations.

  14. Providing appropriate services to individuals in the community: a preliminary case-mix for model allocating personal care services.

    PubMed

    Phillips, Charles D; Dyer, James; Janousek, Vit; Halperin, Lisa; Hawes, Catherine

    2008-01-01

    Personal care services are often provided to clients in community settings through highly discretionary processes. Such processes provide little guidance for caseworkers concerning how public resources should be allocated. The results of such processes almost guarantee that individuals with very similar needs will receive very different levels of care resources. Such disparities in treatment open the door to inequity and ineffectiveness. One way to address this problem is through case-mix classification systems that allocate hours of care according to client needs. This paper outlines the preliminary steps taken by one state in its movement toward such a system.

  15. Certification-Based Process Analysis

    NASA Technical Reports Server (NTRS)

    Knight, Russell L.

    2013-01-01

    Space mission architects are often challenged with knowing which investment in technology infusion will have the highest return. Certification-based analysis (CBA) gives architects and technologists a means to communicate the risks and advantages of infusing technologies at various points in a process. Various alternatives can be compared, and requirements based on supporting streamlining or automation can be derived and levied on candidate technologies. CBA is a technique for analyzing a process and identifying potential areas of improvement. The process and analysis products are used to communicate between technologists and architects. Process means any of the standard representations of a production flow; in this case, any individual steps leading to products, which feed into other steps, until the final product is produced at the end. This sort of process is common for space mission operations, where a set of goals is reduced eventually to a fully vetted command sequence to be sent to the spacecraft. Fully vetting a product is synonymous with certification. For some types of products, this is referred to as verification and validation, and for others it is referred to as checking. Fundamentally, certification is the step in the process where one insures that a product works as intended, and contains no flaws.

  16. Mechanism of protein splicing of the Pyrococcus abyssi lon protease intein

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Brien, Kevin M.; Schufreider, Ann K.; McGill, Melissa A.

    2010-12-17

    Research highlights: {yields} The Pyrococcus abyssi lon protease intein promotes efficient protein splicing. {yields} Inteins with mutations that interfere with individual steps of splicing do not promote unproductive side reactions. {yields} The intein splices with Lys in place of the highly conserved penultimate His. {yields} The intein is flanked by a Gly-rich region at its C terminus that may increase the efficiency of the third step of splicing, Asn cyclization coupled to peptide bond cleavage. -- Abstract: Protein splicing is a post-translational process by which an intervening polypeptide, the intein, excises itself from the flanking polypeptides, the exteins, coupled tomore » ligation of the exteins. The lon protease of Pyrococcus abyssi (Pab) is interrupted by an intein. When over-expressed as a fusion protein in Escherichia coli, the Pab lon protease intein can promote efficient protein splicing. Mutations that block individual steps of splicing generally do not lead to unproductive side reactions, suggesting that the intein tightly coordinates the splicing process. The intein can splice, although it has Lys in place of the highly conserved penultimate His, and mutants of the intein in the C-terminal region lead to the accumulation of stable branched-ester intermediate.« less

  17. Coastal Algorithms and On-Demand Processing- The Lessons Learnt from CoastColour for Sentinel 3

    NASA Astrophysics Data System (ADS)

    Brockmann, Carsten; Doerffer, Roland; Boettcher, Martin; Kramer, Uwe; Zuhlke, Marco; Pinnock, Simon

    2015-12-01

    The ESA DUE CoastColour Project has been initiated to provide water quality products for important costal zones globally. A new 5 component bio-optical model was developed and used in a 3-step approach for regional processing of ocean colour data. The L1P step consists of radiometric and geometric system corrections, and top-of-atmosphere pixel classification including cloud screening, sun glint risk masking or detection of floating vegetation. The second step includes the atmospheric correction and is providing the L2R product, which comprises marine reflectances with error characterisation and normalisation. The third step is the in-water processing which produces IOPs, attenuation coefficient and water constituent concentrations. Each of these steps will benefit from the additional bands on OLCI. The 5 component bio-optical model will already be used in the standard ESA processing of OLCI, and also part of the pixel classification methods will be part of the standard products. Other algorithm adaptation are in preparation. Another important advantage of the CoastColour approach is the highly configurable processing chain which allows adaptation to the individual characteristics of the area of interest, temporal window, algorithm parametrisation and processing chain configuration. This flexibility is made available to data users through the CoastColour on-demand processing service. The complete global MERIS Full and Reduced Resolution data archive is accessible, covering the time range from 17. May 2002 until 08. April 2012, which is almost 200TB of in-put data available online. The CoastColour on-demand processing service can serve as a model for hosted processing, where the software is moved to the data instead of moving the data to the users, which will be a challenge with the large amount of data coming from Sentinel 3.

  18. Enhancement of soil retention for phenanthrene in binary cationic gemini and nonionic surfactant mixtures: characterizing two-step adsorption and partition processes through experimental and modeling approaches.

    PubMed

    Zhao, Shan; Huang, Gordon; An, Chunjiang; Wei, Jia; Yao, Yao

    2015-04-09

    The enhancement of soil retention for phenanthrene (PHE) through the addition of a binary mixture of cationic gemini (12-2-12) and nonionic surfactants (C12E10) was investigated. The maximum apparent sorption coefficient Kd(*) reached 4247.8 mL/g through the addition of mixed 12-2-12 gemini and C12E10 surfactants, which was markedly higher than the summed individual results in the presence of individual 12-2-12 gemini (1148.6 mL/g) or C12E10 (210.0 mL/g) surfactant. However, the sorption of 12-2-12 gemini was inhibited by the increasing C12E10 dose; and a higher initial 12-2-12 gemini dose showed a higher "desorption" rate. The present study also addressed the sorption behavior of the single 12-2-12 gemini surfactant at the soil/aqueous interface. The sorption isotherm was divided into two steps to elucidate the sorption process; and the sorption schematics were proposed to elaborate the growth of surfactant aggregates corresponding to the various steps of the sorption isotherm. Finally, a two-step adsorption and partition model (TAPM) was developed to simulate the sorption process. Analysis of the equilibrium data indicated that the sorption isotherms of 12-2-12 gemini fitted the TAPM model better. Thermodynamic calculations confirmed that the 12-2-12 gemini sorption at the soil/aqueous interface was spontaneous and exothermic from 288 to 308K. Copyright © 2014 Elsevier B.V. All rights reserved.

  19. Chemical kinetic simulation of kerosene combustion in an individual flame tube.

    PubMed

    Zeng, Wen; Liang, Shuang; Li, Hai-Xia; Ma, Hong-An

    2014-05-01

    The use of detailed chemical reaction mechanisms of kerosene is still very limited in analyzing the combustion process in the combustion chamber of the aircraft engine. In this work, a new reduced chemical kinetic mechanism for fuel n-decane, which selected as a surrogate fuel for kerosene, containing 210 elemental reactions (including 92 reversible reactions and 26 irreversible reactions) and 50 species was developed, and the ignition and combustion characteristics of this fuel in both shock tube and flat-flame burner were kinetic simulated using this reduced reaction mechanism. Moreover, the computed results were validated by experimental data. The calculated values of ignition delay times at pressures of 12, 50 bar and equivalence ratio is 1.0, 2.0, respectively, and the main reactants and main products mole fractions using this reduced reaction mechanism agree well with experimental data. The combustion processes in the individual flame tube of a heavy duty gas turbine combustor were simulated by coupling this reduced reaction mechanism of surrogate fuel n-decane and one step reaction mechanism of surrogate fuel C12H23 into the computational fluid dynamics software. It was found that this reduced reaction mechanism is shown clear advantages in simulating the ignition and combustion processes in the individual flame tube over the one step reaction mechanism.

  20. Chemical kinetic simulation of kerosene combustion in an individual flame tube

    PubMed Central

    Zeng, Wen; Liang, Shuang; Li, Hai-xia; Ma, Hong-an

    2013-01-01

    The use of detailed chemical reaction mechanisms of kerosene is still very limited in analyzing the combustion process in the combustion chamber of the aircraft engine. In this work, a new reduced chemical kinetic mechanism for fuel n-decane, which selected as a surrogate fuel for kerosene, containing 210 elemental reactions (including 92 reversible reactions and 26 irreversible reactions) and 50 species was developed, and the ignition and combustion characteristics of this fuel in both shock tube and flat-flame burner were kinetic simulated using this reduced reaction mechanism. Moreover, the computed results were validated by experimental data. The calculated values of ignition delay times at pressures of 12, 50 bar and equivalence ratio is 1.0, 2.0, respectively, and the main reactants and main products mole fractions using this reduced reaction mechanism agree well with experimental data. The combustion processes in the individual flame tube of a heavy duty gas turbine combustor were simulated by coupling this reduced reaction mechanism of surrogate fuel n-decane and one step reaction mechanism of surrogate fuel C12H23 into the computational fluid dynamics software. It was found that this reduced reaction mechanism is shown clear advantages in simulating the ignition and combustion processes in the individual flame tube over the one step reaction mechanism. PMID:25685503

  1. Evaluation and treatment of students with difficulties passing the Step examinations.

    PubMed

    Laatsch, Linda

    2009-05-01

    The author designed this retrospective case series study both to systematically examine characteristics of individuals referred for treatment after multiple failures on the United States Medical Licensing Examinations (USMLE) Step 1 or 2 administered by the National Board of Medical Examiners and to evaluate treatment effectiveness in a uniform sample. Six medical students referred to rehabilitation psychology met selection criteria. All students completed the requisite neuropsychological, academic, and psychological testing to identify cognitive and emotional strengths and weaknesses. All six underwent individualized cognitive rehabilitation (CR) with a primary focus on reading fluency and accuracy. All participants improved on a quantitative measure of reading speed and accuracy, and five of the six passed their next USLME Step examination in spite of past failures. Medical students with identified difficulties on reading fluency, but no history of a learning disability, may benefit from systematic CR that addresses cognitive weaknesses related to test-taking abilities. The strong relationships between language and reading skills and the USMLE Step examinations suggest that some students may fail these examinations because of a relative weakness in language processing and reading fluency that may prohibit their successful completion of the Step examinations.

  2. Does a microprocessor-controlled prosthetic knee affect stair ascent strategies in persons with transfemoral amputation?

    PubMed

    Aldridge Whitehead, Jennifer M; Wolf, Erik J; Scoville, Charles R; Wilken, Jason M

    2014-10-01

    Stair ascent can be difficult for individuals with transfemoral amputation because of the loss of knee function. Most individuals with transfemoral amputation use either a step-to-step (nonreciprocal, advancing one stair at a time) or skip-step strategy (nonreciprocal, advancing two stairs at a time), rather than a step-over-step (reciprocal) strategy, because step-to-step and skip-step allow the leading intact limb to do the majority of work. A new microprocessor-controlled knee (Ottobock X2(®)) uses flexion/extension resistance to allow step-over-step stair ascent. We compared self-selected stair ascent strategies between conventional and X2(®) prosthetic knees, examined between-limb differences, and differentiated stair ascent mechanics between X2(®) users and individuals without amputation. We also determined which factors are associated with differences in knee position during initial contact and swing within X2(®) users. Fourteen individuals with transfemoral amputation participated in stair ascent sessions while using conventional and X2(®) knees. Ten individuals without amputation also completed a stair ascent session. Lower-extremity stair ascent joint angles, moment, and powers and ground reaction forces were calculated using inverse dynamics during self-selected strategy and cadence and controlled cadence using a step-over-step strategy. One individual with amputation self-selected a step-over-step strategy while using a conventional knee, while 10 individuals self-selected a step-over-step strategy while using X2(®) knees. Individuals with amputation used greater prosthetic knee flexion during initial contact (32.5°, p = 0.003) and swing (68.2°, p = 0.001) with higher intersubject variability while using X2(®) knees compared to conventional knees (initial contact: 1.6°, swing: 6.2°). The increased prosthetic knee flexion while using X2(®) knees normalized knee kinematics to individuals without amputation during swing (88.4°, p = 0.179) but not during initial contact (65.7°, p = 0.002). Prosthetic knee flexion during initial contact and swing were positively correlated with prosthetic limb hip power during pull-up (r = 0.641, p = 0.046) and push-up/early swing (r = 0.993, p < 0.001), respectively. Participants with transfemoral amputation were more likely to self-select a step-over-step strategy similar to individuals without amputation while using X2(®) knees than conventional prostheses. Additionally, the increased prosthetic knee flexion used with X2(®) knees placed large power demands on the hip during pull-up and push-up/early swing. A modified strategy that uses less knee flexion can be used to allow step-over-step ascent in individuals with less hip strength.

  3. Survey Methods for Educators: Selecting Samples and Administering Surveys (Part 2 of 3). REL 2016-160

    ERIC Educational Resources Information Center

    Pazzaglia, Angela M.; Stafford, Erin T.; Rodriguez, Sheila M.

    2016-01-01

    This guide describes a five-step collaborative process that educators can use with other educators, researchers, and content experts to write or adapt questions and develop surveys for education contexts. This process allows educators to leverage the expertise of individuals within and outside of their organization to ensure a high-quality survey…

  4. Survey Methods for Educators: Analysis and Reporting of Survey Data (Part 3 of 3). REL 2016-164

    ERIC Educational Resources Information Center

    Pazzaglia, Angela M.; Stafford, Erin T.; Rodriguez, Sheila M.

    2016-01-01

    This guide describes a five-step collaborative process that educators can use with other educators, researchers, and content experts to write or adapt questions and develop surveys for education contexts. This process allows educators to leverage the expertise of individuals within and outside of their organization to ensure a high-quality survey…

  5. The AskA Starter Kit: How To Build and Maintain Digital Reference Services.

    ERIC Educational Resources Information Center

    Lankes, R. David; Kasowitz, Abby S.

    This Starter Kit is designed to help organizations and individuals who wish to offer human-mediated information services via the Internet to users in the K-12 community. A six-step process is proposed for organizations to follow in creating an "AskA" service. This process addresses all aspects involved in building and maintaining an AskA…

  6. A multistep general theory of transition to addiction.

    PubMed

    Piazza, Pier Vincenzo; Deroche-Gamonet, Véronique

    2013-10-01

    Several theories propose alternative explanations for drug addiction. We propose a general theory of transition to addiction that synthesizes knowledge generated in the field of addiction into a unitary explanatory frame. Transition to addiction results from a sequential three-step interaction between: (1) individual vulnerability; (2) degree/amount of drug exposure. The first step, sporadic recreational drug use is a learning process mediated by overactivation of neurobiological substrates of natural rewards that allows most individuals to perceive drugs as highly rewarding stimuli. The second, intensified, sustained, escalated drug use occurs in some vulnerable individuals who have a hyperactive dopaminergic system and impaired prefrontal cortex function. Sustained and prolonged drug use induces incentive sensitization and an allostatic state that makes drugs strongly wanted and needed. Habit formation can also contribute to stabilizing sustained drug use. The last step, loss of control of drug intake and full addiction, is due to a second vulnerable phenotype. This loss-of-control-prone phenotype is triggered by long-term drug exposure and characterized by long-lasting loss of synaptic plasticity in reward areas in the brain that induce a form of behavioral crystallization resulting in loss of control of drug intake. Because of behavioral crystallization, drugs are now not only wanted and needed but also pathologically mourned when absent. This general theory demonstrates that drug addiction is a true psychiatric disease caused by a three-step interaction between vulnerable individuals and amount/duration of drug exposure.

  7. Individualizing drug dosage with longitudinal data.

    PubMed

    Zhu, Xiaolu; Qu, Annie

    2016-10-30

    We propose a two-step procedure to personalize drug dosage over time under the framework of a log-linear mixed-effect model. We model patients' heterogeneity using subject-specific random effects, which are treated as the realizations of an unspecified stochastic process. We extend the conditional quadratic inference function to estimate both fixed-effect coefficients and individual random effects on a longitudinal training data sample in the first step and propose an adaptive procedure to estimate new patients' random effects and provide dosage recommendations for new patients in the second step. An advantage of our approach is that we do not impose any distribution assumption on estimating random effects. Moreover, the new approach can accommodate more general time-varying covariates corresponding to random effects. We show in theory and numerical studies that the proposed method is more efficient compared with existing approaches, especially when covariates are time varying. In addition, a real data example of a clozapine study confirms that our two-step procedure leads to more accurate drug dosage recommendations. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  8. A step-by-step translation of evidence into a psychosocial intervention for everyday activities in dementia: a focus group study.

    PubMed

    Giebel, Clarissa M; Challis, David; Hooper, Nigel M; Ferris, Sally

    2018-03-01

    In order to increase the efficacy of psychosocial interventions in dementia, a step-by-step process translating evidence and public engagement should be adhered to. This paper describes such a process by involving a two-stage focus group with people with dementia (PwD), informal carers, and staff. Based on previous evidence, general aspects of effective interventions were drawn out. These were tested in the first stage of focus groups, one with informal carers and PwD and one with staff. Findings from this stage helped shape the intervention further specifying its content. In the second stage, participants were consulted about the detailed components. The extant evidence base and focus groups helped to identify six practical and situation-specific elements worthy of consideration in planning such an intervention, including underlying theory and personal motivations for participation. Carers, PwD, and staff highlighted the importance of rapport between practitioners and PwD prior to commencing the intervention. It was also considered important that the intervention would be personalised to each individual. This paper shows how valuable public involvement can be to intervention development, and outlines a process of public involvement for future intervention development. The next step would be to formally test the intervention.

  9. Pareto genealogies arising from a Poisson branching evolution model with selection.

    PubMed

    Huillet, Thierry E

    2014-02-01

    We study a class of coalescents derived from a sampling procedure out of N i.i.d. Pareto(α) random variables, normalized by their sum, including β-size-biasing on total length effects (β < α). Depending on the range of α we derive the large N limit coalescents structure, leading either to a discrete-time Poisson-Dirichlet (α, -β) Ξ-coalescent (α ε[0, 1)), or to a family of continuous-time Beta (2 - α, α - β)Λ-coalescents (α ε[1, 2)), or to the Kingman coalescent (α ≥ 2). We indicate that this class of coalescent processes (and their scaling limits) may be viewed as the genealogical processes of some forward in time evolving branching population models including selection effects. In such constant-size population models, the reproduction step, which is based on a fitness-dependent Poisson Point Process with scaling power-law(α) intensity, is coupled to a selection step consisting of sorting out the N fittest individuals issued from the reproduction step.

  10. Recruitment of new physicians, part II: the interview.

    PubMed

    Harolds, Jay A

    2013-06-01

    A careful, expertly done recruitment process is very important in having a successful group. Selecting a search committee, deciding what characteristics the group wants in a new person, evaluating the candidate's curriculum vitae, speaking to the individual on the phone or during a meeting, and calling references are important steps in selecting the top candidates for a group. The interview at the practice site is the next step, and it is critical. Many tips for planning and conducting a successful interview are given in this article.

  11. Comparison of pedometer and accelerometer derived steps in older individuals with Parkinson's disease or osteoporosis under free-living conditions.

    PubMed

    Wallén, Martin Benka; Dohrn, Ing-Mari; Ståhle, Agneta; Franzén, Erika; Hagströmer, Maria

    2014-10-01

    To compare self-reported pedometer steps with accelerometer steps under free-living conditions in individuals with Parkinson's disease (PD) or osteoporosis (OP). Seventy-three individuals with PD and 71 individuals with OP wore a pedometer (Yamax LS2000) and an accelerometer (ActiGraph GT1M/GT3X+) simultaneously for one week. Fifty-one individuals with PD (72.6 ± 5.3 years) and 61 with OP (75.6 ± 5.3 years) provided simultaneously recorded data for 3-7 consecutive days. Pedometer steps were significantly lower than accelerometer steps in the PD group (p = .002) but not in the OP group (p = .956). Bland-Altman plots demonstrated wide limits of agreement between the instruments in both PD (range = 6,911 steps) and OP (range = 6,794 steps). These results suggest that the ActiGraph GT1M/GT3X+ should be preferred over the Yamax LS2000 for the assessment of steps in both research and clinical evaluations, particularly in individuals with PD or altered gait.

  12. Planning that works: Empowerment through stakeholder focused interactive planning (SFIP)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beck, J.E.; Ison, S.A.

    1994-12-31

    This paper describes a powerful planning tool that can enable government, private industries, and public interest organizations to actualize their visions through sound decision making. The stakeholder focused interactive planning model is designed to integrate and ultimately gain stakeholder investment in the success of attainment of their vision. The only concessions required of the planning organization using this process is the acceptance of the premise that sustained vision success requires the support of both internal and external stakeholders and that each step in the process must be used as a validation of the previous step and essential to the completionmore » of the next step. What is stakeholder/public involvement? It is the process in which the stakeholders (both internal and external) values, interests and expectations are included in decision-making processes. The primary goal of public involvement efforts is to include all those who have a stake in the decision, whether or not they have already been identified. Stakeholders are individuals, contractors, clients, suppliers, public organizations, state and local governments, Indian tribes, federal agencies, and other parties affected by decisions.« less

  13. Classification of processes involved in sharing individual participant data from clinical trials.

    PubMed

    Ohmann, Christian; Canham, Steve; Banzi, Rita; Kuchinke, Wolfgang; Battaglia, Serena

    2018-01-01

    Background: In recent years, a cultural change in the handling of data from research has resulted in the strong promotion of a culture of openness and increased sharing of data. In the area of clinical trials, sharing of individual participant data involves a complex set of processes and the interaction of many actors and actions. Individual services/tools to support data sharing are available, but what is missing is a detailed, structured and comprehensive list of processes/subprocesses involved and tools/services needed. Methods : Principles and recommendations from a published data sharing consensus document are analysed in detail by a small expert group. Processes/subprocesses involved in data sharing are identified and linked to actors and possible services/tools. Definitions are adapted from the business process model and notation (BPMN) and applied in the analysis. Results: A detailed and comprehensive list of individual processes/subprocesses involved in data sharing, structured according to 9 main processes, is provided. Possible tools/services to support these processes/subprocesses are identified and grouped according to major type of support. Conclusions: The list of individual processes/subprocesses and tools/services identified is a first step towards development of a generic framework or architecture for sharing of data from clinical trials. Such a framework is strongly needed to give an overview of how various actors, research processes and services could form an interoperable system for data sharing.

  14. Classification of processes involved in sharing individual participant data from clinical trials

    PubMed Central

    Ohmann, Christian; Canham, Steve; Banzi, Rita; Kuchinke, Wolfgang; Battaglia, Serena

    2018-01-01

    Background: In recent years, a cultural change in the handling of data from research has resulted in the strong promotion of a culture of openness and increased sharing of data. In the area of clinical trials, sharing of individual participant data involves a complex set of processes and the interaction of many actors and actions. Individual services/tools to support data sharing are available, but what is missing is a detailed, structured and comprehensive list of processes/subprocesses involved and tools/services needed. Methods: Principles and recommendations from a published data sharing consensus document are analysed in detail by a small expert group. Processes/subprocesses involved in data sharing are identified and linked to actors and possible services/tools. Definitions are adapted from the business process model and notation (BPMN) and applied in the analysis. Results: A detailed and comprehensive list of individual processes/subprocesses involved in data sharing, structured according to 9 main processes, is provided. Possible tools/services to support these processes/subprocesses are identified and grouped according to major type of support. Conclusions: The list of individual processes/subprocesses and tools/services identified is a first step towards development of a generic framework or architecture for sharing of data from clinical trials. Such a framework is strongly needed to give an overview of how various actors, research processes and services could form an interoperable system for data sharing. PMID:29623192

  15. Reactions and Transport: Diffusion, Inertia, and Subdiffusion

    NASA Astrophysics Data System (ADS)

    Méndez, Vicenç; Fedotov, Sergei; Horsthemke, Werner

    Particles, such as molecules, atoms, or ions, and individuals, such as cells or animals, move in space driven by various forces or cues. In particular, particles or individuals can move randomly, undergo velocity jump processes or spatial jump processes [333]. The steps of the random walk can be independent or correlated, unbiased or biased. The probability density function (PDF) for the jump length can decay rapidly or exhibit a heavy tail. Similarly, the PDF for the waiting time between successive jumps can decay rapidly or exhibit a heavy tail. We will discuss these various possibilities in detail in Chap. 3. Below we provide an introduction to three transport processes: standard diffusion, transport with inertia, and anomalous diffusion.

  16. The effect of external forces on discrete motion within holographic optical tweezers.

    PubMed

    Eriksson, E; Keen, S; Leach, J; Goksör, M; Padgett, M J

    2007-12-24

    Holographic optical tweezers is a widely used technique to manipulate the individual positions of optically trapped micron-sized particles in a sample. The trap positions are changed by updating the holographic image displayed on a spatial light modulator. The updating process takes a finite time, resulting in a temporary decrease of the intensity, and thus the stiffness, of the optical trap. We have investigated this change in trap stiffness during the updating process by studying the motion of an optically trapped particle in a fluid flow. We found a highly nonlinear behavior of the change in trap stiffness vs. changes in step size. For step sizes up to approximately 300 nm the trap stiffness is decreasing. Above 300 nm the change in trap stiffness remains constant for all step sizes up to one particle radius. This information is crucial for optical force measurements using holographic optical tweezers.

  17. The meaning of suffering in drug addiction and recovery from the perspective of existentialism, Buddhism and the 12-Step program.

    PubMed

    Chen, Gila

    2010-09-01

    The aim of the current article was to examine the meaning of suffering in drug addiction and in the recovery process. Negative emotions may cause primary suffering that can drive an individual toward substance abuse. At the same time, drugs only provide temporary relief, and over time, the pathological effects of the addiction worsen causing secondary suffering, which is a motivation for treatment. The 12-Step program offers a practical way to cope with suffering through a process of surrender. The act of surrender sets in motion a conversion experience, which involves a self-change including reorganization of one's identity and meaning in life. This article is another step toward understanding one of the several factors that contribute to the addict's motivation for treatment. This knowledge may be helpful for tailoring treatment that addresses suffering as a factor that initiates treatment motivation and, in turn, treatment success.

  18. Implementing Competency-Based Medical Education in a Postgraduate Family Medicine Residency Training Program: A Stepwise Approach, Facilitating Factors, and Processes or Steps That Would Have Been Helpful.

    PubMed

    Schultz, Karen; Griffiths, Jane

    2016-05-01

    In 2009-2010, the postgraduate residency training program at the Department of Family Medicine, Queen's University, wrestled with the practicalities of competency-based medical education (CBME) implementation when its accrediting body, the College of Family Physicians of Canada, introduced the competency-based Triple C curriculum. The authors used a stepwise approach to implement CMBE; the steps were to (1) identify objectives, (2) identify competencies, (3) map objectives and competencies to learning experiences and assessment processes, (4) plan learning experiences, (5) develop an assessment system, (6) collect and interpret data, (7) adjust individual residents' training programs, and (8) distribute decisions to stakeholders. The authors also note overarching processes, costs, and facil itating factors and processes or steps that would have been helpful for CBME implementation. Early outcomes are encouraging. Residents are being directly observed more often with increased documented feedback about performance based on explicit competency standards (24,000 data points for 150 residents from 2013 to 2015). These multiple observations are being collated in a way that is allowing the identification of patterns of performance, red flags, and competency development trajectory. Outliers are being identified earlier, resulting in earlier individualized modification of their residency training program. The authors will continue to provide and refine faculty development, are developing an entrustable professional activity field note app for handheld devices, and are undertaking research to explore what facilitates learners' competency development, what increases assessors' confidence in making competence decisions, and whether residents are better trained as a result of CBME implementation.

  19. Automatic localization of landmark sets in head CT images with regression forests for image registration initialization

    NASA Astrophysics Data System (ADS)

    Zhang, Dongqing; Liu, Yuan; Noble, Jack H.; Dawant, Benoit M.

    2016-03-01

    Cochlear Implants (CIs) are electrode arrays that are surgically inserted into the cochlea. Individual contacts stimulate frequency-mapped nerve endings thus replacing the natural electro-mechanical transduction mechanism. CIs are programmed post-operatively by audiologists but this is currently done using behavioral tests without imaging information that permits relating electrode position to inner ear anatomy. We have recently developed a series of image processing steps that permit the segmentation of the inner ear anatomy and the localization of individual contacts. We have proposed a new programming strategy that uses this information and we have shown in a study with 68 participants that 78% of long term recipients preferred the programming parameters determined with this new strategy. A limiting factor to the large scale evaluation and deployment of our technique is the amount of user interaction still required in some of the steps used in our sequence of image processing algorithms. One such step is the rough registration of an atlas to target volumes prior to the use of automated intensity-based algorithms when the target volumes have very different fields of view and orientations. In this paper we propose a solution to this problem. It relies on a random forest-based approach to automatically localize a series of landmarks. Our results obtained from 83 images with 132 registration tasks show that automatic initialization of an intensity-based algorithm proves to be a reliable technique to replace the manual step.

  20. Practical Programming.

    ERIC Educational Resources Information Center

    Pipe, Peter

    Programed instruction causes the student to take an active role in the instructional process and stimulates interchange between student and teacher. Since it adjusts itself to individual differences in students' learning rates, it can have delegated to it some parts of a teacher's task. Characteristics of programed instruction are small steps,…

  1. Viral capsomere structure, surface processes and growth kinetics in the crystallization of macromolecular crystals visualized by in situ atomic force microscopy

    NASA Astrophysics Data System (ADS)

    Malkin, A. J.; Kuznetsov, Yu. G.; McPherson, A.

    2001-11-01

    In situ atomic force microscopy (AFM) was used to investigate surface evolution during the growth of single crystals of turnip yellow mosaic virus (TYMV), cucumber mosaic virus (CMV) and glucose isomerase. Growth of these crystals proceeded by two-dimensional (2D) nucleation. For glucose isomerase, from supersaturation dependencies of tangential step rates and critical step length, the kinetic coefficients of the steps and the surface free energy of the step edge were calculated for different crystallographic directions. The molecular structure of the step edges, the adsorption of individual virus particles and their aggregates, and the initial stages of formation of 2D nuclei on the surfaces of TYMV and CMV crystals were recorded. The surfaces of individual TYMV virions within crystals were visualized, and hexameric and pentameric capsomers of the T=3 capsids were clearly resolved. This, so far as we are aware, is the first direct visualization of the capsomere structure of a virus by AFM. In the course of recording the in situ development of the TYMV crystals, a profound restructuring of the surface arrangement was observed. This transformation was highly cooperative in nature, but the transitions were unambiguous and readily explicable in terms of an organized loss of classes of virus particles from specific lattice positions.

  2. Simple Sample Processing Enhances Malaria Rapid Diagnostic Test Performance

    PubMed Central

    Davis, K. M.; Gibson, L. E.; Haselton, F. R.; Wright, D. W.

    2016-01-01

    Lateral flow immunochromatographic rapid diagnostic tests (RDTs) are the primary form of medical diagnostic used for malaria in underdeveloped nations. Unfortunately, many of these tests do not detect asymptomatic malaria carriers. In order for eradication of the disease to be achieved, this problem must be solved. In this study, we demonstrate enhancement in the performance of six RDT brands when a simple sample-processing step is added to the front of the diagnostic process. Greater than a 4-fold RDT signal enhancement was observed as a result of the sample processing step. This lowered the limit of detection for RDT brands to submicroscopic parasitemias. For the best performing RDTs the limits of detection were found to be as low as 3 parasites/μL. Finally, through individual donor samples, the correlations between donor source, WHO panel detection scores and RDT signal intensities were explored. PMID:24787948

  3. Simple sample processing enhances malaria rapid diagnostic test performance.

    PubMed

    Davis, K M; Gibson, L E; Haselton, F R; Wright, D W

    2014-06-21

    Lateral flow immunochromatographic rapid diagnostic tests (RDTs) are the primary form of medical diagnostic used for malaria in underdeveloped nations. Unfortunately, many of these tests do not detect asymptomatic malaria carriers. In order for eradication of the disease to be achieved, this problem must be solved. In this study, we demonstrate enhancement in the performance of six RDT brands when a simple sample-processing step is added to the front of the diagnostic process. Greater than a 4-fold RDT signal enhancement was observed as a result of the sample processing step. This lowered the limit of detection for RDT brands to submicroscopic parasitemias. For the best performing RDTs the limits of detection were found to be as low as 3 parasites per μL. Finally, through individual donor samples, the correlations between donor source, WHO panel detection scores and RDT signal intensities were explored.

  4. Motors and Their Tethers: The Role of Secondary Binding Sites in Processive Motility

    PubMed Central

    Kincaid, Margaret M.; King, Stephen J.

    2007-01-01

    Cytoskeletal motors convert the energy from binding and hydrolyzing ATP into conformational changes that direct movement along a cytoskeletal polymer substrate. These enzymes utilize different mechanisms to generate long-range motion on the order of a micron or more that is required for functions ranging from muscle contraction to transport of growth factors along a nerve axon. Several of the individual cytoskeletal motors are processive, meaning that they have the ability to take sequential steps along their polymer substrate without dissociating from the polymer. This ability to maintain contact with the polymer allows individual motors to move cargos quickly from one cellular location to another. Many of the processive motors have now been found to utilize secondary binding sites that aid in motor processivity. PMID:17172850

  5. A Neuro-Mechanical Model Explaining the Physiological Role of Fast and Slow Muscle Fibres at Stop and Start of Stepping of an Insect Leg

    PubMed Central

    Toth, Tibor Istvan; Grabowska, Martyna; Schmidt, Joachim; Büschges, Ansgar; Daun-Gruhn, Silvia

    2013-01-01

    Stop and start of stepping are two basic actions of the musculo-skeletal system of a leg. Although they are basic phenomena, they require the coordinated activities of the leg muscles. However, little is known of the details of how these activities are generated by the interactions between the local neuronal networks controlling the fast and slow muscle fibres at the individual leg joints. In the present work, we aim at uncovering some of those details using a suitable neuro-mechanical model. It is an extension of the model in the accompanying paper and now includes all three antagonistic muscle pairs of the main joints of an insect leg, together with their dedicated neuronal control, as well as common inhibitory motoneurons and the residual stiffness of the slow muscles. This model enabled us to study putative processes of intra-leg coordination during stop and start of stepping. We also made use of the effects of sensory signals encoding the position and velocity of the leg joints. Where experimental observations are available, the corresponding simulation results are in good agreement with them. Our model makes detailed predictions as to the coordination processes of the individual muscle systems both at stop and start of stepping. In particular, it reveals a possible role of the slow muscle fibres at stop in accelerating the convergence of the leg to its steady-state position. These findings lend our model physiological relevance and can therefore be used to elucidate details of the stop and start of stepping in insects, and perhaps in other animals, too. PMID:24278108

  6. Wheat mill stream properties for discrete element method modeling

    USDA-ARS?s Scientific Manuscript database

    A discrete phase approach based on individual wheat kernel characteristics is needed to overcome the limitations of previous statistical models and accurately predict the milling behavior of wheat. As a first step to develop a discrete element method (DEM) model for the wheat milling process, this s...

  7. The Role of Emotions in Employee Creativity.

    ERIC Educational Resources Information Center

    Higgins, Lexis F.; And Others

    1992-01-01

    This paper examines research on influences of emotions on creativity, describes how feelings impact an individual's ability and willingness to function creatively, and discusses the implications for management of creativity in the employment setting. A four-step model of the creative process is discussed, and two sources (proximal and distal) of…

  8. Nonviolent Communication: A Humanizing Ecclesial and Educational Practice

    ERIC Educational Resources Information Center

    Latini, Theresa F.

    2009-01-01

    This article presents Nonviolent Communication (NVC) as a humanizing ecclesial and educational practice. NVC is a four-step process of communication designed to facilitate empathy and honesty between individuals and groups. Through an interdisciplinary dialogue with Reformed theology, this article argues that NVC is one concrete means of living as…

  9. Marketing: A Definition for Community Colleges.

    ERIC Educational Resources Information Center

    Kotler, Philip; Goldgehn, Leslie A.

    1981-01-01

    Defines marketing and discusses the eight steps of the marketing process. Emphasizes the necessity of having one individual or committee responsible for coordinating marketing functions. Notes that marketing's success depends on its acceptance by all levels of the institution. Lists the benefits of implementing marketing in a community college.…

  10. A Process-Centered Tool for Evaluating Patient Safety Performance and Guiding Strategic Improvement

    DTIC Science & Technology

    2005-01-01

    next patient safety steps in individual health care organizations. The low priority given to Category 3 (Focus on patients , other customers , and...presents a patient safety applicator tool for implementing and assessing patient safety systems in health care institutions. The applicator tool consists...the survey rounds. The study addressed three research questions: 1. What critical processes should be included in health care patient safety systems

  11. Phase 1 of the automated array assembly task of the low cost silicon solar array project

    NASA Technical Reports Server (NTRS)

    Coleman, M. G.; Pryor, R. A.; Grenon, L. A.; Lesk, I. A.

    1977-01-01

    The state of technology readiness for the automated production of solar cells and modules is reviewed. Individual process steps and process sequences for making solar cells and modules were evaluated both technically and economically. High efficiency with a suggested cell goal of 15% was stressed. It is concluded that the technology exists to manufacture solar cells which will meet program goals.

  12. Monetary reward processing in obese individuals with and without binge eating disorder.

    PubMed

    Balodis, Iris M; Kober, Hedy; Worhunsky, Patrick D; White, Marney A; Stevens, Michael C; Pearlson, Godfrey D; Sinha, Rajita; Grilo, Carlos M; Potenza, Marc N

    2013-05-01

    An important step in obesity research involves identifying neurobiological underpinnings of nonfood reward processing unique to specific subgroups of obese individuals. Nineteen obese individuals seeking treatment for binge eating disorder (BED) were compared with 19 non-BED obese individuals (OB) and 19 lean control subjects (LC) while performing a monetary reward/loss task that parses anticipatory and outcome components during functional magnetic resonance imaging. Differences in regional activation were investigated in BED, OB, and LC groups during reward/loss prospect, anticipation, and notification. Relative to the LC group, the OB group demonstrated increased ventral striatal and ventromedial prefrontal cortex activity during anticipatory phases. In contrast, the BED group relative to the OB group demonstrated diminished bilateral ventral striatal activity during anticipatory reward/loss processing. No differences were observed between the BED and LC groups in the ventral striatum. Heterogeneity exists among obese individuals with respect to the neural correlates of reward/loss processing. Neural differences in separable groups with obesity suggest that multiple, varying interventions might be important in optimizing prevention and treatment strategies for obesity. Copyright © 2013 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.

  13. A unified engineering model of the first stroke in downward negative lightning

    NASA Astrophysics Data System (ADS)

    Nag, Amitabh; Rakov, Vladimir A.

    2016-03-01

    Each stroke in a negative cloud-to-ground lightning flash is composed of downward leader and upward return stroke processes, which are usually modeled individually. The first stroke leader is stepped and starts with preliminary breakdown (PB) which is often viewed as a separate process. We present the first unified engineering model for computing the electric field produced by a sequence of PB, stepped leader, and return stroke processes, serving to transport negative charge to ground. We assume that a negatively charged channel extends downward in a stepped fashion during both the PB and leader stages. Each step involves a current wave that propagates upward along the newly formed channel section. Once the leader attaches to ground, an upward propagating return stroke neutralizes the charge deposited along the channel. Model-predicted electric fields are in reasonably good agreement with simultaneous measurements at both near (hundreds of meters, electrostatic field component is dominant) and far (tens of kilometers, radiation field component is dominant) distances from the lightning channel. Relations between the features of computed electric field waveforms and model input parameters are examined. It appears that peak currents associated with PB pulses are similar to return stroke peak currents, and the observed variation of electric radiation field peaks produced by leader steps at different heights above ground is influenced by the ground corona space charge.

  14. Performance assessment in algebra learning process

    NASA Astrophysics Data System (ADS)

    Lestariani, Ida; Sujadi, Imam; Pramudya, Ikrar

    2017-12-01

    The purpose of research to describe the implementation of performance assessment on algebra learning process. The subject in this research is math educator of SMAN 1 Ngawi class X. This research includes descriptive qualitative research type. Techniques of data collecting are done by observation method, interview, and documentation. Data analysis technique is done by data reduction, data presentation, and conclusion. The results showed any indication that the steps taken by the educator in applying the performance assessment are 1) preparing individual worksheets and group worksheets, 2) preparing rubric assessments for independent worksheets and groups and 3) making performance assessments rubric to learners’ performance results with individual or groups task.

  15. MO-D-213-01: Workflow Monitoring for a High Volume Radiation Oncology Center

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Laub, S; Dunn, M; Galbreath, G

    2015-06-15

    Purpose: Implement a center wide communication system that increases interdepartmental transparency and accountability while decreasing redundant work and treatment delays by actively monitoring treatment planning workflow. Methods: Intake Management System (IMS), a program developed by ProCure Treatment Centers Inc., is a multi-function database that stores treatment planning process information. It was devised to work with the oncology information system (Mosaiq) to streamline interdepartmental workflow.Each step in the treatment planning process is visually represented and timelines for completion of individual tasks are established within the software. The currently active step of each patient’s planning process is highlighted either red or greenmore » according to whether the initially allocated amount of time has passed for the given process. This information is displayed as a Treatment Planning Process Monitor (TPPM), which is shown on screens in the relevant departments throughout the center. This display also includes the individuals who are responsible for each task.IMS is driven by Mosaiq’s quality checklist (QCL) functionality. Each step in the workflow is initiated by a Mosaiq user sending the responsible party a QCL assignment. IMS is connected to Mosaiq and the sending or completing of a QCL updates the associated field in the TPPM to the appropriate status. Results: Approximately one patient a week is identified during the workflow process as needing to have his/her treatment start date modified or resources re-allocated to address the most urgent cases. Being able to identify a realistic timeline for planning each patient and having multiple departments communicate their limitations and time constraints allows for quality plans to be developed and implemented without overburdening any one department. Conclusion: Monitoring the progression of the treatment planning process has increased transparency between departments, which enables efficient communication. Having built-in timelines allows easy prioritization of tasks and resources and facilitates effective time management.« less

  16. The dynamics of team cognition: A process-oriented theory of knowledge emergence in teams.

    PubMed

    Grand, James A; Braun, Michael T; Kuljanin, Goran; Kozlowski, Steve W J; Chao, Georgia T

    2016-10-01

    Team cognition has been identified as a critical component of team performance and decision-making. However, theory and research in this domain continues to remain largely static; articulation and examination of the dynamic processes through which collectively held knowledge emerges from the individual- to the team-level is lacking. To address this gap, we advance and systematically evaluate a process-oriented theory of team knowledge emergence. First, we summarize the core concepts and dynamic mechanisms that underlie team knowledge-building and represent our theory of team knowledge emergence (Step 1). We then translate this narrative theory into a formal computational model that provides an explicit specification of how these core concepts and mechanisms interact to produce emergent team knowledge (Step 2). The computational model is next instantiated into an agent-based simulation to explore how the key generative process mechanisms described in our theory contribute to improved knowledge emergence in teams (Step 3). Results from the simulations demonstrate that agent teams generate collectively shared knowledge more effectively when members are capable of processing information more efficiently and when teams follow communication strategies that promote equal rates of information sharing across members. Lastly, we conduct an empirical experiment with real teams participating in a collective knowledge-building task to verify that promoting these processes in human teams also leads to improved team knowledge emergence (Step 4). Discussion focuses on implications of the theory for examining team cognition processes and dynamics as well as directions for future research. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  17. Quantifying multi-dimensional attributes of human activities at various geographic scales based on smartphone tracking.

    PubMed

    Zhou, Xiaolu; Li, Dongying

    2018-05-09

    Advancement in location-aware technologies, and information and communication technology in the past decades has furthered our knowledge of the interaction between human activities and the built environment. An increasing number of studies have collected data regarding individual activities to better understand how the environment shapes human behavior. Despite this growing interest, some challenges exist in collecting and processing individual's activity data, e.g., capturing people's precise environmental contexts and analyzing data at multiple spatial scales. In this study, we propose and implement an innovative system that integrates smartphone-based step tracking with an app and the sequential tile scan techniques to collect and process activity data. We apply the OpenStreetMap tile system to aggregate positioning points at various scales. We also propose duration, step and probability surfaces to quantify the multi-dimensional attributes of activities. Results show that, by running the app in the background, smartphones can measure multi-dimensional attributes of human activities, including space, duration, step, and location uncertainty at various spatial scales. By coordinating Global Positioning System (GPS) sensor with accelerometer sensor, this app can save battery which otherwise would be drained by GPS sensor quickly. Based on a test dataset, we were able to detect the recreational center and sports center as the space where the user was most active, among other places visited. The methods provide techniques to address key issues in analyzing human activity data. The system can support future studies on behavioral and health consequences related to individual's environmental exposure.

  18. Comparing Acceptance and Commitment Group Therapy and 12-Steps Narcotics Anonymous in Addict's Rehabilitation Process: A Randomized Controlled Trial.

    PubMed

    Azkhosh, Manoochehr; Farhoudianm, Ali; Saadati, Hemn; Shoaee, Fateme; Lashani, Leila

    2016-10-01

    Objective: Substance abuse is a socio-psychological disorder. The aim of this study was to compare the effectiveness of acceptance and commitment therapy with 12-steps Narcotics Anonymous on psychological well-being of opiate dependent individuals in addiction treatment centers in Shiraz, Iran. Method: This was a randomized controlled trial. Data were collected at entry into the study and at post-test and follow-up visits. The participants were selected from opiate addicted individuals who referred to addiction treatment centers in Shiraz. Sixty individuals were evaluated according to inclusion/ exclusion criteria and were divided into three equal groups randomly (20 participants per group). One group received acceptance and commitment group therapy (Twelve 90-minute sessions) and the other group was provided with the 12-steps Narcotics Anonymous program and the control group received the usual methadone maintenance treatment. During the treatment process, seven participants dropped out. Data were collected using the psychological well-being questionnaire and AAQ questionnaire in the three groups at pre-test, post-test and follow-up visits. Data were analyzed using repeated measure analysis of variance. Results: Repeated measure analysis of variance revealed that the mean difference between the three groups was significant (P<0.05) and that acceptance and commitment therapy group showed improvement relative to the NA and control groups on psychological well-being and psychological flexibility. Conclusion : The results of this study revealed that acceptance and commitment therapy can be helpful in enhancing positive emotions and increasing psychological well-being of addicts who seek treatment.

  19. Effects of coarse-graining on fluctuations in gene expression

    NASA Astrophysics Data System (ADS)

    Pedraza, Juan; Paulsson, Johan

    2008-03-01

    Many cellular components are present in such low numbers per cell that random births and deaths of individual molecules can cause significant `noise' in concentrations. But biochemical events do not necessarily occur in steps of individual molecules. Some processes are greatly randomized when synthesis or degradation occurs in large bursts of many molecules in a short time interval. Conversely, each birth or death of a macromolecule could involve several small steps, creating a memory between individual events. Here we present generalized theory for stochastic gene expression, formulating the variance in protein abundance in terms of the randomness of the individual events, and discuss the effective coarse-graining of the molecular hardware. We show that common molecular mechanisms produce gestation and senescence periods that can reduce noise without changing average abundances, lifetimes, or any concentration-dependent control loops. We also show that single-cell experimental methods that are now commonplace in cell biology do not discriminate between qualitatively different stochastic principles, but that this in turn makes them better suited for identifying which components introduce fluctuations.

  20. Artificial neural networks to model formulation-property correlations in the process of inline-compounding on an injection moulding machine

    NASA Astrophysics Data System (ADS)

    Moritzer, Elmar; Müller, Ellen; Martin, Yannick; Kleeschulte, Rainer

    2015-05-01

    Today the global market poses great challenges for industrial product development. Complexity, diversity of variants, flexibility and individuality are just some of the features that products have to offer today. In addition, the product series have shorter lifetimes. Because of their high capacity for adaption, polymers are increasingly able to displace traditional materials such as wood, glass and metals from various fields of application. Polymers can only be used to substitute other materials, however, if they are optimally suited to the applications in question. Hence, product-specific material development is becoming increasingly important. Integrating the compounding step in the injection moulding process permits a more efficient and faster development process for a new polymer formulation, making it possible to create new product-specific materials. This process is called inline-compounding on an injection moulding machine. The entire process sequence is supported by software from Bayer Technology called Product Design Workbench (PDWB), which provides assistance in all the individual steps from data management, via analysis and model compilation, right through to the optimization of the formulation and the design of experiments. The software is based on artificial neural networks and can model the formulation-property correlations and thus enable different formulations to be optimized. In the study presented, the workflow and the modelling with the software are presented.

  1. Perforating Thin Metal Sheets

    NASA Technical Reports Server (NTRS)

    Davidson, M. E.

    1985-01-01

    Sheets only few mils thick bonded together, punched, then debonded. Three-step process yields perforated sheets of metal. (1): Individual sheets bonded together to form laminate. (2): laminate perforated in desired geometric pattern. (3): After baking, laminate separates into individual sheets. Developed for fabricating conductive layer on blankets that collect and remove ions; however, perforated foils have other applications - as conductive surfaces on insulating materials; stiffeners and conductors in plastic laminates; reflectors in antenna dishes; supports for thermal blankets; lightweight grille cover materials; and material for mockup of components.

  2. The Processes Involved in Designing Software.

    DTIC Science & Technology

    1980-08-01

    repeats Itself at the next level, terminating with a plan whose individual steps can be executed to solve the Initial problem. Hayes-Roth and Hayes-Roth...that the original design problem is decomposed into a collection of well structured subproblems under the control of some type of executive process...given element to refine further, the schema is assumed to execute to completion, developing a solution model for that element and refining it into a

  3. New Tests to Measure Individual Differences in Matching and Labelling Facial Expressions of Emotion, and Their Association with Ability to Recognise Vocal Emotions and Facial Identity

    PubMed Central

    Palermo, Romina; O’Connor, Kirsty B.; Davis, Joshua M.; Irons, Jessica; McKone, Elinor

    2013-01-01

    Although good tests are available for diagnosing clinical impairments in face expression processing, there is a lack of strong tests for assessing “individual differences” – that is, differences in ability between individuals within the typical, nonclinical, range. Here, we develop two new tests, one for expression perception (an odd-man-out matching task in which participants select which one of three faces displays a different expression) and one additionally requiring explicit identification of the emotion (a labelling task in which participants select one of six verbal labels). We demonstrate validity (careful check of individual items, large inversion effects, independence from nonverbal IQ, convergent validity with a previous labelling task), reliability (Cronbach’s alphas of.77 and.76 respectively), and wide individual differences across the typical population. We then demonstrate the usefulness of the tests by addressing theoretical questions regarding the structure of face processing, specifically the extent to which the following processes are common or distinct: (a) perceptual matching and explicit labelling of expression (modest correlation between matching and labelling supported partial independence); (b) judgement of expressions from faces and voices (results argued labelling tasks tap into a multi-modal system, while matching tasks tap distinct perceptual processes); and (c) expression and identity processing (results argued for a common first step of perceptual processing for expression and identity). PMID:23840821

  4. New tests to measure individual differences in matching and labelling facial expressions of emotion, and their association with ability to recognise vocal emotions and facial identity.

    PubMed

    Palermo, Romina; O'Connor, Kirsty B; Davis, Joshua M; Irons, Jessica; McKone, Elinor

    2013-01-01

    Although good tests are available for diagnosing clinical impairments in face expression processing, there is a lack of strong tests for assessing "individual differences"--that is, differences in ability between individuals within the typical, nonclinical, range. Here, we develop two new tests, one for expression perception (an odd-man-out matching task in which participants select which one of three faces displays a different expression) and one additionally requiring explicit identification of the emotion (a labelling task in which participants select one of six verbal labels). We demonstrate validity (careful check of individual items, large inversion effects, independence from nonverbal IQ, convergent validity with a previous labelling task), reliability (Cronbach's alphas of.77 and.76 respectively), and wide individual differences across the typical population. We then demonstrate the usefulness of the tests by addressing theoretical questions regarding the structure of face processing, specifically the extent to which the following processes are common or distinct: (a) perceptual matching and explicit labelling of expression (modest correlation between matching and labelling supported partial independence); (b) judgement of expressions from faces and voices (results argued labelling tasks tap into a multi-modal system, while matching tasks tap distinct perceptual processes); and (c) expression and identity processing (results argued for a common first step of perceptual processing for expression and identity).

  5. Reduced step length reduces knee joint contact forces during running following anterior cruciate ligament reconstruction but does not alter inter-limb asymmetry.

    PubMed

    Bowersock, Collin D; Willy, Richard W; DeVita, Paul; Willson, John D

    2017-03-01

    Anterior cruciate ligament reconstruction is associated with early onset knee osteoarthritis. Running is a typical activity following this surgery, but elevated knee joint contact forces are thought to contribute to osteoarthritis degenerative processes. It is therefore clinically relevant to identify interventions to reduce contact forces during running among individuals after anterior cruciate ligament reconstruction. The primary purpose of this study was to evaluate the effect of reducing step length during running on patellofemoral and tibiofemoral joint contact forces among people with a history of anterior cruciate ligament reconstruction. Inter limb knee joint contact force differences during running were also examined. 18 individuals at an average of 54.8months after unilateral anterior cruciate ligament reconstruction ran in 3 step length conditions (preferred, -5%, -10%). Bilateral patellofemoral, tibiofemoral, and medial tibiofemoral compartment peak force, loading rate, impulse, and impulse per kilometer were evaluated between step length conditions and limbs using separate 2 factor analyses of variance. Reducing step length 5% decreased patellofemoral, tibiofemoral, and medial tibiofemoral compartment peak force, impulse, and impulse per kilometer bilaterally. A 10% step length reduction further decreased peak forces and force impulses, but did not further reduce force impulses per kilometer. Tibiofemoral joint impulse, impulse per kilometer, and patellofemoral joint loading rate were lower in the previously injured limb compared to the contralateral limb. Running with a shorter step length is a feasible clinical intervention to reduce knee joint contact forces during running among people with a history of anterior cruciate ligament reconstruction. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Analyzing Learner Language: Towards a Flexible Natural Language Processing Architecture for Intelligent Language Tutors

    ERIC Educational Resources Information Center

    Amaral, Luiz; Meurers, Detmar; Ziai, Ramon

    2011-01-01

    Intelligent language tutoring systems (ILTS) typically analyze learner input to diagnose learner language properties and provide individualized feedback. Despite a long history of ILTS research, such systems are virtually absent from real-life foreign language teaching (FLT). Taking a step toward more closely linking ILTS research to real-life…

  7. Substrate-Related Factors Affecting Enzymatic Saccharification of Lignocelluloses: Our Recent Understanding

    Treesearch

    Shao-Yuan Leu; J.Y. Zhu

    2013-01-01

    Enzymatic saccharification of cellulose is a key step in conversion of plant biomass to advanced biofuel and chemicals. Many substrate-related factors affect saccharification. Rather than examining the role of each individual factor on overall saccharification efficiency, this study examined how each factor affects the three basic processes of a heterogeneous...

  8. Behavioral Talk-Write as a Method for Teaching Technical Editing.

    ERIC Educational Resources Information Center

    Gilbertsen, Michael; Killingsworth, M. Jimmie

    1987-01-01

    Presents a process-oriented method for teachers of stylistic editing workshops that allows them to (1) focus on individual students, (2) start with students basic repertory of responses and build from there, (3) work with freely emitted behavior, (4) ensure frequent and brief responses, and (5) achieve desired behavior through sequential steps.…

  9. Teaching Social Problem Solving to Individuals with Mental Retardation

    ERIC Educational Resources Information Center

    Crites, Steven A.; Dunn, Caroline

    2004-01-01

    The purpose of this study was to determine effectiveness of a problem-solving curriculum for transition-age students with mental retardation. The interactive training program Solving Your Problems (Browning, n.d.) was used to teach a five-step process for solving problems. Results indicate participants in the training group were able to use the…

  10. Scholastic Audits. Research Brief

    ERIC Educational Resources Information Center

    Walker, Karen

    2009-01-01

    What is a scholastic audit? The purpose of the audit is to assist individual schools and districts improve. The focus is on gathering data and preparing recommendations that can be used to guide school improvement initiatives. Scholastic audits use a multi-step approach and include: (1) Preparing for the Audit; (2) Audit process; (3) Audit report;…

  11. SU-F-T-99: Data Visualization From a Treatment Planning Tracking System for Radiation Oncology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cline, K; Kabat, C; Li, Y

    2016-06-15

    Purpose: A treatment planning process tracker database with input forms and a TV-viewable display webpage was developed and implemented in our clinic to collect time data points throughout the process. Tracking plan times is important because it directly affects the patient quality of care. Simply, the longer a patient waits after their initial simulation CT for treatment to begin, the more time the cancer has to progress. The tracker helps to drive workflow through the clinic, while the data collected can be used to understand and manage the process to find and eliminate inefficiencies. Methods: The overall process steps trackedmore » are CT-simulation, mark patient, draw normal contours, draw target volumes, create plan, and review/approve plan. Time stamps for task completion were extracted and used to generate a set of clinic metrics, among which include average time for each step in the process split apart by type of treatment, average time to completion for plans started in a given week, and individual overall completion time per plan. Results: Trends have been tracked for fourteen weeks of clinical data (196 plans). On average, drawing normal contours and target volumes is taking 2–5 times as long as creating the plan itself. This is potentially an issue because it could mean the process is taking too long initially, and it could be forcing the planning step to be done in a short amount of time. We also saw from our graphs that there appears to be no clear trend on the average amount of time per plan week-to-week. Conclusion: A tracker of this type has the potential to provide insight into how time is utilized in our clinic. By equipping our dosimetrists, radiation oncologists, and physicists with individualized metric sets, the tracker can help provide visibility and drive workflow. Funded in part by CPRIT (RP140105).« less

  12. Rallying the troops: a four-step guide to preparing a residency program for short-term weather emergencies.

    PubMed

    Chow, Grant V; Hayashi, Jennifer; Hirsch, Glenn A; Christmas, Colleen

    2011-04-01

    Weather emergencies present a multifaceted challenge to residents and residency programs. Both the individual trainee and program may be pushed to the limits of physical and mental strain, potentially jeopardizing core competencies of patient care and professionalism. Although daunting, the task of preparing for these events should be a methodical process integrated into every residency training program. The core elements of emergency preparation with regard to inpatient services include identifying and staffing critical positions, motivating residents to consider the needs of the group over those of the individual, providing for basic needs, and planning activities in order to preserve team morale and facilitate recovery. The authors outline a four-step process in preparing a residency program for an anticipated short-term weather emergency. An example worksheet for emergency planning is included. With adequate preparation, residency training programs can maintain the highest levels of patient care, professionalism, and esprit de corps during weather emergencies. When managed effectively, emergencies may present an opportunity for professional growth and a sense of unity for those involved.

  13. Preventing Unintended Disclosure of Personally Identifiable Data Following Anonymisation.

    PubMed

    Smith, Chris

    2017-01-01

    Errors and anomalies during the capture and processing of health data have the potential to place personally identifiable values into attributes of a dataset that are expected to contain non-identifiable values. Anonymisation focuses on those attributes that have been judged to enable identification of individuals. Attributes that are judged to contain non-identifiable values are not considered, but may be included in datasets that are shared by organisations. Consequently, organisations are at risk of sharing datasets that unintendedly disclose personally identifiable values through these attributes. This would have ethical and legal implications for organisations and privacy implications for individuals whose personally identifiable values are disclosed. In this paper, we formulate the problem of unintended disclosure following anonymisation, describe the necessary steps to address this problem, and discuss some key challenges to applying these steps in practice.

  14. Integrated ecotechnology approach towards treatment of complex wastewater with simultaneous bioenergy production.

    PubMed

    Hemalatha, Manupati; Sravan, J Shanthi; Yeruva, Dileep Kumar; Venkata Mohan, S

    2017-10-01

    Sequential integration of three stage diverse biological processes was studied by exploiting the individual process advantage towards enhanced treatment of complex chemical based wastewater. A successful attempt to integrate sequence batch reactor (SBR) with bioelectrochemical treatment (BET) and finally with microalgae treatment was studied. The sequential integration has showed individual substrate degradation (COD) of 55% in SBR, 49% in BET and 56% in microalgae, accounting for a consolidated treatment efficiency of 90%. Nitrates removal efficiency of 25% was observed in SBR, 31% in BET and 44% in microalgae, with a total efficiency of 72%. The SBR treated effluents fed to BET with the electrode intervention showed TDS removal. BET exhibited relatively higher process performance than SBR. The integration approach significantly overcame the individual process limitations along with value addition as biomass (1.75g/L), carbohydrates (640mg/g), lipids (15%) and bioelectricity. The study resulted in providing a strategy of combining SBR as pretreatment step to BET process and finally polishing with microalgae cultivation achieving the benefits of enhanced wastewater treatment along with value addition. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Personalized Physical Activity Coaching: A Machine Learning Approach

    PubMed Central

    Dijkhuis, Talko B.; van Ittersum, Miriam W.; Velthuijsen, Hugo

    2018-01-01

    Living a sedentary lifestyle is one of the major causes of numerous health problems. To encourage employees to lead a less sedentary life, the Hanze University started a health promotion program. One of the interventions in the program was the use of an activity tracker to record participants' daily step count. The daily step count served as input for a fortnightly coaching session. In this paper, we investigate the possibility of automating part of the coaching procedure on physical activity by providing personalized feedback throughout the day on a participant’s progress in achieving a personal step goal. The gathered step count data was used to train eight different machine learning algorithms to make hourly estimations of the probability of achieving a personalized, daily steps threshold. In 80% of the individual cases, the Random Forest algorithm was the best performing algorithm (mean accuracy = 0.93, range = 0.88–0.99, and mean F1-score = 0.90, range = 0.87–0.94). To demonstrate the practical usefulness of these models, we developed a proof-of-concept Web application that provides personalized feedback about whether a participant is expected to reach his or her daily threshold. We argue that the use of machine learning could become an invaluable asset in the process of automated personalized coaching. The individualized algorithms allow for predicting physical activity during the day and provides the possibility to intervene in time. PMID:29463052

  16. An algorithm for identification and classification of individuals with type 1 and type 2 diabetes mellitus in a large primary care database

    PubMed Central

    Sharma, Manuj; Petersen, Irene; Nazareth, Irwin; Coton, Sonia J

    2016-01-01

    Background Research into diabetes mellitus (DM) often requires a reproducible method for identifying and distinguishing individuals with type 1 DM (T1DM) and type 2 DM (T2DM). Objectives To develop a method to identify individuals with T1DM and T2DM using UK primary care electronic health records. Methods Using data from The Health Improvement Network primary care database, we developed a two-step algorithm. The first algorithm step identified individuals with potential T1DM or T2DM based on diagnostic records, treatment, and clinical test results. We excluded individuals with records for rarer DM subtypes only. For individuals to be considered diabetic, they needed to have at least two records indicative of DM; one of which was required to be a diagnostic record. We then classified individuals with T1DM and T2DM using the second algorithm step. A combination of diagnostic codes, medication prescribed, age at diagnosis, and whether the case was incident or prevalent were used in this process. We internally validated this classification algorithm through comparison against an independent clinical examination of The Health Improvement Network electronic health records for a random sample of 500 DM individuals. Results Out of 9,161,866 individuals aged 0–99 years from 2000 to 2014, we classified 37,693 individuals with T1DM and 418,433 with T2DM, while 1,792 individuals remained unclassified. A small proportion were classified with some uncertainty (1,155 [3.1%] of all individuals with T1DM and 6,139 [1.5%] with T2DM) due to unclear health records. During validation, manual assignment of DM type based on clinical assessment of the entire electronic record and algorithmic assignment led to equivalent classification in all instances. Conclusion The majority of individuals with T1DM and T2DM can be readily identified from UK primary care electronic health records. Our approach can be adapted for use in other health care settings. PMID:27785102

  17. An algorithm for identification and classification of individuals with type 1 and type 2 diabetes mellitus in a large primary care database.

    PubMed

    Sharma, Manuj; Petersen, Irene; Nazareth, Irwin; Coton, Sonia J

    2016-01-01

    Research into diabetes mellitus (DM) often requires a reproducible method for identifying and distinguishing individuals with type 1 DM (T1DM) and type 2 DM (T2DM). To develop a method to identify individuals with T1DM and T2DM using UK primary care electronic health records. Using data from The Health Improvement Network primary care database, we developed a two-step algorithm. The first algorithm step identified individuals with potential T1DM or T2DM based on diagnostic records, treatment, and clinical test results. We excluded individuals with records for rarer DM subtypes only. For individuals to be considered diabetic, they needed to have at least two records indicative of DM; one of which was required to be a diagnostic record. We then classified individuals with T1DM and T2DM using the second algorithm step. A combination of diagnostic codes, medication prescribed, age at diagnosis, and whether the case was incident or prevalent were used in this process. We internally validated this classification algorithm through comparison against an independent clinical examination of The Health Improvement Network electronic health records for a random sample of 500 DM individuals. Out of 9,161,866 individuals aged 0-99 years from 2000 to 2014, we classified 37,693 individuals with T1DM and 418,433 with T2DM, while 1,792 individuals remained unclassified. A small proportion were classified with some uncertainty (1,155 [3.1%] of all individuals with T1DM and 6,139 [1.5%] with T2DM) due to unclear health records. During validation, manual assignment of DM type based on clinical assessment of the entire electronic record and algorithmic assignment led to equivalent classification in all instances. The majority of individuals with T1DM and T2DM can be readily identified from UK primary care electronic health records. Our approach can be adapted for use in other health care settings.

  18. Parsing the roles of neck-linker docking and tethered head diffusion in the stepping dynamics of kinesin.

    PubMed

    Zhang, Zhechun; Goldtzvik, Yonathan; Thirumalai, D

    2017-11-14

    Kinesin walks processively on microtubules (MTs) in an asymmetric hand-over-hand manner consuming one ATP molecule per 16-nm step. The individual contributions due to docking of the approximately 13-residue neck linker to the leading head (deemed to be the power stroke) and diffusion of the trailing head (TH) that contributes in propelling the motor by 16 nm have not been quantified. We use molecular simulations by creating a coarse-grained model of the MT-kinesin complex, which reproduces the measured stall force as well as the force required to dislodge the motor head from the MT, to show that nearly three-quarters of the step occurs by bidirectional stochastic motion of the TH. However, docking of the neck linker to the leading head constrains the extent of diffusion and minimizes the probability that kinesin takes side steps, implying that both the events are necessary in the motility of kinesin and for the maintenance of processivity. Surprisingly, we find that during a single step, the TH stochastically hops multiple times between the geometrically accessible neighboring sites on the MT before forming a stable interaction with the target binding site with correct orientation between the motor head and the [Formula: see text] tubulin dimer.

  19. Analysis, design, fabrication, and performance of three-dimensional braided composites

    NASA Astrophysics Data System (ADS)

    Kostar, Timothy D.

    1998-11-01

    Cartesian 3-D (track and column) braiding as a method of composite preforming has been investigated. A complete analysis of the process was conducted to understand the limitations and potentials of the process. Knowledge of the process was enhanced through development of a computer simulation, and it was discovered that individual control of each track and column and multiple-step braid cycles greatly increases possible braid architectures. Derived geometric constraints coupled with the fundamental principles of Cartesian braiding resulted in an algorithm to optimize preform geometry in relation to processing parameters. The design of complex and unusual 3-D braids was investigated in three parts: grouping of yarns to form hybrid composites via an iterative simulation; design of composite cross-sectional shape through implementation of the Universal Method; and a computer algorithm developed to determine the braid plan based on specified cross-sectional shape. Several 3-D braids, which are the result of variations or extensions to Cartesian braiding, are presented. An automated four-step braiding machine with axial yarn insertion has been constructed and used to fabricate two-step, double two-step, four-step, and four-step with axial and transverse yarn insertion braids. A working prototype of a multi-step braiding machine was used to fabricate four-step braids with surrogate material insertion, unique hybrid structures from multiple track and column displacement and multi-step cycles, and complex-shaped structures with constant or varying cross-sections. Braid materials include colored polyester yarn to study the yarn grouping phenomena, Kevlar, glass, and graphite for structural reinforcement, and polystyrene, silicone rubber, and fasteners for surrogate material insertion. A verification study for predicted yarn orientation and volume fraction was conducted, and a topological model of 3-D braids was developed. The solid model utilizes architectural parameters, generated from the process simulation, to determine the composite elastic properties. Methods of preform consolidation are investigated and the results documented. The extent of yarn deformation (packing) resulting from preform consolidation was investigated through cross-sectional micrographs. The fiber volume fraction of select hybrid composites was measured and representative unit cells are suggested. Finally, a comparison study of the elastic performance of Kevlar/epoxy and carbon/Kevlar hybrid composites was conducted.

  20. Parallel processing of embossing dies with ultrafast lasers

    NASA Astrophysics Data System (ADS)

    Jarczynski, Manfred; Mitra, Thomas; Brüning, Stephan; Du, Keming; Jenke, Gerald

    2018-02-01

    Functionalization of surfaces equips products and components with new features like hydrophilic behavior, adjustable gloss level, light management properties, etc. Small feature sizes demand diffraction-limited spots and adapted fluence for different materials. Through the availability of high power fast repeating ultrashort pulsed lasers and efficient optical processing heads delivering diffraction-limited small spot size of around 10μm it is feasible to achieve fluences higher than an adequate patterning requires. Hence, parallel processing is becoming of interest to increase the throughput and allow mass production of micro machined surfaces. The first step on the roadmap of parallel processing for cylinder embossing dies was realized with an eight- spot processing head based on ns-fiber laser with passive optical beam splitting, individual spot switching by acousto optical modulation and an advanced imaging. Patterning of cylindrical embossing dies shows a high efficiency of nearby 80%, diffraction-limited and equally spaced spots with pitches down to 25μm achieved by a compression using cascaded prism arrays. Due to the nanoseconds laser pulses the ablation shows the typical surrounding material deposition of a hot process. In the next step the processing head was adapted to a picosecond-laser source and the 500W fiber laser was replaced by an ultrashort pulsed laser with 300W, 12ps and a repetition frequency of up to 6MHz. This paper presents details about the processing head design and the analysis of ablation rates and patterns on steel, copper and brass dies. Furthermore, it gives an outlook on scaling the parallel processing head from eight to 16 individually switched beamlets to increase processing throughput and optimized utilization of the available ultrashort pulsed laser energy.

  1. Is impaired control of reactive stepping related to falls during inpatient stroke rehabilitation?

    PubMed

    Mansfield, Avril; Inness, Elizabeth L; Wong, Jennifer S; Fraser, Julia E; McIlroy, William E

    2013-01-01

    Individuals with stroke fall more often than age-matched controls. Although many focus on the multifactorial nature of falls, the fundamental problem is likely the ability for an individual to generate reactions to recover from a loss of balance. Stepping reactions to recover balance are particularly important to balance recovery, and individuals with stroke have difficulty executing these responses to prevent a fall following a loss of balance. The purpose of this study is to determine if characteristics of balance recovery steps are related to falls during inpatient stroke rehabilitation. We conducted a retrospective review of individuals with stroke attending inpatient rehabilitation (n = 136). Details of falls experienced during inpatient rehabilitation were obtained from incident reports, nursing notes, and patient interviews. Stepping reactions were evoked using a "release-from-lean" postural perturbation. Poisson regression was used to determine characteristics of stepping reactions that were related to increased fall frequency relative to length of stay. In all, 20 individuals experienced 29 falls during inpatient rehabilitation. The characteristics of stepping reactions significantly related to increased fall rates were increased frequency of external assistance to prevent a fall to the floor, increased frequency of no-step responses, increased frequency of step responses with inadequate foot clearance, and delayed time to initiate stepping responses. Impaired control of balance recovery steps is related to increased fall rates during inpatient stroke rehabilitation. This study informs the specific features of stepping reactions that can be targeted with physiotherapy intervention during inpatient rehabilitation to improve dynamic stability control and potentially prevent falls.

  2. Quantifying social influence in an online cultural market.

    PubMed

    Krumme, Coco; Cebrian, Manuel; Pickard, Galen; Pentland, Sandy

    2012-01-01

    We revisit experimental data from an online cultural market in which 14,000 users interact to download songs, and develop a simple model that can explain seemingly complex outcomes. Our results suggest that individual behavior is characterized by a two-step process--the decision to sample and the decision to download a song. Contrary to conventional wisdom, social influence is material to the first step only. The model also identifies the role of placement in mediating social signals, and suggests that in this market with anonymous feedback cues, social influence serves an informational rather than normative role.

  3. Theoretical study of gas hydrate decomposition kinetics--model development.

    PubMed

    Windmeier, Christoph; Oellrich, Lothar R

    2013-10-10

    In order to provide an estimate of the order of magnitude of intrinsic gas hydrate dissolution and dissociation kinetics, the "Consecutive Desorption and Melting Model" (CDM) is developed by applying only theoretical considerations. The process of gas hydrate decomposition is assumed to comprise two consecutive and repetitive quasi chemical reaction steps. These are desorption of the guest molecule followed by local solid body melting. The individual kinetic steps are modeled according to the "Statistical Rate Theory of Interfacial Transport" and the Wilson-Frenkel approach. All missing required model parameters are directly linked to geometric considerations and a thermodynamic gas hydrate equilibrium model.

  4. Apply creative thinking of decision support in electrical nursing record.

    PubMed

    Hao, Angelica Te-Hui; Hsu, Chien-Yeh; Li-Fang, Huang; Jian, Wen-Shan; Wu, Li-Bin; Kao, Ching-Chiu; Lu, Mei-Show; Chang, Her-Kung

    2006-01-01

    The nursing process consists of five interrelated steps: assessment, diagnosis, planning, intervention, and evaluation. In the nursing process, the nurse collects a great deal of data and information. The amount of data and information may exceed the amount the nurse can process efficiently and correctly. Thus, the nurse needs assistance to become proficient in the planning of nursing care, due to the difficulty of simultaneously processing a large set of information. Computer systems are viewed as tools to expand the capabilities of the nurse's mind. Using computer technology to support clinicians' decision making may provide high-quality, patient-centered, and efficient healthcare. Although some existing nursing information systems aid in the nursing process, they only provide the most fundamental decision support--i.e., standard care plans associated with common nursing diagnoses. Such a computerized decision support system helps the nurse develop a care plan step-by-step. But it does not assist the nurse in the decision-making process. The decision process about how to generate nursing diagnoses from data and how to individualize the care plans still reminds of the nurse. The purpose of this study is to develop a pilot structure in electronic nursing record system integrated with international nursing standard for improving the proficiency and accuracy of plan of care in clinical pathway process. The proposed pilot systems not only assist both student nurses and nurses who are novice in nursing practice, but also experts who need to work in a practice area which they are not familiar with.

  5. A case management tool for occupational health nurses: development, testing, and application.

    PubMed

    Mannon, J A; Conrad, K M; Blue, C L; Muran, S

    1994-08-01

    1. Case management is a process of coordinating an individual client's health care services to achieve optimal, quality care delivered in a cost effective manner. The case manager establishes a provider network, recommends treatment plans that assure quality and efficacy while controlling costs, monitors outcomes, and maintains a strong communication link among all the parties. 2. Through development of audit tools such as the one presented in this article, occupational health nurses can document case management activities and provide employers with measurable outcomes. 3. The Case Management Activity Checklist was tested using data from 61 firefighters' musculoskeletal injury cases. 4. The activities on the checklist are a step by step process: case identification/case disposition; assessment; return to work plan; resource identification; collaborative communication; and evaluation.

  6. Single Molecule Stepping and Structural Dynamics of Myosin X

    PubMed Central

    Sun, Yujie; Sato, Osamu; Ruhnow, Felix; Arsenault, Mark E.; Ikebe, Mitsuo; Goldman, Yale E.

    2010-01-01

    Myosin X is an unconventional myosin with puzzling motility properties. We studied the motility of dimerized myosin X using single molecule fluorescence techniques – polTIRF, FIONA, and Parallax to measure rotation angles and 3-dimensional position of the molecule during its walk. It was found that Myosin X steps processively in a hand-over-hand manner following a left-handed helical path along both single actin filaments and bundles. Its step size and velocity are smaller on actin bundles than individual filaments, suggesting myosin X often steps onto neighboring filaments in a bundle. The data suggest that a previously postulated single α-helical domain mechanically extends the 3-IQ motif lever arm and either the neck-tail hinge or the tail is flexible. These structural features, in conjunction with the membrane and microtubule binding domains, enable myosin X to perform multiple functions on varied actin structures in cells. PMID:20364131

  7. An industrial ecology approach to municipal solid waste ...

    EPA Pesticide Factsheets

    Municipal solid waste (MSW) can be viewed as a feedstock for industrial ecology inspired conversions of wastes to valuable products and energy. The industrial ecology principle of symbiotic processes using waste streams for creating value-added products is applied to MSW, with examples suggested for various residual streams. A methodology is presented to consider individual waste-to-energy or waste-to-product system synergies, evaluating the economic and environmental issues associated with each system. Steps included in the methodology include identifying waste streams, specific waste components of interest, and conversion technologies, plus steps for determining the economic and environmental effects of using wastes and changes due to transport, administrative handling, and processing. In addition to presenting the methodology, technologies for various MSW input streams are categorized as commercialized or demonstrated to provide organizations that are considering processes for MSW with summarized information. The organization can also follow the methodology to analyze interesting processes. Presents information useful for analyzing the sustainability of alternatives for the management of municipal solid waste.

  8. Using a critical reflection process to create an effective learning community in the workplace.

    PubMed

    Walker, Rachel; Cooke, Marie; Henderson, Amanda; Creedy, Debra K

    2013-05-01

    Learning circles are an enabling process to critically examine and reflect on practices with the purpose of promoting individual and organizational growth and change. The authors adapted and developed a learning circle strategy to facilitate open discourse between registered nurses, clinical leaders, clinical facilitators and students, to critically reflect on practice experiences to promote a positive learning environment. This paper reports on an analysis of field notes taken during a critical reflection process used to create an effective learning community in the workplace. A total of 19 learning circles were conducted during in-service periods (that is, the time allocated for professional education between morning and afternoon shifts) over a 3 month period with 56 nurses, 33 students and 1 university-employed clinical supervisor. Participation rates ranged from 3 to 12 individuals per discussion. Ten themes emerged from content analysis of the clinical learning issues identified through the four-step model of critical reflection used in learning circle discussions. The four-step model of critical reflection allowed participants to reflect on clinical learning issues, and raise them in a safe environment that enabled topics to be challenged and explored in a shared and cooperative manner. Copyright © 2012 Elsevier Ltd. All rights reserved.

  9. Investigating interactions between phospholipase B-Like 2 and antibodies during Protein A chromatography.

    PubMed

    Tran, Benjamin; Grosskopf, Vanessa; Wang, Xiangdan; Yang, Jihong; Walker, Don; Yu, Christopher; McDonald, Paul

    2016-03-18

    Purification processes for therapeutic antibodies typically exploit multiple and orthogonal chromatography steps in order to remove impurities, such as host-cell proteins. While the majority of host-cell proteins are cleared through purification processes, individual host-cell proteins such as Phospholipase B-like 2 (PLBL2) are more challenging to remove and can persist into the final purification pool even after multiple chromatography steps. With packed-bed chromatography runs using host-cell protein ELISAs and mass spectrometry analysis, we demonstrated that different therapeutic antibodies interact to varying degrees with host-cell proteins in general, and PLBL2 specifically. We then used a high-throughput Protein A chromatography method to further examine the interaction between our antibodies and PLBL2. Our results showed that the co-elution of PLBL2 during Protein A chromatography is highly dependent on the individual antibody and PLBL2 concentration in the chromatographic load. Process parameters such as antibody resin load density and pre-elution wash conditions also influence the levels of PLBL2 in the Protein A eluate. Furthermore, using surface plasmon resonance, we demonstrated that there is a preference for PLBL2 to interact with IgG4 subclass antibodies compared to IgG1 antibodies. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Enhanced pharmaceutical removal from water in a three step bio-ozone-bio process.

    PubMed

    de Wilt, Arnoud; van Gijn, Koen; Verhoek, Tom; Vergnes, Amber; Hoek, Mirit; Rijnaarts, Huub; Langenhoff, Alette

    2018-07-01

    Individual treatment processes like biological treatment or ozonation have their limitations for the removal of pharmaceuticals from secondary clarified effluents with high organic matter concentrations (i.e. 17 mg TOC/L). These limitations can be overcome by combining these two processes for a cost-effective pharmaceutical removal. A three-step biological-ozone-biological (BO 3 B) treatment process was therefore designed for the enhanced pharmaceutical removal from wastewater effluent. The first biological step removed 38% of ozone scavenging TOC, thus proportionally reducing the absolute ozone input for the subsequent ozonation. Complementariness between biological and ozone treatment, i.e. targeting different pharmaceuticals, resulted in cost-effective pharmaceutical removal by the overall BO 3 B process. At a low ozone dose of 0.2 g O 3 /g TOC and an HRT of 1.46 h in the biological reactors, the removal of 8 out of 9 pharmaceuticals exceeded 85%, except for metoprolol (60%). Testing various ozone doses and HRTs revealed that pharmaceuticals were ineffectively removed at 0.1 g O3/g TOC and an HRT of 0.3 h. At HRTs of 0.47 and 1.46 h easily and moderately biodegradable pharmaceuticals such as caffeine, gemfibrozil, ibuprofen, naproxen and sulfamethoxazole were over 95% removed by biological treatment. The biorecalcitrant carbamazepine was completely ozonated at a dose of 0.4 g O 3 /g TOC. Ozonation products are likely biodegraded in the last biological reactor as a 17% TOC removal was found. No appreciable acute toxicity towards D. magna, P. subcapitata and V. fischeri was found after exposure to the influents and effluents of the individual BO 3 B reactors. The BO 3 B process is estimated to increase the yearly wastewater treatment tariff per population equivalent in the Netherlands by less than 10%. Overall, the BO 3 B process is a cost-effective treatment process for the removal of pharmaceuticals from secondary clarified effluents. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.

  11. Apparatus and process for freeform fabrication of composite reinforcement preforms

    NASA Technical Reports Server (NTRS)

    Yang, Junsheng (Inventor); Wu, Liangwei (Inventor); Liu, Junhai (Inventor); Jang, Bor Z. (Inventor)

    2001-01-01

    A solid freeform fabrication process and apparatus for making a three-dimensional reinforcement shape. The process comprises the steps of (1) operating a multiple-channel material deposition device for dispensing a liquid adhesive composition and selected reinforcement materials at predetermined proportions onto a work surface; (2) during the material deposition process, moving the deposition device and the work surface relative to each other in an X-Y plane defined by first and second directions and in a Z direction orthogonal to the X-Y plane so that the materials are deposited to form a first layer of the shape; (3) repeating these steps to deposit multiple layers for forming a three-dimensional preform shape; and (4) periodically hardening the adhesive to rigidize individual layers of the preform. These steps are preferably executed under the control of a computer system by taking additional steps of (5) creating a geometry of the shape on the computer with the geometry including a plurality of segments defining the preform shape and each segment being preferably coded with a reinforcement composition defining a specific proportion of different reinforcement materials; (6) generating programmed signals corresponding to each of the segments in a predetermined sequence; and (7) moving the deposition device and the work surface relative to each other in response to these programmed signals. Preferably, the system is also operated to generate a support structure for any un-supported feature of the 3-D preform shape.

  12. Droplet morphometry and velocimetry (DMV): a video processing software for time-resolved, label-free tracking of droplet parameters.

    PubMed

    Basu, Amar S

    2013-05-21

    Emerging assays in droplet microfluidics require the measurement of parameters such as drop size, velocity, trajectory, shape deformation, fluorescence intensity, and others. While micro particle image velocimetry (μPIV) and related techniques are suitable for measuring flow using tracer particles, no tool exists for tracking droplets at the granularity of a single entity. This paper presents droplet morphometry and velocimetry (DMV), a digital video processing software for time-resolved droplet analysis. Droplets are identified through a series of image processing steps which operate on transparent, translucent, fluorescent, or opaque droplets. The steps include background image generation, background subtraction, edge detection, small object removal, morphological close and fill, and shape discrimination. A frame correlation step then links droplets spanning multiple frames via a nearest neighbor search with user-defined matching criteria. Each step can be individually tuned for maximum compatibility. For each droplet found, DMV provides a time-history of 20 different parameters, including trajectory, velocity, area, dimensions, shape deformation, orientation, nearest neighbour spacing, and pixel statistics. The data can be reported via scatter plots, histograms, and tables at the granularity of individual droplets or by statistics accrued over the population. We present several case studies from industry and academic labs, including the measurement of 1) size distributions and flow perturbations in a drop generator, 2) size distributions and mixing rates in drop splitting/merging devices, 3) efficiency of single cell encapsulation devices, 4) position tracking in electrowetting operations, 5) chemical concentrations in a serial drop dilutor, 6) drop sorting efficiency of a tensiophoresis device, 7) plug length and orientation of nonspherical plugs in a serpentine channel, and 8) high throughput tracking of >250 drops in a reinjection system. Performance metrics show that highest accuracy and precision is obtained when the video resolution is >300 pixels per drop. Analysis time increases proportionally with video resolution. The current version of the software provides throughputs of 2-30 fps, suggesting the potential for real time analysis.

  13. Combining area-based and individual-level data in the geostatistical mapping of late-stage cancer incidence.

    PubMed

    Goovaerts, Pierre

    2009-01-01

    This paper presents a geostatistical approach to incorporate individual-level data (e.g. patient residences) and area-based data (e.g. rates recorded at census tract level) into the mapping of late-stage cancer incidence, with an application to breast cancer in three Michigan counties. Spatial trends in cancer incidence are first estimated from census data using area-to-point binomial kriging. This prior model is then updated using indicator kriging and individual-level data. Simulation studies demonstrate the benefits of this two-step approach over methods (kernel density estimation and indicator kriging) that process only residence data.

  14. Casting Protocols for the Production of Open Cell Aluminum Foams by the Replication Technique and the Effect on Porosity

    PubMed Central

    Elizondo Luna, Erardo M.; Barari, Farzad; Woolley, Robert; Goodall, Russell

    2014-01-01

    Metal foams are interesting materials from both a fundamental understanding and practical applications point of view. Uses have been proposed, and in many cases validated experimentally, for light weight or impact energy absorbing structures, as high surface area heat exchangers or electrodes, as implants to the body, and many more. Although great progress has been made in understanding their structure-properties relationships, the large number of different processing techniques, each producing material with different characteristics and structure, means that understanding of the individual effects of all aspects of structure is not complete. The replication process, where molten metal is infiltrated between grains of a removable preform material, allows a markedly high degree of control and has been used to good effect to elucidate some of these relationships. Nevertheless, the process has many steps that are dependent on individual “know-how”, and this paper aims to provide a detailed description of all stages of one embodiment of this processing method, using materials and equipment that would be relatively easy to set up in a research environment. The goal of this protocol and its variants is to produce metal foams in an effective and simple way, giving the possibility to tailor the outcome of the samples by modifying certain steps within the process. By following this, open cell aluminum foams with pore sizes of 1–2.36 mm diameter and 61% to 77% porosity can be obtained. PMID:25548938

  15. Adaptation Criteria for the Personalised Delivery of Learning Materials: A Multi-Stage Empirical Investigation

    ERIC Educational Resources Information Center

    Thalmann, Stefan

    2014-01-01

    Personalised e-Learning represents a major step-change from the one-size-fits-all approach of traditional learning platforms to a more customised and interactive provision of learning materials. Adaptive learning can support the learning process by tailoring learning materials to individual needs. However, this requires the initial preparation of…

  16. 77 FR 43606 - Preliminary Damage Assessment for Individual Assistance Operations Manual (9327.2-PR)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-25

    ... site at http://www.fema.gov . The proposed and final manual, all related Federal Register Notices, and... for conducting IA PDAs is to identify the impact, type, and extent of disaster damages and to... to recover. The PDA is an important first step in the disaster declaration process. The information...

  17. Why Are Black Employers More Likely Than White Employers To Hire Blacks? Discussion Paper.

    ERIC Educational Resources Information Center

    Stoll, Michael A.; Raphael, Steven; Holzer, Harry J.

    This study investigated why black employers tend to hire blacks at higher rates than do white employers and examined individual steps in the hiring process, the role of the hiring agent's race, and the degree to which variation in black application rates related to differences in observable characteristics, such as an establishment's physical…

  18. Ten steps for managing organizational change.

    PubMed

    Bolton, L B; Aydin, C; Popolow, G; Ramseyer, J

    1992-06-01

    Managing interdepartmental relations in healthcare organizations is a major challenge for nursing administrators. The authors describe the implementation process of an organization-wide change effort involving individuals from departments throughout the medical center. These strategies can serve as a model to guide effective planning in other institutions embarking on change projects, resulting in smoother and more effective implementation of interdepartmental change.

  19. Evaluating ecological monitoring of civic environmental stewardship in the Green-Duwamish watershed, Washington

    Treesearch

    Jacob C. Sheppard; Clare M. Ryan; Dale J. Blahna

    2017-01-01

    The ecological outcomes of civic environmental stewardship are poorly understood, especially at scales larger than individual sites. In this study we characterized civic environmental stewardship programs in the Green-Duwamish watershed in King County, WA, and evaluated the extent to which stewardship outcomes were monitored. We developed a four-step process based on...

  20. Software forecasting as it is really done: A study of JPL software engineers

    NASA Technical Reports Server (NTRS)

    Griesel, Martha Ann; Hihn, Jairus M.; Bruno, Kristin J.; Fouser, Thomas J.; Tausworthe, Robert C.

    1993-01-01

    This paper presents a summary of the results to date of a Jet Propulsion Laboratory internally funded research task to study the costing process and parameters used by internally recognized software cost estimating experts. Protocol Analysis and Markov process modeling were used to capture software engineer's forecasting mental models. While there is significant variation between the mental models that were studied, it was nevertheless possible to identify a core set of cost forecasting activities, and it was also found that the mental models cluster around three forecasting techniques. Further partitioning of the mental models revealed clustering of activities, that is very suggestive of a forecasting lifecycle. The different forecasting methods identified were based on the use of multiple-decomposition steps or multiple forecasting steps. The multiple forecasting steps involved either forecasting software size or an additional effort forecast. Virtually no subject used risk reduction steps in combination. The results of the analysis include: the identification of a core set of well defined costing activities, a proposed software forecasting life cycle, and the identification of several basic software forecasting mental models. The paper concludes with a discussion of the implications of the results for current individual and institutional practices.

  1. Principles for building public-private partnerships to benefit food safety, nutrition, and health research

    PubMed Central

    Rowe, Sylvia; Alexander, Nick; Kretser, Alison; Steele, Robert; Kretsch, Molly; Applebaum, Rhona; Clydesdale, Fergus; Cummins, Deborah; Hentges, Eric; Navia, Juan; Jarvis, Ashley; Falci, Ken

    2013-01-01

    The present article articulates principles for effective public-private partnerships (PPPs) in scientific research. Recognizing that PPPs represent one approach for creating research collaborations and that there are other methods outside the scope of this article, PPPs can be useful in leveraging diverse expertise among government, academic, and industry researchers to address public health needs and questions concerned with nutrition, health, food science, and food and ingredient safety. A three-step process was used to identify the principles proposed herein: step 1) review of existing PPP guidelines, both in the peer-reviewed literature and at 16 disparate non-industry organizations; step 2) analysis of relevant successful or promising PPPs; and step 3) formal background interviews of 27 experienced, senior-level individuals from academia, government, industry, foundations, and non-governmental organizations. This process resulted in the articulation of 12 potential principles for establishing and managing successful research PPPs. The review of existing guidelines showed that guidelines for research partnerships currently reside largely within institutions rather than in the peer-reviewed literature. This article aims to introduce these principles into the literature to serve as a framework for dialogue and for future PPPs. PMID:24117791

  2. Costs and benefits of integrating information between the cerebral hemispheres: a computational perspective.

    PubMed

    Belger, A; Banich, M T

    1998-07-01

    Because interaction of the cerebral hemispheres has been found to aid task performance under demanding conditions, the present study examined how this effect is moderated by computational complexity, the degree of lateralization for a task, and individual differences in asymmetric hemispheric activation (AHA). Computational complexity was manipulated across tasks either by increasing the number of inputs to be processed or by increasing the number of steps to a decision. Comparison of within- and across-hemisphere trials indicated that the size of the between-hemisphere advantage increased as a function of task complexity, except for a highly lateralized rhyme decision task that can only be performed by the left hemisphere. Measures of individual differences in AHA revealed that when task demands and an individual's AHA both load on the same hemisphere, the ability to divide the processing between the hemispheres is limited. Thus, interhemispheric division of processing improves performance at higher levels of computational complexity only when the required operations can be divided between the hemispheres.

  3. Microfluidic Platform for Parallel Single Cell Analysis for Diagnostic Applications.

    PubMed

    Le Gac, Séverine

    2017-01-01

    Cell populations are heterogeneous: they can comprise different cell types or even cells at different stages of the cell cycle and/or of biological processes. Furthermore, molecular processes taking place in cells are stochastic in nature. Therefore, cellular analysis must be brought down to the single cell level to get useful insight into biological processes, and to access essential molecular information that would be lost when using a cell population analysis approach. Furthermore, to fully characterize a cell population, ideally, information both at the single cell level and on the whole cell population is required, which calls for analyzing each individual cell in a population in a parallel manner. This single cell level analysis approach is particularly important for diagnostic applications to unravel molecular perturbations at the onset of a disease, to identify biomarkers, and for personalized medicine, not only because of the heterogeneity of the cell sample, but also due to the availability of a reduced amount of cells, or even unique cells. This chapter presents a versatile platform meant for the parallel analysis of individual cells, with a particular focus on diagnostic applications and the analysis of cancer cells. We first describe one essential step of this parallel single cell analysis protocol, which is the trapping of individual cells in dedicated structures. Following this, we report different steps of a whole analytical process, including on-chip cell staining and imaging, cell membrane permeabilization and/or lysis using either chemical or physical means, and retrieval of the cell molecular content in dedicated channels for further analysis. This series of experiments illustrates the versatility of the herein-presented platform and its suitability for various analysis schemes and different analytical purposes.

  4. RNA editing in nascent RNA affects pre-mRNA splicing

    PubMed Central

    Hsiao, Yun-Hua Esther; Bahn, Jae Hoon; Yang, Yun; Lin, Xianzhi; Tran, Stephen; Yang, Ei-Wen; Quinones-Valdez, Giovanni

    2018-01-01

    In eukaryotes, nascent RNA transcripts undergo an intricate series of RNA processing steps to achieve mRNA maturation. RNA editing and alternative splicing are two major RNA processing steps that can introduce significant modifications to the final gene products. By tackling these processes in isolation, recent studies have enabled substantial progress in understanding their global RNA targets and regulatory pathways. However, the interplay between individual steps of RNA processing, an essential aspect of gene regulation, remains poorly understood. By sequencing the RNA of different subcellular fractions, we examined the timing of adenosine-to-inosine (A-to-I) RNA editing and its impact on alternative splicing. We observed that >95% A-to-I RNA editing events occurred in the chromatin-associated RNA prior to polyadenylation. We report about 500 editing sites in the 3′ acceptor sequences that can alter splicing of the associated exons. These exons are highly conserved during evolution and reside in genes with important cellular function. Furthermore, we identified a second class of exons whose splicing is likely modulated by RNA secondary structures that are recognized by the RNA editing machinery. The genome-wide analyses, supported by experimental validations, revealed remarkable interplay between RNA editing and splicing and expanded the repertoire of functional RNA editing sites. PMID:29724793

  5. RNA editing in nascent RNA affects pre-mRNA splicing.

    PubMed

    Hsiao, Yun-Hua Esther; Bahn, Jae Hoon; Yang, Yun; Lin, Xianzhi; Tran, Stephen; Yang, Ei-Wen; Quinones-Valdez, Giovanni; Xiao, Xinshu

    2018-06-01

    In eukaryotes, nascent RNA transcripts undergo an intricate series of RNA processing steps to achieve mRNA maturation. RNA editing and alternative splicing are two major RNA processing steps that can introduce significant modifications to the final gene products. By tackling these processes in isolation, recent studies have enabled substantial progress in understanding their global RNA targets and regulatory pathways. However, the interplay between individual steps of RNA processing, an essential aspect of gene regulation, remains poorly understood. By sequencing the RNA of different subcellular fractions, we examined the timing of adenosine-to-inosine (A-to-I) RNA editing and its impact on alternative splicing. We observed that >95% A-to-I RNA editing events occurred in the chromatin-associated RNA prior to polyadenylation. We report about 500 editing sites in the 3' acceptor sequences that can alter splicing of the associated exons. These exons are highly conserved during evolution and reside in genes with important cellular function. Furthermore, we identified a second class of exons whose splicing is likely modulated by RNA secondary structures that are recognized by the RNA editing machinery. The genome-wide analyses, supported by experimental validations, revealed remarkable interplay between RNA editing and splicing and expanded the repertoire of functional RNA editing sites. © 2018 Hsiao et al.; Published by Cold Spring Harbor Laboratory Press.

  6. Nonequilibrium thermodynamics and a fluctuation theorem for individual reaction steps in a chemical reaction network

    NASA Astrophysics Data System (ADS)

    Pal, Krishnendu; Das, Biswajit; Banerjee, Kinshuk; Gangopadhyay, Gautam

    2015-09-01

    We have introduced an approach to nonequilibrium thermodynamics of an open chemical reaction network in terms of the propensities of the individual elementary reactions and the corresponding reverse reactions. The method is a microscopic formulation of the dissipation function in terms of the relative entropy or Kullback-Leibler distance which is based on the analogy of phase space trajectory with the path of elementary reactions in a network of chemical process. We have introduced here a fluctuation theorem valid for each opposite pair of elementary reactions which is useful in determining the contribution of each sub-reaction on the nonequilibrium thermodynamics of overall reaction. The methodology is applied to an oligomeric enzyme kinetics at a chemiostatic condition that leads the reaction to a nonequilibrium steady state for which we have estimated how each step of the reaction is energy driven or entropy driven to contribute to the overall reaction.

  7. Lean six sigma methodologies improve clinical laboratory efficiency and reduce turnaround times.

    PubMed

    Inal, Tamer C; Goruroglu Ozturk, Ozlem; Kibar, Filiz; Cetiner, Salih; Matyar, Selcuk; Daglioglu, Gulcin; Yaman, Akgun

    2018-01-01

    Organizing work flow is a major task of laboratory management. Recently, clinical laboratories have started to adopt methodologies such as Lean Six Sigma and some successful implementations have been reported. This study used Lean Six Sigma to simplify the laboratory work process and decrease the turnaround time by eliminating non-value-adding steps. The five-stage Six Sigma system known as define, measure, analyze, improve, and control (DMAIC) is used to identify and solve problems. The laboratory turnaround time for individual tests, total delay time in the sample reception area, and percentage of steps involving risks of medical errors and biological hazards in the overall process are measured. The pre-analytical process in the reception area was improved by eliminating 3 h and 22.5 min of non-value-adding work. Turnaround time also improved for stat samples from 68 to 59 min after applying Lean. Steps prone to medical errors and posing potential biological hazards to receptionists were reduced from 30% to 3%. Successful implementation of Lean Six Sigma significantly improved all of the selected performance metrics. This quality-improvement methodology has the potential to significantly improve clinical laboratories. © 2017 Wiley Periodicals, Inc.

  8. Quality control in the development of coagulation factor concentrates.

    PubMed

    Snape, T J

    1987-01-01

    Limitation of process change is a major factor contributing to assurance of quality in pharmaceutical manufacturing. This is particularly true in the manufacture of coagulation factor concentrates, for which presumptive testing for poorly defined product characteristics is an integral feature of finished product quality control. The development of new or modified preparations requires that this comfortable position be abandoned, and that the effect on finished product characteristics of changes to individual process steps (and components) be assessed. The degree of confidence in the safety and efficacy of the new product will be determined by, amongst other things, the complexity of the process alteration and the extent to which the results of finished product tests can be considered predictive. The introduction of a heat-treatment step for inactivation of potential viral contaminants in coagulation factor concentrates presents a significant challenge in both respects, quite independent of any consideration of assessment of the effectiveness of the viral inactivation step. These interactions are illustrated by some of the problems encountered with terminal dry heat-treatment (72 h. at 80 degrees C) of factor VIII and prothrombin complex concentrates manufactured by the Blood Products Laboratory.

  9. CALiPER Exploratory Study: Accounting for Uncertainty in Lumen Measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bergman, Rolf; Paget, Maria L.; Richman, Eric E.

    2011-03-31

    With a well-defined and shared understanding of uncertainty in lumen measurements, testing laboratories can better evaluate their processes, contributing to greater consistency and credibility of lighting testing a key component of the U.S. Department of Energy (DOE) Commercially Available LED Product Evaluation and Reporting (CALiPER) program. Reliable lighting testing is a crucial underlying factor contributing toward the success of many energy-efficient lighting efforts, such as the DOE GATEWAY demonstrations, Lighting Facts Label, ENERGY STAR® energy efficient lighting programs, and many others. Uncertainty in measurements is inherent to all testing methodologies, including photometric and other lighting-related testing. Uncertainty exists for allmore » equipment, processes, and systems of measurement in individual as well as combined ways. A major issue with testing and the resulting accuracy of the tests is the uncertainty of the complete process. Individual equipment uncertainties are typically identified, but their relative value in practice and their combined value with other equipment and processes in the same test are elusive concepts, particularly for complex types of testing such as photometry. The total combined uncertainty of a measurement result is important for repeatable and comparative measurements for light emitting diode (LED) products in comparison with other technologies as well as competing products. This study provides a detailed and step-by-step method for determining uncertainty in lumen measurements, working closely with related standards efforts and key industry experts. This report uses the structure proposed in the Guide to Uncertainty Measurements (GUM) for evaluating and expressing uncertainty in measurements. The steps of the procedure are described and a spreadsheet format adapted for integrating sphere and goniophotometric uncertainty measurements is provided for entering parameters, ordering the information, calculating intermediate values and, finally, obtaining expanded uncertainties. Using this basis and examining each step of the photometric measurement and calibration methods, mathematical uncertainty models are developed. Determination of estimated values of input variables is discussed. Guidance is provided for the evaluation of the standard uncertainties of each input estimate, covariances associated with input estimates and the calculation of the result measurements. With this basis, the combined uncertainty of the measurement results and finally, the expanded uncertainty can be determined.« less

  10. Examining attrition rates at one specialty addiction treatment provider in the United States: a case study using a retrospective chart review.

    PubMed

    Loveland, David; Driscoll, Hilary

    2014-09-25

    Engaging individuals who have a substance use disorder (SUD) in treatment continues to be a challenge for the specialty addiction treatment field. Research has consistently revealed high rates of missed appointments at each step of the enrollment process: 1. between calling for services and assessment, 2. between assessment and enrollment, and 3. between enrollment and completion of treatment. Extensive research has examined each step of the process; however, there is limited research examining the overall attrition rate across all steps. A single case study of a specialty addiction treatment agency was used to examine the attrition rates across the first three steps of the enrollment process. Attrition rates were tracked between August 1, 2011 and July 31, 2012. The cohort included 1822 unique individuals who made an initial request for addiction treatment services. Monthly retrospective reviews of medical records, phone logs, and billing data were used to calculate attrition rates. Attrition rates reported in the literature were collected and compared to the rates found at the target agency. Median time between request for treatment and assessment was 6 days (mean 7.5) and between assessment and treatment enrollment was 8 days (mean 12.5). An overall attrition rate of 80% was observed, including 45% between call and assessment, 32% between assessment and treatment enrollment (another 17% could not be determined), and 37% left or were removed from treatment before 30 days. Women were less likely to complete 30 days of treatment compared to men. No other demographics were related to attrition rates. One out of every five people who requested treatment completed a minimum of 30 days of a treatment. The attrition rate was high, yet similar to rates noted in the literature. Limitations of the single case study are noted. Attrition rates in the U.S. are high with approximately 75% to 80% of treatment seekers disengaging at one of the multiple stages of the enrollment and treatment process. Significant changes in the system are needed to improve engagement rates.

  11. Method of development of the program of forming of parametrical drawings of details in the AutoCAD software product

    NASA Astrophysics Data System (ADS)

    Alshakova, E. L.

    2017-01-01

    The program in the AutoLISP language allows automatically to form parametrical drawings during the work in the AutoCAD software product. Students study development of programs on AutoLISP language with the use of the methodical complex containing methodical instructions in which real examples of creation of images and drawings are realized. Methodical instructions contain reference information necessary for the performance of the offered tasks. The method of step-by-step development of the program is the basis for training in programming on AutoLISP language: the program draws elements of the drawing of a detail by means of definitely created function which values of arguments register in that sequence in which AutoCAD gives out inquiries when performing the corresponding command in the editor. The process of the program design is reduced to the process of step-by-step formation of functions and sequence of their calls. The author considers the development of the AutoLISP program for the creation of parametrical drawings of details, the defined design, the user enters the dimensions of elements of details. These programs generate variants of tasks of the graphic works performed in educational process of "Engineering graphics", "Engineering and computer graphics" disciplines. Individual tasks allow to develop at students skills of independent work in reading and creation of drawings, as well as 3D modeling.

  12. Development and test of combustion chamber for Stirling engine heated by natural gas

    NASA Astrophysics Data System (ADS)

    Li, Tie; Song, Xiange; Gui, Xiaohong; Tang, Dawei; Li, Zhigang; Cao, Wenyu

    2014-04-01

    The combustion chamber is an important component for the Stirling engine heated by natural gas. In the paper, we develop a combustion chamber for the Stirling engine which aims to generate 3˜5 kWe electric power. The combustion chamber includes three main components: combustion module, heat exchange cavity and thermal head. Its feature is that the structure can divide "combustion" process and "heat transfer" process into two apparent individual steps and make them happen one by one. Since natural gas can mix with air fully before burning, the combustion process can be easily completed without the second wind. The flame can avoid contacting the thermal head of Stirling engine, and the temperature fields can be easily controlled. The designed combustion chamber is manufactured and its performance is tested by an experiment which includes two steps. The experimental result of the first step proves that the mixture of air and natural gas can be easily ignited and the flame burns stably. In the second step of experiment, the combustion heat flux can reach 20 kW, and the energy utilization efficiency of thermal head has exceeded 0.5. These test results show that the thermal performance of combustion chamber has reached the design goal. The designed combustion chamber can be applied to a real Stirling engine heated by natural gas which is to generate 3˜5 kWe electric power.

  13. Defining the Costs of Reusable Flexible Ureteroscope Reprocessing Using Time-Driven Activity-Based Costing.

    PubMed

    Isaacson, Dylan; Ahmad, Tessnim; Metzler, Ian; Tzou, David T; Taguchi, Kazumi; Usawachintachit, Manint; Zetumer, Samuel; Sherer, Benjamin; Stoller, Marshall; Chi, Thomas

    2017-10-01

    Careful decontamination and sterilization of reusable flexible ureteroscopes used in ureterorenoscopy cases prevent the spread of infectious pathogens to patients and technicians. However, inefficient reprocessing and unavailability of ureteroscopes sent out for repair can contribute to expensive operating room (OR) delays. Time-driven activity-based costing (TDABC) was applied to describe the time and costs involved in reprocessing. Direct observation and timing were performed for all steps in reprocessing of reusable flexible ureteroscopes following operative procedures. Estimated times needed for each step by which damaged ureteroscopes identified during reprocessing are sent for repair were characterized through interviews with purchasing analyst staff. Process maps were created for reprocessing and repair detailing individual step times and their variances. Cost data for labor and disposables used were applied to calculate per minute and average step costs. Ten ureteroscopes were followed through reprocessing. Process mapping for ureteroscope reprocessing averaged 229.0 ± 74.4 minutes, whereas sending a ureteroscope for repair required an estimated 143 minutes per repair. Most steps demonstrated low variance between timed observations. Ureteroscope drying was the longest and highest variance step at 126.5 ± 55.7 minutes and was highly dependent on manual air flushing through the ureteroscope working channel and ureteroscope positioning in the drying cabinet. Total costs for reprocessing totaled $96.13 per episode, including the cost of labor and disposable items. Utilizing TDABC delineates the full spectrum of costs associated with ureteroscope reprocessing and identifies areas for process improvement to drive value-based care. At our institution, ureteroscope drying was one clearly identified target area. Implementing training in ureteroscope drying technique could save up to 2 hours per reprocessing event, potentially preventing expensive OR delays.

  14. Platelet-rich plasma differs according to preparation method and human variability.

    PubMed

    Mazzocca, Augustus D; McCarthy, Mary Beth R; Chowaniec, David M; Cote, Mark P; Romeo, Anthony A; Bradley, James P; Arciero, Robert A; Beitzel, Knut

    2012-02-15

    Varying concentrations of blood components in platelet-rich plasma preparations may contribute to the variable results seen in recently published clinical studies. The purposes of this investigation were (1) to quantify the level of platelets, growth factors, red blood cells, and white blood cells in so-called one-step (clinically used commercial devices) and two-step separation systems and (2) to determine the influence of three separate blood draws on the resulting components of platelet-rich plasma. Three different platelet-rich plasma (PRP) separation methods (on blood samples from eight subjects with a mean age [and standard deviation] of 31.6 ± 10.9 years) were used: two single-spin processes (PRPLP and PRPHP) and a double-spin process (PRPDS) were evaluated for concentrations of platelets, red and white blood cells, and growth factors. Additionally, the effect of three repetitive blood draws on platelet-rich plasma components was evaluated. The content and concentrations of platelets, white blood cells, and growth factors for each method of separation differed significantly. All separation techniques resulted in a significant increase in platelet concentration compared with native blood. Platelet and white blood-cell concentrations of the PRPHP procedure were significantly higher than platelet and white blood-cell concentrations produced by the so-called single-step PRPLP and the so-called two-step PRPDS procedures, although significant differences between PRPLP and PRPDS were not observed. Comparing the results of the three blood draws with regard to the reliability of platelet number and cell counts, wide variations of intra-individual numbers were observed. Single-step procedures are capable of producing sufficient amounts of platelets for clinical usage. Within the evaluated procedures, platelet numbers and numbers of white blood cells differ significantly. The intra-individual results of platelet-rich plasma separations showed wide variations in platelet and cell numbers as well as levels of growth factors regardless of separation method.

  15. Application of the quality by design approach to the drug substance manufacturing process of an Fc fusion protein: towards a global multi-step design space.

    PubMed

    Eon-duval, Alex; Valax, Pascal; Solacroup, Thomas; Broly, Hervé; Gleixner, Ralf; Strat, Claire L E; Sutter, James

    2012-10-01

    The article describes how Quality by Design principles can be applied to the drug substance manufacturing process of an Fc fusion protein. First, the quality attributes of the product were evaluated for their potential impact on safety and efficacy using risk management tools. Similarly, process parameters that have a potential impact on critical quality attributes (CQAs) were also identified through a risk assessment. Critical process parameters were then evaluated for their impact on CQAs, individually and in interaction with each other, using multivariate design of experiment techniques during the process characterisation phase. The global multi-step Design Space, defining operational limits for the entire drug substance manufacturing process so as to ensure that the drug substance quality targets are met, was devised using predictive statistical models developed during the characterisation study. The validity of the global multi-step Design Space was then confirmed by performing the entire process, from cell bank thawing to final drug substance, at its limits during the robustness study: the quality of the final drug substance produced under different conditions was verified against predefined targets. An adaptive strategy was devised whereby the Design Space can be adjusted to the quality of the input material to ensure reliable drug substance quality. Finally, all the data obtained during the process described above, together with data generated during additional validation studies as well as manufacturing data, were used to define the control strategy for the drug substance manufacturing process using a risk assessment methodology. Copyright © 2012 Wiley-Liss, Inc.

  16. An automated workflow for parallel processing of large multiview SPIM recordings

    PubMed Central

    Schmied, Christopher; Steinbach, Peter; Pietzsch, Tobias; Preibisch, Stephan; Tomancak, Pavel

    2016-01-01

    Summary: Selective Plane Illumination Microscopy (SPIM) allows to image developing organisms in 3D at unprecedented temporal resolution over long periods of time. The resulting massive amounts of raw image data requires extensive processing interactively via dedicated graphical user interface (GUI) applications. The consecutive processing steps can be easily automated and the individual time points can be processed independently, which lends itself to trivial parallelization on a high performance computing (HPC) cluster. Here, we introduce an automated workflow for processing large multiview, multichannel, multiillumination time-lapse SPIM data on a single workstation or in parallel on a HPC cluster. The pipeline relies on snakemake to resolve dependencies among consecutive processing steps and can be easily adapted to any cluster environment for processing SPIM data in a fraction of the time required to collect it. Availability and implementation: The code is distributed free and open source under the MIT license http://opensource.org/licenses/MIT. The source code can be downloaded from github: https://github.com/mpicbg-scicomp/snakemake-workflows. Documentation can be found here: http://fiji.sc/Automated_workflow_for_parallel_Multiview_Reconstruction. Contact: schmied@mpi-cbg.de Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26628585

  17. An automated workflow for parallel processing of large multiview SPIM recordings.

    PubMed

    Schmied, Christopher; Steinbach, Peter; Pietzsch, Tobias; Preibisch, Stephan; Tomancak, Pavel

    2016-04-01

    Selective Plane Illumination Microscopy (SPIM) allows to image developing organisms in 3D at unprecedented temporal resolution over long periods of time. The resulting massive amounts of raw image data requires extensive processing interactively via dedicated graphical user interface (GUI) applications. The consecutive processing steps can be easily automated and the individual time points can be processed independently, which lends itself to trivial parallelization on a high performance computing (HPC) cluster. Here, we introduce an automated workflow for processing large multiview, multichannel, multiillumination time-lapse SPIM data on a single workstation or in parallel on a HPC cluster. The pipeline relies on snakemake to resolve dependencies among consecutive processing steps and can be easily adapted to any cluster environment for processing SPIM data in a fraction of the time required to collect it. The code is distributed free and open source under the MIT license http://opensource.org/licenses/MIT The source code can be downloaded from github: https://github.com/mpicbg-scicomp/snakemake-workflows Documentation can be found here: http://fiji.sc/Automated_workflow_for_parallel_Multiview_Reconstruction : schmied@mpi-cbg.de Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.

  18. [Lessons learned from a distribution incident at the Alps-Mediterranean Division of the French Blood Establishment].

    PubMed

    Legrand, D

    2008-11-01

    The Alps-Mediterranean division of the French blood establishment (EFS Alpes-Mediterranée) has implemented a risk management program. Within this framework, the labile blood product distribution process was assessed to identify critical steps. Subsequently, safety measures were instituted including computer-assisted decision support, detailed written instructions and control checks at each step. Failure of these measures to prevent an incident underlines the vulnerability of the process to the human factor. Indeed root cause analysis showed that the incident was due to underestimation of the danger by one individual. Elimination of this type of risk will require continuous training, testing and updating of personnel. Identification and reporting of nonconformities will allow personnel at all levels (local, regional, and national) to share lessons and implement appropriate risk mitigation strategies.

  19. Scaling environmental change through the community level: a trait-based response-and-effect framework for plants

    Treesearch

    Katharine N. Suding; Sandra Lavorel; F. Stuart Chapin; Johannes H.C. Cornelissen; Sandra Diaz; Eric Garnier; Deborah Goldberg; David U. Hooper; Stephen T. Jackson; Marie-Laure Navas

    2008-01-01

    Predicting ecosystem responses to global change is a major challenge in ecology. A critical step in that challenge is to understand how changing environmental conditions influence processes across levels of ecological organization. While direct scaling from individual to ecosystem dynamics can lead to robust and mechanistic predictions, new approaches are needed to...

  20. 78 FR 6306 - Applications for New Awards; Educational Technology, Media, and Materials for Individuals With...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-30

    ... DEPARTMENT OF EDUCATION Applications for New Awards; Educational Technology, Media, and Materials for Individuals With Disabilities Program--Stepping-Up Technology Implementation AGENCY: Office of... Information Educational Technology, Media, and Materials for Individuals With Disabilities Program--Stepping...

  1. Framework for enhancing clinical practice guidelines through continuous patient engagement.

    PubMed

    Armstrong, Melissa J; Rueda, Juan-David; Gronseth, Gary S; Mullins, C Daniel

    2017-02-01

    Patient engagement in clinical practice guideline (CPG) development is recommended by multiple institutions and instruments measuring guideline quality. Approaches to engaging patients, however, vary between oversight organizations, quality tools and guideline developers. We propose a ten-step framework outlining steps and options for patient engagement in guideline development with the goal of highlighting steps for patient engagement and methods by which this can be achieved. This framework provides a model for continuous patient engagement in CPGs by outlining ten steps of guideline development occurring at the levels of the developer/committee and the individual guideline project. At the developer level, patients can assist in topic nomination (step 1), topic prioritization (step 2) and guideline development group selection (step 3). Within specific guideline projects, patients' opinions may be incorporated when framing the question (step 4), creating an analytic framework and research plan (step 5), conducting the systematic review and conclusion formation (step 6), development of recommendations (step 7) and dissemination and implementation (step 8). At the end of process, patients can again be engaged at the developer level by helping determine when guidelines need updating (step 9) and evaluating the developer's approach to patient engagement (step 10). Patient engagement at each CPG development step has different purposes, mechanisms, advantages and disadvantages, and implications for resource utilization. This framework can serve as a resource for guideline developers desiring to increase patient engagement and reference for researchers investigating engagement methodology at different steps of the CPG lifecycle. © 2016 The Authors. Health Expectations Published by John Wiley & Sons Ltd.

  2. [Process design in high-reliability organizations].

    PubMed

    Sommer, K-J; Kranz, J; Steffens, J

    2014-05-01

    Modern medicine is a highly complex service industry in which individual care providers are linked in a complicated network. The complexity and interlinkedness is associated with risks concerning patient safety. Other highly complex industries like commercial aviation have succeeded in maintaining or even increasing its safety levels despite rapidly increasing passenger figures. Standard operating procedures (SOPs), crew resource management (CRM), as well as operational risk evaluation (ORE) are historically developed and trusted parts of a comprehensive and systemic safety program. If medicine wants to follow this quantum leap towards increased patient safety, it must intensively evaluate the results of other high-reliability industries and seek step-by-step implementation after a critical assessment.

  3. Advancing the science of forensic data management

    NASA Astrophysics Data System (ADS)

    Naughton, Timothy S.

    2002-07-01

    Many individual elements comprise a typical forensics process. Collecting evidence, analyzing it, and using results to draw conclusions are all mutually distinct endeavors. Different physical locations and personnel are involved, juxtaposed against an acute need for security and data integrity. Using digital technologies and the Internet's ubiquity, these diverse elements can be conjoined using digital data as the common element. This result is a new data management process that can be applied to serve all elements of the community. The first step is recognition of a forensics lifecycle. Evidence gathering, analysis, storage, and use in legal proceedings are actually just distinct parts of a single end-to-end process, and thus, it is hypothesized that a single data system that can also accommodate each constituent phase using common network and security protocols. This paper introduces the idea of web-based Central Data Repository. Its cornerstone is anywhere, anytime Internet upload, viewing, and report distribution. Archives exist indefinitely after being created, and high-strength security and encryption protect data and ensure subsequent case file additions do not violate chain-of-custody or other handling provisions. Several legal precedents have been established for using digital information in courts of law, and in fact, effective prosecution of cyber crimes absolutely relies on its use. An example is a US Department of Agriculture division's use of digital images to back up its inspection process, with pictures and information retained on secure servers to enforce the Perishable Agricultural Commodities Act. Forensics is a cumulative process. Secure, web-based data management solutions, such as the Central Data Repository postulated here, can support each process step. Logically marrying digital technologies with Internet accessibility should help nurture a thought process to explore alternatives that make forensics data accessible to authorized individuals, whenever and wherever they need it.

  4. Kin groups and trait groups: population structure and epidemic disease selection.

    PubMed

    Fix, A G

    1984-10-01

    A Monte Carlo simulation based on the population structure of a small-scale human population, the Semai Senoi of Malaysia, has been developed to study the combined effects of group, kin, and individual selection. The population structure resembles D.S. Wilson's structured deme model in that local breeding populations (Semai settlements) are subdivided into trait groups (hamlets) that may be kin-structured and are not themselves demes. Additionally, settlement breeding populations are connected by two-dimensional stepping-stone migration approaching 30% per generation. Group and kin-structured group selection occur among hamlets the survivors of which then disperse to breed within the settlement population. Genetic drift is modeled by the process of hamlet formation; individual selection as a deterministic process, and stepping-stone migration as either random or kin-structured migrant groups. The mechanism for group selection is epidemics of infectious disease that can wipe out small hamlets particularly if most adults become sick and social life collapses. Genetic resistance to a disease is an individual attribute; however, hamlet groups with several resistant adults are less likely to disintegrate and experience high social mortality. A specific human gene, hemoglobin E, which confers resistance to malaria, is studied as an example of the process. The results of the simulations show that high genetic variance among hamlet groups may be generated by moderate degrees of kin-structuring. This strong microdifferentiation provides the potential for group selection. The effect of group selection in this case is rapid increase in gene frequencies among the total set of populations. In fact, group selection in concert with individual selection produced a faster rate of gene frequency increase among a set of 25 populations than the rate within a single unstructured population subject to deterministic individual selection. Such rapid evolution with plausible rates of extinction, individual selection, and migration and a population structure realistic in its general form, has implications for specific human polymorphisms such as hemoglobin variants and for the more general problem of the tempo of evolution as well.

  5. Computerized procedures system

    DOEpatents

    Lipner, Melvin H.; Mundy, Roger A.; Franusich, Michael D.

    2010-10-12

    An online data driven computerized procedures system that guides an operator through a complex process facility's operating procedures. The system monitors plant data, processes the data and then, based upon this processing, presents the status of the current procedure step and/or substep to the operator. The system supports multiple users and a single procedure definition supports several interface formats that can be tailored to the individual user. Layered security controls access privileges and revisions are version controlled. The procedures run on a server that is platform independent of the user workstations that the server interfaces with and the user interface supports diverse procedural views.

  6. Plantarflexion moment is a contributor to step length after-effect following walking on a split-belt treadmill in individuals with stroke and healthy individuals.

    PubMed

    Lauzière, Séléna; Miéville, Carole; Betschart, Martina; Duclos, Cyril; Aissaoui, Rachid; Nadeau, Sylvie

    2014-10-01

    To assess plantarflexion moment and hip joint moment after-effects following walking on a split-belt treadmill in healthy individuals and individuals post-stroke. Cross-sectional study. Ten healthy individuals (mean age 57.6 years (standard deviation; SD 17.2)) and twenty individuals post-stroke (mean age 49.3 years (SD 13.2)). Participants walked on an instrumented split-belt treadmill under 3 gait periods: i) baseline (tied-belt); ii) adaptation (split-belt); and iii) post-adaptation (tied-belt). Participants post-stroke performed the protocol with the paretic and nonparetic leg on the faster belt when belts were split. Kinematic data were recorded with the Optotrak system and ground reaction forces were collected via the instrumented split-belt treadmill. In both groups, the fast plantarflexion moment was reduced and the slow plantarflexion moment was increased from mid-stance to toe-off in the post-adaptation period. Significant relationships were found between the plantarflexion moment and contralateral step length. Split-belt treadmills could be useful for restoring step length symmetry in individuals post-stroke who present with a longer paretic step length because the use of this type of intervention increases paretic plantarflexion moments. This intervention might be less recommended for individuals post-stroke with a shorter paretic step length because it reduces the paretic plantarflexion moment.

  7. Semi-autonomous remote sensing time series generation tool

    NASA Astrophysics Data System (ADS)

    Babu, Dinesh Kumar; Kaufmann, Christof; Schmidt, Marco; Dhams, Thorsten; Conrad, Christopher

    2017-10-01

    High spatial and temporal resolution data is vital for crop monitoring and phenology change detection. Due to the lack of satellite architecture and frequent cloud cover issues, availability of daily high spatial data is still far from reality. Remote sensing time series generation of high spatial and temporal data by data fusion seems to be a practical alternative. However, it is not an easy process, since it involves multiple steps and also requires multiple tools. In this paper, a framework of Geo Information System (GIS) based tool is presented for semi-autonomous time series generation. This tool will eliminate the difficulties by automating all the steps and enable the users to generate synthetic time series data with ease. Firstly, all the steps required for the time series generation process are identified and grouped into blocks based on their functionalities. Later two main frameworks are created, one to perform all the pre-processing steps on various satellite data and the other one to perform data fusion to generate time series. The two frameworks can be used individually to perform specific tasks or they could be combined to perform both the processes in one go. This tool can handle most of the known geo data formats currently available which makes it a generic tool for time series generation of various remote sensing satellite data. This tool is developed as a common platform with good interface which provides lot of functionalities to enable further development of more remote sensing applications. A detailed description on the capabilities and the advantages of the frameworks are given in this paper.

  8. Improving end of life care in care homes; an evaluation of the six steps to success programme.

    PubMed

    O'Brien, Mary; Kirton, Jennifer; Knighting, Katherine; Roe, Brenda; Jack, Barbara

    2016-06-03

    There are approximately 426,000 people residing within care homes in the UK. Residents often have complex trajectories of dying, which make it difficult for staff to manage their end-of-life care. There is growing recognition for the need to support care homes staff in the care of these residents with increased educational initiatives. One educational initiative is The Six Steps to Success programme. In order to evaluate the implementation of Six Steps with the first cohort of care homes to complete the end-of-life programme in the North West of England., a pragmatic evaluation methodology was implemented in 2012-2013 using multiple methods of qualitative data collection; online questionnaire with facilitators (n = 16), interviews with facilitators (n = 9) and case studies of care homes that had completed the programme (n = 6). The evaluation explored the implementation approach and experiences of the programme facilitators and obtain a detailed account of the impact of Six Steps on individual care homes. Based upon the National Health Service (NHS) End of Life Care (EoLC) Programme, The Route to Success in EoLC - Achieving Quality in Care Homes. The programme was flexibly designed so that it could be individually tailored to the geographical location and the individual cohort requirements. Facilitators provided comprehensive and flexible support to care homes. Challenges to programme success were noted as; lack of time allocated to champions to devote to additional programme work, inappropriate staff selected as 'Champions' and staff sickness/high staff turnover presented challenges to embedding programme values. Benefits to completing the programme were noted as; improvement in Advance Care Planning, improved staff communication/confidence when dealing with multi-disciplinary teams, improved end-of-life processes/documentation and increased staff confidence through acquisition of new knowledge and new processes. The findings suggested an overall positive impact from the programme. This flexibly designed programme continues to be dynamic, iteratively amended and improved which may affect the direct transferability of the results to future cohorts.

  9. Experimental study on the stability and failure of individual step-pool

    NASA Astrophysics Data System (ADS)

    Zhang, Chendi; Xu, Mengzhen; Hassan, Marwan A.; Chartrand, Shawn M.; Wang, Zhaoyin

    2018-06-01

    Step-pools are one of the most common bedforms in mountain streams, the stability and failure of which play a significant role for riverbed stability and fluvial processes. Given this importance, flume experiments were performed with a manually constructed step-pool model. The experiments were carried out with a constant flow rate to study features of step-pool stability as well as failure mechanisms. The results demonstrate that motion of the keystone grain (KS) caused 90% of the total failure events. The pool reached its maximum depth and either exhibited relative stability for a period before step failure, which was called the stable phase, or the pool collapsed before its full development. The critical scour depth for the pool increased linearly with discharge until the trend was interrupted by step failure. Variability of the stable phase duration ranged by one order of magnitude, whereas variability of pool scour depth was constrained within 50%. Step adjustment was detected in almost all of the runs with step-pool failure and was one or two orders smaller than the diameter of the step stones. Two discharge regimes for step-pool failure were revealed: one regime captures threshold conditions and frames possible step-pool failure, whereas the second regime captures step-pool failure conditions and is the discharge of an exceptional event. In the transitional stage between the two discharge regimes, pool and step adjustment magnitude displayed relatively large variabilities, which resulted in feedbacks that extended the duration of step-pool stability. Step adjustment, which was a type of structural deformation, increased significantly before step failure. As a result, we consider step deformation as the direct explanation to step-pool failure rather than pool scour, which displayed relative stability during step deformations in our experiments.

  10. Use of Single-Cysteine Variants for Trapping Transient States in DNA Mismatch Repair.

    PubMed

    Friedhoff, Peter; Manelyte, Laura; Giron-Monzon, Luis; Winkler, Ines; Groothuizen, Flora S; Sixma, Titia K

    2017-01-01

    DNA mismatch repair (MMR) is necessary to prevent incorporation of polymerase errors into the newly synthesized DNA strand, as they would be mutagenic. In humans, errors in MMR cause a predisposition to cancer, called Lynch syndrome. The MMR process is performed by a set of ATPases that transmit, validate, and couple information to identify which DNA strand requires repair. To understand the individual steps in the repair process, it is useful to be able to study these large molecular machines structurally and functionally. However, the steps and states are highly transient; therefore, the methods to capture and enrich them are essential. Here, we describe how single-cysteine variants can be used for specific cross-linking and labeling approaches that allow trapping of relevant transient states. Analysis of these defined states in functional and structural studies is instrumental to elucidate the molecular mechanism of this important DNA MMR process. © 2017 Elsevier Inc. All rights reserved.

  11. Neutron diffraction measurement of residual stresses, dislocation density and texture in Zr-bonded U-10Mo “mini” fuel foils and plates

    DOE PAGES

    Brown, Donald William; Okuniewski, Maria A.; Sisneros, Thomas A.; ...

    2016-12-01

    Here, Al clad U-10Mo fuel plates are being considered for conversion of several research reactors from high-enriched to low-enriched U fuel. Neutron diffraction measurements of the textures, residual phase stresses, and dislocation densities in the individual phases of the mini-foils throughout several processing steps and following hot-isostatic pressing to the Al cladding, have been completed. Recovery and recrystallization of the bare U-10Mo fuel foil, as indicated by the dislocation density and texture, are observed depending on the state of the material prior to annealing and the duration and temperature of the annealing process. In general, the cladding procedure significantly reducesmore » the dislocation density, but the final state of the clad plate, both texture and dislocation density, depends strongly on the final processing step of the fuel foil. In contrast, the residual stress state of the final plate is dominated by the thermal expansion mismatch of the constituent materials.« less

  12. Neutron diffraction measurement of residual stresses, dislocation density and texture in Zr-bonded U-10Mo “mini” fuel foils and plates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Donald William; Okuniewski, Maria A.; Sisneros, Thomas A.

    Here, Al clad U-10Mo fuel plates are being considered for conversion of several research reactors from high-enriched to low-enriched U fuel. Neutron diffraction measurements of the textures, residual phase stresses, and dislocation densities in the individual phases of the mini-foils throughout several processing steps and following hot-isostatic pressing to the Al cladding, have been completed. Recovery and recrystallization of the bare U-10Mo fuel foil, as indicated by the dislocation density and texture, are observed depending on the state of the material prior to annealing and the duration and temperature of the annealing process. In general, the cladding procedure significantly reducesmore » the dislocation density, but the final state of the clad plate, both texture and dislocation density, depends strongly on the final processing step of the fuel foil. In contrast, the residual stress state of the final plate is dominated by the thermal expansion mismatch of the constituent materials.« less

  13. Have sex or not? Lessons from bacteria.

    PubMed

    Lodé, T

    2012-01-01

    Sex is one of the greatest puzzles in evolutionary biology. A true meiotic process occurs only in eukaryotes, while in bacteria, gene transcription is fragmentary, so asexual reproduction in this case really means clonal reproduction. Sex could stem from a signal that leads to increased reproductive output of all interacting individuals and could be understood as a secondary consequence of primitive metabolic reactions. Meiotic sex evolved in proto-eukaryotes to solve a problem that bacteria did not have, namely a large amount of DNA material, occurring in an archaic step of proto-cell formation and genetic exchanges. Rather than providing selective advantages through reproduction, sex could be thought of as a series of separate events which combines step-by-step some very weak benefits of recombination, meiosis, gametogenesis and syngamy. Copyright © 2012 S. Karger AG, Basel.

  14. From Sequences to Insights in Microbial Ecology

    PubMed Central

    Knight, R.

    2010-01-01

    s4-3 Rapid declines in the cost of sequencing have made large volumes of DNA sequence data available to individual investigators. Now, data analysis is the rate-limiting step: providing a user with sequences alone typically leads to bewilderment, frustration, and skepticism about the technology. In this talk, I focus on how to extract insights from 16S rRNA data, including key lab steps (barcoding and normalization) and on which tools are available to perform routine but essential processing steps such as denoising, chimera detection, taxonomy assignment, and diversity analyses (including detection of biological clusters and gradients in the samples). Providing users with advice on these points and with a standard pipeline they can exploit (but modify if circumstances require) can greatly accelerate the rate of understanding, publication, and acquisition of funding for further studies.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Yueh-Ning; Hennebelle, Patrick; Chabrier, Gilles, E-mail: yueh-ning.lee@cea.fr

    Observations suggest that star formation in filamentary molecular clouds occurs in a two-step process, with the formation of filaments preceding that of prestellar cores and stars. Here, we apply the gravoturbulent fragmentation theory of Hennebelle and Chabrier to a filamentary environment, taking into account magnetic support. We discuss the induced geometrical effect on the cores, with a transition from 3D geometry at small scales to 1D at large ones. The model predicts the fragmentation behavior of a filament for a given mass per unit length (MpL) and level of magnetization. This core mass function (CMF) for individual filaments is thenmore » convolved with the distribution of filaments to obtain the final system CMF. The model yields two major results. (i) The filamentary geometry naturally induces a hierarchical fragmentation process, first into groups of cores, separated by a length equal to a few filament Jeans lengths, i.e., a few times the filament width. These groups then fragment into individual cores. (ii) Non-magnetized filaments with high MpL are found to fragment excessively, at odds with observations. This is resolved by taking into account the magnetic field (treated simply as additional pressure support). The present theory suggests two complementary modes of star formation: although small (spherical or filamentary) structures will collapse directly into prestellar cores, according to the standard Hennebelle–Chabrier theory, the large (filamentary) ones, the dominant population according to observations, will follow the aforedescribed two-step process.« less

  16. Toward the reconstitution of synthetic cell motility

    PubMed Central

    Siton-Mendelson, Orit; Bernheim-Groswasser, Anne

    2016-01-01

    ABSTRACT Cellular motility is a fundamental process essential for embryonic development, wound healing, immune responses, and tissues development. Cells are mostly moving by crawling on external, or inside, substrates which can differ in their surface composition, geometry, and dimensionality. Cells can adopt different migration phenotypes, e.g., bleb-based and protrusion-based, depending on myosin contractility, surface adhesion, and cell confinement. In the few past decades, research on cell motility has focused on uncovering the major molecular players and their order of events. Despite major progresses, our ability to infer on the collective behavior from the molecular properties remains a major challenge, especially because cell migration integrates numerous chemical and mechanical processes that are coupled via feedbacks that span over large range of time and length scales. For this reason, reconstituted model systems were developed. These systems allow for full control of the molecular constituents and various system parameters, thereby providing insight into their individual roles and functions. In this review we describe the various reconstituted model systems that were developed in the past decades. Because of the multiple steps involved in cell motility and the complexity of the overall process, most of the model systems focus on very specific aspects of the individual steps of cell motility. Here we describe the main advancement in cell motility reconstitution and discuss the main challenges toward the realization of a synthetic motile cell. PMID:27019160

  17. Stockpile Dismantlement Database Training Materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1993-11-01

    This document, the Stockpile Dismantlement Database (SDDB) training materials is designed to familiarize the user with the SDDB windowing system and the data entry steps for Component Characterization for Disposition. The foundation of information required for every part is depicted by using numbered graphic and text steps. The individual entering data is lead step by step through generic and specific examples. These training materials are intended to be supplements to individual on-the-job training.

  18. A two-step initial mass function:. Consequences of clustered star formation for binary properties

    NASA Astrophysics Data System (ADS)

    Durisen, R. H.; Sterzik, M. F.; Pickett, B. K.

    2001-06-01

    If stars originate in transient bound clusters of moderate size, these clusters will decay due to dynamic interactions in which a hard binary forms and ejects most or all the other stars. When the cluster members are chosen at random from a reasonable initial mass function (IMF), the resulting binary characteristics do not match current observations. We find a significant improvement in the trends of binary properties from this scenario when an additional constraint is taken into account, namely that there is a distribution of total cluster masses set by the masses of the cloud cores from which the clusters form. Two distinct steps then determine final stellar masses - the choice of a cluster mass and the formation of the individual stars. We refer to this as a ``two-step'' IMF. Simple statistical arguments are used in this paper to show that a two-step IMF, combined with typical results from dynamic few-body system decay, tends to give better agreement between computed binary characteristics and observations than a one-step mass selection process.

  19. Informal learning processes in support of clinical service delivery in a service-oriented community pharmacy.

    PubMed

    Patterson, Brandon J; Bakken, Brianne K; Doucette, William R; Urmie, Julie M; McDonough, Randal P

    The evolving health care system necessitates pharmacy organizations' adjustments by delivering new services and establishing inter-organizational relationships. One approach supporting pharmacy organizations in making changes may be informal learning by technicians, pharmacists, and pharmacy owners. Informal learning is characterized by a four-step cycle including intent to learn, action, feedback, and reflection. This framework helps explain individual and organizational factors that influence learning processes within an organization as well as the individual and organizational outcomes of those learning processes. A case study of an Iowa independent community pharmacy with years of experience in offering patient care services was made. Nine semi-structured interviews with pharmacy personnel revealed initial evidence in support of the informal learning model in practice. Future research could investigate more fully the informal learning model in delivery of patient care services in community pharmacies. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. Come one, come all.

    PubMed

    Lee, Siu Sylvia

    2004-05-05

    Aging is a complex process that involves the gradual functional decline of many different tissues and cells. Gene expression microarray analysis provides a comprehensive view of the gene expression signature associated with age and is particularly valuable for understanding the molecular mechanisms that contribute to the aging process. However, because of the stochastic nature of the aging process, animals of the same chronological age often manifest great physiological differences. Therefore, profiling the gene expression pattern of a large population of aging animals risks either exaggerating or masking the changes in gene expression that correspond to physiological aging. In a recent paper, Golden and Melov surveyed the gene expression profiles of individual aging Caenorhabditis elegans, hoping to circumvent the problem of variability among worms of the same chronological age. This initial analysis of age-dependent gene expression in individual aging worms is an important step toward deciphering the molecular basis of physiological aging.

  1. A model for critical thinking measurement of dental student performance.

    PubMed

    Johnsen, David C; Finkelstein, Michael W; Marshall, Teresa A; Chalkley, Yvonne M

    2009-02-01

    The educational application of critical thinking has increased in the last twenty years with programs like problem-based learning. Performance measurement related to the dental student's capacity for critical thinking remains elusive, however. This article offers a model now in use to measure critical thinking applied to patient assessment and treatment planning across the four years of the dental school curriculum and across clinical disciplines. Two elements of the model are described: 1) a critical thinking measurement "cell," and 2) a list of minimally essential steps in critical thinking for patient assessment and treatment planning. Issues pertaining to this model are discussed: adaptations on the path from novice to expert, the role of subjective measurement, variations supportive of the model, and the correlation of individual and institutional assessment. The critical thinking measurement cell consists of interacting performance tasks and measures. The student identifies the step in the process (for example, chief complaint) with objective measurement; the student then applies the step to a patient or case with subjective measurement; the faculty member then combines the objective and subjective measurements into an evaluation on progress toward competence. The activities in the cell are then repeated until all the steps in the process have been addressed. A next task is to determine consistency across the four years and across clinical disciplines.

  2. Modular Self-Assembly of Protein Cage Lattices for Multistep Catalysis

    DOE PAGES

    Uchida, Masaki; McCoy, Kimberly; Fukuto, Masafumi; ...

    2017-11-13

    The assembly of individual molecules into hierarchical structures is a promising strategy for developing three-dimensional materials with properties arising from interaction between the individual building blocks. Virus capsids are elegant examples of biomolecular nanostructures, which are themselves hierarchically assembled from a limited number of protein subunits. Here, we demonstrate the bio-inspired modular construction of materials with two levels of hierarchy: the formation of catalytically active individual virus-like particles (VLPs) through directed self-assembly of capsid subunits with enzyme encapsulation, and the assembly of these VLP building blocks into three-dimensional arrays. The structure of the assembled arrays was successfully altered from anmore » amorphous aggregate to an ordered structure, with a face-centered cubic lattice, by modifying the exterior surface of the VLP without changing its overall morphology, to modulate interparticle interactions. The assembly behavior and resultant lattice structure was a consequence of interparticle interaction between exterior surfaces of individual particles and thus independent of the enzyme cargos encapsulated within the VLPs. These superlattice materials, composed of two populations of enzyme-packaged VLP modules, retained the coupled catalytic activity in a two-step reaction for isobutanol synthesis. As a result, this study demonstrates a significant step toward the bottom-up fabrication of functional superlattice materials using a self-assembly process across multiple length scales and exhibits properties and function that arise from the interaction between individual building blocks.« less

  3. Modular Self-Assembly of Protein Cage Lattices for Multistep Catalysis

    PubMed Central

    Uchida, Masaki; McCoy, Kimberly; Fukuto, Masafumi; Yang, Lin; Yoshimura, Hideyuki; Miettinen, Heini M.; LaFrance, Ben; Patterson, Dustin P.; Schwarz, Benjamin; Karty, Jonathan A.; Prevelige, Peter E.; Lee, Byeongdu; Douglas, Trevor

    2018-01-01

    The assembly of individual molecules into hierarchical structures is a promising strategy for developing three-dimensional materials with properties arising from interaction between the individual building blocks. Virus capsids are elegant examples of biomolecular nanostructures, which are themselves hierarchically assembled from a limited number of protein subunits. Here we demonstrate the bio-inspired modular construction of materials with two levels of hierarchy; the formation of catalytically active individual virus-like particles (VLPs) through directed self-assembly of capsid subunits with enzyme encapsulation, and the assembly of these VLP building blocks into three-dimensional arrays. The structure of the assembled arrays was successfully altered from an amorphous aggregate to an ordered structure, with a face-centered cubic lattice, by modifying the exterior surface of the VLP without changing its overall morphology, to modulate interparticle interactions. The assembly behavior and resultant lattice structure was a consequence of interparticle interaction between exterior surfaces of individual particles, and thus independent of the enzyme cargos encapsulated within the VLPs. These superlattice materials, composed of two populations of enzyme packaged VLP modules, retained the coupled catalytic activity in a two-step reaction for isobutanol synthesis. This study demonstrates a significant step toward the bottom-up fabrication of functional superlattice materials using a self-assembly process across multiple length scales, and exhibits properties and function that arise from the interaction between individual building blocks. PMID:29131580

  4. Modular Self-Assembly of Protein Cage Lattices for Multistep Catalysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Uchida, Masaki; McCoy, Kimberly; Fukuto, Masafumi

    The assembly of individual molecules into hierarchical structures is a promising strategy for developing three-dimensional materials with properties arising from interaction between the individual building blocks. Virus capsids are elegant examples of biomolecular nanostructures, which are themselves hierarchically assembled from a limited number of protein subunits. Here, we demonstrate the bio-inspired modular construction of materials with two levels of hierarchy: the formation of catalytically active individual virus-like particles (VLPs) through directed self-assembly of capsid subunits with enzyme encapsulation, and the assembly of these VLP building blocks into three-dimensional arrays. The structure of the assembled arrays was successfully altered from anmore » amorphous aggregate to an ordered structure, with a face-centered cubic lattice, by modifying the exterior surface of the VLP without changing its overall morphology, to modulate interparticle interactions. The assembly behavior and resultant lattice structure was a consequence of interparticle interaction between exterior surfaces of individual particles and thus independent of the enzyme cargos encapsulated within the VLPs. These superlattice materials, composed of two populations of enzyme-packaged VLP modules, retained the coupled catalytic activity in a two-step reaction for isobutanol synthesis. As a result, this study demonstrates a significant step toward the bottom-up fabrication of functional superlattice materials using a self-assembly process across multiple length scales and exhibits properties and function that arise from the interaction between individual building blocks.« less

  5. Mass production of silicon pore optics for ATHENA

    NASA Astrophysics Data System (ADS)

    Wille, Eric; Bavdaz, Marcos; Collon, Maximilien

    2016-07-01

    Silicon Pore Optics (SPO) provide high angular resolution with low effective area density as required for the Advanced Telescope for High Energy Astrophysics (Athena). The x-ray telescope consists of several hundreds of SPO mirror modules. During the development of the process steps of the SPO technology, specific requirements of a future mass production have been considered right from the beginning. The manufacturing methods heavily utilise off-the-shelf equipment from the semiconductor industry, robotic automation and parallel processing. This allows to upscale the present production flow in a cost effective way, to produce hundreds of mirror modules per year. Considering manufacturing predictions based on the current technology status, we present an analysis of the time and resources required for the Athena flight programme. This includes the full production process starting with Si wafers up to the integration of the mirror modules. We present the times required for the individual process steps and identify the equipment required to produce two mirror modules per day. A preliminary timeline for building and commissioning the required infrastructure, and for flight model production of about 1000 mirror modules, is presented.

  6. [ASSESSMENT OF EXTREME FACTORS OF SHIFT WORK IN ARCTIC CONDITIONS BY WORKERS WITH DIFFERENT REGULATORY PROCESSES].

    PubMed

    Korneeva, Ya A; Simonova, N N

    2016-01-01

    A man working on a shift basis in the Arctic, every day is under the influence of various extreme factors which are inevitable for oil and gas indudtry. To adapt to shift work employees use various resources of the individual. The purpose of research is the determination of personal resources of shift workers to overcome the adverse factors of the environment in the Arctic. The study involved 191 builder of main gas pipelines, working in shifts in the Tyumen region (the length of the shift 52 days of arrival) at the age of 23 to 59 (mean age 34.9 ± 8.1) years. Methods: psychological testing, questioning, observation, descriptive statistics, discriminant step by step analysis. There was revealed the correlation between the subjective assessment of the majority of adverse climatic factors in the regulatory process "assessment of results"; production factors--regulatory processes such as flexibility, autonomy, simulation, and the general level of self-regulation; social factors are more associated with the severity of such regulatory processes, flexibility and evaluation of results.

  7. Object Segmentation and Ground Truth in 3D Embryonic Imaging.

    PubMed

    Rajasekaran, Bhavna; Uriu, Koichiro; Valentin, Guillaume; Tinevez, Jean-Yves; Oates, Andrew C

    2016-01-01

    Many questions in developmental biology depend on measuring the position and movement of individual cells within developing embryos. Yet, tools that provide this data are often challenged by high cell density and their accuracy is difficult to measure. Here, we present a three-step procedure to address this problem. Step one is a novel segmentation algorithm based on image derivatives that, in combination with selective post-processing, reliably and automatically segments cell nuclei from images of densely packed tissue. Step two is a quantitative validation using synthetic images to ascertain the efficiency of the algorithm with respect to signal-to-noise ratio and object density. Finally, we propose an original method to generate reliable and experimentally faithful ground truth datasets: Sparse-dense dual-labeled embryo chimeras are used to unambiguously measure segmentation errors within experimental data. Together, the three steps outlined here establish a robust, iterative procedure to fine-tune image analysis algorithms and microscopy settings associated with embryonic 3D image data sets.

  8. Object Segmentation and Ground Truth in 3D Embryonic Imaging

    PubMed Central

    Rajasekaran, Bhavna; Uriu, Koichiro; Valentin, Guillaume; Tinevez, Jean-Yves; Oates, Andrew C.

    2016-01-01

    Many questions in developmental biology depend on measuring the position and movement of individual cells within developing embryos. Yet, tools that provide this data are often challenged by high cell density and their accuracy is difficult to measure. Here, we present a three-step procedure to address this problem. Step one is a novel segmentation algorithm based on image derivatives that, in combination with selective post-processing, reliably and automatically segments cell nuclei from images of densely packed tissue. Step two is a quantitative validation using synthetic images to ascertain the efficiency of the algorithm with respect to signal-to-noise ratio and object density. Finally, we propose an original method to generate reliable and experimentally faithful ground truth datasets: Sparse-dense dual-labeled embryo chimeras are used to unambiguously measure segmentation errors within experimental data. Together, the three steps outlined here establish a robust, iterative procedure to fine-tune image analysis algorithms and microscopy settings associated with embryonic 3D image data sets. PMID:27332860

  9. Automation of route identification and optimisation based on data-mining and chemical intuition.

    PubMed

    Lapkin, A A; Heer, P K; Jacob, P-M; Hutchby, M; Cunningham, W; Bull, S D; Davidson, M G

    2017-09-21

    Data-mining of Reaxys and network analysis of the combined literature and in-house reactions set were used to generate multiple possible reaction routes to convert a bio-waste feedstock, limonene, into a pharmaceutical API, paracetamol. The network analysis of data provides a rich knowledge-base for generation of the initial reaction screening and development programme. Based on the literature and the in-house data, an overall flowsheet for the conversion of limonene to paracetamol was proposed. Each individual reaction-separation step in the sequence was simulated as a combination of the continuous flow and batch steps. The linear model generation methodology allowed us to identify the reaction steps requiring further chemical optimisation. The generated model can be used for global optimisation and generation of environmental and other performance indicators, such as cost indicators. However, the identified further challenge is to automate model generation to evolve optimal multi-step chemical routes and optimal process configurations.

  10. Development of DKB ETL module in case of data conversion

    NASA Astrophysics Data System (ADS)

    Kaida, A. Y.; Golosova, M. V.; Grigorieva, M. A.; Gubin, M. Y.

    2018-05-01

    Modern scientific experiments involve the producing of huge volumes of data that requires new approaches in data processing and storage. These data themselves, as well as their processing and storage, are accompanied by a valuable amount of additional information, called metadata, distributed over multiple informational systems and repositories, and having a complicated, heterogeneous structure. Gathering these metadata for experiments in the field of high energy nuclear physics (HENP) is a complex issue, requiring the quest for solutions outside the box. One of the tasks is to integrate metadata from different repositories into some kind of a central storage. During the integration process, metadata taken from original source repositories go through several processing steps: metadata aggregation, transformation according to the current data model and loading it to the general storage in a standardized form. The R&D project of ATLAS experiment on LHC, Data Knowledge Base, is aimed to provide fast and easy access to significant information about LHC experiments for the scientific community. The data integration subsystem, being developed for the DKB project, can be represented as a number of particular pipelines, arranging data flow from data sources to the main DKB storage. The data transformation process, represented by a single pipeline, can be considered as a number of successive data transformation steps, where each step is implemented as an individual program module. This article outlines the specifics of program modules, used in the dataflow, and describes one of the modules developed and integrated into the data integration subsystem of DKB.

  11. Effects of processing steps on the phenolic content and antioxidant activity of beer.

    PubMed

    Leitao, Céline; Marchioni, Eric; Bergaentzlé, Martine; Zhao, Minjie; Didierjean, Luc; Taidi, Behnam; Ennahar, Saïd

    2011-02-23

    A new analytical method (liquid chromatography-antioxidant, LC-AOx) was used that is intended to separate beer polyphenols and to determine the potential antioxidant activity of these constituents after they were allowed to react online with a buffered solution of the radical cation 2,2'-azinobis(3-ethylbenzothiazoline-6-sulfonic acid) (ABTS(•+)). Using the LC-AOx method, it was possible to demonstrate that the extent of the antioxidant activity was very much dependent on the phenolic compound considered. The method was also applied to the analysis of beer extracts and allowed the evaluation of their antioxidant activity at different steps of beer processing: brewing, boiling, and fermentation. This study showed that the total antioxidant activity remained unchanged throughout beer processing, as opposed to the polyphenolic content, which showed a 3-fold increase. Hopping and fermentation steps were the main causes of this increase. However, the increase measured after fermentation was attributed to a better extraction of polyphenols due to the presence of ethanol, rather than to a real increase in their content. Moreover, this method allowed the detection of three unknown antioxidant compounds, which accounted for 64 ± 4% of the total antioxidant activity of beer and were individually more efficient than caffeic acid and epicatechin.

  12. Process for manufacture of inertial confinement fusion targets and resulting product

    DOEpatents

    Masnari, Nino A.; Rensel, Walter B.; Robinson, Merrill G.; Solomon, David E.; Wise, Kensall D.; Wuttke, Gilbert H.

    1982-01-01

    An ICF target comprising a spherical pellet of fusion fuel surrounded by a concentric shell; and a process for manufacturing the same which includes the steps of forming hemispheric shells of a silicon or other substrate material, adhering the shell segments to each other with a fuel pellet contained concentrically therein, then separating the individual targets from the parent substrate. Formation of hemispheric cavities by deposition or coating of a mold substrate is also described. Coatings or membranes may also be applied to the interior of the hemispheric segments prior to joining.

  13. Laboratory procedures to generate viral metagenomes.

    PubMed

    Thurber, Rebecca V; Haynes, Matthew; Breitbart, Mya; Wegley, Linda; Rohwer, Forest

    2009-01-01

    This collection of laboratory protocols describes the steps to collect viruses from various samples with the specific aim of generating viral metagenome sequence libraries (viromes). Viral metagenomics, the study of uncultured viral nucleic acid sequences from different biomes, relies on several concentration, purification, extraction, sequencing and heuristic bioinformatic methods. No single technique can provide an all-inclusive approach, and therefore the protocols presented here will be discussed in terms of hypothetical projects. However, care must be taken to individualize each step depending on the source and type of viral-particles. This protocol is a description of the processes we have successfully used to: (i) concentrate viral particles from various types of samples, (ii) eliminate contaminating cells and free nucleic acids and (iii) extract, amplify and purify viral nucleic acids. Overall, a sample can be processed to isolate viral nucleic acids suitable for high-throughput sequencing in approximately 1 week.

  14. Individual heterogeneity in life histories and eco-evolutionary dynamics

    PubMed Central

    Vindenes, Yngvild; Langangen, Øystein

    2015-01-01

    Individual heterogeneity in life history shapes eco-evolutionary processes, and unobserved heterogeneity can affect demographic outputs characterising life history and population dynamical properties. Demographic frameworks like matrix models or integral projection models represent powerful approaches to disentangle mechanisms linking individual life histories and population-level processes. Recent developments have provided important steps towards their application to study eco-evolutionary dynamics, but so far individual heterogeneity has largely been ignored. Here, we present a general demographic framework that incorporates individual heterogeneity in a flexible way, by separating static and dynamic traits (discrete or continuous). First, we apply the framework to derive the consequences of ignoring heterogeneity for a range of widely used demographic outputs. A general conclusion is that besides the long-term growth rate lambda, all parameters can be affected. Second, we discuss how the framework can help advance current demographic models of eco-evolutionary dynamics, by incorporating individual heterogeneity. For both applications numerical examples are provided, including an empirical example for pike. For instance, we demonstrate that predicted demographic responses to climate warming can be reversed by increased heritability. We discuss how applications of this demographic framework incorporating individual heterogeneity can help answer key biological questions that require a detailed understanding of eco-evolutionary dynamics. PMID:25807980

  15. Developing Poultry Facility Type Information from USDA Agricultural Census Data for Use in Epidemiological and Economic Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Melius, C

    2007-12-05

    The epidemiological and economic modeling of poultry diseases requires knowing the size, location, and operational type of each poultry type operation within the US. At the present time, the only national database of poultry operations that is available to the general public is the USDA's 2002 Agricultural Census data, published by the National Agricultural Statistics Service, herein referred to as the 'NASS data'. The NASS data provides census data at the county level on poultry operations for various operation types (i.e., layers, broilers, turkeys, ducks, geese). However, the number of farms and sizes of farms for the various types aremore » not independent since some facilities have more than one type of operation. Furthermore, some data on the number of birds represents the number sold, which does not represent the number of birds present at any given time. In addition, any data tabulated by NASS that could identify numbers of birds or other data reported by an individual respondent is suppressed by NASS and coded with a 'D'. To be useful for epidemiological and economic modeling, the NASS data must be converted into a unique set of facility types (farms having similar operational characteristics). The unique set must not double count facilities or birds. At the same time, it must account for all the birds, including those for which the data has been suppressed. Therefore, several data processing steps are required to work back from the published NASS data to obtain a consistent database for individual poultry operations. This technical report documents data processing steps that were used to convert the NASS data into a national poultry facility database with twenty-six facility types (7 egg-laying, 6 broiler, 1 backyard, 3 turkey, and 9 others, representing ducks, geese, ostriches, emus, pigeons, pheasants, quail, game fowl breeders and 'other'). The process involves two major steps. The first step defines the rules used to estimate the data that is suppressed within the NASS database. The first step is similar to the first step used to estimate suppressed data for livestock [Melius et al (2006)]. The second step converts the NASS poultry types into the operational facility types used by the epidemiological and economic model. We also define two additional facility types for high and low risk poultry backyards, and an additional two facility types for live bird markets and swap meets. The distribution of these additional facility types among counties is based on US population census data. The algorithm defining the number of premises and the corresponding distribution among counties and the resulting premises density plots for the continental US are provided.« less

  16. Analysis of Strengths, Weaknesses, Opportunities, and Threats as a Tool for Translating Evidence into Individualized Medical Strategies (I-SWOT)

    PubMed Central

    von Kodolitsch, Yskert; Bernhardt, Alexander M.; Robinson, Peter N.; Kölbel, Tilo; Reichenspurner, Hermann; Debus, Sebastian; Detter, Christian

    2015-01-01

    Background It is the physicians’ task to translate evidence and guidelines into medical strategies for individual patients. Until today, however, there is no formal tool that is instrumental to perform this translation. Methods We introduce the analysis of strengths (S) and weaknesses (W) related to therapy with opportunities (O) and threats (T) related to individual patients as a tool to establish an individualized (I) medical strategy (I-SWOT). The I-SWOT matrix identifies four fundamental types of strategy. These comprise “SO” maximizing strengths and opportunities, “WT” minimizing weaknesses and threats, “WO” minimizing weaknesses and maximizing opportunities, and “ST” maximizing strengths and minimizing threats. Each distinct type of strategy may be considered for individualized medical strategies. Results We describe four steps of I-SWOT to establish an individualized medical strategy to treat aortic disease. In the first step, we define the goal of therapy and identify all evidence-based therapeutic options. In a second step, we assess strengths and weaknesses of each therapeutic option in a SW matrix form. In a third step, we assess opportunities and threats related to the individual patient, and in a final step, we use the I-SWOT matrix to establish an individualized medical strategy through matching “SW” with “OT”. As an example we present two 30-year-old patients with Marfan syndrome with identical medical history and aortic pathology. As a result of I-SWOT analysis of their individual opportunities and threats, we identified two distinct medical strategies in these patients. Conclusion I-SWOT is a formal but easy to use tool to translate medical evidence into individualized medical strategies. PMID:27069939

  17. Analysis of Strengths, Weaknesses, Opportunities, and Threats as a Tool for Translating Evidence into Individualized Medical Strategies (I-SWOT).

    PubMed

    von Kodolitsch, Yskert; Bernhardt, Alexander M; Robinson, Peter N; Kölbel, Tilo; Reichenspurner, Hermann; Debus, Sebastian; Detter, Christian

    2015-06-01

    It is the physicians' task to translate evidence and guidelines into medical strategies for individual patients. Until today, however, there is no formal tool that is instrumental to perform this translation. We introduce the analysis of strengths (S) and weaknesses (W) related to therapy with opportunities (O) and threats (T) related to individual patients as a tool to establish an individualized (I) medical strategy (I-SWOT). The I-SWOT matrix identifies four fundamental types of strategy. These comprise "SO" maximizing strengths and opportunities, "WT" minimizing weaknesses and threats, "WO" minimizing weaknesses and maximizing opportunities, and "ST" maximizing strengths and minimizing threats. Each distinct type of strategy may be considered for individualized medical strategies. We describe four steps of I-SWOT to establish an individualized medical strategy to treat aortic disease. In the first step, we define the goal of therapy and identify all evidence-based therapeutic options. In a second step, we assess strengths and weaknesses of each therapeutic option in a SW matrix form. In a third step, we assess opportunities and threats related to the individual patient, and in a final step, we use the I-SWOT matrix to establish an individualized medical strategy through matching "SW" with "OT". As an example we present two 30-year-old patients with Marfan syndrome with identical medical history and aortic pathology. As a result of I-SWOT analysis of their individual opportunities and threats, we identified two distinct medical strategies in these patients. I-SWOT is a formal but easy to use tool to translate medical evidence into individualized medical strategies.

  18. Origins of multicellular evolvability in snowflake yeast

    PubMed Central

    Ratcliff, William C.; Fankhauser, Johnathon D.; Rogers, David W.; Greig, Duncan; Travisano, Michael

    2015-01-01

    Complex life has arisen through a series of ‘major transitions’ in which collectives of formerly autonomous individuals evolve into a single, integrated organism. A key step in this process is the origin of higher-level evolvability, but little is known about how higher-level entities originate and gain the capacity to evolve as an individual. Here we report a single mutation that not only creates a new level of biological organization, but also potentiates higher-level evolvability. Disrupting the transcription factor ACE2 in Saccharomyces cerevisiae prevents mother–daughter cell separation, generating multicellular ‘snowflake’ yeast. Snowflake yeast develop through deterministic rules that produce geometrically defined clusters that preclude genetic conflict and display a high broad-sense heritability for multicellular traits; as a result they are preadapted to multicellular adaptation. This work demonstrates that simple microevolutionary changes can have profound macroevolutionary consequences, and suggests that the formation of clonally developing clusters may often be the first step to multicellularity. PMID:25600558

  19. Laser mass spectrometry for DNA fingerprinting for forensic applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, C.H.; Tang, K.; Taranenko, N.I.

    The application of DNA fingerprinting has become very broad in forensic analysis, patient identification, diagnostic medicine, and wildlife poaching, since every individual`s DNA structure is identical within all tissues of their body. DNA fingerprinting was initiated by the use of restriction fragment length polymorphisms (RFLP). In 1987, Nakamura et al. found that a variable number of tandem repeats (VNTR) often occurred in the alleles. The probability of different individuals having the same number of tandem repeats in several different alleles is very low. Thus, the identification of VNTR from genomic DNA became a very reliable method for identification of individuals.more » DNA fingerprinting is a reliable tool for forensic analysis. In DNA fingerprinting, knowledge of the sequence of tandem repeats and restriction endonuclease sites can provide the basis for identification. The major steps for conventional DNA fingerprinting include (1) specimen processing (2) amplification of selected DNA segments by PCR, and (3) gel electrophoresis to do the final DNA analysis. In this work we propose to use laser desorption mass spectrometry for fast DNA fingerprinting. The process and advantages are discussed.« less

  20. Those who hesitate lose: the relationship between assertiveness and response latency.

    PubMed

    Collins, L H; Powell, J L; Oliver, P V

    2000-06-01

    Individuals who are low in assertiveness may take longer to sort out, process, and state their own perceptions, attitudes and priorities, which puts them at a disadvantage in getting their needs met. The reason for this may not be inhibition in social situations or cognitive ability, but a lack of clarity regarding their own attitudes, opinions, preferences, goals, and priorities. 101 undergraduate students (57% women and 43% men) completed a demographics questionnaire, the Wonderlic Personnel Test, a self-monitoring scale, the Marlowe-Crowne Social Desirability Scale, the Rosenberg Self-esteem Scale, the College Self-expression Scale, and a test of the false-consensus effect. Response latencies to questions were measured. Individuals with higher scores on the Wonderlic Personnel Test answered items more quickly but, even when cognitive ability was controlled, individuals low in assertiveness still took significantly longer to respond to questions about themselves, their opinions, and their preferences. If individuals fall behind at this early step in the process of asserting themselves, then they may be more likely to miss opportunities to be assertive.

  1. [Active aging from the perspective of aged individuals who are functionally independent].

    PubMed

    Ferreira, Olivia Galvão Lucena; Maciel, Silvana Carneiro; Silva, Antonia Oliveira; dos Santos, Walberto Silva; Moreira, Maria Adelaide Silva P

    2010-12-01

    The objective of this study was to identify the social representations of the elderly regarding active aging. Semi-structured interviews were performed with 100 functionally independent aged individuals from João Pessoa, Paraiba, Brazil. The data was organized and analyzed using Alceste software. Results showed that the aged individuals' statements about active aging are permeated with positive contents. However, when aging is not associated with the word active, it is still represented as losses and disabilities. Despite the existence of losses during the process, active aging should be encouraged among the elderly, as it means living a quality, plentiful life. Maintaining the elderly functionally independent is the first step to achieving active aging and thus improving their quality of life.

  2. Personalized Pain Medicine: The Clinical Value of Psychophysical Assessment of Pain Modulation Profile

    PubMed Central

    Granovsky, Yelena; Yarnitsky, David

    2013-01-01

    Experimental pain stimuli can be used to simulate patients’ pain experience. We review recent developments in psychophysical pain testing, focusing on the application of the dynamic tests—conditioned pain modulation (CPM) and temporal summation (TS). Typically, patients with clinical pain of various types express either less efficient CPM or enhanced TS, or both. These tests can be used in prediction of incidence of acquiring pain and of its intensity, as well as in assisting the correct choice of analgesic agents for individual patients. This can help to shorten the commonly occurring long and frustrating process of adjusting analgesic agents to the individual patients. We propose that evaluating pain modulation can serve as a step forward in individualizing pain medicine. PMID:24228167

  3. Personalized pain medicine: the clinical value of psychophysical assessment of pain modulation profile.

    PubMed

    Granovsky, Yelena; Yarnitsky, David

    2013-01-01

    Experimental pain stimuli can be used to simulate patients' pain experience. We review recent developments in psychophysical pain testing, focusing on the application of the dynamic tests-conditioned pain modulation (CPM) and temporal summation (TS). Typically, patients with clinical pain of various types express either less efficient CPM or enhanced TS, or both. These tests can be used in prediction of incidence of acquiring pain and of its intensity, as well as in assisting the correct choice of analgesic agents for individual patients. This can help to shorten the commonly occurring long and frustrating process of adjusting analgesic agents to the individual patients. We propose that evaluating pain modulation can serve as a step forward in individualizing pain medicine.

  4. Quantifying the atomic-level mechanics of single long physisorbed molecular chains.

    PubMed

    Kawai, Shigeki; Koch, Matthias; Gnecco, Enrico; Sadeghi, Ali; Pawlak, Rémy; Glatzel, Thilo; Schwarz, Jutta; Goedecker, Stefan; Hecht, Stefan; Baratoff, Alexis; Grill, Leonhard; Meyer, Ernst

    2014-03-18

    Individual in situ polymerized fluorene chains 10-100 nm long linked by C-C bonds are pulled vertically from an Au(111) substrate by the tip of a low-temperature atomic force microscope. The conformation of the selected chains is imaged before and after manipulation using scanning tunneling microscopy. The measured force gradient shows strong and periodic variations that correspond to the step-by-step detachment of individual fluorene repeat units. These variations persist at constant intensity until the entire polymer is completely removed from the surface. Calculations based on an extended Frenkel-Kontorova model reproduce the periodicity and magnitude of these features and allow us to relate them to the detachment force and desorption energy of the repeat units. The adsorbed part of the polymer slides easily along the surface during the pulling process, leading to only small oscillations as a result of the high stiffness of the fluorenes and of their length mismatch with respect to the substrate surface structure. A significant lateral force also is caused by the sequential detachment of individual units. The gained insight into the molecule-surface interactions during sliding and pulling should aid the design of mechanoresponsive nanosystems and devices.

  5. The Influence of Task Complexity on Knee Joint Kinetics Following ACL Reconstruction

    PubMed Central

    Schroeder, Megan J.; Krishnan, Chandramouli; Dhaher, Yasin Y.

    2015-01-01

    Background Previous research indicates that subjects with anterior cruciate ligament reconstruction exhibit abnormal knee joint movement patterns during functional activities like walking. While the sagittal plane mechanics have been studied extensively, less is known about the secondary planes, specifically with regard to more demanding tasks. This study explored the influence of task complexity on functional joint mechanics in the context of graft-specific surgeries. Methods In 25 participants (10 hamstring tendon graft, 6 patellar tendon graft, 9 matched controls), three-dimensional joint torques were calculated using a standard inverse dynamics approach during level walking and stair descent. The stair descent task was separated into two functionally different sub-tasks—step-to-floor and step-to-step. The differences in external knee moment profiles were compared between groups; paired differences between the reconstructed and non-reconstructed knees were also assessed. Findings The reconstructed knees, irrespective of graft type, typically exhibited significantly lower peak knee flexion moments compared to control knees during stair descent, with the differences more pronounced in the step-to-step task. Frontal plane adduction torque deficits were graft-specific and limited to the hamstring tendon knees during the step-to-step task. Internal rotation torque deficits were also primarily limited to the hamstring tendon graft group during stair descent. Collectively, these results suggest that task complexity was a primary driver of differences in joint mechanics between anterior cruciate ligament reconstructed individuals and controls, and such differences were more pronounced in individuals with hamstring tendon grafts. Interpretation The mechanical environment experienced in the cartilage during repetitive, cyclical tasks such as walking and other activities of daily living has been argued to contribute to the development of degenerative changes to the joint and ultimately osteoarthritis. Given the task-specific and graft-specific differences in joint mechanics detected in this study, care should be taken during the rehabilitation process to mitigate these changes. PMID:26101055

  6. Decision Making Processes and Outcomes

    PubMed Central

    Hicks Patrick, Julie; Steele, Jenessa C.; Spencer, S. Melinda

    2013-01-01

    The primary aim of this study was to examine the contributions of individual characteristics and strategic processing to the prediction of decision quality. Data were provided by 176 adults, ages 18 to 93 years, who completed computerized decision-making vignettes and a battery of demographic and cognitive measures. We examined the relations among age, domain-specific experience, working memory, and three measures of strategic information search to the prediction of solution quality using a 4-step hierarchical linear regression analysis. Working memory and two measures of strategic processing uniquely contributed to the variance explained. Results are discussed in terms of potential advances to both theory and intervention efforts. PMID:24282638

  7. Semiotic individuation and Ernst Cassirer's challenge.

    PubMed

    Hoffmeyer, Jesper

    2015-12-01

    The concept of individuation has suffered from its being mostly connected with Jungian psychology or nominalist philosophy. In this paper, "individuation" will be understood rather as a process; and in particular, as a series of stages (morphological and/or cognitive) that an organism passes through during its lifespan. In most species, individuation is restricted to a short period in early life, as when birds acquire their species specific songs; while in humans - and a few other species of birds or mammals (although to a much lesser degree) - individuation is a life-long, open-ended process. In this understanding, individuation becomes narrowly connected to learning. And since learning necessarily depends on what is already learned, the trajectory of learning-based individuation is necessarily indefinite and dependent on the concrete chance events and steps whereby the process has proceeded. Semiotic individuation is a historical process, and this fact explains why systems biology, as established by Ludwig van Bertalanffy, has not been capable of meeting the hope, expressed long ago by Ernst Cassirer, of bridging the mechanicist-vitalist gap in biology. Instead, a semiotic approach is called for. Human individuation, moreover, is special in a very important sense: language use implies that humans from earliest childhood inescapably become entangled in an 'as-if-world', a virtual reality, a story about who we are and how our life 'here and now' belongs within our own life-history, as well as within the greater pattern of the world around us. Human individuation is thus a double-tracked process, consisting in an incessant reconciliation or negotiation between the virtual reality that we have constructed in our minds and mind-independent reality as it impresses itself upon our lives. Human life cannot therefore be defined by its uniqueness as a particular genetic combination, but must be instead be defined by its uniqueness as a temporal outcome of semiotic individuation. Accordingly, this double-tracked character of human semiotic individuation implies that it is cast as just one particular outcome of a combinatorics with an infinite number of possible outcomes. It is suggested here that our ingrained feeling of possessing a free will is buried in this fact. Copyright © 2015. Published by Elsevier Ltd.

  8. Method for automatically evaluating a transition from a batch manufacturing technique to a lean manufacturing technique

    DOEpatents

    Ivezic, Nenad; Potok, Thomas E.

    2003-09-30

    A method for automatically evaluating a manufacturing technique comprises the steps of: receiving from a user manufacturing process step parameters characterizing a manufacturing process; accepting from the user a selection for an analysis of a particular lean manufacturing technique; automatically compiling process step data for each process step in the manufacturing process; automatically calculating process metrics from a summation of the compiled process step data for each process step; and, presenting the automatically calculated process metrics to the user. A method for evaluating a transition from a batch manufacturing technique to a lean manufacturing technique can comprise the steps of: collecting manufacturing process step characterization parameters; selecting a lean manufacturing technique for analysis; communicating the selected lean manufacturing technique and the manufacturing process step characterization parameters to an automatic manufacturing technique evaluation engine having a mathematical model for generating manufacturing technique evaluation data; and, using the lean manufacturing technique evaluation data to determine whether to transition from an existing manufacturing technique to the selected lean manufacturing technique.

  9. Visual Recognition of the Elderly Concerning Risks of Falling or Stumbling Indoors in the Home

    PubMed Central

    Katsura, Toshiki; Miura, Norio; Hoshino, Akiko; Usui, Kanae; Takahashi, Yasuro; Hisamoto, Seiichi

    2011-01-01

    Objective: The objective of this study was to verify the recognition of dangers and obstacles within a house in the elderly when walking based on analyses of gaze point fixation. Materials and Methods: The rate of recognizing indoor dangers was compared among 30 elderly, 14 middle-aged and 11 young individuals using the Eye Mark Recorder. Results: 1) All of the elderly, middle-aged and young individuals showed a high recognition rate of 100% or near 100% when ascending outdoor steps but a low rate of recognizing obstacles placed on the steps. They showed a recognition rate of about 60% when descending steps from residential premises to the street. The rate of recognizing middle steps in the elderly was significantly lower than that in younger and middle-aged individuals. Regarding recognition indoors, when ascending stairs, all of the elderly, middle-aged and young individuals showed a high recognition rate of nearly 100%. When descending stairs, they showed a recognition rate of 70-90%. However, although the recognition rate in the elderly was lower than in younger and middle-aged individuals, no significant difference was observed. 2) When moving indoors, all of the elderly, middle-aged and young individuals showed a recognition rate of 70%-80%. The recognition rate was high regarding obstacles such as floors, televisions and chests of drawers but low for obstacles in the bathroom and steps on the path. The rate of recognizing steps of doorsills forming the division between a Japanese-style room and corridor as well as obstacles in a Japanese-style room was low, and the rate in the elderly was low, being 40% or less. Conclusion: The rate of recognizing steps of doorsills as well as obstacles in a Japanese-style room was lower in the elderly in comparison with middle-aged or young individuals. PMID:25648876

  10. Nondestructive surface profiling of hidden MEMS using an infrared low-coherence interferometric microscope

    NASA Astrophysics Data System (ADS)

    Krauter, Johann; Osten, Wolfgang

    2018-03-01

    There are a wide range of applications for micro-electro-mechanical systems (MEMS). The automotive and consumer market is the strongest driver for the growing MEMS industry. A 100 % test of MEMS is particularly necessary since these are often used for safety-related purposes such as the ESP (Electronic Stability Program) system. The production of MEMS is a fully automated process that generates 90 % of the costs during the packaging and dicing steps. Nowadays, an electrical test is carried out on each individual MEMS component before these steps. However, after encapsulation, MEMS are opaque to visible light and other defects cannot be detected. Therefore, we apply an infrared low-coherence interferometer for the topography measurement of those hidden structures. A lock-in algorithm-based method is shown to calculate the object height and to reduce ghost steps due to the 2π -unambiguity. Finally, measurements of different MEMS-based sensors are presented.

  11. Modeling behavior dynamics using computational psychometrics within virtual worlds.

    PubMed

    Cipresso, Pietro

    2015-01-01

    In case of fire in a building, how will people behave in the crowd? The behavior of each individual affects the behavior of others and, conversely, each one behaves considering the crowd as a whole and the individual others. In this article, I propose a three-step method to explore a brand new way to study behavior dynamics. The first step relies on the creation of specific situations with standard techniques (such as mental imagery, text, video, and audio) and an advanced technique [Virtual Reality (VR)] to manipulate experimental settings. The second step concerns the measurement of behavior in one, two, or many individuals focusing on parameters extractions to provide information about the behavior dynamics. Finally, the third step, which uses the parameters collected and measured in the previous two steps in order to simulate possible scenarios to forecast through computational models, understand, and explain behavior dynamics at the social level. An experimental study was also included to demonstrate the three-step method and a possible scenario.

  12. Training Rapid Stepping Responses in an Individual With Stroke

    PubMed Central

    Inness, Elizabeth L.; Komar, Janice; Biasin, Louis; Brunton, Karen; Lakhani, Bimal; McIlroy, William E.

    2011-01-01

    Background and Purpose Compensatory stepping reactions are important responses to prevent a fall following a postural perturbation. People with hemiparesis following a stroke show delayed initiation and execution of stepping reactions and often are found to be unable to initiate these steps with the more-affected limb. This case report describes a targeted training program involving repeated postural perturbations to improve control of compensatory stepping in an individual with stroke. Case Description Compensatory stepping reactions of a 68-year-old man were examined 52 days after left hemorrhagic stroke. He required assistance to prevent a fall in all trials administered during his initial examination because he showed weight-bearing asymmetry (with more weight borne on the more-affected right side), was unable to initiate stepping with the right leg (despite blocking of the left leg in some trials), and demonstrated delayed response times. The patient completed 6 perturbation training sessions (30–60 minutes per session) that aimed to improve preperturbation weight-bearing symmetry, to encourage stepping with the right limb, and to reduce step initiation and completion times. Outcomes Improved efficacy of compensatory stepping reactions with training and reduced reliance on assistance to prevent falling were observed. Improvements were noted in preperturbation asymmetry and step timing. Blocking the left foot was effective in encouraging stepping with the more-affected right foot. Discussion This case report demonstrates potential short-term adaptations in compensatory stepping reactions following perturbation training in an individual with stroke. Future work should investigate the links between improved compensatory step characteristics and fall risk in this vulnerable population. PMID:21511992

  13. Impact of modellers' decisions on hydrological a priori predictions

    NASA Astrophysics Data System (ADS)

    Holländer, H. M.; Bormann, H.; Blume, T.; Buytaert, W.; Chirico, G. B.; Exbrayat, J.-F.; Gustafsson, D.; Hölzel, H.; Krauße, T.; Kraft, P.; Stoll, S.; Blöschl, G.; Flühler, H.

    2013-07-01

    The purpose of this paper is to stimulate a re-thinking of how we, the catchment hydrologists, could become reliable forecasters. A group of catchment modellers predicted the hydrological response of a man-made 6 ha catchment in its initial phase (Chicken Creek) without having access to the observed records. They used conceptually different model families. Their modelling experience differed largely. The prediction exercise was organized in three steps: (1) for the 1st prediction modellers received a basic data set describing the internal structure of the catchment (somewhat more complete than usually available to a priori predictions in ungauged catchments). They did not obtain time series of stream flow, soil moisture or groundwater response. (2) Before the 2nd improved prediction they inspected the catchment on-site and attended a workshop where the modellers presented and discussed their first attempts. (3) For their improved 3rd prediction they were offered additional data by charging them pro forma with the costs for obtaining this additional information. Holländer et al. (2009) discussed the range of predictions obtained in step 1. Here, we detail the modeller's decisions in accounting for the various processes based on what they learned during the field visit (step 2) and add the final outcome of step 3 when the modellers made use of additional data. We document the prediction progress as well as the learning process resulting from the availability of added information. For the 2nd and 3rd step, the progress in prediction quality could be evaluated in relation to individual modelling experience and costs of added information. We learned (i) that soft information such as the modeller's system understanding is as important as the model itself (hard information), (ii) that the sequence of modelling steps matters (field visit, interactions between differently experienced experts, choice of model, selection of available data, and methods for parameter guessing), and (iii) that added process understanding can be as efficient as adding data for improving parameters needed to satisfy model requirements.

  14. Grief and Group Recovery Following a Military Air Disaster

    DTIC Science & Technology

    1990-01-01

    stages of human development , the unlabelled cells are meant to suggest individuals who either accelerate through phases more quick- ly than the norm...and far-reaching ( Erikson , 1976; Titchener and Kapp, 1976). Such was apparently the case following the December 1988 crash of Pan Am flight 103 in...the normal processes of group recovery and reintegration after sudden, traumatic loss. The present study takes a necessary step in developing this

  15. Inertial Motion Capture Costume Design Study

    PubMed Central

    Szczęsna, Agnieszka; Skurowski, Przemysław; Lach, Ewa; Pruszowski, Przemysław; Pęszor, Damian; Paszkuta, Marcin; Słupik, Janusz; Lebek, Kamil; Janiak, Mateusz; Polański, Andrzej; Wojciechowski, Konrad

    2017-01-01

    The paper describes a scalable, wearable multi-sensor system for motion capture based on inertial measurement units (IMUs). Such a unit is composed of accelerometer, gyroscope and magnetometer. The final quality of an obtained motion arises from all the individual parts of the described system. The proposed system is a sequence of the following stages: sensor data acquisition, sensor orientation estimation, system calibration, pose estimation and data visualisation. The construction of the system’s architecture with the dataflow programming paradigm makes it easy to add, remove and replace the data processing steps. The modular architecture of the system allows an effortless introduction of a new sensor orientation estimation algorithms. The original contribution of the paper is the design study of the individual components used in the motion capture system. The two key steps of the system design are explored in this paper: the evaluation of sensors and algorithms for the orientation estimation. The three chosen algorithms have been implemented and investigated as part of the experiment. Due to the fact that the selection of the sensor has a significant impact on the final result, the sensor evaluation process is also explained and tested. The experimental results confirmed that the choice of sensor and orientation estimation algorithm affect the quality of the final results. PMID:28304337

  16. Kinematic, muscular, and metabolic responses during exoskeletal-, elliptical-, or therapist-assisted stepping in people with incomplete spinal cord injury.

    PubMed

    Hornby, T George; Kinnaird, Catherine R; Holleran, Carey L; Rafferty, Miriam R; Rodriguez, Kelly S; Cain, Julie B

    2012-10-01

    Robotic-assisted locomotor training has demonstrated some efficacy in individuals with neurological injury and is slowly gaining clinical acceptance. Both exoskeletal devices, which control individual joint movements, and elliptical devices, which control endpoint trajectories, have been utilized with specific patient populations and are available commercially. No studies have directly compared training efficacy or patient performance during stepping between devices. The purpose of this study was to evaluate kinematic, electromyographic (EMG), and metabolic responses during elliptical- and exoskeletal-assisted stepping in individuals with incomplete spinal cord injury (SCI) compared with therapist-assisted stepping. Design A prospective, cross-sectional, repeated-measures design was used. Participants with incomplete SCI (n=11) performed 3 separate bouts of exoskeletal-, elliptical-, or therapist-assisted stepping. Unilateral hip and knee sagittal-plane kinematics, lower-limb EMG recordings, and oxygen consumption were compared across stepping conditions and with control participants (n=10) during treadmill stepping. Exoskeletal stepping kinematics closely approximated normal gait patterns, whereas significantly greater hip and knee flexion postures were observed during elliptical-assisted stepping. Measures of kinematic variability indicated consistent patterns in control participants and during exoskeletal-assisted stepping, whereas therapist- and elliptical-assisted stepping kinematics were more variable. Despite specific differences, EMG patterns generally were similar across stepping conditions in the participants with SCI. In contrast, oxygen consumption was consistently greater during therapist-assisted stepping. Limitations Limitations included a small sample size, lack of ability to evaluate kinetics during stepping, unilateral EMG recordings, and sagittal-plane kinematics. Despite specific differences in kinematics and EMG activity, metabolic activity was similar during stepping in each robotic device. Understanding potential differences and similarities in stepping performance with robotic assistance may be important in delivery of repeated locomotor training using robotic or therapist assistance and for consumers of robotic devices.

  17. Individualized Child-Focused Curriculum: A Differentiated Approach

    ERIC Educational Resources Information Center

    Gronlund, Gaye

    2016-01-01

    How do you focus on each individual child in a full classroom? Learn to integrate individualized curriculum into daily practice with this step-­by-­step guide. Even good observers and documenters do not always use these insights to inform their curriculum planning. Using Developmental Studies, a new tool created and successfully field­-tested by…

  18. Variability of Anticipatory Postural Adjustments During Gait Initiation in Individuals With Parkinson Disease.

    PubMed

    Lin, Cheng-Chieh; Creath, Robert A; Rogers, Mark W

    2016-01-01

    In people with Parkinson disease (PD), difficulties with initiating stepping may be related to impairments of anticipatory postural adjustments (APAs). Increased variability in step length and step time has been observed in gait initiation in individuals with PD. In this study, we investigated whether the ability to generate consistent APAs during gait initiation is compromised in these individuals. Fifteen subjects with PD and 8 healthy control subjects were instructed to take rapid forward steps after a verbal cue. The changes in vertical force and ankle marker position were recorded via force platforms and a 3-dimensional motion capture system, respectively. Means, standard deviations, and coefficients of variation of both timing and magnitude of vertical force, as well as stepping variables, were calculated. During the postural phase of gait initiation the interval was longer and the force modulation was smaller in subjects with PD. Both the variability of timing and force modulation were larger in subjects with PD. Individuals with PD also had a longer time to complete the first step, but no significant differences were found for the variability of step time, length, and speed between groups. The increased variability of APAs during gait initiation in subjects with PD could affect posture-locomotion coupling, and lead to start hesitation, and even falls. Future studies are needed to investigate the effect of rehabilitation interventions on the variability of APAs during gait initiation in individuals with PD.Video abstract available for more insights from the authors (see Supplemental Digital Content 1, http://links.lww.com/JNPT/A119).

  19. Consequences of atomic layer etching on wafer scale uniformity in inductively coupled plasmas

    NASA Astrophysics Data System (ADS)

    Huard, Chad M.; Lanham, Steven J.; Kushner, Mark J.

    2018-04-01

    Atomic layer etching (ALE) typically divides the etching process into two self-limited reactions. One reaction passivates a single layer of material while the second preferentially removes the passivated layer. As such, under ideal conditions the wafer scale uniformity of ALE should be independent of the uniformity of the reactant fluxes onto the wafers, provided all surface reactions are saturated. The passivation and etch steps should individually asymptotically saturate after a characteristic fluence of reactants has been delivered to each site. In this paper, results from a computational investigation are discussed regarding the uniformity of ALE of Si in Cl2 containing inductively coupled plasmas when the reactant fluxes are both non-uniform and non-ideal. In the parameter space investigated for inductively coupled plasmas, the local etch rate for continuous processing was proportional to the ion flux. When operated with saturated conditions (that is, both ALE steps are allowed to self-terminate), the ALE process is less sensitive to non-uniformities in the incoming ion flux than continuous etching. Operating ALE in a sub-saturation regime resulted in less uniform etching. It was also found that ALE processing with saturated steps requires a larger total ion fluence than continuous etching to achieve the same etch depth. This condition may result in increased resist erosion and/or damage to stopping layers using ALE. While these results demonstrate that ALE provides increased etch depth uniformity, they do not show an improved critical dimension uniformity in all cases. These possible limitations to ALE processing, as well as increased processing time, will be part of the process optimization that includes the benefits of atomic resolution and improved uniformity.

  20. Evolution of resource cycling in ecosystems and individuals.

    PubMed

    Crombach, Anton; Hogeweg, Paulien

    2009-06-01

    Resource cycling is a defining process in the maintenance of the biosphere. Microbial communities, ranging from simple to highly diverse, play a crucial role in this process. Yet the evolutionary adaptation and speciation of micro-organisms have rarely been studied in the context of resource cycling. In this study, our basic questions are how does a community evolve its resource usage and how are resource cycles partitioned? We design a computational model in which a population of individuals evolves to take up nutrients and excrete waste. The waste of one individual is another's resource. Given a fixed amount of resources, this leads to resource cycles. We find that the shortest cycle dominates the ecological dynamics, and over evolutionary time its length is minimized. Initially a single lineage processes a long cycle of resources, later crossfeeding lineages arise. The evolutionary dynamics that follow are determined by the strength of indirect selection for resource cycling. We study indirect selection by changing the spatial setting and the strength of direct selection. If individuals are fixed at lattice sites or direct selection is low, indirect selection result in lineages that structure their local environment, leading to 'smart' individuals and stable patterns of resource dynamics. The individuals are good at cycling resources themselves and do this with a short cycle. On the other hand, if individuals randomly change position each time step, or direct selection is high, individuals are more prone to crossfeeding: an ecosystem based solution with turbulent resource dynamics, and individuals that are less capable of cycling resources themselves. In a baseline model of ecosystem evolution we demonstrate different eco-evolutionary trajectories of resource cycling. By varying the strength of indirect selection through the spatial setting and direct selection, the integration of information by the evolutionary process leads to qualitatively different results from individual smartness to cooperative community structures.

  1. Statistical process control as a tool for controlling operating room performance: retrospective analysis and benchmarking.

    PubMed

    Chen, Tsung-Tai; Chang, Yun-Jau; Ku, Shei-Ling; Chung, Kuo-Piao

    2010-10-01

    There is much research using statistical process control (SPC) to monitor surgical performance, including comparisons among groups to detect small process shifts, but few of these studies have included a stabilization process. This study aimed to analyse the performance of surgeons in operating room (OR) and set a benchmark by SPC after stabilized process. The OR profile of 499 patients who underwent laparoscopic cholecystectomy performed by 16 surgeons at a tertiary hospital in Taiwan during 2005 and 2006 were recorded. SPC was applied to analyse operative and non-operative times using the following five steps: first, the times were divided into two segments; second, they were normalized; third, they were evaluated as individual processes; fourth, the ARL(0) was calculated;, and fifth, the different groups (surgeons) were compared. Outliers were excluded to ensure stability for each group and to facilitate inter-group comparison. The results showed that in the stabilized process, only one surgeon exhibited a significantly shorter total process time (including operative time and non-operative time). In this study, we use five steps to demonstrate how to control surgical and non-surgical time in phase I. There are some measures that can be taken to prevent skew and instability in the process. Also, using SPC, one surgeon can be shown to be a real benchmark. © 2010 Blackwell Publishing Ltd.

  2. Mechanistic evaluation of the pros and cons of digital RT-LAMP for HIV-1 viral load quantification on a microfluidic device and improved efficiency via a two-step digital protocol.

    PubMed

    Sun, Bing; Shen, Feng; McCalla, Stephanie E; Kreutz, Jason E; Karymov, Mikhail A; Ismagilov, Rustem F

    2013-02-05

    Here we used a SlipChip microfluidic device to evaluate the performance of digital reverse transcription-loop-mediated isothermal amplification (dRT-LAMP) for quantification of HIV viral RNA. Tests are needed for monitoring HIV viral load to control the emergence of drug resistance and to diagnose acute HIV infections. In resource-limited settings, in vitro measurement of HIV viral load in a simple format is especially needed, and single-molecule counting using a digital format could provide a potential solution. We showed here that when one-step dRT-LAMP is used for quantification of HIV RNA, the digital count is lower than expected and is limited by the yield of desired cDNA. We were able to overcome the limitations by developing a microfluidic protocol to manipulate many single molecules in parallel through a two-step digital process. In the first step we compartmentalize the individual RNA molecules (based on Poisson statistics) and perform reverse transcription on each RNA molecule independently to produce DNA. In the second step, we perform the LAMP amplification on all individual DNA molecules in parallel. Using this new protocol, we increased the absolute efficiency (the ratio between the concentration calculated from the actual count and the expected concentration) of dRT-LAMP 10-fold, from ∼2% to ∼23%, by (i) using a more efficient reverse transcriptase, (ii) introducing RNase H to break up the DNA:RNA hybrid, and (iii) adding only the BIP primer during the RT step. We also used this two-step method to quantify HIV RNA purified from four patient samples and found that in some cases, the quantification results were highly sensitive to the sequence of the patient's HIV RNA. We learned the following three lessons from this work: (i) digital amplification technologies, including dLAMP and dPCR, may give adequate dilution curves and yet have low efficiency, thereby providing quantification values that underestimate the true concentration. Careful validation is essential before a method is considered to provide absolute quantification; (ii) the sensitivity of dLAMP to the sequence of the target nucleic acid necessitates additional validation with patient samples carrying the full spectrum of mutations; (iii) for multistep digital amplification chemistries, such as a combination of reverse transcription with amplification, microfluidic devices may be used to decouple these steps from one another and to perform them under different, individually optimized conditions for improved efficiency.

  3. Potential radiological impact of tornadoes on the safety of Nuclear Fuel Services' West Valley Fuel Reprocessing Plant. Volume I. Tornado effects on head-end cell airflow

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holloway, L.J.; Andrae, R.W.

    1981-09-01

    This report describes results of a parametric study of the impacts of a tornado-generated depressurization on airflow in the contaminated process cells within the presently inoperative Nuclear Fuel Services fuel reprocessing facility near West Valley, NY. The study involved the following tasks: (1) mathematical modeling of installed ventilation and abnormal exhaust pathways from the cells and prediction of tornado-induced airflows in these pathways; (2) mathematical modeling of individual cell flow characteristics and prediction of in-cell velocities induced by flows from step 1; and (3) evaluation of the results of steps 1 and 2 to determine whether any of the pathwaysmore » investigated have the potential for releasing quantities of radioactively contaminated air from the main process cells. The study has concluded that in the event of a tornado strike, certain pathways from the cells have the potential to release radioactive materials of the atmosphere. Determination of the quantities of radioactive material released from the cells through pathways identified in step 3 is presented in Part II of this report.« less

  4. QuickRNASeq lifts large-scale RNA-seq data analyses to the next level of automation and interactive visualization.

    PubMed

    Zhao, Shanrong; Xi, Li; Quan, Jie; Xi, Hualin; Zhang, Ying; von Schack, David; Vincent, Michael; Zhang, Baohong

    2016-01-08

    RNA sequencing (RNA-seq), a next-generation sequencing technique for transcriptome profiling, is being increasingly used, in part driven by the decreasing cost of sequencing. Nevertheless, the analysis of the massive amounts of data generated by large-scale RNA-seq remains a challenge. Multiple algorithms pertinent to basic analyses have been developed, and there is an increasing need to automate the use of these tools so as to obtain results in an efficient and user friendly manner. Increased automation and improved visualization of the results will help make the results and findings of the analyses readily available to experimental scientists. By combing the best open source tools developed for RNA-seq data analyses and the most advanced web 2.0 technologies, we have implemented QuickRNASeq, a pipeline for large-scale RNA-seq data analyses and visualization. The QuickRNASeq workflow consists of three main steps. In Step #1, each individual sample is processed, including mapping RNA-seq reads to a reference genome, counting the numbers of mapped reads, quality control of the aligned reads, and SNP (single nucleotide polymorphism) calling. Step #1 is computationally intensive, and can be processed in parallel. In Step #2, the results from individual samples are merged, and an integrated and interactive project report is generated. All analyses results in the report are accessible via a single HTML entry webpage. Step #3 is the data interpretation and presentation step. The rich visualization features implemented here allow end users to interactively explore the results of RNA-seq data analyses, and to gain more insights into RNA-seq datasets. In addition, we used a real world dataset to demonstrate the simplicity and efficiency of QuickRNASeq in RNA-seq data analyses and interactive visualizations. The seamless integration of automated capabilites with interactive visualizations in QuickRNASeq is not available in other published RNA-seq pipelines. The high degree of automation and interactivity in QuickRNASeq leads to a substantial reduction in the time and effort required prior to further downstream analyses and interpretation of the analyses findings. QuickRNASeq advances primary RNA-seq data analyses to the next level of automation, and is mature for public release and adoption.

  5. Vocal acoustic analysis as a biometric indicator of information processing: implications for neurological and psychiatric disorders.

    PubMed

    Cohen, Alex S; Dinzeo, Thomas J; Donovan, Neila J; Brown, Caitlin E; Morrison, Sean C

    2015-03-30

    Vocal expression reflects an integral component of communication that varies considerably within individuals across contexts and is disrupted in a range of neurological and psychiatric disorders. There is reason to suspect that variability in vocal expression reflects, in part, the availability of "on-line" resources (e.g., working memory, attention). Thus, understanding vocal expression is a potentially important biometric index of information processing, not only across but within individuals over time. A first step in this line of research involves establishing a link between vocal expression and information processing systems in healthy adults. The present study employed a dual attention experimental task where participants provided natural speech while simultaneously engaged in a baseline, medium or high nonverbal processing-load task. Objective, automated, and computerized analysis was employed to measure vocal expression in 226 adults. Increased processing load resulted in longer pauses, fewer utterances, greater silence overall and less variability in frequency and intensity levels. These results provide compelling evidence of a link between information processing resources and vocal expression, and provide important information for the development of an automated, inexpensive and uninvasive biometric measure of information processing. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  6. Case-based medical informatics

    PubMed Central

    Pantazi, Stefan V; Arocha, José F; Moehr, Jochen R

    2004-01-01

    Background The "applied" nature distinguishes applied sciences from theoretical sciences. To emphasize this distinction, we begin with a general, meta-level overview of the scientific endeavor. We introduce the knowledge spectrum and four interconnected modalities of knowledge. In addition to the traditional differentiation between implicit and explicit knowledge we outline the concepts of general and individual knowledge. We connect general knowledge with the "frame problem," a fundamental issue of artificial intelligence, and individual knowledge with another important paradigm of artificial intelligence, case-based reasoning, a method of individual knowledge processing that aims at solving new problems based on the solutions to similar past problems. We outline the fundamental differences between Medical Informatics and theoretical sciences and propose that Medical Informatics research should advance individual knowledge processing (case-based reasoning) and that natural language processing research is an important step towards this goal that may have ethical implications for patient-centered health medicine. Discussion We focus on fundamental aspects of decision-making, which connect human expertise with individual knowledge processing. We continue with a knowledge spectrum perspective on biomedical knowledge and conclude that case-based reasoning is the paradigm that can advance towards personalized healthcare and that can enable the education of patients and providers. We center the discussion on formal methods of knowledge representation around the frame problem. We propose a context-dependent view on the notion of "meaning" and advocate the need for case-based reasoning research and natural language processing. In the context of memory based knowledge processing, pattern recognition, comparison and analogy-making, we conclude that while humans seem to naturally support the case-based reasoning paradigm (memory of past experiences of problem-solving and powerful case matching mechanisms), technical solutions are challenging. Finally, we discuss the major challenges for a technical solution: case record comprehensiveness, organization of information on similarity principles, development of pattern recognition and solving ethical issues. Summary Medical Informatics is an applied science that should be committed to advancing patient-centered medicine through individual knowledge processing. Case-based reasoning is the technical solution that enables a continuous individual knowledge processing and could be applied providing that challenges and ethical issues arising are addressed appropriately. PMID:15533257

  7. Image analysis and mathematical modelling for the supervision of the dough fermentation process

    NASA Astrophysics Data System (ADS)

    Zettel, Viktoria; Paquet-Durand, Olivier; Hecker, Florian; Hitzmann, Bernd

    2016-10-01

    The fermentation (proof) process of dough is one of the quality-determining steps in the production of baking goods. Beside the fluffiness, whose fundaments are built during fermentation, the flavour of the final product is influenced very much during this production stage. However, until now no on-line measurement system is available, which can supervise this important process step. In this investigation the potential of an image analysis system is evaluated, that enables the determination of the volume of fermented dough pieces. The camera is moving around the fermenting pieces and collects images from the objects by means of different angles (360° range). Using image analysis algorithms the volume increase of individual dough pieces is determined. Based on a detailed mathematical description of the volume increase, which based on the Bernoulli equation, carbon dioxide production rate of yeast cells and the diffusion processes of carbon dioxide, the fermentation process is supervised. Important process parameters, like the carbon dioxide production rate of the yeast cells and the dough viscosity can be estimated just after 300 s of proofing. The mean percentage error for forecasting the further evolution of the relative volume of the dough pieces is just 2.3 %. Therefore, a forecast of the further evolution can be performed and used for fault detection.

  8. Coulomb explosion: a novel approach to separate single-walled carbon nanotubes from their bundle.

    PubMed

    Liu, Guangtong; Zhao, Yuanchun; Zheng, Kaihong; Liu, Zheng; Ma, Wenjun; Ren, Yan; Xie, Sishen; Sun, Lianfeng

    2009-01-01

    A novel approach based on Coulomb explosion has been developed to separate single-walled carbon nanotubes (SWNTs) from their bundle. With this technique, we can readily separate a bundle of SWNTs into smaller bundles with uniform diameter as well as some individual SWNTs. The separated SWNTs have a typical length of several microns and form a nanotree at one end of the original bundle. More importantly, this separating procedure involves no surfactant and includes only one-step physical process. The separation method offers great conveniences for the subsequent individual SWNT or multiterminal SWNTs device fabrication and their physical properties studies.

  9. Acceleration of linear stationary iterative processes in multiprocessor computers. II

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Romm, Ya.E.

    1982-05-01

    For pt.I, see Kibernetika, vol.18, no.1, p.47 (1982). For pt.I, see Cybernetics, vol.18, no.1, p.54 (1982). Considers a reduced system of linear algebraic equations x=ax+b, where a=(a/sub ij/) is a real n*n matrix; b is a real vector with common euclidean norm >>>. It is supposed that the existence and uniqueness of solution det (0-a) not equal to e is given, where e is a unit matrix. The linear iterative process converging to x x/sup (k+1)/=fx/sup (k)/, k=0, 1, 2, ..., where the operator f translates r/sup n/ into r/sup n/. In considering implementation of the iterative process (ip) inmore » a multiprocessor system, it is assumed that the number of processors is constant, and are various values of the latter investigated; it is assumed in addition, that the processors perform elementary binary arithmetic operations of addition and multiestimates only include the time of execution of arithmetic operations. With any paralleling of individual iteration, the execution time of the ip is proportional to the number of sequential steps k+1. The author sets the task of reducing the number of sequential steps in the ip so as to execute it in a time proportional to a value smaller than k+1. He also sets the goal of formulating a method of accelerated bit serial-parallel execution of each successive step of the ip, with, in the modification sought, a reduced number of steps in a time comparable to the operation time of logical elements. 6 references.« less

  10. One Small Step for a Man: Estimation of Gender, Age and Height from Recordings of One Step by a Single Inertial Sensor

    PubMed Central

    Riaz, Qaiser; Vögele, Anna; Krüger, Björn; Weber, Andreas

    2015-01-01

    A number of previous works have shown that information about a subject is encoded in sparse kinematic information, such as the one revealed by so-called point light walkers. With the work at hand, we extend these results to classifications of soft biometrics from inertial sensor recordings at a single body location from a single step. We recorded accelerations and angular velocities of 26 subjects using integrated measurement units (IMUs) attached at four locations (chest, lower back, right wrist and left ankle) when performing standardized gait tasks. The collected data were segmented into individual walking steps. We trained random forest classifiers in order to estimate soft biometrics (gender, age and height). We applied two different validation methods to the process, 10-fold cross-validation and subject-wise cross-validation. For all three classification tasks, we achieve high accuracy values for all four sensor locations. From these results, we can conclude that the data of a single walking step (6D: accelerations and angular velocities) allow for a robust estimation of the gender, height and age of a person. PMID:26703601

  11. Towards numerical prediction of cavitation erosion.

    PubMed

    Fivel, Marc; Franc, Jean-Pierre; Chandra Roy, Samir

    2015-10-06

    This paper is intended to provide a potential basis for a numerical prediction of cavitation erosion damage. The proposed method can be divided into two steps. The first step consists in determining the loading conditions due to cavitation bubble collapses. It is shown that individual pits observed on highly polished metallic samples exposed to cavitation for a relatively small time can be considered as the signature of bubble collapse. By combining pitting tests with an inverse finite-element modelling (FEM) of the material response to a representative impact load, loading conditions can be derived for each individual bubble collapse in terms of stress amplitude (in gigapascals) and radial extent (in micrometres). This step requires characterizing as accurately as possible the properties of the material exposed to cavitation. This characterization should include the effect of strain rate, which is known to be high in cavitation erosion (typically of the order of several thousands s(-1)). Nanoindentation techniques as well as compressive tests at high strain rate using, for example, a split Hopkinson pressure bar test system may be used. The second step consists in developing an FEM approach to simulate the material response to the repetitive impact loads determined in step 1. This includes a detailed analysis of the hardening process (isotropic versus kinematic) in order to properly account for fatigue as well as the development of a suitable model of material damage and failure to account for mass loss. Although the whole method is not yet fully operational, promising results are presented that show that such a numerical method might be, in the long term, an alternative to correlative techniques used so far for cavitation erosion prediction.

  12. Towards numerical prediction of cavitation erosion

    PubMed Central

    Fivel, Marc; Franc, Jean-Pierre; Chandra Roy, Samir

    2015-01-01

    This paper is intended to provide a potential basis for a numerical prediction of cavitation erosion damage. The proposed method can be divided into two steps. The first step consists in determining the loading conditions due to cavitation bubble collapses. It is shown that individual pits observed on highly polished metallic samples exposed to cavitation for a relatively small time can be considered as the signature of bubble collapse. By combining pitting tests with an inverse finite-element modelling (FEM) of the material response to a representative impact load, loading conditions can be derived for each individual bubble collapse in terms of stress amplitude (in gigapascals) and radial extent (in micrometres). This step requires characterizing as accurately as possible the properties of the material exposed to cavitation. This characterization should include the effect of strain rate, which is known to be high in cavitation erosion (typically of the order of several thousands s−1). Nanoindentation techniques as well as compressive tests at high strain rate using, for example, a split Hopkinson pressure bar test system may be used. The second step consists in developing an FEM approach to simulate the material response to the repetitive impact loads determined in step 1. This includes a detailed analysis of the hardening process (isotropic versus kinematic) in order to properly account for fatigue as well as the development of a suitable model of material damage and failure to account for mass loss. Although the whole method is not yet fully operational, promising results are presented that show that such a numerical method might be, in the long term, an alternative to correlative techniques used so far for cavitation erosion prediction. PMID:26442139

  13. Reactive Balance in Individuals With Chronic Stroke: Biomechanical Factors Related to Perturbation-Induced Backward Falling.

    PubMed

    Salot, Pooja; Patel, Prakruti; Bhatt, Tanvi

    2016-03-01

    An effective compensatory stepping response is the first line of defense for preventing a fall during sudden large external perturbations. The biomechanical factors that contribute to heightened fall risk in survivors of stroke, however, are not clearly understood. It is known that impending sensorimotor and balance deficits poststroke predispose these individuals to a risk of fall during sudden external perturbations. The purpose of this study was to examine the mechanism of fall risk in survivors of chronic stroke when exposed to sudden, slip-like forward perturbations in stance. This was a cross-sectional study. Fourteen individuals with stroke, 14 age-matched controls (AC group), and 14 young controls (YC group) were exposed to large-magnitude forward stance perturbations. Postural stability was computed as center of mass (COM) position (XCOM/BOS) and velocity (ẊCOM/BOS) relative to the base of support (BOS) at first step lift-off (LO) and touch-down (TD) and at second step TD. Limb support was quantified as vertical hip descent (Zhip) from baseline after perturbation onset. All participants showed a backward balance loss, with 71% of the stroke group experiencing a fall compared with no falls in the control groups (AC and YC groups). At first step LO, no between-group differences in XCOM/BOS and ẊCOM/BOS were noted. At first step TD, however, the stroke group had a significantly posterior XCOM/BOS and backward ẊCOM/BOS compared with the control groups. At second step TD, individuals with stroke were still more unstable (more posterior XCOM/BOS and backward ẊCOM/BOS) compared with the AC group. Individuals with stroke also showed greater peak Zhip compared with the control groups. Furthermore, the stroke group took a larger number of steps with shorter step length and delayed step initiation compared with the control groups. Although the study highlights the reactive balance deficits increasing fall risk in survivors of stroke compared with healthy adults, the study was restricted to individuals with chronic stroke only. It is likely that comparing compensatory stepping responses across different stages of recovery would enable clinicians to identify reactive balance deficits related to a specific stage of recovery. These findings suggest the inability of the survivors of stroke to regain postural stability with one or more compensatory steps, unlike their healthy counterparts. Such a response may expose them to a greater fall risk resulting from inefficient compensatory stepping and reduced vertical limb support. Therapeutic interventions for fall prevention, therefore, should focus on improving both reactive stepping and limb support. © 2016 American Physical Therapy Association.

  14. A multi-step system for screening and localization of hard exudates in retinal images

    NASA Astrophysics Data System (ADS)

    Bopardikar, Ajit S.; Bhola, Vishal; Raghavendra, B. S.; Narayanan, Rangavittal

    2012-03-01

    The number of people being affected by Diabetes mellitus worldwide is increasing at an alarming rate. Monitoring of the diabetic condition and its effects on the human body are therefore of great importance. Of particular interest is diabetic retinopathy (DR) which is a result of prolonged, unchecked diabetes and affects the visual system. DR is a leading cause of blindness throughout the world. At any point of time 25 - 44% of people with diabetes are afflicted by DR. Automation of the screening and monitoring process for DR is therefore essential for efficient utilization of healthcare resources and optimizing treatment of the affected individuals. Such automation would use retinal images and detect the presence of specific artifacts such as hard exudates, hemorrhages and soft exudates (that may appear in the image) to gauge the severity of DR. In this paper, we focus on the detection of hard exudates. We propose a two step system that consists of a screening step that classifies retinal images as normal or abnormal based on the presence of hard exudates and a detection stage that localizes these artifacts in an abnormal retinal image. The proposed screening step automatically detects the presence of hard exudates with a high sensitivity and positive predictive value (PPV ). The detection/localization step uses a k-means based clustering approach to localize hard exudates in the retinal image. Suitable feature vectors are chosen based on their ability to isolate hard exudates while minimizing false detections. The algorithm was tested on a benchmark dataset (DIARETDB1) and was seen to provide a superior performance compared to existing methods. The two-step process described in this paper can be embedded in a tele-ophthalmology system to aid with speedy detection and diagnosis of the severity of DR.

  15. Catalyst Interface Engineering for Improved 2D Film Lift-Off and Transfer

    PubMed Central

    2016-01-01

    The mechanisms by which chemical vapor deposited (CVD) graphene and hexagonal boron nitride (h-BN) films can be released from a growth catalyst, such as widely used copper (Cu) foil, are systematically explored as a basis for an improved lift-off transfer. We show how intercalation processes allow the local Cu oxidation at the interface followed by selective oxide dissolution, which gently releases the 2D material (2DM) film. Interfacial composition change and selective dissolution can thereby be achieved in a single step or split into two individual process steps. We demonstrate that this method is not only highly versatile but also yields graphene and h-BN films of high quality regarding surface contamination, layer coherence, defects, and electronic properties, without requiring additional post-transfer annealing. We highlight how such transfers rely on targeted corrosion at the catalyst interface and discuss this in context of the wider CVD growth and 2DM transfer literature, thereby fostering an improved general understanding of widely used transfer processes, which is essential to numerous other applications. PMID:27934130

  16. Real-Time Decision Making and Aggressive Behavior in Youth: A Heuristic Model of Response Evaluation and Decision (RED)

    PubMed Central

    Fontaine, Reid Griffith; Dodge, Kenneth A.

    2009-01-01

    Considerable scientific and intervention attention has been paid to judgment and decision-making systems associated with aggressive behavior in youth. However, most empirical studies have investigated social-cognitive correlates of stable child and adolescent aggressiveness, and less is known about real-time decision making to engage in aggressive behavior. A model of real-time decision making must incorporate both impulsive actions and rational thought. The present paper advances a process model (response evaluation and decision; RED) of real-time behavioral judgments and decision making in aggressive youths with mathematic representations that may be used to quantify response strength. These components are a heuristic to describe decision making, though it is doubtful that individuals always mentally complete these steps. RED represents an organization of social–cognitive operations believed to be active during the response decision step of social information processing. The model posits that RED processes can be circumvented through impulsive responding. This article provides a description and integration of thoughtful, rational decision making and nonrational impulsivity in aggressive behavioral interactions. PMID:20802851

  17. Real-Time Decision Making and Aggressive Behavior in Youth: A Heuristic Model of Response Evaluation and Decision (RED).

    PubMed

    Fontaine, Reid Griffith; Dodge, Kenneth A

    2006-11-01

    Considerable scientific and intervention attention has been paid to judgment and decision-making systems associated with aggressive behavior in youth. However, most empirical studies have investigated social-cognitive correlates of stable child and adolescent aggressiveness, and less is known about real-time decision making to engage in aggressive behavior. A model of real-time decision making must incorporate both impulsive actions and rational thought. The present paper advances a process model (response evaluation and decision; RED) of real-time behavioral judgments and decision making in aggressive youths with mathematic representations that may be used to quantify response strength. These components are a heuristic to describe decision making, though it is doubtful that individuals always mentally complete these steps. RED represents an organization of social-cognitive operations believed to be active during the response decision step of social information processing. The model posits that RED processes can be circumvented through impulsive responding. This article provides a description and integration of thoughtful, rational decision making and nonrational impulsivity in aggressive behavioral interactions.

  18. An Accelerated Analytical Process for the Development of STR Profiles for Casework Samples.

    PubMed

    Laurin, Nancy; Frégeau, Chantal J

    2015-07-01

    Significant efforts are being devoted to the development of methods enabling rapid generation of short tandem repeat (STR) profiles in order to reduce turnaround times for the delivery of human identification results from biological evidence. Some of the proposed solutions are still costly and low throughput. This study describes the optimization of an analytical process enabling the generation of complete STR profiles (single-source or mixed profiles) for human identification in approximately 5 h. This accelerated process uses currently available reagents and standard laboratory equipment. It includes a 30-min lysis step, a 27-min DNA extraction using the Promega Maxwell(®) 16 System, DNA quantification in <1 h using the Qiagen Investigator(®) Quantiplex HYres kit, fast amplification (<26 min) of the loci included in AmpFℓSTR(®) Identifiler(®), and analysis of the profiles on the 3500-series Genetic Analyzer. This combination of fast individual steps produces high-quality profiling results and offers a cost-effective alternative approach to rapid DNA analysis. © 2015 American Academy of Forensic Sciences.

  19. A componential model of human interaction with graphs: 1. Linear regression modeling

    NASA Technical Reports Server (NTRS)

    Gillan, Douglas J.; Lewis, Robert

    1994-01-01

    Task analyses served as the basis for developing the Mixed Arithmetic-Perceptual (MA-P) model, which proposes (1) that people interacting with common graphs to answer common questions apply a set of component processes-searching for indicators, encoding the value of indicators, performing arithmetic operations on the values, making spatial comparisons among indicators, and repsonding; and (2) that the type of graph and user's task determine the combination and order of the components applied (i.e., the processing steps). Two experiments investigated the prediction that response time will be linearly related to the number of processing steps according to the MA-P model. Subjects used line graphs, scatter plots, and stacked bar graphs to answer comparison questions and questions requiring arithmetic calculations. A one-parameter version of the model (with equal weights for all components) and a two-parameter version (with different weights for arithmetic and nonarithmetic processes) accounted for 76%-85% of individual subjects' variance in response time and 61%-68% of the variance taken across all subjects. The discussion addresses possible modifications in the MA-P model, alternative models, and design implications from the MA-P model.

  20. Arsenic (+3 Oxidation State) Methyltransferase and the Methylation of Arsenicals

    PubMed Central

    Thomas, David J.; Li, Jiaxin; Waters, Stephen B.; Xing, Weibing; Adair, Blakely M.; Drobna, Zuzana; Devesa, Vicenta; Styblo, Miroslav

    2008-01-01

    Metabolic conversion of inorganic arsenic into methylated products is a multistep process that yields mono-, di-, and trimethylated arsenicals. In recent years, it has become apparent that formation of methylated metabolites of inorganic arsenic is not necessarily a detoxification process. Intermediates and products formed in this pathway may be more reactive and toxic than inorganic arsenic. Like all metabolic pathways, understanding the pathway for arsenic methylation involves identification of each individual step in the process and the characterization of the molecules which participate in each step. Among several arsenic methyltransferases that have been identified, arsenic (+3 oxidation state) methyltransferase is the one best characterized at the genetic and functional levels. This review focuses on phylogenetic relationships in the deuterostomal lineage for this enzyme and on the relation between genotype for arsenic (+3 oxidation state) methyltransferase and phenotype for conversion of inorganic arsenic to methylated metabolites. Two conceptual models for function of arsenic (+3 oxidation state) methyltransferase which posit different roles for cellular reductants in the conversion of inorganic arsenic to methylated metabolites are compared. Although each model accurately represents some aspects of enzyme’s role in the pathway for arsenic methylation, neither model is a fully satisfactory representation of all the steps in this metabolic pathway. Additional information on the structure and function of the enzyme will be needed to develop a more comprehensive model for this pathway. PMID:17202581

  1. Yadage and Packtivity - analysis preservation using parametrized workflows

    NASA Astrophysics Data System (ADS)

    Cranmer, Kyle; Heinrich, Lukas

    2017-10-01

    Preserving data analyses produced by the collaborations at LHC in a parametrized fashion is crucial in order to maintain reproducibility and re-usability. We argue for a declarative description in terms of individual processing steps - “packtivities” - linked through a dynamic directed acyclic graph (DAG) and present an initial set of JSON schemas for such a description and an implementation - “yadage” - capable of executing workflows of analysis preserved via Linux containers.

  2. Branding your practice: twelve practical steps to creating lifelong patient relationships.

    PubMed

    Neely, Melinda Hinson

    2005-01-01

    The concept of branding is not limited to large companies. It can be successfully applied in medical practices to those individuals or groups that wish to establish a distinct identity in the marketplace. Branding a medical practice establishes a competitive advantage, ensures a more predictable flow of patients, and ultimately enhances patient satisfaction. This article conceptualizes the branding process and provides guidelines for implementation that are applicable to a variety of budgets.

  3. Efficient hybrid metrology for focus, CD, and overlay

    NASA Astrophysics Data System (ADS)

    Tel, W. T.; Segers, B.; Anunciado, R.; Zhang, Y.; Wong, P.; Hasan, T.; Prentice, C.

    2017-03-01

    In the advent of multiple patterning techniques in semiconductor industry, metrology has progressively become a burden. With multiple patterning techniques such as Litho-Etch-Litho-Etch and Sidewall Assisted Double Patterning, the number of processing step have increased significantly and therefore, so as the amount of metrology steps needed for both control and yield monitoring. The amount of metrology needed is increasing in each and every node as more layers needed multiple patterning steps, and more patterning steps per layer. In addition to this, there is that need for guided defect inspection, which in itself requires substantially denser focus, overlay, and CD metrology as before. Metrology efficiency will therefore be cruicial to the next semiconductor nodes. ASML's emulated wafer concept offers a highly efficient method for hybrid metrology for focus, CD, and overlay. In this concept metrology is combined with scanner's sensor data in order to predict the on-product performance. The principle underlying the method is to isolate and estimate individual root-causes which are then combined to compute the on-product performance. The goal is to use all the information available to avoid ever increasing amounts of metrology.

  4. Quartz resonator processing system

    DOEpatents

    Peters, Roswell D. M.

    1983-01-01

    Disclosed is a single chamber ultra-high vacuum processing system for the oduction of hermetically sealed quartz resonators wherein electrode metallization and sealing are carried out along with cleaning and bake-out without any air exposure between the processing steps. The system includes a common vacuum chamber in which is located a rotatable wheel-like member which is adapted to move a plurality of individual component sets of a flat pack resonator unit past discretely located processing stations in said chamber whereupon electrode deposition takes place followed by the placement of ceramic covers over a frame containing a resonator element and then to a sealing stage where a pair of hydraulic rams including heating elements effect a metallized bonding of the covers to the frame.

  5. Characterizing the roles of changing population size and selection on the evolution of flux control in metabolic pathways.

    PubMed

    Orlenko, Alena; Chi, Peter B; Liberles, David A

    2017-05-25

    Understanding the genotype-phenotype map is fundamental to our understanding of genomes. Genes do not function independently, but rather as part of networks or pathways. In the case of metabolic pathways, flux through the pathway is an important next layer of biological organization up from the individual gene or protein. Flux control in metabolic pathways, reflecting the importance of mutation to individual enzyme genes, may be evolutionarily variable due to the role of mutation-selection-drift balance. The evolutionary stability of rate limiting steps and the patterns of inter-molecular co-evolution were evaluated in a simulated pathway with a system out of equilibrium due to fluctuating selection, population size, or positive directional selection, to contrast with those under stabilizing selection. Depending upon the underlying population genetic regime, fluctuating population size was found to increase the evolutionary stability of rate limiting steps in some scenarios. This result was linked to patterns of local adaptation of the population. Further, during positive directional selection, as with more complex mutational scenarios, an increase in the observation of inter-molecular co-evolution was observed. Differences in patterns of evolution when systems are in and out of equilibrium, including during positive directional selection may lead to predictable differences in observed patterns for divergent evolutionary scenarios. In particular, this result might be harnessed to detect differences between compensatory processes and directional processes at the pathway level based upon evolutionary observations in individual proteins. Detecting functional shifts in pathways reflects an important milestone in predicting when changes in genotypes result in changes in phenotypes.

  6. Toward the Computational Representation of Individual Cultural, Cognitive, and Physiological State: The Sensor Shooter Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    RAYBOURN,ELAINE M.; FORSYTHE,JAMES C.

    2001-08-01

    This report documents an exploratory FY 00 LDRD project that sought to demonstrate the first steps toward a realistic computational representation of the variability encountered in individual human behavior. Realism, as conceptualized in this project, required that the human representation address the underlying psychological, cultural, physiological, and environmental stressors. The present report outlines the researchers' approach to representing cognitive, cultural, and physiological variability of an individual in an ambiguous situation while faced with a high-consequence decision that would greatly impact subsequent events. The present project was framed around a sensor-shooter scenario as a soldier interacts with an unexpected target (twomore » young Iraqi girls). A software model of the ''Sensor Shooter'' scenario from Desert Storm was developed in which the framework consisted of a computational instantiation of Recognition Primed Decision Making in the context of a Naturalistic Decision Making model [1]. Recognition Primed Decision Making was augmented with an underlying foundation based on our current understanding of human neurophysiology and its relationship to human cognitive processes. While the Gulf War scenario that constitutes the framework for the Sensor Shooter prototype is highly specific, the human decision architecture and the subsequent simulation are applicable to other problems similar in concept, intensity, and degree of uncertainty. The goal was to provide initial steps toward a computational representation of human variability in cultural, cognitive, and physiological state in order to attain a better understanding of the full depth of human decision-making processes in the context of ambiguity, novelty, and heightened arousal.« less

  7. [Staged oncological screening with TG test].

    PubMed

    Bakhlaev, I E; Ageenko, A I; Rolik, I S

    2006-01-01

    The authors present their analysis of screening methods used for early diagnostics of cancer of various localization and for detection of high-risk individuals. They offer a program of step-by-step screening that makes it possible to cover more population with prophylactic examination and to reduce the need for special examination methods. TG-test is a universal and the most informative blastomatous process indicator at any stage, including the preclinical one. The practical screening results double the revealing rate of oncopathology and allow for three-fold reduction in the diagnostic costs compared with standard methods of cancer diagnostics. The medical efficiency of the oncological screening is high; in one third of the examined patients a tumor is diagnosed at the preclinical stage.

  8. One-Step Synthesis of Monodisperse In-Doped ZnO Nanocrystals

    NASA Astrophysics Data System (ADS)

    Wang, Qing Ling; Yang, Ye Feng; He, Hai Ping; Chen, Dong Dong; Ye, Zhi Zhen; Jin, Yi Zheng

    2010-05-01

    A method for the synthesis of high quality indium-doped zinc oxide (In-doped ZnO) nanocrystals was developed using a one-step ester elimination reaction based on alcoholysis of metal carboxylate salts. The resulting nearly monodisperse nanocrystals are well-crystallized with typically crystal structure identical to that of wurtzite type of ZnO. Structural, optical, and elemental analyses on the products indicate the incorporation of indium into the host ZnO lattices. The individual nanocrystals with cubic structures were observed in the 5% In-ZnO reaction, due to the relatively high reactivity of indium precursors. Our study would provide further insights for the growth of doped oxide nanocrystals, and deepen the understanding of doping process in colloidal nanocrystal syntheses.

  9. A Minimal Optical Trapping and Imaging Microscopy System

    PubMed Central

    Hernández Candia, Carmen Noemí; Tafoya Martínez, Sara; Gutiérrez-Medina, Braulio

    2013-01-01

    We report the construction and testing of a simple and versatile optical trapping apparatus, suitable for visualizing individual microtubules (∼25 nm in diameter) and performing single-molecule studies, using a minimal set of components. This design is based on a conventional, inverted microscope, operating under plain bright field illumination. A single laser beam enables standard optical trapping and the measurement of molecular displacements and forces, whereas digital image processing affords real-time sample visualization with reduced noise and enhanced contrast. We have tested our trapping and imaging instrument by measuring the persistence length of individual double-stranded DNA molecules, and by following the stepping of single kinesin motor proteins along clearly imaged microtubules. The approach presented here provides a straightforward alternative for studies of biomaterials and individual biomolecules. PMID:23451216

  10. [R-ALERGO. Allergy-healthy routes in Valencia].

    PubMed

    Temes Cordovez, Rafael R; Moya Fuero, Alfonso; Martí Garrido, Jaume; Perales Chordá, Carolina; Díaz Palacios, Miguel; Hernández Fernández de Rojas, Dolores

    2016-01-01

    R-ALERGO is a project developed by researchers from the Universitat Politècnica de València and the Hospital Universitario La Fe (Valencia, Spain). The main objective of the project is to create a mobile application identifying, within the city of Valencia, the most favorable routes for allergic individuals. The application is developed using nine environmental variables with a potential effect on the development of clinical manifestations in allergic individuals. The application combines the use of spatial analysis based on network technology and implemented with a geographic information system software. The first 01 version is under evaluation for a Healthy app hallmark. The next step in this project is to design a clinical validation process to test its usefulness in allergic individuals. Copyright © 2015 SESPAS. Published by Elsevier Espana. All rights reserved.

  11. Association of activities of daily living with the load during step ascent motion in nursing home-residing elderly individuals.

    PubMed

    Masaki, Mitsuhiro; Ikezoe, Tome; Kamiya, Midori; Araki, Kojiro; Isono, Ryo; Kato, Takehiro; Kusano, Ken; Tanaka, Masayo; Sato, Syunsuke; Hirono, Tetsuya; Kita, Kiyoshi; Tsuboyama, Tadao; Ichihashi, Noriaki

    2018-04-19

    This study aimed to examine the association of independence in ADL with the loads during step ascent motion and other motor functions in 32 nursing home-residing elderly individuals. Independence in ADL was assessed by using the functional independence measure (FIM). The loads at the upper (i.e., pulling up) and lower (i.e., pushing up) levels during step ascent task was measured on a step ascent platform. Hip extensor, knee extensor, plantar flexor muscle, and quadriceps setting strengths; lower extremity agility using the stepping test; and hip and knee joint pain severities were measured. One-legged stance and functional reach distance for balance, and maximal walking speed, timed up-and-go (TUG) time, five-chair-stand time, and step ascent time were also measured to assess mobility. Stepwise regression analysis revealed that the load at pushing up during step ascent motion and TUG time were significant and independent determinants of FIM score. FIM score decreased with decreased the load at pushing up and increased TUG time. The study results suggest that depending on task specificity, both one step up task's push up peak load during step ascent motion and TUG, can partially explain ADL's FIM score in the nursing home-residing elderly individuals. Lower extremity muscle strength, agility, pain or balance measures did not add to the prediction.

  12. Empirical modeling of an alcohol expectancy memory network using multidimensional scaling.

    PubMed

    Rather, B C; Goldman, M S; Roehrich, L; Brannick, M

    1992-02-01

    Risk-related antecedent variables can be linked to later alcohol consumption by memory processes, and alcohol expectancies may be one relevant memory content. To advance research in this area, it would be useful to apply current memory models such as semantic network theory to explain drinking decision processes. We used multidimensional scaling (MDS) to empirically model a preliminary alcohol expectancy semantic network, from which a theoretical account of drinking decision making was generated. Subanalyses (PREFMAP) showed how individuals with differing alcohol consumption histories may have had different association pathways within the expectancy network. These pathways may have, in turn influenced future drinking levels and behaviors while the person was under the influence of alcohol. All individuals associated positive/prosocial effects with drinking, but heavier drinkers indicated arousing effects as their highest probability associates, whereas light drinkers expected sedation. An important early step in this MDS modeling process is the determination of iso-meaning expectancy adjective groups, which correspond to theoretical network nodes.

  13. Improving equitable access to imaging under universal-access medicine: the ontario wait time information program and its impact on hospital policy and process.

    PubMed

    Kielar, Ania Z; El-Maraghi, Robert H; Schweitzer, Mark E

    2010-08-01

    In Canada, equal access to health care is the goal, but this is associated with wait times. Wait times should be fair rather than uniform, taking into account the urgency of the problem as well as the time an individual has already waited. In November 2004, the Ontario government began addressing this issue. One of the first steps was to institute benchmarks reflecting "acceptable" wait times for CT and MRI. A public Web site was developed indicating wait times at each Local Health Integration Network. Since starting the Wait Time Information Program, there has been a sustained reduction in wait times for Ontarians requiring CT and MRI. The average wait time for a CT scan went from 81 days in September 2005 to 47 days in September 2009. For MRI, the resulting wait time was reduced from 120 to 105 days. Increased patient scans have been achieved by purchasing new CT and MRI scanners, expanding hours of operation, and improving patient throughput using strategies learned from the Lean initiative, based on Toyota's manufacturing philosophy for car production. Institution-specific changes in booking procedures have been implemented. Concurrently, government guidelines have been developed to ensure accountability for monies received. The Ontario Wait Time Information Program is an innovative first step in improving fair and equitable access to publicly funded imaging services. There have been reductions in wait times for both CT and MRI. As various new processes are implemented, further review will be necessary for each step to determine their individual efficacy. Copyright 2010 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  14. Step-to-step spatiotemporal variables and ground reaction forces of intra-individual fastest sprinting in a single session.

    PubMed

    Nagahara, Ryu; Mizutani, Mirai; Matsuo, Akifumi; Kanehisa, Hiroaki; Fukunaga, Tetsuo

    2018-06-01

    We aimed to investigate the step-to-step spatiotemporal variables and ground reaction forces during the acceleration phase for characterising intra-individual fastest sprinting within a single session. Step-to-step spatiotemporal variables and ground reaction forces produced by 15 male athletes were measured over a 50-m distance during repeated (three to five) 60-m sprints using a long force platform system. Differences in measured variables between the fastest and slowest trials were examined at each step until the 22nd step using a magnitude-based inferences approach. There were possibly-most likely higher running speed and step frequency (2nd to 22nd steps) and shorter support time (all steps) in the fastest trial than in the slowest trial. Moreover, for the fastest trial there were likely-very likely greater mean propulsive force during the initial four steps and possibly-very likely larger mean net anterior-posterior force until the 17th step. The current results demonstrate that better sprinting performance within a single session is probably achieved by 1) a high step frequency (except the initial step) with short support time at all steps, 2) exerting a greater mean propulsive force during initial acceleration, and 3) producing a greater mean net anterior-posterior force during initial and middle acceleration.

  15. RECAPDOC - a questionnaire for the documentation of rehabilitation care utilization in individuals with disorders of consciousness in long-term care in Germany: development and pretesting.

    PubMed

    Klingshirn, Hanna; Mittrach, Rene; Braitmayer, Kathrin; Strobl, Ralf; Bender, Andreas; Grill, Eva; Müller, Martin

    2018-05-04

    A multitude of different rehabilitation interventions and other specific health care services are offered for individuals with disorders of consciousness in long-term care settings. To investigate the association of those services and patient-relevant outcomes, a specific instrument to document the utilization of those services is needed. The purpose of this study was to develop such a questionnaire administered to caregivers in epidemiological studies or patient registries in Germany. The development process of the RECAPDOC questionnaire was carried out in three steps. Step 1 consisted of a systematic literature review and an online-based expert survey to define the general content. Step 2 was an expert interview to evaluate the preliminary content of the questionnaire. Step 3 was a pretest including cognitive interviews with caregivers. After each step, the results were combined into a new version of the questionnaire. The first version of the questionnaire included items on utilization of medical care, medical aids, nursing and therapeutic care. The results of the expert interview led to the integration of five new items and the modification of six other items. The pretest led to some minor modifications of the questionnaire since it was rated as feasible and acceptable. The final questionnaire consisted of 29 items covering the domains "living situation", "social insurance status", "utilisation of home health care", "domestic services", "outpatient health care", "specific diagnostic measures", "adaptive technologies", "medical aids" and "utilization of therapies". Also the experience of family support and multidisciplinary collaboration of health professionals is covered. The developed questionnaire is a first step to make the situation of patients with disorders of consciousness in the long-term care setting accessible for evaluation in epidemiological studies and in the context of patient registries. However, further reliability and validity studies are needed.

  16. Kinematic, Muscular, and Metabolic Responses During Exoskeletal-, Elliptical-, or Therapist-Assisted Stepping in People With Incomplete Spinal Cord Injury

    PubMed Central

    Kinnaird, Catherine R.; Holleran, Carey L.; Rafferty, Miriam R.; Rodriguez, Kelly S.; Cain, Julie B.

    2012-01-01

    Background Robotic-assisted locomotor training has demonstrated some efficacy in individuals with neurological injury and is slowly gaining clinical acceptance. Both exoskeletal devices, which control individual joint movements, and elliptical devices, which control endpoint trajectories, have been utilized with specific patient populations and are available commercially. No studies have directly compared training efficacy or patient performance during stepping between devices. Objective The purpose of this study was to evaluate kinematic, electromyographic (EMG), and metabolic responses during elliptical- and exoskeletal-assisted stepping in individuals with incomplete spinal cord injury (SCI) compared with therapist-assisted stepping. Design A prospective, cross-sectional, repeated-measures design was used. Methods Participants with incomplete SCI (n=11) performed 3 separate bouts of exoskeletal-, elliptical-, or therapist-assisted stepping. Unilateral hip and knee sagittal-plane kinematics, lower-limb EMG recordings, and oxygen consumption were compared across stepping conditions and with control participants (n=10) during treadmill stepping. Results Exoskeletal stepping kinematics closely approximated normal gait patterns, whereas significantly greater hip and knee flexion postures were observed during elliptical-assisted stepping. Measures of kinematic variability indicated consistent patterns in control participants and during exoskeletal-assisted stepping, whereas therapist- and elliptical-assisted stepping kinematics were more variable. Despite specific differences, EMG patterns generally were similar across stepping conditions in the participants with SCI. In contrast, oxygen consumption was consistently greater during therapist-assisted stepping. Limitations Limitations included a small sample size, lack of ability to evaluate kinetics during stepping, unilateral EMG recordings, and sagittal-plane kinematics. Conclusions Despite specific differences in kinematics and EMG activity, metabolic activity was similar during stepping in each robotic device. Understanding potential differences and similarities in stepping performance with robotic assistance may be important in delivery of repeated locomotor training using robotic or therapist assistance and for consumers of robotic devices. PMID:22700537

  17. Catalysis-Enhancement via Rotary Fluctuation of F1-ATPase

    PubMed Central

    Watanabe, Rikiya; Hayashi, Kumiko; Ueno, Hiroshi; Noji, Hiroyuki

    2013-01-01

    Protein conformational fluctuations modulate the catalytic powers of enzymes. The frequency of conformational fluctuations may modulate the catalytic rate at individual reaction steps. In this study, we modulated the rotary fluctuation frequency of F1-ATPase (F1) by attaching probes with different viscous drag coefficients at the rotary shaft of F1. Individual rotation pauses of F1 between rotary steps correspond to the waiting state of a certain elementary reaction step of ATP hydrolysis. This allows us to investigate the impact of the frequency modulation of the rotary fluctuation on the rate of the individual reaction steps by measuring the duration of rotation pauses. Although phosphate release was significantly decelerated, the ATP-binding and hydrolysis steps were less sensitive or insensitive to the viscous drag coefficient of the probe. Brownian dynamics simulation based on a model similar to the Sumi-Marcus theory reproduced the experimental results, providing a theoretical framework for the role of rotational fluctuation in F1 rate enhancement. PMID:24268150

  18. Conflict and coordination in the provision of public goods: a conceptual analysis of continuous and step-level games.

    PubMed

    Abele, Susanne; Stasser, Garold; Chartier, Christopher

    2010-11-01

    Conflicts between individual and collective interests are ubiquitous in social life. Experimental studies have investigated the resolution of such conflicts using public goods games with either continuous or step-level payoff functions. Game theory and social interdependence theory identify consequential differences between these two types of games. Continuous function games are prime examples of social dilemmas because they always contain a conflict between individual and collective interests, whereas step-level games can be construed as social coordination games. Step-level games often provide opportunities for coordinated solutions that benefit both the collective and the individuals. For this and other reasons, the authors conclude that one cannot safely generalize results obtained from step-level to continuous-form games (or vice versa). Finally, the authors identify specific characteristics of the payoff function in public goods games that conceptually mark the transition from a pure dilemma to a coordination problem nested within a dilemma.

  19. Developing Livestock Facility Type Information from USDA Agricultural Census Data for Use in Epidemiological and Economic Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Melius, C; Robertson, A; Hullinger, P

    2006-10-24

    The epidemiological and economic modeling of livestock diseases requires knowing the size, location, and operational type of each livestock facility within the US. At the present time, the only national database of livestock facilities that is available to the general public is the USDA's 2002 Agricultural Census data, published by the National Agricultural Statistics Service, herein referred to as the 'NASS data.' The NASS data provides facility data at the county level for various livestock types (i.e., beef cows, milk cows, cattle on feed, other cattle, total hogs and pigs, sheep and lambs, milk goats, and angora goats). However, themore » number and sizes of facilities for the various livestock types are not independent since some facilities have more than one type of livestock, and some livestock are of more than one type (e.g., 'other cattle' that are being fed for slaughter are also 'cattle on feed'). In addition, any data tabulated by NASS that could identify numbers of animals or other data reported by an individual respondent is suppressed by NASS and coded with a 'D.'. To be useful for epidemiological and economic modeling, the NASS data must be converted into a unique set of facility types (farms having similar operational characteristics). The unique set must not double count facilities or animals. At the same time, it must account for all the animals, including those for which the data has been suppressed. Therefore, several data processing steps are required to work back from the published NASS data to obtain a consistent database for individual livestock operations. This technical report documents data processing steps that were used to convert the NASS data into a national livestock facility database with twenty-eight facility types. The process involves two major steps. The first step defines the rules used to estimate the data that is suppressed within the NASS database. The second step converts the NASS livestock types into the operational facility types used by the epidemiological and economic model. Comparison of the resulting database with an independent survey of farms in central California shows excellent agreement between the numbers of farms for the various facility types. This suggests that the NASS data are well suited for providing a consistent set of county-level information on facility numbers and sizes that can be used in epidemiological and economic models.« less

  20. Tracing the decision-making process of physicians with a Decision Process Matrix.

    PubMed

    Hausmann, Daniel; Zulian, Cristina; Battegay, Edouard; Zimmerli, Lukas

    2016-10-18

    Decision-making processes in a medical setting are complex, dynamic and under time pressure, often with serious consequences for a patient's condition. The principal aim of the present study was to trace and map the individual diagnostic process of real medical cases using a Decision Process Matrix [DPM]). The naturalistic decision-making process of 11 residents and a total of 55 medical cases were recorded in an emergency department, and a DPM was drawn up according to a semi-structured technique following four steps: 1) observing and recording relevant information throughout the entire diagnostic process, 2) assessing options in terms of suspected diagnoses, 3) drawing up an initial version of the DPM, and 4) verifying the DPM, while adding the confidence ratings. The DPM comprised an average of 3.2 suspected diagnoses and 7.9 information units (cues). The following three-phase pattern could be observed: option generation, option verification, and final diagnosis determination. Residents strove for the highest possible level of confidence before making the final diagnoses (in two-thirds of the medical cases with a rating of practically certain) or excluding suspected diagnoses (with practically impossible in half of the cases). The following challenges have to be addressed in the future: real-time capturing of emerging suspected diagnoses in the memory of the physician, definition of meaningful information units, and a more contemporary measurement of confidence. DPM is a useful tool for tracing real and individual diagnostic processes. The methodological approach with DPM allows further investigations into the underlying cognitive diagnostic processes on a theoretical level and improvement of individual clinical reasoning skills in practice.

  1. Stimulant abuser groups to engage in 12-step: a multisite trial in the National Institute on Drug Abuse Clinical Trials Network.

    PubMed

    Donovan, Dennis M; Daley, Dennis C; Brigham, Gregory S; Hodgkins, Candace C; Perl, Harold I; Garrett, Sharon B; Doyle, Suzanne R; Floyd, Anthony S; Knox, Patricia C; Botero, Christopher; Kelly, Thomas M; Killeen, Therese K; Hayes, Carole; Kau'i Baumhofer, Nicole; Kau'ibaumhofer, Nicole; Seamans, Cindy; Zammarelli, Lucy

    2013-01-01

    The study evaluated the effectiveness of an 8-week combined group plus individual 12-step facilitative intervention on stimulant drug use and 12-step meeting attendance and service. Multisite randomized controlled trial, with assessments at baseline, mid-treatment, end of treatment, and 3- and 6-month post-randomization follow-ups (FUs). Intensive outpatient substance treatment programs. Individuals with stimulant use disorders (n = 471) randomly assigned to treatment as usual (TAU) or TAU into which the Stimulant Abuser Groups to Engage in 12-Step (STAGE-12) intervention was integrated. Urinalysis and self-reports of substance use and 12-step attendance and activities. Group sessions focused on increasing acceptance of 12-step principles; individual sessions incorporated an intensive referral procedure connecting participants to 12-step volunteers. Compared with TAU, STAGE-12 participants had significantly greater odds of self-reported stimulant abstinence during the active 8-week treatment phase; however, among those who had not achieved abstinence during this period, STAGE-12 participants had more days of use. STAGE-12 participants had lower Addiction Severity Index Drug Composite scores at and a significant reduction from baseline to the 3-month FU, attended 12-step meetings on a greater number of days during the early phase of active treatment, engaged in more other types of 12-step activities throughout the active treatment phase and the entire FU period, and had more days of self-reported service at meetings from mid-treatment through the 6-month FU. The present findings are mixed with respect to the impact of integrating the STAGE-12 intervention into intensive outpatient drug treatment compared with TAU on stimulant drug use. However, the results more clearly indicate that individuals in STAGE-12 had higher rates of 12-step meeting attendance and were engaged in more related activities throughout both the active treatment phase and the entire 6-month FU period than did those in TAU. Copyright © 2013 Elsevier Inc. All rights reserved.

  2. [Steps to transform a necessity into a validated and useful screening tool for early detection of developmental problems in Mexican children].

    PubMed

    Rizzoli-Córdoba, Antonio; Delgado-Ginebra, Ismael

    A screening test is an instrument whose primary function is to identify individuals with a probable disease among an apparently healthy population, establishing risk or suspicion of a disease. Caution must be taken when using a screening tool in order to avoid unrealistic measurements, delaying an intervention for those who may benefit from it. Before introducing a screening test into clinical practice, it is necessary to certify the presence of some characteristics making its worth useful. This "certification" process is called validation. The main objective of this paper is to describe the different steps that must be taken, from the identification of a need for early detection through the generation of a validated and reliable screening tool using, as an example, the process for the modified version of the Child Development Evaluation Test (CDE or Prueba EDI) in Mexico. Copyright © 2015 Hospital Infantil de México Federico Gómez. Publicado por Masson Doyma México S.A. All rights reserved.

  3. Minefield reconnaissance and detector system

    DOEpatents

    Butler, M.T.; Cave, S.P.; Creager, J.D.; Johnson, C.M.; Mathes, J.B.; Smith, K.J.

    1994-04-26

    A multi-sensor system is described for detecting the presence of objects on the surface of the ground or buried just under the surface, such as anti-personnel or anti-tank mines or the like. A remote sensor platform has a plurality of metal detector sensors and a plurality of short pulse radar sensors. The remote sensor platform is remotely controlled from a processing and control unit and signals from the remote sensor platform are sent to the processing and control unit where they are individually evaluated in separate data analysis subprocess steps to obtain a probability score for each of the pluralities of sensors. These probability scores are combined in a fusion subprocess step by comparing score sets to a probability table which is derived based upon the historical incidence of object present conditions given that score set. A decision making rule is applied to provide an output which is optionally provided to a marker subprocess for controlling a marker device to mark the location of found objects. 7 figures.

  4. Visibility Equalizer Cutaway Visualization of Mesoscopic Biological Models.

    PubMed

    Le Muzic, M; Mindek, P; Sorger, J; Autin, L; Goodsell, D; Viola, I

    2016-06-01

    In scientific illustrations and visualization, cutaway views are often employed as an effective technique for occlusion management in densely packed scenes. We propose a novel method for authoring cutaway illustrations of mesoscopic biological models. In contrast to the existing cutaway algorithms, we take advantage of the specific nature of the biological models. These models consist of thousands of instances with a comparably smaller number of different types. Our method constitutes a two stage process. In the first step, clipping objects are placed in the scene, creating a cutaway visualization of the model. During this process, a hierarchical list of stacked bars inform the user about the instance visibility distribution of each individual molecular type in the scene. In the second step, the visibility of each molecular type is fine-tuned through these bars, which at this point act as interactive visibility equalizers. An evaluation of our technique with domain experts confirmed that our equalizer-based approach for visibility specification was valuable and effective for both, scientific and educational purposes.

  5. Visibility Equalizer Cutaway Visualization of Mesoscopic Biological Models

    PubMed Central

    Le Muzic, M.; Mindek, P.; Sorger, J.; Autin, L.; Goodsell, D.; Viola, I.

    2017-01-01

    In scientific illustrations and visualization, cutaway views are often employed as an effective technique for occlusion management in densely packed scenes. We propose a novel method for authoring cutaway illustrations of mesoscopic biological models. In contrast to the existing cutaway algorithms, we take advantage of the specific nature of the biological models. These models consist of thousands of instances with a comparably smaller number of different types. Our method constitutes a two stage process. In the first step, clipping objects are placed in the scene, creating a cutaway visualization of the model. During this process, a hierarchical list of stacked bars inform the user about the instance visibility distribution of each individual molecular type in the scene. In the second step, the visibility of each molecular type is fine-tuned through these bars, which at this point act as interactive visibility equalizers. An evaluation of our technique with domain experts confirmed that our equalizer-based approach for visibility specification was valuable and effective for both, scientific and educational purposes. PMID:28344374

  6. Balance confidence is related to features of balance and gait in individuals with chronic stroke

    PubMed Central

    Schinkel-Ivy, Alison; Wong, Jennifer S.; Mansfield, Avril

    2016-01-01

    Reduced balance confidence is associated with impairments in features of balance and gait in individuals with sub-acute stroke. However, an understanding of these relationships in individuals at the chronic stage of stroke recovery is lacking. This study aimed to quantify relationships between balance confidence and specific features of balance and gait in individuals with chronic stroke. Participants completed a balance confidence questionnaire and clinical balance assessment (quiet standing, walking, and reactive stepping) at 6 months post-discharge from inpatient stroke rehabilitation. Regression analyses were performed using balance confidence as a predictor variable and quiet standing, walking, and reactive stepping outcome measures as the dependent variables. Walking velocity was positively correlated with balance confidence, while medio-lateral centre of pressure excursion (quiet standing) and double support time, step width variability, and step time variability (walking) were negatively correlated with balance confidence. This study provides insight into the relationships between balance confidence and balance and gait measures in individuals with chronic stroke, suggesting that individuals with low balance confidence exhibited impaired control of quiet standing as well as walking characteristics associated with cautious gait strategies. Future work should identify the direction of these relationships to inform community-based stroke rehabilitation programs for individuals with chronic stroke, and determine the potential utility of incorporating interventions to improve balance confidence into these programs. PMID:27955809

  7. Nano-Evaluris: an inhalation and explosion risk evaluation method for nanoparticle use. Part I: description of the methodology

    NASA Astrophysics Data System (ADS)

    Bouillard, Jacques X.; Vignes, Alexis

    2014-02-01

    In this paper, an inhalation health and explosion safety risk assessment methodology for nanopowders is described. Since toxicological threshold limit values are still unknown for nanosized substances, detailed risk assessment on specific plants may not be carried out. A simple approach based on occupational hazard/exposure band expressed in mass concentrations is proposed for nanopowders. This approach is consolidated with an iso surface toxicological scaling method, which has the merit, although incomplete, to provide concentration threshold levels for which new metrological instruments should be developed for proper air monitoring in order to ensure safety. Whenever the processing or use of nanomaterials is introducing a risk to the worker, a specific nano pictogram is proposed to inform the worker. Examples of risk assessment of process equipment (i.e., containment valves) processing various nanomaterials are provided. Explosion risks related to very reactive nanomaterials such as aluminum nanopowders can be assessed using this new analysis methodology adapted to nanopowders. It is nevertheless found that to formalize and extend this approach, it is absolutely necessary to develop new relevant standard apparatuses and to qualify individual and collective safety barriers with respect to health and explosion risks. In spite of these uncertainties, it appears, as shown in the second paper (Part II) that health and explosion risks, evaluated for given MWCNTs and aluminum nanoparticles, remain manageable in their continuous fabrication mode, considering current individual and collective safety barriers that can be put in place. The authors would, however, underline that peculiar attention must be paid to non-continuous modes of operations, such as process equipment cleaning steps, that are often under-analyzed and are too often forgotten critical steps needing vigilance in order to minimize potential toxic and explosion risks.

  8. High-throughput crystallization screening.

    PubMed

    Skarina, Tatiana; Xu, Xiaohui; Evdokimova, Elena; Savchenko, Alexei

    2014-01-01

    Protein structure determination by X-ray crystallography is dependent on obtaining a single protein crystal suitable for diffraction data collection. Due to this requirement, protein crystallization represents a key step in protein structure determination. The conditions for protein crystallization have to be determined empirically for each protein, making this step also a bottleneck in the structure determination process. Typical protein crystallization practice involves parallel setup and monitoring of a considerable number of individual protein crystallization experiments (also called crystallization trials). In these trials the aliquots of purified protein are mixed with a range of solutions composed of a precipitating agent, buffer, and sometimes an additive that have been previously successful in prompting protein crystallization. The individual chemical conditions in which a particular protein shows signs of crystallization are used as a starting point for further crystallization experiments. The goal is optimizing the formation of individual protein crystals of sufficient size and quality to make them suitable for diffraction data collection. Thus the composition of the primary crystallization screen is critical for successful crystallization.Systematic analysis of crystallization experiments carried out on several hundred proteins as part of large-scale structural genomics efforts allowed the optimization of the protein crystallization protocol and identification of a minimal set of 96 crystallization solutions (the "TRAP" screen) that, in our experience, led to crystallization of the maximum number of proteins.

  9. Aerobic Steps As Measured by Pedometry and Their Relation to Central Obesity

    PubMed Central

    DUCHEČKOVÁ, Petra; FOREJT, Martin

    2014-01-01

    Abstract Background The purpose of this study was to examine the relation between daily steps and aerobic steps, and anthropometric variables, using the waist-to-hip ratio (WHR) and waist-to-height ratio (WHtR). Methods The participants in this cross-sectional study were taken the measurements of by a trained anthropologist and then instructed to wear an Omron pedometer for seven consecutive days. A series of statistical tests (Mann-Whitney U test, Kruskal-Wallis ANOVA, multiple comparisons of z’ values and contingency tables) was performed in order to assess the relation between daily steps and aerobic steps, and anthropometric variables. Results A total of 507 individuals (380 females and 127 males) participated in the study. The average daily number of steps and aerobic steps was significantly lower in the individuals with risky WHR and WHtR as compared to the individuals with normal WHR (P=0.005) and WHtR (P=0.000). A comparison of age and anthropometric variables across aerobic steps activity categories was statistically significant for all the studied parameters. According to the contingency tables for normal steps, there is a 5.75x higher risk in the low-activity category of having WHtR>0.50 as compared to the high-activity category. Conclusions Both normal and aerobic steps are significantly associated with central obesity and other body composition variables. This result is important for older people, who are more likely to perform low-intensity activities rather than moderate- or high-intensity activities. Our results also indicate that risk of having WHtR>0.50 can be reduced by almost 6x by increasing daily steps over 8985 steps per day. PMID:25927036

  10. Single-Molecule Sensing with Nanopore Confinement: From Chemical Reactions to Biological Interactions.

    PubMed

    Lin, Yao; Ying, Yi-Lun; Gao, Rui; Long, Yi-Tao

    2018-03-25

    The nanopore can generate an electrochemical confinement for single-molecule sensing that help understand the fundamental chemical principle in nanoscale dimensions. By observing the generated ionic current, individual bond-making and bond-breaking steps, single biomolecule dynamic conformational changes and electron transfer processes that occur within pore can be monitored with high temporal and current resolution. These single-molecule studies in nanopore confinement are revealing information about the fundamental chemical and biological processes that cannot be extracted from ensemble measurements. In this Concept article, we introduce and discuss the electrochemical confinement effects on single-molecule covalent reactions, conformational dynamics of individual molecules and host-guest interactions in protein nanopores. Then, we extend the concept of nanopore confinement effects to confine electrochemical redox reactions in solid-state nanopores for developing new sensing mechanisms. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Effects of processing on proximate and fatty acid compositions of six commercial sea cucumber species of Sri Lanka.

    PubMed

    Nishanthan, G; Kumara, P A D A; de Croos, M D S T; Prasada, D V P; Dissanayake, D C T

    2018-05-01

    Processing and its impacts on proximate composition and fatty acid profile of six sea cucumber species; Bohadschia marmorata, Stichopus chloronotus, Holothuria spinifera, Thelenota anax, Holothuria scabra and Bohadschia sp. 1 collected from the northwest coast of Sri Lanka were analyzed. Sea cucumbers are processed into bêche - de - mer by both domestic and industrial level processors following the similar steps of cleaning, evisceration, first boiling, salting, second boiling and drying. However, domestically processed bêche - de - mer always reported a higher percentage of moisture, crude ash, crude fat and lower percentage of crude protein than industrially processed products. Although processing resulted in a significant reduction of total SFA and MUFA in fresh individuals of most of these species, total PUFA increased significantly in processed individuals excluding Bohadschia species. Palmitic acid was found to be the most dominant fatty acid in all these species followed by eicosapentaenoic acid, which showed a significant increase in processed products, except Bohadschia sp. 1. Total MUFA were higher than total SFA in all sea cucumber species having exceptions in Bohadchia sp.1 and fresh S. chloronotus. These findings will make a significant contribution to fill the gaps in existing information as no any previous information is available for species like H. spinifera and S. chloronotus .

  12. Individual differences in perceiving and recognizing faces-One element of social cognition.

    PubMed

    Wilhelm, Oliver; Herzmann, Grit; Kunina, Olga; Danthiir, Vanessa; Schacht, Annekathrin; Sommer, Werner

    2010-09-01

    Recognizing faces swiftly and accurately is of paramount importance to humans as a social species. Individual differences in the ability to perform these tasks may therefore reflect important aspects of social or emotional intelligence. Although functional models of face cognition based on group and single case studies postulate multiple component processes, little is known about the ability structure underlying individual differences in face cognition. In 2 large individual differences experiments (N = 151 and N = 209), a broad variety of face-cognition tasks were tested and the component abilities of face cognition-face perception, face memory, and the speed of face cognition-were identified and then replicated. Experiment 2 also showed that the 3 face-cognition abilities are clearly distinct from immediate and delayed memory, mental speed, general cognitive ability, and object cognition. These results converge with functional and neuroanatomical models of face cognition by demonstrating the difference between face perception and face memory. The results also underline the importance of distinguishing between speed and accuracy of face cognition. Together our results provide a first step toward establishing face-processing abilities as an independent ability reflecting elements of social intelligence. (PsycINFO Database Record (c) 2010 APA, all rights reserved).

  13. Visually Guided Step Descent in Children with Williams Syndrome

    ERIC Educational Resources Information Center

    Cowie, Dorothy; Braddick, Oliver; Atkinson, Janette

    2012-01-01

    Individuals with Williams syndrome (WS) have impairments in visuospatial tasks and in manual visuomotor control, consistent with parietal and cerebellar abnormalities. Here we examined whether individuals with WS also have difficulties in visually controlling whole-body movements. We investigated visual control of stepping down at a change of…

  14. A Wandering Mind Does Not Stray Far from Home: The Value of Metacognition in Distant Search

    PubMed Central

    Kudesia, Ravi S.; Baer, Markus; Elfenbein, Hillary Anger

    2015-01-01

    When faced with a problem, how do individuals search for potential solutions? In this article, we explore the cognitive processes that lead to local search (i.e., identifying options closest to existing solutions) and distant search (i.e., identifying options of a qualitatively different nature than existing solutions). We suggest that mind wandering is likely to lead to local search because it operates by spreading activation from initial ideas to closely associated ideas. This reduces the likelihood of accessing a qualitatively different solution. However, instead of getting lost in thought, individuals can also step back and monitor their thoughts from a detached perspective. Such mindful metacognition, we suggest, is likely to lead to distant search because it redistributes activation away from initial ideas to other, less strongly associated, ideas. This hypothesis was confirmed across two studies. Thus, getting lost in thoughts is helpful when one is on the right track and needs only a local search whereas stepping back from thoughts is helpful when one needs distant search to produce a change in perspective. PMID:25974164

  15. Automation in biological crystallization.

    PubMed

    Stewart, Patrick Shaw; Mueller-Dieckmann, Jochen

    2014-06-01

    Crystallization remains the bottleneck in the crystallographic process leading from a gene to a three-dimensional model of the encoded protein or RNA. Automation of the individual steps of a crystallization experiment, from the preparation of crystallization cocktails for initial or optimization screens to the imaging of the experiments, has been the response to address this issue. Today, large high-throughput crystallization facilities, many of them open to the general user community, are capable of setting up thousands of crystallization trials per day. It is thus possible to test multiple constructs of each target for their ability to form crystals on a production-line basis. This has improved success rates and made crystallization much more convenient. High-throughput crystallization, however, cannot relieve users of the task of producing samples of high quality. Moreover, the time gained from eliminating manual preparations must now be invested in the careful evaluation of the increased number of experiments. The latter requires a sophisticated data and laboratory information-management system. A review of the current state of automation at the individual steps of crystallization with specific attention to the automation of optimization is given.

  16. Automation in biological crystallization

    PubMed Central

    Shaw Stewart, Patrick; Mueller-Dieckmann, Jochen

    2014-01-01

    Crystallization remains the bottleneck in the crystallographic process leading from a gene to a three-dimensional model of the encoded protein or RNA. Automation of the individual steps of a crystallization experiment, from the preparation of crystallization cocktails for initial or optimization screens to the imaging of the experiments, has been the response to address this issue. Today, large high-throughput crystallization facilities, many of them open to the general user community, are capable of setting up thousands of crystallization trials per day. It is thus possible to test multiple constructs of each target for their ability to form crystals on a production-line basis. This has improved success rates and made crystallization much more convenient. High-throughput crystallization, however, cannot relieve users of the task of producing samples of high quality. Moreover, the time gained from eliminating manual preparations must now be invested in the careful evaluation of the increased number of experiments. The latter requires a sophisticated data and laboratory information-management system. A review of the current state of automation at the individual steps of crystallization with specific attention to the automation of optimization is given. PMID:24915074

  17. Process, including PSA and membrane separation, for separating hydrogen from hydrocarbons

    DOEpatents

    Baker, Richard W.; Lokhandwala, Kaaeid A.; He, Zhenjie; Pinnau, Ingo

    2001-01-01

    An improved process for separating hydrogen from hydrocarbons. The process includes a pressure swing adsorption step, a compression/cooling step and a membrane separation step. The membrane step relies on achieving a methane/hydrogen selectivity of at least about 2.5 under the conditions of the process.

  18. Perceived Chronic Stress Exposure Modulates Reward-Related Medial Prefrontal Cortex Responses to Acute Stress in Depression

    PubMed Central

    Kumar, Poornima; Slavich, George M.; Berghorst, Lisa H.; Treadway, Michael T.; Brooks, Nancy H.; Dutra, Sunny J.; Greve, Douglas N.; O'Donovan, Aoife; Bleil, Maria E.; Maninger, Nicole; Pizzagalli, Diego A.

    2015-01-01

    Introduction Major depressive disorder (MDD) is often precipitated by life stress and growing evidence suggests that stress-induced alterations in reward processing may contribute to such risk. However, no human imaging studies have examined how recent life stress exposure modulates the neural systems that underlie reward processing in depressed and healthy individuals. Methods In this proof-of-concept study, 12 MDD and 10 psychiatrically healthy individuals were interviewed using the Life Events and Difficulties Schedule (LEDS) to assess their perceived levels of recent acute and chronic life stress exposure. Additionally, each participant performed a monetary incentive delay task under baseline (no-stress) and stress (social-evaluative) conditions during functional MRI. Results Across groups, medial prefrontal cortex (mPFC) activation to reward feedback was greater during acute stress versus no-stress conditions in individuals with greater perceived stressor severity. Under acute stress, depressed individuals showed a positive correlation between perceived stressor severity levels and reward-related mPFC activation (r = 0.79, p = 0.004), whereas no effect was found in healthy controls. Moreover, for depressed (but not healthy) individuals, the correlations between the stress (r = 0.79) and no-stress (r = −0.48) conditions were significantly different. Finally, relative to controls, depressed participants showed significantly reduced mPFC grey matter, but functional findings remained when accounting for structural differences. Limitation Small sample size, which warrants replication. Conclusion Depressed individuals experiencing greater recent life stress recruited the mPFC more under stress when processing rewards. Our results represent an initial step toward elucidating mechanisms underlying stress sensitization and recurrence in depression. PMID:25898329

  19. The determination of measures of software reliability

    NASA Technical Reports Server (NTRS)

    Maxwell, F. D.; Corn, B. C.

    1978-01-01

    Measurement of software reliability was carried out during the development of data base software for a multi-sensor tracking system. The failure ratio and failure rate were found to be consistent measures. Trend lines could be established from these measurements that provide good visualization of the progress on the job as a whole as well as on individual modules. Over one-half of the observed failures were due to factors associated with the individual run submission rather than with the code proper. Possible application of these findings for line management, project managers, functional management, and regulatory agencies is discussed. Steps for simplifying the measurement process and for use of these data in predicting operational software reliability are outlined.

  20. Reliability measurement during software development. [for a multisensor tracking system

    NASA Technical Reports Server (NTRS)

    Hecht, H.; Sturm, W. A.; Trattner, S.

    1977-01-01

    During the development of data base software for a multi-sensor tracking system, reliability was measured. The failure ratio and failure rate were found to be consistent measures. Trend lines were established from these measurements that provided good visualization of the progress on the job as a whole as well as on individual modules. Over one-half of the observed failures were due to factors associated with the individual run submission rather than with the code proper. Possible application of these findings for line management, project managers, functional management, and regulatory agencies is discussed. Steps for simplifying the measurement process and for use of these data in predicting operational software reliability are outlined.

  1. Local symmetries and order-disorder transitions in small macroscopic Wigner islands.

    PubMed

    Coupier, Gwennou; Guthmann, Claudine; Noat, Yves; Jean, Michel Saint

    2005-04-01

    The influence of local order on the disordering scenario of small Wigner islands is discussed. A first disordering step is put in evidence by the time correlation functions and is linked to individual excitations resulting in configuration transitions, which are very sensitive to the local symmetries. This is followed by two other transitions, corresponding to orthoradial and radial diffusion, for which both individual and collective excitations play a significant role. Finally, we show that, contrary to large systems, the focus that is commonly made on collective excitations for such small systems through the Lindemann criterion has to be made carefully in order to clearly identify the relative contributions in the whole disordering process.

  2. Collection and conversion of algal lipid

    NASA Astrophysics Data System (ADS)

    Lin, Ching-Chieh

    Sustainable economic activities mandate a significant replacement of fossil energy by renewable forms. Algae-derived biofuels are increasingly seen as an alternative source of energy with potential to supplement the world's ever increasing demand. Our primary objective is, once the algae were cultivated, to eliminate or make more efficient energy-intensive processing steps of collection, drying, grinding, and solvent extraction prior to conversion. To overcome the processing barrier, we propose to streamline from cultivated algae to biodiesel via algal biomass collection by sand filtration, cell rupturing with ozone, and immediate transesterification. To collect the algal biomass, the specific Chlorococcum aquaticum suspension was acidified to pH 3.3 to promote agglomeration prior to sand filtration. The algae-loaded filter bed was drained of free water and added with methanol and ozonated for 2 min to rupture cell membrane to accelerate release of the cellular contents. The methanol solution now containing the dissolved lipid product was collected by draining, while the filter bed was regenerated by further ozonation when needed. The results showed 95% collection of the algal biomass from the suspension and a 16% yield of lipid from the algae, as well as restoration of filtration velocity of the sand bed via ozonation. The results further showed increased lipid yield upon cell rupturing and transesterified products composed entirely of fatty acid methyl ester (FAME) compounds, demonstrating that the rupture and transesterification processes could proceed consecutively in the same medium, requiring no separate steps of drying, extraction, and conversion. The FAME products from algae without exposure to ozone were mainly of 16 to 18 carbons containing up to 3 double bonds, while those from algae having been ozonated were smaller, highly saturated hydrocarbons. The new technique streamlines individual steps from cultivated algal lipid to transesterified products and represents an improvement over existing energy-intensive steps.

  3. Navigating "Assisted Dying".

    PubMed

    Schipper, Harvey

    2016-02-01

    Carter is a bellwether decision, an adjudication on a narrow point of law whose implications are vast across society, and whose impact may not be realized for years. Coupled with Quebec's Act Respecting End-of-life Care it has sharply changed the legal landscape with respect to actively ending a person's life. "Medically assisted dying" will be permitted under circumstances, and through processes, which have yet to be operationally defined. This decision carries with it moral assumptions, which mean that it will be difficult to reach a unifying consensus. For some, the decision and Act reflect a modern acknowledgement of individual autonomy. For others, allowing such acts is morally unspeakable. Having opened the Pandora's Box, the question becomes one of navigating a tolerable societal path. I believe it is possible to achieve a workable solution based on the core principle that "medically assisted dying" should be a very rarely employed last option, subject to transparent ongoing review, specifically as to why it was deemed necessary. My analysis is based on 1. The societal conditions in which have fostered demand for "assisted dying", 2. Actions in other jurisdictions, 3. Carter and Quebec Bill 52, 4. Political considerations, 5. Current medical practice. Leading to a series of recommendations regarding. 1. Legislation and regulation, 2. The role of professional regulatory agencies, 3. Medical professions education and practice, 4. Public education, 5. Health care delivery and palliative care. Given the burden of public opinion, and the legal steps already taken, a process for assisted-dying is required. However, those legal and regulatory steps should only be considered a necessary and defensive first step in a two stage process. The larger goal, the second step, is to drive the improvement of care, and thus minimize assisted-dying.

  4. A practical and systematic approach to organisational capacity strengthening for research in the health sector in Africa.

    PubMed

    Bates, Imelda; Boyd, Alan; Smith, Helen; Cole, Donald C

    2014-03-03

    Despite increasing investment in health research capacity strengthening efforts in low and middle income countries, published evidence to guide the systematic design and monitoring of such interventions is very limited. Systematic processes are important to underpin capacity strengthening interventions because they provide stepwise guidance and allow for continual improvement. Our objective here was to use evidence to inform the design of a replicable but flexible process to guide health research capacity strengthening that could be customized for different contexts, and to provide a framework for planning, collecting information, making decisions, and improving performance. We used peer-reviewed and grey literature to develop a five-step pathway for designing and evaluating health research capacity strengthening programmes, tested in a variety of contexts in Africa. The five steps are: i) defining the goal of the capacity strengthening effort, ii) describing the optimal capacity needed to achieve the goal, iii) determining the existing capacity gaps compared to the optimum, iv) devising an action plan to fill the gaps and associated indicators of change, and v) adapting the plan and indicators as the programme matures. Our paper describes three contrasting case studies of organisational research capacity strengthening to illustrate how our five-step approach works in practice. Our five-step pathway starts with a clear goal and objectives, making explicit the capacity required to achieve the goal. Strategies for promoting sustainability are agreed with partners and incorporated from the outset. Our pathway for designing capacity strengthening programmes focuses not only on technical, managerial, and financial processes within organisations, but also on the individuals within organisations and the wider system within which organisations are coordinated, financed, and managed. Our five-step approach is flexible enough to generate and utilise ongoing learning. We have tested and critiqued our approach in a variety of organisational settings in the health sector in sub-Saharan Africa, but it needs to be applied and evaluated in other sectors and continents to determine the extent of transferability.

  5. Long-term Outcomes After Stepping Down Asthma Controller Medications: A Claims-Based, Time-to-Event Analysis.

    PubMed

    Rank, Matthew A; Johnson, Ryan; Branda, Megan; Herrin, Jeph; van Houten, Holly; Gionfriddo, Michael R; Shah, Nilay D

    2015-09-01

    Long-term outcomes after stepping down asthma medications are not well described. This study was a retrospective time-to-event analysis of individuals diagnosed with asthma who stepped down their asthma controller medications using a US claims database spanning 2000 to 2012. Four-month intervals were established and a step-down event was defined by a ≥ 50% decrease in days-supplied of controller medications from one interval to the next; this definition is inclusive of step-down that occurred without health-care provider guidance or as a consequence of a medication adherence lapse. Asthma stability in the period prior to step-down was defined by not having an asthma exacerbation (inpatient visit, ED visit, or dispensing of a systemic corticosteroid linked to an asthma visit) and having fewer than two rescue inhaler claims in a 4-month period. The primary outcome in the period following step-down was time-to-first asthma exacerbation. Thirty-two percent of the 26,292 included individuals had an asthma exacerbation in the 24-month period following step-down of asthma controller medication, though only 7% had an ED visit or hospitalization for asthma. The length of asthma stability prior to stepping down asthma medication was strongly associated with the risk of an asthma exacerbation in the subsequent 24-month period: < 4 months' stability, 44%; 4 to 7 months, 34%; 8 to 11 months, 30%; and ≥ 12 months, 21% (P < .001). In a large, claims-based, real-world study setting, 32% of individuals have an asthma exacerbation in the 2 years following a step-down event.

  6. Discrete pre-processing step effects in registration-based pipelines, a preliminary volumetric study on T1-weighted images.

    PubMed

    Muncy, Nathan M; Hedges-Muncy, Ariana M; Kirwan, C Brock

    2017-01-01

    Pre-processing MRI scans prior to performing volumetric analyses is common practice in MRI studies. As pre-processing steps adjust the voxel intensities, the space in which the scan exists, and the amount of data in the scan, it is possible that the steps have an effect on the volumetric output. To date, studies have compared between and not within pipelines, and so the impact of each step is unknown. This study aims to quantify the effects of pre-processing steps on volumetric measures in T1-weighted scans within a single pipeline. It was our hypothesis that pre-processing steps would significantly impact ROI volume estimations. One hundred fifteen participants from the OASIS dataset were used, where each participant contributed three scans. All scans were then pre-processed using a step-wise pipeline. Bilateral hippocampus, putamen, and middle temporal gyrus volume estimations were assessed following each successive step, and all data were processed by the same pipeline 5 times. Repeated-measures analyses tested for a main effects of pipeline step, scan-rescan (for MRI scanner consistency) and repeated pipeline runs (for algorithmic consistency). A main effect of pipeline step was detected, and interestingly an interaction between pipeline step and ROI exists. No effect for either scan-rescan or repeated pipeline run was detected. We then supply a correction for noise in the data resulting from pre-processing.

  7. Quick foot placement adjustments during gait are less accurate in individuals with focal cerebellar lesions.

    PubMed

    Hoogkamer, Wouter; Potocanac, Zrinka; Van Calenbergh, Frank; Duysens, Jacques

    2017-10-01

    Online gait corrections are frequently used to restore gait stability and prevent falling. They require shorter response times than voluntary movements which suggests that subcortical pathways contribute to the execution of online gait corrections. To evaluate the potential role of the cerebellum in these pathways we tested the hypotheses that online gait corrections would be less accurate in individuals with focal cerebellar damage than in neurologically intact controls and that this difference would be more pronounced for shorter available response times and for short step gait corrections. We projected virtual stepping stones on an instrumented treadmill while some of the approaching stepping stones were shifted forward or backward, requiring participants to adjust their foot placement. Varying the timing of those shifts allowed us to address the effect of available response time on foot placement error. In agreement with our hypothesis, individuals with focal cerebellar lesions were less accurate in adjusting their foot placement in reaction to suddenly shifted stepping stones than neurologically intact controls. However, the cerebellar lesion group's foot placement error did not increase more with decreasing available response distance or for short step versus long step adjustments compared to the control group. Furthermore, foot placement error for the non-shifting stepping stones was also larger in the cerebellar lesion group as compared to the control group. Consequently, the reduced ability to accurately adjust foot placement during walking in individuals with focal cerebellar lesions appears to be a general movement control deficit, which could contribute to increased fall risk. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Implantation of an ergonomics administration system in a company: report of an occupational therapist specialist in ergonomics.

    PubMed

    Moraes, Berla; Andrade, Valéria Sousa

    2012-01-01

    This article aims to describe step-by-step the implementation of an ergonomics administration system in a company from March 2009 till March 2011 by an occupational therapist specialist in ergonomics based on the OSHAS 18001 guidelines and the Regulatory Norms 17 manual. The process began with the definition of five requisites with bases on the manual of application of the Regulatory Norms 17: survey; materials individual transportation and discharge; workplace furniture; workplace equipments; work environment and organization of the work to be managed with bases on the OSHAS 18001 guidelines. The following steps were established: sensitization of the company high administration, elaboration and institution of an ergonomics politics, development of ergonomics committees, ergonomics analysis of the work with recommendation of ergonomic improvements, implantation of improvements and evaluation or the results. This research experiment suggests the importance not only of a guiding axle but also of a professional qualification and participation of the company on the implementation of an ergonomics management system.

  9. Improving Nutritional Status of Older Persons with Dementia Using a National Preventive Care Program.

    PubMed

    Johansson, L; Wijk, H; Christensson, L

    2017-01-01

    The aim of the study was to investigate the outcome of change in body weight associated with use of a structured preventive care process among persons with dementia assessed as at risk of malnutrition or malnourished. The preventive care process is a pedagogical model used in the Senior Alert (SA) quality register, where nutrition is one of the prioritized areas and includes four steps: assessment, analysis of underlying causes, actions performed and outcome. An analysis of data from SA with a pre-post design was performed. The participants were living in ordinary housing or special housing in Sweden. 1912 persons, 65 years and older, registered in both SA and the dementia quality register Svedem were included. A national preventive care program including individualized actions. The Mini Nutritional Assessment-Short Form was used to assess nutritional status at baseline. Body weight was measured during baseline and follow-up (7-106 days after baseline). 74.3% persons were malnourished or at risk of malnutrition. Those at risk of malnutrition or malnourished who were registered in all four steps of the preventive care process, increased in body weight from baseline (Md 60.0 kg) to follow-up (Md 62.0 kg) (p=0.013). In those with incomplete registration no increase in body weight was found. Using all steps in the structured preventive care process seems to improve nutritional status of persons with dementia assessed as at risk of malnutrition or malnourished. This study contributes to the development of evidence-based practice regarding malnutrition and persons with dementia.

  10. Experiences and lessons learned from creating a generalized workflow for data publication of field campaign datasets

    NASA Astrophysics Data System (ADS)

    Santhana Vannan, S. K.; Ramachandran, R.; Deb, D.; Beaty, T.; Wright, D.

    2017-12-01

    This paper summarizes the workflow challenges of curating and publishing data produced from disparate data sources and provides a generalized workflow solution to efficiently archive data generated by researchers. The Oak Ridge National Laboratory Distributed Active Archive Center (ORNL DAAC) for biogeochemical dynamics and the Global Hydrology Resource Center (GHRC) DAAC have been collaborating on the development of a generalized workflow solution to efficiently manage the data publication process. The generalized workflow presented here are built on lessons learned from implementations of the workflow system. Data publication consists of the following steps: Accepting the data package from the data providers, ensuring the full integrity of the data files. Identifying and addressing data quality issues Assembling standardized, detailed metadata and documentation, including file level details, processing methodology, and characteristics of data files Setting up data access mechanisms Setup of the data in data tools and services for improved data dissemination and user experience Registering the dataset in online search and discovery catalogues Preserving the data location through Digital Object Identifiers (DOI) We will describe the steps taken to automate, and realize efficiencies to the above process. The goals of the workflow system are to reduce the time taken to publish a dataset, to increase the quality of documentation and metadata, and to track individual datasets through the data curation process. Utilities developed to achieve these goal will be described. We will also share metrics driven value of the workflow system and discuss the future steps towards creation of a common software framework.

  11. Expected values for pedometer-determined physical activity in older populations

    PubMed Central

    2009-01-01

    The purpose of this review is to update expected values for pedometer-determined physical activity in free-living healthy older populations. A search of the literature published since 2001 began with a keyword (pedometer, "step counter," "step activity monitor" or "accelerometer AND steps/day") search of PubMed, Cumulative Index to Nursing & Allied Health Literature (CINAHL), SportDiscus, and PsychInfo. An iterative process was then undertaken to abstract and verify studies of pedometer-determined physical activity (captured in terms of steps taken; distance only was not accepted) in free-living adult populations described as ≥ 50 years of age (studies that included samples which spanned this threshold were not included unless they provided at least some appropriately age-stratified data) and not specifically recruited based on any chronic disease or disability. We identified 28 studies representing at least 1,343 males and 3,098 females ranging in age from 50–94 years. Eighteen (or 64%) of the studies clearly identified using a Yamax pedometer model. Monitoring frames ranged from 3 days to 1 year; the modal length of time was 7 days (17 studies, or 61%). Mean pedometer-determined physical activity ranged from 2,015 steps/day to 8,938 steps/day. In those studies reporting such data, consistent patterns emerged: males generally took more steps/day than similarly aged females, steps/day decreased across study-specific age groupings, and BMI-defined normal weight individuals took more steps/day than overweight/obese older adults. The range of 2,000–9,000 steps/day likely reflects the true variability of physical activity behaviors in older populations. More explicit patterns, for example sex- and age-specific relationships, remain to be informed by future research endeavors. PMID:19706192

  12. In vitro analysis of human immunodeficiency virus particle dissociation: gag proteolytic processing influences dissociation kinetics.

    PubMed

    Müller, Barbara; Anders, Maria; Reinstein, Jochen

    2014-01-01

    Human immunodeficiency virus particles undergo a step of proteolytic maturation, in which the main structural polyprotein Gag is cleaved into its mature subunits matrix (MA), capsid (CA), nucleocapsid (NC) and p6. Gag proteolytic processing is accompanied by a dramatic structural rearrangement within the virion, which is necessary for virus infectivity and has been proposed to proceed through a sequence of dissociation and reformation of the capsid lattice. Morphological maturation appears to be tightly regulated, with sequential cleavage events and two small spacer peptides within Gag playing important roles by regulating the disassembly of the immature capsid layer and formation of the mature capsid lattice. In order to measure the influence of individual Gag domains on lattice stability, we established Förster's resonance energy transfer (FRET) reporter virions and employed rapid kinetic FRET and light scatter measurements. This approach allowed us to measure dissociation properties of HIV-1 particles assembled in eukaryotic cells containing Gag proteins in different states of proteolytic processing. While the complex dissociation behavior of the particles prevented an assignment of kinetic rate constants to individual dissociation steps, our analyses revealed characteristic differences in the dissociation properties of the MA layer dependent on the presence of additional domains. The most striking effect observed here was a pronounced stabilization of the MA-CA layer mediated by the presence of the 14 amino acid long spacer peptide SP1 at the CA C-terminus, underlining the crucial role of this peptide for the resolution of the immature particle architecture.

  13. Lignocellulosic ethanol: Technology design and its impact on process efficiency.

    PubMed

    Paulova, Leona; Patakova, Petra; Branska, Barbora; Rychtera, Mojmir; Melzoch, Karel

    2015-11-01

    This review provides current information on the production of ethanol from lignocellulosic biomass, with the main focus on relationships between process design and efficiency, expressed as ethanol concentration, yield and productivity. In spite of unquestionable advantages of lignocellulosic biomass as a feedstock for ethanol production (availability, price, non-competitiveness with food, waste material), many technological bottlenecks hinder its wide industrial application and competitiveness with 1st generation ethanol production. Among the main technological challenges are the recalcitrant structure of the material, and thus the need for extensive pretreatment (usually physico-chemical followed by enzymatic hydrolysis) to yield fermentable sugars, and a relatively low concentration of monosaccharides in the medium that hinder the achievement of ethanol concentrations comparable with those obtained using 1st generation feedstocks (e.g. corn or molasses). The presence of both pentose and hexose sugars in the fermentation broth, the price of cellulolytic enzymes, and the presence of toxic compounds that can inhibit cellulolytic enzymes and microbial producers of ethanol are major issues. In this review, different process configurations of the main technological steps (enzymatic hydrolysis, fermentation of hexose/and or pentose sugars) are discussed and their efficiencies are compared. The main features, benefits and drawbacks of simultaneous saccharification and fermentation (SSF), simultaneous saccharification and fermentation with delayed inoculation (dSSF), consolidated bioprocesses (CBP) combining production of cellulolytic enzymes, hydrolysis of biomass and fermentation into one step, together with an approach combining utilization of both pentose and hexose sugars are discussed and compared with separate hydrolysis and fermentation (SHF) processes. The impact of individual technological steps on final process efficiency is emphasized and the potential for use of immobilized biocatalysts is considered. Copyright © 2014 Elsevier Inc. All rights reserved.

  14. Failure mode and effects analysis: A community practice perspective.

    PubMed

    Schuller, Bradley W; Burns, Angi; Ceilley, Elizabeth A; King, Alan; LeTourneau, Joan; Markovic, Alexander; Sterkel, Lynda; Taplin, Brigid; Wanner, Jennifer; Albert, Jeffrey M

    2017-11-01

    To report our early experiences with failure mode and effects analysis (FMEA) in a community practice setting. The FMEA facilitator received extensive training at the AAPM Summer School. Early efforts focused on department education and emphasized the need for process evaluation in the context of high profile radiation therapy accidents. A multidisciplinary team was assembled with representation from each of the major department disciplines. Stereotactic radiosurgery (SRS) was identified as the most appropriate treatment technique for the first FMEA evaluation, as it is largely self-contained and has the potential to produce high impact failure modes. Process mapping was completed using breakout sessions, and then compiled into a simple electronic format. Weekly sessions were used to complete the FMEA evaluation. Risk priority number (RPN) values > 100 or severity scores of 9 or 10 were considered high risk. The overall time commitment was also tracked. The final SRS process map contained 15 major process steps and 183 subprocess steps. Splitting the process map into individual assignments was a successful strategy for our group. The process map was designed to contain enough detail such that another radiation oncology team would be able to perform our procedures. Continuous facilitator involvement helped maintain consistent scoring during FMEA. Practice changes were made responding to the highest RPN scores, and new resulting RPN scores were below our high-risk threshold. The estimated person-hour equivalent for project completion was 258 hr. This report provides important details on the initial steps we took to complete our first FMEA, providing guidance for community practices seeking to incorporate this process into their quality assurance (QA) program. Determining the feasibility of implementing complex QA processes into different practice settings will take on increasing significance as the field of radiation oncology transitions into the new TG-100 QA paradigm. © 2017 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  15. Validity of Different Activity Monitors to Count Steps in an Inpatient Rehabilitation Setting.

    PubMed

    Treacy, Daniel; Hassett, Leanne; Schurr, Karl; Chagpar, Sakina; Paul, Serene S; Sherrington, Catherine

    2017-05-01

    Commonly used activity monitors have been shown to be accurate in counting steps in active people; however, further validation is needed in slower walking populations. To determine the validity of activity monitors for measuring step counts in rehabilitation inpatients compared with visually observed step counts. To explore the influence of gait parameters, activity monitor position, and use of walkers on activity monitor accuracy. One hundred and sixty-six inpatients admitted to a rehabilitation unit with an average walking speed of 0.4 m/s (SD 0.2) wore 16 activity monitors (7 different devices in different positions) simultaneously during 6-minute and 6-m walks. The number of steps taken during the tests was also counted by a physical therapist. Gait parameters were assessed using the GAITRite system. To analyze the influence of different gait parameters, the percentage accuracy for each monitor was graphed against various gait parameters for each activity monitor. The StepWatch, Fitbit One worn on the ankle and the ActivPAL showed excellent agreement with observed step count (ICC 2,1 0.98; 0.92; 0.78 respectively). Other devices (Fitbit Charge, Fitbit One worn on hip, G-Sensor, Garmin Vivofit, Actigraph) showed poor agreement with the observed step count (ICC 2,1 0.12-0.40). Percentage agreement with observed step count was highest for the StepWatch (mean 98%). The StepWatch and the Fitbit One worn on the ankle maintained accuracy in individuals who walked more slowly and with shorter strides but other devices were less accurate in these individuals. There were small numbers of participants for some gait parameters. The StepWatch showed the highest accuracy and closest agreement with observed step count. This device can be confidently used by researchers for accurate measurement of step counts in inpatient rehabilitation in individuals who walk slowly. If immediate feedback is desired, the Fitbit One when worn on the ankle would be the best choice for this population. © 2017 American Physical Therapy Association

  16. Pharmaceutical 3D printing: Design and qualification of a single step print and fill capsule.

    PubMed

    Smith, Derrick M; Kapoor, Yash; Klinzing, Gerard R; Procopio, Adam T

    2018-06-10

    Fused deposition modeling (FDM) 3D printing (3DP) has a potential to change how we envision manufacturing in the pharmaceutical industry. A more common utilization for FDM 3DP is to build upon existing hot melt extrusion (HME) technology where the drug is dispersed in the polymer matrix. However, reliable manufacturing of drug-containing filaments remains a challenge along with the limitation of active ingredients which can sustain the processing risks involved in the HME process. To circumvent this obstacle, a single step FDM 3DP process was developed to manufacture thin-walled drug-free capsules which can be filled with dry or liquid drug product formulations. Drug release from these systems is governed by the combined dissolution of the FDM capsule 'shell' and the dosage form encapsulated in these shells. To prepare the shells, the 3D printer files (extension '.gcode') were modified by creating discrete zones, so-called 'zoning process', with individual print parameters. Capsules printed without the zoning process resulted in macroscopic print defects and holes. X-ray computed tomography, finite element analysis and mechanical testing were used to guide the zoning process and printing parameters in order to manufacture consistent and robust capsule shell geometries. Additionally, dose consistencies of drug containing liquid formulations were investigated in this work. Copyright © 2018 Elsevier B.V. All rights reserved.

  17. Are we really measuring empathy? Proposal for a new measurement framework.

    PubMed

    Coll, Michel-Pierre; Viding, Essi; Rütgen, Markus; Silani, Giorgia; Lamm, Claus; Catmur, Caroline; Bird, Geoffrey

    2017-12-01

    Empathy - currently defined as the sharing of another's affective state - has been the focus of much psychological and neuroscientific research in the last decade, much of which has been focused on ascertaining the empathic ability of individuals with various clinical conditions. However, most of this work tends to overlook the fact that empathy is the result of a complex process requiring a number of intermediate processing steps. It is therefore the case that describing an individual or group as 'lacking empathy' lacks specificity. We argue for an alternative measurement framework, in which we explain variance in empathic response in terms of individual differences in the ability to identify another's emotional state ('emotion identification'), and the degree to which identification of another's state causes a corresponding state in the self ('affect sharing'). We describe how existing empathy paradigms need to be modified in order to fit within this measurement framework, and illustrate the utility of this approach with reference to examples from both cognitive neuroscience and clinical psychology. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Lightning electromagnetic radiation field spectra in the interval from 0. 2 to 20 MHz

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Willett, J.C.; Bailey, J.C.; Leteinturier, C.

    1990-11-20

    Average energy spectral densities are presented for the fast transitions in most of the components that produce large radiation field impulses from cloud-to-ground lightning; first and subsequent return strokes; stepped, dart-stepped, and 'chaotic' leaders; and 'characteristic' cloud pulses. A disagreement in the previous literature about the spectral energy radiated by return strokes at high frequencies is noted and explained. The authors show that the spectral amplitudes are not seriously distorted by propagation over less than 35 km of seawater, although as much as 45 km of such propagation does appear to produce significant attenuation above about 10 MHz. First andmore » subsequent return strokes produce identical spectra between 0.2 and 20 MHz. The spectra of stepped and dart-stepped leader steps are nearly identical and are very similar to that of characteristic pulses. The spectra of leader steps also match return stroke spectra above 2-3 MHz after the former are increased by about 7 dB. The shapes of individual spectra do not depend on their amplitude, so the shapes of the average spectra are probably not distorted by the trigger thresholds used in the data acquisition. Return strokes are the strongest sources of radiation from cloud-to-ground lightning in the 0.2- to 20-MHz frequency range, although certain intracloud processes are stronger radiators above 8 MHz.« less

  19. Thermal behaviour and kinetics of coal/biomass blends during co-combustion.

    PubMed

    Gil, M V; Casal, D; Pevida, C; Pis, J J; Rubiera, F

    2010-07-01

    The thermal characteristics and kinetics of coal, biomass (pine sawdust) and their blends were evaluated under combustion conditions using a non-isothermal thermogravimetric method (TGA). Biomass was blended with coal in the range of 5-80 wt.% to evaluate their co-combustion behaviour. No significant interactions were detected between the coal and biomass, since no deviations from their expected behaviour were observed in these experiments. Biomass combustion takes place in two steps: between 200 and 360 degrees C the volatiles are released and burned, and at 360-490 degrees C char combustion takes place. In contrast, coal is characterized by only one combustion stage at 315-615 degrees C. The coal/biomass blends presented three combustion steps, corresponding to the sum of the biomass and coal individual stages. Several solid-state mechanisms were tested by the Coats-Redfern method in order to find out the mechanisms responsible for the oxidation of the samples. The kinetic parameters were determined assuming single separate reactions for each stage of thermal conversion. The combustion process of coal consists of one reaction, whereas, in the case of the biomass and coal/biomass blends, this process consists of two or three independent reactions, respectively. The results showed that the chemical first order reaction is the most effective mechanism for the first step of biomass oxidation and for coal combustion. However, diffusion mechanisms were found to be responsible for the second step of biomass combustion. Copyright (c) 2010 Elsevier Ltd. All rights reserved.

  20. Processing black mulberry into jam: effects on antioxidant potential and in vitro bioaccessibility.

    PubMed

    Tomas, Merve; Toydemir, Gamze; Boyacioglu, Dilek; Hall, Robert D; Beekwilder, Jules; Capanoglu, Esra

    2017-08-01

    Black mulberries (Morus nigra) were processed into jam on an industrialised scale, including the major steps of: selection of frozen black mulberries, adding glucose-fructose syrup and water, cooking, adding citric acid and apple pectin, removing seeds, and pasteurisation. Qualitative and quantitative determinations of antioxidants in black mulberry samples were performed using spectrophotometric methods, as well as HPLC- and LC-QTOF-MS-based measurements. These analyses included the determination of total polyphenolic content, % polymeric colour, total and individual anthocyanin contents, antioxidant capacity, and in vitro bioaccessibility in processing samples. Jam processing led to a significant reduction in total phenolics (88%), total flavonoids (89%), anthocyanins (97%), and antioxidant capacity (88-93%) (P < 0.05). Individual anthocyanin contents, determined using HPLC analysis, also showed a significant decrease (∼99% loss). In contrast, % recovery of bioaccessible total phenolics, anthocyanins, and antioxidant capacity (ABTS assay) increased after jam processing (16%, 12%, and 37%, respectively). Fruit processing resulted in losses of polyphenols, anthocyanins, and antioxidant capacity of black mulberry jam. Optimisation of food processing could help to protect the phenolic compounds in fruits which might be helpful for the food industry to minimise the antioxidant loss and improve the final product quality. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.

  1. Mechanistic equivalent circuit modelling of a commercial polymer electrolyte membrane fuel cell

    NASA Astrophysics Data System (ADS)

    Giner-Sanz, J. J.; Ortega, E. M.; Pérez-Herranz, V.

    2018-03-01

    Electrochemical impedance spectroscopy (EIS) has been widely used in the fuel cell field since it allows deconvolving the different physic-chemical processes that affect the fuel cell performance. Typically, EIS spectra are modelled using electric equivalent circuits. In this work, EIS spectra of an individual cell of a commercial PEM fuel cell stack were obtained experimentally. The goal was to obtain a mechanistic electric equivalent circuit in order to model the experimental EIS spectra. A mechanistic electric equivalent circuit is a semiempirical modelling technique which is based on obtaining an equivalent circuit that does not only correctly fit the experimental spectra, but which elements have a mechanistic physical meaning. In order to obtain the aforementioned electric equivalent circuit, 12 different models with defined physical meanings were proposed. These equivalent circuits were fitted to the obtained EIS spectra. A 2 step selection process was performed. In the first step, a group of 4 circuits were preselected out of the initial list of 12, based on general fitting indicators as the determination coefficient and the fitted parameter uncertainty. In the second step, one of the 4 preselected circuits was selected on account of the consistency of the fitted parameter values with the physical meaning of each parameter.

  2. Effectiveness of a smartphone app in increasing physical activity amongst male adults: a randomised controlled trial.

    PubMed

    Harries, Tim; Eslambolchilar, Parisa; Rettie, Ruth; Stride, Chris; Walton, Simon; van Woerden, Hugo C

    2016-09-02

    Smartphones are ideal for promoting physical activity in those with little intrinsic motivation for exercise. This study tested three hypotheses: H1 - receipt of social feedback generates higher step-counts than receipt of no feedback; H2 - receipt of social feedback generates higher step-counts than only receiving feedback on one's own walking; H3 - receipt of feedback on one's own walking generates higher step-counts than no feedback (H3). A parallel group randomised controlled trial measured the impact of feedback on steps-counts. Healthy male participants (n = 165) aged 18-40 were given phones pre-installed with an app that recorded steps continuously, without the need for user activation. Participants carried these with them as their main phones for a two-week run-in and six-week trial. Randomisation was to three groups: no feedback (control); personal feedback on step-counts; group feedback comparing step-counts against those taken by others in their group. The primary outcome measure, steps per day, was assessed using longitudinal multilevel regression analysis. Control variables included attitude to physical activity and perceived barriers to physical activity. Fifty-five participants were allocated to each group; 152 completed the study and were included in the analysis: n = 49, no feedback; n = 53, individual feedback; n = 50, individual and social feedback. The study provided support for H1 and H3 but not H2. Receipt of either form of feedback explained 7.7 % of between-subject variability in step-count (F = 6.626, p < 0.0005). Compared to the control, the expected step-count for the individual feedback group was 60 % higher (effect on log step-count = 0.474, 95 % CI = 0.166-0.782) and that for the social feedback group, 69 % higher (effect on log step-count = 0.526, 95 % CI = 0.212-0.840). The difference between the two feedback groups (individual vs social feedback) was not statistically significant. Always-on smartphone apps that provide step-counts can increase physical activity in young to early-middle-aged men but the provision of social feedback has no apparent incremental impact. This approach may be particularly suitable for inactive people with low levels of physical activity; it should now be tested with this population.

  3. DNA-Mediated Patterning of Single Quantum Dot Nanoarrays: A Reusable Platform for Single-Molecule Control

    NASA Astrophysics Data System (ADS)

    Huang, Da; Freeley, Mark; Palma, Matteo

    2017-03-01

    We present a facile strategy of general applicability for the assembly of individual nanoscale moieties in array configurations with single-molecule control. Combining the programming ability of DNA as a scaffolding material with a one-step lithographic process, we demonstrate the patterning of single quantum dots (QDs) at predefined locations on silicon and transparent glass surfaces: as proof of concept, clusters of either one, two, or three QDs were assembled in highly uniform arrays with a 60 nm interdot spacing within each cluster. Notably, the platform developed is reusable after a simple cleaning process and can be designed to exhibit different geometrical arrangements.

  4. A novel frequency analysis method for assessing K(ir)2.1 and Na (v)1.5 currents.

    PubMed

    Rigby, J R; Poelzing, S

    2012-04-01

    Voltage clamping is an important tool for measuring individual currents from an electrically active cell. However, it is difficult to isolate individual currents without pharmacological or voltage inhibition. Herein, we present a technique that involves inserting a noise function into a standard voltage step protocol, which allows one to characterize the unique frequency response of an ion channel at different step potentials. Specifically, we compute the fast Fourier transform for a family of current traces at different step potentials for the inward rectifying potassium channel, K(ir)2.1, and the channel encoding the cardiac fast sodium current, Na(v)1.5. Each individual frequency magnitude, as a function of voltage step, is correlated to the peak current produced by each channel. The correlation coefficient vs. frequency relationship reveals that these two channels are associated with some unique frequencies with high absolute correlation. The individual IV relationship can then be recreated using only the unique frequencies with magnitudes of high absolute correlation. Thus, this study demonstrates that ion channels may exhibit unique frequency responses.

  5. Fully automated rodent brain MR image processing pipeline on a Midas server: from acquired images to region-based statistics.

    PubMed

    Budin, Francois; Hoogstoel, Marion; Reynolds, Patrick; Grauer, Michael; O'Leary-Moore, Shonagh K; Oguz, Ipek

    2013-01-01

    Magnetic resonance imaging (MRI) of rodent brains enables study of the development and the integrity of the brain under certain conditions (alcohol, drugs etc.). However, these images are difficult to analyze for biomedical researchers with limited image processing experience. In this paper we present an image processing pipeline running on a Midas server, a web-based data storage system. It is composed of the following steps: rigid registration, skull-stripping, average computation, average parcellation, parcellation propagation to individual subjects, and computation of region-based statistics on each image. The pipeline is easy to configure and requires very little image processing knowledge. We present results obtained by processing a data set using this pipeline and demonstrate how this pipeline can be used to find differences between populations.

  6. Orbital construction support equipment

    NASA Technical Reports Server (NTRS)

    1977-01-01

    Approximately 200 separate construction steps were defined for the three solar power satellite (SPS) concepts. Detailed construction scenarios were developed which describe the specific tasks to be accomplished, and identify general equipment requirements. The scenarios were used to perform a functional analysis, which resulted in the definition of 100 distinct SPS elements. These elements are the components, parts, subsystems, or assemblies upon which construction activities take place. The major SPS elements for each configuration are shown. For those elements, 300 functional requirements were identified in seven generic processes. Cumulatively, these processes encompass all functions required during SPS construction/assembly. Individually each process is defined such that it includes a specific type of activity. Each SPS element may involve activities relating to any or all of the generic processes. The processes are listed, and examples of the requirements defined for a typical element are given.

  7. Effects of Stand and Step Training with Epidural Stimulation on Motor Function for Standing in Chronic Complete Paraplegics

    PubMed Central

    Rejc, Enrico; Angeli, Claudia A.; Bryant, Nicole

    2017-01-01

    Abstract Individuals affected by motor complete spinal cord injury are unable to stand, walk, or move their lower limbs voluntarily; this diagnosis normally implies severe limitations for functional recovery. We have recently shown that the appropriate selection of epidural stimulation parameters was critical to promoting full-body, weight-bearing standing with independent knee extension in four individuals with chronic clinically complete paralysis. In the current study, we examined the effects of stand training and subsequent step training with epidural stimulation on motor function for standing in the same four individuals. After stand training, the ability to stand improved to different extents in the four participants. Step training performed afterwards substantially impaired standing ability in three of the four individuals. Improved standing ability generally coincided with continuous electromyography (EMG) patterns with constant levels of ground reaction forces. Conversely, poorer standing ability was associated with more variable EMG patterns that alternated EMG bursts and longer periods of negligible activity in most of the muscles. Stand and step training also differentially affected the evoked potentials amplitude modulation induced by sitting-to-standing transition. Finally, stand and step training with epidural stimulation were not sufficient to improve motor function for standing without stimulation. These findings show that the spinal circuitry of motor complete paraplegics can generate motor patterns effective for standing in response to task-specific training with optimized stimulation parameters. Conversely, step training can lead to neural adaptations resulting in impaired motor function for standing. PMID:27566051

  8. 12-Step Interventions and Mutual Support Programs for Substance Use Disorders: An Overview

    PubMed Central

    Donovan, Dennis M.; Ingalsbe, Michelle H.; Benbow, James; Daley, Dennis C.

    2013-01-01

    Social workers and other behavioral health professionals are likely to encounter individuals with substance use disorders in a variety of practice settings outside of specialty treatment. 12-Step mutual support programs represent readily available, no cost community-based resources for such individuals; however, practitioners are often unfamiliar with such programs. The present article provides a brief overview of 12-Step programs, the positive substance use and psychosocial outcomes associated with active 12-Step involvement, and approaches ranging from ones that can be utilized by social workers in any practice setting to those developed for specialty treatment programs to facilitate engagement in 12-Step meetings and recovery activities. The goal is to familiarize social workers with 12-Step approaches so that they are better able to make informed referrals that match clients to mutual support groups that best meet the individual’s needs and maximize the likelihood of engagement and positive outcomes. PMID:23731422

  9. Designing Scenarios for Controller-in-the-Loop Air Traffic Simulations

    NASA Technical Reports Server (NTRS)

    Kupfer, Michael; Mercer, Joey S.; Cabrall, Christopher; Callantine, Todd

    2013-01-01

    Well prepared traffic scenarios contribute greatly to the success of controller-in-the-loop simulations. This paper describes each stage in the design process of realistic scenarios based on real-world traffic, to be used in the Airspace Operations Laboratory for simulations within the Air Traffic Management Technology Demonstration 1 effort. The steps from the initial analysis of real-world traffic, to the editing of individual aircraft records in the scenario file, until the final testing of the scenarios before the simulation conduct, are all described. The iterative nature of the design process and the various efforts necessary to reach the required fidelity, as well as the applied design strategies, challenges, and tools used during this process are also discussed.

  10. Discrete pre-processing step effects in registration-based pipelines, a preliminary volumetric study on T1-weighted images

    PubMed Central

    2017-01-01

    Pre-processing MRI scans prior to performing volumetric analyses is common practice in MRI studies. As pre-processing steps adjust the voxel intensities, the space in which the scan exists, and the amount of data in the scan, it is possible that the steps have an effect on the volumetric output. To date, studies have compared between and not within pipelines, and so the impact of each step is unknown. This study aims to quantify the effects of pre-processing steps on volumetric measures in T1-weighted scans within a single pipeline. It was our hypothesis that pre-processing steps would significantly impact ROI volume estimations. One hundred fifteen participants from the OASIS dataset were used, where each participant contributed three scans. All scans were then pre-processed using a step-wise pipeline. Bilateral hippocampus, putamen, and middle temporal gyrus volume estimations were assessed following each successive step, and all data were processed by the same pipeline 5 times. Repeated-measures analyses tested for a main effects of pipeline step, scan-rescan (for MRI scanner consistency) and repeated pipeline runs (for algorithmic consistency). A main effect of pipeline step was detected, and interestingly an interaction between pipeline step and ROI exists. No effect for either scan-rescan or repeated pipeline run was detected. We then supply a correction for noise in the data resulting from pre-processing. PMID:29023597

  11. The STEP model: Characterizing simultaneous time effects on practice for flight simulator performance among middle-aged and older pilots

    PubMed Central

    Kennedy, Quinn; Taylor, Joy; Noda, Art; Yesavage, Jerome; Lazzeroni, Laura C.

    2015-01-01

    Understanding the possible effects of the number of practice sessions (practice) and time between practice sessions (interval) among middle-aged and older adults in real world tasks has important implications for skill maintenance. Prior training and cognitive ability may impact practice and interval effects on real world tasks. In this study, we took advantage of existing practice data from five simulated flights among 263 middle-aged and older pilots with varying levels of flight expertise (defined by FAA proficiency ratings). We developed a new STEP (Simultaneous Time Effects on Practice) model to: (1) model the simultaneous effects of practice and interval on performance of the five flights, and (2) examine the effects of selected covariates (age, flight expertise, and three composite measures of cognitive ability). The STEP model demonstrated consistent positive practice effects, negative interval effects, and predicted covariate effects. Age negatively moderated the beneficial effects of practice. Additionally, cognitive processing speed and intra-individual variability (IIV) in processing speed moderated the benefits of practice and/or the negative influence of interval for particular flight performance measures. Expertise did not interact with either practice or interval. Results indicate that practice and interval effects occur in simulated flight tasks. However, processing speed and IIV may influence these effects, even among high functioning adults. Results have implications for the design and assessment of training interventions targeted at middle-aged and older adults for complex real world tasks. PMID:26280383

  12. Carbon fluxes in tropical forest ecosystems: the value of Eddy-covariance data for individual-based dynamic forest gap models

    NASA Astrophysics Data System (ADS)

    Roedig, Edna; Cuntz, Matthias; Huth, Andreas

    2015-04-01

    The effects of climatic inter-annual fluctuations and human activities on the global carbon cycle are uncertain and currently a major issue in global vegetation models. Individual-based forest gap models, on the other hand, model vegetation structure and dynamics on a small spatial (<100 ha) and large temporal scale (>1000 years). They are well-established tools to reproduce successions of highly-diverse forest ecosystems and investigate disturbances as logging or fire events. However, the parameterizations of the relationships between short-term climate variability and forest model processes are often uncertain in these models (e.g. daily variable temperature and gross primary production (GPP)) and cannot be constrained from forest inventories. We addressed this uncertainty and linked high-resolution Eddy-covariance (EC) data with an individual-based forest gap model. The forest model FORMIND was applied to three diverse tropical forest sites in the Amazonian rainforest. Species diversity was categorized into three plant functional types. The parametrizations for the steady-state of biomass and forest structure were calibrated and validated with different forest inventories. The parameterizations of relationships between short-term climate variability and forest model processes were evaluated with EC-data on a daily time step. The validations of the steady-state showed that the forest model could reproduce biomass and forest structures from forest inventories. The daily estimations of carbon fluxes showed that the forest model reproduces GPP as observed by the EC-method. Daily fluctuations of GPP were clearly reflected as a response to daily climate variability. Ecosystem respiration remains a challenge on a daily time step due to a simplified soil respiration approach. In the long-term, however, the dynamic forest model is expected to estimate carbon budgets for highly-diverse tropical forests where EC-measurements are rare.

  13. The ability of individuals to assess population density influences the evolution of emigration propensity and dispersal distance.

    PubMed

    Poethke, Hans Joachim; Gros, Andreas; Hovestadt, Thomas

    2011-08-07

    We analyze the simultaneous evolution of emigration and settlement decisions for actively dispersing species differing in their ability to assess population density. Using an individual-based model we simulate dispersal as a multi-step (patch to patch) movement in a world consisting of habitat patches surrounded by a hostile matrix. Each such step is associated with the same mortality risk. Our simulations show that individuals following an informed strategy, where emigration (and settlement) probability depends on local population density, evolve a lower (natal) emigration propensity but disperse over significantly larger distances - i.e. postpone settlement longer - than individuals performing density-independent emigration. This holds especially when variation in environmental conditions is spatially correlated. Both effects can be traced to the informed individuals' ability to better exploit existing heterogeneity in reproductive chances. Yet, already moderate distance-dependent dispersal costs prevent the evolution of multi-step (long-distance) dispersal, irrespective of the dispersal strategy. Copyright © 2011 Elsevier Ltd. All rights reserved.

  14. User Manual for Personnel Inventory Aging and Promotion Model

    DTIC Science & Technology

    2009-06-01

    increased by 12. Now, an SQL 9 statement deletes records where [target] = NULL, and the model calculates the number of E8s that need to be promoted to...the run, the [Likelihood] and [Expected] tables are created. The first step in this process is to dy- namically build an SQL statement, based on the...This table has individual-level, longitudinal records. Next, a dy- namically built SQL statement based on the Number of Years, cre- ates a new data

  15. Same-sex attraction: a model to aid nurses' understanding.

    PubMed

    Richardson, Brian

    2009-12-01

    Young people attracted to people of their own sex are at risk of bullying and discrimination. It is often difficult for them to find support. Either emotionally or in relation to their health needs. This article explores a model to aid nurses in understanding the process individuals go through before coming to terms with their sexuality. The model also outlines the steps that nurses can take to enhance the care they provide for this vulnerable group of patients and clients.

  16. Adhesion Forces between Lewis(X) Determinant Antigens as Measured by Atomic Force Microscopy.

    PubMed

    Tromas, C; Rojo, J; de la Fuente, J M; Barrientos, A G; García, R; Penadés, S

    2001-01-01

    The adhesion forces between individual molecules of Lewis(X) trisaccharide antigen (Le(X) ) have been measured in water and in calcium solution by using atomic force microscopy (AFM, see graph). These results demonstrate the self-recognition capability of this antigen, and reinforce the hypothesis that carbohydrate-carbohydrate interaction could be considered as the first step in the cell-adhesion process in nature. Copyright © 2001 WILEY-VCH Verlag GmbH, Weinheim, Fed. Rep. of Germany.

  17. Carbonaceous Chondrite Thin Section Preparation

    NASA Technical Reports Server (NTRS)

    Harrington, R.; Righter, K.

    2017-01-01

    Carbonaceous chondrite meteorites have long posed a challenge for thin section makers. The variability in sample hardness among the different types, and sometimes within individual sections, creates the need for an adaptable approach at each step of the thin section making process. This poster will share some of the procedural adjustments that have proven to be successful at the NASA JSC Meteorite Thin Section Laboratory. These adjustments are modifications of preparation methods that have been in use for decades and therefore do not require investment in new technology or materials.

  18. A Serious Game for Anterior Cruciate Ligament Rehabilitation: Software Development Aspects and Game Engagement Assessment.

    PubMed

    Cordeiro d'Ornellas, Marcos; Santos Machado, Alison; de Moraes, Jefferson Potiguara; Cervi Prado, Ana Lúcia

    2017-01-01

    This work presents the steps for developing a serious game that allows the interaction through natural gestures, whose main purpose is to contribute to the treatment of individuals who have suffered an injury to the anterior cruciate ligament (ACL). In addition to the serious game development process, the users' gaming experience were performed. Through the evaluation assessment, positive results were obtained in relation to various aspects of the game engagement, proving the playful factor of this activity.

  19. PANGEA: pipeline for analysis of next generation amplicons.

    PubMed

    Giongo, Adriana; Crabb, David B; Davis-Richardson, Austin G; Chauliac, Diane; Mobberley, Jennifer M; Gano, Kelsey A; Mukherjee, Nabanita; Casella, George; Roesch, Luiz F W; Walts, Brandon; Riva, Alberto; King, Gary; Triplett, Eric W

    2010-07-01

    High-throughput DNA sequencing can identify organisms and describe population structures in many environmental and clinical samples. Current technologies generate millions of reads in a single run, requiring extensive computational strategies to organize, analyze and interpret those sequences. A series of bioinformatics tools for high-throughput sequencing analysis, including pre-processing, clustering, database matching and classification, have been compiled into a pipeline called PANGEA. The PANGEA pipeline was written in Perl and can be run on Mac OSX, Windows or Linux. With PANGEA, sequences obtained directly from the sequencer can be processed quickly to provide the files needed for sequence identification by BLAST and for comparison of microbial communities. Two different sets of bacterial 16S rRNA sequences were used to show the efficiency of this workflow. The first set of 16S rRNA sequences is derived from various soils from Hawaii Volcanoes National Park. The second set is derived from stool samples collected from diabetes-resistant and diabetes-prone rats. The workflow described here allows the investigator to quickly assess libraries of sequences on personal computers with customized databases. PANGEA is provided for users as individual scripts for each step in the process or as a single script where all processes, except the chi(2) step, are joined into one program called the 'backbone'.

  20. Method for localizing and isolating an errant process step

    DOEpatents

    Tobin, Jr., Kenneth W.; Karnowski, Thomas P.; Ferrell, Regina K.

    2003-01-01

    A method for localizing and isolating an errant process includes the steps of retrieving from a defect image database a selection of images each image having image content similar to image content extracted from a query image depicting a defect, each image in the selection having corresponding defect characterization data. A conditional probability distribution of the defect having occurred in a particular process step is derived from the defect characterization data. A process step as a highest probable source of the defect according to the derived conditional probability distribution is then identified. A method for process step defect identification includes the steps of characterizing anomalies in a product, the anomalies detected by an imaging system. A query image of a product defect is then acquired. A particular characterized anomaly is then correlated with the query image. An errant process step is then associated with the correlated image.

  1. Integration of eHealth Tools in the Process of Workplace Health Promotion: Proposal for Design and Implementation

    PubMed Central

    2018-01-01

    Background Electronic health (eHealth) and mobile health (mHealth) tools can support and improve the whole process of workplace health promotion (WHP) projects. However, several challenges and opportunities have to be considered while integrating these tools in WHP projects. Currently, a large number of eHealth tools are developed for changing health behavior, but these tools can support the whole WHP process, including group administration, information flow, assessment, intervention development process, or evaluation. Objective To support a successful implementation of eHealth tools in the whole WHP processes, we introduce a concept of WHP (life cycle model of WHP) with 7 steps and present critical and success factors for the implementation of eHealth tools in each step. Methods We developed a life cycle model of WHP based on the World Health Organization (WHO) model of healthy workplace continual improvement process. We suggest adaptations to the WHO model to demonstrate the large number of possibilities to implement eHealth tools in WHP as well as possible critical points in the implementation process. Results eHealth tools can enhance the efficiency of WHP in each of the 7 steps of the presented life cycle model of WHP. Specifically, eHealth tools can support by offering easier administration, providing an information and communication platform, supporting assessments, presenting and discussing assessment results in a dashboard, and offering interventions to change individual health behavior. Important success factors include the possibility to give automatic feedback about health parameters, create incentive systems, or bring together a large number of health experts in one place. Critical factors such as data security, anonymity, or lack of management involvement have to be addressed carefully to prevent nonparticipation and dropouts. Conclusions Using eHealth tools can support WHP, but clear regulations for the usage and implementation of these tools at the workplace are needed to secure quality and reach sustainable results. PMID:29475828

  2. Behavioral preference in sequential decision-making and its association with anxiety.

    PubMed

    Zhang, Dandan; Gu, Ruolei

    2018-06-01

    In daily life, people often make consecutive decisions before the ultimate goal is reached (i.e., sequential decision-making). However, this kind of decision-making has been largely overlooked in the literature. The current study investigated whether behavioral preference would change during sequential decisions, and the neural processes underlying the potential changes. For this purpose, we revised the classic balloon analogue risk task and recorded the electroencephalograph (EEG) signals associated with each step of decision-making. Independent component analysis performed on EEG data revealed that four EEG components elicited by periodic feedback in the current step predicted participants' decisions (gamble vs. no gamble) in the next step. In order of time sequence, these components were: bilateral occipital alpha rhythm, bilateral frontal theta rhythm, middle frontal theta rhythm, and bilateral sensorimotor mu rhythm. According to the information flows between these EEG oscillations, we proposed a brain model that describes the temporal dynamics of sequential decision-making. Finally, we found that the tendency to gamble (as well as the power intensity of bilateral frontal theta rhythms) was sensitive to the individual level of trait anxiety in certain steps, which may help understand the role of emotion in decision-making. © 2018 Wiley Periodicals, Inc.

  3. 25 CFR 15.11 - What are the basic steps of the probate process?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 25 Indians 1 2010-04-01 2010-04-01 false What are the basic steps of the probate process? 15.11 Section 15.11 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR PROBATE PROBATE OF INDIAN... are the basic steps of the probate process? The basic steps of the probate process are: (a) We learn...

  4. Design and grayscale fabrication of beamfanners in a silicon substrate

    NASA Astrophysics Data System (ADS)

    Ellis, Arthur Cecil

    2001-11-01

    This dissertation addresses important first steps in the development of a grayscale fabrication process for multiple phase diffractive optical elements (DOS's) in silicon. Specifically, this process was developed through the design, fabrication, and testing of 1-2 and 1-4 beamfanner arrays for 5-micron illumination. The 1-2 beamfanner arrays serve as a test-of- concept and basic developmental step toward the construction of the 1-4 beamfanners. The beamfanners are 50 microns wide, and have features with dimensions of between 2 and 10 microns. The Iterative Annular Spectrum Approach (IASA) method, developed by Steve Mellin of UAH, and the Boundary Element Method (BEM) are the design and testing tools used to create the beamfanner profiles and predict their performance. Fabrication of the beamfanners required the techniques of grayscale photolithography and reactive ion etching (RIE). A 2-3micron feature size 1-4 silicon beamfanner array was fabricated, but the small features and contact photolithographic techniques available prevented its construction to specifications. A second and more successful attempt was made in which both 1-4 and 1-2 beamfanner arrays were fabricated with a 5-micron minimum feature size. Photolithography for the UAH array was contracted to MEMS-Optical of Huntsville, Alabama. A repeatability study was performed, using statistical techniques, of 14 photoresist arrays and the subsequent RIE process used to etch the arrays in silicon. The variance in selectivity between the 14 processes was far greater than the variance between the individual etched features within each process. Specifically, the ratio of the variance of the selectivities averaged over each of the 14 etch processes to the variance of individual feature selectivities within the processes yielded a significance level below 0.1% by F-test, indicating that good etch-to-etch process repeatability was not attained. One of the 14 arrays had feature etch-depths close enough to design specifications for optical testing, but 5- micron IR illumination of the 1-4 and 1-2 beamfanners yielded no convincing results of beam splitting in the detector plane 340 microns from the surface of the beamfanner array.

  5. Effects of Gait Training With Body Weight Support on a Treadmill Versus Overground in Individuals With Stroke.

    PubMed

    Gama, Gabriela L; Celestino, Melissa L; Barela, José A; Forrester, Larry; Whitall, Jill; Barela, Ana M

    2017-04-01

    To investigate the effects of gait training with body weight support (BWS) on a treadmill versus overground in individuals with chronic stroke. Randomized controlled trial. University research laboratory. Individuals (N=28) with chronic stroke (>6mo from the stroke event). Participants were randomly assigned to receive gait training with BWS on a treadmill (n=14) or overground (n=14) 3 times a week for 6 weeks. Gait speed measured using the 10-meter walk test, endurance measured using the 6-minute walk test, functional independence measured using the motor domain of the FIM, lower limb recovery measured using the lower extremity domain of the Fugl-Meyer assessment, step length, step length symmetry ratio, and single-limb support duration. Measurements were obtained at baseline, immediately after the training session, and 6 weeks after the training session. At 1 week after the last training session, both groups improved in all outcome measures except paretic step length and step length symmetry ratio, which were improved only in the overground group (P=.01 and P=.01, respectively). At 6 weeks after the last training session, all improvements remained and the treadmill group also improved paretic step length (P<.001) but not step length symmetry ratio (P>.05). Individuals with chronic stroke equally improve gait speed and other gait parameters after 18 sessions of BWS gait training on either a treadmill or overground. Only the overground group improved step length symmetry ratio, suggesting a role of integrating overground walking into BWS interventions poststroke. Copyright © 2016 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  6. Observational study of treatment space in individual neonatal cot spaces.

    PubMed

    Hignett, Sue; Lu, Jun; Fray, Mike

    2010-01-01

    Technology developments in neonatal intensive care units have increased the spatial requirements for clinical activities. Because the effectiveness of healthcare delivery is determined in part by the design of the physical environment and the spatial organization of work, it is appropriate to apply an evidence-based approach to architectural design. This study aimed to provide empirical evidence of the spatial requirements for an individual cot or incubator space. Observational data from 2 simulation exercises were combined with an expert review to produce a final recommendation. A validated 5-step protocol was used to collect data. Step 1 defined the clinical specialty and space. In step 2, data were collected with 28 staff members and 15 neonates to produce a simulation scenario representing the frequent and safety-critical activities. In step 3, 21 staff members participated in functional space experiments to determine the average spatial requirements. Step 4 incorporated additional data (eg, storage and circulation) to produce a spatial recommendation. Finally, the recommendation was reviewed in step 5 by a national expert clinical panel to consider alternative layouts and technology. The average space requirement for an individual neonatal intensive care unit cot (incubator) space was 13.5 m2 (or 145.3 ft2). The circulation and storage space requirements added in step 4 increased this to 18.46 m2 (or 198.7 ft2). The expert panel reviewed the recommendation and agreed that the average individual cot space (13.5 m2/[or 145.3 ft2]) would accommodate variance in working practices. Care needs to be taken when extrapolating this recommendation to multiple cot areas to maintain the minimum spatial requirement.

  7. Personal Finance. Predrafted Individual Short-Term Plan/Records (Secondary Level): Directions for Resource Teachers, Teachers and Aides.

    ERIC Educational Resources Information Center

    Flores, Merced, Comp.

    Developed by experienced migrant education teachers incorporating Sight and Sound Program concepts, this volume presents predrafted individual short-term Plan/Records for personal finance for secondary level students, plus step-by-step directions for their use by Oregon resource teachers, classroom teachers, and aides. This approach assumes that…

  8. Science. Predrafted Individual Short-Term Plan/Records (Secondary Level): Directions for Resource Teachers, Teachers and Aides.

    ERIC Educational Resources Information Center

    Flores, Merced, Comp.

    Developed by experienced migrant education teachers incorporating Sight and Sound Program concepts, this volume presents predrafted individual short-term Plan/Records for secondary level chemistry, biology, and physics, plus step-by-step directions for their use by Oregon resource teachers, classroom teachers, and aides. The approach assumes that…

  9. Prospective Associations Between Intervention Components and Website Engagement in a Publicly Available Physical Activity Website: The Case of 10,000 Steps Australia

    PubMed Central

    Corry, Kelly; Van Itallie, Anetta; Vandelanotte, Corneel; Caperchione, Cristina; Mummery, W Kerry

    2012-01-01

    Background Effectiveness of and engagement with website-delivered physical activity interventions is moderate at best. Increased exposure to Internet interventions is reported to increase their effectiveness; however, there is a lack of knowledge about which specific intervention elements are able to maintain website engagement. Objective To prospectively study the associations of website engagement and exposure to intervention components for a publicly available physical activity website (10,000 Steps Australia). Methods Between June and July 2006 a total of 348 members of 10,000 Steps completed a Web-based survey to collect demographic characteristics. Website engagement was subsequently assessed over a 2-year period and included engagement data on website components; individual challenges, team challenges, and virtual walking buddies; and indicators of website engagement (average steps logged, days logging steps, and active users). Results On average participants logged steps on 169 (SD 228.25) days. Over a 2-year period this equated to an average of 1.6 logons per week. Binary logistic regression showed that individuals who participated in individual challenges were more likely to achieve an average of 10,000 steps per day (odds ratio [OR] = 2.80, 95% confidence interval [CI] 1.45–5.40), log steps on a higher than average number of days (OR = 6.81, 95% CI 2.87–13.31), and remain an active user (OR = 4.36, 95% CI 2.17–8.71). Additionally, those using virtual walking buddies (OR = 5.83, 95% CI 1.27–26.80) and of older age logged steps on a higher than average number of days. No significant associations were found for team challenges. Conclusions Overall engagement with the 10,000 Steps website was high, and the results demonstrate the relative effectiveness of interactive components to enhance website engagement. However, only exposure to the interactive individual challenge feature was positively associated with all website engagement indicators. More research is needed to examine the influence of intervention components on website engagement, as well as the relationship between website engagement and physical activity change. PMID:22260810

  10. Production of long-term global water vapor and liquid water data set using ultra-fast methods to assimilate multi-satellite and radiosonde observations

    NASA Technical Reports Server (NTRS)

    Vonderhaar, T. H.; Reinke, Donald L.; Randel, David L.; Stephens, Graeme L.; Combs, Cynthia L.; Greenwald, Thomas J.; Ringerud, Mark A.; Wittmeyer, Ian L.

    1993-01-01

    During the next decade, many programs and experiments under the Global Energy and Water Cycle Experiment (GEWEX) will utilize present day and future data sets to improve our understanding of the role of moisture in climate, and its interaction with other variables such as clouds and radiation. An important element of GEWEX will be the GEWEX Water Vapor Project (GVaP), which will eventually initiate a routine, real-time assimilation of the highest quality, global water vapor data sets including information gained from future data collection systems, both ground and space based. The comprehensive global water vapor data set being produced by METSAT Inc. uses a combination of ground-based radiosonde data, and infrared and microwave satellite retrievals. This data is needed to provide the desired foundation from which future GEWEX-related research, such as GVaP, can build. The first year of this project was designed to use a combination of the best available atmospheric moisture data including: radiosonde (balloon/acft/rocket), HIRS/MSU (TOVS) retrievals, and SSM/I retrievals, to produce a one-year, global, high resolution data set of integrated column water vapor (precipitable water) with a horizontal resolution of 1 degree, and a temporal resolution of one day. The time period of this pilot product was to be det3ermined by the availability of all the input data sets. January 1988 through December 1988 were selected. In addition, a sample of vertically integrated liquid water content (LWC) was to be produced with the same temporal and spatial parameters. This sample was to be produced over ocean areas only. Three main steps are followed to produce a merged water vapor and liquid water product. Input data from Radiosondes, TOVS, and SSMI/I is quality checked in steps one and two. Processing is done in step two to generate individual total column water vapor and liquid water data sets. The third step, and final processing task, involves merging the individual output products to produce the integrated water vapor product. A final quality control is applied to the merged data sets.

  11. Feasibility of using continuous chromatography in downstream processing: Comparison of costs and product quality for a hybrid process vs. a conventional batch process.

    PubMed

    Ötes, Ozan; Flato, Hendrik; Winderl, Johannes; Hubbuch, Jürgen; Capito, Florian

    2017-10-10

    The protein A capture step is the main cost-driver in downstream processing, with high attrition costs especially when using protein A resin not until end of resin lifetime. Here we describe a feasibility study, transferring a batch downstream process to a hybrid process, aimed at replacing batch protein A capture chromatography with a continuous capture step, while leaving the polishing steps unchanged to minimize required process adaptations compared to a batch process. 35g of antibody were purified using the hybrid approach, resulting in comparable product quality and step yield compared to the batch process. Productivity for the protein A step could be increased up to 420%, reducing buffer amounts by 30-40% and showing robustness for at least 48h continuous run time. Additionally, to enable its potential application in a clinical trial manufacturing environment cost of goods were compared for the protein A step between hybrid process and batch process, showing a 300% cost reduction, depending on processed volumes and batch cycles. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Towards combined global monthly gravity field solutions

    NASA Astrophysics Data System (ADS)

    Jaeggi, Adrian; Meyer, Ulrich; Beutler, Gerhard; Weigelt, Matthias; van Dam, Tonie; Mayer-Gürr, Torsten; Flury, Jakob; Flechtner, Frank; Dahle, Christoph; Lemoine, Jean-Michel; Bruinsma, Sean

    2014-05-01

    Currently, official GRACE Science Data System (SDS) monthly gravity field solutions are generated independently by the Centre for Space Research (CSR) and the German Research Centre for Geosciences (GFZ). Additional GRACE SDS monthly fields are provided by the Jet Propulsion Laboratory (JPL) for validation and outside the SDS by a number of other institutions worldwide. Although the adopted background models and processing standards have been harmonized more and more by the various processing centers during the past years, notable differences still exist and the users are more or less left alone with a decision which model to choose for their individual applications. This procedure seriously limits the accessibility of these valuable data. Combinations are well established in the area of other space geodetic techniques, such as the Global Navigation Satellite Systems (GNSS), Satellite Laser Ranging (SLR), and Very Long Baseline Interferometry (VLBI). Regularly comparing and combining space-geodetic products has tremendously increased the usefulness of the products in a wide range of disciplines and scientific applications. Therefore, we propose in a first step to mutually compare the large variety of available monthly GRACE gravity field solutions, e.g., by assessing the signal content over selected regions, by estimating the noise over the oceans, and by performing significance tests. We make the attempt to assign different solution characteristics to different processing strategies in order to identify subsets of solutions, which are based on similar processing strategies. Using these subsets we will in a second step explore ways to generate combined solutions, e.g., based on a weighted average of the individual solutions using empirical weights derived from pair-wise comparisons. We will also assess the quality of such a combined solution and discuss the potential benefits for the GRACE and GRACE-FO user community, but also address minimum processing requirements to be met by each analysis centre to enable a meaningful combination (either performed on the solution level or, preferably, on the normal equation level).

  13. 48 CFR 15.202 - Advisory multi-step process.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Advisory multi-step... Information 15.202 Advisory multi-step process. (a) The agency may publish a presolicitation notice (see 5.204... participate in the acquisition. This process should not be used for multi-step acquisitions where it would...

  14. Comparison of machinability of manganese alloyed austempered ductile iron produced using conventional and two step austempering processes

    NASA Astrophysics Data System (ADS)

    Hegde, Ananda; Sharma, Sathyashankara

    2018-05-01

    Austempered Ductile Iron (ADI) is a revolutionary material with high strength and hardness combined with optimum ductility and toughness. The discovery of two step austempering process has lead to the superior combination of all the mechanical properties. However, because of the high strength and hardness of ADI, there is a concern regarding its machinability. In the present study, machinability of ADI produced using conventional and two step heat treatment processes is assessed using tool life and the surface roughness. Speed, feed and depth of cut are considered as the machining parameters in the dry turning operation. The machinability results along with the mechanical properties are compared for ADI produced using both conventional and two step austempering processes. The results have shown that two step austempering process has produced better toughness with good hardness and strength without sacrificing ductility. Addition of 0.64 wt% manganese did not cause any detrimental effect on the machinability of ADI, both in conventional and two step processes. Marginal improvement in tool life and surface roughness were observed in two step process compared to that with conventional process.

  15. The Supertree Toolkit 2: a new and improved software package with a Graphical User Interface for supertree construction.

    PubMed

    Hill, Jon; Davis, Katie E

    2014-01-01

    Building large supertrees involves the collection, storage, and processing of thousands of individual phylogenies to create large phylogenies with thousands to tens of thousands of taxa. Such large phylogenies are useful for macroevolutionary studies, comparative biology and in conservation and biodiversity. No easy to use and fully integrated software package currently exists to carry out this task. Here, we present a new Python-based software package that uses well defined XML schema to manage both data and metadata. It builds on previous versions by 1) including new processing steps, such as Safe Taxonomic Reduction, 2) using a user-friendly GUI that guides the user to complete at least the minimum information required and includes context-sensitive documentation, and 3) a revised storage format that integrates both tree- and meta-data into a single file. These data can then be manipulated according to a well-defined, but flexible, processing pipeline using either the GUI or a command-line based tool. Processing steps include standardising names, deleting or replacing taxa, ensuring adequate taxonomic overlap, ensuring data independence, and safe taxonomic reduction. This software has been successfully used to store and process data consisting of over 1000 trees ready for analyses using standard supertree methods. This software makes large supertree creation a much easier task and provides far greater flexibility for further work.

  16. The Internet as a New Tool in the Rehabilitation Process of Patients—Education in Focus

    PubMed Central

    Forczek, Erzsébet; Makra, Péter; Sik Lanyi, Cecilia; Bari, Ferenc

    2015-01-01

    In the article we deal with the rehabilitation of patients using information technology, especially Internet support. We concentrate on two main areas in the IT support of rehabilitation: one of them is the support for individual therapy, the other one is providing patients with information, which is the basic step in emphasising individual responsibility. In the development of rehabilitation programmes, the knowledge of the IT professional and the therapist, in the IT support of web guidance, medical expertise plays the primary role. The degree of assistance involved in the rehabilitation process depends on the IT knowledge of medical (general practitioner, nursing staff) professionals as well. The necessary knowledge required in healing and development processes is imparted to professionals by a special (full-time) university training. It was a huge challenge for us to teach web-based information organisation skills to doctors and nurses, and it is also a complex task to put forward such an IT viewpoint to information specialists in order to create the foundations of the cooperation between IT and healthcare professionals. PMID:25711359

  17. The Internet as a new tool in the rehabilitation process of patients--education in focus.

    PubMed

    Forczek, Erzsébet; Makra, Péter; Lanyi, Cecilia Sik; Bari, Ferenc

    2015-02-23

    In the article we deal with the rehabilitation of patients using information technology, especially Internet support. We concentrate on two main areas in the IT support of rehabilitation: one of them is the support for individual therapy, the other one is providing patients with information, which is the basic step in emphasising individual responsibility. In the development of rehabilitation programmes, the knowledge of the IT professional and the therapist, in the IT support of web guidance, medical expertise plays the primary role. The degree of assistance involved in the rehabilitation process depends on the IT knowledge of medical (general practitioner, nursing staff) professionals as well. The necessary knowledge required in healing and development processes is imparted to professionals by a special (full-time) university training. It was a huge challenge for us to teach web-based information organisation skills to doctors and nurses, and it is also a complex task to put forward such an IT viewpoint to information specialists in order to create the foundations of the cooperation between IT and healthcare professionals.

  18. A comparison of the effects of visual deprivation and regular body weight support treadmill training on improving over-ground walking of stroke patients: a multiple baseline single subject design.

    PubMed

    Kim, Jeong-Soo; Kang, Sun-Young; Jeon, Hye-Seon

    2015-01-01

    The body-weight-support treadmill (BWST) is commonly used for gait rehabilitation, but other forms of BWST are in development, such as visual-deprivation BWST (VDBWST). In this study, we compare the effect of VDBWST training and conventional BWST training on spatiotemporal gait parameters for three individuals who had hemiparetic strokes. We used a single-subject experimental design, alternating multiple baselines across the individuals. We recruited three individuals with hemiparesis from stroke; two on the left side and one on the right. For the main outcome measures we assessed spatiotemporal gait parameters using GAITRite, including: gait velocity; cadence; step time of the affected side (STA); step time of the non-affected side (STN); step length of the affected side (SLA); step length of the non-affected side (SLN); step-time asymmetry (ST-asymmetry); and step-length asymmetry (SL-asymmetry). Gait velocity, cadence, SLA, and SLN increased from baseline after both interventions, but STA, ST-asymmetry, and SL-asymmetry decreased from the baseline after the interventions. The VDBWST was significantly more effective than the BWST for increasing gait velocity and cadence and for decreasing ST-asymmetry. VDBWST is more effective than BWST for improving gait performance during the rehabilitation for ground walking.

  19. Locating and parsing bibliographic references in HTML medical articles

    PubMed Central

    Zou, Jie; Le, Daniel; Thoma, George R.

    2010-01-01

    The set of references that typically appear toward the end of journal articles is sometimes, though not always, a field in bibliographic (citation) databases. But even if references do not constitute such a field, they can be useful as a preprocessing step in the automated extraction of other bibliographic data from articles, as well as in computer-assisted indexing of articles. Automation in data extraction and indexing to minimize human labor is key to the affordable creation and maintenance of large bibliographic databases. Extracting the components of references, such as author names, article title, journal name, publication date and other entities, is therefore a valuable and sometimes necessary task. This paper describes a two-step process using statistical machine learning algorithms, to first locate the references in HTML medical articles and then to parse them. Reference locating identifies the reference section in an article and then decomposes it into individual references. We formulate this step as a two-class classification problem based on text and geometric features. An evaluation conducted on 500 articles drawn from 100 medical journals achieves near-perfect precision and recall rates for locating references. Reference parsing identifies the components of each reference. For this second step, we implement and compare two algorithms. One relies on sequence statistics and trains a Conditional Random Field. The other focuses on local feature statistics and trains a Support Vector Machine to classify each individual word, followed by a search algorithm that systematically corrects low confidence labels if the label sequence violates a set of predefined rules. The overall performance of these two reference-parsing algorithms is about the same: above 99% accuracy at the word level, and over 97% accuracy at the chunk level. PMID:20640222

  20. Locating and parsing bibliographic references in HTML medical articles.

    PubMed

    Zou, Jie; Le, Daniel; Thoma, George R

    2010-06-01

    The set of references that typically appear toward the end of journal articles is sometimes, though not always, a field in bibliographic (citation) databases. But even if references do not constitute such a field, they can be useful as a preprocessing step in the automated extraction of other bibliographic data from articles, as well as in computer-assisted indexing of articles. Automation in data extraction and indexing to minimize human labor is key to the affordable creation and maintenance of large bibliographic databases. Extracting the components of references, such as author names, article title, journal name, publication date and other entities, is therefore a valuable and sometimes necessary task. This paper describes a two-step process using statistical machine learning algorithms, to first locate the references in HTML medical articles and then to parse them. Reference locating identifies the reference section in an article and then decomposes it into individual references. We formulate this step as a two-class classification problem based on text and geometric features. An evaluation conducted on 500 articles drawn from 100 medical journals achieves near-perfect precision and recall rates for locating references. Reference parsing identifies the components of each reference. For this second step, we implement and compare two algorithms. One relies on sequence statistics and trains a Conditional Random Field. The other focuses on local feature statistics and trains a Support Vector Machine to classify each individual word, followed by a search algorithm that systematically corrects low confidence labels if the label sequence violates a set of predefined rules. The overall performance of these two reference-parsing algorithms is about the same: above 99% accuracy at the word level, and over 97% accuracy at the chunk level.

  1. Audiovisual integration increases the intentional step synchronization of side-by-side walkers.

    PubMed

    Noy, Dominic; Mouta, Sandra; Lamas, Joao; Basso, Daniel; Silva, Carlos; Santos, Jorge A

    2017-12-01

    When people walk side-by-side, they often synchronize their steps. To achieve this, individuals might cross-modally match audiovisual signals from the movements of the partner and kinesthetic, cutaneous, visual and auditory signals from their own movements. Because signals from different sensory systems are processed with noise and asynchronously, the challenge of the CNS is to derive the best estimate based on this conflicting information. This is currently thought to be done by a mechanism operating as a Maximum Likelihood Estimator (MLE). The present work investigated whether audiovisual signals from the partner are integrated according to MLE in order to synchronize steps during walking. Three experiments were conducted in which the sensory cues from a walking partner were virtually simulated. In Experiment 1 seven participants were instructed to synchronize with human-sized Point Light Walkers and/or footstep sounds. Results revealed highest synchronization performance with auditory and audiovisual cues. This was quantified by the time to achieve synchronization and by synchronization variability. However, this auditory dominance effect might have been due to artifacts of the setup. Therefore, in Experiment 2 human-sized virtual mannequins were implemented. Also, audiovisual stimuli were rendered in real-time and thus were synchronous and co-localized. All four participants synchronized best with audiovisual cues. For three of the four participants results point toward their optimal integration consistent with the MLE model. Experiment 3 yielded performance decrements for all three participants when the cues were incongruent. Overall, these findings suggest that individuals might optimally integrate audiovisual cues to synchronize steps during side-by-side walking. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Perceived life stress exposure modulates reward-related medial prefrontal cortex responses to acute stress in depression.

    PubMed

    Kumar, Poornima; Slavich, George M; Berghorst, Lisa H; Treadway, Michael T; Brooks, Nancy H; Dutra, Sunny J; Greve, Douglas N; O'Donovan, Aoife; Bleil, Maria E; Maninger, Nicole; Pizzagalli, Diego A

    2015-07-15

    Major depressive disorder (MDD) is often precipitated by life stress and growing evidence suggests that stress-induced alterations in reward processing may contribute to such risk. However, no human imaging studies have examined how recent life stress exposure modulates the neural systems that underlie reward processing in depressed and healthy individuals. In this proof-of-concept study, 12 MDD and 10 psychiatrically healthy individuals were interviewed using the Life Events and Difficulties Schedule (LEDS) to assess their perceived levels of recent acute and chronic life stress exposure. Additionally, each participant performed a monetary incentive delay task under baseline (no-stress) and stress (social-evaluative) conditions during functional MRI. Across groups, medial prefrontal cortex (mPFC) activation to reward feedback was greater during acute stress versus no-stress conditions in individuals with greater perceived stressor severity. Under acute stress, depressed individuals showed a positive correlation between perceived stressor severity levels and reward-related mPFC activation (r=0.79, p=0.004), whereas no effect was found in healthy controls. Moreover, for depressed (but not healthy) individuals, the correlations between the stress (r=0.79) and no-stress (r=-0.48) conditions were significantly different. Finally, relative to controls, depressed participants showed significantly reduced mPFC gray matter, but functional findings remained robust while accounting for structural differences. Small sample size, which warrants replication. Depressed individuals experiencing greater recent life stress recruited the mPFC more under stress when processing rewards. Our results represent an initial step toward elucidating mechanisms underlying stress sensitization and recurrence in depression. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. US Cystic Fibrosis Foundation and European Cystic Fibrosis Society consensus recommendations for the management of non-tuberculous mycobacteria in individuals with cystic fibrosis

    PubMed Central

    Olivier, Kenneth N; Saiman, Lisa; Daley, Charles L; Herrmann, Jean-Louis; Nick, Jerry A; Noone, Peadar G; Bilton, Diana; Corris, Paul; Gibson, Ronald L; Hempstead, Sarah E; Koetz, Karsten; Sabadosa, Kathryn A; Sermet-Gaudelus, Isabelle; Smyth, Alan R; van Ingen, Jakko; Wallace, Richard J; Winthrop, Kevin L; Marshall, Bruce C; Haworth, Charles S

    2016-01-01

    Non-tuberculous mycobacteria (NTM) are ubiquitous environmental organisms that can cause chronic pulmonary infection, particularly in individuals with pre-existing inflammatory lung disease such as cystic fibrosis (CF). Pulmonary disease caused by NTM has emerged as a major threat to the health of individuals with CF but remains difficult to diagnose and problematic to treat. In response to this challenge, the US Cystic Fibrosis Foundation (CFF) and the European Cystic Fibrosis Society (ECFS) convened an expert panel of specialists to develop consensus recommendations for the screening, investigation, diagnosis and management of NTM pulmonary disease in individuals with CF. Nineteen experts were invited to participate in the recommendation development process. Population, Intervention, Comparison, Outcome (PICO) methodology and systematic literature reviews were employed to inform draft recommendations. An anonymous voting process was used by the committee to reach consensus. All committee members were asked to rate each statement on a scale of: 0, completely disagree, to 9, completely agree; with 80% or more of scores between 7 and 9 being considered ‘good’ agreement. Additionally, the committee solicited feedback from the CF communities in the USA and Europe and considered the feedback in the development of the final recommendation statements. Three rounds of voting were conducted to achieve 80% consensus for each recommendation statement. Through this process, we have generated a series of pragmatic, evidence-based recommendations for the screening, investigation, diagnosis and treatment of NTM infection in individuals with CF as an initial step in optimising management for this challenging condition. PMID:26666259

  4. Structure of Enhanced Cued Recall Task in the 7 Minute Screen Test.

    PubMed

    Mora-Simon, Sara; Ladera-Fernandez, Valentina; Garcia-Garcia, Ricardo; Patino-Alonso, María C; Perea-Bartolome, M Victoria; Unzueta-Arce, Jaime; Perez-Arechaederra, Diana; Rodriguez-Sanchez, Emiliano

    2017-01-01

    Episodic memory in the 7 Minute Screen is assessed by the Enhanced Cued Recall (ECR) test. The ECR test is composed of three phases, Identification, Immediate Recall, and Free and Cued Recall. However, just the last phase is considered for the total score. We believe that collecting the performance data of the Identification and Immediate Recall phases could provide information regarding possible difficulties or impairments in the different aspects involved in the temporal mnesic process of acquisition of new information, such as in working memory or visual identification. The objective was to assess the goodness of fit for the three phases of the ECR test using a Confirmatory Factor Analysis (CFA) to show if each phase is separated from each other as a different aspect that participates in the mnesic process. A total of 311 participants greater than 65 years were included in this study. Confirmatory factor analyses were conducted for each individual phase. The analyses show that the ECR test consists of three separate phases that identify different steps of the mnesic process. Individual scores for each phase could allow for investigation of patient performance in different aspects of the memory process and could help in further neuropsychological assessment.

  5. Fine-scale movement decisions of tropical forest birds in a fragmented landscape.

    PubMed

    Gillies, Cameron S; Beyer, Hawthorne L; St Clair, Colleen Cassady

    2011-04-01

    The persistence of forest-dependent species in fragmented landscapes is fundamentally linked to the movement of individuals among subpopulations. The paths taken by dispersing individuals can be considered a series of steps built from individual route choices. Despite the importance of these fine-scale movement decisions, it has proved difficult to collect such data that reveal how forest birds move in novel landscapes. We collected unprecedented route information about the movement of translocated forest birds from two species in the highly fragmented tropical dry forest of Costa Rica. In this pasture-dominated landscape, forest remains in patches or riparian corridors, with lesser amounts of living fencerows and individual trees or "stepping stones." We used step selection functions to quantify how route choice was influenced by these habitat elements. We found that the amount of risk these birds were willing to take by crossing open habitat was context dependent. The forest-specialist Barred Antshrike (Thamnophilus doliatus) exhibited stronger selection for forested routes when moving in novel landscapes distant from its territory relative to locations closer to its territory. It also selected forested routes when its step originated in forest habitat. It preferred steps ending in stepping stones when the available routes had little forest cover, but avoided them when routes had greater forest cover. The forest-generalist Rufous-naped Wren (Campylorhynchus rufinucha) preferred steps that contained more pasture, but only when starting from non-forest habitats. Our results showed that forested corridors (i.e., riparian corridors) best facilitated the movement of a sensitive forest specialist through this fragmented landscape. They also suggested that stepping stones can be important in highly fragmented forests with little remaining forest cover. We expect that naturally dispersing birds and species with greater forest dependence would exhibit even stronger selection for forested routes than did the birds in our experiments.

  6. Bivalves: From individual to population modelling

    NASA Astrophysics Data System (ADS)

    Saraiva, S.; van der Meer, J.; Kooijman, S. A. L. M.; Ruardij, P.

    2014-11-01

    An individual based population model for bivalves was designed, built and tested in a 0D approach, to simulate the population dynamics of a mussel bed located in an intertidal area. The processes at the individual level were simulated following the dynamic energy budget theory, whereas initial egg mortality, background mortality, food competition, and predation (including cannibalism) were additional population processes. Model properties were studied through the analysis of theoretical scenarios and by simulation of different mortality parameter combinations in a realistic setup, imposing environmental measurements. Realistic criteria were applied to narrow down the possible combination of parameter values. Field observations obtained in the long-term and multi-station monitoring program were compared with the model scenarios. The realistically selected modeling scenarios were able to reproduce reasonably the timing of some peaks in the individual abundances in the mussel bed and its size distribution but the number of individuals was not well predicted. The results suggest that the mortality in the early life stages (egg and larvae) plays an important role in population dynamics, either by initial egg mortality, larvae dispersion, settlement failure or shrimp predation. Future steps include the coupling of the population model with a hydrodynamic and biogeochemical model to improve the simulation of egg/larvae dispersion, settlement probability, food transport and also to simulate the feedback of the organisms' activity on the water column properties, which will result in an improvement of the food quantity and quality characterization.

  7. Performance Assessment as a Diagnostic Tool for Science Teachers

    NASA Astrophysics Data System (ADS)

    Kruit, Patricia; Oostdam, Ron; van den Berg, Ed; Schuitema, Jaap

    2018-04-01

    Information on students' development of science skills is essential for teachers to evaluate and improve their own education, as well as to provide adequate support and feedback to the learning process of individual students. The present study explores and discusses the use of performance assessments as a diagnostic tool for formative assessment to inform teachers and guide instruction of science skills in primary education. Three performance assessments were administered to more than 400 students in grades 5 and 6 of primary education. Students performed small experiments using real materials while following the different steps of the empirical cycle. The mutual relationship between the three performance assessments is examined to provide evidence for the value of performance assessments as useful tools for formative evaluation. Differences in response patterns are discussed, and the diagnostic value of performance assessments is illustrated with examples of individual student performances. Findings show that the performance assessments were difficult for grades 5 and 6 students but that much individual variation exists regarding the different steps of the empirical cycle. Evaluation of scores as well as a more substantive analysis of students' responses provided insight into typical errors that students make. It is concluded that performance assessments can be used as a diagnostic tool for monitoring students' skill performance as well as to support teachers in evaluating and improving their science lessons.

  8. Facilitating improvements in laboratory report writing skills with less grading: a laboratory report peer-review process.

    PubMed

    Brigati, Jennifer R; Swann, Jerilyn M

    2015-05-01

    Incorporating peer-review steps in the laboratory report writing process provides benefits to students, but it also can create additional work for laboratory instructors. The laboratory report writing process described here allows the instructor to grade only one lab report for every two to four students, while giving the students the benefits of peer review and prompt feedback on their laboratory reports. Here we present the application of this process to a sophomore level genetics course and a freshman level cellular biology course, including information regarding class time spent on student preparation activities, instructor preparation, prerequisite student knowledge, suggested learning outcomes, procedure, materials, student instructions, faculty instructions, assessment tools, and sample data. T-tests comparing individual and group grading of the introductory cell biology lab reports yielded average scores that were not significantly different from each other (p = 0.13, n = 23 for individual grading, n = 6 for group grading). T-tests also demonstrated that average laboratory report grades of students using the peer-review process were not significantly different from those of students working alone (p = 0.98, n = 9 for individual grading, n = 6 for pair grading). While the grading process described here does not lead to statistically significant gains (or reductions) in student learning, it allows student learning to be maintained while decreasing instructor workload. This reduction in workload could allow the instructor time to pursue other high-impact practices that have been shown to increase student learning. Finally, we suggest possible modifications to the procedure for application in a variety of settings.

  9. Electric field-induced emission enhancement and modulation in individual CdSe nanowires.

    PubMed

    Vietmeyer, Felix; Tchelidze, Tamar; Tsou, Veronica; Janko, Boldizsar; Kuno, Masaru

    2012-10-23

    CdSe nanowires show reversible emission intensity enhancements when subjected to electric field strengths ranging from 5 to 22 MV/m. Under alternating positive and negative biases, emission intensity modulation depths of 14 ± 7% are observed. Individual wires are studied by placing them in parallel plate capacitor-like structures and monitoring their emission intensities via single nanostructure microscopy. Observed emission sensitivities are rationalized by the field-induced modulation of carrier detrapping rates from NW defect sites responsible for nonradiative relaxation processes. The exclusion of these states from subsequent photophysics leads to observed photoluminescence quantum yield enhancements. We quantitatively explain the phenomenon by developing a kinetic model to account for field-induced variations of carrier detrapping rates. The observed phenomenon allows direct visualization of trap state behavior in individual CdSe nanowires and represents a first step toward developing new optical techniques that can probe defects in low-dimensional materials.

  10. Justificatory Information Forefending in Digital Age: Self-Sealing Informational Conviction of Risky Health Behavior.

    PubMed

    Kim, Jeong-Nam; Oh, Yu Won; Krishna, Arunima

    2018-01-01

    This study proposes the idea of justificatory information forefending, a cognitive process by which individuals accept information that confirms their preexisting health beliefs, and reject information that is dissonant with their attitudes. In light of the sheer volume of often contradictory information related to health that is frequently highlighted by the traditional media, this study sought to identify antecedents and outcomes of this justificatory information forefending. Results indicate that individuals who are exposed to contradictory health information, currently engage in risky health behavior, are comfortable using the Internet to search for information, and are currently taking steps to maintain their health are likely to actively select health information that confirms their preexisting notions about their health, and to reject information that is contradictory to their beliefs. Additionally, individuals who engage in justificatory information forefending were also found to continue to engage in risky health behavior. Implications for theory and practice are discussed.

  11. A Novel and Simple Spike Sorting Implementation.

    PubMed

    Petrantonakis, Panagiotis C; Poirazi, Panayiota

    2017-04-01

    Monitoring the activity of multiple, individual neurons that fire spikes in the vicinity of an electrode, namely perform a Spike Sorting (SS) procedure, comprises one of the most important tools for contemporary neuroscience in order to reverse-engineer the brain. As recording electrodes' technology rabidly evolves by integrating thousands of electrodes in a confined spatial setting, the algorithms that are used to monitor individual neurons from recorded signals have to become even more reliable and computationally efficient. In this work, we propose a novel framework of the SS approach in which a single-step processing of the raw (unfiltered) extracellular signal is sufficient for both the detection and sorting of the activity of individual neurons. Despite its simplicity, the proposed approach exhibits comparable performance with state-of-the-art approaches, especially for spike detection in noisy signals, and paves the way for a new family of SS algorithms with the potential for multi-recording, fast, on-chip implementations.

  12. Evolutionary fate of memory-one strategies in repeated prisoner's dilemma game in structured populations

    NASA Astrophysics Data System (ADS)

    Liu, Xu-Sheng; Wu, Zhi-Xi; Chen, Michael Z. Q.; Guan, Jian-Yue

    2017-07-01

    We study evolutionary spatial prisoner's dilemma game involving a one-step memory mechanism of the individuals whenever making strategy updating. In particular, during the process of strategy updating, each individual keeps in mind all the outcome of the action pairs adopted by himself and each of his neighbors in the last interaction, and according to which the individuals decide what actions they will take in the next round. Computer simulation results imply that win-stay-lose-shift like strategy win out of the memory-one strategy set in the stationary state. This result is robust in a large range of the payoff parameter, and does not depend on the initial state of the system. Furthermore, theoretical analysis with mean field and quasi-static approximation predict the same result. Thus, our studies suggest that win-stay-lose-shift like strategy is a stable dominant strategy in repeated prisoner's dilemma game in homogeneous structured populations.

  13. Algorithmic Mechanism Design of Evolutionary Computation.

    PubMed

    Pei, Yan

    2015-01-01

    We consider algorithmic design, enhancement, and improvement of evolutionary computation as a mechanism design problem. All individuals or several groups of individuals can be considered as self-interested agents. The individuals in evolutionary computation can manipulate parameter settings and operations by satisfying their own preferences, which are defined by an evolutionary computation algorithm designer, rather than by following a fixed algorithm rule. Evolutionary computation algorithm designers or self-adaptive methods should construct proper rules and mechanisms for all agents (individuals) to conduct their evolution behaviour correctly in order to definitely achieve the desired and preset objective(s). As a case study, we propose a formal framework on parameter setting, strategy selection, and algorithmic design of evolutionary computation by considering the Nash strategy equilibrium of a mechanism design in the search process. The evaluation results present the efficiency of the framework. This primary principle can be implemented in any evolutionary computation algorithm that needs to consider strategy selection issues in its optimization process. The final objective of our work is to solve evolutionary computation design as an algorithmic mechanism design problem and establish its fundamental aspect by taking this perspective. This paper is the first step towards achieving this objective by implementing a strategy equilibrium solution (such as Nash equilibrium) in evolutionary computation algorithm.

  14. Algorithmic Mechanism Design of Evolutionary Computation

    PubMed Central

    2015-01-01

    We consider algorithmic design, enhancement, and improvement of evolutionary computation as a mechanism design problem. All individuals or several groups of individuals can be considered as self-interested agents. The individuals in evolutionary computation can manipulate parameter settings and operations by satisfying their own preferences, which are defined by an evolutionary computation algorithm designer, rather than by following a fixed algorithm rule. Evolutionary computation algorithm designers or self-adaptive methods should construct proper rules and mechanisms for all agents (individuals) to conduct their evolution behaviour correctly in order to definitely achieve the desired and preset objective(s). As a case study, we propose a formal framework on parameter setting, strategy selection, and algorithmic design of evolutionary computation by considering the Nash strategy equilibrium of a mechanism design in the search process. The evaluation results present the efficiency of the framework. This primary principle can be implemented in any evolutionary computation algorithm that needs to consider strategy selection issues in its optimization process. The final objective of our work is to solve evolutionary computation design as an algorithmic mechanism design problem and establish its fundamental aspect by taking this perspective. This paper is the first step towards achieving this objective by implementing a strategy equilibrium solution (such as Nash equilibrium) in evolutionary computation algorithm. PMID:26257777

  15. Development of an impairment-based individualized treatment workflow using an iPad-based software platform.

    PubMed

    Kiran, Swathi; Des Roches, Carrie; Balachandran, Isabel; Ascenso, Elsa

    2014-02-01

    Individuals with language and cognitive deficits following brain damage likely require long-term rehabilitation. Consequently, it is a huge practical problem to provide the continued communication therapy that these individuals require. The present project describes the development of an impairment-based individualized treatment workflow using a software platform called Constant Therapy. This article is organized into two sections. We will first describe the general methods of the treatment workflow for patients involved in this study. There are four steps in this process: (1) the patient's impairment is assessed using standardized tests, (2) the patient is assigned a specific and individualized treatment plan, (3) the patient practices the therapy at home and at the clinic, and (4) the clinician and the patient can analyze the results of the patient's performance remotely and monitor and alter the treatment plan accordingly. The second section provides four case studies that provide a representative sample of participants progressing through their individualized treatment plan. The preliminary results of the patient treatment provide encouraging evidence for the feasibility of a rehabilitation program for individuals with brain damage based on the iPad (Apple Inc., Cupertino, CA). Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  16. Individualized In-Service Teacher Education. (Project IN-STEP). Evaluation Report, Phase II.

    ERIC Educational Resources Information Center

    Thurber, John C.

    Phase 2 of Project IN-STEP was conducted to revise, refine, and conduct further field testing of a new inservice teacher education model. The method developed (in Phase 1--see ED 003 905 for report) is an individualized, multi-media approach. Revision activities, based on feedback provided for Phase 1, include the remaking of six videotape…

  17. Defining process design space for a hydrophobic interaction chromatography (HIC) purification step: application of quality by design (QbD) principles.

    PubMed

    Jiang, Canping; Flansburg, Lisa; Ghose, Sanchayita; Jorjorian, Paul; Shukla, Abhinav A

    2010-12-15

    The concept of design space has been taking root under the quality by design paradigm as a foundation of in-process control strategies for biopharmaceutical manufacturing processes. This paper outlines the development of a design space for a hydrophobic interaction chromatography (HIC) process step. The design space included the impact of raw material lot-to-lot variability and variations in the feed stream from cell culture. A failure modes and effects analysis was employed as the basis for the process characterization exercise. During mapping of the process design space, the multi-dimensional combination of operational variables were studied to quantify the impact on process performance in terms of yield and product quality. Variability in resin hydrophobicity was found to have a significant influence on step yield and high-molecular weight aggregate clearance through the HIC step. A robust operating window was identified for this process step that enabled a higher step yield while ensuring acceptable product quality. © 2010 Wiley Periodicals, Inc.

  18. Precise turnaround time measurement of laboratory processes using radiofrequency identification technology.

    PubMed

    Mayer, Horst; Brümmer, Jens; Brinkmann, Thomas

    2011-01-01

    To implement Lean Six Sigma in our central laboratory we conducted a project to measure single pre-analytical steps influencing turnaround time (TAT) of emergency department (ED) serum samples. The traditional approach of extracting data from the Laboratory Information System (LIS) for a retrospective calculation of a mean TAT is not suitable. Therefore, we used radiofrequency identification (RFID) chips for real time tracking of individual samples at any pre-analytical step. 1,200 serum tubes were labelled with RFID chips and were provided to the emergency department. 3 RFID receivers were installed in the laboratory: at the outlet of the pneumatic tube system, at the centrifuge, and in the analyser area. In addition, time stamps of sample entry at the automated sample distributor and communication of results from the analyser were collected from LIS. 1,023 labelled serum tubes arrived at our laboratory. 899 RFID tags were used for TAT calculation. The following transfer times were determined (median 95th percentile in min:sec): pneumatic tube system --> centrifuge (01:25/04:48), centrifuge --> sample distributor (14:06/5:33), sample distributor --> analysis system zone (02:39/15:07), analysis system zone --> result communication (12:42/22:21). Total TAT was calculated at 33:19/57:40 min:sec. Manual processes around centrifugation were identified as a major part of TAT with 44%/60% (median/95th percentile). RFID is a robust, easy to use, and error-free technology and not susceptible to interferences in the laboratory environment. With this study design we were able to measure significant variations in a single manual sample transfer process. We showed that TAT is mainly influenced by manual steps around the centrifugation process and we concluded that centrifugation should be integrated in solutions for total laboratory automation.

  19. An Original Stepwise Multilevel Logistic Regression Analysis of Discriminatory Accuracy: The Case of Neighbourhoods and Health

    PubMed Central

    Wagner, Philippe; Ghith, Nermin; Leckie, George

    2016-01-01

    Background and Aim Many multilevel logistic regression analyses of “neighbourhood and health” focus on interpreting measures of associations (e.g., odds ratio, OR). In contrast, multilevel analysis of variance is rarely considered. We propose an original stepwise analytical approach that distinguishes between “specific” (measures of association) and “general” (measures of variance) contextual effects. Performing two empirical examples we illustrate the methodology, interpret the results and discuss the implications of this kind of analysis in public health. Methods We analyse 43,291 individuals residing in 218 neighbourhoods in the city of Malmö, Sweden in 2006. We study two individual outcomes (psychotropic drug use and choice of private vs. public general practitioner, GP) for which the relative importance of neighbourhood as a source of individual variation differs substantially. In Step 1 of the analysis, we evaluate the OR and the area under the receiver operating characteristic (AUC) curve for individual-level covariates (i.e., age, sex and individual low income). In Step 2, we assess general contextual effects using the AUC. Finally, in Step 3 the OR for a specific neighbourhood characteristic (i.e., neighbourhood income) is interpreted jointly with the proportional change in variance (i.e., PCV) and the proportion of ORs in the opposite direction (POOR) statistics. Results For both outcomes, information on individual characteristics (Step 1) provide a low discriminatory accuracy (AUC = 0.616 for psychotropic drugs; = 0.600 for choosing a private GP). Accounting for neighbourhood of residence (Step 2) only improved the AUC for choosing a private GP (+0.295 units). High neighbourhood income (Step 3) was strongly associated to choosing a private GP (OR = 3.50) but the PCV was only 11% and the POOR 33%. Conclusion Applying an innovative stepwise multilevel analysis, we observed that, in Malmö, the neighbourhood context per se had a negligible influence on individual use of psychotropic drugs, but appears to strongly condition individual choice of a private GP. However, the latter was only modestly explained by the socioeconomic circumstances of the neighbourhoods. Our analyses are based on real data and provide useful information for understanding neighbourhood level influences in general and on individual use of psychotropic drugs and choice of GP in particular. However, our primary aim is to illustrate how to perform and interpret a multilevel analysis of individual heterogeneity in social epidemiology and public health. Our study shows that neighbourhood “effects” are not properly quantified by reporting differences between neighbourhood averages but rather by measuring the share of the individual heterogeneity that exists at the neighbourhood level. PMID:27120054

  20. An implementation framework for the feedback of individual research results and incidental findings in research.

    PubMed

    Thorogood, Adrian; Joly, Yann; Knoppers, Bartha Maria; Nilsson, Tommy; Metrakos, Peter; Lazaris, Anthoula; Salman, Ayat

    2014-12-23

    This article outlines procedures for the feedback of individual research data to participants. This feedback framework was developed in the context of a personalized medicine research project in Canada. Researchers in this domain have an ethical obligation to return individual research results and/or material incidental findings that are clinically significant, valid and actionable to participants. Communication of individual research data must proceed in an ethical and efficient manner. Feedback involves three procedural steps: assessing the health relevance of a finding, re-identifying the affected participant, and communicating the finding. Re-identification requires researchers to break the code in place to protect participant identities. Coding systems replace personal identifiers with a numerical code. Double coding systems provide added privacy protection by separating research data from personal identifying data with a third "linkage" database. A trusted and independent intermediary, the "keyholder", controls access to this linkage database. Procedural guidelines for the return of individual research results and incidental findings are lacking. This article outlines a procedural framework for the three steps of feedback: assessment, re-identification, and communication. This framework clarifies the roles of the researcher, Research Ethics Board, and keyholder in the process. The framework also addresses challenges posed by coding systems. Breaking the code involves privacy risks and should only be carried out in clearly defined circumstances. Where a double coding system is used, the keyholder plays an important role in balancing the benefits of individual feedback with the privacy risks of re-identification. Feedback policies should explicitly outline procedures for the assessment of findings, and the re-identification and contact of participants. The responsibilities of researchers, the Research Ethics Board, and the keyholder must be clearly defined. We provide general guidelines for keyholders involved in feedback. We also recommend that Research Ethics Boards should not be directly involved in the assessment of individual findings. Hospitals should instead establish formal, interdisciplinary clinical advisory committees to help researchers determine whether or not an uncertain finding should be returned.

  1. Moves Management for physician fundraising in a capital campaign.

    PubMed

    Lehner, Larry K

    2005-01-01

    Hospitals are turning to philanthropy as a significant source of funding for capital programs, and physicians are a key resource. Through their own giving and their community-wide influence, physicians provide a high return on capital campaign investment. By adapting Moves Management, the premier method for prospecting and cultivation, development officers can achieve a high rate of participation by the hospital's physicians and, through them, attain increased community giving. Moves Management is defined as a process that involves managing a series of steps (moves) with identified prospects (the 10 percent who can give 90 percent). The number and type of steps depend upon the individual involved, such that each prospect is moved from attention to interest to desire to action and then back to interest until he or she has given everything he/she will or can to the organization.

  2. A two-step approach for mining patient treatment pathways in administrative healthcare databases.

    PubMed

    Najjar, Ahmed; Reinharz, Daniel; Girouard, Catherine; Gagné, Christian

    2018-05-01

    Clustering electronic medical records allows the discovery of information on healthcare practices. Entries in such medical records are usually composed of a succession of diagnostics or therapeutic steps. The corresponding processes are complex and heterogeneous since they depend on medical knowledge integrating clinical guidelines, the physician's individual experience, and patient data and conditions. To analyze such data, we are first proposing to cluster medical visits, consultations, and hospital stays into homogeneous groups, and then to construct higher-level patient treatment pathways over these different groups. These pathways are then also clustered to distill typical pathways, enabling interpretation of clusters by experts. This approach is evaluated on a real-world administrative database of elderly people in Québec suffering from heart failures. Copyright © 2018 Elsevier B.V. All rights reserved.

  3. Mechanistic details for cobalt catalyzed photochemical hydrogen production in aqueous solution: Efficiencies of the photochemical and non-photochemical steps

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shan, Bing; Baine, Teera; Ma, Xuan Anh N.

    2013-04-17

    The use of sunlight to drive chemical reactions that lead to the reduction of water to produce hydrogen is a potential avenue of solar energy utilization. There are many individual steps that take place in this process. This paper reports the investigation of a particular system that involves light absorbing molecules, electron donating agents and a catalyst for water reduction to hydrogen. We evaluated the efficiency of the light induced formation of a strong electron donor, the use of this donor to reduce the catalyst and finally the efficiency of the catalyst to produce hydrogen from water. From this, themore » sources of loss of efficiency could be clearly identified and used in the design of better systems to produce hydrogen from water.« less

  4. Process evaluation to explore internal and external validity of the "Act in Case of Depression" care program in nursing homes.

    PubMed

    Leontjevas, Ruslan; Gerritsen, Debby L; Koopmans, Raymond T C M; Smalbrugge, Martin; Vernooij-Dassen, Myrra J F J

    2012-06-01

    A multidisciplinary, evidence-based care program to improve the management of depression in nursing home residents was implemented and tested using a stepped-wedge design in 23 nursing homes (NHs): "Act in case of Depression" (AiD). Before effect analyses, to evaluate AiD process data on sampling quality (recruitment and randomization, reach) and intervention quality (relevance and feasibility, extent to which AiD was performed), which can be used for understanding internal and external validity. In this article, a model is presented that divides process evaluation data into first- and second-order process data. Qualitative and quantitative data based on personal files of residents, interviews of nursing home professionals, and a research database were analyzed according to the following process evaluation components: sampling quality and intervention quality. Nursing home. The pattern of residents' informed consent rates differed for dementia special care units and somatic units during the study. The nursing home staff was satisfied with the AiD program and reported that the program was feasible and relevant. With the exception of the first screening step (nursing staff members using a short observer-based depression scale), AiD components were not performed fully by NH staff as prescribed in the AiD protocol. Although NH staff found the program relevant and feasible and was satisfied with the program content, individual AiD components may have different feasibility. The results on sampling quality implied that statistical analyses of AiD effectiveness should account for the type of unit, whereas the findings on intervention quality implied that, next to the type of unit, analyses should account for the extent to which individual AiD program components were performed. In general, our first-order process data evaluation confirmed internal and external validity of the AiD trial, and this evaluation enabled further statistical fine tuning. The importance of evaluating the first-order process data before executing statistical effect analyses is thus underlined. Copyright © 2012 American Medical Directors Association, Inc. Published by Elsevier Inc. All rights reserved.

  5. Effects of toe-in and toe-in with wider step width on level walking knee biomechanics in varus, valgus, and neutral knee alignments.

    PubMed

    Bennett, Hunter J; Shen, Guangping; Cates, Harold E; Zhang, Songning

    2017-12-01

    Increased peak external knee adduction moments exist for individuals with knee osteoarthritis and varus knee alignments, compared to healthy and neutrally aligned counterparts. Walking with increased toe-in or increased step width have been individually utilized to successfully reduce 1st and 2nd peak knee adduction moments, respectfully, but have not previously been combined or tested among all alignment groups. The purpose of this study was to compare toe-in only and toe-in with wider step width gait modifications in individuals with neutral, valgus, and varus alignments. Thirty-eight healthy participants with confirmed varus, neutral, or valgus frontal-plane knee alignment through anteroposterior radiographs, performed level walking in normal, toe-in, and toe-in with wider step width gaits. A 3×3 (group×intervention) mixed model repeated measures ANOVA compared alignment groups and gait interventions (p<0.05). The 1st peak knee adduction moment was reduced in both toe-in and toe-in with wider step width compared to normal gait. The 2nd peak adduction moment was increased in toe-in compared to normal and toe-in with wider step width. The adduction impulse was also reduced in toe-in and toe-in with wider step width compared to normal gait. Peak knee flexion and external rotation moments were increased in toe-in and toe-in with wider step width compared to normal gait. Although the toe-in with wider step width gait seems to be a viable option to reduce peak adduction moments for varus alignments, sagittal, and transverse knee loadings should be monitored when implementing this gait modification strategy. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. 3D Tree Dimensionality Assessment Using Photogrammetry and Small Unmanned Aerial Vehicles

    PubMed Central

    2015-01-01

    Detailed, precise, three-dimensional (3D) representations of individual trees are a prerequisite for an accurate assessment of tree competition, growth, and morphological plasticity. Until recently, our ability to measure the dimensionality, spatial arrangement, shape of trees, and shape of tree components with precision has been constrained by technological and logistical limitations and cost. Traditional methods of forest biometrics provide only partial measurements and are labor intensive. Active remote technologies such as LiDAR operated from airborne platforms provide only partial crown reconstructions. The use of terrestrial LiDAR is laborious, has portability limitations and high cost. In this work we capitalized on recent improvements in the capabilities and availability of small unmanned aerial vehicles (UAVs), light and inexpensive cameras, and developed an affordable method for obtaining precise and comprehensive 3D models of trees and small groups of trees. The method employs slow-moving UAVs that acquire images along predefined trajectories near and around targeted trees, and computer vision-based approaches that process the images to obtain detailed tree reconstructions. After we confirmed the potential of the methodology via simulation we evaluated several UAV platforms, strategies for image acquisition, and image processing algorithms. We present an original, step-by-step workflow which utilizes open source programs and original software. We anticipate that future development and applications of our method will improve our understanding of forest self-organization emerging from the competition among trees, and will lead to a refined generation of individual-tree-based forest models. PMID:26393926

  7. 3D Tree Dimensionality Assessment Using Photogrammetry and Small Unmanned Aerial Vehicles.

    PubMed

    Gatziolis, Demetrios; Lienard, Jean F; Vogs, Andre; Strigul, Nikolay S

    2015-01-01

    Detailed, precise, three-dimensional (3D) representations of individual trees are a prerequisite for an accurate assessment of tree competition, growth, and morphological plasticity. Until recently, our ability to measure the dimensionality, spatial arrangement, shape of trees, and shape of tree components with precision has been constrained by technological and logistical limitations and cost. Traditional methods of forest biometrics provide only partial measurements and are labor intensive. Active remote technologies such as LiDAR operated from airborne platforms provide only partial crown reconstructions. The use of terrestrial LiDAR is laborious, has portability limitations and high cost. In this work we capitalized on recent improvements in the capabilities and availability of small unmanned aerial vehicles (UAVs), light and inexpensive cameras, and developed an affordable method for obtaining precise and comprehensive 3D models of trees and small groups of trees. The method employs slow-moving UAVs that acquire images along predefined trajectories near and around targeted trees, and computer vision-based approaches that process the images to obtain detailed tree reconstructions. After we confirmed the potential of the methodology via simulation we evaluated several UAV platforms, strategies for image acquisition, and image processing algorithms. We present an original, step-by-step workflow which utilizes open source programs and original software. We anticipate that future development and applications of our method will improve our understanding of forest self-organization emerging from the competition among trees, and will lead to a refined generation of individual-tree-based forest models.

  8. Neuronal differentiation of human mesenchymal stem cells in response to the domain size of graphene substrates.

    PubMed

    Lee, Yoo-Jung; Seo, Tae Hoon; Lee, Seula; Jang, Wonhee; Kim, Myung Jong; Sung, Jung-Suk

    2018-01-01

    Graphene is a noncytotoxic monolayer platform with unique physical, chemical, and biological properties. It has been demonstrated that graphene substrate may provide a promising biocompatible scaffold for stem cell therapy. Because chemical vapor deposited graphene has a two dimensional polycrystalline structure, it is important to control the individual domain size to obtain desirable properties for nano-material. However, the biological effects mediated by differences in domain size of graphene have not yet been reported. On the basis of the control of graphene domain achieved by one-step growth (1step-G, small domain) and two-step growth (2step-G, large domain) process, we found that the neuronal differentiation of bone marrow-derived human mesenchymal stem cells (hMSCs) highly depended on the graphene domain size. The defects at the domain boundaries in 1step-G graphene was higher (×8.5) and had a relatively low (13% lower) contact angle of water droplet than 2step-G graphene, leading to enhanced cell-substrate adhesion and upregulated neuronal differentiation of hMSCs. We confirmed that the strong interactions between cells and defects at the domain boundaries in 1step-G graphene can be obtained due to their relatively high surface energy, which is stronger than interactions between cells and graphene surfaces. Our results may provide valuable information on the development of graphene-based scaffold by understanding which properties of graphene domain influence cell adhesion efficacy and stem cell differentiation. © 2017 Wiley Periodicals, Inc. J Biomed Mater Res Part A: 106A: 43-51, 2018. © 2017 Wiley Periodicals, Inc.

  9. Crowdsourced Curriculum Development for Online Medical Education.

    PubMed

    Shappell, Eric; Chan, Teresa M; Thoma, Brent; Trueger, N Seth; Stuntz, Bob; Cooney, Robert; Ahn, James

    2017-12-08

    In recent years online educational content, efforts at quality appraisal, and integration of online material into institutional teaching initiatives have increased. However, medical education has yet to develop large-scale online learning centers. Crowd-sourced curriculum development may expedite the realization of this potential while providing opportunities for innovation and scholarship. This article describes the current landscape, best practices, and future directions for crowdsourced curriculum development using Kern's framework for curriculum development and the example topic of core content in emergency medicine. A scoping review of online educational content was performed by a panel of subject area experts for each step in Kern's framework. Best practices and recommendations for future development for each step were established by the same panel using a modified nominal group consensus process. The most prevalent curriculum design steps were (1) educational content and (2) needs assessments. Identified areas of potential innovation within these steps included targeting gaps in specific content areas and developing underrepresented instructional methods. Steps in curriculum development without significant representation included (1) articulation of goals and objectives and (2) tools for curricular evaluation. By leveraging the power of the community, crowd-sourced curriculum development offers a mechanism to diffuse the burden associated with creating comprehensive online learning centers. There is fertile ground for innovation and scholarship in each step along the continuum of curriculum development. Realization of this paradigm's full potential will require individual developers to strongly consider how their contributions will align with the work of others.

  10. Crowdsourced Curriculum Development for Online Medical Education

    PubMed Central

    Chan, Teresa M; Thoma, Brent; Trueger, N Seth; Stuntz, Bob; Cooney, Robert; Ahn, James

    2017-01-01

    In recent years online educational content, efforts at quality appraisal, and integration of online material into institutional teaching initiatives have increased. However, medical education has yet to develop large-scale online learning centers. Crowd-sourced curriculum development may expedite the realization of this potential while providing opportunities for innovation and scholarship. This article describes the current landscape, best practices, and future directions for crowdsourced curriculum development using Kern’s framework for curriculum development and the example topic of core content in emergency medicine. A scoping review of online educational content was performed by a panel of subject area experts for each step in Kern’s framework. Best practices and recommendations for future development for each step were established by the same panel using a modified nominal group consensus process. The most prevalent curriculum design steps were (1) educational content and (2) needs assessments. Identified areas of potential innovation within these steps included targeting gaps in specific content areas and developing underrepresented instructional methods. Steps in curriculum development without significant representation included (1) articulation of goals and objectives and (2) tools for curricular evaluation. By leveraging the power of the community, crowd-sourced curriculum development offers a mechanism to diffuse the burden associated with creating comprehensive online learning centers. There is fertile ground for innovation and scholarship in each step along the continuum of curriculum development. Realization of this paradigm’s full potential will require individual developers to strongly consider how their contributions will align with the work of others. PMID:29464134

  11. A Novel Approach to Identifying Physical Markers of Cryo-Damage in Bull Spermatozoa

    PubMed Central

    Yoon, Sung-Jae; Kwon, Woo-Sung; Rahman, Md Saidur; Lee, June-Sub; Pang, Myung-Geol

    2015-01-01

    Cryopreservation is an efficient way to store spermatozoa and plays a critical role in the livestock industry as well as in clinical practice. During cryopreservation, cryo-stress causes substantial damage to spermatozoa. In present study, the effects of cryo-stress at various cryopreservation steps, such as dilution / cooling, adding cryoprtectant, and freezing were studied in spermatozoa collected from 9 individual bull testes. The motility (%), motion kinematics, capacitation status, mitochondrial activity, and viability of bovine spermatozoa at each step of the cryopreservation process were assessed using computer-assisted sperm analysis, Hoechst 33258/chlortetracycline fluorescence, rhodamine 123 staining, and hypo-osmotic swelling test, respectively. The results demonstrate that the cryopreservation steps reduced motility (%), rapid speed (%), and mitochondrial activity, whereas medium/slow speed (%), and the acrosome reaction were increased (P < 0.05). Differences (Δ) of the acrosome reaction were higher in dilution/cooling step (P < 0.05), whereas differences (Δ) of motility, rapid speed, and non-progressive motility were higher in cryoprotectant and freezing as compared to dilution/cooling (P < 0.05). On the other hand, differences (Δ) of mitochondrial activity, viability, and progressive motility were higher in freezing step (P < 0.05) while the difference (Δ) of the acrosome reaction was higher in dilution/cooling (P < 0.05). Based on these results, we propose that freezing / thawing steps are the most critical in cryopreservation and may provide a logical ground of understanding on the cryo-damage. Moreover, these sperm parameters might be used as physical markers of sperm cryo-damage. PMID:25938413

  12. Accessory stimulus modulates executive function during stepping task

    PubMed Central

    Watanabe, Tatsunori; Koyama, Soichiro; Tanabe, Shigeo

    2015-01-01

    When multiple sensory modalities are simultaneously presented, reaction time can be reduced while interference enlarges. The purpose of this research was to examine the effects of task-irrelevant acoustic accessory stimuli simultaneously presented with visual imperative stimuli on executive function during stepping. Executive functions were assessed by analyzing temporal events and errors in the initial weight transfer of the postural responses prior to a step (anticipatory postural adjustment errors). Eleven healthy young adults stepped forward in response to a visual stimulus. We applied a choice reaction time task and the Simon task, which consisted of congruent and incongruent conditions. Accessory stimuli were randomly presented with the visual stimuli. Compared with trials without accessory stimuli, the anticipatory postural adjustment error rates were higher in trials with accessory stimuli in the incongruent condition and the reaction times were shorter in trials with accessory stimuli in all the task conditions. Analyses after division of trials according to whether anticipatory postural adjustment error occurred or not revealed that the reaction times of trials with anticipatory postural adjustment errors were reduced more than those of trials without anticipatory postural adjustment errors in the incongruent condition. These results suggest that accessory stimuli modulate the initial motor programming of stepping by lowering decision threshold and exclusively under spatial incompatibility facilitate automatic response activation. The present findings advance the knowledge of intersensory judgment processes during stepping and may aid in the development of intervention and evaluation tools for individuals at risk of falls. PMID:25925321

  13. Global phenomena from local rules: Peer-to-peer networks and crystal steps

    NASA Astrophysics Data System (ADS)

    Finkbiner, Amy

    Even simple, deterministic rules can generate interesting behavior in dynamical systems. This dissertation examines some real world systems for which fairly simple, locally defined rules yield useful or interesting properties in the system as a whole. In particular, we study routing in peer-to-peer networks and the motion of crystal steps. Peers can vary by three orders of magnitude in their capacities to process network traffic. This heterogeneity inspires our use of "proportionate load balancing," where each peer provides resources in proportion to its individual capacity. We provide an implementation that employs small, local adjustments to bring the entire network into a global balance. Analytically and through simulations, we demonstrate the effectiveness of proportionate load balancing on two routing methods for de Bruijn graphs, introducing a new "reversed" routing method which performs better than standard forward routing in some cases. The prevalence of peer-to-peer applications prompts companies to locate the hosts participating in these networks. We explore the use of supervised machine learning to identify peer-to-peer hosts, without using application-specific information. We introduce a model for "triples," which exploits information about nearly contemporaneous flows to give a statistical picture of a host's activities. We find that triples, together with measurements of inbound vs. outbound traffic, can capture most of the behavior of peer-to-peer hosts. An understanding of crystal surface evolution is important for the development of modern nanoscale electronic devices. The most commonly studied surface features are steps, which form at low temperatures when the crystal is cut close to a plane of symmetry. Step bunching, when steps arrange into widely separated clusters of tightly packed steps, is one important step phenomenon. We analyze a discrete model for crystal steps, in which the motion of each step depends on the two steps on either side of it. We find an time-dependence term for the motion that does not appear in continuum models, and we determine an explicit dependence on step number.

  14. Processing and mechanical characterization of alumina laminates

    NASA Astrophysics Data System (ADS)

    Montgomery, John K.

    2002-08-01

    Single-phase ceramics that combine property gradients or steps in monolithic bodies are sought as alternatives to ceramic composites made of dissimilar materials. This work describes novel processing methods to produce stepped-density (or laminated) alumina single-phase bodies that maintain their mechanical integrity. One arrangement consists of a stiff, dense bulk material with a thin, flaw tolerant, porous exterior layer. Another configuration consists of a lightweight, low-density bulk material with a thin, hard, wear resistant exterior layer. Alumina laminates with strong interfaces have been successfully produced in this work using two different direct-casting processes. Gelcasting is a useful near-net shape processing technique that has been combined with several techniques, such as reaction bonding of aluminum oxide and the use of starch as a fugative filler, to successfully produced stepped-density alumina laminates. The other direct casting process that has been developed in this work is thermoreversible gelcasting (TRG). This is a reversible gelation process that has been used to produce near-net shape dense ceramic bodies. Also, individual layers can be stacked together and heated to produce laminates. Bilayer laminate samples were produced with varied thickness of porous and dense layers. It was shown that due to the difference in modulus and hardness, transverse cracking is found upon Hertzian contact when the dense layer is on the exterior. In the opposite arrangement, compacted damage zones formed in the porous material and no damage occurred in the underlying dense layer. Flaw tolerant behavior of the porous exterior/dense underlayer was examined by measuring biaxial strength as a function of Vickers indentation load. It was found that the thinnest layer of porous material results in the greatest flaw tolerance. Also, higher strength was exhibited at large indentation loads when compared to dense monoliths. The calculated stresses on the surfaces and interface afforded an explanation of the behavior that failure initiates at the interface between the layers for the thinnest configuration, rather than the sample surface.

  15. Top Ten Reasons for DEOX as a Front End to Pyroprocessing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    B.R. Westphal; K.J. Bateman; S.D. Herrmann

    A front end step is being considered to augment chopping during the treatment of spent oxide fuel by pyroprocessing. The front end step, termed DEOX for its emphasis on decladding via oxidation, employs high temperatures to promote the oxidation of UO2 to U3O8 via an oxygen carrier gas. During oxidation, the spent fuel experiences a 30% increase in lattice structure volume resulting in the separation of fuel from cladding with a reduced particle size. A potential added benefit of DEOX is the removal of fission products, either via direct release from the broken fuel structure or via oxidation and volatilizationmore » by the high temperature process. Fuel element chopping is the baseline operation to prepare spent oxide fuel for an electrolytic reduction step. Typical chopping lengths range from 1 to 5 mm for both individual elements and entire assemblies. During electrolytic reduction, uranium oxide is reduced to metallic uranium via a lithium molten salt. An electrorefining step is then performed to separate a majority of the fission products from the recoverable uranium. Although DEOX is based on a low temperature oxidation cycle near 500oC, additional conditions have been tested to distinguish their effects on the process.[1] Both oxygen and air have been utilized during the oxidation portion followed by vacuum conditions to temperatures as high as 1200oC. In addition, the effects of cladding on fission product removal have also been investigated with released fuel to temperatures greater than 500oC.« less

  16. Minefield reconnaissance and detector system

    DOEpatents

    Butler, Millard T.; Cave, Steven P.; Creager, James D.; Johnson, Charles M.; Mathes, John B.; Smith, Kirk J.

    1994-01-01

    A multi-sensor system (10) for detecting the presence of objects on the surface of the ground or buried just under the surface, such as anti-personnel or anti-tank mines or the like. A remote sensor platform (12) has a plurality of metal detector sensors (22) and a plurality of short pulse radar sensors (24). The remote sensor platform (12) is remotely controlled from a processing and control unit (14) and signals from the remote sensor platform (12) are sent to the processing and control unit (14) where they are individually evaluated in separate data analysis subprocess steps (34, 36) to obtain a probability "score" for each of the pluralities of sensors (22, 24). These probability scores are combined in a fusion subprocess step (38) by comparing score sets to a probability table (130) which is derived based upon the historical incidence of object present conditions given that score set. A decision making rule is applied to provide an output which is optionally provided to a marker subprocess (40) for controlling a marker device (76) to mark the location of found objects.

  17. Psychoacoustic processing of test signals

    NASA Astrophysics Data System (ADS)

    Kadlec, Frantisek

    2003-10-01

    For the quantitative evaluation of electroacoustic system properties and for psychoacoustic testing it is possible to utilize harmonic signals with fixed frequency, sweeping signals, random signals or their combination. This contribution deals with the design of various test signals with emphasis on audible perception. During the digital generation of signals, some additional undesirable frequency components and noise are produced, which are dependent on signal amplitude and sampling frequency. A mathematical analysis describes the origin of this distortion. By proper selection of signal frequency and amplitude it is possible to minimize those undesirable components. An additional step is to minimize the audible perception of this signal distortion by the application of additional noise (dither). For signals intended for listening tests a dither with triangular or Gaussian probability density function was found to be most effective. Signals modified this way may be further improved by the application of noise shaping, which transposes those undesirable products into frequency regions where they are perceived less, according to psychoacoustic principles. The efficiency of individual processing steps was confirmed both by measurements and by listening tests. [Work supported by the Czech Science Foundation.

  18. Pathway of actin filament branch formation by Arp2/3 complex revealed by single-molecule imaging

    PubMed Central

    Smith, Benjamin A.; Daugherty-Clarke, Karen; Goode, Bruce L.; Gelles, Jeff

    2013-01-01

    Actin filament nucleation by actin-related protein (Arp) 2/3 complex is a critical process in cell motility and endocytosis, yet key aspects of its mechanism are unknown due to a lack of real-time observations of Arp2/3 complex through the nucleation process. Triggered by the verprolin homology, central, and acidic (VCA) region of proteins in the Wiskott-Aldrich syndrome protein (WASp) family, Arp2/3 complex produces new (daughter) filaments as branches from the sides of preexisting (mother) filaments. We visualized individual fluorescently labeled Arp2/3 complexes dynamically interacting with and producing branches on growing actin filaments in vitro. Branch formation was strikingly inefficient, even in the presence of VCA: only ∼1% of filament-bound Arp2/3 complexes yielded a daughter filament. VCA acted at multiple steps, increasing both the association rate of Arp2/3 complexes with mother filament and the fraction of filament-bound complexes that nucleated a daughter. The results lead to a quantitative kinetic mechanism for branched actin assembly, revealing the steps that can be stimulated by additional cellular factors. PMID:23292935

  19. Current status and challenges for automotive battery production technologies

    NASA Astrophysics Data System (ADS)

    Kwade, Arno; Haselrieder, Wolfgang; Leithoff, Ruben; Modlinger, Armin; Dietrich, Franz; Droeder, Klaus

    2018-04-01

    Production technology for automotive lithium-ion battery (LIB) cells and packs has improved considerably in the past five years. However, the transfer of developments in materials, cell design and processes from lab scale to production scale remains a challenge due to the large number of consecutive process steps and the significant impact of material properties, electrode compositions and cell designs on processes. This requires an in-depth understanding of the individual production processes and their interactions, and pilot-scale investigations into process parameter selection and prototype cell production. Furthermore, emerging process concepts must be developed at lab and pilot scale that reduce production costs and improve cell performance. Here, we present an introductory summary of the state-of-the-art production technologies for automotive LIBs. We then discuss the key relationships between process, quality and performance, as well as explore the impact of materials and processes on scale and cost. Finally, future developments and innovations that aim to overcome the main challenges are presented.

  20. High frequency copolymer ultrasonic transducer array of size-effective elements

    NASA Astrophysics Data System (ADS)

    Decharat, Adit; Wagle, Sanat; Habib, Anowarul; Jacobsen, Svein; Melandsø, Frank

    2018-02-01

    A layer-by-layer deposition method for producing dual-layer ultrasonic transducers from piezoelectric copolymers has been developed. The method uses a combination of customized and standard processing to obtain 2D array transducers with electrical connection of the individual elements routed directly to the rear of the substrate. A numerical model was implemented to study basic parameters effecting the transducer characteristics. Key elements of the array were characterized and evaluated, demonstrating its viability of 2D imaging. Signal reproducibility of the prototype array was studied by characterizing the variations of the center frequency (≈42 MHz) and bandwidth (≈25 MHz) of the acoustic. Object identification was also tested and parameterized by acoustic-field beamwidth as well as proper scan step size. Simple tests to illustrate a benefit of multi-element scan on lowering the inspection time were conducted. Structural imaging of the test structure underneath multi-layered wave media (glass plate and distilled water) was also performed. The prototype presented in this work is an important step towards realizing an inexpensive, compact array of individually operated copolymer transducers that can serve in a fast/volumetric high frequency (HF) ultrasonic scanning platform.

  1. Post-thymic maturation: young T cells assert their individuality.

    PubMed

    Fink, Pamela J; Hendricks, Deborah W

    2011-07-22

    T cell maturation was once thought to occur entirely within the thymus. Now, evidence is mounting that the youngest peripheral T cells in both mice and humans comprise a distinct population from their more mature, yet still naive, counterparts. These cells, termed recent thymic emigrants (RTEs), undergo a process of post-thymic maturation that can be monitored at the levels of cell phenotype and immune function. Understanding this final maturation step in the process of generating useful and safe T cells is of clinical relevance, given that RTEs are over-represented in neonates and in adults recovering from lymphopenia. Post-thymic maturation may function to ensure T cell fitness and self tolerance.

  2. The structuring of GMO release and evaluation in EU law.

    PubMed

    von Kries, Caroline; Winter, Gerd

    2012-04-01

    Genetically modified organisms (GMOs) and their behavior in the environment are complex and can only be assessed if the different components are distinguished. This article examines, how by EU law the real causation processes from the GMO release to various endpoints are dissected, individually analysed and then again viewed in their entirety. In addition, the articles includes, how the intellectual process of assessment is divided into the steps of tiered generation, shared submission and structured evaluation of relevant knowledge. The framework proposed for such an examination allows to identify strengths and weaknesses of GMO risk assessment in the EU. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Spike-frequency adaptation in the inferior colliculus.

    PubMed

    Ingham, Neil J; McAlpine, David

    2004-02-01

    We investigated spike-frequency adaptation of neurons sensitive to interaural phase disparities (IPDs) in the inferior colliculus (IC) of urethane-anesthetized guinea pigs using a stimulus paradigm designed to exclude the influence of adaptation below the level of binaural integration. The IPD-step stimulus consists of a binaural 3,000-ms tone, in which the first 1,000 ms is held at a neuron's least favorable ("worst") IPD, adapting out monaural components, before being stepped rapidly to a neuron's most favorable ("best") IPD for 300 ms. After some variable interval (1-1,000 ms), IPD is again stepped to the best IPD for 300 ms, before being returned to a neuron's worst IPD for the remainder of the stimulus. Exponential decay functions fitted to the response to best-IPD steps revealed an average adaptation time constant of 52.9 +/- 26.4 ms. Recovery from adaptation to best IPD steps showed an average time constant of 225.5 +/- 210.2 ms. Recovery time constants were not correlated with adaptation time constants. During the recovery period, adaptation to a 2nd best-IPD step followed similar kinetics to adaptation during the 1st best-IPD step. The mean adaptation time constant at stimulus onset (at worst IPD) was 34.8 +/- 19.7 ms, similar to the 38.4 +/- 22.1 ms recorded to contralateral stimulation alone. Individual time constants after stimulus onset were correlated with each other but not with time constants during the best-IPD step. We conclude that such binaurally derived measures of adaptation reflect processes that occur above the level of exclusively monaural pathways, and subsequent to the site of primary binaural interaction.

  4. Your company's secret change agents.

    PubMed

    Pascale, Richard Tanner; Sternin, Jerry

    2005-05-01

    Organizational change has traditionally come about through top-down initiatives such as hiring experts or importing best-of-breed practices. Such methods usually result in companywide rollouts of templates mandated from on high. These do little to get people excited. But within every organization, there are a few individuals who find unique ways to look at problems that seem impossible to solve. Although these change agents start out with the same tools and access to resources as their peers, they are able to see solutions where others do not. They find a way to bridge the divide between what is happening and what is possible. These positive deviants are the key, the authors believe, to a better way of creating organizational change. Your company can make the most of their methods by following six steps. In Step 1, Make the group the guru, the members of the community are engaged in the process of their own evolution. Step 2, Reframe through facts, entails restating the problem in a way that opens minds to new possibilities. Step 3, Make it safe to learn, involves creating an environment that supports innovative ideas. In Step 4, Make the problem concrete, the community combats abstraction by stating uncomfortable truths. In Step 5, Leverage social proof, the community looks to the larger society for examples of solutions that have worked in parallel situations. In Step 6, Confound the immune defense response, solutions are introduced organically from within the group in a way that promotes acceptance. Throughout the steps, the leader must suspend his or her traditional role in favor of more facilitatory practices. The positive-deviance approach has unearthed solutions to such complicated and diverse problems as malnutrition in Mali and human trafficking in East Java. This methodology can help solve even the most extreme dilemmas.

  5. Does footwear type impact the number of steps required to reach gait steady state?: an innovative look at the impact of foot orthoses on gait initiation.

    PubMed

    Najafi, Bijan; Miller, Daniel; Jarrett, Beth D; Wrobel, James S

    2010-05-01

    Many studies have attempted to better elucidate the effect of foot orthoses on gait dynamics. To our knowledge, most previous studies exclude the first few steps of gait and begin analysis at steady state walking. These unanalyzed steps of gait may contain important information about the dynamic and complex processes required to achieve equilibrium for a given gait velocity. The purpose of this study was to quantify gait initiation and determine how many steps were required to reach steady state walking under three footwear conditions: barefoot, habitual shoes, and habitual shoes with a prefabricated foot orthoses. Fifteen healthy subjects walked 50m at habitual speed in each condition. Wearing habitual shoes with the prefabricated orthoses enabled subjects to reach steady state walking in fewer steps (3.5 steps+/-2.0) compared to the barefoot condition (5.2 steps+/-3.0; p=0.02) as well as compared to the habitual shoes condition (4.7 steps+/-1.6; p=0.05). Interestingly, the subjects' dynamic medial-lateral balance was significantly improved (22%, p<0.05) by using foot orthoses compared to other footwear conditions. These findings suggest that foot orthoses may help individuals reach steady state more quickly and with a better dynamic balance in the medial-lateral direction, independent of foot type. The findings of this pilot study may open new avenues for objectively assessing the impact of prescription footwear on dynamic balance and spatio-temporal parameters of gait. Further work to better assess the impact of foot orthoses on gait initiation in patients suffering from gait and instability pathologies may be warranted. Copyright 2010 Elsevier B.V. All rights reserved.

  6. Monitoring one-electron photo-oxidation of guanine in DNA crystals using ultrafast infrared spectroscopy

    NASA Astrophysics Data System (ADS)

    Hall, James P.; Poynton, Fergus E.; Keane, Páraic M.; Gurung, Sarah P.; Brazier, John A.; Cardin, David J.; Winter, Graeme; Gunnlaugsson, Thorfinnur; Sazanovich, Igor V.; Towrie, Michael; Cardin, Christine J.; Kelly, John M.; Quinn, Susan J.

    2015-12-01

    To understand the molecular origins of diseases caused by ultraviolet and visible light, and also to develop photodynamic therapy, it is important to resolve the mechanism of photoinduced DNA damage. Damage to DNA bound to a photosensitizer molecule frequently proceeds by one-electron photo-oxidation of guanine, but the precise dynamics of this process are sensitive to the location and the orientation of the photosensitizer, which are very difficult to define in solution. To overcome this, ultrafast time-resolved infrared (TRIR) spectroscopy was performed on photoexcited ruthenium polypyridyl-DNA crystals, the atomic structure of which was determined by X-ray crystallography. By combining the X-ray and TRIR data we are able to define both the geometry of the reaction site and the rates of individual steps in a reversible photoinduced electron-transfer process. This allows us to propose an individual guanine as the reaction site and, intriguingly, reveals that the dynamics in the crystal state are quite similar to those observed in the solvent medium.

  7. Monitoring one-electron photo-oxidation of guanine in DNA crystals using ultrafast infrared spectroscopy.

    PubMed

    Hall, James P; Poynton, Fergus E; Keane, Páraic M; Gurung, Sarah P; Brazier, John A; Cardin, David J; Winter, Graeme; Gunnlaugsson, Thorfinnur; Sazanovich, Igor V; Towrie, Michael; Cardin, Christine J; Kelly, John M; Quinn, Susan J

    2015-12-01

    To understand the molecular origins of diseases caused by ultraviolet and visible light, and also to develop photodynamic therapy, it is important to resolve the mechanism of photoinduced DNA damage. Damage to DNA bound to a photosensitizer molecule frequently proceeds by one-electron photo-oxidation of guanine, but the precise dynamics of this process are sensitive to the location and the orientation of the photosensitizer, which are very difficult to define in solution. To overcome this, ultrafast time-resolved infrared (TRIR) spectroscopy was performed on photoexcited ruthenium polypyridyl-DNA crystals, the atomic structure of which was determined by X-ray crystallography. By combining the X-ray and TRIR data we are able to define both the geometry of the reaction site and the rates of individual steps in a reversible photoinduced electron-transfer process. This allows us to propose an individual guanine as the reaction site and, intriguingly, reveals that the dynamics in the crystal state are quite similar to those observed in the solvent medium.

  8. Processing of zero-derived words in English: an fMRI investigation.

    PubMed

    Pliatsikas, Christos; Wheeldon, Linda; Lahiri, Aditi; Hansen, Peter C

    2014-01-01

    Derivational morphological processes allow us to create new words (e.g. punish (V) to noun (N) punishment) from base forms. The number of steps from the basic units to derived words often varies (e.g., nationalitybridge-V) i.e., zero-derivation (Aronoff, 1980). We compared the processing of one-step (soaking

  9. Influenza Virus-Mediated Membrane Fusion: Determinants of Hemagglutinin Fusogenic Activity and Experimental Approaches for Assessing Virus Fusion

    PubMed Central

    Hamilton, Brian S.; Whittaker, Gary R.; Daniel, Susan

    2012-01-01

    Hemagglutinin (HA) is the viral protein that facilitates the entry of influenza viruses into host cells. This protein controls two critical aspects of entry: virus binding and membrane fusion. In order for HA to carry out these functions, it must first undergo a priming step, proteolytic cleavage, which renders it fusion competent. Membrane fusion commences from inside the endosome after a drop in lumenal pH and an ensuing conformational change in HA that leads to the hemifusion of the outer membrane leaflets of the virus and endosome, the formation of a stalk between them, followed by pore formation. Thus, the fusion machinery is an excellent target for antiviral compounds, especially those that target the conserved stem region of the protein. However, traditional ensemble fusion assays provide a somewhat limited ability to directly quantify fusion partly due to the inherent averaging of individual fusion events resulting from experimental constraints. Inspired by the gains achieved by single molecule experiments and analysis of stochastic events, recently-developed individual virion imaging techniques and analysis of single fusion events has provided critical information about individual virion behavior, discriminated intermediate fusion steps within a single virion, and allowed the study of the overall population dynamics without the loss of discrete, individual information. In this article, we first start by reviewing the determinants of HA fusogenic activity and the viral entry process, highlight some open questions, and then describe the experimental approaches for assaying fusion that will be useful in developing the most effective therapies in the future. PMID:22852045

  10. Individual Rocks Segmentation in Terrestrial Laser Scanning Point Cloud Using Iterative Dbscan Algorithm

    NASA Astrophysics Data System (ADS)

    Walicka, A.; Jóźków, G.; Borkowski, A.

    2018-05-01

    The fluvial transport is an important aspect of hydrological and geomorphologic studies. The knowledge about the movement parameters of different-size fractions is essential in many applications, such as the exploration of the watercourse changes, the calculation of the river bed parameters or the investigation of the frequency and the nature of the weather events. Traditional techniques used for the fluvial transport investigations do not provide any information about the long-term horizontal movement of the rocks. This information can be gained by means of terrestrial laser scanning (TLS). However, this is a complex issue consisting of several stages of data processing. In this study the methodology for individual rocks segmentation from TLS point cloud has been proposed, which is the first step for the semi-automatic algorithm for movement detection of individual rocks. The proposed algorithm is executed in two steps. Firstly, the point cloud is classified as rocks or background using only geometrical information. Secondly, the DBSCAN algorithm is executed iteratively on points classified as rocks until only one stone is detected in each segment. The number of rocks in each segment is determined using principal component analysis (PCA) and simple derivative method for peak detection. As a result, several segments that correspond to individual rocks are formed. Numerical tests were executed on two test samples. The results of the semi-automatic segmentation were compared to results acquired by manual segmentation. The proposed methodology enabled to successfully segment 76 % and 72 % of rocks in the test sample 1 and test sample 2, respectively.

  11. A comparative study of one-step and two-step approaches for MAPbI3 perovskite layer and its influence on the performance of mesoscopic perovskite solar cell

    NASA Astrophysics Data System (ADS)

    Wang, Minhuan; Feng, Yulin; Bian, Jiming; Liu, Hongzhu; Shi, Yantao

    2018-01-01

    The mesoscopic perovskite solar cells (M-PSCs) were synthesized with MAPbI3 perovskite layers as light harvesters, which were grown with one-step and two-step solution process, respectively. A comparative study was performed through the quantitative correlation of resulting device performance and the crystalline quality of perovskite layers. Comparing with the one-step counterpart, a pronounced improvement in the steady-state power conversion efficiencies (PCEs) by 56.86% was achieved with two-step process, which was mainly resulted from the significant enhancement in fill factor (FF) from 48% to 77% without sacrificing the open circuit voltage (Voc) and short circuit current (Jsc). The enhanced FF was attributed to the reduced non-radiative recombination channels due to the better crystalline quality and larger grain size with the two-step processed perovskite layer. Moreover, the superiority of two-step over one-step process was demonstrated with rather good reproducibility.

  12. Visual impairment, uncorrected refractive error, and accelerometer-defined physical activity in the United States.

    PubMed

    Willis, Jeffrey R; Jefferys, Joan L; Vitale, Susan; Ramulu, Pradeep Y

    2012-03-01

    To examine how accelerometer-measured physical activity is affected by visual impairment (VI) and uncorrected refractive error (URE). Cross-sectional study using data from the 2003-2004/2005-2006 National Health and Nutritional Examination Survey. Visual impairment was defined as better-eye postrefraction visual acuity worse than 20/40. Uncorrected refractive error was defined as better-eye presenting visual acuity of 20/50 or worse, improving to 20/40 or better with refraction. Adults older than 20 years with normal sight, URE, and VI were analyzed. The main outcome measures were steps per day and daily minutes of moderate or vigorous physical activity (MVPA). Five thousand seven hundred twenty-two participants (57.1%) had complete visual acuity and accelerometer data. Individuals with normal sight took an average of 9964 steps per day and engaged in an average of 23.5 minutes per day of MVPA, as compared with 9742 steps per day and 23.1 minutes per day of MVPA in individuals with URE (P > .50 for both) and 5992 steps per day and 9.3 minutes/d of MVPA in individuals with VI (P < .01 for both). In multivariable models, individuals with VI took 26% fewer steps per day (P < .01; 95% CI, 18%-34%) and spent 48% less time in MVPA (P < .01; 95% CI, 37%-57%) than individuals with normal sight. The decrement in steps and MVPA associated with VI equaled or exceeded that associated with self-reported chronic obstructive pulmonary disease, diabetes mellitus, arthritis, stroke, or congestive heart failure. Visual impairment, but not URE, impacts physical activity equal to or greater than other serious medical conditions. The substantial decrement in physical activity observed in nonrefractive vision loss highlights a need for better strategies to safely improve mobility and increase physical activity in this group.

  13. Individualized Inservice Teacher Education (Project In-Step). Evaluation Report. Phase III.

    ERIC Educational Resources Information Center

    Thurber, John C.

    This is a report on the third phase of Project IN-STEP, which was intended to develop a viable model for individualized, multi-media in-service teacher education programs. (Phase I and II are reported in ED 033 905, and ED 042 709). The rationale for Phase III was to see if the model could be successfully transferred to an area other than teaching…

  14. Effect of production management on semen quality during long-term storage in different European boar studs.

    PubMed

    Schulze, M; Kuster, C; Schäfer, J; Jung, M; Grossfeld, R

    2018-03-01

    The processing of ejaculates is a fundamental step for the fertilizing capacity of boar spermatozoa. The aim of the present study was to identify factors that affect quality of boar semen doses. The production process during 1 day of semen processing in 26 European boar studs was monitored. In each boar stud, nine to 19 randomly selected ejaculates from 372 Pietrain boars were analyzed for sperm motility, acrosome and plasma membrane integrity, mitochondrial activity and thermo-resistance (TRT). Each ejaculate was monitored for production time and temperature for each step in semen processing using the special programmed software SEQU (version 1.7, Minitüb, Tiefenbach, Germany). The dilution of ejaculates with a short-term extender was completed in one step in 10 AI centers (n = 135 ejaculates), in two steps in 11 AI centers (n = 158 ejaculates) and in three steps in five AI centers (n = 79 ejaculates). Results indicated there was a greater semen quality with one-step isothermal dilution compared with the multi-step dilution of AI semen doses (total motility TRT d7: 71.1 ± 19.2%, 64.6 ± 20.0%, 47.1 ± 27.1%; one-step compared with two-step compared with the three-step dilution; P < .05). There was a marked advantage when using the one-step isothermal dilution regarding time management, preservation suitability, stability and stress resistance. One-step dilution caused significant lower holding times of raw ejaculates and reduced the possible risk of making mistakes due to a lower number of processing steps. These results lead to refined recommendations for boar semen processing. Copyright © 2018 Elsevier B.V. All rights reserved.

  15. Piaget and Organic Chemistry: Teaching Introductory Organic Chemistry through Learning Cycles

    NASA Astrophysics Data System (ADS)

    Libby, R. Daniel

    1995-07-01

    This paper describes the first application of the Piaget-based learning cycle technique (Atkin & Karplus, Sci. Teach. 1962, 29, 45-51) to an introductory organic chemistry course. It also presents the step-by-step process used to convert a lecture course into a discussion-based active learning course. The course is taught in a series of learning cycles. A learning cycle is a three phase process that provides opportunities for students to explore new material and work with an instructor to recognize logical patterns in data, and devise and test hypotheses. In this application, the first phase, exploration, involves out-of-class student evaluation of data in attempts to identify significant trends and develop hypotheses that might explain the trends in terms of fundamental scientific principles. In the second phase, concept invention, the students and instructor work together in-class to evaluate student hypotheses and find concepts that work best in explaining the data. The third phase, application, is an out-of-class application of the concept to new situations. The development of learning cycles from lecture notes is presented as an 8 step procedure. The process involves revaluation and restructuring of the course material to maintain a continuity of concept development according to the instructor's logic, dividing topics into individual concepts or techniques, and refocusing the presentation in terms of large numbers of examples that can serve as data for students in their exploration and application activities. A sample learning cycle and suggestions for ways of limited implementation of learning cycles into existing courses are also provided.

  16. Combining Open-domain and Biomedical Knowledge for Topic Recognition in Consumer Health Questions.

    PubMed

    Mrabet, Yassine; Kilicoglu, Halil; Roberts, Kirk; Demner-Fushman, Dina

    2016-01-01

    Determining the main topics in consumer health questions is a crucial step in their processing as it allows narrowing the search space to a specific semantic context. In this paper we propose a topic recognition approach based on biomedical and open-domain knowledge bases. In the first step of our method, we recognize named entities in consumer health questions using an unsupervised method that relies on a biomedical knowledge base, UMLS, and an open-domain knowledge base, DBpedia. In the next step, we cast topic recognition as a binary classification problem of deciding whether a named entity is the question topic or not. We evaluated our approach on a dataset from the National Library of Medicine (NLM), introduced in this paper, and another from the Genetic and Rare Disease Information Center (GARD). The combination of knowledge bases outperformed the results obtained by individual knowledge bases by up to 16.5% F1 and achieved state-of-the-art performance. Our results demonstrate that combining open-domain knowledge bases with biomedical knowledge bases can lead to a substantial improvement in understanding user-generated health content.

  17. A method for real-time generation of augmented reality work instructions via expert movements

    NASA Astrophysics Data System (ADS)

    Bhattacharya, Bhaskar; Winer, Eliot

    2015-03-01

    Augmented Reality (AR) offers tremendous potential for a wide range of fields including entertainment, medicine, and engineering. AR allows digital models to be integrated with a real scene (typically viewed through a video camera) to provide useful information in a variety of contexts. The difficulty in authoring and modifying scenes is one of the biggest obstacles to widespread adoption of AR. 3D models must be created, textured, oriented and positioned to create the complex overlays viewed by a user. This often requires using multiple software packages in addition to performing model format conversions. In this paper, a new authoring tool is presented which uses a novel method to capture product assembly steps performed by a user with a depth+RGB camera. Through a combination of computer vision and imaging process techniques, each individual step is decomposed into objects and actions. The objects are matched to those in a predetermined geometry library and the actions turned into animated assembly steps. The subsequent instruction set is then generated with minimal user input. A proof of concept is presented to establish the method's viability.

  18. Load-dependent ADP binding to myosins V and VI: Implications for subunit coordination and function

    PubMed Central

    Oguchi, Yusuke; Mikhailenko, Sergey V.; Ohki, Takashi; Olivares, Adrian O.; De La Cruz, Enrique M.; Ishiwata, Shin'ichi

    2008-01-01

    Dimeric myosins V and VI travel long distances in opposite directions along actin filaments in cells, taking multiple steps in a “hand-over-hand” fashion. The catalytic cycles of both myosins are limited by ADP dissociation, which is considered a key step in the walking mechanism of these motors. Here, we demonstrate that external loads applied to individual actomyosin V or VI bonds asymmetrically affect ADP affinity, such that ADP binds weaker under loads assisting motility. Model-based analysis reveals that forward and backward loads modulate the kinetics of ADP binding to both myosins, although the effect is less pronounced for myosin VI. ADP dissociation is modestly accelerated by forward loads and inhibited by backward loads. Loads applied in either direction slow ADP binding to myosin V but accelerate binding to myosin VI. We calculate that the intramolecular load generated during processive stepping is ≈2 pN for both myosin V and myosin VI. The distinct load dependence of ADP binding allows these motors to perform different cellular functions. PMID:18509050

  19. Diffraction Techniques in Structural Biology

    PubMed Central

    Egli, Martin

    2010-01-01

    A detailed understanding of chemical and biological function and the mechanisms underlying the activities ultimately requires atomic-resolution structural data. Diffraction-based techniques such as single-crystal X-ray crystallography, electron microscopy and neutron diffraction are well established and have paved the road to the stunning successes of modern-day structural biology. The major advances achieved in the last 20 years in all aspects of structural research, including sample preparation, crystallization, the construction of synchrotron and spallation sources, phasing approaches and high-speed computing and visualization, now provide specialists and non-specialists alike with a steady flow of molecular images of unprecedented detail. The present chapter combines a general overview of diffraction methods with a step-by-step description of the process of a single-crystal X-ray structure determination experiment, from chemical synthesis or expression to phasing and refinement, analysis and quality control. For novices it may serve as a stepping-stone to more in-depth treatises of the individual topics. Readers relying on structural information for interpreting functional data may find it a useful consumer guide. PMID:20517991

  20. A Coordinated Initialization Process for the Distributed Space Exploration Simulation

    NASA Technical Reports Server (NTRS)

    Crues, Edwin Z.; Phillips, Robert G.; Dexter, Dan; Hasan, David

    2007-01-01

    A viewgraph presentation on the federate initialization process for the Distributed Space Exploration Simulation (DSES) is described. The topics include: 1) Background: DSES; 2) Simulation requirements; 3) Nine Step Initialization; 4) Step 1: Create the Federation; 5) Step 2: Publish and Subscribe; 6) Step 3: Create Object Instances; 7) Step 4: Confirm All Federates Have Joined; 8) Step 5: Achieve initialize Synchronization Point; 9) Step 6: Update Object Instances With Initial Data; 10) Step 7: Wait for Object Reflections; 11) Step 8: Set Up Time Management; 12) Step 9: Achieve startup Synchronization Point; and 13) Conclusions

  1. Kinematic Differences During Single-Leg Step-Down Between Individuals With Femoroacetabular Impingement Syndrome and Individuals Without Hip Pain.

    PubMed

    Lewis, Cara L; Loverro, Kari L; Khuu, Anne

    2018-04-01

    Study Design Controlled laboratory study, case-control design. Background Despite recognition that femoroacetabular impingement syndrome (FAIS) is a movement-related disorder, few studies have examined dynamic unilateral tasks in individuals with FAIS. Objectives To determine whether movements of the pelvis and lower extremities in individuals with FAIS differ from those in individuals without hip pain during a single-leg step-down, and to analyze kinematic differences between male and female participants within groups. Methods Individuals with FAIS and individuals without hip pain performed a single-leg step-down while kinematic data were collected. Kinematics were evaluated at 60° of knee flexion. A linear regression analysis assessed the main effects of group, sex, and side, and the interaction of sex by group. Results Twenty individuals with FAIS and 40 individuals without hip pain participated. Individuals with FAIS performed the step-down with greater hip flexion (4.9°; 95% confidence interval [CI]: 0.5°, 9.2°) and anterior pelvic tilt (4.1°; 95% CI: 0.9°, 7.3°) than individuals without hip pain. Across groups, female participants performed the task with more hip flexion (6.1°; 95% CI: 1.7°, 10.4°), hip adduction (4.8°; 95% CI: 2.2°, 7.4°), anterior pelvic tilt (5.8°; 95% CI: 2.6°, 9.0°), pelvic drop (1.4°; 95% CI: 0.3°, 2.5°), and thigh adduction (2.7°; 95% CI: 1.3°, 4.2°) than male participants. Conclusion The results of this study suggest that individuals with FAIS have alterations in pelvic motion during a dynamic unilateral task. The noted altered movement patterns in the FAIS group may contribute to the development of hip pain and may be due to impairments that are modifiable through rehabilitation. J Orthop Sports Phys Ther 2018;48(4):270-279. Epub 6 Mar 2018. doi:10.2519/jospt.2018.7794.

  2. 12-step affiliation and attendance following treatment for comorbid substance dependence and depression: a latent growth curve mediation model.

    PubMed

    Worley, Matthew J; Tate, Susan R; McQuaid, John R; Granholm, Eric L; Brown, Sandra A

    2013-01-01

    Among substance-dependent individuals, comorbid major depressive disorder (MDD) is associated with greater severity and poorer treatment outcomes, but little research has examined mediators of posttreatment substance use outcomes within this population. Using latent growth curve models, the authors tested relationships between individual rates of change in 12-step involvement and substance use, utilizing posttreatment follow-up data from a trial of group Twelve-Step Facilitation (TSF) and integrated cognitive-behavioral therapy (ICBT) for veterans with substance dependence and MDD. Although TSF patients were higher on 12-step affiliation and meeting attendance at end-of-treatment as compared with ICBT, they also experienced significantly greater reductions in these variables during the year following treatment, ending at similar levels as ICBT. Veterans in TSF also had significantly greater increases in drinking frequency during follow-up, and this group difference was mediated by their greater reductions in 12-step affiliation and meeting attendance. Patients with comorbid depression appear to have difficulty sustaining high levels of 12-step involvement after the conclusion of formal 12-step interventions, which predicts poorer drinking outcomes over time. Modifications to TSF and other formal 12-step protocols or continued therapeutic contact may be necessary to sustain 12-step involvement and reduced drinking for patients with substance dependence and MDD.

  3. Optimizing The DSSC Fabrication Process Using Lean Six Sigma

    NASA Astrophysics Data System (ADS)

    Fauss, Brian

    Alternative energy technologies must become more cost effective to achieve grid parity with fossil fuels. Dye sensitized solar cells (DSSCs) are an innovative third generation photovoltaic technology, which is demonstrating tremendous potential to become a revolutionary technology due to recent breakthroughs in cost of fabrication. The study here focused on quality improvement measures undertaken to improve fabrication of DSSCs and enhance process efficiency and effectiveness. Several quality improvement methods were implemented to optimize the seven step individual DSSC fabrication processes. Lean Manufacturing's 5S method successfully increased efficiency in all of the processes. Six Sigma's DMAIC methodology was used to identify and eliminate each of the root causes of defects in the critical titanium dioxide deposition process. These optimizations resulted with the following significant improvements in the production process: 1. fabrication time of the DSSCs was reduced by 54 %; 2. fabrication procedures were improved to the extent that all critical defects in the process were eliminated; 3. the quantity of functioning DSSCs fabricated was increased from 17 % to 90 %.

  4. Two-Step Plasma Process for Cleaning Indium Bonding Bumps

    NASA Technical Reports Server (NTRS)

    Greer, Harold F.; Vasquez, Richard P.; Jones, Todd J.; Hoenk, Michael E.; Dickie, Matthew R.; Nikzad, Shouleh

    2009-01-01

    A two-step plasma process has been developed as a means of removing surface oxide layers from indium bumps used in flip-chip hybridization (bump bonding) of integrated circuits. The two-step plasma process makes it possible to remove surface indium oxide, without incurring the adverse effects of the acid etching process.

  5. Predictors of 12-Step Attendance and Participation for Individuals With Stimulant Use Disorders.

    PubMed

    Hatch-Maillette, Mary; Wells, Elizabeth A; Doyle, Suzanne R; Brigham, Gregory S; Daley, Dennis; DiCenzo, Jessica; Donovan, Dennis; Garrett, Sharon; Horigian, Viviana E; Jenkins, Lindsay; Killeen, Therese; Owens, Mandy; Perl, Harold I

    2016-09-01

    Few studies have examined the effectiveness of 12-step peer recovery support programs with drug use disorders, especially stimulant use, and it is difficult to know how outcomes related to 12-step attendance and participation generalize to individuals with non-alcohol substance use disorders (SUDs). A clinical trial of 12-step facilitation (N=471) focusing on individuals with cocaine or methamphetamine use disorders allowed examination of four questions: Q1) To what extent do treatment-seeking stimulant users use 12-step programs and, which ones? Q2) Do factors previously found to predict 12-step participation among those with alcohol use disorders also predict participation among stimulant users? Q3) What specific baseline "12-step readiness" factors predict subsequent 12-step participation and attendance? And Q4) Does stimulant drug of choice differentially predict 12-step participation and attendance? The four outcomes variables, attendance, speaking, duties at 12-step meetings, and other peer recovery support activities, were not related to baseline demographic or substance problem history or severity. Drug of choice was associated with differential days of Alcoholics Anonymous (AA) and Narcotics Anonymous (NA) attendance among those who reported attending, and cocaine users reported more days of attending AA or NA at 1-, 3- and 6-month follow-ups than did methamphetamine users. Pre-randomization measures of perceived benefit of 12-step groups predicted 12-step attendance at 3- and 6-month follow-ups. Pre-randomization 12-step attendance significantly predicted number of other self-help activities at end-of-treatment, 3- and 6-month follow-ups. Pre-randomization perceived benefit and problem severity both predicted number of self-help activities at end-of-treatment and 3-month follow-up. Pre-randomization perceived barriers to 12-step groups were negatively associated with self-help activities at end-of-treatment and 3-month follow-up. Whether or not one participated in any duties was predicted at all time points by pre-randomization involvement in self-help activities. The primary finding of this study is one of continuity: prior attendance and active involvement with 12-step programs were the main signs pointing to future involvement. Limitations and recommendations are discussed. Copyright © 2016 Elsevier Inc. All rights reserved.

  6. Predictors of 12-Step Attendance and Participation for Individuals with Stimulant Use Disorders

    PubMed Central

    Hatch-Maillette, Mary; Wells, Elizabeth A.; Doyle, Suzanne R.; Brigham, Gregory S.; Daley, Dennis; DiCenzo, Jessica; Donovan, Dennis; Garrett, Sharon; Horigian, Viviana E.; Jenkins, Lindsay; Killeen, Therese; Owens, Mandy; Perl, Harold I.

    2017-01-01

    Objective Few studies have examined the effectiveness of 12-step peer recovery support programs with drug use disorders, especially stimulant use, and it is difficult to know how outcomes related to 12-step attendance and participation generalize to individuals with non-alcohol substance use disorders (SUDs). Method A clinical trial of 12-step facilitation (N=471) focusing on individuals with cocaine or methamphetamine use disorders allowed examination of four questions: Q1) To what extent do treatment-seeking stimulant users use 12-step programs and, which ones? Q2) Do factors previously found to predict 12-step participation among those with alcohol use disorders also predict participation among stimulant users? Q3) What specific baseline “12-step readiness” factors predict subsequent 12-step participation and attendance? And Q4) Does stimulant drug of choice differentially predict 12-step participation and attendance? Results The four outcomes variables, Attendance, Speaking, Duties at 12-step meetings, and other peer recovery support Activities, were not related to baseline demographic or substance problem history or severity. Drug of choice was associated with differential days of Alcoholics Anonymous (AA) and Narcotics Anonymous (NA) attendance among those who reported attending, and cocaine users reported more days of attending AA or NA at 1-, 3- and 6-month follow-ups than did methamphetamine users. Pre-randomization measures of Perceived Benefit of 12-step groups predicted 12-step Attendance at 3- and 6-month follow-ups. Pre-randomization 12-step Attendance significantly predicted number of other Self-Help Activities at end-of-treatment, 3- and 6-month follow-ups. Pre-randomization Perceived Benefit and problem severity both predicted number of Self-Help Activities at end-of-treatment and 3-month follow-up. Pre-randomization Perceived Barriers to 12-step groups were negatively associated with Self-Help Activities at end-of-treatment and 3-month follow-up. Whether or not one participated in any Duties was predicted at all time points by pre-randomization involvement in Self-Help Activities. Conclusions The primary finding of this study is one of continuity: prior attendance and active involvement with 12-step programs were the main signs pointing to future involvement. Limitations and Recommendations are discussed. PMID:27431050

  7. A practical and systematic approach to organisational capacity strengthening for research in the health sector in Africa

    PubMed Central

    2014-01-01

    Background Despite increasing investment in health research capacity strengthening efforts in low and middle income countries, published evidence to guide the systematic design and monitoring of such interventions is very limited. Systematic processes are important to underpin capacity strengthening interventions because they provide stepwise guidance and allow for continual improvement. Our objective here was to use evidence to inform the design of a replicable but flexible process to guide health research capacity strengthening that could be customized for different contexts, and to provide a framework for planning, collecting information, making decisions, and improving performance. Methods We used peer-reviewed and grey literature to develop a five-step pathway for designing and evaluating health research capacity strengthening programmes, tested in a variety of contexts in Africa. The five steps are: i) defining the goal of the capacity strengthening effort, ii) describing the optimal capacity needed to achieve the goal, iii) determining the existing capacity gaps compared to the optimum, iv) devising an action plan to fill the gaps and associated indicators of change, and v) adapting the plan and indicators as the programme matures. Our paper describes three contrasting case studies of organisational research capacity strengthening to illustrate how our five-step approach works in practice. Results Our five-step pathway starts with a clear goal and objectives, making explicit the capacity required to achieve the goal. Strategies for promoting sustainability are agreed with partners and incorporated from the outset. Our pathway for designing capacity strengthening programmes focuses not only on technical, managerial, and financial processes within organisations, but also on the individuals within organisations and the wider system within which organisations are coordinated, financed, and managed. Conclusions Our five-step approach is flexible enough to generate and utilise ongoing learning. We have tested and critiqued our approach in a variety of organisational settings in the health sector in sub-Saharan Africa, but it needs to be applied and evaluated in other sectors and continents to determine the extent of transferability. PMID:24581148

  8. Factors associated with the use of cognitive aids in operating room crises: a cross-sectional study of US hospitals and ambulatory surgical centers.

    PubMed

    Alidina, Shehnaz; Goldhaber-Fiebert, Sara N; Hannenberg, Alexander A; Hepner, David L; Singer, Sara J; Neville, Bridget A; Sachetta, James R; Lipsitz, Stuart R; Berry, William R

    2018-03-26

    Operating room (OR) crises are high-acuity events requiring rapid, coordinated management. Medical judgment and decision-making can be compromised in stressful situations, and clinicians may not experience a crisis for many years. A cognitive aid (e.g., checklist) for the most common types of crises in the OR may improve management during unexpected and rare events. While implementation strategies for innovations such as cognitive aids for routine use are becoming better understood, cognitive aids that are rarely used are not yet well understood. We examined organizational context and implementation process factors influencing the use of cognitive aids for OR crises. We conducted a cross-sectional study using a Web-based survey of individuals who had downloaded OR cognitive aids from the websites of Ariadne Labs or Stanford University between January 2013 and January 2016. In this paper, we report on the experience of 368 respondents from US hospitals and ambulatory surgical centers. We analyzed the relationship of more successful implementation (measured as reported regular cognitive aid use during applicable clinical events) with organizational context and with participation in a multi-step implementation process. We used multivariable logistic regression to identify significant predictors of reported, regular OR cognitive aid use during OR crises. In the multivariable logistic regression, small facility size was associated with a fourfold increase in the odds of a facility reporting more successful implementation (p = 0.0092). Completing more implementation steps was also significantly associated with more successful implementation; each implementation step completed was associated with just over 50% higher odds of more successful implementation (p ≤ 0.0001). More successful implementation was associated with leadership support (p < 0.0001) and dedicated time to train staff (p = 0.0189). Less successful implementation was associated with resistance among clinical providers to using cognitive aids (p < 0.0001), absence of an implementation champion (p = 0.0126), and unsatisfactory content or design of the cognitive aid (p = 0.0112). Successful implementation of cognitive aids in ORs was associated with a supportive organizational context and following a multi-step implementation process. Building strong organizational support and following a well-planned multi-step implementation process will likely increase the use of OR cognitive aids during intraoperative crises, which may improve patient outcomes.

  9. Kinetics of the electric double layer formation modelled by the finite difference method

    NASA Astrophysics Data System (ADS)

    Valent, Ivan

    2017-11-01

    Dynamics of the elctric double layer formation in 100 mM NaCl solution for sudden potentail steps of 10 and 20 mV was simulated using the Poisson-Nernst-Planck theory and VLUGR2 solver for partial differential equations. The used approach was verified by comparing the obtained steady-state solution with the available exact solution. The simulations allowed for detailed analysis of the relaxation processes of the individual ions and the electric potential. Some computational aspects of the problem were discussed.

  10. I think I have a good idea: what do I do with it?

    PubMed

    Brigido, Stephen A

    2011-08-01

    The orthopaedic device industry is an ever changing market, often guided by creative surgeons who have the common goal of creating a solution to a problem. While being a surgeon-inventor can be both a challenging and rewarding process, there are several steps that the individual must follow to create intellectual property. This article serves as a guide to the novice surgeon-inventor; intended to be used as an early stage reference for those interested in taking their "solution to a problem" to the device industry.

  11. Time-series panel analysis (TSPA): multivariate modeling of temporal associations in psychotherapy process.

    PubMed

    Ramseyer, Fabian; Kupper, Zeno; Caspar, Franz; Znoj, Hansjörg; Tschacher, Wolfgang

    2014-10-01

    Processes occurring in the course of psychotherapy are characterized by the simple fact that they unfold in time and that the multiple factors engaged in change processes vary highly between individuals (idiographic phenomena). Previous research, however, has neglected the temporal perspective by its traditional focus on static phenomena, which were mainly assessed at the group level (nomothetic phenomena). To support a temporal approach, the authors introduce time-series panel analysis (TSPA), a statistical methodology explicitly focusing on the quantification of temporal, session-to-session aspects of change in psychotherapy. TSPA-models are initially built at the level of individuals and are subsequently aggregated at the group level, thus allowing the exploration of prototypical models. TSPA is based on vector auto-regression (VAR), an extension of univariate auto-regression models to multivariate time-series data. The application of TSPA is demonstrated in a sample of 87 outpatient psychotherapy patients who were monitored by postsession questionnaires. Prototypical mechanisms of change were derived from the aggregation of individual multivariate models of psychotherapy process. In a 2nd step, the associations between mechanisms of change (TSPA) and pre- to postsymptom change were explored. TSPA allowed a prototypical process pattern to be identified, where patient's alliance and self-efficacy were linked by a temporal feedback-loop. Furthermore, therapist's stability over time in both mastery and clarification interventions was positively associated with better outcomes. TSPA is a statistical tool that sheds new light on temporal mechanisms of change. Through this approach, clinicians may gain insight into prototypical patterns of change in psychotherapy. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  12. Facilitating Improvements in Laboratory Report Writing Skills with Less Grading: A Laboratory Report Peer-Review Process†

    PubMed Central

    Brigati, Jennifer R.; Swann, Jerilyn M.

    2015-01-01

    Incorporating peer-review steps in the laboratory report writing process provides benefits to students, but it also can create additional work for laboratory instructors. The laboratory report writing process described here allows the instructor to grade only one lab report for every two to four students, while giving the students the benefits of peer review and prompt feedback on their laboratory reports. Here we present the application of this process to a sophomore level genetics course and a freshman level cellular biology course, including information regarding class time spent on student preparation activities, instructor preparation, prerequisite student knowledge, suggested learning outcomes, procedure, materials, student instructions, faculty instructions, assessment tools, and sample data. T-tests comparing individual and group grading of the introductory cell biology lab reports yielded average scores that were not significantly different from each other (p = 0.13, n = 23 for individual grading, n = 6 for group grading). T-tests also demonstrated that average laboratory report grades of students using the peer-review process were not significantly different from those of students working alone (p = 0.98, n = 9 for individual grading, n = 6 for pair grading). While the grading process described here does not lead to statistically significant gains (or reductions) in student learning, it allows student learning to be maintained while decreasing instructor workload. This reduction in workload could allow the instructor time to pursue other high-impact practices that have been shown to increase student learning. Finally, we suggest possible modifications to the procedure for application in a variety of settings. PMID:25949758

  13. Source-Modeling Auditory Processes of EEG Data Using EEGLAB and Brainstorm.

    PubMed

    Stropahl, Maren; Bauer, Anna-Katharina R; Debener, Stefan; Bleichner, Martin G

    2018-01-01

    Electroencephalography (EEG) source localization approaches are often used to disentangle the spatial patterns mixed up in scalp EEG recordings. However, approaches differ substantially between experiments, may be strongly parameter-dependent, and results are not necessarily meaningful. In this paper we provide a pipeline for EEG source estimation, from raw EEG data pre-processing using EEGLAB functions up to source-level analysis as implemented in Brainstorm. The pipeline is tested using a data set of 10 individuals performing an auditory attention task. The analysis approach estimates sources of 64-channel EEG data without the prerequisite of individual anatomies or individually digitized sensor positions. First, we show advanced EEG pre-processing using EEGLAB, which includes artifact attenuation using independent component analysis (ICA). ICA is a linear decomposition technique that aims to reveal the underlying statistical sources of mixed signals and is further a powerful tool to attenuate stereotypical artifacts (e.g., eye movements or heartbeat). Data submitted to ICA are pre-processed to facilitate good-quality decompositions. Aiming toward an objective approach on component identification, the semi-automatic CORRMAP algorithm is applied for the identification of components representing prominent and stereotypic artifacts. Second, we present a step-wise approach to estimate active sources of auditory cortex event-related processing, on a single subject level. The presented approach assumes that no individual anatomy is available and therefore the default anatomy ICBM152, as implemented in Brainstorm, is used for all individuals. Individual noise modeling in this dataset is based on the pre-stimulus baseline period. For EEG source modeling we use the OpenMEEG algorithm as the underlying forward model based on the symmetric Boundary Element Method (BEM). We then apply the method of dynamical statistical parametric mapping (dSPM) to obtain physiologically plausible EEG source estimates. Finally, we show how to perform group level analysis in the time domain on anatomically defined regions of interest (auditory scout). The proposed pipeline needs to be tailored to the specific datasets and paradigms. However, the straightforward combination of EEGLAB and Brainstorm analysis tools may be of interest to others performing EEG source localization.

  14. Ocean Acidification Portends Acute Habitat Compression for Atlantic Cod (Gadus morhua) in a Physiologically-informed Metabolic Rate Model

    NASA Astrophysics Data System (ADS)

    Del Raye, G.; Weng, K.

    2011-12-01

    Ocean acidification affects organisms on a biochemical scale, yet its societal impacts manifest from changes that propagate through entire populations. Successful forecasting of the effects of ocean acidification therefore depends on at least two steps: (1) deducing systemic physiology based on subcellular stresses and (2) scaling individual physiology up to ecosystem processes. Predictions that are based on known biological processes (process-based models) may fare better than purely statistical models in both these steps because the latter are less robust to novel environmental conditions. Here we present a process-based model that uses temperature, pO2, and pCO2 to predict maximal aerobic scope in Atlantic cod. Using this model, we show that (i) experimentally-derived physiological parameters are sufficient to capture the response of cod aerobic scope to temperature and oxygen, and (ii) subcellular pH effects can be used to predict the systemic physiological response of cod to an acidified ocean. We predict that acute pH stress (on a scale of hours) could limit the mobility of Atlantic cod during diel vertical migration across a pCO2 gradient, promoting habitat compression. Finally, we use a global sensitivity analysis to identify opportunities for the improvement of model uncertainty as well as some physiological adaptations that could mitigate climate stresses on cod in the future.

  15. Three-dimensional rotation electron diffraction: software RED for automated data collection and data processing

    PubMed Central

    Wan, Wei; Sun, Junliang; Su, Jie; Hovmöller, Sven; Zou, Xiaodong

    2013-01-01

    Implementation of a computer program package for automated collection and processing of rotation electron diffraction (RED) data is described. The software package contains two computer programs: RED data collection and RED data processing. The RED data collection program controls the transmission electron microscope and the camera. Electron beam tilts at a fine step (0.05–0.20°) are combined with goniometer tilts at a coarse step (2.0–3.0°) around a common tilt axis, which allows a fine relative tilt to be achieved between the electron beam and the crystal in a large tilt range. An electron diffraction (ED) frame is collected at each combination of beam tilt and goniometer tilt. The RED data processing program processes three-dimensional ED data generated by the RED data collection program or by other approaches. It includes shift correction of the ED frames, peak hunting for diffraction spots in individual ED frames and identification of these diffraction spots as reflections in three dimensions. Unit-cell parameters are determined from the positions of reflections in three-dimensional reciprocal space. All reflections are indexed, and finally a list with hkl indices and intensities is output. The data processing program also includes a visualizer to view and analyse three-dimensional reciprocal lattices reconstructed from the ED frames. Details of the implementation are described. Data collection and data processing with the software RED are demonstrated using a calcined zeolite sample, silicalite-1. The structure of the calcined silicalite-1, with 72 unique atoms, could be solved from the RED data by routine direct methods. PMID:24282334

  16. Three-dimensional rotation electron diffraction: software RED for automated data collection and data processing.

    PubMed

    Wan, Wei; Sun, Junliang; Su, Jie; Hovmöller, Sven; Zou, Xiaodong

    2013-12-01

    Implementation of a computer program package for automated collection and processing of rotation electron diffraction (RED) data is described. The software package contains two computer programs: RED data collection and RED data processing. The RED data collection program controls the transmission electron microscope and the camera. Electron beam tilts at a fine step (0.05-0.20°) are combined with goniometer tilts at a coarse step (2.0-3.0°) around a common tilt axis, which allows a fine relative tilt to be achieved between the electron beam and the crystal in a large tilt range. An electron diffraction (ED) frame is collected at each combination of beam tilt and goniometer tilt. The RED data processing program processes three-dimensional ED data generated by the RED data collection program or by other approaches. It includes shift correction of the ED frames, peak hunting for diffraction spots in individual ED frames and identification of these diffraction spots as reflections in three dimensions. Unit-cell parameters are determined from the positions of reflections in three-dimensional reciprocal space. All reflections are indexed, and finally a list with hkl indices and intensities is output. The data processing program also includes a visualizer to view and analyse three-dimensional reciprocal lattices reconstructed from the ED frames. Details of the implementation are described. Data collection and data processing with the software RED are demonstrated using a calcined zeolite sample, silicalite-1. The structure of the calcined silicalite-1, with 72 unique atoms, could be solved from the RED data by routine direct methods.

  17. Academic-Community Hospital Comparison of Vulnerabilities in Door-to-Needle Process for Acute Ischemic Stroke.

    PubMed

    Prabhakaran, Shyam; Khorzad, Rebeca; Brown, Alexandra; Nannicelli, Anna P; Khare, Rahul; Holl, Jane L

    2015-10-01

    Although best practices have been developed for achieving door-to-needle (DTN) times ≤60 minutes for stroke thrombolysis, critical DTN process failures persist. We sought to compare these failures in the Emergency Department at an academic medical center and a community hospital. Failure modes effects and criticality analysis was used to identify system and process failures. Multidisciplinary teams involved in DTN care participated in moderated sessions at each site. As a result, DTN process maps were created and potential failures and their causes, frequency, severity, and existing safeguards were identified. For each failure, a risk priority number and criticality score were calculated; failures were then ranked, with the highest scores representing the most critical failures and targets for intervention. We detected a total of 70 failures in 50 process steps and 76 failures in 42 process steps at the community hospital and academic medical center, respectively. At the community hospital, critical failures included (1) delay in registration because of Emergency Department overcrowding, (2) incorrect triage diagnosis among walk-in patients, and (3) delay in obtaining consent for thrombolytic treatment. At the academic medical center, critical failures included (1) incorrect triage diagnosis among walk-in patients, (2) delay in stroke team activation, and (3) delay in obtaining computed tomographic imaging. Although the identification of common critical failures suggests opportunities for a generalizable process redesign, differences in the criticality and nature of failures must be addressed at the individual hospital level, to develop robust and sustainable solutions to reduce DTN time. © 2015 American Heart Association, Inc.

  18. Trunk, pelvis and hip biomechanics in individuals with femoroacetabular impingement syndrome: Strategies for step ascent.

    PubMed

    Diamond, Laura E; Bennell, Kim L; Wrigley, Tim V; Hinman, Rana S; Hall, Michelle; O'Donnell, John; Hodges, Paul W

    2018-03-01

    Femoroacetabular impingment (FAI) syndrome is common among young active adults and a proposed risk factor for the future development of hip osteoarthritis. Pain is dominant and drives clinical decision-making. Evidence for altered hip joint function in this patient population is inconsistent, making the identification of treatment targets challenging. A broader assessment, considering adjacent body segments (i.e. pelvis, trunk) and individual movement strategies, may better inform treatment programs. This exploratory study aimed to compare trunk, pelvis, and hip biomechanics during step ascent between individuals with and without FAI syndrome. Fifteen participants diagnosed with symptomatic cam-type or combined (cam plus pincer) FAI who were scheduled for arthroscopic surgery, and 11 age-, and sex-comparable pain- and disease-free individuals, underwent three-dimensional motion analysis during a step ascent task. Trunk, pelvis and hip biomechanics were compared between groups. Participants with FAI syndrome exhibited altered ipsilateral trunk lean and pelvic rise towards the symptomatic side during single-leg support compared to controls. Alterations were not uniformly adopted across all individuals with FAI syndrome; those who exhibited more pronounced alterations to frontal plane pelvis control tended to report pain during the task. There were minimal between-group differences for hip biomechanics. Exploratory data suggest biomechanics at the trunk and pelvis during step ascent differ between individuals with and without FAI syndrome. Those with FAI syndrome implement a range of proximal strategies for task completion, some of which may have relevance for rehabilitation. Longitudinal investigations of larger cohorts are required to evaluate hypothesized clinical and structural consequences. Copyright © 2018 Elsevier B.V. All rights reserved.

  19. High- and low-level hierarchical classification algorithm based on source separation process

    NASA Astrophysics Data System (ADS)

    Loghmari, Mohamed Anis; Karray, Emna; Naceur, Mohamed Saber

    2016-10-01

    High-dimensional data applications have earned great attention in recent years. We focus on remote sensing data analysis on high-dimensional space like hyperspectral data. From a methodological viewpoint, remote sensing data analysis is not a trivial task. Its complexity is caused by many factors, such as large spectral or spatial variability as well as the curse of dimensionality. The latter describes the problem of data sparseness. In this particular ill-posed problem, a reliable classification approach requires appropriate modeling of the classification process. The proposed approach is based on a hierarchical clustering algorithm in order to deal with remote sensing data in high-dimensional space. Indeed, one obvious method to perform dimensionality reduction is to use the independent component analysis process as a preprocessing step. The first particularity of our method is the special structure of its cluster tree. Most of the hierarchical algorithms associate leaves to individual clusters, and start from a large number of individual classes equal to the number of pixels; however, in our approach, leaves are associated with the most relevant sources which are represented according to mutually independent axes to specifically represent some land covers associated with a limited number of clusters. These sources contribute to the refinement of the clustering by providing complementary rather than redundant information. The second particularity of our approach is that at each level of the cluster tree, we combine both a high-level divisive clustering and a low-level agglomerative clustering. This approach reduces the computational cost since the high-level divisive clustering is controlled by a simple Boolean operator, and optimizes the clustering results since the low-level agglomerative clustering is guided by the most relevant independent sources. Then at each new step we obtain a new finer partition that will participate in the clustering process to enhance semantic capabilities and give good identification rates.

  20. Impact of modellers' decisions on hydrological a priori predictions

    NASA Astrophysics Data System (ADS)

    Holländer, H. M.; Bormann, H.; Blume, T.; Buytaert, W.; Chirico, G. B.; Exbrayat, J.-F.; Gustafsson, D.; Hölzel, H.; Krauße, T.; Kraft, P.; Stoll, S.; Blöschl, G.; Flühler, H.

    2014-06-01

    In practice, the catchment hydrologist is often confronted with the task of predicting discharge without having the needed records for calibration. Here, we report the discharge predictions of 10 modellers - using the model of their choice - for the man-made Chicken Creek catchment (6 ha, northeast Germany, Gerwin et al., 2009b) and we analyse how well they improved their prediction in three steps based on adding information prior to each following step. The modellers predicted the catchment's hydrological response in its initial phase without having access to the observed records. They used conceptually different physically based models and their modelling experience differed largely. Hence, they encountered two problems: (i) to simulate discharge for an ungauged catchment and (ii) using models that were developed for catchments, which are not in a state of landscape transformation. The prediction exercise was organized in three steps: (1) for the first prediction the modellers received a basic data set describing the catchment to a degree somewhat more complete than usually available for a priori predictions of ungauged catchments; they did not obtain information on stream flow, soil moisture, nor groundwater response and had therefore to guess the initial conditions; (2) before the second prediction they inspected the catchment on-site and discussed their first prediction attempt; (3) for their third prediction they were offered additional data by charging them pro forma with the costs for obtaining this additional information. Holländer et al. (2009) discussed the range of predictions obtained in step (1). Here, we detail the modeller's assumptions and decisions in accounting for the various processes. We document the prediction progress as well as the learning process resulting from the availability of added information. For the second and third steps, the progress in prediction quality is evaluated in relation to individual modelling experience and costs of added information. In this qualitative analysis of a statistically small number of predictions we learned (i) that soft information such as the modeller's system understanding is as important as the model itself (hard information), (ii) that the sequence of modelling steps matters (field visit, interactions between differently experienced experts, choice of model, selection of available data, and methods for parameter guessing), and (iii) that added process understanding can be as efficient as adding data for improving parameters needed to satisfy model requirements.

  1. Using Geographic Information Systems for Exposure Assessment in Environmental Epidemiology Studies

    PubMed Central

    Nuckols, John R.; Ward, Mary H.; Jarup, Lars

    2004-01-01

    Geographic information systems (GIS) are being used with increasing frequency in environmental epidemiology studies. Reported applications include locating the study population by geocoding addresses (assigning mapping coordinates), using proximity analysis of contaminant source as a surrogate for exposure, and integrating environmental monitoring data into the analysis of the health outcomes. Although most of these studies have been ecologic in design, some have used GIS in estimating environmental levels of a contaminant at the individual level and to design exposure metrics for use in epidemiologic studies. In this article we discuss fundamentals of three scientific disciplines instrumental to using GIS in exposure assessment for epidemiologic studies: geospatial science, environmental science, and epidemiology. We also explore how a GIS can be used to accomplish several steps in the exposure assessment process. These steps include defining the study population, identifying source and potential routes of exposure, estimating environmental levels of target contaminants, and estimating personal exposures. We present and discuss examples for the first three steps. We discuss potential use of GIS and global positioning systems (GPS) in the last step. On the basis of our findings, we conclude that the use of GIS in exposure assessment for environmental epidemiology studies is not only feasible but can enhance the understanding of the association between contaminants in our environment and disease. PMID:15198921

  2. Direct Sensor Orientation of a Land-Based Mobile Mapping System

    PubMed Central

    Rau, Jiann-Yeou; Habib, Ayman F.; Kersting, Ana P.; Chiang, Kai-Wei; Bang, Ki-In; Tseng, Yi-Hsing; Li, Yu-Hua

    2011-01-01

    A land-based mobile mapping system (MMS) is flexible and useful for the acquisition of road environment geospatial information. It integrates a set of imaging sensors and a position and orientation system (POS). The positioning quality of such systems is highly dependent on the accuracy of the utilized POS. This limitation is the major drawback due to the elevated cost associated with high-end GPS/INS units, particularly the inertial system. The potential accuracy of the direct sensor orientation depends on the architecture and quality of the GPS/INS integration process as well as the validity of the system calibration (i.e., calibration of the individual sensors as well as the system mounting parameters). In this paper, a novel single-step procedure using integrated sensor orientation with relative orientation constraint for the estimation of the mounting parameters is introduced. A comparative analysis between the proposed single-step and the traditional two-step procedure is carried out. Moreover, the estimated mounting parameters using the different methods are used in a direct geo-referencing procedure to evaluate their performance and the feasibility of the implemented system. Experimental results show that the proposed system using single-step system calibration method can achieve high 3D positioning accuracy. PMID:22164015

  3. Energy transducing redox steps of the Na+-pumping NADH:quinone oxidoreductase from Vibrio cholerae

    PubMed Central

    Juárez, Oscar; Morgan, Joel E.; Nilges, Mark J.; Barquera, Blanca

    2010-01-01

    Na+-NQR is a unique respiratory enzyme that couples the free energy of electron transfer reactions to electrogenic pumping of sodium across the cell membrane. This enzyme is found in many marine and pathogenic bacteria where it plays an analogous role to the H+-pumping complex I. It has generally been assumed that the sodium pump of Na+-NQR operates on the basis of thermodynamic coupling between reduction of a single redox cofactor and the binding of sodium at a nearby site. In this study, we have defined the coupling to sodium translocation of individual steps in the redox reaction of Na+-NQR. Sodium uptake takes place in the reaction step in which an electron moves from the 2Fe-2S center to FMNC, while the translocation of sodium across the membrane dielectric (and probably its release into the external medium) occurs when an electron moves from FMNB to riboflavin. This argues against a single-site coupling model because the redox steps that drive these two parts of the sodium pumping process do not have any redox cofactor in common. The significance of these results for the mechanism of coupling is discussed, and we proposed that Na+-NQR operates through a novel mechanism based on kinetic coupling, mediated by conformational changes. PMID:20616050

  4. PANGEA: pipeline for analysis of next generation amplicons

    PubMed Central

    Giongo, Adriana; Crabb, David B; Davis-Richardson, Austin G; Chauliac, Diane; Mobberley, Jennifer M; Gano, Kelsey A; Mukherjee, Nabanita; Casella, George; Roesch, Luiz FW; Walts, Brandon; Riva, Alberto; King, Gary; Triplett, Eric W

    2010-01-01

    High-throughput DNA sequencing can identify organisms and describe population structures in many environmental and clinical samples. Current technologies generate millions of reads in a single run, requiring extensive computational strategies to organize, analyze and interpret those sequences. A series of bioinformatics tools for high-throughput sequencing analysis, including preprocessing, clustering, database matching and classification, have been compiled into a pipeline called PANGEA. The PANGEA pipeline was written in Perl and can be run on Mac OSX, Windows or Linux. With PANGEA, sequences obtained directly from the sequencer can be processed quickly to provide the files needed for sequence identification by BLAST and for comparison of microbial communities. Two different sets of bacterial 16S rRNA sequences were used to show the efficiency of this workflow. The first set of 16S rRNA sequences is derived from various soils from Hawaii Volcanoes National Park. The second set is derived from stool samples collected from diabetes-resistant and diabetes-prone rats. The workflow described here allows the investigator to quickly assess libraries of sequences on personal computers with customized databases. PANGEA is provided for users as individual scripts for each step in the process or as a single script where all processes, except the χ2 step, are joined into one program called the ‘backbone’. PMID:20182525

  5. Purification process for vertically aligned carbon nanofibers

    NASA Technical Reports Server (NTRS)

    Nguyen, Cattien V.; Delziet, Lance; Matthews, Kristopher; Chen, Bin; Meyyappan, M.

    2003-01-01

    Individual, free-standing, vertically aligned multiwall carbon nanotubes or nanofibers are ideal for sensor and electrode applications. Our plasma-enhanced chemical vapor deposition techniques for producing free-standing and vertically aligned carbon nanofibers use catalyst particles at the tip of the fiber. Here we present a simple purification process for the removal of iron catalyst particles at the tip of vertically aligned carbon nanofibers derived by plasma-enhanced chemical vapor deposition. The first step involves thermal oxidation in air, at temperatures of 200-400 degrees C, resulting in the physical swelling of the iron particles from the formation of iron oxide. Subsequently, the complete removal of the iron oxide particles is achieved with diluted acid (12% HCl). The purification process appears to be very efficient at removing all of the iron catalyst particles. Electron microscopy images and Raman spectroscopy data indicate that the purification process does not damage the graphitic structure of the nanotubes.

  6. An integrated process for the extraction of fuel and chemicals from marine macroalgal biomass

    NASA Astrophysics Data System (ADS)

    Trivedi, Nitin; Baghel, Ravi S.; Bothwell, John; Gupta, Vishal; Reddy, C. R. K.; Lali, Arvind M.; Jha, Bhavanath

    2016-07-01

    We describe an integrated process that can be applied to biomass of the green seaweed, Ulva fasciata, to allow the sequential recovery of four economically important fractions; mineral rich liquid extract (MRLE), lipid, ulvan, and cellulose. The main benefits of our process are: a) its simplicity and b) the consistent yields obtained from the residual biomass after each successive extraction step. For example, dry Ulva biomass yields ~26% of its starting mass as MRLE, ~3% as lipid, ~25% as ulvan, and ~11% as cellulose, with the enzymatic hydrolysis and fermentation of the final cellulose fraction under optimized conditions producing ethanol at a competitive 0.45 g/g reducing sugar. These yields are comparable to those obtained by direct processing of the individual components from primary biomass. We propose that this integration of ethanol production and chemical feedstock recovery from macroalgal biomass could substantially enhance the sustainability of marine biomass use.

  7. Single-Cell RT-PCR in Microfluidic Droplets with Integrated Chemical Lysis.

    PubMed

    Kim, Samuel C; Clark, Iain C; Shahi, Payam; Abate, Adam R

    2018-01-16

    Droplet microfluidics can identify and sort cells using digital reverse transcription polymerase chain reaction (RT-PCR) signals from individual cells. However, current methods require multiple microfabricated devices for enzymatic cell lysis and PCR reagent addition, making the process complex and prone to failure. Here, we describe a new approach that integrates all components into a single device. The method enables controlled exposure of isolated single cells to a high pH buffer, which lyses cells and inactivates reaction inhibitors but can be instantly neutralized with RT-PCR buffer. Using our chemical lysis approach, we distinguish individual cells' gene expression with data quality equivalent to more complex two-step workflows. Our system accepts cells and produces droplets ready for amplification, making single-cell droplet RT-PCR faster and more reliable.

  8. The extended Beer-Lambert theory for ray tracing modeling of LED chip-scaled packaging application with multiple luminescence materials

    NASA Astrophysics Data System (ADS)

    Yuan, Cadmus C. A.

    2015-12-01

    Optical ray tracing modeling applied Beer-Lambert method in the single luminescence material system to model the white light pattern from blue LED light source. This paper extends such algorithm to a mixed multiple luminescence material system by introducing the equivalent excitation and emission spectrum of individual luminescence materials. The quantum efficiency numbers of individual material and self-absorption of the multiple luminescence material system are considered as well. By this combination, researchers are able to model the luminescence characteristics of LED chip-scaled packaging (CSP), which provides simple process steps and the freedom of the luminescence material geometrical dimension. The method will be first validated by the experimental results. Afterward, a further parametric investigation has been then conducted.

  9. Model-based prognostics for batteries which estimates useful life and uses a probability density function

    NASA Technical Reports Server (NTRS)

    Saha, Bhaskar (Inventor); Goebel, Kai F. (Inventor)

    2012-01-01

    This invention develops a mathematical model to describe battery behavior during individual discharge cycles as well as over its cycle life. The basis for the form of the model has been linked to the internal processes of the battery and validated using experimental data. Effects of temperature and load current have also been incorporated into the model. Subsequently, the model has been used in a Particle Filtering framework to make predictions of remaining useful life for individual discharge cycles as well as for cycle life. The prediction performance was found to be satisfactory as measured by performance metrics customized for prognostics for a sample case. The work presented here provides initial steps towards a comprehensive health management solution for energy storage devices.

  10. Methanol synthesis on ZnO(0001{sup ¯}). IV. Reaction mechanisms and electronic structure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frenzel, Johannes, E-mail: johannes.frenzel@theochem.rub.de; Marx, Dominik

    2014-09-28

    Methanol synthesis from CO and H{sub 2} over ZnO, which requires high temperatures and high pressures giving rise to a complex interplay of physical and chemical processes over this heterogeneous catalyst surface, is investigated using ab initio simulations. The redox properties of the surrounding gas phase are known to directly impact on the catalyst properties and thus, set the overall catalytic reactivity of this easily reducible oxide material. In Paper III of our series [J. Kiss, J. Frenzel, N. N. Nair, B. Meyer, and D. Marx, J. Chem. Phys. 134, 064710 (2011)] we have qualitatively shown that for the partiallymore » hydroxylated and defective ZnO(0001{sup ¯}) surface there exists an intricate network of surface chemical reactions. In the present study, we employ advanced molecular dynamics techniques to resolve in detail this reaction network in terms of elementary steps on the defective surface, which is in stepwise equilibrium with the gas phase. The two individual reduction steps were investigated by ab initio metadynamics sampling of free energy landscapes in three-dimensional reaction subspaces. By also sampling adsorption and desorption processes and thus molecular species that are in the gas phase but close to the surface, our approach successfully generated several alternative pathways of methanol synthesis. The obtained results suggest an Eley-Rideal mechanism for both reduction steps, thus involving “near-surface” molecules from the gas phase, to give methanol preferentially over a strongly reduced catalyst surface, while important side reactions are of Langmuir-Hinshelwood type. Catalyst re-reduction by H{sub 2} stemming from the gas phase is a crucial process after each reduction step in order to maintain the catalyst's activity toward methanol formation and to close the catalytic cycle in some reaction channels. Furthermore, the role of oxygen vacancies, side reactions, and spectator species is investigated and mechanistic details are discussed based on extensive electronic structure analysis.« less

  11. US Cystic Fibrosis Foundation and European Cystic Fibrosis Society consensus recommendations for the management of non-tuberculous mycobacteria in individuals with cystic fibrosis.

    PubMed

    Floto, R Andres; Olivier, Kenneth N; Saiman, Lisa; Daley, Charles L; Herrmann, Jean-Louis; Nick, Jerry A; Noone, Peadar G; Bilton, Diana; Corris, Paul; Gibson, Ronald L; Hempstead, Sarah E; Koetz, Karsten; Sabadosa, Kathryn A; Sermet-Gaudelus, Isabelle; Smyth, Alan R; van Ingen, Jakko; Wallace, Richard J; Winthrop, Kevin L; Marshall, Bruce C; Haworth, Charles S

    2016-01-01

    Non-tuberculous mycobacteria (NTM) are ubiquitous environmental organisms that can cause chronic pulmonary infection, particularly in individuals with pre-existing inflammatory lung disease such as cystic fibrosis (CF). Pulmonary disease caused by NTM has emerged as a major threat to the health of individuals with CF but remains difficult to diagnose and problematic to treat. In response to this challenge, the US Cystic Fibrosis Foundation (CFF) and the European Cystic Fibrosis Society (ECFS) convened an expert panel of specialists to develop consensus recommendations for the screening, investigation, diagnosis and management of NTM pulmonary disease in individuals with CF. Nineteen experts were invited to participate in the recommendation development process. Population, Intervention, Comparison, Outcome (PICO) methodology and systematic literature reviews were employed to inform draft recommendations. An anonymous voting process was used by the committee to reach consensus. All committee members were asked to rate each statement on a scale of: 0, completely disagree, to 9, completely agree; with 80% or more of scores between 7 and 9 being considered 'good' agreement. Additionally, the committee solicited feedback from the CF communities in the USA and Europe and considered the feedback in the development of the final recommendation statements. Three rounds of voting were conducted to achieve 80% consensus for each recommendation statement. Through this process, we have generated a series of pragmatic, evidence-based recommendations for the screening, investigation, diagnosis and treatment of NTM infection in individuals with CF as an initial step in optimising management for this challenging condition. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  12. Implementing an organised cervical screening programme in the Republic of Moldova-Stakeholder identification and engagement.

    PubMed

    Davies, Philip; Valuta, Diana; Cojohari, Natalia; Sancho-Garnier, Helene

    2017-10-01

    Successfully implementing cervical screening programmes requires them to be adapted to the local context and have broad stakeholder support. This can be achieved by actively engaging local stakeholders in planning as well as implementing the programmes. The Moldovan government started implementing an organised cervical screening programme in 2010 with the first step being stakeholder identification and engagement. This process started by contacting easily identified stakeholders with each asked to recommend others and the process continued until no new ones were identified. Stakeholders were then involved in a series of individual and group meetings over a 2-year period to build confidence and encourage progressively greater engagement. In total, 87 individuals from 46 organisations were identified. Over the 2-year process, the individual and group meetings facilitated a change in stakeholder attitudes from disinterest, to acceptance and finally to active cooperation in designing the screening programme and preparing an implementation plan that were both well adapted to the Moldovan context. Developing the broad support needed to implement cervical screening programmes required ongoing interaction with stakeholders over an extended period. This interaction allowed stakeholder concerns to be identified and addressed, progress to be demonstrated, and stakeholders to be educated about organised screening programmes so they had the knowledge to progressively take greater responsibility and ownership. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. The generic MESSy submodel TENDENCY (v1.0) for process-based analyses in Earth system models

    NASA Astrophysics Data System (ADS)

    Eichinger, R.; Jöckel, P.

    2014-07-01

    The tendencies of prognostic variables in Earth system models are usually only accessible, e.g. for output, as a sum over all physical, dynamical and chemical processes at the end of one time integration step. Information about the contribution of individual processes to the total tendency is lost, if no special precautions are implemented. The knowledge on individual contributions, however, can be of importance to track down specific mechanisms in the model system. We present the new MESSy (Modular Earth Submodel System) infrastructure submodel TENDENCY and use it exemplarily within the EMAC (ECHAM/MESSy Atmospheric Chemistry) model to trace process-based tendencies of prognostic variables. The main idea is the outsourcing of the tendency accounting for the state variables from the process operators (submodels) to the TENDENCY submodel itself. In this way, a record of the tendencies of all process-prognostic variable pairs can be stored. The selection of these pairs can be specified by the user, tailor-made for the desired application, in order to minimise memory requirements. Moreover, a standard interface allows the access to the individual process tendencies by other submodels, e.g. for on-line diagnostics or for additional parameterisations, which depend on individual process tendencies. An optional closure test assures the correct treatment of tendency accounting in all submodels and thus serves to reduce the model's susceptibility. TENDENCY is independent of the time integration scheme and therefore the concept is applicable to other model systems as well. Test simulations with TENDENCY show an increase of computing time for the EMAC model (in a setup without atmospheric chemistry) of 1.8 ± 1% due to the additional subroutine calls when using TENDENCY. Exemplary results reveal the dissolving mechanisms of the stratospheric tape recorder signal in height over time. The separation of the tendency of the specific humidity into the respective processes (large-scale clouds, convective clouds, large-scale advection, vertical diffusion and methane oxidation) show that the upward propagating water vapour signal dissolves mainly because of the chemical and the advective contribution. The TENDENCY submodel is part of version 2.42 or later of MESSy.

  14. The generic MESSy submodel TENDENCY (v1.0) for process-based analyses in Earth System Models

    NASA Astrophysics Data System (ADS)

    Eichinger, R.; Jöckel, P.

    2014-04-01

    The tendencies of prognostic variables in Earth System Models are usually only accessible, e.g., for output, as sum over all physical, dynamical and chemical processes at the end of one time integration step. Information about the contribution of individual processes to the total tendency is lost, if no special precautions are implemented. The knowledge on individual contributions, however, can be of importance to track down specific mechanisms in the model system. We present the new MESSy (Modular Earth Submodel System) infrastructure submodel TENDENCY and use it exemplarily within the EMAC (ECHAM/MESSy Atmospheric Chemistry) model to trace process-based tendencies of prognostic variables. The main idea is the outsourcing of the tendency accounting for the state variables from the process operators (submodels) to the TENDENCY submodel itself. In this way, a record of the tendencies of all process-prognostic variable pairs can be stored. The selection of these pairs can be specified by the user, tailor-made for the desired application, in order to minimise memory requirements. Moreover a standard interface allows the access to the individual process tendencies by other submodels, e.g., for on-line diagnostics or for additional parameterisations, which depend on individual process tendencies. An optional closure test assures the correct treatment of tendency accounting in all submodels and thus serves to reduce the models susceptibility. TENDENCY is independent of the time integration scheme and therefore applicable to other model systems as well. Test simulations with TENDENCY show an increase of computing time for the EMAC model (in a setup without atmospheric chemistry) of 1.8 ± 1% due to the additional subroutine calls when using TENDENCY. Exemplary results reveal the dissolving mechanisms of the stratospheric tape recorder signal in height over time. The separation of the tendency of the specific humidity into the respective processes (large-scale clouds, convective clouds, large-scale advection, vertical diffusion and methane-oxidation) show that the upward propagating water vapour signal dissolves mainly because of the chemical and the advective contribution. The TENDENCY submodel is part of version 2.42 or later of MESSy.

  15. The Supertree Toolkit 2: a new and improved software package with a Graphical User Interface for supertree construction

    PubMed Central

    2014-01-01

    Abstract Building large supertrees involves the collection, storage, and processing of thousands of individual phylogenies to create large phylogenies with thousands to tens of thousands of taxa. Such large phylogenies are useful for macroevolutionary studies, comparative biology and in conservation and biodiversity. No easy to use and fully integrated software package currently exists to carry out this task. Here, we present a new Python-based software package that uses well defined XML schema to manage both data and metadata. It builds on previous versions by 1) including new processing steps, such as Safe Taxonomic Reduction, 2) using a user-friendly GUI that guides the user to complete at least the minimum information required and includes context-sensitive documentation, and 3) a revised storage format that integrates both tree- and meta-data into a single file. These data can then be manipulated according to a well-defined, but flexible, processing pipeline using either the GUI or a command-line based tool. Processing steps include standardising names, deleting or replacing taxa, ensuring adequate taxonomic overlap, ensuring data independence, and safe taxonomic reduction. This software has been successfully used to store and process data consisting of over 1000 trees ready for analyses using standard supertree methods. This software makes large supertree creation a much easier task and provides far greater flexibility for further work. PMID:24891820

  16. Limits of acceptable change and natural resources planning: when is LAC useful, when is it not?

    Treesearch

    David N. Cole; Stephen F. McCool

    1997-01-01

    There are ways to improve the LAC process and its implementational procedures. One significant procedural modification is the addition of a new step. This step — which becomes the first step in the process — involves more explicitly defining goals and desired conditions. For other steps in the process, clarifications of concept and terminology are advanced, as are...

  17. Nursing students' understanding and enactment of resilience: a grounded theory study.

    PubMed

    Reyes, Andrew Thomas; Andrusyszyn, Mary-Anne; Iwasiw, Carroll; Forchuk, Cheryl; Babenko-Mould, Yolanda

    2015-11-01

    The aim of this study was to explore nursing students' understanding and enactment of resilience. Stress is considered to be a major factor affecting the health, well-being and academic performance of nursing students. Resilience has been extensively researched as a process that allows individuals to successfully adapt to adversity and develop positive outcomes as a result. However, relatively little is known about the resilience of nursing students. A constructivist, grounded theory qualitative design was used for this study. In-depth individual interviews were conducted with 38 nursing students enrolled in a four-year, integrated baccalaureate nursing degree programme at a university in Ontario, Canada. Face-to-face interviews were conducted from January to April 2012 using a semi-structured interview guide. The basic social process of 'pushing through' emerged as nursing students' understanding and enactment of resilience. Participants employed this process to withstand challenges in their academic lives. This process was comprised of three main phases: 'stepping into', 'staying the course' and 'acknowledging'. 'Pushing through' also included a transient 'disengaging' process where students were temporarily unable to push through their adversities. The process of 'pushing through' was based on a progressive trajectory, which implied that nursing students enacted the process to make progress in their academic lives and to attain goals. Study findings provide important evidence for understanding the phenomenon of resilience as a dynamic, contextual process that can be learnt and developed, rather than a static trait or personality characteristic. © 2015 John Wiley & Sons Ltd.

  18. Fast kinetics of chromatin assembly revealed by single-molecule videomicroscopy and scanning force microscopy

    PubMed Central

    Ladoux, Benoit; Quivy, Jean-Pierre; Doyle, Patrick; Roure, Olivia du; Almouzni, Geneviève; Viovy, Jean-Louis

    2000-01-01

    Fluorescence videomicroscopy and scanning force microscopy were used to follow, in real time, chromatin assembly on individual DNA molecules immersed in cell-free systems competent for physiological chromatin assembly. Within a few seconds, molecules are already compacted into a form exhibiting strong similarities to native chromatin fibers. In these extracts, the compaction rate is more than 100 times faster than expected from standard biochemical assays. Our data provide definite information on the forces involved (a few piconewtons) and on the reaction path. DNA compaction as a function of time revealed unique features of the assembly reaction in these extracts. They imply a sequential process with at least three steps, involving DNA wrapping as the final event. An absolute and quantitative measure of the kinetic parameters of the early steps in chromatin assembly under physiological conditions could thus be obtained. PMID:11114182

  19. Role of step stiffness and kinks in the relaxation of vicinal (001) with zigzag [110] steps

    NASA Astrophysics Data System (ADS)

    Mahjoub, B.; Hamouda, Ajmi BH.; Einstein, TL.

    2017-08-01

    We present a kinetic Monte Carlo study of the relaxation dynamics and steady state configurations of 〈110〉 steps on a vicinal (001) simple cubic surface. This system is interesting because 〈110〉 (fully kinked) steps have different elementary excitation energetics and favor step diffusion more than 〈100〉 (nominally straight) steps. In this study we show how this leads to different relaxation dynamics as well as to different steady state configurations, including that 2-bond breaking processes are rate determining for 〈110〉 steps in contrast to 3-bond breaking processes for 〈100〉-steps found in previous work [Surface Sci. 602, 3569 (2008)]. The analysis of the terrace-width distribution (TWD) shows a significant role of kink-generation-annihilation processes during the relaxation of steps: the kinetic of relaxation, toward the steady state, is much faster in the case of 〈110〉-zigzag steps, with a higher standard deviation of the TWD, in agreement with a decrease of step stiffness due to orientation. We conclude that smaller step stiffness leads inexorably to faster step dynamics towards the steady state. The step-edge anisotropy slows the relaxation of steps and increases the strength of step-step effective interactions.

  20. A pilot randomized controlled trial evaluating motivationally matched pedometer feedback to increase physical activity behavior in older adults.

    PubMed

    Strath, Scott J; Swartz, Ann M; Parker, Sarah J; Miller, Nora E; Grimm, Elizabeth K; Cashin, Susan E

    2011-09-01

    Increasing physical activity (PA) levels in older adults represents an important public health challenge. The purpose of this study was to evaluate the feasibility of combining individualized motivational messaging with pedometer walking step targets to increase PA in previously inactive and insufficiently active older adults. In this 12-week intervention study older adults were randomized to 1 of 4 study arms: Group 1--control; Group 2--pedometer 10,000 step goal; Group 3--pedometer step goal plus individualized motivational feedback; or Group 4--everything in Group 3 augmented with biweekly telephone feedback. 81 participants were randomized into the study, 61 participants completed the study with an average age of 63.8 ± 6.0 years. Group 1 did not differ in accumulated steps/day following the 12-week intervention compared with participants in Group 2. Participants in Groups 3 and 4 took on average 2159 (P < .001) and 2488 (P < .001) more steps/day, respectively, than those in Group 1 after the 12-week intervention. In this 12-week pilot randomized control trial, a pedometer feedback intervention partnered with individually matched motivational messaging was an effective intervention strategy to significantly increase PA behavior in previously inactive and insufficiently active older adults.

  1. Effects of industrial cashew nut processing on anacardic acid content and allergen recognition by IgE.

    PubMed

    Mattison, Christopher P; Malveira Cavalcante, Jéfferson; Izabel Gallão, Maria; Sousa de Brito, Edy

    2018-02-01

    Cashew nuts are important both nutritionally and industrially, but can also cause food allergies in some individuals. The present study aimed to assess the effect(s) of industrial processing on anacardic acids and allergens present in cashew nuts. Sample analyses were performed using liquid chromatography coupled with mass spectrometry, SDS-PAGE and immunoassay. The anacardic acid concentration ranged from 6.2 to 82.6mg/g during processing, and this variation was attributed to cashew nut shell liquid incorporation during storage and humidification. Dehydrated and selected samples did not significantly differ in anacardic acid content, having values similar to the raw sample. SDS-PAGE and immunoassay analysis with rabbit polyclonal sera and human IgE indicated only minor differences in protein solubility and antibody binding following processing steps. The findings indicate that appreciable amounts of anacardic acid remain in processed nuts, and that changes to cashew allergens during industrial processing may only mildly affect antibody recognition. Published by Elsevier Ltd.

  2. The contribution of temporary storage and executive processes to category learning.

    PubMed

    Wang, Tengfei; Ren, Xuezhu; Schweizer, Karl

    2015-09-01

    Three distinctly different working memory processes, temporary storage, mental shifting and inhibition, were proposed to account for individual differences in category learning. A sample of 213 participants completed a classic category learning task and two working memory tasks that were experimentally manipulated for tapping specific working memory processes. Fixed-links models were used to decompose data of the category learning task into two independent components representing basic performance and improvement in performance in category learning. Processes of working memory were also represented by fixed-links models. In a next step the three working memory processes were linked to components of category learning. Results from modeling analyses indicated that temporary storage had a significant effect on basic performance and shifting had a moderate effect on improvement in performance. In contrast, inhibition showed no effect on any component of the category learning task. These results suggest that temporary storage and the shifting process play different roles in the course of acquiring new categories. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. Experimental analysis for fabrication of high-aspect-ratio piezoelectric ceramic structure by micro-powder injection molding process

    NASA Astrophysics Data System (ADS)

    Han, Jun Sae; Gal, Chang Woo; Park, Jae Man; Kim, Jong Hyun; Park, Seong Jin

    2018-04-01

    Aspect ratio effects in the micro-powder injection molding process were experimentally analyzed for fabrication of high-aspect-ratio piezoelectric ceramic structure. The mechanisms of critical defects have been studied according to individual manufacturing steps. In the molding process, incomplete filling phenomenon determines the critical aspect ratios of a micro pattern. According to mold temperature, an incomplete filling phenomenon has been analyzed with respect to different pattern sizes and aspect ratio. In demolding and drying process, the capillary behavior of sacrificial polymeric mold insert determines the critical aspect ratio of a micro pattern. With respect to pattern dimensions, slumping behavior has been analyzed. Based on our current systems, micro PZT feature has stability when it has lower aspect ratio than 5. Under optimized processing conditions, 20 μm and 40 μm ceramic rod array feature which has 5 of aspect ratio were successfully fabricated by the developed process. Further modification points to fabricate the smaller and higher feature were specifically addressed.

  4. Function in the Human Connectome: Task-fMRI and Individual Differences in Behavior

    PubMed Central

    Barch, Deanna M.; Burgess, Gregory C.; Harms, Michael P.; Petersen, Steven E.; Schlaggar, Bradley L.; Corbetta, Maurizio; Glasser, Matthew F.; Curtiss, Sandra; Dixit, Sachin; Feldt, Cindy; Nolan, Dan; Bryant, Edward; Hartley, Tucker; Footer, Owen; Bjork, James M.; Poldrack, Russ; Smith, Steve; Johansen-Berg, Heidi; Snyder, Abraham Z.; Van Essen, David C.

    2014-01-01

    The primary goal of the Human Connectome Project (HCP) is to delineate the typical patterns of structural and functional connectivity in the healthy adult human brain. However, we know that there are important individual differences in such patterns of connectivity, with evidence that this variability is associated with alterations in important cognitive and behavioral variables that affect real world function. The HCP data will be a critical stepping-off point for future studies that will examine how variation in human structural and functional connectivity play a role in adult and pediatric neurological and psychiatric disorders that account for a huge amount of public health resources. Thus, the HCP is collecting behavioral measures of a range of motor, sensory, cognitive and emotional processes that will delineate a core set of functions relevant to understanding the relationship between brain connectivity and human behavior. In addition, the HCP is using task-fMRI (tfMRI) to help delineate the relationships between individual differences in the neurobiological substrates of mental processing and both functional and structural connectivity, as well as to help characterize and validate the connectivity analyses to be conducted on the structural and functional connectivity data. This paper describes the logic and rationale behind the development of the behavioral, individual difference, and tfMRI batteries and provides preliminary data on the patterns of activation associated with each of the fMRI tasks, at both a group and individual level. PMID:23684877

  5. Generation of cell type-specific monoclonal antibodies for the planarian and optimization of sample processing for immunolabeling.

    PubMed

    Forsthoefel, David J; Waters, Forrest A; Newmark, Phillip A

    2014-12-21

    Efforts to elucidate the cellular and molecular mechanisms of regeneration have required the application of methods to detect specific cell types and tissues in a growing cohort of experimental animal models. For example, in the planarian Schmidtea mediterranea, substantial improvements to nucleic acid hybridization and electron microscopy protocols have facilitated the visualization of regenerative events at the cellular level. By contrast, immunological resources have been slower to emerge. Specifically, the repertoire of antibodies recognizing planarian antigens remains limited, and a more systematic approach is needed to evaluate the effects of processing steps required during sample preparation for immunolabeling. To address these issues and to facilitate studies of planarian digestive system regeneration, we conducted a monoclonal antibody (mAb) screen using phagocytic intestinal cells purified from the digestive tracts of living planarians as immunogens. This approach yielded ten antibodies that recognized intestinal epitopes, as well as markers for the central nervous system, musculature, secretory cells, and epidermis. In order to improve signal intensity and reduce non-specific background for a subset of mAbs, we evaluated the effects of fixation and other steps during sample processing. We found that fixative choice, treatments to remove mucus and bleach pigment, as well as methods for tissue permeabilization and antigen retrieval profoundly influenced labeling by individual antibodies. These experiments led to the development of a step-by-step workflow for determining optimal specimen preparation for labeling whole planarians as well as unbleached histological sections. We generated a collection of monoclonal antibodies recognizing the planarian intestine and other tissues; these antibodies will facilitate studies of planarian tissue morphogenesis. We also developed a protocol for optimizing specimen processing that will accelerate future efforts to generate planarian-specific antibodies, and to extend functional genetic studies of regeneration to post-transcriptional aspects of gene expression, such as protein localization or modification. Our efforts demonstrate the importance of systematically testing multiple approaches to species-specific idiosyncracies, such as mucus removal and pigment bleaching, and may serve as a template for the development of immunological resources in other emerging model organisms.

  6. First Steps in Using Multi-Voxel Pattern Analysis to Disentangle Neural Processes Underlying Generalization of Spider Fear

    PubMed Central

    Visser, Renée M.; Haver, Pia; Zwitser, Robert J.; Scholte, H. Steven; Kindt, Merel

    2016-01-01

    A core symptom of anxiety disorders is the tendency to interpret ambiguous information as threatening. Using electroencephalography and blood oxygenation level dependent magnetic resonance imaging (BOLD-MRI), several studies have begun to elucidate brain processes involved in fear-related perceptual biases, but thus far mainly found evidence for general hypervigilance in high fearful individuals. Recently, multi-voxel pattern analysis (MVPA) has become popular for decoding cognitive states from distributed patterns of neural activation. Here, we used this technique to assess whether biased fear generalization, characteristic of clinical fear, is already present during the initial perception and categorization of a stimulus, or emerges during the subsequent interpretation of a stimulus. Individuals with low spider fear (n = 20) and high spider fear (n = 18) underwent functional MRI scanning while viewing series of schematic flowers morphing to spiders. In line with previous studies, individuals with high fear of spiders were behaviorally more likely to classify ambiguous morphs as spiders than individuals with low fear of spiders. Univariate analyses of BOLD-MRI data revealed stronger activation toward spider pictures in high fearful individuals compared to low fearful individuals in numerous areas. Yet, neither average activation, nor support vector machine classification (i.e., a form of MVPA) matched the behavioral results – i.e., a biased response toward ambiguous stimuli – in any of the regions of interest. This may point to limitations of the current design, and to challenges associated with classifying emotional and neutral stimuli in groups that differ in their judgment of emotionality. Improvements for future research are suggested. PMID:27303278

  7. Ten steps to successful software process improvement

    NASA Technical Reports Server (NTRS)

    Kandt, R. K.

    2003-01-01

    This paper identifies ten steps for managing change that address organizational and cultural issues. Four of these steps are critical, that if not done, will almost guarantee failure. This ten-step program emphasizes the alignment of business goals, change process goals, and the work performed by the employees of an organization.

  8. Validity of the SenseWear armband step count measure during controlled and free-living conditions.

    PubMed

    Lee, Joey Allen; Laurson, Kelly Rian

    2015-06-01

    Advances in technology continue to provide numerous options for physical activity assessment. These advances necessitate evaluation of the validity of newly developed activity monitors being used in clinical and research settings. The purpose of this study was to validate the SenseWear Pro3 Armband (SWA) step counts during treadmill walking and free-living conditions. Study 1 observed 39 individuals (17 males, 22 females) wearing an SWA and a Yamax Digiwalker SW-701 pedometer (DIGI) during treadmill walking, utilizing manually counted steps as the criterion. Study 2 compared free-living step count data from 35 participants (17 males, 18 females) wearing the SWA and DIGI (comparison) for 3 consecutive days. During Study 1, the SWA underestimated steps by 16.0%, 10.7%, 5.6%, 6.1%, and 6.5% at speeds of 54 m/min, 67 m/min, 80 m/min, 94 m/min, and 107 m/min, respectively, compared to manually counted steps. During Study 2, the intraclass correlation (ICC) coefficient of mean steps/d between the SWA and DIGI was strong (r = 0.98, p  < 0.001). Unlike Study 1, the SWA overestimated step counts during the 3-day wear period by an average of 1028 steps/d (or +11.3%) compared to the DIGI. When analyzed individually, the SWA consistently overestimated step counts for each day ( p  < 0.05). The SWA underestimates steps during treadmill walking and appears to overestimate steps during free-living compared to the DIGI pedometer. Caution is warranted when using the SWA to count steps. Modifications are needed to enhance step counting accuracy.

  9. Bistatic SAR: Signal Processing and Image Formation.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wahl, Daniel E.; Yocky, David A.

    This report describes the significant processing steps that were used to take the raw recorded digitized signals from the bistatic synthetic aperture RADAR (SAR) hardware built for the NCNS Bistatic SAR project to a final bistatic SAR image. In general, the process steps herein are applicable to bistatic SAR signals that include the direct-path signal and the reflected signal. The steps include preprocessing steps, data extraction to for a phase history, and finally, image format. Various plots and values will be shown at most steps to illustrate the processing for a bistatic COSMO SkyMed collection gathered on June 10, 2013more » on Kirtland Air Force Base, New Mexico.« less

  10. High-fidelity target sequencing of individual molecules identified using barcode sequences: de novo detection and absolute quantitation of mutations in plasma cell-free DNA from cancer patients.

    PubMed

    Kukita, Yoji; Matoba, Ryo; Uchida, Junji; Hamakawa, Takuya; Doki, Yuichiro; Imamura, Fumio; Kato, Kikuya

    2015-08-01

    Circulating tumour DNA (ctDNA) is an emerging field of cancer research. However, current ctDNA analysis is usually restricted to one or a few mutation sites due to technical limitations. In the case of massively parallel DNA sequencers, the number of false positives caused by a high read error rate is a major problem. In addition, the final sequence reads do not represent the original DNA population due to the global amplification step during the template preparation. We established a high-fidelity target sequencing system of individual molecules identified in plasma cell-free DNA using barcode sequences; this system consists of the following two steps. (i) A novel target sequencing method that adds barcode sequences by adaptor ligation. This method uses linear amplification to eliminate the errors introduced during the early cycles of polymerase chain reaction. (ii) The monitoring and removal of erroneous barcode tags. This process involves the identification of individual molecules that have been sequenced and for which the number of mutations have been absolute quantitated. Using plasma cell-free DNA from patients with gastric or lung cancer, we demonstrated that the system achieved near complete elimination of false positives and enabled de novo detection and absolute quantitation of mutations in plasma cell-free DNA. © The Author 2015. Published by Oxford University Press on behalf of Kazusa DNA Research Institute.

  11. Development of a new multidimensional individual and interpersonal resilience measure for older adults.

    PubMed

    Martin, A'verria Sirkin; Distelberg, Brian; Palmer, Barton W; Jeste, Dilip V

    2015-01-01

    Develop an empirically grounded measure that can be used to assess family and individual resilience in a population of older adults (aged 50-99). Cross-sectional, self-report data from 1006 older adults were analyzed in two steps. The total sample was split into two subsamples and the first step identified the underlying latent structure through principal component exploratory factor analysis (EFA). The second step utilized the second half of the sample to validate the derived latent structure through confirmatory factor analysis (CFA). EFA produced an eight-factor structure that appeared clinically relevant for measuring the multidimensional nature of resilience. Factors included self-efficacy, access to social support network, optimism, perceived economic and social resources, spirituality and religiosity, relational accord, emotional expression and communication, and emotional regulation. CFA confirmed the eight-factor structure previously achieved with covariance between each of the factors. Based on these analyses we developed the multidimensional individual and interpersonal resilience measure, a broad assessment of resilience for older adults. This study highlights the multidimensional nature of resilience and introduces an individual and interpersonal resilience measure developed for older adults which is grounded in the individual and family resilience literature.

  12. Development of a New Multidimensional Individual and Interpersonal Resilience Measure for Older Adults

    PubMed Central

    Martin, A’verria Sirkin; Distelberg, Brian; Palmer, Barton W.; Jeste, Dilip V.

    2015-01-01

    Objectives Develop an empirically grounded measure that can be used to assess family and individual resilience in a population of older adults (aged 50-99). Methods Cross-sectional, self-report data from 1,006 older adults were analyzed in two steps. The total sample was split into two sub-samples and the first step identified the underlying latent structure through principal component Exploratory Factor Analysis (EFA). The second step utilized the second half of the sample to validate the derived latent structure through Confirmatory Factor Analysis (CFA). Results EFA produced an eight-factor structure that appeared clinically relevant for measuring the multidimensional nature of resilience. Factors included self-efficacy, access to social support network, optimism, perceived economic and social resources, spirituality and religiosity, relational accord, emotional expression and communication, and emotional regulation. CFA confirmed the eight-factor structure previously achieved with covariance between each of the factors. Based on these analyses we developed the Multidimensional Individual and Interpersonal Resilience Measure (MIIRM), a broad assessment of resilience for older adults. Conclusion This study highlights the multidimensional nature of resilience and introduces an individual and interpersonal resilience measure developed for older adults which is grounded in the individual and family resilience literature. PMID:24787701

  13. Conquering technophobia: preparing faculty for today.

    PubMed

    Richard, P L

    1997-01-01

    The constantly changing world of technology creates excitement and an obligation for faculty of schools of nursing to address computer literacy in the curricula at all levels. The initial step in the process of meeting the goals was to assist the faculty in becoming computer literate so that they could foster and encourage the same in the students. The implementation of The Cure for Technophobia included basic and advanced computer skills designed to assist the faculty in becoming comfortable and competent computer users. The applications addressed included: introduction to windows, electronic mail, word processing, presentation and database applications, library on-line searches of literature databases, introduction to internet browsers and a computerized testing program. Efforts were made to overcome barriers to computer literacy and promote the learning process. Familiar, competent, computer literate individuals were used to conduct the classes to accomplish this goal.

  14. Challenging evidence-based decision-making: a hypothetical case study about return to work.

    PubMed

    Aas, Randi W; Alexanderson, Kristina

    2012-03-01

    A hypothetical case study about return to work was used to explore the process of translating research into practice. The method involved constructing a case study derived from the characteristics of a typical, sick-listed employee with non-specific low back pain in Norway. Next, the five-step evidence-based process, including the Patient, Intervention, Co-Interventions and Outcome framework (PICO), was applied to the case study. An inductive analysis produced 10 technical and more fundamental challenges to incorporate research into intervention decisions for an individual with comorbidity. A more dynamic, interactive approach to the evidence-based practice process is proposed. It is recommended that this plus the 10 challenges are validated with real life cases, as the hypothetical case study may not be replicable. Copyright © 2011 John Wiley & Sons, Ltd.

  15. Environmental Adaptations, Ecological Filtering, and Dispersal Central to Insect Invasions.

    PubMed

    Renault, David; Laparie, Mathieu; McCauley, Shannon J; Bonte, Dries

    2018-01-07

    Insect invasions, the establishment and spread of nonnative insects in new regions, can have extensive economic and environmental consequences. Increased global connectivity accelerates rates of introductions, while climate change may decrease the barriers to invader species' spread. We follow an individual-level insect- and arachnid-centered perspective to assess how the process of invasion is influenced by phenotypic heterogeneity associated with dispersal and stress resistance, and their coupling, across the multiple steps of the invasion process. We also provide an overview and synthesis on the importance of environmental filters during the entire invasion process for the facilitation or inhibition of invasive insect population spread. Finally, we highlight important research gaps and the relevance and applicability of ongoing natural range expansions in the context of climate change to gain essential mechanistic insights into insect invasions.

  16. Additive Manufacturing Consolidation of Low-Cost Water Atomized Steel Powder Using Micro-Induction Sintering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carter, William G.; Rios, Orlando; U

    ORNL worked with Grid Logic Inc to demonstrate micro induction sintering (MIS) and binder decomposition of steel powders. It was shown that MIS effectively emits spatially confined electromagnetic energy that is directly coupled to metallic powders resulting in resistive heating of individual particles. The non-uniformity of particle morphology and distribution of the water atomized steel powders resulted in inefficient transfer of energy. It was shown that adhering the particles together using polymer binders resulted in more efficient coupling. Using the MIS processes, debinding and sintering could be done in a single step. When combined with another system, such as binder-jet,more » this could reduce the amount of required post-processing. An invention disclosure was filed on hybrid systems that use MIS to reduce the amount of required post-processing.« less

  17. Effective virus inactivation and removal by steps of Biotest Pharmaceuticals IGIV production process

    PubMed Central

    Dichtelmüller, Herbert O.; Flechsig, Eckhard; Sananes, Frank; Kretschmar, Michael; Dougherty, Christopher J.

    2012-01-01

    The virus validation of three steps of Biotest Pharmaceuticals IGIV production process is described here. The steps validated are precipitation and removal of fraction III of the cold ethanol fractionation process, solvent/detergent treatment and 35 nm virus filtration. Virus validation was performed considering combined worst case conditions. By these validated steps sufficient virus inactivation/removal is achieved, resulting in a virus safe product. PMID:24371563

  18. Stepping reaction time and gait adaptability are significantly impaired in people with Parkinson's disease: Implications for fall risk.

    PubMed

    Caetano, Maria Joana D; Lord, Stephen R; Allen, Natalie E; Brodie, Matthew A; Song, Jooeun; Paul, Serene S; Canning, Colleen G; Menant, Jasmine C

    2018-02-01

    Decline in the ability to take effective steps and to adapt gait, particularly under challenging conditions, may be important reasons why people with Parkinson's disease (PD) have an increased risk of falling. This study aimed to determine the extent of stepping and gait adaptability impairments in PD individuals as well as their associations with PD symptoms, cognitive function and previous falls. Thirty-three older people with PD and 33 controls were assessed in choice stepping reaction time, Stroop stepping and gait adaptability tests; measurements identified as fall risk factors in older adults. People with PD had similar mean choice stepping reaction times to healthy controls, but had significantly greater intra-individual variability. In the Stroop stepping test, the PD participants were more likely to make an error (48 vs 18%), took 715 ms longer to react (2312 vs 1517 ms) and had significantly greater response variability (536 vs 329 ms) than the healthy controls. People with PD also had more difficulties adapting their gait in response to targets (poorer stepping accuracy) and obstacles (increased number of steps) appearing at short notice on a walkway. Within the PD group, higher disease severity, reduced cognition and previous falls were associated with poorer stepping and gait adaptability performances. People with PD have reduced ability to adapt gait to unexpected targets and obstacles and exhibit poorer stepping responses, particularly in a test condition involving conflict resolution. Such impaired stepping responses in Parkinson's disease are associated with disease severity, cognitive impairment and falls. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Securing your financial future.

    PubMed

    Kachalia, Parag R

    2009-04-01

    Securing one's financial future requires dedication and planning. A clear plan must be implemented and continually re-examined to assure an individual remains on track to achieve this security. True success of the plan will be dependent upon taking the appropriate steps to protecting one's assets against unfortunate events along with building assets with a clear end goal in mind. This article will cover the fundamental steps an individual can take to secure their financial future.

  20. ICME — A Mere Coupling of Models or a Discipline of Its Own?

    NASA Astrophysics Data System (ADS)

    Bambach, Markus; Schmitz, Georg J.; Prahl, Ulrich

    Technically, ICME — Integrated computational materials engineering — is an approach for solving advanced engineering problems related to the design of new materials and processes by combining individual materials and process models. The combination of models by now is mainly achieved by manual transformation of the output of a simulation to form the input to a subsequent one. This subsequent simulation is either performed at a different length scale or constitutes a subsequent step along the process chain. Is ICME thus just a synonym for the coupling of simulations? In fact, most ICME publications up to now are examples of the joint application of selected models and software codes to a specific problem. However, from a systems point of view, the coupling of individual models and/or software codes across length scales and along material processing chains leads to highly complex meta-models. Their viability has to be ensured by joint efforts from science, industry, software developers and independent organizations. This paper identifies some developments that seem necessary to make future ICME simulations viable, sustainable and broadly accessible and accepted. The main conclusion is that ICME is more than a multi-disciplinary subject but a discipline of its own, for which a generic structural framework has to be elaborated and established.

  1. Deliberate Practice as a Theoretical Framework for Interprofessional Experiential Education.

    PubMed

    Wang, Joyce M; Zorek, Joseph A

    2016-01-01

    The theory of deliberate practice has been applied to many skill-based performance activities. The primary aim of this project was to integrate synergistic principles from deliberate practice and consensus-derived competencies for interprofessional education into a framework upon which educational models to advance interprofessional experiential education (IEE) might be built. CINAHL, ERIC, and MEDLINE databases were searched using the keywords "deliberate practice" and "interprofessional education," both individually and in combination. Relevant articles were selected from the catalog based on support for the premise of the project. Defining characteristics of deliberate practice were distilled with particular emphasis on their application to the Interprofessional Education Collaborative's (IPEC) core competencies. Recommendations for IEE development were identified through the synthesis of deliberate practice principles and IPEC competencies. There is a high degree of synergy between deliberate practice principles and IPEC competencies. Our synthesis of the literature yielded a cyclical four-step process to advance IEE: (1) implement an IEE plan guided by the student's strengths/weaknesses and in consideration of the collaborative practice skills they wish to develop, (2) engage in IPE experiences that will challenge targeted skills according to the IEE plan, (3) embed frequent opportunities for student reflection and preceptor/team feedback within IEE plan, and (4) revise the IEE plan and the IPE experience based on insights gained during step 3. The cyclical four-step process synthesized through this literature review may be used to guide the development of new IEE models. The purposeful development of IEE models grounded in a theory that has already been operationalized in other skill-based performance areas is an important step to address expanding accreditation standards throughout the health professions mandating interprofessional education for pre-licensure health professional students.

  2. Mechanical and Metallurgical Evolution of Stainless Steel 321 in a Multi-step Forming Process

    NASA Astrophysics Data System (ADS)

    Anderson, M.; Bridier, F.; Gholipour, J.; Jahazi, M.; Wanjara, P.; Bocher, P.; Savoie, J.

    2016-04-01

    This paper examines the metallurgical evolution of AISI Stainless Steel 321 (SS 321) during multi-step forming, a process that involves cycles of deformation with intermediate heat treatment steps. The multi-step forming process was simulated by implementing interrupted uniaxial tensile testing experiments. Evolution of the mechanical properties as well as the microstructural features, such as twins and textures of the austenite and martensite phases, was studied as a function of the multi-step forming process. The characteristics of the Strain-Induced Martensite (SIM) were also documented for each deformation step and intermediate stress relief heat treatment. The results indicated that the intermediate heat treatments considerably increased the formability of SS 321. Texture analysis showed that the effect of the intermediate heat treatment on the austenite was minor and led to partial recrystallization, while deformation was observed to reinforce the crystallographic texture of austenite. For the SIM, an Olson-Cohen equation type was identified to analytically predict its formation during the multi-step forming process. The generated SIM was textured and weakened with increasing deformation.

  3. From single steps to mass migration: the problem of scale in the movement ecology of the Serengeti wildebeest.

    PubMed

    Torney, Colin J; Hopcraft, J Grant C; Morrison, Thomas A; Couzin, Iain D; Levin, Simon A

    2018-05-19

    A central question in ecology is how to link processes that occur over different scales. The daily interactions of individual organisms ultimately determine community dynamics, population fluctuations and the functioning of entire ecosystems. Observations of these multiscale ecological processes are constrained by various technological, biological or logistical issues, and there are often vast discrepancies between the scale at which observation is possible and the scale of the question of interest. Animal movement is characterized by processes that act over multiple spatial and temporal scales. Second-by-second decisions accumulate to produce annual movement patterns. Individuals influence, and are influenced by, collective movement decisions, which then govern the spatial distribution of populations and the connectivity of meta-populations. While the field of movement ecology is experiencing unprecedented growth in the availability of movement data, there remain challenges in integrating observations with questions of ecological interest. In this article, we present the major challenges of addressing these issues within the context of the Serengeti wildebeest migration, a keystone ecological phenomena that crosses multiple scales of space, time and biological complexity.This article is part of the theme issue 'Collective movement ecology'. © 2018 The Author(s).

  4. How to implement information technology in the operating room and the intensive care unit.

    PubMed

    Meyfroidt, Geert

    2009-03-01

    The number of operating rooms and intensive care units looking for a data management system to perform their increasingly complex tasks is rising. Although at this time only a minority is computerized, within the next few years many centres will start implementing information technology. The transition towards a computerized system is a major venture, which will have a major impact on workflow. This chapter reviews the present literature. Published papers on this subject are predominantly single- or multi-centre implementation reports. The general principles that should guide such a process are described. For healthcare institutions or individual practitioners that plan to undertake this venture, the implementation process is described in a practical, nine-step overview.

  5. The impact of experimental measurement errors on long-term viscoelastic predictions. [of structural materials

    NASA Technical Reports Server (NTRS)

    Tuttle, M. E.; Brinson, H. F.

    1986-01-01

    The impact of flight error in measured viscoelastic parameters on subsequent long-term viscoelastic predictions is numerically evaluated using the Schapery nonlinear viscoelastic model. Of the seven Schapery parameters, the results indicated that long-term predictions were most sensitive to errors in the power law parameter n. Although errors in the other parameters were significant as well, errors in n dominated all other factors at long times. The process of selecting an appropriate short-term test cycle so as to insure an accurate long-term prediction was considered, and a short-term test cycle was selected using material properties typical for T300/5208 graphite-epoxy at 149 C. The process of selection is described, and its individual steps are itemized.

  6. Physical activity measured using global positioning system tracking in non-small cell lung cancer: an observational study.

    PubMed

    Granger, Catherine L; Denehy, Linda; McDonald, Christine F; Irving, Louis; Clark, Ross A

    2014-11-01

    Increasingly physical activity (PA) is being recognized as an important outcome in non-small cell lung cancer (NSCLC). We investigated PA using novel global positioning system (GPS) tracking individuals with NSCLC and a group of similar-aged healthy individuals. A prospective cross-sectional multicenter study. Fifty individuals with NSCLC from 3 Australian tertiary hospitals and 35 similar-aged healthy individuals without cancer were included. Individuals with NSCLC were assessed pretreatment. Primary measures were triaxial accelerometery (steps/day) and GPS tracking (outdoor PA behavior). Secondary measures were questionnaires assessing depression, motivation to exercise, and environmental barriers to PA. Between-group comparisons were analyzed using analysis of covariance. Individuals with NSCLC engaged in significantly less PA than similar-aged healthy individuals (mean difference 2363 steps/day, P = .007) and had higher levels of depression (P = .027) and lower motivation to exercise (P = .001). Daily outdoor walking time (P = .874) and distance travelled away from home (P = .883) were not different between groups. Individuals with NSCLC spent less time outdoors in their local neighborhood area (P < .001). A greater number of steps per day was seen in patients who were less depressed (r = .39) or had better access to nonresidential destinations such as shopping centers (r = .25). Global positioning system tracking appears to be a feasible methodology for adult cancer patients and holds promise for use in future studies investigating PA and or lifestyle behaviors. © The Author(s) 2014.

  7. Integration of eHealth Tools in the Process of Workplace Health Promotion: Proposal for Design and Implementation.

    PubMed

    Jimenez, Paulino; Bregenzer, Anita

    2018-02-23

    Electronic health (eHealth) and mobile health (mHealth) tools can support and improve the whole process of workplace health promotion (WHP) projects. However, several challenges and opportunities have to be considered while integrating these tools in WHP projects. Currently, a large number of eHealth tools are developed for changing health behavior, but these tools can support the whole WHP process, including group administration, information flow, assessment, intervention development process, or evaluation. To support a successful implementation of eHealth tools in the whole WHP processes, we introduce a concept of WHP (life cycle model of WHP) with 7 steps and present critical and success factors for the implementation of eHealth tools in each step. We developed a life cycle model of WHP based on the World Health Organization (WHO) model of healthy workplace continual improvement process. We suggest adaptations to the WHO model to demonstrate the large number of possibilities to implement eHealth tools in WHP as well as possible critical points in the implementation process. eHealth tools can enhance the efficiency of WHP in each of the 7 steps of the presented life cycle model of WHP. Specifically, eHealth tools can support by offering easier administration, providing an information and communication platform, supporting assessments, presenting and discussing assessment results in a dashboard, and offering interventions to change individual health behavior. Important success factors include the possibility to give automatic feedback about health parameters, create incentive systems, or bring together a large number of health experts in one place. Critical factors such as data security, anonymity, or lack of management involvement have to be addressed carefully to prevent nonparticipation and dropouts. Using eHealth tools can support WHP, but clear regulations for the usage and implementation of these tools at the workplace are needed to secure quality and reach sustainable results. ©Paulino Jimenez, Anita Bregenzer. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 23.02.2018.

  8. Individually Tailored, Adaptive Intervention to Manage Gestational Weight Gain: Protocol for a Randomized Controlled Trial in Women With Overweight and Obesity.

    PubMed

    Symons Downs, Danielle; Savage, Jennifer S; Rivera, Daniel E; Smyth, Joshua M; Rolls, Barbara J; Hohman, Emily E; McNitt, Katherine M; Kunselman, Allen R; Stetter, Christy; Pauley, Abigail M; Leonard, Krista S; Guo, Penghong

    2018-06-08

    High gestational weight gain is a major public health concern as it independently predicts adverse maternal and infant outcomes. Past interventions have had only limited success in effectively managing pregnancy weight gain, especially among women with overweight and obesity. Well-designed interventions are needed that take an individualized approach and target unique barriers to promote healthy weight gain. The primary aim of the study is to describe the study protocol for Healthy Mom Zone, an individually tailored, adaptive intervention for managing weight in pregnant women with overweight and obesity. The Healthy Mom Zone Intervention, based on theories of planned behavior and self-regulation and a model of energy balance, includes components (eg, education, self-monitoring, physical activity/healthy eating behaviors) that are adapted over the intervention (ie, increase in intensity) to better regulate weight gain. Decision rules inform when to adapt the intervention. In this randomized controlled trial, women are randomized to the intervention or standard care control group. The intervention is delivered from approximately 8-36 weeks gestation and includes step-ups in dosages (ie, Step-up 1 = education + physical activity + healthy eating active learning [cooking/recipes]; Step-up 2 = Step-up 1 + portion size, physical activity; Step-up 3 = Step-up 1 + 2 + grocery store feedback, physical activity); 5 maximum adaptations. Study measures are obtained at pre- and postintervention as well as daily (eg, weight), weekly (eg, energy intake/expenditure), and monthly (eg, psychological) over the study period. Analyses will include linear mixed-effects models, generalized estimating equations, and dynamical modeling to understand between-group and within-individual effects of the intervention on weight gain. Recruitment of 31 pregnant women with overweight and obesity has occurred from January 2016 through July 2017. Baseline data have been collected for all participants. To date, 24 participants have completed the intervention and postintervention follow-up assessments, 3 are currently in progress, 1 dropped out, and 3 women had early miscarriages and are no longer active in the study. Of the 24 participants, 13 women have completed the intervention to date, of which 1 (8%, 1/13) received only the baseline intervention, 3 (23%, 3/13) received baseline + step-up 1, 6 (46%, 6/13) received baseline + step-up 1 + step-up 2, and 3 (23%, 3/13) received baseline + step-up 1 + step-up 2 +step-up 3. Data analysis is still ongoing through spring 2018. This is one of the first intervention studies to use an individually tailored, adaptive design to manage weight gain in pregnancy. Results from this study will be useful in designing a larger randomized trial to examine efficacy of this intervention and developing strategies for clinical application. RR1-10.2196/9220. ©Danielle Symons Downs, Jennifer S Savage, Daniel E Rivera, Joshua M Smyth, Barbara J Rolls, Emily E Hohman, Katherine M McNitt, Allen R Kunselman, Christy Stetter, Abigail M Pauley, Krista S Leonard, Penghong Guo. Originally published in JMIR Research Protocols (http://www.researchprotocols.org), 08.06.2018.

  9. 3D Stacked Memory Final Report CRADA No. TC-0494-93

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bernhardt, A.; Beene, G.

    TI and LLNL demonstrated: (1) a process for the fabrication of 3-D memory using stacked DRAM chips, and (2) a fast prototyping process for 3-D stacks and MCMs. The metallization to route the chip pads to the sides of the die was carried out in a single high-speed masking step. The mask was not the usual physical one in glass and chrome, but was simply a computer file used to control the laser patterning process. Changes in either chip or customer circuit-board pad layout were easily and inexpensively accommodated, so that prototyping was a natural consequence of the laser patterningmore » process. As in the current TI process, a dielectric layer was added to the wafer, and vias to the chip I/0 pads were formed. All of the steps in Texas Instruments earlier process that were required to gold bump the pads were eliminated, significantly reducing fabrication cost and complexity. Pads were created on the sides of ·the die, which became pads on the side of the stack. In order to extend the process to accommodate non-memory devices with substantially greater I/0 than is required for DRAMs, pads were patterned on two sides of the memory stacks as a proof of principle. Stacking and bonding were done using modifications of the current TI process. After stacking and bonding, the pads on the sides of the dice were connected by application of a polyimide insulator film with laser ablation of the polyimide to form contacts to the pads. Then metallization was accomplished in the same manner as on the individual die.« less

  10. Standard work for room entry: Linking lean, hand hygiene, and patient-centeredness.

    PubMed

    O'Reilly, Kristin; Ruokis, Samantha; Russell, Kristin; Teves, Tim; DiLibero, Justin; Yassa, David; Berry, Hannah; Howell, Michael D

    2016-03-01

    Healthcare-associated infections are costly and fatal. Substantial front-line, administrative, regulatory, and research efforts have focused on improving hand hygiene. While broad agreement exists that hand hygiene is the most important single approach to infection prevention, compliance with hand hygiene is typically only about 40%(1). Our aim was to develop a standard process for room entry in the intensive care unit that improved compliance with hand hygiene and allowed for maximum efficiency. We recognized that hand hygiene is a single step in a substantially more complicated process of room entry. We applied Lean engineering techniques to develop a standard process that included both physical steps and also standard communication elements from provider to patients and families and created a physical environment to support this. We observed meaningful improvement in the performance of the new standard as well as time savings for clinical providers with each room entry. We also observed an increase in room entries that included verbal communication and an explanation of what the clinician was entering the room to do. The design and implementation of a standardized room entry process and the creation of an environment that supports that new process has resulted in measurable positive outcomes on the medical intensive care unit, including quality, patient experience, efficiency, and staff satisfaction. Designing a process, rather than viewing tasks that need to happen in close proximity in time (either serially or in parallel) as unrelated, simplifies work for staff and results in higher compliance to individual tasks. Copyright © 2015 Elsevier Inc. All rights reserved.

  11. Whole limb kinematics are preferentially conserved over individual joint kinematics after peripheral nerve injury

    PubMed Central

    Chang, Young-Hui; Auyang, Arick G.; Scholz, John P.; Nichols, T. Richard

    2009-01-01

    Summary Biomechanics and neurophysiology studies suggest whole limb function to be an important locomotor control parameter. Inverted pendulum and mass-spring models greatly reduce the complexity of the legs and predict the dynamics of locomotion, but do not address how numerous limb elements are coordinated to achieve such simple behavior. As a first step, we hypothesized whole limb kinematics were of primary importance and would be preferentially conserved over individual joint kinematics after neuromuscular injury. We used a well-established peripheral nerve injury model of cat ankle extensor muscles to generate two experimental injury groups with a predictable time course of temporary paralysis followed by complete muscle self-reinnervation. Mean trajectories of individual joint kinematics were altered as a result of deficits after injury. By contrast, mean trajectories of limb orientation and limb length remained largely invariant across all animals, even with paralyzed ankle extensor muscles, suggesting changes in mean joint angles were coordinated as part of a long-term compensation strategy to minimize change in whole limb kinematics. Furthermore, at each measurement stage (pre-injury, paralytic and self-reinnervated) step-by-step variance of individual joint kinematics was always significantly greater than that of limb orientation. Our results suggest joint angle combinations are coordinated and selected to stabilize whole limb kinematics against short-term natural step-by-step deviations as well as long-term, pathological deviations created by injury. This may represent a fundamental compensation principle allowing animals to adapt to changing conditions with minimal effect on overall locomotor function. PMID:19837893

  12. Elite sprinting: are athletes individually step-frequency or step-length reliant?

    PubMed

    Salo, Aki I T; Bezodis, Ian N; Batterham, Alan M; Kerwin, David G

    2011-06-01

    The aim of this study was to investigate the step characteristics among the very best 100-m sprinters in the world to understand whether the elite athletes are individually more reliant on step frequency (SF) or step length (SL). A total of 52 male elite-level 100-m races were recorded from publicly available television broadcasts, with 11 analyzed athletes performing in 10 or more races. For each run of each athlete, the average SF and SL over the whole 100-m distance was analyzed. To determine any SF or SL reliance for an individual athlete, the 90% confidence interval (CI) for the difference between the SF-time versus SL-time relationships was derived using a criterion nonparametric bootstrapping technique. Athletes performed these races with various combinations of SF and SL reliance. Athlete A10 yielded the highest positive CI difference (SL reliance), with a value of 1.05 (CI = 0.50-1.53). The largest negative difference (SF reliance) occurred for athlete A11 as -0.60, with the CI range of -1.20 to 0.03. Previous studies have generally identified only one of these variables to be the main reason for faster running velocities. However, this study showed that there is a large variation of performance patterns among the elite athletes and, overall, SF or SL reliance is a highly individual occurrence. It is proposed that athletes should take this reliance into account in their training, with SF-reliant athletes needing to keep their neural system ready for fast leg turnover and SL-reliant athletes requiring more concentration on maintaining strength levels.

  13. Preclinic group education sessions reduce waiting times and costs at public pain medicine units.

    PubMed

    Davies, Stephanie; Quintner, John; Parsons, Richard; Parkitny, Luke; Knight, Paul; Forrester, Elizabeth; Roberts, Mary; Graham, Carl; Visser, Eric; Antill, Tracy; Packer, Tanya; Schug, Stephan A

    2011-01-01

    To assess the effects of preclinic group education sessions and system redesign on tertiary pain medicine units and patient outcomes. Prospective cohort study. Two public hospital multidisciplinary pain medicine units. People with persistent pain. A system redesign from a "traditional" model (initial individual medical appointments) to a model that delivers group education sessions prior to individual appointments. Based on Patient Triage Questionnaires patients were scheduled to attend Self-Training Educative Pain Sessions (STEPS), a two day eight hour group education program, followed by optional patient-initiated clinic appointments. Number of patients completing STEPS who subsequently requested individual outpatient clinic appointment(s); wait-times; unit cost per new patient referred; recurrent health care utilization; patient satisfaction; Global Perceived Impression of Change (GPIC); and utilized pain management strategies. Following STEPS 48% of attendees requested individual outpatient appointments. Wait times reduced from 105.6 to 16.1 weeks at one pain unit and 37.3 to 15.2 weeks at the second. Unit cost per new patient appointed reduced from $1,805 Australian Dollars (AUD) to AUD$541 (for STEPS). At 3 months, patients scored their satisfaction with "the treatment received for their pain" more positively than at baseline (change score=0.88; P=0.0003), GPIC improved (change score=0.46; P<0.0001) and mean number of active strategies utilized increased by 4.12 per patient (P=0.0004). The introduction of STEPS was associated with reduced wait-times and costs at public pain medicine units and increased both the use of active pain management strategies and patient satisfaction. Wiley Periodicals, Inc.

  14. Separation process using pervaporation and dephlegmation

    DOEpatents

    Vane, Leland M.; Mairal, Anurag P.; Ng, Alvin; Alvarez, Franklin R.; Baker, Richard W.

    2004-06-29

    A process for treating liquids containing organic compounds and water. The process includes a pervaporation step in conjunction with a dephlegmation step to treat at least a portion of the permeate vapor from the pervaporation step. The process yields a membrane residue stream, a stream enriched in the more volatile component (usually the organic) as the overhead stream from the dephlegmator and a condensate stream enriched in the less volatile component (usually the water) as a bottoms stream from the dephlegmator. Any of these may be the principal product of the process. The membrane separation step may also be performed in the vapor phase, or by membrane distillation.

  15. Lithographic chip identification: meeting the failure analysis challenge

    NASA Astrophysics Data System (ADS)

    Perkins, Lynn; Riddell, Kevin G.; Flack, Warren W.

    1992-06-01

    This paper describes a novel method using stepper photolithography to uniquely identify individual chips for permanent traceability. A commercially available 1X stepper is used to mark chips with an identifier or `serial number' which can be encoded with relevant information for the integrated circuit manufacturer. The permanent identification of individual chips can improve current methods of quality control, failure analysis, and inventory control. The need for this technology is escalating as manufacturers seek to provide six sigma quality control for their products and trace fabrication problems to their source. This need is especially acute for parts that fail after packaging and are returned to the manufacturer for analysis. Using this novel approach, failure analysis data can be tied back to a particular batch, wafer, or even a position within a wafer. Process control can be enhanced by identifying the root cause of chip failures. Chip identification also addresses manufacturers concerns with increasing incidences of chip theft. Since chips currently carry no identification other than the manufacturer's name and part number, recovery efforts are hampered by the inability to determine the sales history of a specific packaged chip. A definitive identifier or serial number for each chip would address this concern. The results of chip identification (patent pending) are easily viewed through a low power microscope. Batch number, wafer number, exposure step, and chip location within the exposure step can be recorded, as can dates and other items of interest. An explanation of the chip identification procedure and processing requirements are described. Experimental testing and results are presented, and potential applications are discussed.

  16. Dataset for an analysis of communicative aspects of finance.

    PubMed

    Natalya Zavyalova

    2017-04-01

    The article describes a step-by-step strategy for designing a universal comprehensive vision of a vast majority of financial research topics. The strategy is focused around the analysis of the retrieval results of the word processing system Serelex which is based on the semantic similarity measure. While designing a research topic, scientists usually employ their individual background. They rely in most cases on their individual assumptions and hypotheses. The strategy, introduced in the article, highlights the method of identifying components of semantic maps which can lead to a better coverage of any scientific topic under analysis. On the example of the research field of finance we show the practical and theoretical value of semantic similarity measurements, i.e., a better coverage of the problems which might be included in the scientific analysis of financial field. At the designing stage of any research scientists are not immune to an insufficient and, thus, erroneous spectrum of problems under analysis. According to the famous maxima of St. Augustine, 'Fallor ergo sum', the researchers' activities are driven along the way from one mistake to another. However, this might not be the case for the 21st century science approach. Our strategy offers an innovative methodology, according to which the number of mistakes at the initial stage of any research may be significantly reduced. The data, obtained, was used in two articles (N. Zavyalova, 2017) [7], (N. Zavyalova, 2015) [8]. The second stage of our experiment was driven towards analyzing the correlation between the language and income level of the respondents. The article contains the information about data processing.

  17. Single-Run Single-Mask Inductively-Coupled-Plasma Reactive-Ion-Etching Process for Fabricating Suspended High-Aspect-Ratio Microstructures

    NASA Astrophysics Data System (ADS)

    Yang, Yao-Joe; Kuo, Wen-Cheng; Fan, Kuang-Chao

    2006-01-01

    In this work, we present a single-run single-mask (SRM) process for fabricating suspended high-aspect-ratio structures on standard silicon wafers using an inductively coupled plasma-reactive ion etching (ICP-RIE) etcher. This process eliminates extra fabrication steps which are required for structure release after trench etching. Released microstructures with 120 μm thickness are obtained by this process. The corresponding maximum aspect ratio of the trench is 28. The SRM process is an extended version of the standard process proposed by BOSCH GmbH (BOSCH process). The first step of the SRM process is a standard BOSCH process for trench etching, then a polymer layer is deposited on trench sidewalls as a protective layer for the subsequent structure-releasing step. The structure is released by dry isotropic etching after the polymer layer on the trench floor is removed. All the steps can be integrated into a single-run ICP process. Also, only one mask is required. Therefore, the process complexity and fabrication cost can be effectively reduced. Discussions on each SRM step and considerations for avoiding undesired etching of the silicon structures during the release process are also presented.

  18. Effect of the processing steps on compositions of table olive since harvesting time to pasteurization.

    PubMed

    Nikzad, Nasim; Sahari, Mohammad A; Vanak, Zahra Piravi; Safafar, Hamed; Boland-nazar, Seyed A

    2013-08-01

    Weight, oil, fatty acids, tocopherol, polyphenol, and sterol properties of 5 olive cultivars (Zard, Fishomi, Ascolana, Amigdalolia, and Conservalia) during crude, lye treatment, washing, fermentation, and pasteurization steps were studied. Results showed: oil percent was higher and lower in Ascolana (crude step) and in Fishomi (pasteurization step), respectively; during processing steps, in all cultivars, oleic, palmitic, linoleic, and stearic acids were higher; the highest changes in saturated and unsaturated fatty acids were in fermentation step; the highest and the lowest ratios of ω3 / ω6 were in Ascolana (washing step) and in Zard (pasteurization step), respectively; the highest and the lowest tocopherol were in Amigdalolia and Fishomi, respectively, and major damage occurred in lye step; the highest and the lowest polyphenols were in Ascolana (crude step) and in Zard and Ascolana (pasteurization step), respectively; the major damage among cultivars occurred during lye step, in which the polyphenol reduced to 1/10 of first content; sterol did not undergo changes during steps. Reviewing of olive patents shows that many compositions of fruits such as oil quality, fatty acids, quantity and its fraction can be changed by alteration in cultivar and process.

  19. Building dynamic population graph for accurate correspondence detection.

    PubMed

    Du, Shaoyi; Guo, Yanrong; Sanroma, Gerard; Ni, Dong; Wu, Guorong; Shen, Dinggang

    2015-12-01

    In medical imaging studies, there is an increasing trend for discovering the intrinsic anatomical difference across individual subjects in a dataset, such as hand images for skeletal bone age estimation. Pair-wise matching is often used to detect correspondences between each individual subject and a pre-selected model image with manually-placed landmarks. However, the large anatomical variability across individual subjects can easily compromise such pair-wise matching step. In this paper, we present a new framework to simultaneously detect correspondences among a population of individual subjects, by propagating all manually-placed landmarks from a small set of model images through a dynamically constructed image graph. Specifically, we first establish graph links between models and individual subjects according to pair-wise shape similarity (called as forward step). Next, we detect correspondences for the individual subjects with direct links to any of model images, which is achieved by a new multi-model correspondence detection approach based on our recently-published sparse point matching method. To correct those inaccurate correspondences, we further apply an error detection mechanism to automatically detect wrong correspondences and then update the image graph accordingly (called as backward step). After that, all subject images with detected correspondences are included into the set of model images, and the above two steps of graph expansion and error correction are repeated until accurate correspondences for all subject images are established. Evaluations on real hand X-ray images demonstrate that our proposed method using a dynamic graph construction approach can achieve much higher accuracy and robustness, when compared with the state-of-the-art pair-wise correspondence detection methods as well as a similar method but using static population graph. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. Bridging Health Care and the Workplace: Formulation of a Return-to-Work Intervention for Breast Cancer Patients Using an Intervention Mapping Approach.

    PubMed

    Désiron, Huguette A M; Crutzen, Rik; Godderis, Lode; Van Hoof, Elke; de Rijk, Angelique

    2016-09-01

    Purpose An increasing number of breast cancer (BC) survivors of working age require return to work (RTW) support. Objective of this paper is to describe the development of a RTW intervention to be embedded in the care process bridging the gap between hospital and workplace. Method The Intervention Mapping (IM) approach was used and combined formative research results regarding RTW in BC patients with published insights on occupational therapy (OT) and RTW. Four development steps were taken, starting from needs assessment to the development of intervention components and materials. Results A five-phased RTW intervention guided by a hospital-based occupational therapist is proposed: (1) assessing the worker, the usual work and contextual factors which impacts on (re-)employment; (2) exploration of match/differences between the worker and the usual work; (3) establishing long term goals, broken down into short term goals; (4) setting up tailored actions by carefully implementing results of preceding phases; (5) step by step, the program as described in phase 4 will be executed. The occupational therapist monitors, measures and reviews goals and program-steps in the intervention to secure the tailor-made approach of each program-step of the intervention. Conclusion The use of IM resulted in a RTW oriented OT intervention. This unique intervention succeeds in matching individual BC patient needs, the input of stakeholders at the hospital and the workplace.

  1. Surface Modified Particles By Multi-Step Addition And Process For The Preparation Thereof

    DOEpatents

    Cook, Ronald Lee; Elliott, Brian John; Luebben, Silvia DeVito; Myers, Andrew William; Smith, Bryan Matthew

    2006-01-17

    The present invention relates to a new class of surface modified particles and to a multi-step surface modification process for the preparation of the same. The multi-step surface functionalization process involves two or more reactions to produce particles that are compatible with various host systems and/or to provide the particles with particular chemical reactivities. The initial step comprises the attachment of a small organic compound to the surface of the inorganic particle. The subsequent steps attach additional compounds to the previously attached organic compounds through organic linking groups.

  2. FTIR spectroscopic study on individual amino acid residues in the proton pumping process of bacteriorhodopsin

    NASA Astrophysics Data System (ADS)

    Liu, Xiaomei

    1998-05-01

    My thesis project has concentrated on clarifying the role of individual amino acids such as tyrosine, arginine and threonine in the active proton transferring process of Bacteriorhodopsin(bR). BR is a protein found in the purple membrane of Halobacteria salinarium. The main function of bR is to transfer a proton from the interior side of the cell to the external medium upon illumination by visible light. BR belongs to a family of retinal- containing membrane proteins which includes rhodopsin, a visual receptor found in the eye, and sensory rhodopsin I, a light receptor for phototaxis found in H. salinarium. Complete understanding of the proton transferring mechanism of bR can help explain the energy transduction and active ion transport in biological systems. This information also provides insight into other members of the retinal-containing protein family. To study the behavior of a single amino acid in a protein which consists of 248 amino acids, I employed the Fourier transform infrared (FTIR) difference spectroscopy technique. This was combined with the recently developed genetic engineering method of site directed isotope labeling (SDIL). As complementary work, I also characterized the vibrational properties of individual amino acids in various environments. Because of the high resolution and sensitivity of FTIR difference spectroscopy, along with the ability of SDIL to detect structural changes at the single amino acid level, we are able to determine changes in the structure of specific amino acids at different steps in bR photocycle. My research results provide strong evidence for a proton pump model. This model predicts the participation of tyrosine 185 and one or more threonines in a hydrogen bonded chain which can transfer proton across the membrane. My data also suggest a more accurate model for the proton release step which involves arginine 82.

  3. Data-based control of a multi-step forming process

    NASA Astrophysics Data System (ADS)

    Schulte, R.; Frey, P.; Hildenbrand, P.; Vogel, M.; Betz, C.; Lechner, M.; Merklein, M.

    2017-09-01

    The fourth industrial revolution represents a new stage in the organization and management of the entire value chain. However, concerning the field of forming technology, the fourth industrial revolution has only arrived gradually until now. In order to make a valuable contribution to the digital factory the controlling of a multistage forming process was investigated. Within the framework of the investigation, an abstracted and transferable model is used to outline which data have to be collected, how an interface between the different forming machines can be designed tangible and which control tasks must be fulfilled. The goal of this investigation was to control the subsequent process step based on the data recorded in the first step. The investigated process chain links various metal forming processes, which are typical elements of a multi-step forming process. Data recorded in the first step of the process chain is analyzed and processed for an improved process control of the subsequent process. On the basis of the gained scientific knowledge, it is possible to make forming operations more robust and at the same time more flexible, and thus create the fundament for linking various production processes in an efficient way.

  4. Targeting safety improvements through identification of incident origination and detection in a near-miss incident learning system.

    PubMed

    Novak, Avrey; Nyflot, Matthew J; Ermoian, Ralph P; Jordan, Loucille E; Sponseller, Patricia A; Kane, Gabrielle M; Ford, Eric C; Zeng, Jing

    2016-05-01

    Radiation treatment planning involves a complex workflow that has multiple potential points of vulnerability. This study utilizes an incident reporting system to identify the origination and detection points of near-miss errors, in order to guide their departmental safety improvement efforts. Previous studies have examined where errors arise, but not where they are detected or applied a near-miss risk index (NMRI) to gauge severity. From 3/2012 to 3/2014, 1897 incidents were analyzed from a departmental incident learning system. All incidents were prospectively reviewed weekly by a multidisciplinary team and assigned a NMRI score ranging from 0 to 4 reflecting potential harm to the patient (no potential harm to potential critical harm). Incidents were classified by point of incident origination and detection based on a 103-step workflow. The individual steps were divided among nine broad workflow categories (patient assessment, imaging for radiation therapy (RT) planning, treatment planning, pretreatment plan review, treatment delivery, on-treatment quality management, post-treatment completion, equipment/software quality management, and other). The average NMRI scores of incidents originating or detected within each broad workflow area were calculated. Additionally, out of 103 individual process steps, 35 were classified as safety barriers, the process steps whose primary function is to catch errors. The safety barriers which most frequently detected incidents were identified and analyzed. Finally, the distance between event origination and detection was explored by grouping events by the number of broad workflow area events passed through before detection, and average NMRI scores were compared. Near-miss incidents most commonly originated within treatment planning (33%). However, the incidents with the highest average NMRI scores originated during imaging for RT planning (NMRI = 2.0, average NMRI of all events = 1.5), specifically during the documentation of patient positioning and localization of the patient. Incidents were most frequently detected during treatment delivery (30%), and incidents identified at this point also had higher severity scores than other workflow areas (NMRI = 1.6). Incidents identified during on-treatment quality management were also more severe (NMRI = 1.7), and the specific process steps of reviewing portal and CBCT images tended to catch highest-severity incidents. On average, safety barriers caught 46% of all incidents, most frequently at physics chart review, therapist's chart check, and the review of portal images; however, most of the incidents that pass through a particular safety barrier are not designed to be capable of being captured at that barrier. Incident learning systems can be used to assess the most common points of error origination and detection in radiation oncology. This can help tailor safety improvement efforts and target the highest impact portions of the workflow. The most severe near-miss events tend to originate during simulation, with the most severe near-miss events detected at the time of patient treatment. Safety barriers can be improved to allow earlier detection of near-miss events.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Novak, Avrey; Nyflot, Matthew J.; Ermoian, Ralph P.

    Purpose: Radiation treatment planning involves a complex workflow that has multiple potential points of vulnerability. This study utilizes an incident reporting system to identify the origination and detection points of near-miss errors, in order to guide their departmental safety improvement efforts. Previous studies have examined where errors arise, but not where they are detected or applied a near-miss risk index (NMRI) to gauge severity. Methods: From 3/2012 to 3/2014, 1897 incidents were analyzed from a departmental incident learning system. All incidents were prospectively reviewed weekly by a multidisciplinary team and assigned a NMRI score ranging from 0 to 4 reflectingmore » potential harm to the patient (no potential harm to potential critical harm). Incidents were classified by point of incident origination and detection based on a 103-step workflow. The individual steps were divided among nine broad workflow categories (patient assessment, imaging for radiation therapy (RT) planning, treatment planning, pretreatment plan review, treatment delivery, on-treatment quality management, post-treatment completion, equipment/software quality management, and other). The average NMRI scores of incidents originating or detected within each broad workflow area were calculated. Additionally, out of 103 individual process steps, 35 were classified as safety barriers, the process steps whose primary function is to catch errors. The safety barriers which most frequently detected incidents were identified and analyzed. Finally, the distance between event origination and detection was explored by grouping events by the number of broad workflow area events passed through before detection, and average NMRI scores were compared. Results: Near-miss incidents most commonly originated within treatment planning (33%). However, the incidents with the highest average NMRI scores originated during imaging for RT planning (NMRI = 2.0, average NMRI of all events = 1.5), specifically during the documentation of patient positioning and localization of the patient. Incidents were most frequently detected during treatment delivery (30%), and incidents identified at this point also had higher severity scores than other workflow areas (NMRI = 1.6). Incidents identified during on-treatment quality management were also more severe (NMRI = 1.7), and the specific process steps of reviewing portal and CBCT images tended to catch highest-severity incidents. On average, safety barriers caught 46% of all incidents, most frequently at physics chart review, therapist’s chart check, and the review of portal images; however, most of the incidents that pass through a particular safety barrier are not designed to be capable of being captured at that barrier. Conclusions: Incident learning systems can be used to assess the most common points of error origination and detection in radiation oncology. This can help tailor safety improvement efforts and target the highest impact portions of the workflow. The most severe near-miss events tend to originate during simulation, with the most severe near-miss events detected at the time of patient treatment. Safety barriers can be improved to allow earlier detection of near-miss events.« less

  6. Stochastic eco-evolutionary model of a prey-predator community.

    PubMed

    Costa, Manon; Hauzy, Céline; Loeuille, Nicolas; Méléard, Sylvie

    2016-02-01

    We are interested in the impact of natural selection in a prey-predator community. We introduce an individual-based model of the community that takes into account both prey and predator phenotypes. Our aim is to understand the phenotypic coevolution of prey and predators. The community evolves as a multi-type birth and death process with mutations. We first consider the infinite particle approximation of the process without mutation. In this limit, the process can be approximated by a system of differential equations. We prove the existence of a unique globally asymptotically stable equilibrium under specific conditions on the interaction among prey individuals. When mutations are rare, the community evolves on the mutational scale according to a Markovian jump process. This process describes the successive equilibria of the prey-predator community and extends the polymorphic evolutionary sequence to a coevolutionary framework. We then assume that mutations have a small impact on phenotypes and consider the evolution of monomorphic prey and predator populations. The limit of small mutation steps leads to a system of two differential equations which is a version of the canonical equation of adaptive dynamics for the prey-predator coevolution. We illustrate these different limits with an example of prey-predator community that takes into account different prey defense mechanisms. We observe through simulations how these various prey strategies impact the community.

  7. Intuitive and deliberate judgments are based on common principles.

    PubMed

    Kruglanski, Arie W; Gigerenzer, Gerd

    2011-01-01

    A popular distinction in cognitive and social psychology has been between intuitive and deliberate judgments. This juxtaposition has aligned in dual-process theories of reasoning associative, unconscious, effortless, heuristic, and suboptimal processes (assumed to foster intuitive judgments) versus rule-based, conscious, effortful, analytic, and rational processes (assumed to characterize deliberate judgments). In contrast, we provide convergent arguments and evidence for a unified theoretical approach to both intuitive and deliberative judgments. Both are rule-based, and in fact, the very same rules can underlie both intuitive and deliberate judgments. The important open question is that of rule selection, and we propose a 2-step process in which the task itself and the individual's memory constrain the set of applicable rules, whereas the individual's processing potential and the (perceived) ecological rationality of the rule for the task guide the final selection from that set. Deliberate judgments are not generally more accurate than intuitive judgments; in both cases, accuracy depends on the match between rule and environment: the rules' ecological rationality. Heuristics that are less effortful and in which parts of the information are ignored can be more accurate than cognitive strategies that have more information and computation. The proposed framework adumbrates a unified approach that specifies the critical dimensions on which judgmental situations may vary and the environmental conditions under which rules can be expected to be successful.

  8. Functional Constructivism: In Search of Formal Descriptors.

    PubMed

    Trofimova, Irina

    2017-10-01

    The Functional Constructivism (FC) paradigm is an alternative to behaviorism and considers behavior as being generated every time anew, based on an individual's capacities, environmental resources and demands. Walter Freeman's work provided us with evidence supporting the FC principles. In this paper we make parallels between gradual construction processes leading to the formation of individual behavior and habits, and evolutionary processes leading to the establishment of biological systems. Referencing evolutionary theory, several formal descriptors of such processes are proposed. These FC descriptors refer to the most universal aspects for constructing consistent structures: expansion of degrees of freedom, integration processes based on internal and external compatibility between systems and maintenance processes, all given in four different classes of systems: (a) Zone of Proximate Development (poorly defined) systems; (b) peer systems with emerging reproduction of multiple siblings; (c) systems with internalized integration of behavioral elements ('cruise controls'); and (d) systems capable of handling low-probability, not yet present events. The recursive dynamics within this set of descriptors acting on (traditional) downward, upward and horizontal directions of evolution, is conceptualized as diagonal evolution, or di-evolution. Two examples applying these FC descriptors to taxonomy are given: classification of the functionality of neuro-transmitters and temperament traits; classification of mental disorders. The paper is an early step towards finding a formal language describing universal tendencies in highly diverse, complex and multi-level transient systems known in ecology and biology as 'contingency cycles.'

  9. Step Forward. Single Parent/Homemaker Annual Report for the Fiscal Year 1990-1991.

    ERIC Educational Resources Information Center

    Kentucky Tech Region 5, Elizabethtown.

    The Step Forward Single Parent/Homemaker Program in Elizabethtown, Kentucky, was developed to provide information on career opportunities and assist the target individuals in career assessment, career counseling, and goal setting in order to develop self-esteem and time management skills. During the second year of the Step Forward program in…

  10. Mechanical, thermal and morphological characterization of polycarbonate/oxidized carbon nanofiber composites produced with a lean 2-step manufacturing process.

    PubMed

    Lively, Brooks; Kumar, Sandeep; Tian, Liu; Li, Bin; Zhong, Wei-Hong

    2011-05-01

    In this study we report the advantages of a 2-step method that incorporates an additional process pre-conditioning step for rapid and precise blending of the constituents prior to the commonly used melt compounding method for preparing polycarbonate/oxidized carbon nanofiber composites. This additional step (equivalent to a manufacturing cell) involves the formation of a highly concentrated solid nano-nectar of polycarbonate/carbon nanofiber composite using a solution mixing process followed by melt mixing with pure polycarbonate. This combined method yields excellent dispersion and improved mechanical and thermal properties as compared to the 1-step melt mixing method. The test results indicated that inclusion of carbon nanofibers into composites via the 2-step method resulted in dramatically reduced ( 48% lower) coefficient of thermal expansion compared to that of pure polycarbonate and 30% lower than that from the 1-step processing, at the same loading of 1.0 wt%. Improvements were also found in dynamic mechanical analysis and flexural mechanical properties. The 2-step approach is more precise and leads to better dispersion, higher quality, consistency, and improved performance in critical application areas. It is also consistent with Lean Manufacturing principles in which manufacturing cells are linked together using less of the key resources and creates a smoother production flow. Therefore, this 2-step process can be more attractive for industry.

  11. All individuals are not created equal; accounting for interindividual variation in fitting life-history responses to toxicants.

    PubMed

    Jager, Tjalling

    2013-02-05

    The individuals of a species are not equal. These differences frustrate experimental biologists and ecotoxicologists who wish to study the response of a species (in general) to a treatment. In the analysis of data, differences between model predictions and observations on individual animals are usually treated as random measurement error around the true response. These deviations, however, are mainly caused by real differences between the individuals (e.g., differences in physiology and in initial conditions). Understanding these intraspecies differences, and accounting for them in the data analysis, will improve our understanding of the response to the treatment we are investigating and allow for a more powerful, less biased, statistical analysis. Here, I explore a basic scheme for statistical inference to estimate parameters governing stress that allows individuals to differ in their basic physiology. This scheme is illustrated using a simple toxicokinetic-toxicodynamic model and a data set for growth of the springtail Folsomia candida exposed to cadmium in food. This article should be seen as proof of concept; a first step in bringing more realism into the statistical inference for process-based models in ecotoxicology.

  12. Learning a constrained conditional random field for enhanced segmentation of fallen trees in ALS point clouds

    NASA Astrophysics Data System (ADS)

    Polewski, Przemyslaw; Yao, Wei; Heurich, Marco; Krzystek, Peter; Stilla, Uwe

    2018-06-01

    In this study, we present a method for improving the quality of automatic single fallen tree stem segmentation in ALS data by applying a specialized constrained conditional random field (CRF). The entire processing pipeline is composed of two steps. First, short stem segments of equal length are detected and a subset of them is selected for further processing, while in the second step the chosen segments are merged to form entire trees. The first step is accomplished using the specialized CRF defined on the space of segment labelings, capable of finding segment candidates which are easier to merge subsequently. To achieve this, the CRF considers not only the features of every candidate individually, but incorporates pairwise spatial interactions between adjacent segments into the model. In particular, pairwise interactions include a collinearity/angular deviation probability which is learned from training data as well as the ratio of spatial overlap, whereas unary potentials encode a learned probabilistic model of the laser point distribution around each segment. Each of these components enters the CRF energy with its own balance factor. To process previously unseen data, we first calculate the subset of segments for merging on a grid of balance factors by minimizing the CRF energy. Then, we perform the merging and rank the balance configurations according to the quality of their resulting merged trees, obtained from a learned tree appearance model. The final result is derived from the top-ranked configuration. We tested our approach on 5 plots from the Bavarian Forest National Park using reference data acquired in a field inventory. Compared to our previous segment selection method without pairwise interactions, an increase in detection correctness and completeness of up to 7 and 9 percentage points, respectively, was observed.

  13. Density functional theory and RRKM calculations of decompositions of the metastable E-2,4-pentadienal molecular ions.

    PubMed

    Solano Espinoza, Eduardo A; Vallejo Narváez, Wilmer E

    2010-07-01

    The potential energy profiles for the fragmentations that lead to [C(5)H(5)O](+) and [C(4)H(6)](+*) ions from the molecular ions [C(5)H(6)O](+*) of E-2,4-pentadienal were obtained from calculations at the UB3LYP/6-311G + + (3df,3pd)//UB3LYP/6-31G(d,p) level of theory. Kinetic barriers and harmonic frequencies obtained by the density functional method were then employed in Rice-Ramsperger-Kassel-Marcus calculations of individual rate coefficients for a large number of reaction steps. The pre-equilibrium and rate-controlling step approximations were applied to different regions of the complex potential energy surface, allowing the overall rate of decomposition to be calculated and discriminated between three rival pathways: C-H bond cleavage, decarbonylation and cyclization. These processes should have to compete for an equilibrated mixture of four conformers of the E-2,4-pentadienal ions. The direct dissociation, however, can only become important in the high-energy regime. In contrast, loss of CO and cyclization are observable processes in the metastable kinetic window. The former involves a slow 1,2-hydrogen shift from the carbonyl group that is immediately followed by the formation of an ion-neutral complex which, in turn, decomposes rapidly to the s-trans-1,3-butadiene ion [C(4)H(6)](+*). The predominating metastable channel is the second one, that is, a multi-step ring closure which starts with a rate-limiting cis-trans isomerization. This process yields a mixture of interconverting pyran ions that dissociates to the pyrylium ions [C(5)H(5)O](+). These results can be used to rationalize the CID mass spectrum of E-2,4-pentadienal in a low-energy regime. 2010 John Wiley & Sons, Ltd.

  14. Oxidation-driven surface dynamics on NiAl(100)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Qin, Hailang; Chen, Xidong; Li, Liang

    Atomic steps, a defect common to all crystal surfaces, can play an important role in many physical and chemical processes. However, attempts to predict surface dynamics under nonequilibrium conditions are usually frustrated by poor knowledge of the atomic processes of surface motion arising from mass transport from/to surface steps. Using low-energy electron microscopy that spatially and temporally resolves oxide film growth during the oxidation of NiAl(100) we demonstrate that surface steps are impermeable to oxide film growth. The advancement of the oxide occurs exclusively on the same terrace and requires the coordinated migration of surface steps. The resulting piling upmore » of surface steps ahead of the oxide growth front progressively impedes the oxide growth. This process is reversed during oxide decomposition. The migration of the substrate steps is found to be a surface-step version of the well-known Hele-Shaw problem, governed by detachment (attachment) of Al atoms at step edges induced by the oxide growth (decomposition). As a result, by comparing with the oxidation of NiAl(110) that exhibits unimpeded oxide film growth over substrate steps, we suggest that whenever steps are the source of atoms used for oxide growth they limit the oxidation process; when atoms are supplied from the bulk, the oxidation rate is not limited by the motion of surface steps.« less

  15. Oxidation-driven surface dynamics on NiAl(100)

    DOE PAGES

    Qin, Hailang; Chen, Xidong; Li, Liang; ...

    2014-12-29

    Atomic steps, a defect common to all crystal surfaces, can play an important role in many physical and chemical processes. However, attempts to predict surface dynamics under nonequilibrium conditions are usually frustrated by poor knowledge of the atomic processes of surface motion arising from mass transport from/to surface steps. Using low-energy electron microscopy that spatially and temporally resolves oxide film growth during the oxidation of NiAl(100) we demonstrate that surface steps are impermeable to oxide film growth. The advancement of the oxide occurs exclusively on the same terrace and requires the coordinated migration of surface steps. The resulting piling upmore » of surface steps ahead of the oxide growth front progressively impedes the oxide growth. This process is reversed during oxide decomposition. The migration of the substrate steps is found to be a surface-step version of the well-known Hele-Shaw problem, governed by detachment (attachment) of Al atoms at step edges induced by the oxide growth (decomposition). As a result, by comparing with the oxidation of NiAl(110) that exhibits unimpeded oxide film growth over substrate steps, we suggest that whenever steps are the source of atoms used for oxide growth they limit the oxidation process; when atoms are supplied from the bulk, the oxidation rate is not limited by the motion of surface steps.« less

  16. Genomic prediction in a nuclear population of layers using single-step models.

    PubMed

    Yan, Yiyuan; Wu, Guiqin; Liu, Aiqiao; Sun, Congjiao; Han, Wenpeng; Li, Guangqi; Yang, Ning

    2018-02-01

    Single-step genomic prediction method has been proposed to improve the accuracy of genomic prediction by incorporating information of both genotyped and ungenotyped animals. The objective of this study is to compare the prediction performance of single-step model with a 2-step models and the pedigree-based models in a nuclear population of layers. A total of 1,344 chickens across 4 generations were genotyped by a 600 K SNP chip. Four traits were analyzed, i.e., body weight at 28 wk (BW28), egg weight at 28 wk (EW28), laying rate at 38 wk (LR38), and Haugh unit at 36 wk (HU36). In predicting offsprings, individuals from generation 1 to 3 were used as training data and females from generation 4 were used as validation set. The accuracies of predicted breeding values by pedigree BLUP (PBLUP), genomic BLUP (GBLUP), SSGBLUP and single-step blending (SSBlending) were compared for both genotyped and ungenotyped individuals. For genotyped females, GBLUP performed no better than PBLUP because of the small size of training data, while the 2 single-step models predicted more accurately than the PBLUP model. The average predictive ability of SSGBLUP and SSBlending were 16.0% and 10.8% higher than the PBLUP model across traits, respectively. Furthermore, the predictive abilities for ungenotyped individuals were also enhanced. The average improvements of prediction abilities were 5.9% and 1.5% for SSGBLUP and SSBlending model, respectively. It was concluded that single-step models, especially the SSGBLUP model, can yield more accurate prediction of genetic merits and are preferable for practical implementation of genomic selection in layers. © 2017 Poultry Science Association Inc.

  17. Precision nutrition - review of methods for point-of-care assessment of nutritional status.

    PubMed

    Srinivasan, Balaji; Lee, Seoho; Erickson, David; Mehta, Saurabh

    2017-04-01

    Precision nutrition encompasses prevention and treatment strategies for optimizing health that consider individual variability in diet, lifestyle, environment and genes by accurately determining an individual's nutritional status. This is particularly important as malnutrition now affects a third of the global population, with most of those affected or their care providers having limited means of determining their nutritional status. Similarly, program implementers often have no way of determining the impact or success of their interventions, thus hindering their scale-up. Exciting new developments in the area of point-of-care diagnostics promise to provide improved access to nutritional status assessment, as a first step towards enabling precision nutrition and tailored interventions at both the individual and community levels. In this review, we focus on the current advances in developing portable diagnostics for assessment of nutritional status at point-of-care, along with the numerous design challenges in this process and potential solutions. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. One-Class Classification-Based Real-Time Activity Error Detection in Smart Homes.

    PubMed

    Das, Barnan; Cook, Diane J; Krishnan, Narayanan C; Schmitter-Edgecombe, Maureen

    2016-08-01

    Caring for individuals with dementia is frequently associated with extreme physical and emotional stress, which often leads to depression. Smart home technology and advances in machine learning techniques can provide innovative solutions to reduce caregiver burden. One key service that caregivers provide is prompting individuals with memory limitations to initiate and complete daily activities. We hypothesize that sensor technologies combined with machine learning techniques can automate the process of providing reminder-based interventions. The first step towards automated interventions is to detect when an individual faces difficulty with activities. We propose machine learning approaches based on one-class classification that learn normal activity patterns. When we apply these classifiers to activity patterns that were not seen before, the classifiers are able to detect activity errors, which represent potential prompt situations. We validate our approaches on smart home sensor data obtained from older adult participants, some of whom faced difficulties performing routine activities and thus committed errors.

  19. Optimization of airport security lanes

    NASA Astrophysics Data System (ADS)

    Chen, Lin

    2018-05-01

    Current airport security management system is widely implemented all around the world to ensure the safety of passengers, but it might not be an optimum one. This paper aims to seek a better security system, which can maximize security while minimize inconvenience to passengers. Firstly, we apply Petri net model to analyze the steps where the main bottlenecks lie. Based on average tokens and time transition, the most time-consuming steps of security process can be found, including inspection of passengers' identification and documents, preparing belongings to be scanned and the process for retrieving belongings back. Then, we develop a queuing model to figure out factors affecting those time-consuming steps. As for future improvement, the effective measures which can be taken include transferring current system as single-queuing and multi-served, intelligently predicting the number of security checkpoints supposed to be opened, building up green biological convenient lanes. Furthermore, to test the theoretical results, we apply some data to stimulate the model. And the stimulation results are consistent with what we have got through modeling. Finally, we apply our queuing model to a multi-cultural background. The result suggests that by quantifying and modifying the variance in wait time, the model can be applied to individuals with various habits customs and habits. Generally speaking, our paper considers multiple affecting factors, employs several models and does plenty of calculations, which is practical and reliable for handling in reality. In addition, with more precise data available, we can further test and improve our models.

  20. Gwyscan: a library to support non-equidistant scanning probe microscope measurements

    NASA Astrophysics Data System (ADS)

    Klapetek, Petr; Yacoot, Andrew; Grolich, Petr; Valtr, Miroslav; Nečas, David

    2017-03-01

    We present a software library and related methodology for enabling easy integration of adaptive step (non-equidistant) scanning techniques into metrological scanning probe microscopes or scanning probe microscopes where individual x, y position data are recorded during measurements. Scanning with adaptive steps can reduce the amount of data collected in SPM measurements thereby leading to faster data acquisition, a smaller amount of data collection required for a specific analytical task and less sensitivity to mechanical and thermal drift. Implementation of adaptive scanning routines into a custom built microscope is not normally an easy task: regular data are much easier to handle for previewing (e.g. levelling) and storage. We present an environment to make implementation of adaptive scanning easier for an instrument developer, specifically taking into account data acquisition approaches that are used in high accuracy microscopes as those developed by National Metrology Institutes. This includes a library with algorithms written in C and LabVIEW for handling data storage, regular mesh preview generation and planning the scan path on basis of different assumptions. A set of modules for Gwyddion open source software for handling these data and for their further analysis is presented. Using this combination of data acquisition and processing tools one can implement adaptive scanning in a relatively easy way into an instrument that was previously measuring on a regular grid. The performance of the presented approach is shown and general non-equidistant data processing steps are discussed.

Top