Sample records for multiple processes including

  1. Ultrascalable petaflop parallel supercomputer

    DOEpatents

    Blumrich, Matthias A [Ridgefield, CT; Chen, Dong [Croton On Hudson, NY; Chiu, George [Cross River, NY; Cipolla, Thomas M [Katonah, NY; Coteus, Paul W [Yorktown Heights, NY; Gara, Alan G [Mount Kisco, NY; Giampapa, Mark E [Irvington, NY; Hall, Shawn [Pleasantville, NY; Haring, Rudolf A [Cortlandt Manor, NY; Heidelberger, Philip [Cortlandt Manor, NY; Kopcsay, Gerard V [Yorktown Heights, NY; Ohmacht, Martin [Yorktown Heights, NY; Salapura, Valentina [Chappaqua, NY; Sugavanam, Krishnan [Mahopac, NY; Takken, Todd [Brewster, NY

    2010-07-20

    A massively parallel supercomputer of petaOPS-scale includes node architectures based upon System-On-a-Chip technology, where each processing node comprises a single Application Specific Integrated Circuit (ASIC) having up to four processing elements. The ASIC nodes are interconnected by multiple independent networks that optimally maximize the throughput of packet communications between nodes with minimal latency. The multiple networks may include three high-speed networks for parallel algorithm message passing including a Torus, collective network, and a Global Asynchronous network that provides global barrier and notification functions. These multiple independent networks may be collaboratively or independently utilized according to the needs or phases of an algorithm for optimizing algorithm processing performance. The use of a DMA engine is provided to facilitate message passing among the nodes without the expenditure of processing resources at the node.

  2. 42 CFR 137.327 - May multiple projects be included in a single construction project agreement?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 1 2010-10-01 2010-10-01 false May multiple projects be included in a single construction project agreement? 137.327 Section 137.327 Public Health PUBLIC HEALTH SERVICE, DEPARTMENT OF...-GOVERNANCE Construction Project Assumption Process § 137.327 May multiple projects be included in a single...

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tate, John G.; Richardson, Bradley S.; Love, Lonnie J.

    ORNL worked with the Schaeffler Group USA to explore additive manufacturing techniques that might be appropriate for prototyping of bearing cages. Multiple additive manufacturing techniques were investigated, including e-beam, binder jet and multiple laser based processes. The binder jet process worked best for the thin, detailed cages printed.

  4. Control of Visually Guided Saccades in Multiple Sclerosis: Disruption to Higher-Order Processes

    ERIC Educational Resources Information Center

    Fielding, Joanne; Kilpatrick, Trevor; Millist, Lynette; White, Owen

    2009-01-01

    Ocular motor abnormalities are a common feature of multiple sclerosis (MS), with more salient deficits reflecting tissue damage within brainstem and cerebellar circuits. However, MS may also result in disruption to higher level or cognitive control processes governing eye movement, including attentional processes that enhance the neural processing…

  5. Single photon detection with self-quenching multiplication

    NASA Technical Reports Server (NTRS)

    Zheng, Xinyu (Inventor); Cunningham, Thomas J. (Inventor); Pain, Bedabrata (Inventor)

    2011-01-01

    A photoelectronic device and an avalanche self-quenching process for a photoelectronic device are described. The photoelectronic device comprises a nanoscale semiconductor multiplication region and a nanoscale doped semiconductor quenching structure including a depletion region and an undepletion region. The photoelectronic device can act as a single photon detector or a single carrier multiplier. The avalanche self-quenching process allows electrical field reduction in the multiplication region by movement of the multiplication carriers, thus quenching the avalanche.

  6. Developing image processing meta-algorithms with data mining of multiple metrics.

    PubMed

    Leung, Kelvin; Cunha, Alexandre; Toga, A W; Parker, D Stott

    2014-01-01

    People often use multiple metrics in image processing, but here we take a novel approach of mining the values of batteries of metrics on image processing results. We present a case for extending image processing methods to incorporate automated mining of multiple image metric values. Here by a metric we mean any image similarity or distance measure, and in this paper we consider intensity-based and statistical image measures and focus on registration as an image processing problem. We show how it is possible to develop meta-algorithms that evaluate different image processing results with a number of different metrics and mine the results in an automated fashion so as to select the best results. We show that the mining of multiple metrics offers a variety of potential benefits for many image processing problems, including improved robustness and validation.

  7. Children's representations of multiple family relationships: organizational structure and development in early childhood.

    PubMed

    Schermerhorn, Alice C; Cummings, E Mark; Davies, Patrick T

    2008-02-01

    The authors examine mutual family influence processes at the level of children's representations of multiple family relationships, as well as the structure of those representations. From a community sample with 3 waves, each spaced 1 year apart, kindergarten-age children (105 boys and 127 girls) completed a story-stem completion task, tapping representations of multiple family relationships. Structural equation modeling with autoregressive controls indicated that representational processes involving different family relationships were interrelated over time, including links between children's representations of marital conflict and reactions to conflict, between representations of security about marital conflict and parent-child relationships, and between representations of security in father-child and mother-child relationships. Mixed support was found for notions of increasing stability in representations during this developmental period. Results are discussed in terms of notions of transactional family dynamics, including family-wide perspectives on mutual influence processes attributable to multiple family relationships.

  8. Hierarchical, parallel computing strategies using component object model for process modelling responses of forest plantations to interacting multiple stresses

    Treesearch

    J. G. Isebrands; G. E. Host; K. Lenz; G. Wu; H. W. Stech

    2000-01-01

    Process models are powerful research tools for assessing the effects of multiple environmental stresses on forest plantations. These models are driven by interacting environmental variables and often include genetic factors necessary for assessing forest plantation growth over a range of different site, climate, and silvicultural conditions. However, process models are...

  9. Developing Image Processing Meta-Algorithms with Data Mining of Multiple Metrics

    PubMed Central

    Cunha, Alexandre; Toga, A. W.; Parker, D. Stott

    2014-01-01

    People often use multiple metrics in image processing, but here we take a novel approach of mining the values of batteries of metrics on image processing results. We present a case for extending image processing methods to incorporate automated mining of multiple image metric values. Here by a metric we mean any image similarity or distance measure, and in this paper we consider intensity-based and statistical image measures and focus on registration as an image processing problem. We show how it is possible to develop meta-algorithms that evaluate different image processing results with a number of different metrics and mine the results in an automated fashion so as to select the best results. We show that the mining of multiple metrics offers a variety of potential benefits for many image processing problems, including improved robustness and validation. PMID:24653748

  10. Multisite EPR oximetry from multiple quadrature harmonics.

    PubMed

    Ahmad, R; Som, S; Johnson, D H; Zweier, J L; Kuppusamy, P; Potter, L C

    2012-01-01

    Multisite continuous wave (CW) electron paramagnetic resonance (EPR) oximetry using multiple quadrature field modulation harmonics is presented. First, a recently developed digital receiver is used to extract multiple harmonics of field modulated projection data. Second, a forward model is presented that relates the projection data to unknown parameters, including linewidth at each site. Third, a maximum likelihood estimator of unknown parameters is reported using an iterative algorithm capable of jointly processing multiple quadrature harmonics. The data modeling and processing are applicable for parametric lineshapes under nonsaturating conditions. Joint processing of multiple harmonics leads to 2-3-fold acceleration of EPR data acquisition. For demonstration in two spatial dimensions, both simulations and phantom studies on an L-band system are reported. Copyright © 2011 Elsevier Inc. All rights reserved.

  11. LC-MS/MS and UPLC-UV evaluation of anthocyanins and anthocyanidins during rabbiteye blueberry juice processing

    USDA-ARS?s Scientific Manuscript database

    Blueberry juice processing includes multiple steps and each affect the chemical composition of the berries, including thermal degradation of anthocyanins. Not from concentrate juice was made by heating and enzyme processing blueberries before pressing followed by ultrafiltration and pasteurization. ...

  12. Video to Text (V2T) in Wide Area Motion Imagery

    DTIC Science & Technology

    2015-09-01

    microtext) or a document (e.g., using Sphinx or Apache NLP ) as an automated approach [102]. Previous work in natural language full-text searching...language processing ( NLP ) based module. The heart of the structured text processing module includes the following seven key word banks...Features Tracker MHT Multiple Hypothesis Tracking MIL Multiple Instance Learning NLP Natural Language Processing OAB Online AdaBoost OF Optic Flow

  13. Using normalisation process theory to understand barriers and facilitators to implementing mindfulness-based stress reduction for people with multiple sclerosis.

    PubMed

    Simpson, Robert; Simpson, Sharon; Wood, Karen; Mercer, Stewart W; Mair, Frances S

    2018-01-01

    Objectives To study barriers and facilitators to implementation of mindfulness-based stress reduction for people with multiple sclerosis. Methods Qualitative interviews were used to explore barriers and facilitators to implementation of mindfulness-based stress reduction, including 33 people with multiple sclerosis, 6 multiple sclerosis clinicians and 2 course instructors. Normalisation process theory provided the underpinning conceptual framework. Data were analysed deductively using normalisation process theory constructs (coherence, cognitive participation, collective action and reflexive monitoring). Results Key barriers included mismatched stakeholder expectations, lack of knowledge about mindfulness-based stress reduction, high levels of comorbidity and disability and skepticism about embedding mindfulness-based stress reduction in routine multiple sclerosis care. Facilitators to implementation included introducing a pre-course orientation session; adaptations to mindfulness-based stress reduction to accommodate comorbidity and disability and participants suggested smaller, shorter classes, shortened practices, exclusion of mindful-walking and more time with peers. Post-mindfulness-based stress reduction booster sessions may be required, and objective and subjective reports of benefit would increase clinician confidence in mindfulness-based stress reduction. Discussion Multiple sclerosis patients and clinicians know little about mindfulness-based stress reduction. Mismatched expectations are a barrier to participation, as is rigid application of mindfulness-based stress reduction in the context of disability. Course adaptations in response to patient needs would facilitate uptake and utilisation. Rendering access to mindfulness-based stress reduction rapid and flexible could facilitate implementation. Embedded outcome assessment is desirable.

  14. Efficient group decision making in workshop settings

    Treesearch

    Daniel L. Schmoldt; David L. Peterson

    2001-01-01

    Public land managers must treat multiple values coincidentally in time and space, which requires the participation of multiple resource specialists and consideration of diverse clientele interests in the decision process. This implies decision making that includes multiple participants, both internally and externally. Decades of social science research on decision...

  15. A multiple deficit model of reading disability and attention-deficit/hyperactivity disorder: searching for shared cognitive deficits.

    PubMed

    McGrath, Lauren M; Pennington, Bruce F; Shanahan, Michelle A; Santerre-Lemmon, Laura E; Barnard, Holly D; Willcutt, Erik G; Defries, John C; Olson, Richard K

    2011-05-01

    This study tests a multiple cognitive deficit model of reading disability (RD), attention-deficit/hyperactivity disorder (ADHD), and their comorbidity. A structural equation model (SEM) of multiple cognitive risk factors and symptom outcome variables was constructed. The model included phonological awareness as a unique predictor of RD and response inhibition as a unique predictor of ADHD. Processing speed, naming speed, and verbal working memory were modeled as potential shared cognitive deficits. Model fit indices from the SEM indicated satisfactory fit. Closer inspection of the path weights revealed that processing speed was the only cognitive variable with significant unique relationships to RD and ADHD dimensions, particularly inattention. Moreover, the significant correlation between reading and inattention was reduced to non-significance when processing speed was included in the model, suggesting that processing speed primarily accounted for the phenotypic correlation (or comorbidity) between reading and inattention. This study illustrates the power of a multiple deficit approach to complex developmental disorders and psychopathologies, particularly for exploring comorbidities. The theoretical role of processing speed in the developmental pathways of RD and ADHD and directions for future research are discussed. © 2010 The Authors. Journal of Child Psychology and Psychiatry © 2010 Association for Child and Adolescent Mental Health.

  16. Cargo identification algorithms facilitating unmanned/unattended inspection at high throughput portals

    NASA Astrophysics Data System (ADS)

    Chalmers, Alex

    2007-10-01

    A simple model is presented of a possible inspection regimen applied to each leg of a cargo containers' journey between its point of origin and destination. Several candidate modalities are proposed to be used at multiple remote locations to act as a pre-screen inspection as the target approaches a perimeter and as the primary inspection modality at the portal. Information from multiple data sets are fused to optimize the costs and performance of a network of such inspection systems. A series of image processing algorithms are presented that automatically process X-ray images of containerized cargo. The goal of this processing is to locate the container in a real time stream of traffic traversing a portal without impeding the flow of commerce. Such processing may facilitate the inclusion of unmanned/unattended inspection systems in such a network. Several samples of the processing applied to data collected from deployed systems are included. Simulated data from a notional cargo inspection system with multiple sensor modalities and advanced data fusion algorithms are also included to show the potential increased detection and throughput performance of such a configuration.

  17. Analytical group decision making in natural resources: methodology and application

    Treesearch

    Daniel L. Schmoldt; David L. Peterson

    2000-01-01

    Group decision making is becoming increasingly important in natural resource management and associated scientific applications, because multiple values are treated coincidentally in time and space, multiple resource specialists are needed, and multiple stakeholders must be included in the decision process. Decades of social science research on decision making in groups...

  18. Mesoscopic Modeling of Blood Clotting: Coagulation Cascade and Platelets Adhesion

    NASA Astrophysics Data System (ADS)

    Yazdani, Alireza; Li, Zhen; Karniadakis, George

    2015-11-01

    The process of clot formation and growth at a site on a blood vessel wall involve a number of multi-scale simultaneous processes including: multiple chemical reactions in the coagulation cascade, species transport and flow. To model these processes we have incorporated advection-diffusion-reaction (ADR) of multiple species into an extended version of Dissipative Particle Dynamics (DPD) method which is considered as a coarse-grained Molecular Dynamics method. At the continuum level this is equivalent to the Navier-Stokes equation plus one advection-diffusion equation for each specie. The chemistry of clot formation is now understood to be determined by mechanisms involving reactions among many species in dilute solution, where reaction rate constants and species diffusion coefficients in plasma are known. The role of blood particulates, i.e. red cells and platelets, in the clotting process is studied by including them separately and together in the simulations. An agonist-induced platelet activation mechanism is presented, while platelets adhesive dynamics based on a stochastic bond formation/dissociation process is included in the model.

  19. GSBPP CAPSTONE REVIEW

    DTIC Science & Technology

    2016-12-01

    including the GSBPP exit survey , archived GSBPP capstones, faculty advisement data, faculty interviews, and a new GSBPP student survey in order to detail...analysis from multiple sources, including the GSBPP exit survey , archived GSBPP capstones, faculty advisement data, faculty interviews, and a new...GSBPP student survey in order to detail the capstone’s process, content, and value to multiple stakeholders. The project team also employs the Plan-Do

  20. System and process for pulsed multiple reaction monitoring

    DOEpatents

    Belov, Mikhail E

    2013-05-17

    A new pulsed multiple reaction monitoring process and system are disclosed that uses a pulsed ion injection mode for use in conjunction with triple-quadrupole instruments. The pulsed injection mode approach reduces background ion noise at the detector, increases amplitude of the ion signal, and includes a unity duty cycle that provides a significant sensitivity increase for reliable quantitation of proteins/peptides present at attomole levels in highly complex biological mixtures.

  1. Quantum filtering for multiple diffusive and Poissonian measurements

    NASA Astrophysics Data System (ADS)

    Emzir, Muhammad F.; Woolley, Matthew J.; Petersen, Ian R.

    2015-09-01

    We provide a rigorous derivation of a quantum filter for the case of multiple measurements being made on a quantum system. We consider a class of measurement processes which are functions of bosonic field operators, including combinations of diffusive and Poissonian processes. This covers the standard cases from quantum optics, where homodyne detection may be described as a diffusive process and photon counting may be described as a Poissonian process. We obtain a necessary and sufficient condition for any pair of such measurements taken at different output channels to satisfy a commutation relationship. Then, we derive a general, multiple-measurement quantum filter as an extension of a single-measurement quantum filter. As an application we explicitly obtain the quantum filter corresponding to homodyne detection and photon counting at the output ports of a beam splitter.

  2. Global interrupt and barrier networks

    DOEpatents

    Blumrich, Matthias A.; Chen, Dong; Coteus, Paul W.; Gara, Alan G.; Giampapa, Mark E; Heidelberger, Philip; Kopcsay, Gerard V.; Steinmacher-Burow, Burkhard D.; Takken, Todd E.

    2008-10-28

    A system and method for generating global asynchronous signals in a computing structure. Particularly, a global interrupt and barrier network is implemented that implements logic for generating global interrupt and barrier signals for controlling global asynchronous operations performed by processing elements at selected processing nodes of a computing structure in accordance with a processing algorithm; and includes the physical interconnecting of the processing nodes for communicating the global interrupt and barrier signals to the elements via low-latency paths. The global asynchronous signals respectively initiate interrupt and barrier operations at the processing nodes at times selected for optimizing performance of the processing algorithms. In one embodiment, the global interrupt and barrier network is implemented in a scalable, massively parallel supercomputing device structure comprising a plurality of processing nodes interconnected by multiple independent networks, with each node including one or more processing elements for performing computation or communication activity as required when performing parallel algorithm operations. One multiple independent network includes a global tree network for enabling high-speed global tree communications among global tree network nodes or sub-trees thereof. The global interrupt and barrier network may operate in parallel with the global tree network for providing global asynchronous sideband signals.

  3. Platform for Post-Processing Waveform-Based NDE

    NASA Technical Reports Server (NTRS)

    Roth, Don J.

    2010-01-01

    Signal- and image-processing methods are commonly needed to extract information from the waves, improve resolution of, and highlight defects in an image. Since some similarity exists for all waveform-based nondestructive evaluation (NDE) methods, it would seem that a common software platform containing multiple signal- and image-processing techniques to process the waveforms and images makes sense where multiple techniques, scientists, engineers, and organizations are involved. NDE Wave & Image Processor Version 2.0 software provides a single, integrated signal- and image-processing and analysis environment for total NDE data processing and analysis. It brings some of the most useful algorithms developed for NDE over the past 20 years into a commercial-grade product. The software can import signal/spectroscopic data, image data, and image series data. This software offers the user hundreds of basic and advanced signal- and image-processing capabilities including esoteric 1D and 2D wavelet-based de-noising, de-trending, and filtering. Batch processing is included for signal- and image-processing capability so that an optimized sequence of processing operations can be applied to entire folders of signals, spectra, and images. Additionally, an extensive interactive model-based curve-fitting facility has been included to allow fitting of spectroscopy data such as from Raman spectroscopy. An extensive joint-time frequency module is included for analysis of non-stationary or transient data such as that from acoustic emission, vibration, or earthquake data.

  4. Manipulations of Choice Familiarity in Multiple-Choice Testing Support a Retrieval Practice Account of the Testing Effect

    ERIC Educational Resources Information Center

    Jang, Yoonhee; Pashler, Hal; Huber, David E.

    2014-01-01

    We performed 4 experiments assessing the learning that occurs when taking a test. Our experiments used multiple-choice tests because the processes deployed during testing can be manipulated by varying the nature of the choice alternatives. Previous research revealed that a multiple-choice test that includes "none of the above" (NOTA)…

  5. Nurse-led immunotreatment DEcision Coaching In people with Multiple Sclerosis (DECIMS) - Feasibility testing, pilot randomised controlled trial and mixed methods process evaluation.

    PubMed

    Rahn, A C; Köpke, S; Backhus, I; Kasper, J; Anger, K; Untiedt, B; Alegiani, A; Kleiter, I; Mühlhauser, I; Heesen, C

    2018-02-01

    Treatment decision-making is complex for people with multiple sclerosis. Profound information on available options is virtually not possible in regular neurologist encounters. The "nurse decision coach model" was developed to redistribute health professionals' tasks in supporting immunotreatment decision-making following the principles of informed shared decision-making. To test the feasibility of a decision coaching programme and recruitment strategies to inform the main trial. Feasibility testing and parallel pilot randomised controlled trial, accompanied by a mixed methods process evaluation. Two German multiple sclerosis university centres. People with suspected or relapsing-remitting multiple sclerosis facing immunotreatment decisions on first line drugs were recruited. Randomisation to the intervention (n = 38) or control group (n = 35) was performed on a daily basis. Quantitative and qualitative process data were collected from people with multiple sclerosis, nurses and physicians. We report on the development and piloting of the decision coaching programme. It comprises a training course for multiple sclerosis nurses and the coaching intervention. The intervention consists of up to three structured nurse-led decision coaching sessions, access to an evidence-based online information platform (DECIMS-Wiki) and a final physician consultation. After feasibility testing, a pilot randomised controlled trial was performed. People with multiple sclerosis were randomised to the intervention or control group. The latter had also access to the DECIMS-Wiki, but received otherwise care as usual. Nurses were not blinded to group assignment, while people with multiple sclerosis and physicians were. The primary outcome was 'informed choice' after six months including the sub-dimensions' risk knowledge (after 14 days), attitude concerning immunotreatment (after physician consultation), and treatment uptake (after six months). Quantitative process evaluation data were collected via questionnaires. Qualitative interviews were performed with all nurses and a convenience sample of nine people with multiple sclerosis. 116 people with multiple sclerosis fulfilled the inclusion criteria and 73 (63%) were included. Groups were comparable at baseline. Data of 51 people with multiple sclerosis (70%) were available for the primary endpoint. In the intervention group 15 of 31 (48%) people with multiple sclerosis achieved an informed choice after six months and 6 of 20 (30%) in the control group. Process evaluation data illustrated a positive response towards the coaching programme as well as good acceptance. The pilot-phase showed promising results concerning acceptability and feasibility of the intervention, which was well perceived by people with multiple sclerosis, most nurses and physicians. Delegating parts of the immunotreatment decision-making process to trained nurses has the potential to increase informed choice and participation as well as effectiveness of patient-physician consultations. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cai, H.

    In this dissertation we study a procedure which restarts a Markov process when the process is killed by some arbitrary multiplicative functional. The regenerative nature of this revival procedure is characterized through a Markov renewal equation. An interesting duality between the revival procedure and the classical killing operation is found. Under the condition that the multiplicative functional possesses an intensity, the generators of the revival process can be written down explicitly. An intimate connection is also found between the perturbation of the sample path of a Markov process and the perturbation of a generator (in Kato's sense). The applications ofmore » the theory include the study of the processes like piecewise-deterministic Markov process, virtual waiting time process and the first entrance decomposition (taboo probability).« less

  7. Hierarchical analysis of species distributions and abundance across environmental gradients

    Treesearch

    Jeffery Diez; Ronald H. Pulliam

    2007-01-01

    Abiotic and biotic processes operate at multiple spatial and temporal scales to shape many ecological processes, including species distributions and demography. Current debate about the relative roles of niche-based and stochastic processes in shaping species distributions and community composition reflects, in part, the challenge of understanding how these processes...

  8. Key Steps in the Special Review Process

    EPA Pesticide Factsheets

    EPA uses this process when it has reason to believe that the use of a pesticide may result in unreasonable adverse effects on people or the environment. Steps include comprehensive risk and benefit analyses and multiple Position Documents.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zitney, S.E.

    This paper highlights the use of the CAPE-OPEN (CO) standard interfaces in the Advanced Process Engineering Co-Simulator (APECS) developed at the National Energy Technology Laboratory (NETL). The APECS system uses the CO unit operation, thermodynamic, and reaction interfaces to provide its plug-and-play co-simulation capabilities, including the integration of process simulation with computational fluid dynamics (CFD) simulation. APECS also relies heavily on the use of a CO COM/CORBA bridge for running process/CFD co-simulations on multiple operating systems. For process optimization in the face of multiple and some time conflicting objectives, APECS offers stochastic modeling and multi-objective optimization capabilities developed to complymore » with the CO software standard. At NETL, system analysts are applying APECS to a wide variety of advanced power generation systems, ranging from small fuel cell systems to commercial-scale power plants including the coal-fired, gasification-based FutureGen power and hydrogen production plant.« less

  10. Sequential infiltration synthesis for enhancing multiple-patterning lithography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Darling, Seth B.; Elam, Jeffrey W.; Tseng, Yu-Chih

    Simplified methods of multiple-patterning photolithography using sequential infiltration synthesis to modify the photoresist such that it withstands plasma etching better than unmodified resist and replaces one or more hard masks and/or a freezing step in MPL processes including litho-etch-litho-etch photolithography or litho-freeze-litho-etch photolithography.

  11. [Construction and analysis of a monitoring system with remote real-time multiple physiological parameters based on cloud computing].

    PubMed

    Zhu, Lingyun; Li, Lianjie; Meng, Chunyan

    2014-12-01

    There have been problems in the existing multiple physiological parameter real-time monitoring system, such as insufficient server capacity for physiological data storage and analysis so that data consistency can not be guaranteed, poor performance in real-time, and other issues caused by the growing scale of data. We therefore pro posed a new solution which was with multiple physiological parameters and could calculate clustered background data storage and processing based on cloud computing. Through our studies, a batch processing for longitudinal analysis of patients' historical data was introduced. The process included the resource virtualization of IaaS layer for cloud platform, the construction of real-time computing platform of PaaS layer, the reception and analysis of data stream of SaaS layer, and the bottleneck problem of multi-parameter data transmission, etc. The results were to achieve in real-time physiological information transmission, storage and analysis of a large amount of data. The simulation test results showed that the remote multiple physiological parameter monitoring system based on cloud platform had obvious advantages in processing time and load balancing over the traditional server model. This architecture solved the problems including long turnaround time, poor performance of real-time analysis, lack of extensibility and other issues, which exist in the traditional remote medical services. Technical support was provided in order to facilitate a "wearable wireless sensor plus mobile wireless transmission plus cloud computing service" mode moving towards home health monitoring for multiple physiological parameter wireless monitoring.

  12. Central Processing Dysfunctions in Children: A Review of Research.

    ERIC Educational Resources Information Center

    Chalfant, James C.; Scheffelin, Margaret A.

    Research on central processing dysfunctions in children is reviewed in three major areas. The first, dysfunctions in the analysis of sensory information, includes auditory, visual, and haptic processing. The second, dysfunction in the synthesis of sensory information, covers multiple stimulus integration and short-term memory. The third area of…

  13. Pulsed electric fields for pasteurization: defining processing conditions

    USDA-ARS?s Scientific Manuscript database

    Application of pulsed electric fields (PEF) technology in food pasteurization has been extensively studied. Optimal PEF treatment conditions for maximum microbial inactivation depend on multiple factors including PEF processing conditions, production parameters and product properties. In order for...

  14. The California general plan process and sustainable transportation planning

    DOT National Transportation Integrated Search

    2001-11-01

    This study reviewed the current and potential utility of California's General Plan process as a tool for promoting more sustainable local transportation systems The study used multiple methods to investigate this issue, including: 1. An extensive lit...

  15. Principles of Temporal Processing Across the Cortical Hierarchy.

    PubMed

    Himberger, Kevin D; Chien, Hsiang-Yun; Honey, Christopher J

    2018-05-02

    The world is richly structured on multiple spatiotemporal scales. In order to represent spatial structure, many machine-learning models repeat a set of basic operations at each layer of a hierarchical architecture. These iterated spatial operations - including pooling, normalization and pattern completion - enable these systems to recognize and predict spatial structure, while robust to changes in the spatial scale, contrast and noisiness of the input signal. Because our brains also process temporal information that is rich and occurs across multiple time scales, might the brain employ an analogous set of operations for temporal information processing? Here we define a candidate set of temporal operations, and we review evidence that they are implemented in the mammalian cerebral cortex in a hierarchical manner. We conclude that multiple consecutive stages of cortical processing can be understood to perform temporal pooling, temporal normalization and temporal pattern completion. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.

  16. New opportunities of real-world data from clinical routine settings in life-cycle management of drugs: example of an integrative approach in multiple sclerosis.

    PubMed

    Rothenbacher, Dietrich; Capkun, Gorana; Uenal, Hatice; Tumani, Hayrettin; Geissbühler, Yvonne; Tilson, Hugh

    2015-05-01

    The assessment and demonstration of a positive benefit-risk balance of a drug is a life-long process and includes specific data from preclinical, clinical development and post-launch experience. However, new integrative approaches are needed to enrich evidence from clinical trials and sponsor-initiated observational studies with information from multiple additional sources, including registry information and other existing observational data and, more recently, health-related administrative claims and medical records databases. To illustrate the value of this approach, this paper exemplifies such a cross-package approach to the area of multiple sclerosis, exploring also possible analytic strategies when using these multiple sources of information.

  17. Reliability study of high-brightness multiple single emitter diode lasers

    NASA Astrophysics Data System (ADS)

    Zhu, Jing; Yang, Thomas; Zhang, Cuipeng; Lang, Chao; Jiang, Xiaochen; Liu, Rui; Gao, Yanyan; Guo, Weirong; Jiang, Yuhua; Liu, Yang; Zhang, Luyan; Chen, Louisa

    2015-03-01

    In this study the chip bonding processes for various chips from various chip suppliers around the world have been optimized to achieve reliable chip on sub-mount for high performance. These chip on sub-mounts, for examples, includes three types of bonding, 8xx nm-1.2W/10.0W Indium bonded lasers, 9xx nm 10W-20W AuSn bonded lasers and 1470 nm 6W Indium bonded lasers will be reported below. The MTTF@25 of 9xx nm chip on sub-mount (COS) is calculated to be more than 203,896 hours. These chips from various chip suppliers are packaged into many multiple single emitter laser modules, using similar packaging techniques from 2 emitters per module to up to 7 emitters per module. A reliability study including aging test is performed on those multiple single emitter laser modules. With research team's 12 years' experienced packaging design and techniques, precise optical and fiber alignment processes and superior chip bonding capability, we have achieved a total MTTF exceeding 177,710 hours of life time with 60% confidence level for those multiple single emitter laser modules. Furthermore, a separated reliability study on wavelength stabilized laser modules have shown this wavelength stabilized module packaging process is reliable as well.

  18. Universal principles governing multiple random searchers on complex networks: The logarithmic growth pattern and the harmonic law

    NASA Astrophysics Data System (ADS)

    Weng, Tongfeng; Zhang, Jie; Small, Michael; Harandizadeh, Bahareh; Hui, Pan

    2018-03-01

    We propose a unified framework to evaluate and quantify the search time of multiple random searchers traversing independently and concurrently on complex networks. We find that the intriguing behaviors of multiple random searchers are governed by two basic principles—the logarithmic growth pattern and the harmonic law. Specifically, the logarithmic growth pattern characterizes how the search time increases with the number of targets, while the harmonic law explores how the search time of multiple random searchers varies relative to that needed by individual searchers. Numerical and theoretical results demonstrate these two universal principles established across a broad range of random search processes, including generic random walks, maximal entropy random walks, intermittent strategies, and persistent random walks. Our results reveal two fundamental principles governing the search time of multiple random searchers, which are expected to facilitate investigation of diverse dynamical processes like synchronization and spreading.

  19. Design requirements for SRB production control system. Volume 2: System requirements and conceptual description

    NASA Technical Reports Server (NTRS)

    1981-01-01

    In the development of the business system for the SRB automated production control system, special attention had to be paid to the unique environment posed by the space shuttle. The issues posed by this environment, and the means by which they were addressed, are reviewed. The change in management philosphy which will be required as NASA switches from one-of-a-kind launches to multiple launches is discussed. The implications of the assembly process on the business system are described. These issues include multiple missions, multiple locations and facilities, maintenance and refurbishment, multiple sources, and multiple contractors. The implications of these aspects on the automated production control system are reviewed including an assessment of the six major subsystems, as well as four other subsystem. Some general system requirements which flow through the entire business system are described.

  20. On Holo-Hilbert Spectral Analysis: A Full Informational Spectral Representation for Nonlinear and Non-Stationary Data

    NASA Technical Reports Server (NTRS)

    Huang, Norden E.; Hu, Kun; Yang, Albert C. C.; Chang, Hsing-Chih; Jia, Deng; Liang, Wei-Kuang; Yeh, Jia Rong; Kao, Chu-Lan; Juan, Chi-Huang; Peng, Chung Kang; hide

    2016-01-01

    The Holo-Hilbert spectral analysis (HHSA) method is introduced to cure the deficiencies of traditional spectral analysis and to give a full informational representation of nonlinear and non-stationary data. It uses a nested empirical mode decomposition and Hilbert-Huang transform (HHT) approach to identify intrinsic amplitude and frequency modulations often present in nonlinear systems. Comparisons are first made with traditional spectrum analysis, which usually achieved its results through convolutional integral transforms based on additive expansions of an a priori determined basis, mostly under linear and stationary assumptions. Thus, for non-stationary processes, the best one could do historically was to use the time- frequency representations, in which the amplitude (or energy density) variation is still represented in terms of time. For nonlinear processes, the data can have both amplitude and frequency modulations (intra-mode and inter-mode) generated by two different mechanisms: linear additive or nonlinear multiplicative processes. As all existing spectral analysis methods are based on additive expansions, either a priori or adaptive, none of them could possibly represent the multiplicative processes. While the earlier adaptive HHT spectral analysis approach could accommodate the intra-wave nonlinearity quite remarkably, it remained that any inter-wave nonlinear multiplicative mechanisms that include cross-scale coupling and phase-lock modulations were left untreated. To resolve the multiplicative processes issue, additional dimensions in the spectrum result are needed to account for the variations in both the amplitude and frequency modulations simultaneously. HHSA accommodates all the processes: additive and multiplicative, intra-mode and inter-mode, stationary and nonstationary, linear and nonlinear interactions. The Holo prefix in HHSA denotes a multiple dimensional representation with both additive and multiplicative capabilities.

  1. On Holo-Hilbert spectral analysis: a full informational spectral representation for nonlinear and non-stationary data

    PubMed Central

    Huang, Norden E.; Hu, Kun; Yang, Albert C. C.; Chang, Hsing-Chih; Jia, Deng; Liang, Wei-Kuang; Yeh, Jia Rong; Kao, Chu-Lan; Juan, Chi-Hung; Peng, Chung Kang; Meijer, Johanna H.; Wang, Yung-Hung; Long, Steven R.; Wu, Zhauhua

    2016-01-01

    The Holo-Hilbert spectral analysis (HHSA) method is introduced to cure the deficiencies of traditional spectral analysis and to give a full informational representation of nonlinear and non-stationary data. It uses a nested empirical mode decomposition and Hilbert–Huang transform (HHT) approach to identify intrinsic amplitude and frequency modulations often present in nonlinear systems. Comparisons are first made with traditional spectrum analysis, which usually achieved its results through convolutional integral transforms based on additive expansions of an a priori determined basis, mostly under linear and stationary assumptions. Thus, for non-stationary processes, the best one could do historically was to use the time–frequency representations, in which the amplitude (or energy density) variation is still represented in terms of time. For nonlinear processes, the data can have both amplitude and frequency modulations (intra-mode and inter-mode) generated by two different mechanisms: linear additive or nonlinear multiplicative processes. As all existing spectral analysis methods are based on additive expansions, either a priori or adaptive, none of them could possibly represent the multiplicative processes. While the earlier adaptive HHT spectral analysis approach could accommodate the intra-wave nonlinearity quite remarkably, it remained that any inter-wave nonlinear multiplicative mechanisms that include cross-scale coupling and phase-lock modulations were left untreated. To resolve the multiplicative processes issue, additional dimensions in the spectrum result are needed to account for the variations in both the amplitude and frequency modulations simultaneously. HHSA accommodates all the processes: additive and multiplicative, intra-mode and inter-mode, stationary and non-stationary, linear and nonlinear interactions. The Holo prefix in HHSA denotes a multiple dimensional representation with both additive and multiplicative capabilities. PMID:26953180

  2. Conjoint Analysis: A Study of the Effects of Using Person Variables.

    ERIC Educational Resources Information Center

    Fraas, John W.; Newman, Isadore

    Three statistical techniques--conjoint analysis, a multiple linear regression model, and a multiple linear regression model with a surrogate person variable--were used to estimate the relative importance of five university attributes for students in the process of selecting a college. The five attributes include: availability and variety of…

  3. Neuropsychological Predictors of Math Calculation and Reasoning in School-Aged Children

    ERIC Educational Resources Information Center

    Schneider, Dana Lynn

    2012-01-01

    After multiple reviews of the literature, which documented that multiple cognitive processes may be involved in mathematics ability and disability, Geary (1993) proposed a model that included three subtypes of math disability: Semantic, Procedural, and Visuospatial. A review of the extant literature produced three studies that examined Geary's…

  4. Using Comparison of Multiple Strategies in the Mathematics Classroom: Lessons Learned and Next Steps

    ERIC Educational Resources Information Center

    Durkin, Kelley; Star, Jon R.; Rittle-Johnson, Bethany

    2017-01-01

    Comparison is a fundamental cognitive process that can support learning in a variety of domains, including mathematics. The current paper aims to summarize empirical findings that support recommendations on using comparison of multiple strategies in mathematics classrooms. We report the results of our classroom-based research on using comparison…

  5. A Framework for Understanding Young Children with Severe Multiple Disabilities: The van Dijk Approach to Assessment.

    ERIC Educational Resources Information Center

    Nelson, Catherine; van Dijk, Jan; McDonnell, Andrea P.; Thompson, Kristina

    2002-01-01

    This article describes a framework for assessing young children with severe multiple disabilities. The assessment is child-led and examines underlying processes of learning, including biobehavioral state, orienting response, learning channels, approach-withdrawal, memory, interactions, communication, and problem solving. Case studies and a sample…

  6. A novel all-optical label processing based on multiple optical orthogonal codes sequences for optical packet switching networks

    NASA Astrophysics Data System (ADS)

    Zhang, Chongfu; Qiu, Kun; Xu, Bo; Ling, Yun

    2008-05-01

    This paper proposes an all-optical label processing scheme that uses the multiple optical orthogonal codes sequences (MOOCS)-based optical label for optical packet switching (OPS) (MOOCS-OPS) networks. In this scheme, each MOOCS is a permutation or combination of the multiple optical orthogonal codes (MOOC) selected from the multiple-groups optical orthogonal codes (MGOOC). Following a comparison of different optical label processing (OLP) schemes, the principles of MOOCS-OPS network are given and analyzed. Firstly, theoretical analyses are used to prove that MOOCS is able to greatly enlarge the number of available optical labels when compared to the previous single optical orthogonal code (SOOC) for OPS (SOOC-OPS) network. Then, the key units of the MOOCS-based optical label packets, including optical packet generation, optical label erasing, optical label extraction and optical label rewriting etc., are given and studied. These results are used to verify that the proposed MOOCS-OPS scheme is feasible.

  7. Exciton multiplication from first principles.

    PubMed

    Jaeger, Heather M; Hyeon-Deuk, Kim; Prezhdo, Oleg V

    2013-06-18

    Third-generation photovolatics require demanding cost and power conversion efficiency standards, which may be achieved through efficient exciton multiplication. Therefore, generating more than one electron-hole pair from the absorption of a single photon has vast ramifications on solar power conversion technology. Unlike their bulk counterparts, irradiated semiconductor quantum dots exhibit efficient exciton multiplication, due to confinement-enhanced Coulomb interactions and slower nonradiative losses. The exact characterization of the complicated photoexcited processes within quantum-dot photovoltaics is a work in progress. In this Account, we focus on the photophysics of nanocrystals and investigate three constituent processes of exciton multiplication, including photoexcitation, phonon-induced dephasing, and impact ionization. We quantify the role of each process in exciton multiplication through ab initio computation and analysis of many-electron wave functions. The probability of observing a multiple exciton in a photoexcited state is proportional to the magnitude of electron correlation, where correlated electrons can be simultaneously promoted across the band gap. Energies of multiple excitons are determined directly from the excited state wave functions, defining the threshold for multiple exciton generation. This threshold is strongly perturbed in the presence of surface defects, dopants, and ionization. Within a few femtoseconds following photoexcitation, the quantum state loses coherence through interactions with the vibrating atomic lattice. The phase relationship between single excitons and multiple excitons dissipates first, followed by multiple exciton fission. Single excitons are coupled to multiple excitons through Coulomb and electron-phonon interactions, and as a consequence, single excitons convert to multiple excitons and vice versa. Here, exciton multiplication depends on the initial energy and coupling magnitude and competes with electron-phonon energy relaxation. Multiple excitons are generated through impact ionization within picoseconds. The basis of exciton multiplication in quantum dots is the collective result of photoexcitation, dephasing, and nonadiabatic evolution. Each process is characterized by a distinct time-scale, and the overall multiple exciton generation dynamics is complete by about 10 ps. Without relying on semiempirical parameters, we computed quantum mechanical probabilities of multiple excitons for small model systems. Because exciton correlations and coherences are microscopic, quantum properties, results for small model systems can be extrapolated to larger, realistic quantum dots.

  8. In line monitoring of the preparation of water-in-oil-in-water (W/O/W) type multiple emulsions via dielectric spectroscopy.

    PubMed

    Beer, Sebastian; Dobler, Dorota; Gross, Alexander; Ost, Martin; Elseberg, Christiane; Maeder, Ulf; Schmidts, Thomas Michael; Keusgen, Michael; Fiebich, Martin; Runkel, Frank

    2013-01-30

    Multiple emulsions offer various applications in a wide range of fields such as pharmaceutical, cosmetics and food technology. Two features are known to yield a great influence on multiple emulsion quality and utility as encapsulation efficiency and prolonged stability. To achieve a prolonged stability, the production of the emulsions has to be observed and controlled, preferably in line. In line measurements provide available parameters in a short time frame without the need for the sample to be removed from the process stream, thereby enabling continuous process control. In this study, information about the physical state of multiple emulsions obtained from dielectric spectroscopy (DS) is evaluated for this purpose. Results from dielectric measurements performed in line during the production cycle are compared to theoretically expected results and to well established off line measurements. Thus, a first step to include the production of multiple emulsions into the process analytical technology (PAT) guidelines of the Food and Drug Administration (FDA) is achieved. DS proved to be beneficial in determining the crucial stopping criterion, which is essential in the production of multiple emulsions. The stopping of the process at a less-than-ideal point can severely lower the encapsulation efficiency and the stability, thereby lowering the quality of the emulsion. DS is also expected to provide further information about the multiple emulsion like encapsulation efficiency. Copyright © 2012 Elsevier B.V. All rights reserved.

  9. Resistance of Listeria monocytogenes biofilms to sanitizing agents

    USDA-ARS?s Scientific Manuscript database

    Listeria monocytogenes is notorious for its capacity to colonize the environment and equipment of food processing facilities and to persist in the processing plant ecosystem, sometimes for decades. Such persistence is mediated by multiple attributes of L. monocytogenes, including the pathogen’s capa...

  10. Connecting Photosynthesis and Cellular Respiration: Preservice Teachers' Conceptions

    ERIC Educational Resources Information Center

    Brown, Mary H.; Schwartz, Renee S.

    2009-01-01

    The biological processes of photosynthesis and plant cellular respiration include multiple biochemical steps, occur simultaneously within plant cells, and share common molecular components. Yet, learners often compartmentalize functions and specialization of cell organelles relevant to these two processes, without considering the interconnections…

  11. Pursuing the method of multiple working hypotheses to understand differences in process-based snow models

    NASA Astrophysics Data System (ADS)

    Clark, Martyn; Essery, Richard

    2017-04-01

    When faced with the complex and interdisciplinary challenge of building process-based land models, different modelers make different decisions at different points in the model development process. These modeling decisions are generally based on several considerations, including fidelity (e.g., what approaches faithfully simulate observed processes), complexity (e.g., which processes should be represented explicitly), practicality (e.g., what is the computational cost of the model simulations; are there sufficient resources to implement the desired modeling concepts), and data availability (e.g., is there sufficient data to force and evaluate models). Consequently the research community, comprising modelers of diverse background, experience, and modeling philosophy, has amassed a wide range of models, which differ in almost every aspect of their conceptualization and implementation. Model comparison studies have been undertaken to explore model differences, but have not been able to meaningfully attribute inter-model differences in predictive ability to individual model components because there are often too many structural and implementation differences among the different models considered. As a consequence, model comparison studies to date have provided limited insight into the causes of differences in model behavior, and model development has often relied on the inspiration and experience of individual modelers rather than on a systematic analysis of model shortcomings. This presentation will summarize the use of "multiple-hypothesis" modeling frameworks to understand differences in process-based snow models. Multiple-hypothesis frameworks define a master modeling template, and include a a wide variety of process parameterizations and spatial configurations that are used in existing models. Such frameworks provide the capability to decompose complex models into the individual decisions that are made as part of model development, and evaluate each decision in isolation. It is hence possible to attribute differences in system-scale model predictions to individual modeling decisions, providing scope to mimic the behavior of existing models, understand why models differ, characterize model uncertainty, and identify productive pathways to model improvement. Results will be presented applying multiple hypothesis frameworks to snow model comparison projects, including PILPS, SnowMIP, and the upcoming ESM-SnowMIP project.

  12. Techniques for improving transients in learning control systems

    NASA Technical Reports Server (NTRS)

    Chang, C.-K.; Longman, Richard W.; Phan, Minh

    1992-01-01

    A discrete modern control formulation is used to study the nature of the transient behavior of the learning process during repetitions. Several alternative learning control schemes are developed to improve the transient performance. These include a new method using an alternating sign on the learning gain, which is very effective in limiting peak transients and also very useful in multiple-input, multiple-output systems. Other methods include learning at an increasing number of points progressing with time, or an increasing number of points of increasing density.

  13. Manufacturing Technology Support (MATES). Task Order 0021: Air Force Technology and Industrial Base Research and Analysis, Subtask Order 06: Direct Digital Manufacturing

    DTIC Science & Technology

    2011-08-01

    industries and key players providing equipment include Flow and OMAX. The decision tree for waterjet machining is shown in Figure 28. Figure 28...about the melt pool. Process parameters including powder flow , laser power, and scan speed are adjusted accordingly • Multiple materials o BD...project.eu.com/home/home_page_static.jsp o Working with multiple partners; one is Cochlear . Using LMD or SLM to fabricate cochlear implants with 10

  14. Data-Base Software For Tracking Technological Developments

    NASA Technical Reports Server (NTRS)

    Aliberti, James A.; Wright, Simon; Monteith, Steve K.

    1996-01-01

    Technology Tracking System (TechTracS) computer program developed for use in storing and retrieving information on technology and related patent information developed under auspices of NASA Headquarters and NASA's field centers. Contents of data base include multiple scanned still images and quick-time movies as well as text. TechTracS includes word-processing, report-editing, chart-and-graph-editing, and search-editing subprograms. Extensive keyword searching capabilities enable rapid location of technologies, innovators, and companies. System performs routine functions automatically and serves multiple users.

  15. Multiplicative processes in visual cognition

    NASA Astrophysics Data System (ADS)

    Credidio, H. F.; Teixeira, E. N.; Reis, S. D. S.; Moreira, A. A.; Andrade, J. S.

    2014-03-01

    The Central Limit Theorem (CLT) is certainly one of the most important results in the field of statistics. The simple fact that the addition of many random variables can generate the same probability curve, elucidated the underlying process for a broad spectrum of natural systems, ranging from the statistical distribution of human heights to the distribution of measurement errors, to mention a few. An extension of the CLT can be applied to multiplicative processes, where a given measure is the result of the product of many random variables. The statistical signature of these processes is rather ubiquitous, appearing in a diverse range of natural phenomena, including the distributions of incomes, body weights, rainfall, and fragment sizes in a rock crushing process. Here we corroborate results from previous studies which indicate the presence of multiplicative processes in a particular type of visual cognition task, namely, the visual search for hidden objects. Precisely, our results from eye-tracking experiments show that the distribution of fixation times during visual search obeys a log-normal pattern, while the fixational radii of gyration follow a power-law behavior.

  16. An Advanced Multiple Alternatives Modeling Formulation for Determining Graduated Fiscal Support Strategies for Operational and Planned Educational Programs.

    ERIC Educational Resources Information Center

    Wholeben, Brent Edward

    A rationale is presented for viewing the decision-making process inherent in determining budget reductions for educational programs as most effectively modeled by a graduated funding approach. The major tenets of the graduated budget reduction approach to educational fiscal policy include the development of multiple alternative reduction plans, or…

  17. Parallel processing considerations for image recognition tasks

    NASA Astrophysics Data System (ADS)

    Simske, Steven J.

    2011-01-01

    Many image recognition tasks are well-suited to parallel processing. The most obvious example is that many imaging tasks require the analysis of multiple images. From this standpoint, then, parallel processing need be no more complicated than assigning individual images to individual processors. However, there are three less trivial categories of parallel processing that will be considered in this paper: parallel processing (1) by task; (2) by image region; and (3) by meta-algorithm. Parallel processing by task allows the assignment of multiple workflows-as diverse as optical character recognition [OCR], document classification and barcode reading-to parallel pipelines. This can substantially decrease time to completion for the document tasks. For this approach, each parallel pipeline is generally performing a different task. Parallel processing by image region allows a larger imaging task to be sub-divided into a set of parallel pipelines, each performing the same task but on a different data set. This type of image analysis is readily addressed by a map-reduce approach. Examples include document skew detection and multiple face detection and tracking. Finally, parallel processing by meta-algorithm allows different algorithms to be deployed on the same image simultaneously. This approach may result in improved accuracy.

  18. CHALLENGES OF PROCESSING BIOLOGICAL DATA FOR INCORPORATION INTO A LAKE EUTROPHICATION MODEL

    EPA Science Inventory

    A eutrophication model is in development as part of the Lake Michigan Mass Balance Project (LMMBP). Successful development and calibration of this model required the processing and incorporation of extensive biological data. Data were drawn from multiple sources, including nutrie...

  19. The retrovirus RNA trafficking granule: from birth to maturity

    PubMed Central

    Cochrane, Alan W; McNally, Mark T; Mouland, Andrew J

    2006-01-01

    Post-transcriptional events in the life of an RNA including RNA processing, transport, translation and metabolism are characterized by the regulated assembly of multiple ribonucleoprotein (RNP) complexes. At each of these steps, there is the engagement and disengagement of RNA-binding proteins until the RNA reaches its final destination. For retroviral genomic RNA, the final destination is the capsid. Numerous studies have provided crucial information about these processes and serve as the basis for studies on the intracellular fate of retroviral RNA. Retroviral RNAs are like cellular mRNAs but their processing is more tightly regulated by multiple cis-acting sequences and the activities of many trans-acting proteins. This review describes the viral and cellular partners that retroviral RNA encounters during its maturation that begins in the nucleus, focusing on important events including splicing, 3' end-processing, RNA trafficking from the nucleus to the cytoplasm and finally, mechanisms that lead to its compartmentalization into progeny virions. PMID:16545126

  20. SIG: a general-purpose signal processing program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lager, D.; Azevedo, S.

    1986-02-01

    SIG is a general-purpose signal processing, analysis, and display program. Its main purpose is to perform manipulations on time- and frequency-domain signals. It also accommodates other representations for data such as transfer function polynomials. Signal processing operations include digital filtering, auto/cross spectral density, transfer function/impulse response, convolution, Fourier transform, and inverse Fourier transform. Graphical operations provide display of signals and spectra, including plotting, cursor zoom, families of curves, and multiple viewport plots. SIG provides two user interfaces with a menu mode for occasional users and a command mode for more experienced users. Capability exits for multiple commands per line, commandmore » files with arguments, commenting lines, defining commands, automatic execution for each item in a repeat sequence, etc. SIG is presently available for VAX(VMS), VAX (BERKELEY 4.2 UNIX), SUN (BERKELEY 4.2 UNIX), DEC-20 (TOPS-20), LSI-11/23 (TSX), and DEC PRO 350 (TSX). 4 refs., 2 figs.« less

  1. An Applied Framework for Incorporating Multiple Sources of Uncertainty in Fisheries Stock Assessments.

    PubMed

    Scott, Finlay; Jardim, Ernesto; Millar, Colin P; Cerviño, Santiago

    2016-01-01

    Estimating fish stock status is very challenging given the many sources and high levels of uncertainty surrounding the biological processes (e.g. natural variability in the demographic rates), model selection (e.g. choosing growth or stock assessment models) and parameter estimation. Incorporating multiple sources of uncertainty in a stock assessment allows advice to better account for the risks associated with proposed management options, promoting decisions that are more robust to such uncertainty. However, a typical assessment only reports the model fit and variance of estimated parameters, thereby underreporting the overall uncertainty. Additionally, although multiple candidate models may be considered, only one is selected as the 'best' result, effectively rejecting the plausible assumptions behind the other models. We present an applied framework to integrate multiple sources of uncertainty in the stock assessment process. The first step is the generation and conditioning of a suite of stock assessment models that contain different assumptions about the stock and the fishery. The second step is the estimation of parameters, including fitting of the stock assessment models. The final step integrates across all of the results to reconcile the multi-model outcome. The framework is flexible enough to be tailored to particular stocks and fisheries and can draw on information from multiple sources to implement a broad variety of assumptions, making it applicable to stocks with varying levels of data availability The Iberian hake stock in International Council for the Exploration of the Sea (ICES) Divisions VIIIc and IXa is used to demonstrate the framework, starting from length-based stock and indices data. Process and model uncertainty are considered through the growth, natural mortality, fishing mortality, survey catchability and stock-recruitment relationship. Estimation uncertainty is included as part of the fitting process. Simple model averaging is used to integrate across the results and produce a single assessment that considers the multiple sources of uncertainty.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Damtie, Fikeraddis A., E-mail: Fikeraddis.Damtie@teorfys.lu.se; Wacker, Andreas, E-mail: Andreas.Wacker@fysik.lu.se; Karki, Khadga J., E-mail: Khadga.Karki@chemphys.lu.se

    Multiple exciton generation (MEG) is a process in which more than one electron hole pair is generated per absorbed photon. It allows us to increase the efficiency of solar energy harvesting. Experimental studies have shown the multiple exciton generation yield of 1.2 in isolated colloidal quantum dots. However real photoelectric devices require the extraction of electron hole pairs to electric contacts. We provide a systematic study of the corresponding quantum coherent processes including extraction and injection and show that a proper design of extraction and injection rates enhances the yield significantly up to values around 1.6.

  3. Living with Multiple Health Problems: What Older Adults Should Know

    MedlinePlus

    ... treatments may take longer than others to show benefits. You should also decide if you want to make all of your care decisions on your own or include others in the decision-making process. These can include spouses, family members, or ...

  4. Constraints and Approach for Selecting the Mars Surveyor '01 Landing Site

    NASA Technical Reports Server (NTRS)

    Golombek, M.; Bridges, N.; Gilmore, M.; Haldemann, A.; Parker, T.; Saunders, R.; Spencer, D.; Smith, J.; Weitz, C.

    1999-01-01

    There are many similarities between the Mars Surveyor '01 (MS '01) landing site selection process and that of Mars Pathfinder. The selection process includes two parallel activities in which engineers define and refine the capabilities of the spacecraft through design, testing and modeling and scientists define a set of landing site constraints based on the spacecraft design and landing scenario. As for Pathfinder, the safety of the site is without question the single most important factor, for the simple reason that failure to land safely yields no science and exposes the mission and program to considerable risk. The selection process must be thorough and defensible and capable of surviving multiple withering reviews similar to the Pathfinder decision. On Pathfinder, this was accomplished by attempting to understand the surface properties of sites using available remote sensing data sets and models based on them. Science objectives are factored into the selection process only after the safety of the site is validated. Finally, as for Pathfinder, the selection process is being done in an open environment with multiple opportunities for community involvement including open workshops, with education and outreach opportunities.

  5. Constraints, Approach and Present Status for Selecting the Mars Surveyor 2001 Landing Site

    NASA Technical Reports Server (NTRS)

    Golombek, M.; Anderson, F.; Bridges, N.; Briggs, G.; Gilmore, M.; Gulick, V.; Haldemann, A.; Parker, T.; Saunders, R.; Spencer, D.; hide

    1999-01-01

    There are many similarities between the Mars Surveyor '01 (MS '01) landing site selection process and that of Mars Pathfinder. The selection process includes two parallel activities in which engineers define and refine the capabilities of the spacecraft through design, testing and modeling and scientists define a set of landing site constraints based on the spacecraft design and landing scenario. As for Pathfinder, the safety of the site is without question the single most important factor, for the simple reason that failure to land safely yields no science and exposes the mission and program to considerable risk. The selection process must be thorough, defensible and capable of surviving multiple withering reviews similar to the Pathfinder decision. On Pathfinder, this was accomplished by attempting to understand the surface properties of sites using available remote sensing data sets and models based on them. Science objectives are factored into the selection process only after the safety of the site is validated. Finally, as for Pathfinder, the selection process is being done in an open environment with multiple opportunities for community involvement including open workshops, with education and outreach opportunities.

  6. Control of aliphatic halogenated DBP precursors with multiple drinking water treatment processes: Formation potential and integrated toxicity.

    PubMed

    Zhang, Yimeng; Chu, Wenhai; Yao, Dechang; Yin, Daqiang

    2017-08-01

    The comprehensive control efficiency for the formation potentials (FPs) of a range of regulated and unregulated halogenated disinfection by-products (DBPs) (including carbonaceous DBPs (C-DBPs), nitrogenous DBPs (N-DBPs), and iodinated DBPs (I-DBPs)) with the multiple drinking water treatment processes, including pre-ozonation, conventional treatment (coagulation-sedimentation, pre-sand filtration), ozone-biological activated carbon (O 3 -BAC) advanced treatment, and post-sand filtration, was investigated. The potential toxic risks of DBPs by combing their FPs and toxicity values were also evaluated. The results showed that the multiple drinking water treatment processes had superior performance in removing organic/inorganic precursors and reducing the formation of a range of halogenated DBPs. Therein, ozonation significantly removed bromide and iodide, and thus reduced the formation of brominated and iodinated DBPs. The removal of organic carbon and nitrogen precursors by the conventional treatment processes was substantially improved by O 3 -BAC advanced treatment, and thus prevented the formation of chlorinated C-DBPs and N-DBPs. However, BAC filtration leads to the increased formation of brominated C-DBPs and N-DBPs due to the increase of bromide/DOC and bromide/DON. After the whole multiple treatment processes, the rank order for integrated toxic risk values caused by these halogenated DBPs was haloacetonitriles (HANs)≫haloacetamides (HAMs)>haloacetic acids (HAAs)>trihalomethanes (THMs)>halonitromethanes (HNMs)≫I-DBPs (I-HAMs and I-THMs). I-DBPs failed to cause high integrated toxic risk because of their very low FPs. The significant higher integrated toxic risk value caused by HANs than other halogenated DBPs cannot be ignored. Copyright © 2017. Published by Elsevier B.V.

  7. The Performance Improvement of the Lagrangian Particle Dispersion Model (LPDM) Using Graphics Processing Unit (GPU) Computing

    DTIC Science & Technology

    2017-08-01

    access to the GPU for general purpose processing .5 CUDA is designed to work easily with multiple programming languages , including Fortran. CUDA is a...Using Graphics Processing Unit (GPU) Computing by Leelinda P Dawson Approved for public release; distribution unlimited...The Performance Improvement of the Lagrangian Particle Dispersion Model (LPDM) Using Graphics Processing Unit (GPU) Computing by Leelinda

  8. Noise limitations in optical linear algebra processors.

    PubMed

    Batsell, S G; Jong, T L; Walkup, J F; Krile, T F

    1990-05-10

    A general statistical noise model is presented for optical linear algebra processors. A statistical analysis which includes device noise, the multiplication process, and the addition operation is undertaken. We focus on those processes which are architecturally independent. Finally, experimental results which verify the analytical predictions are also presented.

  9. Coincidence and covariance data acquisition in photoelectron and -ion spectroscopy. I. Formal theory

    NASA Astrophysics Data System (ADS)

    Mikosch, Jochen; Patchkovskii, Serguei

    2013-10-01

    We derive a formal theory of noisy Poisson processes with multiple outcomes. We obtain simple, compact expressions for the probability distribution function of arbitrarily complex composite events and its moments. We illustrate the utility of the theory by analyzing properties of coincidence and covariance photoelectron-photoion detection involving single-ionization events. The results and techniques introduced in this work are directly applicable to more general coincidence and covariance experiments, including multiple ionization and multiple-ion fragmentation pathways.

  10. Utilization of multiple-criteria decision analysis (MCDA) to support healthcare decision-making FIFARMA, 2016

    PubMed Central

    Drake, Julia I.; de Hart, Juan Carlos Trujillo; Monleón, Clara; Toro, Walter; Valentim, Joice

    2017-01-01

    ABSTRACT Background and objectives:   MCDA is a decision-making tool with increasing use in the healthcare sector, including HTA (Health Technology Assessment). By applying multiple criteria, including innovation, in a comprehensive, structured and explicit manner, MCDA fosters a transparent, participative, consistent decision-making process taking into consideration values of all stakeholders. This paper by FIFARMA (Latin American Federation of Pharmaceutical Industry) proposes the deliberative (partial) MCDA as a more pragmatic, agile approach, especially when newly implemented. Methods: Literature review including real-world examples of effective MCDA implementation in healthcare decision making in both the public and private sector worldwide and in LA. Results and conclusion: It is the view of FIFARMA that MCDA should strongly be considered as a tool to support HTA and broader healthcare decision making such as the contracts and tenders process in order to foster transparency, fairness, and collaboration amongst stakeholders. PMID:29081919

  11. Role of Proangiogenic Factors in Immunopathogenesis of Multiple Sclerosis.

    PubMed

    Hamid, Kabir Magaji; Mirshafiey, Abbas

    2016-02-01

    Angiogenesis is a complex and balanced process in which new blood vessels form from preexisting ones by sprouting, splitting, growth and remodeling. This phenomenon plays a vital role in many physiological and pathological processes. However, the disturbance in physiological process can play a role in pathogenesis of some chronic inflammatory diseases, including multiple sclerosis (MS) in human and its animal model. Although the relation between abnormal blood vessels and MS lesions was established in previous studies, but the role of pathological angiogenesis remains unclear. In this study, the link between proangiogenic factors and multiple sclerosis pathogenesis was examined by conducting a systemic review. Thus we searched the English medical literature via PubMed, ISI web of knowledge, Medline and virtual health library (VHL) databases. In this review, we describe direct and indirect roles of some proangiogenic factors in MS pathogenesis and report the association of these factors with pathological and inflammatory angiogenesis.

  12. Faster processing of multiple spatially-heterodyned direct to digital holograms

    DOEpatents

    Hanson, Gregory R.; Bingham, Philip R.

    2006-10-03

    Systems and methods are described for faster processing of multiple spatially-heterodyned direct to digital holograms. A method includes of obtaining multiple spatially-heterodyned holograms, includes: digitally recording a first spatially-heterodyned hologram including spatial heterodyne fringes for Fourier analysis; digitally recording a second spatially-heterodyned hologram including spatial heterodyne fringes for Fourier analysis; Fourier analyzing the recorded first spatially-heterodyned hologram by shifting a first original origin of the recorded first spatially-heterodyned hologram including spatial heterodyne fringes in Fourier space to sit on top of a spatial-heterodyne carrier frequency defined as a first angle between a first reference beam and a first, object beam; applying a first digital filter to cut off signals around the first original origin and performing an inverse Fourier transform on the result; Fourier analyzing the recorded second spatially-heterodyned hologram by shifting a second original origin of the recorded second spatially-heterodyned hologram including spatial heterodyne fringes in Fourier space to sit on top of a spatial-heterodyne carrier frequency defined as a second angle between a second reference beam and a second object beam; and applying a second digital filter to cut off signals around the second original origin and performing an inverse Fourier transform on the result, wherein digitally recording the first spatially-heterodyned hologram is completed before digitally recording the second spatially-heterodyned hologram and a single digital image includes both the first spatially-heterodyned hologram and the second spatially-heterodyned hologram.

  13. Faster processing of multiple spatially-heterodyned direct to digital holograms

    DOEpatents

    Hanson, Gregory R [Clinton, TN; Bingham, Philip R [Knoxville, TN

    2008-09-09

    Systems and methods are described for faster processing of multiple spatially-heterodyned direct to digital holograms. A method includes of obtaining multiple spatially-heterodyned holograms, includes: digitally recording a first spatially-heterodyned hologram including spatial heterodyne fringes for Fourier analysis; digitally recording a second spatially-heterodyned hologram including spatial heterodyne fringes for Fourier analysis; Fourier analyzing the recorded first spatially-heterodyned hologram by shifting a first original origin of the recorded first spatially-heterodyned hologram including spatial heterodyne fringes in Fourier space to sit on top of a spatial-heterodyne carrier frequency defined as a first angle between a first reference beam and a first object beam; applying a first digital filter to cut off signals around the first original origin and performing an inverse Fourier transform on the result; Fourier analyzing the recorded second spatially-heterodyned hologram by shifting a second original origin of the recorded second spatially-heterodyned hologram including spatial heterodyne fringes in Fourier space to sit on top of a spatial-heterodyne carrier frequency defined as a second angle between a second reference beam and a second object beam; and applying a second digital filter to cut off signals around the second original origin and performing an inverse Fourier transform on the result, wherein digitally recording the first spatially-heterodyned hologram is completed before digitally recording the second spatially-heterodyned hologram and a single digital image includes both the first spatially-heterodyned hologram and the second spatially-heterodyned hologram.

  14. Method and apparatus for optimized processing of sparse matrices

    DOEpatents

    Taylor, Valerie E.

    1993-01-01

    A computer architecture for processing a sparse matrix is disclosed. The apparatus stores a value-row vector corresponding to nonzero values of a sparse matrix. Each of the nonzero values is located at a defined row and column position in the matrix. The value-row vector includes a first vector including nonzero values and delimiting characters indicating a transition from one column to another. The value-row vector also includes a second vector which defines row position values in the matrix corresponding to the nonzero values in the first vector and column position values in the matrix corresponding to the column position of the nonzero values in the first vector. The architecture also includes a circuit for detecting a special character within the value-row vector. Matrix-vector multiplication is executed on the value-row vector. This multiplication is performed by multiplying an index value of the first vector value by a column value from a second matrix to form a matrix-vector product which is added to a previous matrix-vector product.

  15. PrimerDesign-M: A multiple-alignment based multiple-primer design tool for walking across variable genomes

    DOE PAGES

    Yoon, Hyejin; Leitner, Thomas

    2014-12-17

    Analyses of entire viral genomes or mtDNA requires comprehensive design of many primers across their genomes. In addition, simultaneous optimization of several DNA primer design criteria may improve overall experimental efficiency and downstream bioinformatic processing. To achieve these goals, we developed PrimerDesign-M. It includes several options for multiple-primer design, allowing researchers to efficiently design walking primers that cover long DNA targets, such as entire HIV-1 genomes, and that optimizes primers simultaneously informed by genetic diversity in multiple alignments and experimental design constraints given by the user. PrimerDesign-M can also design primers that include DNA barcodes and minimize primer dimerization. PrimerDesign-Mmore » finds optimal primers for highly variable DNA targets and facilitates design flexibility by suggesting alternative designs to adapt to experimental conditions.« less

  16. Sustained multifocal attentional enhancement of stimulus processing in early visual areas predicts tracking performance.

    PubMed

    Störmer, Viola S; Winther, Gesche N; Li, Shu-Chen; Andersen, Søren K

    2013-03-20

    Keeping track of multiple moving objects is an essential ability of visual perception. However, the mechanisms underlying this ability are not well understood. We instructed human observers to track five or seven independent randomly moving target objects amid identical nontargets and recorded steady-state visual evoked potentials (SSVEPs) elicited by these stimuli. Visual processing of moving targets, as assessed by SSVEP amplitudes, was continuously facilitated relative to the processing of identical but irrelevant nontargets. The cortical sources of this enhancement were located to areas including early visual cortex V1-V3 and motion-sensitive area MT, suggesting that the sustained multifocal attentional enhancement during multiple object tracking already operates at hierarchically early stages of visual processing. Consistent with this interpretation, the magnitude of attentional facilitation during tracking in a single trial predicted the speed of target identification at the end of the trial. Together, these findings demonstrate that attention can flexibly and dynamically facilitate the processing of multiple independent object locations in early visual areas and thereby allow for tracking of these objects.

  17. The role of speed versus working memory in predicting learning new information in multiple sclerosis.

    PubMed

    Chiaravalloti, Nancy D; Stojanovic-Radic, Jelena; DeLuca, John

    2013-01-01

    The most common cognitive impairments in multiple sclerosis (MS) have been documented in specific domains, including new learning and memory, working memory, and information processing speed. However, little attempt has been made to increase our understanding of their relationship to one another. While recent studies have shown that processing speed impacts new learning and memory abilities in MS, the role of working memory in this relationship has received less attention. The present study examines the relative contribution of impaired working memory versus processing speed in new learning and memory functions in MS. Participants consisted of 51 individuals with clinically definite MS. Participants completed two measures of processing speed, two measures of working memory, and two measures of episodic memory. Data were analyzed via correlational and multiple regression analysis. Results indicate that the variance in new learning abilities in this sample was primarily associated with processing speed, with working memory exerting much less of an influence. Results are discussed in terms of the role of cognitive rehabilitation of new learning and memory abilities in persons with MS.

  18. KAMO: towards automated data processing for microcrystals.

    PubMed

    Yamashita, Keitaro; Hirata, Kunio; Yamamoto, Masaki

    2018-05-01

    In protein microcrystallography, radiation damage often hampers complete and high-resolution data collection from a single crystal, even under cryogenic conditions. One promising solution is to collect small wedges of data (5-10°) separately from multiple crystals. The data from these crystals can then be merged into a complete reflection-intensity set. However, data processing of multiple small-wedge data sets is challenging. Here, a new open-source data-processing pipeline, KAMO, which utilizes existing programs, including the XDS and CCP4 packages, has been developed to automate whole data-processing tasks in the case of multiple small-wedge data sets. Firstly, KAMO processes individual data sets and collates those indexed with equivalent unit-cell parameters. The space group is then chosen and any indexing ambiguity is resolved. Finally, clustering is performed, followed by merging with outlier rejections, and a report is subsequently created. Using synthetic and several real-world data sets collected from hundreds of crystals, it was demonstrated that merged structure-factor amplitudes can be obtained in a largely automated manner using KAMO, which greatly facilitated the structure analyses of challenging targets that only produced microcrystals. open access.

  19. Enhancing multiple disciplinary teamwork.

    PubMed

    Weaver, Terri E

    2008-01-01

    Multiple disciplinary research provides an opportunity to bring together investigators across disciplines to provide new views and develop innovative approaches to important questions. Through this shared experience, novel paradigms are formed, original frameworks are developed, and new language is generated. Integral to the successful construction of effective cross-disciplinary teams is the recognition of antecedent factors that affect the development of the team such as intrapersonal, social, physical environmental, organizational, and institutional influences. Team functioning is enhanced with well-developed behavioral, affective, interpersonal, and intellectual processes. Outcomes of effective multiple disciplinary research teams include novel ideas, integrative models, new training programs, institutional change, and innovative policies that can also influence the degree to which antecedents and processes contribute to team performance. Ongoing evaluation of team functioning and achievement of designated outcomes ensures the continued development of the multiple disciplinary team and confirmation of this approach as important to the advancement of science.

  20. PFLOTRAN: Reactive Flow & Transport Code for Use on Laptops to Leadership-Class Supercomputers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hammond, Glenn E.; Lichtner, Peter C.; Lu, Chuan

    PFLOTRAN, a next-generation reactive flow and transport code for modeling subsurface processes, has been designed from the ground up to run efficiently on machines ranging from leadership-class supercomputers to laptops. Based on an object-oriented design, the code is easily extensible to incorporate additional processes. It can interface seamlessly with Fortran 9X, C and C++ codes. Domain decomposition parallelism is employed, with the PETSc parallel framework used to manage parallel solvers, data structures and communication. Features of the code include a modular input file, implementation of high-performance I/O using parallel HDF5, ability to perform multiple realization simulations with multiple processors permore » realization in a seamless manner, and multiple modes for multiphase flow and multicomponent geochemical transport. Chemical reactions currently implemented in the code include homogeneous aqueous complexing reactions and heterogeneous mineral precipitation/dissolution, ion exchange, surface complexation and a multirate kinetic sorption model. PFLOTRAN has demonstrated petascale performance using 2{sup 17} processor cores with over 2 billion degrees of freedom. Accomplishments achieved to date include applications to the Hanford 300 Area and modeling CO{sub 2} sequestration in deep geologic formations.« less

  1. [Neural Mechanisms Underlying the Processing of Temporal Information in Episodic Memory and Its Disturbance].

    PubMed

    Iwata, Saeko; Tsukiura, Takashi

    2017-11-01

    Episodic memory is defined as memory for personally experienced events, and includes memory content and contextual information of time and space. Previous neuroimaging and neuropsychological studies have demonstrated three possible roles of the temporal context in episodic memory. First, temporal information contributes to the arrangement of temporal order for sequential events in episodic memory, and this process is involved in the lateral prefrontal cortex. The second possible role of temporal information in episodic memory is the segregation between memories of multiple events, which are segregated by cues of different time information. The role of segregation is associated with the orbitofrontal regions including the orbitofrontal cortex and basal forebrain region. Third, temporal information in episodic memory plays an important role in the integration of multiple components into a coherent episodic memory, in which episodic components in the different modalities are combined by temporal information as an index. The role of integration is mediated by the medial temporal lobe including the hippocampus and parahippocampal gyrus. Thus, temporal information in episodic memory could be represented in multiple stages, which are involved in a network of the lateral prefrontal, orbitofrontal, and medial temporal lobe regions.

  2. Noise reduction methods for nucleic acid and macromolecule sequencing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schuller, Ivan K.; Di Ventra, Massimiliano; Balatsky, Alexander

    Methods, systems, and devices are disclosed for processing macromolecule sequencing data with substantial noise reduction. In one aspect, a method for reducing noise in a sequential measurement of a macromolecule comprising serial subunits includes cross-correlating multiple measured signals of a physical property of subunits of interest of the macromolecule, the multiple measured signals including the time data associated with the measurement of the signal, to remove or at least reduce signal noise that is not in the same frequency and in phase with the systematic signal contribution of the measured signals.

  3. Developing dimensions for a multicomponent multidisciplinary approach to obesity management: a qualitative study.

    PubMed

    Cochrane, Anita J; Dick, Bob; King, Neil A; Hills, Andrew P; Kavanagh, David J

    2017-10-16

    There have been consistent recommendations for multicomponent and multidisciplinary approaches for obesity management. However, there is no clear agreement on the components, disciplines or processes to be considered within such an approach. In this study, we explored multicomponent and multidisciplinary approaches through an examination of knowledge, skills, beliefs, and recommendations of stakeholders involved in obesity management. These stakeholders included researchers, practitioners, educators, and patients. We used qualitative action research methods, including convergent interviewing and observation, to assist the process of inquiry. The consensus was that a multicomponent and multidisciplinary approach should be based on four central meta-components (patient, practitioner, process, and environmental factors), and specific components of these factors were identified. Psychologists, dieticians, exercise physiologists and general practitioners were nominated as key practitioners to be included. A complex condition like obesity requires that multiple components be addressed, and that both patients and multiple disciplines are involved in developing solutions. Implementing cycles of continuous improvement to deal with complexity, instead of trying to control for it, offers an effective way to deal with complex, changing multisystem problems like obesity.

  4. The relative contributions of processing speed and cognitive load to working memory accuracy in multiple sclerosis.

    PubMed

    Leavitt, Victoria M; Lengenfelder, Jean; Moore, Nancy B; Chiaravalloti, Nancy D; DeLuca, John

    2011-06-01

    Cognitive symptoms of multiple sclerosis (MS) include processing-speed deficits and working memory impairment. The precise manner in which these deficits interact in individuals with MS remains to be explicated. We hypothesized that providing more time on a complex working memory task would result in performance benefits for individuals with MS relative to healthy controls. Fifty-three individuals with clinically definite MS and 36 matched healthy controls performed a computerized task that systematically manipulated cognitive load. The interval between stimuli presentations was manipulated to provide increasing processing time. The results confirmed that individuals with MS who have processing-speed deficits significantly improve in performance accuracy when given additional time to process the information in working memory. Implications of these findings for developing appropriate cognitive rehabilitation interventions are discussed.

  5. A Quasi-2D Delta-growth Model Accounting for Multiple Avulsion Events, Validated by Robust Data from the Yellow River Delta, China

    NASA Astrophysics Data System (ADS)

    Moodie, A. J.; Nittrouer, J. A.; Ma, H.; Carlson, B.; Parker, G.

    2016-12-01

    The autogenic "life cycle" of a lowland fluvial channel building a deltaic lobe typically follows a temporal sequence that includes: channel initiation, progradation and aggradation, and abandonment via avulsion. In terms of modeling these processes, it is possible to use a one-dimensional (1D) morphodynamic scheme to capture the magnitude of the prograding and aggrading processes. These models can include algorithms to predict the timing and location of avulsions for a channel lobe. However, this framework falls short in its ability to evaluate the deltaic system beyond the time scale of a single channel, and assess sedimentation processes occurring on the floodplain, which is important for lobe building. Herein, we adapt a 1D model to explicitly account for multiple avulsions and therefore replicate a deltaic system that includes many lobe cycles. Following an avulsion, sediment on the floodplain and beyond the radially-averaged shoreline is redistributed across the delta topset and along the shoreline, respectively, simultaneously prograding and aggrading the delta. Over time this framework produces net shoreline progradation and forward-stepping of subsequent avulsions. Testing this model using modern systems is inherently difficult due to a lack of data: most modern delta lobes are active for timescales of centuries to millennia, and so observing multiple iterations of the channel-lobe cycle is impossible. However, the Yellow River delta (China) is unique because the lobe cycles here occur within years to decades. Therefore it is possible to measure shoreline evolution through multiple lobe cycles, based on satellite imagery and historical records. These data are used to validate the model outcomes. Our findings confirm that the explicit accounting of avulsion processes in a quasi-2D model framework is capable of capturing shoreline development patterns that otherwise are not resolvable based on previously published delta building models.

  6. Gamma ray spectroscopy employing divalent europium-doped alkaline earth halides and digital readout for accurate histogramming

    DOEpatents

    Cherepy, Nerine Jane; Payne, Stephen Anthony; Drury, Owen B.; Sturm, Benjamin W.

    2016-02-09

    According to one embodiment, a scintillator radiation detector system includes a scintillator, and a processing device for processing pulse traces corresponding to light pulses from the scintillator, where the processing device is configured to: process each pulse trace over at least two temporal windows and to use pulse digitization to improve energy resolution of the system. According to another embodiment, a scintillator radiation detector system includes a processing device configured to: fit digitized scintillation waveforms to an algorithm, perform a direct integration of fit parameters, process multiple integration windows for each digitized scintillation waveform to determine a correction factor, and apply the correction factor to each digitized scintillation waveform.

  7. Portable multiplicity counter

    DOEpatents

    Newell, Matthew R [Los Alamos, NM; Jones, David Carl [Los Alamos, NM

    2009-09-01

    A portable multiplicity counter has signal input circuitry, processing circuitry and a user/computer interface disposed in a housing. The processing circuitry, which can comprise a microcontroller integrated circuit operably coupled to shift register circuitry implemented in a field programmable gate array, is configured to be operable via the user/computer interface to count input signal pluses receivable at said signal input circuitry and record time correlations thereof in a total counting mode, coincidence counting mode and/or a multiplicity counting mode. The user/computer interface can be for example an LCD display/keypad and/or a USB interface. The counter can include a battery pack for powering the counter and low/high voltage power supplies for biasing external detectors so that the counter can be configured as a hand-held device for counting neutron events.

  8. Perspectives on Dichotic Listening and the Corpus Callosum

    ERIC Educational Resources Information Center

    Musiek, Frank E.; Weihing, Jeffrey

    2011-01-01

    The present review summarizes historic and recent research which has investigated the role of the corpus callosum in dichotic processing within the context of audiology. Examination of performance by certain clinical groups, including split brain patients, multiple sclerosis cases, and other types of neurological lesions is included. Maturational,…

  9. Gifted Children and Divorce

    ERIC Educational Resources Information Center

    Dudley, John; Karnes, Frances A.

    2011-01-01

    Divorce is often a contentious process with multiple issues to decide, especially in cases in which there are children involved. Divorce raises several legal issues when considering the well-being of children, including those who are gifted. In this article, the authors discuss these issues which include school choice, child support, and custody…

  10. The Legacy of Hobbs and Gray: Research on the Development and Prevention of Conduct Problems.

    ERIC Educational Resources Information Center

    Dodge, Kenneth A.

    1996-01-01

    Describes research on the development of chronic conduct problems in childhood and adolescence, examining a multiple risk-factor model that includes biological predispositions, ecological context, family processes, peer influences, academic performance, and social information processing as factors leading to conduct problems. The paper describes a…

  11. Digital communications study

    NASA Technical Reports Server (NTRS)

    Boorstyn, R. R.

    1973-01-01

    Research is reported dealing with problems of digital data transmission and computer communications networks. The results of four individual studies are presented which include: (1) signal processing with finite state machines, (2) signal parameter estimation from discrete-time observations, (3) digital filtering for radar signal processing applications, and (4) multiple server queues where all servers are not identical.

  12. Level of Processing Modulates the Neural Correlates of Emotional Memory Formation

    ERIC Educational Resources Information Center

    Ritchey, Maureen; LaBar, Kevin S.; Cabeza, Roberto

    2011-01-01

    Emotion is known to influence multiple aspects of memory formation, including the initial encoding of the memory trace and its consolidation over time. However, the neural mechanisms whereby emotion impacts memory encoding remain largely unexplored. The present study used a levels-of-processing manipulation to characterize the impact of emotion on…

  13. Energy requirement for the production of silicon solar arrays

    NASA Technical Reports Server (NTRS)

    Lindmayer, J.; Wihl, M.; Scheinine, A.; Morrison, A.

    1977-01-01

    An assessment of potential changes and alternative technologies which could impact the photovoltaic manufacturing process is presented. Topics discussed include: a multiple wire saw, ribbon growth techniques, silicon casting, and a computer model for a large-scale solar power plant. Emphasis is placed on reducing the energy demands of the manufacturing process.

  14. Methods of natural gas liquefaction and natural gas liquefaction plants utilizing multiple and varying gas streams

    DOEpatents

    Wilding, Bruce M; Turner, Terry D

    2014-12-02

    A method of natural gas liquefaction may include cooling a gaseous NG process stream to form a liquid NG process stream. The method may further include directing the first tail gas stream out of a plant at a first pressure and directing a second tail gas stream out of the plant at a second pressure. An additional method of natural gas liquefaction may include separating CO.sub.2 from a liquid NG process stream and processing the CO.sub.2 to provide a CO.sub.2 product stream. Another method of natural gas liquefaction may include combining a marginal gaseous NG process stream with a secondary substantially pure NG stream to provide an improved gaseous NG process stream. Additionally, a NG liquefaction plant may include a first tail gas outlet, and at least a second tail gas outlet, the at least a second tail gas outlet separate from the first tail gas outlet.

  15. DMA engine for repeating communication patterns

    DOEpatents

    Chen, Dong; Gara, Alan G.; Giampapa, Mark E.; Heidelberger, Philip; Steinmacher-Burow, Burkhard; Vranas, Pavlos

    2010-09-21

    A parallel computer system is constructed as a network of interconnected compute nodes to operate a global message-passing application for performing communications across the network. Each of the compute nodes includes one or more individual processors with memories which run local instances of the global message-passing application operating at each compute node to carry out local processing operations independent of processing operations carried out at other compute nodes. Each compute node also includes a DMA engine constructed to interact with the application via Injection FIFO Metadata describing multiple Injection FIFOs where each Injection FIFO may containing an arbitrary number of message descriptors in order to process messages with a fixed processing overhead irrespective of the number of message descriptors included in the Injection FIFO.

  16. Simultaneous reconstruction of multiple depth images without off-focus points in integral imaging using a graphics processing unit.

    PubMed

    Yi, Faliu; Lee, Jieun; Moon, Inkyu

    2014-05-01

    The reconstruction of multiple depth images with a ray back-propagation algorithm in three-dimensional (3D) computational integral imaging is computationally burdensome. Further, a reconstructed depth image consists of a focus and an off-focus area. Focus areas are 3D points on the surface of an object that are located at the reconstructed depth, while off-focus areas include 3D points in free-space that do not belong to any object surface in 3D space. Generally, without being removed, the presence of an off-focus area would adversely affect the high-level analysis of a 3D object, including its classification, recognition, and tracking. Here, we use a graphics processing unit (GPU) that supports parallel processing with multiple processors to simultaneously reconstruct multiple depth images using a lookup table containing the shifted values along the x and y directions for each elemental image in a given depth range. Moreover, each 3D point on a depth image can be measured by analyzing its statistical variance with its corresponding samples, which are captured by the two-dimensional (2D) elemental images. These statistical variances can be used to classify depth image pixels as either focus or off-focus points. At this stage, the measurement of focus and off-focus points in multiple depth images is also implemented in parallel on a GPU. Our proposed method is conducted based on the assumption that there is no occlusion of the 3D object during the capture stage of the integral imaging process. Experimental results have demonstrated that this method is capable of removing off-focus points in the reconstructed depth image. The results also showed that using a GPU to remove the off-focus points could greatly improve the overall computational speed compared with using a CPU.

  17. Human Exploration Framework Team: Strategy and Status

    NASA Technical Reports Server (NTRS)

    Muirhead, Brian K.; Sherwood, Brent; Olson, John

    2011-01-01

    Human Exploration Framework Team (HEFT) was formulated to create a decision framework for human space exploration that drives out the knowledge, capabilities and infrastructure NASA needs to send people to explore multiple destinations in the Solar System in an efficient, sustainable way. The specific goal is to generate an initial architecture that can evolve into a long term, enterprise-wide architecture that is the basis for a robust human space flight enterprise. This paper will discuss the initial HEFT activity which focused on starting up the cross-agency team, getting it functioning, developing a comprehensive development and analysis process and conducting multiple iterations of the process. The outcome of this process will be discussed including initial analysis of capabilities and missions for at least two decades, keeping Mars as the ultimate destination. Details are provided on strategies that span a broad technical and programmatic trade space, are analyzed against design reference missions and evaluated against a broad set of figures of merit including affordability, operational complexity, and technical and programmatic risk.

  18. Multiple Myeloma: Patient Handbook

    MedlinePlus

    ... incidence of deep vein thrombosis and pulmonar y embolism. • Patients must be pre-medicated with dexamethasone, antihistamine, ... foods that include processed sugars and artificial trans fats. Caution should be used in two areas: • Vitamin ...

  19. Proteus: a reconfigurable computational network for computer vision

    NASA Astrophysics Data System (ADS)

    Haralick, Robert M.; Somani, Arun K.; Wittenbrink, Craig M.; Johnson, Robert; Cooper, Kenneth; Shapiro, Linda G.; Phillips, Ihsin T.; Hwang, Jenq N.; Cheung, William; Yao, Yung H.; Chen, Chung-Ho; Yang, Larry; Daugherty, Brian; Lorbeski, Bob; Loving, Kent; Miller, Tom; Parkins, Larye; Soos, Steven L.

    1992-04-01

    The Proteus architecture is a highly parallel MIMD, multiple instruction, multiple-data machine, optimized for large granularity tasks such as machine vision and image processing The system can achieve 20 Giga-flops (80 Giga-flops peak). It accepts data via multiple serial links at a rate of up to 640 megabytes/second. The system employs a hierarchical reconfigurable interconnection network with the highest level being a circuit switched Enhanced Hypercube serial interconnection network for internal data transfers. The system is designed to use 256 to 1,024 RISC processors. The processors use one megabyte external Read/Write Allocating Caches for reduced multiprocessor contention. The system detects, locates, and replaces faulty subsystems using redundant hardware to facilitate fault tolerance. The parallelism is directly controllable through an advanced software system for partitioning, scheduling, and development. System software includes a translator for the INSIGHT language, a parallel debugger, low and high level simulators, and a message passing system for all control needs. Image processing application software includes a variety of point operators neighborhood, operators, convolution, and the mathematical morphology operations of binary and gray scale dilation, erosion, opening, and closing.

  20. Constraints, Approach, and Status of Mars Surveyor 2001 Landing Site Selection

    NASA Technical Reports Server (NTRS)

    Golombek, M.; Bridges, N.; Briggs, G.; Gilmore, M.; Haldemann, A.; Parker, T.; Saunders, R.; Spencer, D.; Smith, J.; Soderblom, L.

    1999-01-01

    There are many similarities between the Mars Surveyor '01 (MS '01) landing site selection process and that of Mars Pathfinder. The selection process includes two parallel activities in which engineers define and refine the capabilities of the spacecraft through design, testing and modeling and scientists define a set of landing site constraints based on the spacecraft design and landing scenario. As for Pathfinder, the safety of the site is without question the single most important factor, for the simple reason that failure to land safely yields no science and exposes the mission and program to considerable risk. The selection process must be thorough and defensible and capable of surviving multiple withering reviews similar to the Pathfinder decision. On Pathfinder, this was accomplished by attempting to understand the surface properties of sites using available remote sensing data sets and models based on them. Science objectives are factored into the selection process only after the safety of the site is validated. Finally, as for Pathfinder, the selection process is being done in an open environment with multiple opportunities for community involvement including open workshops, with education and outreach opportunities. Additional information is contained in the original extended abstract.

  1. Specific Etiologies Associated With the Multiple Organ Dysfunction Syndrome in Children: Part 2.

    PubMed

    Upperman, Jeffrey S; Bucuvalas, John C; Williams, Felicia N; Cairns, Bruce A; Cox, Charles S; Doctor, Allan; Tamburro, Robert F

    2017-03-01

    To describe a number of conditions and therapies associated with multiple organ dysfunction syndrome presented as part of the Eunice Kennedy Shriver National Institute of Child Health and Human Development Multiple Organ Dysfunction Workshop (March 26-27, 2015). In addition, the relationship between burn injuries and multiple organ dysfunction syndrome is also included although it was not discussed at the workshop. Literature review, research data, and expert opinion. Not applicable. Moderated by an expert from the field, issues relevant to the association of multiple organ dysfunction syndrome with a variety of conditions and therapies were presented, discussed, and debated with a focus on identifying knowledge gaps and the research priorities. Summary of presentations and discussion supported and supplemented by relevant literature. Sepsis and trauma are the two conditions most commonly associated with multiple organ dysfunction syndrome both in children and adults. However, many other pathophysiologic processes may result in multiple organ dysfunction syndrome. In this article, we discuss conditions such as liver failure and pancreatitis, pathophysiologic processes such as ischemia and hypoxia, and injuries such as trauma and burns. Additionally, therapeutic interventions such as medications, blood transfusions, transplantation may also precipitate and contribute to multiple organ dysfunction syndrome. The purpose of this article is to describe the association of multiple organ dysfunction syndrome with a variety of conditions and therapies in an attempt to identify similarities, differences, and opportunities for therapeutic intervention.

  2. Space-Time Processing for Tactical Mobile Ad Hoc Networks

    DTIC Science & Technology

    2010-05-01

    Spatial Diversity and Imperfect Channel Estimation on Wideband MC- DS - CDMA and MC- CDMA " IEEE Transactions on Communications, Vol. 57, No. 10, pp. 2988...include direct sequence code division multiple access ( DS - CDMA ), Frequency Hopped (FH) CDMA and Orthogonal Frequency Division Multiple Access (OFDMA...capability, LPD/LPI, and operability in non-continuous spectrum. In addition, FH- CDMA is robust to the near-far problem, while DS - CDMA requires

  3. Systems Biophysics of Gene Expression

    PubMed Central

    Vilar, Jose M.G.; Saiz, Leonor

    2013-01-01

    Gene expression is a process central to any form of life. It involves multiple temporal and functional scales that extend from specific protein-DNA interactions to the coordinated regulation of multiple genes in response to intracellular and extracellular changes. This diversity in scales poses fundamental challenges to the use of traditional approaches to fully understand even the simplest gene expression systems. Recent advances in computational systems biophysics have provided promising avenues to reliably integrate the molecular detail of biophysical process into the system behavior. Here, we review recent advances in the description of gene regulation as a system of biophysical processes that extend from specific protein-DNA interactions to the combinatorial assembly of nucleoprotein complexes. There is now basic mechanistic understanding on how promoters controlled by multiple, local and distal, DNA binding sites for transcription factors can actively control transcriptional noise, cell-to-cell variability, and other properties of gene regulation, including precision and flexibility of the transcriptional responses. PMID:23790365

  4. Localization of multiple defects using the compact phased array (CPA) method

    NASA Astrophysics Data System (ADS)

    Senyurek, Volkan Y.; Baghalian, Amin; Tashakori, Shervin; McDaniel, Dwayne; Tansel, Ibrahim N.

    2018-01-01

    Array systems of transducers have found numerous applications in detection and localization of defects in structural health monitoring (SHM) of plate-like structures. Different types of array configurations and analysis algorithms have been used to improve the process of localization of defects. For accurate and reliable monitoring of large structures by array systems, a high number of actuator and sensor elements are often required. In this study, a compact phased array system consisting of only three piezoelectric elements is used in conjunction with an updated total focusing method (TFM) for localization of single and multiple defects in an aluminum plate. The accuracy of the localization process was greatly improved by including wave propagation information in TFM. Results indicated that the proposed CPA approach can locate single and multiple defects with high accuracy while decreasing the processing costs and the number of required transducers. This method can be utilized in critical applications such as aerospace structures where the use of a large number of transducers is not desirable.

  5. Mass media in health promotion: an analysis using an extended information-processing model.

    PubMed

    Flay, B R; DiTecco, D; Schlegel, R P

    1980-01-01

    The information-processing model of the attitude and behavior change process was critically examined and extended from six to 12 levels for a better analysis of change due to mass media campaigns. Findings from social psychology and communications research, and from evaluations of mass media health promotion programs, were reviewed to determine how source, message, channel, receiver, and destination variables affect each of the levels of change of major interest (knowledge, beliefs, attitudes, intentions and behavior). Factors found to most likely induce permanent attitude and behavior change (most important in health promotion) were: presentation and repetition over long time periods, via multiple sources, at different times (including "prime" or high-exposure times), by multiple sources, in novel and involving ways, with appeals to multiple motives, development of social support, and provisions of appropriate behavioral skills, alternatives, and reinforcement (preferably in ways that get the active participation of the audience). Suggestions for evaluation of mass media programs that take account of this complexity were advanced.

  6. Integration of multiple theories for the simulation of laser interference lithography processes

    NASA Astrophysics Data System (ADS)

    Lin, Te-Hsun; Yang, Yin-Kuang; Fu, Chien-Chung

    2017-11-01

    The periodic structure of laser interference lithography (LIL) fabrication is superior to other lithography technologies. In contrast to traditional lithography, LIL has the advantages of being a simple optical system with no mask requirements, low cost, high depth of focus, and large patterning area in a single exposure. Generally, a simulation pattern for the periodic structure is obtained through optical interference prior to its fabrication through LIL. However, the LIL process is complex and combines the fields of optical and polymer materials; thus, a single simulation theory cannot reflect the real situation. Therefore, this research integrates multiple theories, including those of optical interference, standing waves, and photoresist characteristics, to create a mathematical model for the LIL process. The mathematical model can accurately estimate the exposure time and reduce the LIL process duration through trial and error.

  7. Integration of multiple theories for the simulation of laser interference lithography processes.

    PubMed

    Lin, Te-Hsun; Yang, Yin-Kuang; Fu, Chien-Chung

    2017-11-24

    The periodic structure of laser interference lithography (LIL) fabrication is superior to other lithography technologies. In contrast to traditional lithography, LIL has the advantages of being a simple optical system with no mask requirements, low cost, high depth of focus, and large patterning area in a single exposure. Generally, a simulation pattern for the periodic structure is obtained through optical interference prior to its fabrication through LIL. However, the LIL process is complex and combines the fields of optical and polymer materials; thus, a single simulation theory cannot reflect the real situation. Therefore, this research integrates multiple theories, including those of optical interference, standing waves, and photoresist characteristics, to create a mathematical model for the LIL process. The mathematical model can accurately estimate the exposure time and reduce the LIL process duration through trial and error.

  8. TEAMS (Tele-Exercise and Multiple Sclerosis), a Tailored Telerehabilitation mHealth App: Participant-Centered Development and Usability Study

    PubMed Central

    Rimmer, James H; Johnson, George; Wilroy, Jereme; Young, Hui-Ju; Mehta, Tapan; Lai, Byron

    2018-01-01

    Background People with multiple sclerosis face varying levels of disability and symptoms, thus requiring highly trained therapists and/or exercise trainers to design personalized exercise programs. However, for people living in geographically isolated communities, access to such trained professionals can be challenging due to a number of barriers associated with cost, access to transportation, and travel distance. Generic mobile health exercise apps often fall short of what people with multiple sclerosis need to become physically active (ie, exercise content that has been adapted to accommodate a wide range of functional limitations). Objective This usability study describes the development process of the TEAMS (Tele-Exercise and Multiple Sclerosis) app, which is being used by people with multiple sclerosis in a large randomized controlled trial to engage in home-based telerehabilitation. Methods Twenty-one participants with disabilities (10 people with multiple sclerosis) were involved in the double iterative design, which included the simultaneous development of the app features and exercise content (exercise videos and articles). Framed within a user-centered design approach, the development process included 2 stages: ground-level creation (focus group followed by early stage evaluations and developments), and proof of concept through 2 usability tests. Usability (effectiveness, usefulness, and satisfaction) was evaluated using a mixed-methods approach. Results During testing of the app’s effectiveness, the second usability test resulted in an average of 1 problem per participant, a decrease of 53% compared to the initial usability test. Five themes were constructed from the qualitative data that related to app usefulness and satisfaction, namely: high perceived confidence for app usability, positive perceptions of exercise videos, viable exercise option at home, orientation and familiarity required for successful participation, and app issues. Participants acknowledged that the final app was ready to be delivered to the public after minor revisions. After including these revisions, the project team released the final app that is being used in the randomized controlled trial. Conclusions A multi-level user-centered development process resulted in the development of an inclusive exercise program for people with multiple sclerosis operated through an easy-to-use app. The promotion of exercise through self-regulated mHealth programs requires a stakeholder-driven approach to app development. This ensures that app and content match the preferences and functional abilities of the end user (ie, people with varying levels of multiple sclerosis). PMID:29798832

  9. TEAMS (Tele-Exercise and Multiple Sclerosis), a Tailored Telerehabilitation mHealth App: Participant-Centered Development and Usability Study.

    PubMed

    Thirumalai, Mohanraj; Rimmer, James H; Johnson, George; Wilroy, Jereme; Young, Hui-Ju; Mehta, Tapan; Lai, Byron

    2018-05-24

    People with multiple sclerosis face varying levels of disability and symptoms, thus requiring highly trained therapists and/or exercise trainers to design personalized exercise programs. However, for people living in geographically isolated communities, access to such trained professionals can be challenging due to a number of barriers associated with cost, access to transportation, and travel distance. Generic mobile health exercise apps often fall short of what people with multiple sclerosis need to become physically active (ie, exercise content that has been adapted to accommodate a wide range of functional limitations). This usability study describes the development process of the TEAMS (Tele-Exercise and Multiple Sclerosis) app, which is being used by people with multiple sclerosis in a large randomized controlled trial to engage in home-based telerehabilitation. Twenty-one participants with disabilities (10 people with multiple sclerosis) were involved in the double iterative design, which included the simultaneous development of the app features and exercise content (exercise videos and articles). Framed within a user-centered design approach, the development process included 2 stages: ground-level creation (focus group followed by early stage evaluations and developments), and proof of concept through 2 usability tests. Usability (effectiveness, usefulness, and satisfaction) was evaluated using a mixed-methods approach. During testing of the app's effectiveness, the second usability test resulted in an average of 1 problem per participant, a decrease of 53% compared to the initial usability test. Five themes were constructed from the qualitative data that related to app usefulness and satisfaction, namely: high perceived confidence for app usability, positive perceptions of exercise videos, viable exercise option at home, orientation and familiarity required for successful participation, and app issues. Participants acknowledged that the final app was ready to be delivered to the public after minor revisions. After including these revisions, the project team released the final app that is being used in the randomized controlled trial. A multi-level user-centered development process resulted in the development of an inclusive exercise program for people with multiple sclerosis operated through an easy-to-use app. The promotion of exercise through self-regulated mHealth programs requires a stakeholder-driven approach to app development. This ensures that app and content match the preferences and functional abilities of the end user (ie, people with varying levels of multiple sclerosis). ©Mohanraj Thirumalai, James H Rimmer, George Johnson, Jereme Wilroy, Hui-Ju Young, Tapan Mehta, Byron Lai. Originally published in JMIR Mhealth and Uhealth (http://mhealth.jmir.org), 24.05.2018.

  10. 45 CFR 1308.8 - Eligibility criteria: Emotional/behavioral disorders.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... or emotional functioning in multiple settings. (c) The evaluation process must include a review of the child's regular Head Start physical examination to eliminate the possibility of misdiagnosis due to an underlying physical condition. ...

  11. Neural Correlates of Restricted, Repetitive Behaviors in Autism Spectrum Disorders

    DTIC Science & Technology

    2014-12-01

    ventral precuneus has been associated with self - referential processing8. • To better understand the relation of the connectivity of this region with RRBs...acquisition for 89 participants (53 with ASD and 36 controls). After performing qualitative and quantitative quality control and pre- processing of the...data, we have been actively processing and analyzing the rich dataset from multiple perspectives. This includes investigation of how altered functional

  12. A Grounded Theory of Preservice Music Educators' Lesson Planning Processes within Field Experience Methods Courses

    ERIC Educational Resources Information Center

    Parker, Elizabeth Cassidy; Bond, Vanessa L.; Powell, Sean R.

    2017-01-01

    The purpose of this grounded theory study was to understand the process of field experience lesson planning for preservice music educators enrolled in choral, general, and instrumental music education courses within three university contexts. Data sources included multiple interviews, written responses, and field texts from 42 participants. Four…

  13. Balancing multiple roles among a group of urban midlife American Indian working women.

    PubMed

    Napholz, L

    2000-06-01

    Presented are the results of a secondary analysis of group data from a study of a six-week role conflict reduction intervention among a group of urban American Indian women (n = 8). The specific aim of this researcher was to understand the process of balancing multiple roles as expressed in the participants' daily lived experiences as mothers, wives, and workers. A construction of the process of balancing multiple roles was accomplished through the use of narratives. Balancing multiple roles represented a major current attempt on the part of the participants to integrate and balance traditional and contemporary feminine strengths in a positive, culturally consistent manner. The study themes included: traditional sex role expectation conflicts, family guilt, guilt management, transitioning inner conflict and stress, breaking the silence-learning to say no, and healing the spirit to reclaim the self. Further support for retraditionalization of roles for this group of Indian women was maintained as an effective means of balancing roles and achieving Indian self-determination.

  14. Nonlinear coherent optical image processing using logarithmic transmittance of bacteriorhodopsin films

    NASA Astrophysics Data System (ADS)

    Downie, John D.

    1995-08-01

    The transmission properties of some bacteriorhodopsin-film spatial light modulators are uniquely suited to allow nonlinear optical image-processing operations to be applied to images with multiplicative noise characteristics. A logarithmic amplitude-transmission characteristic of the film permits the conversion of multiplicative noise to additive noise, which may then be linearly filtered out in the Fourier plane of the transformed image. I present experimental results demonstrating the principle and the capability for several different image and noise situations, including deterministic noise and speckle. The bacteriorhodopsin film studied here displays the logarithmic transmission response for write intensities spanning a dynamic range greater than 2 orders of magnitude.

  15. Multiple Parton Interactions in p$$bar{p}$$ Collisions in D0 Experiment at the Tevatron Collider (in Russian)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Golovanov, Georgy

    The thesis is devoted to the study of processes with multiple parton interactions (MPI) in a ppbar collision collected by D0 detector at the Fermilab Tevatron collider at sqrt(s) = 1.96 TeV. The study includes measurements of MPI event fraction and effective cross section, a process-independent parameter related to the effective interaction region inside the nucleon. The measurements are done using events with a photon and three hadronic jets in the final state. The measured effective cross section is used to estimate background from MPI for WH production at the Tevatron energy

  16. Sharing the skies: the Gemini Observatory international time allocation process

    NASA Astrophysics Data System (ADS)

    Margheim, Steven J.

    2016-07-01

    Gemini Observatory serves a diverse community of four partner countries (United States, Canada, Brazil, and Argentina), two hosts (Chile and University of Hawaii), and limited-term partnerships (currently Australia and the Republic of Korea). Observing time is available via multiple opportunities including Large and Long Pro- grams, Fast-turnaround programs, and regular semester queue programs. The slate of programs for observation each semester must be created by merging programs from these multiple, conflicting sources. This paper de- scribes the time allocation process used to schedule the overall science program for the semester, with emphasis on the International Time Allocation Committee and the software applications used.

  17. Nonlinear Coherent Optical Image Processing Using Logarithmic Transmittance of Bacteriorhodopsin Films

    NASA Technical Reports Server (NTRS)

    Downie, John D.

    1995-01-01

    The transmission properties of some bacteriorhodopsin-film spatial light modulators are uniquely suited to allow nonlinear optical image-processing operations to be applied to images with multiplicative noise characteristics. A logarithmic amplitude-transmission characteristic of the film permits the conversion of multiplicative noise to additive noise, which may then be linearly filtered out in the Fourier plane of the transformed image. I present experimental results demonstrating the principle and the capability for several different image and noise situations, including deterministic noise and speckle. The bacteriorhodopsin film studied here displays the logarithmic transmission response for write intensities spanning a dynamic range greater than 2 orders of magnitude.

  18. Implementation and benefits of advanced process control for lithography CD and overlay

    NASA Astrophysics Data System (ADS)

    Zavyalova, Lena; Fu, Chong-Cheng; Seligman, Gary S.; Tapp, Perry A.; Pol, Victor

    2003-05-01

    Due to the rapidly reduced imaging process windows and increasingly stingent device overlay requirements, sub-130 nm lithography processes are more severely impacted than ever by systamic fault. Limits on critical dimensions (CD) and overlay capability further challenge the operational effectiveness of a mix-and-match environment using multiple lithography tools, as such mode additionally consumes the available error budgets. Therefore, a focus on advanced process control (APC) methodologies is key to gaining control in the lithographic modules for critical device levels, which in turn translates to accelerated yield learning, achieving time-to-market lead, and ultimately a higher return on investment. This paper describes the implementation and unique challenges of a closed-loop CD and overlay control solution in high voume manufacturing of leading edge devices. A particular emphasis has been placed on developing a flexible APC application capable of managing a wide range of control aspects such as process and tool drifts, single and multiple lot excursions, referential overlay control, 'special lot' handling, advanced model hierarchy, and automatic model seeding. Specific integration cases, including the multiple-reticle complementary phase shift lithography process, are discussed. A continuous improvement in the overlay and CD Cpk performance as well as the rework rate has been observed through the implementation of this system, and the results are studied.

  19. More Than Just Accuracy: A Novel Method to Incorporate Multiple Test Attributes in Evaluating Diagnostic Tests Including Point of Care Tests.

    PubMed

    Thompson, Matthew; Weigl, Bernhard; Fitzpatrick, Annette; Ide, Nicole

    2016-01-01

    Current frameworks for evaluating diagnostic tests are constrained by a focus on diagnostic accuracy, and assume that all aspects of the testing process and test attributes are discrete and equally important. Determining the balance between the benefits and harms associated with new or existing tests has been overlooked. Yet, this is critically important information for stakeholders involved in developing, testing, and implementing tests. This is particularly important for point of care tests (POCTs) where tradeoffs exist between numerous aspects of the testing process and test attributes. We developed a new model that multiple stakeholders (e.g., clinicians, patients, researchers, test developers, industry, regulators, and health care funders) can use to visualize the multiple attributes of tests, the interactions that occur between these attributes, and their impacts on health outcomes. We use multiple examples to illustrate interactions between test attributes (test availability, test experience, and test results) and outcomes, including several POCTs. The model could be used to prioritize research and development efforts, and inform regulatory submissions for new diagnostics. It could potentially provide a way to incorporate the relative weights that various subgroups or clinical settings might place on different test attributes. Our model provides a novel way that multiple stakeholders can use to visualize test attributes, their interactions, and impacts on individual and population outcomes. We anticipate that this will facilitate more informed decision making around diagnostic tests.

  20. B cell biology: implications for treatment of systemic lupus erythematosus.

    PubMed

    Anolik, J H

    2013-04-01

    B cells are critical players in the orchestration of properly regulated immune responses, normally providing protective immunity without autoimmunity. Balance in the B cell compartment is achieved through the finely regulated participation of multiple B cell populations with different antibody-dependent and independent functions. Both types of functions allow B cells to modulate other components of the innate and adaptive immune system. Autoantibody-independent B cell functions include antigen presentation, T cell activation and polarization, and dendritic cell modulation. Several of these functions are mediated by the ability of B cells to produce immunoregulatory cytokines and chemokines and by their critical contribution to lymphoid tissue development and organization including the development of ectopic tertiary lymphoid tissue. Additionally, the functional versatility of B cells enables them to play either protective or pathogenic roles in autoimmunity. In turn, B cell dysfunction has been critically implicated in the pathophysiology of systemic lupus erythematosus (SLE), a complex disease characterized by the production of autoantibodies and heterogeneous clinical involvement. Thus, the breakdown of B cell tolerance is a defining and early event in the disease process and may occur by multiple pathways, including alterations in factors that affect B cell activation thresholds, B cell longevity, and apoptotic cell processing. Once tolerance is broken, autoantibodies contribute to autoimmunity by multiple mechanisms including immune-complex mediated Type III hypersensitivity reactions, type II antibody-dependent cytotoxicity, and by instructing innate immune cells to produce pathogenic cytokines including IFNα, TNF and IL-1. The complexity of B cell functions has been highlighted by the variable success of B cell-targeted therapies in multiple autoimmune diseases, including those conventionally viewed as T cell-mediated conditions. Given the widespread utilization of B cell depletion therapy in autoimmune diseases and the need for new therapeutic approaches in SLE, a better understanding of human B cell subsets and the balance of pathogenic and regulatory functions is of the essence.

  1. A Technical Approach to Expedited Processing of NTPR Radiation Dose Assessments

    DTIC Science & Technology

    2011-10-01

    Pharynx ET Region+ Surrogate Oral Cavity and Pharynx (140-149) None PNLGL Pineal Gland Brain Surrogate Other Endocrine Glands (194) PITTGL PITTGL...including brain); endocrine glands other than thyroid; other and ill-defined sites; lymphoma and multiple myeloma Risk depends on age at exposure...endocrine glands 14 45 Cancers of other and ill-defined sites 16 50 Lymphoma and multiple myeloma 22 61 Leukemia, excluding CLL 1.9 (5 years) 41

  2. Automation of Ocean Product Metrics

    DTIC Science & Technology

    2008-09-30

    Presented in: Ocean Sciences 2008 Conf., 5 Mar 2008. Shriver, J., J. D. Dykes, and J. Fabre: Automation of Operational Ocean Product Metrics. Presented in 2008 EGU General Assembly , 14 April 2008. 9 ...processing (multiple data cuts per day) and multiple-nested models. Routines for generating automated evaluations of model forecast statistics will be...developed and pre-existing tools will be collected to create a generalized tool set, which will include user-interface tools to the metrics data

  3. Taxation of oil and gas revenues: Norway

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stauffer, T.R.

    1982-04-01

    Fiscalization of petroleum in Norway is a multidimensional process, which includes the conventional components of explicit taxation but also involves implicit nontax economic burdens. The latter are often even more important than the taxes themselves. The multidimensional fiscal structure reflects the multiple purposes of petroleum taxation in Norway, of which revenue collection appears to be but one. Given the multiple objectives, it is therefore not surprising that the components are partly inconsistent and contradictory.

  4. Optical Interconnection Via Computer-Generated Holograms

    NASA Technical Reports Server (NTRS)

    Liu, Hua-Kuang; Zhou, Shaomin

    1995-01-01

    Method of free-space optical interconnection developed for data-processing applications like parallel optical computing, neural-network computing, and switching in optical communication networks. In method, multiple optical connections between multiple sources of light in one array and multiple photodetectors in another array made via computer-generated holograms in electrically addressed spatial light modulators (ESLMs). Offers potential advantages of massive parallelism, high space-bandwidth product, high time-bandwidth product, low power consumption, low cross talk, and low time skew. Also offers advantage of programmability with flexibility of reconfiguration, including variation of strengths of optical connections in real time.

  5. The Overgrid Interface for Computational Simulations on Overset Grids

    NASA Technical Reports Server (NTRS)

    Chan, William M.; Kwak, Dochan (Technical Monitor)

    2002-01-01

    Computational simulations using overset grids typically involve multiple steps and a variety of software modules. A graphical interface called OVERGRID has been specially designed for such purposes. Data required and created by the different steps include geometry, grids, domain connectivity information and flow solver input parameters. The interface provides a unified environment for the visualization, processing, generation and diagnosis of such data. General modules are available for the manipulation of structured grids and unstructured surface triangulations. Modules more specific for the overset approach include surface curve generators, hyperbolic and algebraic surface grid generators, a hyperbolic volume grid generator, Cartesian box grid generators, and domain connectivity: pre-processing tools. An interface provides automatic selection and viewing of flow solver boundary conditions, and various other flow solver inputs. For problems involving multiple components in relative motion, a module is available to build the component/grid relationships and to prescribe and animate the dynamics of the different components.

  6. Response and adaptation of photosynthesis, respiration, and antioxidant systems to elevated CO2 with environmental stress in plants

    PubMed Central

    Xu, Zhenzhu; Jiang, Yanling; Zhou, Guangsheng

    2015-01-01

    It is well known that plant photosynthesis and respiration are two fundamental and crucial physiological processes, while the critical role of the antioxidant system in response to abiotic factors is still a focus point for investigating physiological stress. Although one key metabolic process and its response to climatic change have already been reported and reviewed, an integrative review, including several biological processes at multiple scales, has not been well reported. The current review will present a synthesis focusing on the underlying mechanisms in the responses to elevated CO2 at multiple scales, including molecular, cellular, biochemical, physiological, and individual aspects, particularly, for these biological processes under elevated CO2 with other key abiotic stresses, such as heat, drought, and ozone pollution, as well as nitrogen limitation. The present comprehensive review may add timely and substantial information about the topic in recent studies, while it presents what has been well established in previous reviews. First, an outline of the critical biological processes, and an overview of their roles in environmental regulation, is presented. Second, the research advances with regard to the individual subtopics are reviewed, including the response and adaptation of the photosynthetic capacity, respiration, and antioxidant system to CO2 enrichment alone, and its combination with other climatic change factors. Finally, the potential applications for plant responses at various levels to climate change are discussed. The above issue is currently of crucial concern worldwide, and this review may help in a better understanding of how plants deal with elevated CO2 using other mainstream abiotic factors, including molecular, cellular, biochemical, physiological, and whole individual processes, and the better management of the ecological environment, climate change, and sustainable development. PMID:26442017

  7. Hereditary Angioedema Attacks: Local Swelling at Multiple Sites.

    PubMed

    Hofman, Zonne L M; Relan, Anurag; Hack, C Erik

    2016-02-01

    Hereditary angioedema (HAE) patients experience recurrent local swelling in various parts of the body including painful swelling of the intestine and life-threatening laryngeal oedema. Most HAE literature is about attacks located in one anatomical site, though it is mentioned that HAE attacks may also involve multiple anatomical sites simultaneously. A detailed description of such multi-location attacks is currently lacking. This study investigated the occurrence, severity and clinical course of HAE attacks with multiple anatomical locations. HAE patients included in a clinical database of recombinant human C1-inhibitor (rhC1INH) studies were evaluated. Visual analog scale scores filled out by the patients for various symptoms at various locations and investigator symptoms scores during the attack were analysed. Data of 219 eligible attacks in 119 patients was analysed. Thirty-three patients (28%) had symptoms at multiple locations in anatomically unrelated regions at the same time during their first attack. Up to five simultaneously affected locations were reported. The observation that severe HAE attacks often affect multiple sites in the body suggests that HAE symptoms result from a systemic rather than from a local process as is currently believed.

  8. Signal Processing, Analysis, & Display

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lager, Darrell; Azevado, Stephen

    1986-06-01

    SIG is a general-purpose signal processing, analysis, and display program. Its main purpose is to perform manipulations on time- and frequency-domain signals. However, it has been designed to ultimately accommodate other representations for data such as multiplexed signals and complex matrices. Two user interfaces are provided in SIG - a menu mode for the unfamiliar user and a command mode for more experienced users. In both modes errors are detected as early as possible and are indicated by friendly, meaningful messages. An on-line HELP package is also included. A variety of operations can be performed on time- and frequency-domain signalsmore » including operations on the samples of a signal, operations on the entire signal, and operations on two or more signals. Signal processing operations that can be performed are digital filtering (median, Bessel, Butterworth, and Chebychev), ensemble average, resample, auto and cross spectral density, transfer function and impulse response, trend removal, convolution, Fourier transform and inverse window functions (Hamming, Kaiser-Bessel), simulation (ramp, sine, pulsetrain, random), and read/write signals. User definable signal processing algorithms are also featured. SIG has many options including multiple commands per line, command files with arguments,commenting lines, defining commands, and automatic execution for each item in a repeat sequence. Graphical operations on signals and spectra include: x-y plots of time signals; real, imaginary, magnitude, and phase plots of spectra; scaling of spectra for continuous or discrete domain; cursor zoom; families of curves; and multiple viewports.« less

  9. SIG. Signal Processing, Analysis, & Display

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hernandez, J.; Lager, D.; Azevedo, S.

    1992-01-22

    SIG is a general-purpose signal processing, analysis, and display program. Its main purpose is to perform manipulations on time- and frequency-domain signals. However, it has been designed to ultimately accommodate other representations for data such as multiplexed signals and complex matrices. Two user interfaces are provided in SIG - a menu mode for the unfamiliar user and a command mode for more experienced users. In both modes errors are detected as early as possible and are indicated by friendly, meaningful messages. An on-line HELP package is also included. A variety of operations can be performed on time- and frequency-domain signalsmore » including operations on the samples of a signal, operations on the entire signal, and operations on two or more signals. Signal processing operations that can be performed are digital filtering (median, Bessel, Butterworth, and Chebychev), ensemble average, resample, auto and cross spectral density, transfer function and impulse response, trend removal, convolution, Fourier transform and inverse window functions (Hamming, Kaiser-Bessel), simulation (ramp, sine, pulsetrain, random), and read/write signals. User definable signal processing algorithms are also featured. SIG has many options including multiple commands per line, command files with arguments,commenting lines, defining commands, and automatic execution for each item in a repeat sequence. Graphical operations on signals and spectra include: x-y plots of time signals; real, imaginary, magnitude, and phase plots of spectra; scaling of spectra for continuous or discrete domain; cursor zoom; families of curves; and multiple viewports.« less

  10. A practical theoretical formalism for atomic multielectron processes: direct multiple ionization by a single auger decay or by impact of a single electron or photon

    NASA Astrophysics Data System (ADS)

    Liu, Pengfei; Zeng, Jiaolong; Yuan, Jianmin

    2018-04-01

    Multiple electron processes occur widely in atoms, molecules, clusters, and condensed matters when they are interacting with energetic particles or intense laser fields. Direct multielectron processes (DMEP) are the most complicated among the general multiple electron processes and are the most difficult to describe theoretically. In this work, a unified and accurate theoretical formalism is proposed on the DMEP of atoms including the multiple auger decay and multiple ionization by an impact of a single electron or a single photon based on the atomic collision theory described by a correlated many-body Green's function. Such a practical treatment is made possible by taking consideration of the different coherence features of the atoms (matter waves) in the initial and final states. We first explain how the coherence characteristics of the ejected continuum electrons is largely destructed, by taking the electron impact direct double ionization process as an example. The direct double ionization process is completely different from the single ionization where the complete interference can be maintained. The detailed expressions are obtained for the energy correlations among the continuum electrons and energy resolved differential and integral cross sections according to the separation of knock-out (KO) and shake-off (SO) mechanisms for the electron impact direct double ionization, direct double and triple auger decay, and double and triple photoionization (TPI) processes. Extension to higher order DMEP than triple ionization is straight forward by adding contributions of the following KO and SO processes. The approach is applied to investigate the electron impact double ionization processes of C+, N+, and O+, the direct double and triple auger decay of the K-shell excited states of C+ 1s2{s}22{p}2{}2D and {}2P, and the double and TPI of lithium. Comparisons with the experimental and other theoretical investigations wherever available in the literature show that our theoretical formalism is accurate and effective in treating the atomic multielectron processes.

  11. Analytical group decision making in natural resources: Methodology and application

    USGS Publications Warehouse

    Schmoldt, D.L.; Peterson, D.L.

    2000-01-01

    Group decision making is becoming increasingly important in natural resource management and associated scientific applications, because multiple values are treated coincidentally in time and space, multiple resource specialists are needed, and multiple stakeholders must be included in the decision process. Decades of social science research on decision making in groups have provided insights into the impediments to effective group processes and on techniques that can be applied in a group context. Nevertheless, little integration and few applications of these results have occurred in resource management decision processes, where formal groups are integral, either directly or indirectly. A group decision-making methodology is introduced as an effective approach for temporary, formal groups (e.g., workshops). It combines the following three components: (1) brainstorming to generate ideas; (2) the analytic hierarchy process to produce judgments, manage conflict, enable consensus, and plan for implementation; and (3) a discussion template (straw document). Resulting numerical assessments of alternative decision priorities can be analyzed statistically to indicate where group member agreement occurs and where priority values are significantly different. An application of this group process to fire research program development in a workshop setting indicates that the process helps focus group deliberations; mitigates groupthink, nondecision, and social loafing pitfalls; encourages individual interaction; identifies irrational judgments; and provides a large amount of useful quantitative information about group preferences. This approach can help facilitate scientific assessments and other decision-making processes in resource management.

  12. Electrophysiological models of neural processing.

    PubMed

    Nelson, Mark E

    2011-01-01

    The brain is an amazing information processing system that allows organisms to adaptively monitor and control complex dynamic interactions with their environment across multiple spatial and temporal scales. Mathematical modeling and computer simulation techniques have become essential tools in understanding diverse aspects of neural processing ranging from sub-millisecond temporal coding in the sound localization circuity of barn owls to long-term memory storage and retrieval in humans that can span decades. The processing capabilities of individual neurons lie at the core of these models, with the emphasis shifting upward and downward across different levels of biological organization depending on the nature of the questions being addressed. This review provides an introduction to the techniques for constructing biophysically based models of individual neurons and local networks. Topics include Hodgkin-Huxley-type models of macroscopic membrane currents, Markov models of individual ion-channel currents, compartmental models of neuronal morphology, and network models involving synaptic interactions among multiple neurons.

  13. Feature extraction from multiple data sources using genetic programming

    NASA Astrophysics Data System (ADS)

    Szymanski, John J.; Brumby, Steven P.; Pope, Paul A.; Eads, Damian R.; Esch-Mosher, Diana M.; Galassi, Mark C.; Harvey, Neal R.; McCulloch, Hersey D.; Perkins, Simon J.; Porter, Reid B.; Theiler, James P.; Young, Aaron C.; Bloch, Jeffrey J.; David, Nancy A.

    2002-08-01

    Feature extraction from imagery is an important and long-standing problem in remote sensing. In this paper, we report on work using genetic programming to perform feature extraction simultaneously from multispectral and digital elevation model (DEM) data. We use the GENetic Imagery Exploitation (GENIE) software for this purpose, which produces image-processing software that inherently combines spatial and spectral processing. GENIE is particularly useful in exploratory studies of imagery, such as one often does in combining data from multiple sources. The user trains the software by painting the feature of interest with a simple graphical user interface. GENIE then uses genetic programming techniques to produce an image-processing pipeline. Here, we demonstrate evolution of image processing algorithms that extract a range of land cover features including towns, wildfire burnscars, and forest. We use imagery from the DOE/NNSA Multispectral Thermal Imager (MTI) spacecraft, fused with USGS 1:24000 scale DEM data.

  14. Using Photovoice to Include People with Profound and Multiple Learning Disabilities in Inclusive Research

    ERIC Educational Resources Information Center

    Cluley, Victoria

    2017-01-01

    Background: It is now expected that projects addressing the lives of people with learning disabilities include people with learning disabilities in the research process. In the past, such research often excluded people with learning disabilities, favouring the opinions of family members, carers and professionals. The inclusion of the voices of…

  15. Melatonin, The Pineal Gland and Circadian Rhythms

    DTIC Science & Technology

    1992-04-30

    physiological rhythms including locomotion, sleep/wake, thermoregulation , car- diovascular function and many endocrine processes. Among the rhythms under SCN...control of a wide array of behavioral and physiological rhythms including locomotion, sleep/wake, thermoregulation , cardiovascular function and many... reptiles and birds, overt rhythmicity results from the integration of multiple circadian oscillators within the pineal gland, eyes and the presumed

  16. The Inclusion of Science Process Skills in Multiple Choice Questions: Are We Getting Any Better?

    ERIC Educational Resources Information Center

    Elmas, Ridvan; Bodner, George M.; Aydogdu, Bulent; Saban, Yakup

    2018-01-01

    The goal of this study was to analyze the science and technology questions with respect to science process skills (SPS) included in the "Transition from Primary to Secondary Education" (TEOG) examination developed for use with 8th-grade students in Turkey. The 12 TEOG exams administered in the course of three academic years from 2014…

  17. Software system for data management and distributed processing of multichannel biomedical signals.

    PubMed

    Franaszczuk, P J; Jouny, C C

    2004-01-01

    The presented software is designed for efficient utilization of cluster of PC computers for signal analysis of multichannel physiological data. The system consists of three main components: 1) a library of input and output procedures, 2) a database storing additional information about location in a storage system, 3) a user interface for selecting data for analysis, choosing programs for analysis, and distributing computing and output data on cluster nodes. The system allows for processing multichannel time series data in multiple binary formats. The description of data format, channels and time of recording are included in separate text files. Definition and selection of multiple channel montages is possible. Epochs for analysis can be selected both manually and automatically. Implementation of a new signal processing procedures is possible with a minimal programming overhead for the input/output processing and user interface. The number of nodes in cluster used for computations and amount of storage can be changed with no major modification to software. Current implementations include the time-frequency analysis of multiday, multichannel recordings of intracranial EEG of epileptic patients as well as evoked response analyses of repeated cognitive tasks.

  18. A new cognitive rehabilitation programme for patients with multiple sclerosis: the 'MS-line! Project'.

    PubMed

    Gich, Jordi; Freixenet, Jordi; Garcia, Rafael; Vilanova, Joan Carles; Genís, David; Silva, Yolanda; Montalban, Xavier; Ramió-Torrentà, Lluís

    2015-09-01

    Cognitive rehabilitation is often delayed in multiple sclerosis (MS). To develop a free and specific cognitive rehabilitation programme for MS patients to be used from early stages that does not interfere with daily living activities. MS-line!, cognitive rehabilitation materials consisting of written, manipulative and computer-based materials with difficulty levels developed by a multidisciplinary team. Mathematical, problem-solving and word-based exercises were designed. Physical materials included spatial, coordination and reasoning games. Computer-based material included logic and reasoning, working memory and processing speed games. Cognitive rehabilitation exercises that are specific for MS patients have been successfully developed. © The Author(s), 2014.

  19. Combined fluorescence and reflectance spectroscopy for in vivo quantification of cancer biomarkers in low- and high-grade glioma surgery

    PubMed Central

    Valdés, Pablo A.; Kim, Anthony; Leblond, Frederic; Conde, Olga M.; Harris, Brent T.; Paulsen, Keith D.; Wilson, Brian C.; Roberts, David W.

    2011-01-01

    Biomarkers are indicators of biological processes and hold promise for the diagnosis and treatment of disease. Gliomas represent a heterogeneous group of brain tumors with marked intra- and inter-tumor variability. The extent of surgical resection is a significant factor influencing post-surgical recurrence and prognosis. Here, we used fluorescence and reflectance spectral signatures for in vivo quantification of multiple biomarkers during glioma surgery, with fluorescence contrast provided by exogenously-induced protoporphyrin IX (PpIX) following administration of 5-aminolevulinic acid. We performed light-transport modeling to quantify multiple biomarkers indicative of tumor biological processes, including the local concentration of PpIX and associated photoproducts, total hemoglobin concentration, oxygen saturation, and optical scattering parameters. We developed a diagnostic algorithm for intra-operative tissue delineation that accounts for the combined tumor-specific predictive capabilities of these quantitative biomarkers. Tumor tissue delineation achieved accuracies of up to 94% (specificity = 94%, sensitivity = 94%) across a range of glioma histologies beyond current state-of-the-art optical approaches, including state-of-the-art fluorescence image guidance. This multiple biomarker strategy opens the door to optical methods for surgical guidance that use quantification of well-established neoplastic processes. Future work would seek to validate the predictive power of this proof-of-concept study in a separate larger cohort of patients. PMID:22112112

  20. Combined fluorescence and reflectance spectroscopy for in vivo quantification of cancer biomarkers in low- and high-grade glioma surgery

    NASA Astrophysics Data System (ADS)

    Valdés, Pablo A.; Kim, Anthony; Leblond, Frederic; Conde, Olga M.; Harris, Brent T.; Paulsen, Keith D.; Wilson, Brian C.; Roberts, David W.

    2011-11-01

    Biomarkers are indicators of biological processes and hold promise for the diagnosis and treatment of disease. Gliomas represent a heterogeneous group of brain tumors with marked intra- and inter-tumor variability. The extent of surgical resection is a significant factor influencing post-surgical recurrence and prognosis. Here, we used fluorescence and reflectance spectral signatures for in vivo quantification of multiple biomarkers during glioma surgery, with fluorescence contrast provided by exogenously-induced protoporphyrin IX (PpIX) following administration of 5-aminolevulinic acid. We performed light-transport modeling to quantify multiple biomarkers indicative of tumor biological processes, including the local concentration of PpIX and associated photoproducts, total hemoglobin concentration, oxygen saturation, and optical scattering parameters. We developed a diagnostic algorithm for intra-operative tissue delineation that accounts for the combined tumor-specific predictive capabilities of these quantitative biomarkers. Tumor tissue delineation achieved accuracies of up to 94% (specificity = 94%, sensitivity = 94%) across a range of glioma histologies beyond current state-of-the-art optical approaches, including state-of-the-art fluorescence image guidance. This multiple biomarker strategy opens the door to optical methods for surgical guidance that use quantification of well-established neoplastic processes. Future work would seek to validate the predictive power of this proof-of-concept study in a separate larger cohort of patients.

  1. Introduction to acoustic emission

    NASA Technical Reports Server (NTRS)

    Possa, G.

    1983-01-01

    Typical acoustic emission signal characteristics are described and techniques which localize the signal source by processing the acoustic delay data from multiple sensors are discussed. The instrumentation, which includes sensors, amplifiers, pulse counters, a minicomputer and output devices is examined. Applications are reviewed.

  2. Dynamo Catalogue: Geometrical tools and data management for particle picking in subtomogram averaging of cryo-electron tomograms.

    PubMed

    Castaño-Díez, Daniel; Kudryashev, Mikhail; Stahlberg, Henning

    2017-02-01

    Cryo electron tomography allows macromolecular complexes within vitrified, intact, thin cells or sections thereof to be visualized, and structural analysis to be performed in situ by averaging over multiple copies of the same molecules. Image processing for subtomogram averaging is specific and cumbersome, due to the large amount of data and its three dimensional nature and anisotropic resolution. Here, we streamline data processing for subtomogram averaging by introducing an archiving system, Dynamo Catalogue. This system manages tomographic data from multiple tomograms and allows visual feedback during all processing steps, including particle picking, extraction, alignment and classification. The file structure of a processing project file structure includes logfiles of performed operations, and can be backed up and shared between users. Command line commands, database queries and a set of GUIs give the user versatile control over the process. Here, we introduce a set of geometric tools that streamline particle picking from simple (filaments, spheres, tubes, vesicles) and complex geometries (arbitrary 2D surfaces, rare instances on proteins with geometric restrictions, and 2D and 3D crystals). Advanced functionality, such as manual alignment and subboxing, is useful when initial templates are generated for alignment and for project customization. Dynamo Catalogue is part of the open source package Dynamo and includes tools to ensure format compatibility with the subtomogram averaging functionalities of other packages, such as Jsubtomo, PyTom, PEET, EMAN2, XMIPP and Relion. Copyright © 2016. Published by Elsevier Inc.

  3. Prospective relations between family conflict and adolescent maladjustment: security in the family system as a mediating process.

    PubMed

    Cummings, E Mark; Koss, Kalsea J; Davies, Patrick T

    2015-04-01

    Conflict in specific family systems (e.g., interparental, parent-child) has been implicated in the development of a host of adjustment problems in adolescence, but little is known about the impact of family conflict involving multiple family systems. Furthermore, questions remain about the effects of family conflict on symptoms of specific disorders and adjustment problems and the processes mediating these effects. The present study prospectively examines the impact of family conflict and emotional security about the family system on adolescent symptoms of specific disorders and adjustment problems, including the development of symptoms of anxiety, depression, conduct problems, and peer problems. Security in the family system was examined as a mediator of these relations. Participants included 295 mother-father-adolescent families (149 girls) participating across three annual time points (grades 7-9). Including auto-regressive controls for initial levels of emotional insecurity and multiple adjustment problems (T1), higher-order emotional insecurity about the family system (T2) mediated relations between T1 family conflict and T3 peer problems, anxiety, and depressive symptoms. Further analyses supported specific patterns of emotional security/insecurity (i.e., security, disengagement, preoccupation) as mediators between family conflict and specific domains of adolescent adjustment. Family conflict was thus found to prospectively predict the development of symptoms of multiple specific adjustment problems, including symptoms of depression, anxiety, conduct problems, and peer problems, by elevating in in adolescent's emotional insecurity about the family system. The clinical implications of these findings are considered.

  4. Exploring asynchronous brainstorming in large groups: a field comparison of serial and parallel subgroups.

    PubMed

    de Vreede, Gert-Jan; Briggs, Robert O; Reiter-Palmon, Roni

    2010-04-01

    The aim of this study was to compare the results of two different modes of using multiple groups (instead of one large group) to identify problems and develop solutions. Many of the complex problems facing organizations today require the use of very large groups or collaborations of groups from multiple organizations. There are many logistical problems associated with the use of such large groups, including the ability to bring everyone together at the same time and location. A field study involved two different organizations and compared productivity and satisfaction of group. The approaches included (a) multiple small groups, each completing the entire process from start to end and combining the results at the end (parallel mode); and (b) multiple subgroups, each building on the work provided by previous subgroups (serial mode). Groups using the serial mode produced more elaborations compared with parallel groups, whereas parallel groups produced more unique ideas compared with serial groups. No significant differences were found related to satisfaction with process and outcomes between the two modes. Preferred mode depends on the type of task facing the group. Parallel groups are more suited for tasks for which a variety of new ideas are needed, whereas serial groups are best suited when elaboration and in-depth thinking on the solution are required. Results of this research can guide the development of facilitated sessions of large groups or "teams of teams."

  5. Coordinating an Autonomous Earth-Observing Sensorweb

    NASA Technical Reports Server (NTRS)

    Sherwood, Robert; Cichy, Benjamin; Tran, Daniel; Chien, Steve; Rabideau, Gregg; Davies, Ashley; Castano, Rebecca; frye, Stuart; Mandl, Dan; Shulman, Seth; hide

    2006-01-01

    A system of software has been developed to coordinate the operation of an autonomous Earth-observing sensorweb. Sensorwebs are collections of sensor units scattered over large regions to gather data on spatial and temporal patterns of physical, chemical, or biological phenomena in those regions. Each sensor unit is a node in a data-gathering/ data-communication network that spans a region of interest. In this case, the region is the entire Earth, and the sensorweb includes multiple terrestrial and spaceborne sensor units. In addition to acquiring data for scientific study, the sensorweb is required to give timely notice of volcanic eruptions, floods, and other hazardous natural events. In keeping with the inherently modular nature of the sensory, communication, and data-processing hardware, the software features a flexible, modular architecture that facilitates expansion of the network, customization of conditions that trigger alarms of hazardous natural events, and customization of responses to alarms. The soft8 NASA Tech Briefs, July 2006 ware facilitates access to multiple sources of data on an event of scientific interest, enables coordinated use of multiple sensors in rapid reaction to detection of an event, and facilitates the tracking of spacecraft operations, including tracking of the acquisition, processing, and downlinking of requested data.

  6. Generalized Landauer equation: Absorption-controlled diffusion processes

    NASA Astrophysics Data System (ADS)

    Godoy, Salvador; García-Colín, L. S.; Micenmacher, Victor

    1999-05-01

    The exact expression of the one-dimensional Boltzmann multiple-scattering coefficients, for the passage of particles through a slab of a given material, is obtained in terms of the single-scattering cross section of the material, including absorption. The remarkable feature of the result is that for multiple scattering in a metal, free from absorption, one recovers the well-known Landauer result for conduction electrons. In the case of particles, such as neutrons, moving through a weak absorbing media, Landuer's formula is modified due to the absorption cross section. For photons, in a strong absorbing media, one recovers the Lambert-Beer equation. In this latter case one may therefore speak of absorption-controlled diffusive processes.

  7. Calculating with light using a chip-scale all-optical abacus.

    PubMed

    Feldmann, J; Stegmaier, M; Gruhler, N; Ríos, C; Bhaskaran, H; Wright, C D; Pernice, W H P

    2017-11-02

    Machines that simultaneously process and store multistate data at one and the same location can provide a new class of fast, powerful and efficient general-purpose computers. We demonstrate the central element of an all-optical calculator, a photonic abacus, which provides multistate compute-and-store operation by integrating functional phase-change materials with nanophotonic chips. With picosecond optical pulses we perform the fundamental arithmetic operations of addition, subtraction, multiplication, and division, including a carryover into multiple cells. This basic processing unit is embedded into a scalable phase-change photonic network and addressed optically through a two-pulse random access scheme. Our framework provides first steps towards light-based non-von Neumann arithmetic.

  8. Baseband-processed SS-TDMA communication system architecture and design concepts

    NASA Technical Reports Server (NTRS)

    Attwood, S.; Sabourin, D.

    1982-01-01

    The architecture and system design for a commercial satellite communications system planned for the 1990's was developed by Motorola for NASA's Lewis Research Center. The system provides data communications between individual users via trunking and customer premises service terminals utilizing a central switching satellite operating in a time-division multiple-access (TDMA) mode. The major elements of the design incorporating baseband processing include: demand-assigned multiple access reservation protocol, spectral utilization, system synchronization, modulation technique and forward error control implementation. Motorola's baseband processor design, which is being proven in a proof-of-concept advanced technology development, will perform data regeneration and message routing for individual users on-board the spacecraft.

  9. Accurate Micro-Tool Manufacturing by Iterative Pulsed-Laser Ablation

    NASA Astrophysics Data System (ADS)

    Warhanek, Maximilian; Mayr, Josef; Dörig, Christian; Wegener, Konrad

    2017-12-01

    Iterative processing solutions, including multiple cycles of material removal and measurement, are capable of achieving higher geometric accuracy by compensating for most deviations manifesting directly on the workpiece. Remaining error sources are the measurement uncertainty and the repeatability of the material-removal process including clamping errors. Due to the lack of processing forces, process fluids and wear, pulsed-laser ablation has proven high repeatability and can be realized directly on a measuring machine. This work takes advantage of this possibility by implementing an iterative, laser-based correction process for profile deviations registered directly on an optical measurement machine. This way efficient iterative processing is enabled, which is precise, applicable for all tool materials including diamond and eliminates clamping errors. The concept is proven by a prototypical implementation on an industrial tool measurement machine and a nanosecond fibre laser. A number of measurements are performed on both the machine and the processed workpieces. Results show production deviations within 2 μm diameter tolerance.

  10. Exploring change in a group-based psychological intervention for multiple sclerosis patients.

    PubMed

    Borghi, Martina; Bonino, Silvia; Graziano, Federica; Calandri, Emanuela

    2018-07-01

    The study is focused on a group-based cognitive behavioral intervention aimed at promoting the quality of life and psychological well-being of multiple sclerosis patients. The study investigates how the group intervention promoted change among participants and fostered their adjustment to the illness. The intervention involved six groups of patients (a total of 41 patients) and included four consecutive sessions and a 6-month follow-up. To explore change, verbatim transcripts of the intervention sessions were analyzed using a mixed-methods content analysis with qualitative data combined with descriptive statistics. The categories of resistance and openness to change were used to describe the process of change. Resistance and openness to change coexisted during the intervention. Only in the first session did resistance prevail over openness to change; thereafter, openness to change gradually increased and stabilized over time, and openness to change was then always stronger than resistance. The study builds on previous research on the effectiveness of group-based psychological interventions for multiple sclerosis patients and gives methodological and clinical suggestions to health care professionals working with multiple sclerosis patients. Implications for rehabilitation The study suggests that a group-based cognitive behavioral intervention for multiple sclerosis patients focused on the promotion of identity redefinition, a sense of coherence and self-efficacy in dealing with multiple sclerosis fosters the process of change and may be effective in promoting patients' adjustment to their illness. Health care professionals leading group-based psychological interventions for multiple sclerosis patients should be aware that resistance and openness to change coexist in the process of change. The study suggests that the duration of the intervention is a crucial factor: a minimum of three sessions appears to be necessary for group participants to develop greater openness to change and follow-up sessions should be implemented to maintain positive changes among participants. The use of qualitative instruments to evaluate group interventions captures the complexity of processes and gives useful indications to health professionals to improve rehabilitation programs.

  11. Subvocal articulatory rehearsal during verbal working memory in multiple sclerosis.

    PubMed

    Sweet, Lawrence H; Vanderhill, Susan D; Jerskey, Beth A; Gordon, Norman M; Paul, Robert H; Cohen, Ronald A

    2010-10-01

    This study was designed to examine verbal working memory (VWM) components among multiple sclerosis (MS) patients and determine the influence of information processing speed. Of two frequently studied VWM sub-components, subvocal rehearsal was expected to be more affected by MS than short-term memory buffering. Furthermore, worse subvocal rehearsal was predicted to be specifically related to slower cognitive processing. Fifteen MS patients were administered a neuropsychological battery assessing VWM, processing speed, mood, fatigue, and disability. Participants performed a 2-Back VWM task with modified nested conditions designed to increase subvocal rehearsal (via inter-stimulus interval) and short-term memory buffering demands (via phonological similarity). Performance during these 2-Back conditions did not significantly differ and both exhibited strong positive correlations with disability. However, only scores on the subvocal rehearsal 2-Back were significantly related to performance on the remaining test battery, including processing speed and depressive symptoms. Findings suggest that performance during increased subvocal rehearsal demands is specifically influenced by cognitive processing speed and depressive symptoms.

  12. Software Framework for Advanced Power Plant Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    John Widmann; Sorin Munteanu; Aseem Jain

    2010-08-01

    This report summarizes the work accomplished during the Phase II development effort of the Advanced Process Engineering Co-Simulator (APECS). The objective of the project is to develop the tools to efficiently combine high-fidelity computational fluid dynamics (CFD) models with process modeling software. During the course of the project, a robust integration controller was developed that can be used in any CAPE-OPEN compliant process modeling environment. The controller mediates the exchange of information between the process modeling software and the CFD software. Several approaches to reducing the time disparity between CFD simulations and process modeling have been investigated and implemented. Thesemore » include enabling the CFD models to be run on a remote cluster and enabling multiple CFD models to be run simultaneously. Furthermore, computationally fast reduced-order models (ROMs) have been developed that can be 'trained' using the results from CFD simulations and then used directly within flowsheets. Unit operation models (both CFD and ROMs) can be uploaded to a model database and shared between multiple users.« less

  13. Method for Fabricating Composite Structures Using Pultrusion Processing

    NASA Technical Reports Server (NTRS)

    Farley, Gary L. (Inventor)

    2000-01-01

    A method for fabricating composite structures at a low-cost, moderate-to-high production rate. A first embodiment of the method includes employing a continuous press forming fabrication process. A second embodiment of the method includes employing a pultrusion process for obtaining composite structures. The methods include coating yarns with matrix material, weaving the yarn into fabric to produce a continuous fabric supply and feeding multiple layers of net-shaped fabrics having optimally oriented fibers into a debulking tool to form an undebulked preform. The continuous press forming fabrication process includes partially debulking the preform, cutting the partially debulked preform and debulking the partially debulked preform to form a net-shape. An electron-beam or similar technique then cures the structure. The pultrusion fabric process includes feeding the undebulked preform into a heated die and gradually debulking the undebulked preform. The undebulked preform in the heated die changes dimension until a desired cross-sectional dimension is achieved. This process further includes obtaining a net-shaped infiltrated uncured preform, cutting the uncured preform to a desired length and electron-beam curing (or similar technique) the uncured preform. These fabrication methods produce superior structures formed at higher production rates, resulting in lower cost and high structural performance.

  14. Method for Fabricating Composite Structures Using Pultrusion Processing

    NASA Technical Reports Server (NTRS)

    Farley, Gary L. (Inventor)

    2000-01-01

    A method for fabricating composite structures at a low-cost, moderate-to-high production rate. A first embodiment of the method includes employing a continuous press forming fabrication process. A second embodiment of the method includes employing a pultrusion process for obtaining composite structures. The methods include coating yarns with matrix material, weaving the yarn into fabric to produce a continuous fabric supply and feeding multiple layers of net-shaped fabrics having optimally oriented fibers into a debulking tool to form an undebulked preform. The continuous press forming fabrication process includes partially debulking the preform, cutting the partially debulked preform and debulking the partially debulked preform to form a netshape. An electron-beam or similar technique then cures the structure. The pultrusion fabric process includes feeding the undebulked preform into a heated die and gradually debulking the undebulked preform. The undebulked preform in the heated die changes dimension until a desired cross-sectional dimension is achieved. This process further includes obtaining a net-shaped infiltrated uncured preform, cutting the uncured preform to a desired length and electronbeam curing (or similar technique) the uncured preform. These fabrication methods produce superior structures formed at higher production rates, resulting in lower cost and high structural performance.

  15. A Prescription for List-Mode Data Processing Conventions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beddingfield, David H.; Swinhoe, Martyn Thomas; Huszti, Jozsef

    There are a variety of algorithmic approaches available to process list-mode pulse streams to produce multiplicity histograms for subsequent analysis. In the development of the INCC v6.0 code to include the processing of this data format, we have noted inconsistencies in the “processed time” between the various approaches. The processed time, tp, is the time interval over which the recorded pulses are analyzed to construct multiplicity histograms. This is the time interval that is used to convert measured counts into count rates. The observed inconsistencies in tp impact the reported count rate information and the determination of the error-values associatedmore » with the derived singles, doubles, and triples counting rates. This issue is particularly important in low count-rate environments. In this report we will present a prescription for the processing of list-mode counting data that produces values that are both correct and consistent with traditional shift-register technologies. It is our objective to define conventions for list mode data processing to ensure that the results are physically valid and numerically aligned with the results from shift-register electronics.« less

  16. Requirements Analysis for Large Ada Programs: Lessons Learned on CCPDS- R

    DTIC Science & Technology

    1989-12-01

    when the design had matured and This approach was not optimal from the formal the SRS role was to be the tester’s contract, implemen- testing and...on the software development CPU processing load. These constraints primar- process is the necessity to include sufficient testing ily affect algorithm...allocations and timing requirements are by-products of the software design process when multiple CSCls are a P R StrR eSOFTWARE ENGINEERING executed within

  17. Should Secondary Schools Buy Local Area Networks?

    ERIC Educational Resources Information Center

    Hyde, Hartley

    1986-01-01

    The advantages of microcomputer networks include resource sharing, multiple user communications, and integrating data processing and office automation. This article nonetheless favors stand-alone computers for Australian secondary school classrooms because of unreliable hardware, software design, and copyright problems, and individual progress…

  18. Measurements of gluconeogenesis and glycogenolysis: A methodological review

    USDA-ARS?s Scientific Manuscript database

    Gluconeogenesis is a complex metabolic process that involves multiple enzymatic steps regulated by myriad factors, including substrate concentrations, the redox state, activation and inhibition of specific enzyme steps, and hormonal modulation. At present, the most widely accepted technique to deter...

  19. The micronutrient genomics project: a community-driven knowledge base for micronutrient research

    USDA-ARS?s Scientific Manuscript database

    Micronutrients influence multiple metabolic pathways including oxidative and inflammatory processes. Optimum micronutrient supply is important for the maintenance of homeostasis in metabolism and, ultimately, for maintaining good health. With advances in systems biology and genomics technologies, it...

  20. Roundhouse Diagrams.

    ERIC Educational Resources Information Center

    Ward, Robin E.; Wandersee, James

    2000-01-01

    Students must understand key concepts through reasoning, searching out related concepts, and making connections within multiple systems to learn science. The Roundhouse diagram was developed to be a concise, holistic, graphic representation of a science topic, process, or activity. Includes sample Roundhouse diagrams, a diagram checklist, and…

  1. The use of process mapping in healthcare quality improvement projects.

    PubMed

    Antonacci, Grazia; Reed, Julie E; Lennox, Laura; Barlow, James

    2018-05-01

    Introduction Process mapping provides insight into systems and processes in which improvement interventions are introduced and is seen as useful in healthcare quality improvement projects. There is little empirical evidence on the use of process mapping in healthcare practice. This study advances understanding of the benefits and success factors of process mapping within quality improvement projects. Methods Eight quality improvement projects were purposively selected from different healthcare settings within the UK's National Health Service. Data were gathered from multiple data-sources, including interviews exploring participants' experience of using process mapping in their projects and perceptions of benefits and challenges related to its use. These were analysed using inductive analysis. Results Eight key benefits related to process mapping use were reported by participants (gathering a shared understanding of the reality; identifying improvement opportunities; engaging stakeholders in the project; defining project's objectives; monitoring project progress; learning; increased empathy; simplicity of the method) and five factors related to successful process mapping exercises (simple and appropriate visual representation, information gathered from multiple stakeholders, facilitator's experience and soft skills, basic training, iterative use of process mapping throughout the project). Conclusions Findings highlight benefits and versatility of process mapping and provide practical suggestions to improve its use in practice.

  2. Two-step emulsification process for water-in-oil-in-water multiple emulsions stabilized by lamellar liquid crystals.

    PubMed

    Ito, Toshifumi; Tsuji, Yukitaka; Aramaki, Kenji; Tonooka, Noriaki

    2012-01-01

    Multiple emulsions, also called complex emulsions or multiphase emulsions, include water-in-oil-in-water (W/O/W)-type and oil-in-water-in-oil (O/W/O)-type emulsions. W/O/W-type multiple emulsions, obtained by utilizing lamellar liquid crystal with a layer structure showing optical anisotropy at the periphery of emulsion droplets, are superior in stability to O/W/O-type emulsions. In this study, we investigated a two-step emulsification process for a W/O/W-type multiple emulsion utilizing liquid crystal emulsification. We found that a W/O/W-type multiple emulsion containing lamellar liquid crystal can be prepared by mixing a W/O-type emulsion (prepared by primary emulsification) with a lamellar liquid crystal obtained from poly(oxyethylene) stearyl ether, cetyl alcohol, and water, and by dispersing and emulsifying the mixture in an outer aqueous phase. When poly(oxyethylene) stearyl ether and cetyl alcohol are each used in a given amount and the amount of water added is varied from 0 to 15 g (total amount of emulsion, 100 g), a W/O/W-type multiple emulsion is efficiently prepared. When the W/O/W-type multiple emulsion was held in a thermostatic bath at 25°C, the droplet size distribution showed no change 0, 30, or 60 days after preparation. Moreover, the W/O/W-type multiple emulsion strongly encapsulated Uranine in the inner aqueous phase as compared with emulsions prepared by one-step emulsification.

  3. Neural network-based multiple robot simultaneous localization and mapping.

    PubMed

    Saeedi, Sajad; Paull, Liam; Trentini, Michael; Li, Howard

    2011-12-01

    In this paper, a decentralized platform for simultaneous localization and mapping (SLAM) with multiple robots is developed. Each robot performs single robot view-based SLAM using an extended Kalman filter to fuse data from two encoders and a laser ranger. To extend this approach to multiple robot SLAM, a novel occupancy grid map fusion algorithm is proposed. Map fusion is achieved through a multistep process that includes image preprocessing, map learning (clustering) using neural networks, relative orientation extraction using norm histogram cross correlation and a Radon transform, relative translation extraction using matching norm vectors, and then verification of the results. The proposed map learning method is a process based on the self-organizing map. In the learning phase, the obstacles of the map are learned by clustering the occupied cells of the map into clusters. The learning is an unsupervised process which can be done on the fly without any need to have output training patterns. The clusters represent the spatial form of the map and make further analyses of the map easier and faster. Also, clusters can be interpreted as features extracted from the occupancy grid map so the map fusion problem becomes a task of matching features. Results of the experiments from tests performed on a real environment with multiple robots prove the effectiveness of the proposed solution.

  4. Quantitative assessment of cervical vertebral maturation using cone beam computed tomography in Korean girls.

    PubMed

    Byun, Bo-Ram; Kim, Yong-Il; Yamaguchi, Tetsutaro; Maki, Koutaro; Son, Woo-Sung

    2015-01-01

    This study was aimed to examine the correlation between skeletal maturation status and parameters from the odontoid process/body of the second vertebra and the bodies of third and fourth cervical vertebrae and simultaneously build multiple regression models to be able to estimate skeletal maturation status in Korean girls. Hand-wrist radiographs and cone beam computed tomography (CBCT) images were obtained from 74 Korean girls (6-18 years of age). CBCT-generated cervical vertebral maturation (CVM) was used to demarcate the odontoid process and the body of the second cervical vertebra, based on the dentocentral synchondrosis. Correlation coefficient analysis and multiple linear regression analysis were used for each parameter of the cervical vertebrae (P < 0.05). Forty-seven of 64 parameters from CBCT-generated CVM (independent variables) exhibited statistically significant correlations (P < 0.05). The multiple regression model with the greatest R (2) had six parameters (PH2/W2, UW2/W2, (OH+AH2)/LW2, UW3/LW3, D3, and H4/W4) as independent variables with a variance inflation factor (VIF) of <2. CBCT-generated CVM was able to include parameters from the second cervical vertebral body and odontoid process, respectively, for the multiple regression models. This suggests that quantitative analysis might be used to estimate skeletal maturation status.

  5. Pedagogical Affordances of Multiple External Representations in Scientific Processes

    NASA Astrophysics Data System (ADS)

    Wu, Hsin-Kai; Puntambekar, Sadhana

    2012-12-01

    Multiple external representations (MERs) have been widely used in science teaching and learning. Theories such as dual coding theory and cognitive flexibility theory have been developed to explain why the use of MERs is beneficial to learning, but they do not provide much information on pedagogical issues such as how and in what conditions MERs could be introduced and used to support students' engagement in scientific processes and develop competent scientific practices (e.g., asking questions, planning investigations, and analyzing data). Additionally, little is understood about complex interactions among scientific processes and affordances of MERs. Therefore, this article focuses on pedagogical affordances of MERs in learning environments that engage students in various scientific processes. By reviewing literature in science education and cognitive psychology and integrating multiple perspectives, this article aims at exploring (1) how MERs can be integrated with science processes due to their different affordances, and (2) how student learning with MERs can be scaffolded, especially in a classroom situation. We argue that pairing representations and scientific processes in a principled way based on the affordances of the representations and the goals of the activities is a powerful way to use MERs in science education. Finally, we outline types of scaffolding that could help effective use of MERs including dynamic linking, model progression, support in instructional materials, teacher support, and active engagement.

  6. BOREAS RSS-8 BIOME-BGC Model Simulations at Tower Flux Sites in 1994

    NASA Technical Reports Server (NTRS)

    Hall, Forrest G. (Editor); Nickeson, Jaime (Editor); Kimball, John

    2000-01-01

    BIOME-BGC is a general ecosystem process model designed to simulate biogeochemical and hydrologic processes across multiple scales (Running and Hunt, 1993). In this investigation, BIOME-BGC was used to estimate daily water and carbon budgets for the BOREAS tower flux sites for 1994. Carbon variables estimated by the model include gross primary production (i.e., net photosynthesis), maintenance and heterotrophic respiration, net primary production, and net ecosystem carbon exchange. Hydrologic variables estimated by the model include snowcover, evaporation, transpiration, evapotranspiration, soil moisture, and outflow. The information provided by the investigation includes input initialization and model output files for various sites in tabular ASCII format.

  7. Pollination and seed dispersal are the most threatened processes of plant regeneration

    NASA Astrophysics Data System (ADS)

    Neuschulz, Eike Lena; Mueller, Thomas; Schleuning, Matthias; Böhning-Gaese, Katrin

    2016-07-01

    Plant regeneration is essential for maintaining forest biodiversity and ecosystem functioning, which are globally threatened by human disturbance. Here we present the first integrative meta-analysis on how forest disturbance affects multiple ecological processes of plant regeneration including pollination, seed dispersal, seed predation, recruitment and herbivory. We analysed 408 pairwise comparisons of these processes between near-natural and disturbed forests. Human impacts overall reduced plant regeneration. Importantly, only processes early in the regeneration cycle that often depend on plant-animal interactions, i.e. pollination and seed dispersal, were negatively affected. Later processes, i.e. seed predation, recruitment and herbivory, showed overall no significant response to human disturbance. Conserving pollination and seed dispersal, including the animals that provide these services to plants, should become a priority in forest conservation efforts globally.

  8. AgMIP: Next Generation Models and Assessments

    NASA Astrophysics Data System (ADS)

    Rosenzweig, C.

    2014-12-01

    Next steps in developing next-generation crop models fall into several categories: significant improvements in simulation of important crop processes and responses to stress; extension from simplified crop models to complex cropping systems models; and scaling up from site-based models to landscape, national, continental, and global scales. Crop processes that require major leaps in understanding and simulation in order to narrow uncertainties around how crops will respond to changing atmospheric conditions include genetics; carbon, temperature, water, and nitrogen; ozone; and nutrition. The field of crop modeling has been built on a single crop-by-crop approach. It is now time to create a new paradigm, moving from 'crop' to 'cropping system.' A first step is to set up the simulation technology so that modelers can rapidly incorporate multiple crops within fields, and multiple crops over time. Then the response of these more complex cropping systems can be tested under different sustainable intensification management strategies utilizing the updated simulation environments. Model improvements for diseases, pests, and weeds include developing process-based models for important diseases, frameworks for coupling air-borne diseases to crop models, gathering significantly more data on crop impacts, and enabling the evaluation of pest management strategies. Most smallholder farming in the world involves integrated crop-livestock systems that cannot be represented by crop modeling alone. Thus, next-generation cropping system models need to include key linkages to livestock. Livestock linkages to be incorporated include growth and productivity models for grasslands and rangelands as well as the usual annual crops. There are several approaches for scaling up, including use of gridded models and development of simpler quasi-empirical models for landscape-scale analysis. On the assessment side, AgMIP is leading a community process for coordinated contributions to IPCC AR6 that involves the key modeling groups from around the world including North America, Europe, South America, Sub-Saharan Africa, South Asia, East Asia, and Australia and Oceania. This community process will lead to mutually agreed protocols for coordinated global and regional assessments.

  9. Integrated Main Propulsion System Performance Reconstruction Process/Models

    NASA Technical Reports Server (NTRS)

    Lopez, Eduardo; Elliott, Katie; Snell, Steven; Evans, Michael

    2013-01-01

    The Integrated Main Propulsion System (MPS) Performance Reconstruction process provides the MPS post-flight data files needed for postflight reporting to the project integration management and key customers to verify flight performance. This process/model was used as the baseline for the currently ongoing Space Launch System (SLS) work. The process utilizes several methodologies, including multiple software programs, to model integrated propulsion system performance through space shuttle ascent. It is used to evaluate integrated propulsion systems, including propellant tanks, feed systems, rocket engine, and pressurization systems performance throughout ascent based on flight pressure and temperature data. The latest revision incorporates new methods based on main engine power balance model updates to model higher mixture ratio operation at lower engine power levels.

  10. Measuring Radiofrequency and Microwave Radiation from Varying Signal Strengths

    NASA Technical Reports Server (NTRS)

    Davis, Bette; Gaul, W. C.

    2007-01-01

    This viewgraph presentation discusses the process of measuring radiofrequency and microwave radiation from various signal strengths. The topics include: 1) Limits and Guidelines; 2) Typical Variable Standard (IEEE) Frequency Dependent; 3) FCC Standard 47 CFR 1.1310; 4) Compliance Follows Unity Rule; 5) Multiple Sources Contribute; 6) Types of RF Signals; 7) Interfering Radiations; 8) Different Frequencies Different Powers; 9) Power Summing - Peak Power; 10) Contribution from Various Single Sources; 11) Total Power from Multiple Sources; 12) Are You Out of Compliance?; and 13) In Compliance.

  11. Multiple exciton generation in chiral carbon nanotubes: Density functional theory based computation

    NASA Astrophysics Data System (ADS)

    Kryjevski, Andrei; Mihaylov, Deyan; Kilina, Svetlana; Kilin, Dmitri

    2017-10-01

    We use a Boltzmann transport equation (BE) to study time evolution of a photo-excited state in a nanoparticle including phonon-mediated exciton relaxation and the multiple exciton generation (MEG) processes, such as exciton-to-biexciton multiplication and biexciton-to-exciton recombination. BE collision integrals are computed using Kadanoff-Baym-Keldysh many-body perturbation theory based on density functional theory simulations, including exciton effects. We compute internal quantum efficiency (QE), which is the number of excitons generated from an absorbed photon in the course of the relaxation. We apply this approach to chiral single-wall carbon nanotubes (SWCNTs), such as (6,2) and (6,5). We predict efficient MEG in the (6,2) and (6,5) SWCNTs within the solar spectrum range starting at the 2Eg energy threshold and with QE reaching ˜1.6 at about 3Eg, where Eg is the electronic gap.

  12. Multiple exciton generation in chiral carbon nanotubes: Density functional theory based computation.

    PubMed

    Kryjevski, Andrei; Mihaylov, Deyan; Kilina, Svetlana; Kilin, Dmitri

    2017-10-21

    We use a Boltzmann transport equation (BE) to study time evolution of a photo-excited state in a nanoparticle including phonon-mediated exciton relaxation and the multiple exciton generation (MEG) processes, such as exciton-to-biexciton multiplication and biexciton-to-exciton recombination. BE collision integrals are computed using Kadanoff-Baym-Keldysh many-body perturbation theory based on density functional theory simulations, including exciton effects. We compute internal quantum efficiency (QE), which is the number of excitons generated from an absorbed photon in the course of the relaxation. We apply this approach to chiral single-wall carbon nanotubes (SWCNTs), such as (6,2) and (6,5). We predict efficient MEG in the (6,2) and (6,5) SWCNTs within the solar spectrum range starting at the 2E g energy threshold and with QE reaching ∼1.6 at about 3E g , where E g is the electronic gap.

  13. Information Retrieval: A Sequential Learning Process.

    ERIC Educational Resources Information Center

    Bookstein, Abraham

    1983-01-01

    Presents decision-theoretic models which intrinsically include retrieval of multiple documents whereby system responds to request by presenting documents to patron in sequence, gathering feedback, and using information to modify future retrievals. Document independence model, set retrieval model, sequential retrieval model, learning model,…

  14. Organotypic three-dimensional culture model of mesenchymal and epithelial cells to examine tissue fusion events.

    EPA Science Inventory

    Tissue fusion during early mammalian development requires coordination of multiple cell types, the extracellular matrix, and complex signaling pathways. Fusion events during processes including heart development, neural tube closure, and palatal fusion are dependent on signaling ...

  15. IPL Processing of the Viking Orbiter Images of Mars

    NASA Technical Reports Server (NTRS)

    Ruiz, R. M.; Elliott, D. A.; Yagi, G. M.; Pomphrey, R. B.; Power, M. A.; Farrell, W., Jr.; Lorre, J. J.; Benton, W. D.; Dewar, R. E.; Cullen, L. E.

    1977-01-01

    The Viking orbiter cameras returned over 9000 images of Mars during the 6-month nominal mission. Digital image processing was required to produce products suitable for quantitative and qualitative scientific interpretation. Processing included the production of surface elevation data using computer stereophotogrammetric techniques, crater classification based on geomorphological characteristics, and the generation of color products using multiple black-and-white images recorded through spectral filters. The Image Processing Laboratory of the Jet Propulsion Laboratory was responsible for the design, development, and application of the software required to produce these 'second-order' products.

  16. Optimizing the post-graduate institutional program evaluation process.

    PubMed

    Lypson, Monica L; Prince, Mark E P; Kasten, Steven J; Osborne, Nicholas H; Cohan, Richard H; Kowalenko, Terry; Dougherty, Paul J; Reynolds, R Kevin; Spires, M Catherine; Kozlow, Jeffrey H; Gitlin, Scott D

    2016-02-17

    Reviewing program educational efforts is an important component of postgraduate medical education program accreditation. The post-graduate review process has evolved over time to include centralized oversight based on accreditation standards. The institutional review process and the impact on participating faculty are topics not well described in the literature. We conducted multiple Plan-Do-Study-Act (PDSA) cycles to identify and implement areas for change to improve productivity in our institutional program review committee. We also conducted one focus group and six in-person interviews with 18 committee members to explore their perspectives on the committee's evolution. One author (MLL) reviewed the transcripts and performed the initial thematic coding with a PhD level research associate and identified and categorized themes. These themes were confirmed by all participating committee members upon review of a detailed summary. Emergent themes were triangulated with the University of Michigan Medical School's Admissions Executive Committee (AEC). We present an overview of adopted new practices to the educational program evaluation process at the University of Michigan Health System that includes standardization of meetings, inclusion of resident members, development of area content experts, solicitation of committed committee members, transition from paper to electronic committee materials, and focus on continuous improvement. Faculty and resident committee members identified multiple improvement areas including the ability to provide high quality reviews of training programs, personal and professional development, and improved feedback from program trainees. A standing committee that utilizes the expertise of a group of committed faculty members and which includes formal resident membership has significant advantages over ad hoc or other organizational structures for program evaluation committees.

  17. A multi-scale metrics approach to forest fragmentation for Strategic Environmental Impact Assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Eunyoung, E-mail: eykim@kei.re.kr; Song, Wonkyong, E-mail: wksong79@gmail.com; Lee, Dongkun, E-mail: dklee7@snu.ac.kr

    Forests are becoming severely fragmented as a result of land development. South Korea has responded to changing community concerns about environmental issues. The nation has developed and is extending a broad range of tools for use in environmental management. Although legally mandated environmental compliance requirements in South Korea have been implemented to predict and evaluate the impacts of land-development projects, these legal instruments are often insufficient to assess the subsequent impact of development on the surrounding forests. It is especially difficult to examine impacts on multiple (e.g., regional and local) scales in detail. Forest configuration and size, including forest fragmentationmore » by land development, are considered on a regional scale. Moreover, forest structure and composition, including biodiversity, are considered on a local scale in the Environmental Impact Assessment process. Recently, the government amended the Environmental Impact Assessment Act, including the SEA, EIA, and small-scale EIA, to require an integrated approach. Therefore, the purpose of this study was to establish an impact assessment system that minimizes the impacts of land development using an approach that is integrated across multiple scales. This study focused on forest fragmentation due to residential development and road construction sites in selected Congestion Restraint Zones (CRZs) in the Greater Seoul Area of South Korea. Based on a review of multiple-scale impacts, this paper integrates models that assess the impacts of land development on forest ecosystems. The applicability of the integrated model for assessing impacts on forest ecosystems through the SEIA process is considered. On a regional scale, it is possible to evaluate the location and size of a land-development project by considering aspects of forest fragmentation, such as the stability of the forest structure and the degree of fragmentation. On a local scale, land-development projects should consider the distances at which impacts occur in the vicinity of the forest ecosystem, and these considerations should include the impacts on forest vegetation and bird species. Impacts can be mitigated by considering the distances at which these influences occur. In particular, this paper presents an integrated environmental impact assessment system to be applied in the SEIA process. The integrated assessment system permits the assessment of the cumulative impacts of land development on multiple scales. -- Highlights: • The model is to assess the impact of forest fragmentation across multiple scales. • The paper suggests the type of forest fragmentation on a regional scale. • The type can be used to evaluate the location and size of a land development. • The paper shows the influence distance of land development on a local scale. • The distance can be used to mitigate the impact at an EIA process.« less

  18. The neuroscience of placebo effects: connecting context, learning and health

    PubMed Central

    Wager, Tor D.; Atlas, Lauren Y.

    2018-01-01

    Placebo effects are beneficial effects that are attributable to the brain–mind responses to the context in which a treatment is delivered rather than to the specific actions of the drug. They are mediated by diverse processes — including learning, expectations and social cognition — and can influence various clinical and physiological outcomes related to health. Emerging neuroscience evidence implicates multiple brain systems and neurochemical mediators, including opioids and dopamine. We present an empirical review of the brain systems that are involved in placebo effects, focusing on placebo analgesia, and a conceptual framework linking these findings to the mind–brain processes that mediate them. This framework suggests that the neuropsychological processes that mediate placebo effects may be crucial for a wide array of therapeutic approaches, including many drugs. PMID:26087681

  19. Policy-based secure communication with automatic key management for industrial control and automation systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chernoguzov, Alexander; Markham, Thomas R.; Haridas, Harshal S.

    A method includes generating at least one access vector associated with a specified device in an industrial process control and automation system. The specified device has one of multiple device roles. The at least one access vector is generated based on one or more communication policies defining communications between one or more pairs of devices roles in the industrial process control and automation system, where each pair of device roles includes the device role of the specified device. The method also includes providing the at least one access vector to at least one of the specified device and one ormore » more other devices in the industrial process control and automation system in order to control communications to or from the specified device.« less

  20. Wafer hotspot prevention using etch aware OPC correction

    NASA Astrophysics Data System (ADS)

    Hamouda, Ayman; Power, Dave; Salama, Mohamed; Chen, Ao

    2016-03-01

    As technology development advances into deep-sub-wavelength nodes, multiple patterning is becoming more essential to achieve the technology shrink requirements. Recently, Optical Proximity Correction (OPC) technology has proposed simultaneous correction of multiple mask-patterns to enable multiple patterning awareness during OPC correction. This is essential to prevent inter-layer hot-spots during the final pattern transfer. In state-of-art literature, multi-layer awareness is achieved using simultaneous resist-contour simulations to predict and correct for hot-spots during mask generation. However, this approach assumes a uniform etch shrink response for all patterns independent of their proximity, which isn't sufficient for the full prevention of inter-exposure hot-spot, for example different color space violations post etch or via coverage/enclosure post etch. In this paper, we explain the need to include the etch component during multiple patterning OPC. We also introduce a novel approach for Etch-aware simultaneous Multiple-patterning OPC, where we calibrate and verify a lumped model that includes the combined resist and etch responses. Adding this extra simulation condition during OPC is suitable for full chip processing from a computation intensity point of view. Also, using this model during OPC to predict and correct inter-exposures hot-spots is similar to previously proposed multiple-patterning OPC, yet our proposed approach more accurately corrects post-etch defects too.

  1. Method for Fabricating Composite Structures Including Continuous Press Forming and Pultrusion Processing

    NASA Technical Reports Server (NTRS)

    Farley, Gary L. (Inventor)

    1995-01-01

    A method for fabricating composite structures at a low-cost, moderate-to-high production rate is disclosed. A first embodiment of the method includes employing a continuous press forming fabrication process. A second embodiment of the method includes employing a pultrusion process for obtaining composite structures. The methods include coating yarns with matrix material, weaving the yarn into fabric to produce a continuous fabric supply, and feeding multiple layers of net-shaped fabrics having optimally oriented fibers into a debulking tool to form an undebulked preform. The continuous press forming fabrication process includes partially debulking the preform, cutting the partially debulked preform, and debulking the partially debulked preform to form a netshape. An electron-beam or similar technique then cures the structure. The pultrusion fabric process includes feeding the undebulked preform into a heated die and gradually debulking the undebulked preform. The undebulked preform in the heated die changes dimension until a desired cross-sectional dimension is achieved. This process further includes obtaining a net-shaped infiltrated uncured preform, cutting the uncured preform to a desired length, and electron-beam curing (or similar technique) the uncured preform. These fabrication methods produce superior structures formed at higher production rates, resulting in lower cost and high structural performance.

  2. Multiple Export Mechanisms for mRNAs

    PubMed Central

    Delaleau, Mildred; Borden, Katherine L. B.

    2015-01-01

    Nuclear mRNA export plays an important role in gene expression. We describe the mechanisms of mRNA export including the importance of mRNP assembly, docking with the nuclear basket of the nuclear pore complex (NPC), transit through the central channel of the NPC and cytoplasmic release. We describe multiple mechanisms of mRNA export including NXF1 and CRM1 mediated pathways. Selective groups of mRNAs can be preferentially transported in order to respond to cellular stimuli. RNAs can be selected based on the presence of specific cis-acting RNA elements and binding of specific adaptor proteins. The role that dysregulation of this process plays in human disease is also discussed. PMID:26343730

  3. Correcting for multiple-testing in multi-arm trials: is it necessary and is it done?

    PubMed

    Wason, James M S; Stecher, Lynne; Mander, Adrian P

    2014-09-17

    Multi-arm trials enable the evaluation of multiple treatments within a single trial. They provide a way of substantially increasing the efficiency of the clinical development process. However, since multi-arm trials test multiple hypotheses, some regulators require that a statistical correction be made to control the chance of making a type-1 error (false-positive). Several conflicting viewpoints are expressed in the literature regarding the circumstances in which a multiple-testing correction should be used. In this article we discuss these conflicting viewpoints and review the frequency with which correction methods are currently used in practice. We identified all multi-arm clinical trials published in 2012 by four major medical journals. Summary data on several aspects of the trial design were extracted, including whether the trial was exploratory or confirmatory, whether a multiple-testing correction was applied and, if one was used, what type it was. We found that almost half (49%) of published multi-arm trials report using a multiple-testing correction. The percentage that corrected was higher for trials in which the experimental arms included multiple doses or regimens of the same treatments (67%). The percentage that corrected was higher in exploratory than confirmatory trials, although this is explained by a greater proportion of exploratory trials testing multiple doses and regimens of the same treatment. A sizeable proportion of published multi-arm trials do not correct for multiple-testing. Clearer guidance about whether multiple-testing correction is needed for multi-arm trials that test separate treatments against a common control group is required.

  4. Natural health products that inhibit angiogenesis: a potential source for investigational new agents to treat cancer—Part 1

    PubMed Central

    Sagar, S.M.; Yance, D.; Wong, R.K.

    2006-01-01

    An integrative approach for managing a patient with cancer should target the multiple biochemical and physiologic pathways that support tumour development and minimize normal-tissue toxicity. Angiogenesis is a key process in the promotion of cancer. Many natural health products that inhibit angiogenesis also manifest other anticancer activities. The present article focuses on products that have a high degree of anti-angiogenic activity, but it also describes some of the many other actions of these agents that can inhibit tumour progression and reduce the risk of metastasis. Natural health products target molecular pathways other than angiogenesis, including epidermal growth factor receptor, the HER2/neu gene, the cyclooxygenase-2 enzyme, the nuclear factor kappa-B transcription factor, the protein kinases, the Bcl-2 protein, and coagulation pathways. The herbs that are traditionally used for anticancer treatment and that are anti-angiogenic through multiple interdependent processes (including effects on gene expression, signal processing, and enzyme activities) include Artemisia annua (Chinese wormwood), Viscum album (European mistletoe), Curcuma longa (curcumin), Scutellaria baicalensis (Chinese skullcap), resveratrol and proanthocyanidin (grape seed extract), Magnolia officinalis (Chinese magnolia tree), Camellia sinensis (green tea), Ginkgo biloba, quercetin, Poria cocos, Zingiber officinalis (ginger), Panax ginseng, Rabdosia rubescens hora (Rabdosia), and Chinese destagnation herbs. Quality assurance of appropriate extracts is essential prior to embarking upon clinical trials. More data are required on dose–response, appropriate combinations, and potential toxicities. Given the multiple effects of these agents, their future use for cancer therapy probably lies in synergistic combinations. During active cancer therapy, they should generally be evaluated in combination with chemotherapy and radiation. In this role, they act as modifiers of biologic response or as adaptogens, potentially enhancing the efficacy of the conventional therapies. PMID:17576437

  5. Quality and efficiency successes leveraging IT and new processes.

    PubMed

    Chaiken, Barry P; Christian, Charles E; Johnson, Liz

    2007-01-01

    Today, healthcare annually invests billions of dollars in information technology, including clinical systems, electronic medical records and interoperability platforms. While continued investment and parallel development of standards are critical to secure exponential benefits from clinical information technology, intelligent and creative redesign of processes through path innovation is necessary to deliver meaningful value. Reports from two organizations included in this report review the steps taken to reinvent clinical processes that best leverage information technology to deliver safer and more efficient care. Good Samaritan Hospital, Vincennes, Indiana, implemented electronic charting, point-of-care bar coding of medications prior to administration, and integrated clinical documentation for nursing, laboratory, radiology and pharmacy. Tenet Healthcare, during its implementation and deployment of multiple clinical systems across several hospitals, focused on planning that included team-based process redesign. In addition, Tenet constructed valuable and measurable metrics that link outcomes with its strategic goals.

  6. A System for Distributing Real-Time Customized (NEXRAD-Radar) Geosciences Data

    NASA Astrophysics Data System (ADS)

    Singh, Satpreet; McWhirter, Jeff; Krajewski, Witold; Kruger, Anton; Goska, Radoslaw; Seo, Bongchul; Domaszczynski, Piotr; Weber, Jeff

    2010-05-01

    Hydrometeorologists and hydrologists can benefit from (weather) radar derived rain products, including rain rates and accumulations. The Hydro-NEXRAD system (HNX1) has been in operation since 2006 at IIHR-Hydroscience and Engineering at The University of Iowa. It provides rapid and user-friendly access to such user-customized products, generated using archived Weather Surveillance Doppler Radar (WSR-88D) data from the NEXRAD weather radar network in the United States. HNX1 allows researchers to deal directly with radar-derived rain products, without the burden of the details of radar data collection, quality control, processing, and format conversion. A number of hydrologic applications can benefit from a continuous real-time feed of customized radar-derived rain products. We are currently developing such a system, Hydro-NEXRAD 2 (HNX2). HNX2 collects real-time, unprocessed data from multiple NEXRAD radars as they become available, processes them through a user-configurable pipeline of data-processing modules, and then publishes processed products at regular intervals. Modules in the data processing pipeline encapsulate algorithms such as non-meteorological echo detection, range correction, radar-reflectivity-rain rate (Z-R) conversion, advection correction, merging products from multiple radars, and grid transformations. HNX2's implementation presents significant challenges, including quality-control, error-handling, time-synchronization of data from multiple asynchronous sources, generation of multiple-radar metadata products, distribution of products to a user base with diverse needs and constraints, and scalability. For content management and distribution, HNX2 uses RAMADDA (Repository for Archiving, Managing and Accessing Diverse Data), developed by the UCAR/Unidata Program Center in the Unites States. RAMADDA allows HNX2 to publish products through automation and gives users multiple access methods to the published products, including simple web-browser based access, and OpenDAP access. The latter allows a user to set up automation at his/her end, and fetch new data from HNX2 at regular intervals. HNX2 uses a two-dimensional metadata structure called a mosaic for managing metadata of the rain products. Currently, HNX2 is in pre-production state and is serving near real-time rain-rate map data-products for individual radars and merged data-products from seven radars covering the state of Iowa in the United States. These products then drive a rainfall-runoff model called CUENCAS, which is used as part of the Iowa Flood Center (housed at The University of Iowa) real-time flood forecasting system. We are currently developing a generalized scalable framework that will run on inexpensive hardware and will provide products for basins anywhere in the continental United States.

  7. A three-dimensional code for muon propagation through the rock: MUSIC

    NASA Astrophysics Data System (ADS)

    Antonioli, P.; Ghetti, C.; Korolkova, E. V.; Kudryavtsev, V. A.; Sartorelli, G.

    1997-10-01

    We present a new three-dimensional Monte-Carlo code MUSIC (MUon SImulation Code) for muon propagation through the rock. All processes of muon interaction with matter with high energy loss (including the knock-on electron production) are treated as stochastic processes. The angular deviation and lateral displacement of muons due to multiple scattering, as well as bremsstrahlung, pair production and inelastic scattering are taken into account. The code has been applied to obtain the energy distribution and angular and lateral deviations of single muons at different depths underground. The muon multiplicity distributions obtained with MUSIC and CORSIKA (Extensive Air Shower simulation code) are also presented. We discuss the systematic uncertainties of the results due to different muon bremsstrahlung cross-sections.

  8. Nonlinear Optical Image Processing with Bacteriorhodopsin Films

    NASA Technical Reports Server (NTRS)

    Downie, John D.; Deiss, Ron (Technical Monitor)

    1994-01-01

    The transmission properties of some bacteriorhodopsin film spatial light modulators are uniquely suited to allow nonlinear optical image processing operations to be applied to images with multiplicative noise characteristics. A logarithmic amplitude transmission feature of the film permits the conversion of multiplicative noise to additive noise, which may then be linearly filtered out in the Fourier plane of the transformed image. The bacteriorhodopsin film displays the logarithmic amplitude response for write beam intensities spanning a dynamic range greater than 2.0 orders of magnitude. We present experimental results demonstrating the principle and capability for several different image and noise situations, including deterministic noise and speckle. Using the bacteriorhodopsin film, we successfully filter out image noise from the transformed image that cannot be removed from the original image.

  9. Method for Fabricating Composite Structures Using Continuous Press Forming

    NASA Technical Reports Server (NTRS)

    Farley, Gary L. (Inventor)

    1997-01-01

    A method for fabricating composite structures at a low-cost. moderate-to-high production rate. A first embodiment of the method includes employing a continuous press forming fabrication process. A second embodiment of the method includes employing a pultrusion process for obtaining composite structures. The methods include coating yarns with matrix material, weaving the yarn into fabric to produce a continuous fabric supply and feeding multiple layers of net-shaped fabrics having optimally oriented fibers into a debulking tool to form an undebulked preform. The continuous press forming fabrication process includes partially debulking the preform, cutting the partially debulked preform and debulking the partially debulked preform to form a net-shape. An electron-beam or similar technique then cures the structure. The pultrusion fabric process includes feeding the undebulked preform into a heated die and gradually debulking the undebulked preform. The undebulked preform in the heated die changes dimension until a desired cross-sectional dimension is achieved. This process further includes obtaining a net-shaped infiltrated uncured preform, cutting the uncured preform to a desired length and electron-beam curing (or similar technique) the uncured preform. These fabrication methods produce superior structures formed at higher production rates. resulting in lower cost and high structural performance.

  10. Development and implementation of a residency project advisory board.

    PubMed

    Dagam, Julie K; Iglar, Arlene; Kindsfater, Julie; Loeb, Al; Smith, Chad; Spexarth, Frank; Brierton, Dennis; Woller, Thomas

    2017-06-15

    The development and implementation of a residency project advisory board (RPAB) to manage multiple pharmacy residents' yearlong projects across several residency programs are described. Preceptor and resident feedback during our annual residency program review and strategic planning sessions suggested the implementation of a more-coordinated approach to the identification, selection, and oversight of all components of the residency project process. A panel of 7 department leaders actively engaged in residency training and performance improvement was formed to evaluate the residency project process and provide recommendations for change. These 7 individuals would eventually constitute the RPAB. The primary objective of the RPAB at Aurora Health Care is to provide oversight and a structured framework for the selection and execution of multiple residents' yearlong projects across all residency programs within our organization. Key roles of the RPAB include developing expectations, coordinating residency project ideas, and providing oversight and feedback. The development and implementation of the RPAB resulted in a significant overhaul of our entire yearlong resident project process. Trends toward success were realized after the first year of implementation, including consistent expectations, increased clarity and engagement in resident project ideas, and more projects meeting anticipated endpoints. The development and implementation of an RPAB have provided a framework to optimize the organization, progression, and outcomes of multiple pharmacy resident yearlong projects in all residency programs across our pharmacy enterprise. Copyright © 2017 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  11. Prospective Relations between Family Conflict and Adolescent Maladjustment: Security in the Family System as a Mediating Process

    PubMed Central

    Cummings, E. Mark; Koss, Kalsea J.; Davies, Patrick T.

    2018-01-01

    Conflict in specific family systems (e.g., interparental, parent-child) has been implicated in the development of a host of adjustment problems in adolescence, but little is known about the impact of family conflict involving multiple family systems. Furthermore, questions remain about the effects of family conflict on symptoms of specific disorders and adjustment problems and the processes mediating these effects. The present study prospectively examines the impact of family conflict and emotional security about the family system on adolescent symptoms of specific disorders and adjustment problems, including the development of symptoms of anxiety, depression, conduct problems, and peer problems. Security in the family system was examined as a mediator of these relations. Participants included 295 mother-father-adolescent families (149 girls) participating across three annual time points (grades 7–9). Including auto-regressive controls for initial levels of emotional insecurity and multiple adjustment problems (T1), higher-order emotional insecurity about the family system (T2) mediated relations between T1 family conflict and T3 peer problems, anxiety, and depressive symptoms. Further analyses supported specific patterns of emotional security/insecurity (i.e., security, disengagement, preoccupation) as mediators between family conflict and specific domains of adolescent adjustment. Family conflict was thus found to prospectively predict the development of symptoms of multiple specific adjustment problems, including symptoms of depression, anxiety, conduct problems, and peer problems, by elevating in in adolescent’s emotional insecurity about the family system. The clinical implications of these findings are considered. PMID:25131144

  12. A Robust and Resilient Network Design Paradigm for Region-Based Faults Inflicted by WMD Attack

    DTIC Science & Technology

    2016-04-01

    MEASUREMENTS FOR GRID MONITORING AND CONTROL AGAINST POSSIBLE WMD ATTACKS We investigated big data processing of PMU measurements for grid monitoring and...control against possible WMD attacks. Big data processing and analytics of synchrophasor measurements, collected from multiple locations of power grids...collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources

  13. Disposable world-to-chip interface for digital microfluidics

    DOEpatents

    Van Dam, R. Michael; Shah, Gaurav; Keng, Pei-Yuin

    2017-05-16

    The present disclosure sets forth incorporating microfluidic chips interfaces for use with digital microfluidic processes. Methods and devices according to the present disclosure utilize compact, integrated platforms that interface with a chip upstream and downstream of the reaction, as well as between intermediate reaction steps if needed. In some embodiments these interfaces are automated, including automation of a multiple reagent process. Various reagent delivery systems and methods are also disclosed.

  14. Neurocomputation by Reaction Diffusion

    NASA Astrophysics Data System (ADS)

    Liang, Ping

    1995-08-01

    This Letter demonstrates the possible role nonsynaptic diffusion neurotransmission may play in neurocomputation using an artificial neural network model. A reaction-diffusion neural network model with field-based information-processing mechanisms is proposed. The advantages of nonsynaptic field neurotransmission from a computational viewpoint demonstrated in this Letter include long-range inhibition using only local interaction, nonhardwired and changeable (target specific) long-range communication pathways, and multiple simultaneous spatiotemporal organization processes in the same medium.

  15. The Heterogeneity in Retrieved Relations between the Personality Trait ‘Harm Avoidance’ and Gray Matter Volumes Due to Variations in the VBM and ROI Labeling Processing Settings

    PubMed Central

    Van Schuerbeek, Peter; Baeken, Chris; De Mey, Johan

    2016-01-01

    Concerns are raising about the large variability in reported correlations between gray matter morphology and affective personality traits as ‘Harm Avoidance’ (HA). A recent review study (Mincic 2015) stipulated that this variability could come from methodological differences between studies. In order to achieve more robust results by standardizing the data processing procedure, as a first step, we repeatedly analyzed data from healthy females while changing the processing settings (voxel-based morphology (VBM) or region-of-interest (ROI) labeling, smoothing filter width, nuisance parameters included in the regression model, brain atlas and multiple comparisons correction method). The heterogeneity in the obtained results clearly illustrate the dependency of the study outcome to the opted analysis settings. Based on our results and the existing literature, we recommended the use of VBM over ROI labeling for whole brain analyses with a small or intermediate smoothing filter (5-8mm) and a model variable selection step included in the processing procedure. Additionally, it is recommended that ROI labeling should only be used in combination with a clear hypothesis and that authors are encouraged to report their results uncorrected for multiple comparisons as supplementary material to aid review studies. PMID:27096608

  16. Use of multi-node wells in the Groundwater-Management Process of MODFLOW-2005 (GWM-2005)

    USGS Publications Warehouse

    Ahlfeld, David P.; Barlow, Paul M.

    2013-01-01

    Many groundwater wells are open to multiple aquifers or to multiple intervals within a single aquifer. These types of wells can be represented in numerical simulations of groundwater flow by use of the Multi-Node Well (MNW) Packages developed for the U.S. Geological Survey’s MODFLOW model. However, previous versions of the Groundwater-Management (GWM) Process for MODFLOW did not allow the use of multi-node wells in groundwater-management formulations. This report describes modifications to the MODFLOW–2005 version of the GWM Process (GWM–2005) to provide for such use with the MNW2 Package. Multi-node wells can be incorporated into a management formulation as flow-rate decision variables for which optimal withdrawal or injection rates will be determined as part of the GWM–2005 solution process. In addition, the heads within multi-node wells can be used as head-type state variables, and, in that capacity, be included in the objective function or constraint set of a management formulation. Simple head bounds also can be defined to constrain water levels at multi-node wells. The report provides instructions for including multi-node wells in the GWM–2005 data-input files and a sample problem that demonstrates use of multi-node wells in a typical groundwater-management problem.

  17. Robotics technology discipline

    NASA Technical Reports Server (NTRS)

    Montemerlo, Melvin D.

    1990-01-01

    Viewgraphs on robotics technology discipline for Space Station Freedom are presented. Topics covered include: mechanisms; sensors; systems engineering processes for integrated robotics; man/machine cooperative control; 3D-real-time machine perception; multiple arm redundancy control; manipulator control from a movable base; multi-agent reasoning; and surfacing evolution technologies.

  18. A MULTIPLE-PURPOSE DESIGN APPROACH TO THE EVALUATION OF RISKS FROM COMPLEX MIXTURES OF DISINFECTION BY-PRODUCTS

    EPA Science Inventory

    Drinking water disinfection has effectively eliminated much of the morbidity and mortality associated with waterborne infectious diseases in the United States. Various disinfection processes, however, produce certain types and amounts of disinfection by-products (DBPs), including...

  19. A Methodology for Distributing the Corporate Database.

    ERIC Educational Resources Information Center

    McFadden, Fred R.

    The trend to distributed processing is being fueled by numerous forces, including advances in technology, corporate downsizing, increasing user sophistication, and acquisitions and mergers. Increasingly, the trend in corporate information systems (IS) departments is toward sharing resources over a network of multiple types of processors, operating…

  20. Engineering Design Thinking

    ERIC Educational Resources Information Center

    Lammi, Matthew; Becker, Kurt

    2013-01-01

    Engineering design thinking is "a complex cognitive process" including divergence-convergence, a systems perspective, ambiguity, and collaboration (Dym, Agogino, Eris, Frey, & Leifer, 2005, p. 104). Design is often complex, involving multiple levels of interacting components within a system that may be nested within or connected to other systems.…

  1. Multiple-Parameter, Low-False-Alarm Fire-Detection Systems

    NASA Technical Reports Server (NTRS)

    Hunter, Gary W.; Greensburg, Paul; McKnight, Robert; Xu, Jennifer C.; Liu, C. C.; Dutta, Prabir; Makel, Darby; Blake, D.; Sue-Antillio, Jill

    2007-01-01

    Fire-detection systems incorporating multiple sensors that measure multiple parameters are being developed for use in storage depots, cargo bays of ships and aircraft, and other locations not amenable to frequent, direct visual inspection. These systems are intended to improve upon conventional smoke detectors, now used in such locations, that reliably detect fires but also frequently generate false alarms: for example, conventional smoke detectors based on the blockage of light by smoke particles are also affected by dust particles and water droplets and, thus, are often susceptible to false alarms. In contrast, by utilizing multiple parameters associated with fires, i.e. not only obscuration by smoke particles but also concentrations of multiple chemical species that are commonly generated in combustion, false alarms can be significantly decreased while still detecting fires as reliably as older smoke-detector systems do. The present development includes fabrication of sensors that have, variously, micrometer- or nanometer-sized features so that such multiple sensors can be integrated into arrays that have sizes, weights, and power demands smaller than those of older macroscopic sensors. The sensors include resistors, electrochemical cells, and Schottky diodes that exhibit different sensitivities to the various airborne chemicals of interest. In a system of this type, the sensor readings are digitized and processed by advanced signal-processing hardware and software to extract such chemical indications of fires as abnormally high concentrations of CO and CO2, possibly in combination with H2 and/or hydrocarbons. The system also includes a microelectromechanical systems (MEMS)-based particle detector and classifier device to increase the reliability of measurements of chemical species and particulates. In parallel research, software for modeling the evolution of a fire within an aircraft cargo bay has been developed. The model implemented in the software can describe the concentrations of chemical species and of particulate matter as functions of time. A system of the present developmental type and a conventional fire detector were tested under both fire and false-alarm conditions in a Federal Aviation Administration cargo-compartment- testing facility. Both systems consistently detected fires. However, the conventional fire detector consistently generated false alarms, whereas the developmental system did not generate any false alarms.

  2. Development of automated high throughput single molecular microfluidic detection platform for signal transduction analysis

    NASA Astrophysics Data System (ADS)

    Huang, Po-Jung; Baghbani Kordmahale, Sina; Chou, Chao-Kai; Yamaguchi, Hirohito; Hung, Mien-Chie; Kameoka, Jun

    2016-03-01

    Signal transductions including multiple protein post-translational modifications (PTM), protein-protein interactions (PPI), and protein-nucleic acid interaction (PNI) play critical roles for cell proliferation and differentiation that are directly related to the cancer biology. Traditional methods, like mass spectrometry, immunoprecipitation, fluorescence resonance energy transfer, and fluorescence correlation spectroscopy require a large amount of sample and long processing time. "microchannel for multiple-parameter analysis of proteins in single-complex (mMAPS)"we proposed can reduce the process time and sample volume because this system is composed by microfluidic channels, fluorescence microscopy, and computerized data analysis. In this paper, we will present an automated mMAPS including integrated microfluidic device, automated stage and electrical relay for high-throughput clinical screening. Based on this result, we estimated that this automated detection system will be able to screen approximately 150 patient samples in a 24-hour period, providing a practical application to analyze tissue samples in a clinical setting.

  3. System Safety in an IT Service Organization

    NASA Astrophysics Data System (ADS)

    Parsons, Mike; Scutt, Simon

    Within Logica UK, over 30 IT service projects are considered safetyrelated. These include operational IT services for airports, railway infrastructure asset management, nationwide radiation monitoring and hospital medical records services. A recent internal audit examined the processes and documents used to manage system safety on these services and made a series of recommendations for improvement. This paper looks at the changes and the challenges to introducing them, especially where the service is provided by multiple units supporting both safety and non-safety related services from multiple locations around the world. The recommendations include improvements to service agreements, improved process definitions, routine safety assessment of changes, enhanced call logging, improved staff competency and training, and increased safety awareness. Progress is reported as of today, together with a road map for implementation of the improvements to the service safety management system. A proposal for service assurance levels (SALs) is discussed as a way forward to cover the wide variety of services and associated safety risks.

  4. Working memory span in mild cognitive impairment. Influence of processing speed and cognitive reserve.

    PubMed

    Facal, David; Juncos-Rabadán, Onésimo; Pereiro, Arturo X; Lojo-Seoane, Cristina

    2014-04-01

    Mild cognitive impairment (MCI) often includes episodic memory impairment, but can also involve other types of cognitive decline. Although previous studies have shown poorer performance of MCI patients in working memory (WM) span tasks, different MCI subgroups were not studied. In the present exploratory study, 145 participants underwent extensive cognitive evaluation, which included three different WM span tasks, and were classified into the following groups: multiple-domain amnestic MCI (mda-MCI), single-domain amnestic MCI (sda-MCI), and controls. General linear model was conducted by considering the WM span tasks as the within-subject factor; the group (mda-MCI, sda-MCI, and controls) as the inter-subject factor; and processing speed, vocabulary and age as covariates. Multiple linear regression models were also used to test the influence of processing speed, vocabulary, and other cognitive reserve (CR) proxies. Results indicate different levels of impairment of WM, with more severe impairment in mda-MCI patients. The differences were still present when processing resources and CR were controlled. Between-group differences can be understood as a manifestation of the greater severity and widespread memory impairment in mda-MCI patients and may contribute to a better understanding of continuum from normal controls to mda-MCI patients. Processing speed and CR have a limited influence on WM scores, reducing but not removing differences between groups.

  5. A new process sensitivity index to identify important system processes under process model and parametric uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai, Heng; Ye, Ming; Walker, Anthony P.

    Hydrological models are always composed of multiple components that represent processes key to intended model applications. When a process can be simulated by multiple conceptual-mathematical models (process models), model uncertainty in representing the process arises. While global sensitivity analysis methods have been widely used for identifying important processes in hydrologic modeling, the existing methods consider only parametric uncertainty but ignore the model uncertainty for process representation. To address this problem, this study develops a new method to probe multimodel process sensitivity by integrating the model averaging methods into the framework of variance-based global sensitivity analysis, given that the model averagingmore » methods quantify both parametric and model uncertainty. A new process sensitivity index is derived as a metric of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and model parameters. For demonstration, the new index is used to evaluate the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that converting precipitation to recharge, and the geology process is also simulated by two models of different parameterizations of hydraulic conductivity; each process model has its own random parameters. The new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less

  6. Incorporating medication indications into the prescribing process.

    PubMed

    Kron, Kevin; Myers, Sara; Volk, Lynn; Nathan, Aaron; Neri, Pamela; Salazar, Alejandra; Amato, Mary G; Wright, Adam; Karmiy, Sam; McCord, Sarah; Seoane-Vazquez, Enrique; Eguale, Tewodros; Rodriguez-Monguio, Rosa; Bates, David W; Schiff, Gordon

    2018-04-19

    The incorporation of medication indications into the prescribing process to improve patient safety is discussed. Currently, most prescriptions lack a key piece of information needed for safe medication use: the patient-specific drug indication. Integrating indications could pave the way for safer prescribing in multiple ways, including avoiding look-alike/sound-alike errors, facilitating selection of drugs of choice, aiding in communication among the healthcare team, bolstering patient understanding and adherence, and organizing medication lists to facilitate medication reconciliation. Although strongly supported by pharmacists, multiple prior attempts to encourage prescribers to include the indication on prescriptions have not been successful. We convened 6 expert panels to consult high-level stakeholders on system design considerations and requirements necessary for building and implementing an indications-based computerized prescriber order-entry (CPOE) system. We summarize our findings from the 6 expert stakeholder panels, including rationale, literature findings, potential benefits, and challenges of incorporating indications into the prescribing process. Based on this stakeholder input, design requirements for a new CPOE interface and workflow have been identified. The emergence of universal electronic prescribing and content knowledge vendors has laid the groundwork for incorporating indications into the CPOE prescribing process. As medication prescribing moves in the direction of inclusion of the indication, it is imperative to design CPOE systems to efficiently and effectively incorporate indications into prescriber workflows and optimize ways this can best be accomplished. Copyright © 2018 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  7. SIG. Signal Processing, Analysis, & Display

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hernandez, J.; Lager, D.; Azevedo, S.

    1992-01-22

    SIG is a general-purpose signal processing, analysis, and display program. Its main purpose is to perform manipulations on time and frequency-domain signals. However, it has been designed to ultimately accommodate other representations for data such as multiplexed signals and complex matrices. Two user interfaces are provided in SIG; a menu mode for the unfamiliar user and a command mode for more experienced users. In both modes errors are detected as early as possible and are indicated by friendly, meaningful messages. An on-line HELP package is also included. A variety of operations can be performed on time and frequency-domain signals includingmore » operations on the samples of a signal, operations on the entire signal, and operations on two or more signals. Signal processing operations that can be performed are digital filtering (median, Bessel, Butterworth, and Chebychev), ensemble average, resample, auto and cross spectral density, transfer function and impulse response, trend removal, convolution, Fourier transform and inverse window functions (Hamming, Kaiser-Bessel), simulation (ramp, sine, pulsetrain, random), and read/write signals. User definable signal processing algorithms are also featured. SIG has many options including multiple commands per line, command files with arguments, commenting lines, defining commands, and automatic execution for each item in a `repeat` sequence. Graphical operations on signals and spectra include: x-y plots of time signals; real, imaginary, magnitude, and phase plots of spectra; scaling of spectra for continuous or discrete domain; cursor zoom; families of curves; and multiple viewports.« less

  8. SIG. Signal Processing, Analysis, & Display

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hernandez, J.; Lager, D.; Azevedo, S.

    1992-01-22

    SIG is a general-purpose signal processing, analysis, and display program. Its main purpose is to perform manipulations on time-and frequency-domain signals. However, it has been designed to ultimately accommodate other representations for data such as multiplexed signals and complex matrices. Two user interfaces are provided in SIG - a menu mode for the unfamiliar user and a command mode for more experienced users. In both modes errors are detected as early as possible and are indicated by friendly, meaningful messages. An on-line HELP package is also included. A variety of operations can be performed on time and frequency-domain signals includingmore » operations on the samples of a signal, operations on the entire signal, and operations on two or more signals. Signal processing operations that can be performed are digital filtering (median, Bessel, Butterworth, and Chebychev), ensemble average, resample, auto and cross spectral density, transfer function and impulse response, trend removal, convolution, Fourier transform and inverse window functions (Hamming, Kaiser-Bessel), simulation (ramp, sine, pulsetrain, random), and read/write signals. User definable signal processing algorithms are also featured. SIG has many options including multiple commands per line, command files with arguments, commenting lines, defining commands, and automatic execution for each item in a repeat sequence. Graphical operations on signals and spectra include: x-y plots of time signals; real, imaginary, magnitude, and phase plots of spectra; scaling of spectra for continuous or discrete domain; cursor zoom; families of curves; and multiple viewports.« less

  9. Hearing the voices of service user researchers in collaborative qualitative data analysis: the case for multiple coding.

    PubMed

    Sweeney, Angela; Greenwood, Kathryn E; Williams, Sally; Wykes, Til; Rose, Diana S

    2013-12-01

    Health research is frequently conducted in multi-disciplinary teams, with these teams increasingly including service user researchers. Whilst it is common for service user researchers to be involved in data collection--most typically interviewing other service users--it is less common for service user researchers to be involved in data analysis and interpretation. This means that a unique and significant perspective on the data is absent. This study aims to use an empirical report of a study on Cognitive Behavioural Therapy for psychosis (CBTp) to demonstrate the value of multiple coding in enabling service users voices to be heard in team-based qualitative data analysis. The CBTp study employed multiple coding to analyse service users' discussions of CBT for psychosis (CBTp) from the perspectives of a service user researcher, clinical researcher and psychology assistant. Multiple coding was selected to enable multiple perspectives to analyse and interpret data, to understand and explore differences and to build multi-disciplinary consensus. Multiple coding enabled the team to understand where our views were commensurate and incommensurate and to discuss and debate differences. Through the process of multiple coding, we were able to build strong consensus about the data from multiple perspectives, including that of the service user researcher. Multiple coding is an important method for understanding and exploring multiple perspectives on data and building team consensus. This can be contrasted with inter-rater reliability which is only appropriate in limited circumstances. We conclude that multiple coding is an appropriate and important means of hearing service users' voices in qualitative data analysis. © 2012 John Wiley & Sons Ltd.

  10. Cognitive processing speed is related to fall frequency in older adults with multiple sclerosis.

    PubMed

    Sosnoff, Jacob J; Balantrapu, Swathi; Pilutti, Lara A; Sandroff, Brian M; Morrison, Steven; Motl, Robert W

    2013-08-01

    To examine mobility, balance, fall risk, and cognition in older adults with multiple sclerosis (MS) as a function of fall frequency. Retrospective, cross-sectional design. University research laboratory. Community-dwelling persons with MS (N=27) aged between 50 and 75 years were divided into 2 groups-single-time (n=11) and recurrent (n=16; >2 falls/12 mo) fallers-on the basis of fall history. Not applicable. Mobility was assessed using a variety of measures including Multiple Sclerosis Walking Scale-12, walking speed (Timed 25-Foot Walk test), endurance (6-Minute Walk test), and functional mobility (Timed Up and Go test). Balance was assessed with the Berg Balance Scale, posturography, and self-reported balance confidence. Fall risk was assessed with the Physiological Profile Assessment. Cognitive processing speed was quantified with the Symbol Digit Modalities Test and the Paced Auditory Serial Addition Test. Recurrent fallers had slower cognitive processing speed than single-time fallers (P ≤.01). There was no difference in mobility, balance, or fall risk between recurrent and single-time fallers (P>.05). Results indicated that cognitive processing speed is associated with fall frequency and may have implications for fall prevention strategies targeting recurrent fallers with MS. Copyright © 2013 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  11. A new process sensitivity index to identify important system processes under process model and parametric uncertainty

    DOE PAGES

    Dai, Heng; Ye, Ming; Walker, Anthony P.; ...

    2017-03-28

    A hydrological model consists of multiple process level submodels, and each submodel represents a process key to the operation of the simulated system. Global sensitivity analysis methods have been widely used to identify important processes for system model development and improvement. The existing methods of global sensitivity analysis only consider parametric uncertainty, and are not capable of handling model uncertainty caused by multiple process models that arise from competing hypotheses about one or more processes. To address this problem, this study develops a new method to probe model output sensitivity to competing process models by integrating model averaging methods withmore » variance-based global sensitivity analysis. A process sensitivity index is derived as a single summary measure of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and their parameters. Here, for demonstration, the new index is used to assign importance to the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that convert precipitation to recharge, and the geology process is simulated by two models of hydraulic conductivity. Each process model has its own random parameters. Finally, the new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less

  12. A new process sensitivity index to identify important system processes under process model and parametric uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai, Heng; Ye, Ming; Walker, Anthony P.

    A hydrological model consists of multiple process level submodels, and each submodel represents a process key to the operation of the simulated system. Global sensitivity analysis methods have been widely used to identify important processes for system model development and improvement. The existing methods of global sensitivity analysis only consider parametric uncertainty, and are not capable of handling model uncertainty caused by multiple process models that arise from competing hypotheses about one or more processes. To address this problem, this study develops a new method to probe model output sensitivity to competing process models by integrating model averaging methods withmore » variance-based global sensitivity analysis. A process sensitivity index is derived as a single summary measure of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and their parameters. Here, for demonstration, the new index is used to assign importance to the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that convert precipitation to recharge, and the geology process is simulated by two models of hydraulic conductivity. Each process model has its own random parameters. Finally, the new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less

  13. Integrated Optoelectronics for Parallel Microbioanalysis

    NASA Technical Reports Server (NTRS)

    Stirbl, Robert; Moynihan, Philip; Bearman, Gregory; Lane, Arthur

    2003-01-01

    Miniature, relatively inexpensive microbioanalytical systems ("laboratory-on-achip" devices) have been proposed for the detection of hazardous microbes and toxic chemicals. Each system of this type would include optoelectronic sensors and sensor-output-processing circuitry that would simultaneously look for the optical change, fluorescence, delayed fluorescence, or phosphorescence signatures from multiple redundant sites that have interacted with the test biomolecules in order to detect which one(s) was present in a given situation. These systems could be used in a variety of settings that could include doctors offices, hospitals, hazardous-material laboratories, biological-research laboratories, military operations, and chemical-processing plants.

  14. The use of multiple representations and visualizations in student learning of introductory physics: An example from work and energy

    NASA Astrophysics Data System (ADS)

    Zou, Xueli

    In the past three decades, physics education research has primarily focused on student conceptual understanding; little work has been conducted to investigate student difficulties in problem solving. In cognitive science and psychology, however, extensive studies have explored the differences in problem solving between experts and naive students. A major finding indicates that experts often apply qualitative representations in problem solving, but that novices use an equation-centered method. This dissertation describes investigations into the use of multiple representations and visualizations in student understanding and problem solving with the concepts of work and energy. A multiple-representation strategy was developed to help students acquire expertise in solving work-energy problems. In this approach, a typical work-energy problem is considered as a physical process. The process is first described in words-the verbal representation of the process. Next, a sketch or a picture, called a pictorial representation, is used to represent the process. This is followed by work-energy bar charts-a physical representation of the same processes. Finally, this process is represented mathematically by using a generalized work-energy equation. In terms of the multiple representations, the goal of solving a work- energy problem is to represent the physical process the more intuitive pictorial and diagrammatic physical representations. Ongoing assessment of student learning indicates that this multiple-representation technique is more effective than standard instruction methods in student problem solving. visualize this difficult-to-understand concept, a guided- inquiry learning activity using a pair of model carts and an experiment problem using a sandbag were developed. Assessment results have shown that these research-based materials are effective in helping students visualize this concept and give a pictorial idea of ``where the kinetic energy goes'' during inelastic collisions. The research and curriculum development was conducted in the context of the introductory calculus-based physics course. Investigations were carried out using common physics education research tools, including open-ended surveys, written test questions, and individual student interviews.

  15. A microfluidic device integrating dual CMOS polysilicon nanowire sensors for on-chip whole blood processing and simultaneous detection of multiple analytes.

    PubMed

    Kuan, Da-Han; Wang, I-Shun; Lin, Jiun-Rue; Yang, Chao-Han; Huang, Chi-Hsien; Lin, Yen-Hung; Lin, Chih-Ting; Huang, Nien-Tsu

    2016-08-02

    The hemoglobin-A1c test, measuring the ratio of glycated hemoglobin (HbA1c) to hemoglobin (Hb) levels, has been a standard assay in diabetes diagnosis that removes the day-to-day glucose level variation. Currently, the HbA1c test is restricted to hospitals and central laboratories due to the laborious, time-consuming whole blood processing and bulky instruments. In this paper, we have developed a microfluidic device integrating dual CMOS polysilicon nanowire sensors (MINS) for on-chip whole blood processing and simultaneous detection of multiple analytes. The micromachined polymethylmethacrylate (PMMA) microfluidic device consisted of a serpentine microchannel with multiple dam structures designed for non-lysed cells or debris trapping, uniform plasma/buffer mixing and dilution. The CMOS-fabricated polysilicon nanowire sensors integrated with the microfluidic device were designed for the simultaneous, label-free electrical detection of multiple analytes. Our study first measured the Hb and HbA1c levels in 11 clinical samples via these nanowire sensors. The results were compared with those of standard Hb and HbA1c measurement methods (Hb: the sodium lauryl sulfate hemoglobin detection method; HbA1c: cation-exchange high-performance liquid chromatography) and showed comparable outcomes. Finally, we successfully demonstrated the efficacy of the MINS device's on-chip whole blood processing followed by simultaneous Hb and HbA1c measurement in a clinical sample. Compared to current Hb and HbA1c sensing instruments, the MINS platform is compact and can simultaneously detect two analytes with only 5 μL of whole blood, which corresponds to a 300-fold blood volume reduction. The total assay time, including the in situ sample processing and analyte detection, was just 30 minutes. Based on its on-chip whole blood processing and simultaneous multiple analyte detection functionalities with a lower sample volume requirement and shorter process time, the MINS device can be effectively applied to real-time diabetes diagnostics and monitoring in point-of-care settings.

  16. Distinct representations of subtraction and multiplication in the neural systems for numerosity and language

    PubMed Central

    Prado, Jérôme; Mutreja, Rachna; Zhang, Hongchuan; Mehta, Rucha; Desroches, Amy S.; Minas, Jennifer E.; Booth, James R.

    2010-01-01

    It has been proposed that recent cultural inventions such as symbolic arithmetic recycle evolutionary older neural mechanisms. A central assumption of this hypothesis is that the degree to which a pre-existing mechanism is recycled depends upon the degree of similarity between its initial function and the novel task. To test this assumption, we investigated whether the brain region involved in magnitude comparison in the intraparietal sulcus (IPS), localized by a numerosity comparison task, is recruited to a greater degree by arithmetic problems that involve number comparison (single-digit subtractions) than by problems that involve retrieving facts from memory (single-digit multiplications). Our results confirmed that subtractions are associated with greater activity in the IPS than multiplications, whereas multiplications elicit greater activity than subtractions in regions involved in verbal processing including the middle temporal gyrus and inferior frontal gyrus that were localized by a phonological processing task. Pattern analyses further indicated that the neural mechanisms more active for subtraction than multiplication in the IPS overlap with those involved in numerosity comparison, and that the strength of this overlap predicts inter-individual performance in the subtraction task. These findings provide novel evidence that elementary arithmetic relies on the co-option of evolutionary older neural circuits. PMID:21246667

  17. Vps15 is required for stress induced and developmentally triggered autophagy and salivary gland protein secretion in Drosophila.

    PubMed

    Anding, A L; Baehrecke, E H

    2015-03-01

    Autophagy is a catabolic process used to deliver cellular material to the lysosome for degradation. The core Vps34/class III phosphatidylinositol 3-kinase (PI3K) complex, consisting of Atg6, Vps15, and Vps34, is highly conserved throughout evolution, critical for recruiting autophagy-related proteins to the preautophagosomal structure and for other vesicular trafficking processes, including vacuolar protein sorting. Atg6 and Vps34 have been well characterized, but the Vps15 kinase remains poorly characterized with most studies focusing on nutrient deprivation-induced autophagy. Here, we investigate the function of Vps15 in different cellular contexts and find that it is necessary for both stress-induced and developmentally programmed autophagy in various tissues in Drosophila melanogaster. Vps15 is required for autophagy that is induced by multiple forms of stress, including nutrient deprivation, hypoxia, and oxidative stress. Furthermore, autophagy that is triggered by physiological stimuli during development in the fat body, intestine, and salivary gland also require the function of Vps15. In addition, we show that Vps15 is necessary for efficient salivary gland protein secretion. These data illustrate the broad importance of Vps15 in multiple forms of autophagy in different animal cells, and also highlight the pleiotropic function of this kinase in multiple vesicle-trafficking pathways.

  18. Imaging synthetic aperture radar

    DOEpatents

    Burns, Bryan L.; Cordaro, J. Thomas

    1997-01-01

    A linear-FM SAR imaging radar method and apparatus to produce a real-time image by first arranging the returned signals into a plurality of subaperture arrays, the columns of each subaperture array having samples of dechirped baseband pulses, and further including a processing of each subaperture array to obtain coarse-resolution in azimuth, then fine-resolution in range, and lastly, to combine the processed subapertures to obtain the final fine-resolution in azimuth. Greater efficiency is achieved because both the transmitted signal and a local oscillator signal mixed with the returned signal can be varied on a pulse-to-pulse basis as a function of radar motion. Moreover, a novel circuit can adjust the sampling location and the A/D sample rate of the combined dechirped baseband signal which greatly reduces processing time and hardware. The processing steps include implementing a window function, stabilizing either a central reference point and/or all other points of a subaperture with respect to doppler frequency and/or range as a function of radar motion, sorting and compressing the signals using a standard fourier transforms. The stabilization of each processing part is accomplished with vector multiplication using waveforms generated as a function of radar motion wherein these waveforms may be synthesized in integrated circuits. Stabilization of range migration as a function of doppler frequency by simple vector multiplication is a particularly useful feature of the invention; as is stabilization of azimuth migration by correcting for spatially varying phase errors prior to the application of an autofocus process.

  19. [Cormorbidity in multiple sclerosis and its therapeutic approach].

    PubMed

    Estruch, Bonaventura Casanova

    2014-12-01

    Multiple sclerosis (MS) is a long-term chronic disease, in which intercurrent processes develop three times more frequently in affected individuals than in persons without MS. Knowledge of the comorbidity of MS, its definition and measurement (Charlson index) improves patient management. Acting on comorbid conditions delays the progression of disability, which is intimately linked to the number of concurrent processes and with health states and habits. Moreover, the presence of comorbidities delays the diagnosis of MS, which in turn delays the start of treatment. The main comorbidity found in MS includes other autoimmune diseases (thyroiditis, systemic lupus erythematosus, or pemphigus) but can also include general diseases, such as asthma or osteomuscular alterations, and, in particular, psychiatric disturbances. All these alterations should be evaluated with multidimensional scales (Disability Expectancy Table, DET), which allow more accurate determination of the patient's real clinical course and quality of life. These scales also allow identification of how MS, concurrent and intercurrent processes occurring during the clinical course, and the treatment provided affect patients with MS. An overall approach to patients' health status helps to improve quality of life. Copyright © 2014 Elsevier España, S.L.U. All rights reserved.

  20. Recent advances in heterocycle generation using the efficient Ugi multiple-component condensation reaction.

    PubMed

    Tempest, Paul A

    2005-11-01

    The current trend of rising research spending and falling numbers of novel chemical entities continues to drive efforts aimed at increasing efficiency in the drug discovery process. Strategic issues, such as assigning resources to poorly validated targets have been implicated in the declining productivity of recent years. Tactical approaches employed to improve this situation include attempts to speed the discovery process toward decision points in a timely manner. Accelerating the optimization of high-throughput screening hits is a goal in streamlining the discovery process, and the use of multiple-component condensation (MCC) reactions have proved useful toward this end. MCC reactions are powerful and efficient tools for the generation of diverse compound sets. Collections of compounds can be synthesized with all of the required diversity elements included in a single synthetic step. One of the most widely investigated MCC reactions is the Ugi four-component condensation. This review highlights disclosures of the Ugi reaction published over the past two years (2003 to 2005) in three areas: (i) Ugi reaction in conjunction with post-condensation cyclization; (ii) bifunctional condensations leading to heterocyclic cores; and (iii) general findings relating to linear products or interesting improvements in the basic Ugi reaction.

  1. High resolution crustal image of South California Continental Borderland: Reverse time imaging including multiples

    NASA Astrophysics Data System (ADS)

    Bian, A.; Gantela, C.

    2014-12-01

    Strong multiples were observed in marine seismic data of Los Angeles Regional Seismic Experiment (LARSE).It is crucial to eliminate these multiples in conventional ray-based or one-way wave-equation based depth image methods. As long as multiples contain information of target zone along travelling path, it's possible to use them as signal, to improve the illumination coverage thus enhance the image quality of structural boundaries. Reverse time migration including multiples is a two-way wave-equation based prestack depth image method that uses both primaries and multiples to map structural boundaries. Several factors, including source wavelet, velocity model, back ground noise, data acquisition geometry and preprocessing workflow may influence the quality of image. The source wavelet is estimated from direct arrival of marine seismic data. Migration velocity model is derived from integrated model building workflow, and the sharp velocity interfaces near sea bottom needs to be preserved in order to generate multiples in the forward and backward propagation steps. The strong amplitude, low frequency marine back ground noise needs to be removed before the final imaging process. High resolution reverse time image sections of LARSE Lines 1 and Line 2 show five interfaces: depth of sea-bottom, base of sedimentary basins, top of Catalina Schist, a deep layer and a possible pluton boundary. Catalina Schist shows highs in the San Clemente ridge, Emery Knoll, Catalina Ridge, under Catalina Basin on both the lines, and a minor high under Avalon Knoll. The high of anticlinal fold in Line 1 is under the north edge of Emery Knoll and under the San Clemente fault zone. An area devoid of any reflection features are interpreted as sides of an igneous plume.

  2. Systems-level mechanisms of action of Panax ginseng: a network pharmacological approach.

    PubMed

    Park, Sa-Yoon; Park, Ji-Hun; Kim, Hyo-Su; Lee, Choong-Yeol; Lee, Hae-Jeung; Kang, Ki Sung; Kim, Chang-Eop

    2018-01-01

    Panax ginseng has been used since ancient times based on the traditional Asian medicine theory and clinical experiences, and currently, is one of the most popular herbs in the world. To date, most of the studies concerning P. ginseng have focused on specific mechanisms of action of individual constituents. However, in spite of many studies on the molecular mechanisms of P. ginseng , it still remains unclear how multiple active ingredients of P. ginseng interact with multiple targets simultaneously, giving the multidimensional effects on various conditions and diseases. In order to decipher the systems-level mechanism of multiple ingredients of P. ginseng , a novel approach is needed beyond conventional reductive analysis. We aim to review the systems-level mechanism of P. ginseng by adopting novel analytical framework-network pharmacology. Here, we constructed a compound-target network of P. ginseng using experimentally validated and machine learning-based prediction results. The targets of the network were analyzed in terms of related biological process, pathways, and diseases. The majority of targets were found to be related with primary metabolic process, signal transduction, nitrogen compound metabolic process, blood circulation, immune system process, cell-cell signaling, biosynthetic process, and neurological system process. In pathway enrichment analysis of targets, mainly the terms related with neural activity showed significant enrichment and formed a cluster. Finally, relative degrees analysis for the target-disease association of P. ginseng revealed several categories of related diseases, including respiratory, psychiatric, and cardiovascular diseases.

  3. The Experience of Persons With Multiple Sclerosis Using MS INFoRm: An Interactive Fatigue Management Resource.

    PubMed

    Pétrin, Julie; Akbar, Nadine; Turpin, Karen; Smyth, Penelope; Finlayson, Marcia

    2018-04-01

    We aimed to understand participants' experiences with a self-guided fatigue management resource, Multiple Sclerosis: An Interactive Fatigue Management Resource ( MS INFoRm), and the extent to which they found its contents relevant and useful to their daily lives. We recruited 35 persons with MS experiencing mild to moderate fatigue, provided them with MS INFoRm, and then conducted semistructured interviews 3 weeks and 3 months after they received the resource. Interpretive description guided the analysis process. Findings indicate that participants' experience of using MS INFoRm could be understood as a process of change, influenced by their initial reactions to the resource. They reported experiencing a shift in knowledge, expectations, and behaviors with respect to fatigue self-management. These shifts led to multiple positive outcomes, including increased levels of self-confidence and improved quality of life. These findings suggest that MS INFoRm may have a place in the continuum of fatigue management interventions for people with MS.

  4. Free Energy Landscape and Multiple Folding Pathways of an H-Type RNA Pseudoknot

    PubMed Central

    Bian, Yunqiang; Zhang, Jian; Wang, Jun; Wang, Jihua; Wang, Wei

    2015-01-01

    How RNA sequences fold to specific tertiary structures is one of the key problems for understanding their dynamics and functions. Here, we study the folding process of an H-type RNA pseudoknot by performing a large-scale all-atom MD simulation and bias-exchange metadynamics. The folding free energy landscapes are obtained and several folding intermediates are identified. It is suggested that the folding occurs via multiple mechanisms, including a step-wise mechanism starting either from the first helix or the second, and a cooperative mechanism with both helices forming simultaneously. Despite of the multiple mechanism nature, the ensemble folding kinetics estimated from a Markov state model is single-exponential. It is also found that the correlation between folding and binding of metal ions is significant, and the bound ions mediate long-range interactions in the intermediate structures. Non-native interactions are found to be dominant in the unfolded state and also present in some intermediates, possibly hinder the folding process of the RNA. PMID:26030098

  5. Collaboration process for integrated social and health care strategy implementation.

    PubMed

    Korpela, Jukka; Elfvengren, Kalle; Kaarna, Tanja; Tepponen, Merja; Tuominen, Markku

    2012-01-01

    To present a collaboration process for creating a roadmap for the implementation of a strategy for integrated health and social care. The developed collaboration process includes multiple phases and uses electronic group decision support system technology (GDSS). A case study done in the South Karelia District of Social and Health Services in Finland during 2010-2011. An expert panel of 13 participants was used in the planning process of the strategy implementation. The participants were interviewed and observed during the case study. As a practical result, a roadmap for integrated health and social care strategy implementation has been developed. The strategic roadmap includes detailed plans of several projects which are needed for successful integration strategy implementation. As an academic result, a collaboration process to create such a roadmap has been developed. The collaboration process and technology seem to suit the planning process well. The participants of the meetings were satisfied with the collaboration process and the GDSS technology. The strategic roadmap was accepted by the participants, which indicates satisfaction with the developed process.

  6. Including Multiple Voices in Collaboratively Designing a Teacher Education Program

    ERIC Educational Resources Information Center

    Konecki, Loretta R.; Sturdivant, Robika L.; King, Caryn M.; Melin, Jacquelyn A.; Lancaster, Paula E.

    2012-01-01

    This narrative case study describes the collaborative processes employed by a midwestern university as it designed and implemented a clinically based, postbaccalaureate teacher preparation program for science, technology, engineering, and mathematics (STEM) graduates committed to teaching in high need secondary schools. The program development…

  7. Microbiological Impact on Carbon Capture and Sequestration: Biotic Processes in Natural CO2 Analogue

    EPA Science Inventory

    Multiple ground-water based microbial community analyses including membrane lipids assays for phospholipid fatty acid and DNA analysis were performed from hydraulically isolated zones. DGGE results from DNA extracts from vertical profiling of the entire depth of aquifer sampled a...

  8. Use of a quality trait index to increase the reliability of phenotypic evaluations in broccoli

    USDA-ARS?s Scientific Manuscript database

    Selection of superior broccoli hybrids involves multiple considerations, including optimization of head quality traits. Quality assessment of broccoli heads is often confounded by relatively subjective human preferences for optimal appearance of heads. To assist the selection process, we assessed fi...

  9. Managing Curriculum Change and "Ontological Uncertainty" in Tertiary Education

    ERIC Educational Resources Information Center

    Keesing-Styles, Linda; Nash, Simon; Ayres, Robert

    2014-01-01

    Curriculum reform at institutional level is a challenging endeavour. Those charged with leading this process will encounter both enthusiasm and multiple obstacles to teacher engagement including the particularly complex issue of confronting existing teacher identities. At Unitec Institute of Technology (Unitec), the "Living Curriculum"…

  10. Concordance in Genomic Changes Between Mouse Lungs and Human Airway Epithelial Cells Exposed to Diesel Exhaust Particles

    EPA Science Inventory

    Human and animal toxicity studies have shown that exposure to diesel exhaust particles (DEP) or their constituents affect multiple biological processes including immune and inflammatory pathways, mutagenesis and in some cases carcinogenesis. This study compared genomic changes by...

  11. Effects of Cognitive Demand on Word Encoding in Adults Who Stutter

    ERIC Educational Resources Information Center

    Tsai, Pei-Tzu

    2011-01-01

    The etiology of persistent stuttering is unknown, but stuttering has been attributed to multiple potential factors, including difficulty in processing language-related information, but findings remain inconclusive regarding any "specific" linguistic deficit potentially causing stuttering. One particular challenge in drawing conclusions is the…

  12. A Software Platform for Post-Processing Waveform-Based NDE

    NASA Technical Reports Server (NTRS)

    Roth, Donald J.; Martin, Richard E.; Seebo, Jeff P.; Trinh, Long B.; Walker, James L.; Winfree, William P.

    2007-01-01

    Ultrasonic, microwave, and terahertz nondestructive evaluation imaging systems generally require the acquisition of waveforms at each scan point to form an image. For such systems, signal and image processing methods are commonly needed to extract information from the waves and improve resolution of, and highlight, defects in the image. Since some similarity exists for all waveform-based NDE methods, it would seem a common software platform containing multiple signal and image processing techniques to process the waveforms and images makes sense where multiple techniques, scientists, engineers, and organizations are involved. This presentation describes NASA Glenn Research Center's approach in developing a common software platform for processing waveform-based NDE signals and images. This platform is currently in use at NASA Glenn and at Lockheed Martin Michoud Assembly Facility for processing of pulsed terahertz and ultrasonic data. Highlights of the software operation will be given. A case study will be shown for use with terahertz data. The authors also request scientists and engineers who are interested in sharing customized signal and image processing algorithms to contribute to this effort by letting the authors code up and include these algorithms in future releases.

  13. Quantitative Assessment of Cervical Vertebral Maturation Using Cone Beam Computed Tomography in Korean Girls

    PubMed Central

    Byun, Bo-Ram; Kim, Yong-Il; Maki, Koutaro; Son, Woo-Sung

    2015-01-01

    This study was aimed to examine the correlation between skeletal maturation status and parameters from the odontoid process/body of the second vertebra and the bodies of third and fourth cervical vertebrae and simultaneously build multiple regression models to be able to estimate skeletal maturation status in Korean girls. Hand-wrist radiographs and cone beam computed tomography (CBCT) images were obtained from 74 Korean girls (6–18 years of age). CBCT-generated cervical vertebral maturation (CVM) was used to demarcate the odontoid process and the body of the second cervical vertebra, based on the dentocentral synchondrosis. Correlation coefficient analysis and multiple linear regression analysis were used for each parameter of the cervical vertebrae (P < 0.05). Forty-seven of 64 parameters from CBCT-generated CVM (independent variables) exhibited statistically significant correlations (P < 0.05). The multiple regression model with the greatest R 2 had six parameters (PH2/W2, UW2/W2, (OH+AH2)/LW2, UW3/LW3, D3, and H4/W4) as independent variables with a variance inflation factor (VIF) of <2. CBCT-generated CVM was able to include parameters from the second cervical vertebral body and odontoid process, respectively, for the multiple regression models. This suggests that quantitative analysis might be used to estimate skeletal maturation status. PMID:25878721

  14. Toshiba TDF-500 High Resolution Viewing And Analysis System

    NASA Astrophysics Data System (ADS)

    Roberts, Barry; Kakegawa, M.; Nishikawa, M.; Oikawa, D.

    1988-06-01

    A high resolution, operator interactive, medical viewing and analysis system has been developed by Toshiba and Bio-Imaging Research. This system provides many advanced features including high resolution displays, a very large image memory and advanced image processing capability. In particular, the system provides CRT frame buffers capable of update in one frame period, an array processor capable of image processing at operator interactive speeds, and a memory system capable of updating multiple frame buffers at frame rates whilst supporting multiple array processors. The display system provides 1024 x 1536 display resolution at 40Hz frame and 80Hz field rates. In particular, the ability to provide whole or partial update of the screen at the scanning rate is a key feature. This allows multiple viewports or windows in the display buffer with both fixed and cine capability. To support image processing features such as windowing, pan, zoom, minification, filtering, ROI analysis, multiplanar and 3D reconstruction, a high performance CPU is integrated into the system. This CPU is an array processor capable of up to 400 million instructions per second. To support the multiple viewer and array processors' instantaneous high memory bandwidth requirement, an ultra fast memory system is used. This memory system has a bandwidth capability of 400MB/sec and a total capacity of 256MB. This bandwidth is more than adequate to support several high resolution CRT's and also the fast processing unit. This fully integrated approach allows effective real time image processing. The integrated design of viewing system, memory system and array processor are key to the imaging system. It is the intention to describe the architecture of the image system in this paper.

  15. A model of human decision making in multiple process monitoring situations

    NASA Technical Reports Server (NTRS)

    Greenstein, J. S.; Rouse, W. B.

    1982-01-01

    Human decision making in multiple process monitoring situations is considered. It is proposed that human decision making in many multiple process monitoring situations can be modeled in terms of the human's detection of process related events and his allocation of attention among processes once he feels event have occurred. A mathematical model of human event detection and attention allocation performance in multiple process monitoring situations is developed. An assumption made in developing the model is that, in attempting to detect events, the human generates estimates of the probabilities that events have occurred. An elementary pattern recognition technique, discriminant analysis, is used to model the human's generation of these probability estimates. The performance of the model is compared to that of four subjects in a multiple process monitoring situation requiring allocation of attention among processes.

  16. Tactical resource allocation and elective patient admission planning in care processes.

    PubMed

    Hulshof, Peter J H; Boucherie, Richard J; Hans, Erwin W; Hurink, Johann L

    2013-06-01

    Tactical planning of resources in hospitals concerns elective patient admission planning and the intermediate term allocation of resource capacities. Its main objectives are to achieve equitable access for patients, to meet production targets/to serve the strategically agreed number of patients, and to use resources efficiently. This paper proposes a method to develop a tactical resource allocation and elective patient admission plan. These tactical plans allocate available resources to various care processes and determine the selection of patients to be served that are at a particular stage of their care process. Our method is developed in a Mixed Integer Linear Programming (MILP) framework and copes with multiple resources, multiple time periods and multiple patient groups with various uncertain treatment paths through the hospital, thereby integrating decision making for a chain of hospital resources. Computational results indicate that our method leads to a more equitable distribution of resources and provides control of patient access times, the number of patients served and the fraction of allocated resource capacity. Our approach is generic, as the base MILP and the solution approach allow for including various extensions to both the objective criteria and the constraints. Consequently, the proposed method is applicable in various settings of tactical hospital management.

  17. Context-based automated defect classification system using multiple morphological masks

    DOEpatents

    Gleason, Shaun S.; Hunt, Martin A.; Sari-Sarraf, Hamed

    2002-01-01

    Automatic detection of defects during the fabrication of semiconductor wafers is largely automated, but the classification of those defects is still performed manually by technicians. This invention includes novel digital image analysis techniques that generate unique feature vector descriptions of semiconductor defects as well as classifiers that use these descriptions to automatically categorize the defects into one of a set of pre-defined classes. Feature extraction techniques based on multiple-focus images, multiple-defect mask images, and segmented semiconductor wafer images are used to create unique feature-based descriptions of the semiconductor defects. These feature-based defect descriptions are subsequently classified by a defect classifier into categories that depend on defect characteristics and defect contextual information, that is, the semiconductor process layer(s) with which the defect comes in contact. At the heart of the system is a knowledge database that stores and distributes historical semiconductor wafer and defect data to guide the feature extraction and classification processes. In summary, this invention takes as its input a set of images containing semiconductor defect information, and generates as its output a classification for the defect that describes not only the defect itself, but also the location of that defect with respect to the semiconductor process layers.

  18. [Combined application of multiple fluorescence in research on the degradation of fluoranthene by potassium ferrate].

    PubMed

    Li, Si; Yu, Dan-Ni; Ji, Fang-Ying; Zhou, Guang-Ming; He, Qiang

    2012-11-01

    The degradation of fluoranthene was researched by combined means of multiple fluorescence spectra, including emission, synchronous, excitation emission matrix (EEM), time-scan and photometry. The characteristics of the degradation and fluoranthene molecular changes within the degradation's process were also discussed according to the information about the degradation provided by all of the fluorescence spectra mentioned above. The equations of fluoranthene's degradation by potassium ferrate were obtained on the bases of fitting time-scan fluorescence curves at different time, and the degradation's kinetic was speculated accordingly. From the experimental results, multiple fluorescence data commonly reflected that it had same degradation rate at the same reaction time. t = 10 s, and the degradation rate is -55%, t = 25 s, -81%, t = 40 s, -91%. No new fluorescent characteristic was observed within every degradation' stage. The reaction stage during t < or = 20 s was crucial, in which the degradation process is closest to linear relationship. After this beginning stage, the linear relationship deviated gradually with the development of the degradation process. The degradation of fluoranthene by potassium ferrate was nearly in accord with the order of the first order reaction.

  19. High temperature superconducting composite conductor and method for manufacturing the same

    DOEpatents

    Holesinger, Terry G.; Bingert, John F.

    2002-01-01

    A high temperature superconducting composite conductor is provided including a high temperature superconducting material surrounded by a noble metal layer, the high temperature superconducting composite conductor characterized as having a fill factor of greater than about 40. Additionally, the conductor can be further characterized as containing multiple cores of high temperature superconducting material surrounded by a noble metal layer, said multiple cores characterized as having substantially uniform geometry in the cross-sectional dimensions. Processes of forming such a high temperature superconducting composite conductor are also provided.

  20. Exploring Contextual Models in Chemical Patent Search

    NASA Astrophysics Data System (ADS)

    Urbain, Jay; Frieder, Ophir

    We explore the development of probabilistic retrieval models for integrating term statistics with entity search using multiple levels of document context to improve the performance of chemical patent search. A distributed indexing model was developed to enable efficient named entity search and aggregation of term statistics at multiple levels of patent structure including individual words, sentences, claims, descriptions, abstracts, and titles. The system can be scaled to an arbitrary number of compute instances in a cloud computing environment to support concurrent indexing and query processing operations on large patent collections.

  1. A scalable parallel algorithm for multiple objective linear programs

    NASA Technical Reports Server (NTRS)

    Wiecek, Malgorzata M.; Zhang, Hong

    1994-01-01

    This paper presents an ADBASE-based parallel algorithm for solving multiple objective linear programs (MOLP's). Job balance, speedup and scalability are of primary interest in evaluating efficiency of the new algorithm. Implementation results on Intel iPSC/2 and Paragon multiprocessors show that the algorithm significantly speeds up the process of solving MOLP's, which is understood as generating all or some efficient extreme points and unbounded efficient edges. The algorithm gives specially good results for large and very large problems. Motivation and justification for solving such large MOLP's are also included.

  2. Photoinitiated grafting of porous polymer monoliths and thermoplastic polymers for microfluidic devices

    DOEpatents

    Frechet, Jean M. J. [Oakland, CA; Svec, Frantisek [Alameda, CA; Rohr, Thomas [Leiden, NL

    2008-10-07

    A microfluidic device preferably made of a thermoplastic polymer that includes a channel or a multiplicity of channels whose surfaces are modified by photografting. The device further includes a porous polymer monolith prepared via UV initiated polymerization within the channel, and functionalization of the pore surface of the monolith using photografting. Processes for making such surface modifications of thermoplastic polymers and porous polymer monoliths are set forth.

  3. Pediatric Multiple Sclerosis: Genes, Environment, and a Comprehensive Therapeutic Approach.

    PubMed

    Cappa, Ryan; Theroux, Liana; Brenton, J Nicholas

    2017-10-01

    Pediatric multiple sclerosis is an increasingly recognized and studied disorder that accounts for 3% to 10% of all patients with multiple sclerosis. The risk for pediatric multiple sclerosis is thought to reflect a complex interplay between environmental and genetic risk factors. Environmental exposures, including sunlight (ultraviolet radiation, vitamin D levels), infections (Epstein-Barr virus), passive smoking, and obesity, have been identified as potential risk factors in youth. Genetic predisposition contributes to the risk of multiple sclerosis, and the major histocompatibility complex on chromosome 6 makes the single largest contribution to susceptibility to multiple sclerosis. With the use of large-scale genome-wide association studies, other non-major histocompatibility complex alleles have been identified as independent risk factors for the disease. The bridge between environment and genes likely lies in the study of epigenetic processes, which are environmentally-influenced mechanisms through which gene expression may be modified. This article will review these topics to provide a framework for discussion of a comprehensive approach to counseling and ultimately treating the pediatric patient with multiple sclerosis. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. [Being cared for and caring: living with multiple chronic diseases (Leila)-a qualitative study about APN contributions to integrated care].

    PubMed

    Müller-Staub, Maria; Zigan, Nicole; Händler-Schuster, Daniela; Probst, Sebastian; Monego, Renate; Imhof, Lorenz

    2015-04-01

    Living with multiple chronic diseases is complex and leads to enhanced care needs. To foster integrated care a project called "Living with chronic disease" (Leila) was initiated. The aim was to develop an Advanced Practice Nursing (APN) service in collaboration with medical centers for persons who are living with multiple chronic diseases. The following research questions were addressed: 1. What are patients' experiences, referring physicians and APNs with the Leila-Service? 2. How are referral processes performed? 3. How do the involved groups experience collaboration and APN role development? A qualitative approach according grounded theory of Corbin and Strauss was used to explore the experiences with the Leila project and the interaction of the persons involved. 38 interviews were conducted with patients who are living with multiple chronic diseases, their APN's and the referring physicians. The findings revealed "Being cared for and caring" as main category. The data demonstrated how patients responded to their involvement into care and that they were taken as serious partners in the care process. The category "organizing everyday life" describes how patients learned to cope with the consequences of living with multiple chronic diseases. "Using all resources" as another category demonstrates how capabilities and strengths were adopted. The results of the cooperation- and allocation processes showed that the APN recognition and APN role performance have to be negotiated. Prospective APN-services for this patient population should be integrated along with physician networks and other service providers including community health nursing.

  5. Future Directions in Vulnerability to Depression among Youth: Integrating Risk Factors and Processes across Multiple Levels of Analysis

    PubMed Central

    Hankin, Benjamin L.

    2014-01-01

    Depression is a developmental phenomenon. Considerable progress has been made in describing the syndrome, establishing its prevalence and features, providing clues as to its etiology, and developing evidence-based treatment and prevention options. Despite considerable headway in distinct lines of vulnerability research, there is an explanatory gap in the field ability to more comprehensively explain and predict who is likely to become depressed, when, and why. Still, despite clear success in predicting moderate variance for future depression, especially with empirically rigorous methods and designs, the heterogeneous and multi-determined nature of depression suggests that additional etiologies need to be included to advance knowledge on developmental pathways to depression. This paper advocates for a multiple levels of analysis approach to investigating vulnerability to depression across the lifespan and providing a more comprehensive understanding of its etiology. One example of a multiple levels of analysis model of vulnerabilities to depression is provided that integrates the most accessible, observable factors (e.g., cognitive and temperament risks), intermediate processes and endophenotypes (e.g., information processing biases, biological stress physiology, and neural activation and connectivity), and genetic influences (e.g., candidate genes and epigenetics). Evidence for each of these factors as well as their cross-level integration is provided. Methodological and conceptual considerations important for conducting integrative, multiple levels of depression vulnerability research are discussed. Finally, translational implications for how a multiple levels of analysis perspective may confer additional leverage to reduce the global burden of depression and improve care are considered. PMID:22900513

  6. Functional characterization of the Drosophila MRP (mitochondrial RNA processing) RNA gene.

    PubMed

    Schneider, Mary D; Bains, Anupinder K; Rajendra, T K; Dominski, Zbigniew; Matera, A Gregory; Simmonds, Andrew J

    2010-11-01

    MRP RNA is a noncoding RNA component of RNase mitochondrial RNA processing (MRP), a multi-protein eukaryotic endoribonuclease reported to function in multiple cellular processes, including ribosomal RNA processing, mitochondrial DNA replication, and cell cycle regulation. A recent study predicted a potential Drosophila ortholog of MRP RNA (CR33682) by computer-based genome analysis. We have confirmed the expression of this gene and characterized the phenotype associated with this locus. Flies with mutations that specifically affect MRP RNA show defects in growth and development that begin in the early larval period and end in larval death during the second instar stage. We present several lines of evidence demonstrating a role for Drosophila MRP RNA in rRNA processing. The nuclear fraction of Drosophila MRP RNA localizes to the nucleolus. Further, a mutant strain shows defects in rRNA processing that include a defect in 5.8S rRNA processing, typical of MRP RNA mutants in other species, as well as defects in early stages of rRNA processing.

  7. Enhancement of ORR catalytic activity by multiple heteroatom-doped carbon materials.

    PubMed

    Kim, Dae-wook; Li, Oi Lun; Saito, Nagahiro

    2015-01-07

    Heteroatom-doped carbon matrices have been attracting significant attention due to their superior electrochemical stability, light weight and low cost. Hence, in this study, various types of heteroatom, including single dopants of N, B and P and multiple dopants of B-N and P-N with a carbon matrix were synthesized by an innovative method named the solution plasma process. The heteroatom was doped into the carbon matrix during the discharge process by continuous dissociation and recombination of precursors. The chemical bonding structure, ORR activity and electrochemical performance were compared in detail for each single dopant and multiple dopants. According to the Raman spectra, the carbon structures were deformed by the doped heteroatoms in the carbon matrix. In comparison with N-doped structures (NCNS), the ORR potential of PN-doped structures (PNCNS) was positively shifted from -0.27 V to -0.24 V. It was observed that doping with N decreased the bonding between P and C in the matrix. The multiple doping induced additional active sites for ORR which further enhanced ORR activity and stability. Therefore, PNCNS is a promising metal-free catalyst for ORR at the cathode in a fuel cell.

  8. Understanding and Managing the Assessment Process

    Treesearch

    Gene Lessard; Scott Archer; John R. Probst; Sandra Clark

    1999-01-01

    Taking an ecological approach to management, or ecosystem management, is a developing approach for managing natural resources within the context of large geogaphic scales and over multiple time frames. Recently, the Council on Environmental Quality (CEQ) (IEMTF 1995) defined an ecosystem as "...an interconnected community of living things, including humans, and...

  9. Development of a Rubric to Improve Critical Thinking

    ERIC Educational Resources Information Center

    Hildenbrand, Kasee J.; Schultz, Judy A.

    2012-01-01

    Context: Health care professionals, including athletic trainers are confronted daily with multiple complex problems that require critical thinking. Objective: This research attempts to develop a reliable process to assess students' critical thinking in a variety of athletic training and kinesiology courses. Design: Our first step was to create a…

  10. Interactive Video Usage on Autism Spectrum Disorder Training in Medical Education

    ERIC Educational Resources Information Center

    Taslibeyaz, Elif; Dursun, Onur Burak; Karaman, Selcuk

    2017-01-01

    This study aimed to compare the effects of interactive and non-interactive videos concerning the autism spectrum disorder on medical students' achievement. It also evaluated the relation between the interactive videos' interactivity and the students' decision-making process. It used multiple methods, including quantitative and qualitative methods.…

  11. Self-Assessment as a Process for Inclusion

    ERIC Educational Resources Information Center

    Bourke, Roseanna; Mentis, Mandia

    2013-01-01

    There are multiple ways that assessment is positioned within education: as a method for accountability, a strategy to attract funding and an approach to support learning. Different assessment practices portray students in different ways and can serve to include or exclude them in their learning and assessment. Students are often categorised and…

  12. Multimodal Resemiotization and Authorial Agency in an L2 Writing Classroom

    ERIC Educational Resources Information Center

    Cimasko, Tony; Shin, Dong-shin

    2017-01-01

    This study examines the composing process and authorial agency of a college ESL writer as she remediated an argumentative essay into a multimodal digital video. Employing principles of sociosemiotic ethnography, and drawing on the concepts of resemiotization and recontextualization, the study investigated multiple types of data, including an…

  13. The Kindergarten Path Effect Revisited: Children's Use of Context in Processing Structural Ambiguities

    ERIC Educational Resources Information Center

    Weighall, Anna R.

    2008-01-01

    Research with adults has shown that ambiguous spoken sentences are resolved efficiently, exploiting multiple cues--including referential context--to select the intended meaning. Paradoxically, children appear to be insensitive to referential cues when resolving ambiguous sentences, relying instead on statistical properties intrinsic to the…

  14. Bicultural Team Teaching: Experiences from an Emerging Business School.

    ERIC Educational Resources Information Center

    Napier, Nancy K.; Hang, Ngo Minh; Mai, Nyugen Thi Tuyet; Thang, Nyugen Van; Tuan, Vu Van

    2002-01-01

    A new graduate business course in Vietnam team taught by American and Vietnamese instructors illustrates issues in bicultural team teaching, including team formation, sharing workloads in and out of class, and evaluation/grading. The process made the class more relevant, exposed students to multiple perspectives, and helped participants appreciate…

  15. Nutrient transport in runoff as affected by diet, tillage and manure application rate

    USDA-ARS?s Scientific Manuscript database

    Including distillers grains in feedlot finishing diets may increase feedlot profitability. However the nutrient content of by-products are concentrated about three during the distillation process. Manure can be applied to meet single or multiple year crop nutrient requirements. The water quality eff...

  16. Nursing Admission Practices to Discern "Fit": A Case Study Exemplar

    ERIC Educational Resources Information Center

    Sinutko, Jaime M.

    2014-01-01

    Admission to a baccalaureate nursing school in the United States is currently a challenging proposition for a variety of reasons. This research explored a holistic nursing school admission process at a small, private, baccalaureate college using a retrospective, mixed-method, approach. The holistic method included multiple admission criteria, both…

  17. Principals and Teachers "Craft Coherence" among Accountability Policies

    ERIC Educational Resources Information Center

    Stosich, Elizabeth Leisy

    2018-01-01

    Purpose: The purpose of this paper is to examine how US school leaders and teachers make sense of multiple accountability policies, including the Common Core State Standards and teacher evaluation, and how this process relates to school priorities and classroom practice. Design/methodology/approach: This study uses a comparative case study…

  18. Behavioral Modeling of Adversaries with Multiple Objectives in Counterterrorism.

    PubMed

    Mazicioglu, Dogucan; Merrick, Jason R W

    2018-05-01

    Attacker/defender models have primarily assumed that each decisionmaker optimizes the cost of the damage inflicted and its economic repercussions from their own perspective. Two streams of recent research have sought to extend such models. One stream suggests that it is more realistic to consider attackers with multiple objectives, but this research has not included the adaption of the terrorist with multiple objectives to defender actions. The other stream builds off experimental studies that show that decisionmakers deviate from optimal rational behavior. In this article, we extend attacker/defender models to incorporate multiple objectives that a terrorist might consider in planning an attack. This includes the tradeoffs that a terrorist might consider and their adaption to defender actions. However, we must also consider experimental evidence of deviations from the rationality assumed in the commonly used expected utility model in determining such adaption. Thus, we model the attacker's behavior using multiattribute prospect theory to account for the attacker's multiple objectives and deviations from rationality. We evaluate our approach by considering an attacker with multiple objectives who wishes to smuggle radioactive material into the United States and a defender who has the option to implement a screening process to hinder the attacker. We discuss the problems with implementing such an approach, but argue that research in this area must continue to avoid misrepresenting terrorist behavior in determining optimal defensive actions. © 2017 Society for Risk Analysis.

  19. RCT of a Psychological Intervention for Patients With Cancer: I. Mechanisms of Change

    PubMed Central

    Andersen, Barbara L.; Shelby, Rebecca A.; Golden-Kreutz, Deanna M.

    2008-01-01

    Little is known about the therapeutic processes contributing to efficacy of psychological interventions for patients with cancer. Data from a randomized clinical trial yielding robust biobehavioral and health effects (B. L. Andersen et al., 2004, 2007) were used to examine associations between process variables, treatment utilization, and outcomes. Novel findings emerged. Patients were highly satisfied with the treatment, but their higher levels of felt support (group cohesion) covaried with lower distress and fewer symptoms. Also, specific. treatment strategies were associated with specific outcomes, including lower distress, improved dietary habits, reduced symptomatology, and higher chemotherapy dose intensity. These data provide a comprehensive test of multiple therapeutic processes and mechanisms for biobehavioral change with an intervention including both intensive and maintenance phases. PMID:18085909

  20. Planning assistance for the NASA 30/20 GHz program. Network control architecture study.

    NASA Technical Reports Server (NTRS)

    Inukai, T.; Bonnelycke, B.; Strickland, S.

    1982-01-01

    Network Control Architecture for a 30/20 GHz flight experiment system operating in the Time Division Multiple Access (TDMA) was studied. Architecture development, identification of processing functions, and performance requirements for the Master Control Station (MCS), diversity trunking stations, and Customer Premises Service (CPS) stations are covered. Preliminary hardware and software processing requirements as well as budgetary cost estimates for the network control system are given. For the trunking system control, areas covered include on board SS-TDMA switch organization, frame structure, acquisition and synchronization, channel assignment, fade detection and adaptive power control, on board oscillator control, and terrestrial network timing. For the CPS control, they include on board processing and adaptive forward error correction control.

  1. Primary care research conducted in networks: getting down to business.

    PubMed

    Mold, James W

    2012-01-01

    This seventh annual practice-based research theme issue of the Journal of the American Board of Family Medicine highlights primary care research conducted in practice-based research networks (PBRNs). The issue includes discussion of (1) theoretical and methodological research, (2) health care research (studies addressing primary care processes), (3) clinical research (studies addressing the impact of primary care on patients), and (4) health systems research (studies of health system issues impacting primary care including the quality improvement process). We had a noticeable increase in submissions from PBRN collaborations, that is, studies that involved multiple networks. As PBRNs cooperate to recruit larger and more diverse patient samples, greater generalizability and applicability of findings lead to improved primary care processes.

  2. Metabolism and function of phenazines in bacteria: impacts on the behavior of bacteria in the environment and biotechnological processes

    PubMed Central

    Pierson, Elizabeth A.

    2010-01-01

    Phenazines constitute a large group of nitrogen-containing heterocyclic compounds produced by a diverse range of bacteria. Both natural and synthetic phenazine derivatives are studied due their impacts on bacterial interactions and biotechnological processes. Phenazines serve as electron shuttles to alternate terminal acceptors, modify cellular redox states, act as cell signals that regulate patterns of gene expression, contribute to biofilm formation and architecture, and enhance bacterial survival. Phenazines have diverse effects on eukaryotic hosts and host tissues, including the modification of multiple host cellular responses. In plants, phenazines also may influence growth and elicit induced systemic resistance. Here, we discuss emerging evidence that phenazines play multiple roles for the producing organism and contribute to their behavior and ecological fitness. PMID:20352425

  3. Apollo-Soyuz pamphlet no. 8: Zero-g technology. [experimental designispace processing and aerospace engineering

    NASA Technical Reports Server (NTRS)

    Page, L. W.; From, T. P.

    1977-01-01

    The behavior of liquids in zero gravity environments is discussed with emphasis on foams, wetting, and wicks. A multipurpose electric furnace (MA-010) for the high temperature processing of metals and salts in zero-g is described. Experiments discussed include: monolectic and synthetic alloys (MA-041); multiple material melting point (MA-150); zero-g processing of metals (MA-070); surface tension induced convection (MA-041); halide eutectic growth; interface markings in crystals (MA-060); crystal growth from the vapor phase (MA-085); and photography of crystal growth (MA-028).

  4. Tectonic Inversion Along the Algerian and Ligurian Margins: On the Insight Provided By Latest Seismic Processing Techniques Applied to Recent and Vintage 2D Offshore Multichannel Seismic Data

    NASA Astrophysics Data System (ADS)

    Schenini, L.; Beslier, M. O.; Sage, F.; Badji, R.; Galibert, P. Y.; Lepretre, A.; Dessa, J. X.; Aidi, C.; Watremez, L.

    2014-12-01

    Recent studies on the Algerian and the North-Ligurian margins in the Western Mediterranean have evidenced inversion-related superficial structures, such as folds and asymmetric sedimentary perched basins whose geometry hints at deep compressive structures dipping towards the continent. Deep seismic imaging of these margins is difficult due to steep slope and superficial multiples, and, in the Mediterranean context, to the highly diffractive Messinian evaporitic series in the basin. During the Algerian-French SPIRAL survey (2009, R/V Atalante), 2D marine multi-channel seismic (MCS) reflection data were collected along the Algerian Margin using a 4.5 km, 360 channel digital streamer and a 3040 cu. in. air-gun array. An advanced processing workflow has been laid out using Geocluster CGG software, which includes noise attenuation, 2D SRME multiple attenuation, surface consistent deconvolution, Kirchhoff pre-stack time migration. This processing produces satisfactory seismic images of the whole sedimentary cover, and of southward dipping reflectors in the acoustic basement along the central part of the margin offshore Great Kabylia, that are interpreted as inversion-related blind thrusts as part of flat-ramp systems. We applied this successful processing workflow to old 2D marine MCS data acquired on the North-Ligurian Margin (Malis survey, 1995, R/V Le Nadir), using a 2.5 km, 96 channel streamer and a 1140 cu. in. air-gun array. Particular attention was paid to multiple attenuation in adapting our workflow. The resulting reprocessed seismic images, interpreted with a coincident velocity model obtained by wide-angle data tomography, provide (1) enhanced imaging of the sedimentary cover down to the top of the acoustic basement, including the base of the Messinian evaporites and the sub-salt Miocene series, which appear to be tectonized as far as in the mid-basin, and (2) new evidence of deep crustal structures in the margin which the initial processing had failed to reveal.

  5. Co-Creativity and Interactive Repair: Commentary on Berta Bornstein's "The Analysis of a Phobic Child".

    PubMed

    Harrison, Alexandra

    2014-01-01

    My comments focus on a consideration of three issues central to child psychoanalysis stimulated by rereading the classic paper by Berta Bornstein, "The Analysis of a Phobic Child: Some Problems of Theory and Technique in Child Analysis": (1) the importance of "co-creativity" and its use in analysis to repair disruptions in the mother-child relationship; (2) working analytically with the "inner world of the child "; and (3) the fundamental importance of multiple simultaneous meaning-making processes. I begin with a discussion of current thinking about the importance of interactive processes in developmental and therapeutic change and then lead to the concepts of "co-creativity" and interactive repair, elements that are missing in the "Frankie" paper. The co-creative process that I outline includes multiple contributions that Frankie and his caregivers brought to their relationships--his mother, his father, his nurse, and even his analyst. I then address the question of how child analysts can maintain a central focus on the inner world of the child while still taking into account the complex nature of co-creativity in the change process. Finally, I discuss insights into the multiple simultaneous meaning-making processes in the analytic relationship to effect therapeutic change, including what I call the "sandwich model," an attempt to organize this complexity so that is more accessible to the practicing clinician. In terms of the specific case of Frankie, my reading of the case suggests that failure to repair disruptions in the mother-child relationship from infancy through the time of the analytic treatment was central to Frankie's problems. My hypothesis is that, rather than the content of his analyst's interpretations, what was helpful to Frankie in the analysis was the series of attempts at interactive repair in the analytic process. Unfortunately, the case report does not offer data to test this hypothesis. Indeed, one concluding observation from my reading of this classic case is how useful it would be for the contemporary analyst to pay attention to the multifaceted co-creative process in order to explain and foster the therapeutic change that can occur in analysis.

  6. Natural Resource Information System. Volume 1: Overall description

    NASA Technical Reports Server (NTRS)

    1972-01-01

    A prototype computer-based Natural Resource Information System was designed which could store, process, and display data of maximum usefulness to land management decision making. The system includes graphic input and display, the use of remote sensing as a data source, and it is useful at multiple management levels. A survey established current decision making processes and functions, information requirements, and data collection and processing procedures. The applications of remote sensing data and processing requirements were established. Processing software was constructed and a data base established using high-altitude imagery and map coverage of selected areas of SE Arizona. Finally a demonstration of system processing functions was conducted utilizing material from the data base.

  7. Alternative Splicing of sept9a and sept9b in Zebrafish Produces Multiple mRNA Transcripts Expressed Throughout Development

    PubMed Central

    Hannibal, Mark C.; Kimelman, David

    2010-01-01

    Background Septins are involved in a number of cellular processes including cytokinesis and organization of the cytoskeleton. Alterations in human septin-9 (SEPT9) levels have been linked to multiple cancers, whereas mutations in SEPT9 cause the episodic neuropathy, hereditary neuralgic amyotrophy (HNA). Despite its important function in human health, the in vivo role of SEPT9 is unknown. Methodology/Principal Findings Here we utilize zebrafish to study the role of SEPT9 in early development. We show that zebrafish possess two genes, sept9a and sept9b that, like humans, express multiple transcripts. Knockdown or overexpression of sept9a transcripts results in specific developmental alterations including circulation defects and aberrant epidermal development. Conclusions/Significance Our work demonstrates that sept9 plays an important role in zebrafish development, and establishes zebrafish as a valuable model organism for the study of SEPT9. PMID:20502708

  8. Multiple-hypothesis multiple-model line tracking

    NASA Astrophysics Data System (ADS)

    Pace, Donald W.; Owen, Mark W.; Cox, Henry

    2000-07-01

    Passive sonar signal processing generally includes tracking of narrowband and/or broadband signature components observed on a Lofargram or on a Bearing-Time-Record (BTR) display. Fielded line tracking approaches to date have been recursive and single-hypthesis-oriented Kalman- or alpha-beta filters, with no mechanism for considering tracking alternatives beyond the most recent scan of measurements. While adaptivity is often built into the filter to handle changing track dynamics, these approaches are still extensions of single target tracking solutions to multiple target tracking environment. This paper describes an application of multiple-hypothesis, multiple target tracking technology to the sonar line tracking problem. A Multiple Hypothesis Line Tracker (MHLT) is developed which retains the recursive minimum-mean-square-error tracking behavior of a Kalman Filter in a maximum-a-posteriori delayed-decision multiple hypothesis context. Multiple line track filter states are developed and maintained using the interacting multiple model (IMM) state representation. Further, the data association and assignment problem is enhanced by considering line attribute information (line bandwidth and SNR) in addition to beam/bearing and frequency fit. MHLT results on real sonar data are presented to demonstrate the benefits of the multiple hypothesis approach. The utility of the system in cluttered environments and particularly in crossing line situations is shown.

  9. On the Role of Multi-Scale Processes in CO2 Storage Security and Integrity

    NASA Astrophysics Data System (ADS)

    Pruess, K.; Kneafsey, T. J.

    2008-12-01

    Consideration of multiple scales in subsurface processes is usually referred to the spatial domain, where we may attempt to relate process descriptions and parameters from pore and bench (Darcy) scale to much larger field and regional scales. However, multiple scales occur also in the time domain, and processes extending over a broad range of time scales may be very relevant to CO2 storage and containment. In some cases, such as in the convective instability induced by CO2 dissolution in saline waters, space and time scales are coupled in the sense that perturbations induced by CO2 injection will grow concurrently over many orders of magnitude in both space and time. In other cases, CO2 injection may induce processes that occur on short time scales, yet may affect large regions. Possible examples include seismicity that may be triggered by CO2 injection, or hypothetical release events such as "pneumatic eruptions" that may discharge substantial amounts of CO2 over a short time period. This paper will present recent advances in our experimental and modeling studies of multi-scale processes. Specific examples that will be discussed include (1) the process of CO2 dissolution-diffusion-convection (DDC), that can greatly accelerate the rate at which free-phase CO2 is stored as aqueous solute; (2) self- enhancing and self-limiting processes during CO2 leakage through faults, fractures, or improperly abandoned wells; and (3) porosity and permeability reduction from salt precipitation near CO2 injection wells, and mitigation of corresponding injectivity loss. This work was supported by the Office of Basic Energy Sciences and by the Zero Emission Research and Technology project (ZERT) under Contract No. DE-AC02-05CH11231 with the U.S. Department of Energy.

  10. Simulation of deleterious processes in a static-cell diode pumped alkali laser

    NASA Astrophysics Data System (ADS)

    Oliker, Benjamin Q.; Haiducek, John D.; Hostutler, David A.; Pitz, Greg A.; Rudolph, Wolfgang; Madden, Timothy J.

    2014-02-01

    The complex interactions in a diode pumped alkali laser (DPAL) gain cell provide opportunities for multiple deleterious processes to occur. Effects that may be attributable to deleterious processes have been observed experimentally in a cesium static-cell DPAL at the United States Air Force Academy [B.V. Zhdanov, J. Sell, R.J. Knize, "Multiple laser diode array pumped Cs laser with 48 W output power," Electronics Letters, 44, 9 (2008)]. The power output in the experiment was seen to go through a "roll-over"; the maximum power output was obtained with about 70 W of pump power, then power output decreased as the pump power was increased beyond this point. Research to determine the deleterious processes that caused this result has been done at the Air Force Research Laboratory utilizing physically detailed simulation. The simulations utilized coupled computational fluid dynamics (CFD) and optics solvers, which were three-dimensional and time-dependent. The CFD code used a cell-centered, conservative, finite-volume discretization of the integral form of the Navier-Stokes equations. It included thermal energy transport and mass conservation, which accounted for chemical reactions and state kinetics. Optical models included pumping, lasing, and fluorescence. The deleterious effects investigated were: alkali number density decrease in high temperature regions, convective flow, pressure broadening and shifting of the absorption lineshape including hyperfine structure, radiative decay, quenching, energy pooling, off-resonant absorption, Penning ionization, photoionization, radiative recombination, three-body recombination due to free electron and buffer gas collisions, ambipolar diffusion, thermal aberration, dissociative recombination, multi-photon ionization, alkali-hydrocarbon reactions, and electron impact ionization.

  11. [Axonopathy in the pathogenesis of multiple sclerosis, peripheral diffuse and local motor neuropathies and motor neuron disease].

    PubMed

    Merkulov, Iu A; Merkulova, D M; Iosifova, O A; Zavalishin, I A

    2010-01-01

    Two hundreds and seventy-six patients including 43 patients with multiple sclerosis, 24 - with acute inflammatory demyelinating polyneuropathy (AIDP), 144 - with chronic inflammatory demyelinating polyneuropathy (CIDP), 27 - with motor multifocal neuropathy (MMN), 38 - with lateral amyotrophic sclerosis (LAS) have been examined. Symptoms of axonal degeneration, manifested in denervation phenomena in both clinical and instrumental studies (electromyography, transcranial magnetic stimulation, MRT), were revealed in all groups of patients. The formation of excitation conduction blocks is an universal pathophysiological mechanism of the axonopathy development in AIDP, CIDP, MMN and LAS. Symptoms of axonopathy and peripheral demyelinization in patients with multiple sclerosis and LAS suggest the possibility of transformation of immunopathological process from the central nervous system to the peripheral one.

  12. Method and apparatus for production of subsea hydrocarbon formations

    DOEpatents

    Blandford, Joseph W.

    1995-01-01

    A system for controlling, separating, processing and exporting well fluids produced from subsea hydrocarbon formations is disclosed. The subsea well tender system includes a surface buoy supporting one or more decks above the water surface for accommodating equipment to process oil, gas and water recovered from the subsea hydrocarbon formation. The surface buoy includes a surface-piercing central flotation column connected to one or more external floatation tanks located below the water surface. The surface buoy is secured to the seabed by one or more tendons which are anchored to a foundation with piles imbedded in the seabed. The system accommodates multiple versions on the surface buoy configuration.

  13. Ground Test of the Urine Processing Assembly for Accelerations and Transfer Functions

    NASA Technical Reports Server (NTRS)

    Houston, Janice; Almond, Deborah F. (Technical Monitor)

    2001-01-01

    This viewgraph presentation gives an overview of the ground test of the urine processing assembly for accelerations and transfer functions. Details are given on the test setup, test data, data analysis, analytical results, and microgravity assessment. The conclusions of the tests include the following: (1) the single input/multiple output method is useful if the data is acquired by tri-axial accelerometers and inputs can be considered uncorrelated; (2) tying coherence with the matrix yields higher confidence in results; (3) the WRS#2 rack ORUs need to be isolated; (4) and future work includes a plan for characterizing performance of isolation materials.

  14. Stress corrosion in titanium alloys and other metallic materials

    NASA Technical Reports Server (NTRS)

    Harkins, C. G. (Editor); Brotzen, F. R.; Hightower, J. W.; Mclellan, R. B.; Roberts, J. M.; Rudee, M. L.; Leith, I. R.; Basu, P. K.; Salama, K.; Parris, D. P.

    1971-01-01

    Multiple physical and chemical techniques including mass spectroscopy, atomic absorption spectroscopy, gas chromatography, electron microscopy, optical microscopy, electronic spectroscopy for chemical analysis (ESCA), infrared spectroscopy, nuclear magnetic resonance (NMR), X-ray analysis, conductivity, and isotopic labeling were used in investigating the atomic interactions between organic environments and titanium and titanium oxide surfaces. Key anhydrous environments studied included alcohols, which contain hydrogen; carbon tetrachloride, which does not contain hydrogen; and mixtures of alcohols and halocarbons. Effects of dissolved salts in alcohols were also studied. This program emphasized experiments designed to delineate the conditions necessary rather than sufficient for initiation processes and for propagation processes in Ti SCC.

  15. Microarray profiling analysis uncovers common molecular mechanisms of rubella virus, human cytomegalovirus, and herpes simplex virus type 2 infections in ECV304 cells.

    PubMed

    Mo, X; Xu, L; Yang, Q; Feng, H; Peng, J; Zhang, Y; Yuan, W; Wang, Y; Li, Y; Deng, Y; Wan, Y; Chen, Z; Li, F; Wu, X

    2011-08-01

    To study the common molecular mechanisms of various viruses infections that might result in congential cardiovascular diseases in perinatal period, changes in mRNA expression levels of ECV304 cells infected by rubella virus (RUBV), human cytomegalovirus (HCMV), and herpes simplex virus type 2 (HSV-2) were analyzed using a microarray system representing 18,716 human genes. 99 genes were found to exhibit differential expression (80 up-regulated and 19 down-regulated). Biological process analysis showed that 33 signaling pathways including 22 genes were relevant significantly to RV, HCMV and HSV-II infections. Of these 33 biological processes, 28 belong to one-gene biological processes and 5 belong to multiple-gene biological processes. Gene annotation indicated that the 5 multiple-gene biological processes including regulation of cell growth, collagen fibril organization, mRNA transport, cell adhesion and regulation of cell shape, and seven down- or up-regulated genes [CRIM1 (cysteine rich transmembrane BMP regulator 1), WISP2 (WNT1 inducible signaling pathway protein 2), COL12A1 (collagen, type XII, alpha 1), COL11A2 (collagen, type XI, alpha 2), CNTN5 (contactin 5), DDR1 (discoidin domain receptor tyrosine kinase 1), VEGF (vascular endothelial growth factor precursor)], are significantly correlated to RUBV, HCMV and HSV-2 infections in ECV304 cells. The results obtained in this study suggested the common molecular mechanisms of viruses infections that might result in congential cardiovascular diseases.

  16. Bridging paradigms: hybrid mechanistic-discriminative predictive models.

    PubMed

    Doyle, Orla M; Tsaneva-Atansaova, Krasimira; Harte, James; Tiffin, Paul A; Tino, Peter; Díaz-Zuccarini, Vanessa

    2013-03-01

    Many disease processes are extremely complex and characterized by multiple stochastic processes interacting simultaneously. Current analytical approaches have included mechanistic models and machine learning (ML), which are often treated as orthogonal viewpoints. However, to facilitate truly personalized medicine, new perspectives may be required. This paper reviews the use of both mechanistic models and ML in healthcare as well as emerging hybrid methods, which are an exciting and promising approach for biologically based, yet data-driven advanced intelligent systems.

  17. PLEXOS Input Data Generator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    The PLEXOS Input Data Generator (PIDG) is a tool that enables PLEXOS users to better version their data, automate data processing, collaborate in developing inputs, and transfer data between different production cost modeling and other power systems analysis software. PIDG can process data that is in a generalized format from multiple input sources, including CSV files, PostgreSQL databases, and PSS/E .raw files and write it to an Excel file that can be imported into PLEXOS with only limited manual intervention.

  18. Lateralized hippocampal oscillations underlie distinct aspects of human spatial memory and navigation.

    PubMed

    Miller, Jonathan; Watrous, Andrew J; Tsitsiklis, Melina; Lee, Sang Ah; Sheth, Sameer A; Schevon, Catherine A; Smith, Elliot H; Sperling, Michael R; Sharan, Ashwini; Asadi-Pooya, Ali Akbar; Worrell, Gregory A; Meisenhelter, Stephen; Inman, Cory S; Davis, Kathryn A; Lega, Bradley; Wanda, Paul A; Das, Sandhitsu R; Stein, Joel M; Gorniak, Richard; Jacobs, Joshua

    2018-06-21

    The hippocampus plays a vital role in various aspects of cognition including both memory and spatial navigation. To understand electrophysiologically how the hippocampus supports these processes, we recorded intracranial electroencephalographic activity from 46 neurosurgical patients as they performed a spatial memory task. We measure signals from multiple brain regions, including both left and right hippocampi, and we use spectral analysis to identify oscillatory patterns related to memory encoding and navigation. We show that in the left but not right hippocampus, the amplitude of oscillations in the 1-3-Hz "low theta" band increases when viewing subsequently remembered object-location pairs. In contrast, in the right but not left hippocampus, low-theta activity increases during periods of navigation. The frequencies of these hippocampal signals are slower than task-related signals in the neocortex. These results suggest that the human brain includes multiple lateralized oscillatory networks that support different aspects of cognition.

  19. Regulation of vesicular trafficking and leukocyte function by Rab27 GTPases and their effectors

    PubMed Central

    Catz, Sergio Daniel

    2013-01-01

    The Rab27 family of GTPases regulates the efficiency and specificity of exocytosis in hematopoietic cells, including neutrophils, CTLs, NK cells, and mast cells. However, the mechanisms regulated by Rab27 GTPases are cell-specific, as they depend on the differential expression and function of particular effector molecules that are recruited by the GTPases. In addition, Rab27 GTPases participate in multiple steps of the regulation of the secretory process, including priming, tethering, docking, and fusion through sequential interaction with multiple effector molecules. Finally, recent reports suggest that Rab27 GTPases and their effectors regulate vesicular trafficking mechanisms other than exocytosis, including endocytosis and phagocytosis. This review focuses on the latest discoveries on the function of Rab27 GTPases and their effectors Munc13-4 and Slp1 in neutrophil function comparatively to their functions in other leukocytes. PMID:23378593

  20. Making assessments while taking repeated risks: a pattern of multiple response pathways.

    PubMed

    Pleskac, Timothy J; Wershbale, Avishai

    2014-02-01

    Beyond simply a decision process, repeated risky decisions also require a number of cognitive processes including learning, search and exploration, and attention. In this article, we examine how multiple response pathways develop over repeated risky decisions. Using the Balloon Analogue Risk Task (BART) as a case study, we show that 2 different response pathways emerge over the course of the task. The assessment pathway is a slower, more controlled pathway where participants deliberate over taking a risk. The 2nd pathway is a faster, more automatic process where no deliberation occurs. Results imply the slower assessment pathway is taken as choice conflict increases and that the faster automatic response is a learned response. Based on these results, we modify an existing formal cognitive model of decision making during the BART to account for these dual response pathways. The slower more deliberative response process is modeled with a sequential sampling process where evidence is accumulated to a threshold, while the other response is given automatically. We show that adolescents with conduct disorder and substance use disorder symptoms not only evaluate risks differently during the BART but also differ in the rate at which they develop the more automatic response. More broadly, our results suggest cognitive models of judgment decision making need to transition from treating observed decisions as the result of a single response pathway to the result of multiple response pathways that change and develop over time.

  1. Examination of the Effects of Dimensionality on Cognitive Processing in Science: A Computational Modeling Experiment Comparing Online Laboratory Simulations and Serious Educational Games

    ERIC Educational Resources Information Center

    Lamb, Richard L.

    2016-01-01

    Within the last 10 years, new tools for assisting in the teaching and learning of academic skills and content within the context of science have arisen. These new tools include multiple types of computer software and hardware to include (video) games. The purpose of this study was to examine and compare the effect of computer learning games in the…

  2. Upscaling of U (VI) desorption and transport from decimeter‐scale heterogeneity to plume‐scale modeling

    USGS Publications Warehouse

    Curtis, Gary P.; Kohler, Matthias; Kannappan, Ramakrishnan; Briggs, Martin A.; Day-Lewis, Frederick D.

    2015-01-01

    Scientifically defensible predictions of field scale U(VI) transport in groundwater requires an understanding of key processes at multiple scales. These scales range from smaller than the sediment grain scale (less than 10 μm) to as large as the field scale which can extend over several kilometers. The key processes that need to be considered include both geochemical reactions in solution and at sediment surfaces as well as physical transport processes including advection, dispersion, and pore-scale diffusion. The research summarized in this report includes both experimental and modeling results in batch, column and tracer tests. The objectives of this research were to: (1) quantify the rates of U(VI) desorption from sediments acquired from a uranium contaminated aquifer in batch experiments;(2) quantify rates of U(VI) desorption in column experiments with variable chemical conditions, and(3) quantify nonreactive tracer and U(VI) transport in field tests.

  3. Lipid-associated Oral Delivery: Mechanisms and Analysis of Oral Absorption Enhancement

    PubMed Central

    Rezhdo, Oljora; Speciner, Lauren; Carrier, Rebecca L.

    2016-01-01

    The majority of newly discovered oral drugs are poorly water soluble, and co-administration with lipids has proven effective in significantly enhancing bioavailability of some compounds with low aqueous solubility. Yet, lipid-based delivery technologies have not been widely employed in commercial oral products. Lipids can impact drug transport and fate in the gastrointestinal (GI) tract through multiple mechanisms including enhancement of solubility and dissolution kinetics, enhancement of permeation through the intestinal mucosa, and triggering drug precipitation upon lipid emulsion depletion (e.g., by digestion). The effect of lipids on drug absorption is currently not quantitatively predictable, in part due to the multiple complex dynamic processes that can be impacted by lipids. Quantitative mechanistic analysis of the processes significant to lipid system function and overall impact on drug absorption can aid understanding of drug-lipid interactions in the GI tract and exploitation of such interactions to achieve optimal lipid-based drug delivery. In this review, we discuss the impact of co-delivered lipids and lipid digestion on drug dissolution, partitioning, and absorption in the context of the experimental tools and associated kinetic expressions used to study and model these processes. The potential benefit of a systems-based consideration of the concurrent multiple dynamic processes occurring upon co-dosing lipids and drugs to predict the impact of lipids on drug absorption and enable rational design of lipid-based delivery systems is presented. PMID:27520734

  4. Methods and Tools to Align Curriculum to the Skills and Competencies Needed by the Workforce - an Example from Geospatial Science and Technology

    NASA Astrophysics Data System (ADS)

    Johnson, A. B.

    2012-12-01

    Geospatial science and technology (GST) including geographic information systems, remote sensing, global positioning systems and mobile applications, are valuable tools for geoscientists and students learning to become geoscientists. GST allows the user to analyze data spatially and temporarily and then visualize the data and outcomes in multiple formats (digital, web and paper). GST has evolved rapidly and it has been difficult to create effective curriculum as few guidelines existed to help educators. In 2010, the US Department of Labor (DoL), in collaboration with the National Geospatial Center of Excellence (GeoTech Center), a National Science Foundation supported grant, approved the Geospatial Technology Competency Mode (GTCM). The GTCM was developed and vetted with industry experts and provided the structure and example competencies needed across the industry. While the GTCM was helpful, a more detailed list of skills and competencies needed to be identified in order to build appropriate curriculum. The GeoTech Center carried out multiple DACUM events to identify the skills and competencies needed by entry-level workers. DACUM (Developing a Curriculum) is a job analysis process whereby expert workers are convened to describe what they do for a specific occupation. The outcomes from multiple DACUMs were combined into a MetaDACUM and reviewed by hundreds of GST professionals. This provided a list of more than 320 skills and competencies needed by the workforce. The GeoTech Center then held multiple workshops across the U.S. where more than 100 educators knowledgeable in teaching GST parsed the list into Model Courses and a Model Certificate Program. During this process, tools were developed that helped educators define which competency should be included in a specific course and the depth of instruction for that competency. This presentation will provide details about the process, methodology and tools used to create the Models and suggest how they can be used to create customized curriculum integrating geospatial science and technology into geoscience programs.

  5. Functional Connectivity in Multiple Cortical Networks Is Associated with Performance Across Cognitive Domains in Older Adults.

    PubMed

    Shaw, Emily E; Schultz, Aaron P; Sperling, Reisa A; Hedden, Trey

    2015-10-01

    Intrinsic functional connectivity MRI has become a widely used tool for measuring integrity in large-scale cortical networks. This study examined multiple cortical networks using Template-Based Rotation (TBR), a method that applies a priori network and nuisance component templates defined from an independent dataset to test datasets of interest. A priori templates were applied to a test dataset of 276 older adults (ages 65-90) from the Harvard Aging Brain Study to examine the relationship between multiple large-scale cortical networks and cognition. Factor scores derived from neuropsychological tests represented processing speed, executive function, and episodic memory. Resting-state BOLD data were acquired in two 6-min acquisitions on a 3-Tesla scanner and processed with TBR to extract individual-level metrics of network connectivity in multiple cortical networks. All results controlled for data quality metrics, including motion. Connectivity in multiple large-scale cortical networks was positively related to all cognitive domains, with a composite measure of general connectivity positively associated with general cognitive performance. Controlling for the correlations between networks, the frontoparietal control network (FPCN) and executive function demonstrated the only significant association, suggesting specificity in this relationship. Further analyses found that the FPCN mediated the relationships of the other networks with cognition, suggesting that this network may play a central role in understanding individual variation in cognition during aging.

  6. A Qualitative Analysis of Physician Perspectives on Missed and Delayed Outpatient Diagnosis: The Focus on System-Related Factors.

    PubMed

    Sarkar, Urmimala; Simchowitz, Brett; Bonacum, Doug; Strull, William; Lopez, Andrea; Rotteau, Leahora; Shojania, Kaveh G

    2014-10-01

    Delayed and missed diagnoses lead to significant patient harm. Because physician actions are fundamental to the outpatient diagnostic process, a study was conducted to explore physician perspectives on diagnosis. As part of a quality improvement initiative, an integrated health system conducted six physician focus groups in 2004 and 2005. The focus groups included questions about the process of diagnosis, specific factors contributing to missed diagnosis, use of guidelines, atypical vs. typical presentations of disease, diagnostic tools, and follow-up, all with regard to delays in the diagnostic process. The interviews were analyzed (1) deductively, with application of the Systems Engineering Initiative for Patient Safety (SEIPS) model, which addresses systems design, quality management, job design, and technology implementations that affect safety-related patient and organizational and/or staff outcomes, and (2) inductively, with identification of novel themes using content analysis. A total of 25 physicians participated in the six focus groups, which yielded 12 hours of discussion. Providers identified multiple barriers to timely and accurate diagnosis, including organizational culture, information availability, and communication factors. Multiple themes relating to each of the participants in the diagnostic process-health system, provider, and patient-emerged. Concerns about health system structure and providers' interactions with one another and with patients far exceeded discussion of the cognitive factors that might affect the diagnostic process. The results suggest that, at least in physicians' views, improving the diagnostic process requires attention to the organization of the health system in addition to the cognitive aspects of diagnosis.

  7. Multiple Frequency Contrast Source Inversion Method for Vertical Electromagnetic Profiling: 2D Simulation Results and Analyses

    NASA Astrophysics Data System (ADS)

    Li, Jinghe; Song, Linping; Liu, Qing Huo

    2016-02-01

    A simultaneous multiple frequency contrast source inversion (CSI) method is applied to reconstructing hydrocarbon reservoir targets in a complex multilayered medium in two dimensions. It simulates the effects of a salt dome sedimentary formation in the context of reservoir monitoring. In this method, the stabilized biconjugate-gradient fast Fourier transform (BCGS-FFT) algorithm is applied as a fast solver for the 2D volume integral equation for the forward computation. The inversion technique with CSI combines the efficient FFT algorithm to speed up the matrix-vector multiplication and the stable convergence of the simultaneous multiple frequency CSI in the iteration process. As a result, this method is capable of making quantitative conductivity image reconstruction effectively for large-scale electromagnetic oil exploration problems, including the vertical electromagnetic profiling (VEP) survey investigated here. A number of numerical examples have been demonstrated to validate the effectiveness and capacity of the simultaneous multiple frequency CSI method for a limited array view in VEP.

  8. A Methodology for Multiple Rule System Integration and Resolution Within a Singular Knowledge Base

    NASA Technical Reports Server (NTRS)

    Kautzmann, Frank N., III

    1988-01-01

    Expert Systems which support knowledge representation by qualitative modeling techniques experience problems, when called upon to support integrated views embodying description and explanation, especially when other factors such as multiple causality, competing rule model resolution, and multiple uses of knowledge representation are included. A series of prototypes are being developed to demonstrate the feasibility of automating the process of systems engineering, design and configuration, and diagnosis and fault management. A study involves not only a generic knowledge representation; it must also support multiple views at varying levels of description and interaction between physical elements, systems, and subsystems. Moreover, it will involve models of description and explanation for each level. This multiple model feature requires the development of control methods between rule systems and heuristics on a meta-level for each expert system involved in an integrated and larger class of expert system. The broadest possible category of interacting expert systems is described along with a general methodology for the knowledge representation and control of mutually exclusive rule systems.

  9. The role of processing difficulty in the predictive utility of working memory span.

    PubMed

    Bunting, Michael

    2006-12-01

    Storage-plus-processing working memory span tasks (e.g., operation span [OSPAN]) are strong predictors of higher order cognition, including general fluid intelligence. This is due, in part, to the difficulty of the processing component. When the processing component prevents only articulatory rehearsal, but not executive attentional control, the predictive utility is attenuated. Participants in one experiment (N = 59) completed Raven's Advanced Progressive Matrices (RAPM) and multiple versions of OSPAN and probed recall (PR). A distractor task (high or low difficulty) was added to PR, and OSPAN's processing component was manipulated for difficulty. OSPAN and PR correlated with RAPM when the processing component took executive attentional control. These results are suggestive of resource sharing between processing and storage.

  10. Instantiating the multiple levels of analysis perspective in a program of study on externalizing behavior

    PubMed Central

    Beauchaine, Theodore P.; Gatzke-Kopp, Lisa M.

    2014-01-01

    During the last quarter century, developmental psychopathology has become increasingly inclusive and now spans disciplines ranging from psychiatric genetics to primary prevention. As a result, developmental psychopathologists have extended traditional diathesis–stress and transactional models to include causal processes at and across all relevant levels of analysis. Such research is embodied in what is known as the multiple levels of analysis perspective. We describe how multiple levels of analysis research has informed our current thinking about antisocial and borderline personality development among trait impulsive and therefore vulnerable individuals. Our approach extends the multiple levels of analysis perspective beyond simple Biology × Environment interactions by evaluating impulsivity across physiological systems (genetic, autonomic, hormonal, neural), psychological constructs (social, affective, motivational), developmental epochs (preschool, middle childhood, adolescence, adulthood), sexes (male, female), and methods of inquiry (self-report, informant report, treatment outcome, cardiovascular, electrophysiological, neuroimaging). By conducting our research using any and all available methods across these levels of analysis, we have arrived at a developmental model of trait impulsivity that we believe confers a greater understanding of this highly heritable trait and captures at least some heterogeneity in key behavioral outcomes, including delinquency and suicide. PMID:22781868

  11. Design of multiple representations e-learning resources based on a contextual approach for the basic physics course

    NASA Astrophysics Data System (ADS)

    Bakri, F.; Muliyati, D.

    2018-05-01

    This research aims to design e-learning resources with multiple representations based on a contextual approach for the Basic Physics Course. The research uses the research and development methods accordance Dick & Carey strategy. The development carried out in the digital laboratory of Physics Education Department, Mathematics and Science Faculty, Universitas Negeri Jakarta. The result of the process of product development with Dick & Carey strategy, have produced e-learning design of the Basic Physics Course is presented in multiple representations in contextual learning syntax. The appropriate of representation used in the design of learning basic physics include: concept map, video, figures, data tables of experiment results, charts of data tables, the verbal explanations, mathematical equations, problem and solutions example, and exercise. Multiple representations are presented in the form of contextual learning by stages: relating, experiencing, applying, transferring, and cooperating.

  12. Pain medication management processes used by oncology outpatients and family caregivers part I: health systems contexts.

    PubMed

    Schumacher, Karen L; Plano Clark, Vicki L; West, Claudia M; Dodd, Marylin J; Rabow, Michael W; Miaskowski, Christine

    2014-11-01

    Oncology patients with persistent pain treated in outpatient settings and their family caregivers have significant responsibility for managing pain medications. However, little is known about their practical day-to-day experiences with pain medication management. The aim was to describe day-to-day pain medication management from the perspectives of oncology outpatients and their family caregivers who participated in a randomized clinical trial of a psychoeducational intervention called the Pro-Self(©) Plus Pain Control Program. In this article, we focus on pain medication management by patients and family caregivers in the context of multiple complex health systems. We qualitatively analyzed audio-recorded intervention sessions that included extensive dialogue between patients, family caregivers, and nurses about pain medication management during the 10-week intervention. The health systems context for pain medication management included multiple complex systems for clinical care, reimbursement, and regulation of analgesic prescriptions. Pain medication management processes particularly relevant to this context were getting prescriptions and obtaining medications. Responsibilities that fell primarily to patients and family caregivers included facilitating communication and coordination among multiple clinicians, overcoming barriers to access, and serving as a final safety checkpoint. Significant effort was required of patients and family caregivers to insure safe and effective pain medication management. Health systems issues related to access to needed analgesics, medication safety in outpatient settings, and the effort expended by oncology patients and their family caregivers require more attention in future research and health-care reform initiatives. Copyright © 2014 American Academy of Hospice and Palliative Medicine. Published by Elsevier Inc. All rights reserved.

  13. Complementary Roles for Amygdala and Periaqueductal Gray in Temporal-Difference Fear Learning

    ERIC Educational Resources Information Center

    Cole, Sindy; McNally, Gavan P.

    2009-01-01

    Pavlovian fear conditioning is not a unitary process. At the neurobiological level multiple brain regions and neurotransmitters contribute to fear learning. At the behavioral level many variables contribute to fear learning including the physical salience of the events being learned about, the direction and magnitude of predictive error, and the…

  14. Emotional and Cognitive Self-Regulation following Academic Shame

    ERIC Educational Resources Information Center

    Turner, Jeannine E.; Husman, Jenefer

    2008-01-01

    In the face of shame, students may need to turn the global focus of their failures into more discrete behaviors that they can control. Instructors can facilitate this process by informing students of specific behaviors they can enact to support successful achievement, including study and volitional strategies. Students' use of multiple study and…

  15. Visualizing the Complex Process for Deep Learning with an Authentic Programming Project

    ERIC Educational Resources Information Center

    Peng, Jun; Wang, Minhong; Sampson, Demetrios

    2017-01-01

    Project-based learning (PjBL) has been increasingly used to connect abstract knowledge and authentic tasks in educational practice, including computer programming education. Despite its promising effects on improving learning in multiple aspects, PjBL remains a struggle due to its complexity. Completing an authentic programming project involves a…

  16. The Measurement and Cost of Removing Unexplained Gender Differences in Faculty Salaries.

    ERIC Educational Resources Information Center

    Becker, William E.; Toutkoushian, Robert K.

    1995-01-01

    In assessing sex-discrimination suit damages, debate rages over the type and number of variables included in a single-equation model of the salary-determination process. This article considers single- and multiple-equation models, providing 36 different damage calculations. For University of Minnesota data, equalization cost hinges on the…

  17. How Does Sexual Minority Stigma "Get under the Skin"? A Psychological Mediation Framework

    ERIC Educational Resources Information Center

    Hatzenbuehler, Mark L.

    2009-01-01

    Sexual minorities are at increased risk for multiple mental health burdens compared with heterosexuals. The field has identified 2 distinct determinants of this risk, including group-specific minority stressors and general psychological processes that are common across sexual orientations. The goal of the present article is to develop a…

  18. Energy budget closure observed in paired Eddy Covariance towers with increased and continuous daily turbulence

    USDA-ARS?s Scientific Manuscript database

    The lack of energy closure has been a longstanding issue with Eddy Covariance (EC). Multiple mechanisms have been proposed to explain the discrepancies in energy balance including diurnal energy storage changes, advection of energy, and larger scale turbulent processes that cannot be resolved by fi...

  19. Observed Hierarchy of Student Proficiency with Period, Frequency, and Angular Frequency

    ERIC Educational Resources Information Center

    Young, Nicholas T.; Heckler, Andrew F.

    2018-01-01

    In the context of a generic harmonic oscillator, we investigated students' accuracy in determining the period, frequency, and angular frequency from mathematical and graphical representations. In a series of studies including interviews, free response tests, and multiple-choice tests developed in an iterative process, we assessed students in both…

  20. Calcium and aluminum impacts on sugar maple physiology in a northern hardwood forest

    Treesearch

    Joshua M. Halman; Paul G. Schaberg; Gary J. Hawley; Linda H. Pardo; Timothy J. Fahey

    2013-01-01

    Forests of northeastern North America have been exposed to anthropogenic acidic inputs for decades, resulting in altered cation relations and disruptions to associated physiological processes in multiple tree species, including sugar maple (Acer saccharum Marsh.). In the current study, the impacts of calcium (Ca) and aluminum (Al) additions on mature...

  1. Mediatizing Higher Education Policies: Discourses about Quality Education in the Media

    ERIC Educational Resources Information Center

    Cabalin, Cristian

    2015-01-01

    This article presents a critical-political discourse analysis of the media debate over quality assurance in higher education, which occurred in Chile after the 2011 student movement. Students criticized the privatization of higher education and the multiple flaws of this sector, which included corruption scandals during the process of quality…

  2. Using Behavioral Interventions to Assist Children with Type 1 Diabetes Manage Blood Glucose Levels

    ERIC Educational Resources Information Center

    Lasecki, Kim; Olympia, Daniel; Clark, Elaine; Jenson, William; Heathfield, Lora Tuesday

    2008-01-01

    Treatment and management of chronic disease processes on children occurs across multiple settings, placing demands for consultation and expertise on school personnel, including school psychologists. One such chronic condition in children is type I diabetes. Children with type I insulin dependent diabetes mellitus exhibit high rates of…

  3. Incorporating Computer-Aided Language Sample Analysis into Clinical Practice

    ERIC Educational Resources Information Center

    Price, Lisa Hammett; Hendricks, Sean; Cook, Colleen

    2010-01-01

    Purpose: During the evaluation of language abilities, the needs of the child are best served when multiple types and sources of data are included in the evaluation process. Current educational policies and practice guidelines further dictate the use of authentic assessment data to inform diagnosis and treatment planning. Language sampling and…

  4. Resilience after Maltreatment: The Importance of Social Services as Facilitators of Positive Adaptation

    ERIC Educational Resources Information Center

    Ungar, Michael

    2013-01-01

    This practice note will show that resilience among children who have been maltreated is the result of multiple protective factors, including the quality of the services provided to children exposed to chronic adversity. This social ecological perspective of resilience suggests that resilience is a process resulting from interactions between…

  5. The Genre of Instructor Feedback in Doctoral Programs: A Corpus Linguistic Analysis

    ERIC Educational Resources Information Center

    Walters, Kelley Jo; Henry, Patricia; Vinella, Michael; Wells, Steve; Shaw, Melanie; Miller, James

    2015-01-01

    Providing transparent written feedback to doctoral students is essential to the learning process and preparation for the capstone. The purpose of this study was to conduct a qualitative exploration of faculty feedback on benchmark written assignments across multiple, online doctoral programs. The Corpus for this analysis included 236 doctoral…

  6. Large-scale analysis of antisense transcription in wheat using the Affymetrix GeneChip Wheat Genome Array

    USDA-ARS?s Scientific Manuscript database

    Natural antisense transcripts (NATs) are transcripts of the opposite DNA strand to the sense-strand either at the same locus (cis-encoded) or a different locus (trans-encoded). They can affect gene expression at multiple stages including transcription, RNA processing and transport, and translation....

  7. The Program Evaluator's Role in Cross-Project Pollination.

    ERIC Educational Resources Information Center

    Yasgur, Bruce J.

    An expanded duties role of the multiple-program evaluator as an integral part of the ongoing decision-making process in all projects served is defended. Assumptions discussed included that need for projects with related objectives to pool resources and avoid duplication of effort and the evaluator's unique ability to provide an objective…

  8. 40 CFR 86.004-40 - Heavy-duty engine rebuilding practices.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ..., replacement of multiple parts due to wear, and reassembly, and also may include the removal of the engine from..., emissions-related codes or signals from on-board monitoring systems may not be erased or reset without... signals may not be rendered inoperative during the rebuilding process. (d) When conducting a rebuild...

  9. 40 CFR 86.004-40 - Heavy-duty engine rebuilding practices.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ..., replacement of multiple parts due to wear, and reassembly, and also may include the removal of the engine from..., emissions-related codes or signals from on-board monitoring systems may not be erased or reset without... signals may not be rendered inoperative during the rebuilding process. (d) When conducting a rebuild...

  10. 40 CFR 86.004-40 - Heavy-duty engine rebuilding practices.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ..., replacement of multiple parts due to wear, and reassembly, and also may include the removal of the engine from..., emissions-related codes or signals from on-board monitoring systems may not be erased or reset without... signals may not be rendered inoperative during the rebuilding process. (d) When conducting a rebuild...

  11. 40 CFR 86.004-40 - Heavy-duty engine rebuilding practices.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ..., replacement of multiple parts due to wear, and reassembly, and also may include the removal of the engine from..., emissions-related codes or signals from on-board monitoring systems may not be erased or reset without... signals may not be rendered inoperative during the rebuilding process. (d) When conducting a rebuild...

  12. A Multi-criterial Decision Support System for Forest Management

    Treesearch

    Donald Nute; Geneho Kim; Walter D. Potter; Mark J. Twery; H. Michael Rauscher; Scott Thomasma; Deborah Bennett; Peter Kollasch

    1999-01-01

    We describe a research project that has as its goal development of a full-featured decision support system for managing forested land to satisfy multiple criteria represented as timber, wildlife, water, ecological, and wildlife objectives. The decision process proposed for what was originally conceived of as a Northeast Decision Model (NED) includes data acquisition,...

  13. 12 CFR 584.2-1 - Prescribed services and activities of savings and loan holding companies.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... warehousing of such real estate loans, except that such a company or subsidiary shall not invest in a loan... installed therein), including brokerage and warehousing of such chattel paper; (iii) Loans, with or without... other multiple holding companies and affiliates thereof: (i) Data processing; (ii) Credit information...

  14. 12 CFR 584.2-1 - Prescribed services and activities of savings and loan holding companies.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... warehousing of such real estate loans, except that such a company or subsidiary shall not invest in a loan... installed therein), including brokerage and warehousing of such chattel paper; (iii) Loans, with or without... other multiple holding companies and affiliates thereof: (i) Data processing; (ii) Credit information...

  15. 12 CFR 584.2-1 - Prescribed services and activities of savings and loan holding companies.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... warehousing of such real estate loans, except that such a company or subsidiary shall not invest in a loan... installed therein), including brokerage and warehousing of such chattel paper; (iii) Loans, with or without... other multiple holding companies and affiliates thereof: (i) Data processing; (ii) Credit information...

  16. Multiple wavelength light collimator and monitor

    NASA Technical Reports Server (NTRS)

    Gore, Warren J. (Inventor)

    2011-01-01

    An optical system for receiving and collimating light and for transporting and processing light received in each of N wavelength ranges, including near-ultraviolet, visible, near-infrared and mid-infrared wavelengths, to determine a fraction of light received, and associated dark current, in each wavelength range in each of a sequence of time intervals.

  17. The Application of Magnesium(I) Compounds to Energy Storage Materials - Phase 2

    DTIC Science & Technology

    2013-06-24

    currently exploring the use of 10 as a catalyst for a number of other processes, e.g. the selective hydroboration of ketone , aldehydes and nitriles...hydroboration of ketones ; (vi) a variety of related results, including the preparation of an iron(I) dimer with the shortest Fe-Fe multiple bond

  18. SU-F-T-475: An Evaluation of the Overlap Between the Acceptance Testing and Commissioning Processes for Conventional Medical Linear Accelerators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morrow, A; Rangaraj, D; Perez-Andujar, A

    2016-06-15

    Purpose: This work’s objective is to determine the overlap of processes, in terms of sub-processes and time, between acceptance testing and commissioning of a conventional medical linear accelerator and to evaluate the time saved by consolidating the two processes. Method: A process map for acceptance testing for medical linear accelerators was created from vendor documentation (Varian and Elekta). Using AAPM TG-106 and inhouse commissioning procedures, a process map was created for commissioning of said accelerators. The time to complete each sub-process in each process map was evaluated. Redundancies in the processes were found and the time spent on each weremore » calculated. Results: Mechanical testing significantly overlaps between the two processes - redundant work here amounts to 9.5 hours. Many beam non-scanning dosimetry tests overlap resulting in another 6 hours of overlap. Beam scanning overlaps somewhat - acceptance tests include evaluating PDDs and multiple profiles but for only one field size while commissioning beam scanning includes multiple field sizes and depths of profiles. This overlap results in another 6 hours of rework. Absolute dosimetry, field outputs, and end to end tests are not done at all in acceptance testing. Finally, all imaging tests done in acceptance are repeated in commissioning, resulting in about 8 hours of rework. The total time overlap between the two processes is about 30 hours. Conclusion: The process mapping done in this study shows that there are no tests done in acceptance testing that are not also recommended to do for commissioning. This results in about 30 hours of redundant work when preparing a conventional linear accelerator for clinical use. Considering these findings in the context of the 5000 linacs in the United states, consolidating acceptance testing and commissioning would have allowed for the treatment of an additional 25000 patients using no additional resources.« less

  19. Tribbles in normal and malignant haematopoiesis.

    PubMed

    Stein, Sarah J; Mack, Ethan A; Rome, Kelly S; Pear, Warren S

    2015-10-01

    The tribbles protein family, an evolutionarily conserved group of pseudokinases, have been shown to regulate multiple cellular events including those involved in normal and malignant haematopoiesis. The three mammalian Tribbles homologues, Trib1, Trib2 and Trib3 are characterized by conserved motifs, including a pseudokinase domain and a C-terminal E3 ligase-binding domain. In this review, we focus on the role of Trib (mammalian Tribbles homologues) proteins in mammalian haematopoiesis and leukaemia. The Trib proteins show divergent expression in haematopoietic cells, probably indicating cell-specific functions. The roles of the Trib proteins in oncogenesis are also varied and appear to be tissue-specific. Finally, we discuss the potential mechanisms by which the Trib proteins preferentially regulate these processes in multiple cell types. © 2015 Authors; published by Portland Press Limited.

  20. Multifocal Neuropathy: Expanding the Scope of Double Crush Syndrome.

    PubMed

    Cohen, Brian H; Gaspar, Michael P; Daniels, Alan H; Akelman, Edward; Kane, Patrick M

    2016-12-01

    Double crush syndrome (DCS), as it is classically defined, is a clinical condition composed of neurological dysfunction due to compressive pathology at multiple sites along a single peripheral nerve. The traditional definition of DCS is narrow in scope because many systemic pathologic processes, such as diabetes mellitus, drug-induced neuropathy, vascular disease and autoimmune neuronal damage, can have deleterious effects on nerve function. Multifocal neuropathy is a more appropriate term describing the multiple etiologies (including compressive lesions) that may synergistically contribute to nerve dysfunction and clinical symptoms. This paper examines the history of DCS and multifocal neuropathy, including the epidemiology and pathophysiology in addition to principles of evaluation and management. Copyright © 2016 American Society for Surgery of the Hand. Published by Elsevier Inc. All rights reserved.

  1. Knowledge Support and Automation for Performance Analysis with PerfExplorer 2.0

    DOE PAGES

    Huck, Kevin A.; Malony, Allen D.; Shende, Sameer; ...

    2008-01-01

    The integration of scalable performance analysis in parallel development tools is difficult. The potential size of data sets and the need to compare results from multiple experiments presents a challenge to manage and process the information. Simply to characterize the performance of parallel applications running on potentially hundreds of thousands of processor cores requires new scalable analysis techniques. Furthermore, many exploratory analysis processes are repeatable and could be automated, but are now implemented as manual procedures. In this paper, we will discuss the current version of PerfExplorer, a performance analysis framework which provides dimension reduction, clustering and correlation analysis ofmore » individual trails of large dimensions, and can perform relative performance analysis between multiple application executions. PerfExplorer analysis processes can be captured in the form of Python scripts, automating what would otherwise be time-consuming tasks. We will give examples of large-scale analysis results, and discuss the future development of the framework, including the encoding and processing of expert performance rules, and the increasing use of performance metadata.« less

  2. ALFA: The new ALICE-FAIR software framework

    NASA Astrophysics Data System (ADS)

    Al-Turany, M.; Buncic, P.; Hristov, P.; Kollegger, T.; Kouzinopoulos, C.; Lebedev, A.; Lindenstruth, V.; Manafov, A.; Richter, M.; Rybalchenko, A.; Vande Vyvre, P.; Winckler, N.

    2015-12-01

    The commonalities between the ALICE and FAIR experiments and their computing requirements led to the development of large parts of a common software framework in an experiment independent way. The FairRoot project has already shown the feasibility of such an approach for the FAIR experiments and extending it beyond FAIR to experiments at other facilities[1, 2]. The ALFA framework is a joint development between ALICE Online- Offline (O2) and FairRoot teams. ALFA is designed as a flexible, elastic system, which balances reliability and ease of development with performance using multi-processing and multithreading. A message- based approach has been adopted; such an approach will support the use of the software on different hardware platforms, including heterogeneous systems. Each process in ALFA assumes limited communication and reliance on other processes. Such a design will add horizontal scaling (multiple processes) to vertical scaling provided by multiple threads to meet computing and throughput demands. ALFA does not dictate any application protocols. Potentially, any content-based processor or any source can change the application protocol. The framework supports different serialization standards for data exchange between different hardware and software languages.

  3. Rehabilitation in multiple sclerosis.

    PubMed

    Kubsik-Gidlewska, Anna M; Klimkiewicz, Paulina; Klimkiewicz, Robert; Janczewska, Katarzyna; Woldańska-Okońska, Marta

    2017-07-01

    The aim of the study is to present a strategy of rehabilitation in multiple sclerosis on the basis of the latest developments in the field of physiotherapy. The publications on the problem discuss a wide range of methods of physiotherapy that can be used in order to reduce the degree of disability and alleviate the symptoms associated with the disease. The complexity of the disease, the difficulty in determining the appropriate treatment and a wide range of symptoms require a comprehensive approach to the patient, which would include both pharmacology and neurorehabilitation. Rehabilitation, which includes psychotherapy and symptomatic therapy, is regarded nowadays as the best form of treatment for multiple sclerosis. An indepth diagnostic assessment of functional status and prognosis should be carried out before the start of the rehabilitation process. The prognosis should take into account the mental state, the neurological status and the awareness of the patient. The kinesiotherapy program in multiple sclerosis is based on a gradation of physiotherapy which assumes a gradual transition from basic movements to more complex ones till global functions are obtained. The most appropriate form of treatment is functional rehabilitation combined with physical procedures. Recent reports indicate the need for aerobic training to be included in the rehabilitation program. The introduction of physical activities, regardless of the severity of the disease, will reduce the negative effects of akinesia, and thus increase the functional capabilities of all body systems.

  4. Proteomic Profiling of Cranial (Superior) Cervical Ganglia Reveals Beta-Amyloid and Ubiquitin Proteasome System Perturbations in an Equine Multiple System Neuropathy.

    PubMed

    McGorum, Bruce C; Pirie, R Scott; Eaton, Samantha L; Keen, John A; Cumyn, Elizabeth M; Arnott, Danielle M; Chen, Wenzhang; Lamont, Douglas J; Graham, Laura C; Llavero Hurtado, Maica; Pemberton, Alan; Wishart, Thomas M

    2015-11-01

    Equine grass sickness (EGS) is an acute, predominantly fatal, multiple system neuropathy of grazing horses with reported incidence rates of ∼2%. An apparently identical disease occurs in multiple species, including but not limited to cats, dogs, and rabbits. Although the precise etiology remains unclear, ultrastructural findings have suggested that the primary lesion lies in the glycoprotein biosynthetic pathway of specific neuronal populations. The goal of this study was therefore to identify the molecular processes underpinning neurodegeneration in EGS. Here, we use a bottom-up approach beginning with the application of modern proteomic tools to the analysis of cranial (superior) cervical ganglion (CCG, a consistently affected tissue) from EGS-affected patients and appropriate control cases postmortem. In what appears to be the proteomic application of modern proteomic tools to equine neuronal tissues and/or to an inherent neurodegenerative disease of large animals (not a model of human disease), we identified 2,311 proteins in CCG extracts, with 320 proteins increased and 186 decreased by greater than 20% relative to controls. Further examination of selected proteomic candidates by quantitative fluorescent Western blotting (QFWB) and subcellular expression profiling by immunohistochemistry highlighted a previously unreported dysregulation in proteins commonly associated with protein misfolding/aggregation responses seen in a myriad of human neurodegenerative conditions, including but not limited to amyloid precursor protein (APP), microtubule associated protein (Tau), and multiple components of the ubiquitin proteasome system (UPS). Differentially expressed proteins eligible for in silico pathway analysis clustered predominantly into the following biofunctions: (1) diseases and disorders, including; neurological disease and skeletal and muscular disorders and (2) molecular and cellular functions, including cellular assembly and organization, cell-to-cell signaling and interaction (including epinephrine, dopamine, and adrenergic signaling and receptor function), and small molecule biochemistry. Interestingly, while the biofunctions identified in this study may represent pathways underpinning EGS-induced neurodegeneration, this is also the first demonstration of potential molecular conservation (including previously unreported dysregulation of the UPS and APP) spanning the degenerative cascades from an apparently unrelated condition of large animals, to small animal models with altered neuronal vulnerability, and human neurological conditions. Importantly, this study highlights the feasibility and benefits of applying modern proteomic techniques to veterinary investigations of neurodegenerative processes in diseases of large animals. © 2015 by The American Society for Biochemistry and Molecular Biology, Inc.

  5. Large-Scale Transport Model Uncertainty and Sensitivity Analysis: Distributed Sources in Complex Hydrogeologic Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sig Drellack, Lance Prothro

    2007-12-01

    The Underground Test Area (UGTA) Project of the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office is in the process of assessing and developing regulatory decision options based on modeling predictions of contaminant transport from underground testing of nuclear weapons at the Nevada Test Site (NTS). The UGTA Project is attempting to develop an effective modeling strategy that addresses and quantifies multiple components of uncertainty including natural variability, parameter uncertainty, conceptual/model uncertainty, and decision uncertainty in translating model results into regulatory requirements. The modeling task presents multiple unique challenges to the hydrological sciences as a result ofmore » the complex fractured and faulted hydrostratigraphy, the distributed locations of sources, the suite of reactive and non-reactive radionuclides, and uncertainty in conceptual models. Characterization of the hydrogeologic system is difficult and expensive because of deep groundwater in the arid desert setting and the large spatial setting of the NTS. Therefore, conceptual model uncertainty is partially addressed through the development of multiple alternative conceptual models of the hydrostratigraphic framework and multiple alternative models of recharge and discharge. Uncertainty in boundary conditions is assessed through development of alternative groundwater fluxes through multiple simulations using the regional groundwater flow model. Calibration of alternative models to heads and measured or inferred fluxes has not proven to provide clear measures of model quality. Therefore, model screening by comparison to independently-derived natural geochemical mixing targets through cluster analysis has also been invoked to evaluate differences between alternative conceptual models. Advancing multiple alternative flow models, sensitivity of transport predictions to parameter uncertainty is assessed through Monte Carlo simulations. The simulations are challenged by the distributed sources in each of the Corrective Action Units, by complex mass transfer processes, and by the size and complexity of the field-scale flow models. An efficient methodology utilizing particle tracking results and convolution integrals provides in situ concentrations appropriate for Monte Carlo analysis. Uncertainty in source releases and transport parameters including effective porosity, fracture apertures and spacing, matrix diffusion coefficients, sorption coefficients, and colloid load and mobility are considered. With the distributions of input uncertainties and output plume volumes, global analysis methods including stepwise regression, contingency table analysis, and classification tree analysis are used to develop sensitivity rankings of parameter uncertainties for each model considered, thus assisting a variety of decisions.« less

  6. Revealing stable processing products from ribosome-associated small RNAs by deep-sequencing data analysis.

    PubMed

    Zywicki, Marek; Bakowska-Zywicka, Kamilla; Polacek, Norbert

    2012-05-01

    The exploration of the non-protein-coding RNA (ncRNA) transcriptome is currently focused on profiling of microRNA expression and detection of novel ncRNA transcription units. However, recent studies suggest that RNA processing can be a multi-layer process leading to the generation of ncRNAs of diverse functions from a single primary transcript. Up to date no methodology has been presented to distinguish stable functional RNA species from rapidly degraded side products of nucleases. Thus the correct assessment of widespread RNA processing events is one of the major obstacles in transcriptome research. Here, we present a novel automated computational pipeline, named APART, providing a complete workflow for the reliable detection of RNA processing products from next-generation-sequencing data. The major features include efficient handling of non-unique reads, detection of novel stable ncRNA transcripts and processing products and annotation of known transcripts based on multiple sources of information. To disclose the potential of APART, we have analyzed a cDNA library derived from small ribosome-associated RNAs in Saccharomyces cerevisiae. By employing the APART pipeline, we were able to detect and confirm by independent experimental methods multiple novel stable RNA molecules differentially processed from well known ncRNAs, like rRNAs, tRNAs or snoRNAs, in a stress-dependent manner.

  7. Comparative Evaluation of Financing Programs: Insights From California’s Experience

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deason, Jeff

    Berkeley Lab examines criteria for a comparative assessment of multiple financing programs for energy efficiency, developed through a statewide public process in California. The state legislature directed the California Alternative Energy and Advanced Transportation Financing Authority (CAEATFA) to develop these criteria. CAEATFA's report to the legislature, an invaluable reference for other jurisdictions considering these topics, discusses the proposed criteria and the rationales behind them in detail. Berkeley Lab's brief focuses on several salient issues that emerged during the criteria development and discussion process. Many of these issues are likely to arise in other states that plan to evaluate the impactsmore » of energy efficiency financing programs, whether for a single program or multiple programs. Issues discussed in the brief include: -The stakeholder process to develop the proposed assessment criteria -Attribution of outcomes - such as energy savings - to financing programs vs. other drivers -Choosing the outcome metric of primary interest: program take-up levels vs. savings -The use of net benefits vs. benefit-cost ratios for cost-effectiveness evaluation -Non-energy factors -Consumer protection factors -Market transformation impacts -Accommodating varying program goals in a multi-program evaluation -Accounting for costs and risks borne by various parties, including taxpayers and utility customers, in cost-effectiveness analysis -How to account for potential synergies among programs in a multi-program evaluation« less

  8. Implementation of Systematic Review Tools in IRIS | Science ...

    EPA Pesticide Factsheets

    Currently, the number of chemicals present in the environment exceeds the ability of public health scientists to efficiently screen the available data in order to produce well-informed human health risk assessments in a timely manner. For this reason, the US EPA’s Integrated Risk Information System (IRIS) program has started implementing new software tools into the hazard characterization workflow. These automated tools aid in multiple phases of the systematic review process including scoping and problem formulation, literature search, and identification and screening of available published studies. The increased availability of these tools lays the foundation for automating or semi-automating multiple phases of the systematic review process. Some of these software tools include modules to facilitate a structured approach to study quality evaluation of human and animal data, although approaches are generally lacking for assessing complex mechanistic information, in particular “omics”-based evidence tools are starting to become available to evaluate these types of studies. We will highlight how new software programs, online tools, and approaches for assessing study quality can be better integrated to allow for a more efficient and transparent workflow of the risk assessment process as well as identify tool gaps that would benefit future risk assessments. Disclaimer: The views expressed here are those of the authors and do not necessarily represent the view

  9. Green Tea Extracts Epigallocatechin-3-gallate for Different Treatments

    PubMed Central

    Chu, Chenyu; Deng, Jia

    2017-01-01

    Epigallocatechin-3-gallate (EGCG), a component extracted from green tea, has been proved to have multiple effects on human pathological and physiological processes, and its mechanisms are discrepant in cancer, vascularity, bone regeneration, and nervous system. Although there are multiple benefits associated with EGCG, more and more challenges are still needed to get through. For example, EGCG shows low bioactivity via oral administration. This review focuses on effects of EGCG, including anti-cancer, antioxidant, anti-inflammatory, anticollagenase, and antifibrosis effects, to express the potential of EGCG and necessity of further studies in this field. PMID:28884125

  10. Effects of filler type and content on mechanical properties of photopolymerizable composites measured across two-dimensional combinatorial arrays.

    PubMed

    Lin-Gibson, Sheng; Sung, Lipiin; Forster, Aaron M; Hu, Haiqing; Cheng, Yajun; Lin, Nancy J

    2009-07-01

    Multicomponent formulations coupled with complex processing conditions govern the final properties of photopolymerizable dental composites. In this study, a single test substrate was fabricated to support multiple formulations with a gradient in degree of conversion (DC), allowing the evaluation of multiple processing conditions and formulations on one specimen. Mechanical properties and damage response were evaluated as a function of filler type/content and irradiation. DC, surface roughness, modulus, hardness, scratch deformation and cytotoxicity were quantified using techniques including near-infrared spectroscopy, laser confocal scanning microscopy, depth-sensing indentation, scratch testing and cell viability. Scratch parameters (depth, width, percent recovery) were correlated to composite modulus and hardness. Total filler content, nanofiller and irradiation time/intensity all affected the final properties, with the dominant factor for improved properties being a higher DC. This combinatorial platform accelerates the screening of dental composites through the direct comparison of properties and processing conditions across the same sample.

  11. Coactivation of response initiation processes with redundant signals.

    PubMed

    Maslovat, Dana; Hajj, Joëlle; Carlsen, Anthony N

    2018-05-14

    During reaction time (RT) tasks, participants respond faster to multiple stimuli from different modalities as compared to a single stimulus, a phenomenon known as the redundant signal effect (RSE). Explanations for this effect typically include coactivation arising from the multiple stimuli, which results in enhanced processing of one or more response production stages. The current study compared empirical RT data with the predictions of a model in which initiation-related activation arising from each stimulus is additive. Participants performed a simple wrist extension RT task following either a visual go-signal, an auditory go-signal, or both stimuli with the auditory stimulus delayed between 0 and 125 ms relative to the visual stimulus. Results showed statistical equivalence between the predictions of an additive initiation model and the observed RT data, providing novel evidence that the RSE can be explained via a coactivation of initiation-related processes. It is speculated that activation summation occurs at the thalamus, leading to the observed facilitation of response initiation. Copyright © 2018 Elsevier B.V. All rights reserved.

  12. UCXp camera imaging principle and key technologies of data post-processing

    NASA Astrophysics Data System (ADS)

    Yuan, Fangyan; Li, Guoqing; Zuo, Zhengli; Liu, Jianmin; Wu, Liang; Yu, Xiaoping; Zhao, Haitao

    2014-03-01

    The large format digital aerial camera product UCXp was introduced into the Chinese market in 2008, the image consists of 17310 columns and 11310 rows with a pixel size of 6 mm. The UCXp camera has many advantages compared with the same generation camera, with multiple lenses exposed almost at the same time and no oblique lens. The camera has a complex imaging process whose principle will be detailed in this paper. On the other hand, the UCXp image post-processing method, including data pre-processing and orthophoto production, will be emphasized in this article. Based on the data of new Beichuan County, this paper will describe the data processing and effects.

  13. A wireless body measurement system to study fatigue in multiple sclerosis.

    PubMed

    Yu, Fei; Bilberg, Arne; Stenager, Egon; Rabotti, Chiara; Zhang, Bin; Mischi, Massimo

    2012-12-01

    Fatigue is reported as the most common symptom by patients with multiple sclerosis (MS). The physiological and functional parameters related to fatigue in MS patients are currently not well established. A new wearable wireless body measurement system, named Fatigue Monitoring System (FAMOS), was developed to study fatigue in MS. It can continuously measure electrocardiogram, body-skin temperature, electromyogram and motions of feet. The goal of this study is to test the ability of distinguishing fatigued MS patients from healthy subjects by the use of FAMOS. This paper presents the realization of the measurement system including the design of both hardware and dedicated signal processing algorithms. Twenty-six participants including 17 MS patients with fatigue and 9 sex- and age-matched healthy controls were included in the study for continuous 24 h monitoring. The preliminary results show significant differences between fatigued MS patients and healthy controls. In conclusion, the FAMOS enables continuous data acquisition and estimation of multiple physiological and functional parameters. It provides a new, flexible and objective approach to study fatigue in MS, which can distinguish between fatigued MS patients and healthy controls. The usability and reliability of the FAMOS should however be further improved and validated through larger clinical trials.

  14. Model assessment using a multi-metric ranking technique

    NASA Astrophysics Data System (ADS)

    Fitzpatrick, P. J.; Lau, Y.; Alaka, G.; Marks, F.

    2017-12-01

    Validation comparisons of multiple models presents challenges when skill levels are similar, especially in regimes dominated by the climatological mean. Assessing skill separation will require advanced validation metrics and identifying adeptness in extreme events, but maintain simplicity for management decisions. Flexibility for operations is also an asset. This work postulates a weighted tally and consolidation technique which ranks results by multiple types of metrics. Variables include absolute error, bias, acceptable absolute error percentages, outlier metrics, model efficiency, Pearson correlation, Kendall's Tau, reliability Index, multiplicative gross error, and root mean squared differences. Other metrics, such as root mean square difference and rank correlation were also explored, but removed when the information was discovered to be generally duplicative to other metrics. While equal weights are applied, weights could be altered depending for preferred metrics. Two examples are shown comparing ocean models' currents and tropical cyclone products, including experimental products. The importance of using magnitude and direction for tropical cyclone track forecasts instead of distance, along-track, and cross-track are discussed. Tropical cyclone intensity and structure prediction are also assessed. Vector correlations are not included in the ranking process, but found useful in an independent context, and will be briefly reported.

  15. Planning as an Iterative Process

    NASA Technical Reports Server (NTRS)

    Smith, David E.

    2012-01-01

    Activity planning for missions such as the Mars Exploration Rover mission presents many technical challenges, including oversubscription, consideration of time, concurrency, resources, preferences, and uncertainty. These challenges have all been addressed by the research community to varying degrees, but significant technical hurdles still remain. In addition, the integration of these capabilities into a single planning engine remains largely unaddressed. However, I argue that there is a deeper set of issues that needs to be considered namely the integration of planning into an iterative process that begins before the goals, objectives, and preferences are fully defined. This introduces a number of technical challenges for planning, including the ability to more naturally specify and utilize constraints on the planning process, the ability to generate multiple qualitatively different plans, and the ability to provide deep explanation of plans.

  16. Roles of mTOR Signaling in Brain Development.

    PubMed

    Lee, Da Yong

    2015-09-01

    mTOR is a serine/threonine kinase composed of multiple protein components. Intracellular signaling of mTOR complexes is involved in many of physiological functions including cell survival, proliferation and differentiation through the regulation of protein synthesis in multiple cell types. During brain development, mTOR-mediated signaling pathway plays a crucial role in the process of neuronal and glial differentiation and the maintenance of the stemness of neural stem cells. The abnormalities in the activity of mTOR and its downstream signaling molecules in neural stem cells result in severe defects of brain developmental processes causing a significant number of brain disorders, such as pediatric brain tumors, autism, seizure, learning disability and mental retardation. Understanding the implication of mTOR activity in neural stem cells would be able to provide an important clue in the development of future brain developmental disorder therapies.

  17. SKIRT: Hybrid parallelization of radiative transfer simulations

    NASA Astrophysics Data System (ADS)

    Verstocken, S.; Van De Putte, D.; Camps, P.; Baes, M.

    2017-07-01

    We describe the design, implementation and performance of the new hybrid parallelization scheme in our Monte Carlo radiative transfer code SKIRT, which has been used extensively for modelling the continuum radiation of dusty astrophysical systems including late-type galaxies and dusty tori. The hybrid scheme combines distributed memory parallelization, using the standard Message Passing Interface (MPI) to communicate between processes, and shared memory parallelization, providing multiple execution threads within each process to avoid duplication of data structures. The synchronization between multiple threads is accomplished through atomic operations without high-level locking (also called lock-free programming). This improves the scaling behaviour of the code and substantially simplifies the implementation of the hybrid scheme. The result is an extremely flexible solution that adjusts to the number of available nodes, processors and memory, and consequently performs well on a wide variety of computing architectures.

  18. An efficient, reliable and inexpensive device for the rapid homogenization of multiple tissue samples by centrifugation.

    PubMed

    Ilyin, S E; Plata-Salamán, C R

    2000-02-15

    Homogenization of tissue samples is a common first step in the majority of current protocols for RNA, DNA, and protein isolation. This report describes a simple device for centrifugation-mediated homogenization of tissue samples. The method presented is applicable to RNA, DNA, and protein isolation, and we show examples where high quality total cell RNA, DNA, and protein were obtained from brain and other tissue samples. The advantages of the approach presented include: (1) a significant reduction in time investment relative to hand-driven or individual motorized-driven pestle homogenization; (2) easy construction of the device from inexpensive parts available in any laboratory; (3) high replicability in the processing; and (4) the capacity for the parallel processing of multiple tissue samples, thus allowing higher efficiency, reliability, and standardization.

  19. Impaired neurosteroid synthesis in multiple sclerosis

    PubMed Central

    Noorbakhsh, Farshid; Ellestad, Kristofor K.; Maingat, Ferdinand; Warren, Kenneth G.; Han, May H.; Steinman, Lawrence; Baker, Glen B.

    2011-01-01

    High-throughput technologies have led to advances in the recognition of disease pathways and their underlying mechanisms. To investigate the impact of micro-RNAs on the disease process in multiple sclerosis, a prototypic inflammatory neurological disorder, we examined cerebral white matter from patients with or without the disease by micro-RNA profiling, together with confirmatory reverse transcription–polymerase chain reaction analysis, immunoblotting and gas chromatography-mass spectrometry. These observations were verified using the in vivo multiple sclerosis model, experimental autoimmune encephalomyelitis. Brains of patients with or without multiple sclerosis demonstrated differential expression of multiple micro-RNAs, but expression of three neurosteroid synthesis enzyme-specific micro-RNAs (miR-338, miR-155 and miR-491) showed a bias towards induction in patients with multiple sclerosis (P < 0.05). Analysis of the neurosteroidogenic pathways targeted by micro-RNAs revealed suppression of enzyme transcript and protein levels in the white matter of patients with multiple sclerosis (P < 0.05). This was confirmed by firefly/Renilla luciferase micro-RNA target knockdown experiments (P < 0.05) and detection of specific micro-RNAs by in situ hybridization in the brains of patients with or without multiple sclerosis. Levels of important neurosteroids, including allopregnanolone, were suppressed in the white matter of patients with multiple sclerosis (P < 0.05). Induction of the murine micro-RNAs, miR-338 and miR-155, accompanied by diminished expression of neurosteroidogenic enzymes and allopregnanolone, was also observed in the brains of mice with experimental autoimmune encephalomyelitis (P < 0.05). Allopregnanolone treatment of the experimental autoimmune encephalomyelitis mouse model limited the associated neuropathology, including neuroinflammation, myelin and axonal injury and reduced neurobehavioral deficits (P < 0.05). These multi-platform studies point to impaired neurosteroidogenesis in both multiple sclerosis and experimental autoimmune encephalomyelitis. The findings also indicate that allopregnanolone and perhaps other neurosteroid-like compounds might represent potential biomarkers or therapies for multiple sclerosis. PMID:21908875

  20. Update on conjunctival pathology

    PubMed Central

    Mudhar, Hardeep Singh

    2017-01-01

    Conjunctival biopsies constitute a fairly large number of cases in a typical busy ophthalmic pathology practice. They range from a single biopsy through multiple mapping biopsies to assess the extent of a particular pathological process. Like most anatomical sites, the conjunctiva is subject to a very wide range of pathological processes. This article will cover key, commonly encountered nonneoplastic and neoplastic entities. Where relevant, sections will include recommendations on how best to submit specimens to the ophthalmic pathology laboratory and the relevance of up-to-date molecular techniques. PMID:28905821

  1. Mercury's exosphere: observations during MESSENGER's First Mercury flyby.

    PubMed

    McClintock, William E; Bradley, E Todd; Vervack, Ronald J; Killen, Rosemary M; Sprague, Ann L; Izenberg, Noam R; Solomon, Sean C

    2008-07-04

    During MESSENGER's first Mercury flyby, the Mercury Atmospheric and Surface Composition Spectrometer measured Mercury's exospheric emissions, including those from the antisunward sodium tail, calcium and sodium close to the planet, and hydrogen at high altitudes on the dayside. Spatial variations indicate that multiple source and loss processes generate and maintain the exosphere. Energetic processes connected to the solar wind and magnetospheric interaction with the planet likely played an important role in determining the distributions of exospheric species during the flyby.

  2. Systems Engineering in NASA's R&TD Programs

    NASA Technical Reports Server (NTRS)

    Jones, Harry

    2005-01-01

    Systems engineering is largely the analysis and planning that support the design, development, and operation of systems. The most common application of systems engineering is in guiding systems development projects that use a phased process of requirements, specifications, design, and development. This paper investigates how systems engineering techniques should be applied in research and technology development programs for advanced space systems. These programs should include anticipatory engineering of future space flight systems and a project portfolio selection process, as well as systems engineering for multiple development projects.

  3. Thermal Studies of Ammonium Cyanide Reactions: A Model for Thermal Alteration of Prebiotic Compounds in Meteorite Parent Bodies

    NASA Technical Reports Server (NTRS)

    Hammer, P. G.; Locke, D. R.; Burton, A. S.; Callahan, M. P.

    2017-01-01

    Organic compounds in carbonaceous chondrites were likely transformed by a variety of parent body processes including thermal and aqueous processing. Here, we analyzed ammonium cyanide reactions that were heated at different temperatures and times by multiple analytical techniques. The goal of this study is to better understand the effect of hydrothermal alteration on cyanide chemistry, which is believed to be responsible for the abiotic synthesis of purine nucleobases and their structural analogs detected in carbonaceous chondrites.

  4. Magnetic lens apparatus for use in high-resolution scanning electron microscopes and lithographic processes

    DOEpatents

    Crewe, Albert V.

    2000-01-01

    Disclosed are lens apparatus in which a beam of charged particlesis brought to a focus by means of a magnetic field, the lens being situated behind the target position. In illustrative embodiments, a lens apparatus is employed in a scanning electron microscopeas the sole lens for high-resolution focusing of an electron beam, and in particular, an electron beam having an accelerating voltage of from about 10 to about 30,000 V. In one embodiment, the lens apparatus comprises an electrically-conducting coil arranged around the axis of the beam and a magnetic pole piece extending along the axis of the beam at least within the space surrounded by the coil. In other embodiments, the lens apparatus comprises a magnetic dipole or virtual magnetic monopole fabricated from a variety of materials, including permanent magnets, superconducting coils, and magnetizable spheres and needles contained within an energy-conducting coil. Multiple-array lens apparatus are also disclosed for simultaneous and/or consecutive imaging of multiple images on single or multiple specimens. The invention further provides apparatus, methods, and devices useful in focusing charged particle beams for lithographic processes.

  5. Calcium as a signal integrator in developing epithelial tissues.

    PubMed

    Brodskiy, Pavel A; Zartman, Jeremiah J

    2018-05-16

    Decoding how tissue properties emerge across multiple spatial and temporal scales from the integration of local signals is a grand challenge in quantitative biology. For example, the collective behavior of epithelial cells is critical for shaping developing embryos. Understanding how epithelial cells interpret a diverse range of local signals to coordinate tissue-level processes requires a systems-level understanding of development. Integration of multiple signaling pathways that specify cell signaling information requires second messengers such as calcium ions. Increasingly, specific roles have been uncovered for calcium signaling throughout development. Calcium signaling regulates many processes including division, migration, death, and differentiation. However, the pleiotropic and ubiquitous nature of calcium signaling implies that many additional functions remain to be discovered. Here we review a selection of recent studies to highlight important insights into how multiple signals are transduced by calcium transients in developing epithelial tissues. Quantitative imaging and computational modeling have provided important insights into how calcium signaling integration occurs. Reverse-engineering the conserved features of signal integration mediated by calcium signaling will enable novel approaches in regenerative medicine and synthetic control of morphogenesis.

  6. Distinct organization of the candidate tumor suppressor gene RFP2 in human and mouse: multiple mRNA isoforms in both species- and human-specific antisense transcript RFP2OS.

    PubMed

    Baranova, Ancha; Hammarsund, Marianne; Ivanov, Dmitry; Skoblov, Mikhail; Sangfelt, Olle; Corcoran, Martin; Borodina, Tatiana; Makeeva, Natalia; Pestova, Anna; Tyazhelova, Tatiana; Nazarenko, Svetlana; Gorreta, Francesco; Alsheddi, Tariq; Schlauch, Karen; Nikitin, Eugene; Kapanadze, Bagrat; Shagin, Dmitry; Poltaraus, Andrey; Ivanovich Vorobiev, Andrey; Zabarovsky, Eugene; Lukianov, Sergey; Chandhoke, Vikas; Ibbotson, Rachel; Oscier, David; Einhorn, Stefan; Grander, Dan; Yankovsky, Nick

    2003-12-04

    In the present study, we describe the human and mouse RFP2 gene structure, multiple RFP2 mRNA isoforms in the two species that have different 5' UTRs and a human-specific antisense transcript RFP2OS. Since the human RFP2 5' UTR is not conserved in mouse, these findings might indicate a different regulation of RFP2 in the two species. The predicted human and mouse RFP2 proteins are shown to contain a tripartite RING finger-B-box-coiled-coil domain (RBCC), also known as a TRIM domain, and therefore belong to a subgroup of RING finger proteins that are often involved in developmental and tumorigenic processes. Because homozygous deletions of chromosomal region 13q14.3 are found in a number of malignancies, including chronic lymphocytic leukemia (CLL) and multiple myeloma (MM), we suggest that RFP2 might be involved in tumor development. This study provides necessary information for evaluation of the role of RFP2 in malignant transformation and other biological processes.

  7. Big data and high-performance analytics in structural health monitoring for bridge management

    NASA Astrophysics Data System (ADS)

    Alampalli, Sharada; Alampalli, Sandeep; Ettouney, Mohammed

    2016-04-01

    Structural Health Monitoring (SHM) can be a vital tool for effective bridge management. Combining large data sets from multiple sources to create a data-driven decision-making framework is crucial for the success of SHM. This paper presents a big data analytics framework that combines multiple data sets correlated with functional relatedness to convert data into actionable information that empowers risk-based decision-making. The integrated data environment incorporates near real-time streams of semi-structured data from remote sensors, historical visual inspection data, and observations from structural analysis models to monitor, assess, and manage risks associated with the aging bridge inventories. Accelerated processing of dataset is made possible by four technologies: cloud computing, relational database processing, support from NOSQL database, and in-memory analytics. The framework is being validated on a railroad corridor that can be subjected to multiple hazards. The framework enables to compute reliability indices for critical bridge components and individual bridge spans. In addition, framework includes a risk-based decision-making process that enumerate costs and consequences of poor bridge performance at span- and network-levels when rail networks are exposed to natural hazard events such as floods and earthquakes. Big data and high-performance analytics enable insights to assist bridge owners to address problems faster.

  8. Weak ergodicity of population evolution processes.

    PubMed

    Inaba, H

    1989-10-01

    The weak ergodic theorems of mathematical demography state that the age distribution of a closed population is asymptotically independent of the initial distribution. In this paper, we provide a new proof of the weak ergodic theorem of the multistate population model with continuous time. The main tool to attain this purpose is a theory of multiplicative processes, which was mainly developed by Garrett Birkhoff, who showed that ergodic properties generally hold for an appropriate class of multiplicative processes. First, we construct a general theory of multiplicative processes on a Banach lattice. Next, we formulate a dynamical model of a multistate population and show that its evolution operator forms a multiplicative process on the state space of the population. Subsequently, we investigate a sufficient condition that guarantees the weak ergodicity of the multiplicative process. Finally, we prove the weak and strong ergodic theorems for the multistate population and resolve the consistency problem.

  9. Learning to write like a scientist: Coauthoring as an enculturation task

    NASA Astrophysics Data System (ADS)

    Florence, Marilyn K.; Yore, Larry D.

    2004-08-01

    This multiple case study examined the coauthorship process in research laboratories of different university departments. The study focused on two cases comprising five writing teams, one in biochemistry and microbiology and four in earth and ocean sciences. The role of the research supervisor, the role of the student (graduate and postgraduate), the interaction of the supervisor and the student, the activities and processes inherent in the coauthorship process, and the student's beliefs, expertise, scientific writing, and entry into an academic discourse community were documented utilizing multiple sources of data and methods. Several activities and processes were found to be common across all coauthorship teams, including aspects of planning, drafting, and revising. Elements of scientific and writing expertise, facets of enculturation into scientific research and discourse communities, academic civility, and the dynamics of collaborative groups also were apparent. There was healthy tension and mutual respect in the research groups as they attempted to make sense of science, report their results clearly and persuasively, and share the responsibilities of expertise. The novice scientists came to appreciate that the writing, editing, and revising process influenced the quality of the science as well as the writing.

  10. Seeing beyond monitors-Critical care nurses' multiple skills in patient observation: Descriptive qualitative study.

    PubMed

    Alastalo, Mika; Salminen, Leena; Lakanmaa, Riitta-Liisa; Leino-Kilpi, Helena

    2017-10-01

    The aim of this study was to provide a comprehensive description of multiple skills in patient observation in critical care nursing. Data from semi-structured interviews were analysed using thematic analysis. Experienced critical care nurses (n=20) from three intensive care units in two university hospitals in Finland. Patient observation skills consist of: information gaining skills, information processing skills, decision-making skills and co-operation skills. The first three skills are integrated in the patient observation process, in which gaining information is a prerequisite for processing information that precedes making decisions. Co-operation has a special role as it occurs throughout the process. This study provided a comprehensive description of patient observation skills related to the three-phased patient observation process. The findings contribute to clarifying this part of the competence. The description of patient observation skills may be applied in both clinical practice and education as it may serve as a framework for orientation, ensuring clinical skills and designing learning environments. Based on this study, patient observation skills can be recommended to be included in critical care nursing education, orientation and as a part of critical care nurses' competence evaluation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Considerations for multiple hypothesis correlation on tactical platforms

    NASA Astrophysics Data System (ADS)

    Thomas, Alan M.; Turpen, James E.

    2013-05-01

    Tactical platforms benefit greatly from the fusion of tracks from multiple sources in terms of increased situation awareness. As a necessary precursor to this track fusion, track-to-track association, or correlation, must first be performed. The related measurement-to-track fusion problem has been well studied with multiple hypothesis tracking and multiple frame assignment methods showing the most success. The track-to-track problem differs from this one in that measurements themselves are not available but rather track state update reports from the measuring sensors. Multiple hypothesis, multiple frame correlation systems have previously been considered; however, their practical implementation under the constraints imposed by tactical platforms is daunting. The situation is further exacerbated by the inconvenient nature of reports from legacy sensor systems on bandwidth- limited communications networks. In this paper, consideration is given to the special difficulties encountered when attempting the correlation of tracks from legacy sensors on tactical aircraft. Those difficulties include the following: covariance information from reporting sensors is frequently absent or incomplete; system latencies can create temporal uncertainty in data; and computational processing is severely limited by hardware and architecture. Moreover, consideration is given to practical solutions for dealing with these problems in a multiple hypothesis correlator.

  12. Bringing functions together with fusion enzymes--from nature's inventions to biotechnological applications.

    PubMed

    Elleuche, Skander

    2015-02-01

    It is a mammoth task to develop a modular protein toolbox enabling the production of posttranslational organized multifunctional enzymes that catalyze reactions in complex pathways. However, nature has always guided scientists to mimic evolutionary inventions in the laboratory and, nowadays, versatile methods have been established to experimentally connect enzymatic activities with multiple advantages. Among the oldest known natural examples is the linkage of two or more juxtaposed proteins catalyzing consecutive, non-consecutive, or opposing reactions by a native peptide bond. There are multiple reasons for the artificial construction of such fusion enzymes including improved catalytic activities, enabled substrate channelling by proximity of biocatalysts, higher stabilities, and cheaper production processes. To produce fused proteins, it is either possible to genetically fuse coding open reading frames or to connect proteins in a posttranslational process. Molecular biology techniques that have been established for the production of end-to-end or insertional fusions include overlap extension polymerase chain reaction, cloning, and recombination approaches. Depending on their flexibility and applicability, these methods offer various advantages to produce fusion genes in high throughput, different orientations, and including linker sequences to maximize the flexibility and performance of fusion partners. In this review, practical techniques to fuse genes are highlighted, enzymatic parameters to choose adequate enzymes for fusion approaches are summarized, and examples with biotechnological relevance are presented including a focus on plant biomass-degrading glycosyl hydrolases.

  13. Transactional Pathways of Transgender Identity Development in Transgender and Gender Nonconforming Youth and Caregivers from the Trans Youth Family Study

    PubMed Central

    Katz-Wise, Sabra L.; Budge, Stephanie L.; Fugate, Ellen; Flanagan, Kaleigh; Touloumtzis, Currie; Rood, Brian; Perez-Brumer, Amaya; Leibowitz, Scott

    2017-01-01

    Background A growing body of research has examined transgender identity development, but no studies have investigated developmental pathways as a transactional process between youth and caregivers, incorporating perspectives from multiple family members. The aim of this study was to conceptualize pathways of transgender identity development using narratives from both transgender and gender nonconforming (TGN) youth and their cisgender (non-transgender) caregivers. Methods The sample included 16 families, with 16 TGN youth, ages 7–18 years, and 29 cisgender caregivers (N = 45 family members). TGN youth represented multiple gender identities, including trans boy (n = 9), trans girl (n = 5), gender fluid boy (n = 1), and girlish boy (n = 1). Caregivers included mothers (n = 17), fathers (n = 11), and one grandmother. Participants were recruited from LGBTQ community organizations and support networks for families with transgender youth in the Midwest, Northeast, and South regions of the United States. Each family member completed a one-time in-person semi-structured qualitative interview that included questions about transgender identity development. Results Analyses revealed seven overarching themes of transgender identity development, which were organized into a conceptual model: Trans identity development, sociocultural influences/societal discourse, biological influences, family adjustment/impact, stigma/cisnormativity, support/resources, and gender affirmation/actualization. Conclusions Findings underscore the importance of assessing developmental processes among TGN youth as transactional, impacting both youth and their caregivers. PMID:29527139

  14. Development of the Fray-Farthing-Chen Cambridge Process: Towards the Sustainable Production of Titanium and Its Alloys

    NASA Astrophysics Data System (ADS)

    Hu, Di; Dolganov, Aleksei; Ma, Mingchan; Bhattacharya, Biyash; Bishop, Matthew T.; Chen, George Z.

    2018-02-01

    The Kroll process has been employed for titanium extraction since the 1950s. It is a labour and energy intensive multi-step semi-batch process. The post-extraction processes for making the raw titanium into alloys and products are also excessive, including multiple remelting steps. Invented in the late 1990s, the Fray-Farthing-Chen (FFC) Cambridge process extracts titanium from solid oxides at lower energy consumption via electrochemical reduction in molten salts. Its ability to produce alloys and powders, while retaining the cathode shape also promises energy and material efficient manufacturing. Focusing on titanium and its alloys, this article reviews the recent development of the FFC-Cambridge process in two aspects, (1) resource and process sustainability and (2) advanced post-extraction processing.

  15. Information processing. [in human performance

    NASA Technical Reports Server (NTRS)

    Wickens, Christopher D.; Flach, John M.

    1988-01-01

    Theoretical models of sensory-information processing by the human brain are reviewed from a human-factors perspective, with a focus on their implications for aircraft and avionics design. The topics addressed include perception (signal detection and selection), linguistic factors in perception (context provision, logical reversals, absence of cues, and order reversals), mental models, and working and long-term memory. Particular attention is given to decision-making problems such as situation assessment, decision formulation, decision quality, selection of action, the speed-accuracy tradeoff, stimulus-response compatibility, stimulus sequencing, dual-task performance, task difficulty and structure, and factors affecting multiple task performance (processing modalities, codes, and stages).

  16. Information processing in the primate visual system - An integrated systems perspective

    NASA Technical Reports Server (NTRS)

    Van Essen, David C.; Anderson, Charles H.; Felleman, Daniel J.

    1992-01-01

    The primate visual system contains dozens of distinct areas in the cerebral cortex and several major subcortical structures. These subdivisions are extensively interconnected in a distributed hierarchical network that contains several intertwined processing streams. A number of strategies are used for efficient information processing within this hierarchy. These include linear and nonlinear filtering, passage through information bottlenecks, and coordinated use of multiple types of information. In addition, dynamic regulation of information flow within and between visual areas may provide the computational flexibility needed for the visual system to perform a broad spectrum of tasks accurately and at high resolution.

  17. Communicating nutraceuticals: A multi-stakeholder perspective from a developing nation.

    PubMed

    Jain, Varsha; Roy, Subhadip; Damle, Neha; Jagani, Khyati

    2016-01-01

    Nutraceuticals, a combination of nutrition and pharmaceutical, have grown rapidly as a product globally. Nutraceuticals can be advertised directly to consumers as well as prescribed, and thus involve multiple stakeholders in the marketing communication process. The present study investigates the marketing communication aspects of nutraceuticals using 216 semistructured in-depth interviews including all stakeholders in the process such as company/brand, physicians, pharmacists, and consumers. The findings bring out the role of each participant in the communication process and a comprehensive picture of the same. The insights would facilitate the nutraceutical brands to understand and implement marketing effective communication strategies.

  18. Multi-discipline Waste Acceptance Process at the Nevada National Security Site - 13573

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carilli, Jhon T.; Krenzien, Susan K.

    2013-07-01

    The Nevada National Security Site low-level radioactive waste disposal facility acceptance process requires multiple disciplines to ensure the protection of workers, the public, and the environment. These disciplines, which include waste acceptance, nuclear criticality, safety, permitting, operations, and performance assessment, combine into the overall waste acceptance process to assess low-level radioactive waste streams for disposal at the Area 5 Radioactive Waste Management Site. Four waste streams recently highlighted the integration of these disciplines: the Oak Ridge Radioisotope Thermoelectric Generators and Consolidated Edison Uranium Solidification Project material, West Valley Melter, and classified waste. (authors)

  19. Experimental realization of entanglement in multiple degrees of freedom between two quantum memories.

    PubMed

    Zhang, Wei; Ding, Dong-Sheng; Dong, Ming-Xin; Shi, Shuai; Wang, Kai; Liu, Shi-Long; Li, Yan; Zhou, Zhi-Yuan; Shi, Bao-Sen; Guo, Guang-Can

    2016-11-14

    Entanglement in multiple degrees of freedom has many benefits over entanglement in a single one. The former enables quantum communication with higher channel capacity and more efficient quantum information processing and is compatible with diverse quantum networks. Establishing multi-degree-of-freedom entangled memories is not only vital for high-capacity quantum communication and computing, but also promising for enhanced violations of nonlocality in quantum systems. However, there have been yet no reports of the experimental realization of multi-degree-of-freedom entangled memories. Here we experimentally established hyper- and hybrid entanglement in multiple degrees of freedom, including path (K-vector) and orbital angular momentum, between two separated atomic ensembles by using quantum storage. The results are promising for achieving quantum communication and computing with many degrees of freedom.

  20. An Authentication Protocol for Future Sensor Networks.

    PubMed

    Bilal, Muhammad; Kang, Shin-Gak

    2017-04-28

    Authentication is one of the essential security services in Wireless Sensor Networks (WSNs) for ensuring secure data sessions. Sensor node authentication ensures the confidentiality and validity of data collected by the sensor node, whereas user authentication guarantees that only legitimate users can access the sensor data. In a mobile WSN, sensor and user nodes move across the network and exchange data with multiple nodes, thus experiencing the authentication process multiple times. The integration of WSNs with Internet of Things (IoT) brings forth a new kind of WSN architecture along with stricter security requirements; for instance, a sensor node or a user node may need to establish multiple concurrent secure data sessions. With concurrent data sessions, the frequency of the re-authentication process increases in proportion to the number of concurrent connections. Moreover, to establish multiple data sessions, it is essential that a protocol participant have the capability of running multiple instances of the protocol run, which makes the security issue even more challenging. The currently available authentication protocols were designed for the autonomous WSN and do not account for the above requirements. Hence, ensuring a lightweight and efficient authentication protocol has become more crucial. In this paper, we present a novel, lightweight and efficient key exchange and authentication protocol suite called the Secure Mobile Sensor Network (SMSN) Authentication Protocol. In the SMSN a mobile node goes through an initial authentication procedure and receives a re-authentication ticket from the base station. Later a mobile node can use this re-authentication ticket when establishing multiple data exchange sessions and/or when moving across the network. This scheme reduces the communication and computational complexity of the authentication process. We proved the strength of our protocol with rigorous security analysis (including formal analysis using the BAN-logic) and simulated the SMSN and previously proposed schemes in an automated protocol verifier tool. Finally, we compared the computational complexity and communication cost against well-known authentication protocols.

  1. An Authentication Protocol for Future Sensor Networks

    PubMed Central

    Bilal, Muhammad; Kang, Shin-Gak

    2017-01-01

    Authentication is one of the essential security services in Wireless Sensor Networks (WSNs) for ensuring secure data sessions. Sensor node authentication ensures the confidentiality and validity of data collected by the sensor node, whereas user authentication guarantees that only legitimate users can access the sensor data. In a mobile WSN, sensor and user nodes move across the network and exchange data with multiple nodes, thus experiencing the authentication process multiple times. The integration of WSNs with Internet of Things (IoT) brings forth a new kind of WSN architecture along with stricter security requirements; for instance, a sensor node or a user node may need to establish multiple concurrent secure data sessions. With concurrent data sessions, the frequency of the re-authentication process increases in proportion to the number of concurrent connections. Moreover, to establish multiple data sessions, it is essential that a protocol participant have the capability of running multiple instances of the protocol run, which makes the security issue even more challenging. The currently available authentication protocols were designed for the autonomous WSN and do not account for the above requirements. Hence, ensuring a lightweight and efficient authentication protocol has become more crucial. In this paper, we present a novel, lightweight and efficient key exchange and authentication protocol suite called the Secure Mobile Sensor Network (SMSN) Authentication Protocol. In the SMSN a mobile node goes through an initial authentication procedure and receives a re-authentication ticket from the base station. Later a mobile node can use this re-authentication ticket when establishing multiple data exchange sessions and/or when moving across the network. This scheme reduces the communication and computational complexity of the authentication process. We proved the strength of our protocol with rigorous security analysis (including formal analysis using the BAN-logic) and simulated the SMSN and previously proposed schemes in an automated protocol verifier tool. Finally, we compared the computational complexity and communication cost against well-known authentication protocols. PMID:28452937

  2. Cognitive Processes in the Production of Multiple-Goal Messages: Evidence from the Temporal Characteristics of Speech.

    ERIC Educational Resources Information Center

    Greene, John O.; And Others

    1993-01-01

    Finds that the increased cognitive load accompanying multiple-goal messages arises from demands on time and processing capacity associated with assembling incompatible message features and that multiple-goal messages are characterized by heavier demand on processing capacity associated with maintaining more complex message-relevant specifications…

  3. Characterizing multiple timescales of stream and storage zone interaction that affect solute fate and transport in streams

    USGS Publications Warehouse

    Choi, Jungyill; Harvey, Judson W.; Conklin, Martha H.

    2000-01-01

    The fate of contaminants in streams and rivers is affected by exchange and biogeochemical transformation in slowly moving or stagnant flow zones that interact with rapid flow in the main channel. In a typical stream, there are multiple types of slowly moving flow zones in which exchange and transformation occur, such as stagnant or recirculating surface water as well as subsurface hyporheic zones. However, most investigators use transport models with just a single storage zone in their modeling studies, which assumes that the effects of multiple storage zones can be lumped together. Our study addressed the following question: Can a single‐storage zone model reliably characterize the effects of physical retention and biogeochemical reactions in multiple storage zones? We extended an existing stream transport model with a single storage zone to include a second storage zone. With the extended model we generated 500 data sets representing transport of nonreactive and reactive solutes in stream systems that have two different types of storage zones with variable hydrologic conditions. The one storage zone model was tested by optimizing the lumped storage parameters to achieve a best fit for each of the generated data sets. Multiple storage processes were categorized as possessing I, additive; II, competitive; or III, dominant storage zone characteristics. The classification was based on the goodness of fit of generated data sets, the degree of similarity in mean retention time of the two storage zones, and the relative distributions of exchange flux and storage capacity between the two storage zones. For most cases (>90%) the one storage zone model described either the effect of the sum of multiple storage processes (category I) or the dominant storage process (category III). Failure of the one storage zone model occurred mainly for category II, that is, when one of the storage zones had a much longer mean retention time (ts ratio > 5.0) and when the dominance of storage capacity and exchange flux occurred in different storage zones. We also used the one storage zone model to estimate a “single” lumped rate constant representing the net removal of a solute by biogeochemical reactions in multiple storage zones. For most cases the lumped rate constant that was optimized by one storage zone modeling estimated the flux‐weighted rate constant for multiple storage zones. Our results explain how the relative hydrologic properties of multiple storage zones (retention time, storage capacity, exchange flux, and biogeochemical reaction rate constant) affect the reliability of lumped parameters determined by a one storage zone transport model. We conclude that stream transport models with a single storage compartment will in most cases reliably characterize the dominant physical processes of solute retention and biogeochemical reactions in streams with multiple storage zones.

  4. ODMSummary: A Tool for Automatic Structured Comparison of Multiple Medical Forms Based on Semantic Annotation with the Unified Medical Language System.

    PubMed

    Storck, Michael; Krumm, Rainer; Dugas, Martin

    2016-01-01

    Medical documentation is applied in various settings including patient care and clinical research. Since procedures of medical documentation are heterogeneous and developed further, secondary use of medical data is complicated. Development of medical forms, merging of data from different sources and meta-analyses of different data sets are currently a predominantly manual process and therefore difficult and cumbersome. Available applications to automate these processes are limited. In particular, tools to compare multiple documentation forms are missing. The objective of this work is to design, implement and evaluate the new system ODMSummary for comparison of multiple forms with a high number of semantically annotated data elements and a high level of usability. System requirements are the capability to summarize and compare a set of forms, enable to estimate the documentation effort, track changes in different versions of forms and find comparable items in different forms. Forms are provided in Operational Data Model format with semantic annotations from the Unified Medical Language System. 12 medical experts were invited to participate in a 3-phase evaluation of the tool regarding usability. ODMSummary (available at https://odmtoolbox.uni-muenster.de/summary/summary.html) provides a structured overview of multiple forms and their documentation fields. This comparison enables medical experts to assess multiple forms or whole datasets for secondary use. System usability was optimized based on expert feedback. The evaluation demonstrates that feedback from domain experts is needed to identify usability issues. In conclusion, this work shows that automatic comparison of multiple forms is feasible and the results are usable for medical experts.

  5. Text-based Analytics for Biosurveillance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Charles, Lauren E.; Smith, William P.; Rounds, Jeremiah

    The ability to prevent, mitigate, or control a biological threat depends on how quickly the threat is identified and characterized. Ensuring the timely delivery of data and analytics is an essential aspect of providing adequate situational awareness in the face of a disease outbreak. This chapter outlines an analytic pipeline for supporting an advanced early warning system that can integrate multiple data sources and provide situational awareness of potential and occurring disease situations. The pipeline, includes real-time automated data analysis founded on natural language processing (NLP), semantic concept matching, and machine learning techniques, to enrich content with metadata related tomore » biosurveillance. Online news articles are presented as an example use case for the pipeline, but the processes can be generalized to any textual data. In this chapter, the mechanics of a streaming pipeline are briefly discussed as well as the major steps required to provide targeted situational awareness. The text-based analytic pipeline includes various processing steps as well as identifying article relevance to biosurveillance (e.g., relevance algorithm) and article feature extraction (who, what, where, why, how, and when). The ability to prevent, mitigate, or control a biological threat depends on how quickly the threat is identified and characterized. Ensuring the timely delivery of data and analytics is an essential aspect of providing adequate situational awareness in the face of a disease outbreak. This chapter outlines an analytic pipeline for supporting an advanced early warning system that can integrate multiple data sources and provide situational awareness of potential and occurring disease situations. The pipeline, includes real-time automated data analysis founded on natural language processing (NLP), semantic concept matching, and machine learning techniques, to enrich content with metadata related to biosurveillance. Online news articles are presented as an example use case for the pipeline, but the processes can be generalized to any textual data. In this chapter, the mechanics of a streaming pipeline are briefly discussed as well as the major steps required to provide targeted situational awareness. The text-based analytic pipeline includes various processing steps as well as identifying article relevance to biosurveillance (e.g., relevance algorithm) and article feature extraction (who, what, where, why, how, and when).« less

  6. Guiding Principles for a Pediatric Neurology ICU (neuroPICU) Bedside Multimodal Monitor

    PubMed Central

    Eldar, Yonina C.; Gopher, Daniel; Gottlieb, Amihai; Lammfromm, Rotem; Mangat, Halinder S; Peleg, Nimrod; Pon, Steven; Rozenberg, Igal; Schiff, Nicholas D; Stark, David E; Yan, Peter; Pratt, Hillel; Kosofsky, Barry E

    2016-01-01

    Summary Background Physicians caring for children with serious acute neurologic disease must process overwhelming amounts of physiological and medical information. Strategies to optimize real time display of this information are understudied. Objectives Our goal was to engage clinical and engineering experts to develop guiding principles for creating a pediatric neurology intensive care unit (neuroPICU) monitor that integrates and displays data from multiple sources in an intuitive and informative manner. Methods To accomplish this goal, an international group of physicians and engineers communicated regularly for one year. We integrated findings from clinical observations, interviews, a survey, signal processing, and visualization exercises to develop a concept for a neuroPICU display. Results Key conclusions from our efforts include: (1) A neuroPICU display should support (a) rapid review of retrospective time series (i.e. cardiac, pulmonary, and neurologic physiology data), (b) rapidly modifiable formats for viewing that data according to the specialty of the reviewer, and (c) communication of the degree of risk of clinical decline. (2) Specialized visualizations of physiologic parameters can highlight abnormalities in multivariable temporal data. Examples include 3-D stacked spider plots and color coded time series plots. (3) Visual summaries of EEG with spectral tools (i.e. hemispheric asymmetry and median power) can highlight seizures via patient-specific “fingerprints.” (4) Intuitive displays should emphasize subsets of physiology and processed EEG data to provide a rapid gestalt of the current status and medical stability of a patient. Conclusions A well-designed neuroPICU display must present multiple datasets in dynamic, flexible, and informative views to accommodate clinicians from multiple disciplines in a variety of clinical scenarios. PMID:27437048

  7. Guiding Principles for a Pediatric Neurology ICU (neuroPICU) Bedside Multimodal Monitor: Findings from an International Working Group.

    PubMed

    Grinspan, Zachary M; Eldar, Yonina C; Gopher, Daniel; Gottlieb, Amihai; Lammfromm, Rotem; Mangat, Halinder S; Peleg, Nimrod; Pon, Steven; Rozenberg, Igal; Schiff, Nicholas D; Stark, David E; Yan, Peter; Pratt, Hillel; Kosofsky, Barry E

    2016-01-01

    Physicians caring for children with serious acute neurologic disease must process overwhelming amounts of physiological and medical information. Strategies to optimize real time display of this information are understudied. Our goal was to engage clinical and engineering experts to develop guiding principles for creating a pediatric neurology intensive care unit (neuroPICU) monitor that integrates and displays data from multiple sources in an intuitive and informative manner. To accomplish this goal, an international group of physicians and engineers communicated regularly for one year. We integrated findings from clinical observations, interviews, a survey, signal processing, and visualization exercises to develop a concept for a neuroPICU display. Key conclusions from our efforts include: (1) A neuroPICU display should support (a) rapid review of retrospective time series (i.e. cardiac, pulmonary, and neurologic physiology data), (b) rapidly modifiable formats for viewing that data according to the specialty of the reviewer, and (c) communication of the degree of risk of clinical decline. (2) Specialized visualizations of physiologic parameters can highlight abnormalities in multivariable temporal data. Examples include 3-D stacked spider plots and color coded time series plots. (3) Visual summaries of EEG with spectral tools (i.e. hemispheric asymmetry and median power) can highlight seizures via patient-specific "fingerprints." (4) Intuitive displays should emphasize subsets of physiology and processed EEG data to provide a rapid gestalt of the current status and medical stability of a patient. A well-designed neuroPICU display must present multiple datasets in dynamic, flexible, and informative views to accommodate clinicians from multiple disciplines in a variety of clinical scenarios.

  8. Assessing uncertainty and sensitivity of model parameterizations and parameters in WRF affecting simulated surface fluxes and land-atmosphere coupling over the Amazon region

    NASA Astrophysics Data System (ADS)

    Qian, Y.; Wang, C.; Huang, M.; Berg, L. K.; Duan, Q.; Feng, Z.; Shrivastava, M. B.; Shin, H. H.; Hong, S. Y.

    2016-12-01

    This study aims to quantify the relative importance and uncertainties of different physical processes and parameters in affecting simulated surface fluxes and land-atmosphere coupling strength over the Amazon region. We used two-legged coupling metrics, which include both terrestrial (soil moisture to surface fluxes) and atmospheric (surface fluxes to atmospheric state or precipitation) legs, to diagnose the land-atmosphere interaction and coupling strength. Observations made using the Department of Energy's Atmospheric Radiation Measurement (ARM) Mobile Facility during the GoAmazon field campaign together with satellite and reanalysis data are used to evaluate model performance. To quantify the uncertainty in physical parameterizations, we performed a 120 member ensemble of simulations with the WRF model using a stratified experimental design including 6 cloud microphysics, 3 convection, 6 PBL and surface layer, and 3 land surface schemes. A multiple-way analysis of variance approach is used to quantitatively analyze the inter- and intra-group (scheme) means and variances. To quantify parameter sensitivity, we conducted an additional 256 WRF simulations in which an efficient sampling algorithm is used to explore the multiple-dimensional parameter space. Three uncertainty quantification approaches are applied for sensitivity analysis (SA) of multiple variables of interest to 20 selected parameters in YSU PBL and MM5 surface layer schemes. Results show consistent parameter sensitivity across different SA methods. We found that 5 out of 20 parameters contribute more than 90% total variance, and first-order effects dominate comparing to the interaction effects. Results of this uncertainty quantification study serve as guidance for better understanding the roles of different physical processes in land-atmosphere interactions, quantifying model uncertainties from various sources such as physical processes, parameters and structural errors, and providing insights for improving the model physics parameterizations.

  9. Method and apparatus for production of subsea hydrocarbon formations

    DOEpatents

    Blandford, J.W.

    1995-01-17

    A system for controlling, separating, processing and exporting well fluids produced from subsea hydrocarbon formations is disclosed. The subsea well tender system includes a surface buoy supporting one or more decks above the water surface for accommodating equipment to process oil, gas and water recovered from the subsea hydrocarbon formation. The surface buoy includes a surface-piercing central flotation column connected to one or more external flotation tanks located below the water surface. The surface buoy is secured to the sea bed by one or more tendons which are anchored to a foundation with piles imbedded in the sea bed. The system accommodates multiple versions on the surface buoy configuration. 20 figures.

  10. Perspectives on Home Care Quality

    PubMed Central

    Kane, Rosalie A.; Kane, Robert L.; Illston, Laurel H.; Eustis, Nancy N.

    1994-01-01

    Home care quality assurance (QA) must consider features inherent in home care, including: multiple goals, limited provider control, and unique family roles. Successive panels of stakeholders were asked to rate the importance of selected home care outcomes. Most highly rated outcomes were freedom from exploitation, satisfaction with care, physical safety, affordability, and physical functioning. Panelists preferred outcome indicators to process and structure, and all groups emphasized “enabling” criteria. Themes highlighted included: interpersonal components of care; normalizing life for clientele; balancing quality of life with safety; developing flexible, negotiated care plans; mechanisms for accountability and case management. These themes were formulated differently according to the stakeholders' role. Providers preferred intermediate outcomes, akin to process. PMID:10140158

  11. 40 CFR 63.1329 - Process contact cooling towers provisions.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... viscosity multiple end finisher process that utilizes a process contact cooling tower shall comply with... high viscosity multiple end finisher process and who is subject or becomes subject to 40 CFR part 60...

  12. 40 CFR 63.1329 - Process contact cooling towers provisions.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... viscosity multiple end finisher process that utilizes a process contact cooling tower shall comply with... high viscosity multiple end finisher process, and who is subject or becomes subject to 40 CFR part 60...

  13. The Cerefy Neuroradiology Atlas: a Talairach-Tournoux atlas-based tool for analysis of neuroimages available over the internet.

    PubMed

    Nowinski, Wieslaw L; Belov, Dmitry

    2003-09-01

    The article introduces an atlas-assisted method and a tool called the Cerefy Neuroradiology Atlas (CNA), available over the Internet for neuroradiology and human brain mapping. The CNA contains an enhanced, extended, and fully segmented and labeled electronic version of the Talairach-Tournoux brain atlas, including parcelated gyri and Brodmann's areas. To our best knowledge, this is the first online, publicly available application with the Talairach-Tournoux atlas. The process of atlas-assisted neuroimage analysis is done in five steps: image data loading, Talairach landmark setting, atlas normalization, image data exploration and analysis, and result saving. Neuroimage analysis is supported by a near-real-time, atlas-to-data warping based on the Talairach transformation. The CNA runs on multiple platforms; is able to process simultaneously multiple anatomical and functional data sets; and provides functions for a rapid atlas-to-data registration, interactive structure labeling and annotating, and mensuration. It is also empowered with several unique features, including interactive atlas warping facilitating fine tuning of atlas-to-data fit, navigation on the triplanar formed by the image data and the atlas, multiple-images-in-one display with interactive atlas-anatomy-function blending, multiple label display, and saving of labeled and annotated image data. The CNA is useful for fast atlas-assisted analysis of neuroimage data sets. It increases accuracy and reduces time in localization analysis of activation regions; facilitates to communicate the information on the interpreted scans from the neuroradiologist to other clinicians and medical students; increases the neuroradiologist's confidence in terms of anatomy and spatial relationships; and serves as a user-friendly, public domain tool for neuroeducation. At present, more than 700 users from five continents have subscribed to the CNA.

  14. Enhanced peripheral visual processing in congenitally deaf humans is supported by multiple brain regions, including primary auditory cortex.

    PubMed

    Scott, Gregory D; Karns, Christina M; Dow, Mark W; Stevens, Courtney; Neville, Helen J

    2014-01-01

    Brain reorganization associated with altered sensory experience clarifies the critical role of neuroplasticity in development. An example is enhanced peripheral visual processing associated with congenital deafness, but the neural systems supporting this have not been fully characterized. A gap in our understanding of deafness-enhanced peripheral vision is the contribution of primary auditory cortex. Previous studies of auditory cortex that use anatomical normalization across participants were limited by inter-subject variability of Heschl's gyrus. In addition to reorganized auditory cortex (cross-modal plasticity), a second gap in our understanding is the contribution of altered modality-specific cortices (visual intramodal plasticity in this case), as well as supramodal and multisensory cortices, especially when target detection is required across contrasts. Here we address these gaps by comparing fMRI signal change for peripheral vs. perifoveal visual stimulation (11-15° vs. 2-7°) in congenitally deaf and hearing participants in a blocked experimental design with two analytical approaches: a Heschl's gyrus region of interest analysis and a whole brain analysis. Our results using individually-defined primary auditory cortex (Heschl's gyrus) indicate that fMRI signal change for more peripheral stimuli was greater than perifoveal in deaf but not in hearing participants. Whole-brain analyses revealed differences between deaf and hearing participants for peripheral vs. perifoveal visual processing in extrastriate visual cortex including primary auditory cortex, MT+/V5, superior-temporal auditory, and multisensory and/or supramodal regions, such as posterior parietal cortex (PPC), frontal eye fields, anterior cingulate, and supplementary eye fields. Overall, these data demonstrate the contribution of neuroplasticity in multiple systems including primary auditory cortex, supramodal, and multisensory regions, to altered visual processing in congenitally deaf adults.

  15. The Role of Language in Religious Identity Making: A Case of a Caribbean-Chinese Youth

    ERIC Educational Resources Information Center

    Skerrett, Allison

    2017-01-01

    This article explores the processes of religious identity development in a Caribbean-Chinese adolescent who is from a multifaith, multilingual home. Findings include (1) the youth developed a Christian religious identity through his multiple situatedness within home and school worlds that privileged that faith and the dominant language of English…

  16. College Student Perceptions and Learning Points from the Formal University Judicial Process: A Multiple Case Study

    ERIC Educational Resources Information Center

    Lucas, Christopher M.

    2009-01-01

    For educators in the field of higher education and judicial affairs, issues are growing. Campus adjudicators must somehow maximize every opportunity for student education and development in the context of declining resources and increasing expectations of public accountability. Numbers of student misconduct cases, including matters of violence and…

  17. Integration and Testing Challenges of Small, Multiple Satellite Missions: Experiences From The Space Technology 5 Project

    NASA Technical Reports Server (NTRS)

    Sauerwein, Timothy A.; Gostomski, Thomas

    2007-01-01

    This brief presentation describes the mechanical and electrical integration activities and environmental testing challenges of the Space Technology 5 (ST5) Project. Lessons learned during this process are highlighted, including performing mechanical activities serially to gain efficiency through repetition and performing electrical activities based on the level of subsystem expertise available.

  18. Modeling How, When, and What Is Learned in a Simple Fault-Finding Task

    ERIC Educational Resources Information Center

    Ritter, Frank E.; Bibby, Peter A.

    2008-01-01

    We have developed a process model that learns in multiple ways while finding faults in a simple control panel device. The model predicts human participants' learning through its own learning. The model's performance was systematically compared to human learning data, including the time course and specific sequence of learned behaviors. These…

  19. Write Now! Using Reflective Writing beyond the Humanities and Social Sciences

    ERIC Educational Resources Information Center

    Cannady, Rachel E.; Gallo, Kasia Z.

    2016-01-01

    Writing is an important teaching and learning tool that fosters active and critical thinking. There are multiple pressures for disciplines outside the humanities and social sciences to integrate writing in their courses. The shift from teaching solely discipline-specific skills to including writing in a meaningful way can be a daunting process. An…

  20. Validating Alternative Modes of Scoring for Coloured Progressive Matrices.

    ERIC Educational Resources Information Center

    Razel, Micha; Eylon, Bat-Sheva

    Conventional scoring of the Coloured Progressive Matrices (CPM) was compared with three methods of multiple weight scoring. The methods include: (1) theoretical weighting in which the weights were based on a theory of cognitive processing; (2) judged weighting in which the weights were given by a group of nine adult expert judges; and (3)…

  1. Behavioral Relaxation Training for Parkinson's Disease Related Dyskinesia and Comorbid Social Anxiety

    ERIC Educational Resources Information Center

    Lundervold, Duane A.; Pahwa, Rajesh; Lyons, Kelly E.

    2013-01-01

    Effects of brief Behavioral Relaxation Training (BRT) on anxiety and dyskinesia of a 57-year-old female, with an 11-year history of Parkinson's disease (PD) and 18-months post-deep brain stimulation of the subthalamic nucleus, were evaluated. Multiple process and outcome measures were used including the Clinical Anxiety Scale (CAS), Subjective…

  2. The Evaluation of a North Carolina Truancy Intervention Program: Is It Culturally Sensitive?

    ERIC Educational Resources Information Center

    Carson, Deborah

    2013-01-01

    Truancy issues are longstanding and the desires to develop programs that are effective have been just as enduring. More striking is the lack of evidence of a truancy program evaluation process that includes collaboration with the clients that it serves. Truancy intervention programs have a wide range of effects on participants from multiple ethnic…

  3. Developing tools for investigating the multiple roles of ethylene: Identification and mapping genes for ethylene biosynthesis and reception in barley

    USDA-ARS?s Scientific Manuscript database

    The plant hormone ethylene is important to many plant processes from germination through senescence, including responses to in vitro growth and plant regeneration. Knowledge of the number of genes, and of their function, that are involved in ethylene biosynthesis and reception is necessary to determ...

  4. Adaptive Coping under Conditions of Extreme Stress: Multilevel Influences on the Determinants of Resilience in Maltreated Children

    ERIC Educational Resources Information Center

    Cicchetti, Dante; Rogosch, Fred A.

    2009-01-01

    The study of resilience in maltreated children reveals the possibility of coping processes and resources on multiple levels of analysis as children strive to adapt under conditions of severe stress. In a maltreating context, aspects of self-organization, including self-esteem, self-reliance, emotion regulation, and adaptable yet reserved…

  5. Engaging Adolescents through Participatory and Qualitative Research Methods to Develop a Digital Communication Intervention to Reduce Adolescent Obesity

    ERIC Educational Resources Information Center

    Livingood, William C.; Monticalvo, David; Bernhardt, Jay M.; Wells, Kelli T.; Harris, Todd; Kee, Kadra; Hayes, Johnathan; George, Donald; Woodhouse, Lynn D.

    2017-01-01

    Background: The complexity of the childhood obesity epidemic requires the application of community-based participatory research (CBPR) in a manner that can transcend multiple communities of stakeholders, including youth, the broader community, and the community of health care providers. Aim: To (a) describe participatory processes for engaging…

  6. MANGO: a new approach to multiple sequence alignment.

    PubMed

    Zhang, Zefeng; Lin, Hao; Li, Ming

    2007-01-01

    Multiple sequence alignment is a classical and challenging task for biological sequence analysis. The problem is NP-hard. The full dynamic programming takes too much time. The progressive alignment heuristics adopted by most state of the art multiple sequence alignment programs suffer from the 'once a gap, always a gap' phenomenon. Is there a radically new way to do multiple sequence alignment? This paper introduces a novel and orthogonal multiple sequence alignment method, using multiple optimized spaced seeds and new algorithms to handle these seeds efficiently. Our new algorithm processes information of all sequences as a whole, avoiding problems caused by the popular progressive approaches. Because the optimized spaced seeds are provably significantly more sensitive than the consecutive k-mers, the new approach promises to be more accurate and reliable. To validate our new approach, we have implemented MANGO: Multiple Alignment with N Gapped Oligos. Experiments were carried out on large 16S RNA benchmarks showing that MANGO compares favorably, in both accuracy and speed, against state-of-art multiple sequence alignment methods, including ClustalW 1.83, MUSCLE 3.6, MAFFT 5.861, Prob-ConsRNA 1.11, Dialign 2.2.1, DIALIGN-T 0.2.1, T-Coffee 4.85, POA 2.0 and Kalign 2.0.

  7. Advanced bulk processing of lightweight materials for utilization in the transportation sector

    NASA Astrophysics Data System (ADS)

    Milner, Justin L.

    The overall objective of this research is to develop the microstructure of metallic lightweight materials via multiple advanced processing techniques with potentials for industrial utilization on a large scale to meet the demands of the aerospace and automotive sectors. This work focused on (i) refining the grain structure to increase the strength, (ii) controlling the texture to increase formability and (iii) directly reducing processing/production cost of lightweight material components. Advanced processing is conducted on a bulk scale by several severe plastic deformation techniques including: accumulative roll bonding, isolated shear rolling and friction stir processing to achieve the multiple targets of this research. Development and validation of the processing techniques is achieved through wide-ranging experiments along with detailed mechanical and microstructural examination of the processed material. On a broad level, this research will make advancements in processing of bulk lightweight materials facilitating industrial-scale implementation. Where accumulative roll bonding and isolated shear rolling, currently feasible on an industrial scale, processes bulk sheet materials capable of replacing more expensive grades of alloys and enabling low-temperature and high-strain-rate formability. Furthermore, friction stir processing to manufacture lightweight tubes, made from magnesium alloys, has the potential to increase the utilization of these materials in the automotive and aerospace sectors for high strength - high formability applications. With the increased utilization of these advanced processing techniques will significantly reduce the cost associated with lightweight materials for many applications in the transportation sectors.

  8. Multiple functions of caprylic acid-induced impurity precipitation for process intensification in monoclonal antibody purification.

    PubMed

    Trapp, Anja; Faude, Alexander; Hörold, Natalie; Schubert, Sven; Faust, Sabine; Grob, Thilo; Schmidt, Stefan

    2018-05-02

    New emerging technologies delivering benefits in terms of process robustness and economy are an inevitable prerequisite for monoclonal antibody purification processes intensification. Caprylic acid was proven as an effective precipitating agent enabling efficient precipitaton of product- and process-related impurities while leaving the antibody in solution. This purification step at mild acidic pH was therefore introduced in generic antibody platform approaches after Protein A capture and evaluated for its impact regarding process robustness and antibody stability. Comparison of 13 different monoclonal antibodies showed significant differences in antibody recovery between 65-95% during caprylic acid-induced impurity precipitation. Among six compared physicochemical properties, isoelectric point of the antibody domains was figured out to correlate with yield. Antibodies with mild acidic pI of the light chain were significantly susceptible to caprylic acid-induced precipitation resulting in lower yields. Virus clearance studies revealed that caprylic acid provided complete virus inactivation of an enveloped virus. Multiple process relevant factors such as pH range, caprylic acid concentration and antibody stability were investigated in this study to enable an intensified purification process including caprylic acid precipitation for HCP removal of up to 2 log 10 reduction values at mAb yields >90% while also contributing to the virus safety of the process. Copyright © 2018 Elsevier B.V. All rights reserved.

  9. Pre-orthographic character string processing and parietal cortex: a role for visual attention in reading?

    PubMed

    Lobier, Muriel; Peyrin, Carole; Le Bas, Jean-François; Valdois, Sylviane

    2012-07-01

    The visual front-end of reading is most often associated with orthographic processing. The left ventral occipito-temporal cortex seems to be preferentially tuned for letter string and word processing. In contrast, little is known of the mechanisms responsible for pre-orthographic processing: the processing of character strings regardless of character type. While the superior parietal lobule has been shown to be involved in multiple letter processing, further data is necessary to extend these results to non-letter characters. The purpose of this study is to identify the neural correlates of pre-orthographic character string processing independently of character type. Fourteen skilled adult readers carried out multiple and single element visual categorization tasks with alphanumeric (AN) and non-alphanumeric (nAN) characters under fMRI. The role of parietal cortex in multiple element processing was further probed with a priori defined anatomical regions of interest (ROIs). Participants activated posterior parietal cortex more strongly for multiple than single element processing. ROI analyses showed that bilateral SPL/BA7 was more strongly activated for multiple than single element processing, regardless of character type. In contrast, no multiple element specific activity was found in inferior parietal lobules. These results suggests that parietal mechanisms are involved in pre-orthographic character string processing. We argue that in general, attentional mechanisms are involved in visual word recognition, as an early step of word visual analysis. Copyright © 2012 Elsevier Ltd. All rights reserved.

  10. TAMU: A New Space Mission Operations Paradigm

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila; Ruszkowski, James; Haensly, Jean; Pennington, Granvil A.; Hogle, Charles

    2011-01-01

    The Transferable, Adaptable, Modular and Upgradeable (TAMU) Flight Production Process (FPP) is a model-centric System of System (SoS) framework which cuts across multiple organizations and their associated facilities, that are, in the most general case, in geographically diverse locations, to develop the architecture and associated workflow processes for a broad range of mission operations. Further, TAMU FPP envisions the simulation, automatic execution and re-planning of orchestrated workflow processes as they become operational. This paper provides the vision for the TAMU FPP paradigm. This includes a complete, coherent technique, process and tool set that result in an infrastructure that can be used for full lifecycle design and decision making during any flight production process. A flight production process is the process of developing all products that are necessary for flight.

  11. Rigorous ILT optimization for advanced patterning and design-process co-optimization

    NASA Astrophysics Data System (ADS)

    Selinidis, Kosta; Kuechler, Bernd; Cai, Howard; Braam, Kyle; Hoppe, Wolfgang; Domnenko, Vitaly; Poonawala, Amyn; Xiao, Guangming

    2018-03-01

    Despite the large difficulties involved in extending 193i multiple patterning and the slow ramp of EUV lithography to full manufacturing readiness, the pace of development for new technology node variations has been accelerating. Multiple new variations of new and existing technology nodes have been introduced for a range of device applications; each variation with at least a few new process integration methods, layout constructs and/or design rules. This had led to a strong increase in the demand for predictive technology tools which can be used to quickly guide important patterning and design co-optimization decisions. In this paper, we introduce a novel hybrid predictive patterning method combining two patterning technologies which have each individually been widely used for process tuning, mask correction and process-design cooptimization. These technologies are rigorous lithography simulation and inverse lithography technology (ILT). Rigorous lithography simulation has been extensively used for process development/tuning, lithography tool user setup, photoresist hot-spot detection, photoresist-etch interaction analysis, lithography-TCAD interactions/sensitivities, source optimization and basic lithography design rule exploration. ILT has been extensively used in a range of lithographic areas including logic hot-spot fixing, memory layout correction, dense memory cell optimization, assist feature (AF) optimization, source optimization, complex patterning design rules and design-technology co-optimization (DTCO). The combined optimization capability of these two technologies will therefore have a wide range of useful applications. We investigate the benefits of the new functionality for a few of these advanced applications including correction for photoresist top loss and resist scumming hotspots.

  12. Multiple attenuation to reflection seismic data using Radon filter and Wave Equation Multiple Rejection (WEMR) method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Erlangga, Mokhammad Puput

    Separation between signal and noise, incoherent or coherent, is important in seismic data processing. Although we have processed the seismic data, the coherent noise is still mixing with the primary signal. Multiple reflections are a kind of coherent noise. In this research, we processed seismic data to attenuate multiple reflections in the both synthetic and real seismic data of Mentawai. There are several methods to attenuate multiple reflection, one of them is Radon filter method that discriminates between primary reflection and multiple reflection in the τ-p domain based on move out difference between primary reflection and multiple reflection. However, inmore » case where the move out difference is too small, the Radon filter method is not enough to attenuate the multiple reflections. The Radon filter also produces the artifacts on the gathers data. Except the Radon filter method, we also use the Wave Equation Multiple Elimination (WEMR) method to attenuate the long period multiple reflection. The WEMR method can attenuate the long period multiple reflection based on wave equation inversion. Refer to the inversion of wave equation and the magnitude of the seismic wave amplitude that observed on the free surface, we get the water bottom reflectivity which is used to eliminate the multiple reflections. The WEMR method does not depend on the move out difference to attenuate the long period multiple reflection. Therefore, the WEMR method can be applied to the seismic data which has small move out difference as the Mentawai seismic data. The small move out difference on the Mentawai seismic data is caused by the restrictiveness of far offset, which is only 705 meter. We compared the real free multiple stacking data after processing with Radon filter and WEMR process. The conclusion is the WEMR method can more attenuate the long period multiple reflection than the Radon filter method on the real (Mentawai) seismic data.« less

  13. A time-driven, activity-based costing methodology for determining the costs of red blood cell transfusion in patients with beta thalassaemia major.

    PubMed

    Burns, K E; Haysom, H E; Higgins, A M; Waters, N; Tahiri, R; Rushford, K; Dunstan, T; Saxby, K; Kaplan, Z; Chunilal, S; McQuilten, Z K; Wood, E M

    2018-04-10

    To describe the methodology to estimate the total cost of administration of a single unit of red blood cells (RBC) in adults with beta thalassaemia major in an Australian specialist haemoglobinopathy centre. Beta thalassaemia major is a genetic disorder of haemoglobin associated with multiple end-organ complications and typically requiring lifelong RBC transfusion therapy. New therapeutic agents are becoming available based on advances in understanding of the disorder and its consequences. Assessment of the true total cost of transfusion, incorporating both product and activity costs, is required in order to evaluate the benefits and costs of these new therapies. We describe the bottom-up, time-driven, activity-based costing methodology used to develop process maps to provide a step-by-step outline of the entire transfusion pathway. Detailed flowcharts for each process are described. Direct observations and timing of the process maps document all activities, resources, staff, equipment and consumables in detail. The analysis will include costs associated with performing these processes, including resources and consumables. Sensitivity analyses will be performed to determine the impact of different staffing levels, timings and probabilities associated with performing different tasks. Thirty-one process maps have been developed, with over 600 individual activities requiring multiple timings. These will be used for future detailed cost analyses. Detailed process maps using bottom-up, time-driven, activity-based costing for determining the cost of RBC transfusion in thalassaemia major have been developed. These could be adapted for wider use to understand and compare the costs and complexities of transfusion in other settings. © 2018 British Blood Transfusion Society.

  14. A Framework for Assessing the Value of Investments in Nonclinical Prevention

    PubMed Central

    Roehrig, Charles; Russo, Pamela

    2015-01-01

    We present a high-level framework to show the process by which an investment in primary prevention produces value. We define primary prevention broadly to include investments in any of the determinants of health. Although it builds on previously developed frameworks, ours incorporates several additional features. It distinguishes direct and upstream determinants of health, a distinction that can help identify, describe, and track the impact of a policy or program on health and health care costs. It recognizes multiple dimensions of value, including the need to establish the nonhealth value of investments whose objectives are not limited to improvements in health (and whose costs should not be attributed solely to the health benefits). Finally, it emphasizes the need to describe value from the perspectives of the multiple stakeholders that can influence such investments. PMID:26652216

  15. A Framework for Assessing the Value of Investments in Nonclinical Prevention.

    PubMed

    Miller, George; Roehrig, Charles; Russo, Pamela

    2015-12-10

    We present a high-level framework to show the process by which an investment in primary prevention produces value. We define primary prevention broadly to include investments in any of the determinants of health. Although it builds on previously developed frameworks, ours incorporates several additional features. It distinguishes direct and upstream determinants of health, a distinction that can help identify, describe, and track the impact of a policy or program on health and health care costs. It recognizes multiple dimensions of value, including the need to establish the nonhealth value of investments whose objectives are not limited to improvements in health (and whose costs should not be attributed solely to the health benefits). Finally, it emphasizes the need to describe value from the perspectives of the multiple stakeholders that can influence such investments.

  16. A dynamic bead-based microarray for parallel DNA detection

    NASA Astrophysics Data System (ADS)

    Sochol, R. D.; Casavant, B. P.; Dueck, M. E.; Lee, L. P.; Lin, L.

    2011-05-01

    A microfluidic system has been designed and constructed by means of micromachining processes to integrate both microfluidic mixing of mobile microbeads and hydrodynamic microbead arraying capabilities on a single chip to simultaneously detect multiple bio-molecules. The prototype system has four parallel reaction chambers, which include microchannels of 18 × 50 µm2 cross-sectional area and a microfluidic mixing section of 22 cm length. Parallel detection of multiple DNA oligonucleotide sequences was achieved via molecular beacon probes immobilized on polystyrene microbeads of 16 µm diameter. Experimental results show quantitative detection of three distinct DNA oligonucleotide sequences from the Hepatitis C viral (HCV) genome with single base-pair mismatch specificity. Our dynamic bead-based microarray offers an effective microfluidic platform to increase parallelization of reactions and improve microbead handling for various biological applications, including bio-molecule detection, medical diagnostics and drug screening.

  17. Processing and analysis of cardiac optical mapping data obtained with potentiometric dyes

    PubMed Central

    Laughner, Jacob I.; Ng, Fu Siong; Sulkin, Matthew S.; Arthur, R. Martin

    2012-01-01

    Optical mapping has become an increasingly important tool to study cardiac electrophysiology in the past 20 years. Multiple methods are used to process and analyze cardiac optical mapping data, and no consensus currently exists regarding the optimum methods. The specific methods chosen to process optical mapping data are important because inappropriate data processing can affect the content of the data and thus alter the conclusions of the studies. Details of the different steps in processing optical imaging data, including image segmentation, spatial filtering, temporal filtering, and baseline drift removal, are provided in this review. We also provide descriptions of the common analyses performed on data obtained from cardiac optical imaging, including activation mapping, action potential duration mapping, repolarization mapping, conduction velocity measurements, and optical action potential upstroke analysis. Optical mapping is often used to study complex arrhythmias, and we also discuss dominant frequency analysis and phase mapping techniques used for the analysis of cardiac fibrillation. PMID:22821993

  18. Cognitive status in patients with multiple sclerosis in Lanzarote.

    PubMed

    Pérez-Martín, María Yaiza; Eguia-Del Río, Pablo; González-Platas, Montserrat; Jiménez-Sosa, Alejandro

    2016-01-01

    Cognitive impairment is a common feature in multiple sclerosis affecting ~43%-72% of patients, which involves cognitive functions such as memory, processing speed, attention, and executive function. The aim of this study was to describe the extent and pattern of the involvement of cognitive impairment and psychological status in all patients with multiple sclerosis on a small Spanish island. In all, 70 patients and 56 healthy controls were included in the study between February 2013 and May 2013. All participants were assessed using the Brief Repeatable Battery of Neuropsychological Test. The patients also completed instruments to evaluate the presence of fatigue, perceived cognitive dysfunction, and symptoms of anxiety and depression. All procedures were performed in a single session. Cognitive impairment, defined as a score <1.5 standard deviation on two subtests of the battery, was present in 35% of the participants. The most frequently affected domain was working memory, followed by verbal memory and processing speed. Disease duration showed a moderate correlation with visuospatial memory and processing speed. The Expanded Disability Status Scale score correlated with verbal and processing speed. Verbal memory was correlated with depression symptoms and fatigue. Cognitive impairment was present in 35% of the study population. The most affected domains were working memory and verbal memory. Working memory and verbal fluency deficit are independent factors of disease evolution. Cognitive decline is related to clinical variables and psychological measures such as fatigue or depression but not to anxiety.

  19. Cognitive status in patients with multiple sclerosis in Lanzarote

    PubMed Central

    Pérez-Martín, María Yaiza; Eguia-del Río, Pablo; González-Platas, Montserrat; Jiménez-Sosa, Alejandro

    2016-01-01

    Objectives Cognitive impairment is a common feature in multiple sclerosis affecting ~43%–72% of patients, which involves cognitive functions such as memory, processing speed, attention, and executive function. The aim of this study was to describe the extent and pattern of the involvement of cognitive impairment and psychological status in all patients with multiple sclerosis on a small Spanish island. Patients and methods In all, 70 patients and 56 healthy controls were included in the study between February 2013 and May 2013. All participants were assessed using the Brief Repeatable Battery of Neuropsychological Test. The patients also completed instruments to evaluate the presence of fatigue, perceived cognitive dysfunction, and symptoms of anxiety and depression. All procedures were performed in a single session. Results Cognitive impairment, defined as a score <1.5 standard deviation on two subtests of the battery, was present in 35% of the participants. The most frequently affected domain was working memory, followed by verbal memory and processing speed. Disease duration showed a moderate correlation with visuospatial memory and processing speed. The Expanded Disability Status Scale score correlated with verbal and processing speed. Verbal memory was correlated with depression symptoms and fatigue. Conclusion Cognitive impairment was present in 35% of the study population. The most affected domains were working memory and verbal memory. Working memory and verbal fluency deficit are independent factors of disease evolution. Cognitive decline is related to clinical variables and psychological measures such as fatigue or depression but not to anxiety. PMID:27418825

  20. Lipid-associated oral delivery: Mechanisms and analysis of oral absorption enhancement.

    PubMed

    Rezhdo, Oljora; Speciner, Lauren; Carrier, Rebecca

    2016-10-28

    The majority of newly discovered oral drugs are poorly water soluble, and co-administration with lipids has proven effective in significantly enhancing bioavailability of some compounds with low aqueous solubility. Yet, lipid-based delivery technologies have not been widely employed in commercial oral products. Lipids can impact drug transport and fate in the gastrointestinal (GI) tract through multiple mechanisms including enhancement of solubility and dissolution kinetics, enhancement of permeation through the intestinal mucosa, and triggering drug precipitation upon lipid emulsion depletion (e.g., by digestion). The effect of lipids on drug absorption is currently not quantitatively predictable, in part due to the multiple complex dynamic processes that can be impacted by lipids. Quantitative mechanistic analysis of the processes significant to lipid system function and overall impact on drug absorption can aid in the understanding of drug-lipid interactions in the GI tract and exploitation of such interactions to achieve optimal lipid-based drug delivery. In this review, we discuss the impact of co-delivered lipids and lipid digestion on drug dissolution, partitioning, and absorption in the context of the experimental tools and associated kinetic expressions used to study and model these processes. The potential benefit of a systems-based consideration of the concurrent multiple dynamic processes occurring upon co-dosing lipids and drugs to predict the impact of lipids on drug absorption and enable rational design of lipid-based delivery systems is presented. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. Multiple resolution chirp reflectometry for fault localization and diagnosis in a high voltage cable in automotive electronics

    NASA Astrophysics Data System (ADS)

    Chang, Seung Jin; Lee, Chun Ku; Shin, Yong-June; Park, Jin Bae

    2016-12-01

    A multiple chirp reflectometry system with a fault estimation process is proposed to obtain multiple resolution and to measure the degree of fault in a target cable. A multiple resolution algorithm has the ability to localize faults, regardless of fault location. The time delay information, which is derived from the normalized cross-correlation between the incident signal and bandpass filtered reflected signals, is converted to a fault location and cable length. The in-phase and quadrature components are obtained by lowpass filtering of the mixed signal of the incident signal and the reflected signal. Based on in-phase and quadrature components, the reflection coefficient is estimated by the proposed fault estimation process including the mixing and filtering procedure. Also, the measurement uncertainty for this experiment is analyzed according to the Guide to the Expression of Uncertainty in Measurement. To verify the performance of the proposed method, we conduct comparative experiments to detect and measure faults under different conditions. Considering the installation environment of the high voltage cable used in an actual vehicle, target cable length and fault position are designed. To simulate the degree of fault, the variety of termination impedance (10 Ω , 30 Ω , 50 Ω , and 1 \\text{k} Ω ) are used and estimated by the proposed method in this experiment. The proposed method demonstrates advantages in that it has multiple resolution to overcome the blind spot problem, and can assess the state of the fault.

  2. Evaluator-blinded trial evaluating nurse-led immunotherapy DEcision Coaching In persons with relapsing-remitting Multiple Sclerosis (DECIMS) and accompanying process evaluation: study protocol for a cluster randomised controlled trial.

    PubMed

    Rahn, Anne Christin; Köpke, Sascha; Kasper, Jürgen; Vettorazzi, Eik; Mühlhauser, Ingrid; Heesen, Christoph

    2015-03-21

    Multiple sclerosis is a chronic neurological condition usually starting in early adulthood and regularly leading to severe disability. Immunotherapy options are growing in number and complexity, while costs of treatments are high and adherence rates remain low. Therefore, treatment decision-making has become more complex for patients. Structured decision coaching, based on the principles of evidence-based patient information and shared decision-making, has the potential to facilitate participation of individuals in the decision-making process. This cluster randomised controlled trial follows the assumption that decision coaching by trained nurses, using evidence-based patient information and preference elicitation, will facilitate informed choices and induce higher decision quality, as well as better decisional adherence. The decision coaching programme will be evaluated through an evaluator-blinded superiority cluster randomised controlled trial, including 300 patients with suspected or definite relapsing-remitting multiple sclerosis, facing an immunotherapy decision. The clusters are 12 multiple sclerosis outpatient clinics in Germany. Further, the trial will be accompanied by a mixed-methods process evaluation and a cost-effectiveness study. Nurses in the intervention group will be trained in shared decision-making, coaching, and evidence-based patient information principles. Patients who meet the inclusion criteria will receive decision coaching (intervention group) with up to three face-to-face coaching sessions with a trained nurse (decision coach) or counselling as usual (control group). Patients in both groups will be given access to an evidence-based online information tool. The primary outcome is 'informed choice' after six months, assessed with the multi-dimensional measure of informed choice including the sub-dimensions risk knowledge (questionnaire), attitude concerning immunotherapy (questionnaire), and immunotherapy uptake (telephone survey). Secondary outcomes include decisional conflict, adherence to immunotherapy decisions, autonomy preference, planned behaviour, coping self-efficacy, and perceived involvement in coaching and decisional encounters. Safety outcomes are comprised of anxiety and depression and disease-specific quality of life. This trial will assess the effectiveness of a new model of patient decision support concerning MS-immunotherapy options. The delegation of treatment information provision from physicians to trained nurses bears the potential to change current doctor-focused practice in Germany. Current Controlled Trials (identifier: ISRCTN37929939 ), May 27, 2014.

  3. Image Processing, Coding, and Compression with Multiple-Point Impulse Response Functions.

    NASA Astrophysics Data System (ADS)

    Stossel, Bryan Joseph

    1995-01-01

    Aspects of image processing, coding, and compression with multiple-point impulse response functions are investigated. Topics considered include characterization of the corresponding random-walk transfer function, image recovery for images degraded by the multiple-point impulse response, and the application of the blur function to image coding and compression. It is found that although the zeros of the real and imaginary parts of the random-walk transfer function occur in continuous, closed contours, the zeros of the transfer function occur at isolated spatial frequencies. Theoretical calculations of the average number of zeros per area are in excellent agreement with experimental results obtained from computer counts of the zeros. The average number of zeros per area is proportional to the standard deviations of the real part of the transfer function as well as the first partial derivatives. Statistical parameters of the transfer function are calculated including the mean, variance, and correlation functions for the real and imaginary parts of the transfer function and their corresponding first partial derivatives. These calculations verify the assumptions required in the derivation of the expression for the average number of zeros. Interesting results are found for the correlations of the real and imaginary parts of the transfer function and their first partial derivatives. The isolated nature of the zeros in the transfer function and its characteristics at high spatial frequencies result in largely reduced reconstruction artifacts and excellent reconstructions are obtained for distributions of impulses consisting of 25 to 150 impulses. The multiple-point impulse response obscures original scenes beyond recognition. This property is important for secure transmission of data on many communication systems. The multiple-point impulse response enables the decoding and restoration of the original scene with very little distortion. Images prefiltered by the random-walk transfer function yield greater compression ratios than are obtained for the original scene. The multiple-point impulse response decreases the bit rate approximately 40-70% and affords near distortion-free reconstructions. Due to the lossy nature of transform-based compression algorithms, noise reduction measures must be incorporated to yield acceptable reconstructions after decompression.

  4. PROCAMS - A second generation multispectral-multitemporal data processing system for agricultural mensuration

    NASA Technical Reports Server (NTRS)

    Erickson, J. D.; Nalepka, R. F.

    1976-01-01

    PROCAMS (Prototype Classification and Mensuration System) has been designed for the classification and mensuration of agricultural crops (specifically small grains including wheat, rye, oats, and barley) through the use of data provided by Landsat. The system includes signature extension as a major feature and incorporates multitemporal as well as early season unitemporal approaches for using multiple training sites. Also addressed are partial cloud cover and cloud shadows, bad data points and lines, as well as changing sun angle and atmospheric state variations.

  5. Characterization and production of multifunctional cationic peptides derived from rice proteins.

    PubMed

    Taniguchi, Masayuki; Ochiai, Akihito

    2017-04-01

    Food proteins have been identified as a source of bioactive peptides. These peptides are inactive within the sequence of the parent protein and must be released during gastrointestinal digestion, fermentation, or food processing. Of bioactive peptides, multifunctional cationic peptides are more useful than other peptides that have specific activity in promotion of health and/or the treatment of diseases. We have identified and characterized cationic peptides from rice enzymes and proteins that possess multiple functions, including antimicrobial, endotoxin-neutralizing, arginine gingipain-inhibitory, and/or angiogenic activities. In particular, we have elucidated the contribution of cationic amino acids (arginine and lysine) in the peptides to their bioactivities. Further, we have discussed the critical parameters, particularly proteinase preparations and fractionation or purification, in the enzymatic hydrolysis process for producing bioactive peptides from food proteins. Using an ampholyte-free isoelectric focusing (autofocusing) technique as a tool for fractionation, we successfully prepared fractions containing cationic peptides with multiple functions.

  6. Water Isotopes in the GISS GCM: History, Applications and Potential

    NASA Astrophysics Data System (ADS)

    Schmidt, G. A.; LeGrande, A. N.; Field, R. D.; Nusbaumer, J. M.

    2017-12-01

    Water isotopes have been incorporated in the GISS GCMs since the pioneering work of Jean Jouzel in the 1980s. Since 2005, this functionality has been maintained within the master branch of the development code and has been usable (and used) in all subsequent versions. This has allowed a wide variety of applications, across multiple time-scales and interests, to be tackled coherently. Water isotope tracers have been used to debug the atmospheric model code, tune parameterisations of moist processes, assess the isotopic fingerprints of multiple climate drivers, produce forward models for remotely sensed isotope products, and validate paleo-climate interpretations from the last millennium to the Eocene. We will present an overview of recent results involving isotope tracers, including improvements in models for the isotopic fractionation processes themselves, and demonstrate the potential for using these tracers and models more systematically in paleo-climate reconstructions and investigations of the modern hydrological cycle.

  7. Passing in Command Line Arguments and Parallel Cluster/Multicore Batching in R with batch.

    PubMed

    Hoffmann, Thomas J

    2011-03-01

    It is often useful to rerun a command line R script with some slight change in the parameters used to run it - a new set of parameters for a simulation, a different dataset to process, etc. The R package batch provides a means to pass in multiple command line options, including vectors of values in the usual R format, easily into R. The same script can be setup to run things in parallel via different command line arguments. The R package batch also provides a means to simplify this parallel batching by allowing one to use R and an R-like syntax for arguments to spread a script across a cluster or local multicore/multiprocessor computer, with automated syntax for several popular cluster types. Finally it provides a means to aggregate the results together of multiple processes run on a cluster.

  8. Perceptions and Cost-Analysis of a Multiple Mini-Interview in a Pharmacy School Admissions Process.

    PubMed

    Corelli, Robin L; Muchnik, Michael A; Beechinor, Ryan J; Fong, Gary; Vogt, Eleanor M; Cocohoba, Jennifer M; Tsourounis, Candy; Hudmon, Karen Suchanek

    2015-11-25

    To improve the quality of admissions interviews for a doctor of pharmacy program, using a multiple mini-interview (MMI) in place of the standard interview. Stakeholders completed an anonymous web-based survey. This study characterized perceptions of the MMI format across 3 major stakeholders (candidates, interviewers, admissions committee members) and included comparative cost estimates.Costs were estimated using human and facility resources from the 2012 cycle (standard format) and the 2013 cycle (MMI format). Most candidates (65%), interviewers (86%), and admissions committee members (79%) perceived the MMI format as effective for evaluating applicants, and most (59% of candidates, 84% of interviewers, 77% of committee members) agreed that the MMI format should be continued. Cost per candidate interviewed was $136.34 (standard interview) vs $75.30 (MMI). Perceptions of the MMI process were favorable across stakeholder groups, and this format was less costly per candidate interviewed.

  9. Inter-individual cognitive variability in children with Asperger's syndrome

    PubMed Central

    Gonzalez-Gadea, Maria Luz; Tripicchio, Paula; Rattazzi, Alexia; Baez, Sandra; Marino, Julian; Roca, Maria; Manes, Facundo; Ibanez, Agustin

    2014-01-01

    Multiple studies have tried to establish the distinctive profile of individuals with Asperger's syndrome (AS). However, recent reports suggest that adults with AS feature heterogeneous cognitive profiles. The present study explores inter-individual variability in children with AS through group comparison and multiple case series analysis. All participants completed an extended battery including measures of fluid and crystallized intelligence, executive functions, theory of mind, and classical neuropsychological tests. Significant group differences were found in theory of mind and other domains related to global information processing. However, the AS group showed high inter-individual variability (both sub- and supra-normal performance) on most cognitive tasks. Furthermore, high fluid intelligence correlated with less general cognitive impairment, high cognitive flexibility, and speed of motor processing. In light of these findings, we propose that children with AS are characterized by a distinct, uneven pattern of cognitive strengths and weaknesses. PMID:25132817

  10. Biosynthesis of Jasmine Lactone in Tea ( Camellia sinensis) Leaves and Its Formation in Response to Multiple Stresses.

    PubMed

    Zeng, Lanting; Zhou, Ying; Fu, Xiumin; Liao, Yinyin; Yuan, Yunfei; Jia, Yongxia; Dong, Fang; Yang, Ziyin

    2018-04-18

    Jasmine lactone has a potent odor that contributes to the fruity, sweet floral aroma of tea ( Camellia sinensis). Our previous study demonstrated that jasmine lactone was mostly accumulated at the turnover stage of the oolong tea manufacturing process. This study investigates the previously unknown mechanism of formation of jasmine lactone in tea leaves exposed to multiple stresses occurring during the growth and manufacturing processes. Both continuous mechanical damage and the dual stress of low temperature and mechanical damage enhanced jasmine lactone accumulation in tea leaves. In addition, only one pathway, via hydroperoxy fatty acids from unsaturated fatty acid, including linoleic acid and α-linolenic acid, under the action of lipoxygenases (LOXs), especially CsLOX1, was significantly affected by these stresses. This is the first evidence of the mechanism of jasmine lactone formation in tea leaves and is a characteristic example of plant volatile formation in response to dual stress.

  11. Effect of Pore Size and Pore Connectivity on Unidirectional Capillary Penetration Kinetics in 3-D Porous Media using Direct Numerical Simulation

    NASA Astrophysics Data System (ADS)

    Fu, An; Palakurthi, Nikhil; Konangi, Santosh; Comer, Ken; Jog, Milind

    2017-11-01

    The physics of capillary flow is used widely in multiple fields. Lucas-Washburn equation is developed by using a single pore-sized capillary tube with continuous pore connection. Although this equation has been extended to describe the penetration kinetics into porous medium, multiple studies have indicated L-W does not accurately predict flow patterns in real porous media. In this study, the penetration kinetics including the effect of pore size and pore connectivity will be closely examined since they are expected to be the key factors effecting the penetration process. The Liquid wicking process is studied from a converging and diverging capillary tube to the complex virtual 3-D porous structures with Direct Numerical Simulation (DNS) using the Volume-Of-Fluid (VOF) method within the OpenFOAM CFD Solver. Additionally Porous Medium properties such as Permeability (k) , Tortuosity (τ) will be also analyzed.

  12. Rab protein evolution and the history of the eukaryotic endomembrane system

    PubMed Central

    Brighouse, Andrew; Dacks, Joel B.

    2010-01-01

    Spectacular increases in the quantity of sequence data genome have facilitated major advances in eukaryotic comparative genomics. By exploiting homology with classical model organisms, this makes possible predictions of pathways and cellular functions currently impossible to address in intractable organisms. Echoing realization that core metabolic processes were established very early following evolution of life on earth, it is now emerging that many eukaryotic cellular features, including the endomembrane system, are ancient and organized around near-universal principles. Rab proteins are key mediators of vesicle transport and specificity, and via the presence of multiple paralogues, alterations in interaction specificity and modification of pathways, contribute greatly to the evolution of complexity of membrane transport. Understanding system-level contributions of Rab proteins to evolutionary history provides insight into the multiple processes sculpting cellular transport pathways and the exciting challenges that we face in delving further into the origins of membrane trafficking specificity. PMID:20582450

  13. Linguistic measures of the referential process in psychodynamic treatment: the English and Italian versions.

    PubMed

    Mariani, Rachele; Maskit, Bernard; Bucci, Wilma; De Coro, Alessandra

    2013-01-01

    The referential process is defined in the context of Bucci's multiple code theory as the process by which nonverbal experience is connected to language. The English computerized measures of the referential process, which have been applied in psychotherapy research, include the Weighted Referential Activity Dictionary (WRAD), and measures of Reflection, Affect and Disfluency. This paper presents the development of the Italian version of the IWRAD by modeling Italian texts scored by judges, and shows the application of the IWRAD and other Italian measures in three psychodynamic treatments evaluated for personality change using the Shedler-Westen Assessment Procedure (SWAP-200). Clinical predictions based on applications of the English measures were supported.

  14. Double-observer line transect surveys with Markov-modulated Poisson process models for animal availability.

    PubMed

    Borchers, D L; Langrock, R

    2015-12-01

    We develop maximum likelihood methods for line transect surveys in which animals go undetected at distance zero, either because they are stochastically unavailable while within view or because they are missed when they are available. These incorporate a Markov-modulated Poisson process model for animal availability, allowing more clustered availability events than is possible with Poisson availability models. They include a mark-recapture component arising from the independent-observer survey, leading to more accurate estimation of detection probability given availability. We develop models for situations in which (a) multiple detections of the same individual are possible and (b) some or all of the availability process parameters are estimated from the line transect survey itself, rather than from independent data. We investigate estimator performance by simulation, and compare the multiple-detection estimators with estimators that use only initial detections of individuals, and with a single-observer estimator. Simultaneous estimation of detection function parameters and availability model parameters is shown to be feasible from the line transect survey alone with multiple detections and double-observer data but not with single-observer data. Recording multiple detections of individuals improves estimator precision substantially when estimating the availability model parameters from survey data, and we recommend that these data be gathered. We apply the methods to estimate detection probability from a double-observer survey of North Atlantic minke whales, and find that double-observer data greatly improve estimator precision here too. © 2015 The Authors Biometrics published by Wiley Periodicals, Inc. on behalf of International Biometric Society.

  15. Signal Processing Methods for Liquid Rocket Engine Combustion Stability Assessments

    NASA Technical Reports Server (NTRS)

    Kenny, R. Jeremy; Lee, Erik; Hulka, James R.; Casiano, Matthew

    2011-01-01

    The J2X Gas Generator engine design specifications include dynamic, spontaneous, and broadband combustion stability requirements. These requirements are verified empirically based high frequency chamber pressure measurements and analyses. Dynamic stability is determined with the dynamic pressure response due to an artificial perturbation of the combustion chamber pressure (bomb testing), and spontaneous and broadband stability are determined from the dynamic pressure responses during steady operation starting at specified power levels. J2X Workhorse Gas Generator testing included bomb tests with multiple hardware configurations and operating conditions, including a configuration used explicitly for engine verification test series. This work covers signal processing techniques developed at Marshall Space Flight Center (MSFC) to help assess engine design stability requirements. Dynamic stability assessments were performed following both the CPIA 655 guidelines and a MSFC in-house developed statistical-based approach. The statistical approach was developed to better verify when the dynamic pressure amplitudes corresponding to a particular frequency returned back to pre-bomb characteristics. This was accomplished by first determining the statistical characteristics of the pre-bomb dynamic levels. The pre-bomb statistical characterization provided 95% coverage bounds; these bounds were used as a quantitative measure to determine when the post-bomb signal returned to pre-bomb conditions. The time for post-bomb levels to acceptably return to pre-bomb levels was compared to the dominant frequency-dependent time recommended by CPIA 655. Results for multiple test configurations, including stable and unstable configurations, were reviewed. Spontaneous stability was assessed using two processes: 1) characterization of the ratio of the peak response amplitudes to the excited chamber acoustic mode amplitudes and 2) characterization of the variability of the peak response's frequency over the test duration. This characterization process assists in evaluating the discreteness of a signal as well as the stability of the chamber response. Broadband stability was assessed using a running root-mean-square evaluation. These techniques were also employed, in a comparative analysis, on available Fastrac data, and these results are presented here.

  16. Distributed Circuit Plasticity: New Clues for the Cerebellar Mechanisms of Learning.

    PubMed

    D'Angelo, Egidio; Mapelli, Lisa; Casellato, Claudia; Garrido, Jesus A; Luque, Niceto; Monaco, Jessica; Prestori, Francesca; Pedrocchi, Alessandra; Ros, Eduardo

    2016-04-01

    The cerebellum is involved in learning and memory of sensory motor skills. However, the way this process takes place in local microcircuits is still unclear. The initial proposal, casted into the Motor Learning Theory, suggested that learning had to occur at the parallel fiber-Purkinje cell synapse under supervision of climbing fibers. However, the uniqueness of this mechanism has been questioned, and multiple forms of long-term plasticity have been revealed at various locations in the cerebellar circuit, including synapses and neurons in the granular layer, molecular layer and deep-cerebellar nuclei. At present, more than 15 forms of plasticity have been reported. There has been a long debate on which plasticity is more relevant to specific aspects of learning, but this question turned out to be hard to answer using physiological analysis alone. Recent experiments and models making use of closed-loop robotic simulations are revealing a radically new view: one single form of plasticity is insufficient, while altogether, the different forms of plasticity can explain the multiplicity of properties characterizing cerebellar learning. These include multi-rate acquisition and extinction, reversibility, self-scalability, and generalization. Moreover, when the circuit embeds multiple forms of plasticity, it can easily cope with multiple behaviors endowing therefore the cerebellum with the properties needed to operate as an effective generalized forward controller.

  17. Information Commons for Rice (IC4R)

    PubMed Central

    2016-01-01

    Rice is the most important staple food for a large part of the world's human population and also a key model organism for plant research. Here, we present Information Commons for Rice (IC4R; http://ic4r.org), a rice knowledgebase featuring adoption of an extensible and sustainable architecture that integrates multiple omics data through community-contributed modules. Each module is developed and maintained by different committed groups, deals with data collection, processing and visualization, and delivers data on-demand via web services. In the current version, IC4R incorporates a variety of rice data through multiple committed modules, including genome-wide expression profiles derived entirely from RNA-Seq data, resequencing-based genomic variations obtained from re-sequencing data of thousands of rice varieties, plant homologous genes covering multiple diverse plant species, post-translational modifications, rice-related literatures and gene annotations contributed by the rice research community. Unlike extant related databases, IC4R is designed for scalability and sustainability and thus also features collaborative integration of rice data and low costs for database update and maintenance. Future directions of IC4R include incorporation of other omics data and association of multiple omics data with agronomically important traits, dedicating to build IC4R into a valuable knowledgebase for both basic and translational researches in rice. PMID:26519466

  18. Multisensor data fusion across time and space

    NASA Astrophysics Data System (ADS)

    Villeneuve, Pierre V.; Beaven, Scott G.; Reed, Robert A.

    2014-06-01

    Field measurement campaigns typically deploy numerous sensors having different sampling characteristics for spatial, temporal, and spectral domains. Data analysis and exploitation is made more difficult and time consuming as the sample data grids between sensors do not align. This report summarizes our recent effort to demonstrate feasibility of a processing chain capable of "fusing" image data from multiple independent and asynchronous sensors into a form amenable to analysis and exploitation using commercially-available tools. Two important technical issues were addressed in this work: 1) Image spatial registration onto a common pixel grid, 2) Image temporal interpolation onto a common time base. The first step leverages existing image matching and registration algorithms. The second step relies upon a new and innovative use of optical flow algorithms to perform accurate temporal upsampling of slower frame rate imagery. Optical flow field vectors were first derived from high-frame rate, high-resolution imagery, and then finally used as a basis for temporal upsampling of the slower frame rate sensor's imagery. Optical flow field values are computed using a multi-scale image pyramid, thus allowing for more extreme object motion. This involves preprocessing imagery to varying resolution scales and initializing new vector flow estimates using that from the previous coarser-resolution image. Overall performance of this processing chain is demonstrated using sample data involving complex too motion observed by multiple sensors mounted to the same base. Multiple sensors were included, including a high-speed visible camera, up to a coarser resolution LWIR camera.

  19. Properties of heuristic search strategies

    NASA Technical Reports Server (NTRS)

    Vanderbrug, G. J.

    1973-01-01

    A directed graph is used to model the search space of a state space representation with single input operators, an AND/OR is used for problem reduction representations, and a theorem proving graph is used for state space representations with multiple input operators. These three graph models and heuristic strategies for searching them are surveyed. The completeness, admissibility, and optimality properties of search strategies which use the evaluation function f = (1 - omega)g = omega(h) are presented and interpreted using a representation of the search process in the plane. The use of multiple output operators to imply dependent successors, and thus obtain a formalism which includes all three types of representations, is discussed.

  20. Multiple sclerosis pathogenesis: missing pieces of an old puzzle.

    PubMed

    Rahmanzadeh, Reza; Brück, Wolfgang; Minagar, Alireza; Sahraian, Mohammad Ali

    2018-06-08

    Traditionally, multiple sclerosis (MS) was considered to be a CD4 T cell-mediated CNS autoimmunity, compatible with experimental autoimmune encephalitis model, which can be characterized by focal lesions in the white matter. However, studies of recent decades revealed several missing pieces of MS puzzle and showed that MS pathogenesis is more complex than the traditional view and may include the following: a primary degenerative process (e.g. oligodendroglial pathology), generalized abnormality of normal-appearing brain tissue, pronounced gray matter pathology, involvement of innate immunity, and CD8 T cells and B cells. Here, we review these findings and discuss their implications in MS pathogenesis.

  1. Acoustical Detection Of Leakage In A Combustor

    NASA Technical Reports Server (NTRS)

    Puster, Richard L.; Petty, Jeffrey L.

    1993-01-01

    Abnormal combustion excites characteristic standing wave. Acoustical leak-detection system gives early warning of failure, enabling operating personnel to stop combustion process and repair spray bar before leak grows large enough to cause damage. Applicable to engines, gas turbines, furnaces, and other machines in which acoustic emissions at known frequencies signify onset of damage. Bearings in rotating machines monitored for emergence of characteristic frequencies shown in previous tests associated with incipient failure. Also possible to monitor for signs of trouble at multiple frequencies by feeding output of transducer simultaneously to multiple band-pass filters and associated circuitry, including separate trigger circuit set to appropriate level for each frequency.

  2. Thermo-elasto-plastic simulations of femtosecond laser-induced multiple-cavity in fused silica

    NASA Astrophysics Data System (ADS)

    Beuton, R.; Chimier, B.; Breil, J.; Hébert, D.; Mishchik, K.; Lopez, J.; Maire, P. H.; Duchateau, G.

    2018-04-01

    The formation and the interaction of multiple cavities, induced by tightly focused femtosecond laser pulses, are studied using a developed numerical tool, including the thermo-elasto-plastic material response. Simulations are performed in fused silica in cases of one, two, and four spots of laser energy deposition. The relaxation of the heated matter, launching shock waves in the surrounding cold material, leads to cavity formation and emergence of areas where cracks may be induced. Results show that the laser-induced structure shape depends on the energy deposition configuration and demonstrate the potential of the used numerical tool to obtain the desired designed structure or technological process.

  3. Speckle imaging through turbulent atmosphere based on adaptable pupil segmentation

    NASA Astrophysics Data System (ADS)

    Loktev, Mikhail; Soloviev, Oleg; Savenko, Svyatoslav; Vdovin, Gleb

    2011-07-01

    We report on the first results to our knowledge obtained with adaptable multiaperture imaging through turbulence on a horizontal atmospheric path. We show that the resolution can be improved by adaptively matching the size of the subaperture to the characteristic size of the turbulence. Further improvement is achieved by the deconvolution of a number of subimages registered simultaneously through multiple subapertures. Different implementations of multiaperture geometry, including pupil multiplication, pupil image sampling, and a plenoptic telescope, are considered. Resolution improvement has been demonstrated on a ˜550m horizontal turbulent path, using a combination of aperture sampling, speckle image processing, and, optionally, frame selection.

  4. Floating point only SIMD instruction set architecture including compare, select, Boolean, and alignment operations

    DOEpatents

    Gschwind, Michael K [Chappaqua, NY

    2011-03-01

    Mechanisms for implementing a floating point only single instruction multiple data instruction set architecture are provided. A processor is provided that comprises an issue unit, an execution unit coupled to the issue unit, and a vector register file coupled to the execution unit. The execution unit has logic that implements a floating point (FP) only single instruction multiple data (SIMD) instruction set architecture (ISA). The floating point vector registers of the vector register file store both scalar and floating point values as vectors having a plurality of vector elements. The processor may be part of a data processing system.

  5. Smith predictor-based multiple periodic disturbance compensation for long dead-time processes

    NASA Astrophysics Data System (ADS)

    Tan, Fang; Li, Han-Xiong; Shen, Ping

    2018-05-01

    Many disturbance rejection methods have been proposed for processes with dead-time, while these existing methods may not work well under multiple periodic disturbances. In this paper, a multiple periodic disturbance rejection is proposed under the Smith predictor configuration for processes with long dead-time. One feedback loop is added to compensate periodic disturbance while retaining the advantage of the Smith predictor. With information of the disturbance spectrum, the added feedback loop can remove multiple periodic disturbances effectively. The robust stability can be easily maintained through the rigorous analysis. Finally, simulation examples demonstrate the effectiveness and robustness of the proposed method for processes with long dead-time.

  6. Multiphase flow microfluidics for the production of single or multiple emulsions for drug delivery.

    PubMed

    Zhao, Chun-Xia

    2013-11-01

    Considerable effort has been directed towards developing novel drug delivery systems. Microfluidics, capable of generating monodisperse single and multiple emulsion droplets, executing precise control and operations on these droplets, is a powerful tool for fabricating complex systems (microparticles, microcapsules, microgels) with uniform size, narrow size distribution and desired properties, which have great potential in drug delivery applications. This review presents an overview of the state-of-the-art multiphase flow microfluidics for the production of single emulsions or multiple emulsions for drug delivery. The review starts with a brief introduction of the approaches for making single and multiple emulsions, followed by presentation of some potential drug delivery systems (microparticles, microcapsules and microgels) fabricated in microfluidic devices using single or multiple emulsions as templates. The design principles, manufacturing processes and properties of these drug delivery systems are also discussed and compared. Furthermore, drug encapsulation and drug release (including passive and active controlled release) are provided and compared highlighting some key findings and insights. Finally, site-targeting delivery using multiphase flow microfluidics is also briefly introduced. Copyright © 2013 Elsevier B.V. All rights reserved.

  7. “The Relationship between Executive Functioning, Processing Speed and White Matter Integrity in Multiple Sclerosis”

    PubMed Central

    Genova, Helen M.; DeLuca, John; Chiaravalloti, Nancy; Wylie, Glenn

    2014-01-01

    The primary purpose of the current study was to examine the relationship between performance on executive tasks and white matter integrity, assessed by diffusion tensor imaging (DTI) in Multiple Sclerosis (MS). A second aim was to examine how processing speed affects the relationship between executive functioning and FA. This relationship was examined in two executive tasks that rely heavily on processing speed: the Color-Word Interference Test and Trail-Making Test (Delis-Kaplan Executive Function System). It was hypothesized that reduced fractional anisotropy (FA) is related to poor performance on executive tasks in MS, but that this relationship would be affected by the statistical correction of processing speed from the executive tasks. 15 healthy controls and 25 persons with MS participated. Regression analyses were used to examine the relationship between executive functioning and FA, both before and after processing speed was removed from the executive scores. Before processing speed was removed from the executive scores, reduced FA was associated with poor performance on Color-Word Interference Test and Trail-Making Test in a diffuse network including corpus callosum and superior longitudinal fasciculus. However, once processing speed was removed, the relationship between executive functions and FA was no longer significant on the Trail Making test, and significantly reduced and more localized on the Color-Word Interference Test. PMID:23777468

  8. Hierarchical Robot Control System and Method for Controlling Select Degrees of Freedom of an Object Using Multiple Manipulators

    NASA Technical Reports Server (NTRS)

    Platt, Robert (Inventor); Wampler, II, Charles W. (Inventor); Abdallah, Muhammad E. (Inventor)

    2013-01-01

    A robotic system includes a robot having manipulators for grasping an object using one of a plurality of grasp types during a primary task, and a controller. The controller controls the manipulators during the primary task using a multiple-task control hierarchy, and automatically parameterizes the internal forces of the system for each grasp type in response to an input signal. The primary task is defined at an object-level of control, e.g., using a closed-chain transformation, such that only select degrees of freedom are commanded for the object. A control system for the robotic system has a host machine and algorithm for controlling the manipulators using the above hierarchy. A method for controlling the system includes receiving and processing the input signal using the host machine, including defining the primary task at the object-level of control, e.g., using a closed-chain definition, and parameterizing the internal forces for each of grasp type.

  9. Safeguarding the provision of ecosystem services in catchment systems.

    PubMed

    Everard, Mark

    2013-04-01

    A narrow technocentric focus on a few favored ecosystem services (generally provisioning services) has led to ecosystem degradation globally, including catchment systems and their capacities to support human well-being. Increasing recognition of the multiple benefits provided by ecosystems is slowly being translated into policy and some areas of practice, although there remains a significant shortfall in the incorporation of a systemic perspective into operation management and decision-making tools. Nevertheless, a range of ecosystem-based solutions to issues as diverse as flooding and green space provision in the urban environment offers hope for improving habitat and optimization of beneficial services. The value of catchment ecosystem processes and their associated services is also being increasingly recognized and internalized by the water industry, improving water quality and quantity through catchment land management rather than at greater expense in the treatment costs of contaminated water abstracted lower in catchments. Parallel recognition of the value of working with natural processes, rather than "defending" built assets when catchment hydrology is adversely affected by unsympathetic upstream development, is being progressively incorporated into flood risk management policy. This focus on wider catchment processes also yields a range of cobenefits for fishery, wildlife, amenity, flood risk, and other interests, which may be optimized if multiple stakeholders and their diverse value systems are included in decision-making processes. Ecosystem services, particularly implemented as a central element of the ecosystem approach, provide an integrated framework for building in these different perspectives and values, many of them formerly excluded, into commercial and resource management decision-making processes, thereby making tractable the integrative aspirations of sustainable development. This can help redress deeply entrenched inherited assumptions, habits, and vested interests, replacing them in many management situations with wider recognition of the multiple values of ecosystems and their services. Global interest in taking an ecosystem approach is promoting novel scientific and policy thinking, yet there is a shortfall in its translation into practical management tools. Professional associations may have key roles to play in breaking down barriers to the "mainstreaming" of systemic perspectives into common practice, particularly through joining u different sectors of society essential to their implementation and ongoing adaptive management. Copyright © 2012 SETAC.

  10. Upscaling of U(VI) Desorption and Transport from Decimeter-Scale Heterogeneity to Plume-Scale Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Curtis, Gary P.; Kohler, Matthias; Kannappan, Ramakrishnan

    2015-02-24

    Scientifically defensible predictions of field scale U(VI) transport in groundwater requires an understanding of key processes at multiple scales. These scales range from smaller than the sediment grain scale (less than 10 μm) to as large as the field scale which can extend over several kilometers. The key processes that need to be considered include both geochemical reactions in solution and at sediment surfaces as well as physical transport processes including advection, dispersion, and pore-scale diffusion. The research summarized in this report includes both experimental and modeling results in batch, column and tracer tests. The objectives of this research weremore » to: (1) quantify the rates of U(VI) desorption from sediments acquired from a uranium contaminated aquifer in batch experiments;(2) quantify rates of U(VI) desorption in column experiments with variable chemical conditions, and(3) quantify nonreactive tracer and U(VI) transport in field tests.« less

  11. Experimental realization of entanglement in multiple degrees of freedom between two quantum memories

    PubMed Central

    Zhang, Wei; Ding, Dong-Sheng; Dong, Ming-Xin; Shi, Shuai; Wang, Kai; Liu, Shi-Long; Li, Yan; Zhou, Zhi-Yuan; Shi, Bao-Sen; Guo, Guang-Can

    2016-01-01

    Entanglement in multiple degrees of freedom has many benefits over entanglement in a single one. The former enables quantum communication with higher channel capacity and more efficient quantum information processing and is compatible with diverse quantum networks. Establishing multi-degree-of-freedom entangled memories is not only vital for high-capacity quantum communication and computing, but also promising for enhanced violations of nonlocality in quantum systems. However, there have been yet no reports of the experimental realization of multi-degree-of-freedom entangled memories. Here we experimentally established hyper- and hybrid entanglement in multiple degrees of freedom, including path (K-vector) and orbital angular momentum, between two separated atomic ensembles by using quantum storage. The results are promising for achieving quantum communication and computing with many degrees of freedom. PMID:27841274

  12. Multiple Exciton Generation in Colloidal Nanocrystals

    PubMed Central

    Smith, Charles; Binks, David

    2013-01-01

    In a conventional solar cell, the energy of an absorbed photon in excess of the band gap is rapidly lost as heat, and this is one of the main reasons that the theoretical efficiency is limited to ~33%. However, an alternative process, multiple exciton generation (MEG), can occur in colloidal quantum dots. Here, some or all of the excess energy is instead used to promote one or more additional electrons to the conduction band, potentially increasing the photocurrent of a solar cell and thereby its output efficiency. This review will describe the development of this field over the decade since the first experimental demonstration of multiple exciton generation, including the controversies over experimental artefacts, comparison with similar effects in bulk materials, and the underlying mechanisms. We will also describe the current state-of-the-art and outline promising directions for further development. PMID:28348283

  13. A practical approach to pancreatic cancer immunotherapy using resected tumor lysate vaccines processed to express α-gal epitopes

    PubMed Central

    Miyoshi, Eiji; Eguchi, Hidetoshi; Nagano, Hiroaki; Matsunami, Katsuyoshi; Nagaoka, Satoshi; Yamada, Daisaku; Asaoka, Tadafumi; Noda, Takehiro; Wada, Hiroshi; Kawamoto, Koichi; Goto, Kunihito; Taniyama, Kiyomi; Mori, Masaki; Doki, Yuichiro

    2017-01-01

    Objectives Single-agent immunotherapy is ineffective against poorly immunogenic cancers, including pancreatic ductal adenocarcinoma (PDAC). The aims of this study were to demonstrate the feasibility of production of novel autologous tumor lysate vaccines from resected PDAC tumors, and verify vaccine safety and efficacy. Methods Fresh surgically resected tumors obtained from human patients were processed to enzymatically synthesize α-gal epitopes on the carbohydrate chains of membrane glycoproteins. Processed membranes were analyzed for the expression of α-gal epitopes and the binding of anti-Gal, and vaccine efficacy was assessed in vitro and in vivo. Results Effective synthesis of α-gal epitopes was demonstrated after processing of PDAC tumor lysates from 10 different patients, and tumor lysates readily bound an anti-Gal monoclonal antibody. α-gal(+) PDAC tumor lysate vaccines elicited strong antibody production against multiple tumor-associated antigens and activated multiple tumor-specific T cells. The lysate vaccines stimulated a robust immune response in animal models, resulting in tumor suppression and a significant improvement in survival without any adverse events. Conclusions Our data suggest that α-gal(+) PDAC tumor lysate vaccination may be a practical and effective new immunotherapeutic approach for treating pancreatic cancer. PMID:29077749

  14. Role Clarification Processes for Better Integration of Nurse Practitioners into Primary Healthcare Teams: A Multiple-Case Study

    PubMed Central

    D'Amour, Danielle; Contandriopoulos, Damien; Chouinard, Véronique; Dubois, Carl-Ardy

    2014-01-01

    Role clarity is a crucial issue for effective interprofessional collaboration. Poorly defined roles can become a source of conflict in clinical teams and reduce the effectiveness of care and services delivered to the population. Our objective in this paper is to outline processes for clarifying professional roles when a new role is introduced into clinical teams, that of the primary healthcare nurse practitioner (PHCNP). To support our empirical analysis we used the Canadian National Interprofessional Competency Framework, which defines the essential components for role clarification among professionals. A qualitative multiple-case study was conducted on six cases in which the PHCNP role was introduced into primary care teams. Data collection included 34 semistructured interviews with key informants involved in the implementation of the PHCNP role. Our results revealed that the best performing primary care teams were those that used a variety of organizational and individual strategies to carry out role clarification processes. From this study, we conclude that role clarification is both an organizational process to be developed and a competency that each member of the primary care team must mobilize to ensure effective interprofessional collaboration. PMID:25692039

  15. Role clarification processes for better integration of nurse practitioners into primary healthcare teams: a multiple-case study.

    PubMed

    Brault, Isabelle; Kilpatrick, Kelley; D'Amour, Danielle; Contandriopoulos, Damien; Chouinard, Véronique; Dubois, Carl-Ardy; Perroux, Mélanie; Beaulieu, Marie-Dominique

    2014-01-01

    Role clarity is a crucial issue for effective interprofessional collaboration. Poorly defined roles can become a source of conflict in clinical teams and reduce the effectiveness of care and services delivered to the population. Our objective in this paper is to outline processes for clarifying professional roles when a new role is introduced into clinical teams, that of the primary healthcare nurse practitioner (PHCNP). To support our empirical analysis we used the Canadian National Interprofessional Competency Framework, which defines the essential components for role clarification among professionals. A qualitative multiple-case study was conducted on six cases in which the PHCNP role was introduced into primary care teams. Data collection included 34 semistructured interviews with key informants involved in the implementation of the PHCNP role. Our results revealed that the best performing primary care teams were those that used a variety of organizational and individual strategies to carry out role clarification processes. From this study, we conclude that role clarification is both an organizational process to be developed and a competency that each member of the primary care team must mobilize to ensure effective interprofessional collaboration.

  16. A practical approach to pancreatic cancer immunotherapy using resected tumor lysate vaccines processed to express α-gal epitopes.

    PubMed

    Furukawa, Kenta; Tanemura, Masahiro; Miyoshi, Eiji; Eguchi, Hidetoshi; Nagano, Hiroaki; Matsunami, Katsuyoshi; Nagaoka, Satoshi; Yamada, Daisaku; Asaoka, Tadafumi; Noda, Takehiro; Wada, Hiroshi; Kawamoto, Koichi; Goto, Kunihito; Taniyama, Kiyomi; Mori, Masaki; Doki, Yuichiro

    2017-01-01

    Single-agent immunotherapy is ineffective against poorly immunogenic cancers, including pancreatic ductal adenocarcinoma (PDAC). The aims of this study were to demonstrate the feasibility of production of novel autologous tumor lysate vaccines from resected PDAC tumors, and verify vaccine safety and efficacy. Fresh surgically resected tumors obtained from human patients were processed to enzymatically synthesize α-gal epitopes on the carbohydrate chains of membrane glycoproteins. Processed membranes were analyzed for the expression of α-gal epitopes and the binding of anti-Gal, and vaccine efficacy was assessed in vitro and in vivo. Effective synthesis of α-gal epitopes was demonstrated after processing of PDAC tumor lysates from 10 different patients, and tumor lysates readily bound an anti-Gal monoclonal antibody. α-gal(+) PDAC tumor lysate vaccines elicited strong antibody production against multiple tumor-associated antigens and activated multiple tumor-specific T cells. The lysate vaccines stimulated a robust immune response in animal models, resulting in tumor suppression and a significant improvement in survival without any adverse events. Our data suggest that α-gal(+) PDAC tumor lysate vaccination may be a practical and effective new immunotherapeutic approach for treating pancreatic cancer.

  17. Emotional pictures and sounds: a review of multimodal interactions of emotion cues in multiple domains

    PubMed Central

    Gerdes, Antje B. M.; Wieser, Matthias J.; Alpers, Georg W.

    2014-01-01

    In everyday life, multiple sensory channels jointly trigger emotional experiences and one channel may alter processing in another channel. For example, seeing an emotional facial expression and hearing the voice’s emotional tone will jointly create the emotional experience. This example, where auditory and visual input is related to social communication, has gained considerable attention by researchers. However, interactions of visual and auditory emotional information are not limited to social communication but can extend to much broader contexts including human, animal, and environmental cues. In this article, we review current research on audiovisual emotion processing beyond face-voice stimuli to develop a broader perspective on multimodal interactions in emotion processing. We argue that current concepts of multimodality should be extended in considering an ecologically valid variety of stimuli in audiovisual emotion processing. Therefore, we provide an overview of studies in which emotional sounds and interactions with complex pictures of scenes were investigated. In addition to behavioral studies, we focus on neuroimaging, electro- and peripher-physiological findings. Furthermore, we integrate these findings and identify similarities or differences. We conclude with suggestions for future research. PMID:25520679

  18. Central Office Supports for Data-Driven Talent Management Decisions: Evidence from the Implementation of New Systems for Measuring Teacher Effectiveness

    ERIC Educational Resources Information Center

    Grissom, Jason A.; Rubin, Mollie; Neumerski, Christine M.; Cannata, Marisa; Drake, Timothy A.; Goldring, Ellen; Schuermann, Patrick

    2017-01-01

    School districts increasingly push school leaders to utilize multiple measures of teacher effectiveness, such as observation ratings or value-added scores, in making talent management decisions, including teacher hiring, assignment, support, and retention, but we know little about the local conditions that promote or impede these processes. We…

  19. Developmental and Cognitive Perspectives on Humans' Sense of the Times of Past and Future Events

    ERIC Educational Resources Information Center

    Friedman, W.J.

    2005-01-01

    Mental time travel in human adults includes a sense of when past events occurred and future events are expected to occur. Studies with adults and children reveal that a number of distinct psychological processes contribute to a temporally differentiated sense of the past and future. Adults possess representations of multiple time patterns, and…

  20. Workshop Report: Joint Requirements. Oversight Council Process.

    DTIC Science & Technology

    1996-02-28

    provides media for professional exchange and peer criticism among students, theoreticians, practitioners, and users of military operations research. These... exchange of ideas and methods. involvement in the annual Joint Warfare Inter- Subsequent efforts could include multiple operability Demonstrations (JWID...forums for exchange of ideas at the working level, clear, visible relations but studies and analysis opportunities as well. between the JWCAs need to

  1. Role of Working Memory and Strategy-Use in Feedback Effects on Children's Progression in Analogy Solving:an Explanatory Item Response Theory Account

    ERIC Educational Resources Information Center

    Stevenson, Claire E.

    2017-01-01

    This study contrasted the effects of tutoring, multiple try and no feedback on children's progression in analogy solving and examined individual differences herein. Feedback that includes additional hints or explanations leads to the greatest learning gains in adults. However, children process feedback differently from adults and effective…

  2. Developing Students' Creative Self-Efficacy Based on Design-Thinking: Evaluation of an Elective University Course

    ERIC Educational Resources Information Center

    Ohly, Sandra; Plückthun, Laura; Kissel, Dorothea

    2017-01-01

    The development of novel and useful ideas is a process that can be described in multiple steps, including information gathering, generating ideas and evaluating ideas. We evaluated a university course that was developed based on design thinking principles which employ similar steps. Our results show that the course was not effective in enhancing…

  3. Listening to Early Career Teachers: How Can Elementary Mathematics Methods Courses Better Prepare Them to Utilize Standards-Based Practices in Their Classrooms?

    ERIC Educational Resources Information Center

    Coester, Lee Anne

    2010-01-01

    This study was designed to gather input from early career elementary teachers with the goal of finding ways to improve elementary mathematics methods courses. Multiple areas were explored including the degree to which respondents' elementary mathematics methods course focused on the NCTM Process Standards, the teachers' current standards-based…

  4. Workforce Skills and Innovation: An Overview of Major Themes in the Literature. OECD Education Working Papers, No. 55

    ERIC Educational Resources Information Center

    Toner, Phillip

    2011-01-01

    This paper provides an account of the main approaches, debates and evidence in the literature on the role of workforce skills in the innovation process in developed economies. It draws on multiple sources including the innovation studies discipline, neoclassical Human Capital theory, institutionalist labour market studies and the work organisation…

  5. Population genetic structure of Bromus tectorum in the mountains of western North America

    Treesearch

    Spencer Arnesen; Craig E. Coleman; Susan E. Meyer

    2017-01-01

    PREMISE OF THE STUDY: Invasive species are often initially restricted to a narrow range and may then expand through any of multiple mechanisms including phenotypic plasticity, in situ evolution, or selection on traits preadapted for new habitats. Our study used population genetics to explore possible processes by which the highly selfing invasive annual grass Bromus...

  6. The Effects of Modified Games on the Development of Gross Motor Skill in Preschoolers

    ERIC Educational Resources Information Center

    Lestari, Indah; Ratnaningsih, Tri

    2016-01-01

    Gross motor skills on children must be optimized much earlier since it plays important role not only on their interaction process but also in supporting other multiple developments. One of the means in developing child's motor skill is by providing innovative games i.e. modified games including game format, game timing, and game sequence. The…

  7. Root Cause Investigation of Lead-Free Solder Joint Interfacial Failures After Multiple Reflows

    NASA Astrophysics Data System (ADS)

    Li, Yan; Hatch, Olen; Liu, Pilin; Goyal, Deepak

    2017-03-01

    Solder joint interconnects in three-dimensional (3D) packages with package stacking configurations typically must undergo multiple reflow cycles during the assembly process. In this work, interfacial open joint failures between the bulk solder and the intermetallic compound (IMC) layer were found in Sn-Ag-Cu (SAC) solder joints connecting a small package to a large package after multiple reflow reliability tests. Systematic progressive 3D x-ray computed tomography experiments were performed on both incoming and assembled parts to reveal the initiation and evolution of the open failures in the same solder joints before and after the reliability tests. Characterization studies, including focused ion beam cross-sections, scanning electron microscopy, and energy-dispersive x-ray spectroscopy, were conducted to determine the correlation between IMC phase transformation and failure initiation in the solder joints. A comprehensive failure mechanism, along with solution paths for the solder joint interfacial failures after multiple reflow cycles, is discussed in detail.

  8. A health impact assessment of proposed public transportation service cuts and fare increases in Boston, Massachusetts (U.S.A.).

    PubMed

    James, Peter; Ito, Kate; Buonocore, Jonathan J; Levy, Jonathan I; Arcaya, Mariana C

    2014-08-07

    Transportation decisions have health consequences that are often not incorporated into policy-making processes. Health Impact Assessment (HIA) is a process that can be used to evaluate health effects of transportation policy. We present a rapid HIA, conducted over eight weeks, evaluating health and economic effects of proposed fare increases and service cuts to Boston, Massachusetts' public transportation system. We used transportation modeling in concert with tools allowing for quantification and monetization of multiple pathways. We estimated health and economic costs of proposed public transportation system changes to be hundreds of millions of dollars per year, exceeding the budget gap the public transportation authority was required to close. Significant health pathways included crashes, air pollution, and physical activity. The HIA enabled stakeholders to advocate for more modest fare increases and service cuts, which were eventually adopted by decision makers. This HIA was among the first to quantify and monetize multiple pathways linking transportation decisions with health and economic outcomes, using approaches that could be applied in different settings. Including health costs in transportation decisions can lead to policy choices with both economic and public health benefits.

  9. A Health Impact Assessment of Proposed Public Transportation Service Cuts and Fare Increases in Boston, Massachusetts (U.S.A.)

    PubMed Central

    James, Peter; Ito, Kate; Buonocore, Jonathan J.; Levy, Jonathan I.; Arcaya, Mariana C.

    2014-01-01

    Transportation decisions have health consequences that are often not incorporated into policy-making processes. Health Impact Assessment (HIA) is a process that can be used to evaluate health effects of transportation policy. We present a rapid HIA, conducted over eight weeks, evaluating health and economic effects of proposed fare increases and service cuts to Boston, Massachusetts’ public transportation system. We used transportation modeling in concert with tools allowing for quantification and monetization of multiple pathways. We estimated health and economic costs of proposed public transportation system changes to be hundreds of millions of dollars per year, exceeding the budget gap the public transportation authority was required to close. Significant health pathways included crashes, air pollution, and physical activity. The HIA enabled stakeholders to advocate for more modest fare increases and service cuts, which were eventually adopted by decision makers. This HIA was among the first to quantify and monetize multiple pathways linking transportation decisions with health and economic outcomes, using approaches that could be applied in different settings. Including health costs in transportation decisions can lead to policy choices with both economic and public health benefits. PMID:25105550

  10. Fluence-field modulated x-ray CT using multiple aperture devices

    NASA Astrophysics Data System (ADS)

    Stayman, J. Webster; Mathews, Aswin; Zbijewski, Wojciech; Gang, Grace; Siewerdsen, Jeffrey; Kawamoto, Satomi; Blevis, Ira; Levinson, Reuven

    2016-03-01

    We introduce a novel strategy for fluence field modulation (FFM) in x-ray CT using multiple aperture devices (MADs). MAD filters permit FFM by blocking or transmitting the x-ray beam on a fine (0.1-1 mm) scale. The filters have a number of potential advantages over other beam modulation strategies including the potential for a highly compact design, modest actuation speed and acceleration requirements, and spectrally neutral filtration due to their essentially binary action. In this work, we present the underlying MAD filtration concept including a design process to achieve a specific class of FFM patterns. A set of MAD filters is fabricated using a tungsten laser sintering process and integrated into an x-ray CT test bench. A characterization of the MAD filters is conducted and compared to traditional attenuating bowtie filters and the ability to flatten the fluence profile for a 32 cm acrylic phantom is demonstrated. MAD-filtered tomographic data was acquired on the CT test bench and reconstructed without artifacts associated with the MAD filter. These initial studies suggest that MAD-based FFM is appropriate for integration in clinical CT system to create patient-specific fluence field profile and reduce radiation exposures.

  11. A Federated Network for Translational Cancer Research Using Clinical Data and Biospecimens

    PubMed Central

    Becich, Michael J.; Bollag, Roni J.; Chavan, Girish; Corrigan, Julia; Dhir, Rajiv; Feldman, Michael D.; Gaudioso, Carmelo; Legowski, Elizabeth; Maihle, Nita J.; Mitchell, Kevin; Murphy, Monica; Sakthivel, Mayur; Tseytlin, Eugene; Weaver, JoEllen

    2015-01-01

    Advances in cancer research and personalized medicine will require significant new bridging infrastructures, including more robust biorepositories that link human tissue to clinical phenotypes and outcomes. In order to meet that challenge, four cancer centers formed the TIES Cancer Research Network, a federated network that facilitates data and biospecimen sharing among member institutions. Member sites can access pathology data that is de-identified and processed with the TIES natural language processing system, which creates a repository of rich phenotype data linked to clinical biospecimens. TIES incorporates multiple security and privacy best practices that, combined with legal agreements, network policies and procedures, enable regulatory compliance. The TIES Cancer Research Network now provides integrated access to investigators at all member institutions, where multiple investigator-driven pilot projects are underway. Examples of federated search across the network illustrate the potential impact on translational research, particularly for studies involving rare cancers, rare phenotypes, and specific biologic behaviors. The network satisfies several key desiderata including local control of data and credentialing, inclusion of rich phenotype information, and applicability to diverse research objectives. The TIES Cancer Research Network presents a model for a national data and biospecimen network. PMID:26670560

  12. Information processing of earth resources data

    NASA Technical Reports Server (NTRS)

    Zobrist, A. L.; Bryant, N. A.

    1982-01-01

    Current trends in the use of remotely sensed data include integration of multiple data sources of various formats and use of complex models. These trends have placed a strain on information processing systems because an enormous number of capabilities are needed to perform a single application. A solution to this problem is to create a general set of capabilities which can perform a wide variety of applications. General capabilities for the Image-Based Information System (IBIS) are outlined in this report. They are then cross-referenced for a set of applications performed at JPL.

  13. Patient safety - the role of human factors and systems engineering.

    PubMed

    Carayon, Pascale; Wood, Kenneth E

    2010-01-01

    Patient safety is a global challenge that requires knowledge and skills in multiple areas, including human factors and systems engineering. In this chapter, numerous conceptual approaches and methods for analyzing, preventing and mitigating medical errors are described. Given the complexity of healthcare work systems and processes, we emphasize the need for increasing partnerships between the health sciences and human factors and systems engineering to improve patient safety. Those partnerships will be able to develop and implement the system redesigns that are necessary to improve healthcare work systems and processes for patient safety.

  14. Use of Mueller and non-Mueller matrices to describe polarization properties of telescope-based polarimeters

    NASA Astrophysics Data System (ADS)

    Seagraves, P. H.; Elmore, David F.

    1994-09-01

    Systems using optical elements such as linear polarizers, retarders, and mirrors can be represented by Mueller matrices. Some polarimeters include elements with time-varying polarization properties, multiple light beams, light detectors, and signal processing equipment. Standard Mueller matrix forms describing time-varying retarders, and beam splitters are presented, as well as non-Mueller matrices which describe detection and signal processing. These matrices provide a compact and intuitive mathematical description of polarimeter response which can aid in the refining of instrument designs.

  15. Pervaporation of phenols

    DOEpatents

    Boddeker, K.W.

    1989-02-21

    Aqueous phenolic solutions are separated by pervaporation to yield a phenol-depleted retentate and a phenol-enriched permeate. The separation effect is enhanced by phase segregation into two immiscible phases, phenol in water'' (approximately 10% phenol), and water in phenol'' (approximately 70% phenol). Membranes capable of enriching phenols by pervaporation include elastomeric polymers and anion exchange membranes, membrane selection and process design being guided by pervaporation performance and chemical stability towards phenolic solutions. Single- and multiple-stage processes are disclosed, both for the enrichment of phenols and for purification of water from phenolic contamination. 8 figs.

  16. Multiple wavelength silicon photonic 200 mm R+D platform for 25Gb/s and above applications

    NASA Astrophysics Data System (ADS)

    Szelag, B.; Blampey, B.; Ferrotti, T.; Reboud, V.; Hassan, K.; Malhouitre, S.; Grand, G.; Fowler, D.; Brision, S.; Bria, T.; Rabillé, G.; Brianceau, P.; Hartmann, J. M.; Hugues, V.; Myko, A.; Elleboode, F.; Gays, F.; Fédéli, J. M.; Kopp, C.

    2016-05-01

    A silicon photonics platform that uses a CMOS foundry line is described. Fabrication process is following a modular integration scheme which leads to a flexible platform, allowing different device combinations. A complete device library is demonstrated for 1310 nm applications with state of the art performances. A PDK which includes specific photonic features and which is compatible with commercial EDA tools has been developed allowing an MPW shuttle service. Finally platform evolutions such as device offer extension to 1550 nm or new process modules introduction are presented.

  17. Fiber tractography using machine learning.

    PubMed

    Neher, Peter F; Côté, Marc-Alexandre; Houde, Jean-Christophe; Descoteaux, Maxime; Maier-Hein, Klaus H

    2017-09-01

    We present a fiber tractography approach based on a random forest classification and voting process, guiding each step of the streamline progression by directly processing raw diffusion-weighted signal intensities. For comparison to the state-of-the-art, i.e. tractography pipelines that rely on mathematical modeling, we performed a quantitative and qualitative evaluation with multiple phantom and in vivo experiments, including a comparison to the 96 submissions of the ISMRM tractography challenge 2015. The results demonstrate the vast potential of machine learning for fiber tractography. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. Electromagnetic Dissociation Cross Sections for High LET Fragments

    NASA Technical Reports Server (NTRS)

    Norbury, John

    2016-01-01

    Nuclear interaction cross sections are used in space radiation transport codes to calculate the probability of fragment emission in high energy nucleus-nucleus collisions. Strong interactions usually dominate in these collisions, but electromagnetic (EM) interactions can also sometimes be important. Strong interactions typically occur when the projectile nucleus hits a target nucleus, with a small impact parameter. For impact parameters larger than the sum of the nuclear radii, EM reactions dominate and the process is called electromagnetic dissociation (EMD) if one of the nuclei undergo fragmentation. Previous models of EMD have been used to calculate single proton (p) production, single neutron (n) production or light ion production, where a light ion is defined as an isotope of hydrogen (H) or helium (He), such as a deuteron (2H), a triton (3H), a helion (3He) or an alpha particle (4He). A new model is described which can also account for multiple nucleon production, such as 2p, 2n, 1p1n, 2p1n, 2p2n, etc. in addition to light ion production. Such processes are important to include for the following reasons. Consider, for example, the EMD reaction 56Fe + Al --> 52Cr + X + Al, for a 56Fe projectile impacting Al, which produces the high linear energy transfer (LET) fragment 52Cr. In this reaction, the most probable particles representing X are either 2p2n or 4He. Therefore, production of the high LET fragment 52Cr, must include the multiple nucleon production of 2p2n in addition to the light ion production of 4He. Previous models, such as the NUCFRG3 model, could only account for the 4He production process in this reaction and could not account for 2p2n. The new EMD model presented in this work accounts for both the light ion and multiple nucleon processes, and is therefore able to correctly account for the production of high LET products such as 52Cr. The model will be described and calculations will be presented that show the importance of light ion and multiple nucleon production. The work will also show that EMD reactions contribute most to those fragments with the highest LET.

  19. Modeling the Development of Audiovisual Cue Integration in Speech Perception

    PubMed Central

    Getz, Laura M.; Nordeen, Elke R.; Vrabic, Sarah C.; Toscano, Joseph C.

    2017-01-01

    Adult speech perception is generally enhanced when information is provided from multiple modalities. In contrast, infants do not appear to benefit from combining auditory and visual speech information early in development. This is true despite the fact that both modalities are important to speech comprehension even at early stages of language acquisition. How then do listeners learn how to process auditory and visual information as part of a unified signal? In the auditory domain, statistical learning processes provide an excellent mechanism for acquiring phonological categories. Is this also true for the more complex problem of acquiring audiovisual correspondences, which require the learner to integrate information from multiple modalities? In this paper, we present simulations using Gaussian mixture models (GMMs) that learn cue weights and combine cues on the basis of their distributional statistics. First, we simulate the developmental process of acquiring phonological categories from auditory and visual cues, asking whether simple statistical learning approaches are sufficient for learning multi-modal representations. Second, we use this time course information to explain audiovisual speech perception in adult perceivers, including cases where auditory and visual input are mismatched. Overall, we find that domain-general statistical learning techniques allow us to model the developmental trajectory of audiovisual cue integration in speech, and in turn, allow us to better understand the mechanisms that give rise to unified percepts based on multiple cues. PMID:28335558

  20. Modeling the Development of Audiovisual Cue Integration in Speech Perception.

    PubMed

    Getz, Laura M; Nordeen, Elke R; Vrabic, Sarah C; Toscano, Joseph C

    2017-03-21

    Adult speech perception is generally enhanced when information is provided from multiple modalities. In contrast, infants do not appear to benefit from combining auditory and visual speech information early in development. This is true despite the fact that both modalities are important to speech comprehension even at early stages of language acquisition. How then do listeners learn how to process auditory and visual information as part of a unified signal? In the auditory domain, statistical learning processes provide an excellent mechanism for acquiring phonological categories. Is this also true for the more complex problem of acquiring audiovisual correspondences, which require the learner to integrate information from multiple modalities? In this paper, we present simulations using Gaussian mixture models (GMMs) that learn cue weights and combine cues on the basis of their distributional statistics. First, we simulate the developmental process of acquiring phonological categories from auditory and visual cues, asking whether simple statistical learning approaches are sufficient for learning multi-modal representations. Second, we use this time course information to explain audiovisual speech perception in adult perceivers, including cases where auditory and visual input are mismatched. Overall, we find that domain-general statistical learning techniques allow us to model the developmental trajectory of audiovisual cue integration in speech, and in turn, allow us to better understand the mechanisms that give rise to unified percepts based on multiple cues.

  1. Validity of the Symbol Digit Modalities Test as a cognition performance outcome measure for multiple sclerosis

    PubMed Central

    Benedict, Ralph HB; DeLuca, John; Phillips, Glenn; LaRocca, Nicholas; Hudson, Lynn D; Rudick, Richard

    2017-01-01

    Cognitive and motor performance measures are commonly employed in multiple sclerosis (MS) research, particularly when the purpose is to determine the efficacy of treatment. The increasing focus of new therapies on slowing progression or reversing neurological disability makes the utilization of sensitive, reproducible, and valid measures essential. Processing speed is a basic elemental cognitive function that likely influences downstream processes such as memory. The Multiple Sclerosis Outcome Assessments Consortium (MSOAC) includes representatives from advocacy organizations, Food and Drug Administration (FDA), European Medicines Agency (EMA), National Institute of Neurological Disorders and Stroke (NINDS), academic institutions, and industry partners along with persons living with MS. Among the MSOAC goals is acceptance and qualification by regulators of performance outcomes that are highly reliable and valid, practical, cost-effective, and meaningful to persons with MS. A critical step for these neuroperformance metrics is elucidation of clinically relevant benchmarks, well-defined degrees of disability, and gradients of change that are deemed clinically meaningful. This topical review provides an overview of research on one particular cognitive measure, the Symbol Digit Modalities Test (SDMT), recognized as being particularly sensitive to slowed processing of information that is commonly seen in MS. The research in MS clearly supports the reliability and validity of this test and recently has supported a responder definition of SDMT change approximating 4 points or 10% in magnitude. PMID:28206827

  2. Validity of the Symbol Digit Modalities Test as a cognition performance outcome measure for multiple sclerosis.

    PubMed

    Benedict, Ralph Hb; DeLuca, John; Phillips, Glenn; LaRocca, Nicholas; Hudson, Lynn D; Rudick, Richard

    2017-04-01

    Cognitive and motor performance measures are commonly employed in multiple sclerosis (MS) research, particularly when the purpose is to determine the efficacy of treatment. The increasing focus of new therapies on slowing progression or reversing neurological disability makes the utilization of sensitive, reproducible, and valid measures essential. Processing speed is a basic elemental cognitive function that likely influences downstream processes such as memory. The Multiple Sclerosis Outcome Assessments Consortium (MSOAC) includes representatives from advocacy organizations, Food and Drug Administration (FDA), European Medicines Agency (EMA), National Institute of Neurological Disorders and Stroke (NINDS), academic institutions, and industry partners along with persons living with MS. Among the MSOAC goals is acceptance and qualification by regulators of performance outcomes that are highly reliable and valid, practical, cost-effective, and meaningful to persons with MS. A critical step for these neuroperformance metrics is elucidation of clinically relevant benchmarks, well-defined degrees of disability, and gradients of change that are deemed clinically meaningful. This topical review provides an overview of research on one particular cognitive measure, the Symbol Digit Modalities Test (SDMT), recognized as being particularly sensitive to slowed processing of information that is commonly seen in MS. The research in MS clearly supports the reliability and validity of this test and recently has supported a responder definition of SDMT change approximating 4 points or 10% in magnitude.

  3. Next Generation Space Surveillance System-of-Systems

    NASA Astrophysics Data System (ADS)

    McShane, B.

    2014-09-01

    International economic and military dependence on space assets is pervasive and ever-growing in an environment that is now congested, contested, and competitive. There are a number of natural and man-made risks that need to be monitored and characterized to protect and preserve the space environment and the assets within it. Unfortunately, today's space surveillance network (SSN) has gaps in coverage, is not resilient, and has a growing number of objects that get lost. Risks can be efficiently and effectively mitigated, gaps closed, resiliency improved, and performance increased within a next generation space surveillance network implemented as a system-of-systems with modern information architectures and analytic techniques. This also includes consideration for the newest SSN sensors (e.g. Space Fence) which are born Net-Centric out-of-the-box and able to seamlessly interface with the JSpOC Mission System, global information grid, and future unanticipated users. Significant opportunity exists to integrate legacy, traditional, and non-traditional sensors into a larger space system-of-systems (including command and control centers) for multiple clients through low cost sustainment, modification, and modernization efforts. Clients include operations centers (e.g. JSpOC, USSTRATCOM, CANSPOC), Intelligence centers (e.g. NASIC), space surveillance sensor sites (e.g. AMOS, GEODSS), international governments (e.g. Germany, UK), space agencies (e.g. NASA), and academic institutions. Each has differing priorities, networks, data needs, timeliness, security, accuracy requirements and formats. Enabling processes and technologies include: Standardized and type accredited methods for secure connections to multiple networks, machine-to-machine interfaces for near real-time data sharing and tip-and-queue activities, common data models for analytical processing across multiple radar and optical sensor types, an efficient way to automatically translate between differing client and sensor formats, data warehouse of time based space events, secure collaboration tools for international coalition space operations, shared concept-of-operations, tactics, techniques, and procedures.

  4. Identification and analysis of chemical constituents and rat serum metabolites in Suan-Zao-Ren granule using ultra high performance liquid chromatography quadrupole time-of-flight mass spectrometry combined with multiple data processing approaches.

    PubMed

    Du, Yiyang; He, Bosai; Li, Qing; He, Jiao; Wang, Di; Bi, Kaishun

    2017-07-01

    Suan-Zao-Ren granule is widely used to treat insomnia in China. However, because of the complexity and diversity of the chemical compositions in traditional Chinese medicine formula, the comprehensive analysis of constituents in vitro and in vivo is rather difficult. In our study, an ultra high performance liquid chromatography with quadrupole time-of-flight mass spectrometry and the PeakView® software, which uses multiple data processing approaches including product ion filter, neutral loss filter, and mass defect filter, method was developed to characterize the ingredients and rat serum metabolites in Suan-Zao-Ren granule. A total of 101 constituents were detected in vitro. Under the same analysis conditions, 68 constituents were characterized in rat serum, including 35 prototype components and 33 metabolites. The metabolic pathways of main components were also illustrated. Among them, the metabolic pathways of timosaponin AI were firstly revealed. The bioactive compounds mainly underwent the phase I metabolic pathways including hydroxylation, oxidation, hydrolysis, and phase II metabolic pathways including sulfate conjugation, glucuronide conjugation, cysteine conjugation, acetycysteine conjugation, and glutathione conjugation. In conclusion, our results showed that this analysis approach was extremely useful for the in-depth pharmacological research of Suan-Zao-Ren granule and provided a chemical basis for its rational. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Real-time processing for full-range Fourier-domain optical-coherence tomography with zero-filling interpolation using multiple graphic processing units.

    PubMed

    Watanabe, Yuuki; Maeno, Seiya; Aoshima, Kenji; Hasegawa, Haruyuki; Koseki, Hitoshi

    2010-09-01

    The real-time display of full-range, 2048?axial pixelx1024?lateral pixel, Fourier-domain optical-coherence tomography (FD-OCT) images is demonstrated. The required speed was achieved by using dual graphic processing units (GPUs) with many stream processors to realize highly parallel processing. We used a zero-filling technique, including a forward Fourier transform, a zero padding to increase the axial data-array size to 8192, an inverse-Fourier transform back to the spectral domain, a linear interpolation from wavelength to wavenumber, a lateral Hilbert transform to obtain the complex spectrum, a Fourier transform to obtain the axial profiles, and a log scaling. The data-transfer time of the frame grabber was 15.73?ms, and the processing time, which includes the data transfer between the GPU memory and the host computer, was 14.75?ms, for a total time shorter than the 36.70?ms frame-interval time using a line-scan CCD camera operated at 27.9?kHz. That is, our OCT system achieved a processed-image display rate of 27.23 frames/s.

  6. ODMSummary: A Tool for Automatic Structured Comparison of Multiple Medical Forms Based on Semantic Annotation with the Unified Medical Language System

    PubMed Central

    Krumm, Rainer; Dugas, Martin

    2016-01-01

    Introduction Medical documentation is applied in various settings including patient care and clinical research. Since procedures of medical documentation are heterogeneous and developed further, secondary use of medical data is complicated. Development of medical forms, merging of data from different sources and meta-analyses of different data sets are currently a predominantly manual process and therefore difficult and cumbersome. Available applications to automate these processes are limited. In particular, tools to compare multiple documentation forms are missing. The objective of this work is to design, implement and evaluate the new system ODMSummary for comparison of multiple forms with a high number of semantically annotated data elements and a high level of usability. Methods System requirements are the capability to summarize and compare a set of forms, enable to estimate the documentation effort, track changes in different versions of forms and find comparable items in different forms. Forms are provided in Operational Data Model format with semantic annotations from the Unified Medical Language System. 12 medical experts were invited to participate in a 3-phase evaluation of the tool regarding usability. Results ODMSummary (available at https://odmtoolbox.uni-muenster.de/summary/summary.html) provides a structured overview of multiple forms and their documentation fields. This comparison enables medical experts to assess multiple forms or whole datasets for secondary use. System usability was optimized based on expert feedback. Discussion The evaluation demonstrates that feedback from domain experts is needed to identify usability issues. In conclusion, this work shows that automatic comparison of multiple forms is feasible and the results are usable for medical experts. PMID:27736972

  7. Integration of health management and support systems is key to achieving cost reduction and operational concept goals of the 2nd generation reusable launch vehicle

    NASA Astrophysics Data System (ADS)

    Koon, Phillip L.; Greene, Scott

    2002-07-01

    Our aerospace customers are demanding that we drastically reduce the cost of operating and supporting our products. Our space customer in particular is looking for the next generation of reusable launch vehicle systems to support more aircraft like operation. To achieve this goal requires more than an evolution in materials, processes and systems, what is required is a paradigm shift in the design of the launch vehicles and the processing systems that support the launch vehicles. This paper describes the Automated Informed Maintenance System (AIM) we are developing for NASA's Space Launch Initiative (SLI) Second Generation Reusable Launch Vehicle (RLV). Our system includes an Integrated Health Management (IHM) system for the launch vehicles and ground support systems, which features model based diagnostics and prognostics. Health Management data is used by our AIM decision support and process aids to automatically plan maintenance, generate work orders and schedule maintenance activities along with the resources required to execute these processes. Our system will automate the ground processing for a spaceport handling multiple RLVs executing multiple missions. To accomplish this task we are applying the latest web based distributed computing technologies and application development techniques.

  8. Predicting dual-task performance with the Multiple Resources Questionnaire (MRQ).

    PubMed

    Boles, David B; Bursk, Jonathan H; Phillips, Jeffrey B; Perdelwitz, Jason R

    2007-02-01

    The objective was to assess the validity of the Multiple Resources Questionnaire (MRQ) in predicting dual-task interference. Subjective workload measures such as the Subjective Workload Assessment Technique (SWAT) and NASA Task Load Index are sensitive to single-task parameters and dual-task loads but have not attempted to measure workload in particular mental processes. An alternative is the MRQ. In Experiment 1, participants completed simple laboratory tasks and the MRQ after each. Interference between tasks was then correlated to three different task similarity metrics: profile similarity, based on r(2) between ratings; overlap similarity, based on summed minima; and overall demand, based on summed ratings. Experiment 2 used similar methods but more complex computer-based games. In Experiment 1 the MRQ moderately predicted interference (r = +.37), with no significant difference between metrics. In Experiment 2 the metric effect was significant, with overlap similarity excelling in predicting interference (r = +.83). Mean ratings showed high diagnosticity in identifying specific mental processing bottlenecks. The MRQ shows considerable promise as a cognitive-process-sensitive workload measure. Potential applications of the MRQ include the identification of dual-processing bottlenecks as well as process overloads in single tasks, preparatory to redesign in areas such as air traffic management, advanced flight displays, and medical imaging.

  9. TOWARDS A MULTI-SCALE AGENT-BASED PROGRAMMING LANGUAGE METHODOLOGY

    PubMed Central

    Somogyi, Endre; Hagar, Amit; Glazier, James A.

    2017-01-01

    Living tissues are dynamic, heterogeneous compositions of objects, including molecules, cells and extra-cellular materials, which interact via chemical, mechanical and electrical process and reorganize via transformation, birth, death and migration processes. Current programming language have difficulty describing the dynamics of tissues because: 1: Dynamic sets of objects participate simultaneously in multiple processes, 2: Processes may be either continuous or discrete, and their activity may be conditional, 3: Objects and processes form complex, heterogeneous relationships and structures, 4: Objects and processes may be hierarchically composed, 5: Processes may create, destroy and transform objects and processes. Some modeling languages support these concepts, but most cannot translate models into executable simulations. We present a new hybrid executable modeling language paradigm, the Continuous Concurrent Object Process Methodology (CCOPM) which naturally expresses tissue models, enabling users to visually create agent-based models of tissues, and also allows computer simulation of these models. PMID:29282379

  10. TOWARDS A MULTI-SCALE AGENT-BASED PROGRAMMING LANGUAGE METHODOLOGY.

    PubMed

    Somogyi, Endre; Hagar, Amit; Glazier, James A

    2016-12-01

    Living tissues are dynamic, heterogeneous compositions of objects , including molecules, cells and extra-cellular materials, which interact via chemical, mechanical and electrical process and reorganize via transformation, birth, death and migration processes . Current programming language have difficulty describing the dynamics of tissues because: 1: Dynamic sets of objects participate simultaneously in multiple processes, 2: Processes may be either continuous or discrete, and their activity may be conditional, 3: Objects and processes form complex, heterogeneous relationships and structures, 4: Objects and processes may be hierarchically composed, 5: Processes may create, destroy and transform objects and processes. Some modeling languages support these concepts, but most cannot translate models into executable simulations. We present a new hybrid executable modeling language paradigm, the Continuous Concurrent Object Process Methodology ( CCOPM ) which naturally expresses tissue models, enabling users to visually create agent-based models of tissues, and also allows computer simulation of these models.

  11. Efficient multitasking: parallel versus serial processing of multiple tasks

    PubMed Central

    Fischer, Rico; Plessow, Franziska

    2015-01-01

    In the context of performance optimizations in multitasking, a central debate has unfolded in multitasking research around whether cognitive processes related to different tasks proceed only sequentially (one at a time), or can operate in parallel (simultaneously). This review features a discussion of theoretical considerations and empirical evidence regarding parallel versus serial task processing in multitasking. In addition, we highlight how methodological differences and theoretical conceptions determine the extent to which parallel processing in multitasking can be detected, to guide their employment in future research. Parallel and serial processing of multiple tasks are not mutually exclusive. Therefore, questions focusing exclusively on either task-processing mode are too simplified. We review empirical evidence and demonstrate that shifting between more parallel and more serial task processing critically depends on the conditions under which multiple tasks are performed. We conclude that efficient multitasking is reflected by the ability of individuals to adjust multitasking performance to environmental demands by flexibly shifting between different processing strategies of multiple task-component scheduling. PMID:26441742

  12. Efficient multitasking: parallel versus serial processing of multiple tasks.

    PubMed

    Fischer, Rico; Plessow, Franziska

    2015-01-01

    In the context of performance optimizations in multitasking, a central debate has unfolded in multitasking research around whether cognitive processes related to different tasks proceed only sequentially (one at a time), or can operate in parallel (simultaneously). This review features a discussion of theoretical considerations and empirical evidence regarding parallel versus serial task processing in multitasking. In addition, we highlight how methodological differences and theoretical conceptions determine the extent to which parallel processing in multitasking can be detected, to guide their employment in future research. Parallel and serial processing of multiple tasks are not mutually exclusive. Therefore, questions focusing exclusively on either task-processing mode are too simplified. We review empirical evidence and demonstrate that shifting between more parallel and more serial task processing critically depends on the conditions under which multiple tasks are performed. We conclude that efficient multitasking is reflected by the ability of individuals to adjust multitasking performance to environmental demands by flexibly shifting between different processing strategies of multiple task-component scheduling.

  13. BowMapCL: Burrows-Wheeler Mapping on Multiple Heterogeneous Accelerators.

    PubMed

    Nogueira, David; Tomas, Pedro; Roma, Nuno

    2016-01-01

    The computational demand of exact-search procedures has pressed the exploitation of parallel processing accelerators to reduce the execution time of many applications. However, this often imposes strict restrictions in terms of the problem size and implementation efforts, mainly due to their possibly distinct architectures. To circumvent this limitation, a new exact-search alignment tool (BowMapCL) based on the Burrows-Wheeler Transform and FM-Index is presented. Contrasting to other alternatives, BowMapCL is based on a unified implementation using OpenCL, allowing the exploitation of multiple and possibly different devices (e.g., NVIDIA, AMD/ATI, and Intel GPUs/APUs). Furthermore, to efficiently exploit such heterogeneous architectures, BowMapCL incorporates several techniques to promote its performance and scalability, including multiple buffering, work-queue task-distribution, and dynamic load-balancing, together with index partitioning, bit-encoding, and sampling. When compared with state-of-the-art tools, the attained results showed that BowMapCL (using a single GPU) is 2 × to 7.5 × faster than mainstream multi-threaded CPU BWT-based aligners, like Bowtie, BWA, and SOAP2; and up to 4 × faster than the best performing state-of-the-art GPU implementations (namely, SOAP3 and HPG-BWT). When multiple and completely distinct devices are considered, BowMapCL efficiently scales the offered throughput, ensuring a convenient load-balance of the involved processing in the several distinct devices.

  14. Differential Effects of Alcohol on Working Memory: Distinguishing Multiple Processes

    PubMed Central

    Saults, J. Scott; Cowan, Nelson; Sher, Kenneth J.; Moreno, Matthew V.

    2008-01-01

    We assessed effects of alcohol consumption on different types of working memory (WM) tasks in an attempt to characterize the nature of alcohol effects on cognition. The WM tasks varied in two properties of materials to be retained in a two-stimulus comparison procedure. Conditions included (1) spatial arrays of colors, (2) temporal sequences of colors, (3) spatial arrays of spoken digits, and (4) temporal sequences of spoken digits. Alcohol consumption impaired memory for auditory and visual sequences, but not memory for simultaneous arrays of auditory or visual stimuli. These results suggest that processes needed to encode and maintain stimulus sequences, such as rehearsal, are more sensitive to alcohol intoxication than other WM mechanisms needed to maintain multiple concurrent items, such as focusing attention on them. These findings help to resolve disparate findings from prior research into alcohol’s effect on WM and on divided attention. The results suggest that moderate doses of alcohol impair WM by affecting certain mnemonic strategies and executive processes rather than by shrinking the basic holding capacity of WM. PMID:18179311

  15. Data Pre-Processing for Label-Free Multiple Reaction Monitoring (MRM) Experiments

    PubMed Central

    Chung, Lisa M.; Colangelo, Christopher M.; Zhao, Hongyu

    2014-01-01

    Multiple Reaction Monitoring (MRM) conducted on a triple quadrupole mass spectrometer allows researchers to quantify the expression levels of a set of target proteins. Each protein is often characterized by several unique peptides that can be detected by monitoring predetermined fragment ions, called transitions, for each peptide. Concatenating large numbers of MRM transitions into a single assay enables simultaneous quantification of hundreds of peptides and proteins. In recognition of the important role that MRM can play in hypothesis-driven research and its increasing impact on clinical proteomics, targeted proteomics such as MRM was recently selected as the Nature Method of the Year. However, there are many challenges in MRM applications, especially data pre‑processing where many steps still rely on manual inspection of each observation in practice. In this paper, we discuss an analysis pipeline to automate MRM data pre‑processing. This pipeline includes data quality assessment across replicated samples, outlier detection, identification of inaccurate transitions, and data normalization. We demonstrate the utility of our pipeline through its applications to several real MRM data sets. PMID:24905083

  16. Data Pre-Processing for Label-Free Multiple Reaction Monitoring (MRM) Experiments.

    PubMed

    Chung, Lisa M; Colangelo, Christopher M; Zhao, Hongyu

    2014-06-05

    Multiple Reaction Monitoring (MRM) conducted on a triple quadrupole mass spectrometer allows researchers to quantify the expression levels of a set of target proteins. Each protein is often characterized by several unique peptides that can be detected by monitoring predetermined fragment ions, called transitions, for each peptide. Concatenating large numbers of MRM transitions into a single assay enables simultaneous quantification of hundreds of peptides and proteins. In recognition of the important role that MRM can play in hypothesis-driven research and its increasing impact on clinical proteomics, targeted proteomics such as MRM was recently selected as the Nature Method of the Year. However, there are many challenges in MRM applications, especially data pre‑processing where many steps still rely on manual inspection of each observation in practice. In this paper, we discuss an analysis pipeline to automate MRM data pre‑processing. This pipeline includes data quality assessment across replicated samples, outlier detection, identification of inaccurate transitions, and data normalization. We demonstrate the utility of our pipeline through its applications to several real MRM data sets.

  17. Under-sampling in a Multiple-Channel Laser Vibrometry System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Corey, Jordan

    2007-03-01

    Laser vibrometry is a technique used to detect vibrations on objects using the interference of coherent light with itself. Most vibrometry systems process only one target location at a time, but processing multiple locations simultaneously provides improved detection capabilities. Traditional laser vibrometry systems employ oversampling to sample the incoming modulated-light signal, however as the number of channels increases in these systems, certain issues arise such a higher computational cost, excessive heat, increased power requirements, and increased component cost. This thesis describes a novel approach to laser vibrometry that utilizes undersampling to control the undesirable issues associated with over-sampled systems. Undersamplingmore » allows for significantly less samples to represent the modulated-light signals, which offers several advantages in the overall system design. These advantages include an improvement in thermal efficiency, lower processing requirements, and a higher immunity to the relative intensity noise inherent in laser vibrometry applications. A unique feature of this implementation is the use of a parallel architecture to increase the overall system throughput. This parallelism is realized using a hierarchical multi-channel architecture based on off-the-shelf programmable logic devices (PLDs).« less

  18. MABEL at IPAC: managing address books and email lists at the Infrared Processing and Analysis Center

    NASA Astrophysics Data System (ADS)

    Crane, Megan; Brinkworth, Carolyn; Gelino, Dawn; O'Leary, Ellen

    2012-09-01

    The Infrared Processing and Analysis Center (IPAC), located on the campus of the California Institute of Technology, is NASA's multi-mission data center for infrared astrophysics. Some of IPAC's services include administering data analysis funding awards to the astronomical community, organizing conferences and workshops, and soliciting and selecting fellowship and observing proposals. As most of these services are repeated annually or biannually, it becomes necessary to maintain multiple lists of email contacts associated with each service. MABEL is a PHP/MySQL web database application designed to facilitate this process. It serves as an address book containing up-to-date contact information for thousands of recipients. Recipients may be assigned to any number of email lists categorized by IPAC project and team. Lists may be public (viewable by all project members) or private (viewable only by team members). MABEL can also be used to send HTML or plain-text emails to multiple lists at once and prevents duplicate emails to a single recipient. This work was performed at the California Institute of Technology under contract to the National Aeronautics and Space Administration.

  19. Differential effects of alcohol on working memory: distinguishing multiple processes.

    PubMed

    Saults, J Scott; Cowan, Nelson; Sher, Kenneth J; Moreno, Matthew V

    2007-12-01

    The authors assessed effects of alcohol consumption on different types of working memory (WM) tasks in an attempt to characterize the nature of alcohol effects on cognition. The WM tasks varied in 2 properties of materials to be retained in a 2-stimulus comparison procedure. Conditions included (a) spatial arrays of colors, (b) temporal sequences of colors, (c) spatial arrays of spoken digits, and (d) temporal sequences of spoken digits. Alcohol consumption impaired memory for auditory and visual sequences but not memory for simultaneous arrays of auditory or visual stimuli. These results suggest that processes needed to encode and maintain stimulus sequences, such as rehearsal, are more sensitive to alcohol intoxication than other WM mechanisms needed to maintain multiple concurrent items, such as focusing attention on them. These findings help to resolve disparate findings from prior research on alcohol's effect on WM and on divided attention. The results suggest that moderate doses of alcohol impair WM by affecting certain mnemonic strategies and executive processes rather than by shrinking the basic holding capacity of WM. (c) 2008 APA, all rights reserved.

  20. Interventions aimed at improving the ability to use everyday technology in work after brain injury.

    PubMed

    Kassberg, Ann-Charlotte; Prellwitz, Maria; Malinowsky, Camilla; Larsson-Lund, Maria

    2016-01-01

    The aim of this study was to explore and describe how client-centred occupational therapy interventions may support and improve the ability to use everyday technology (ET) in work tasks in people with acquired brain injury (ABI). A qualitative, descriptive multiple-case study was designed, and occupation-based interventions were provided to three working-age participants with ABI. Multiple sources were used to collect data throughout the three intervention processes, including assessments, field notes, and interviews. The Canadian Occupational Performance Measure and the Management of Everyday Technology Assessment were administered before the interventions, after the interventions and at a follow-up session 2-3 months subsequent to the interventions. The three intervention processes initially consisted of similar actions, but subsequently the actions took on a different focus and intensity for each case. All of the goals in each of the three case processes were achieved, and both perceived and observed abilities to use ET in work tasks improved. Client-centred occupational therapy interventions might have the potential to improve the ability to use ET in work tasks in people with ABI.

  1. Monitoring and predicting cognitive state and performance via physiological correlates of neuronal signals.

    PubMed

    Russo, Michael B; Stetz, Melba C; Thomas, Maria L

    2005-07-01

    Judgment, decision making, and situational awareness are higher-order mental abilities critically important to operational cognitive performance. Higher-order mental abilities rely on intact functioning of multiple brain regions, including the prefrontal, thalamus, and parietal areas. Real-time monitoring of individuals for cognitive performance capacity via an approach based on sampling multiple neurophysiologic signals and integrating those signals with performance prediction models potentially provides a method of supporting warfighters' and commanders' decision making and other operationally relevant mental processes and is consistent with the goals of augmented cognition. Cognitive neurophysiological assessments that directly measure brain function and subsequent cognition include positron emission tomography, functional magnetic resonance imaging, mass spectroscopy, near-infrared spectroscopy, magnetoencephalography, and electroencephalography (EEG); however, most direct measures are not practical to use in operational environments. More practical, albeit indirect measures that are generated by, but removed from the actual neural sources, are movement activity, oculometrics, heart rate, and voice stress signals. The goal of the papers in this section is to describe advances in selected direct and indirect cognitive neurophysiologic monitoring techniques as applied for the ultimate purpose of preventing operational performance failures. These papers present data acquired in a wide variety of environments, including laboratory, simulator, and clinical arenas. The papers discuss cognitive neurophysiologic measures such as digital signal processing wrist-mounted actigraphy; oculometrics including blinks, saccadic eye movements, pupillary movements, the pupil light reflex; and high-frequency EEG. These neurophysiological indices are related to cognitive performance as measured through standard test batteries and simulators with conditions including sleep loss, time on task, and aviation flight-induced fatigue.

  2. Role of Multiple Atmospheric Reflections in Formation of Electron Distribution Function in the Diffuse Aurora Region. Chapter 9

    NASA Technical Reports Server (NTRS)

    Khazanov, George V.; Himwich, Elizabeth W.; Glocer, Alex; Sibeck, David G.

    2015-01-01

    The precipitation of high-energy magnetospheric electrons (E greater than 500-600 electronvolts) in the diffuse aurora contributes significant energy flux into Earth's ionosphere. In the diffuse aurora, precipitating electrons initially injected from the plasmasheet via wave-particle interaction processes degrade in the atmosphere toward lower energies and produce secondary electrons via impact ionization of the neutral atmosphere. These initially precipitating electrons of magnetospheric origin can be additionally reflected back into the magnetosphere by the two magnetically conjugated atmospheres, leading to a series of multiple reflections that can greatly influence the initially precipitating flux at the upper ionospheric boundary (700-800 kilometers) and the resultant population of secondary electrons and electrons cascading toward lower energies. We present the solution of the Boltzmann.Landau kinetic equation that uniformly describes the entire electron distribution function in the diffuse aurora, including the affiliated production of secondary electrons (E is less than or equal to 600 electronvolts) and their energy interplay in the magnetosphere and two conjugated ionospheres. This solution takes into account the role of multiple atmospheric reflections of the precipitated electrons that were initially moved into the loss cone via wave.particle interaction processes in Earth's plasmasheet.

  3. Slum Upgrading and Health Equity.

    PubMed

    Corburn, Jason; Sverdlik, Alice

    2017-03-24

    Informal settlement upgrading is widely recognized for enhancing shelter and promoting economic development, yet its potential to improve health equity is usually overlooked. Almost one in seven people on the planet are expected to reside in urban informal settlements, or slums, by 2030. Slum upgrading is the process of delivering place-based environmental and social improvements to the urban poor, including land tenure, housing, infrastructure, employment, health services and political and social inclusion. The processes and products of slum upgrading can address multiple environmental determinants of health. This paper reviewed urban slum upgrading evaluations from cities across Asia, Africa and Latin America and found that few captured the multiple health benefits of upgrading. With the Sustainable Development Goals (SDGs) focused on improving well-being for billions of city-dwellers, slum upgrading should be viewed as a key strategy to promote health, equitable development and reduce climate change vulnerabilities. We conclude with suggestions for how slum upgrading might more explicitly capture its health benefits, such as through the use of health impact assessment (HIA) and adopting an urban health in all policies (HiAP) framework. Urban slum upgrading must be more explicitly designed, implemented and evaluated to capture its multiple global environmental health benefits.

  4. Slum Upgrading and Health Equity

    PubMed Central

    Corburn, Jason; Sverdlik, Alice

    2017-01-01

    Informal settlement upgrading is widely recognized for enhancing shelter and promoting economic development, yet its potential to improve health equity is usually overlooked. Almost one in seven people on the planet are expected to reside in urban informal settlements, or slums, by 2030. Slum upgrading is the process of delivering place-based environmental and social improvements to the urban poor, including land tenure, housing, infrastructure, employment, health services and political and social inclusion. The processes and products of slum upgrading can address multiple environmental determinants of health. This paper reviewed urban slum upgrading evaluations from cities across Asia, Africa and Latin America and found that few captured the multiple health benefits of upgrading. With the Sustainable Development Goals (SDGs) focused on improving well-being for billions of city-dwellers, slum upgrading should be viewed as a key strategy to promote health, equitable development and reduce climate change vulnerabilities. We conclude with suggestions for how slum upgrading might more explicitly capture its health benefits, such as through the use of health impact assessment (HIA) and adopting an urban health in all policies (HiAP) framework. Urban slum upgrading must be more explicitly designed, implemented and evaluated to capture its multiple global environmental health benefits. PMID:28338613

  5. Software forecasting as it is really done: A study of JPL software engineers

    NASA Technical Reports Server (NTRS)

    Griesel, Martha Ann; Hihn, Jairus M.; Bruno, Kristin J.; Fouser, Thomas J.; Tausworthe, Robert C.

    1993-01-01

    This paper presents a summary of the results to date of a Jet Propulsion Laboratory internally funded research task to study the costing process and parameters used by internally recognized software cost estimating experts. Protocol Analysis and Markov process modeling were used to capture software engineer's forecasting mental models. While there is significant variation between the mental models that were studied, it was nevertheless possible to identify a core set of cost forecasting activities, and it was also found that the mental models cluster around three forecasting techniques. Further partitioning of the mental models revealed clustering of activities, that is very suggestive of a forecasting lifecycle. The different forecasting methods identified were based on the use of multiple-decomposition steps or multiple forecasting steps. The multiple forecasting steps involved either forecasting software size or an additional effort forecast. Virtually no subject used risk reduction steps in combination. The results of the analysis include: the identification of a core set of well defined costing activities, a proposed software forecasting life cycle, and the identification of several basic software forecasting mental models. The paper concludes with a discussion of the implications of the results for current individual and institutional practices.

  6. Method and apparatus for signal processing in a sensor system for use in spectroscopy

    DOEpatents

    O'Connor, Paul [Bellport, NY; DeGeronimo, Gianluigi [Nesconset, NY; Grosholz, Joseph [Natrona Heights, PA

    2008-05-27

    A method for processing pulses arriving randomly in time on at least one channel using multiple peak detectors includes asynchronously selecting a non-busy peak detector (PD) in response to a pulse-generated trigger signal, connecting the channel to the selected PD in response to the trigger signal, and detecting a pulse peak amplitude. Amplitude and time of arrival data are output in first-in first-out (FIFO) sequence. An apparatus includes trigger comparators to generate the trigger signal for the pulse-receiving channel, PDs, a switch for connecting the channel to the selected PD, and logic circuitry which maintains the write pointer. Also included, time-to-amplitude converters (TACs) convert time of arrival to analog voltage and an analog multiplexer provides FIFO output. A multi-element sensor system for spectroscopy includes detector elements, channels, trigger comparators, PDs, a switch, and a logic circuit with asynchronous write pointer. The system includes TACs, a multiplexer and analog-to-digital converter.

  7. Standardised Embedded Data framework for Drones [SEDD

    NASA Astrophysics Data System (ADS)

    Wyngaard, J.; Barbieri, L.; Peterson, F. S.

    2015-12-01

    A number of barriers to entry remain for UAS use in science. One in particular is that of implementing an experiment and UAS specific software stack. Currently this stack is most often developed in-house and customised for a particular UAS-sensor pairing - limiting its reuse. Alternatively, when adaptable a suitable commercial package may be used, but such systems are both costly and usually suboptimal.In order to address this challenge the Standardised Embedded Data framework for Drones [SEDD] is being developed in μpython. SEDD provides an open source, reusable, and scientist-accessible drop in solution for drone data capture and triage. Targeted at embedded hardware, and offering easy access to standard I/O interfaces, SEDD provides an easy solution for simply capturing data from a sensor. However, the intention is rather to enable more complex systems of multiple sensors, computer hardware, and feedback loops, via 3 primary components.A data asset manager ensures data assets are associated with appropriate metadata as they are captured. Thereafter, the asset is easily archived or otherwise redirected, possibly to - onboard storage, onboard compute resource for processing, an interface for transmission, another sensor control system, remote storage and processing (such as EarthCube's CHORDS), or to any combination of the above.A service workflow managerenables easy implementation of complex onboard systems via dedicated control of multiple continuous and periodic services. Such services will include the housekeeping chores of operating a UAS and multiple sensors, but will also permit a scientist to drop in an initial scientific data processing code utilising on-board compute resources beyond the autopilot. Having such capabilities firstly enables easy creation of real-time feedback, to the human- or auto- pilot, or other sensors, on data quality or needed flight path changes. Secondly, compute hardware provides the opportunity to carry out real-time data triage, for the purposes of conserving on-board storage space or transmission bandwidth in inherently poor connectivity environments.A compute manager is finally included. Depending on system complexity, and given the need for power efficient parallelism, it can quickly become necessary to provide a scheduling service for multiple workflows.

  8. Dovetail spoke internal permanent magnet machine

    DOEpatents

    Alexander, James Pellegrino [Ballston Lake, NY; EL-Refaie, Ayman Mohamed Fawzi [Niskayuna, NY; Lokhandwalla, Murtuza [Clifton Park, NY; Shah, Manoj Ramprasad [Latham, NY; VanDam, Jeremy Daniel [West Coxsackie, NY

    2011-08-23

    An internal permanent magnet (IPM) machine is provided. The IPM machine includes a stator assembly and a stator core. The stator core also includes multiple stator teeth. The stator assembly is further configured with stator windings to generate a stator magnetic field when excited with alternating currents and extends along a longitudinal axis with an inner surface defining a cavity. The IPM machine also includes a rotor assembly and a rotor core. The rotor core is disposed inside the cavity and configured to rotate about the longitudinal axis. The rotor assembly further includes a shaft. The shaft further includes multiple protrusions alternately arranged relative to multiple bottom structures provided on the shaft. The rotor assembly also includes multiple stacks of laminations disposed on the protrusions and dovetailed circumferentially around the shaft. The rotor assembly further includes multiple pair of permanent magnets for generating a magnetic field, which magnetic field interacts with the stator magnetic field to produce a torque. The multiple pair of permanent magnets are disposed between the stacks. The rotor assembly also includes multiple middle wedges mounted between each pair of the multiple permanent magnets.

  9. Self-interaction of NPM1 modulates multiple mechanisms of liquid-liquid phase separation.

    PubMed

    Mitrea, Diana M; Cika, Jaclyn A; Stanley, Christopher B; Nourse, Amanda; Onuchic, Paulo L; Banerjee, Priya R; Phillips, Aaron H; Park, Cheon-Gil; Deniz, Ashok A; Kriwacki, Richard W

    2018-02-26

    Nucleophosmin (NPM1) is an abundant, oligomeric protein in the granular component of the nucleolus with roles in ribosome biogenesis. Pentameric NPM1 undergoes liquid-liquid phase separation (LLPS) via heterotypic interactions with nucleolar components, including ribosomal RNA (rRNA) and proteins which display multivalent arginine-rich linear motifs (R-motifs), and is integral to the liquid-like nucleolar matrix. Here we show that NPM1 can also undergo LLPS via homotypic interactions between its polyampholytic intrinsically disordered regions, a mechanism that opposes LLPS via heterotypic interactions. Using a combination of biophysical techniques, including confocal microscopy, SAXS, analytical ultracentrifugation, and single-molecule fluorescence, we describe how conformational changes within NPM1 control valency and switching between the different LLPS mechanisms. We propose that this newly discovered interplay between multiple LLPS mechanisms may influence the direction of vectorial pre-ribosomal particle assembly within, and exit from the nucleolus as part of the ribosome biogenesis process.

  10. Mediators and mechanisms of herpes simplex virus entry into ocular cells.

    PubMed

    Farooq, Asim V; Valyi-Nagy, Tibor; Shukla, Deepak

    2010-06-01

    The entry of herpes simplex virus into cells was once thought to be a general process. It is now understood that the virus is able to use multiple mechanisms for entry and spread, including the use of receptors and co-receptors that have been determined to be cell-type specific. This is certainly true for ocular cell types, which is important as the virus may use different mechanisms to gain access to multiple anatomic structures in close proximity, leading to various ocular diseases. There are some patterns that may be utilized by the virus in the eye and elsewhere, including surfing along filopodia in moving from cell to cell. There are common themes as well as intriguing differences in the entry mechanisms of herpes simplex virus into ocular cells. We discuss these issues in the context of conjunctivitis, keratitis, acute retinal necrosis, and other ocular diseases.

  11. Mediators and Mechanisms of Herpes Simplex Virus Entry into Ocular Cells

    PubMed Central

    Farooq, Asim V.; Valyi-Nagy, Tibor; Shukla, Deepak

    2010-01-01

    The entry of herpes simplex virus (HSV) into cells was once thought to be a general process. It is now understood that the virus is able to use multiple mechanisms for entry and spread, including the use of receptors and co-receptors that have been determined to be cell-type specific. This is certainly true for ocular cell types, which is important as the virus may use different mechanisms to gain access to multiple anatomic structures in close proximity, leading to various ocular diseases. There are some patterns that may be utilized by the virus in the eye and elsewhere, including surfing along filopodia in moving from cell to cell. There are common themes as well as intriguing differences in the entry mechanisms of HSV into ocular cells. We discuss these issues in the context of conjunctivitis, keratitis, acute retinal necrosis and other ocular diseases. PMID:20465436

  12. Chromosome catastrophes involve replication mechanisms generating complex genomic rearrangements

    PubMed Central

    Liu, Pengfei; Erez, Ayelet; Sreenath Nagamani, Sandesh C.; Dhar, Shweta U.; Kołodziejska, Katarzyna E.; Dharmadhikari, Avinash V.; Cooper, M. Lance; Wiszniewska, Joanna; Zhang, Feng; Withers, Marjorie A.; Bacino, Carlos A.; Campos-Acevedo, Luis Daniel; Delgado, Mauricio R.; Freedenberg, Debra; Garnica, Adolfo; Grebe, Theresa A.; Hernández-Almaguer, Dolores; Immken, LaDonna; Lalani, Seema R.; McLean, Scott D.; Northrup, Hope; Scaglia, Fernando; Strathearn, Lane; Trapane, Pamela; Kang, Sung-Hae L.; Patel, Ankita; Cheung, Sau Wai; Hastings, P. J.; Stankiewicz, Paweł; Lupski, James R.; Bi, Weimin

    2011-01-01

    SUMMARY Complex genomic rearrangements (CGR) consisting of two or more breakpoint junctions have been observed in genomic disorders. Recently, a chromosome catastrophe phenomenon termed chromothripsis, in which numerous genomic rearrangements are apparently acquired in one single catastrophic event, was described in multiple cancers. Here we show that constitutionally acquired CGRs share similarities with cancer chromothripsis. In the 17 CGR cases investigated we observed localization and multiple copy number changes including deletions, duplications and/or triplications, as well as extensive translocations and inversions. Genomic rearrangements involved varied in size and complexities; in one case, array comparative genomic hybridization revealed 18 copy number changes. Breakpoint sequencing identified characteristic features, including small templated insertions at breakpoints and microhomology at breakpoint junctions, which have been attributed to replicative processes. The resemblance between CGR and chromothripsis suggests similar mechanistic underpinnings. Such chromosome catastrophic events appear to reflect basic DNA metabolism operative throughout an organism’s life cycle. PMID:21925314

  13. The World Hypertension League: where now and where to in salt reduction

    PubMed Central

    Lackland, Daniel T.; Lisheng, Liu; Zhang, Xin-Hua; Nilsson, Peter M.; Niebylski, Mark L.

    2015-01-01

    High dietary salt is a leading risk for death and disability largely by causing increased blood pressure. Other associated health risks include gastric and renal cell cancers, osteoporosis, renal stones, and increased disease activity in multiple sclerosis, headache, increased body fat and Meniere’s disease. The World Hypertension League (WHL) has prioritized advocacy for salt reduction. WHL resources and actions include a non-governmental organization policy statement, dietary salt fact sheet, development of standardized nomenclature, call for quality research, collaboration in a weekly salt science update, development of a process to set recommended dietary salt research standards and regular literature reviews, development of adoptable power point slide sets to support WHL positions and resources, and critic of weak research studies on dietary salt. The WHL plans to continue to work with multiple governmental and non-governmental organizations to promote dietary salt reduction towards the World Health Organization (WHO) recommendations. PMID:26090335

  14. Optical cavity furnace for semiconductor wafer processing

    DOEpatents

    Sopori, Bhushan L.

    2014-08-05

    An optical cavity furnace 10 having multiple optical energy sources 12 associated with an optical cavity 18 of the furnace. The multiple optical energy sources 12 may be lamps or other devices suitable for producing an appropriate level of optical energy. The optical cavity furnace 10 may also include one or more reflectors 14 and one or more walls 16 associated with the optical energy sources 12 such that the reflectors 14 and walls 16 define the optical cavity 18. The walls 16 may have any desired configuration or shape to enhance operation of the furnace as an optical cavity 18. The optical energy sources 12 may be positioned at any location with respect to the reflectors 14 and walls defining the optical cavity. The optical cavity furnace 10 may further include a semiconductor wafer transport system 22 for transporting one or more semiconductor wafers 20 through the optical cavity.

  15. Hybrid colored noise process with space-dependent switching rates

    NASA Astrophysics Data System (ADS)

    Bressloff, Paul C.; Lawley, Sean D.

    2017-07-01

    A fundamental issue in the theory of continuous stochastic process is the interpretation of multiplicative white noise, which is often referred to as the Itô-Stratonovich dilemma. From a physical perspective, this reflects the need to introduce additional constraints in order to specify the nature of the noise, whereas from a mathematical perspective it reflects an ambiguity in the formulation of stochastic differential equations (SDEs). Recently, we have identified a mechanism for obtaining an Itô SDE based on a form of temporal disorder. Motivated by switching processes in molecular biology, we considered a Brownian particle that randomly switches between two distinct conformational states with different diffusivities. In each state, the particle undergoes normal diffusion (additive noise) so there is no ambiguity in the interpretation of the noise. However, if the switching rates depend on position, then in the fast switching limit one obtains Brownian motion with a space-dependent diffusivity of the Itô form. In this paper, we extend our theory to include colored additive noise. We show that the nature of the effective multiplicative noise process obtained by taking both the white-noise limit (κ →0 ) and fast switching limit (ɛ →0 ) depends on the order the two limits are taken. If the white-noise limit is taken first, then we obtain Itô, and if the fast switching limit is taken first, then we obtain Stratonovich. Moreover, the form of the effective diffusion coefficient differs in the two cases. The latter result holds even in the case of space-independent transition rates, where one obtains additive noise processes with different diffusion coefficients. Finally, we show that yet another form of multiplicative noise is obtained in the simultaneous limit ɛ ,κ →0 with ɛ /κ2 fixed.

  16. Data quality and processing for decision making: divergence between corporate strategy and manufacturing processes

    NASA Astrophysics Data System (ADS)

    McNeil, Ronald D.; Miele, Renato; Shaul, Dennis

    2000-10-01

    Information technology is driving improvements in manufacturing systems. Results are higher productivity and quality. However, corporate strategy is driven by a number of factors and includes data and pressure from multiple stakeholders, which includes employees, managers, executives, stockholders, boards, suppliers and customers. It is also driven by information about competitors and emerging technology. Much information is based on processing of data and the resulting biases of the processors. Thus, stakeholders can base inputs on faulty perceptions, which are not reality based. Prior to processing, data used may be inaccurate. Sources of data and information may include demographic reports, statistical analyses, intelligence reports (e.g., marketing data), technology and primary data collection. The reliability and validity of data as well as the management of sources and information is critical element to strategy formulation. The paper explores data collection, processing and analyses from secondary and primary sources, information generation and report presentation for strategy formulation and contrast this with data and information utilized to drive internal process such as manufacturing. The hypothesis is that internal process, such as manufacturing, are subordinate to corporate strategies. The impact of possible divergence in quality of decisions at the corporate level on IT driven, quality-manufacturing processes based on measurable outcomes is significant. Recommendations for IT improvements at the corporate strategy level are given.

  17. Optical implementation of systolic array processing

    NASA Technical Reports Server (NTRS)

    Caulfield, H. J.; Rhodes, W. T.; Foster, M. J.; Horvitz, S.

    1981-01-01

    Algorithms for matrix vector multiplication are implemented using acousto-optic cells for multiplication and input data transfer and using charge coupled devices detector arrays for accumulation and output of the results. No two dimensional matrix mask is required; matrix changes are implemented electronically. A system for multiplying a 50 component nonnegative real vector by a 50 by 50 nonnegative real matrix is described. Modifications for bipolar real and complex valued processing are possible, as are extensions to matrix-matrix multiplication and multiplication of a vector by multiple matrices.

  18. The emergence of coherence over the course of decision making.

    PubMed

    Simon, D; Pham, L B; Le, Q A; Holyoak, K J

    2001-09-01

    Previous research has indicated that decision making is accompanied by an increase in the coherence of assessments of the factors related to the decision alternatives. In the present study, the authors investigated whether this coherence shift is obtained before people commit to a decision, and whether it is obtained in the course of a number of other processing tasks. College students were presented with a complex legal case involving multiple conflicting arguments. Participants rated agreement with the individual arguments in isolation before seeing the case and after processing it under various initial sets, including playing the role of a judge assigned to decide the case. Coherence shifts were observed when participants were instructed to delay making the decision (Experiment 1), to memorize the case (Experiment 2), and to comprehend the case (Experiment 3). The findings support the hypothesis that a coherence-generating mechanism operates in a variety of processing tasks, including decision making.

  19. The Effect of Multiple Surface Treatments on Biological Properties of Ti-6Al-4V Alloy

    NASA Astrophysics Data System (ADS)

    Parsikia, Farhang; Amini, Pupak; Asgari, Sirous

    2014-09-01

    In this research, the effect of various surface treatments including laser processing, grit blasting and anodizing on chemical structure, surface topography, and bioactivity of Ti-6Al-4V was investigated. Six groups of samples were prepared by a combination of two alternative laser processes, grit blasting and anodizing. Selected samples were first evaluated using microanalysis techniques and contact roughness testing and were then exposed to in vitro environment. Scanning electron microscopy was used to characterize the corresponding final surface morphologies. Weight measurement and atomic absorption tests were employed for determination of bioactivity limits of different surface conditions. Based on the data obtained in this study, low-energy laser processing generally yields a better biological response. The maximum bioactivity was attained in those samples exposed to a three step treatment including low-energy laser treatment followed by grit blasting and anodizing.

  20. MicroRNAs and the metabolic hallmarks of aging.

    PubMed

    Victoria, Berta; Nunez Lopez, Yury O; Masternak, Michal M

    2017-11-05

    Aging, the natural process of growing older, is characterized by a progressive deterioration of physiological homeostasis at the cellular, tissue, and organismal level. Metabolically, the aging process is characterized by extensive changes in body composition, multi-tissue/multi-organ insulin resistance, and physiological declines in multiple signaling pathways including growth hormone, insulin/insulin-like growth factor 1, and sex steroids regulation. With this review, we intend to consolidate published information about microRNAs that regulate critical metabolic processes relevant to aging. In certain occasions we uncover relationships likely relevant to aging, which has not been directly described before, such as the miR-451/AMPK axis. We have also included a provocative section highlighting the potential role in aging of a new designation of miRNAs, namely fecal miRNAs, recently discovered to regulate intestinal microbiota in mammals. Copyright © 2016. Published by Elsevier B.V.

Top