Sample records for single comprehensive model

  1. The Comprehension and Validation of Social Information.

    ERIC Educational Resources Information Center

    Wyer, Robert S., Jr.; Radvansky, Gabriel A.

    1999-01-01

    Proposes a theory of social cognition to account for the comprehension and verification of social information. The theory views comprehension as a process of constructing situation models of new information on the basis of previously formed models about its referents. The comprehension of both single statements and multiple pieces of information…

  2. Prologue: Reading Comprehension Is Not a Single Ability.

    PubMed

    Catts, Hugh W; Kamhi, Alan G

    2017-04-20

    In this initial article of the clinical forum on reading comprehension, we argue that reading comprehension is not a single ability that can be assessed by one or more general reading measures or taught by a small set of strategies or approaches. We present evidence for a multidimensional view of reading comprehension that demonstrates how it varies as a function of reader ability, text, and task. The implications of this view for instruction of reading comprehension are considered. Reading comprehension is best conceptualized with a multidimensional model. The multidimensionality of reading comprehension means that instruction will be more effective when tailored to student performance with specific texts and tasks.

  3. Application Study of Comprehensive Forecasting Model Based on Entropy Weighting Method on Trend of PM2.5 Concentration in Guangzhou, China

    PubMed Central

    Liu, Dong-jun; Li, Li

    2015-01-01

    For the issue of haze-fog, PM2.5 is the main influence factor of haze-fog pollution in China. The trend of PM2.5 concentration was analyzed from a qualitative point of view based on mathematical models and simulation in this study. The comprehensive forecasting model (CFM) was developed based on the combination forecasting ideas. Autoregressive Integrated Moving Average Model (ARIMA), Artificial Neural Networks (ANNs) model and Exponential Smoothing Method (ESM) were used to predict the time series data of PM2.5 concentration. The results of the comprehensive forecasting model were obtained by combining the results of three methods based on the weights from the Entropy Weighting Method. The trend of PM2.5 concentration in Guangzhou China was quantitatively forecasted based on the comprehensive forecasting model. The results were compared with those of three single models, and PM2.5 concentration values in the next ten days were predicted. The comprehensive forecasting model balanced the deviation of each single prediction method, and had better applicability. It broadens a new prediction method for the air quality forecasting field. PMID:26110332

  4. Application Study of Comprehensive Forecasting Model Based on Entropy Weighting Method on Trend of PM2.5 Concentration in Guangzhou, China.

    PubMed

    Liu, Dong-jun; Li, Li

    2015-06-23

    For the issue of haze-fog, PM2.5 is the main influence factor of haze-fog pollution in China. The trend of PM2.5 concentration was analyzed from a qualitative point of view based on mathematical models and simulation in this study. The comprehensive forecasting model (CFM) was developed based on the combination forecasting ideas. Autoregressive Integrated Moving Average Model (ARIMA), Artificial Neural Networks (ANNs) model and Exponential Smoothing Method (ESM) were used to predict the time series data of PM2.5 concentration. The results of the comprehensive forecasting model were obtained by combining the results of three methods based on the weights from the Entropy Weighting Method. The trend of PM2.5 concentration in Guangzhou China was quantitatively forecasted based on the comprehensive forecasting model. The results were compared with those of three single models, and PM2.5 concentration values in the next ten days were predicted. The comprehensive forecasting model balanced the deviation of each single prediction method, and had better applicability. It broadens a new prediction method for the air quality forecasting field.

  5. Deliverable 2.4.4 -- Evaluation and single-well models for the demonstration wells, Class 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deo, Milind; Morgan Craig D.

    2000-07-12

    Two single-well models were developed for Michelle Ute and Malnar Pike wells. The perforated intervals span thousands of feet in both the wells. Geological properties were calculated for all the perforated beds. The information was used to develop models for these two wells. These were comprehensive models since they took into account all the perforated beds.

  6. Case Study of an Institutionalized Urban Comprehensive School Physical Activity Program

    ERIC Educational Resources Information Center

    Doolittle, Sarah A.; Rukavina, Paul B.

    2014-01-01

    This single case study (Yin, 2009) compares an established urban physical education/ sport/physical activity program with two models: Comprehensive School Physical Activity Program/CSPAP (AAHPERD, 2013; CDC, 2013); and Lawson's propositions (2005) for sport, exercise and physical education for empowerment and community development to determine…

  7. Does Multimedia Support Individual Differences?--EFL Learners' Listening Comprehension and Cognitive Load

    ERIC Educational Resources Information Center

    Yang, Hui-Yu

    2014-01-01

    The present study examines how display model, English proficiency and cognitive preference affect English as a Foreign Language (EFL) learners' listening comprehension of authentic videos and cognitive load degree. EFL learners were randomly assigned to one of two groups. The control group received single coding and the experimental group received…

  8. Time-Dependent Traveling Wave Tube Model for Intersymbol Interference Investigations

    DTIC Science & Technology

    2001-06-01

    band is 5.7 degrees. C. Differences between broadband and single-tone excitations The TWT characteristics are compared when excited by single-tones...direct description of the effects of the TWT on modulated digital signals. The TWT model comprehensively takes into account the effects of frequency...of the high power amplifier and the operational digital signal. This method promises superior predictive fidelity compared to methods using TWT

  9. Linking population viability, habitat suitability, and landscape simulation models for conservation planning

    Treesearch

    Michael A. Larson; Frank R., III Thompson; Joshua J. Millspaugh; William D. Dijak; Stephen R. Shifley

    2004-01-01

    Methods for habitat modeling based on landscape simulations and population viability modeling based on habitat quality are well developed, but no published study of which we are aware has effectively joined them in a single, comprehensive analysis. We demonstrate the application of a population viability model for ovenbirds (Seiurus aurocapillus)...

  10. Building a comprehensive team for the longitudinal care of single ventricle heart defects: Building blocks and initial results.

    PubMed

    Texter, Karen; Davis, Jo Ann M; Phelps, Christina; Cheatham, Sharon; Cheatham, John; Galantowicz, Mark; Feltes, Timothy F

    2017-07-01

    With increasing survival of children with HLHS and other single ventricle lesions, the complexity of medical care for these patients is substantial. Establishing and adhering to best practice models may improve outcome, but requires careful coordination and monitoring. In 2013 our Heart Center began a process to build a comprehensive Single Ventricle Team designed to target these difficult issues. Comprehensive Single Ventricle Team in 2014 was begun, to standardize care for children with single ventricle heart defects from diagnosis to adulthood within our institution. The team is a multidisciplinary group of providers committed to improving outcomes and quality of life for children with single ventricle heart defects, all functioning within the medical home of our heart center. Standards of care were developed and implemented in five target areas to standardize medical management and patient and family support. Under the team 100 patients have been cared for. Since 2014 a decrease in interstage mortality for HLHS were seen. Using a team approach and the tools of Quality Improvement they have been successful in reaching high protocol compliance for each of these areas. This article describes the process of building a successful Single Ventricle team, our initial results, and lessons learned. Additional study is ongoing to demonstrate the effects of these interventions on patient outcomes. © 2017 Wiley Periodicals, Inc.

  11. The Effects of an Intervention Combining Peer Tutoring with Story Mapping on the Text Comprehension of Struggling Readers: A Case Report

    ERIC Educational Resources Information Center

    Grünke, Matthias; Leidig, Tatjana

    2017-01-01

    This single-case study tested a peer tutoring model using a visualizing strategy (story mapping) to teach struggling students better text comprehension. Three teams each consisting of a tutor and a tutee attending a fourth-grade general education classroom participated in the experiment. A short series of observations was carried out before and…

  12. Extending Single-Molecule Microscopy Using Optical Fourier Processing

    PubMed Central

    2015-01-01

    This article surveys the recent application of optical Fourier processing to the long-established but still expanding field of single-molecule imaging and microscopy. A variety of single-molecule studies can benefit from the additional image information that can be obtained by modulating the Fourier, or pupil, plane of a widefield microscope. After briefly reviewing several current applications, we present a comprehensive and computationally efficient theoretical model for simulating single-molecule fluorescence as it propagates through an imaging system. Furthermore, we describe how phase/amplitude-modulating optics inserted in the imaging pathway may be modeled, especially at the Fourier plane. Finally, we discuss selected recent applications of Fourier processing methods to measure the orientation, depth, and rotational mobility of single fluorescent molecules. PMID:24745862

  13. Extending single-molecule microscopy using optical Fourier processing.

    PubMed

    Backer, Adam S; Moerner, W E

    2014-07-17

    This article surveys the recent application of optical Fourier processing to the long-established but still expanding field of single-molecule imaging and microscopy. A variety of single-molecule studies can benefit from the additional image information that can be obtained by modulating the Fourier, or pupil, plane of a widefield microscope. After briefly reviewing several current applications, we present a comprehensive and computationally efficient theoretical model for simulating single-molecule fluorescence as it propagates through an imaging system. Furthermore, we describe how phase/amplitude-modulating optics inserted in the imaging pathway may be modeled, especially at the Fourier plane. Finally, we discuss selected recent applications of Fourier processing methods to measure the orientation, depth, and rotational mobility of single fluorescent molecules.

  14. Comprehensive model of a hermetic reciprocating compressor

    NASA Astrophysics Data System (ADS)

    Yang, B.; Ziviani, D.; Groll, E. A.

    2017-08-01

    A comprehensive simulation model is presented to predict the performance of a hermetic reciprocating compressor and to reveal the underlying mechanisms when the compressor is running. The presented model is composed of sub-models simulating the in-cylinder compression process, piston ring/journal bearing frictional power loss, single phase induction motor and the overall compressor energy balance among different compressor components. The valve model, leakage through piston ring model and in-cylinder heat transfer model are also incorporated into the in-cylinder compression process model. A numerical algorithm solving the model is introduced. The predicted results of the compressor mass flow rate and input power consumption are compared to the published compressor map values. Future work will focus on detailed experimental validation of the model and parametric studies investigating the effects of structural parameters, including the stroke-to-bore ratio, on the compressor performance.

  15. Training Inference Making Skills Using a Situation Model Approach Improves Reading Comprehension

    PubMed Central

    Bos, Lisanne T.; De Koning, Bjorn B.; Wassenburg, Stephanie I.; van der Schoot, Menno

    2016-01-01

    This study aimed to enhance third and fourth graders’ text comprehension at the situation model level. Therefore, we tested a reading strategy training developed to target inference making skills, which are widely considered to be pivotal to situation model construction. The training was grounded in contemporary literature on situation model-based inference making and addressed the source (text-based versus knowledge-based), type (necessary versus unnecessary for (re-)establishing coherence), and depth of an inference (making single lexical inferences versus combining multiple lexical inferences), as well as the type of searching strategy (forward versus backward). Results indicated that, compared to a control group (n = 51), children who followed the experimental training (n = 67) improved their inference making skills supportive to situation model construction. Importantly, our training also resulted in increased levels of general reading comprehension and motivation. In sum, this study showed that a ‘level of text representation’-approach can provide a useful framework to teach inference making skills to third and fourth graders. PMID:26913014

  16. "All in the Family" in Context: A Unified Model of Media Studies Applied to Television Criticism.

    ERIC Educational Resources Information Center

    Timberg, Bernard

    Proposing the use of a single comprehensive communications model, the Circles of Context Model, for all forms of communication, this paper shows how the model can be used to identify different kinds of criticism of the television comedy series "All in the Family" and the ways in which that criticism shifted during the show's nine-year…

  17. Computational modeling of the human auditory periphery: Auditory-nerve responses, evoked potentials and hearing loss.

    PubMed

    Verhulst, Sarah; Altoè, Alessandro; Vasilkov, Viacheslav

    2018-03-01

    Models of the human auditory periphery range from very basic functional descriptions of auditory filtering to detailed computational models of cochlear mechanics, inner-hair cell (IHC), auditory-nerve (AN) and brainstem signal processing. It is challenging to include detailed physiological descriptions of cellular components into human auditory models because single-cell data stems from invasive animal recordings while human reference data only exists in the form of population responses (e.g., otoacoustic emissions, auditory evoked potentials). To embed physiological models within a comprehensive human auditory periphery framework, it is important to capitalize on the success of basic functional models of hearing and render their descriptions more biophysical where possible. At the same time, comprehensive models should capture a variety of key auditory features, rather than fitting their parameters to a single reference dataset. In this study, we review and improve existing models of the IHC-AN complex by updating their equations and expressing their fitting parameters into biophysical quantities. The quality of the model framework for human auditory processing is evaluated using recorded auditory brainstem response (ABR) and envelope-following response (EFR) reference data from normal and hearing-impaired listeners. We present a model with 12 fitting parameters from the cochlea to the brainstem that can be rendered hearing impaired to simulate how cochlear gain loss and synaptopathy affect human population responses. The model description forms a compromise between capturing well-described single-unit IHC and AN properties and human population response features. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.

  18. Comprehending expository texts: the dynamic neurobiological correlates of building a coherent text representation

    PubMed Central

    Swett, Katherine; Miller, Amanda C.; Burns, Scott; Hoeft, Fumiko; Davis, Nicole; Petrill, Stephen A.; Cutting, Laurie E.

    2013-01-01

    Little is known about the neural correlates of expository text comprehension. In this study, we sought to identify neural networks underlying expository text comprehension, how those networks change over the course of comprehension, and whether information central to the overall meaning of the text is functionally distinct from peripheral information. Seventeen adult subjects read expository passages while being scanned using functional magnetic resonance imaging (fMRI). By convolving phrase onsets with the hemodynamic response function (HRF), we were able to identify regions that increase and decrease in activation over the course of passage comprehension. We found that expository text comprehension relies on the co-activation of the semantic control network and regions in the posterior midline previously associated with mental model updating and integration [posterior cingulate cortex (PCC) and precuneus (PCU)]. When compared to single word comprehension, left PCC and left Angular Gyrus (AG) were activated only for discourse-level comprehension. Over the course of comprehension, reliance on the same regions in the semantic control network increased, while a parietal region associated with attention [intraparietal sulcus (IPS)] decreased. These results parallel previous findings in narrative comprehension that the initial stages of mental model building require greater visuospatial attention processes, while maintenance of the model increasingly relies on semantic integration regions. Additionally, we used an event-related analysis to examine phrases central to the text's overall meaning vs. peripheral phrases. It was found that central ideas are functionally distinct from peripheral ideas, showing greater activation in the PCC and PCU, while over the course of passage comprehension, central and peripheral ideas increasingly recruit different parts of the semantic control network. The finding that central information elicits greater response in mental model updating regions than peripheral ideas supports previous behavioral models on the cognitive importance of distinguishing textual centrality. PMID:24376411

  19. Professional School Counseling: A Handbook of Theories, Programs, and Practices. Third Edition

    ERIC Educational Resources Information Center

    Erford, Bradley T., Ed.

    2016-01-01

    "Professional School Counseling" is a comprehensive, single source for information about the critical issues facing school counselors today. This third edition of the Handbook integrates and expands on the changes brought about by the ASCA National Model. Revisions to each chapter reflect the influence of the model. Several new chapters…

  20. Single stage queueing/manufacturing system model that involves emission variable

    NASA Astrophysics Data System (ADS)

    Murdapa, P. S.; Pujawan, I. N.; Karningsih, P. D.; Nasution, A. H.

    2018-04-01

    Queueing is commonly occured at every industry. The basic model of queueing theory gives a foundation for modeling a manufacturing system. Nowadays, carbon emission is an important and inevitable issue due to its huge impact to our environment. However, existing model of queuing applied for analysis of single stage manufacturing system has not taken Carbon emissions into consideration. If it is applied to manufacturing context, it may lead to improper decisisions. By taking into account of emission variables into queuing models, not only the model become more comprehensive but also it creates awareness on the issue to many parties that involves in the system. This paper discusses the single stage M/M/1 queueing model that involves emission variable. Hopefully it could be a starting point for the next more complex models. It has a main objective for determining how carbon emissions could fit into the basic queueing theory. It turned out that the involvement of emission variables into the model has modified the traditional model of a single stage queue to a calculation model of production lot quantity allowed per period.

  1. Joint Models of Longitudinal and Time-to-Event Data with More Than One Event Time Outcome: A Review.

    PubMed

    Hickey, Graeme L; Philipson, Pete; Jorgensen, Andrea; Kolamunnage-Dona, Ruwanthi

    2018-01-31

    Methodological development and clinical application of joint models of longitudinal and time-to-event outcomes have grown substantially over the past two decades. However, much of this research has concentrated on a single longitudinal outcome and a single event time outcome. In clinical and public health research, patients who are followed up over time may often experience multiple, recurrent, or a succession of clinical events. Models that utilise such multivariate event time outcomes are quite valuable in clinical decision-making. We comprehensively review the literature for implementation of joint models involving more than a single event time per subject. We consider the distributional and modelling assumptions, including the association structure, estimation approaches, software implementations, and clinical applications. Research into this area is proving highly promising, but to-date remains in its infancy.

  2. Prologue: Reading Comprehension Is Not a Single Ability

    ERIC Educational Resources Information Center

    Catts, Hugh W.; Kamhi, Alan G.

    2017-01-01

    Purpose: In this initial article of the clinical forum on reading comprehension, we argue that reading comprehension is not a single ability that can be assessed by one or more general reading measures or taught by a small set of strategies or approaches. Method: We present evidence for a multidimensional view of reading comprehension that…

  3. Pre-Service Teachers Learn to Teach Geography: A Suggested Course Model

    ERIC Educational Resources Information Center

    Mitchell, Jerry T.

    2018-01-01

    How to improve geography education via teacher preparation programs has been a concern for nearly three decades, but few examples of a single, comprehensive university-level course exist. The purpose of this article is to share the model of a pre-service geography education methods course. Within the course, geography content (physical and social)…

  4. Continuous Evaluation of Fast Processes in Climate Models Using ARM Measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Zhijin; Sha, Feng; Liu, Yangang

    2016-02-02

    This five-year award supports the project “Continuous Evaluation of Fast Processes in Climate Models Using ARM Measurements (FASTER)”. The goal of this project is to produce accurate, consistent and comprehensive data sets for initializing both single column models (SCMs) and cloud resolving models (CRMs) using data assimilation. A multi-scale three-dimensional variational data assimilation scheme (MS-3DVAR) has been implemented. This MS-3DVAR system is built on top of WRF/GSI. The Community Gridpoint Statistical Interpolation (GSI) system is an operational data assimilation system at the National Centers for Environmental Prediction (NCEP) and has been implemented in the Weather Research and Forecast (WRF) model.more » This MS-3DVAR is further enhanced by the incorporation of a land surface 3DVAR scheme and a comprehensive aerosol 3DVAR scheme. The data assimilation implementation focuses in the ARM SGP region. ARM measurements are assimilated along with other available satellite and radar data. Reanalyses are then generated for a few selected period of time. This comprehensive data assimilation system has also been employed for other ARM-related applications.« less

  5. Comprehending Comprehension: Selected Possibilities for Clinical Practice Within a Multidimensional Model.

    PubMed

    Wallach, Geraldine P; Ocampo, Alaine

    2017-04-20

    In this discussion as part of a response to Catts and Kamhi's "Prologue: Reading Comprehension Is Not a Single Activity" (2017), the authors provide selected examples from 4th-, 5th-, and 6th-grade texts to demonstrate, in agreement with Catts and Kamhi, that reading comprehension is a multifaceted and complex ability. The authors were asked to provide readers with evidence-based practices that lend support to applications of a multidimensional model of comprehension. We present examples from the reading comprehension literature that support the notion that reading is a complex set of abilities that include a reader's ability, especially background knowledge; the type of text the reader is being asked to comprehend; and the task or technique used in assessment or intervention paradigms. An intervention session from 6th grade serves to demonstrate how background knowledge, a text's demands, and tasks may come together in the real world as clinicians and educators aim to help students comprehend complex material. The authors agree with the conceptual framework proposed by Catts and Kamhi that clinicians and educators should consider the multidimensional nature of reading comprehension (an interaction of reader, text, and task) when creating assessment and intervention programs. The authors might depart slightly by considering, more closely, those reading comprehension strategies that might facilitate comprehension across texts and tasks with an understanding of students' individual needs at different points in time.

  6. AdapChem

    NASA Technical Reports Server (NTRS)

    Oluwole, Oluwayemisi O.; Wong, Hsi-Wu; Green, William

    2012-01-01

    AdapChem software enables high efficiency, low computational cost, and enhanced accuracy on computational fluid dynamics (CFD) numerical simulations used for combustion studies. The software dynamically allocates smaller, reduced chemical models instead of the larger, full chemistry models to evolve the calculation while ensuring the same accuracy to be obtained for steady-state CFD reacting flow simulations. The software enables detailed chemical kinetic modeling in combustion CFD simulations. AdapChem adapts the reaction mechanism used in the CFD to the local reaction conditions. Instead of a single, comprehensive reaction mechanism throughout the computation, a dynamic distribution of smaller, reduced models is used to capture accurately the chemical kinetics at a fraction of the cost of the traditional single-mechanism approach.

  7. Comprehensive Model of Single Particle Pulverized Coal Combustion Extended to Oxy-Coal Conditions

    DOE PAGES

    Holland, Troy; Fletcher, Thomas H.

    2017-02-22

    Oxy-fired coal combustion is a promising potential carbon capture technology. Predictive CFD simulations are valuable tools in evaluating and deploying oxy-fuel and other carbon capture technologies either as retrofit technologies or for new construction. But, accurate predictive simulations require physically realistic submodels with low computational requirements. In particular, comprehensive char oxidation and gasification models have been developed that describe multiple reaction and diffusion processes. Our work extends a comprehensive char conversion code (CCK), which treats surface oxidation and gasification reactions as well as processes such as film diffusion, pore diffusion, ash encapsulation, and annealing. In this work several submodels inmore » the CCK code were updated with more realistic physics or otherwise extended to function in oxy-coal conditions. Improved submodels include the annealing model, the swelling model, the mode of burning parameter, and the kinetic model, as well as the addition of the chemical percolation devolatilization (CPD) model. We compare our results of the char combustion model to oxy-coal data, and further compared to parallel data sets near conventional conditions. A potential method to apply the detailed code in CFD work is given.« less

  8. Comprehensive Model of Single Particle Pulverized Coal Combustion Extended to Oxy-Coal Conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holland, Troy; Fletcher, Thomas H.

    Oxy-fired coal combustion is a promising potential carbon capture technology. Predictive CFD simulations are valuable tools in evaluating and deploying oxy-fuel and other carbon capture technologies either as retrofit technologies or for new construction. But, accurate predictive simulations require physically realistic submodels with low computational requirements. In particular, comprehensive char oxidation and gasification models have been developed that describe multiple reaction and diffusion processes. Our work extends a comprehensive char conversion code (CCK), which treats surface oxidation and gasification reactions as well as processes such as film diffusion, pore diffusion, ash encapsulation, and annealing. In this work several submodels inmore » the CCK code were updated with more realistic physics or otherwise extended to function in oxy-coal conditions. Improved submodels include the annealing model, the swelling model, the mode of burning parameter, and the kinetic model, as well as the addition of the chemical percolation devolatilization (CPD) model. We compare our results of the char combustion model to oxy-coal data, and further compared to parallel data sets near conventional conditions. A potential method to apply the detailed code in CFD work is given.« less

  9. Multiple commodities in statistical microeconomics: Model and market

    NASA Astrophysics Data System (ADS)

    Baaquie, Belal E.; Yu, Miao; Du, Xin

    2016-11-01

    A statistical generalization of microeconomics has been made in Baaquie (2013). In Baaquie et al. (2015), the market behavior of single commodities was analyzed and it was shown that market data provides strong support for the statistical microeconomic description of commodity prices. The case of multiple commodities is studied and a parsimonious generalization of the single commodity model is made for the multiple commodities case. Market data shows that the generalization can accurately model the simultaneous correlation functions of up to four commodities. To accurately model five or more commodities, further terms have to be included in the model. This study shows that the statistical microeconomics approach is a comprehensive and complete formulation of microeconomics, and which is independent to the mainstream formulation of microeconomics.

  10. Improved word comprehension in Global aphasia using a modified semantic feature analysis treatment.

    PubMed

    Munro, Philippa; Siyambalapitiya, Samantha

    2017-01-01

    Limited research has investigated treatment of single word comprehension in people with aphasia, despite numerous studies examining treatment of naming deficits. This study employed a single case experimental design to examine efficacy of a modified semantic feature analysis (SFA) therapy in improving word comprehension in an individual with Global aphasia, who presented with a semantically based comprehension impairment. Ten treatment sessions were conducted over a period of two weeks. Following therapy, the participant demonstrated improved comprehension of treatment items and generalisation to control items, measured by performance on a spoken word picture matching task. Improvements were also observed on other language assessments (e.g. subtests of WAB-R; PALPA subtest 47) and were largely maintained over a period of 12 weeks without further therapy. This study provides support for the efficacy of a modified SFA therapy in remediating single word comprehension in individuals with aphasia with a semantically based comprehension deficit.

  11. A Study on the Employee Turnover Antecedents in ITES/BPO Sector

    ERIC Educational Resources Information Center

    Sree Rekha, K. R.; Kamalanabhan, T. J.

    2010-01-01

    This paper aims at testing a conceptual model connecting variables of the internal and external work environment to ITES/BPO employee turnover. Based on the gaps identified from the literature that no single model explains in a comprehensive way as to why, people choose to leave and the lack of turnover studies on call centers located in India.…

  12. A Model for Determining Teaching Efficacy through the Use of Qualitative Single Subject Design, Student Learning Outcomes and Associative Statistics

    ERIC Educational Resources Information Center

    Osler, James Edward, II; Mansaray, Mahmud

    2014-01-01

    Many universities and colleges are increasingly concerned about enhancing the comprehension and knowledge of their students, particularly in the classroom. One of the method to enhancing student success is teaching effectiveness. The objective of this research paper is to propose a novel research model which examines the relationship between…

  13. Improving Student Services in Secondary Schools.

    ERIC Educational Resources Information Center

    Maddy-Bernstein, Carolyn; Cunanan, Esmeralda S.

    1995-01-01

    No single comprehensive student services delivery model exists, and "student services" terminology remains problematic. The Office of Student Services has defined student services as those services provided by educational institutions to facilitate learning and the successful transition from school to work, military, or more education. To be…

  14. What propels sexual murderers: a proposed integrated theory of social learning and routine activities theories.

    PubMed

    Chan, Heng Choon Oliver; Heide, Kathleen M; Beauregard, Eric

    2011-04-01

    Despite the great interest in the study of sexual homicide, little is known about the processes involved in an individual's becoming motivated to sexually kill, deciding to sexually kill, and acting on that desire, intention, and opportunity. To date, no comprehensive model of sexual murdering from the offending perspective has been proposed in the criminological literature. This article incorporates the works of Akers and Cohen and Felson regarding their social learning theory and routine activities theory, respectively, to construct an integrated conceptual offending framework in sexual homicide. This integrated model produces a stronger and more comprehensive explanation of sexual murder than any single theory currently available.

  15. ECOSYSTEM SERVICES AND BEYOND: INTEGRATION OF ECOSYSTEM SCIENCE AND MULTIMEDIA EXPOSURE MODELING FOR ENVIRONMENTAL PROTECTION

    EPA Science Inventory

    Decision-making for ecosystem protection and resource management requires an integrative science and technology applied with a sufficiently comprehensive systems approach. Single media (e.g., air, soil and water) approaches that evaluate aspects of an ecosystem in a stressor-by-...

  16. Arrhythmias Following Comprehensive Stage II Surgical Palliation in Single Ventricle Patients.

    PubMed

    Wilhelm, Carolyn M; Paulus, Diane; Cua, Clifford L; Kertesz, Naomi J; Cheatham, John P; Galantowicz, Mark; Fernandez, Richard P

    2016-03-01

    Post-operative arrhythmias are common in pediatric patients following cardiac surgery. Following hybrid palliation in single ventricle patients, a comprehensive stage II palliation is performed. The incidence of arrhythmias in patients following comprehensive stage II palliation is unknown. The purpose of this study is to determine the incidence of arrhythmias following comprehensive stage II palliation. A single-center retrospective chart review was performed on all single ventricle patients undergoing a comprehensive stage II palliation from January 2010 to May 2014. Pre-operative, operative, and post-operative data were collected. A clinically significant arrhythmia was defined as an arrhythmia which led to cardiopulmonary resuscitation or required treatment with either pacing or antiarrhythmic medication. Statistical analysis was performed with Wilcoxon rank-sum test and Fisher's exact test with p < 0.05 significant. Forty-eight single ventricle patients were reviewed (32 hypoplastic left heart syndrome, 16 other single ventricle variants). Age at surgery was 185 ± 56 days. Cardiopulmonary bypass time was 259 ± 45 min. Average vasoactive-inotropic score was 5.97 ± 7.58. Six patients (12.5 %) had clinically significant arrhythmias: four sinus bradycardia, one 2:1 atrioventricular block, and one slow junctional rhythm. No tachyarrhythmias were documented for this patient population. Presence of arrhythmia was associated with elevated lactate (p = 0.04) and cardiac arrest (p = 0.002). Following comprehensive stage II palliation, single ventricle patients are at low risk for development of tachyarrhythmias. The most frequent arrhythmia seen in these patients was sinus bradycardia associated with respiratory compromise.

  17. Development of simplified external control techniques for broad area semiconductor lasers

    NASA Technical Reports Server (NTRS)

    Davis, Christopher C.

    1993-01-01

    The goal of this project was to injection lock a 500 mW broad area laser diode (BAL) with a single mode low power laser diode with injection beam delivery through a single mode optical fiber (SMF). This task was completed successfully with the following significant accomplishments: (1) injection locking of a BAL through a single-mode fiber using a master oscillator and integrated miniature optics; (2) generation of a single-lobed, high-power far-field pattern from the injection-locked BAL that steers with drive current; and (3) a comprehensive theoretical analysis of a model that describes the observed behavior of the injection locked oscillator.

  18. Regulation of cell fate determination by single-repeat R3 MYB transcription factors in Arabidopsis

    PubMed Central

    Wang, Shucai; Chen, Jin-Gui

    2014-01-01

    MYB transcription factors regulate multiple aspects of plant growth and development. Among the large family of MYB transcription factors, single-repeat R3 MYBs are characterized by their short sequence (<120 amino acids) consisting largely of the single MYB DNA-binding repeat. In the model plant Arabidopsis, R3 MYBs mediate lateral inhibition during epidermal patterning and are best characterized for their regulatory roles in trichome and root hair development. R3 MYBs act as negative regulators for trichome formation but as positive regulators for root hair development. In this article, we provide a comprehensive review on the role of R3 MYBs in the regulation of cell type specification in the model plant Arabidopsis. PMID:24782874

  19. Regulation of Cell Fate Determination by Single-Repeat R3 MYB Transcription Factors in Arabidopsis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Shucai; Chen, Jay

    2014-01-01

    MYB transcription factors regulate multiple aspects of plant growth and development. Among the large family of MYB transcription factors, single-repeat R3 MYB are characterized by their short sequence (<120 amino acids) consisting largely of the single MYB DNA-binding repeat. In the model plant Arabidopsis, R3 MYBs mediate lateral inhibition during epidermal patterning and are best characterized for their regulatory roles in trichome and root hair development. R3 MYBs act as negative regulators for trichome formation but as positive regulators for root hair development. In this article, we provide a comprehensive review on the role of R3 MYBs in the regulationmore » of cell type specification in the model plant Arabidopsis.« less

  20. An experimentally validated network of nine haematopoietic transcription factors reveals mechanisms of cell state stability

    PubMed Central

    Schütte, Judith; Wang, Huange; Antoniou, Stella; Jarratt, Andrew; Wilson, Nicola K; Riepsaame, Joey; Calero-Nieto, Fernando J; Moignard, Victoria; Basilico, Silvia; Kinston, Sarah J; Hannah, Rebecca L; Chan, Mun Chiang; Nürnberg, Sylvia T; Ouwehand, Willem H; Bonzanni, Nicola; de Bruijn, Marella FTR; Göttgens, Berthold

    2016-01-01

    Transcription factor (TF) networks determine cell-type identity by establishing and maintaining lineage-specific expression profiles, yet reconstruction of mammalian regulatory network models has been hampered by a lack of comprehensive functional validation of regulatory interactions. Here, we report comprehensive ChIP-Seq, transgenic and reporter gene experimental data that have allowed us to construct an experimentally validated regulatory network model for haematopoietic stem/progenitor cells (HSPCs). Model simulation coupled with subsequent experimental validation using single cell expression profiling revealed potential mechanisms for cell state stabilisation, and also how a leukaemogenic TF fusion protein perturbs key HSPC regulators. The approach presented here should help to improve our understanding of both normal physiological and disease processes. DOI: http://dx.doi.org/10.7554/eLife.11469.001 PMID:26901438

  1. SSD for R: A Comprehensive Statistical Package to Analyze Single-System Data

    ERIC Educational Resources Information Center

    Auerbach, Charles; Schudrich, Wendy Zeitlin

    2013-01-01

    The need for statistical analysis in single-subject designs presents a challenge, as analytical methods that are applied to group comparison studies are often not appropriate in single-subject research. "SSD for R" is a robust set of statistical functions with wide applicability to single-subject research. It is a comprehensive package…

  2. Why does working memory capacity predict variation in reading comprehension? On the influence of mind wandering and executive attention.

    PubMed

    McVay, Jennifer C; Kane, Michael J

    2012-05-01

    Some people are better readers than others, and this variation in comprehension ability is predicted by measures of working memory capacity (WMC). The primary goal of this study was to investigate the mediating role of mind-wandering experiences in the association between WMC and normal individual differences in reading comprehension, as predicted by the executive-attention theory of WMC (e.g., Engle & Kane, 2004). We used a latent-variable, structural-equation-model approach, testing skilled adult readers on 3 WMC span tasks, 7 varied reading-comprehension tasks, and 3 attention-control tasks. Mind wandering was assessed using experimenter-scheduled thought probes during 4 different tasks (2 reading, 2 attention-control). The results support the executive-attention theory of WMC. Mind wandering across the 4 tasks loaded onto a single latent factor, reflecting a stable individual difference. Most important, mind wandering was a significant mediator in the relationship between WMC and reading comprehension, suggesting that the WMC-comprehension correlation is driven, in part, by attention control over intruding thoughts. We discuss implications for theories of WMC, attention control, and reading comprehension.

  3. Moving beyond the priming of single-language sentences: A proposal for a comprehensive model to account for linguistic representation in bilinguals.

    PubMed

    Kootstra, Gerrit Jan; Rossi, Eleonora

    2017-01-01

    In their target article, Branigan & Pickering (B&P) briefly discuss bilingual language representation, focusing primarily on cross-language priming between single-language sentences. We follow up on this discussion by showing how structural priming drives real-life phenomena of bilingual language use beyond the priming of unilingual sentences and by arguing that B&P's account should be extended with a representation for language membership.

  4. Neurobiological bases of reading comprehension: Insights from neuroimaging studies of word level and text level processing in skilled and impaired readers

    PubMed Central

    Landi, Nicole; Frost, Stephen J.; Menc, W. Einar; Sandak, Rebecca; Pugh, Kenneth R.

    2012-01-01

    For accurate reading comprehension, readers must first learn to map letters to their corresponding speech sounds and meaning and then they must string the meanings of many words together to form a representation of the text. Furthermore, readers must master the complexities involved in parsing the relevant syntactic and pragmatic information necessary for accurate interpretation. Failure in this process can occur at multiple levels and cognitive neuroscience has been helpful in identifying the underlying causes of success and failure in reading single words and in reading comprehension. In general, neurobiological studies of skilled reading comprehension indicate a highly overlapping language circuit for single word reading, reading comprehension and listening comprehension with largely quantitative differences in a number of reading and language related areas. This paper reviews relevant research from studies employing neuroimaging techniques to study reading with a focus on the relationship between reading skill, single word reading, and text comprehension. PMID:23662034

  5. Measurement of fatigue: Comparison of the reliability and validity of single-item and short measures to a comprehensive measure.

    PubMed

    Kim, Hee-Ju; Abraham, Ivo

    2017-01-01

    Evidence is needed on the clinicometric properties of single-item or short measures as alternatives to comprehensive measures. We examined whether two single-item fatigue measures (i.e., Likert scale, numeric rating scale) or a short fatigue measure were comparable to a comprehensive measure in reliability (i.e., internal consistency and test-retest reliability) and validity (i.e., convergent, concurrent, and predictive validity) in Korean young adults. For this quantitative study, we selected the Functional Assessment of Chronic Illness Therapy-Fatigue for the comprehensive measure and the Profile of Mood States-Brief, Fatigue subscale for the short measure; and constructed two single-item measures. A total of 368 students from four nursing colleges in South Korea participated. We used Cronbach's alpha and item-total correlation for internal consistency reliability and intraclass correlation coefficient for test-retest reliability. We assessed Pearson's correlation with a comprehensive measure for convergent validity, with perceived stress level and sleep quality for concurrent validity and the receiver operating characteristic curve for predictive validity. The short measure was comparable to the comprehensive measure in internal consistency reliability (Cronbach's alpha=0.81 vs. 0.88); test-retest reliability (intraclass correlation coefficient=0.66 vs. 0.61); convergent validity (r with comprehensive measure=0.79); concurrent validity (r with perceived stress=0.55, r with sleep quality=0.39) and predictive validity (area under curve=0.88). Single-item measures were not comparable to the comprehensive measure. A short fatigue measure exhibited similar levels of reliability and validity to the comprehensive measure in Korean young adults. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. New insights into the complex regulation of the glycolytic pathway in Lactococcus lactis. I. Construction and diagnosis of a comprehensive dynamic model.

    PubMed

    Dolatshahi, Sepideh; Fonseca, Luis L; Voit, Eberhard O

    2016-01-01

    This article and the companion paper use computational systems modeling to decipher the complex coordination of regulatory signals controlling the glycolytic pathway in the dairy bacterium Lactococcus lactis. In this first article, the development of a comprehensive kinetic dynamic model is described. The model is based on in vivo NMR data that consist of concentration trends in key glycolytic metabolites and cofactors. The model structure and parameter values are identified with a customized optimization strategy that uses as its core the method of dynamic flux estimation. For the first time, a dynamic model with a single parameter set fits all available glycolytic time course data under anaerobic operation. The model captures observations that had not been addressed so far and suggests the existence of regulatory effects that had been observed in other species, but not in L. lactis. The companion paper uses this model to analyze details of the dynamic control of glycolysis under aerobic and anaerobic conditions.

  7. Commentary on "Reading Comprehension Is Not a Single Ability": Implications for Child Language Intervention

    ERIC Educational Resources Information Center

    Ukrainetz, Teresa A.

    2017-01-01

    Purpose: This commentary responds to the implications for child language intervention of Catts and Kamhi's (2017) call to move from viewing reading comprehension as a single ability to recognizing it as a complex constellation of reader, text, and activity. Method: Reading comprehension, as Catts and Kamhi explain, is very complicated. In this…

  8. How accurate are lexile text measures?

    PubMed

    Stenner, A Jackson; Burdick, Hal; Sanford, Eleanor E; Burdick, Donald S

    2006-01-01

    The Lexile Framework for Reading models comprehension as the difference between a reader measure and a text measure. Uncertainty in comprehension rates results from unreliability in reader measures and inaccuracy in text readability measures. Whole-text processing eliminates sampling error in text measures. However, Lexile text measures are imperfect due to misspecification of the Lexile theory. The standard deviation component associated with theory misspecification is estimated at 64L for a standard-length passage (approximately 125 words). A consequence is that standard errors for longer texts (2,500 to 150,000 words) are measured on the Lexile scale with uncertainties in the single digits. Uncertainties in expected comprehension rates are largely due to imprecision in reader ability and not inaccuracies in text readabilities.

  9. On the progenitors of Type Ia supernovae

    NASA Astrophysics Data System (ADS)

    Livio, Mario; Mazzali, Paolo

    2018-03-01

    We review all the models proposed for the progenitor systems of Type Ia supernovae and discuss the strengths and weaknesses of each scenario when confronted with observations. We show that all scenarios encounter at least a few serious difficulties, if taken to represent a comprehensive model for the progenitors of all Type Ia supernovae (SNe Ia). Consequently, we tentatively conclude that there is probably more than one channel leading SNe Ia. While the single-degenerate scenario (in which a single white dwarf accretes mass from a normal stellar companion) has been studied in some detail, the other scenarios will need a similar level of scrutiny before any firm conclusions can be drawn.

  10. North Atlantic Coast Comprehensive Study Phase I: Statistical Analysis of Historical Extreme Water Levels with Sea Level Change

    DTIC Science & Technology

    2014-09-01

    14-7 ii Abstract The U.S. North Atlantic coast is subject to coastal flooding as a result of both severe extratropical storms (e.g., Nor’easters...Products and Services, excluding any kind of high-resolution hydrodynamic modeling. Tropical and extratropical storms were treated as a single...joint probability analysis and high-fidelity modeling of tropical and extratropical storms

  11. Developing a Comprehensive Model of Risk and Protective Factors That Can Predict Spelling at Age Seven: Findings from a Community Sample of Victorian Children

    ERIC Educational Resources Information Center

    Serry, Tanya Anne; Castles, Anne; Mensah, Fiona K.; Bavin, Edith L.; Eadie, Patricia; Pezic, Angela; Prior, Margot; Bretherton, Lesley; Reilly, Sheena

    2015-01-01

    The paper reports on a study designed to develop a risk model that can best predict single-word spelling in seven-year-old children when they were aged 4 and 5. Test measures, personal characteristics and environmental influences were all considered as variables from a community sample of 971 children. Strong concurrent correlations were found…

  12. Cues, quantification, and agreement in language comprehension.

    PubMed

    Tanner, Darren; Bulkes, Nyssa Z

    2015-12-01

    We investigated factors that affect the comprehension of subject-verb agreement in English, using quantification as a window into the relationship between morphosyntactic processes in language production and comprehension. Event-related brain potentials (ERPs) were recorded while participants read sentences with grammatical and ungrammatical verbs, in which the plurality of the subject noun phrase was either doubly marked (via overt plural quantification and morphological marking on the noun) or singly marked (via only plural morphology on the noun). Both acceptability judgments and the ERP data showed heightened sensitivity to agreement violations when quantification provided an additional cue to the grammatical number of the subject noun phrase, over and above plural morphology. This is consistent with models of grammatical comprehension that emphasize feature prediction in tandem with cue-based memory retrieval. Our results additionally contrast with those of prior studies that showed no effects of plural quantification on agreement in language production. These findings therefore highlight some nontrivial divergences in the cues and mechanisms supporting morphosyntactic processing in language production and comprehension.

  13. Indications of comprehensiveness in the pedagogical relationship: a design to be constructed in nursing education.

    PubMed

    Lima, Margarete Maria de; Reibnitz, Kenya Schmidt; Kloh, Daiana; Martini, Jussara Gue; Backes, Vania Marli Schubert

    2017-11-27

    To analyze how the indications of comprehensiveness translate into the teaching-learning process in a nursing undergraduate course. Qualitative case study carried out with professors of a Nursing Undergraduate Course. Data collection occurred through documentary analysis, non-participant observation and individual interviews. Data analysis was guided from an analytical matrix following the steps of the operative proposal. Eight professors participated in the study. Some indications of comprehensiveness such as dialogue, listening, mutual respect, bonding and welcoming are present in the daily life of some professors. The indications of comprehensiveness are applied by some professors in the pedagogical relationship. The results refer to the Comprehensiveness of teaching-learning in a single and double loop model, and in this the professor and the student assume an open posture for new possibilities in the teaching-learning process. Comprehensiveness, as it is recognized as a pedagogical principle, allows the disruption of a professor-centered teaching and advances in collective learning, enabling the professor and student to create their own design anchored in a reflective process about their practices and the reality found in the health services.

  14. New Directions in U.S. National Security Strategy, Defense Plans, and Diplomacy -- A Review of Official Strategic Documents

    DTIC Science & Technology

    2011-07-01

    demand capabilities, a force-generation model that provides sufficient strategic depth, and a comprehensive study on the future balance between Active...career, and use of bonuses and credits to reward critical specialties and outstanding perfor- mance. They also include a continuum-of-service model that...development projects (for instance, the F–22) typically try to produce major leaps in technology and performance in a single step. A better model , it

  15. Ecosystem Risk Assessment Using the Comprehensive Assessment of Risk to Ecosystems (CARE) Tool

    NASA Astrophysics Data System (ADS)

    Battista, W.; Fujita, R.; Karr, K.

    2016-12-01

    Effective Ecosystem Based Management requires a localized understanding of the health and functioning of a given system as well as of the various factors that may threaten the ongoing ability of the system to support the provision of valued services. Several risk assessment models are available that can provide a scientific basis for understanding these factors and for guiding management action, but these models focus mainly on single species and evaluate only the impacts of fishing in detail. We have developed a new ecosystem risk assessment model - the Comprehensive Assessment of Risk to Ecosystems (CARE) - that allows analysts to consider the cumulative impact of multiple threats, interactions among multiple threats that may result in synergistic or antagonistic impacts, and the impacts of a suite of threats on whole-ecosystem productivity and functioning, as well as on specific ecosystem services. The CARE model was designed to be completed in as little as two hours, and uses local and expert knowledge where data are lacking. The CARE tool can be used to evaluate risks facing a single site; to compare multiple sites for the suitability or necessity of different management options; or to evaluate the effects of a proposed management action aimed at reducing one or more risks. This analysis can help users identify which threats are the most important at a given site, and therefore where limited management resources should be targeted. CARE can be applied to virtually any system, and can be modified as knowledge is gained or to better match different site characteristics. CARE builds on previous ecosystem risk assessment tools to provide a comprehensive assessment of fishing and non-fishing threats that can be used to inform environmental management decisions across a broad range of systems.

  16. Ecosystem Risk Assessment Using the Comprehensive Assessment of Risk to Ecosystems (CARE) Tool

    NASA Astrophysics Data System (ADS)

    Battista, W.; Fujita, R.; Karr, K.

    2016-02-01

    Effective Ecosystem Based Management requires a localized understanding of the health and functioning of a given system as well as of the various factors that may threaten the ongoing ability of the system to support the provision of valued services. Several risk assessment models are available that can provide a scientific basis for understanding these factors and for guiding management action, but these models focus mainly on single species and evaluate only the impacts of fishing in detail. We have developed a new ecosystem risk assessment model - the Comprehensive Assessment of Risk to Ecosystems (CARE) - that allows analysts to consider the cumulative impact of multiple threats, interactions among multiple threats that may result in synergistic or antagonistic impacts, and the impacts of a suite of threats on whole-ecosystem productivity and functioning, as well as on specific ecosystem services. The CARE model was designed to be completed in as little as two hours, and uses local and expert knowledge where data are lacking. The CARE tool can be used to evaluate risks facing a single site; to compare multiple sites for the suitability or necessity of different management options; or to evaluate the effects of a proposed management action aimed at reducing one or more risks. This analysis can help users identify which threats are the most important at a given site, and therefore where limited management resources should be targeted. CARE can be applied to virtually any system, and can be modified as knowledge is gained or to better match different site characteristics. CARE builds on previous ecosystem risk assessment tools to provide a comprehensive assessment of fishing and non-fishing threats that can be used to inform environmental management decisions across a broad range of systems.

  17. Mind wandering in text comprehension under dual-task conditions.

    PubMed

    Dixon, Peter; Li, Henry

    2013-01-01

    In two experiments, subjects responded to on-task probes while reading under dual-task conditions. The secondary task was to monitor the text for occurrences of the letter e. In Experiment 1, reading comprehension was assessed with a multiple-choice recognition test; in Experiment 2, subjects recalled the text. In both experiments, the secondary task replicated the well-known "missing-letter effect" in which detection of e's was less effective for function words and the word "the." Letter detection was also more effective when subjects were on task, but this effect did not interact with the missing-letter effect. Comprehension was assessed in both the dual-task conditions and in control single-task conditions. In the single-task conditions, both recognition (Experiment 1) and recall (Experiment 2) was better when subjects were on task, replicating previous research on mind wandering. Surprisingly, though, comprehension under dual-task conditions only showed an effect of being on task when measured with recall; there was no effect on recognition performance. Our interpretation of this pattern of results is that subjects generate responses to on-task probes on the basis of a retrospective assessment of the contents of working memory. Further, we argue that under dual-task conditions, the contents of working memory is not closely related to the reading processes required for accurate recognition performance. These conclusions have implications for models of text comprehension and for the interpretation of on-task probe responses.

  18. Mind wandering in text comprehension under dual-task conditions

    PubMed Central

    Dixon, Peter; Li, Henry

    2013-01-01

    In two experiments, subjects responded to on-task probes while reading under dual-task conditions. The secondary task was to monitor the text for occurrences of the letter e. In Experiment 1, reading comprehension was assessed with a multiple-choice recognition test; in Experiment 2, subjects recalled the text. In both experiments, the secondary task replicated the well-known “missing-letter effect” in which detection of e's was less effective for function words and the word “the.” Letter detection was also more effective when subjects were on task, but this effect did not interact with the missing-letter effect. Comprehension was assessed in both the dual-task conditions and in control single-task conditions. In the single-task conditions, both recognition (Experiment 1) and recall (Experiment 2) was better when subjects were on task, replicating previous research on mind wandering. Surprisingly, though, comprehension under dual-task conditions only showed an effect of being on task when measured with recall; there was no effect on recognition performance. Our interpretation of this pattern of results is that subjects generate responses to on-task probes on the basis of a retrospective assessment of the contents of working memory. Further, we argue that under dual-task conditions, the contents of working memory is not closely related to the reading processes required for accurate recognition performance. These conclusions have implications for models of text comprehension and for the interpretation of on-task probe responses. PMID:24101909

  19. Public injury prevention system in the Italian manufacturing sector: What types of inspection are more effective?

    PubMed

    Farina, Elena; Bena, Antonella; Fedeli, Ugo; Mastrangelo, Giuseppe; Veronese, Michela; Agnesi, Roberto

    2016-04-01

    Literature suggests that more research is needed to clarify the effect of workplace inspections by governmental officers on injury rates. This paper aims to compare comprehensive and partial inspections in Italian manufacturing companies. Survival analysis was applied to the period free from injuries following inspection by means of the Kaplan-Meier method and of Cox models. Kaplan-Meier curves show that, compared to companies with a partial inspection, companies which had a comprehensive inspection had a higher survival through the entire period. Adjusting for confounders, the Cox model confirms a significant preventive effect of comprehensive inspection for companies with 10-30 employees, but not for those with >30 employees. The results suggest that the effect on injuries is greater if all aspects of safety are addressed during the inspection instead of focusing on a single aspect. These findings are interesting because they can help in planning effective prevention activities. © 2016 Wiley Periodicals, Inc.

  20. Why Does Working Memory Capacity Predict Variation in Reading Comprehension? On the Influence of Mind Wandering and Executive Attention

    PubMed Central

    McVay, Jennifer C.; Kane, Michael J.

    2012-01-01

    Some people are better readers than others, and this variation in comprehension ability is predicted by measures of working memory capacity (WMC). The primary goal of this study was to investigate the mediating role of mind wandering experiences in the association between WMC and normal individual differences in reading comprehension, as predicted by the executive-attention theory of WMC (e.g., Engle & Kane, 2004). We used a latent-variable, structural-equation-model approach, testing skilled adult readers on three WMC span tasks, seven varied reading comprehension tasks, and three attention-control tasks. Mind wandering was assessed using experimenter-scheduled thought probes during four different tasks (two reading, two attention-control tasks). The results support the executive-attention theory of WMC. Mind wandering across the four tasks loaded onto a single latent factor, reflecting a stable individual difference. Most importantly, mind wandering was a significant mediator in the relationship between WMC and reading comprehension, suggesting that the WMC-comprehension correlation is driven, in part, by attention control over intruding thoughts. We discuss implications for theories of WMC, attention control, and reading comprehension. PMID:21875246

  1. Intraseasonal Variability in the Atmosphere-Ocean Climate System. Second Edition

    NASA Technical Reports Server (NTRS)

    Lau, William K. M.; Waliser, Duane E.

    2011-01-01

    Understanding and predicting the intraseasonal variability (ISV) of the ocean and atmosphere is crucial to improving long-range environmental forecasts and the reliability of climate change projections through climate models. This updated, comprehensive and authoritative second edition has a balance of observation, theory and modeling and provides a single source of reference for all those interested in this important multi-faceted natural phenomenon and its relation to major short-term climatic variations.

  2. Theoretical and material studies on thin-film electroluminescent devices

    NASA Technical Reports Server (NTRS)

    Summers, C. J.; Brennan, K. F.

    1986-01-01

    Electroluminescent materials and device technology were assessed. The evaluation strongly suggests the need for a comprehensive theoretical and experimental study of both materials and device structures, particularly in the following areas: carrier generation and multiplication; radiative and nonradiative processes of luminescent centers; device modeling; new device concepts; and single crystal materials growth and characterization. Modeling of transport properties of hot electrons in ZnSe and the generation of device concepts were initiated.

  3. SNP discovery and chromosome anchoring provide the first physically-anchored hexaploid oat map and reveal synteny with model species

    USDA-ARS?s Scientific Manuscript database

    For the first time in many years a comprehensive genome map for cultivated oat has been constructed using a combination of single nucleotide polymorphism (SNP) markers and validated with a collection of cytogenetically defined germplasm lines. The markers were able to help distinguish the three geno...

  4. Commentary on "Reading Comprehension Is Not a Single Ability": Implications for Child Language Intervention.

    PubMed

    Ukrainetz, Teresa A

    2017-04-20

    This commentary responds to the implications for child language intervention of Catts and Kamhi's (2017) call to move from viewing reading comprehension as a single ability to recognizing it as a complex constellation of reader, text, and activity. Reading comprehension, as Catts and Kamhi explain, is very complicated. In this commentary, I consider how comprehension has been taught and the directions in which it is moving. I consider how speech-language pathologists (SLPs), with their distinctive expertise and resources, can contribute to effective reading comprehension instruction. I build from Catts and Kamhi's emphasis on the importance of context and knowledge, using the approaches of staying on topic, close reading, and incorporating quality features of intervention. I consider whether and how SLPs should treat language skills and comprehension strategies to achieve noticeable changes in their students' reading comprehension. Within this multidimensional view of reading comprehension, SLPs can make strategic, meaningful contributions to improving the reading comprehension of students with language impairments.

  5. Versatile Analysis of Single-Molecule Tracking Data by Comprehensive Testing against Monte Carlo Simulations

    PubMed Central

    Wieser, Stefan; Axmann, Markus; Schütz, Gerhard J.

    2008-01-01

    We propose here an approach for the analysis of single-molecule trajectories which is based on a comprehensive comparison of an experimental data set with multiple Monte Carlo simulations of the diffusion process. It allows quantitative data analysis, particularly whenever analytical treatment of a model is infeasible. Simulations are performed on a discrete parameter space and compared with the experimental results by a nonparametric statistical test. The method provides a matrix of p-values that assess the probability for having observed the experimental data at each setting of the model parameters. We show the testing approach for three typical situations observed in the cellular plasma membrane: i), free Brownian motion of the tracer, ii), hop diffusion of the tracer in a periodic meshwork of squares, and iii), transient binding of the tracer to slowly diffusing structures. By plotting the p-value as a function of the model parameters, one can easily identify the most consistent parameter settings but also recover mutual dependencies and ambiguities which are difficult to determine by standard fitting routines. Finally, we used the test to reanalyze previous data obtained on the diffusion of the glycosylphosphatidylinositol-protein CD59 in the plasma membrane of the human T24 cell line. PMID:18805933

  6. Neural Basis of Action Understanding: Evidence from Sign Language Aphasia.

    PubMed

    Rogalsky, Corianne; Raphel, Kristin; Tomkovicz, Vivian; O'Grady, Lucinda; Damasio, Hanna; Bellugi, Ursula; Hickok, Gregory

    2013-01-01

    The neural basis of action understanding is a hotly debated issue. The mirror neuron account holds that motor simulation in fronto-parietal circuits is critical to action understanding including speech comprehension, while others emphasize the ventral stream in the temporal lobe. Evidence from speech strongly supports the ventral stream account, but on the other hand, evidence from manual gesture comprehension (e.g., in limb apraxia) has led to contradictory findings. Here we present a lesion analysis of sign language comprehension. Sign language is an excellent model for studying mirror system function in that it bridges the gap between the visual-manual system in which mirror neurons are best characterized and language systems which have represented a theoretical target of mirror neuron research. Twenty-one life long deaf signers with focal cortical lesions performed two tasks: one involving the comprehension of individual signs and the other involving comprehension of signed sentences (commands). Participants' lesions, as indicated on MRI or CT scans, were mapped onto a template brain to explore the relationship between lesion location and sign comprehension measures. Single sign comprehension was not significantly affected by left hemisphere damage. Sentence sign comprehension impairments were associated with left temporal-parietal damage. We found that damage to mirror system related regions in the left frontal lobe were not associated with deficits on either of these comprehension tasks. We conclude that the mirror system is not critically involved in action understanding.

  7. A comprehensive model of ion diffusion and charge exchange in the cold Io torus

    NASA Technical Reports Server (NTRS)

    Barbosa, D. D.; Moreno, M. A.

    1988-01-01

    A comprehensive analytic model of radial diffusion in the cold Io torus is developed. The model involves a generalized molecular cloud theory of SO2 and its dissociation fragments SO, O2, S, and O, which are formed at a relatively large rate by solar UV photodissociation of SO2. The key component of the new theory is SO, which can react with S(+) through a near-resonant charge exchange process that is exothermic. This provides a mechanism for the rapid depletion of singly ionized sulfur in the cold torus and can account for the large decrease in the total flux tube content inward of Io's orbit. The model is used to demonstrate quantitatively the effects of radial diffusion in a charge exchange environment that acts as a combined source and sink for ions in various charge states. A detailed quantitative explanation for the O(2+) component of the cold torus is given, and insight is derived into the workings of the so-called plasma 'ribbon'.

  8. Development of a comprehensive model for stakeholder management in mental healthcare.

    PubMed

    Bierbooms, Joyce; Van Oers, Hans; Rijkers, Jeroen; Bongers, Inge

    2016-06-20

    Purpose - Stakeholder management is not yet incorporated into the standard practice of most healthcare providers. The purpose of this paper is to assess the applicability of a comprehensive model for stakeholder management in mental healthcare organization for more evidence-based (stakeholder) management. Design/methodology/approach - The assessment was performed in two research parts: the steps described in the model were executed in a single case study at a mental healthcare organization in the Netherlands; and a process and effect evaluation was done to find the supporting and impeding factors with regard to the applicability of the model. Interviews were held with managers and directors to evaluate the effectiveness of the model with a view to stakeholder management. Findings - The stakeholder analysis resulted in the identification of eight stakeholder groups. Different expectations were identified for each of these groups. The analysis on performance gaps revealed that stakeholders generally find the collaboration with a mental healthcare provider "sufficient." Finally a prioritization showed that five stakeholder groups were seen as "definite" stakeholders by the organization. Practical implications - The assessment of the model showed that it generated useful knowledge for more evidence-based (stakeholder) management. Adaptation of the model is needed to increase its feasibility in practice. Originality/value - Provided that the model is properly adapted for the specific field, the analysis can provide more knowledge on stakeholders and can help integrate stakeholder management as a comprehensive process in policy planning.

  9. The Effects of Visual Attention Span and Phonological Decoding in Reading Comprehension in Dyslexia: A Path Analysis.

    PubMed

    Chen, Chen; Schneps, Matthew H; Masyn, Katherine E; Thomson, Jennifer M

    2016-11-01

    Increasing evidence has shown visual attention span to be a factor, distinct from phonological skills, that explains single-word identification (pseudo-word/word reading) performance in dyslexia. Yet, little is known about how well visual attention span explains text comprehension. Observing reading comprehension in a sample of 105 high school students with dyslexia, we used a pathway analysis to examine the direct and indirect path between visual attention span and reading comprehension while controlling for other factors such as phonological awareness, letter identification, short-term memory, IQ and age. Integrating phonemic decoding efficiency skills in the analytic model, this study aimed to disentangle how visual attention span and phonological skills work together in reading comprehension for readers with dyslexia. We found visual attention span to have a significant direct effect on more difficult reading comprehension but not on an easier level. It also had a significant direct effect on pseudo-word identification but not on word identification. In addition, we found that visual attention span indirectly explains reading comprehension through pseudo-word reading and word reading skills. This study supports the hypothesis that at least part of the dyslexic profile can be explained by visual attention abilities. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  10. The core legion object model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lewis, M.; Grimshaw, A.

    1996-12-31

    The Legion project at the University of Virginia is an architecture for designing and building system services that provide the illusion of a single virtual machine to users, a virtual machine that provides secure shared object and shared name spaces, application adjustable fault-tolerance, improved response time, and greater throughput. Legion targets wide area assemblies of workstations, supercomputers, and parallel supercomputers, Legion tackles problems not solved by existing workstation based parallel processing tools; the system will enable fault-tolerance, wide area parallel processing, inter-operability, heterogeneity, a single global name space, protection, security, efficient scheduling, and comprehensive resource management. This paper describes themore » core Legion object model, which specifies the composition and functionality of Legion`s core objects-those objects that cooperate to create, locate, manage, and remove objects in the Legion system. The object model facilitates a flexible extensible implementation, provides a single global name space, grants site autonomy to participating organizations, and scales to millions of sites and trillions of objects.« less

  11. The Genetic Architecture of Oral Language, Reading Fluency, and Reading Comprehension: A Twin Study From 7 to 16 Years

    PubMed Central

    2017-01-01

    This study examines the genetic and environmental etiology underlying the development of oral language and reading skills, and the relationship between them, over a long period of developmental time spanning middle childhood and adolescence. It focuses particularly on the differential relationship between language and two different aspects of reading: reading fluency and reading comprehension. Structural equation models were applied to language and reading data at 7, 12, and 16 years from the large-scale TEDS twin study. A series of multivariate twin models show a clear patterning of oral language with reading comprehension, as distinct from reading fluency: significant but moderate genetic overlap between oral language and reading fluency (genetic correlation rg = .46–.58 at 7, 12, and 16) contrasts with very substantial genetic overlap between oral language and reading comprehension (rg = .81–.87, at 12 and 16). This pattern is even clearer in a latent factors model, fit to the data aggregated across ages, in which a single factor representing oral language and reading comprehension is correlated with—but distinct from—a second factor representing reading fluency. A distinction between oral language and reading fluency is also apparent in different developmental trajectories: While the heritability of oral language increases over the period from 7 to 12 to 16 years (from h2 = .27 to .47 to .55), the heritability of reading fluency is high and largely stable over the same period of time (h2 = .73 to .71 to .64). PMID:28541066

  12. The genetic architecture of oral language, reading fluency, and reading comprehension: A twin study from 7 to 16 years.

    PubMed

    Tosto, Maria G; Hayiou-Thomas, Marianna E; Harlaar, Nicole; Prom-Wormley, Elizabeth; Dale, Philip S; Plomin, Robert

    2017-06-01

    This study examines the genetic and environmental etiology underlying the development of oral language and reading skills, and the relationship between them, over a long period of developmental time spanning middle childhood and adolescence. It focuses particularly on the differential relationship between language and two different aspects of reading: reading fluency and reading comprehension. Structural equation models were applied to language and reading data at 7, 12, and 16 years from the large-scale TEDS twin study. A series of multivariate twin models show a clear patterning of oral language with reading comprehension, as distinct from reading fluency: significant but moderate genetic overlap between oral language and reading fluency (genetic correlation r g = .46-.58 at 7, 12, and 16) contrasts with very substantial genetic overlap between oral language and reading comprehension (r g = .81-.87, at 12 and 16). This pattern is even clearer in a latent factors model, fit to the data aggregated across ages, in which a single factor representing oral language and reading comprehension is correlated with-but distinct from-a second factor representing reading fluency. A distinction between oral language and reading fluency is also apparent in different developmental trajectories: While the heritability of oral language increases over the period from 7 to 12 to 16 years (from h² = .27 to .47 to .55), the heritability of reading fluency is high and largely stable over the same period of time (h² = .73 to .71 to .64). (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brechenmacher, Laurent; Nguyen, Tran H.; Hixson, Kim K.

    Root hairs are a terminally differentiated single cell type, mainly involved in water and nutrient uptake from the soil. The soybean root hair cell represents an excellent model for the study of single cell systems biology. In this study, we identified 5702 proteins, with at least two peptides, from soybean root hairs using an accurate mass and time tag approach, establishing the most comprehensive proteome reference map of this single cell type. We also showed that trypsin is the most appropriate enzyme for soybean proteomic studies by performing an in silico digestion of the soybean proteome database using different proteases.more » Although the majority of proteins identified in this study are involved in basal metabolism, the function of others are more related to root hair formation/function and include proteins involved in nutrient uptake (transporters) or vesicular trafficking (cytoskeleton and RAB proteins). Interestingly, some of these proteins appear to be specifically expressed in root hairs and constitute very good candidates for further studies to elucidate unique features of this single cell model.« less

  14. The Effect of Educational Software, Video Modelling and Group Discussion on Social-Skill Acquisition among Students with Mild Intellectual Disabilities

    ERIC Educational Resources Information Center

    Hetzroni, Orit E.; Banin, Irit

    2017-01-01

    Background: People with intellectual and developmental disabilities (IDD) often demonstrate difficulties in social skills. The purpose of this study was to examine the effects of a comprehensive intervention program on the acquisition of social skills among students with mild IDD. Method: Single subject multiple baseline design across situations…

  15. Weather impacts on single-vehicle truck crash injury severity.

    PubMed

    Naik, Bhaven; Tung, Li-Wei; Zhao, Shanshan; Khattak, Aemal J

    2016-09-01

    The focus of this paper is on illustrating the feasibility of aggregating data from disparate sources to investigate the relationship between single-vehicle truck crash injury severity and detailed weather conditions. Specifically, this paper presents: (a) a methodology that combines detailed 15-min weather station data with crash and roadway data, and (b) an empirical investigation of the effects of weather on crash-related injury severities of single-vehicle truck crashes. Random parameters ordinal and multinomial regression models were used to investigate crash injury severity under different weather conditions, taking into account the individual unobserved heterogeneity. The adopted methodology allowed consideration of environmental, roadway, and climate-related variables in single-vehicle truck crash injury severity. Results showed that wind speed, rain, humidity, and air temperature were linked with single-vehicle truck crash injury severity. Greater recorded wind speed added to the severity of injuries in single-vehicle truck crashes in general. Rain and warmer air temperatures were linked to more severe crash injuries in single-vehicle truck crashes while higher levels of humidity were linked to less severe injuries. Random parameters ordered logit and multinomial logit, respectively, revealed some individual heterogeneity in the data and showed that integrating comprehensive weather data with crash data provided useful insights into factors associated with single-vehicle truck crash injury severity. The research provided a practical method that combined comprehensive 15-min weather station data with crash and roadway data, thereby providing useful insights into crash injury severity of single-vehicle trucks. Those insights are useful for future truck driver educational programs and for truck safety in different weather conditions. Copyright © 2016 Elsevier Ltd and National Safety Council. All rights reserved.

  16. ULTRA: Universal Grammar as a Universal Parser

    PubMed Central

    Medeiros, David P.

    2018-01-01

    A central concern of generative grammar is the relationship between hierarchy and word order, traditionally understood as two dimensions of a single syntactic representation. A related concern is directionality in the grammar. Traditional approaches posit process-neutral grammars, embodying knowledge of language, put to use with infinite facility both for production and comprehension. This has crystallized in the view of Merge as the central property of syntax, perhaps its only novel feature. A growing number of approaches explore grammars with different directionalities, often with more direct connections to performance mechanisms. This paper describes a novel model of universal grammar as a one-directional, universal parser. Mismatch between word order and interpretation order is pervasive in comprehension; in the present model, word order is language-particular and interpretation order (i.e., hierarchy) is universal. These orders are not two dimensions of a unified abstract object (e.g., precedence and dominance in a single tree); rather, both are temporal sequences, and UG is an invariant real-time procedure (based on Knuth's stack-sorting algorithm) transforming word order into hierarchical order. This shift in perspective has several desirable consequences. It collapses linearization, displacement, and composition into a single performance process. The architecture provides a novel source of brackets (labeled unambiguously and without search), which are understood not as part-whole constituency relations, but as storage and retrieval routines in parsing. It also explains why neutral word order within single syntactic cycles avoids 213-like permutations. The model identifies cycles as extended projections of lexical heads, grounding the notion of phase. This is achieved with a universal processor, dispensing with parameters. The empirical focus is word order in noun phrases. This domain provides some of the clearest evidence for 213-avoidance as a cross-linguistic word order generalization. Importantly, recursive phrase structure “bottoms out” in noun phrases, which are typically a single cycle (though further cycles may be embedded, e.g., relative clauses). By contrast, a simple transitive clause plausibly involves two cycles (vP and CP), embedding further nominal cycles. In the present theory, recursion is fundamentally distinct from structure-building within a single cycle, and different word order restrictions might emerge in larger domains like clauses. PMID:29497394

  17. Single-Event Effects in High-Frequency Linear Amplifiers: Experiment and Analysis

    NASA Astrophysics Data System (ADS)

    Zeinolabedinzadeh, Saeed; Ying, Hanbin; Fleetwood, Zachary E.; Roche, Nicolas J.-H.; Khachatrian, Ani; McMorrow, Dale; Buchner, Stephen P.; Warner, Jeffrey H.; Paki-Amouzou, Pauline; Cressler, John D.

    2017-01-01

    The single-event transient (SET) response of two different silicon-germanium (SiGe) X-band (8-12 GHz) low noise amplifier (LNA) topologies is fully investigated in this paper. The two LNAs were designed and implemented in 130nm SiGe HBT BiCMOS process technology. Two-photon absorption (TPA) laser pulses were utilized to induce transients within various devices in these LNAs. Impulse response theory is identified as a useful tool for predicting the settling behavior of the LNAs subjected to heavy ion strikes. Comprehensive device and circuit level modeling and simulations were performed to accurately simulate the behavior of the circuits under ion strikes. The simulations agree well with TPA measurements. The simulation, modeling and analysis presented in this paper can be applied for any other circuit topologies for SET modeling and prediction.

  18. Desertification in the south Junggar Basin, 2000-2009: Part II. Model development and trend analysis

    NASA Astrophysics Data System (ADS)

    Jiang, Miao; Lin, Yi

    2018-07-01

    The substantial objective of desertification monitoring is to derive its development trend, which facilitates pre-making policies to handle its potential influences. Aiming at this extreme goal, previous studies have proposed a large number of remote sensing (RS) based methods to retrieve multifold indicators, as reviewed in Part I. However, most of these indicators individually capable of characterizing a single aspect of land attributes, e.g., albedo quantifying land surface reflectivity, cannot show a full picture of desertification processes; few comprehensive RS-based models have either been published. To fill this gap, this Part II was dedicated to developing a RS information model for comprehensively characterizing the desertification and deriving its trend, based on the indicators retrieved in Part I in the same case of the south Junggar Basin, China in the last decade (2000-2009). The proposed model was designed to have three dominant component modules, i.e., the vegetation-relevant sub-model, the soil-relevant sub-model, and the water-relevant sub-model, which synthesize all of the retrieved indicators to integrally reflect the processes of desertification; based on the model-output indices, the desertification trends were derived using the least absolute deviation fitting algorithm. Tests indicated that the proposed model did work and the study area showed different development tendencies for different desertification levels. Overall, this Part II established a new comprehensive RS information model for desertification risk assessment and its trend deriving, and the whole study comprising Part I and Part II advanced a relatively standard framework for RS-based desertification monitoring.

  19. FINAL REPORT (DE-FG02-97ER62338): Single-column modeling, GCM parameterizations, and ARM data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Richard C. J. Somerville

    2009-02-27

    Our overall goal is the development of new and improved parameterizations of cloud-radiation effects and related processes, using ARM data at all three ARM sites, and the implementation and testing of these parameterizations in global models. To test recently developed prognostic parameterizations based on detailed cloud microphysics, we have compared SCM (single-column model) output with ARM observations at the SGP, NSA and TWP sites. We focus on the predicted cloud amounts and on a suite of radiative quantities strongly dependent on clouds, such as downwelling surface shortwave radiation. Our results demonstrate the superiority of parameterizations based on comprehensive treatments ofmore » cloud microphysics and cloud-radiative interactions. At the SGP and NSA sites, the SCM results simulate the ARM measurements well and are demonstrably more realistic than typical parameterizations found in conventional operational forecasting models. At the TWP site, the model performance depends strongly on details of the scheme, and the results of our diagnostic tests suggest ways to develop improved parameterizations better suited to simulating cloud-radiation interactions in the tropics generally. These advances have made it possible to take the next step and build on this progress, by incorporating our parameterization schemes in state-of-the-art three-dimensional atmospheric models, and diagnosing and evaluating the results using independent data. Because the improved cloud-radiation results have been obtained largely via implementing detailed and physically comprehensive cloud microphysics, we anticipate that improved predictions of hydrologic cycle components, and hence of precipitation, may also be achievable.« less

  20. Non-symmetric approach to single-screw expander and compressor modeling

    NASA Astrophysics Data System (ADS)

    Ziviani, Davide; Groll, Eckhard A.; Braun, James E.; Horton, W. Travis; De Paepe, M.; van den Broek, M.

    2017-08-01

    Single-screw type volumetric machines are employed both as compressors in refrigeration systems and, more recently, as expanders in organic Rankine cycle (ORC) applications. The single-screw machine is characterized by having a central grooved rotor and two mating toothed starwheels that isolate the working chambers. One of the main features of such machine is related to the simultaneous occurrence of the compression or expansion processes on both sides of the main rotor which results in a more balanced loading on the main shaft bearings with respect to twin-screw machines. However, the meshing between starwheels and main rotor is a critical aspect as it heavily affects the volumetric performance of the machine. To allow flow interactions between the two sides of the rotor, a non-symmetric modelling approach has been established to obtain a more comprehensive model of the single-screw machine. The resulting mechanistic model includes in-chamber governing equations, leakage flow models, heat transfer mechanisms, viscous and mechanical losses. Forces and moments balances are used to estimate the loads on the main shaft bearings as well as on the starwheel bearings. An 11 kWe single-screw expander (SSE) adapted from an air compressor operating with R245fa as working fluid is used to validate the model. A total of 60 steady-steady points at four different rotational speeds have been collected to characterize the performance of the machine. The maximum electrical power output and overall isentropic efficiency measured were 7.31 kW and 51.91%, respectively.

  1. Modeling, Analysis, and Impedance Design of Battery Energy Stored Single-Phase Quasi-Z Source Photovoltaic Inverter System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xue, Yaosuo

    The battery energy stored quasi-Z-source (BES-qZS) based photovoltaic (PV) power generation system combines advantages of the qZS inverter and the battery energy storage system. However, the second harmonic (2 ) power ripple will degrade the system's performance and affect the system's design. An accurate model to analyze the 2 ripple is very important. The existing models did not consider the battery, and with the assumption L1=L2 and C1=C2, which causes the non-optimized design for the impedance parameters of qZS network. This paper proposes a comprehensive model for single-phase BES-qZS-PV inverter system, where the battery is considered and without any restrictionmore » of L1, L2, C1, and C2. A BES-qZS impedance design method based on the built model is proposed to mitigate the 2 ripple. Simulation and experimental results verify the proposed 2 ripple model and design method.« less

  2. A Successful Model for a Comprehensive Patient Flow Management Center at an Academic Health System.

    PubMed

    Lovett, Paris B; Illg, Megan L; Sweeney, Brian E

    2016-05-01

    This article reports on an innovative approach to managing patient flow at a multicampus academic health system, integrating multiple services into a single, centralized Patient Flow Management Center that manages supply and demand for inpatient services across the system. Control of bed management was centralized across 3 campuses and key services were integrated, including bed management, case management, environmental services, patient transport, ambulance and helicopter dispatch, and transfer center. A single technology platform was introduced, as was providing round-the-clock patient placement by critical care nurses, and adding medical directors. Daily bed meetings with nurse managers and charge nurses drive action plans. This article reports immediate improvements in the first year of operations in emergency department walkouts, emergency department boarding, ambulance diversion, growth in transfer volume, reduction in lost transfers, reduction in time to bed assignment, and bed turnover time. The authors believe theirs is the first institution to integrate services and centralize bed management so comprehensively. © The Author(s) 2014.

  3. Comprehensive Genome Profiling of Single Sperm Cells by Multiple Annealing and Looping-Based Amplification Cycles and Next-Generation Sequencing from Carriers of Robertsonian Translocation.

    PubMed

    Sha, Yanwei; Sha, Yankun; Ji, Zhiyong; Ding, Lu; Zhang, Qing; Ouyang, Honggen; Lin, Shaobin; Wang, Xu; Shao, Lin; Shi, Chong; Li, Ping; Song, Yueqiang

    2017-03-01

    Robertsonian translocation (RT) is a common cause for male infertility, recurrent pregnancy loss, and birth defects. Studying meiotic recombination in RT-carrier patients helps decipher the mechanism and improve the clinical management of infertility and birth defects caused by RT. Here we present a new method to study spermatogenesis on a single-gamete basis from two RT carriers. By using a combined single-cell whole-genome amplification and sequencing protocol, we comprehensively profiled the chromosomal copy number of 88 single sperms from two RT-carrier patients. With the profiled information, chromosomal aberrations were identified on a whole-genome, per-sperm basis. We found that the previously reported interchromosomal effect might not exist with RT carriers. It is suggested that single-cell genome sequencing enables comprehensive chromosomal aneuploidy screening and provides a powerful tool for studying gamete generation from patients carrying chromosomal diseases. © 2017 John Wiley & Sons Ltd/University College London.

  4. Excess mortality in persons with severe mental disorders: a multilevel intervention framework and priorities for clinical practice, policy and research agendas

    PubMed Central

    Liu, Nancy H.; Daumit, Gail L.; Dua, Tarun; Aquila, Ralph; Charlson, Fiona; Cuijpers, Pim; Druss, Benjamin; Dudek, Kenn; Freeman, Melvyn; Fujii, Chiyo; Gaebel, Wolfgang; Hegerl, Ulrich; Levav, Itzhak; Munk Laursen, Thomas; Ma, Hong; Maj, Mario; Elena Medina‐Mora, Maria; Nordentoft, Merete; Prabhakaran, Dorairaj; Pratt, Karen; Prince, Martin; Rangaswamy, Thara; Shiers, David; Susser, Ezra; Thornicroft, Graham; Wahlbeck, Kristian; Fekadu Wassie, Abe; Whiteford, Harvey; Saxena, Shekhar

    2017-01-01

    Excess mortality in persons with severe mental disorders (SMD) is a major public health challenge that warrants action. The number and scope of truly tested interventions in this area remain limited, and strategies for implementation and scaling up of programmes with a strong evidence base are scarce. Furthermore, the majority of available interventions focus on a single or an otherwise limited number of risk factors. Here we present a multilevel model highlighting risk factors for excess mortality in persons with SMD at the individual, health system and socio‐environmental levels. Informed by that model, we describe a comprehensive framework that may be useful for designing, implementing and evaluating interventions and programmes to reduce excess mortality in persons with SMD. This framework includes individual‐focused, health system‐focused, and community level and policy‐focused interventions. Incorporating lessons learned from the multilevel model of risk and the comprehensive intervention framework, we identify priorities for clinical practice, policy and research agendas. PMID:28127922

  5. Wernicke's aphasia reflects a combination of acoustic-phonological and semantic control deficits: a case-series comparison of Wernicke's aphasia, semantic dementia and semantic aphasia.

    PubMed

    Robson, Holly; Sage, Karen; Ralph, Matthew A Lambon

    2012-01-01

    Wernicke's aphasia (WA) is the classical neurological model of comprehension impairment and, as a result, the posterior temporal lobe is assumed to be critical to semantic cognition. This conclusion is potentially confused by (a) the existence of patient groups with semantic impairment following damage to other brain regions (semantic dementia and semantic aphasia) and (b) an ongoing debate about the underlying causes of comprehension impairment in WA. By directly comparing these three patient groups for the first time, we demonstrate that the comprehension impairment in Wernicke's aphasia is best accounted for by dual deficits in acoustic-phonological analysis (associated with pSTG) and semantic cognition (associated with pMTG and angular gyrus). The WA group were impaired on both nonverbal and verbal comprehension assessments consistent with a generalised semantic impairment. This semantic deficit was most similar in nature to that of the semantic aphasia group suggestive of a disruption to semantic control processes. In addition, only the WA group showed a strong effect of input modality on comprehension, with accuracy decreasing considerably as acoustic-phonological requirements increased. These results deviate from traditional accounts which emphasise a single impairment and, instead, implicate two deficits underlying the comprehension disorder in WA. Copyright © 2011 Elsevier Ltd. All rights reserved.

  6. Introduction to the Clinical Forum: Reading Comprehension Is Not a Single Ability.

    PubMed

    Gray, Shelley

    2017-04-20

    In this introduction to the clinical forum on reading comprehension, the Editor-in-Chief of Language, Speech, and Hearing Services in Schools provides data on our national reading comprehension problem, resources for increasing our understanding of reading comprehension, and a call to action for speech-language pathologists to work with educational teams to address poor reading comprehension in school-age children.

  7. The internationalization of health care: the UZ Brussel model for international partnerships.

    PubMed

    Noppen, Marc

    2012-01-01

    Globalization of health care, flat medicine, cross-boarder health care, medical tourism, are all terms describing some, but not all, aspects of a growing trend: patients seeking health care provision abroad, and health care providers travelling abroad for temporary or permanent health care delivery services. This trend is a complex, bilateral and multifaceted phenomenon, which in our opinion, cannot be sustained in a single, comprehensive description. Individual hospitals have the unique opportunity to develop a model for appropriate action. The specific model created by the university hospital UZ Brussel is presented here.

  8. One-Tube-Only Standardized Site-Directed Mutagenesis: An Alternative Approach to Generate Amino Acid Substitution Collections

    PubMed Central

    Mingo, Janire; Erramuzpe, Asier; Luna, Sandra; Aurtenetxe, Olaia; Amo, Laura; Diez, Ibai; Schepens, Jan T. G.; Hendriks, Wiljan J. A. J.; Cortés, Jesús M.; Pulido, Rafael

    2016-01-01

    Site-directed mutagenesis (SDM) is a powerful tool to create defined collections of protein variants for experimental and clinical purposes, but effectiveness is compromised when a large number of mutations is required. We present here a one-tube-only standardized SDM approach that generates comprehensive collections of amino acid substitution variants, including scanning- and single site-multiple mutations. The approach combines unified mutagenic primer design with the mixing of multiple distinct primer pairs and/or plasmid templates to increase the yield of a single inverse-PCR mutagenesis reaction. Also, a user-friendly program for automatic design of standardized primers for Ala-scanning mutagenesis is made available. Experimental results were compared with a modeling approach together with stochastic simulation data. For single site-multiple mutagenesis purposes and for simultaneous mutagenesis in different plasmid backgrounds, combination of primer sets and/or plasmid templates in a single reaction tube yielded the distinct mutations in a stochastic fashion. For scanning mutagenesis, we found that a combination of overlapping primer sets in a single PCR reaction allowed the yield of different individual mutations, although this yield did not necessarily follow a stochastic trend. Double mutants were generated when the overlap of primer pairs was below 60%. Our results illustrate that one-tube-only SDM effectively reduces the number of reactions required in large-scale mutagenesis strategies, facilitating the generation of comprehensive collections of protein variants suitable for functional analysis. PMID:27548698

  9. Measuring Individual Performance with Comprehensive Bibliometric Reports as an Alternative to h-Index Values

    PubMed Central

    2018-01-01

    The h-index is frequently used to measure the performance of single scientists in Korea (and beyond). No single indicator alone, however, is able to provide a stable and complete assessment of performance. The Stata command bibrep.ado is introduced which automatically produces bibliometric reports for single researchers (senior researchers working in the natural or life sciences). The user of the command receives a comprehensive bibliometric report which can be used in research evaluation instead of the h-index. PMID:29713257

  10. Comprehension and production of nouns and verbs in temporal lobe epilepsy.

    PubMed

    Yurchenko, Anna; Golovteev, Alexander; Kopachev, Dmitry; Dragoy, Olga

    2017-10-01

    Previous research on linguistic performance at the single-word level in patients with temporal lobe epilepsy (TLE) has mostly been limited to the comprehension and production of nouns, and findings have been inconsistent. Results are likewise limited and controversial regarding the lateralization of the epileptogenic focus. The present study investigates comprehension and production of nouns and verbs in patients with left and right TLE (12 in each group). We designed a comprehension (word-picture matching) test and a production (naming) test, matched on a range of psycholinguistic parameters for the two word classes. The results showed impaired verb comprehension in patients with left TLE and impaired noun and verb production in both groups of patients compared to the control group. Patients with left and right TLE differed significantly on verb comprehension and noun production, whereas verb production was equally impaired in the two groups of patients. These findings suggest difficulties with single-word processing in patients with both left and right TLE, which are more prominent for verbs than for nouns in patients with left TLE. The verb production (action naming) test turned out to be the most effective tool for assessing linguistic difficulties at the single-word level in patients with TLE. Copyright © 2017 Elsevier Inc. All rights reserved.

  11. Economic assessment of single-walled carbon nanotube processes

    NASA Astrophysics Data System (ADS)

    Isaacs, J. A.; Tanwani, A.; Healy, M. L.; Dahlben, L. J.

    2010-02-01

    The carbon nanotube market is steadily growing and projected to reach 1.9 billion by 2010. This study examines the economics of manufacturing single-walled carbon nanotubes (SWNT) using process-based cost models developed for arc, CVD, and HiPco processes. Using assumed input parameters, manufacturing costs are calculated for 1 g SWNT for arc, CVD, and HiPco, totaling 1,906, 1,706, and 485, respectively. For each SWNT process, the synthesis and filtration steps showed the highest costs, with direct labor as a primary cost driver. Reductions in production costs are calculated for increased working hours per day and for increased synthesis reaction yield (SRY) in each process. The process-based cost models offer a means for exploring opportunities for cost reductions, and provide a structured system for comparisons among alternative SWNT manufacturing processes. Further, the models can be used to comprehensively evaluate additional scenarios on the economics of environmental, health, and safety best manufacturing practices.

  12. Lane-changing behavior and its effect on energy dissipation using full velocity difference model

    NASA Astrophysics Data System (ADS)

    Wang, Jian; Ding, Jian-Xun; Shi, Qin; Kühne, Reinhart D.

    2016-07-01

    In real urban traffic, roadways are usually multilane with lane-specific velocity limits. Most previous researches are derived from single-lane car-following theory which in the past years has been extensively investigated and applied. In this paper, we extend the continuous single-lane car-following model (full velocity difference model) to simulate the three-lane-changing behavior on an urban roadway which consists of three lanes. To meet incentive and security requirements, a comprehensive lane-changing rule set is constructed, taking safety distance and velocity difference into consideration and setting lane-specific speed restriction for each lane. We also investigate the effect of lane-changing behavior on distribution of cars, velocity, headway, fundamental diagram of traffic and energy dissipation. Simulation results have demonstrated asymmetric lane-changing “attraction” on changeable lane-specific speed-limited roadway, which leads to dramatically increasing energy dissipation.

  13. Using the Job Burden-Capital Model of Occupational Stress to Predict Depression and Well-Being among Electronic Manufacturing Service Employees in China

    PubMed Central

    Wang, Chao; Li, Shuang; Li, Tao; Yu, Shanfa; Dai, Junming; Liu, Xiaoman; Zhu, Xiaojun; Ji, Yuqing; Wang, Jin

    2016-01-01

    Background: This study aimed to identify the association between occupational stress and depression-well-being by proposing a comprehensive and flexible job burden-capital model with its corresponding hypotheses. Methods: For this research, 1618 valid samples were gathered from the electronic manufacturing service industry in Hunan Province, China; self-rated questionnaires were administered to participants for data collection after obtaining their written consent. The proposed model was fitted and tested through structural equation model analysis. Results: Single-factor correlation analysis results indicated that coefficients between all items and dimensions had statistical significance. The final model demonstrated satisfactory global goodness of fit (CMIN/DF = 5.37, AGFI = 0.915, NNFI = 0.945, IFI = 0.952, RMSEA = 0.052). Both the measurement and structural models showed acceptable path loadings. Job burden and capital were directly associated with depression and well-being or indirectly related to them through personality. Multi-group structural equation model analyses indicated general applicability of the proposed model to basic features of such a population. Gender, marriage and education led to differences in the relation between occupational stress and health outcomes. Conclusions: The job burden-capital model of occupational stress-depression and well-being was found to be more systematic and comprehensive than previous models. PMID:27529267

  14. Using the Job Burden-Capital Model of Occupational Stress to Predict Depression and Well-Being among Electronic Manufacturing Service Employees in China.

    PubMed

    Wang, Chao; Li, Shuang; Li, Tao; Yu, Shanfa; Dai, Junming; Liu, Xiaoman; Zhu, Xiaojun; Ji, Yuqing; Wang, Jin

    2016-08-12

    This study aimed to identify the association between occupational stress and depression-well-being by proposing a comprehensive and flexible job burden-capital model with its corresponding hypotheses. For this research, 1618 valid samples were gathered from the electronic manufacturing service industry in Hunan Province, China; self-rated questionnaires were administered to participants for data collection after obtaining their written consent. The proposed model was fitted and tested through structural equation model analysis. Single-factor correlation analysis results indicated that coefficients between all items and dimensions had statistical significance. The final model demonstrated satisfactory global goodness of fit (CMIN/DF = 5.37, AGFI = 0.915, NNFI = 0.945, IFI = 0.952, RMSEA = 0.052). Both the measurement and structural models showed acceptable path loadings. Job burden and capital were directly associated with depression and well-being or indirectly related to them through personality. Multi-group structural equation model analyses indicated general applicability of the proposed model to basic features of such a population. Gender, marriage and education led to differences in the relation between occupational stress and health outcomes. The job burden-capital model of occupational stress-depression and well-being was found to be more systematic and comprehensive than previous models.

  15. The Impact of Gloss Types on Iranian EFL Students' Reading Comprehension and Lexical Retention

    ERIC Educational Resources Information Center

    Farvardin, Mohammad Taghi; Biria, Reza

    2012-01-01

    Research has shown that the effect of marginal glosses on reading comprehension and vocabulary retention is a controversial issue. The purpose of this study was to investigate this issue among Iranian university EFL students. Three types of glosses were applied in this study: single gloss in participants' first language (SL1G), single gloss in…

  16. Comprehensive Fuel Spray Modeling and Impacts on Chamber Acoustics in Combustion Dynamics Simulations

    DTIC Science & Technology

    2013-05-01

    multiple swirler configurations and fuel injector locations at atmospheric pressure con- ditions. Both single-element and multiple-element LDI...the swirl number, Reynolds’ number and injector location in the LDI element. Besides the multi-phase flow characteristics, several experimen- tal...region downstream of the fuel injector on account of a sta- ble and compact precessing vortex core. Recent ex- periments conducted by the Purdue group have

  17. Comprehensive Modeling and Analysis of Rotorcraft Variable Speed Propulsion System With Coupled Engine/Transmission/Rotor Dynamics

    NASA Technical Reports Server (NTRS)

    DeSmidt, Hans A.; Smith, Edward C.; Bill, Robert C.; Wang, Kon-Well

    2013-01-01

    This project develops comprehensive modeling and simulation tools for analysis of variable rotor speed helicopter propulsion system dynamics. The Comprehensive Variable-Speed Rotorcraft Propulsion Modeling (CVSRPM) tool developed in this research is used to investigate coupled rotor/engine/fuel control/gearbox/shaft/clutch/flight control system dynamic interactions for several variable rotor speed mission scenarios. In this investigation, a prototypical two-speed Dual-Clutch Transmission (DCT) is proposed and designed to achieve 50 percent rotor speed variation. The comprehensive modeling tool developed in this study is utilized to analyze the two-speed shift response of both a conventional single rotor helicopter and a tiltrotor drive system. In the tiltrotor system, both a Parallel Shift Control (PSC) strategy and a Sequential Shift Control (SSC) strategy for constant and variable forward speed mission profiles are analyzed. Under the PSC strategy, selecting clutch shift-rate results in a design tradeoff between transient engine surge margins and clutch frictional power dissipation. In the case of SSC, clutch power dissipation is drastically reduced in exchange for the necessity to disengage one engine at a time which requires a multi-DCT drive system topology. In addition to comprehensive simulations, several sections are dedicated to detailed analysis of driveline subsystem components under variable speed operation. In particular an aeroelastic simulation of a stiff in-plane rotor using nonlinear quasi-steady blade element theory was conducted to investigate variable speed rotor dynamics. It was found that 2/rev and 4/rev flap and lag vibrations were significant during resonance crossings with 4/rev lagwise loads being directly transferred into drive-system torque disturbances. To capture the clutch engagement dynamics, a nonlinear stick-slip clutch torque model is developed. Also, a transient gas-turbine engine model based on first principles mean-line compressor and turbine approximations is developed. Finally an analysis of high frequency gear dynamics including the effect of tooth mesh stiffness variation under variable speed operation is conducted including experimental validation. Through exploring the interactions between the various subsystems, this investigation provides important insights into the continuing development of variable-speed rotorcraft propulsion systems.

  18. Single and double production of the Higgs boson at hadron and lepton colliders in minimal composite Higgs models

    NASA Astrophysics Data System (ADS)

    Kanemura, Shinya; Kaneta, Kunio; Machida, Naoki; Odori, Shinya; Shindou, Tetsuo

    2016-07-01

    In the composite Higgs models, originally proposed by Georgi and Kaplan, the Higgs boson is a pseudo Nambu-Goldstone boson (pNGB) of spontaneous breaking of a global symmetry. In the minimal version of such models, global SO(5) symmetry is spontaneously broken to SO(4), and the pNGBs form an isospin doublet field, which corresponds to the Higgs doublet in the Standard Model (SM). Predicted coupling constants of the Higgs boson can in general deviate from the SM predictions, depending on the compositeness parameter. The deviation pattern is determined also by the detail of the matter sector. We comprehensively study how the model can be tested via measuring single and double production processes of the Higgs boson at the LHC and future electron-positron colliders. The possibility to distinguish the matter sector among the minimal composite Higgs models is also discussed. In addition, we point out differences in the cross section of double Higgs boson production from the prediction in other new physics models.

  19. Multi-model inference for incorporating trophic and climate uncertainty into stock assessments

    NASA Astrophysics Data System (ADS)

    Ianelli, James; Holsman, Kirstin K.; Punt, André E.; Aydin, Kerim

    2016-12-01

    Ecosystem-based fisheries management (EBFM) approaches allow a broader and more extensive consideration of objectives than is typically possible with conventional single-species approaches. Ecosystem linkages may include trophic interactions and climate change effects on productivity for the relevant species within the system. Presently, models are evolving to include a comprehensive set of fishery and ecosystem information to address these broader management considerations. The increased scope of EBFM approaches is accompanied with a greater number of plausible models to describe the systems. This can lead to harvest recommendations and biological reference points that differ considerably among models. Model selection for projections (and specific catch recommendations) often occurs through a process that tends to adopt familiar, often simpler, models without considering those that incorporate more complex ecosystem information. Multi-model inference provides a framework that resolves this dilemma by providing a means of including information from alternative, often divergent models to inform biological reference points and possible catch consequences. We apply an example of this approach to data for three species of groundfish in the Bering Sea: walleye pollock, Pacific cod, and arrowtooth flounder using three models: 1) an age-structured "conventional" single-species model, 2) an age-structured single-species model with temperature-specific weight at age, and 3) a temperature-specific multi-species stock assessment model. The latter two approaches also include consideration of alternative future climate scenarios, adding another dimension to evaluate model projection uncertainty. We show how Bayesian model-averaging methods can be used to incorporate such trophic and climate information to broaden single-species stock assessments by using an EBFM approach that may better characterize uncertainty.

  20. Single-process versus multiple-strategy models of decision making: evidence from an information intrusion paradigm.

    PubMed

    Söllner, Anke; Bröder, Arndt; Glöckner, Andreas; Betsch, Tilmann

    2014-02-01

    When decision makers are confronted with different problems and situations, do they use a uniform mechanism as assumed by single-process models (SPMs) or do they choose adaptively from a set of available decision strategies as multiple-strategy models (MSMs) imply? Both frameworks of decision making have gathered a lot of support, but only rarely have they been contrasted with each other. Employing an information intrusion paradigm for multi-attribute decisions from givens, SPM and MSM predictions on information search, decision outcomes, attention, and confidence judgments were derived and tested against each other in two experiments. The results consistently support the SPM view: Participants seemingly using a "take-the-best" (TTB) strategy do not ignore TTB-irrelevant information as MSMs would predict, but adapt the amount of information searched, choose alternative choice options, and show varying confidence judgments contingent on the quality of the "irrelevant" information. The uniformity of these findings underlines the adequacy of the novel information intrusion paradigm and comprehensively promotes the notion of a uniform decision making mechanism as assumed by single-process models. Copyright © 2013 The Authors. Published by Elsevier B.V. All rights reserved.

  1. Performance metrics and variance partitioning reveal sources of uncertainty in species distribution models

    USGS Publications Warehouse

    Watling, James I.; Brandt, Laura A.; Bucklin, David N.; Fujisaki, Ikuko; Mazzotti, Frank J.; Romañach, Stephanie; Speroterra, Carolina

    2015-01-01

    Species distribution models (SDMs) are widely used in basic and applied ecology, making it important to understand sources and magnitudes of uncertainty in SDM performance and predictions. We analyzed SDM performance and partitioned variance among prediction maps for 15 rare vertebrate species in the southeastern USA using all possible combinations of seven potential sources of uncertainty in SDMs: algorithms, climate datasets, model domain, species presences, variable collinearity, CO2 emissions scenarios, and general circulation models. The choice of modeling algorithm was the greatest source of uncertainty in SDM performance and prediction maps, with some additional variation in performance associated with the comprehensiveness of the species presences used for modeling. Other sources of uncertainty that have received attention in the SDM literature such as variable collinearity and model domain contributed little to differences in SDM performance or predictions in this study. Predictions from different algorithms tended to be more variable at northern range margins for species with more northern distributions, which may complicate conservation planning at the leading edge of species' geographic ranges. The clear message emerging from this work is that researchers should use multiple algorithms for modeling rather than relying on predictions from a single algorithm, invest resources in compiling a comprehensive set of species presences, and explicitly evaluate uncertainty in SDM predictions at leading range margins.

  2. Computer simulation of a single pilot flying a modern high-performance helicopter

    NASA Technical Reports Server (NTRS)

    Zipf, Mark E.; Vogt, William G.; Mickle, Marlin H.; Hoelzeman, Ronald G.; Kai, Fei; Mihaloew, James R.

    1988-01-01

    Presented is a computer simulation of a human response pilot model able to execute operational flight maneuvers and vehicle stabilization of a modern high-performance helicopter. Low-order, single-variable, human response mechanisms, integrated to form a multivariable pilot structure, provide a comprehensive operational control over the vehicle. Evaluations of the integrated pilot were performed by direct insertion into a nonlinear, total-force simulation environment provided by NASA Lewis. Comparisons between the integrated pilot structure and single-variable pilot mechanisms are presented. Static and dynamically alterable configurations of the pilot structure are introduced to simulate pilot activities during vehicle maneuvers. These configurations, in conjunction with higher level, decision-making processes, are considered for use where guidance and navigational procedures, operational mode transfers, and resource sharing are required.

  3. Comprehensive analyses of core-shell InGaN/GaN single nanowire photodiodes

    NASA Astrophysics Data System (ADS)

    Zhang, H.; Guan, N.; Piazza, V.; Kapoor, A.; Bougerol, C.; Julien, F. H.; Babichev, A. V.; Cavassilas, N.; Bescond, M.; Michelini, F.; Foldyna, M.; Gautier, E.; Durand, C.; Eymery, J.; Tchernycheva, M.

    2017-12-01

    Single nitride nanowire core/shell n-p photodetectors are fabricated and analyzed. Nanowires consisting of an n-doped GaN stem, a radial InGaN/GaN multiple quantum well system and a p-doped GaN external shell were grown by catalyst-free metal-organic vapour phase epitaxy on sapphire substrates. Single nanowires were dispersed and the core and the shell regions were contacted with a metal and an ITO deposition, respectively, defined using electron beam lithography. The single wire photodiodes present a response in the visible to UV spectral range under zero external bias. The detector operation speed has been analyzed under different bias conditions. Under zero bias, the  -3 dB cut-off frequency is ~200 Hz for small light modulations. The current generation was modeled using non-equilibrium Green function formalism, which evidenced the importance of phonon scattering for carrier extraction from the quantum wells.

  4. Double-Line-Frequency Ripple Model, Analysis & Impedance Design for Energy Stored Single-Phase Quasi-Z Source Photovoltaic System

    DOE PAGES

    Liang, Weihua; Liu, Yushan; Ge, Baoming; ...

    2017-09-08

    The battery energy stored quasi-Z-source (BESqZS) based photovoltaic (PV) power generation system combines advantages of the qZS inverter and the battery energy storage system. But, the second harmonic (2ω) power ripple degrades the system’s performance and affects the system’s design. An accurate model to analyze the 2ω ripple is very important. The existing models did not consider the battery, or assumed a symmetric qZS network with L 1=L 2 and C 1=C 2, which limits the design freedom and causes oversized impedance parameters. Our paper proposes a comprehensive model for the single-phase BES-qZS-PV inverter system, where the battery is consideredmore » and there is no restriction of L 1=L 2 and C 1=C 2. Based on the built model, a BES-qZS impedance design method is proposed to mitigate the 2ω ripple with asymmetric qZS network. Simulation and experimental results verify the proposed 2ω ripple model and impedance design method.« less

  5. Double-Line-Frequency Ripple Model, Analysis & Impedance Design for Energy Stored Single-Phase Quasi-Z Source Photovoltaic System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liang, Weihua; Liu, Yushan; Ge, Baoming

    The battery energy stored quasi-Z-source (BESqZS) based photovoltaic (PV) power generation system combines advantages of the qZS inverter and the battery energy storage system. But, the second harmonic (2ω) power ripple degrades the system’s performance and affects the system’s design. An accurate model to analyze the 2ω ripple is very important. The existing models did not consider the battery, or assumed a symmetric qZS network with L 1=L 2 and C 1=C 2, which limits the design freedom and causes oversized impedance parameters. Our paper proposes a comprehensive model for the single-phase BES-qZS-PV inverter system, where the battery is consideredmore » and there is no restriction of L 1=L 2 and C 1=C 2. Based on the built model, a BES-qZS impedance design method is proposed to mitigate the 2ω ripple with asymmetric qZS network. Simulation and experimental results verify the proposed 2ω ripple model and impedance design method.« less

  6. The initial establishment and epithelial morphogenesis of the esophagus: a new model of tracheal–esophageal separation and transition of simple columnar into stratified squamous epithelium in the developing esophagus

    PubMed Central

    Que, Jianwen

    2016-01-01

    The esophagus and trachea are tubular organs that initially share a single common lumen in the anterior foregut. Several models have been proposed to explain how this single-lumen developmental intermediate generates two tubular organs. However, new evidence suggests that these models are not comprehensive. I will first briefly review these models and then propose a novel ‘splitting and extension’ model based on our in vitro modeling of the foregut separation process. Signaling molecules (e.g., SHHs, WNTs, BMPs) and transcription factors (e.g., NKX2.1 and SOX2) are critical for the separation of the foregut. Intriguingly, some of these molecules continue to play essential roles during the transition of simple columnar into stratified squamous epithelium in the developing esophagus, and they are also closely involved in epithelial maintenance in the adults. Alterations in the levels of these molecules have been associated with the initiation and progression of several esophageal diseases and cancer in adults. PMID:25727889

  7. Determining Pain Detection and Tolerance Thresholds Using an Integrated, Multi-Modal Pain Task Battery.

    PubMed

    Hay, Justin L; Okkerse, Pieter; van Amerongen, Guido; Groeneveld, Geert Jan

    2016-04-14

    Human pain models are useful in the assessing the analgesic effect of drugs, providing information about a drug's pharmacology and identify potentially suitable therapeutic populations. The need to use a comprehensive battery of pain models is highlighted by studies whereby only a single pain model, thought to relate to the clinical situation, demonstrates lack of efficacy. No single experimental model can mimic the complex nature of clinical pain. The integrated, multi-modal pain task battery presented here encompasses the electrical stimulation task, pressure stimulation task, cold pressor task, the UVB inflammatory model which includes a thermal task and a paradigm for inhibitory conditioned pain modulation. These human pain models have been tested for predicative validity and reliability both in their own right and in combination, and can be used repeatedly, quickly, in short succession, with minimum burden for the subject and with a modest quantity of equipment. This allows a drug to be fully characterized and profiled for analgesic effect which is especially useful for drugs with a novel or untested mechanism of action.

  8. The comprehensiveness care of sickle cell disease.

    PubMed

    Okpala, Iheanyi; Thomas, Veronica; Westerdale, Neil; Jegede, Tina; Raj, Kavita; Daley, Sadie; Costello-Binger, Hilda; Mullen, Jean; Rochester-Peart, Collis; Helps, Sarah; Tulloch, Emense; Akpala, Mary; Dick, Moira; Bewley, Susan; Davies, Mark; Abbs, Ian

    2002-03-01

    Millions of people across the world have sickle cell disease (SCD). Although the true prevalence of SCD in Europe is not certain, London (UK) alone had an estimated 9000 people with the disorder in 1997. People affected by SCD are best managed by a multidisciplinary team of professionals who deliver comprehensive care: a model of healthcare based on interaction of medical and non-medical services with the affected persons. The components of comprehensive care include patient/parent information, genetic counselling, social services, prevention of infections, dietary advice and supplementation, psychotherapy, renal and other specialist medical care, maternal and child health, orthopaedic and general surgery, pain control, physiotherapy, dental and eye care, drug dependency services and specialist sickle cell nursing. The traditional role of haematologists remains to co-ordinate overall management and liase with other specialities as necessary. Co-operation from the affected persons is indispensable to the delivery of comprehensive care. Working in partnership with the hospital or community health service administration and voluntary agencies enhances the success of the multidisciplinary team. Holistic care improves the quality of life of people affected by SCD, and reduces the number as well as length of hospital admissions. Disease-related morbidity is reduced by early detection and treatment of chronic complications. Comprehensive care promotes awareness of SCD among affected persons who are encouraged to take greater control of their own lives, and achieves better patient management than the solo efforts of any single group of professionals. This cost-effective model of care is an option for taking haemoglobinopathy services forward in the new millennium.

  9. Single Case Design Elements in Text Comprehension Research for Students with Developmental Disabilities

    ERIC Educational Resources Information Center

    Snyder, Sara M.; Knight, Victoria F.; Ayres, Kevin M.; Mims, Pamela J.; Sartini, Emily C.

    2017-01-01

    Recently researchers have begun exploring the efficacy of interventions designed to improve text comprehension skills for students with developmental disabilities (DD). Text comprehension is essential for understanding academic content as students with disabilities make progress in the general education curriculum. This article focuses on single…

  10. Evaluation of Attention Training and Metacognitive Facilitation to Improve Reading Comprehension in Aphasia

    ERIC Educational Resources Information Center

    Lee, Jaime B.; Sohlberg, McKay Moore

    2013-01-01

    Purpose: This pilot study investigated the impact of direct attention training combined with metacognitive facilitation on reading comprehension in individuals with aphasia. Method: A single-subject, multiple baseline design was employed across 4 participants to evaluate potential changes in reading comprehension resulting from an 8-week…

  11. In Vivo Control of CpG and Non-CpG DNA Methylation by DNA Methyltransferases

    PubMed Central

    Arand, Julia; Spieler, David; Karius, Tommy; Branco, Miguel R.; Meilinger, Daniela; Meissner, Alexander; Jenuwein, Thomas; Xu, Guoliang; Leonhardt, Heinrich; Wolf, Verena; Walter, Jörn

    2012-01-01

    The enzymatic control of the setting and maintenance of symmetric and non-symmetric DNA methylation patterns in a particular genome context is not well understood. Here, we describe a comprehensive analysis of DNA methylation patterns generated by high resolution sequencing of hairpin-bisulfite amplicons of selected single copy genes and repetitive elements (LINE1, B1, IAP-LTR-retrotransposons, and major satellites). The analysis unambiguously identifies a substantial amount of regional incomplete methylation maintenance, i.e. hemimethylated CpG positions, with variant degrees among cell types. Moreover, non-CpG cytosine methylation is confined to ESCs and exclusively catalysed by Dnmt3a and Dnmt3b. This sequence position–, cell type–, and region-dependent non-CpG methylation is strongly linked to neighboring CpG methylation and requires the presence of Dnmt3L. The generation of a comprehensive data set of 146,000 CpG dyads was used to apply and develop parameter estimated hidden Markov models (HMM) to calculate the relative contribution of DNA methyltransferases (Dnmts) for de novo and maintenance DNA methylation. The comparative modelling included wild-type ESCs and mutant ESCs deficient for Dnmt1, Dnmt3a, Dnmt3b, or Dnmt3a/3b, respectively. The HMM analysis identifies a considerable de novo methylation activity for Dnmt1 at certain repetitive elements and single copy sequences. Dnmt3a and Dnmt3b contribute de novo function. However, both enzymes are also essential to maintain symmetrical CpG methylation at distinct repetitive and single copy sequences in ESCs. PMID:22761581

  12. Discrimination of speech stimuli based on neuronal response phase patterns depends on acoustics but not comprehension.

    PubMed

    Howard, Mary F; Poeppel, David

    2010-11-01

    Speech stimuli give rise to neural activity in the listener that can be observed as waveforms using magnetoencephalography. Although waveforms vary greatly from trial to trial due to activity unrelated to the stimulus, it has been demonstrated that spoken sentences can be discriminated based on theta-band (3-7 Hz) phase patterns in single-trial response waveforms. Furthermore, manipulations of the speech signal envelope and fine structure that reduced intelligibility were found to produce correlated reductions in discrimination performance, suggesting a relationship between theta-band phase patterns and speech comprehension. This study investigates the nature of this relationship, hypothesizing that theta-band phase patterns primarily reflect cortical processing of low-frequency (<40 Hz) modulations present in the acoustic signal and required for intelligibility, rather than processing exclusively related to comprehension (e.g., lexical, syntactic, semantic). Using stimuli that are quite similar to normal spoken sentences in terms of low-frequency modulation characteristics but are unintelligible (i.e., their time-inverted counterparts), we find that discrimination performance based on theta-band phase patterns is equal for both types of stimuli. Consistent with earlier findings, we also observe that whereas theta-band phase patterns differ across stimuli, power patterns do not. We use a simulation model of the single-trial response to spoken sentence stimuli to demonstrate that phase-locked responses to low-frequency modulations of the acoustic signal can account not only for the phase but also for the power results. The simulation offers insight into the interpretation of the empirical results with respect to phase-resetting and power-enhancement models of the evoked response.

  13. Development and test of different methods to improve the description and NO{sub x} emissions in staged combustion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brink, A.; Kilpinen, P.; Hupa, M.

    1996-01-01

    Two methods to improve the modeling of NO{sub x} emissions in numerical flow simulation of combustion are investigated. The models used are a reduced mechanism for nitrogen chemistry in methane combustion and a new model based on regression analysis of perfectly stirred reactor simulations using detailed comprehensive reaction kinetics. The applicability of the methods to numerical flow simulation of practical furnaces, especially in the near burner region, is tested against experimental data from a pulverized coal fired single burner furnace. The results are also compared to those obtained using a commonly used description for the overall reaction rate of NO.

  14. Phonological and semantic processing during comprehension in Wernicke's aphasia: An N400 and Phonological Mapping Negativity Study.

    PubMed

    Robson, Holly; Pilkington, Emma; Evans, Louise; DeLuca, Vincent; Keidel, James L

    2017-06-01

    Comprehension impairments in Wernicke's aphasia are thought to result from a combination of impaired phonological and semantic processes. However, the relationship between these cognitive processes and language comprehension has only been inferred through offline neuropsychological tasks. This study used ERPs to investigate phonological and semantic processing during online single word comprehension. EEG was recorded in a group of Wernicke's aphasia n=8 and control participants n=10 while performing a word-picture verification task. The N400 and Phonological Mapping Negativity/Phonological Mismatch Negativity (PMN) event-related potential components were investigated as an index of semantic and phonological processing, respectively. Individuals with Wernicke's aphasia displayed reduced and inconsistent N400 and PMN effects in comparison to control participants. Reduced N400 effects in the WA group were simulated in the control group by artificially degrading speech perception. Correlation analyses in the Wernicke's aphasia group found that PMN but not N400 amplitude was associated with behavioural word-picture verification performance. The results confirm impairments at both phonological and semantic stages of comprehension in Wernicke's aphasia. However, reduced N400 responses in Wernicke's aphasia are at least partially attributable to earlier phonological processing impairments. The results provide further support for the traditional model of Wernicke's aphasia which claims a causative link between phonological processing and language comprehension impairments. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. When to Make Mountains out of Molehills: The Pros and Cons of Simple and Complex Model Calibration Procedures

    NASA Astrophysics Data System (ADS)

    Smith, K. A.; Barker, L. J.; Harrigan, S.; Prudhomme, C.; Hannaford, J.; Tanguy, M.; Parry, S.

    2017-12-01

    Earth and environmental models are relied upon to investigate system responses that cannot otherwise be examined. In simulating physical processes, models have adjustable parameters which may, or may not, have a physical meaning. Determining the values to assign to these model parameters is an enduring challenge for earth and environmental modellers. Selecting different error metrics by which the models results are compared to observations will lead to different sets of calibrated model parameters, and thus different model results. Furthermore, models may exhibit `equifinal' behaviour, where multiple combinations of model parameters lead to equally acceptable model performance against observations. These decisions in model calibration introduce uncertainty that must be considered when model results are used to inform environmental decision-making. This presentation focusses on the uncertainties that derive from the calibration of a four parameter lumped catchment hydrological model (GR4J). The GR models contain an inbuilt automatic calibration algorithm that can satisfactorily calibrate against four error metrics in only a few seconds. However, a single, deterministic model result does not provide information on parameter uncertainty. Furthermore, a modeller interested in extreme events, such as droughts, may wish to calibrate against more low flows specific error metrics. In a comprehensive assessment, the GR4J model has been run with 500,000 Latin Hypercube Sampled parameter sets across 303 catchments in the United Kingdom. These parameter sets have been assessed against six error metrics, including two drought specific metrics. This presentation compares the two approaches, and demonstrates that the inbuilt automatic calibration can outperform the Latin Hypercube experiment approach in single metric assessed performance. However, it is also shown that there are many merits of the more comprehensive assessment, which allows for probabilistic model results, multi-objective optimisation, and better tailoring to calibrate the model for specific applications such as drought event characterisation. Modellers and decision-makers may be constrained in their choice of calibration method, so it is important that they recognise the strengths and limitations of their chosen approach.

  16. Single sources in the low-frequency gravitational wave sky: properties and time to detection by pulsar timing arrays

    NASA Astrophysics Data System (ADS)

    Kelley, Luke Zoltan; Blecha, Laura; Hernquist, Lars; Sesana, Alberto; Taylor, Stephen R.

    2018-06-01

    We calculate the properties, occurrence rates and detection prospects of individually resolvable `single sources' in the low-frequency gravitational wave (GW) spectrum. Our simulations use the population of galaxies and massive black hole binaries from the Illustris cosmological hydrodynamic simulations, coupled to comprehensive semi-analytic models of the binary merger process. Using mock pulsar timing arrays (PTA) with, for the first time, varying red-noise models, we calculate plausible detection prospects for GW single sources and the stochastic GW background (GWB). Contrary to previous results, we find that single sources are at least as detectable as the GW background. Using mock PTA, we find that these `foreground' sources (also `deterministic'/`continuous') are likely to be detected with ˜20 yr total observing baselines. Detection prospects, and indeed the overall properties of single sources, are only moderately sensitive to binary evolution parameters - namely eccentricity and environmental coupling, which can lead to differences of ˜5 yr in times to detection. Red noise has a stronger effect, roughly doubling the time to detection of the foreground between a white-noise only model (˜10-15 yr) and severe red noise (˜20-30 yr). The effect of red noise on the GWB is even stronger, suggesting that single source detections may be more robust. We find that typical signal-to-noise ratios for the foreground peak near f = 0.1 yr-1, and are much less sensitive to the continued addition of new pulsars to PTA.

  17. Analysis and evaluation of the applicability of green energy technology

    NASA Astrophysics Data System (ADS)

    Xu, Z. J.; Song, Y. K.

    2017-11-01

    With the seriousness of environmental issues and the shortage of resources, the applicability of green energy technology has been paid more and more attention by scholars in different fields. However, the current researches are often single in perspective and simple in method. According to the Theory of Applicable Technology, this paper analyzes and defines the green energy technology and its applicability from the all-around perspectives of symbiosis of economy, society, environment and science & technology etc., and correspondingly constructs the evaluation index system. The paper further applies the Fuzzy Comprehensive Evaluation to the evaluation of its applicability, discusses in depth the evaluation models and methods, and explains in detail with an example. The author holds that the applicability of green energy technology involves many aspects of economy, society, environment and science & technology and can be evaluated comprehensively by an index system composed of a number of independent indexes. The evaluation is multi-object, multi-factor, multi-level and fuzzy comprehensive, which is undoubtedly correct, effective and feasible by the Fuzzy Comprehensive Evaluation. It is of vital theoretical and practical significance to understand and evaluate comprehensively the applicability of green energy technology for the rational development and utilization of green energy technology and for the better promotion of sustainable development of human and nature.

  18. A critical question for NEC researchers: Can we create a consensus definition of NEC that facilitates research progress?

    PubMed

    Gordon, Phillip V; Swanson, Jonathan R; MacQueen, Brianna C; Christensen, Robert D

    2017-02-01

    In the last decades the reported incidence of preterm necrotizing enterocolitis (NEC) has been declining in large part due to implementing comprehensive NEC prevention initiatives, including breast milk feeding, standardized feeding protocols, transfusion guidelines, and antibiotic stewardship and improving the rigor with which non-NEC cases are excluded from NEC data. However, after more than 60 years of NEC research in animal models, the promise of a "magic bullet" to prevent NEC has yet to materialize. There are also serious issues involving clinical NEC research. There is a lack of a common, comprehensive definition of NEC. National datasets have their own unique definition and staging definitions. Even within academia, randomized trials and single center studies have widely disparate definitions. This makes NEC metadata of very limited value. The world of neonatology needs a comprehensive, universal, consensus definition of NEC. It also needs a de-identified, international data warehouse. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. Building the European Seismological Research Infrastructure: results from 4 years NERIES EC project

    NASA Astrophysics Data System (ADS)

    van Eck, T.; Giardini, D.

    2010-12-01

    The EC Research Infrastructure (RI) project, Network of Research Infrastructures for European Seismology (NERIES), implemented a comprehensive European integrated RI for earthquake seismological data that is scalable and sustainable. NERIES opened a significant amount of additional seismological data, integrated different distributed data archives, implemented and produced advanced analysis tools and advanced software packages and tools. A single seismic data portal provides a single access point and overview for European seismological data available for the earth science research community. Additional data access tools and sites have been implemented to meet user and robustness requirements, notably those at the EMSC and ORFEUS. The datasets compiled in NERIES and available through the portal include among others: - The expanded Virtual European Broadband Seismic Network (VEBSN) with real-time access to more then 500 stations from > 53 observatories. This data is continuously monitored, quality controlled and archived in the European Integrated Distributed waveform Archive (EIDA). - A unique integration of acceleration datasets from seven networks in seven European or associated countries centrally accessible in a homogeneous format, thus forming the core comprehensive European acceleration database. Standardized parameter analysis and actual software are included in the database. - A Distributed Archive of Historical Earthquake Data (AHEAD) for research purposes, containing among others a comprehensive European Macroseismic Database and Earthquake Catalogue (1000 - 1963, M ≥5.8), including analysis tools. - Data from 3 one year OBS deployments at three sites, Atlantic, Ionian and Ligurian Sea within the general SEED format, thus creating the core integrated data base for ocean, sea and land based seismological observatories. Tools to facilitate analysis and data mining of the RI datasets are: - A comprehensive set of European seismological velocity reference model including a standardized model description with several visualisation tools currently adapted on a global scale. - An integrated approach to seismic hazard modelling and forecasting, a community accepted forecasting testing and model validation approach and the core hazard portal developed along the same technologies as the NERIES data portal. - Implemented homogeneous shakemap estimation tools at several large European observatories and a complementary new loss estimation software tool. - A comprehensive set of new techniques for geotechnical site characterization with relevant software packages documented and maintained (www.geopsy.org). - A set of software packages for data mining, data reduction, data exchange and information management in seismology as research and observatory analysis tools NERIES has a long-term impact and is coordinated with related US initiatives IRIS and EarthScope. The follow-up EC project of NERIES, NERA (2010 - 2014), is funded and will integrate the seismological and the earthquake engineering infrastructures. NERIES further provided the proof of concept for the ESFRI2008 initiative: the European Plate Observing System (EPOS). Its preparatory phase (2010 - 2014) is also funded by the EC.

  20. NDARC - NASA Design and Analysis of Rotorcraft Validation and Demonstration

    NASA Technical Reports Server (NTRS)

    Johnson, Wayne

    2010-01-01

    Validation and demonstration results from the development of the conceptual design tool NDARC (NASA Design and Analysis of Rotorcraft) are presented. The principal tasks of NDARC are to design a rotorcraft to satisfy specified design conditions and missions, and then analyze the performance of the aircraft for a set of off-design missions and point operating conditions. The aircraft chosen as NDARC development test cases are the UH-60A single main-rotor and tail-rotor helicopter, the CH-47D tandem helicopter, the XH-59A coaxial lift-offset helicopter, and the XV-15 tiltrotor. These aircraft were selected because flight performance data, a weight statement, detailed geometry information, and a correlated comprehensive analysis model are available for each. Validation consists of developing the NDARC models for these aircraft by using geometry and weight information, airframe wind tunnel test data, engine decks, rotor performance tests, and comprehensive analysis results; and then comparing the NDARC results for aircraft and component performance with flight test data. Based on the calibrated models, the capability of the code to size rotorcraft is explored.

  1. 41 CFR 302-1.100 - What is a comprehensive, automated relocation management system?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... system? A comprehensive, automated relocation management system is a system that integrates into a single... 41 Public Contracts and Property Management 4 2014-07-01 2014-07-01 false What is a comprehensive, automated relocation management system? 302-1.100 Section 302-1.100 Public Contracts and Property Management...

  2. 41 CFR 302-1.100 - What is a comprehensive, automated relocation management system?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... system? A comprehensive, automated relocation management system is a system that integrates into a single... 41 Public Contracts and Property Management 4 2013-07-01 2012-07-01 true What is a comprehensive, automated relocation management system? 302-1.100 Section 302-1.100 Public Contracts and Property Management...

  3. 41 CFR 302-1.100 - What is a comprehensive, automated relocation management system?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... system? A comprehensive, automated relocation management system is a system that integrates into a single... 41 Public Contracts and Property Management 4 2011-07-01 2011-07-01 false What is a comprehensive, automated relocation management system? 302-1.100 Section 302-1.100 Public Contracts and Property Management...

  4. 41 CFR 302-1.100 - What is a comprehensive, automated relocation management system?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... system? A comprehensive, automated relocation management system is a system that integrates into a single... 41 Public Contracts and Property Management 4 2012-07-01 2012-07-01 false What is a comprehensive, automated relocation management system? 302-1.100 Section 302-1.100 Public Contracts and Property Management...

  5. Epilogue: Reading Comprehension Is Not a Single Ability--Implications for Assessment and Instruction

    ERIC Educational Resources Information Center

    Kamhi, Alan G.; Catts, Hugh W.

    2017-01-01

    Purpose: In this epilogue, we review the 4 response articles and highlight the implications of a multidimensional view of reading for the assessment and instruction of reading comprehension. Method: We reiterate the problems with standardized tests of reading comprehension and discuss the advantages and disadvantages of recently developed…

  6. Comprehension of Infrequent Subject-Verb Agreement Forms: Evidence from French-Learning Children

    ERIC Educational Resources Information Center

    Legendre, Geraldine; Barriere, Isabelle; Goyet, Louise; Nazzi, Thierry

    2010-01-01

    Two comprehension experiments were conducted to investigate whether young French-learning children (N = 76) are able to use a single number cue in subject-verb agreement contexts and match a visually dynamic scene with a corresponding verbal stimulus. Results from both preferential looking and pointing demonstrated significant comprehension in…

  7. Modelling dynamics in protein crystal structures by ensemble refinement

    PubMed Central

    Burnley, B Tom; Afonine, Pavel V; Adams, Paul D; Gros, Piet

    2012-01-01

    Single-structure models derived from X-ray data do not adequately account for the inherent, functionally important dynamics of protein molecules. We generated ensembles of structures by time-averaged refinement, where local molecular vibrations were sampled by molecular-dynamics (MD) simulation whilst global disorder was partitioned into an underlying overall translation–libration–screw (TLS) model. Modeling of 20 protein datasets at 1.1–3.1 Å resolution reduced cross-validated Rfree values by 0.3–4.9%, indicating that ensemble models fit the X-ray data better than single structures. The ensembles revealed that, while most proteins display a well-ordered core, some proteins exhibit a ‘molten core’ likely supporting functionally important dynamics in ligand binding, enzyme activity and protomer assembly. Order–disorder changes in HIV protease indicate a mechanism of entropy compensation for ordering the catalytic residues upon ligand binding by disordering specific core residues. Thus, ensemble refinement extracts dynamical details from the X-ray data that allow a more comprehensive understanding of structure–dynamics–function relationships. DOI: http://dx.doi.org/10.7554/eLife.00311.001 PMID:23251785

  8. Examinations of the Chemical Step in Enzyme Catalysis.

    PubMed

    Singh, P; Islam, Z; Kohen, A

    2016-01-01

    Advances in computational and experimental methods in enzymology have aided comprehension of enzyme-catalyzed chemical reactions. The main difficulty in comparing computational findings to rate measurements is that the first examines a single energy barrier, while the second frequently reflects a combination of many microscopic barriers. We present here intrinsic kinetic isotope effects and their temperature dependence as a useful experimental probe of a single chemical step in a complex kinetic cascade. Computational predictions are tested by this method for two model enzymes: dihydrofolate reductase and thymidylate synthase. The description highlights the significance of collaboration between experimentalists and theoreticians to develop a better understanding of enzyme-catalyzed chemical conversions. © 2016 Elsevier Inc. All rights reserved.

  9. Complete Proteomic-Based Enzyme Reaction and Inhibition Kinetics Reveal How Monolignol Biosynthetic Enzyme Families Affect Metabolic Flux and Lignin in Populus trichocarpa[W

    PubMed Central

    Wang, Jack P.; Naik, Punith P.; Chen, Hsi-Chuan; Shi, Rui; Lin, Chien-Yuan; Liu, Jie; Shuford, Christopher M.; Li, Quanzi; Sun, Ying-Hsuan; Tunlaya-Anukit, Sermsawat; Williams, Cranos M.; Muddiman, David C.; Ducoste, Joel J.; Sederoff, Ronald R.; Chiang, Vincent L.

    2014-01-01

    We established a predictive kinetic metabolic-flux model for the 21 enzymes and 24 metabolites of the monolignol biosynthetic pathway using Populus trichocarpa secondary differentiating xylem. To establish this model, a comprehensive study was performed to obtain the reaction and inhibition kinetic parameters of all 21 enzymes based on functional recombinant proteins. A total of 104 Michaelis-Menten kinetic parameters and 85 inhibition kinetic parameters were derived from these enzymes. Through mass spectrometry, we obtained the absolute quantities of all 21 pathway enzymes in the secondary differentiating xylem. This extensive experimental data set, generated from a single tissue specialized in wood formation, was used to construct the predictive kinetic metabolic-flux model to provide a comprehensive mathematical description of the monolignol biosynthetic pathway. The model was validated using experimental data from transgenic P. trichocarpa plants. The model predicts how pathway enzymes affect lignin content and composition, explains a long-standing paradox regarding the regulation of monolignol subunit ratios in lignin, and reveals novel mechanisms involved in the regulation of lignin biosynthesis. This model provides an explanation of the effects of genetic and transgenic perturbations of the monolignol biosynthetic pathway in flowering plants. PMID:24619611

  10. Teaching Modern Foreign Languages in Single-Sex Classes in a Co-Educational Context--Review of a Project in a North Yorkshire Comprehensive School

    ERIC Educational Resources Information Center

    Chambers, Gary

    2005-01-01

    A co-educational comprehensive school in North Yorkshire, concerned at the gap between boys' and girls' performance in French and German at GCSE, opted to teach Year 8 languages classes as single-sex groups. 2003-04 was to be a pilot year, at the end of which pupils' performance, motivation and attitude, as well as the experiences and views of…

  11. The Effect of Educational Software, Video Modelling and Group Discussion on Social-Skill Acquisition Among Students with Mild Intellectual Disabilities.

    PubMed

    Hetzroni, Orit E; Banin, Irit

    2017-07-01

    People with intellectual and developmental disabilities (IDD) often demonstrate difficulties in social skills. The purpose of this study was to examine the effects of a comprehensive intervention program on the acquisition of social skills among students with mild IDD. Single subject multiple baseline design across situations was used for teaching five school-age children with mild IDD social skills embedded in school-based situations. Results demonstrate that the intervention program that included video modelling and games embedded with group discussions and simulations increased the level and use of adequate social behaviours within the school's natural environment. Results demonstrate the unique attribution of a comprehensive interactive program for acquisition and transfer of participants' social skills such as language pragmatics and social rules within the school environment. Group discussions and simulations were beneficial and enabled both group and personalized instruction through the unique application of the program designed for the study. © 2016 John Wiley & Sons Ltd.

  12. One-step fabrication of porous GaN crystal membrane and its application in energy storage

    NASA Astrophysics Data System (ADS)

    Zhang, Lei; Wang, Shouzhi; Shao, Yongliang; Wu, Yongzhong; Sun, Changlong; Huo, Qin; Zhang, Baoguo; Hu, Haixiao; Hao, Xiaopeng

    2017-03-01

    Single-crystal gallium nitride (GaN) membranes have great potential for a variety of applications. However, fabrication of single-crystalline GaN membranes remains a challenge owing to its chemical inertness and mechanical hardness. This study prepares large-area, free-standing, and single-crystalline porous GaN membranes using a one-step high-temperature annealing technique for the first time. A promising separation model is proposed through a comprehensive study that combines thermodynamic theories analysis and experiments. Porous GaN crystal membrane is processed into supercapacitors, which exhibit stable cycling life, high-rate capability, and ultrahigh power density, to complete proof-of-concept demonstration of new energy storage application. Our results contribute to the study of GaN crystal membranes into a new stage related to the elelctrochemical energy storage application.

  13. Probing the target search of DNA-binding proteins in mammalian cells using TetR as model searcher

    NASA Astrophysics Data System (ADS)

    Normanno, Davide; Boudarène, Lydia; Dugast-Darzacq, Claire; Chen, Jiji; Richter, Christian; Proux, Florence; Bénichou, Olivier; Voituriez, Raphaël; Darzacq, Xavier; Dahan, Maxime

    2015-07-01

    Many cellular functions rely on DNA-binding proteins finding and associating to specific sites in the genome. Yet the mechanisms underlying the target search remain poorly understood, especially in the case of the highly organized mammalian cell nucleus. Using as a model Tet repressors (TetRs) searching for a multi-array locus, we quantitatively analyse the search process in human cells with single-molecule tracking and single-cell protein-DNA association measurements. We find that TetRs explore the nucleus and reach their target by 3D diffusion interspersed with transient interactions with non-cognate sites, consistent with the facilitated diffusion model. Remarkably, nonspecific binding times are broadly distributed, underlining a lack of clear delimitation between specific and nonspecific interactions. However, the search kinetics is not determined by diffusive transport but by the low association rate to nonspecific sites. Altogether, our results provide a comprehensive view of the recruitment dynamics of proteins at specific loci in mammalian cells.

  14. Employment of single-diode model to elucidate the variations in photovoltaic parameters under different electrical and thermal conditions

    PubMed Central

    Hameed, Shilan S.; Aziz, Fakhra; Sulaiman, Khaulah; Ahmad, Zubair

    2017-01-01

    In this research work, numerical simulations are performed to correlate the photovoltaic parameters with various internal and external factors influencing the performance of solar cells. Single-diode modeling approach is utilized for this purpose and theoretical investigations are compared with the reported experimental evidences for organic and inorganic solar cells at various electrical and thermal conditions. Electrical parameters include parasitic resistances (Rs and Rp) and ideality factor (n), while thermal parameters can be defined by the cells temperature (T). A comprehensive analysis concerning broad spectral variations in the short circuit current (Isc), open circuit voltage (Voc), fill factor (FF) and efficiency (η) is presented and discussed. It was generally concluded that there exists a good agreement between the simulated results and experimental findings. Nevertheless, the controversial consequence of temperature impact on the performance of organic solar cells necessitates the development of a complementary model which is capable of well simulating the temperature impact on these devices performance. PMID:28793325

  15. SEMIPARAMETRIC QUANTILE REGRESSION WITH HIGH-DIMENSIONAL COVARIATES

    PubMed Central

    Zhu, Liping; Huang, Mian; Li, Runze

    2012-01-01

    This paper is concerned with quantile regression for a semiparametric regression model, in which both the conditional mean and conditional variance function of the response given the covariates admit a single-index structure. This semiparametric regression model enables us to reduce the dimension of the covariates and simultaneously retains the flexibility of nonparametric regression. Under mild conditions, we show that the simple linear quantile regression offers a consistent estimate of the index parameter vector. This is a surprising and interesting result because the single-index model is possibly misspecified under the linear quantile regression. With a root-n consistent estimate of the index vector, one may employ a local polynomial regression technique to estimate the conditional quantile function. This procedure is computationally efficient, which is very appealing in high-dimensional data analysis. We show that the resulting estimator of the quantile function performs asymptotically as efficiently as if the true value of the index vector were known. The methodologies are demonstrated through comprehensive simulation studies and an application to a real dataset. PMID:24501536

  16. Reading Skill and the Minimum Distance Principle: A Comparison of Sentence Comprehension in Context and in Isolation.

    ERIC Educational Resources Information Center

    Goldman, Susan R.

    The comprehension of the Minimum Distance Principle was examined in three experiments, using the "tell/promise" sentence construction. Experiment one compared the listening and reading comprehension of singly presented sentences, e.g. "John tells Bill to bake the cake" and "John promises Bill to bake the cake." The…

  17. Enhancing Literacy Skills of Students with Congenital and Profound Hearing Impairment in Nigeria Using Babudoh's Comprehension Therapy

    ERIC Educational Resources Information Center

    Babudoh, Gladys B.

    2014-01-01

    This study reports the effect of a treatment tool called "Babudoh's comprehension therapy" in enhancing the comprehension and writing skills of 10 junior secondary school students with congenital and profound hearing impairment in Plateau State, Nigeria. The study adopted the single group pretest-posttest quasi-experimental research…

  18. A kernel regression approach to gene-gene interaction detection for case-control studies.

    PubMed

    Larson, Nicholas B; Schaid, Daniel J

    2013-11-01

    Gene-gene interactions are increasingly being addressed as a potentially important contributor to the variability of complex traits. Consequently, attentions have moved beyond single locus analysis of association to more complex genetic models. Although several single-marker approaches toward interaction analysis have been developed, such methods suffer from very high testing dimensionality and do not take advantage of existing information, notably the definition of genes as functional units. Here, we propose a comprehensive family of gene-level score tests for identifying genetic elements of disease risk, in particular pairwise gene-gene interactions. Using kernel machine methods, we devise score-based variance component tests under a generalized linear mixed model framework. We conducted simulations based upon coalescent genetic models to evaluate the performance of our approach under a variety of disease models. These simulations indicate that our methods are generally higher powered than alternative gene-level approaches and at worst competitive with exhaustive SNP-level (where SNP is single-nucleotide polymorphism) analyses. Furthermore, we observe that simulated epistatic effects resulted in significant marginal testing results for the involved genes regardless of whether or not true main effects were present. We detail the benefits of our methods and discuss potential genome-wide analysis strategies for gene-gene interaction analysis in a case-control study design. © 2013 WILEY PERIODICALS, INC.

  19. A flexible computational framework for detecting, characterizing, and interpreting statistical patterns of epistasis in genetic studies of human disease susceptibility.

    PubMed

    Moore, Jason H; Gilbert, Joshua C; Tsai, Chia-Ti; Chiang, Fu-Tien; Holden, Todd; Barney, Nate; White, Bill C

    2006-07-21

    Detecting, characterizing, and interpreting gene-gene interactions or epistasis in studies of human disease susceptibility is both a mathematical and a computational challenge. To address this problem, we have previously developed a multifactor dimensionality reduction (MDR) method for collapsing high-dimensional genetic data into a single dimension (i.e. constructive induction) thus permitting interactions to be detected in relatively small sample sizes. In this paper, we describe a comprehensive and flexible framework for detecting and interpreting gene-gene interactions that utilizes advances in information theory for selecting interesting single-nucleotide polymorphisms (SNPs), MDR for constructive induction, machine learning methods for classification, and finally graphical models for interpretation. We illustrate the usefulness of this strategy using artificial datasets simulated from several different two-locus and three-locus epistasis models. We show that the accuracy, sensitivity, specificity, and precision of a naïve Bayes classifier are significantly improved when SNPs are selected based on their information gain (i.e. class entropy removed) and reduced to a single attribute using MDR. We then apply this strategy to detecting, characterizing, and interpreting epistatic models in a genetic study (n = 500) of atrial fibrillation and show that both classification and model interpretation are significantly improved.

  20. Risk assessment of flood disaster and forewarning model at different spatial-temporal scales

    NASA Astrophysics Data System (ADS)

    Zhao, Jun; Jin, Juliang; Xu, Jinchao; Guo, Qizhong; Hang, Qingfeng; Chen, Yaqian

    2018-05-01

    Aiming at reducing losses from flood disaster, risk assessment of flood disaster and forewarning model is studied. The model is built upon risk indices in flood disaster system, proceeding from the whole structure and its parts at different spatial-temporal scales. In this study, on the one hand, it mainly establishes the long-term forewarning model for the surface area with three levels of prediction, evaluation, and forewarning. The method of structure-adaptive back-propagation neural network on peak identification is used to simulate indices in prediction sub-model. Set pair analysis is employed to calculate the connection degrees of a single index, comprehensive index, and systematic risk through the multivariate connection number, and the comprehensive assessment is made by assessment matrixes in evaluation sub-model. The comparison judging method is adopted to divide warning degree of flood disaster on risk assessment comprehensive index with forewarning standards in forewarning sub-model and then the long-term local conditions for proposing planning schemes. On the other hand, it mainly sets up the real-time forewarning model for the spot, which introduces the real-time correction technique of Kalman filter based on hydrological model with forewarning index, and then the real-time local conditions for presenting an emergency plan. This study takes Tunxi area, Huangshan City of China, as an example. After risk assessment and forewarning model establishment and application for flood disaster at different spatial-temporal scales between the actual and simulated data from 1989 to 2008, forewarning results show that the development trend for flood disaster risk remains a decline on the whole from 2009 to 2013, despite the rise in 2011. At the macroscopic level, project and non-project measures are advanced, while at the microcosmic level, the time, place, and method are listed. It suggests that the proposed model is feasible with theory and application, thus offering a way for assessing and forewarning flood disaster risk.

  1. DIMM-SC: a Dirichlet mixture model for clustering droplet-based single cell transcriptomic data.

    PubMed

    Sun, Zhe; Wang, Ting; Deng, Ke; Wang, Xiao-Feng; Lafyatis, Robert; Ding, Ying; Hu, Ming; Chen, Wei

    2018-01-01

    Single cell transcriptome sequencing (scRNA-Seq) has become a revolutionary tool to study cellular and molecular processes at single cell resolution. Among existing technologies, the recently developed droplet-based platform enables efficient parallel processing of thousands of single cells with direct counting of transcript copies using Unique Molecular Identifier (UMI). Despite the technology advances, statistical methods and computational tools are still lacking for analyzing droplet-based scRNA-Seq data. Particularly, model-based approaches for clustering large-scale single cell transcriptomic data are still under-explored. We developed DIMM-SC, a Dirichlet Mixture Model for clustering droplet-based Single Cell transcriptomic data. This approach explicitly models UMI count data from scRNA-Seq experiments and characterizes variations across different cell clusters via a Dirichlet mixture prior. We performed comprehensive simulations to evaluate DIMM-SC and compared it with existing clustering methods such as K-means, CellTree and Seurat. In addition, we analyzed public scRNA-Seq datasets with known cluster labels and in-house scRNA-Seq datasets from a study of systemic sclerosis with prior biological knowledge to benchmark and validate DIMM-SC. Both simulation studies and real data applications demonstrated that overall, DIMM-SC achieves substantially improved clustering accuracy and much lower clustering variability compared to other existing clustering methods. More importantly, as a model-based approach, DIMM-SC is able to quantify the clustering uncertainty for each single cell, facilitating rigorous statistical inference and biological interpretations, which are typically unavailable from existing clustering methods. DIMM-SC has been implemented in a user-friendly R package with a detailed tutorial available on www.pitt.edu/∼wec47/singlecell.html. wei.chen@chp.edu or hum@ccf.org. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  2. MECHANISMS OF HYPNOSIS:

    PubMed Central

    Jensen, Mark P.; Adachi, Tomonori; Tomé-Pires, Catarina; Lee, Jikwan; Osman, Zubaidah Jamil; Miró, Jordi

    2014-01-01

    Evidence supports the efficacy of hypnotic treatments, but there remain many unresolved questions regarding how hypnosis produces its beneficial effects. Most theoretical models focus more or less on biological, psychological, and social factors. This scoping review summarizes the empirical findings regarding the associations between specific factors in each of these domains and response to hypnosis. The findings indicate that: (1) no single factor appears primary; (2) different factors may contribute more or less to outcomes in different subsets of individuals or for different conditions; and (3) comprehensive models of hypnosis that incorporate factors from all 3 domains may ultimately prove to be more useful than more restrictive models that focus on 1 or a very few factors. PMID:25365127

  3. [Association between single-person households and ambulatory treatment of endocrine and metabolic disease in Japan: analysis of the Comprehensive Survey of Living Conditions].

    PubMed

    Tsukinoki, Rumi; Murakami, Yoshitaka

    2014-01-01

    We examined the association between single-person households and ambulatory treatment of endocrine and metabolic disease in Japan. We used random sample data from the Comprehensive Survey of Living Conditions in 2003. The study included 11,928 participants aged ≥20 years, excluding inpatients and nursing home residents. Household status was categorized in terms of two groups: single-person household or multi-person household. Three age categories were used: 20-49, 50-64, and ≥65 years. Endocrine and metabolic disease was defined as the prevalence of diabetes, obesity, hyperlipidemia, and thyroid diseases. Men and women were analyzed separately. Logistic regression models were used to estimate the odds ratios (ORs) after adjusting for employment status, marital status, disability in activities of daily living, and smoking. The association between age, household, and ambulatory care for endocrine and metabolic disease was examined by a likelihood ratio test. There were 443 male and 529 female outpatients with endocrine and metabolic disease. In male outpatients from single-person households, the ORs for endocrine and metabolic disease were higher than for multi-person households across all age groups [single-person household, 1.62 (95% confidence interval: 1.03-2.56)]. The ORs for outpatients with endocrine and metabolic disease increased with age, and for those aged ≥65 years, these ORs increased gradually. There were no significant associations between age, households, and ambulatory care for endocrine and metabolic disease in men (for the interaction P=0.986). Furthermore, there was no significant association between single-person households and ambulatory care for endocrine and metabolic disease in women. The data from the national survey suggest that single-person households are a risk factor for endocrine and metabolic disease in Japanese men. Our findings indicate the need for management of endocrine and metabolic disease across all age groups.

  4. An MPI-CUDA approach for hypersonic flows with detailed state-to-state air kinetics using a GPU cluster

    NASA Astrophysics Data System (ADS)

    Bonelli, Francesco; Tuttafesta, Michele; Colonna, Gianpiero; Cutrone, Luigi; Pascazio, Giuseppe

    2017-10-01

    This paper describes the most advanced results obtained in the context of fluid dynamic simulations of high-enthalpy flows using detailed state-to-state air kinetics. Thermochemical non-equilibrium, typical of supersonic and hypersonic flows, was modeled by using both the accurate state-to-state approach and the multi-temperature model proposed by Park. The accuracy of the two thermochemical non-equilibrium models was assessed by comparing the results with experimental findings, showing better predictions provided by the state-to-state approach. To overcome the huge computational cost of the state-to-state model, a multiple-nodes GPU implementation, based on an MPI-CUDA approach, was employed and a comprehensive code performance analysis is presented. Both the pure MPI-CPU and the MPI-CUDA implementations exhibit excellent scalability performance. GPUs outperform CPUs computing especially when the state-to-state approach is employed, showing speed-ups, of the single GPU with respect to the single-core CPU, larger than 100 in both the case of one MPI process and multiple MPI process.

  5. Marine atmospheric effects on electro-optical systems performance

    NASA Astrophysics Data System (ADS)

    Richter, Juergen H.; Hughes, Herbert G.

    1990-09-01

    For the past twelve years, a coordinated tri-service effort has been underway in the United States Department of Defense to provide an atmospheric effects assessment capability for existing and planned electro-optical (E0) systems. This paper reviews the exploratory development effort in the US Navy. A key responsibility for the Navy was the development of marine aerosol models. An initial model, the Navy Aerosol Model (NAN), was developed, tested, and transitioned into LOWTRAN 6. A more comprehensive model, the Navy Oceanic Vertical Aerosol Model (NOVAM), has been formulated and is presently undergoing comprehensive evaluation and testing. Marine aerosols and their extinction properties are only one important factor in EO systems performance assessment. For many EO systems applications, an accurate knowledge of marine background radiances is required in addition to considering the effects of the intervening atmosphere. Accordingly, a capability was developed to estimate the apparent sea surface radiance for different sea states and meteorological conditions. Also, an empirical relationship was developed which directly relates apparent mean sea temperature to calculated mean sky temperature. In situ measurements of relevant environmental parameters are essential for real-time EO systems performance assessment. Direct measurement of slant path extinction would be most desirable. This motivated a careful investigation of lidar (light detection and ranging) techniques including improvements to single-ended lidar profile inversion algorithms and development of new lidar techniques such as double-ended and dual-angle configurations. It was concluded that single-ended, single frequency lidars can not be used to infer slant path extinction with an accuracy necessary to make meaningful performance assessments. Other lidar configurations may find limited application in model validation and research efforts. No technique has emerged yet which could be considered ready for shipboard implementation. A shipboard real-time performance assessment system was developed and named PREOS (Performance and Range for EO Systems). PREOS has been incorporated into the Navy's Tactical Environmental Support System (TESS). The present version of PREOS is a first step in accomplishing the complex task of real-time systems performance assessment. Improved target and background models are under development and will be incorporated into TESS when tested and validated. A reliable assessment capability can be used to develop Tactical Decision Aids (TDAs). TDAs permit optimum selection or combination of sensors and estimation of a ship's own vulnerability against hostile systems.

  6. Adaptation in Tunably Rugged Fitness Landscapes: The Rough Mount Fuji Model

    PubMed Central

    Neidhart, Johannes; Szendro, Ivan G.; Krug, Joachim

    2014-01-01

    Much of the current theory of adaptation is based on Gillespie’s mutational landscape model (MLM), which assumes that the fitness values of genotypes linked by single mutational steps are independent random variables. On the other hand, a growing body of empirical evidence shows that real fitness landscapes, while possessing a considerable amount of ruggedness, are smoother than predicted by the MLM. In the present article we propose and analyze a simple fitness landscape model with tunable ruggedness based on the rough Mount Fuji (RMF) model originally introduced by Aita et al. in the context of protein evolution. We provide a comprehensive collection of results pertaining to the topographical structure of RMF landscapes, including explicit formulas for the expected number of local fitness maxima, the location of the global peak, and the fitness correlation function. The statistics of single and multiple adaptive steps on the RMF landscape are explored mainly through simulations, and the results are compared to the known behavior in the MLM model. Finally, we show that the RMF model can explain the large number of second-step mutations observed on a highly fit first-step background in a recent evolution experiment with a microvirid bacteriophage. PMID:25123507

  7. Revisiting the flocculation kinetics of destabilized asphaltenes.

    PubMed

    Vilas Bôas Fávero, Cláudio; Maqbool, Tabish; Hoepfner, Michael; Haji-Akbari, Nasim; Fogler, H Scott

    2017-06-01

    A comprehensive review of the recently published work on asphaltene destabilization and flocculation kinetics is presented. Four different experimental techniques were used to study asphaltenes undergoing flocculation process in crude oils and model oils. The asphaltenes were destabilized by different n-alkanes and a geometric population balance with the Smoluchowski collision kernel was used to model the asphaltene aggregation process. Additionally, by postulating a relation between the aggregation collision efficiency and the solubility parameter of asphaltenes and the solution, a unified model of asphaltene aggregation model was developed. When the aggregation model is applied to the experimental data obtained from several different crude oil and model oils, the detection time curves collapsed onto a universal single line, indicating that the model successfully captures the underlying physics of the observed process. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. Drug target inference through pathway analysis of genomics data

    PubMed Central

    Ma, Haisu; Zhao, Hongyu

    2013-01-01

    Statistical modeling coupled with bioinformatics is commonly used for drug discovery. Although there exist many approaches for single target based drug design and target inference, recent years have seen a paradigm shift to system-level pharmacological research. Pathway analysis of genomics data represents one promising direction for computational inference of drug targets. This article aims at providing a comprehensive review on the evolving issues is this field, covering methodological developments, their pros and cons, as well as future research directions. PMID:23369829

  9. Comprehensiveness of care from the patient perspective: comparison of primary healthcare evaluation instruments.

    PubMed

    Haggerty, Jeannie L; Beaulieu, Marie-Dominique; Pineault, Raynald; Burge, Frederick; Lévesque, Jean-Frédéric; Santor, Darcy A; Bouharaoui, Fatima; Beaulieu, Christine

    2011-12-01

    Comprehensiveness relates both to scope of services offered and to a whole-person clinical approach. Comprehensive services are defined as "the provision, either directly or indirectly, of a full range of services to meet most patients' healthcare needs"; whole-person care is "the extent to which a provider elicits and considers the physical, emotional and social aspects of a patient's health and considers the community context in their care." Among instruments that evaluate primary healthcare, two had subscales that mapped to comprehensive services and to the community component of whole-person care: the Primary Care Assessment Tool - Short Form (PCAT-S) and the Components of Primary Care Index (CPCI, a limited measure of whole-person care). To examine how well comprehensiveness is captured in validated instruments that evaluate primary healthcare from the patient's perspective. 645 adults with at least one healthcare contact in the previous 12 months responded to six instruments that evaluate primary healthcare. Scores were normalized for descriptive comparison. Exploratory and confirmatory (structural equation modelling) factor analysis examined fit to operational definition, and item response theory analysis examined item performance on common constructs. Over one-quarter of respondents had missing responses on services offered or doctor's knowledge of the community. The subscales did not load on a single factor; comprehensive services and community orientation were examined separately. The community orientation subscales did not perform satisfactorily. The three comprehensive services subscales fit very modestly onto two factors: (1) most healthcare needs (from one provider) (CPCI Comprehensive Care, PCAT-S First-Contact Utilization) and (2) range of services (PCAT-S Comprehensive Services Available). Individual item performance revealed several problems. Measurement of comprehensiveness is problematic, making this attribute a priority for measure development. Range of services offered is best obtained from providers. Whole-person care is not addressed as a separate construct, but some dimensions are covered by attributes such as interpersonal communication and relational continuity.

  10. Modeling of abnormal mechanical properties of nickel-based single crystal superalloy by three-dimensional discrete dislocation dynamics

    NASA Astrophysics Data System (ADS)

    Yang, Hui; Li, Zhenhuan; Huang, Minsheng

    2014-12-01

    Unlike common single crystals, the nickel-based single crystal superalloy shows surprisingly anomalous flow strength (i.e. with the increase of temperature, the yield strength first increases to a peak value and then decreases) and tension-compression (TC) asymmetry. A comprehensive three-dimensional discrete dislocation dynamics (3D-DDD) procedure was developed to model these abnormal mechanical properties. For this purpose, a series of complicated dynamic evolution details of Kear-Wilsdorf (KW) locks, which are closely related to the flow strength anomaly and TC asymmetry, were incorporated into this 3D-DDD framework. Moreover, the activation of the cubic slip system, which is the origin of the decrease in yield strength with increasing temperature at relatively high temperatures, was especially taken into account by introducing a competition criterion between the unlocking of the KW locks and the activation of the cubic slip system. To test our framework, a series of 3D-DDD simulations were performed on a representative volume cell model with a cuboidal Ni3Al precipitate phase embedded in a nickel matrix. Results show that the present 3D-DDD procedure can successfully capture the dynamic evolution of KW locks, the flow strength anomaly and TC asymmetry. Then, the underlying dislocation mechanisms leading to these abnormal mechanical responses were investigated and discussed in detail. Finally, a cyclic deformation of the nickel-based single crystal superalloy was modeled by using the present DDD model, with a special focus on the influence of KW locks on the Bauschinger effect and cyclic softening.

  11. RFA Guardian: Comprehensive Simulation of Radiofrequency Ablation Treatment of Liver Tumors.

    PubMed

    Voglreiter, Philip; Mariappan, Panchatcharam; Pollari, Mika; Flanagan, Ronan; Blanco Sequeiros, Roberto; Portugaller, Rupert Horst; Fütterer, Jurgen; Schmalstieg, Dieter; Kolesnik, Marina; Moche, Michael

    2018-01-15

    The RFA Guardian is a comprehensive application for high-performance patient-specific simulation of radiofrequency ablation of liver tumors. We address a wide range of usage scenarios. These include pre-interventional planning, sampling of the parameter space for uncertainty estimation, treatment evaluation and, in the worst case, failure analysis. The RFA Guardian is the first of its kind that exhibits sufficient performance for simulating treatment outcomes during the intervention. We achieve this by combining a large number of high-performance image processing, biomechanical simulation and visualization techniques into a generalized technical workflow. Further, we wrap the feature set into a single, integrated application, which exploits all available resources of standard consumer hardware, including massively parallel computing on graphics processing units. This allows us to predict or reproduce treatment outcomes on a single personal computer with high computational performance and high accuracy. The resulting low demand for infrastructure enables easy and cost-efficient integration into the clinical routine. We present a number of evaluation cases from the clinical practice where users performed the whole technical workflow from patient-specific modeling to final validation and highlight the opportunities arising from our fast, accurate prediction techniques.

  12. Diffraction-geometry refinement in the DIALS framework

    DOE PAGES

    Waterman, David G.; Winter, Graeme; Gildea, Richard J.; ...

    2016-03-30

    Rapid data collection and modern computing resources provide the opportunity to revisit the task of optimizing the model of diffraction geometry prior to integration. A comprehensive description is given of new software that builds upon established methods by performing a single global refinement procedure, utilizing a smoothly varying model of the crystal lattice where appropriate. This global refinement technique extends to multiple data sets, providing useful constraints to handle the problem of correlated parameters, particularly for small wedges of data. Examples of advanced uses of the software are given and the design is explained in detail, with particular emphasis onmore » the flexibility and extensibility it entails.« less

  13. Broadband microwave photonic fully tunable filter using a single heterogeneously integrated III-V/SOI-microdisk-based phase shifter.

    PubMed

    Lloret, Juan; Morthier, Geert; Ramos, Francisco; Sales, Salvador; Van Thourhout, Dries; Spuesens, Thijs; Olivier, Nicolas; Fédéli, Jean-Marc; Capmany, José

    2012-05-07

    A broadband microwave photonic phase shifter based on a single III-V microdisk resonator heterogeneously integrated on and coupled to a nanophotonic silicon-on-insulator waveguide is reported. The phase shift tunability is accomplished by modifying the effective index through carrier injection. A comprehensive semi-analytical model aiming at predicting its behavior is formulated and confirmed by measurements. Quasi-linear and continuously tunable 2π phase shifts at radiofrequencies greater than 18 GHz are experimentally demonstrated. The phase shifter performance is also evaluated when used as a key element in tunable filtering schemes. Distortion-free and wideband filtering responses with a tuning range of ~100% over the free spectral range are obtained.

  14. a Thtee-Dimensional Variational Assimilation Scheme for Satellite Aod

    NASA Astrophysics Data System (ADS)

    Liang, Y.; Zang, Z.; You, W.

    2018-04-01

    A three-dimensional variational data assimilation scheme is designed for satellite AOD based on the IMPROVE (Interagency Monitoring of Protected Visual Environments) equation. The observation operator that simulates AOD from the control variables is established by the IMPROVE equation. All of the 16 control variables in the assimilation scheme are the mass concentrations of aerosol species from the Model for Simulation Aerosol Interactions and Chemistry scheme, so as to take advantage of this scheme in providing comprehensive analyses of species concentrations and size distributions as well as be calculating efficiently. The assimilation scheme can save computational resources as the IMPROVE equation is a quadratic equation. A single-point observation experiment shows that the information from the single-point AOD is effectively spread horizontally and vertically.

  15. Ultimate patterning limits for EUV at 5nm node and beyond

    NASA Astrophysics Data System (ADS)

    Ali, Rehab Kotb; Hamed Fatehy, Ahmed; Lafferty, Neal; Word, James

    2018-03-01

    The 5nm technology node introduces more aggressive geometries than previous nodes. In this paper, we are introducing a comprehensive study to examine the pattering limits of EUV at 0.33NA. The study is divided into two main approaches: (A) Exploring pattering limits of Single Exposure EUV Cut/Block mask in Self-Aligned-Multi-Patterning (SAMP) process, and (B) Exploring the pattering limits of a Single Exposure EUV printing of metal Layers. The printability of the resulted OPC masks is checked through a model based manufacturing flow for the two pattering approaches. The final manufactured patterns are quantified by Edge Placement Error (EPE), Process Variation Band (PVBand), soft/hard bridging and pinching, Image Log Slope (ILS) and Common Depth of Focus (CDOF)

  16. Perinatal nursing education for single-room maternity care: an evaluation of a competency-based model.

    PubMed

    Janssen, Patricia A; Keen, Lois; Soolsma, Jetty; Seymour, Laurie C; Harris, Susan J; Klein, Michael C; Reime, Birgit

    2005-01-01

    To evaluate the success of a competency-based nursing orientation programme for a single-room maternity care unit by measuring improvement in self-reported competency after six months. Single-room maternity care has challenged obstetrical nurses to provide comprehensive nursing care during all phases of the in-hospital birth experience. In this model, nurses provide intrapartum, postpartum and newborn care in one room. To date, an evaluation of nursing education for single-room maternity care has not been published. A prospective cohort design comparing self-reported competencies prior to starting work in the single-room maternity care and six months after. Nurses completed a competency-based education programme in which they could select from a menu of learning methods and content areas according to their individual needs. Learning methods included classroom lectures, self-paced learning packages, and preceptorships in the clinical area. Competencies were measured by a standardized perinatal self-efficacy tool and a tool developed by the authors for this study, the Single-Room Maternity Care Competency Tool. A paired analysis was undertaken to take into account the paired (before and after) nature of the design. Scores on the perinatal self-efficacy scale and the single-room maternity care competency tool were improved. These differences were statistically significant. Improvements in perinatal and single-room maternity care-specific competencies suggest that our education programme was successful in preparing nurses for their new role in the single-room maternity care setting. This conclusion is supported by reported increases in nursing and patient satisfaction in the single-room maternity care compared with the traditional labour/delivery and postpartum settings. An education programme tailored to the learning needs of experienced clinical nurses contributes to improvements in nursing competencies and patient care.

  17. Academic Achievement of Deaf and Hard-of-Hearing Students in an ASL/English Bilingual Program

    PubMed Central

    Wilbur, Ronnie B.

    2016-01-01

    There has been a scarcity of studies exploring the influence of students’ American Sign Language (ASL) proficiency on their academic achievement in ASL/English bilingual programs. The aim of this study was to determine the effects of ASL proficiency on reading comprehension skills and academic achievement of 85 deaf or hard-of-hearing signing students. Two subgroups, differing in ASL proficiency, were compared on the Northwest Evaluation Association Measures of Academic Progress and the reading comprehension subtest of the Stanford Achievement Test, 10th edition. Findings suggested that students highly proficient in ASL outperformed their less proficient peers in nationally standardized measures of reading comprehension, English language use, and mathematics. Moreover, a regression model consisting of 5 predictors including variables regarding education, hearing devices, and secondary disabilities as well as ASL proficiency and home language showed that ASL proficiency was the single variable significantly predicting results on all outcome measures. This study calls for a paradigm shift in thinking about deaf education by focusing on characteristics shared among successful deaf signing readers, specifically ASL fluency. PMID:26864688

  18. Oscillation mechanics of the respiratory system.

    PubMed

    Bates, Jason H T; Irvin, Charles G; Farré, Ramon; Hantos, Zoltán

    2011-07-01

    The mechanical impedance of the respiratory system defines the pressure profile required to drive a unit of oscillatory flow into the lungs. Impedance is a function of oscillation frequency, and is measured using the forced oscillation technique. Digital signal processing methods, most notably the Fourier transform, are used to calculate impedance from measured oscillatory pressures and flows. Impedance is a complex function of frequency, having both real and imaginary parts that vary with frequency in ways that can be used empirically to distinguish normal lung function from a variety of different pathologies. The most useful diagnostic information is gained when anatomically based mathematical models are fit to measurements of impedance. The simplest such model consists of a single flow-resistive conduit connecting to a single elastic compartment. Models of greater complexity may have two or more compartments, and provide more accurate fits to impedance measurements over a variety of different frequency ranges. The model that currently enjoys the widest application in studies of animal models of lung disease consists of a single airway serving an alveolar compartment comprising tissue with a constant-phase impedance. This model has been shown to fit very accurately to a wide range of impedance data, yet contains only four free parameters, and as such is highly parsimonious. The measurement of impedance in human patients is also now rapidly gaining acceptance, and promises to provide a more comprehensible assessment of lung function than parameters derived from conventional spirometry. © 2011 American Physiological Society.

  19. Hospital daily outpatient visits forecasting using a combinatorial model based on ARIMA and SES models.

    PubMed

    Luo, Li; Luo, Le; Zhang, Xinli; He, Xiaoli

    2017-07-10

    Accurate forecasting of hospital outpatient visits is beneficial for the reasonable planning and allocation of healthcare resource to meet the medical demands. In terms of the multiple attributes of daily outpatient visits, such as randomness, cyclicity and trend, time series methods, ARIMA, can be a good choice for outpatient visits forecasting. On the other hand, the hospital outpatient visits are also affected by the doctors' scheduling and the effects are not pure random. Thinking about the impure specialty, this paper presents a new forecasting model that takes cyclicity and the day of the week effect into consideration. We formulate a seasonal ARIMA (SARIMA) model on a daily time series and then a single exponential smoothing (SES) model on the day of the week time series, and finally establish a combinatorial model by modifying them. The models are applied to 1 year of daily visits data of urban outpatients in two internal medicine departments of a large hospital in Chengdu, for forecasting the daily outpatient visits about 1 week ahead. The proposed model is applied to forecast the cross-sectional data for 7 consecutive days of daily outpatient visits over an 8-weeks period based on 43 weeks of observation data during 1 year. The results show that the two single traditional models and the combinatorial model are simplicity of implementation and low computational intensiveness, whilst being appropriate for short-term forecast horizons. Furthermore, the combinatorial model can capture the comprehensive features of the time series data better. Combinatorial model can achieve better prediction performance than the single model, with lower residuals variance and small mean of residual errors which needs to be optimized deeply on the next research step.

  20. Comprehensive analysis of information dissemination in disasters

    NASA Astrophysics Data System (ADS)

    Zhang, N.; Huang, H.; Su, Boni

    2016-11-01

    China is a country that experiences a large number of disasters. The number of deaths caused by large-scale disasters and accidents in past 10 years is around 900,000. More than 92.8 percent of these deaths could be avoided if there were an effective pre-warning system deployed. Knowledge of the information dissemination characteristics of different information media taking into consideration governmental assistance (information published by a government) in disasters in urban areas, plays a critical role in increasing response time and reducing the number of deaths and economic losses. In this paper we have developed a comprehensive information dissemination model to optimize efficiency of pre-warning mechanics. This model also can be used for disseminating information for evacuees making real-time evacuation plans. We analyzed every single information dissemination models for pre-warning in disasters by considering 14 media: short message service (SMS), phone, television, radio, news portals, Wechat, microblogs, email, newspapers, loudspeaker vehicles, loudspeakers, oral communication, and passive information acquisition via visual and auditory senses. Since governmental assistance is very useful in a disaster, we calculated the sensitivity of governmental assistance ratio. The results provide useful references for information dissemination during disasters in urban areas.

  1. The Problem of Size in Robust Design

    NASA Technical Reports Server (NTRS)

    Koch, Patrick N.; Allen, Janet K.; Mistree, Farrokh; Mavris, Dimitri

    1997-01-01

    To facilitate the effective solution of multidisciplinary, multiobjective complex design problems, a departure from the traditional parametric design analysis and single objective optimization approaches is necessary in the preliminary stages of design. A necessary tradeoff becomes one of efficiency vs. accuracy as approximate models are sought to allow fast analysis and effective exploration of a preliminary design space. In this paper we apply a general robust design approach for efficient and comprehensive preliminary design to a large complex system: a high speed civil transport (HSCT) aircraft. Specifically, we investigate the HSCT wing configuration design, incorporating life cycle economic uncertainties to identify economically robust solutions. The approach is built on the foundation of statistical experimentation and modeling techniques and robust design principles, and is specialized through incorporation of the compromise Decision Support Problem for multiobjective design. For large problems however, as in the HSCT example, this robust design approach developed for efficient and comprehensive design breaks down with the problem of size - combinatorial explosion in experimentation and model building with number of variables -and both efficiency and accuracy are sacrificed. Our focus in this paper is on identifying and discussing the implications and open issues associated with the problem of size for the preliminary design of large complex systems.

  2. Comparing Students With and Without Reading Difficulties on Reading Comprehension Assessments: A Meta-Analysis.

    PubMed

    Collins, Alyson A; Lindström, Esther R; Compton, Donald L

    Researchers have increasingly investigated sources of variance in reading comprehension test scores, particularly with students with reading difficulties (RD). The purpose of this meta-analysis was to determine if the achievement gap between students with RD and typically developing (TD) students varies as a function of different reading comprehension response formats (e.g., multiple choice, cloze). A systematic literature review identified 82 eligible studies. All studies administered reading comprehension assessments to students with RD and TD students in Grades K-12. Hedge's g standardized mean difference effect sizes were calculated, and random effects robust variance estimation techniques were used to aggregate average weighted effect sizes for each response format. Results indicated that the achievement gap between students with RD and TD students was larger for some response formats (e.g., picture selection ES g = -1.80) than others (e.g., retell ES g = -0.60). Moreover, for multiple-choice, cloze, and open-ended question response formats, single-predictor metaregression models explored potential moderators of heterogeneity in effect sizes. No clear patterns, however, emerged in regard to moderators of heterogeneity in effect sizes across response formats. Findings suggest that the use of different response formats may lead to variability in the achievement gap between students with RD and TD students.

  3. [Evaluation of comprehensive capacity of resources and environments in Poyang Lake Eco-economic Zone].

    PubMed

    Song, Yan-Chun; Yu, Dan

    2014-10-01

    With the development of the society and economy, the contradictions among population, resources and environment are increasingly worse. As a result, the capacity of resources and environment becomes one of the focal issues for many countries and regions. Through investigating and analyzing the present situation and the existing problems of resources and environment in Poyang Lake Eco-economic Zone, seven factors were chosen as the evaluation criterion layer, namely, land resources, water resources, biological resources, mineral resources, ecological-geological environment, water environment and atmospheric environment. Based on the single factor evaluation results and with the county as the evaluation unit, the comprehensive capacity of resources and environment was evaluated by using the state space method in Poyang Lake Eco-economic Zone. The results showed that it boasted abundant biological resources, quality atmosphere and water environment, and relatively stable geological environment, while restricted by land resource, water resource and mineral resource. Currently, although the comprehensive capacity of the resources and environments in Poyang Lake Eco-economic Zone was not overloaded as a whole, it has been the case in some counties/districts. State space model, with clear indication and high accuracy, could serve as another approach to evaluating comprehensive capacity of regional resources and environment.

  4. Revisiting the Incremental Effects of Context on Word Processing: Evidence from Single-Word Event-Related Brain Potentials

    PubMed Central

    Payne, Brennan R.; Lee, Chia-Lin; Federmeier, Kara D.

    2015-01-01

    The amplitude of the N400— an event-related potential (ERP) component linked to meaning processing and initial access to semantic memory— is inversely related to the incremental build-up of semantic context over the course of a sentence. We revisited the nature and scope of this incremental context effect, adopting a word-level linear mixed-effects modeling approach, with the goal of probing the continuous and incremental effects of semantic and syntactic context on multiple aspects of lexical processing during sentence comprehension (i.e., effects of word frequency and orthographic neighborhood). First, we replicated the classic word position effect at the single-word level: open-class words showed reductions in N400 amplitude with increasing word position in semantically congruent sentences only. Importantly, we found that accruing sentence context had separable influences on the effects of frequency and neighborhood on the N400. Word frequency effects were reduced with accumulating semantic context. However, orthographic neighborhood was unaffected by accumulating context, showing robust effects on the N400 across all words, even within congruent sentences. Additionally, we found that N400 amplitudes to closed-class words were reduced with incrementally constraining syntactic context in sentences that provided only syntactic constraints. Taken together, our findings indicate that modeling word-level variability in ERPs reveals mechanisms by which different sources of information simultaneously contribute to the unfolding neural dynamics of comprehension. PMID:26311477

  5. Revisiting the incremental effects of context on word processing: Evidence from single-word event-related brain potentials.

    PubMed

    Payne, Brennan R; Lee, Chia-Lin; Federmeier, Kara D

    2015-11-01

    The amplitude of the N400-an event-related potential (ERP) component linked to meaning processing and initial access to semantic memory-is inversely related to the incremental buildup of semantic context over the course of a sentence. We revisited the nature and scope of this incremental context effect, adopting a word-level linear mixed-effects modeling approach, with the goal of probing the continuous and incremental effects of semantic and syntactic context on multiple aspects of lexical processing during sentence comprehension (i.e., effects of word frequency and orthographic neighborhood). First, we replicated the classic word-position effect at the single-word level: Open-class words showed reductions in N400 amplitude with increasing word position in semantically congruent sentences only. Importantly, we found that accruing sentence context had separable influences on the effects of frequency and neighborhood on the N400. Word frequency effects were reduced with accumulating semantic context. However, orthographic neighborhood was unaffected by accumulating context, showing robust effects on the N400 across all words, even within congruent sentences. Additionally, we found that N400 amplitudes to closed-class words were reduced with incrementally constraining syntactic context in sentences that provided only syntactic constraints. Taken together, our findings indicate that modeling word-level variability in ERPs reveals mechanisms by which different sources of information simultaneously contribute to the unfolding neural dynamics of comprehension. © 2015 Society for Psychophysiological Research.

  6. Cortico-striatal language pathways dynamically adjust for syntactic complexity: A computational study.

    PubMed

    Szalisznyó, Krisztina; Silverstein, David; Teichmann, Marc; Duffau, Hugues; Smits, Anja

    2017-01-01

    A growing body of literature supports a key role of fronto-striatal circuits in language perception. It is now known that the striatum plays a role in engaging attentional resources and linguistic rule computation while also serving phonological short-term memory capabilities. The ventral semantic and the dorsal phonological stream dichotomy assumed for spoken language processing also seems to play a role in cortico-striatal perception. Based on recent studies that correlate deep Broca-striatal pathways with complex syntax performance, we used a previously developed computational model of frontal-striatal syntax circuits and hypothesized that different parallel language pathways may contribute to canonical and non-canonical sentence comprehension separately. We modified and further analyzed a thematic role assignment task and corresponding reservoir computing model of language circuits, as previously developed by Dominey and coworkers. We examined the models performance under various parameter regimes, by influencing how fast the presented language input decays and altering the temporal dynamics of activated word representations. This enabled us to quantify canonical and non-canonical sentence comprehension abilities. The modeling results suggest that separate cortico-cortical and cortico-striatal circuits may be recruited differently for processing syntactically more difficult and less complicated sentences. Alternatively, a single circuit would need to dynamically and adaptively adjust to syntactic complexity. Copyright © 2016. Published by Elsevier Inc.

  7. The advancement of the built environment research through employment of structural equation modeling (SEM)

    NASA Astrophysics Data System (ADS)

    Wasilah, S.; Fahmyddin, T.

    2018-03-01

    The employment of structural equation modeling (SEM) in research has taken an increasing attention in among researchers in built environment. There is a gap to understand the attributes, application, and importance of this approach in data analysis in built environment study. This paper intends to provide fundamental comprehension of SEM method in data analysis, unveiling attributes, employment and significance and bestow cases to assess associations amongst variables and constructs. The study uses some main literature to grasp the essence of SEM regarding with built environment research. The better acknowledgment of this analytical tool may assist the researcher in the built environment to analyze data under complex research questions and to test multivariate models in a single study.

  8. A high power, pulsed, microwave amplifier for a synthetique aperture radar electrical model. Phase 1: Design

    NASA Astrophysics Data System (ADS)

    Atkinson, J. E.; Barker, G. G.; Feltham, S. J.; Gabrielson, S.; Lane, P. C.; Matthews, V. J.; Perring, D.; Randall, J. P.; Saunders, J. W.; Tuck, R. A.

    1982-05-01

    An electrical model klystron amplifier was designed. Its features include a gridded gun, a single stage depressed collector, a rare earth permanent magnet focusing system, an input loop, six rugged tuners and a coaxial line output section incorporating a coaxial-to-waveguide transducer and a pillbox window. At each stage of the design, the thermal and mechanical aspects were investigated and optimized within the framework of the RF specification. Extensive use was made of data from the preliminary design study and from RF measurements on the breadboard model. In an additional study, a comprehensive draft tube specification has been produced. Great emphasis has been laid on a second additional study on space-qualified materials and processes.

  9. Cross-Cultural Evaluation of Antonovsky's Orientation to Life Questionnaire: Comparison Between Australian, Finnish, and Turkish Young Adults.

    PubMed

    Lajunen, Timo

    2018-01-01

    Antonovsky's concept "sense of coherence" (SOC) and the related measurement instrument "The Orientation to Life Questionnaire" (OLQ) has been widely applied in studies on health and well-being. The purpose of the present study is to investigate the cultural differences in factor structures and psychometric properties as well as mean scores of the 13-item form of Antonovsky's OLQ among Australian (n = 201), Finnish (n = 203), and Turkish (n = 152) students. Three models of factor structure were studied by using confirmatory factor analysis: single-factor model, first-order correlated-three-factor model, and the second-order three-factor model. Results obtained in all three countries suggest that the first- and second-order three-factor models fitted the data better that the single-factor model. Hence, the OLQ scoring based on comprehensibility, manageability, and meaningfulness scales was supported. Scale reliabilities and inter-correlations were in line with those reported in earlier studies. Two-way analyses of variance (gender × nationality) with age as a covariate showed no cultural differences in SOC scale scores. Women got higher scores on the meaningfulness scale than men, and age was positively related to all SOC scale scores indicating that SOC increases in early adulthood. The results support the three-factor model of OLQ which thus should be used in Australia, Finland, and Turkey instead of a single-factor model. Need for cross-cultural studies taking into account cultural correlates of SOC and its relation to health and well-being indicators as well as studies on gender differences in the OLQ are emphasized.

  10. Comprehensive stroke units: a review of comparative evidence and experience.

    PubMed

    Chan, Daniel K Y; Cordato, Dennis; O'Rourke, Fintan; Chan, Daniel L; Pollack, Michael; Middleton, Sandy; Levi, Chris

    2013-06-01

    Stroke unit care offers significant benefits in survival and dependency when compared to general medical ward. Most stroke units are either acute or rehabilitation, but comprehensive (combined acute and rehabilitation) model (comprehensive stroke unit) is less common. To examine different levels of evidence of comprehensive stroke unit compared to other organized inpatient stroke care and share local experience of comprehensive stroke units. Cochrane Library and Medline (1980 to December 2010) review of English language articles comparing stroke units to alternative forms of stroke care delivery, different types of stroke unit models, and differences in processes of care within different stroke unit models. Different levels of comparative evidence of comprehensive stroke units to other models of stroke units are collected. There are no randomized controlled trials directly comparing comprehensive stroke units to other stroke unit models (either acute or rehabilitation). Comprehensive stroke units are associated with reduced length of stay and greatest reduction in combined death and dependency in a meta-analysis study when compared to other stroke unit models. Comprehensive stroke units also have better length of stay and functional outcome when compared to acute or rehabilitation stroke unit models in a cross-sectional study, and better length of stay in a 'before-and-after' comparative study. Components of stroke unit care that improve outcome are multifactorial and most probably include early mobilization. A comprehensive stroke unit model has been successfully implemented in metropolitan and rural hospital settings. Comprehensive stroke units are associated with reductions in length of stay and combined death and dependency and improved functional outcomes compared to other stroke unit models. A comprehensive stroke unit model is worth considering as the preferred model of stroke unit care in the planning and delivery of metropolitan and rural stroke services. © 2012 The Authors. International Journal of Stroke © 2012 World Stroke Organization.

  11. WWC Review of the Report “Improving Reading Comprehension and Social Studies Knowledge in Middle School.” What Works Clearinghouse Single Study Review

    ERIC Educational Resources Information Center

    What Works Clearinghouse, 2013

    2013-01-01

    The study reviewed in this paper examined the effects of the instructional practice “Promoting Acceleration of Comprehension and Content Through Text” (“PACT”), an approach that aims to improve social studies content knowledge and reading comprehension. This study took place in two middle schools in a near-urban district in Texas. Study authors…

  12. Single-cell transcriptomics for microbial eukaryotes.

    PubMed

    Kolisko, Martin; Boscaro, Vittorio; Burki, Fabien; Lynn, Denis H; Keeling, Patrick J

    2014-11-17

    One of the greatest hindrances to a comprehensive understanding of microbial genomics, cell biology, ecology, and evolution is that most microbial life is not in culture. Solutions to this problem have mainly focused on whole-community surveys like metagenomics, but these analyses inevitably loose information and present particular challenges for eukaryotes, which are relatively rare and possess large, gene-sparse genomes. Single-cell analyses present an alternative solution that allows for specific species to be targeted, while retaining information on cellular identity, morphology, and partitioning of activities within microbial communities. Single-cell transcriptomics, pioneered in medical research, offers particular potential advantages for uncultivated eukaryotes, but the efficiency and biases have not been tested. Here we describe a simple and reproducible method for single-cell transcriptomics using manually isolated cells from five model ciliate species; we examine impacts of amplification bias and contamination, and compare the efficacy of gene discovery to traditional culture-based transcriptomics. Gene discovery using single-cell transcriptomes was found to be comparable to mass-culture methods, suggesting single-cell transcriptomics is an efficient entry point into genomic data from the vast majority of eukaryotic biodiversity. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. CSRQ Center Report on Elementary School Comprehensive School Reform Models: Educator's Summary

    ERIC Educational Resources Information Center

    Center for Data-Driven Reform in Education (NJ3), 2008

    2008-01-01

    Which comprehensive school reform programs have evidence of positive effects on elementary school achievement? To find out, this review summarizes evidence on comprehensive school reform (CSR) models in elementary schools, grades K-6. Comprehensive school reform models are programs used schoolwide to improve student achievement. They typically…

  14. Quantum interference of independently generated telecom-band single photons

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patel, Monika; Altepeter, Joseph B.; Huang, Yu-Ping

    We report on high-visibility quantum interference of independently generated telecom O-band (1310 nm) single photons using standard single-mode fibers. The experimental data are shown to agree well with the results of simulations using a comprehensive quantum multimode theory without the need for any fitting parameter.

  15. Hybrid quantum-classical modeling of quantum dot devices

    NASA Astrophysics Data System (ADS)

    Kantner, Markus; Mittnenzweig, Markus; Koprucki, Thomas

    2017-11-01

    The design of electrically driven quantum dot devices for quantum optical applications asks for modeling approaches combining classical device physics with quantum mechanics. We connect the well-established fields of semiclassical semiconductor transport theory and the theory of open quantum systems to meet this requirement. By coupling the van Roosbroeck system with a quantum master equation in Lindblad form, we introduce a new hybrid quantum-classical modeling approach, which provides a comprehensive description of quantum dot devices on multiple scales: it enables the calculation of quantum optical figures of merit and the spatially resolved simulation of the current flow in realistic semiconductor device geometries in a unified way. We construct the interface between both theories in such a way, that the resulting hybrid system obeys the fundamental axioms of (non)equilibrium thermodynamics. We show that our approach guarantees the conservation of charge, consistency with the thermodynamic equilibrium and the second law of thermodynamics. The feasibility of the approach is demonstrated by numerical simulations of an electrically driven single-photon source based on a single quantum dot in the stationary and transient operation regime.

  16. A computational modeling of semantic knowledge in reading comprehension: Integrating the landscape model with latent semantic analysis.

    PubMed

    Yeari, Menahem; van den Broek, Paul

    2016-09-01

    It is a well-accepted view that the prior semantic (general) knowledge that readers possess plays a central role in reading comprehension. Nevertheless, computational models of reading comprehension have not integrated the simulation of semantic knowledge and online comprehension processes under a unified mathematical algorithm. The present article introduces a computational model that integrates the landscape model of comprehension processes with latent semantic analysis representation of semantic knowledge. In three sets of simulations of previous behavioral findings, the integrated model successfully simulated the activation and attenuation of predictive and bridging inferences during reading, as well as centrality estimations and recall of textual information after reading. Analyses of the computational results revealed new theoretical insights regarding the underlying mechanisms of the various comprehension phenomena.

  17. Comparing Binaural Pre-processing Strategies III

    PubMed Central

    Warzybok, Anna; Ernst, Stephan M. A.

    2015-01-01

    A comprehensive evaluation of eight signal pre-processing strategies, including directional microphones, coherence filters, single-channel noise reduction, binaural beamformers, and their combinations, was undertaken with normal-hearing (NH) and hearing-impaired (HI) listeners. Speech reception thresholds (SRTs) were measured in three noise scenarios (multitalker babble, cafeteria noise, and single competing talker). Predictions of three common instrumental measures were compared with the general perceptual benefit caused by the algorithms. The individual SRTs measured without pre-processing and individual benefits were objectively estimated using the binaural speech intelligibility model. Ten listeners with NH and 12 HI listeners participated. The participants varied in age and pure-tone threshold levels. Although HI listeners required a better signal-to-noise ratio to obtain 50% intelligibility than listeners with NH, no differences in SRT benefit from the different algorithms were found between the two groups. With the exception of single-channel noise reduction, all algorithms showed an improvement in SRT of between 2.1 dB (in cafeteria noise) and 4.8 dB (in single competing talker condition). Model predictions with binaural speech intelligibility model explained 83% of the measured variance of the individual SRTs in the no pre-processing condition. Regarding the benefit from the algorithms, the instrumental measures were not able to predict the perceptual data in all tested noise conditions. The comparable benefit observed for both groups suggests a possible application of noise reduction schemes for listeners with different hearing status. Although the model can predict the individual SRTs without pre-processing, further development is necessary to predict the benefits obtained from the algorithms at an individual level. PMID:26721922

  18. Pulmonary artery pressure-guided heart failure management: US cost-effectiveness analyses using the results of the CHAMPION clinical trial.

    PubMed

    Martinson, Melissa; Bharmi, Rupinder; Dalal, Nirav; Abraham, William T; Adamson, Philip B

    2017-05-01

    Haemodynamic-guided heart failure (HF) management effectively reduces decompensation events and need for hospitalizations. The economic benefit of clinical improvement requires further study. An estimate of the cost-effectiveness of haemodynamic-guided HF management was made based on observations published in the randomized, prospective single-blinded CHAMPION trial. A comprehensive analysis was performed including healthcare utilization event rates, survival, and quality of life demonstrated in the randomized portion of the trial (18 months). Markov modelling with Monte Carlo simulation was used to approximate comprehensive costs and quality-adjusted life years (QALYs) from a payer perspective. Unit costs were estimated using the Truven Health MarketScan database from April 2008 to March 2013. Over a 5-year horizon, patients in the Treatment group had average QALYs of 2.56 with a total cost of US$56 974; patients in the Control group had QALYs of 2.16 with a total cost of US$52 149. The incremental cost-effectiveness ratio (ICER) was US$12 262 per QALY. Using comprehensive cost modelling, including all anticipated costs of HF and non-HF hospitalizations, physician visits, prescription drugs, long-term care, and outpatient hospital visits over 5 years, the Treatment group had a total cost of US$212 004 and the Control group had a total cost of US$200 360. The ICER was US$29 593 per QALY. Standard economic modelling suggests that pulmonary artery pressure-guided management of HF using the CardioMEMS™ HF System is cost-effective from the US-payer perspective. This analysis provides the background for further modelling in specific country healthcare systems and cost structures. © 2016 The Authors. European Journal of Heart Failure published by John Wiley & Sons Ltd on behalf of European Society of Cardiology.

  19. Quality Assurance in the Presence of Variability

    NASA Astrophysics Data System (ADS)

    Lauenroth, Kim; Metzger, Andreas; Pohl, Klaus

    Software Product Line Engineering (SPLE) is a reuse-driven development paradigm that has been applied successfully in information system engineering and other domains. Quality assurance of the reusable artifacts of the product line (e.g. requirements, design, and code artifacts) is essential for successful product line engineering. As those artifacts are reused in several products, a defect in a reusable artifact can affect several products of the product line. A central challenge for quality assurance in product line engineering is how to consider product line variability. Since the reusable artifacts contain variability, quality assurance techniques from single-system engineering cannot directly be applied to those artifacts. Therefore, different strategies and techniques have been developed for quality assurance in the presence of variability. In this chapter, we describe those strategies and discuss in more detail one of those strategies, the so called comprehensive strategy. The comprehensive strategy aims at checking the quality of all possible products of the product line and thus offers the highest benefits, since it is able to uncover defects in all possible products of the product line. However, the central challenge for applying the comprehensive strategy is the complexity that results from the product line variability and the large number of potential products of a product line. In this chapter, we present one concrete technique that we have developed to implement the comprehensive strategy that addresses this challenge. The technique is based on model checking technology and allows for a comprehensive verification of domain artifacts against temporal logic properties.

  20. Uncertainty

    USGS Publications Warehouse

    Hunt, Randall J.

    2012-01-01

    Management decisions will often be directly informed by model predictions. However, we now know there can be no expectation of a single ‘true’ model; thus, model results are uncertain. Understandable reporting of underlying uncertainty provides necessary context to decision-makers, as model results are used for management decisions. This, in turn, forms a mechanism by which groundwater models inform a risk-management framework because uncertainty around a prediction provides the basis for estimating the probability or likelihood of some event occurring. Given that the consequences of management decisions vary, it follows that the extent of and resources devoted to an uncertainty analysis may depend on the consequences. For events with low impact, a qualitative, limited uncertainty analysis may be sufficient for informing a decision. For events with a high impact, on the other hand, the risks might be better assessed and associated decisions made using a more robust and comprehensive uncertainty analysis. The purpose of this chapter is to provide guidance on uncertainty analysis through discussion of concepts and approaches, which can vary from heuristic (i.e. the modeller’s assessment of prediction uncertainty based on trial and error and experience) to a comprehensive, sophisticated, statistics-based uncertainty analysis. Most of the material presented here is taken from Doherty et al. (2010) if not otherwise cited. Although the treatment here is necessarily brief, the reader can find citations for the source material and additional references within this chapter.

  1. Supporting user-defined granularities in a spatiotemporal conceptual model

    USGS Publications Warehouse

    Khatri, V.; Ram, S.; Snodgrass, R.T.; O'Brien, G. M.

    2002-01-01

    Granularities are integral to spatial and temporal data. A large number of applications require storage of facts along with their temporal and spatial context, which needs to be expressed in terms of appropriate granularities. For many real-world applications, a single granularity in the database is insufficient. In order to support any type of spatial or temporal reasoning, the semantics related to granularities needs to be embedded in the database. Specifying granularities related to facts is an important part of conceptual database design because under-specifying the granularity can restrict an application, affect the relative ordering of events and impact the topological relationships. Closely related to granularities is indeterminacy, i.e., an occurrence time or location associated with a fact that is not known exactly. In this paper, we present an ontology for spatial granularities that is a natural analog of temporal granularities. We propose an upward-compatible, annotation-based spatiotemporal conceptual model that can comprehensively capture the semantics related to spatial and temporal granularities, and indeterminacy without requiring new spatiotemporal constructs. We specify the formal semantics of this spatiotemporal conceptual model via translation to a conventional conceptual model. To underscore the practical focus of our approach, we describe an on-going case study. We apply our approach to a hydrogeologic application at the United States Geologic Survey and demonstrate that our proposed granularity-based spatiotemporal conceptual model is straightforward to use and is comprehensive.

  2. Updating during Reading Comprehension: Why Causality Matters

    ERIC Educational Resources Information Center

    Kendeou, Panayiota; Smith, Emily R.; O'Brien, Edward J.

    2013-01-01

    The present set of 7 experiments systematically examined the effectiveness of adding causal explanations to simple refutations in reducing or eliminating the impact of outdated information on subsequent comprehension. The addition of a single causal-explanation sentence to a refutation was sufficient to eliminate any measurable disruption in…

  3. Crystal Genetics, Inc.

    PubMed

    Kermani, Bahram G

    2016-07-01

    Crystal Genetics, Inc. is an early-stage genetic test company, focused on achieving the highest possible clinical-grade accuracy and comprehensiveness for detecting germline (e.g., in hereditary cancer) and somatic (e.g., in early cancer detection) mutations. Crystal's mission is to significantly improve the health status of the population, by providing high accuracy, comprehensive, flexible and affordable genetic tests, primarily in cancer. Crystal's philosophy is that when it comes to detecting mutations that are strongly correlated with life-threatening diseases, the detection accuracy of every single mutation counts: a single false-positive error could cause severe anxiety for the patient. And, more importantly, a single false-negative error could potentially cost the patient's life. Crystal's objective is to eliminate both of these error types.

  4. RAD-ADAPT: Software for modelling clonogenic assay data in radiation biology.

    PubMed

    Zhang, Yaping; Hu, Kaiqiang; Beumer, Jan H; Bakkenist, Christopher J; D'Argenio, David Z

    2017-04-01

    We present a comprehensive software program, RAD-ADAPT, for the quantitative analysis of clonogenic assays in radiation biology. Two commonly used models for clonogenic assay analysis, the linear-quadratic model and single-hit multi-target model, are included in the software. RAD-ADAPT uses maximum likelihood estimation method to obtain parameter estimates with the assumption that cell colony count data follow a Poisson distribution. The program has an intuitive interface, generates model prediction plots, tabulates model parameter estimates, and allows automatic statistical comparison of parameters between different groups. The RAD-ADAPT interface is written using the statistical software R and the underlying computations are accomplished by the ADAPT software system for pharmacokinetic/pharmacodynamic systems analysis. The use of RAD-ADAPT is demonstrated using an example that examines the impact of pharmacologic ATM and ATR kinase inhibition on human lung cancer cell line A549 after ionizing radiation. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Concurrent processing of vehicle lane keeping and speech comprehension tasks.

    PubMed

    Cao, Shi; Liu, Yili

    2013-10-01

    With the growing prevalence of using in-vehicle devices and mobile devices while driving, a major concern is their impact on driving performance and safety. However, the effects of cognitive load such as conversation on driving performance are still controversial and not well understood. In this study, an experiment was conducted to investigate the concurrent performance of vehicle lane keeping and speech comprehension tasks with improved experimental control of the confounding factors identified in previous studies. The results showed that the standard deviation of lane position (SDLP) was increased when the driving speed was faster (0.30 m at 36 km/h; 0.36 m at 72 km/h). The concurrent comprehension task had no significant effect on SDLP (0.34 m on average) or the standard deviation of steering wheel angle (SDSWA; 5.20° on average). The correct rate of the comprehension task was reduced in the dual-task condition (from 93.4% to 91.3%) compared with the comprehension single-task condition. Mental workload was significantly higher in the dual-task condition compared with the single-task conditions. Implications for driving safety were discussed. Copyright © 2013 Elsevier Ltd. All rights reserved.

  6. Proverb and idiom comprehension in Alzheimer disease.

    PubMed

    Kempler, D; Van Lancker, D; Read, S

    1988-01-01

    Twenty-nine patients diagnosed with Probable Alzheimer Disease were administered tests of word, familiar phrases (idioms and proverbs), and novel phrase comprehension. From the early stage of the disease, patients performed worse at understanding familiar phrases than single words or novel phrases. The results uphold common observations that AD patients have difficulty interpreting abstract meanings. Cognitive variables responsible for poor idiom/proverb comprehension and the clinical implications of this new protocol are discussed.

  7. Thermal Protection System of the Space Shuttle

    NASA Technical Reports Server (NTRS)

    Cleland, John; Iannetti, Francesco

    1989-01-01

    The Thermal Protection System (TPS), introduced by NASA, continues to incorporate many of the advances in materials over the past two decades. A comprehensive, single-volume summary of the TPS, including system design rationales, key design features, and broad descriptions of the subsystems of TPS (E.g., reusable surface insulation, leading edge structural, and penetration subsystems) is provided. Details of all elements of TPS development and application are covered (materials properties, manufacturing, modeling, testing, installation, and inspection). Disclosures and inventions are listed and potential commercial application of TPS-related technology is discussed.

  8. Experimental investigation of shock-cell noise reduction for single-stream nozzles in simulated flight, comprehensive data report. Volume 2: Laser velocimeter data

    NASA Technical Reports Server (NTRS)

    Yamamoto, K.; Brausch, J. F.; Janardan, B. A.; Hoerst, D. J.; Price, A. O.; Knott, P. R.

    1984-01-01

    Mean velocity (axial component) and turbulent velocity (axial component) measurements for thirty one selected flow conditions of six models were performed employing the Laser Doppler Velocimeter Aerodynamic conditions which define the test points are given. Tabulations which explain the scope of mean velocity traverses and turbulence histogram measurements are also presented. The actual LV position, the type of traverse, and measured mean and turbulent velocities along copies of the LV mean velocity traces are contained.

  9. A computerized data base of nitrate concentrations in Indiana ground water

    USGS Publications Warehouse

    Risch, M.R.; Cohen, D.A.

    1995-01-01

    The nitrate data base was compiled from numerous data sets that were readily accessible in electronic format. The uses of these data may be limited because they were neither comprehensive nor of a single statistical design. Nonetheless, the nitrate data can be used in several ways: (1) to identify geographic areas with and without nitrate data; (2) to evaluate assumptions, models, and maps of ground-water-contamination potential; and (3) to investigate the relation between environmental factors, land-use types, and the occurrence of nitrate.

  10. Are shame and self-esteem risk factors in prolonged grief after death of a spouse?

    PubMed

    Dellmann, Thomas

    2018-07-01

    Although many single factors of prolonged grief have been identified in the literature, a comprehensive understanding of predictors is still lacking. This article argues that shame and low self-esteem, present risk factors in prolonged grief after spousal loss, based on a review of correlational studies. Using a practitioner-scientist approach, a developmental model of shame as a core factor in prolonged grief is proposed, outlining the progression from childhood relational trauma, to insecure attachment, shame, self-esteem contingent on spousal approval to eventual prolonged grief.

  11. Equivalent ZF precoding scheme for downlink indoor MU-MIMO VLC systems

    NASA Astrophysics Data System (ADS)

    Fan, YangYu; Zhao, Qiong; Kang, BoChao; Deng, LiJun

    2018-01-01

    In indoor visible light communication (VLC) systems, the channels of photo detectors (PDs) at one user are highly correlated, which determines the choice of spatial diversity model for individual users. In a spatial diversity model, the signals received by PDs belonging to one user carry the same information, and can be combined directly. Based on the above, we propose an equivalent zero-forcing (ZF) precoding scheme for multiple-user multiple-input single-output (MU-MIMO) VLC systems by transforming an indoor MU-MIMO VLC system into an indoor multiple-user multiple-input single-output (MU-MISO) VLC system through simply processing. The power constraints of light emitting diodes (LEDs) are also taken into account. Comprehensive computer simulations in three scenarios indicate that our scheme can not only reduce the computational complexity, but also guarantee the system performance. Furthermore, the proposed scheme does not require noise information in the calculating of the precoding weights, and has no restrictions on the numbers of APs and PDs.

  12. Single-cell transcriptome of early embryos and cultured embryonic stem cells of cynomolgus monkeys

    PubMed Central

    Nakamura, Tomonori; Yabuta, Yukihiro; Okamoto, Ikuhiro; Sasaki, Kotaro; Iwatani, Chizuru; Tsuchiya, Hideaki; Saitou, Mitinori

    2017-01-01

    In mammals, the development of pluripotency and specification of primordial germ cells (PGCs) have been studied predominantly using mice as a model organism. However, divergences among mammalian species for such processes have begun to be recognized. Between humans and mice, pre-implantation development appears relatively similar, but the manner and morphology of post-implantation development are significantly different. Nevertheless, the embryogenesis just after implantation in primates, including the specification of PGCs, has been unexplored due to the difficulties in analyzing the embryos at relevant developmental stages. Here, we present a comprehensive single-cell transcriptome dataset of pre- and early post-implantation embryo cells, PGCs and embryonic stem cells (ESCs) of cynomolgus monkeys as a model of higher primates. The identities of each transcriptome were also validated rigorously by other way such as immunofluorescent analysis. The information reported here will serve as a foundation for our understanding of a wide range of processes in the developmental biology of primates, including humans. PMID:28649393

  13. How do typographical factors affect reading text and comprehension performance in Arabic?

    PubMed

    Ganayim, Deia; Ibrahim, Raphiq

    2013-04-01

    The objective of this study was to establish basic reading performance that could lead to useful design recommendations for print display text formats and layouts for the improvement of reading and comprehension performance of print text, such as academic writings, books, and newspapers, of Arabic language. Readability of English print text has been shown to be influenced by a number of typographical variables, including interline spacing, column setting and line length, and so on.Therefore, it is very important to improve the reading efficiency and satisfaction of print text reading and comprehension by following simple design guidelines. Most existing research on readability of print text is oriented to build guidelines for designing English texts rather than Arabic. However, guidelines built for English script cannot be simply applied for Arabic script because of orthographic differences. In the current study, manipulating interline spacing and column setting and line length generated nine text layouts. The reading and comprehension performance of 210 native Arab students assigned randomly to the different text layouts was compared. Results showed that the use of multicolumn setting (with medium or short line length) affected comprehension achievement but not reading and comprehension speed. Participants' comprehension scores were better for the single-column (with long line length) than for the multicolumn setting. However, no effect was found for interline spacing. The recommendations for appropriate print text format and layout in Arabic language based on the results of objective measures facilitating reading and comprehension performance is a single-column (with long line length) layout with no relevance of the interline spacing.

  14. Improving participant comprehension in the informed consent process.

    PubMed

    Cohn, Elizabeth; Larson, Elaine

    2007-01-01

    To critically analyze studies published within the past decade about participants' comprehension of informed consent in clinical research and to identify promising intervention strategies. Integrative review of literature. The Cumulative Index of Nursing and Allied Health Literature (CINAHL), PubMed, and the Cochrane Database of Systematic Reviews and Cochrane Central Register of Controlled Trials were searched. Inclusion criteria included studies (a) published between January 1, 1996 and January 1, 2007, (b) designed as descriptive or interventional studies of comprehension of informed consent for clinical research, (c) conducted in nonpsychiatric adult populations who were either patients or volunteer participants, (d) written in English, and (e) published in peer-reviewed journals. Of the 980 studies identified, 319 abstracts were screened, 154 studies were reviewed, and 23 met the inclusion criteria. Thirteen studies (57%) were descriptive, and 10 (43%) were interventional. Interventions tested included simplified written consent documents, multimedia approaches, and the use of a trained professional (consent educator) to assist in the consent process. Collectively, no single intervention strategy was consistently associated with improved comprehension. Studies also varied in regard to the definition of comprehension and the tools used to measure it. Despite increasing regulatory scrutiny, deficiencies still exist in participant comprehension of the research in which they participate, as well as differences in how comprehension is measured and assessed. No single intervention was identified as consistently successful for improving participant comprehension, and results indicated that any successful consent process should at a minimum include various communication modes and is likely to require one-to-one interaction with someone knowledgeable about the study.

  15. Do prominent quality measurement surveys capture the concerns of persons with disability?

    PubMed

    Iezzoni, Lisa I; Marsella, Sarah A; Lopinsky, Tiffany; Heaphy, Dennis; Warsett, Kimberley S

    2017-04-01

    Demonstration programs nationwide aim to control costs and improve care for people dually-eligible for Medicare and Medicaid, including many persons with disability. Ensuring these initiatives maintain or improve care quality requires comprehensive evaluation of quality of care. To examine whether the common quality measures being used to evaluate the Massachusetts One Care duals demonstration program comprehensively address the concerns of persons with disability. Drawing upon existing conceptual frameworks, we developed a model of interrelationships of personal, health care, and environmental factors for achieving wellness for persons with disability. Based on this model, we specified a scheme to code individual quality measurement items and coded the items contained in 12 measures being used to assess Massachusetts One Care, which exclusively enrolls non-elderly adults with disability. Across these 12 measures, we assigned 376 codes to 302 items; some items received two codes. Taken together, the 12 measures contain items addressing most factors in our conceptual model that affect health care quality for persons with disability, including long-term services and supports. Some important gaps exist. No items examine sexual or reproductive health care, peer support, housing security, disability stigmatization, and specific services obtained outside the home like adult day care. Certain key concepts are covered only by a single or several of the 12 quality measures. Common quality metrics cover most - although not all-health care quality concerns of persons with disability. However, multiple different quality measures are required for this comprehensive coverage, raising questions about respondent burden. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. Bioerodible System for Sequential Release of Multiple Drugs

    PubMed Central

    Sundararaj, Sharath C.; Thomas, Mark V.; Dziubla, Thomas D.; Puleo, David A.

    2013-01-01

    Because many complex physiological processes are controlled by multiple biomolecules, comprehensive treatment of certain disease conditions may be more effectively achieved by administration of more than one type of drug. Thus, the objective of the present research was to develop a multilayered, polymer-based system for sequential delivery of multiple drugs. The polymers used were cellulose acetate phthalate (CAP) complexed with Pluronic F-127 (P). After evaluating morphology of the resulting CAPP system, in vitro release of small molecule drugs and a model protein was studied from both single and multilayered devices. Drug release from single-layered CAPP films followed zero-order kinetics related to surface erosion of the association polymer. Release studies from multilayered CAPP devices showed the possibility of achieving intermittent release of one type of drug as well as sequential release of more than one type of drug. Mathematical modeling accurately predicted the release profiles for both single layer and multilayered devices. The present CAPP association polymer-based multilayer devices can be used for localized, sequential delivery of multiple drugs for the possible treatment of complex disease conditions, and perhaps for tissue engineering applications, that require delivery of more than one type of biomolecule. PMID:24096151

  17. Two states or not two states: Single-molecule folding studies of protein L

    NASA Astrophysics Data System (ADS)

    Aviram, Haim Yuval; Pirchi, Menahem; Barak, Yoav; Riven, Inbal; Haran, Gilad

    2018-03-01

    Experimental tools of increasing sophistication have been employed in recent years to study protein folding and misfolding. Folding is considered a complex process, and one way to address it is by studying small proteins, which seemingly possess a simple energy landscape with essentially only two stable states, either folded or unfolded. The B1-IgG binding domain of protein L (PL) is considered a model two-state folder, based on measurements using a wide range of experimental techniques. We applied single-molecule fluorescence resonance energy transfer (FRET) spectroscopy in conjunction with a hidden Markov model analysis to fully characterize the energy landscape of PL and to extract the kinetic properties of individual molecules of the protein. Surprisingly, our studies revealed the existence of a third state, hidden under the two-state behavior of PL due to its small population, ˜7%. We propose that this minority intermediate involves partial unfolding of the two C-terminal β strands of PL. Our work demonstrates that single-molecule FRET spectroscopy can be a powerful tool for a comprehensive description of the folding dynamics of proteins, capable of detecting and characterizing relatively rare metastable states that are difficult to observe in ensemble studies.

  18. Does the Component Processes Task Assess Text-Based Inferences Important for Reading Comprehension? A Path Analysis in Primary School Children

    PubMed Central

    Wassenburg, Stephanie I.; de Koning, Björn B.; de Vries, Meinou H.; van der Schoot, Menno

    2016-01-01

    Using a component processes task (CPT) that differentiates between higher-level cognitive processes of reading comprehension provides important advantages over commonly used general reading comprehension assessments. The present study contributes to further development of the CPT by evaluating the relative contributions of its components (text memory, text inferencing, and knowledge integration) and working memory to general reading comprehension within a single study using path analyses. Participants were 173 third- and fourth-grade children. As hypothesized, knowledge integration was the only component of the CPT that directly contributed to reading comprehension, indicating that the text-inferencing component did not assess inferential processes related to reading comprehension. Working memory was a significant predictor of reading comprehension over and above the component processes. Future research should focus on finding ways to ensure that the text-inferencing component taps into processes important for reading comprehension. PMID:27378989

  19. The Wernicke conundrum and the anatomy of language comprehension in primary progressive aphasia

    PubMed Central

    Thompson, Cynthia K.; Weintraub, Sandra; Rogalski, Emily J.

    2015-01-01

    Wernicke’s aphasia is characterized by severe word and sentence comprehension impairments. The location of the underlying lesion site, known as Wernicke’s area, remains controversial. Questions related to this controversy were addressed in 72 patients with primary progressive aphasia who collectively displayed a wide spectrum of cortical atrophy sites and language impairment patterns. Clinico-anatomical correlations were explored at the individual and group levels. These analyses showed that neuronal loss in temporoparietal areas, traditionally included within Wernicke’s area, leave single word comprehension intact and cause inconsistent impairments of sentence comprehension. The most severe sentence comprehension impairments were associated with a heterogeneous set of cortical atrophy sites variably encompassing temporoparietal components of Wernicke’s area, Broca’s area, and dorsal premotor cortex. Severe comprehension impairments for single words, on the other hand, were invariably associated with peak atrophy sites in the left temporal pole and adjacent anterior temporal cortex, a pattern of atrophy that left sentence comprehension intact. These results show that the neural substrates of word and sentence comprehension are dissociable and that a circumscribed cortical area equally critical for word and sentence comprehension is unlikely to exist anywhere in the cerebral cortex. Reports of combined word and sentence comprehension impairments in Wernicke’s aphasia come almost exclusively from patients with cerebrovascular accidents where brain damage extends into subcortical white matter. The syndrome of Wernicke’s aphasia is thus likely to reflect damage not only to the cerebral cortex but also to underlying axonal pathways, leading to strategic cortico-cortical disconnections within the language network. The results of this investigation further reinforce the conclusion that the left anterior temporal lobe, a region ignored by classic aphasiology, needs to be inserted into the language network with a critical role in the multisynaptic hierarchy underlying word comprehension and object naming. PMID:26112340

  20. The Wernicke conundrum and the anatomy of language comprehension in primary progressive aphasia.

    PubMed

    Mesulam, M-Marsel; Thompson, Cynthia K; Weintraub, Sandra; Rogalski, Emily J

    2015-08-01

    Wernicke's aphasia is characterized by severe word and sentence comprehension impairments. The location of the underlying lesion site, known as Wernicke's area, remains controversial. Questions related to this controversy were addressed in 72 patients with primary progressive aphasia who collectively displayed a wide spectrum of cortical atrophy sites and language impairment patterns. Clinico-anatomical correlations were explored at the individual and group levels. These analyses showed that neuronal loss in temporoparietal areas, traditionally included within Wernicke's area, leave single word comprehension intact and cause inconsistent impairments of sentence comprehension. The most severe sentence comprehension impairments were associated with a heterogeneous set of cortical atrophy sites variably encompassing temporoparietal components of Wernicke's area, Broca's area, and dorsal premotor cortex. Severe comprehension impairments for single words, on the other hand, were invariably associated with peak atrophy sites in the left temporal pole and adjacent anterior temporal cortex, a pattern of atrophy that left sentence comprehension intact. These results show that the neural substrates of word and sentence comprehension are dissociable and that a circumscribed cortical area equally critical for word and sentence comprehension is unlikely to exist anywhere in the cerebral cortex. Reports of combined word and sentence comprehension impairments in Wernicke's aphasia come almost exclusively from patients with cerebrovascular accidents where brain damage extends into subcortical white matter. The syndrome of Wernicke's aphasia is thus likely to reflect damage not only to the cerebral cortex but also to underlying axonal pathways, leading to strategic cortico-cortical disconnections within the language network. The results of this investigation further reinforce the conclusion that the left anterior temporal lobe, a region ignored by classic aphasiology, needs to be inserted into the language network with a critical role in the multisynaptic hierarchy underlying word comprehension and object naming. © The Author (2015). Published by Oxford University Press on behalf of the Guarantors of Brain. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  1. Language Comprehension and Performance.

    ERIC Educational Resources Information Center

    Tanaka, Masako N.; Massad, Carolyn E.

    The effectiveness of the CIRCUS language instruments for determining language comprehension and performance in the 4- and 5-year-old child is discussed. In these instruments, the use of content words is primarily studied through the use of single-word measures, such as a picture vocabulary test and an auditory discrimination test, whereas the use…

  2. A Single Question to Examine the Prevalence and Protective Effect of Seroadaptive Strategies Among Men Who Have Sex With Men.

    PubMed

    Khosropour, Christine M; Dombrowski, Julia C; Katz, David A; Golden, Matthew R

    2017-11-01

    Seroadaptive behaviors among men who have sex with men (MSM) are common, but ascertaining behavioral information is challenging in clinical settings. To address this, we developed a single seroadaptive behavior question. Men who have sex with men 18 years or older attending a sexually transmitted disease clinic in Seattle, WA, from 2013 to 2015, were eligible for this cross-sectional study. Respondents completed a comprehensive seroadaptive behavior questionnaire which included a single question that asked HIV-negative MSM to indicate which of 12 strategies they used in the past year to reduce their HIV risk. HIV testing was performed per routine clinical care. We used the κ statistic to examine agreement between the comprehensive questionnaire and the single question. We enrolled HIV-negative MSM at 3341 (55%) of 6105 eligible visits. The agreement between the full questionnaire and single question for 5 behaviors was fair to moderate (κ values of 0.34-0.59). From the single question, the most commonly reported behaviors were as follows: avoiding sex with HIV-positive (66%) or unknown-status (52%) men and using condoms with unknown-status partners (53%); 8% of men reported no seroadaptive behavior. Men tested newly HIV positive at 38 (1.4%) of 2741 visits. HIV test positivity for the most commonly reported behaviors ranged from 0.8% to 1.3%. Men reporting no seroadaptive strategy had a significantly higher HIV test positivity (3.5%) compared with men who reported at least 1 strategy (1.3%; P = 0.02). The single question performed relatively well against a comprehensive seroadaptive behaviors assessment and may be useful in clinical settings to identify men at greatest risk for HIV.

  3. Developmental Relations Between Vocabulary Knowledge and Reading Comprehension: A Latent Change Score Modeling Study

    PubMed Central

    Quinn, Jamie M.; Wagner, Richard K.; Petscher, Yaacov; Lopez, Danielle

    2014-01-01

    The present study followed a sample of first grade students (N = 316, mean age = 7.05 at first test) through fourth grade to evaluate dynamic developmental relations between vocabulary knowledge and reading comprehension. Using latent change score modeling, competing models were fit to the repeated measurements of vocabulary knowledge and reading comprehension to test for the presence of leading and lagging influences. Univariate models indicated growth in vocabulary knowledge and reading comprehension was determined by two parts: constant yearly change and change proportional to the previous level of the variable. Bivariate models indicated previous levels of vocabulary knowledge acted as leading indicators of reading comprehension growth, but the reverse relation was not found. Implications for theories of developmental relations between vocabulary and reading comprehension are discussed. PMID:25201552

  4. Recognition and source memory as multivariate decision processes.

    PubMed

    Banks, W P

    2000-07-01

    Recognition memory, source memory, and exclusion performance are three important domains of study in memory, each with its own findings, it specific theoretical developments, and its separate research literature. It is proposed here that results from all three domains can be treated with a single analytic model. This article shows how to generate a comprehensive memory representation based on multidimensional signal detection theory and how to make predictions for each of these paradigms using decision axes drawn through the space. The detection model is simpler than the comparable multinomial model, it is more easily generalizable, and it does not make threshold assumptions. An experiment using the same memory set for all three tasks demonstrates the analysis and tests the model. The results show that some seemingly complex relations between the paradigms derive from an underlying simplicity of structure.

  5. Physical and Relativistic Numerical Cosmology.

    PubMed

    Anninos, Peter

    1998-01-01

    In order to account for the observable Universe, any comprehensive theory or model of cosmology must draw from many disciplines of physics, including gauge theories of strong and weak interactions, the hydrodynamics and microphysics of baryonic matter, electromagnetic fields, and spacetime curvature, for example. Although it is difficult to incorporate all these physical elements into a single complete model of our Universe, advances in computing methods and technologies have contributed significantly towards our understanding of cosmological models, the Universe, and astrophysical processes within them. A sample of numerical calculations addressing specific issues in cosmology are reviewed in this article: from the Big Bang singularity dynamics to the fundamental interactions of gravitational waves; from the quark-hadron phase transition to the large scale structure of the Universe. The emphasis, although not exclusively, is on those calculations designed to test different models of cosmology against the observed Universe.

  6. Endpoint Model of Exclusive Processes

    NASA Astrophysics Data System (ADS)

    Dagaonkar, Sumeet; Jain, Pankaj; Ralston, John P.

    2018-07-01

    The endpoint model explains the scaling laws observed in exclusive hadronic reactions at large momentum transfer in all experimentally important regimes. The model, originally conceived by Feynman and others, assumes a single valence quark carries most of the hadron momentum. The quark wave function is directly related to the momentum transfer dependence of the reaction. After extracting the momentum dependence of the quark wave function from one process, it explains all the others. Endpoint quark-counting rules relate the number of quarks in a hadron to the power-law. A universal linear endpoint behavior explains the proton electromagnetic form factors F1 and F2, proton-proton scattering at fixed-angle, the t-dependence of proton-proton scattering at large s>> t, and Compton scattering at fixed t. The model appears to be the only comprehensive mechanism consistent with all experimental information.

  7. A Pilot Study of Biomedical Text Comprehension using an Attention-Based Deep Neural Reader: Design and Experimental Analysis

    PubMed Central

    Lee, Kyubum; Kim, Byounggun; Jeon, Minji; Kim, Jihye; Tan, Aik Choon

    2018-01-01

    Background With the development of artificial intelligence (AI) technology centered on deep-learning, the computer has evolved to a point where it can read a given text and answer a question based on the context of the text. Such a specific task is known as the task of machine comprehension. Existing machine comprehension tasks mostly use datasets of general texts, such as news articles or elementary school-level storybooks. However, no attempt has been made to determine whether an up-to-date deep learning-based machine comprehension model can also process scientific literature containing expert-level knowledge, especially in the biomedical domain. Objective This study aims to investigate whether a machine comprehension model can process biomedical articles as well as general texts. Since there is no dataset for the biomedical literature comprehension task, our work includes generating a large-scale question answering dataset using PubMed and manually evaluating the generated dataset. Methods We present an attention-based deep neural model tailored to the biomedical domain. To further enhance the performance of our model, we used a pretrained word vector and biomedical entity type embedding. We also developed an ensemble method of combining the results of several independent models to reduce the variance of the answers from the models. Results The experimental results showed that our proposed deep neural network model outperformed the baseline model by more than 7% on the new dataset. We also evaluated human performance on the new dataset. The human evaluation result showed that our deep neural model outperformed humans in comprehension by 22% on average. Conclusions In this work, we introduced a new task of machine comprehension in the biomedical domain using a deep neural model. Since there was no large-scale dataset for training deep neural models in the biomedical domain, we created the new cloze-style datasets Biomedical Knowledge Comprehension Title (BMKC_T) and Biomedical Knowledge Comprehension Last Sentence (BMKC_LS) (together referred to as BioMedical Knowledge Comprehension) using the PubMed corpus. The experimental results showed that the performance of our model is much higher than that of humans. We observed that our model performed consistently better regardless of the degree of difficulty of a text, whereas humans have difficulty when performing biomedical literature comprehension tasks that require expert level knowledge. PMID:29305341

  8. Tracking real-time neural activation of conceptual knowledge using single-trial event-related potentials.

    PubMed

    Amsel, Ben D

    2011-04-01

    Empirically derived semantic feature norms categorized into different types of knowledge (e.g., visual, functional, auditory) can be summed to create number-of-feature counts per knowledge type. Initial evidence suggests several such knowledge types may be recruited during language comprehension. The present study provides a more detailed understanding of the timecourse and intensity of influence of several such knowledge types on real-time neural activity. A linear mixed-effects model was applied to single trial event-related potentials for 207 visually presented concrete words measured on total number of features (semantic richness), imageability, and number of visual motion, color, visual form, smell, taste, sound, and function features. Significant influences of multiple feature types occurred before 200ms, suggesting parallel neural computation of word form and conceptual knowledge during language comprehension. Function and visual motion features most prominently influenced neural activity, underscoring the importance of action-related knowledge in computing word meaning. The dynamic time courses and topographies of these effects are most consistent with a flexible conceptual system wherein temporally dynamic recruitment of representations in modal and supramodal cortex are a crucial element of the constellation of processes constituting word meaning computation in the brain. Copyright © 2011 Elsevier Ltd. All rights reserved.

  9. Genetic architecture of plant stress resistance: multi-trait genome-wide association mapping.

    PubMed

    Thoen, Manus P M; Davila Olivas, Nelson H; Kloth, Karen J; Coolen, Silvia; Huang, Ping-Ping; Aarts, Mark G M; Bac-Molenaar, Johanna A; Bakker, Jaap; Bouwmeester, Harro J; Broekgaarden, Colette; Bucher, Johan; Busscher-Lange, Jacqueline; Cheng, Xi; Fradin, Emilie F; Jongsma, Maarten A; Julkowska, Magdalena M; Keurentjes, Joost J B; Ligterink, Wilco; Pieterse, Corné M J; Ruyter-Spira, Carolien; Smant, Geert; Testerink, Christa; Usadel, Björn; van Loon, Joop J A; van Pelt, Johan A; van Schaik, Casper C; van Wees, Saskia C M; Visser, Richard G F; Voorrips, Roeland; Vosman, Ben; Vreugdenhil, Dick; Warmerdam, Sonja; Wiegers, Gerrie L; van Heerwaarden, Joost; Kruijer, Willem; van Eeuwijk, Fred A; Dicke, Marcel

    2017-02-01

    Plants are exposed to combinations of various biotic and abiotic stresses, but stress responses are usually investigated for single stresses only. Here, we investigated the genetic architecture underlying plant responses to 11 single stresses and several of their combinations by phenotyping 350 Arabidopsis thaliana accessions. A set of 214 000 single nucleotide polymorphisms (SNPs) was screened for marker-trait associations in genome-wide association (GWA) analyses using tailored multi-trait mixed models. Stress responses that share phytohormonal signaling pathways also share genetic architecture underlying these responses. After removing the effects of general robustness, for the 30 most significant SNPs, average quantitative trait locus (QTL) effect sizes were larger for dual stresses than for single stresses. Plants appear to deploy broad-spectrum defensive mechanisms influencing multiple traits in response to combined stresses. Association analyses identified QTLs with contrasting and with similar responses to biotic vs abiotic stresses, and below-ground vs above-ground stresses. Our approach allowed for an unprecedented comprehensive genetic analysis of how plants deal with a wide spectrum of stress conditions. © 2016 The Authors. New Phytologist © 2016 New Phytologist Trust.

  10. Models of Jovian decametric radiation. [astronomical models of decametric waves

    NASA Technical Reports Server (NTRS)

    Smith, R. A.

    1975-01-01

    A critical review is presented of theoretical models of Jovian decametric radiation, with particular emphasis on the Io-modulated emission. The problem is divided into three broad aspects: (1) the mechanism coupling Io's orbital motion to the inner exosphere, (2) the consequent instability mechanism by which electromagnetic waves are amplified, and (3) the subsequent propagation of the waves in the source region and the Jovian plasmasphere. At present there exists no comprehensive theory that treats all of these aspects quantitatively within a single framework. Acceleration of particles by plasma sheaths near Io is proposed as an explanation for the coupling mechanism, while most of the properties of the emission may be explained in the context of cyclotron instability of a highly anisotropic distribution of streaming particles.

  11. A comprehensive review of the sub-axial ligaments of the vertebral column: part I anatomy and function.

    PubMed

    Butt, Asma Mian; Gill, Clarence; Demerdash, Amin; Watanabe, Koichi; Loukas, Marios; Rozzelle, Curtis J; Tubbs, R Shane

    2015-07-01

    As important as the vertebral ligaments are in maintaining the integrity of the spinal column and protecting the contents of the spinal canal, a single detailed review of their anatomy and function is missing in the literature. A literature search using online search engines was conducted. Single comprehensive reviews of the spinal ligaments are not found in the extant medical literature. This review will be useful to those who treat patients with pathology of the spine or who interpret imaging or investigate the anatomy of the ligaments of the vertebral column.

  12. A Framework for Considering Comprehensibility in Modeling

    PubMed Central

    Gleicher, Michael

    2016-01-01

    Abstract Comprehensibility in modeling is the ability of stakeholders to understand relevant aspects of the modeling process. In this article, we provide a framework to help guide exploration of the space of comprehensibility challenges. We consider facets organized around key questions: Who is comprehending? Why are they trying to comprehend? Where in the process are they trying to comprehend? How can we help them comprehend? How do we measure their comprehension? With each facet we consider the broad range of options. We discuss why taking a broad view of comprehensibility in modeling is useful in identifying challenges and opportunities for solutions. PMID:27441712

  13. Single Plant Root System Modeling under Soil Moisture Variation

    NASA Astrophysics Data System (ADS)

    Yabusaki, S.; Fang, Y.; Chen, X.; Scheibe, T. D.

    2016-12-01

    A prognostic Virtual Plant-Atmosphere-Soil System (vPASS) model is being developed that integrates comprehensively detailed mechanistic single plant modeling with microbial, atmospheric, and soil system processes in its immediate environment. Three broad areas of process module development are targeted: Incorporating models for root growth and function, rhizosphere interactions with bacteria and other organisms, litter decomposition and soil respiration into established porous media flow and reactive transport models Incorporating root/shoot transport, growth, photosynthesis and carbon allocation process models into an integrated plant physiology model Incorporating transpiration, Volatile Organic Compounds (VOC) emission, particulate deposition and local atmospheric processes into a coupled plant/atmosphere model. The integrated plant ecosystem simulation capability is being developed as open source process modules and associated interfaces under a modeling framework. The initial focus addresses the coupling of root growth, vascular transport system, and soil under drought scenarios. Two types of root water uptake modeling approaches are tested: continuous root distribution and constitutive root system architecture. The continuous root distribution models are based on spatially averaged root development process parameters, which are relatively straightforward to accommodate in the continuum soil flow and reactive transport module. Conversely, the constitutive root system architecture models use root growth rates, root growth direction, and root branching to evolve explicit root geometries. The branching topologies require more complex data structures and additional input parameters. Preliminary results are presented for root model development and the vascular response to temporal and spatial variations in soil conditions.

  14. A Layered Decision Model for Cost-Effective System Security

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wei, Huaqiang; Alves-Foss, James; Soule, Terry

    System security involves decisions in at least three areas: identification of well-defined security policies, selection of cost-effective defence strategies, and implementation of real-time defence tactics. Although choices made in each of these areas affect the others, existing decision models typically handle these three decision areas in isolation. There is no comprehensive tool that can integrate them to provide a single efficient model for safeguarding a network. In addition, there is no clear way to determine which particular combinations of defence decisions result in cost-effective solutions. To address these problems, this paper introduces a Layered Decision Model (LDM) for use inmore » deciding how to address defence decisions based on their cost-effectiveness. To validate the LDM and illustrate how it is used, we used simulation to test model rationality and applied the LDM to the design of system security for an e-commercial business case.« less

  15. Theoretical Models of Comprehension Skills Tested through a Comprehension Assessment Battery for Primary School Children

    ERIC Educational Resources Information Center

    Tobia, Valentina; Ciancaleoni, Matteo; Bonifacci, Paola

    2017-01-01

    In this study, two alternative theoretical models were compared, in order to analyze which of them best explains primary school children's text comprehension skills. The first one was based on the distinction between two types of answers requested by the comprehension test: local or global. The second model involved texts' input modality: written…

  16. Analyzing and Integrating Models of Multiple Text Comprehension

    ERIC Educational Resources Information Center

    List, Alexandra; Alexander, Patricia A.

    2017-01-01

    We introduce a special issue featuring four theoretical models of multiple text comprehension. We present a central framework for conceptualizing the four models in this special issue. Specifically, we chart the models according to how they consider learner, texts, task, and context factors in explaining multiple text comprehension. In addition,…

  17. Bayesian Model Development for Analysis of Open Source Information to Support the Assessment of Nuclear Programs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gastelum, Zoe N.; Whitney, Paul D.; White, Amanda M.

    2013-07-15

    Pacific Northwest National Laboratory has spent several years researching, developing, and validating large Bayesian network models to support integration of open source data sets for nuclear proliferation research. Our current work focuses on generating a set of interrelated models for multi-source assessment of nuclear programs, as opposed to a single comprehensive model. By using this approach, we can break down the models to cover logical sub-problems that can utilize different expertise and data sources. This approach allows researchers to utilize the models individually or in combination to detect and characterize a nuclear program and identify data gaps. The models operatemore » at various levels of granularity, covering a combination of state-level assessments with more detailed models of site or facility characteristics. This paper will describe the current open source-driven, nuclear nonproliferation models under development, the pros and cons of the analytical approach, and areas for additional research.« less

  18. Nondestructive detection of pork quality based on dual-band VIS/NIR spectroscopy

    NASA Astrophysics Data System (ADS)

    Wang, Wenxiu; Peng, Yankun; Li, Yongyu; Tang, Xiuying; Liu, Yuanyuan

    2015-05-01

    With the continuous development of living standards and the relative change of dietary structure, consumers' rising and persistent demand for better quality of meat is emphasized. Colour, pH value, and cooking loss are important quality attributes when evaluating meat. To realize nondestructive detection of multi-parameter of meat quality simultaneously is popular in production and processing of meat and meat products. The objectives of this research were to compare the effectiveness of two bands for rapid nondestructive and simultaneous detection of pork quality attributes. Reflectance spectra of 60 chilled pork samples were collected from a dual-band visible/near-infrared spectroscopy system which covered 350-1100 nm and 1000-2600 nm. Then colour, pH value and cooking loss were determined by standard methods as reference values. Standard normal variables transform (SNVT) was employed to eliminate the spectral noise. A spectrum connection method was put forward for effective integration of the dual-band spectrum to make full use of the whole efficient information. Partial least squares regression (PLSR) and Principal component analysis (PCA) were applied to establish prediction models using based on single-band spectrum and dual-band spectrum, respectively. The experimental results showed that the PLSR model based on dual-band spectral information was superior to the models based on single band spectral information with lower root means quare error (RMSE) and higher accuracy. The PLSR model based on dual-band (use the overlapping part of first band) yielded the best prediction result with correlation coefficient of validation (Rv) of 0.9469, 0.9495, 0.9180, 0.9054 and 0.8789 for L*, a*, b*, pH value and cooking loss, respectively. This mainly because dual-band spectrum can provide sufficient and comprehensive information which reflected the quality attributes. Data fusion from dual-band spectrum could significantly improve pork quality parameters prediction performance. The research also indicated that multi-band spectral information fusion has potential to comprehensively evaluate other quality and safety attributes of pork.

  19. Academic Achievement of Deaf and Hard-of-Hearing Students in an ASL/English Bilingual Program.

    PubMed

    Hrastinski, Iva; Wilbur, Ronnie B

    2016-04-01

    There has been a scarcity of studies exploring the influence of students' American Sign Language (ASL) proficiency on their academic achievement in ASL/English bilingual programs. The aim of this study was to determine the effects of ASL proficiency on reading comprehension skills and academic achievement of 85 deaf or hard-of-hearing signing students. Two subgroups, differing in ASL proficiency, were compared on the Northwest Evaluation Association Measures of Academic Progress and the reading comprehension subtest of the Stanford Achievement Test, 10th edition. Findings suggested that students highly proficient in ASL outperformed their less proficient peers in nationally standardized measures of reading comprehension, English language use, and mathematics. Moreover, a regression model consisting of 5 predictors including variables regarding education, hearing devices, and secondary disabilities as well as ASL proficiency and home language showed that ASL proficiency was the single variable significantly predicting results on all outcome measures. This study calls for a paradigm shift in thinking about deaf education by focusing on characteristics shared among successful deaf signing readers, specifically ASL fluency. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  20. No Association of Coronary Artery Disease with X-Chromosomal Variants in Comprehensive International Meta-Analysis.

    PubMed

    Loley, Christina; Alver, Maris; Assimes, Themistocles L; Bjonnes, Andrew; Goel, Anuj; Gustafsson, Stefan; Hernesniemi, Jussi; Hopewell, Jemma C; Kanoni, Stavroula; Kleber, Marcus E; Lau, King Wai; Lu, Yingchang; Lyytikäinen, Leo-Pekka; Nelson, Christopher P; Nikpay, Majid; Qu, Liming; Salfati, Elias; Scholz, Markus; Tukiainen, Taru; Willenborg, Christina; Won, Hong-Hee; Zeng, Lingyao; Zhang, Weihua; Anand, Sonia S; Beutner, Frank; Bottinger, Erwin P; Clarke, Robert; Dedoussis, George; Do, Ron; Esko, Tõnu; Eskola, Markku; Farrall, Martin; Gauguier, Dominique; Giedraitis, Vilmantas; Granger, Christopher B; Hall, Alistair S; Hamsten, Anders; Hazen, Stanley L; Huang, Jie; Kähönen, Mika; Kyriakou, Theodosios; Laaksonen, Reijo; Lind, Lars; Lindgren, Cecilia; Magnusson, Patrik K E; Marouli, Eirini; Mihailov, Evelin; Morris, Andrew P; Nikus, Kjell; Pedersen, Nancy; Rallidis, Loukianos; Salomaa, Veikko; Shah, Svati H; Stewart, Alexandre F R; Thompson, John R; Zalloua, Pierre A; Chambers, John C; Collins, Rory; Ingelsson, Erik; Iribarren, Carlos; Karhunen, Pekka J; Kooner, Jaspal S; Lehtimäki, Terho; Loos, Ruth J F; März, Winfried; McPherson, Ruth; Metspalu, Andres; Reilly, Muredach P; Ripatti, Samuli; Sanghera, Dharambir K; Thiery, Joachim; Watkins, Hugh; Deloukas, Panos; Kathiresan, Sekar; Samani, Nilesh J; Schunkert, Heribert; Erdmann, Jeanette; König, Inke R

    2016-10-12

    In recent years, genome-wide association studies have identified 58 independent risk loci for coronary artery disease (CAD) on the autosome. However, due to the sex-specific data structure of the X chromosome, it has been excluded from most of these analyses. While females have 2 copies of chromosome X, males have only one. Also, one of the female X chromosomes may be inactivated. Therefore, special test statistics and quality control procedures are required. Thus, little is known about the role of X-chromosomal variants in CAD. To fill this gap, we conducted a comprehensive X-chromosome-wide meta-analysis including more than 43,000 CAD cases and 58,000 controls from 35 international study cohorts. For quality control, sex-specific filters were used to adequately take the special structure of X-chromosomal data into account. For single study analyses, several logistic regression models were calculated allowing for inactivation of one female X-chromosome, adjusting for sex and investigating interactions between sex and genetic variants. Then, meta-analyses including all 35 studies were conducted using random effects models. None of the investigated models revealed genome-wide significant associations for any variant. Although we analyzed the largest-to-date sample, currently available methods were not able to detect any associations of X-chromosomal variants with CAD.

  1. A comprehensive dynamic model of double-row spherical roller bearing—Model development and case studies on surface defects, preloads, and radial clearance

    NASA Astrophysics Data System (ADS)

    Cao, M.; Xiao, J.

    2008-02-01

    Bearing excitation is one of the most important mechanical sources for vibration and noise generation in machine systems of a broad range of industries. Although extensively investigated, accurately predicting the vibration/acoustic behavior of bearings remains a challenging task because of its complicated nonlinear behaviors. While some ground work has been laid out on single-row deep-grooved ball (DGB) bearing, comprehensive modeling effort on spherical roller bearing (SRB) has yet to be carried out. This is mainly due to the facts that SRB system carries one more extra degree of freedom (DOF) on the moving race (could be either inner or outer race) and in general has more rolling elements compared with DGB. In this study, a comprehensive SRB excitation source model is developed. In addition to the vertical and horizontal displacements considered in previous investigations, the impacts of axial displacement/load are addressed by introducing the DOF in the axial shaft direction. Hence, instead of being treated as pre-assumed constants, the roller-inner/outer race contact angles are formulated as functions of the axial displacement of the moving race to reflect their dependence on the axial movement. The approach presented in this paper accounts for the point contacts between rollers and inner/outer races, as well as line contacts when the loads on individual rollers exceed the limit for point contact. A detailed contact-damping model reflecting the influences of the surface profiles and the speeds of the both contacting elements is developed and applied in the SRB model. Waviness of all the contact surfaces (including inner race, outer race, and rollers) is included and compared in this analysis. Extensive case studies are carried out to reveal the impacts of surface waviness, radial clearance, surface defects, and loading conditions on the force and displacement responses of the SRB system. System design guidelines are recommended based on the simulation results. This model is also applicable for bearing health monitoring, as demonstrated by the numerical case studies showing the frequency response of the system with moderate-to-large point defects on both inner and outer races, as well as the rollers. Comparisons between the simulation results and some conclusions reflecting common sense available in open literature serves as first hand partial validation of the developed model. Future validation efforts and further improvement directions are also provided. The comprehensive model developed in this investigation is a useful tool for machine system design, optimization, and performance evaluation.

  2. Temperature induced phase transformations and negative electrocaloric effect in (Pb,La)(Zr,Sn,Ti)O3 antiferroelectric single crystal

    NASA Astrophysics Data System (ADS)

    Zhuo, Fangping; Li, Qiang; Yan, Qingfeng; Zhang, Yiling; Wu, Hong-Hui; Xi, Xiaoqing; Chu, Xiangcheng; Cao, Wenwu

    2017-10-01

    Temperature induced phase transitions and electrocaloric effect (ECE) of (Pb,La)(Zr,Sn,Ti)O3 (PLZST) single crystals have been comprehensively studied. Based on the in situ evolution of domain structures and dielectric properties of the PLZST crystals, the phase transitions during heating are in the sequence of orthorhombic antiferroelectric → rhombohedral ferroelectric → cubic paraelectric. Coexistence of the negative and positive ECEs has been achieved in the PLZST single crystals. A negative ECE value of -1.26 °C and enhanced electrocaloric strength of -0.21 K mm/kV near the Curie temperature have been obtained. A modified Landau model gives a satisfactory description of the experimentally observed unusual ECE. Moreover, a temperature-electric field phase diagram is also established based on theoretical analysis. Our results will help people understand better the electrocaloric family, particularly on the negative and/or positive effect in antiferroelectrics and ferroelectrics.

  3. Single-mode SOA-based 1kHz-linewidth dual-wavelength random fiber laser.

    PubMed

    Xu, Yanping; Zhang, Liang; Chen, Liang; Bao, Xiaoyi

    2017-07-10

    Narrow-linewidth multi-wavelength fiber lasers are of significant interests for fiber-optic sensors, spectroscopy, optical communications, and microwave generation. A novel narrow-linewidth dual-wavelength random fiber laser with single-mode operation, based on the semiconductor optical amplifier (SOA) gain, is achieved in this work for the first time, to the best of our knowledge. A simplified theoretical model is established to characterize such kind of random fiber laser. The inhomogeneous gain in SOA mitigates the mode competition significantly and alleviates the laser instability, which are frequently encountered in multi-wavelength fiber lasers with Erbium-doped fiber gain. The enhanced random distributed feedback from a 5km non-uniform fiber provides coherent feedback, acting as mode selection element to ensure single-mode operation with narrow linewidth of ~1kHz. The laser noises are also comprehensively investigated and studied, showing the improvements of the proposed random fiber laser with suppressed intensity and frequency noises.

  4. A comprehensive approach for the simulation of the Urban Heat Island effect with the WRF/SLUCM modeling system: The case of Athens (Greece)

    NASA Astrophysics Data System (ADS)

    Giannaros, Christos; Nenes, Athanasios; Giannaros, Theodore M.; Kourtidis, Konstantinos; Melas, Dimitrios

    2018-03-01

    This study presents a comprehensive modeling approach for simulating the spatiotemporal distribution of urban air temperatures with a modeling system that includes the Weather Research and Forecasting (WRF) model and the Single-Layer Urban Canopy Model (SLUCM) with a modified treatment of the impervious surface temperature. The model was applied to simulate a 3-day summer heat wave event over the city of Athens, Greece. The simulation, using default SLUCM parameters, is capable of capturing the observed diurnal variation of urban temperatures and the Urban Heat Island (UHI) in the greater Athens Area (GAA), albeit with systematic biases that are prominent during nighttime hours. These biases are particularly evident over low-intensity residential areas, and they are associated with the surface and urban canopy properties representing the urban environment. A series of sensitivity simulations unravels the importance of the sub-grid urban fraction parameter, surface albedo, and street canyon geometry in the overall causation and development of the UHI effect. The sensitivities are then used to determine optimal values of the street canyon geometry, which reproduces the observed temperatures throughout the simulation domain. The optimal parameters, apart from considerably improving model performance (reductions in mean temperature bias from 0.30 °C to 1.58 °C), are also consistent with actual city building characteristics - which gives confidence that the model set-up is robust, and can be used to study the UHI in the GAA in the anticipated warmer conditions in the future.

  5. Double-row vs single-row rotator cuff repair: a review of the biomechanical evidence.

    PubMed

    Wall, Lindley B; Keener, Jay D; Brophy, Robert H

    2009-01-01

    A review of the current literature will show a difference between the biomechanical properties of double-row and single-row rotator cuff repairs. Rotator cuff tears commonly necessitate surgical repair; however, the optimal technique for repair continues to be investigated. Recently, double-row repairs have been considered an alternative to single-row repair, allowing a greater coverage area for healing and a possibly stronger repair. We reviewed the literature of all biomechanical studies comparing double-row vs single-row repair techniques. Inclusion criteria included studies using cadaveric, animal, or human models that directly compared double-row vs single-row repair techniques, written in the English language, and published in peer reviewed journals. Identified articles were reviewed to provide a comprehensive conclusion of the biomechanical strength and integrity of the repair techniques. Fifteen studies were identified and reviewed. Nine studies showed a statistically significant advantage to a double-row repair with regards to biomechanical strength, failure, and gap formation. Three studies produced results that did not show any statistical advantage. Five studies that directly compared footprint reconstruction all demonstrated that the double-row repair was superior to a single-row repair in restoring anatomy. The current literature reveals that the biomechanical properties of a double-row rotator cuff repair are superior to a single-row repair. Basic Science Study, SRH = Single vs. Double Row RCR.

  6. Findings across Practitioner Training Studies in Special Education: A Comprehensive Review and Meta-Analysis

    ERIC Educational Resources Information Center

    Brock, Matthew E.; Cannella-Malone, Helen I.; Seaman, Rachel L.; Andzik, Natalie R.; Schaefer, John M.; Page, E. Justin; Barczak, Mary A.; Dueker, Scott A.

    2017-01-01

    Existing reviews address important questions about subsets of practitioner training studies in special education but leave important questions about the broader literature unanswered. In this comprehensive review, we identified 118 peer-reviewed single-case-design studies in which researchers tested the efficacy of practitioner training on…

  7. Findings across Practitioner Training Studies in Special Education: A Comprehensive Review and Meta-Analysis

    ERIC Educational Resources Information Center

    Brock, Matthew E.; Cannella-Malone, Helen I.; Seaman, Rachel L.; Andzik, Natalie R.; Schaefer, John M.; Page, E. Justin; Barczak, Mary A.; Dueker, Scott A.

    2017-01-01

    Existing reviews answer important questions about subsets of practitioner training studies in special education, but leave important questions about the broader literature unanswered. In this comprehensive review, we identified 118 peer-reviewed single-case design studies in which researchers tested the efficacy of practitioner training on…

  8. Using Repeated Reading to Improve Reading Speed and Comprehension in Students with Visual Impairments

    ERIC Educational Resources Information Center

    Savaiano, Mackenzie E.; Hatton, Deborah D.

    2013-01-01

    Introduction: This study evaluated whether children with visual impairments who receive repeated reading instruction exhibit an increase in their oral reading rate and comprehension and a decrease in oral reading error rates. Methods: A single-subject, changing-criterion design replicated across three participants was used to demonstrate the…

  9. Handbook of Research on Science Teaching and Learning Project.

    ERIC Educational Resources Information Center

    Gabel, Dorothy L., Ed.

    This uniquely comprehensive and current survey of the research in science education has been compiled by the most prominent experts in the field. More than a summary of findings, the content of this comprehensive single volume provides an assessment of the significance of research; evaluates new developments; and examines current conflicts,…

  10. Ameliorating the English Reading Comprehension of Spanish-Speaking ELLs through a Reciprocal Teaching Intervention

    ERIC Educational Resources Information Center

    Ramos, Jose A.

    2012-01-01

    Through a single-subject multiple-baseline across-participants design, the present study examined the effects of Reciprocal Teaching (RT) instruction and Spanish use on the cognitive strategy use and English reading comprehension of four, 4th grade Spanish-speaking bilingual students that are "good" decoders but "poor"…

  11. A REVIEW OF SINGLE SPECIES TOXICITY TESTS: ARE THE TESTS RELIABLE PREDICTORS OF AQUATIC ECOSYSTEM COMMUNITY RESPONSES?

    EPA Science Inventory

    This document provides a comprehensive review to evaluate the reliability of indicator species toxicity test results in predicting aquatic ecosystem impacts, also called the ecological relevance of laboratory single species toxicity tests.

  12. Trends In Susceptibility To Single-Event Upset

    NASA Technical Reports Server (NTRS)

    Nichols, Donald K.; Price, William E.; Kolasinski, Wojciech A.; Koga, Rukotaro; Waskiewicz, Alvin E.; Pickel, James C.; Blandford, James T.

    1989-01-01

    Report provides nearly comprehensive body of data on single-event upsets due to irradiation by heavy ions. Combines new test data and previously published data from governmental and industrial laboratories. Clear trends emerge from data useful in predicting future performances of devices.

  13. On Reconstructing School Segregation: The Efficacy and Equity of Single-Sex Schooling

    ERIC Educational Resources Information Center

    Billger, Sherrilyn M.

    2009-01-01

    A change to Title IX has spurred new single-sex public schooling in the US. Until recently, nearly all gender-segregated schools were private, and comprehensive data for public school comparisons are not yet available. To investigate the effects of single-sex education, I focus on within private sector comparisons, and additionally address…

  14. Levels of text comprehension in children with autism spectrum disorders (ASD): the influence of language phenotype.

    PubMed

    Lucas, Rebecca; Norbury, Courtenay Frazier

    2014-11-01

    Many children with autism spectrum disorders (ASD) have reading comprehension difficulties, but the level of processing at which comprehension is most vulnerable and the influence of language phenotype on comprehension skill is currently unclear. We explored comprehension at sentence and passage levels across language phenotypes. Children with ASD and age-appropriate language skills (n = 25) demonstrated similar syntactic and semantic facilitation to typically developing peers. In contrast, few children with ASD and language impairments (n = 25) could read beyond the single word level. Those who could read sentences benefited from semantic coherence, but were less sensitive to syntactic coherence. At the passage level, the strongest predictor of comprehension was vocabulary knowledge. This emphasizes that the intimate relationship between language competence and both decoding skill and comprehension is evident at the sentence, as well as the passage level, for children with ASD.

  15. For US Students, L2 Reading Comprehension Is Hard Because L2 Listening Comprehension Is Hard, Too

    ERIC Educational Resources Information Center

    Sparks, Richard; Patton, Jon; Luebbers, Julie

    2018-01-01

    The Simple View of Reading (SVR) model posits that reading is the product of word decoding and language comprehension and that oral language (listening) comprehension is the best predictor of reading comprehension once word-decoding skill has been established. The SVR model also proposes that there are good readers and three types of poor…

  16. Single-Word Recognition Need Not Depend on Single-Word Features: Narrative Coherence Counteracts Effects of Single-Word Features That Lexical Decision Emphasizes

    ERIC Educational Resources Information Center

    Teng, Dan W.; Wallot, Sebastian; Kelty-Stephen, Damian G.

    2016-01-01

    Research on reading comprehension of connected text emphasizes reliance on single-word features that organize a stable, mental lexicon of words and that speed or slow the recognition of each new word. However, the time needed to recognize a word might not actually be as fixed as previous research indicates, and the stability of the mental lexicon…

  17. Applying a Multiple Group Causal Indicator Modeling Framework to the Reading Comprehension Skills of Third, Seventh, and Tenth Grade Students

    PubMed Central

    Tighe, Elizabeth L.; Wagner, Richard K.; Schatschneider, Christopher

    2015-01-01

    This study demonstrates the utility of applying a causal indicator modeling framework to investigate important predictors of reading comprehension in third, seventh, and tenth grade students. The results indicated that a 4-factor multiple indicator multiple indicator cause (MIMIC) model of reading comprehension provided adequate fit at each grade level. This model included latent predictor constructs of decoding, verbal reasoning, nonverbal reasoning, and working memory and accounted for a large portion of the reading comprehension variance (73% to 87%) across grade levels. Verbal reasoning contributed the most unique variance to reading comprehension at all grade levels. In addition, we fit a multiple group 4-factor MIMIC model to investigate the relative stability (or variability) of the predictor contributions to reading comprehension across development (i.e., grade levels). The results revealed that the contributions of verbal reasoning, nonverbal reasoning, and working memory to reading comprehension were stable across the three grade levels. Decoding was the only predictor that could not be constrained to be equal across grade levels. The contribution of decoding skills to reading comprehension was higher in third grade and then remained relatively stable between seventh and tenth grade. These findings illustrate the feasibility of using MIMIC models to explain individual differences in reading comprehension across the development of reading skills. PMID:25821346

  18. Calculation of the Aerodynamic Behavior of the Tilt Rotor Aeroacoustic Model (TRAM) in the DNW

    NASA Technical Reports Server (NTRS)

    Johnson, Wayne

    2001-01-01

    Comparisons of measured and calculated aerodynamic behavior of a tiltrotor model are presented. The test of the Tilt Rotor Aeroacoustic Model (TRAM) with a single, 1/4-scale V- 22 rotor in the German-Dutch Wind Tunnel (DNW) provides an extensive set of aeroacoustic, performance, and structural loads data. The calculations were performed using the rotorcraft comprehensive analysis CAMRAD II. Presented are comparisons of measured and calculated performance and airloads for helicopter mode operation, as well as calculated induced and profile power. An aerodynamic and wake model and calculation procedure that reflects the unique geometry and phenomena of tiltrotors has been developed. There are major differences between this model and the corresponding aerodynamic and wake model that has been established for helicopter rotors. In general, good correlation between measured and calculated performance and airloads behavior has been shown. Two aspects of the analysis that clearly need improvement are the stall delay model and the trailed vortex formation model.

  19. Reversal of autism-like behaviors and metabolism in adult mice with single-dose antipurinergic therapy

    PubMed Central

    Naviaux, J C; Schuchbauer, M A; Li, K; Wang, L; Risbrough, V B; Powell, S B; Naviaux, R K

    2014-01-01

    Autism spectrum disorders (ASDs) now affect 1–2% of the children born in the United States. Hundreds of genetic, metabolic and environmental factors are known to increase the risk of ASD. Similar factors are known to influence the risk of schizophrenia and bipolar disorder; however, a unifying mechanistic explanation has remained elusive. Here we used the maternal immune activation (MIA) mouse model of neurodevelopmental and neuropsychiatric disorders to study the effects of a single dose of the antipurinergic drug suramin on the behavior and metabolism of adult animals. We found that disturbances in social behavior, novelty preference and metabolism are not permanent but are treatable with antipurinergic therapy (APT) in this model of ASD and schizophrenia. A single dose of suramin (20 mg kg−1 intraperitoneally (i.p.)) given to 6-month-old adults restored normal social behavior, novelty preference and metabolism. Comprehensive metabolomic analysis identified purine metabolism as the key regulatory pathway. Correction of purine metabolism normalized 17 of 18 metabolic pathways that were disturbed in the MIA model. Two days after treatment, the suramin concentration in the plasma and brainstem was 7.64 μM pmol μl−1 (±0.50) and 5.15 pmol mg−1 (±0.49), respectively. These data show good uptake of suramin into the central nervous system at the level of the brainstem. Most of the improvements associated with APT were lost after 5 weeks of drug washout, consistent with the 1-week plasma half-life of suramin in mice. Our results show that purine metabolism is a master regulator of behavior and metabolism in the MIA model, and that single-dose APT with suramin acutely reverses these abnormalities, even in adults. PMID:24937094

  20. A Comprehensive evaluation of groundwater vulnerability to saltwater up-coning and sea water intrusion in a coastal aquifer (case study: Ghaemshahr-juybar aquifer)

    NASA Astrophysics Data System (ADS)

    Motevalli, Alireza; Moradi, Hamid Reza; Javadi, Saman

    2018-02-01

    Aquifer salinization has recently increased significantly due to human activity and has caused irreparable environmental and economic effects. In this research, a new method is proposed for modeling the vulnerability to salinity for the Ghaemshahr-juybar aquifer. Specifically, the GALDIT (Sea water intrusion) and TAWLBIC (Saltwater up-coning) indices were combined to produce a map of vulnerability (Comprehensive Salinity Index or CSI) to seawater intrusion of a region near the coast and saltwater up-coning away from the coast, respectively. Single parameter and removal layer sensitivity analysis were performed in order to identify the sensitive parameters and achieve optimal weights (through the single-parameter method) of contributing factors in all three methods. The three optimized methods produced were GALDIT-Opt, TAWLBIC-Opt and CSI-Opt. To assess the accuracy of the original maps and optimal ones, the Pearson correlation was used. Results indicated that the Pearson correlation of the optimized GALDIT, TAWLBIC and CSI model was better than GALDIT, TAWLBIC and CSI. The results show that the increase in correlation between EC (Electrical Conductivity), TDS (Total Dissolved Solids) and SAR (Sodium Adsorption Ratio) from the GALDIT model to the CSI-Opt model from values of 0.64, 0.56 and 0.68 has improved to values of 0.81, 0.88 and 0.91, respectively. The highest concentration of EC, with a value of 7050 μs/cm, is sampled in the areas of the east and northwest of the Ghaemshahr-juybar aquifer, which are classified in the CSI-Opt model as high and very high vulnerability levels. The highest concentration of TDS and SAR has been found in the east, northwest and northeast of the Ghaemshahr-juybar aquifer with a value of 4724 ppm for TDS and 14 mg/l for SAR that have been modeled in the CSI-Opt index as highly vulnerable areas. Eventually, CSI mapping can be used as an efficient tool in prioritizing in terms of the vulnerability to aquifer salinity, carrying out adjustments, recharge, and adaptation policies for this issue.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hurt, Christopher J.; Freels, James D.; Hobbs, Randy W.

    There has been a considerable effort over the previous few years to demonstrate and optimize the production of plutonium-238 ( 238Pu) at the High Flux Isotope Reactor (HFIR). This effort has involved resources from multiple divisions and facilities at the Oak Ridge National Laboratory (ORNL) to demonstrate the fabrication, irradiation, and chemical processing of targets containing neptunium-237 ( 237Np) dioxide (NpO 2)/aluminum (Al) cermet pellets. A critical preliminary step to irradiation at the HFIR is to demonstrate the safety of the target under irradiation via documented experiment safety analyses. The steady-state thermal safety analyses of the target are simulated inmore » a finite element model with the COMSOL Multiphysics code that determines, among other crucial parameters, the limiting maximum temperature in the target. Safety analysis efforts for this model discussed in the present report include: (1) initial modeling of single and reduced-length pellet capsules in order to generate an experimental knowledge base that incorporate initial non-linear contact heat transfer and fission gas equations, (2) modeling efforts for prototypical designs of partially loaded and fully loaded targets using limited available knowledge of fabrication and irradiation characteristics, and (3) the most recent and comprehensive modeling effort of a fully coupled thermo-mechanical approach over the entire fully loaded target domain incorporating burn-up dependent irradiation behavior and measured target and pellet properties, hereafter referred to as the production model. These models are used to conservatively determine several important steady-state parameters including target stresses and temperatures, the limiting condition of which is the maximum temperature with respect to the melting point. The single pellet model results provide a basis for the safety of the irradiations, followed by parametric analyses in the initial prototypical designs that were necessary due to the limiting fabrication and irradiation data available. The calculated parameters in the final production target model are the most accurate and comprehensive, while still conservative. Over 210 permutations in irradiation time and position were evaluated, and are supported by the most recent inputs and highest fidelity methodology. The results of these analyses show that the models presented in this report provide a robust and reliable basis for previous, current and future experiment safety analyses. In addition, they reveal an evolving knowledge of the steady-state behavior of the NpO 2/Al pellets under irradiation for a variety of target encapsulations and potential conditions.« less

  2. Application of the IRT and TRT Models to a Reading Comprehension Test

    ERIC Educational Resources Information Center

    Kim, Weon H.

    2017-01-01

    The purpose of the present study is to apply the item response theory (IRT) and testlet response theory (TRT) models to a reading comprehension test. This study applied the TRT models and the traditional IRT model to a seventh-grade reading comprehension test (n = 8,815) with eight testlets. These three models were compared to determine the best…

  3. A comprehensive mechanistic model for upward two-phase flow in wellbores

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sylvester, N.D.; Sarica, C.; Shoham, O.

    1994-05-01

    A comprehensive model is formulated to predict the flow behavior for upward two-phase flow. This model is composed of a model for flow-pattern prediction and a set of independent mechanistic models for predicting such flow characteristics as holdup and pressure drop in bubble, slug, and annular flow. The comprehensive model is evaluated by using a well data bank made up of 1,712 well cases covering a wide variety of field data. Model performance is also compared with six commonly used empirical correlations and the Hasan-Kabir mechanistic model. Overall model performance is in good agreement with the data. In comparison withmore » other methods, the comprehensive model performed the best.« less

  4. Beyond Comprehension Strategy Instruction: What's Next?

    PubMed

    Elleman, Amy M; Compton, Donald L

    2017-04-20

    In this article, we respond to Catts and Kamhi's (2017) argument that reading comprehension is not a single ability. We provide a brief review of the impact of strategy instruction, the importance of knowledge in reading comprehension, and possible avenues for future research and practice. We agree with Catts and Kamhi's argument that reading comprehension is a complex endeavor and that current recommended practices do not reflect the complexity of the construct. Knowledge building, despite its important role in comprehension, has been relegated to a back seat in reading comprehension instruction. In the final section of the article, we outline possible avenues for research and practice (e.g., generative language instruction, dialogic approaches to knowledge building, analogical reasoning and disciplinary literacy, the use of graphics and media, inference instruction) for improving reading-comprehension outcomes. Reading comprehension is a complex ability, and comprehension instruction should reflect this complexity. If we want to have an impact on long-term growth in reading comprehension, we will need to expand our current repertoire of instructional methods to include approaches that support the acquisition and integration of knowledge across a variety of texts and topics.

  5. Centrality in earthquake multiplex networks

    NASA Astrophysics Data System (ADS)

    Lotfi, Nastaran; Darooneh, Amir Hossein; Rodrigues, Francisco A.

    2018-06-01

    Seismic time series has been mapped as a complex network, where a geographical region is divided into square cells that represent the nodes and connections are defined according to the sequence of earthquakes. In this paper, we map a seismic time series to a temporal network, described by a multiplex network, and characterize the evolution of the network structure in terms of the eigenvector centrality measure. We generalize previous works that considered the single layer representation of earthquake networks. Our results suggest that the multiplex representation captures better earthquake activity than methods based on single layer networks. We also verify that the regions with highest seismological activities in Iran and California can be identified from the network centrality analysis. The temporal modeling of seismic data provided here may open new possibilities for a better comprehension of the physics of earthquakes.

  6. A Pilot Study of Biomedical Text Comprehension using an Attention-Based Deep Neural Reader: Design and Experimental Analysis.

    PubMed

    Kim, Seongsoon; Park, Donghyeon; Choi, Yonghwa; Lee, Kyubum; Kim, Byounggun; Jeon, Minji; Kim, Jihye; Tan, Aik Choon; Kang, Jaewoo

    2018-01-05

    With the development of artificial intelligence (AI) technology centered on deep-learning, the computer has evolved to a point where it can read a given text and answer a question based on the context of the text. Such a specific task is known as the task of machine comprehension. Existing machine comprehension tasks mostly use datasets of general texts, such as news articles or elementary school-level storybooks. However, no attempt has been made to determine whether an up-to-date deep learning-based machine comprehension model can also process scientific literature containing expert-level knowledge, especially in the biomedical domain. This study aims to investigate whether a machine comprehension model can process biomedical articles as well as general texts. Since there is no dataset for the biomedical literature comprehension task, our work includes generating a large-scale question answering dataset using PubMed and manually evaluating the generated dataset. We present an attention-based deep neural model tailored to the biomedical domain. To further enhance the performance of our model, we used a pretrained word vector and biomedical entity type embedding. We also developed an ensemble method of combining the results of several independent models to reduce the variance of the answers from the models. The experimental results showed that our proposed deep neural network model outperformed the baseline model by more than 7% on the new dataset. We also evaluated human performance on the new dataset. The human evaluation result showed that our deep neural model outperformed humans in comprehension by 22% on average. In this work, we introduced a new task of machine comprehension in the biomedical domain using a deep neural model. Since there was no large-scale dataset for training deep neural models in the biomedical domain, we created the new cloze-style datasets Biomedical Knowledge Comprehension Title (BMKC_T) and Biomedical Knowledge Comprehension Last Sentence (BMKC_LS) (together referred to as BioMedical Knowledge Comprehension) using the PubMed corpus. The experimental results showed that the performance of our model is much higher than that of humans. We observed that our model performed consistently better regardless of the degree of difficulty of a text, whereas humans have difficulty when performing biomedical literature comprehension tasks that require expert level knowledge. ©Seongsoon Kim, Donghyeon Park, Yonghwa Choi, Kyubum Lee, Byounggun Kim, Minji Jeon, Jihye Kim, Aik Choon Tan, Jaewoo Kang. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 05.01.2018.

  7. Aviation accident forensic assessment : comprehensive single-extraction urine screening procedure : final report.

    DOT National Transportation Integrated Search

    1996-05-01

    This paper describes a new single extraction screening procedure that was developed to identify as many drugs as possible in urine, with minimal effort and cost. Urine specimens are hydrolyzed and the specimen is then extracted using commercially pur...

  8. Coastal single-beam bathymetry data collected in 2015 from the Chandeleur Islands, Louisiana

    USGS Publications Warehouse

    Stalk, Chelsea A.; DeWitt, Nancy T.; Bernier, Julie C.; Kindinger, Jack G.; Flocks, James G.; Miselis, Jennifer L.; Locker, Stanley D.; Kelso, Kyle W.; Tuten, Thomas M.

    2017-02-23

    As part of the Louisiana Coastal Protection and Restoration Authority (CPRA) Barrier Island Comprehensive Monitoring Program, scientists from the U.S. Geological Survey (USGS) St. Petersburg Coastal and Marine Science Center conducted a single-beam bathymetry survey around the Chandeleur Islands, Louisiana, in June 2015. The goal of the program is to provide long-term data on Louisiana’s barrier islands and use this data to plan, design, evaluate, and maintain current and future barrier island restoration projects. The data described in this report, along with (1) USGS bathymetry data collected in 2013 as a part of the Barrier Island Evolution Research project covering the northern Chandeleur Islands, and (2) data collected in 2014 in collaboration with the Louisiana CPRA Barrier Island Comprehensive Monitoring Program around Breton Island, will be used to assess bathymetric change since 2006‒2007 as well as serve as a bathymetric control in supporting modeling of future changes in response to restoration and storm impacts. The survey area encompasses approximately 435 square kilometers of nearshore and back-barrier environments around Hewes Point, the Chandeleur Islands, and Curlew and Grand Gosier Shoals. This Data Series serves as an archive of processed single-beam bathymetry data, collected in the nearshore of the Chandeleur Islands, Louisiana, from June 17‒24, 2015, during USGS Field Activity Number 2015-317-FA. Geographic information system data products include a 200-meter-cell-size interpolated bathymetry grid, trackline maps, and xyz point data files. Additional files include error analysis maps, Field Activity Collection System logs, and formal Federal Geographic Data Committee metadata.

  9. A comprehensive and sensitive method for hair analysis in drug-facilitated crimes and incorporation of zolazepam and tiletamine into hair after a single exposure.

    PubMed

    Kim, Jihyun; Yum, Hyesun; Jang, Moonhee; Shin, Ilchung; Yang, Wonkyung; Baeck, Seungkyung; Suh, Joon Hyuk; Lee, Sooyeun; Han, Sang Beom

    2016-01-01

    Hair is a highly relevant specimen that is used to verify drug exposure in victims of drug-facilitated crime (DFC) cases. In the present study, a new analytical method involving ultrahigh-performance liquid chromatography-tandem mass spectrometry was developed for determining the presence of model drugs, including zolazepam and tiletamine and their metabolites in hair specimens from DFCs. The incorporation of zolazepam and tiletamine into hair after a single exposure was investigated in Long-Evans rats with the ratio of the hair concentration to the area under the curve. For rapid and simple sample preparation, methanol extraction and protein precipitation were performed for hair and plasma, respectively. No interference was observed in drug-free hair or plasma, except for hair-derived diphenhydramine in blank hair. The coefficients of variance of the matrix effects were below 12%, and the recoveries of the analytes exceeded 70% in all of the matrices. The precision and accuracy results were satisfactory. The limits of quantification ranged from 20 to 50 pg in 10 mg of hair. The drug incorporation rates were 0.03 ± 0.01% for zolazepam and 2.09 ± 0.51% for tiletamine in pigmented hair. We applied the present method to real hair samples in order to determine the drug that was used in seven cases. These results suggest that this comprehensive and sensitive hair analysis method can successfully verify a drug after a single exposure in crimes and can be applied in forensic and clinical toxicology laboratories.

  10. Evaluating methods of inferring gene regulatory networks highlights their lack of performance for single cell gene expression data.

    PubMed

    Chen, Shuonan; Mar, Jessica C

    2018-06-19

    A fundamental fact in biology states that genes do not operate in isolation, and yet, methods that infer regulatory networks for single cell gene expression data have been slow to emerge. With single cell sequencing methods now becoming accessible, general network inference algorithms that were initially developed for data collected from bulk samples may not be suitable for single cells. Meanwhile, although methods that are specific for single cell data are now emerging, whether they have improved performance over general methods is unknown. In this study, we evaluate the applicability of five general methods and three single cell methods for inferring gene regulatory networks from both experimental single cell gene expression data and in silico simulated data. Standard evaluation metrics using ROC curves and Precision-Recall curves against reference sets sourced from the literature demonstrated that most of the methods performed poorly when they were applied to either experimental single cell data, or simulated single cell data, which demonstrates their lack of performance for this task. Using default settings, network methods were applied to the same datasets. Comparisons of the learned networks highlighted the uniqueness of some predicted edges for each method. The fact that different methods infer networks that vary substantially reflects the underlying mathematical rationale and assumptions that distinguish network methods from each other. This study provides a comprehensive evaluation of network modeling algorithms applied to experimental single cell gene expression data and in silico simulated datasets where the network structure is known. Comparisons demonstrate that most of these assessed network methods are not able to predict network structures from single cell expression data accurately, even if they are specifically developed for single cell methods. Also, single cell methods, which usually depend on more elaborative algorithms, in general have less similarity to each other in the sets of edges detected. The results from this study emphasize the importance for developing more accurate optimized network modeling methods that are compatible for single cell data. Newly-developed single cell methods may uniquely capture particular features of potential gene-gene relationships, and caution should be taken when we interpret these results.

  11. Guided Comprehension in the Primary Grades.

    ERIC Educational Resources Information Center

    McLaughlin, Maureen

    Intended as a response to recent developments in reading research and a demand by primary-grade teachers for a comprehension-based instructional framework, this book adapts the Guided Comprehension Model introduced in the author/educator's book "Guided Comprehension: A Teaching Model for Grades 3-8." According to the book, the Guided…

  12. Comprehensive School Reform and Achievement: A Meta-Analysis. Educator's Summary

    ERIC Educational Resources Information Center

    Center for Data-Driven Reform in Education (NJ3), 2008

    2008-01-01

    Which comprehensive school reform programs have been proven to help elementary and secondary students achieve? To find out, this review summarizes evidence on comprehensive school reform (CSR) models in elementary and secondary schools. Comprehensive school reform models are programs used schoolwide to improve student achievement. They typically…

  13. Computer simulation of multiple pilots flying a modern high performance helicopter

    NASA Technical Reports Server (NTRS)

    Zipf, Mark E.; Vogt, William G.; Mickle, Marlin H.; Hoelzeman, Ronald G.; Kai, Fei; Mihaloew, James R.

    1988-01-01

    A computer simulation of a human response pilot mechanism within the flight control loop of a high-performance modern helicopter is presented. A human response mechanism, implemented by a low order, linear transfer function, is used in a decoupled single variable configuration that exploits the dominant vehicle characteristics by associating cockpit controls and instrumentation with specific vehicle dynamics. Low order helicopter models obtained from evaluations of the time and frequency domain responses of a nonlinear simulation model, provided by NASA Lewis Research Center, are presented and considered in the discussion of the pilot development. Pilot responses and reactions to test maneuvers are presented and discussed. Higher level implementation, using the pilot mechanisms, are discussed and considered for their use in a comprehensive control structure.

  14. Systems biology of embryonic development: Prospects for a complete understanding of the Caenorhabditis elegans embryo.

    PubMed

    Murray, John Isaac

    2018-05-01

    The convergence of developmental biology and modern genomics tools brings the potential for a comprehensive understanding of developmental systems. This is especially true for the Caenorhabditis elegans embryo because its small size, invariant developmental lineage, and powerful genetic and genomic tools provide the prospect of a cellular resolution understanding of messenger RNA (mRNA) expression and regulation across the organism. We describe here how a systems biology framework might allow large-scale determination of the embryonic regulatory relationships encoded in the C. elegans genome. This framework consists of two broad steps: (a) defining the "parts list"-all genes expressed in all cells at each time during development and (b) iterative steps of computational modeling and refinement of these models by experimental perturbation. Substantial progress has been made towards defining the parts list through imaging methods such as large-scale green fluorescent protein (GFP) reporter analysis. Imaging results are now being augmented by high-resolution transcriptome methods such as single-cell RNA sequencing, and it is likely the complete expression patterns of all genes across the embryo will be known within the next few years. In contrast, the modeling and perturbation experiments performed so far have focused largely on individual cell types or genes, and improved methods will be needed to expand them to the full genome and organism. This emerging comprehensive map of embryonic expression and regulatory function will provide a powerful resource for developmental biologists, and would also allow scientists to ask questions not accessible without a comprehensive picture. This article is categorized under: Invertebrate Organogenesis > Worms Technologies > Analysis of the Transcriptome Gene Expression and Transcriptional Hierarchies > Gene Networks and Genomics. © 2018 Wiley Periodicals, Inc.

  15. A Comprehensive Atlas of the Adult Mouse Penis

    PubMed Central

    Phillips, Tiffany R.; Wright, David K.; Gradie, Paul E.; Johnston, Leigh A.; Pask, Andrew J.

    2016-01-01

    Mice are routinely used to study the development of the external genitalia and, in particular, the process of male urethral closure. This is because misplacement of the male penile urethra, or hypospadias, is amongst the most common birth defects reported in humans. While mice present a tractable model to study penile development, several structures differ between mice and humans, and there is a lack of consensus in the literature on their annotation and developmental origins. Defining the ontology of the mouse prepuce is especially important for the relevance and interpretation of mouse models of hypospadias to human conditions. We have developed a detailed annotation of the adult mouse penis that addresses these differences and enables an accurate comparison of murine and human hypospadias phenotypes. Through MRI data, gross morphology and section histology, we define the origin of the mouse external and internal prepuces, their relationship to the single human foreskin as well as provide a comprehensive view of the various structures of the mouse penis and their associated muscle attachments within the body. These data are combined to annotate structures in a novel 3D adult penis atlas that can be downloaded, viewed at any angle, and manipulated to examine the relationship of various structures. PMID:26112156

  16. Variations on Debris Disks. IV. An Improved Analytical Model for Collisional Cascades

    NASA Astrophysics Data System (ADS)

    Kenyon, Scott J.; Bromley, Benjamin C.

    2017-04-01

    We derive a new analytical model for the evolution of a collisional cascade in a thin annulus around a single central star. In this model, r max the size of the largest object changes with time, {r}\\max \\propto {t}-γ , with γ ≈ 0.1-0.2. Compared to standard models where r max is constant in time, this evolution results in a more rapid decline of M d , the total mass of solids in the annulus, and L d , the luminosity of small particles in the annulus: {M}d\\propto {t}-(γ +1) and {L}d\\propto {t}-(γ /2+1). We demonstrate that the analytical model provides an excellent match to a comprehensive suite of numerical coagulation simulations for annuli at 1 au and at 25 au. If the evolution of real debris disks follows the predictions of the analytical or numerical models, the observed luminosities for evolved stars require up to a factor of two more mass than predicted by previous analytical models.

  17. A Bayesian Poisson-lognormal Model for Count Data for Multiple-Trait Multiple-Environment Genomic-Enabled Prediction

    PubMed Central

    Montesinos-López, Osval A.; Montesinos-López, Abelardo; Crossa, José; Toledo, Fernando H.; Montesinos-López, José C.; Singh, Pawan; Juliana, Philomin; Salinas-Ruiz, Josafhat

    2017-01-01

    When a plant scientist wishes to make genomic-enabled predictions of multiple traits measured in multiple individuals in multiple environments, the most common strategy for performing the analysis is to use a single trait at a time taking into account genotype × environment interaction (G × E), because there is a lack of comprehensive models that simultaneously take into account the correlated counting traits and G × E. For this reason, in this study we propose a multiple-trait and multiple-environment model for count data. The proposed model was developed under the Bayesian paradigm for which we developed a Markov Chain Monte Carlo (MCMC) with noninformative priors. This allows obtaining all required full conditional distributions of the parameters leading to an exact Gibbs sampler for the posterior distribution. Our model was tested with simulated data and a real data set. Results show that the proposed multi-trait, multi-environment model is an attractive alternative for modeling multiple count traits measured in multiple environments. PMID:28364037

  18. Computational Cosmology: From the Early Universe to the Large Scale Structure.

    PubMed

    Anninos, Peter

    2001-01-01

    In order to account for the observable Universe, any comprehensive theory or model of cosmology must draw from many disciplines of physics, including gauge theories of strong and weak interactions, the hydrodynamics and microphysics of baryonic matter, electromagnetic fields, and spacetime curvature, for example. Although it is difficult to incorporate all these physical elements into a single complete model of our Universe, advances in computing methods and technologies have contributed significantly towards our understanding of cosmological models, the Universe, and astrophysical processes within them. A sample of numerical calculations (and numerical methods applied to specific issues in cosmology are reviewed in this article: from the Big Bang singularity dynamics to the fundamental interactions of gravitational waves; from the quark-hadron phase transition to the large scale structure of the Universe. The emphasis, although not exclusively, is on those calculations designed to test different models of cosmology against the observed Universe.

  19. Computational Cosmology: from the Early Universe to the Large Scale Structure.

    PubMed

    Anninos, Peter

    1998-01-01

    In order to account for the observable Universe, any comprehensive theory or model of cosmology must draw from many disciplines of physics, including gauge theories of strong and weak interactions, the hydrodynamics and microphysics of baryonic matter, electromagnetic fields, and spacetime curvature, for example. Although it is difficult to incorporate all these physical elements into a single complete model of our Universe, advances in computing methods and technologies have contributed significantly towards our understanding of cosmological models, the Universe, and astrophysical processes within them. A sample of numerical calculations addressing specific issues in cosmology are reviewed in this article: from the Big Bang singularity dynamics to the fundamental interactions of gravitational waves; from the quark-hadron phase transition to the large scale structure of the Universe. The emphasis, although not exclusively, is on those calculations designed to test different models of cosmology against the observed Universe.

  20. A comprehensive method for preliminary design optimization of axial gas turbine stages. II - Code verification

    NASA Technical Reports Server (NTRS)

    Jenkins, R. M.

    1983-01-01

    The present effort represents an extension of previous work wherein a calculation model for performing rapid pitchline optimization of axial gas turbine geometry, including blade profiles, is developed. The model requires no specification of geometric constraints. Output includes aerodynamic performance (adiabatic efficiency), hub-tip flow-path geometry, blade chords, and estimates of blade shape. Presented herein is a verification of the aerodynamic performance portion of the model, whereby detailed turbine test-rig data, including rig geometry, is input to the model to determine whether tested performance can be predicted. An array of seven (7) NASA single-stage axial gas turbine configurations is investigated, ranging in size from 0.6 kg/s to 63.8 kg/s mass flow and in specific work output from 153 J/g to 558 J/g at design (hot) conditions; stage loading factor ranges from 1.15 to 4.66.

  1. Dissimilarity based Partial Least Squares (DPLS) for genomic prediction from SNPs.

    PubMed

    Singh, Priyanka; Engel, Jasper; Jansen, Jeroen; de Haan, Jorn; Buydens, Lutgarde Maria Celina

    2016-05-04

    Genomic prediction (GP) allows breeders to select plants and animals based on their breeding potential for desirable traits, without lengthy and expensive field trials or progeny testing. We have proposed to use Dissimilarity-based Partial Least Squares (DPLS) for GP. As a case study, we use the DPLS approach to predict Bacterial wilt (BW) in tomatoes using SNPs as predictors. The DPLS approach was compared with the Genomic Best-Linear Unbiased Prediction (GBLUP) and single-SNP regression with SNP as a fixed effect to assess the performance of DPLS. Eight genomic distance measures were used to quantify relationships between the tomato accessions from the SNPs. Subsequently, each of these distance measures was used to predict the BW using the DPLS prediction model. The DPLS model was found to be robust to the choice of distance measures; similar prediction performances were obtained for each distance measure. DPLS greatly outperformed the single-SNP regression approach, showing that BW is a comprehensive trait dependent on several loci. Next, the performance of the DPLS model was compared to that of GBLUP. Although GBLUP and DPLS are conceptually very different, the prediction quality (PQ) measured by DPLS models were similar to the prediction statistics obtained from GBLUP. A considerable advantage of DPLS is that the genotype-phenotype relationship can easily be visualized in a 2-D scatter plot. This so-called score-plot provides breeders an insight to select candidates for their future breeding program. DPLS is a highly appropriate method for GP. The model prediction performance was similar to the GBLUP and far better than the single-SNP approach. The proposed method can be used in combination with a wide range of genomic dissimilarity measures and genotype representations such as allele-count, haplotypes or allele-intensity values. Additionally, the data can be insightfully visualized by the DPLS model, allowing for selection of desirable candidates from the breeding experiments. In this study, we have assessed the DPLS performance on a single trait.

  2. Girls' Participation in Physics in Single Sex Classes in Mixed Schools in Relation to Confidence and Achievement.

    ERIC Educational Resources Information Center

    Gillibrand, Eileen; Robinson, Peter; Brawn, Richard; Osborn, Albert

    1999-01-01

    Reports the findings from a three-year longitudinal case study of two single-sex General Certificate of Secondary Education (GCSE) physics classes in a mixed comprehensive school in England. Results indicate that girls who elected to study physics in single-sex classes gain confidence in the subject. This gain in confidence is associated with…

  3. Development of a Theoretically Based Treatment for Sentence Comprehension Deficits in Individuals with Aphasia

    ERIC Educational Resources Information Center

    Kiran, Swathi; Caplan, David; Sandberg, Chaleece; Levy, Joshua; Berardino, Alex; Ascenso, Elsa; Villard, Sarah; Tripodis, Yorghos

    2012-01-01

    Purpose: Two new treatments, 1 based on sentence to picture matching (SPM) and the other on object manipulation (OM), that train participants on the thematic roles of sentences using pictures or by manipulating objects were piloted. Method: Using a single-subject multiple-baseline design, sentence comprehension was trained on the affected sentence…

  4. Halting in Single Word Production: A Test of the Perceptual Loop Theory of Speech Monitoring

    ERIC Educational Resources Information Center

    Slevc, L. Robert; Ferreira, Victor S.

    2006-01-01

    The "perceptual loop theory" of speech monitoring (Levelt, 1983) claims that inner and overt speech are monitored by the comprehension system, which detects errors by comparing the comprehension of formulated utterances to originally intended utterances. To test the perceptual loop monitor, speakers named pictures and sometimes attempted to halt…

  5. Vestibular Stimulation for ADHD: Randomized Controlled Trial of Comprehensive Motion Apparatus

    ERIC Educational Resources Information Center

    Clark, David L.; Arnold, L. Eugene; Crowl, Lindsay; Bozzolo, Hernan; Peruggia, Mario; Ramadan, Yaser; Bornstein, Robert; Hollway, Jill A.; Thompson, Susan; Malone, Krista; Hall, Kristy L.; Shelton, Sara B.; Bozzolo, Dawn R.; Cook, Amy

    2008-01-01

    Objective: This research evaluates effects of vestibular stimulation by Comprehensive Motion Apparatus (CMA) in ADHD. Method: Children ages 6 to 12 (48 boys, 5 girls) with ADHD were randomized to thrice-weekly 30-min treatments for 12 weeks with CMA, stimulating otoliths and semicircular canals, or a single-blind control of equal duration and…

  6. Use of an E-Reader as a Compensatory Strategy among University Students with Reading Disabilities

    ERIC Educational Resources Information Center

    Tanners, Adam

    2010-01-01

    This study investigated the impact of a Kindle e-book reader on the reading rate, comprehension and e-reader acceptance of five postsecondary students with reading disabilities. A single-case Alternating Treatments Design was employed to measure reading rates and reading comprehension. Students were exposed to a series of controlled reading…

  7. Sifting noisy data for truths about noisy systems. Comment on "Extracting physics of life at the molecular level: A review of single-molecule data analyses" by W. Colomb and S.K. Sarkar

    NASA Astrophysics Data System (ADS)

    Flyvbjerg, Henrik; Mortensen, Kim I.

    2015-06-01

    With each new aspect of nature that becomes accessible to quantitative science, new needs arise for data analysis and mathematical modeling. The classical example is Tycho Brahe's accurate and comprehensive observations of planets, which made him hire Kepler for his mathematical skills to assist with the data analysis. We all learned what that lead to: Kepler's three laws of planetary motion, phenomenology in purely mathematical form. Newton built on this, and the scientific revolution was over, completed.

  8. The Attentional Field Revealed by Single-Voxel Modeling of fMRI Time Courses

    PubMed Central

    DeYoe, Edgar A.

    2015-01-01

    The spatial topography of visual attention is a distinguishing and critical feature of many theoretical models of visuospatial attention. Previous fMRI-based measurements of the topography of attention have typically been too crude to adequately test the predictions of different competing models. This study demonstrates a new technique to make detailed measurements of the topography of visuospatial attention from single-voxel, fMRI time courses. Briefly, this technique involves first estimating a voxel's population receptive field (pRF) and then “drifting” attention through the pRF such that the modulation of the voxel's fMRI time course reflects the spatial topography of attention. The topography of the attentional field (AF) is then estimated using a time-course modeling procedure. Notably, we are able to make these measurements in many visual areas including smaller, higher order areas, thus enabling a more comprehensive comparison of attentional mechanisms throughout the full hierarchy of human visual cortex. Using this technique, we show that the AF scales with eccentricity and varies across visual areas. We also show that voxels in multiple visual areas exhibit suppressive attentional effects that are well modeled by an AF having an enhancing Gaussian center with a suppressive surround. These findings provide extensive, quantitative neurophysiological data for use in modeling the psychological effects of visuospatial attention. PMID:25810532

  9. An Evaluation of a Testing Model for Listening Comprehension.

    ERIC Educational Resources Information Center

    Kangli, Ji

    A model for testing listening comprehension in English as a Second Language is discussed and compared with the Test for English Majors (TEM). The model in question incorporates listening for: (1) understanding factual information; (2) comprehension and interpretation; (3) detailed and selective information; (4) global ideas; (5) on-line tasks…

  10. Testing and Refining the Direct and Inferential Mediation Model of Reading Comprehension

    ERIC Educational Resources Information Center

    Cromley, Jennifer G.; Azevedo, Roger

    2007-01-01

    A significant proportion of American high school students struggle with reading comprehension. Theoretical models of reading comprehension might help researchers understand these difficulties, because they can point to variables that make the largest contributions to comprehension. On the basis of an extensive review of the literature, we created…

  11. Novel Threat-risk Index Using Probabilistic Risk Assessment and Human Reliability Analysis - Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    George A. Beitel

    2004-02-01

    In support of a national need to improve the current state-of-the-art in alerting decision makers to the risk of terrorist attack, a quantitative approach employing scientific and engineering concepts to develop a threat-risk index was undertaken at the Idaho National Engineering and Environmental Laboratory (INEEL). As a result of this effort, a set of models has been successfully integrated into a single comprehensive model known as Quantitative Threat-Risk Index Model (QTRIM), with the capability of computing a quantitative threat-risk index on a system level, as well as for the major components of the system. Such a threat-risk index could providemore » a quantitative variant or basis for either prioritizing security upgrades or updating the current qualitative national color-coded terrorist threat alert.« less

  12. An integrative model of the impairments in insight in schizophrenia: emerging research on causal factors and treatments.

    PubMed

    Vohs, Jenifer L; George, Sunita; Leonhardt, Bethany L; Lysaker, Paul H

    2016-10-01

    Poor insight, or unawareness of some major aspect of mental illness, is a major barrier to wellness when it interferes with persons seeking out treatment or forming their own understanding of the challenges they face. One barrier to addressing impaired insight is the absence of a comprehensive model of how poor insight develops. To explore this issue we review how poor insight is the result of multiple phenomena which interfere with the construction of narrative accounts of psychiatric challenges, rather than a single social or biological cause. Expert commentary: We propose an integrative model of poor insight in schizophrenia which involves the interaction of symptoms, deficits in neurocognition, social cognition, metacognition, and stigma. Emerging treatments for poor insight including therapies which focus on the development of metacognition are discussed.

  13. Contact Forces between Single Metal Oxide Nanoparticles in Gas-Phase Applications and Processes.

    PubMed

    Salameh, Samir; van der Veen, Monique A; Kappl, Michael; van Ommen, J Ruud

    2017-03-14

    In this work we present a comprehensive experimental study to determine the contact forces between individual metal oxide nanoparticles in the gas-phase using atomic force microscopy. In addition, we determined the amount of physisorbed water for each type of particle surface. By comparing our results with mathematical models of the interaction forces, we could demonstrate that classical continuum models of van der Waals and capillary forces alone cannot sufficiently describe the experimental findings. Rather, the discrete nature of the molecules has to be considered, which leads to ordering at the interface and the occurrence of solvation forces. We demonstrate that inclusion of solvation forces in the model leads to quantitative agreement with experimental data and that tuning of the molecular order by addition of isopropanol vapor allows us to control the interaction forces between the nanoparticles.

  14. Can discrete event simulation be of use in modelling major depression?

    PubMed Central

    Le Lay, Agathe; Despiegel, Nicolas; François, Clément; Duru, Gérard

    2006-01-01

    Background Depression is among the major contributors to worldwide disease burden and adequate modelling requires a framework designed to depict real world disease progression as well as its economic implications as closely as possible. Objectives In light of the specific characteristics associated with depression (multiple episodes at varying intervals, impact of disease history on course of illness, sociodemographic factors), our aim was to clarify to what extent "Discrete Event Simulation" (DES) models provide methodological benefits in depicting disease evolution. Methods We conducted a comprehensive review of published Markov models in depression and identified potential limits to their methodology. A model based on DES principles was developed to investigate the benefits and drawbacks of this simulation method compared with Markov modelling techniques. Results The major drawback to Markov models is that they may not be suitable to tracking patients' disease history properly, unless the analyst defines multiple health states, which may lead to intractable situations. They are also too rigid to take into consideration multiple patient-specific sociodemographic characteristics in a single model. To do so would also require defining multiple health states which would render the analysis entirely too complex. We show that DES resolve these weaknesses and that its flexibility allow patients with differing attributes to move from one event to another in sequential order while simultaneously taking into account important risk factors such as age, gender, disease history and patients attitude towards treatment, together with any disease-related events (adverse events, suicide attempt etc.). Conclusion DES modelling appears to be an accurate, flexible and comprehensive means of depicting disease progression compared with conventional simulation methodologies. Its use in analysing recurrent and chronic diseases appears particularly useful compared with Markov processes. PMID:17147790

  15. Can discrete event simulation be of use in modelling major depression?

    PubMed

    Le Lay, Agathe; Despiegel, Nicolas; François, Clément; Duru, Gérard

    2006-12-05

    Depression is among the major contributors to worldwide disease burden and adequate modelling requires a framework designed to depict real world disease progression as well as its economic implications as closely as possible. In light of the specific characteristics associated with depression (multiple episodes at varying intervals, impact of disease history on course of illness, sociodemographic factors), our aim was to clarify to what extent "Discrete Event Simulation" (DES) models provide methodological benefits in depicting disease evolution. We conducted a comprehensive review of published Markov models in depression and identified potential limits to their methodology. A model based on DES principles was developed to investigate the benefits and drawbacks of this simulation method compared with Markov modelling techniques. The major drawback to Markov models is that they may not be suitable to tracking patients' disease history properly, unless the analyst defines multiple health states, which may lead to intractable situations. They are also too rigid to take into consideration multiple patient-specific sociodemographic characteristics in a single model. To do so would also require defining multiple health states which would render the analysis entirely too complex. We show that DES resolve these weaknesses and that its flexibility allow patients with differing attributes to move from one event to another in sequential order while simultaneously taking into account important risk factors such as age, gender, disease history and patients attitude towards treatment, together with any disease-related events (adverse events, suicide attempt etc.). DES modelling appears to be an accurate, flexible and comprehensive means of depicting disease progression compared with conventional simulation methodologies. Its use in analysing recurrent and chronic diseases appears particularly useful compared with Markov processes.

  16. Predicting Longitudinal Change in Language Production and Comprehension in Individuals with Down Syndrome: Hierarchical Linear Modeling.

    ERIC Educational Resources Information Center

    Chapman, Robin S.; Hesketh, Linda J.; Kistler, Doris J.

    2002-01-01

    Longitudinal change in syntax comprehension and production skill, measured over six years, was modeled in 31 individuals (ages 5-20) with Down syndrome. The best fitting Hierarchical Linear Modeling model of comprehension uses age and visual and auditory short-term memory as predictors of initial status, and age for growth trajectory. (Contains…

  17. Postpolio Syndrome: Using a Single Case Study

    ERIC Educational Resources Information Center

    Obringer, S. John; Elrod, G. Franklin

    2004-01-01

    The purpose of this study was to identify the major characteristics of postpolio syndrome (PPS), investigate physical and psychological limitations, and comprehensively review current medical interventions through a single subject design. The study addresses the symptoms and characteristics, the effect on life style, and the current recommended…

  18. A weak-scattering model for turbine-tone haystacking

    NASA Astrophysics Data System (ADS)

    McAlpine, A.; Powles, C. J.; Tester, B. J.

    2013-08-01

    Noise and emissions are critical technical issues in the development of aircraft engines. This necessitates the development of accurate models to predict the noise radiated from aero-engines. Turbine tones radiated from the exhaust nozzle of a turbofan engine propagate through turbulent jet shear layers which causes scattering of sound. In the far-field, measurements of the tones may exhibit spectral broadening, where owing to scattering, the tones are no longer narrow band peaks in the spectrum. This effect is known colloquially as 'haystacking'. In this article a comprehensive analytical model to predict spectral broadening for a tone radiated through a circular jet, for an observer in the far field, is presented. This model extends previous work by the authors which considered the prediction of spectral broadening at far-field observer locations outside the cone of silence. The modelling uses high-frequency asymptotic methods and a weak-scattering assumption. A realistic shear layer velocity profile and turbulence characteristics are included in the model. The mathematical formulation which details the spectral broadening, or haystacking, of a single-frequency, single azimuthal order turbine tone is outlined. In order to validate the model, predictions are compared with experimental results, albeit only at polar angle equal to 90°. A range of source frequencies from 4 to 20kHz, and jet velocities from 20 to 60ms-1, are examined for validation purposes. The model correctly predicts how the spectral broadening is affected when the source frequency and jet velocity are varied.

  19. The CESM Large Ensemble Project: Inspiring New Ideas and Understanding

    NASA Astrophysics Data System (ADS)

    Kay, J. E.; Deser, C.

    2016-12-01

    While internal climate variability is known to affect climate projections, its influence is often underappreciated and confused with model error. Why? In general, modeling centers contribute a small number of realizations to international climate model assessments [e.g., phase 5 of the Coupled Model Intercomparison Project (CMIP5)]. As a result, model error and internal climate variability are difficult, and at times impossible, to disentangle. In response, the Community Earth System Model (CESM) community designed the CESM Large Ensemble (CESM-LE) with the explicit goal of enabling assessment of climate change in the presence of internal climate variability. All CESM-LE simulations use a single CMIP5 model (CESM with the Community Atmosphere Model, version 5). The core simulations replay the twenty to twenty-first century (1920-2100) 40+ times under historical and representative concentration pathway 8.5 external forcing with small initial condition differences. Two companion 2000+-yr-long preindustrial control simulations (fully coupled, prognostic atmosphere and land only) allow assessment of internal climate variability in the absence of climate change. Comprehensive outputs, including many daily fields, are available as single-variable time series on the Earth System Grid for anyone to use. Examples of scientists and stakeholders that are using the CESM-LE outputs to help interpret the observational record, to understand projection spread and to plan for a range of possible futures influenced by both internal climate variability and forced climate change will be highlighted the presentation.

  20. Building Comprehensive Career Guidance Programs for Secondary Schools: A Handbook of Programs, Practices, and Models. Research and Development Series No. 147.

    ERIC Educational Resources Information Center

    Campbell, Robert E.; And Others

    This handbook presents management techniques, program ideas, and student activities for building comprehensive secondary career guidance programs. Part 1 (chapter 1) traces the history of guidance to set the stage for the current emphasis on comprehensive programs, summarizes four representative models for designing comprehensive programs, and…

  1. Reconsidering the Simple View of Reading in an Intriguing Case of Equivalent Models: Commentary on Tunmer and Chapman (2012)

    ERIC Educational Resources Information Center

    Wagner, Richard K.; Herrera, Sarah K.; Spencer, Mercedes; Quinn, Jamie M.

    2015-01-01

    Recently, Tunmer and Chapman provided an alternative model of how decoding and listening comprehension affect reading comprehension that challenges the simple view of reading. They questioned the simple view's fundamental assumption that oral language comprehension and decoding make independent contributions to reading comprehension by arguing…

  2. Epilogue: Reading Comprehension Is Not a Single Ability-Implications for Assessment and Instruction.

    PubMed

    Kamhi, Alan G; Catts, Hugh W

    2017-04-20

    In this epilogue, we review the 4 response articles and highlight the implications of a multidimensional view of reading for the assessment and instruction of reading comprehension. We reiterate the problems with standardized tests of reading comprehension and discuss the advantages and disadvantages of recently developed authentic tests of reading comprehension. In the "Instruction" section, we review the benefits and limitations of strategy instruction and highlight suggestions from the response articles to improve content and language knowledge. We argue that the only compelling reason to administer a standardized test of reading comprehension is when these tests are necessary to qualify students for special education services. Instruction should be focused on content knowledge, language knowledge, and specific task and learning requirements. This instruction may entail the use of comprehension strategies, particularly those that are specific to the task and focus on integrating new knowledge with prior knowledge.

  3. Oral Language and Listening Comprehension: Same or Different Constructs?

    PubMed

    2017-05-24

    The purpose of this study was to add to our understanding of the dimensionality of oral language in children and to determine whether oral language and listening comprehension are separate constructs in children enrolled in preschool (PK) through 3rd grade. In the spring of the school year, children from 4 states (N = 1,869) completed multiple measures of oral language (i.e., expressive and receptive vocabulary and grammar) and listening comprehension as part of a larger study of the language bases of reading comprehension. Initial confirmatory factor analysis found evidence that measures of oral language and listening comprehension loaded on two separate factors in PK through 3rd grade; however, these factors were highly correlated at all grades. These results suggest that oral language and listening comprehension are best characterized as a single oral language construct in PK through 3rd grade. The implications for early identification and intervention are discussed.

  4. A Causal Model of Sentence Recall: Effects of Familiarity, Concreteness, Comprehensibility, and Interestingness.

    ERIC Educational Resources Information Center

    Sadoski, Mark; And Others

    1993-01-01

    Presents and tests a theoretically derived causal model of the recall of sentences. Notes that the causal model identifies familiarity and concreteness as causes of comprehensibility; familiarity, concreteness, and comprehensibility as causes of interestingness; and all the identified variables as causes of both immediate and delayed recall.…

  5. Comprehensive multilevel in vivo and in vitro analysis of heart rate fluctuations in mice by ECG telemetry and electrophysiology.

    PubMed

    Fenske, Stefanie; Pröbstle, Rasmus; Auer, Franziska; Hassan, Sami; Marks, Vanessa; Pauza, Danius H; Biel, Martin; Wahl-Schott, Christian

    2016-01-01

    The normal heartbeat slightly fluctuates around a mean value; this phenomenon is called physiological heart rate variability (HRV). It is well known that altered HRV is a risk factor for sudden cardiac death. The availability of genetic mouse models makes it possible to experimentally dissect the mechanism of pathological changes in HRV and its relation to sudden cardiac death. Here we provide a protocol that allows for a comprehensive multilevel analysis of heart rate (HR) fluctuations. The protocol comprises a set of techniques that include in vivo telemetry and in vitro electrophysiology of intact sinoatrial network preparations or isolated single sinoatrial node (SAN) cells. In vitro preparations can be completed within a few hours, with data acquisition within 1 d. In vivo telemetric ECG requires 1 h for surgery and several weeks for data acquisition and analysis. This protocol is of interest to researchers investigating cardiovascular physiology and the pathophysiology of sudden cardiac death.

  6. Comparing theories' performance in predicting violence.

    PubMed

    Haas, Henriette; Cusson, Maurice

    2015-01-01

    The stakes of choosing the best theory as a basis for violence prevention and offender rehabilitation are high. However, no single theory of violence has ever been universally accepted by a majority of established researchers. Psychiatry, psychology and sociology are each subdivided into different schools relying upon different premises. All theories can produce empirical evidence for their validity, some of them stating the opposite of each other. Calculating different models with multivariate logistic regression on a dataset of N = 21,312 observations and ninety-two influences allowed a direct comparison of the performance of operationalizations of some of the most important schools. The psychopathology model ranked as the best model in terms of predicting violence right after the comprehensive interdisciplinary model. Next came the rational choice and lifestyle model and third the differential association and learning theory model. Other models namely the control theory model, the childhood-trauma model and the social conflict and reaction model turned out to have low sensitivities for predicting violence. Nevertheless, all models produced acceptable results in predictions of a non-violent outcome. Copyright © 2015. Published by Elsevier Ltd.

  7. Zuni Comprehensive Development Plan. For a Better Zuni by '75. Volumes One and Two.

    ERIC Educational Resources Information Center

    Pueblo of Zuni, NM.

    The Zuni comprehensive development plan encompasses a variety of projects designed to achieve major development goals on the Zuni Indian reservation in New Mexico. The single overall planning objective is to raise the level of living for residents of the Zuni reservation to equal or to exceed the average for all United States citizens. Major…

  8. Analyzing the Effects of Story Mapping on the Reading Comprehension of Children with Low Intellectual Abilities

    ERIC Educational Resources Information Center

    Grünke, Matthias; Wilbert, Jürgen; Stegemann, Kim Calder

    2013-01-01

    This single-case study examined the effects of a graphic organizing strategy on the ability of children to improve their text comprehension abilities. Participants were six students between ten and fourteen years old with major problems in understanding what they read. The intervention intended to teach them to visually highlight key elements of a…

  9. Comprehension and Time Expended for a Doctoral Student with a Learning Disability when Reading with and without an Accommodation

    ERIC Educational Resources Information Center

    Tanners, Adam; McDougall, Dennis; Skouge, Jim; Narkon, Drue

    2012-01-01

    The purpose of this alternating treatment, single-case research study was to compare reading comprehension and time expended reading, of a doctoral student with learning disabilities, under two reading conditions. In condition one, the student used a self-discovered accommodation, that is, listening, on an iPod, to an audiobook version…

  10. Using Text-to-Speech Reading Support for an Adult with Mild Aphasia and Cognitive Impairment

    ERIC Educational Resources Information Center

    Harvey, Judy; Hux, Karen; Snell, Jeffry

    2013-01-01

    This single case study served to examine text-to-speech (TTS) effects on reading rate and comprehension in an individual with mild aphasia and cognitive impairment. Findings showed faster reading, given TTS presented at a normal speaking rate, but no significant comprehension changes. TTS may support reading in people with aphasia when time…

  11. The effects of antecedent color on reading for students with learning disabilities and co-occurring attention-deficit/ hyperactivity disorder.

    PubMed

    Belfiore, P J; Grskovic, J A; Murphy, A M; Zentall, S S

    1996-07-01

    The effects of color on the reading recognition and comprehension of 3 students with learning disabilities and attention-deficit/hyperactivity disorder were assessed in a single-subject design. Color did not enhance sight-word learning; for longer reading comprehension tasks, color had an immediate effect across and within sessions.

  12. Implementation of a Single Comprehensive Function-Based Intervention across Multiple Classrooms for a High School Student

    ERIC Educational Resources Information Center

    Whitford, Denise K.; Liaupsin, Carl J.; Umbreit, John; Ferro, Jolenea B.

    2013-01-01

    A comprehensive function-based intervention was developed to address the chronic, high levels of off-task behavior by a 15-year-old ninth grade Caucasian male with learning disabilities and ADHD. A descriptive FBA identified that the student's off-task behavior was reinforced by peer attention and task avoidance. Intervention involved the…

  13. Recent Official Policy and Concepts of Reading Comprehension and Inference: The Case of England's Primary Curriculum

    ERIC Educational Resources Information Center

    Williams, Jazz C.

    2014-01-01

    This article engages with recent policy on reading comprehension. It argues that the construct of inference has been treated as a single entity despite research and literature to the contrary, and this is perpetuated in the National Curriculum for 2014. It explores the limitations of conceptualising inference as a unitary construct and…

  14. A predictive study of reading comprehension in third-grade Spanish students.

    PubMed

    López-Escribano, Carmen; Elosúa de Juan, María Rosa; Gómez-Veiga, Isabel; García-Madruga, Juan Antonio

    2013-01-01

    The study of the contribution of language and cognitive skills to reading comprehension is an important goal of current reading research. However, reading comprehension is not easily assessed by a single instrument, as different comprehension tests vary in the type of tasks used and in the cognitive demands required. This study examines the contribution of basic language and cognitive skills (decoding, word recognition, reading speed, verbal and nonverbal intelligence and working memory) to reading comprehension, assessed by two tests utilizing various tasks that require different skill sets in third-grade Spanish-speaking students. Linguistic and cognitive abilities predicted reading comprehension. A measure of reading speed (the reading time of pseudo-words) was the best predictor of reading comprehension when assessed by the PROLEC-R test. However, measures of word recognition (the orthographic choice task) and verbal working memory were the best predictors of reading comprehension when assessed by means of the DARC test. These results show, on the one hand, that reading speed and word recognition are better predictors of Spanish language comprehension than reading accuracy. On the other, the reading comprehension test applied here serves as a critical variable when analyzing and interpreting results regarding this topic.

  15. Hierarchies of Models: Toward Understanding Planetary Nebulae

    NASA Technical Reports Server (NTRS)

    Knuth, Kevin H.; Hajian, Arsen R.; Clancy, Daniel (Technical Monitor)

    2003-01-01

    Stars like our sun (initial masses between 0.8 to 8 solar masses) end their lives as swollen red giants surrounded by cool extended atmospheres. The nuclear reactions in their cores create carbon, nitrogen and oxygen, which are transported by convection to the outer envelope of the stellar atmosphere. As the star finally collapses to become a white dwarf, this envelope is expelled from the star to form a planetary nebula (PN) rich in organic molecules. The physics, dynamics, and chemistry of these nebulae are poorly understood and have implications not only for our understanding of the stellar life cycle but also for organic astrochemistry and the creation of prebiotic molecules in interstellar space. We are working toward generating three-dimensional models of planetary nebulae (PNe), which include the size, orientation, shape, expansion rate and mass distribution of the nebula. Such a reconstruction of a PN is a challenging problem for several reasons. First, the data consist of images obtained over time from the Hubble Space Telescope (HST) and spectra obtained from Kitt Peak National Observatory (KPNO) and Cerro Tololo Inter-American Observatory (CTIO). These images are of course taken from a single viewpoint in space, which amounts to a very challenging tomographic reconstruction. Second, the fact that we have two disparate and orthogonal data types requires that we utilize a method that allows these data to be used together to obtain a solution. To address these first two challenges we employ Bayesian model estimation using a parameterized physical model that incorporates much prior information about the known physics of the PN. In our previous works we have found that the forward problem of the comprehensive model is extremely time consuming. To address this challenge, we explore the use of a set of hierarchical models, which allow us to estimate increasingly more detailed sets of model parameters. These hierarchical models of increasing complexity are akin to scientific theories of increasing sophistication, with each new model/theory being a refinement of a previous one by either incorporating additional prior information or by introducing a new set of parameters to model an entirely new phenomenon. We apply these models to both a simulated and a real ellipsoidal PN to initially estimate the position, angular size, and orientation of the nebula as a two-dimensional object and use these estimates to later examine its three-dimensional properties. The efficiency/accuracy tradeoffs of the techniques are studied to determine the advantages and disadvantages of employing a set of hierarchical models over a single comprehensive model.

  16. Hierarchies of Models: Toward Understanding Planetary Nebulae

    NASA Technical Reports Server (NTRS)

    Knuth, Kevin H.; Hajian, Arsen R.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Stars like our sun (initial masses between 0.8 to 8 solar masses) end their lives as swollen red giants surrounded by cool extended atmospheres. The nuclear reactions in their cores create carbon, nitrogen and oxygen, which are transported by convection to the outer envelope of the stellar atmosphere. As the star finally collapses to become a white dwarf, this envelope is expelled from the star to form a planetary nebula (PN) rich in organic molecules. The physics, dynamics, and chemistry of these nebulae are poorly understood and have implications not only for our understanding of the stellar life cycle but also for organic astrochemistry and the creation of prebiotic molecules in interstellar space. We are working toward generating three-dimensional models of planetary nebulae (PNe), which include the size, orientation, shape, expansion rate and mass distribution of the nebula. Such a reconstruction of a PN is a challenging problem for several reasons. First, the data consist of images obtained over time from the Hubble Space Telescope (HST) and spectra obtained from Kitt Peak National Observatory (KPNO) and Cerro Tololo Inter-American Observatory (CTIO). These images are of course taken from a single viewpoint in space, which amounts to a very challenging tomographic reconstruction. Second, the fact that we have two disparate and orthogonal data types requires that we utilize a method that allows these data to be used together to obtain a solution. To address these first two challenges we employ Bayesian model estimation using a parameterized physical model that incorporates much prior information about the known physics of the PN. In our previous works we have found that the forward problem of the comprehensive model is extremely time consuming. To address this challenge, we explore the use of a set of hierarchical models, which allow us to estimate increasingly more detailed sets of model parameters. These hierarchical models of increasing complexity are akin to scientific theories of increasing sophistication, with each new model/theory being a refinement of a previous one by either incorporating additional prior information or by introducing a new set of parameters to model an entirely new phenomenon. We apply these models to both a simulated and a real ellipsoidal PN to initially estimate the position, angular size, and orientation of the nebula as a two-dimensional object and use these estimates to later examine its three-dimensional properties. The efficiency/accuracy tradeoffs of the techniques are studied to determine the advantages and disadvantages of employing a set of hierarchical models over a single comprehensive model.

  17. Advances in understanding tumour evolution through single-cell sequencing.

    PubMed

    Kuipers, Jack; Jahn, Katharina; Beerenwinkel, Niko

    2017-04-01

    The mutational heterogeneity observed within tumours poses additional challenges to the development of effective cancer treatments. A thorough understanding of a tumour's subclonal composition and its mutational history is essential to open up the design of treatments tailored to individual patients. Comparative studies on a large number of tumours permit the identification of mutational patterns which may refine forecasts of cancer progression, response to treatment and metastatic potential. The composition of tumours is shaped by evolutionary processes. Recent advances in next-generation sequencing offer the possibility to analyse the evolutionary history and accompanying heterogeneity of tumours at an unprecedented resolution, by sequencing single cells. New computational challenges arise when moving from bulk to single-cell sequencing data, leading to the development of novel modelling frameworks. In this review, we present the state of the art methods for understanding the phylogeny encoded in bulk or single-cell sequencing data, and highlight future directions for developing more comprehensive and informative pictures of tumour evolution. This article is part of a Special Issue entitled: Evolutionary principles - heterogeneity in cancer?, edited by Dr. Robert A. Gatenby. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  18. Mapping human pluripotent stem cell differentiation pathways using high throughput single-cell RNA-sequencing.

    PubMed

    Han, Xiaoping; Chen, Haide; Huang, Daosheng; Chen, Huidong; Fei, Lijiang; Cheng, Chen; Huang, He; Yuan, Guo-Cheng; Guo, Guoji

    2018-04-05

    Human pluripotent stem cells (hPSCs) provide powerful models for studying cellular differentiations and unlimited sources of cells for regenerative medicine. However, a comprehensive single-cell level differentiation roadmap for hPSCs has not been achieved. We use high throughput single-cell RNA-sequencing (scRNA-seq), based on optimized microfluidic circuits, to profile early differentiation lineages in the human embryoid body system. We present a cellular-state landscape for hPSC early differentiation that covers multiple cellular lineages, including neural, muscle, endothelial, stromal, liver, and epithelial cells. Through pseudotime analysis, we construct the developmental trajectories of these progenitor cells and reveal the gene expression dynamics in the process of cell differentiation. We further reprogram primed H9 cells into naïve-like H9 cells to study the cellular-state transition process. We find that genes related to hemogenic endothelium development are enriched in naïve-like H9. Functionally, naïve-like H9 show higher potency for differentiation into hematopoietic lineages than primed cells. Our single-cell analysis reveals the cellular-state landscape of hPSC early differentiation, offering new insights that can be harnessed for optimization of differentiation protocols.

  19. Identifying Evidence-Based Special Education Interventions from Single-Subject Research

    ERIC Educational Resources Information Center

    Freeman, Jennifer; Sugai, George

    2013-01-01

    Special educators are required to use evidence-based academic and behavioral interventions in their classrooms (U.S. Department of Education, 2010). No rigorous and comprehensive database currently exists to support educators. Within the field of special education, single-subject research is the primary research methodology (Horner, Carr, Halle,…

  20. 76 FR 67192 - Administration on Children, Youth and Families Announces the Award of a Single-Source Program...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-31

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES Administration for Children and Families Administration on Children, Youth and Families Announces the Award of a Single-Source Program Expansion Supplement Grant to..., comprehensive outreach and increased capacity through technology improvements. STATUTORY AUTHORITY: Part C...

  1. Crash problem definition and safety benefits methodology for stability control for single-unit medium and heavy trucks and large-platform buses

    DOT National Transportation Integrated Search

    2009-10-01

    This report presents the findings of a comprehensive engineering analysis of electronic stability control (ESC) and roll stability control (RSC) systems for single-unit medium and heavy trucks and large-platform buses. This report details the applica...

  2. Spectral irradiance of singly and doubly ionized zinc in low-intensity laser-plasma ultraviolet light sources

    NASA Astrophysics Data System (ADS)

    Szilagyi, John; Parchamy, Homaira; Masnavi, Majid; Richardson, Martin

    2017-01-01

    The absolute spectral irradiances of laser-plasmas produced from planar zinc targets are determined over a wavelength region of 150 to 250 nm. Strong spectral radiation is generated using 60 ns full-width-at-half-maximum, 1.0 μm wavelength laser pulses with incident laser intensities as low as ˜5 × 108 W cm-2. A typical radiation conversion efficiency of ˜2%/2πsr is measured. Numerical calculations using a comprehensive radiation-hydrodynamics model reveal the strong experimental spectra to originate mainly from 3d94s4p-3d94s2, 3d94s4d-3d94s4p, and 3d94p-3d94s, 3d94d-3d94p unresolved-transition arrays in singly and doubly ionized zinc, respectively.

  3. The Zeeman splitting of bulk 2H-MoTe2 single crystal in high magnetic field

    NASA Astrophysics Data System (ADS)

    Sun, Yan; Zhang, Junpei; Ma, Zongwei; Chen, Cheng; Han, Junbo; Chen, Fangchu; Luo, Xuan; Sun, Yuping; Sheng, Zhigao

    2017-03-01

    A high magnetic field magneto-optical spectrum is utilized to study the A exciton of bulk 2H-MoTe2 single crystal. A clear Zeeman splitting of the A exciton is observed under high magnetic fields up to 41.68 T, and the g-factor (-2.09 ± 0.08) is deduced. Moreover, a high magnetic field enables us to obtain the quadratic diamagnetic shifts of the A exciton (0.486 μeV T-2). Accordingly, the binding energy, reduced mass, and radius of the A exciton were obtained by using both two and three dimensional models. Compared with other transition metal dichalcogenides (TMDs), the A exciton of bulk 2H-MoTe2 has a relatively small binding energy and larger exciton radius, which provide fundamental parameters for comprehensive understanding of excitons in TMDs as well as their future applications.

  4. Simulating single word processing in the classic aphasia syndromes based on the Wernicke-Lichtheim-Geschwind theory.

    PubMed

    Weems, Scott A; Reggia, James A

    2006-09-01

    The Wernicke-Lichtheim-Geschwind (WLG) theory of the neurobiological basis of language is of great historical importance, and it continues to exert a substantial influence on most contemporary theories of language in spite of its widely recognized limitations. Here, we suggest that neurobiologically grounded computational models based on the WLG theory can provide a deeper understanding of which of its features are plausible and where the theory fails. As a first step in this direction, we created a model of the interconnected left and right neocortical areas that are most relevant to the WLG theory, and used it to study visual-confrontation naming, auditory repetition, and auditory comprehension performance. No specific functionality is assigned a priori to model cortical regions, other than that implicitly present due to their locations in the cortical network and a higher learning rate in left hemisphere regions. Following learning, the model successfully simulates confrontation naming and word repetition, and acquires a unique internal representation in parietal regions for each named object. Simulated lesions to the language-dominant cortical regions produce patterns of single word processing impairment reminiscent of those postulated historically in the classic aphasia syndromes. These results indicate that WLG theory, instantiated as a simple interconnected network of model neocortical regions familiar to any neuropsychologist/neurologist, captures several fundamental "low-level" aspects of neurobiological word processing and their impairment in aphasia.

  5. The Implementation of C-ID, R2D2 Model on Learning Reading Comprehension

    ERIC Educational Resources Information Center

    Rayanto, Yudi Hari; Rusmawan, Putu Ngurah

    2016-01-01

    The purposes of this research are to find out, (1) whether C-ID, R2D2 model is effective to be implemented on learning Reading comprehension, (2) college students' activity during the implementation of C-ID, R2D2 model on learning Reading comprehension, and 3) college students' learning achievement during the implementation of C-ID, R2D2 model on…

  6. Comprehensive Oral Health Care to Reduce the Incidence of Severe Early Childhood Caries (s-ECC) in Urban China.

    PubMed

    Si, Yan; Guo, Yan; Yuan, Chao; Xu, Tao; Zheng, Shu Guo

    2016-03-01

    To explore the effectiveness of comprehensive oral health care to reduce the caries incidence for children with severe early childhood caries (s-ECC) in an urban area in China. A total of 357 children aged 3 to 4 years old and diagnosed with s-ECC were recruited in this randomised controlled, single-blinded clinical trial for 1 year. Children of two different kindergarten classes were enrolled in this study and randomly divided into a test group (205 children) and a control group (152 children). The test group received comprehensive oral health care, which included: oral health examination, oral health education, topical fluoride application and dental treatment, and the children in the control group only received the oral health examination. The evaluation of the oral health questionnaire for parents was also performed. An evaluation was carried out at the time of recruitment and 1 year later to explore the effectiveness of the comprehensive oral health care model. The differences in decayed teeth (dt), decayed tooth surfaces (ds), filled teeth (ft), filled tooth surfaces (fs) and the ratio of ft /(dt + ft) between the two groups were statistically significant (P < 0.001) at 1 year. The incidence of caries in the control group was higher than that of the test group (P = 0.02). The rate of awareness of oral health knowledge (P = 0.01) and the practice of good diet habits (P = 0.02) by parents in the test group were significantly higher than those in the control group. The present study demonstrated that the comprehensive oral health care program reduces and prevents caries amongst children with s-ECC.

  7. No Association of Coronary Artery Disease with X-Chromosomal Variants in Comprehensive International Meta-Analysis

    PubMed Central

    Loley, Christina; Alver, Maris; Assimes, Themistocles L.; Bjonnes, Andrew; Goel, Anuj; Gustafsson, Stefan; Hernesniemi, Jussi; Hopewell, Jemma C.; Kanoni, Stavroula; Kleber, Marcus E.; Lau, King Wai; Lu, Yingchang; Lyytikäinen, Leo-Pekka; Nelson, Christopher P.; Nikpay, Majid; Qu, Liming; Salfati, Elias; Scholz, Markus; Tukiainen, Taru; Willenborg, Christina; Won, Hong-Hee; Zeng, Lingyao; Zhang, Weihua; Anand, Sonia S.; Beutner, Frank; Bottinger, Erwin P.; Clarke, Robert; Dedoussis, George; Do, Ron; Esko, Tõnu; Eskola, Markku; Farrall, Martin; Gauguier, Dominique; Giedraitis, Vilmantas; Granger, Christopher B.; Hall, Alistair S.; Hamsten, Anders; Hazen, Stanley L.; Huang, Jie; Kähönen, Mika; Kyriakou, Theodosios; Laaksonen, Reijo; Lind, Lars; Lindgren, Cecilia; Magnusson, Patrik K. E.; Marouli, Eirini; Mihailov, Evelin; Morris, Andrew P.; Nikus, Kjell; Pedersen, Nancy; Rallidis, Loukianos; Salomaa, Veikko; Shah, Svati H.; Stewart, Alexandre F. R.; Thompson, John R.; Zalloua, Pierre A.; Chambers, John C.; Collins, Rory; Ingelsson, Erik; Iribarren, Carlos; Karhunen, Pekka J.; Kooner, Jaspal S.; Lehtimäki, Terho; Loos, Ruth J. F.; März, Winfried; McPherson, Ruth; Metspalu, Andres; Reilly, Muredach P.; Ripatti, Samuli; Sanghera, Dharambir K.; Thiery, Joachim; Watkins, Hugh; Deloukas, Panos; Kathiresan, Sekar; Samani, Nilesh J.; Schunkert, Heribert; Erdmann, Jeanette; König, Inke R.

    2016-01-01

    In recent years, genome-wide association studies have identified 58 independent risk loci for coronary artery disease (CAD) on the autosome. However, due to the sex-specific data structure of the X chromosome, it has been excluded from most of these analyses. While females have 2 copies of chromosome X, males have only one. Also, one of the female X chromosomes may be inactivated. Therefore, special test statistics and quality control procedures are required. Thus, little is known about the role of X-chromosomal variants in CAD. To fill this gap, we conducted a comprehensive X-chromosome-wide meta-analysis including more than 43,000 CAD cases and 58,000 controls from 35 international study cohorts. For quality control, sex-specific filters were used to adequately take the special structure of X-chromosomal data into account. For single study analyses, several logistic regression models were calculated allowing for inactivation of one female X-chromosome, adjusting for sex and investigating interactions between sex and genetic variants. Then, meta-analyses including all 35 studies were conducted using random effects models. None of the investigated models revealed genome-wide significant associations for any variant. Although we analyzed the largest-to-date sample, currently available methods were not able to detect any associations of X-chromosomal variants with CAD. PMID:27731410

  8. Single-Column Modeling, GCM Parameterizations and Atmospheric Radiation Measurement Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Somerville, R.C.J.; Iacobellis, S.F.

    2005-03-18

    Our overall goal is identical to that of the Atmospheric Radiation Measurement (ARM) Program: the development of new and improved parameterizations of cloud-radiation effects and related processes, using ARM data at all three ARM sites, and the implementation and testing of these parameterizations in global and regional models. To test recently developed prognostic parameterizations based on detailed cloud microphysics, we have first compared single-column model (SCM) output with ARM observations at the Southern Great Plains (SGP), North Slope of Alaska (NSA) and Topical Western Pacific (TWP) sites. We focus on the predicted cloud amounts and on a suite of radiativemore » quantities strongly dependent on clouds, such as downwelling surface shortwave radiation. Our results demonstrate the superiority of parameterizations based on comprehensive treatments of cloud microphysics and cloud-radiative interactions. At the SGP and NSA sites, the SCM results simulate the ARM measurements well and are demonstrably more realistic than typical parameterizations found in conventional operational forecasting models. At the TWP site, the model performance depends strongly on details of the scheme, and the results of our diagnostic tests suggest ways to develop improved parameterizations better suited to simulating cloud-radiation interactions in the tropics generally. These advances have made it possible to take the next step and build on this progress, by incorporating our parameterization schemes in state-of-the-art 3D atmospheric models, and diagnosing and evaluating the results using independent data. Because the improved cloud-radiation results have been obtained largely via implementing detailed and physically comprehensive cloud microphysics, we anticipate that improved predictions of hydrologic cycle components, and hence of precipitation, may also be achievable. We are currently testing the performance of our ARM-based parameterizations in state-of-the--art global and regional models. One fruitful strategy for evaluating advances in parameterizations has turned out to be using short-range numerical weather prediction as a test-bed within which to implement and improve parameterizations for modeling and predicting climate variability. The global models we have used to date are the CAM atmospheric component of the National Center for Atmospheric Research (NCAR) CCSM climate model as well as the National Centers for Environmental Prediction (NCEP) numerical weather prediction model, thus allowing testing in both climate simulation and numerical weather prediction modes. We present detailed results of these tests, demonstrating the sensitivity of model performance to changes in parameterizations.« less

  9. [Language comprehension in late talkers].

    PubMed

    Sachse, S; von Suchodoletz, W

    2013-11-01

    Late talkers (LTs) show very different courses of language development. The aim of this study was to examine whether subgrouping LTs in terms of language comprehension could allow the identification of specific subtypes with different prognoses. Amongst other assessment strategies, standardized language (SETK-2, SETK 3-5), general nonverbal development (MFED, SON-R 2½-7) and hearing tests (TOAE) were used to examine 48 LTs at the ages of 25 and 37 months. Deficits in language comprehension were recorded for 38 % of the LTs. LTs with and without impaired language comprehension differed only slightly in terms of their further language and nonverbal development, as well as in terms of anamnestic data. Comprehension of words but not of sentences proved to be a predictor of later speech impairments. Classification of LTs based on the comprehension of single words, but not of sentences or general language comprehension, at the age of 25 months can define subgroups of children with different prognoses. However, this only leads to marginal improvements in the predicted development of LTs, since substantial impairment of word comprehension is rarely observed.

  10. A comprehensive study of the delay vector variance method for quantification of nonlinearity in dynamical systems

    PubMed Central

    Mandic, D. P.; Ryan, K.; Basu, B.; Pakrashi, V.

    2016-01-01

    Although vibration monitoring is a popular method to monitor and assess dynamic structures, quantification of linearity or nonlinearity of the dynamic responses remains a challenging problem. We investigate the delay vector variance (DVV) method in this regard in a comprehensive manner to establish the degree to which a change in signal nonlinearity can be related to system nonlinearity and how a change in system parameters affects the nonlinearity in the dynamic response of the system. A wide range of theoretical situations are considered in this regard using a single degree of freedom (SDOF) system to obtain numerical benchmarks. A number of experiments are then carried out using a physical SDOF model in the laboratory. Finally, a composite wind turbine blade is tested for different excitations and the dynamic responses are measured at a number of points to extend the investigation to continuum structures. The dynamic responses were measured using accelerometers, strain gauges and a Laser Doppler vibrometer. This comprehensive study creates a numerical and experimental benchmark for structurally dynamical systems where output-only information is typically available, especially in the context of DVV. The study also allows for comparative analysis between different systems driven by the similar input. PMID:26909175

  11. Prioritizing Congenital Syphilis Control in South China: A Decision Analytic Model to Inform Policy Implementation

    PubMed Central

    Tan, Nicholas X.; Rydzak, Chara; Yang, Li-Gang; Vickerman, Peter; Yang, Bin; Peeling, Rosanna W.; Hawkes, Sarah; Chen, Xiang-Sheng; Tucker, Joseph D.

    2013-01-01

    Background Syphilis is a major public health problem in many regions of China, with increases in congenital syphilis (CS) cases causing concern. The Chinese Ministry of Health recently announced a comprehensive 10-y national syphilis control plan focusing on averting CS. The decision analytic model presented here quantifies the impact of the planned strategies to determine whether they are likely to meet the goals laid out in the control plan. Methods and Findings Our model incorporated data on age-stratified fertility, female adult syphilis cases, and empirical syphilis transmission rates to estimate the number of CS cases associated with prenatal syphilis infection on a yearly basis. Guangdong Province was the focus of this analysis because of the availability of high-quality demographic and public health data. Each model outcome was simulated 1,000 times to incorporate uncertainty in model inputs. The model was validated using data from a CS intervention program among 477,656 women in China. Sensitivity analyses were performed to identify which variables are likely to be most influential in achieving Chinese and international policy goals. Increasing prenatal screening coverage was the single most effective strategy for reducing CS cases. An incremental increase in prenatal screening from the base case of 57% coverage to 95% coverage was associated with 106 (95% CI: 101, 111) CS cases averted per 100,000 live births (58% decrease). The policy strategies laid out in the national plan led to an outcome that fell short of the target, while a four-pronged comprehensive syphilis control strategy consisting of increased prenatal screening coverage, increased treatment completion, earlier prenatal screening, and improved syphilis test characteristics was associated with 157 (95% CI: 154, 160) CS cases averted per 100,000 live births (85% decrease). Conclusions The Chinese national plan provides a strong foundation for syphilis control, but more comprehensive measures that include earlier and more extensive screening are necessary for reaching policy goals. Please see later in the article for the Editors' Summary PMID:23349624

  12. Synergistic effect of temperature and point defect on the mechanical properties of single layer and bi-layer graphene

    NASA Astrophysics Data System (ADS)

    Debroy, Sanghamitra; Pavan Kumar, V.; Vijaya Sekhar, K.; Acharyya, Swati Ghosh; Acharyya, Amit

    2017-10-01

    The present study reports a comprehensive molecular dynamics simulation of the effect of a) temperature (300-1073 K at intervals of every 100 K) and b) point defect on the mechanical behaviour of single (armchair and zigzag direction) and bilayer layer graphene (AA and AB stacking). Adaptive intermolecular reactive bond order (AIREBO) potential function was used to describe the many-body short-range interatomic interactions for the single layer graphene sheet. Moreover, Lennard Jones model was considered for bilayer graphene to incorporate the van der Waals interactions among the interlayers of graphene. The effect of temperature on the strain energy of single layer and bilayer graphene was studied in order to understand the difference in mechanical behaviour of the two systems. The strength of the pristine single layer graphene was found to be higher as compared to bilayer AA stacked graphene at all temperatures. It was observed at 1073 K and in the presence of vacancy defect the strength for single layer armchair sheet falls by 30% and for bilayer armchair sheet by 33% as compared to the pristine sheets at 300 K. The AB stacked graphene sheet was found to have a two-step rupture process. The strength of pristine AB sheet was found to decrease by 22% on increase of temperature from 300 K to 1073 K.

  13. Extracorporeal Membrane Oxygenation Outcomes After the Comprehensive Stage II Procedure in Patients With Single Ventricles.

    PubMed

    Gomez, Daniel; Duffy, Vicky; Hersey, Diane; Backes, Carl; Rycus, Peter; McConnell, Patrick; Voss, Jordan; Galantowicz, Mark; Cua, Clifford L

    2017-01-01

    Outcomes for extracorporeal membrane oxygenation (ECMO) have been described for patients with single ventricle physiology (SVP) undergoing cavopulmonary connection (Glenn procedure). An alternative surgical pathway for patients with SVP consists of an initial hybrid procedure followed by a comprehensive Stage II procedure. No data exist describing the outcomes of patients requiring ECMO after the comprehensive Stage II procedure. The goal of this study is to describe the outcomes for patients who required ECMO after the comprehensive Stage II procedure. Data from the Extracorporeal Life Support Organization (ELSO) registry from 2001 to 2015 for children undergoing the comprehensive Stage II procedure older than 3 months of age were retrospectively analyzed. Demographics and ECMO characteristics were recorded. A total of six children required ECMO support after the comprehensive Stage II procedure (2 males, 4 females). Four patients had the diagnosis of hypoplastic left heart syndrome and two patients had the diagnosis of an unbalanced atrioventricular septal defect. Bypass time was 242.8 ± 110.9 min and cross-clamp time was 91.2 ± 46.2 min for the surgical procedure. Weight was 5.8 ± 1.3 kg and age was 150.2 + 37.9 days at time of ECMO. ECMO duration was 276.0 ± 218.1 h. Complications during the ECMO run included hemorrhage in four patients (67%), renal dysfunction in two patients (33%), and neurologic injury in two patients (33%). Four patients (67%) were discharged alive after ECMO decannulation. Despite being a much more extensive surgical procedure, the morbidity and mortality after ECMO in patients undergoing the comprehensive Stage II procedure are similar to those in patients undergoing the Glenn procedure. If needed, ECMO support is reasonable for patients after the comprehensive Stage II procedure. © 2016 International Center for Artificial Organs and Transplantation and Wiley Periodicals, Inc.

  14. Statistical sensitivity analysis of a simple nuclear waste repository model

    NASA Astrophysics Data System (ADS)

    Ronen, Y.; Lucius, J. L.; Blow, E. M.

    1980-06-01

    A preliminary step in a comprehensive sensitivity analysis of the modeling of a nuclear waste repository. The purpose of the complete analysis is to determine which modeling parameters and physical data are most important in determining key design performance criteria and then to obtain the uncertainty in the design for safety considerations. The theory for a statistical screening design methodology is developed for later use in the overall program. The theory was applied to the test case of determining the relative importance of the sensitivity of near field temperature distribution in a single level salt repository to modeling parameters. The exact values of the sensitivities to these physical and modeling parameters were then obtained using direct methods of recalculation. The sensitivity coefficients found to be important for the sample problem were thermal loading, distance between the spent fuel canisters and their radius. Other important parameters were those related to salt properties at a point of interest in the repository.

  15. A powerful and robust test in genetic association studies.

    PubMed

    Cheng, Kuang-Fu; Lee, Jen-Yu

    2014-01-01

    There are several well-known single SNP tests presented in the literature for detecting gene-disease association signals. Having in place an efficient and robust testing process across all genetic models would allow a more comprehensive approach to analysis. Although some studies have shown that it is possible to construct such a test when the variants are common and the genetic model satisfies certain conditions, the model conditions are too restrictive and in general difficult to verify. In this paper, we propose a powerful and robust test without assuming any model restrictions. Our test is based on the selected 2 × 2 tables derived from the usual 2 × 3 table. By signals from these tables, we show through simulations across a wide range of allele frequencies and genetic models that this approach may produce a test which is almost uniformly most powerful in the analysis of low- and high-frequency variants. Two cancer studies are used to demonstrate applications of the proposed test. © 2014 S. Karger AG, Basel.

  16. The chronic mild stress (CMS) model of depression: History, evaluation and usage.

    PubMed

    Willner, Paul

    2017-02-01

    Now 30 years old, the chronic mild stress (CMS) model of depression has been used in >1300 published studies, with a year-on-year increase rising to >200 papers in 2015. Data from a survey of users show that while a variety of names are in use (chronic mild/unpredictable/varied stress), these describe essentially the same procedure. This paper provides an update on the validity and reliability of the CMS model, and reviews recent data on the neurobiological basis of CMS effects and the mechanisms of antidepressant action: the volume of this research may be unique in providing a comprehensive account of antidepressant action within a single model. Also discussed is the use of CMS in drug discovery, with particular reference to hippocampal and extra-hippocampal targets. The high translational potential of the CMS model means that the neurobiological mechanisms described may be of particular relevance to human depression and mechanisms of clinical antidepressant action.

  17. Variance-based selection may explain general mating patterns in social insects.

    PubMed

    Rueppell, Olav; Johnson, Nels; Rychtár, Jan

    2008-06-23

    Female mating frequency is one of the key parameters of social insect evolution. Several hypotheses have been suggested to explain multiple mating and considerable empirical research has led to conflicting results. Building on several earlier analyses, we present a simple general model that links the number of queen matings to variance in colony performance and this variance to average colony fitness. The model predicts selection for multiple mating if the average colony succeeds in a focal task, and selection for single mating if the average colony fails, irrespective of the proximate mechanism that links genetic diversity to colony fitness. Empirical support comes from interspecific comparisons, e.g. between the bee genera Apis and Bombus, and from data on several ant species, but more comprehensive empirical tests are needed.

  18. First principles molecular dynamics of molten NaCl

    NASA Astrophysics Data System (ADS)

    Galamba, N.; Costa Cabral, B. J.

    2007-03-01

    First principles Hellmann-Feynman molecular dynamics (HFMD) results for molten NaCl at a single state point are reported. The effect of induction forces on the structure and dynamics of the system is studied by comparison of the partial radial distribution functions and the velocity and force autocorrelation functions with those calculated from classical MD based on rigid-ion and shell-model potentials. The first principles results reproduce the main structural features of the molten salt observed experimentally, whereas they are incorrectly described by both rigid-ion and shell-model potentials. Moreover, HFMD Green-Kubo self-diffusion coefficients are in closer agreement with experimental data than those predicted by classical MD. A comprehensive discussion of MD results for molten NaCl based on different ab initio parametrized polarizable interionic potentials is also given.

  19. International challenge to model the long-range transport of radioxenon released from medical isotope production to six Comprehensive Nuclear-Test-Ban Treaty monitoring stations

    DOE PAGES

    Maurer, Christian; Baré, Jonathan; Kusmierczyk-Michulec, Jolanta; ...

    2018-03-08

    After performing a first multi-model exercise in 2015 a comprehensive and technically more demanding atmospheric transport modelling challenge was organized in 2016. Release data were provided by the Australian Nuclear Science and Technology Organization radiopharmaceutical facility in Sydney (Australia) for a one month period. Measured samples for the same time frame were gathered from six International Monitoring System stations in the Southern Hemisphere with distances to the source ranging between 680 (Melbourne) and about 17,000 km (Tristan da Cunha). Participants were prompted to work with unit emissions in pre-defined emission intervals (daily, half-daily, 3-hourly and hourly emission segment lengths) andmore » in order to perform a blind test actual emission values were not provided to them. Despite the quite different settings of the two atmospheric transport modelling challenges there is common evidence that for long-range atmospheric transport using temporally highly resolved emissions and highly space-resolved meteorological input fields has no significant advantage compared to using lower resolved ones. As well an uncertainty of up to 20% in the daily stack emission data turns out to be acceptable for the purpose of a study like this. Model performance at individual stations is quite diverse depending largely on successfully capturing boundary layer processes. No single model-meteorology combination performs best for all stations. Moreover, the stations statistics do not depend on the distance between the source and the individual stations. Finally, it became more evident how future exercises need to be designed. Set-up parameters like the meteorological driver or the output grid resolution should be pre-scribed in order to enhance diversity as well as comparability among model runs.« less

  20. International challenge to model the long-range transport of radioxenon released from medical isotope production to six Comprehensive Nuclear-Test-Ban Treaty monitoring stations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maurer, Christian; Baré, Jonathan; Kusmierczyk-Michulec, Jolanta

    After performing a first multi-model exercise in 2015 a comprehensive and technically more demanding atmospheric transport modelling challenge was organized in 2016. Release data were provided by the Australian Nuclear Science and Technology Organization radiopharmaceutical facility in Sydney (Australia) for a one month period. Measured samples for the same time frame were gathered from six International Monitoring System stations in the Southern Hemisphere with distances to the source ranging between 680 (Melbourne) and about 17,000 km (Tristan da Cunha). Participants were prompted to work with unit emissions in pre-defined emission intervals (daily, half-daily, 3-hourly and hourly emission segment lengths) andmore » in order to perform a blind test actual emission values were not provided to them. Despite the quite different settings of the two atmospheric transport modelling challenges there is common evidence that for long-range atmospheric transport using temporally highly resolved emissions and highly space-resolved meteorological input fields has no significant advantage compared to using lower resolved ones. As well an uncertainty of up to 20% in the daily stack emission data turns out to be acceptable for the purpose of a study like this. Model performance at individual stations is quite diverse depending largely on successfully capturing boundary layer processes. No single model-meteorology combination performs best for all stations. Moreover, the stations statistics do not depend on the distance between the source and the individual stations. Finally, it became more evident how future exercises need to be designed. Set-up parameters like the meteorological driver or the output grid resolution should be pre-scribed in order to enhance diversity as well as comparability among model runs.« less

  1. Restorative dentistry productivity of senior students engaged in comprehensive care.

    PubMed

    Blalock, John S; Callan, Richard S; Lazarchik, David A; Frank Caughman, W; Looney, Stephen

    2012-12-01

    In dental education, various clinical delivery models are used to educate dental students. The quantitative and qualitative measures used to assess the outcomes of these models are varied. Georgia Health Sciences University College of Dental Medicine has adopted a version of a general dentistry comprehensive care dental education hybrid model. Outcome assessments were developed to evaluate the effectiveness of this delivery model. The aim of this study was to compare the number of restorative procedures performed by senior dental students under a discipline-based model versus senior student productivity engaged in comprehensive care as part of a hybrid model. The rate of senior students' productivity in performing various restorative procedures was tracked over four years, and a comparison was made. In the first two years, the seniors operated in a discipline-based model, while in the last two years the seniors operated in a comprehensive care hybrid model. The results showed that there was a significant increase in productivity by the students in terms of direct and indirect restorations. This increase in productivity may indicate that the comprehensive care model may be a more productive model, thereby enhancing clinical experiences for the students, improving operating efficiency for the schools, and ultimately increasing clinical income.

  2. Comprehensive genetic testing for female and male infertility using next-generation sequencing.

    PubMed

    Patel, Bonny; Parets, Sasha; Akana, Matthew; Kellogg, Gregory; Jansen, Michael; Chang, Chihyu; Cai, Ying; Fox, Rebecca; Niknazar, Mohammad; Shraga, Roman; Hunter, Colby; Pollock, Andrew; Wisotzkey, Robert; Jaremko, Malgorzata; Bisignano, Alex; Puig, Oscar

    2018-05-19

    To develop a comprehensive genetic test for female and male infertility in support of medical decisions during assisted reproductive technology (ART) protocols. We developed a next-generation sequencing (NGS) gene panel consisting of 87 genes including promoters, 5' and 3' untranslated regions, exons, and selected introns. In addition, sex chromosome aneuploidies and Y chromosome microdeletions were analyzed concomitantly using the same panel. The NGS panel was analytically validated by retrospective analysis of 118 genomic DNA samples with known variants in loci representative of female and male infertility. Our results showed analytical accuracy of > 99%, with > 98% sensitivity for single-nucleotide variants (SNVs) and > 91% sensitivity for insertions/deletions (indels). Clinical sensitivity was assessed with samples containing variants representative of male and female infertility, and it was 100% for SNVs/indels, CFTR IVS8-5T variants, sex chromosome aneuploidies, and copy number variants (CNVs) and > 93% for Y chromosome microdeletions. Cost analysis shows potential savings when comparing this single NGS assay with the standard approach, which includes multiple assays. A single, comprehensive, NGS panel can simplify the ordering process for healthcare providers, reduce turnaround time, and lower the overall cost of testing for genetic assessment of infertility in females and males, while maintaining accuracy.

  3. Applying a Multiple Group Causal Indicator Modeling Framework to the Reading Comprehension Skills of Third, Seventh, and Tenth Grade Students

    ERIC Educational Resources Information Center

    Tighe, Elizabeth L.; Wagner, Richard K.; Schatschneider, Christopher

    2015-01-01

    This study demonstrates the utility of applying a causal indicator modeling framework to investigate important predictors of reading comprehension in third, seventh, and tenth grade students. The results indicated that a 4-factor multiple indicator multiple indicator cause (MIMIC) model of reading comprehension provided adequate fit at each grade…

  4. Individualized Prediction of Reading Comprehension Ability Using Gray Matter Volume.

    PubMed

    Cui, Zaixu; Su, Mengmeng; Li, Liangjie; Shu, Hua; Gong, Gaolang

    2018-05-01

    Reading comprehension is a crucial reading skill for learning and putatively contains 2 key components: reading decoding and linguistic comprehension. Current understanding of the neural mechanism underlying these reading comprehension components is lacking, and whether and how neuroanatomical features can be used to predict these 2 skills remain largely unexplored. In the present study, we analyzed a large sample from the Human Connectome Project (HCP) dataset and successfully built multivariate predictive models for these 2 skills using whole-brain gray matter volume features. The results showed that these models effectively captured individual differences in these 2 skills and were able to significantly predict these components of reading comprehension for unseen individuals. The strict cross-validation using the HCP cohort and another independent cohort of children demonstrated the model generalizability. The identified gray matter regions contributing to the skill prediction consisted of a wide range of regions covering the putative reading, cerebellum, and subcortical systems. Interestingly, there were gender differences in the predictive models, with the female-specific model overestimating the males' abilities. Moreover, the identified contributing gray matter regions for the female-specific and male-specific models exhibited considerable differences, supporting a gender-dependent neuroanatomical substrate for reading comprehension.

  5. A Team-Based Process for Designing Comprehensive, Integrated, Three-Tiered (CI3T) Models of Prevention: How Does My School-Site Leadership Team Design a CI3T Model?

    ERIC Educational Resources Information Center

    Lane, Kathleen Lynne; Oakes, Wendy Peia; Jenkins, Abbie; Menzies, Holly Mariah; Kalberg, Jemma Robertson

    2014-01-01

    Comprehensive, integrated, three-tiered models are context specific and developed by school-site teams according to the core values held by the school community. In this article, the authors provide a step-by-step, team-based process for designing comprehensive, integrated, three-tiered models of prevention that integrate academic, behavioral, and…

  6. Team-Based Models for End-of-Life Care: An Evidence-Based Analysis

    PubMed Central

    2014-01-01

    Background End of life refers to the period when people are living with advanced illness that will not stabilize and from which they will not recover and will eventually die. It is not limited to the period immediately before death. Multiple services are required to support people and their families during this time period. The model of care used to deliver these services can affect the quality of the care they receive. Objectives Our objective was to determine whether an optimal team-based model of care exists for service delivery at end of life. In systematically reviewing such models, we considered their core components: team membership, services offered, modes of patient contact, and setting. Data Sources A literature search was performed on October 14, 2013, using Ovid MEDLINE, Ovid MEDLINE In-Process and Other Non-Indexed Citations, Ovid Embase, EBSCO Cumulative Index to Nursing & Allied Health Literature (CINAHL), and EBM Reviews, for studies published from January 1, 2000, to October 14, 2013. Review Methods Abstracts were reviewed by a single reviewer and full-text articles were obtained that met the inclusion criteria. Studies were included if they evaluated a team model of care compared with usual care in an end-of-life adult population. A team was defined as having at least 2 health care disciplines represented. Studies were limited to English publications. A meta-analysis was completed to obtain pooled effect estimates where data permitted. The GRADE quality of the evidence was evaluated. Results Our literature search located 10 randomized controlled trials which, among them, evaluated the following 6 team-based models of care: hospital, direct contact home, direct contact home, indirect contact comprehensive, indirect contact comprehensive, direct contact comprehensive, direct, and early contact Direct contact is when team members see the patient; indirect contact is when they advise another health care practitioner (e.g., a family doctor) who sees the patient. A “comprehensive” model is one that provides continuity of service across inpatient and outpatient settings, e.g., in hospital and then at home. All teams consisted of a nurse and physician at minimum, at least one of whom had a specialty in end-of-life health care. More than 50% of the teams offered services that included symptom management, psychosocial care, development of patient care plans, end-of-life care planning, and coordination of care. We found moderate-quality evidence that the use of a comprehensive direct contact model initiated up to 9 months before death improved informal caregiver satisfaction and the odds of having a home death, and decreased the odds of dying in a nursing home. We found moderate-quality evidence that the use of a comprehensive, direct, and early (up to 24 months before death) contact model improved patient quality of life, symptom management, and patient satisfaction. We did not find that using a comprehensive team-based model had an impact on hospital admissions or length of stay. We found low-quality evidence that the use of a home team-based model increased the odds of having a home death. Limitations Heterogeneity in data reporting across studies limited the ability to complete a meta-analysis on many of the outcome measures. Missing data was not managed well within the studies. Conclusions Moderate-quality evidence shows that a comprehensive, direct-contact, team-based model of care provides the following benefits for end-of-life patients with an estimated survival of up to 9 months: it improves caregiver satisfaction and increases the odds of dying at home while decreasing the odds of dying in a nursing home. Moderate-quality evidence also shows that improvement in patient quality of life, symptom management, and patient satisfaction occur when end-of-life care via this model is provided early (up to 24 months before death). However, using this model to deliver end-of-life care does not impact hospital admissions or hospital length of stay. Team membership includes at minimum a physician and nurse, with at least one having specialist training and/or experience in end-of-life care. Team services include symptom management, psychosocial care, development of patient care plans, end-of-life care planning, and coordination of care. PMID:26356140

  7. Reading comprehension and reading related abilities in adolescents with reading disabilities and attention-deficit/hyperactivity disorder.

    PubMed

    Ghelani, Karen; Sidhu, Robindra; Jain, Umesh; Tannock, Rosemary

    2004-11-01

    Reading comprehension is a very complex task that requires different cognitive processes and reading abilities over the life span. There are fewer studies of reading comprehension relative to investigations of word reading abilities. Reading comprehension difficulties, however, have been identified in two common and frequently overlapping childhood disorders: reading disability (RD) and attention-deficit/hyperactivity disorder (ADHD). The nature of reading comprehension difficulties in these groups remains unclear. The performance of four groups of adolescents (RD, ADHD, comorbid ADHD and RD, and normal controls) was compared on reading comprehension tasks as well as on reading rate and accuracy tasks. Adolescents with RD showed difficulties across most reading tasks, although their comprehension scores were average. Adolescents with ADHD exhibited adequate single word reading abilities. Subtle difficulties were observed, however, on measures of text reading rate and accuracy as well as on silent reading comprehension, but scores remained in the average range. The comorbid group demonstrated similar difficulties to the RD group on word reading accuracy and on reading rate but experienced problems on only silent reading comprehension. Implications for reading interventions are outlined, as well as the clinical relevance for diagnosis.

  8. A comprehensive simulation framework for imaging single particles and biomolecules at the European X-ray Free-Electron Laser

    NASA Astrophysics Data System (ADS)

    Yoon, Chun Hong; Yurkov, Mikhail V.; Schneidmiller, Evgeny A.; Samoylova, Liubov; Buzmakov, Alexey; Jurek, Zoltan; Ziaja, Beata; Santra, Robin; Loh, N. Duane; Tschentscher, Thomas; Mancuso, Adrian P.

    2016-04-01

    The advent of newer, brighter, and more coherent X-ray sources, such as X-ray Free-Electron Lasers (XFELs), represents a tremendous growth in the potential to apply coherent X-rays to determine the structure of materials from the micron-scale down to the Angstrom-scale. There is a significant need for a multi-physics simulation framework to perform source-to-detector simulations for a single particle imaging experiment, including (i) the multidimensional simulation of the X-ray source; (ii) simulation of the wave-optics propagation of the coherent XFEL beams; (iii) atomistic modelling of photon-material interactions; (iv) simulation of the time-dependent diffraction process, including incoherent scattering; (v) assembling noisy and incomplete diffraction intensities into a three-dimensional data set using the Expansion-Maximisation-Compression (EMC) algorithm and (vi) phase retrieval to obtain structural information. We demonstrate the framework by simulating a single-particle experiment for a nitrogenase iron protein using parameters of the SPB/SFX instrument of the European XFEL. This exercise demonstrably yields interpretable consequences for structure determination that are crucial yet currently unavailable for experiment design.

  9. Quantum Darwinism: Entanglement, branches, and the emergent classicality of redundantly stored quantum information

    NASA Astrophysics Data System (ADS)

    Blume-Kohout, Robin; Zurek, Wojciech H.

    2006-06-01

    We lay a comprehensive foundation for the study of redundant information storage in decoherence processes. Redundancy has been proposed as a prerequisite for objectivity, the defining property of classical objects. We consider two ensembles of states for a model universe consisting of one system and many environments: the first consisting of arbitrary states, and the second consisting of “singly branching” states consistent with a simple decoherence model. Typical states from the random ensemble do not store information about the system redundantly, but information stored in branching states has a redundancy proportional to the environment’s size. We compute the specific redundancy for a wide range of model universes, and fit the results to a simple first-principles theory. Our results show that the presence of redundancy divides information about the system into three parts: classical (redundant); purely quantum; and the borderline, undifferentiated or “nonredundant,” information.

  10. Modification of transparent materials with ultrashort laser pulses: What is energetically and mechanically meaningful?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bulgakova, Nadezhda M., E-mail: nadezhda.bulgakova@hilase.cz; Institute of Thermophysics SB RAS, 1 Lavrentyev Ave., 630090 Novosibirsk; Zhukov, Vladimir P.

    A comprehensive analysis of laser-induced modification of bulk glass by single ultrashort laser pulses is presented which is based on combination of optical Maxwell-based modeling with thermoelastoplastic simulations of post-irradiation behavior of matter. A controversial question on free electron density generated inside bulk glass by ultrashort laser pulses in modification regimes is addressed on energy balance grounds. Spatiotemporal dynamics of laser beam propagation in fused silica have been elucidated for the regimes used for direct laser writing in bulk glass. 3D thermoelastoplastic modeling of material relocation dynamics under laser-induced stresses has been performed up to the microsecond timescale when allmore » motions in the material decay. The final modification structure is found to be imprinted into material matrix already at sub-nanosecond timescale. Modeling results agree well with available experimental data on laser light transmission through the sample and the final modification structure.« less

  11. Coupled-channel model for K ¯ N scattering in the resonant region

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fernández-Ramírez, Cesar; Danilkin, Igor V.; Manley, D. Mark

    2016-02-18

    Here, we present a unitary multichannel model formore » $$\\bar{K}$$N scattering in the resonance region that fulfills unitarity. It has the correct analytical properties for the amplitudes once they are extended to the complex-$s$ plane and the partial waves have the right threshold behavior. In order to determine the parameters of the model, we have fitted single-energy partial waves up to J = 7/2 and up to 2.15 GeV of energy in the center-of-mass reference frame obtaining the poles of the Λ* and Σ* resonances, which are compared to previous analyses. Furthermore, we provide the most comprehensive picture of the S = –1 hyperon spectrum to date. Here, important differences are found between the available analyses making the gathering of further experimental information on $$\\bar{K}$$N scattering mandatory to make progress in the assessment of the hyperon spectrum.« less

  12. Whole Atmosphere Simulation of Anthropogenic Climate Change

    NASA Astrophysics Data System (ADS)

    Solomon, Stanley C.; Liu, Han-Li; Marsh, Daniel R.; McInerney, Joseph M.; Qian, Liying; Vitt, Francis M.

    2018-02-01

    We simulated anthropogenic global change through the entire atmosphere, including the thermosphere and ionosphere, using the Whole Atmosphere Community Climate Model-eXtended. The basic result was that even as the lower atmosphere gradually warms, the upper atmosphere rapidly cools. The simulations employed constant low solar activity conditions, to remove the effects of variable solar and geomagnetic activity. Global mean annual mean temperature increased at a rate of +0.2 K/decade at the surface and +0.4 K/decade in the upper troposphere but decreased by about -1 K/decade in the stratosphere-mesosphere and -2.8 K/decade in the thermosphere. Near the mesopause, temperature decreases were small compared to the interannual variation, so trends in that region are uncertain. Results were similar to previous modeling confined to specific atmospheric levels and compared favorably with available measurements. These simulations demonstrate the ability of a single comprehensive numerical model to characterize global change throughout the atmosphere.

  13. Simulation Research on Vehicle Active Suspension Controller Based on G1 Method

    NASA Astrophysics Data System (ADS)

    Li, Gen; Li, Hang; Zhang, Shuaiyang; Luo, Qiuhui

    2017-09-01

    Based on the order relation analysis method (G1 method), the optimal linear controller of vehicle active suspension is designed. The system of the main and passive suspension of the single wheel vehicle is modeled and the system input signal model is determined. Secondly, the system motion state space equation is established by the kinetic knowledge and the optimal linear controller design is completed with the optimal control theory. The weighting coefficient of the performance index coefficients of the main passive suspension is determined by the relational analysis method. Finally, the model is simulated in Simulink. The simulation results show that: the optimal weight value is determined by using the sequence relation analysis method under the condition of given road conditions, and the vehicle acceleration, suspension stroke and tire motion displacement are optimized to improve the comprehensive performance of the vehicle, and the active control is controlled within the requirements.

  14. Influence of Wake Models on Calculated Tiltrotor Aerodynamics

    NASA Technical Reports Server (NTRS)

    Johnson, Wayne

    2001-01-01

    The tiltrotor aircraft configuration has the potential to revolutionize air transportation by providing an economical combination of vertical take-off and landing capability with efficient, high-speed cruise flight. To achieve this potential it is necessary to have validated analytical tools that will support future tiltrotor aircraft development. These analytical tools must calculate tiltrotor aeromechanical behavior, including performance, structural loads, vibration, and aeroelastic stability, with an accuracy established by correlation with measured tiltrotor data. The recent test of the Tilt Rotor Aeroacoustic Model (TRAM) with a single,l/4-scale V-22 rotor in the German-Dutch Wind Tunnel (DNW) provides an extensive set of aeroacoustic, performance, and structural loads data. This paper will examine the influence of wake models on calculated tiltrotor aerodynamics, comparing calculations of performance and airloads with TRAM DNW measurements. The calculations will be performed using the comprehensive analysis CAMRAD II.

  15. Contact Forces between Single Metal Oxide Nanoparticles in Gas-Phase Applications and Processes

    PubMed Central

    2017-01-01

    In this work we present a comprehensive experimental study to determine the contact forces between individual metal oxide nanoparticles in the gas-phase using atomic force microscopy. In addition, we determined the amount of physisorbed water for each type of particle surface. By comparing our results with mathematical models of the interaction forces, we could demonstrate that classical continuum models of van der Waals and capillary forces alone cannot sufficiently describe the experimental findings. Rather, the discrete nature of the molecules has to be considered, which leads to ordering at the interface and the occurrence of solvation forces. We demonstrate that inclusion of solvation forces in the model leads to quantitative agreement with experimental data and that tuning of the molecular order by addition of isopropanol vapor allows us to control the interaction forces between the nanoparticles. PMID:28186771

  16. Modeling the Relations Among Morphological Awareness Dimensions, Vocabulary Knowledge, and Reading Comprehension in Adult Basic Education Students

    PubMed Central

    Tighe, Elizabeth L.; Schatschneider, Christopher

    2016-01-01

    This study extended the findings of Tighe and Schatschneider (2015) by investigating the predictive utility of separate dimensions of morphological awareness as well as vocabulary knowledge to reading comprehension in adult basic education (ABE) students. We competed two- and three-factor structural equation models of reading comprehension. A three-factor model of real word morphological awareness, pseudoword morphological awareness, and vocabulary knowledge emerged as the best fit and accounted for 79% of the reading comprehension variance. The results indicated that the constructs contributed jointly to reading comprehension; however, vocabulary knowledge was the only potentially unique predictor (p = 0.052), accounting for an additional 5.6% of the variance. This study demonstrates the feasibility of applying a latent variable modeling approach to examine individual differences in the reading comprehension skills of ABE students. Further, this study replicates the findings of Tighe and Schatschneider (2015) on the importance of differentiating among dimensions of morphological awareness in this population. PMID:26869981

  17. Modeling the Relations Among Morphological Awareness Dimensions, Vocabulary Knowledge, and Reading Comprehension in Adult Basic Education Students.

    PubMed

    Tighe, Elizabeth L; Schatschneider, Christopher

    2016-01-01

    This study extended the findings of Tighe and Schatschneider (2015) by investigating the predictive utility of separate dimensions of morphological awareness as well as vocabulary knowledge to reading comprehension in adult basic education (ABE) students. We competed two- and three-factor structural equation models of reading comprehension. A three-factor model of real word morphological awareness, pseudoword morphological awareness, and vocabulary knowledge emerged as the best fit and accounted for 79% of the reading comprehension variance. The results indicated that the constructs contributed jointly to reading comprehension; however, vocabulary knowledge was the only potentially unique predictor (p = 0.052), accounting for an additional 5.6% of the variance. This study demonstrates the feasibility of applying a latent variable modeling approach to examine individual differences in the reading comprehension skills of ABE students. Further, this study replicates the findings of Tighe and Schatschneider (2015) on the importance of differentiating among dimensions of morphological awareness in this population.

  18. Hydrologic Model Development and Calibration: Contrasting a Single- and Multi-Objective Approach for Comparing Model Performance

    NASA Astrophysics Data System (ADS)

    Asadzadeh, M.; Maclean, A.; Tolson, B. A.; Burn, D. H.

    2009-05-01

    Hydrologic model calibration aims to find a set of parameters that adequately simulates observations of watershed behavior, such as streamflow, or a state variable, such as snow water equivalent (SWE). There are different metrics for evaluating calibration effectiveness that involve quantifying prediction errors, such as the Nash-Sutcliffe (NS) coefficient and bias evaluated for the entire calibration period, on a seasonal basis, for low flows, or for high flows. Many of these metrics are conflicting such that the set of parameters that maximizes the high flow NS differs from the set of parameters that maximizes the low flow NS. Conflicting objectives are very likely when different calibration objectives are based on different fluxes and/or state variables (e.g., NS based on streamflow versus SWE). One of the most popular ways to balance different metrics is to aggregate them based on their importance and find the set of parameters that optimizes a weighted sum of the efficiency metrics. Comparing alternative hydrologic models (e.g., assessing model improvement when a process or more detail is added to the model) based on the aggregated objective might be misleading since it represents one point on the tradeoff of desired error metrics. To derive a more comprehensive model comparison, we solved a bi-objective calibration problem to estimate the tradeoff between two error metrics for each model. Although this approach is computationally more expensive than the aggregation approach, it results in a better understanding of the effectiveness of selected models at each level of every error metric and therefore provides a better rationale for judging relative model quality. The two alternative models used in this study are two MESH hydrologic models (version 1.2) of the Wolf Creek Research basin that differ in their watershed spatial discretization (a single Grouped Response Unit, GRU, versus multiple GRUs). The MESH model, currently under development by Environment Canada, is a coupled land-surface and hydrologic model. Results will demonstrate the conclusions a modeller might make regarding the value of additional watershed spatial discretization under both an aggregated (single-objective) and multi-objective model comparison framework.

  19. The Zebrafish Model Organism Database: new support for human disease models, mutation details, gene expression phenotypes and searching

    PubMed Central

    Howe, Douglas G.; Bradford, Yvonne M.; Eagle, Anne; Fashena, David; Frazer, Ken; Kalita, Patrick; Mani, Prita; Martin, Ryan; Moxon, Sierra Taylor; Paddock, Holly; Pich, Christian; Ramachandran, Sridhar; Ruzicka, Leyla; Schaper, Kevin; Shao, Xiang; Singer, Amy; Toro, Sabrina; Van Slyke, Ceri; Westerfield, Monte

    2017-01-01

    The Zebrafish Model Organism Database (ZFIN; http://zfin.org) is the central resource for zebrafish (Danio rerio) genetic, genomic, phenotypic and developmental data. ZFIN curators provide expert manual curation and integration of comprehensive data involving zebrafish genes, mutants, transgenic constructs and lines, phenotypes, genotypes, gene expressions, morpholinos, TALENs, CRISPRs, antibodies, anatomical structures, models of human disease and publications. We integrate curated, directly submitted, and collaboratively generated data, making these available to zebrafish research community. Among the vertebrate model organisms, zebrafish are superbly suited for rapid generation of sequence-targeted mutant lines, characterization of phenotypes including gene expression patterns, and generation of human disease models. The recent rapid adoption of zebrafish as human disease models is making management of these data particularly important to both the research and clinical communities. Here, we describe recent enhancements to ZFIN including use of the zebrafish experimental conditions ontology, ‘Fish’ records in the ZFIN database, support for gene expression phenotypes, models of human disease, mutation details at the DNA, RNA and protein levels, and updates to the ZFIN single box search. PMID:27899582

  20. A comprehensive evaluation of input data-induced uncertainty in nonpoint source pollution modeling

    NASA Astrophysics Data System (ADS)

    Chen, L.; Gong, Y.; Shen, Z.

    2015-11-01

    Watershed models have been used extensively for quantifying nonpoint source (NPS) pollution, but few studies have been conducted on the error-transitivity from different input data sets to NPS modeling. In this paper, the effects of four input data, including rainfall, digital elevation models (DEMs), land use maps, and the amount of fertilizer, on NPS simulation were quantified and compared. A systematic input-induced uncertainty was investigated using watershed model for phosphorus load prediction. Based on the results, the rain gauge density resulted in the largest model uncertainty, followed by DEMs, whereas land use and fertilizer amount exhibited limited impacts. The mean coefficient of variation for errors in single rain gauges-, multiple gauges-, ASTER GDEM-, NFGIS DEM-, land use-, and fertilizer amount information was 0.390, 0.274, 0.186, 0.073, 0.033 and 0.005, respectively. The use of specific input information, such as key gauges, is also highlighted to achieve the required model accuracy. In this sense, these results provide valuable information to other model-based studies for the control of prediction uncertainty.

  1. Airloads and Wake Geometry Calculations for an Isolated Tiltrotor Model in a Wind Tunnel

    NASA Technical Reports Server (NTRS)

    Johnson, Wayne

    2001-01-01

    Comparisons of measured and calculated aerodynamic behavior of a tiltrotor model are presented. The test of the Tilt Rotor Aeroacoustic Model (TRAM) with a single, 0.25-scale V-22 rotor in the German-Dutch Wind Tunnel (DNW) provides an extensive set of aeroacoustic, performance, and structural loads data. The calculations were performed using the rotorcraft comprehensive analysis CAMRAD II. Presented are comparisons of measured and calculated performance for hover and helicopter mode operation, and airloads for helicopter mode. Calculated induced power, profile power, and wake geometry provide additional information about the aerodynamic behavior. An aerodynamic and wake model and calculation procedure that reflects the unique geometry and phenomena of tiltrotors has been developed. There are major differences between this model and the corresponding aerodynamic and wake model that has been established for helicopter rotors. In general, good correlation between measured and calculated performance and airloads behavior has been shown. Two aspects of the analysis that clearly need improvement are the stall delay model and the trailed vortex formation model.

  2. The Economics of Solar Heating

    NASA Technical Reports Server (NTRS)

    Forney, J. A.

    1982-01-01

    SHCOST program assesses economic feasibility of solar energy for single-family residences and light commercial applications. Program analyzes life-cycle costs as well as sensitivity studies to aid designer in selecting most economically attractive solar system for single-family residence or light commercial application. SHCOST includes fairly comprehensive list of cost elements from which user may select.

  3. Reading Comprehension Interventions for Students with Autism Spectrum Disorders: A Synthesis of Research

    ERIC Educational Resources Information Center

    El Zein, Farah; Solis, Michael; Vaughn, Sharon; McCulley, Lisa

    2014-01-01

    The authors synthesized reading intervention studies conducted between 1980 and 2012 with K-12 students identified with autism spectrum disorders (ASD). Nine single-subject design studies, one quasi-experimental study, and two single-group design studies met the criteria for inclusion. Findings from the studies indicate that modifying…

  4. Neuroanatomically Separable Effects of Imageability and Grammatical Class during Single-Word Comprehension

    ERIC Educational Resources Information Center

    Bedny, Marina; Thompson-Schill, Sharon L.

    2006-01-01

    The present study characterizes the neural correlates of noun and verb imageability and addresses the question of whether components of the neural network supporting word recognition can be separately modified by variations in grammatical class and imageability. We examined the effect of imageability on BOLD signal during single-word comprehension…

  5. Proposed Performance Standards for Comprehensive Support Services and Vocational Equity Grants.

    ERIC Educational Resources Information Center

    Lewis, Morgan V.

    Activities to develop proposed performance standards and measures for programs receiving funds authorized by the Carl D. Perkins Vocational and Applied Technology Education Act are described in this report. Two sections of the act are considered: Section 221 authorizes programs for single parents, displaced homemakers, and single pregnant women;…

  6. Real-Time Monitoring and Prediction of the Pilot Vehicle System (PVS) Closed-Loop Stability

    NASA Astrophysics Data System (ADS)

    Mandal, Tanmay Kumar

    Understanding human control behavior is an important step for improving the safety of future aircraft. Considerable resources are invested during the design phase of an aircraft to ensure that the aircraft has desirable handling qualities. However, human pilots exhibit a wide range of control behaviors that are a function of external stimulus, aircraft dynamics, and human psychological properties (such as workload, stress factor, confidence, and sense of urgency factor). This variability is difficult to address comprehensively during the design phase and may lead to undesirable pilot-aircraft interaction, such as pilot-induced oscillations (PIO). This creates the need to keep track of human pilot performance in real-time to monitor the pilot vehicle system (PVS) stability. This work focused on studying human pilot behavior for the longitudinal axis of a remotely controlled research aircraft and using human-in-the-loop (HuIL) simulations to obtain information about the human controlled system (HCS) stability. The work in this dissertation is divided into two main parts: PIO analysis and human control model parameters estimation. To replicate different flight conditions, this study included time delay and elevator rate limiting phenomena, typical of actuator dynamics during the experiments. To study human control behavior, this study employed the McRuer model for single-input single-output manual compensatory tasks. McRuer model is a lead-lag controller with time delay which has been shown to adequately model manual compensatory tasks. This dissertation presents a novel technique to estimate McRuer model parameters in real-time and associated validation using HuIL simulations to correctly predict HCS stability. The McRuer model parameters were estimated in real-time using a Kalman filter approach. The estimated parameters were then used to analyze the stability of the closed-loop HCS and verify them against the experimental data. Therefore, the main contribution of this dissertation is the design of an unscented Kalman filter-based algorithm to estimate McRuer model parameters in real time, and a framework to validate this algorithm for single-input single-output manual compensatory tasks to predict instabilities.

  7. A One-Pot/Single-Analysis Approach to Substrate Scope Investigations Using Comprehensive Two-Dimensional Gas Chromatography (GC×GC).

    PubMed

    O'Neil, Gregory W; Nelson, Robert K; Wright, Alicia M; Reddy, Christopher M

    2016-05-06

    A representative substrate scope investigation for an enantioselective catalytic ketone-reduction has been performed as a single reaction on a mixture containing equimolar amounts of nine (9) prototypical compounds. The resulting analyte pool containing 18 potential products from nine different reactions could all be completely resolved in a single chromatographic injection using comprehensive two-dimensional gas chromatography (GC×GC) with time-of-flight mass spectrometry, allowing for simultaneous determination of percent conversion and enantiomeric excess for each substrate. The results obtained for an enantioselective iron-catalyzed asymmetric transfer hydrogenation using this one-pot/single-analysis approach were similar to those reported for the individualized reactions, demonstrating the utility of this strategy for streamlining substrate scope investigations. Moreover, for this particular catalyst, activity and selectivity were not greatly affected by the presence of other ketones or enantioenriched reduced products. This approach allows for faster and greener analyses that are central to new reaction development, as well as an opportunity to gain further insights into other established transformations.

  8. Dissecting hematopoietic and renal cell heterogeneity in adult zebrafish at single-cell resolution using RNA sequencing.

    PubMed

    Tang, Qin; Iyer, Sowmya; Lobbardi, Riadh; Moore, John C; Chen, Huidong; Lareau, Caleb; Hebert, Christine; Shaw, McKenzie L; Neftel, Cyril; Suva, Mario L; Ceol, Craig J; Bernards, Andre; Aryee, Martin; Pinello, Luca; Drummond, Iain A; Langenau, David M

    2017-10-02

    Recent advances in single-cell, transcriptomic profiling have provided unprecedented access to investigate cell heterogeneity during tissue and organ development. In this study, we used massively parallel, single-cell RNA sequencing to define cell heterogeneity within the zebrafish kidney marrow, constructing a comprehensive molecular atlas of definitive hematopoiesis and functionally distinct renal cells found in adult zebrafish. Because our method analyzed blood and kidney cells in an unbiased manner, our approach was useful in characterizing immune-cell deficiencies within DNA-protein kinase catalytic subunit ( prkdc ), interleukin-2 receptor γ a ( il2rga ), and double-homozygous-mutant fish, identifying blood cell losses in T, B, and natural killer cells within specific genetic mutants. Our analysis also uncovered novel cell types, including two classes of natural killer immune cells, classically defined and erythroid-primed hematopoietic stem and progenitor cells, mucin-secreting kidney cells, and kidney stem/progenitor cells. In total, our work provides the first, comprehensive, single-cell, transcriptomic analysis of kidney and marrow cells in the adult zebrafish. © 2017 Tang et al.

  9. Dissecting hematopoietic and renal cell heterogeneity in adult zebrafish at single-cell resolution using RNA sequencing

    PubMed Central

    Iyer, Sowmya; Lobbardi, Riadh; Chen, Huidong; Hebert, Christine; Shaw, McKenzie L.; Neftel, Cyril; Suva, Mario L.; Bernards, Andre; Aryee, Martin; Drummond, Iain A.

    2017-01-01

    Recent advances in single-cell, transcriptomic profiling have provided unprecedented access to investigate cell heterogeneity during tissue and organ development. In this study, we used massively parallel, single-cell RNA sequencing to define cell heterogeneity within the zebrafish kidney marrow, constructing a comprehensive molecular atlas of definitive hematopoiesis and functionally distinct renal cells found in adult zebrafish. Because our method analyzed blood and kidney cells in an unbiased manner, our approach was useful in characterizing immune-cell deficiencies within DNA–protein kinase catalytic subunit (prkdc), interleukin-2 receptor γ a (il2rga), and double-homozygous–mutant fish, identifying blood cell losses in T, B, and natural killer cells within specific genetic mutants. Our analysis also uncovered novel cell types, including two classes of natural killer immune cells, classically defined and erythroid-primed hematopoietic stem and progenitor cells, mucin-secreting kidney cells, and kidney stem/progenitor cells. In total, our work provides the first, comprehensive, single-cell, transcriptomic analysis of kidney and marrow cells in the adult zebrafish. PMID:28878000

  10. Sorption Modeling and Verification for Off-Gas Treatment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tavlarides, Lawrence; Yiacoumi, Sotira; Tsouris, Costas

    2016-12-20

    This project was successfully executed to provide valuable adsorption data and improve a comprehensive model developed in previous work by the authors. Data obtained were used in an integrated computer program to predict the behavior of adsorption columns. The model is supported by experimental data and has been shown to predict capture of off gas similar to that evolving during the reprocessing of nuclear waste. The computer program structure contains (a) equilibrium models of off-gases with the adsorbate; (b) mass-transfer models to describe off-gas mass transfer to a particle, diffusion through the pores of the particle, and adsorption on themore » active sites of the particle; and (c) incorporation of these models into fixed bed adsorption modeling, which includes advection through the bed. These models are being connected with the MOOSE (Multiphysics Object-Oriented Simulation Environment) software developed at the Idaho National Laboratory through DGOSPREY (Discontinuous Galerkin Off-gas SeParation and REcoverY) computer codes developed in this project. Experiments for iodine and water adsorption have been conducted on reduced silver mordenite (Ag0Z) for single layered particles. Adsorption apparatuses have been constructed to execute these experiments over a useful range of conditions for temperatures ranging from ambient to 250°C and water dew points ranging from -69 to 19°C. Experimental results were analyzed to determine mass transfer and diffusion of these gases into the particles and to determine which models best describe the single and binary component mass transfer and diffusion processes. The experimental results were also used to demonstrate the capabilities of the comprehensive models developed to predict single-particle adsorption and transients of the adsorption-desorption processes in fixed beds. Models for adsorption and mass transfer have been developed to mathematically describe adsorption kinetics and transport via diffusion and advection processes. These models were built on a numerical framework for solving conservation law problems in one-dimensional geometries such as spheres, cylinders, and lines. Coupled with the framework are specific models for adsorption in commercial adsorbents, such as zeolites and mordenites. Utilizing this modeling approach, the authors were able to accurately describe and predict adsorption kinetic data obtained from experiments at a variety of different temperatures and gas phase concentrations. A demonstration of how these models, and framework, can be used to simulate adsorption in fixed- bed columns is provided. The CO 2 absorption work involved modeling with supportive experimental information. A dynamic model was developed to simulate CO 2 absorption using high alkaline content water solutions. The model is based upon transient mass and energy balances for chemical species commonly present in CO 2 absorption. A computer code was developed to implement CO 2 absorption with a chemical reaction model. Experiments were conducted in a laboratory scale column to determine the model parameters. The influence of geometric parameters and operating variables on CO 2 absorption was studied over a wide range of conditions. Continuing work could employ the model to control column operation and predict the absorption behavior under various input conditions and other prescribed experimental perturbations. The value of the validated models and numerical frameworks developed in this project is that they can be used to predict the sorption behavior of off-gas evolved during the reprocessing of nuclear waste and thus reduce the cost of the experiments. They can also be used to design sorption processes based on concentration limits and flow-rates determined at the plant level.« less

  11. High-Energy Passive Mode-Locking of Fiber Lasers

    PubMed Central

    Ding, Edwin; Renninger, William H.; Wise, Frank W.; Grelu, Philippe; Shlizerman, Eli; Kutz, J. Nathan

    2012-01-01

    Mode-locking refers to the generation of ultrashort optical pulses in laser systems. A comprehensive study of achieving high-energy pulses in a ring cavity fiber laser that is passively mode-locked by a series of waveplates and a polarizer is presented in this paper. Specifically, it is shown that the multipulsing instability can be circumvented in favor of bifurcating to higher-energy single pulses by appropriately adjusting the group velocity dispersion in the fiber and the waveplate/polarizer settings in the saturable absorber. The findings may be used as practical guidelines for designing high-power lasers since the theoretical model relates directly to the experimental settings. PMID:22866059

  12. Fundamental reform of payment for adult primary care: comprehensive payment for comprehensive care.

    PubMed

    Goroll, Allan H; Berenson, Robert A; Schoenbaum, Stephen C; Gardner, Laurence B

    2007-03-01

    Primary care is essential to the effective and efficient functioning of health care delivery systems, yet there is an impending crisis in the field due in part to a dysfunctional payment system. We present a fundamentally new model of payment for primary care, replacing encounter-based imbursement with comprehensive payment for comprehensive care. Unlike former iterations of primary care capitation (which simply bundled inadequate fee-for-service payments), our comprehensive payment model represents new investment in adult primary care, with substantial increases in payment over current levels. The comprehensive payment is directed to practices to include support for the modern systems and teams essential to the delivery of comprehensive, coordinated care. Income to primary physicians is increased commensurate with the high level of responsibility expected. To ensure optimal allocation of resources and the rewarding of desired outcomes, the comprehensive payment is needs/risk-adjusted and performance-based. Our model establishes a new social contract with the primary care community, substantially increasing payment in return for achieving important societal health system goals, including improved accessibility, quality, safety, and efficiency. Attainment of these goals should help offset and justify the costs of the investment. Field tests of this and other new models of payment for primary care are urgently needed.

  13. Fundamental Reform of Payment for Adult Primary Care: Comprehensive Payment for Comprehensive Care

    PubMed Central

    Berenson, Robert A.; Schoenbaum, Stephen C.; Gardner, Laurence B.

    2007-01-01

    Primary care is essential to the effective and efficient functioning of health care delivery systems, yet there is an impending crisis in the field due in part to a dysfunctional payment system. We present a fundamentally new model of payment for primary care, replacing encounter-based imbursement with comprehensive payment for comprehensive care. Unlike former iterations of primary care capitation (which simply bundled inadequate fee-for-service payments), our comprehensive payment model represents new investment in adult primary care, with substantial increases in payment over current levels. The comprehensive payment is directed to practices to include support for the modern systems and teams essential to the delivery of comprehensive, coordinated care. Income to primary physicians is increased commensurate with the high level of responsibility expected. To ensure optimal allocation of resources and the rewarding of desired outcomes, the comprehensive payment is needs/risk-adjusted and performance-based. Our model establishes a new social contract with the primary care community, substantially increasing payment in return for achieving important societal health system goals, including improved accessibility, quality, safety, and efficiency. Attainment of these goals should help offset and justify the costs of the investment. Field tests of this and other new models of payment for primary care are urgently needed. PMID:17356977

  14. Comprehensive simulation-enhanced training curriculum for an advanced minimally invasive procedure: a randomized controlled trial.

    PubMed

    Zevin, Boris; Dedy, Nicolas J; Bonrath, Esther M; Grantcharov, Teodor P

    2017-05-01

    There is no comprehensive simulation-enhanced training curriculum to address cognitive, psychomotor, and nontechnical skills for an advanced minimally invasive procedure. 1) To develop and provide evidence of validity for a comprehensive simulation-enhanced training (SET) curriculum for an advanced minimally invasive procedure; (2) to demonstrate transfer of acquired psychomotor skills from a simulation laboratory to live porcine model; and (3) to compare training outcomes of SET curriculum group and chief resident group. University. This prospective single-blinded, randomized, controlled trial allocated 20 intermediate-level surgery residents to receive either conventional training (control) or SET curriculum training (intervention). The SET curriculum consisted of cognitive, psychomotor, and nontechnical training modules. Psychomotor skills in a live anesthetized porcine model in the OR was the primary outcome. Knowledge of advanced minimally invasive and bariatric surgery and nontechnical skills in a simulated OR crisis scenario were the secondary outcomes. Residents in the SET curriculum group went on to perform a laparoscopic jejunojejunostomy in the OR. Cognitive, psychomotor, and nontechnical skills of SET curriculum group were also compared to a group of 12 chief surgery residents. SET curriculum group demonstrated superior psychomotor skills in a live porcine model (56 [47-62] versus 44 [38-53], P<.05) and superior nontechnical skills (41 [38-45] versus 31 [24-40], P<.01) compared with conventional training group. SET curriculum group and conventional training group demonstrated equivalent knowledge (14 [12-15] versus 13 [11-15], P = 0.47). SET curriculum group demonstrated equivalent psychomotor skills in the live porcine model and in the OR in a human patient (56 [47-62] versus 63 [61-68]; P = .21). SET curriculum group demonstrated inferior knowledge (13 [11-15] versus 16 [14-16]; P<.05), equivalent psychomotor skill (63 [61-68] versus 68 [62-74]; P = .50), and superior nontechnical skills (41 [38-45] versus 34 [27-35], P<.01) compared with chief resident group. Completion of the SET curriculum resulted in superior training outcomes, compared with conventional surgery training. Implementation of the SET curriculum can standardize training for an advanced minimally invasive procedure and can ensure that comprehensive proficiency milestones are met before exposure to patient care. Copyright © 2017 American Society for Bariatric Surgery. Published by Elsevier Inc. All rights reserved.

  15. Reconciling Time, Space and Function: A New Dorsal-Ventral Stream Model of Sentence Comprehension

    ERIC Educational Resources Information Center

    Bornkessel-Schlesewsky, Ina; Schlesewsky, Matthias

    2013-01-01

    We present a new dorsal-ventral stream framework for language comprehension which unifies basic neurobiological assumptions (Rauschecker & Scott, 2009) with a cross-linguistic neurocognitive sentence comprehension model (eADM; Bornkessel & Schlesewsky, 2006). The dissociation between (time-dependent) syntactic structure-building and…

  16. Learning Oriented Region-based Convolutional Neural Networks for Building Detection in Satellite Remote Sensing Images

    NASA Astrophysics Data System (ADS)

    Chen, C.; Gong, W.; Hu, Y.; Chen, Y.; Ding, Y.

    2017-05-01

    The automated building detection in aerial images is a fundamental problem encountered in aerial and satellite images analysis. Recently, thanks to the advances in feature descriptions, Region-based CNN model (R-CNN) for object detection is receiving an increasing attention. Despite the excellent performance in object detection, it is problematic to directly leverage the features of R-CNN model for building detection in single aerial image. As we know, the single aerial image is in vertical view and the buildings possess significant directional feature. However, in R-CNN model, direction of the building is ignored and the detection results are represented by horizontal rectangles. For this reason, the detection results with horizontal rectangle cannot describe the building precisely. To address this problem, in this paper, we proposed a novel model with a key feature related to orientation, namely, Oriented R-CNN (OR-CNN). Our contributions are mainly in the following two aspects: 1) Introducing a new oriented layer network for detecting the rotation angle of building on the basis of the successful VGG-net R-CNN model; 2) the oriented rectangle is proposed to leverage the powerful R-CNN for remote-sensing building detection. In experiments, we establish a complete and bran-new data set for training our oriented R-CNN model and comprehensively evaluate the proposed method on a publicly available building detection data set. We demonstrate State-of-the-art results compared with the previous baseline methods.

  17. Ecosystem approach to fisheries: Exploring environmental and trophic effects on Maximum Sustainable Yield (MSY) reference point estimates

    PubMed Central

    Kumar, Rajeev; Pitcher, Tony J.; Varkey, Divya A.

    2017-01-01

    We present a comprehensive analysis of estimation of fisheries Maximum Sustainable Yield (MSY) reference points using an ecosystem model built for Mille Lacs Lake, the second largest lake within Minnesota, USA. Data from single-species modelling output, extensive annual sampling for species abundances, annual catch-survey, stomach-content analysis for predatory-prey interactions, and expert opinions were brought together within the framework of an Ecopath with Ecosim (EwE) ecosystem model. An increase in the lake water temperature was observed in the last few decades; therefore, we also incorporated a temperature forcing function in the EwE model to capture the influences of changing temperature on the species composition and food web. The EwE model was fitted to abundance and catch time-series for the period 1985 to 2006. Using the ecosystem model, we estimated reference points for most of the fished species in the lake at single-species as well as ecosystem levels with and without considering the influence of temperature change; therefore, our analysis investigated the trophic and temperature effects on the reference points. The paper concludes that reference points such as MSY are not stationary, but change when (1) environmental conditions alter species productivity and (2) fishing on predators alters the compensatory response of their prey. Thus, it is necessary for the management to re-estimate or re-evaluate the reference points when changes in environmental conditions and/or major shifts in species abundance or community structure are observed. PMID:28957387

  18. A Global User-Driven Model for Tile Prefetching in Web Geographical Information Systems.

    PubMed

    Pan, Shaoming; Chong, Yanwen; Zhang, Hang; Tan, Xicheng

    2017-01-01

    A web geographical information system is a typical service-intensive application. Tile prefetching and cache replacement can improve cache hit ratios by proactively fetching tiles from storage and replacing the appropriate tiles from the high-speed cache buffer without waiting for a client's requests, which reduces disk latency and improves system access performance. Most popular prefetching strategies consider only the relative tile popularities to predict which tile should be prefetched or consider only a single individual user's access behavior to determine which neighbor tiles need to be prefetched. Some studies show that comprehensively considering all users' access behaviors and all tiles' relationships in the prediction process can achieve more significant improvements. Thus, this work proposes a new global user-driven model for tile prefetching and cache replacement. First, based on all users' access behaviors, a type of expression method for tile correlation is designed and implemented. Then, a conditional prefetching probability can be computed based on the proposed correlation expression mode. Thus, some tiles to be prefetched can be found by computing and comparing the conditional prefetching probability from the uncached tiles set and, similarly, some replacement tiles can be found in the cache buffer according to multi-step prefetching. Finally, some experiments are provided comparing the proposed model with other global user-driven models, other single user-driven models, and other client-side prefetching strategies. The results show that the proposed model can achieve a prefetching hit rate in approximately 10.6% ~ 110.5% higher than the compared methods.

  19. The Structure of Oral Language and Reading and Their Relation to Comprehension in Kindergarten through Grade 2

    PubMed Central

    Foorman, Barbara R.; Herrera, Sarah; Petscher, Yaacov; Mitchell, Alison; Truckenmiller, Adrea

    2016-01-01

    This study examined the structure of oral language and reading and their relation to comprehension from a latent variable modeling perspective in Kindergarten, Grade 1, and Grade 2. Participants were students in Kindergarten (n = 218), Grade 1 (n = 372), and Grade 2 (n = 273), attending Title 1 schools. Students were administered phonological awareness, syntax, vocabulary, listening comprehension, and decoding fluency measures in mid-year. Outcome measures included a listening comprehension measure in Kindergarten and a reading comprehension test in Grades1 and 2. In Kindergarten, oral language (consisting of listening comprehension, syntax, and vocabulary) shared variance with phonological awareness in predicting a listening comprehension outcome. However, in Grades 1 and 2, phonological awareness was no longer predictive of reading comprehension when decoding fluency and oral language were included in the model. In Grades 1 and 2, oral language and decoding fluency were significant predictors of reading comprehension. PMID:27660395

  20. In Vitro Evaluation of Glycoengineered RSV-F in the Human Artificial Lymph Node Reactor.

    PubMed

    Radke, Lars; Sandig, Grit; Lubitz, Annika; Schließer, Ulrike; von Horsten, Hans Henning; Blanchard, Veronique; Keil, Karolin; Sandig, Volker; Giese, Christoph; Hummel, Michael; Hinderlich, Stephan; Frohme, Marcus

    2017-08-15

    Subunit vaccines often require adjuvants to elicit sustained immune activity. Here, a method is described to evaluate the efficacy of single vaccine candidates in the preclinical stage based on cytokine and gene expression analysis. As a model, the recombinant human respiratory syncytial virus (RSV) fusion protein (RSV-F) was produced in CHO cells. For comparison, wild-type and glycoengineered, afucosylated RSV-F were established. Both glycoprotein vaccines were tested in a commercial Human Artificial Lymph Node in vitro model (HuALN ® ). The analysis of six key cytokines in cell culture supernatants showed well-balanced immune responses for the afucosylated RSV-F, while immune response of wild-type RSV-F was more Th1 accentuated. In particular, stronger and specific secretion of interleukin-4 after each round of re-stimulation underlined higher potency and efficacy of the afucosylated vaccine candidate. Comprehensive gene expression analysis by nCounter gene expression assay confirmed the stronger onset of the immunologic reaction in stimulation experiments with the afucosylated vaccine in comparison to wild-type RSV-F and particularly revealed prominent activation of Th17 related genes, innate immunity, and comprehensive activation of humoral immunity. We, therefore, show that our method is suited to distinguish the potency of two vaccine candidates with minor structural differences.

  1. Comprehensive analysis of mouse retinal mononuclear phagocytes.

    PubMed

    Lückoff, Anika; Scholz, Rebecca; Sennlaub, Florian; Xu, Heping; Langmann, Thomas

    2017-06-01

    The innate immune system is activated in a number of degenerative and inflammatory retinal disorders such as age-related macular degeneration (AMD). Retinal microglia, choroidal macrophages, and recruited monocytes, collectively termed 'retinal mononuclear phagocytes', are critical determinants of ocular disease outcome. Many publications have described the presence of these cells in mouse models for retinal disease; however, only limited aspects of their behavior have been uncovered, and these have only been uncovered using a single detection method. The workflow presented here describes a comprehensive analysis strategy that allows characterization of retinal mononuclear phagocytes in vivo and in situ. We present standardized working steps for scanning laser ophthalmoscopy of microglia from MacGreen reporter mice (mice expressing the macrophage colony-stimulating factor receptor GFP transgene throughout the mononuclear phagocyte system), quantitative analysis of Iba1-stained retinal sections and flat mounts, CD11b-based retinal flow cytometry, and qRT-PCR analysis of key microglia markers. The protocol can be completed within 3 d, and we present data from retinas treated with laser-induced choroidal neovascularization (CNV), bright white-light exposure, and Fam161a-associated inherited retinal degeneration. The assays can be applied to any of the existing mouse models for retinal disorders and may be valuable for documenting immune responses in studies for immunomodulatory therapies.

  2. Accelerating population balance-Monte Carlo simulation for coagulation dynamics from the Markov jump model, stochastic algorithm and GPU parallel computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, Zuwei; Zhao, Haibo, E-mail: klinsmannzhb@163.com; Zheng, Chuguang

    2015-01-15

    This paper proposes a comprehensive framework for accelerating population balance-Monte Carlo (PBMC) simulation of particle coagulation dynamics. By combining Markov jump model, weighted majorant kernel and GPU (graphics processing unit) parallel computing, a significant gain in computational efficiency is achieved. The Markov jump model constructs a coagulation-rule matrix of differentially-weighted simulation particles, so as to capture the time evolution of particle size distribution with low statistical noise over the full size range and as far as possible to reduce the number of time loopings. Here three coagulation rules are highlighted and it is found that constructing appropriate coagulation rule providesmore » a route to attain the compromise between accuracy and cost of PBMC methods. Further, in order to avoid double looping over all simulation particles when considering the two-particle events (typically, particle coagulation), the weighted majorant kernel is introduced to estimate the maximum coagulation rates being used for acceptance–rejection processes by single-looping over all particles, and meanwhile the mean time-step of coagulation event is estimated by summing the coagulation kernels of rejected and accepted particle pairs. The computational load of these fast differentially-weighted PBMC simulations (based on the Markov jump model) is reduced greatly to be proportional to the number of simulation particles in a zero-dimensional system (single cell). Finally, for a spatially inhomogeneous multi-dimensional (multi-cell) simulation, the proposed fast PBMC is performed in each cell, and multiple cells are parallel processed by multi-cores on a GPU that can implement the massively threaded data-parallel tasks to obtain remarkable speedup ratio (comparing with CPU computation, the speedup ratio of GPU parallel computing is as high as 200 in a case of 100 cells with 10 000 simulation particles per cell). These accelerating approaches of PBMC are demonstrated in a physically realistic Brownian coagulation case. The computational accuracy is validated with benchmark solution of discrete-sectional method. The simulation results show that the comprehensive approach can attain very favorable improvement in cost without sacrificing computational accuracy.« less

  3. Research on a Frame-Based Model of Reading Comprehension. Final Report.

    ERIC Educational Resources Information Center

    Goldstein, Ira

    This report summarizes computational investigations of language comprehension based on Marvin Minsky's theory of frames, a recent advance in artifical intelligence theories about the representation of knowledge. The investigations discussed explored frame theory as a basis for text comprehension by implementing models of the theory and developing…

  4. WWC Review of the Report "The Impact of Collaborative Strategic Reading on the Reading Comprehension of Grade 5 Students in Linguistically Diverse Schools." What Works Clearinghouse Single Study Review

    ERIC Educational Resources Information Center

    What Works Clearinghouse, 2013

    2013-01-01

    The study reviewed in this paper examined the impact of "Collaborative Strategic Reading" ("CSR"), a set of instructional strategies used to build reading proficiency, on the reading comprehension of fifth-grade students. The analysis included 1,355 students from 74 social studies classrooms within 26 linguistically diverse…

  5. A Bayesian Poisson-lognormal Model for Count Data for Multiple-Trait Multiple-Environment Genomic-Enabled Prediction.

    PubMed

    Montesinos-López, Osval A; Montesinos-López, Abelardo; Crossa, José; Toledo, Fernando H; Montesinos-López, José C; Singh, Pawan; Juliana, Philomin; Salinas-Ruiz, Josafhat

    2017-05-05

    When a plant scientist wishes to make genomic-enabled predictions of multiple traits measured in multiple individuals in multiple environments, the most common strategy for performing the analysis is to use a single trait at a time taking into account genotype × environment interaction (G × E), because there is a lack of comprehensive models that simultaneously take into account the correlated counting traits and G × E. For this reason, in this study we propose a multiple-trait and multiple-environment model for count data. The proposed model was developed under the Bayesian paradigm for which we developed a Markov Chain Monte Carlo (MCMC) with noninformative priors. This allows obtaining all required full conditional distributions of the parameters leading to an exact Gibbs sampler for the posterior distribution. Our model was tested with simulated data and a real data set. Results show that the proposed multi-trait, multi-environment model is an attractive alternative for modeling multiple count traits measured in multiple environments. Copyright © 2017 Montesinos-López et al.

  6. Towards a comprehensive city emission function (CCEF)

    NASA Astrophysics Data System (ADS)

    Kocifaj, Miroslav

    2018-01-01

    The comprehensive city emission function (CCEF) is developed for a heterogeneous light-emitting or blocking urban environments, embracing any combination of input parameters that characterize linear dimensions in the system (size and distances between buildings or luminaires), properties of light-emitting elements (such as luminous building façades and street lighting), ground reflectance and total uplight-fraction, all of these defined for an arbitrarily sized 2D area. The analytical formula obtained is not restricted to a single model class as it can capture any specific light-emission feature for wide range of cities. The CCEF method is numerically fast in contrast to what can be expected of other probabilistic approaches that rely on repeated random sampling. Hence the present solution has great potential in light-pollution modeling and can be included in larger numerical models. Our theoretical findings promise great progress in light-pollution modeling as this is the first time an analytical solution to city emission function (CEF) has been developed that depends on statistical mean size and height of city buildings, inter-building separation, prevailing heights of light fixtures, lighting density, and other factors such as e.g. luminaire light output and light distribution, including the amount of uplight, and representative city size. The model is validated for sensitivity and specificity pertinent to combinations of input parameters in order to test its behavior under various conditions, including those that can occur in complex urban environments. It is demonstrated that the solution model succeeds in reproducing a light emission peak at some elevated zenith angles and is consistent with reduced rather than enhanced emission in directions nearly parallel to the ground.

  7. Comprehensive two-dimensional liquid chromatography for polyphenol analysis in foodstuffs.

    PubMed

    Cacciola, Francesco; Farnetti, Sara; Dugo, Paola; Marriott, Philip John; Mondello, Luigi

    2017-01-01

    Polyphenols are a class of plant secondary metabolites that are recently drawing a special interest because of their broad spectrum of pharmacological effects. As they are characterized by an enormous structural variability, the identification of these molecules in food samples is a difficult task, and sometimes having only a limited number of commercially available reference materials is not of great help. One-dimensional liquid chromatography is the most widely applied analytical approach for their analysis. In particular, the hyphenation of liquid chromatography to mass spectrometry has come to play an influential role by allowing relatively fast tentative identification and accurate quantification of polyphenolic compounds at trace levels in vegetable media. However, when dealing with very complex real-world food samples, a single separation system often does not provide sufficient resolving power for attaining rewarding results. Comprehensive two-dimensional liquid chromatography is a technique of great analytical impact, since it offers much higher peak capacities than separations in a single dimension. In the present review, we describe applications in the field of comprehensive two-dimensional liquid chromatography for polyphenol analysis in real-world food samples. Comprehensive two-dimensional liquid chromatography applications to nonfood matrices fall outside the scope of the current report and will not be discussed. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Attention Process Training-3 to improve reading comprehension in mild aphasia: A single-case experimental design study.

    PubMed

    Lee, Jaime B; Sohlberg, McKay Moore; Harn, Beth; Horner, Robert; Cherney, Leora R

    2018-06-04

    People with aphasia frequently present with nonlinguistic deficits, in addition to their compromised language abilities, which may contribute to their problems with reading comprehension. Treatment of attention, working memory and executive control may improve reading comprehension in individuals with aphasia, particularly those with mild reading problems. This single-case experimental design study evaluated the efficacy of Attention Process Training-3, an intervention combining direct attention training and metacognitive facilitation, for improving reading comprehension in individuals with mild aphasia. A multiple baseline design across six participants was used to evaluate treatment effects. The primary outcome measure was a maze reading task. Cognitive measures were administered pre- and post-treatment. Visual inspection of graphed maze reading performance data indicated a basic effect between APT-3 and improved maze reading for three of the six participants. Quantitative analyses, using Tau-U, corroborated findings identified through visual analysis. The overall effect size was significant (Tau = .48, p = .01). Results suggest that APT-3 has the potential to improve reading in individuals with aphasia, but that it may be more efficacious under certain conditions. Treatment and participant variables, including intensity of treatment and metacognitive strategy usage, are discussed as potential influences on participants' responsiveness to APT-3.

  9. Effects of word frequency and modality on sentence comprehension impairments in people with aphasia.

    PubMed

    DeDe, Gayle

    2012-05-01

    It is well known that people with aphasia have sentence comprehension impairments. The present study investigated whether lexical factors contribute to sentence comprehension impairments in both the auditory and written modalities using online measures of sentence processing. People with aphasia and non brain-damaged controls participated in the experiment (n = 8 per group). Twenty-one sentence pairs containing high- and low-frequency words were presented in self-paced listening and reading tasks. The sentences were syntactically simple and differed only in the critical words. The dependent variables were response times for critical segments of the sentence and accuracy on the comprehension questions. The results showed that word frequency influences performance on measures of sentence comprehension in people with aphasia. The accuracy data on the comprehension questions suggested that people with aphasia have more difficulty understanding sentences containing low-frequency words in the written compared to auditory modality. Both group and single-case analyses of the response time data also indicated that people with aphasia experience more difficulty with reading than listening. Sentence comprehension in people with aphasia is influenced by word frequency and presentation modality.

  10. The Bilingual Language Interaction Network for Comprehension of Speech*

    PubMed Central

    Marian, Viorica

    2013-01-01

    During speech comprehension, bilinguals co-activate both of their languages, resulting in cross-linguistic interaction at various levels of processing. This interaction has important consequences for both the structure of the language system and the mechanisms by which the system processes spoken language. Using computational modeling, we can examine how cross-linguistic interaction affects language processing in a controlled, simulated environment. Here we present a connectionist model of bilingual language processing, the Bilingual Language Interaction Network for Comprehension of Speech (BLINCS), wherein interconnected levels of processing are created using dynamic, self-organizing maps. BLINCS can account for a variety of psycholinguistic phenomena, including cross-linguistic interaction at and across multiple levels of processing, cognate facilitation effects, and audio-visual integration during speech comprehension. The model also provides a way to separate two languages without requiring a global language-identification system. We conclude that BLINCS serves as a promising new model of bilingual spoken language comprehension. PMID:24363602

  11. On the validity of the arithmetic-geometric mean method to locate the optimal solution in a supply chain system

    NASA Astrophysics Data System (ADS)

    Chung, Kun-Jen

    2012-08-01

    Cardenas-Barron [Cardenas-Barron, L.E. (2010) 'A Simple Method to Compute Economic order Quantities: Some Observations', Applied Mathematical Modelling, 34, 1684-1688] indicates that there are several functions in which the arithmetic-geometric mean method (AGM) does not give the minimum. This article presents another situation to reveal that the AGM inequality to locate the optimal solution may be invalid for Teng, Chen, and Goyal [Teng, J.T., Chen, J., and Goyal S.K. (2009), 'A Comprehensive Note on: An Inventory Model under Two Levels of Trade Credit and Limited Storage Space Derived without Derivatives', Applied Mathematical Modelling, 33, 4388-4396], Teng and Goyal [Teng, J.T., and Goyal S.K. (2009), 'Comment on 'Optimal Inventory Replenishment Policy for the EPQ Model under Trade Credit Derived without Derivatives', International Journal of Systems Science, 40, 1095-1098] and Hsieh, Chang, Weng, and Dye [Hsieh, T.P., Chang, H.J., Weng, M.W., and Dye, C.Y. (2008), 'A Simple Approach to an Integrated Single-vendor Single-buyer Inventory System with Shortage', Production Planning and Control, 19, 601-604]. So, the main purpose of this article is to adopt the calculus approach not only to overcome shortcomings of the arithmetic-geometric mean method of Teng et al. (2009), Teng and Goyal (2009) and Hsieh et al. (2008), but also to develop the complete solution procedures for them.

  12. Volume-Of-Fluid Simulation for Predicting Two-Phase Cooling in a Microchannel

    NASA Astrophysics Data System (ADS)

    Gorle, Catherine; Parida, Pritish; Houshmand, Farzad; Asheghi, Mehdi; Goodson, Kenneth

    2014-11-01

    Two-phase flow in microfluidic geometries has applications of increasing interest for next generation electronic and optoelectronic systems, telecommunications devices, and vehicle electronics. While there has been progress on comprehensive simulation of two-phase flows in compact geometries, validation of the results in different flow regimes should be considered to determine the predictive capabilities. In the present study we use the volume-of-fluid method to model the flow through a single micro channel with cross section 100 × 100 μm and length 10 mm. The channel inlet mass flux and the heat flux at the lower wall result in a subcooled boiling regime in the first 2.5 mm of the channel and a saturated flow regime further downstream. A conservation equation for the vapor volume fraction, and a single set of momentum and energy equations with volume-averaged fluid properties are solved. A reduced-physics phase change model represents the evaporation of the liquid and the corresponding heat loss, and the surface tension is accounted for by a source term in the momentum equation. The phase change model used requires the definition of a time relaxation parameter, which can significantly affect the solution since it determines the rate of evaporation. The results are compared to experimental data available from literature, focusing on the capability of the reduced-physics phase change model to predict the correct flow pattern, temperature profile and pressure drop.

  13. Wide coverage biomedical event extraction using multiple partially overlapping corpora

    PubMed Central

    2013-01-01

    Background Biomedical events are key to understanding physiological processes and disease, and wide coverage extraction is required for comprehensive automatic analysis of statements describing biomedical systems in the literature. In turn, the training and evaluation of extraction methods requires manually annotated corpora. However, as manual annotation is time-consuming and expensive, any single event-annotated corpus can only cover a limited number of semantic types. Although combined use of several such corpora could potentially allow an extraction system to achieve broad semantic coverage, there has been little research into learning from multiple corpora with partially overlapping semantic annotation scopes. Results We propose a method for learning from multiple corpora with partial semantic annotation overlap, and implement this method to improve our existing event extraction system, EventMine. An evaluation using seven event annotated corpora, including 65 event types in total, shows that learning from overlapping corpora can produce a single, corpus-independent, wide coverage extraction system that outperforms systems trained on single corpora and exceeds previously reported results on two established event extraction tasks from the BioNLP Shared Task 2011. Conclusions The proposed method allows the training of a wide-coverage, state-of-the-art event extraction system from multiple corpora with partial semantic annotation overlap. The resulting single model makes broad-coverage extraction straightforward in practice by removing the need to either select a subset of compatible corpora or semantic types, or to merge results from several models trained on different individual corpora. Multi-corpus learning also allows annotation efforts to focus on covering additional semantic types, rather than aiming for exhaustive coverage in any single annotation effort, or extending the coverage of semantic types annotated in existing corpora. PMID:23731785

  14. Integrating System Dynamics and Bayesian Networks with Application to Counter-IED Scenarios

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jarman, Kenneth D.; Brothers, Alan J.; Whitney, Paul D.

    2010-06-06

    The practice of choosing a single modeling paradigm for predictive analysis can limit the scope and relevance of predictions and their utility to decision-making processes. Considering multiple modeling methods simultaneously may improve this situation, but a better solution provides a framework for directly integrating different, potentially complementary modeling paradigms to enable more comprehensive modeling and predictions, and thus better-informed decisions. The primary challenges of this kind of model integration are to bridge language and conceptual gaps between modeling paradigms, and to determine whether natural and useful linkages can be made in a formal mathematical manner. To address these challenges inmore » the context of two specific modeling paradigms, we explore mathematical and computational options for linking System Dynamics (SD) and Bayesian network (BN) models and incorporating data into the integrated models. We demonstrate that integrated SD/BN models can naturally be described as either state space equations or Dynamic Bayes Nets, which enables the use of many existing computational methods for simulation and data integration. To demonstrate, we apply our model integration approach to techno-social models of insurgent-led attacks and security force counter-measures centered on improvised explosive devices.« less

  15. Asymmetric transmission and reflection spectra of FBG in single-multi-single mode fiber structure.

    PubMed

    Chai, Quan; Liu, Yanlei; Zhang, Jianzhong; Yang, Jun; Chen, Yujin; Yuan, Libo; Peng, Gang-Ding

    2015-05-04

    We give a comprehensive theoretical analysis and simulation of a FBG in single-multi-single mode fiber structure (FBG-in-SMS), based on the coupled mode analysis and the mode interference analysis. This enables us to explain the experimental observations, its asymmetric transmission and reflection spectra with the similar temperature responses near the spectral range of Bragg wavelengths. The transmission spectrum shift during FBG written-in process is observed and discussed. The analysis results are useful in the design of the SMS structure based sensors and filters.

  16. Single-cell Transcriptome Study as Big Data

    PubMed Central

    Yu, Pingjian; Lin, Wei

    2016-01-01

    The rapid growth of single-cell RNA-seq studies (scRNA-seq) demands efficient data storage, processing, and analysis. Big-data technology provides a framework that facilitates the comprehensive discovery of biological signals from inter-institutional scRNA-seq datasets. The strategies to solve the stochastic and heterogeneous single-cell transcriptome signal are discussed in this article. After extensively reviewing the available big-data applications of next-generation sequencing (NGS)-based studies, we propose a workflow that accounts for the unique characteristics of scRNA-seq data and primary objectives of single-cell studies. PMID:26876720

  17. Direct and mediated effects of language and cognitive skills on comprehension of oral narrative texts (listening comprehension) for children.

    PubMed

    Kim, Young-Suk Grace

    2016-01-01

    We investigated component language and cognitive skills of oral language comprehension of narrative texts (i.e., listening comprehension). Using the construction-integration model of text comprehension as an overarching theoretical framework, we examined direct and mediated relations of foundational cognitive skills (working memory and attention), foundational language skills (vocabulary and grammatical knowledge), and higher-order cognitive skills (inference, theory of mind, and comprehension monitoring) to listening comprehension. A total of 201 first grade children in South Korea participated in the study. Structural equation modeling results showed that listening comprehension is directly predicted by working memory, grammatical knowledge, inference, and theory of mind and is indirectly predicted by attention, vocabulary, and comprehension monitoring. The total effects were .46 for working memory, .07 for attention, .30 for vocabulary, .49 for grammatical knowledge, .31 for inference, .52 for theory of mind, and .18 for comprehension monitoring. These results suggest that multiple language and cognitive skills make contributions to listening comprehension, and their contributions are both direct and indirect. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. The Relation between Executive Functioning, Reaction Time, Naming Speed, and Single Word Reading in Children with Typical Development and Language Impairments

    ERIC Educational Resources Information Center

    Messer, David; Henry, Lucy A.; Nash, Gilly

    2016-01-01

    Background: Few investigations have examined the relationship between a comprehensive range of executive functioning (EF) abilities and reading. Aims: Our investigation identified components of EF that independently predicted single word reading, and determined whether their predictive role remained when additional variables were included in the…

  19. The Ins and Outs of Evaluating Web-Scale Discovery Services

    ERIC Educational Resources Information Center

    Hoeppner, Athena

    2012-01-01

    Librarians are familiar with the single-line form, the consolidated index, which represents a very large portion of a library's print and online collection. Their end users are familiar with the idea of a single search across a comprehensive index that produces a large, relevancy-ranked results list. Even though most patrons would not recognize…

  20. EFL Learners' Multiple Documents Literacy: Effects of a Strategy-Directed Intervention Program

    ERIC Educational Resources Information Center

    Karimi, Mohammad Nabi

    2015-01-01

    There is a substantial body of L2 research documenting the central role of strategy instruction in reading comprehension. However, this line of research has been conducted mostly within the single text paradigm of reading research. With reading literacy undergoing a marked shift from single source reading to multiple documents literacy, little is…

  1. A Comprehensive Model of Cancer-Related Information Seeking Applied to Magazines.

    ERIC Educational Resources Information Center

    Johnson, J. David; Meischke, Hendrika

    1993-01-01

    Examines a comprehensive model of information seeking resulting from the synthesis of three theoretical research streams: the health belief model, uses and gratifications research, and a model of media exposure. Suggests that models of information seeking from mass media should focus on purely communicative factors. (RS)

  2. Building Comprehensive High School Guidance Programs through the Smaller Learning Communities Model

    ERIC Educational Resources Information Center

    Harper, Geralyn

    2013-01-01

    Despite many reform initiatives, including the federally funded initiative titled the Smaller Learning Communities' (SLC) Model, many students are still underexposed to comprehensive guidance programs. The purpose of this mixed method project study was to examine which components in a comprehensive guidance program for the learning academies at a…

  3. High-recovery visual identification and single-cell retrieval of circulating tumor cells for genomic analysis using a dual-technology platform integrated with automated immunofluorescence staining.

    PubMed

    Campton, Daniel E; Ramirez, Arturo B; Nordberg, Joshua J; Drovetto, Nick; Clein, Alisa C; Varshavskaya, Paulina; Friemel, Barry H; Quarre, Steve; Breman, Amy; Dorschner, Michael; Blau, Sibel; Blau, C Anthony; Sabath, Daniel E; Stilwell, Jackie L; Kaldjian, Eric P

    2015-05-06

    Circulating tumor cells (CTCs) are malignant cells that have migrated from solid cancers into the blood, where they are typically present in rare numbers. There is great interest in using CTCs to monitor response to therapies, to identify clinically actionable biomarkers, and to provide a non-invasive window on the molecular state of a tumor. Here we characterize the performance of the AccuCyte®--CyteFinder® system, a comprehensive, reproducible and highly sensitive platform for collecting, identifying and retrieving individual CTCs from microscopic slides for molecular analysis after automated immunofluorescence staining for epithelial markers. All experiments employed a density-based cell separation apparatus (AccuCyte) to separate nucleated cells from the blood and transfer them to microscopic slides. After staining, the slides were imaged using a digital scanning microscope (CyteFinder). Precisely counted model CTCs (mCTCs) from four cancer cell lines were spiked into whole blood to determine recovery rates. Individual mCTCs were removed from slides using a single-cell retrieval device (CytePicker™) for whole genome amplification and subsequent analysis by PCR and Sanger sequencing, whole exome sequencing, or array-based comparative genomic hybridization. Clinical CTCs were evaluated in blood samples from patients with different cancers in comparison with the CellSearch® system. AccuCyte--CyteFinder presented high-resolution images that allowed identification of mCTCs by morphologic and phenotypic features. Spike-in mCTC recoveries were between 90 and 91%. More than 80% of single-digit spike-in mCTCs were identified and even a single cell in 7.5 mL could be found. Analysis of single SKBR3 mCTCs identified presence of a known TP53 mutation by both PCR and whole exome sequencing, and confirmed the reported karyotype of this cell line. Patient sample CTC counts matched or exceeded CellSearch CTC counts in a small feasibility cohort. The AccuCyte--CyteFinder system is a comprehensive and sensitive platform for identification and characterization of CTCs that has been applied to the assessment of CTCs in cancer patient samples as well as the isolation of single cells for genomic analysis. It thus enables accurate non-invasive monitoring of CTCs and evolving cancer biology for personalized, molecularly-guided cancer treatment.

  4. Coding stimulus amplitude by correlated neural activity

    NASA Astrophysics Data System (ADS)

    Metzen, Michael G.; Ávila-Åkerberg, Oscar; Chacron, Maurice J.

    2015-04-01

    While correlated activity is observed ubiquitously in the brain, its role in neural coding has remained controversial. Recent experimental results have demonstrated that correlated but not single-neuron activity can encode the detailed time course of the instantaneous amplitude (i.e., envelope) of a stimulus. These have furthermore demonstrated that such coding required and was optimal for a nonzero level of neural variability. However, a theoretical understanding of these results is still lacking. Here we provide a comprehensive theoretical framework explaining these experimental findings. Specifically, we use linear response theory to derive an expression relating the correlation coefficient to the instantaneous stimulus amplitude, which takes into account key single-neuron properties such as firing rate and variability as quantified by the coefficient of variation. The theoretical prediction was in excellent agreement with numerical simulations of various integrate-and-fire type neuron models for various parameter values. Further, we demonstrate a form of stochastic resonance as optimal coding of stimulus variance by correlated activity occurs for a nonzero value of noise intensity. Thus, our results provide a theoretical explanation of the phenomenon by which correlated but not single-neuron activity can code for stimulus amplitude and how key single-neuron properties such as firing rate and variability influence such coding. Correlation coding by correlated but not single-neuron activity is thus predicted to be a ubiquitous feature of sensory processing for neurons responding to weak input.

  5. Single cell isolation process with laser induced forward transfer.

    PubMed

    Deng, Yu; Renaud, Philippe; Guo, Zhongning; Huang, Zhigang; Chen, Ying

    2017-01-01

    A viable single cell is crucial for studies of single cell biology. In this paper, laser-induced forward transfer (LIFT) was used to isolate individual cell with a closed chamber designed to avoid contamination and maintain humidity. Hela cells were used to study the impact of laser pulse energy, laser spot size, sacrificed layer thickness and working distance. The size distribution, number and proliferation ratio of separated cells were statistically evaluated. Glycerol was used to increase the viscosity of the medium and alginate were introduced to soften the landing process. The role of laser pulse energy, the spot size and the thickness of titanium in energy absorption in LIFT process was theoretically analyzed with Lambert-Beer and a thermal conductive model. After comprehensive analysis, mechanical damage was found to be the dominant factor affecting the size and proliferation ratio of the isolated cells. An orthogonal experiment was conducted, and the optimal conditions were determined as: laser pulse energy, 9 μJ; spot size, 60 μm; thickness of titanium, 12 nm; working distance, 700 μm;, glycerol, 2% and alginate depth, greater than 1 μm. With these conditions, along with continuous incubation, a single cell could be transferred by the LIFT with one shot, with limited effect on cell size and viability. LIFT conducted in a closed chamber under optimized condition is a promising method for reliably isolating single cells.

  6. A Comprehensive Model of the Meteoroids Environment Around Mercury

    NASA Astrophysics Data System (ADS)

    Pokorny, P.; Sarantos, M.; Janches, D.

    2018-05-01

    We present a comprehensive dynamical model for the meteoroid environment around Mercury comprised of meteoroids originating in asteroids, short and long period comets. Our model is fully calibrated and provides predictions for different values of TAA.

  7. Monitoring and Assessment of Youshui River Water Quality in Youyang

    NASA Astrophysics Data System (ADS)

    Wang, Xue-qin; Wen, Juan; Chen, Ping-hua; Liu, Na-na

    2018-02-01

    By monitoring the water quality of Youshui River from January 2016 to December 2016, according to the indicator grading and the assessment standard of water quality, the formulas for 3 types water quality indexes are established. These 3 types water quality indexes, the single indicator index Ai, single moment index Ak and the comprehensive water quality index A, were used to quantitatively evaluate the quality of single indicator, the water quality and the change of water quality with time. The results show that, both total phosphorus and fecal coliform indicators exceeded the standard, while the other 16 indicators measured up to the standard. The water quality index of Youshui River is 0.93 and the grade of water quality comprehensive assessment is level 2, which indicated that the water quality of Youshui River is good, and there is room for further improvement. To this end, several protection measures for Youshui River environmental management and pollution treatment are proposed.

  8. Rapid detection of single bacteria in unprocessed blood using Integrated Comprehensive Droplet Digital Detection

    PubMed Central

    Kang, Dong-Ku; Ali, M. Monsur; Zhang, Kaixiang; Huang, Susan S.; Peterson, Ellena; Digman, Michelle A.; Gratton, Enrico; Zhao, Weian

    2014-01-01

    Blood stream infection or sepsis is a major health problem worldwide, with extremely high mortality, which is partly due to the inability to rapidly detect and identify bacteria in the early stages of infection. Here we present a new technology termed ‘Integrated Comprehensive Droplet Digital Detection’ (IC 3D) that can selectively detect bacteria directly from milliliters of diluted blood at single-cell sensitivity in a one-step, culture- and amplification-free process within 1.5–4 h. The IC 3D integrates real-time, DNAzyme-based sensors, droplet microencapsulation and a high-throughput 3D particle counter system. Using Escherichia coli as a target, we demonstrate that the IC 3D can provide absolute quantification of both stock and clinical isolates of E. coli in spiked blood within a broad range of extremely low concentration from 1 to 10,000 bacteria per ml with exceptional robustness and limit of detection in the single digit regime. PMID:25391809

  9. Photogrammetry Applied to Wind Tunnel Testing

    NASA Technical Reports Server (NTRS)

    Liu, Tian-Shu; Cattafesta, L. N., III; Radeztsky, R. H.; Burner, A. W.

    2000-01-01

    In image-based measurements, quantitative image data must be mapped to three-dimensional object space. Analytical photogrammetric methods, which may be used to accomplish this task, are discussed from the viewpoint of experimental fluid dynamicists. The Direct Linear Transformation (DLT) for camera calibration, used in pressure sensitive paint, is summarized. An optimization method for camera calibration is developed that can be used to determine the camera calibration parameters, including those describing lens distortion, from a single image. Combined with the DLT method, this method allows a rapid and comprehensive in-situ camera calibration and therefore is particularly useful for quantitative flow visualization and other measurements such as model attitude and deformation in production wind tunnels. The paper also includes a brief description of typical photogrammetric applications to temperature- and pressure-sensitive paint measurements and model deformation measurements in wind tunnels.

  10. Bayesian Networks Predict Neuronal Transdifferentiation.

    PubMed

    Ainsworth, Richard I; Ai, Rizi; Ding, Bo; Li, Nan; Zhang, Kai; Wang, Wei

    2018-05-30

    We employ the language of Bayesian networks to systematically construct gene-regulation topologies from deep-sequencing single-nucleus RNA-Seq data for human neurons. From the perspective of the cell-state potential landscape, we identify attractors that correspond closely to different neuron subtypes. Attractors are also recovered for cell states from an independent data set confirming our models accurate description of global genetic regulations across differing cell types of the neocortex (not included in the training data). Our model recovers experimentally confirmed genetic regulations and community analysis reveals genetic associations in common pathways. Via a comprehensive scan of all theoretical three-gene perturbations of gene knockout and overexpression, we discover novel neuronal trans-differrentiation recipes (including perturbations of SATB2, GAD1, POU6F2 and ADARB2) for excitatory projection neuron and inhibitory interneuron subtypes. Copyright © 2018, G3: Genes, Genomes, Genetics.

  11. Positional estimation techniques for an autonomous mobile robot

    NASA Technical Reports Server (NTRS)

    Nandhakumar, N.; Aggarwal, J. K.

    1990-01-01

    Techniques for positional estimation of a mobile robot navigation in an indoor environment are described. A comprehensive review of the various positional estimation techniques studied in the literature is first presented. The techniques are divided into four different types and each of them is discussed briefly. Two different kinds of environments are considered for positional estimation; mountainous natural terrain and an urban, man-made environment with polyhedral buildings. In both cases, the robot is assumed to be equipped with single visual camera that can be panned and tilted and also a 3-D description (world model) of the environment is given. Such a description could be obtained from a stereo pair of aerial images or from the architectural plans of the buildings. Techniques for positional estimation using the camera input and the world model are presented.

  12. Total generalized variation-regularized variational model for single image dehazing

    NASA Astrophysics Data System (ADS)

    Shu, Qiao-Ling; Wu, Chuan-Sheng; Zhong, Qiu-Xiang; Liu, Ryan Wen

    2018-04-01

    Imaging quality is often significantly degraded under hazy weather condition. The purpose of this paper is to recover the latent sharp image from its hazy version. It is well known that the accurate estimation of depth information could assist in improving dehazing performance. In this paper, a detail-preserving variational model was proposed to simultaneously estimate haze-free image and depth map. In particular, the total variation (TV) and total generalized variation (TGV) regularizers were introduced to restrain haze-free image and depth map, respectively. The resulting nonsmooth optimization problem was efficiently solved using the alternating direction method of multipliers (ADMM). Comprehensive experiments have been conducted on realistic datasets to compare our proposed method with several state-of-the-art dehazing methods. Results have illustrated the superior performance of the proposed method in terms of visual quality evaluation.

  13. High-resolution age modelling of peat bog profiles using pre and post-bomb 14C, 210Pb and cryptotephra data from six Albertan peat bogs

    NASA Astrophysics Data System (ADS)

    Davies, L. J.; Froese, D. G.; Appleby, P.; van Bellen, S.; Magnan, G.; Mullan-Boudreau, G.; Noernberg, T.; Shotyk, W.; Zaccone, C.

    2016-12-01

    Age modelling of recent peat profiles is frequently undertaken for high-resolution modern studies, but the most common techniques applied (e.g. 14C, 210Pb, cryptotephra) are rarely combined and used for testing and inter-comparison. Here, we integrate three age-dating approaches to produce a single age model to comprehensively investigate variations in the chronometers and individual site histories since 1900. OxCal's P_Sequence function is used to model dates produced using 14C (pre- and post-bomb), 210Pb (corroborated with 137Cs and 241Am) from six peat bogs in central and northern Alberta. Physical and chemical characteristics of the cores (e.g. macrofossils, humification, ash content, dry density) provide important constraints for the model by highlighting periods with significant changes in accumulation rate (e.g. fire events, permafrost development, prolonged surficial drying). Sub-cm resolution output shows there are consistent differences in how the 14C and 210Pb signals are preserved in peat profiles, with 14C commonly showing a slight bias toward older ages at the same depth relative to 210Pb data. These methods can successfully be combined in a Bayesian model and used to produce a single age model that more accurately accounts for the uncertainties inherent in each method. Understanding these differences and combining the results of these methods results in a stronger chronology at each site investigated here despite observed differences in ecological setting, accumulation rates, fire events/frequency and permafrost development.

  14. The Little State That Couldn't Could? The Politics of "Single-Payer" Health Coverage in Vermont.

    PubMed

    Fox, Ashley M; Blanchet, Nathan J

    2015-06-01

    In May 2011, a year after the passage of the Affordable Care Act (ACA), Vermont became the first state to lay the groundwork for a single-payer health care system, known as Green Mountain Care. What can other states learn from the Vermont experience? This article summarizes the findings from interviews with nearly 120 stakeholders as part of a study to inform the design of the health reform legislation. Comparing Vermont's failed effort to adopt single-payer legislation in 1994 to present efforts, we find that Vermont faced similar challenges but greater opportunities in 2010 that enabled reform. A closely contested gubernatorial election and a progressive social movement opened a window of opportunity to advance legislation to design three comprehensive health reform options for legislative consideration. With a unified Democratic government under the leadership of a single-payer proponent, a high-profile policy proposal, and relatively weak opposition, a framework for a single-payer system was adopted by the legislature - though with many details and political battles to be fought in the future. Other states looking to reform their health systems more comprehensively than national reform can learn from Vermont's design and political strategy. Copyright © 2015 by Duke University Press.

  15. Examining General and Specific Factors in the Dimensionality of Oral Language and Reading in 4th–10th Grades

    PubMed Central

    Foorman, Barbara R.; Koon, Sharon; Petscher, Yaacov; Mitchell, Alison; Truckenmiller, Adrea

    2015-01-01

    The objective of this study was to explore dimensions of oral language and reading and their influence on reading comprehension in a relatively understudied population—adolescent readers in 4th through 10th grades. The current study employed latent variable modeling of decoding fluency, vocabulary, syntax, and reading comprehension so as to represent these constructs with minimal error and to examine whether residual variance unaccounted for by oral language can be captured by specific factors of syntax and vocabulary. A 1-, 3-, 4-, and bifactor model were tested with 1,792 students in 18 schools in 2 large urban districts in the Southeast. Students were individually administered measures of expressive and receptive vocabulary, syntax, and decoding fluency in mid-year. At the end of the year students took the state reading test as well as a group-administered, norm-referenced test of reading comprehension. The bifactor model fit the data best in all 7 grades and explained 72% to 99% of the variance in reading comprehension. The specific factors of syntax and vocabulary explained significant unique variance in reading comprehension in 1 grade each. The decoding fluency factor was significantly correlated with the reading comprehension and oral language factors in all grades, but, in the presence of the oral language factor, was not significantly associated with the reading comprehension factor. Results support a bifactor model of lexical knowledge rather than the 3-factor model of the Simple View of Reading, with the vast amount of variance in reading comprehension explained by a general oral language factor. PMID:26346839

  16. Single column comprehensive analysis of pharmaceutical preparations using dual-injection mixed-mode (ion-exchange and reversed-phase) and hydrophilic interaction liquid chromatography.

    PubMed

    Kazarian, Artaches A; Taylor, Mark R; Haddad, Paul R; Nesterenko, Pavel N; Paull, Brett

    2013-12-01

    The comprehensive separation and detection of hydrophobic and hydrophilic active pharmaceutical ingredients (APIs), their counter-ions (organic, inorganic) and excipients, using a single mixed-mode chromatographic column, and a dual injection approach is presented. Using a mixed-mode Thermo Fisher Acclaim Trinity P1 column, APIs, their counter-ions and possible degradants were first separated using a combination of anion-exchange, cation-exchange and hydrophobic interactions, using a mobile phase consisting of a dual organic modifier/salt concentration gradient. A complementary method was also developed using the same column for the separation of hydrophilic bulk excipients, using hydrophilic interaction liquid chromatography (HILIC) under high organic solvent mobile phase conditions. These two methods were then combined within a single gradient run using dual sample injection, with the first injection at the start of the applied gradient (mixed-mode retention of solutes), followed by a second sample injection at the end of the gradient (HILIC retention of solutes). Detection using both ultraviolet absorbance and refractive index enabled the sensitive detection of APIs and UV-absorbing counter-ions, together with quantitative determination of bulk excipients. The developed approach was applied successfully to the analysis of a dry powder inhalers (Flixotide(®), Spiriva(®)), enabling comprehensive quantification of all APIs and excipients in the sample. Copyright © 2013 Elsevier B.V. All rights reserved.

  17. Time-lapse electrical impedance spectroscopy for monitoring the cell cycle of single immobilized S. pombe cells.

    PubMed

    Zhu, Zhen; Frey, Olivier; Haandbaek, Niels; Franke, Felix; Rudolf, Fabian; Hierlemann, Andreas

    2015-11-26

    As a complement and alternative to optical methods, wide-band electrical impedance spectroscopy (EIS) enables multi-parameter, label-free and real-time detection of cellular and subcellular features. We report on a microfluidics-based system designed to reliably capture single rod-shaped Schizosaccharomyces pombe cells by applying suction through orifices in a channel wall. The system enables subsequent culturing of immobilized cells in an upright position, while dynamic changes in cell-cycle state and morphology were continuously monitored through EIS over a broad frequency range. Besides measuring cell growth, clear impedance signals for nuclear division have been obtained. The EIS system has been characterized with respect to sensitivity and detection limits. The spatial resolution in measuring cell length was 0.25 μm, which corresponds to approximately a 5-min interval of cell growth under standard conditions. The comprehensive impedance data sets were also used to determine the occurrence of nuclear division and cytokinesis. The obtained results have been validated through concurrent confocal imaging and plausibilized through comparison with finite-element modeling data. The possibility to monitor cellular and intracellular features of single S. pombe cells during the cell cycle at high spatiotemporal resolution renders the presented microfluidics-based EIS system a suitable tool for dynamic single-cell investigations.

  18. Time-lapse electrical impedance spectroscopy for monitoring the cell cycle of single immobilized S. pombe cells

    PubMed Central

    Zhu, Zhen; Frey, Olivier; Haandbaek, Niels; Franke, Felix; Rudolf, Fabian; Hierlemann, Andreas

    2015-01-01

    As a complement and alternative to optical methods, wide-band electrical impedance spectroscopy (EIS) enables multi-parameter, label-free and real-time detection of cellular and subcellular features. We report on a microfluidics-based system designed to reliably capture single rod-shaped Schizosaccharomyces pombe cells by applying suction through orifices in a channel wall. The system enables subsequent culturing of immobilized cells in an upright position, while dynamic changes in cell-cycle state and morphology were continuously monitored through EIS over a broad frequency range. Besides measuring cell growth, clear impedance signals for nuclear division have been obtained. The EIS system has been characterized with respect to sensitivity and detection limits. The spatial resolution in measuring cell length was 0.25 μm, which corresponds to approximately a 5-min interval of cell growth under standard conditions. The comprehensive impedance data sets were also used to determine the occurrence of nuclear division and cytokinesis. The obtained results have been validated through concurrent confocal imaging and plausibilized through comparison with finite-element modeling data. The possibility to monitor cellular and intracellular features of single S. pombe cells during the cell cycle at high spatiotemporal resolution renders the presented microfluidics-based EIS system a suitable tool for dynamic single-cell investigations. PMID:26608589

  19. Entropy-Based Performance Analysis of Jet Engines; Methodology and Application to a Generic Single-Spool Turbojet

    NASA Astrophysics Data System (ADS)

    Abbas, Mohammad

    Recently developed methodology that provides the direct assessment of traditional thrust-based performance of aerospace vehicles in terms of entropy generation (i.e., exergy destruction) is modified for stand-alone jet engines. This methodology is applied to a specific single-spool turbojet engine configuration. A generic compressor performance map along with modeled engine component performance characterizations are utilized in order to provide comprehensive traditional engine performance results (engine thrust, mass capture, and RPM), for on and off-design engine operation. Details of exergy losses in engine components, across the entire engine, and in the engine wake are provided and the engine performance losses associated with their losses are discussed. Results are provided across the engine operating envelope as defined by operational ranges of flight Mach number, altitude, and fuel throttle setting. The exergy destruction that occurs in the engine wake is shown to be dominant with respect to other losses, including all exergy losses that occur inside the engine. Specifically, the ratio of the exergy destruction rate in the wake to the exergy destruction rate inside the engine itself ranges from 1 to 2.5 across the operational envelope of the modeled engine.

  20. COMPREHENSIVE PBPK MODELING APPROACH USING THE EXPOSURE RELATED DOSE ESTIMATING MODEL (ERDEM)

    EPA Science Inventory

    ERDEM, a complex PBPK modeling system, is the result of the implementation of a comprehensive PBPK modeling approach. ERDEM provides a scalable and user-friendly environment that enables researchers to focus on data input values rather than writing program code. It efficiently ...

  1. The Soldier Fitness Tracker: global delivery of Comprehensive Soldier Fitness.

    PubMed

    Fravell, Mike; Nasser, Katherine; Cornum, Rhonda

    2011-01-01

    Carefully implemented technology strategies are vital to the success of large-scale initiatives such as the U.S. Army's Comprehensive Soldier Fitness (CSF) program. Achieving the U.S. Army's vision for CSF required a robust information technology platform that was scaled to millions of users and that leveraged the Internet to enable global reach. The platform needed to be agile, provide powerful real-time reporting, and have the capacity to quickly transform to meet emerging requirements. Existing organizational applications, such as "Single Sign-On," and authoritative data sources were exploited to the maximum extent possible. Development of the "Soldier Fitness Tracker" is the most recent, and possibly the best, demonstration of the potential benefits possible when existing organizational capabilities are married to new, innovative applications. Combining the capabilities of the extant applications with the newly developed applications expedited development, eliminated redundant data collection, resulted in the exceeding of program objectives, and produced a comfortable experience for the end user, all in less than six months. This is a model for future technology integration. (c) 2010 APA, all rights reserved.

  2. Toward a Model of Text Comprehension and Production.

    ERIC Educational Resources Information Center

    Kintsch, Walter; Van Dijk, Teun A.

    1978-01-01

    Described is the system of mental operations occurring in text comprehension and in recall and summarization. A processing model is outlined: 1) the meaning elements of a text become organized into a coherent whole, 2) the full meaning of the text is condensed into its gist, and 3) new texts are generated from the comprehension processes.…

  3. The Flipped Classroom Model to Develop Egyptian EFL Students' Listening Comprehension

    ERIC Educational Resources Information Center

    Ahmad, Samah Zakareya

    2016-01-01

    The present study aimed at investigating the effect of the flipped classroom model on Egyptian EFL students' listening comprehension. A one-group pre-posttest design was adopted. Thirty-four 3rd-year EFL students at the Faculty of Education, Suez University, were pretested on listening comprehension before the experiment and then posttested after…

  4. Validation of a Cognitive Diagnostic Model across Multiple Forms of a Reading Comprehension Assessment

    ERIC Educational Resources Information Center

    Clark, Amy K.

    2013-01-01

    The present study sought to fit a cognitive diagnostic model (CDM) across multiple forms of a passage-based reading comprehension assessment using the attribute hierarchy method. Previous research on CDMs for reading comprehension assessments served as a basis for the attributes in the hierarchy. The two attribute hierarchies were fit to data from…

  5. A Longitudinal Study of Reading Comprehension Achievement from Grades 3 to 10: Investigating Models of Stability, Cumulative Growth, and Compensation

    ERIC Educational Resources Information Center

    Kwiatkowska-White, Bozena; Kirby, John R.; Lee, Elizabeth A.

    2016-01-01

    This longitudinal study of 78 Canadian English-speaking students examined the applicability of the stability, cumulative, and compensatory models in reading comprehension development. Archival government-mandated assessments of reading comprehension at Grades 3, 6, and 10, and the Canadian Test of Basic Skills measure of reading comprehension…

  6. Coherence Threshold and the Continuity of Processing: The RI-Val Model of Comprehension

    ERIC Educational Resources Information Center

    O'Brien, Edward J.; Cook, Anne E.

    2016-01-01

    Common to all models of reading comprehension is the assumption that a reader's level of comprehension is heavily influenced by their standards of coherence (van den Broek, Risden, & Husbye-Hartman, 1995). Our discussion focuses on a subcomponent of the readers' standards of coherence: the coherence threshold. We situate this discussion within…

  7. The Role of Elaboration in the Comprehension and Retention of Prose: A Critical Review.

    ERIC Educational Resources Information Center

    Reder, Lynne M.

    1980-01-01

    Recent research in the area of prose comprehension is reviewed, including factors that affect amount of recall, representations of text structures, and use of world knowledge to aid comprehension. The need for more information processing models of comprehension is emphasized. Elaboration is considered important for comprehension and retention.…

  8. Direct and Mediated Effects of Language and Cognitive Skills on Comprehension or Oral Narrative Texts (Listening Comprehension) for Children

    ERIC Educational Resources Information Center

    Kim, Young-Suk Grace

    2016-01-01

    We investigated component language and cognitive skills of oral language comprehension of narrative texts (i.e., listening comprehension). Using the construction--integration model of text comprehension as an overarching theoretical framework, we examined direct and mediated relations of foundational cognitive skills (working memory and…

  9. Deep phenotyping of human induced pluripotent stem cell-derived atrial and ventricular cardiomyocytes.

    PubMed

    Cyganek, Lukas; Tiburcy, Malte; Sekeres, Karolina; Gerstenberg, Kathleen; Bohnenberger, Hanibal; Lenz, Christof; Henze, Sarah; Stauske, Michael; Salinas, Gabriela; Zimmermann, Wolfram-Hubertus; Hasenfuss, Gerd; Guan, Kaomei

    2018-06-21

    Generation of homogeneous populations of subtype-specific cardiomyocytes (CMs) derived from human induced pluripotent stem cells (iPSCs) and their comprehensive phenotyping is crucial for a better understanding of the subtype-related disease mechanisms and as tools for the development of chamber-specific drugs. The goals of this study were to apply a simple and efficient method for differentiation of iPSCs into defined functional CM subtypes in feeder-free conditions and to obtain a comprehensive understanding of the molecular, cell biological, and functional properties of atrial and ventricular iPSC-CMs on both the single-cell and engineered heart muscle (EHM) level. By a stage-specific activation of retinoic acid signaling in monolayer-based and well-defined culture, we showed that cardiac progenitors can be directed towards a highly homogeneous population of atrial CMs. By combining the transcriptome and proteome profiling of the iPSC-CM subtypes with functional characterizations via optical action potential and calcium imaging, and with contractile analyses in EHM, we demonstrated that atrial and ventricular iPSC-CMs and -EHM highly correspond to the atrial and ventricular heart muscle, respectively. This study provides a comprehensive understanding of the molecular and functional identities characteristic of atrial and ventricular iPSC-CMs and -EHM and supports their suitability in disease modeling and chamber-specific drug screening.

  10. LipidPedia: a comprehensive lipid knowledgebase.

    PubMed

    Kuo, Tien-Chueh; Tseng, Yufeng Jane

    2018-04-10

    Lipids are divided into fatty acyls, glycerolipids, glycerophospholipids, sphingolipids, saccharolipids, sterols, prenol lipids and polyketides. Fatty acyls and glycerolipids are commonly used as energy storage, whereas glycerophospholipids, sphingolipids, sterols and saccharolipids are common used as components of cell membranes. Lipids in fatty acyls, glycerophospholipids, sphingolipids and sterols classes play important roles in signaling. Although more than 36 million lipids can be identified or computationally generated, no single lipid database provides comprehensive information on lipids. Furthermore, the complex systematic or common names of lipids make the discovery of related information challenging. Here, we present LipidPedia, a comprehensive lipid knowledgebase. The content of this database is derived from integrating annotation data with full-text mining of 3,923 lipids and more than 400,000 annotations of associated diseases, pathways, functions, and locations that are essential for interpreting lipid functions and mechanisms from over 1,400,000 scientific publications. Each lipid in LipidPedia also has its own entry containing a text summary curated from the most frequently cited diseases, pathways, genes, locations, functions, lipids and experimental models in the biomedical literature. LipidPedia aims to provide an overall synopsis of lipids to summarize lipid annotations and provide a detailed listing of references for understanding complex lipid functions and mechanisms. LipidPedia is available at http://lipidpedia.cmdm.tw. yjtseng@csie.ntu.edu.tw. Supplementary data are available at Bioinformatics online.

  11. The P3C2R+GIRD Paradigm of Creating a Reading Comprehension Lesson for EFL Students: From Conceptual Model to Model Lesson

    ERIC Educational Resources Information Center

    Karanjakwut, Chalermsup

    2017-01-01

    This academic article is aiming at creating a reading comprehension lesson with a new paradigm called the P3C2R+GIRD model developed by a 9-year-experience author in teaching English reading skill who always found that one of the problems of EFL students in learning English language is the lack of reading comprehension which is an important skill…

  12. Collective Dynamics for Heterogeneous Networks of Theta Neurons

    NASA Astrophysics Data System (ADS)

    Luke, Tanushree

    Collective behavior in neural networks has often been used as an indicator of communication between different brain areas. These collective synchronization and desynchronization patterns are also considered an important feature in understanding normal and abnormal brain function. To understand the emergence of these collective patterns, I create an analytic model that identifies all such macroscopic steady-states attainable by a network of Type-I neurons. This network, whose basic unit is the model "theta'' neuron, contains a mixture of excitable and spiking neurons coupled via a smooth pulse-like synapse. Applying the Ott-Antonsen reduction method in the thermodynamic limit, I obtain a low-dimensional evolution equation that describes the asymptotic dynamics of the macroscopic mean field of the network. This model can be used as the basis in understanding more complicated neuronal networks when additional dynamical features are included. From this reduced dynamical equation for the mean field, I show that the network exhibits three collective attracting steady-states. The first two are equilibrium states that both reflect partial synchronization in the network, whereas the third is a limit cycle in which the degree of network synchronization oscillates in time. In addition to a comprehensive identification of all possible attracting macro-states, this analytic model permits a complete bifurcation analysis of the collective behavior of the network with respect to three key network features: the degree of excitability of the neurons, the heterogeneity of the population, and the overall coupling strength. The network typically tends towards the two macroscopic equilibrium states when the neuron's intrinsic dynamics and the network interactions reinforce each other. In contrast, the limit cycle state, bifurcations, and multistability tend to occur when there is competition between these network features. I also outline here an extension of the above model where the neurons' excitability now varies in time sinuosoidally, thus simulating a parabolic bursting network. This time-varying excitability can lead to the emergence of macroscopic chaos and multistability in the collective behavior of the network. Finally, I expand the single population model described above to examine a two-population neuronal network where each population has its own unique mixture of excitable and spiking neurons, as well as its own coupling strength (either excitatory or inhibitory in nature). Specifically, I consider the situation where the first population is only allowed to influence the second population without any feedback, thus effectively creating a feed-forward "driver-response" system. In this special arrangement, the driver's asymptotic macroscopic dynamics are fully explored in the comprehensive analysis of the single population. Then, in the presence of an influence from the driver, the modified dynamics of the second population, which now acts as a response population, can also be fully analyzed. As in the time-varying model, these modifications give rise to richer dynamics to the response population than those found from the single population formalism, including multi-periodicity and chaos.

  13. Acoustic Predictions of Manned and Unmanned Rotorcraft Using the Comprehensive Analytical Rotorcraft Model for Acoustics (CARMA) Code System

    NASA Technical Reports Server (NTRS)

    Boyd, D. Douglas, Jr.; Burley, Casey L.; Conner, David A.

    2005-01-01

    The Comprehensive Analytical Rotorcraft Model for Acoustics (CARMA) is being developed under the Quiet Aircraft Technology Project within the NASA Vehicle Systems Program. The purpose of CARMA is to provide analysis tools for the design and evaluation of efficient low-noise rotorcraft, as well as support the development of safe, low-noise flight operations. The baseline prediction system of CARMA is presented and current capabilities are illustrated for a model rotor in a wind tunnel, a rotorcraft in flight and for a notional coaxial rotor configuration; however, a complete validation of the CARMA system capabilities with respect to a variety of measured databases is beyond the scope of this work. For the model rotor illustration, predicted rotor airloads and acoustics for a BO-105 model rotor are compared to test data from HART-II. For the flight illustration, acoustic data from an MD-520N helicopter flight test, which was conducted at Eglin Air Force Base in September 2003, are compared with CARMA full vehicle flight predictions. Predicted acoustic metrics at three microphone locations are compared for limited level flight and descent conditions. Initial acoustic predictions using CARMA for a notional coaxial rotor system are made. The effect of increasing the vertical separation between the rotors on the predicted airloads and acoustic results are shown for both aerodynamically non-interacting and aerodynamically interacting rotors. The sensitivity of including the aerodynamic interaction effects of each rotor on the other, especially when the rotors are in close proximity to one another is initially examined. The predicted coaxial rotor noise is compared to that of a conventional single rotor system of equal thrust, where both are of reasonable size for an unmanned aerial vehicle (UAV).

  14. VARS-TOOL: A Comprehensive, Efficient, and Robust Sensitivity Analysis Toolbox

    NASA Astrophysics Data System (ADS)

    Razavi, S.; Sheikholeslami, R.; Haghnegahdar, A.; Esfahbod, B.

    2016-12-01

    VARS-TOOL is an advanced sensitivity and uncertainty analysis toolbox, applicable to the full range of computer simulation models, including Earth and Environmental Systems Models (EESMs). The toolbox was developed originally around VARS (Variogram Analysis of Response Surfaces), which is a general framework for Global Sensitivity Analysis (GSA) that utilizes the variogram/covariogram concept to characterize the full spectrum of sensitivity-related information, thereby providing a comprehensive set of "global" sensitivity metrics with minimal computational cost. VARS-TOOL is unique in that, with a single sample set (set of simulation model runs), it generates simultaneously three philosophically different families of global sensitivity metrics, including (1) variogram-based metrics called IVARS (Integrated Variogram Across a Range of Scales - VARS approach), (2) variance-based total-order effects (Sobol approach), and (3) derivative-based elementary effects (Morris approach). VARS-TOOL is also enabled with two novel features; the first one being a sequential sampling algorithm, called Progressive Latin Hypercube Sampling (PLHS), which allows progressively increasing the sample size for GSA while maintaining the required sample distributional properties. The second feature is a "grouping strategy" that adaptively groups the model parameters based on their sensitivity or functioning to maximize the reliability of GSA results. These features in conjunction with bootstrapping enable the user to monitor the stability, robustness, and convergence of GSA with the increase in sample size for any given case study. VARS-TOOL has been shown to achieve robust and stable results within 1-2 orders of magnitude smaller sample sizes (fewer model runs) than alternative tools. VARS-TOOL, available in MATLAB and Python, is under continuous development and new capabilities and features are forthcoming.

  15. Cross-disciplinary links in environmental systems science: Current state and claimed needs identified in a meta-review of process models.

    PubMed

    Ayllón, Daniel; Grimm, Volker; Attinger, Sabine; Hauhs, Michael; Simmer, Clemens; Vereecken, Harry; Lischeid, Gunnar

    2018-05-01

    Terrestrial environmental systems are characterised by numerous feedback links between their different compartments. However, scientific research is organized into disciplines that focus on processes within the respective compartments rather than on interdisciplinary links. Major feedback mechanisms between compartments might therefore have been systematically overlooked so far. Without identifying these gaps, initiatives on future comprehensive environmental monitoring schemes and experimental platforms might fail. We performed a comprehensive overview of feedbacks between compartments currently represented in environmental sciences and explores to what degree missing links have already been acknowledged in the literature. We focused on process models as they can be regarded as repositories of scientific knowledge that compile findings of numerous single studies. In total, 118 simulation models from 23 model types were analysed. Missing processes linking different environmental compartments were identified based on a meta-review of 346 published reviews, model intercomparison studies, and model descriptions. Eight disciplines of environmental sciences were considered and 396 linking processes were identified and ascribed to the physical, chemical or biological domain. There were significant differences between model types and scientific disciplines regarding implemented interdisciplinary links. The most wide-spread interdisciplinary links were between physical processes in meteorology, hydrology and soil science that drive or set the boundary conditions for other processes (e.g., ecological processes). In contrast, most chemical and biological processes were restricted to links within the same compartment. Integration of multiple environmental compartments and interdisciplinary knowledge was scarce in most model types. There was a strong bias of suggested future research foci and model extensions towards reinforcing existing interdisciplinary knowledge rather than to open up new interdisciplinary pathways. No clear pattern across disciplines exists with respect to suggested future research efforts. There is no evidence that environmental research would clearly converge towards more integrated approaches or towards an overarching environmental systems theory. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Understanding and Changing Food Consumption Behavior Among Children: The Comprehensive Child Consumption Patterns Model.

    PubMed

    Jeffries, Jayne K; Noar, Seth M; Thayer, Linden

    2015-01-01

    Current theoretical models attempting to explain diet-related weight status among children center around three individual-level theories. Alone, these theories fail to explain why children are engaging or not engaging in health-promoting eating behaviors. Our Comprehensive Child Consumption Patterns model takes a comprehensive approach and was developed specifically to help explain child food consumption behavior and addresses many of the theoretical gaps found in previous models, including integration of the life course trajectory, key influencers, perceived behavioral control, and self-regulation. Comprehensive Child Consumption Patterns model highlights multiple levels of the socioecological model to explain child food consumption, illustrating how negative influence at multiple levels can lead to caloric imbalance and contribute to child overweight and obesity. Recognizing the necessity for multi-level and system-based interventions, this model serves as a template for holistic, integrated interventions to improve child eating behavior, ultimately impacting life course health development. © The Author(s) 2015.

  17. Superconducting and charge density wave transition in single crystalline LaPt2Si2

    NASA Astrophysics Data System (ADS)

    Gupta, Ritu; Dhar, S. K.; Thamizhavel, A.; Rajeev, K. P.; Hossain, Z.

    2017-06-01

    We present results of our comprehensive studies on single crystalline LaPt2Si2. Pronounced anomaly in electrical resistivity and heat capacity confirms the bulk nature of superconductivity (SC) and charge density wave (CDW) transition in the single crystals. While the charge density wave transition temperature is lower, the superconducting transition temperature is higher in single crystal compared to the polycrystalline sample. This result confirms the competing nature of CDW and SC. Another important finding is the anomalous temperature dependence of upper critical field H C2(T). We also report the anisotropy in the transport and magnetic measurements of the single crystal.

  18. A COMPREHENSIVE APPROACH FOR PHYSIOLOGICALLY BASED PHARMACOKINETIC (PBPK) MODELS USING THE EXPOSURE RELATED DOSE ESTIMATING MODEL (ERDEM) SYSTEM

    EPA Science Inventory

    The implementation of a comprehensive PBPK modeling approach resulted in ERDEM, a complex PBPK modeling system. ERDEM provides a scalable and user-friendly environment that enables researchers to focus on data input values rather than writing program code. ERDEM efficiently m...

  19. Comprehensive Career Guidance. Postsecondary & Adult. Programs and Model.

    ERIC Educational Resources Information Center

    Moore, Earl J.; Miller, Thomas B.

    Divided into four parts, this document describes a comprehensive career guidance model for postsecondary and adult programs. In part 1, the rationale for extending career guidance and counseling into the lifelong learning perspective is explained, the Georgia Life Career Development Model is described, and the components of a process model for…

  20. MOSAIC : Model Of Sustainability And Integrated Corridors, phase 3 : comprehensive model calibration and validation and additional model enhancement.

    DOT National Transportation Integrated Search

    2015-02-01

    The Maryland State Highway Administration (SHA) has initiated major planning efforts to improve transportation : efficiency, safety, and sustainability on critical highway corridors through its Comprehensive Highway Corridor : (CHC) program. This pro...

  1. A map of single nucleotide polymorphisms of the date palm (Phoenix dactylifera) based on whole genome sequencing of 62 varieties

    USDA-ARS?s Scientific Manuscript database

    Date palm is one of the few crop species that thrive in arid environments and are the most significant fruit crop in the Middle East and North Africa, but lacks genomic resources that can accelerate breeding efforts. Here, we present the first comprehensive catalogue of ~12 million common single nuc...

  2. Single balloon versus double balloon bipedicular kyphoplasty: a systematic review and meta-analysis.

    PubMed

    Jing, Zehao; Dong, Jianli; Li, Zhengwei; Nan, Feng

    2018-06-19

    Kyphoplasty has been widely used to treat vertebral compression fractures (VCFs). In standard procedure of kyphoplasty, two balloons were inserted into the vertebral body through bipedicular and inflated simultaneously, while using a single balloon two times is also a common method in clinic to lessen the financial burden of patients. However, the effect and safety of single balloon versus double balloon bipedicular kyphoplasty are still controversy. In this systematic review and meta-analysis, eligible studies were identified through a comprehensive literature search of PubMed, Cochrane library EMBASE, Web of Science, Wanfang, CNKI, VIP and CBM until January 1, 2018. Results from individual studies were pooled using a random or fixed effects model. Seven articles were included in the systematic review and five studies were consisted in meta-analysis. We observed no significant difference between single balloon and double balloon bipedicular kyphoplasty in visual analog scale (VAS), angle (kyphotic angle and Cobb angle), consumption (operation time, cement volume and volume of bleeding), vertebral height (anterior height, medium height and posterior height) and complications (cement leakage and new VCFs), while the cost of single balloon bipedicular kyphoplasty is lower than that of double balloon bipedicular kyphoplasty. The results of our meta-analysis also demonstrated that single balloon can significantly improve the VAS, angle and vertebral height of patients suffering from VCFs. This systematic review and meta-analysis collectively concludes that single balloon bipedicular kyphoplasty is as effective as double balloon bipedicular kyphoplasty in improving clinical symptoms, deformity and complications of VCFs but not so expensive. These slides can be retrieved under Electronic Supplementary Material.

  3. SCOUP: a probabilistic model based on the Ornstein-Uhlenbeck process to analyze single-cell expression data during differentiation.

    PubMed

    Matsumoto, Hirotaka; Kiryu, Hisanori

    2016-06-08

    Single-cell technologies make it possible to quantify the comprehensive states of individual cells, and have the power to shed light on cellular differentiation in particular. Although several methods have been developed to fully analyze the single-cell expression data, there is still room for improvement in the analysis of differentiation. In this paper, we propose a novel method SCOUP to elucidate differentiation process. Unlike previous dimension reduction-based approaches, SCOUP describes the dynamics of gene expression throughout differentiation directly, including the degree of differentiation of a cell (in pseudo-time) and cell fate. SCOUP is superior to previous methods with respect to pseudo-time estimation, especially for single-cell RNA-seq. SCOUP also successfully estimates cell lineage more accurately than previous method, especially for cells at an early stage of bifurcation. In addition, SCOUP can be applied to various downstream analyses. As an example, we propose a novel correlation calculation method for elucidating regulatory relationships among genes. We apply this method to a single-cell RNA-seq data and detect a candidate of key regulator for differentiation and clusters in a correlation network which are not detected with conventional correlation analysis. We develop a stochastic process-based method SCOUP to analyze single-cell expression data throughout differentiation. SCOUP can estimate pseudo-time and cell lineage more accurately than previous methods. We also propose a novel correlation calculation method based on SCOUP. SCOUP is a promising approach for further single-cell analysis and available at https://github.com/hmatsu1226/SCOUP.

  4. Aerodynamic preliminary analysis system 2. Part 1: Theory

    NASA Technical Reports Server (NTRS)

    Bonner, E.; Clever, W.; Dunn, K.

    1981-01-01

    A subsonic/supersonic/hypersonic aerodynamic analysis was developed by integrating the Aerodynamic Preliminary Analysis System (APAS), and the inviscid force calculation modules of the Hypersonic Arbitrary Body Program. APAS analysis was extended for nonlinear vortex forces using a generalization of the Polhamus analogy. The interactive system provides appropriate aerodynamic models for a single input geometry data base and has a run/output format similar to a wind tunnel test program. The user's manual was organized to cover the principle system activities of a typical application, geometric input/editing, aerodynamic evaluation, and post analysis review/display. Sample sessions are included to illustrate the specific task involved and are followed by a comprehensive command/subcommand dictionary used to operate the system.

  5. BioServices: a common Python package to access biological Web Services programmatically.

    PubMed

    Cokelaer, Thomas; Pultz, Dennis; Harder, Lea M; Serra-Musach, Jordi; Saez-Rodriguez, Julio

    2013-12-15

    Web interfaces provide access to numerous biological databases. Many can be accessed to in a programmatic way thanks to Web Services. Building applications that combine several of them would benefit from a single framework. BioServices is a comprehensive Python framework that provides programmatic access to major bioinformatics Web Services (e.g. KEGG, UniProt, BioModels, ChEMBLdb). Wrapping additional Web Services based either on Representational State Transfer or Simple Object Access Protocol/Web Services Description Language technologies is eased by the usage of object-oriented programming. BioServices releases and documentation are available at http://pypi.python.org/pypi/bioservices under a GPL-v3 license.

  6. Smoking cessation treatment and outcomes patterns simulation: a new framework for evaluating the potential health and economic impact of smoking cessation interventions.

    PubMed

    Getsios, Denis; Marton, Jenő P; Revankar, Nikhil; Ward, Alexandra J; Willke, Richard J; Rublee, Dale; Ishak, K Jack; Xenakis, James G

    2013-09-01

    Most existing models of smoking cessation treatments have considered a single quit attempt when modelling long-term outcomes. To develop a model to simulate smokers over their lifetimes accounting for multiple quit attempts and relapses which will allow for prediction of the long-term health and economic impact of smoking cessation strategies. A discrete event simulation (DES) that models individuals' life course of smoking behaviours, attempts to quit, and the cumulative impact on health and economic outcomes was developed. Each individual is assigned one of the available strategies used to support each quit attempt; the outcome of each attempt, time to relapses if abstinence is achieved, and time between quit attempts is tracked. Based on each individual's smoking or abstinence patterns, the risk of developing diseases associated with smoking (chronic obstructive pulmonary disease, lung cancer, myocardial infarction and stroke) is determined and the corresponding costs, changes to mortality, and quality of life assigned. Direct costs are assessed from the perspective of a comprehensive US healthcare payer ($US, 2012 values). Quit attempt strategies that can be evaluated in the current simulation include unassisted quit attempts, brief counselling, behavioural modification therapy, nicotine replacement therapy, bupropion, and varenicline, with the selection of strategies and time between quit attempts based on equations derived from survey data. Equations predicting the success of quit attempts as well as the short-term probability of relapse were derived from five varenicline clinical trials. Concordance between the five trials and predictions from the simulation on abstinence at 12 months was high, indicating that the equations predicting success and relapse in the first year following a quit attempt were reliable. Predictions allowing for only a single quit attempt versus unrestricted attempts demonstrate important differences, with the single quit attempt simulation predicting 19 % more smoking-related diseases and 10 % higher costs associated with smoking-related diseases. Differences are most prominent in predictions of the time that individuals abstain from smoking: 13.2 years on average over a lifetime allowing for multiple quit attempts, versus only 1.2 years with single quit attempts. Differences in abstinence time estimates become substantial only 5 years into the simulation. In the multiple quit attempt simulations, younger individuals survived longer, yet had lower lifetime smoking-related disease and total costs, while the opposite was true for those with high levels of nicotine dependence. By allowing for multiple quit attempts over the course of individuals' lives, the simulation can provide more reliable estimates on the health and economic impact of interventions designed to increase abstinence from smoking. Furthermore, the individual nature of the simulation allows for evaluation of outcomes in populations with different baseline profiles. DES provides a framework for comprehensive and appropriate predictions when applied to smoking cessation over smoker lifetimes.

  7. Effects of Word Frequency and Modality on Sentence Comprehension Impairments in People with Aphasia

    PubMed Central

    DeDe, Gayle

    2014-01-01

    Purpose It is well known that people with aphasia have sentence comprehension impairments. The present study investigated whether lexical factors contribute to sentence comprehension impairments in both the auditory and written modalities using on-line measures of sentence processing. Methods People with aphasia and non-brain-damaged controls participated in the experiment (n=8 per group). Twenty-one sentence pairs containing high and low frequency words were presented in self-paced listening and reading tasks. The sentences were syntactically simple and differed only in the critical words. The dependent variables were response times for critical segments of the sentence and accuracy on the comprehension questions. Results The results showed that word frequency influences performance on measures of sentence comprehension in people with aphasia. The accuracy data on the comprehension questions suggested that people with aphasia have more difficulty understanding sentences containing low frequency words in the written compared to auditory modality. Both group and single case analyses of the response time data also pointed to more difficulty with reading than listening. Conclusions The results show that sentence comprehension in people with aphasia is influenced by word frequency and presentation modality. PMID:22294411

  8. Why the Simple View of Reading Is Not Simplistic: Unpacking Component Skills of Reading Using a Direct and Indirect Effect Model of Reading (DIER)

    ERIC Educational Resources Information Center

    Kim, Young-Suk Grace

    2017-01-01

    Pathways of relations of language, cognitive, and literacy skills (i.e., working memory, vocabulary, grammatical knowledge, inference, comprehension monitoring, word reading, and listening comprehension) to reading comprehension were examined by comparing four variations of direct and indirect effects model of reading. Results from 350…

  9. Cooperative Learning Model toward a Reading Comprehensions on the Elementary School

    ERIC Educational Resources Information Center

    Murtono

    2015-01-01

    The purposes of this research are: (1) description of reading skill the students who join in CIRC learning model, Jigsaw learning model, and STAD learning model; (2) finding out the effective of learning model cooperative toward a reading comprehensions between the students who have high language logic and low language logic; and (3) finding out…

  10. Updating during reading comprehension: why causality matters.

    PubMed

    Kendeou, Panayiota; Smith, Emily R; O'Brien, Edward J

    2013-05-01

    The present set of 7 experiments systematically examined the effectiveness of adding causal explanations to simple refutations in reducing or eliminating the impact of outdated information on subsequent comprehension. The addition of a single causal-explanation sentence to a refutation was sufficient to eliminate any measurable disruption in comprehension caused by the outdated information (Experiment 1) but was not sufficient to eliminate its reactivation (Experiment 2). However, a 3 sentence causal-explanation addition to a refutation eliminated both any measurable disruption in comprehension (Experiment 3) and the reactivation of the outdated information (Experiment 4). A direct comparison between the 1 and 3 causal-explanation conditions provided converging evidence for these findings (Experiment 5). Furthermore, a comparison of the 3 sentence causal-explanation condition with a 3 sentence qualified-elaboration condition demonstrated that even though both conditions were sufficient to eliminate any measurable disruption in comprehension (Experiment 6), only the causal-explanation condition was sufficient to eliminate the reactivation of the outdated information (Experiment 7). These results establish a boundary condition under which outdated information will influence comprehension; they also have broader implications for both the updating process and knowledge revision in general.

  11. A Global User-Driven Model for Tile Prefetching in Web Geographical Information Systems

    PubMed Central

    Pan, Shaoming; Chong, Yanwen; Zhang, Hang; Tan, Xicheng

    2017-01-01

    A web geographical information system is a typical service-intensive application. Tile prefetching and cache replacement can improve cache hit ratios by proactively fetching tiles from storage and replacing the appropriate tiles from the high-speed cache buffer without waiting for a client’s requests, which reduces disk latency and improves system access performance. Most popular prefetching strategies consider only the relative tile popularities to predict which tile should be prefetched or consider only a single individual user's access behavior to determine which neighbor tiles need to be prefetched. Some studies show that comprehensively considering all users’ access behaviors and all tiles’ relationships in the prediction process can achieve more significant improvements. Thus, this work proposes a new global user-driven model for tile prefetching and cache replacement. First, based on all users’ access behaviors, a type of expression method for tile correlation is designed and implemented. Then, a conditional prefetching probability can be computed based on the proposed correlation expression mode. Thus, some tiles to be prefetched can be found by computing and comparing the conditional prefetching probability from the uncached tiles set and, similarly, some replacement tiles can be found in the cache buffer according to multi-step prefetching. Finally, some experiments are provided comparing the proposed model with other global user-driven models, other single user-driven models, and other client-side prefetching strategies. The results show that the proposed model can achieve a prefetching hit rate in approximately 10.6% ~ 110.5% higher than the compared methods. PMID:28085937

  12. Estimating the cost of blood: past, present, and future directions.

    PubMed

    Shander, Aryeh; Hofmann, Axel; Gombotz, Hans; Theusinger, Oliver M; Spahn, Donat R

    2007-06-01

    Understanding the costs associated with blood products requires sophisticated knowledge about transfusion medicine and is attracting the attention of clinical and administrative healthcare sectors worldwide. To improve outcomes, blood usage must be optimized and expenditures controlled so that resources may be channeled toward other diagnostic, therapeutic, and technological initiatives. Estimating blood costs, however, is a complex undertaking, surpassing simple supply versus demand economics. Shrinking donor availability and application of a precautionary principle to minimize transfusion risks are factors that continue to drive the cost of blood products upward. Recognizing that historical accounting attempts to determine blood costs have varied in scope, perspective, and methodology, new approaches have been initiated to identify all potential cost elements related to blood and blood product administration. Activities are also under way to tie these elements together in a comprehensive and practical model that will be applicable to all single-donor blood products without regard to practice type (e.g., academic, private, multi- or single-center clinic). These initiatives, their rationale, importance, and future directions are described.

  13. Epitaxy: Programmable Atom Equivalents Versus Atoms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Mary X.; Seo, Soyoung E.; Gabrys, Paul A.

    The programmability of DNA makes it an attractive structure-directing ligand for the assembly of nanoparticle superlattices in a manner that mimics many aspects of atomic crystallization. However, the synthesis of multilayer single crystals of defined size remains a challenge. Though previous studies considered lattice mismatch as the major limiting factor for multilayer assembly, thin film growth depends on many interlinked variables. Here, a more comprehensive approach is taken to study fundamental elements, such as the growth temperature and the thermodynamics of interfacial energetics, to achieve epitaxial growth of nanoparticle thin films. Under optimized equilibrium conditions, single crystal, multilayer thin filmsmore » can be synthesized over 500 × 500 μm2 areas on lithographically patterned templates. Importantly, these superlattices follow the same patterns of crystal growth demonstrated in thin film atomic deposition, allowing for these processes to be understood in the context of well-studied atomic epitaxy, and potentially enabling a nanoscale model to study fundamental crystallization processes.« less

  14. Quantitative evaluation of software packages for single-molecule localization microscopy.

    PubMed

    Sage, Daniel; Kirshner, Hagai; Pengo, Thomas; Stuurman, Nico; Min, Junhong; Manley, Suliana; Unser, Michael

    2015-08-01

    The quality of super-resolution images obtained by single-molecule localization microscopy (SMLM) depends largely on the software used to detect and accurately localize point sources. In this work, we focus on the computational aspects of super-resolution microscopy and present a comprehensive evaluation of localization software packages. Our philosophy is to evaluate each package as a whole, thus maintaining the integrity of the software. We prepared synthetic data that represent three-dimensional structures modeled after biological components, taking excitation parameters, noise sources, point-spread functions and pixelation into account. We then asked developers to run their software on our data; most responded favorably, allowing us to present a broad picture of the methods available. We evaluated their results using quantitative and user-interpretable criteria: detection rate, accuracy, quality of image reconstruction, resolution, software usability and computational resources. These metrics reflect the various tradeoffs of SMLM software packages and help users to choose the software that fits their needs.

  15. Optimizing estimation of hemispheric dominance for language using magnetic source imaging

    PubMed Central

    Passaro, Antony D.; Rezaie, Roozbeh; Moser, Dana C.; Li, Zhimin; Dias, Nadeeka; Papanicolaou, Andrew C.

    2011-01-01

    The efficacy of magnetoencephalography (MEG) as an alternative to invasive methods for investigating the cortical representation of language has been explored in several studies. Recently, studies comparing MEG to the gold standard Wada procedure have found inconsistent and often less-than accurate estimates of laterality across various MEG studies. Here we attempted to address this issue among normal right-handed adults (N=12) by supplementing a well-established MEG protocol involving word recognition and the single dipole method with a sentence comprehension task and a beamformer approach localizing neural oscillations. Beamformer analysis of word recognition and sentence comprehension tasks revealed a desynchronization in the 10–18 Hz range, localized to the temporo-parietal cortices. Inspection of individual profiles of localized desynchronization (10–18 Hz) revealed left hemispheric dominance in 91.7% and 83.3% of individuals during the word recognition and sentence comprehension tasks, respectively. In contrast, single dipole analysis yielded lower estimates, such that activity in temporal language regions was left-lateralized in 66.7% and 58.3% of individuals during word recognition and sentence comprehension, respectively. The results obtained from the word recognition task and localization of oscillatory activity using a beamformer appear to be in line with general estimates of left hemispheric dominance for language in normal right-handed individuals. Furthermore, the current findings support the growing notion that changes in neural oscillations underlie critical components of linguistic processing. PMID:21890118

  16. A Comprehensive Guide for Performing Sample Preparation and Top-Down Protein Analysis

    PubMed Central

    Padula, Matthew P.; Berry, Iain J.; O′Rourke, Matthew B.; Raymond, Benjamin B.A.; Santos, Jerran; Djordjevic, Steven P.

    2017-01-01

    Methodologies for the global analysis of proteins in a sample, or proteome analysis, have been available since 1975 when Patrick O′Farrell published the first paper describing two-dimensional gel electrophoresis (2D-PAGE). This technique allowed the resolution of single protein isoforms, or proteoforms, into single ‘spots’ in a polyacrylamide gel, allowing the quantitation of changes in a proteoform′s abundance to ascertain changes in an organism′s phenotype when conditions change. In pursuit of the comprehensive profiling of the proteome, significant advances in technology have made the identification and quantitation of intact proteoforms from complex mixtures of proteins more routine, allowing analysis of the proteome from the ‘Top-Down’. However, the number of proteoforms detected by Top-Down methodologies such as 2D-PAGE or mass spectrometry has not significantly increased since O’Farrell’s paper when compared to Bottom-Up, peptide-centric techniques. This article explores and explains the numerous methodologies and technologies available to analyse the proteome from the Top-Down with a strong emphasis on the necessity to analyse intact proteoforms as a better indicator of changes in biology and phenotype. We arrive at the conclusion that the complete and comprehensive profiling of an organism′s proteome is still, at present, beyond our reach but the continuing evolution of protein fractionation techniques and mass spectrometry brings comprehensive Top-Down proteome profiling closer. PMID:28387712

  17. A Comprehensive Guide for Performing Sample Preparation and Top-Down Protein Analysis.

    PubMed

    Padula, Matthew P; Berry, Iain J; O Rourke, Matthew B; Raymond, Benjamin B A; Santos, Jerran; Djordjevic, Steven P

    2017-04-07

    Methodologies for the global analysis of proteins in a sample, or proteome analysis, have been available since 1975 when Patrick O'Farrell published the first paper describing two-dimensional gel electrophoresis (2D-PAGE). This technique allowed the resolution of single protein isoforms, or proteoforms, into single 'spots' in a polyacrylamide gel, allowing the quantitation of changes in a proteoform's abundance to ascertain changes in an organism's phenotype when conditions change. In pursuit of the comprehensive profiling of the proteome, significant advances in technology have made the identification and quantitation of intact proteoforms from complex mixtures of proteins more routine, allowing analysis of the proteome from the 'Top-Down'. However, the number of proteoforms detected by Top-Down methodologies such as 2D-PAGE or mass spectrometry has not significantly increased since O'Farrell's paper when compared to Bottom-Up, peptide-centric techniques. This article explores and explains the numerous methodologies and technologies available to analyse the proteome from the Top-Down with a strong emphasis on the necessity to analyse intact proteoforms as a better indicator of changes in biology and phenotype. We arrive at the conclusion that the complete and comprehensive profiling of an organism's proteome is still, at present, beyond our reach but the continuing evolution of protein fractionation techniques and mass spectrometry brings comprehensive Top-Down proteome profiling closer.

  18. Comprehensive Aspectual UML approach to support AspectJ.

    PubMed

    Magableh, Aws; Shukur, Zarina; Ali, Noorazean Mohd

    2014-01-01

    Unified Modeling Language is the most popular and widely used Object-Oriented modelling language in the IT industry. This study focuses on investigating the ability to expand UML to some extent to model crosscutting concerns (Aspects) to support AspectJ. Through a comprehensive literature review, we identify and extensively examine all the available Aspect-Oriented UML modelling approaches and find that the existing Aspect-Oriented Design Modelling approaches using UML cannot be considered to provide a framework for a comprehensive Aspectual UML modelling approach and also that there is a lack of adequate Aspect-Oriented tool support. This study also proposes a set of Aspectual UML semantic rules and attempts to generate AspectJ pseudocode from UML diagrams. The proposed Aspectual UML modelling approach is formally evaluated using a focus group to test six hypotheses regarding performance; a "good design" criteria-based evaluation to assess the quality of the design; and an AspectJ-based evaluation as a reference measurement-based evaluation. The results of the focus group evaluation confirm all the hypotheses put forward regarding the proposed approach. The proposed approach provides a comprehensive set of Aspectual UML structural and behavioral diagrams, which are designed and implemented based on a comprehensive and detailed set of AspectJ programming constructs.

  19. Comprehensive Aspectual UML Approach to Support AspectJ

    PubMed Central

    Magableh, Aws; Shukur, Zarina; Mohd. Ali, Noorazean

    2014-01-01

    Unified Modeling Language is the most popular and widely used Object-Oriented modelling language in the IT industry. This study focuses on investigating the ability to expand UML to some extent to model crosscutting concerns (Aspects) to support AspectJ. Through a comprehensive literature review, we identify and extensively examine all the available Aspect-Oriented UML modelling approaches and find that the existing Aspect-Oriented Design Modelling approaches using UML cannot be considered to provide a framework for a comprehensive Aspectual UML modelling approach and also that there is a lack of adequate Aspect-Oriented tool support. This study also proposes a set of Aspectual UML semantic rules and attempts to generate AspectJ pseudocode from UML diagrams. The proposed Aspectual UML modelling approach is formally evaluated using a focus group to test six hypotheses regarding performance; a “good design” criteria-based evaluation to assess the quality of the design; and an AspectJ-based evaluation as a reference measurement-based evaluation. The results of the focus group evaluation confirm all the hypotheses put forward regarding the proposed approach. The proposed approach provides a comprehensive set of Aspectual UML structural and behavioral diagrams, which are designed and implemented based on a comprehensive and detailed set of AspectJ programming constructs. PMID:25136656

  20. A Closed-Form Error Model of Straight Lines for Improved Data Association and Sensor Fusing

    PubMed Central

    2018-01-01

    Linear regression is a basic tool in mobile robotics, since it enables accurate estimation of straight lines from range-bearing scans or in digital images, which is a prerequisite for reliable data association and sensor fusing in the context of feature-based SLAM. This paper discusses, extends and compares existing algorithms for line fitting applicable also in the case of strong covariances between the coordinates at each single data point, which must not be neglected if range-bearing sensors are used. Besides, in particular, the determination of the covariance matrix is considered, which is required for stochastic modeling. The main contribution is a new error model of straight lines in closed form for calculating quickly and reliably the covariance matrix dependent on just a few comprehensible and easily-obtainable parameters. The model can be applied widely in any case when a line is fitted from a number of distinct points also without a priori knowledge of the specific measurement noise. By means of extensive simulations, the performance and robustness of the new model in comparison to existing approaches is shown. PMID:29673205

  1. A comprehensive study of extended tetrathiafulvalene cruciform molecules for molecular electronics: synthesis and electrical transport measurements.

    PubMed

    Parker, Christian R; Leary, Edmund; Frisenda, Riccardo; Wei, Zhongming; Jennum, Karsten S; Glibstrup, Emil; Abrahamsen, Peter Bæch; Santella, Marco; Christensen, Mikkel A; Della Pia, Eduardo Antonio; Li, Tao; Gonzalez, Maria Teresa; Jiang, Xingbin; Morsing, Thorbjørn J; Rubio-Bollinger, Gabino; Laursen, Bo W; Nørgaard, Kasper; van der Zant, Herre; Agrait, Nicolas; Nielsen, Mogens Brøndsted

    2014-11-26

    Cruciform-like molecules with two orthogonally placed π-conjugated systems have in recent years attracted significant interest for their potential use as molecular wires in molecular electronics. Here we present synthetic protocols for a large selection of cruciform molecules based on oligo(phenyleneethynylene) (OPE) and tetrathiafulvalene (TTF) scaffolds, end-capped with acetyl-protected thiolates as electrode anchoring groups. The molecules were subjected to a comprehensive study of their conducting properties as well as their photophysical and electrochemical properties in solution. The complex nature of the molecules and their possible binding in different configurations in junctions called for different techniques of conductance measurements: (1) conducting-probe atomic force microscopy (CP-AFM) measurements on self-assembled monolayers (SAMs), (2) mechanically controlled break-junction (MCBJ) measurements, and (3) scanning tunneling microscopy break-junction (STM-BJ) measurements. The CP-AFM measurements showed structure-property relationships from SAMs of series of OPE3 and OPE5 cruciform molecules; the conductance of the SAM increased with the number of dithiafulvene (DTF) units (0, 1, 2) along the wire, and it increased when substituting two arylethynyl end groups of the OPE3 backbone with two DTF units. The MCBJ and STM-BJ studies on single molecules both showed that DTFs decreased the junction formation probability, but, in contrast, no significant influence on the single-molecule conductance was observed. We suggest that the origins of the difference between SAM and single-molecule measurements lie in the nature of the molecule-electrode interface as well as in effects arising from molecular packing in the SAMs. This comprehensive study shows that for complex molecules care should be taken when directly comparing single-molecule measurements and measurements of SAMs and solid-state devices thereof.

  2. Bidirectional Relations between Text Reading Prosody and Reading Comprehension in the Upper Primary School Grades: A Longitudinal Perspective

    PubMed Central

    Veenendaal, Nathalie J.; Groen, Margriet A.; Verhoeven, Ludo

    2016-01-01

    The purpose of this study was to examine the directionality of the relationship between text reading prosody and reading comprehension in the upper grades of primary school. We compared three theoretical possibilities: Two unidirectional relations from text reading prosody to reading comprehension and from reading comprehension to text reading prosody and a bidirectional relation between text reading prosody and reading comprehension. Further, we controlled for autoregressive effects and included decoding efficiency as a measure of general reading skill. Participants were 99 Dutch children, followed longitudinally, from fourth- to sixth-grade. Structural equation modeling showed that the bidirectional relation provided the best fitting model. In fifth-grade, text reading prosody was related to prior decoding and reading comprehension, whereas in sixth-grade, reading comprehension was related to prior text reading prosody. As such, the results suggest that the relation between text reading prosody and reading comprehension is reciprocal, but dependent on grade level. PMID:27667916

  3. Components and context: exploring sources of reading difficulties for language minority learners and native English speakers in urban schools.

    PubMed

    Kieffer, Michael J; Vukovic, Rose K

    2012-01-01

    Drawing on the cognitive and ecological domains within the componential model of reading, this longitudinal study explores heterogeneity in the sources of reading difficulties for language minority learners and native English speakers in urban schools. Students (N = 150) were followed from first through third grade and assessed annually on standardized English language and reading measures. Structural equation modeling was used to investigate the relative contributions of code-related and linguistic comprehension skills in first and second grade to third grade reading comprehension. Linguistic comprehension and the interaction between linguistic comprehension and code-related skills each explained substantial variation in reading comprehension. Among students with low reading comprehension, more than 80% demonstrated weaknesses in linguistic comprehension alone, whereas approximately 15% demonstrated weaknesses in both linguistic comprehension and code-related skills. Results were remarkably similar for the language minority learners and native English speakers, suggesting the importance of their shared socioeconomic backgrounds and schooling contexts.

  4. A comprehensive simulation framework for imaging single particles and biomolecules at the European X-ray Free-Electron Laser

    PubMed Central

    Yoon, Chun Hong; Yurkov, Mikhail V.; Schneidmiller, Evgeny A.; Samoylova, Liubov; Buzmakov, Alexey; Jurek, Zoltan; Ziaja, Beata; Santra, Robin; Loh, N. Duane; Tschentscher, Thomas; Mancuso, Adrian P.

    2016-01-01

    The advent of newer, brighter, and more coherent X-ray sources, such as X-ray Free-Electron Lasers (XFELs), represents a tremendous growth in the potential to apply coherent X-rays to determine the structure of materials from the micron-scale down to the Angstrom-scale. There is a significant need for a multi-physics simulation framework to perform source-to-detector simulations for a single particle imaging experiment, including (i) the multidimensional simulation of the X-ray source; (ii) simulation of the wave-optics propagation of the coherent XFEL beams; (iii) atomistic modelling of photon-material interactions; (iv) simulation of the time-dependent diffraction process, including incoherent scattering; (v) assembling noisy and incomplete diffraction intensities into a three-dimensional data set using the Expansion-Maximisation-Compression (EMC) algorithm and (vi) phase retrieval to obtain structural information. We demonstrate the framework by simulating a single-particle experiment for a nitrogenase iron protein using parameters of the SPB/SFX instrument of the European XFEL. This exercise demonstrably yields interpretable consequences for structure determination that are crucial yet currently unavailable for experiment design. PMID:27109208

  5. A comprehensive simulation framework for imaging single particles and biomolecules at the European X-ray Free-Electron Laser

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoon, Chun Hong; Yurkov, Mikhail V.; Schneidmiller, Evgeny A.

    The advent of newer, brighter, and more coherent X-ray sources, such as X-ray Free-Electron Lasers (XFELs), represents a tremendous growth in the potential to apply coherent X-rays to determine the structure of materials from the micron-scale down to the Angstrom-scale. There is a significant need for a multi-physics simulation framework to perform source-to-detector simulations for a single particle imaging experiment, including (i) the multidimensional simulation of the X-ray source; (ii) simulation of the wave-optics propagation of the coherent XFEL beams; (iii) atomistic modelling of photon-material interactions; (iv) simulation of the time-dependent diffraction process, including incoherent scattering; (v) assembling noisy andmore » incomplete diffraction intensities into a three-dimensional data set using the Expansion-Maximisation-Compression (EMC) algorithm and (vi) phase retrieval to obtain structural information. Furthermore, we demonstrate the framework by simulating a single-particle experiment for a nitrogenase iron protein using parameters of the SPB/SFX instrument of the European XFEL. This exercise demonstrably yields interpretable consequences for structure determination that are crucial yet currently unavailable for experiment design.« less

  6. A comprehensive simulation framework for imaging single particles and biomolecules at the European X-ray Free-Electron Laser

    DOE PAGES

    Yoon, Chun Hong; Yurkov, Mikhail V.; Schneidmiller, Evgeny A.; ...

    2016-04-25

    The advent of newer, brighter, and more coherent X-ray sources, such as X-ray Free-Electron Lasers (XFELs), represents a tremendous growth in the potential to apply coherent X-rays to determine the structure of materials from the micron-scale down to the Angstrom-scale. There is a significant need for a multi-physics simulation framework to perform source-to-detector simulations for a single particle imaging experiment, including (i) the multidimensional simulation of the X-ray source; (ii) simulation of the wave-optics propagation of the coherent XFEL beams; (iii) atomistic modelling of photon-material interactions; (iv) simulation of the time-dependent diffraction process, including incoherent scattering; (v) assembling noisy andmore » incomplete diffraction intensities into a three-dimensional data set using the Expansion-Maximisation-Compression (EMC) algorithm and (vi) phase retrieval to obtain structural information. Furthermore, we demonstrate the framework by simulating a single-particle experiment for a nitrogenase iron protein using parameters of the SPB/SFX instrument of the European XFEL. This exercise demonstrably yields interpretable consequences for structure determination that are crucial yet currently unavailable for experiment design.« less

  7. Direct generation of linearly polarized single photons with a deterministic axis in quantum dots

    NASA Astrophysics Data System (ADS)

    Wang, Tong; Puchtler, Tim J.; Patra, Saroj K.; Zhu, Tongtong; Ali, Muhammad; Badcock, Tom J.; Ding, Tao; Oliver, Rachel A.; Schulz, Stefan; Taylor, Robert A.

    2017-07-01

    We report the direct generation of linearly polarized single photons with a deterministic polarization axis in self-assembled quantum dots (QDs), achieved by the use of non-polar InGaN without complex device geometry engineering. Here, we present a comprehensive investigation of the polarization properties of these QDs and their origin with statistically significant experimental data and rigorous k·p modeling. The experimental study of 180 individual QDs allows us to compute an average polarization degree of 0.90, with a standard deviation of only 0.08. When coupled with theoretical insights, we show that these QDs are highly insensitive to size differences, shape anisotropies, and material content variations. Furthermore, 91% of the studied QDs exhibit a polarization axis along the crystal [1-100] axis, with the other 9% polarized orthogonal to this direction. These features give non-polar InGaN QDs unique advantages in polarization control over other materials, such as conventional polar nitride, InAs, or CdSe QDs. Hence, the ability to generate single photons with polarization control makes non-polar InGaN QDs highly attractive for quantum cryptography protocols.

  8. Towards A Complete Model Of Photopic Visual Threshold Performance

    NASA Astrophysics Data System (ADS)

    Overington, I.

    1982-02-01

    Based on a wide variety of fragmentary evidence taken from psycho-physics, neurophysiology and electron microscopy, it has been possible to put together a very widely applicable conceptual model of photopic visual threshold performance. Such a model is so complex that a single comprehensive mathematical version is excessively cumbersome. It is, however, possible to set up a suite of related mathematical models, each of limited application but strictly known envelope of usage. Such models may be used for assessment of a variety of facets of visual performance when using display imagery, including effects and interactions of image quality, random and discrete display noise, viewing distance, image motion, etc., both for foveal interrogation tasks and for visual search tasks. The specific model may be selected from the suite according to the assessment task in hand. The paper discusses in some depth the major facets of preperceptual visual processing and their interaction with instrumental image quality and noise. It then highlights the statistical nature of visual performance before going on to consider a number of specific mathematical models of partial visual function. Where appropriate, these are compared with widely popular empirical models of visual function.

  9. From good ideas to actions: a model-driven community collaborative to prevent childhood obesity.

    PubMed

    Huberty, Jennifer L; Balluff, Mary; O'Dell, Molly; Peterson, Kerri

    2010-01-01

    Activate Omaha Kids, a community collaborative, was designed, implemented, and evaluated with the aim of preventing childhood obesity in the Omaha community. Activate Omaha Kids brought together key stakeholders and community leaders to create a community coalition. The coalition's aim was to oversee a long-term sustainable approach to preventing obesity. Following a planning phase, a business plan was developed that prioritized best practices to be implemented in Omaha. The business plan was developed using the Ecological Model, Health Policy Model, and Robert Wood Johnson Foundation Active Living by Design 5P model. The three models helped the community identify target populations and activities that then created a single model for sustainable change. Twenty-four initiatives were identified, over one million dollars in funding was secured, and evaluation strategies were identified. By using the models from the initial steps through evaluation, a clear facilitation of the process was possible, and the result was a comprehensive, feasible plan. The use of the models to design a strategic plan was pivotal in building a sustainable coalition to achieve measurable improvements in the health of children and prove replicable over time.

  10. International challenge to predict the impact of radioxenon releases from medical isotope production on a comprehensive nuclear test ban treaty sampling station.

    PubMed

    Eslinger, Paul W; Bowyer, Ted W; Achim, Pascal; Chai, Tianfeng; Deconninck, Benoit; Freeman, Katie; Generoso, Sylvia; Hayes, Philip; Heidmann, Verena; Hoffman, Ian; Kijima, Yuichi; Krysta, Monika; Malo, Alain; Maurer, Christian; Ngan, Fantine; Robins, Peter; Ross, J Ole; Saunier, Olivier; Schlosser, Clemens; Schöppner, Michael; Schrom, Brian T; Seibert, Petra; Stein, Ariel F; Ungar, Kurt; Yi, Jing

    2016-06-01

    The International Monitoring System (IMS) is part of the verification regime for the Comprehensive Nuclear-Test-Ban-Treaty Organization (CTBTO). At entry-into-force, half of the 80 radionuclide stations will be able to measure concentrations of several radioactive xenon isotopes produced in nuclear explosions, and then the full network may be populated with xenon monitoring afterward. An understanding of natural and man-made radionuclide backgrounds can be used in accordance with the provisions of the treaty (such as event screening criteria in Annex 2 to the Protocol of the Treaty) for the effective implementation of the verification regime. Fission-based production of (99)Mo for medical purposes also generates nuisance radioxenon isotopes that are usually vented to the atmosphere. One of the ways to account for the effect emissions from medical isotope production has on radionuclide samples from the IMS is to use stack monitoring data, if they are available, and atmospheric transport modeling. Recently, individuals from seven nations participated in a challenge exercise that used atmospheric transport modeling to predict the time-history of (133)Xe concentration measurements at the IMS radionuclide station in Germany using stack monitoring data from a medical isotope production facility in Belgium. Participants received only stack monitoring data and used the atmospheric transport model and meteorological data of their choice. Some of the models predicted the highest measured concentrations quite well. A model comparison rank and ensemble analysis suggests that combining multiple models may provide more accurate predicted concentrations than any single model. None of the submissions based only on the stack monitoring data predicted the small measured concentrations very well. Modeling of sources by other nuclear facilities with smaller releases than medical isotope production facilities may be important in understanding how to discriminate those releases from releases from a nuclear explosion. Published by Elsevier Ltd.

  11. Modeling: A Direct Instruction Model for Programming Reading Comprehension.

    ERIC Educational Resources Information Center

    Wise, Beth S.

    Modeling the behaviors they expect students to exhibit is one way teachers can teach comprehension skills. Teachers need to give multiple examples wherein the teacher models every behavior students should exhibit, giving the answer to the question, and giving the line of reasoning followed to arrive at the answer. To teach each separate…

  12. Retention modelling of polychlorinated biphenyls in comprehensive two-dimensional gas chromatography.

    PubMed

    D'Archivio, Angelo Antonio; Incani, Angela; Ruggieri, Fabrizio

    2011-01-01

    In this paper, we use a quantitative structure-retention relationship (QSRR) method to predict the retention times of polychlorinated biphenyls (PCBs) in comprehensive two-dimensional gas chromatography (GC×GC). We analyse the GC×GC retention data taken from the literature by comparing predictive capability of different regression methods. The various models are generated using 70 out of 209 PCB congeners in the calibration stage, while their predictive performance is evaluated on the remaining 139 compounds. The two-dimensional chromatogram is initially estimated by separately modelling retention times of PCBs in the first and in the second column ((1) t (R) and (2) t (R), respectively). In particular, multilinear regression (MLR) combined with genetic algorithm (GA) variable selection is performed to extract two small subsets of predictors for (1) t (R) and (2) t (R) from a large set of theoretical molecular descriptors provided by the popular software Dragon, which after removal of highly correlated or almost constant variables consists of 237 structure-related quantities. Based on GA-MLR analysis, a four-dimensional and a five-dimensional relationship modelling (1) t (R) and (2) t (R), respectively, are identified. Single-response partial least square (PLS-1) regression is alternatively applied to independently model (1) t (R) and (2) t (R) without the need for preliminary GA variable selection. Further, we explore the possibility of predicting the two-dimensional chromatogram of PCBs in a single calibration procedure by using a two-response PLS (PLS-2) model or a feed-forward artificial neural network (ANN) with two output neurons. In the first case, regression is carried out on the full set of 237 descriptors, while the variables previously selected by GA-MLR are initially considered as ANN inputs and subjected to a sensitivity analysis to remove the redundant ones. Results show PLS-1 regression exhibits a noticeably better descriptive and predictive performance than the other investigated approaches. The observed values of determination coefficients for (1) t (R) and (2) t (R) in calibration (0.9999 and 0.9993, respectively) and prediction (0.9987 and 0.9793, respectively) provided by PLS-1 demonstrate that GC×GC behaviour of PCBs is properly modelled. In particular, the predicted two-dimensional GC×GC chromatogram of 139 PCBs not involved in the calibration stage closely resembles the experimental one. Based on the above lines of evidence, the proposed approach ensures accurate simulation of the whole GC×GC chromatogram of PCBs using experimental determination of only 1/3 retention data of representative congeners.

  13. Surrogate Reservoir Model

    NASA Astrophysics Data System (ADS)

    Mohaghegh, Shahab

    2010-05-01

    Surrogate Reservoir Model (SRM) is new solution for fast track, comprehensive reservoir analysis (solving both direct and inverse problems) using existing reservoir simulation models. SRM is defined as a replica of the full field reservoir simulation model that runs and provides accurate results in real-time (one simulation run takes only a fraction of a second). SRM mimics the capabilities of a full field model with high accuracy. Reservoir simulation is the industry standard for reservoir management. It is used in all phases of field development in the oil and gas industry. The routine of simulation studies calls for integration of static and dynamic measurements into the reservoir model. Full field reservoir simulation models have become the major source of information for analysis, prediction and decision making. Large prolific fields usually go through several versions (updates) of their model. Each new version usually is a major improvement over the previous version. The updated model includes the latest available information incorporated along with adjustments that usually are the result of single-well or multi-well history matching. As the number of reservoir layers (thickness of the formations) increases, the number of cells representing the model approaches several millions. As the reservoir models grow in size, so does the time that is required for each run. Schemes such as grid computing and parallel processing helps to a certain degree but do not provide the required speed for tasks such as: field development strategies using comprehensive reservoir analysis, solving the inverse problem for injection/production optimization, quantifying uncertainties associated with the geological model and real-time optimization and decision making. These types of analyses require hundreds or thousands of runs. Furthermore, with the new push for smart fields in the oil/gas industry that is a natural growth of smart completion and smart wells, the need for real time reservoir modeling becomes more pronounced. SRM is developed using the state of the art in neural computing and fuzzy pattern recognition to address the ever growing need in the oil and gas industry to perform accurate, but high speed simulation and modeling. Unlike conventional geo-statistical approaches (response surfaces, proxy models …) that require hundreds of simulation runs for development, SRM is developed only with a few (from 10 to 30 runs) simulation runs. SRM can be developed regularly (as new versions of the full field model become available) off-line and can be put online for real-time processing to guide important decisions. SRM has proven its value in the field. An SRM was developed for a giant oil field in the Middle East. The model included about one million grid blocks with more than 165 horizontal wells and took ten hours for a single run on 12 parallel CPUs. Using only 10 simulation runs, an SRM was developed that was able to accurately mimic the behavior of the reservoir simulation model. Performing a comprehensive reservoir analysis that included making millions of SRM runs, wells in the field were divided into five clusters. It was predicted that wells in cluster one & two are best candidates for rate relaxation with minimal, long term water production while wells in clusters four and five are susceptive to high water cuts. Two and a half years and 20 wells later, rate relaxation results from the field proved that all the predictions made by the SRM analysis were correct. While incremental oil production increased in all wells (wells in clusters 1 produced the most followed by wells in cluster 2, 3 …) the percent change in average monthly water cut for wells in each cluster clearly demonstrated the analytic power of SRM. As it was correctly predicted, wells in clusters 1 and 2 actually experience a reduction in water cut while a substantial increase in water cut was observed in wells classified into clusters 4 and 5. Performing these analyses would have been impossible using the original full field simulation model.

  14. Working memory, situation models, and synesthesia

    DOE PAGES

    Radvansky, Gabriel A.; Gibson, Bradley S.; McNerney, M. Windy

    2013-03-04

    Research on language comprehension suggests a strong relationship between working memory span measures and language comprehension. However, there is also evidence that this relationship weakens at higher levels of comprehension, such as the situation model level. The current study explored this relationship by comparing 10 grapheme–color synesthetes who have additional color experiences when they read words that begin with different letters and 48 normal controls on a number of tests of complex working memory capacity and processing at the situation model level. On all tests of working memory capacity, the synesthetes outperformed the controls. Importantly, there was no carryover benefitmore » for the synesthetes for processing at the situation model level. This reinforces the idea that although some aspects of language comprehension are related to working memory span scores, this applies less directly to situation model levels. As a result, this suggests that theories of working memory must take into account this limitation, and the working memory processes that are involved in situation model construction and processing must be derived.« less

  15. Working memory, situation models, and synesthesia

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Radvansky, Gabriel A.; Gibson, Bradley S.; McNerney, M. Windy

    Research on language comprehension suggests a strong relationship between working memory span measures and language comprehension. However, there is also evidence that this relationship weakens at higher levels of comprehension, such as the situation model level. The current study explored this relationship by comparing 10 grapheme–color synesthetes who have additional color experiences when they read words that begin with different letters and 48 normal controls on a number of tests of complex working memory capacity and processing at the situation model level. On all tests of working memory capacity, the synesthetes outperformed the controls. Importantly, there was no carryover benefitmore » for the synesthetes for processing at the situation model level. This reinforces the idea that although some aspects of language comprehension are related to working memory span scores, this applies less directly to situation model levels. As a result, this suggests that theories of working memory must take into account this limitation, and the working memory processes that are involved in situation model construction and processing must be derived.« less

  16. Promoting Different Reading Comprehension Levels through Online Annotations

    ERIC Educational Resources Information Center

    Tseng, Sheng-Shiang; Yeh, Hui-Chin; Yang, Shih-hsien

    2015-01-01

    Previous studies have evaluated reading comprehension as the general understanding of reading texts. However, this broad and generic assessment of reading comprehension overlooks the specific aspects and processes that students need to develop. This study adopted Kintsch's Construction-Integration model to tap into reading comprehension at…

  17. Are Models Easier to Understand than Code? An Empirical Study on Comprehension of Entity-Relationship (ER) Models vs. Structured Query Language (SQL) Code

    ERIC Educational Resources Information Center

    Sanchez, Pablo; Zorrilla, Marta; Duque, Rafael; Nieto-Reyes, Alicia

    2011-01-01

    Models in Software Engineering are considered as abstract representations of software systems. Models highlight relevant details for a certain purpose, whereas irrelevant ones are hidden. Models are supposed to make system comprehension easier by reducing complexity. Therefore, models should play a key role in education, since they would ease the…

  18. Fuzzy comprehensive evaluation for grid-connected performance of integrated distributed PV-ES systems

    NASA Astrophysics Data System (ADS)

    Lv, Z. H.; Li, Q.; Huang, R. W.; Liu, H. M.; Liu, D.

    2016-08-01

    Based on the discussion about topology structure of integrated distributed photovoltaic (PV) power generation system and energy storage (ES) in single or mixed type, this paper focuses on analyzing grid-connected performance of integrated distributed photovoltaic and energy storage (PV-ES) systems, and proposes a comprehensive evaluation index system. Then a multi-level fuzzy comprehensive evaluation method based on grey correlation degree is proposed, and the calculations for weight matrix and fuzzy matrix are presented step by step. Finally, a distributed integrated PV-ES power generation system connected to a 380 V low voltage distribution network is taken as the example, and some suggestions are made based on the evaluation results.

  19. National Response Framework (NRF)

    EPA Pesticide Factsheets

    The NRF establishes a single, comprehensive approach to domestic incident management to prevent, prepare for, respond to, and recover from terrorist attacks, major disasters, and other emergencies. Built on the National Incident Management System template.

  20. Hospital-Based Comprehensive Care Programs for Children With Special Health Care Needs

    PubMed Central

    Cohen, Eyal; Jovcevska, Vesna; Kuo, Dennis Z.; Mahant, Sanjay

    2014-01-01

    Objective To examine the effectiveness of hospital-based comprehensive care programs in improving the quality of care for children with special health care needs. Data Sources A systematic review was conducted using Ovid MEDLINE, CINAHL, EMBASE, PsycINFO, Sociological Abstracts SocioFile, and Web of Science. Study Selection Evaluations of comprehensive care programs for categorical (those with single disease) and noncategorical groups of children with special health care needs were included. Selected articles were reviewed independently by 2 raters. Data Extraction Models of care focused on comprehensive care based at least partially in a hospital setting. The main outcome measures were the proportions of studies demonstrating improvement in the Institute of Medicine’s quality-of-care domains (effectiveness of care, efficiency of care, patient or family centeredness, patient safety, timeliness of care, and equity of care). Data Synthesis Thirty-three unique programs were included, 13 (39%) of which were randomized controlled trials. Improved outcomes most commonly reported were efficiency of care (64% [49 of 76 outcomes]), effectiveness of care (60% [57 of 95 outcomes]), and patient or family centeredness (53% [10 of 19 outcomes). Outcomes less commonly evaluated were patient safety (9% [3 of 33 programs]), timeliness of care (6% [2 of 33 programs]), and equity of care (0%). Randomized controlled trials occurred more frequently in studies evaluating categorical vs noncategorical disease populations (11 of 17 [65%] vs 2 of 16 [17%], P = .008). Conclusions Although positive, the evidence supporting comprehensive hospital-based programs for children with special health care needs is restricted primarily to nonexperimental studies of children with categorical diseases and is limited by inadequate outcome measures. Additional high-quality evidence with appropriate comparative groups and broad outcomes is necessary to justify continued development and growth of programs for broad groups of children with special health care needs. PMID:21646589

  1. A Comprehensive Experiment for Molecular Biology: Determination of Single Nucleotide Polymorphism in Human REV3 Gene Using PCR-RFLP

    ERIC Educational Resources Information Center

    Zhang, Xu; Shao, Meng; Gao, Lu; Zhao, Yuanyuan; Sun, Zixuan; Zhou, Liping; Yan, Yongmin; Shao, Qixiang; Xu, Wenrong; Qian, Hui

    2017-01-01

    Laboratory exercise is helpful for medical students to understand the basic principles of molecular biology and to learn about the practical applications of molecular biology. We have designed a lab course on molecular biology about the determination of single nucleotide polymorphism (SNP) in human REV3 gene, the product of which is a subunit of…

  2. Military Operating Room of the Future

    DTIC Science & Technology

    2012-10-01

    more representative than any single source of data, and a more comprehensive systems analysis than has ever been attempted before. Systems Redesign...Madigan due their Trauma site survey that occurred in June 2011 and the Chief of Surgery, our primary contact, COL Rush, was deployed in Afghanistan from...and nature of flow disruptions allows for the development of evidence-based interventions (Wiegmann, 2006). Flow disruptions collected in a single

  3. Genome-wide single nucleotide polymorphisms (SNPs) for a model invasive ascidian Botryllus schlosseri.

    PubMed

    Gao, Yangchun; Li, Shiguo; Zhan, Aibin

    2018-04-01

    Invasive species cause huge damages to ecology, environment and economy globally. The comprehensive understanding of invasion mechanisms, particularly genetic bases of micro-evolutionary processes responsible for invasion success, is essential for reducing potential damages caused by invasive species. The golden star tunicate, Botryllus schlosseri, has become a model species in invasion biology, mainly owing to its high invasiveness nature and small well-sequenced genome. However, the genome-wide genetic markers have not been well developed in this highly invasive species, thus limiting the comprehensive understanding of genetic mechanisms of invasion success. Using restriction site-associated DNA (RAD) tag sequencing, here we developed a high-quality resource of 14,119 out of 158,821 SNPs for B. schlosseri. These SNPs were relatively evenly distributed at each chromosome. SNP annotations showed that the majority of SNPs (63.20%) were located at intergenic regions, and 21.51% and 14.58% were located at introns and exons, respectively. In addition, the potential use of the developed SNPs for population genomics studies was primarily assessed, such as the estimate of observed heterozygosity (H O ), expected heterozygosity (H E ), nucleotide diversity (π), Wright's inbreeding coefficient (F IS ) and effective population size (Ne). Our developed SNP resource would provide future studies the genome-wide genetic markers for genetic and genomic investigations, such as genetic bases of micro-evolutionary processes responsible for invasion success.

  4. Memory mechanisms supporting syntactic comprehension.

    PubMed

    Caplan, David; Waters, Gloria

    2013-04-01

    Efforts to characterize the memory system that supports sentence comprehension have historically drawn extensively on short-term memory as a source of mechanisms that might apply to sentences. The focus of these efforts has changed significantly in the past decade. As a result of changes in models of short-term working memory (ST-WM) and developments in models of sentence comprehension, the effort to relate entire components of an ST-WM system, such as those in the model developed by Baddeley (Nature Reviews Neuroscience 4: 829-839, 2003) to sentence comprehension has largely been replaced by an effort to relate more specific mechanisms found in modern models of ST-WM to memory processes that support one aspect of sentence comprehension--the assignment of syntactic structure (parsing) and its use in determining sentence meaning (interpretation) during sentence comprehension. In this article, we present the historical background to recent studies of the memory mechanisms that support parsing and interpretation and review recent research into this relation. We argue that the results of this research do not converge on a set of mechanisms derived from ST-WM that apply to parsing and interpretation. We argue that the memory mechanisms supporting parsing and interpretation have features that characterize another memory system that has been postulated to account for skilled performance-long-term working memory. We propose a model of the relation of different aspects of parsing and interpretation to ST-WM and long-term working memory.

  5. Teaching Comprehension and Study Strategies through Modeling and Thinking Aloud.

    ERIC Educational Resources Information Center

    Nist, Sherrie L.; Kirby, Kate

    1986-01-01

    Focuses on three ideas pertaining to modeling and thinking aloud, presents examples of how the processes can be applied to teaching both text comprehension and study strategies to college developmental readers, and discusses reasons for using modeling and thinking aloud in the classroom. (FL)

  6. Representation of deforestation impacts on climate, water, and nutrient cycles in the ACME earth system model

    NASA Astrophysics Data System (ADS)

    Cai, X.; Riley, W. J.; Zhu, Q.

    2017-12-01

    Deforestation causes a series of changes to the climate, water, and nutrient cycles. Employing a state-of-the-art earth system model—ACME (Accelerated Climate Modeling for Energy), we comprehensively investigate the impacts of deforestation on these processes. We first assess the performance of the ACME Land Model (ALM) in simulating runoff, evapotranspiration, albedo, and plant productivity at 42 FLUXNET sites. The single column mode of ACME is then used to examine climate effects (temperature cooling/warming) and responses of runoff, evapotranspiration, and nutrient fluxes to deforestation. This approach separates local effects of deforestation from global circulation effects. To better understand the deforestation effects in a global context, we use the coupled (atmosphere, land, and slab ocean) mode of ACME to demonstrate the impacts of deforestation on global climate, water, and nutrient fluxes. Preliminary results showed that the land component of ACME has advantages in simulating these processes and that local deforestation has potentially large impacts on runoff and atmospheric processes.

  7. Simulation of VSPT Experimental Cascade Under High and Low Free-Stream Turbulence Conditions

    NASA Technical Reports Server (NTRS)

    Ameri, Ali A.; Giel, Paul W.; Flegel, Ashlie B.

    2014-01-01

    Variable-Speed Power Turbines (VSPT) for rotorcraft applications operate at low Reynolds number and over a wide range in incidence associated with shaft speed change. A comprehensive linear cascade data set obtained includes the effects of Reynolds number, free-stream turbulence and incidence is available and this paper concerns itself with the presentation and numerical simulation of conditions resulting in a selected set of those data. As such, post-dictions of blade pressure loading, total-pressure loss and exit flow angles under conditions of high and low turbulence intensity for a single Reynolds number are presented. Analyses are performed with the three-equation turbulence models of Walters-Leylek and Walters and Cokljat. Transition, loading, total-pressure loss and exit angle variations are presented and comparisons are made with experimental data as available. It is concluded that at the low freestream turbulence conditions the Walters-Cokljat model is better suited to predictions while for high freestream conditions the two models generate similar predications that are generally satisfactory.

  8. Quantitative Analysis of the Efficiency of OLEDs.

    PubMed

    Sim, Bomi; Moon, Chang-Ki; Kim, Kwon-Hyeon; Kim, Jang-Joo

    2016-12-07

    We present a comprehensive model for the quantitative analysis of factors influencing the efficiency of organic light-emitting diodes (OLEDs) as a function of the current density. The model takes into account the contribution made by the charge carrier imbalance, quenching processes, and optical design loss of the device arising from various optical effects including the cavity structure, location and profile of the excitons, effective radiative quantum efficiency, and out-coupling efficiency. Quantitative analysis of the efficiency can be performed with an optical simulation using material parameters and experimental measurements of the exciton profile in the emission layer and the lifetime of the exciton as a function of the current density. This method was applied to three phosphorescent OLEDs based on a single host, mixed host, and exciplex-forming cohost. The three factors (charge carrier imbalance, quenching processes, and optical design loss) were influential in different ways, depending on the device. The proposed model can potentially be used to optimize OLED configurations on the basis of an analysis of the underlying physical processes.

  9. Simulating Sources of Superstorm Plasmas

    NASA Technical Reports Server (NTRS)

    Fok, Mei-Ching

    2008-01-01

    We evaluated the contributions to magnetospheric pressure (ring current) of the solar wind, polar wind, auroral wind, and plasmaspheric wind, with the surprising result that the main phase pressure is dominated by plasmaspheric protons. We used global simulation fields from the LFM single fluid ideal MHD model. We embedded the Comprehensive Ring Current Model within it, driven by the LFM transpolar potential, and supplied with plasmas at its boundary including solar wind protons, polar wind protons, auroral wind O+, and plasmaspheric protons. We included auroral outflows and acceleration driven by the LFM ionospheric boundary condition, including parallel ion acceleration driven by upward currents. Our plasmasphere model runs within the CRCM and is driven by it. Ionospheric sources were treated using our Global Ion Kinetics code based on full equations of motion. This treatment neglects inertial loading and pressure exerted by the ionospheric plasmas, and will be superceded by multifluid simulations that include those effects. However, these simulations provide new insights into the respective role of ionospheric sources in storm-time magnetospheric dynamics.

  10. Simulation of VSPT Experimental Cascade Under High and Low Free-Stream Turbulence Conditions

    NASA Technical Reports Server (NTRS)

    Ameri, Ali A.; Giel, Paul W.; Flegel, Ashlie B.

    2015-01-01

    Variable-Speed Power Turbines (VSPT) for rotorcraft applications operate at low Reynolds number and over a wide range in incidence associated with shaft speed change. A comprehensive linear cascade data set obtained includes the effects of Reynolds number, free-stream turbulence and incidence is available and this paper concerns itself with the presentation and numerical simulation of conditions resulting in a selected set of those data. As such, post-dictions of blade pressure loading, total-pressure loss and exit flow angles under conditions of high and low turbulence intensity for a single Reynolds number are presented. Analyses are performed with the three-equation turbulence models of Walters- Leylek and Walters and Cokljat. Transition, loading, total-pressure loss and exit angle variations are presented and comparisons are made with experimental data as available. It is concluded that at the low freestream turbulence conditions the Walters-Cokljat model is better suited to predictions while for high freestream conditions the two models generate similar predications that are generally satisfactory.

  11. Pathogenesis of Proteus mirabilis Infection

    PubMed Central

    Armbruster, Chelsie E.; Mobley, Harry L. T.; Pearson, Melanie M.

    2017-01-01

    Proteus mirabilis, a Gram-negative rod-shaped bacterium most noted for its swarming motility and urease activity, frequently causes catheter-associated urinary tract infections (CAUTI) that are often polymicrobial. These infections may be accompanied by urolithiasis, development of bladder or kidney stones due to alkalinization of urine from urease-catalyzed urea hydrolysis. Adherence of the bacterium to epithelial and catheter surfaces is mediated by 17 different fimbriae, most notably MR/P fimbriae. Repressors of motility are often encoded by these fimbrial operons. Motility is mediated by flagella encoded on a single contiguous 54 kb chromosomal sequence. On agar plates, P. mirabilis undergoes a morphological conversion to a filamentous swarmer cell expressing hundreds of flagella. When swarms from different strains meet, a line of demarcation, a “Dienes line”, develops due to the killing action of each strain’s type VI secretion system. During infection, histological damage is caused by cytotoxins including hemolysin and a variety of proteases, some autotransported. The pathogenesis of infection, including assessment of individual genes or global screens for virulence or fitness factors has been assessed in murine models of ascending UTI or CAUTI using both single-species and polymicrobial models. Global gene expression studies carried out in culture and in the murine model have revealed the unique metabolism of this bacterium. Vaccines, using MR/P fimbria and its adhesin, MrpH, have been shown to be efficacious in the murine model. A comprehensive review of factors associated with urinary tract infection is presented, encompassing both historical perspectives and current advances. PMID:29424333

  12. RNAi High-Throughput Screening of Single- and Multi-Cell-Type Tumor Spheroids: A Comprehensive Analysis in Two and Three Dimensions.

    PubMed

    Fu, Jiaqi; Fernandez, Daniel; Ferrer, Marc; Titus, Steven A; Buehler, Eugen; Lal-Nag, Madhu A

    2017-06-01

    The widespread use of two-dimensional (2D) monolayer cultures for high-throughput screening (HTS) to identify targets in drug discovery has led to attrition in the number of drug targets being validated. Solid tumors are complex, aberrantly growing microenvironments that harness structural components from stroma, nutrients fed through vasculature, and immunosuppressive factors. Increasing evidence of stromally-derived signaling broadens the complexity of our understanding of the tumor microenvironment while stressing the importance of developing better models that reflect these interactions. Three-dimensional (3D) models may be more sensitive to certain gene-silencing events than 2D models because of their components of hypoxia, nutrient gradients, and increased dependence on cell-cell interactions and therefore are more representative of in vivo interactions. Colorectal cancer (CRC) and breast cancer (BC) models composed of epithelial cells only, deemed single-cell-type tumor spheroids (SCTS) and multi-cell-type tumor spheroids (MCTS), containing fibroblasts were developed for RNAi HTS in 384-well microplates with flat-bottom wells for 2D screening and round-bottom, ultra-low-attachment wells for 3D screening. We describe the development of a high-throughput assay platform that can assess physiologically relevant phenotypic differences between screening 2D versus 3D SCTS, 3D SCTS, and MCTS in the context of different cancer subtypes. This assay platform represents a paradigm shift in how we approach drug discovery that can reduce the attrition rate of drugs that enter the clinic.

  13. Adsorption of the compounds encountered in monosaccharide dehydration in zeolite beta.

    PubMed

    León, Marta; Swift, T Dallas; Nikolakis, Vladimiros; Vlachos, Dionisios G

    2013-06-04

    A comprehensive study of the adsorption of the compounds involved in the reaction of dehydration of fructose to 5-hydroxymethyl furfural (HMF) on the zeolite H-BEA with SiO2/Al2O3 = 18 has been carried out. Furthermore, a method for the estimation of the real adsorption loading from the experimentally measured excess adsorption is developed and applied to calculate the adsorption isotherms both in the case of single-solute and multisolute mixtures. It was found that zeolite H-BEA adsorbs HMF and levulinic acid from water mixtures to greater extent than sugars and formic acid, which prefer to partition in the aqueous phase. HMF and levulinic acid adsorption isotherms could be fitted in a Redlich-Peterson isotherm model, while the adsorption of formic acid is better fitted using the Freundlich model and sugars via the Henry model. Adsorption loadings decreased with increasing temperature (0, 25, and 40 °C), which is characteristic of an exothermic process. From the temperature dependence of the isotherms, the limiting heat of adsorption at zero coverage was determined using van't Hoff equation. Given the importance and the complexity of multicomponent systems, several experiments of adsorption of multisolute solutions have been carried out. In most of the cases, the ideal adsorbed solution theory (IAST) has been proven to satisfactorily predict adsorption from multisolute mixtures using as input the single-solute isotherms.

  14. Developing a Self-Scoring Comprehensive Instrument to Measure Rest's Four-Component Model of Moral Behavior: The Moral Skills Inventory.

    PubMed

    Chambers, David W

    2011-01-01

    One of the most extensively studied constructs in dental education is the four-component model of moral behavior proposed by James Rest and the set of instruments for measuring it developed by Rest, Muriel Bebeau, and others. Although significant associations have been identified between the four components Rest proposed (called here Moral Sensitivity, Moral Reasoning, Moral Integrity, and Moral Courage) and dental ethics courses and practitioners with disciplined licenses, there is no single instrument that measures all four components, and existing single component instruments require professional scoring. This article describes the development and validation of a short, self-scoring instrument, the Moral Skills Inventory, that measures all four components. Evidence of face validity, test/retest reliability, and concurrent convergent and divergent predictive validity are demonstrated in three populations: dental students, clinical dental faculty members, and regents and officers of the American College of Dentists. Significant issues remain in developing the Rest four-component model for use in dental education and practice. Specifically, further construct validation research is needed to understand the nature of the components. In particular, it remains undetermined whether moral constructs are characteristics of individuals that drive behavior in specific situations or whether particular patterns of moral behavior learned and used in response to individual circumstances are summarized by researchers and then imputed to practitioners.

  15. Atomistic Free Energy Model for Nucleic Acids: Simulations of Single-Stranded DNA and the Entropy Landscape of RNA Stem-Loop Structures.

    PubMed

    Mak, Chi H

    2015-11-25

    While single-stranded (ss) segments of DNAs and RNAs are ubiquitous in biology, details about their structures have only recently begun to emerge. To study ssDNA and RNAs, we have developed a new Monte Carlo (MC) simulation using a free energy model for nucleic acids that has the atomisitic accuracy to capture fine molecular details of the sugar-phosphate backbone. Formulated on the basis of a first-principle calculation of the conformational entropy of the nucleic acid chain, this free energy model correctly reproduced both the long and short length-scale structural properties of ssDNA and RNAs in a rigorous comparison against recent data from fluorescence resonance energy transfer, small-angle X-ray scattering, force spectroscopy and fluorescence correlation transport measurements on sequences up to ∼100 nucleotides long. With this new MC algorithm, we conducted a comprehensive investigation of the entropy landscape of small RNA stem-loop structures. From a simulated ensemble of ∼10(6) equilibrium conformations, the entropy for the initiation of different size RNA hairpin loops was computed and compared against thermodynamic measurements. Starting from seeded hairpin loops, constrained MC simulations were then used to estimate the entropic costs associated with propagation of the stem. The numerical results provide new direct molecular insights into thermodynaimc measurement from macroscopic calorimetry and melting experiments.

  16. Pyrolysis of reinforced polymer composites: Parameterizing a model for multiple compositions

    NASA Astrophysics Data System (ADS)

    Martin, Geraldine E.

    A single set of material properties was developed to describe the pyrolysis of fiberglass reinforced polyester composites at multiple composition ratios. Milligram-scale testing was performed on the unsaturated polyester (UP) resin using thermogravimetric analysis (TGA) coupled with differential scanning calorimetry (DSC) to establish and characterize an effective semi-global reaction mechanism, of three consecutive first-order reactions. Radiation-driven gasification experiments were conducted on UP resin and the fiberglass composites at compositions ranging from 41 to 54 wt% resin at external heat fluxes from 30 to 70 kW m -2. The back surface temperature was recorded with an infrared camera and used as the target for inverse analysis to determine the thermal conductivity of the systematically isolated constituent species. Manual iterations were performed in a comprehensive pyrolysis model, ThermaKin. The complete set of properties was validated for the ability to reproduce the mass loss rate during gasification testing.

  17. Probing Sizes and Shapes of Nobelium Isotopes by Laser Spectroscopy

    NASA Astrophysics Data System (ADS)

    Raeder, S.; Ackermann, D.; Backe, H.; Beerwerth, R.; Berengut, J. C.; Block, M.; Borschevsky, A.; Cheal, B.; Chhetri, P.; Düllmann, Ch. E.; Dzuba, V. A.; Eliav, E.; Even, J.; Ferrer, R.; Flambaum, V. V.; Fritzsche, S.; Giacoppo, F.; Götz, S.; Heßberger, F. P.; Huyse, M.; Kaldor, U.; Kaleja, O.; Khuyagbaatar, J.; Kunz, P.; Laatiaoui, M.; Lautenschläger, F.; Lauth, W.; Mistry, A. K.; Minaya Ramirez, E.; Nazarewicz, W.; Porsev, S. G.; Safronova, M. S.; Safronova, U. I.; Schuetrumpf, B.; Van Duppen, P.; Walther, T.; Wraith, C.; Yakushev, A.

    2018-06-01

    Until recently, ground-state nuclear moments of the heaviest nuclei could only be inferred from nuclear spectroscopy, where model assumptions are required. Laser spectroscopy in combination with modern atomic structure calculations is now able to probe these moments directly, in a comprehensive and nuclear-model-independent way. Here we report on unique access to the differential mean-square charge radii of No 252 ,253 ,254 , and therefore to changes in nuclear size and shape. State-of-the-art nuclear density functional calculations describe well the changes in nuclear charge radii in the region of the heavy actinides, indicating an appreciable central depression in the deformed proton density distribution in No,254252 isotopes. Finally, the hyperfine splitting of No 253 was evaluated, enabling a complementary measure of its (quadrupole) deformation, as well as an insight into the neutron single-particle wave function via the nuclear spin and magnetic moment.

  18. Using formal methods to scope performance challenges for Smart Manufacturing Systems: focus on agility.

    PubMed

    Jung, Kiwook; Morris, K C; Lyons, Kevin W; Leong, Swee; Cho, Hyunbo

    2015-12-01

    Smart Manufacturing Systems (SMS) need to be agile to adapt to new situations by using detailed, precise, and appropriate data for intelligent decision-making. The intricacy of the relationship of strategic goals to operational performance across the many levels of a manufacturing system inhibits the realization of SMS. This paper proposes a method for identifying what aspects of a manufacturing system should be addressed to respond to changing strategic goals. The method uses standard modeling techniques in specifying a manufacturing system and the relationship between strategic goals and operational performance metrics. Two existing reference models related to manufacturing operations are represented formally and harmonized to support the proposed method. The method is illustrated for a single scenario using agility as a strategic goal. By replicating the proposed method for other strategic goals and with multiple scenarios, a comprehensive set of performance challenges can be identified.

  19. The Commercial Open Source Business Model

    NASA Astrophysics Data System (ADS)

    Riehle, Dirk

    Commercial open source software projects are open source software projects that are owned by a single firm that derives a direct and significant revenue stream from the software. Commercial open source at first glance represents an economic paradox: How can a firm earn money if it is making its product available for free as open source? This paper presents the core properties of com mercial open source business models and discusses how they work. Using a commercial open source approach, firms can get to market faster with a superior product at lower cost than possible for traditional competitors. The paper shows how these benefits accrue from an engaged and self-supporting user community. Lacking any prior comprehensive reference, this paper is based on an analysis of public statements by practitioners of commercial open source. It forges the various anecdotes into a coherent description of revenue generation strategies and relevant business functions.

  20. An extended algebraic variational multiscale-multigrid-multifractal method (XAVM4) for large-eddy simulation of turbulent two-phase flow

    NASA Astrophysics Data System (ADS)

    Rasthofer, U.; Wall, W. A.; Gravemeier, V.

    2018-04-01

    A novel and comprehensive computational method, referred to as the eXtended Algebraic Variational Multiscale-Multigrid-Multifractal Method (XAVM4), is proposed for large-eddy simulation of the particularly challenging problem of turbulent two-phase flow. The XAVM4 involves multifractal subgrid-scale modeling as well as a Nitsche-type extended finite element method as an approach for two-phase flow. The application of an advanced structural subgrid-scale modeling approach in conjunction with a sharp representation of the discontinuities at the interface between two bulk fluids promise high-fidelity large-eddy simulation of turbulent two-phase flow. The high potential of the XAVM4 is demonstrated for large-eddy simulation of turbulent two-phase bubbly channel flow, that is, turbulent channel flow carrying a single large bubble of the size of the channel half-width in this particular application.

  1. The multiple deficit model of dyslexia: what does it mean for identification and intervention?

    PubMed

    Ring, Jeremiah; Black, Jeffrey L

    2018-04-24

    Research demonstrates that phonological skills provide the basis of reading acquisition and are a primary processing deficit in dyslexia. This consensus has led to the development of effective methods of reading intervention. However, a single phonological deficit is not sufficient to account for the heterogeneity of individuals with dyslexia, and recent research provides evidence that supports a multiple-deficit model of reading disorders. Two studies are presented that investigate (1) the prevalence of phonological and cognitive processing deficit profiles in children with significant reading disability and (2) the effects of those same phonological and cognitive processing skills on reading development in a sample of children that received treatment for dyslexia. The results are discussed in the context of implications for identification and an intervention approach that accommodates multiple deficits within a comprehensive skills-based reading program.

  2. A Review of Hypothesized Determinants Associated with Bighorn Sheep (Ovis canadensis) Die-Offs

    PubMed Central

    Miller, David S.; Hoberg, Eric; Weiser, Glen; Aune, Keith; Atkinson, Mark; Kimberling, Cleon

    2012-01-01

    Multiple determinants have been hypothesized to cause or favor disease outbreaks among free-ranging bighorn sheep (Ovis canadensis) populations. This paper considered direct and indirect causes of mortality, as well as potential interactions among proposed environmental, host, and agent determinants of disease. A clear, invariant relationship between a single agent and field outbreaks has not yet been documented, in part due to methodological limitations and practical challenges associated with developing rigorous study designs. Therefore, although there is a need to develop predictive models for outbreaks and validated mitigation strategies, uncertainty remains as to whether outbreaks are due to endemic or recently introduced agents. Consequently, absence of established and universal explanations for outbreaks contributes to conflict among wildlife and livestock stakeholders over land use and management practices. This example illustrates the challenge of developing comprehensive models for understanding and managing wildlife diseases in complex biological and sociological environments. PMID:22567546

  3. Using formal methods to scope performance challenges for Smart Manufacturing Systems: focus on agility

    PubMed Central

    Jung, Kiwook; Morris, KC; Lyons, Kevin W.; Leong, Swee; Cho, Hyunbo

    2016-01-01

    Smart Manufacturing Systems (SMS) need to be agile to adapt to new situations by using detailed, precise, and appropriate data for intelligent decision-making. The intricacy of the relationship of strategic goals to operational performance across the many levels of a manufacturing system inhibits the realization of SMS. This paper proposes a method for identifying what aspects of a manufacturing system should be addressed to respond to changing strategic goals. The method uses standard modeling techniques in specifying a manufacturing system and the relationship between strategic goals and operational performance metrics. Two existing reference models related to manufacturing operations are represented formally and harmonized to support the proposed method. The method is illustrated for a single scenario using agility as a strategic goal. By replicating the proposed method for other strategic goals and with multiple scenarios, a comprehensive set of performance challenges can be identified. PMID:27141209

  4. Comprehensive T-matrix Reference Database: A 2009-2011 Update

    NASA Technical Reports Server (NTRS)

    Zakharova, Nadezhda T.; Videen, G.; Khlebtsov, Nikolai G.

    2012-01-01

    The T-matrix method is one of the most versatile and efficient theoretical techniques widely used for the computation of electromagnetic scattering by single and composite particles, discrete random media, and particles in the vicinity of an interface separating two half-spaces with different refractive indices. This paper presents an update to the comprehensive database of peer-reviewed T-matrix publications compiled by us previously and includes the publications that appeared since 2009. It also lists several earlier publications not included in the original database.

  5. Cardiac rehabilitation: a comprehensive review

    PubMed Central

    Lear, Scott A; Ignaszewski, Andrew

    2001-01-01

    Cardiac rehabilitation (CR) is a commonly used treatment for men and women with cardiovascular disease. To date, no single study has conclusively demonstrated a comprehensive benefit of CR. Numerous individual studies, however, have demonstrated beneficial effects such as improved risk-factor profile, slower disease progression, decreased morbidity, and decreased mortality. This paper will review the evidence for the use of CR and discuss the implications and limitations of these studies. The safety, relevance to special populations, challenges, and future directions of CR will also be reviewed. PMID:11806801

  6. The Coherence Formation Model of Illustrated Text Comprehension: A Path Model of Attention to Multimedia Text

    ERIC Educational Resources Information Center

    Fitzhugh, Shannon Leigh

    2012-01-01

    The study reported here tests a model that includes several factors thought to contribute to the comprehension of static multimedia learning materials (i.e. background knowledge, working memory, attention to components as measured with eye movement measures). The model examines the effects of working memory capacity, domain specific (biology) and…

  7. A Mid-Layer Model for Human Reliability Analysis: Understanding the Cognitive Causes of Human Failure Events

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stacey M. L. Hendrickson; April M. Whaley; Ronald L. Boring

    The Office of Nuclear Regulatory Research (RES) is sponsoring work in response to a Staff Requirements Memorandum (SRM) directing an effort to establish a single human reliability analysis (HRA) method for the agency or guidance for the use of multiple methods. As part of this effort an attempt to develop a comprehensive HRA qualitative approach is being pursued. This paper presents a draft of the method’s middle layer, a part of the qualitative analysis phase that links failure mechanisms to performance shaping factors. Starting with a Crew Response Tree (CRT) that has identified human failure events, analysts identify potential failuremore » mechanisms using the mid-layer model. The mid-layer model presented in this paper traces the identification of the failure mechanisms using the Information-Diagnosis/Decision-Action (IDA) model and cognitive models from the psychological literature. Each failure mechanism is grouped according to a phase of IDA. Under each phase of IDA, the cognitive models help identify the relevant performance shaping factors for the failure mechanism. The use of IDA and cognitive models can be traced through fault trees, which provide a detailed complement to the CRT.« less

  8. A mid-layer model for human reliability analysis : understanding the cognitive causes of human failure events.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shen, Song-Hua; Chang, James Y. H.; Boring,Ronald L.

    2010-03-01

    The Office of Nuclear Regulatory Research (RES) at the US Nuclear Regulatory Commission (USNRC) is sponsoring work in response to a Staff Requirements Memorandum (SRM) directing an effort to establish a single human reliability analysis (HRA) method for the agency or guidance for the use of multiple methods. As part of this effort an attempt to develop a comprehensive HRA qualitative approach is being pursued. This paper presents a draft of the method's middle layer, a part of the qualitative analysis phase that links failure mechanisms to performance shaping factors. Starting with a Crew Response Tree (CRT) that has identifiedmore » human failure events, analysts identify potential failure mechanisms using the mid-layer model. The mid-layer model presented in this paper traces the identification of the failure mechanisms using the Information-Diagnosis/Decision-Action (IDA) model and cognitive models from the psychological literature. Each failure mechanism is grouped according to a phase of IDA. Under each phase of IDA, the cognitive models help identify the relevant performance shaping factors for the failure mechanism. The use of IDA and cognitive models can be traced through fault trees, which provide a detailed complement to the CRT.« less

  9. Cognitive aging and hearing acuity: modeling spoken language comprehension.

    PubMed

    Wingfield, Arthur; Amichetti, Nicole M; Lash, Amanda

    2015-01-01

    The comprehension of spoken language has been characterized by a number of "local" theories that have focused on specific aspects of the task: models of word recognition, models of selective attention, accounts of thematic role assignment at the sentence level, and so forth. The ease of language understanding (ELU) model (Rönnberg et al., 2013) stands as one of the few attempts to offer a fully encompassing framework for language understanding. In this paper we discuss interactions between perceptual, linguistic, and cognitive factors in spoken language understanding. Central to our presentation is an examination of aspects of the ELU model that apply especially to spoken language comprehension in adult aging, where speed of processing, working memory capacity, and hearing acuity are often compromised. We discuss, in relation to the ELU model, conceptions of working memory and its capacity limitations, the use of linguistic context to aid in speech recognition and the importance of inhibitory control, and language comprehension at the sentence level. Throughout this paper we offer a constructive look at the ELU model; where it is strong and where there are gaps to be filled.

  10. Pharmaceutical pictograms: a model for development and testing for comprehension and utility.

    PubMed

    Montagne, Michael

    2013-01-01

    With concerns about the medication literacy skills of patients comes the need to develop various types of information materials that will enhance understanding and drug use. To review pictogram development projects and to propose a model for pharmaceutical pictogram development and testing for comprehension and use. Previous efforts in developing specific types of pictograms in engineering and safety as well as in health care and pharmacy are collected and summarized in terms of level of comprehension and recall. The impact of pictogram-enhanced medication information materials on knowledge acquisition, information retention, and adherence is assessed. Pictograms are a key component in re-designing medication information to improve comprehension, recall, and adherence. Many types of pictograms still produce low levels of comprehension and the impact of pictograms on medication knowledge is inconsistent. Prior training through patient counseling on the intended meaning and use of pictograms greatly increases their effectiveness. A model for the development and testing of pictograms and pictogram sequences for comprehension and use in medication information is presented and discussed. Copyright © 2013 Elsevier Inc. All rights reserved.

  11. Assembly and diploid architecture of an individual human genome via single-molecule technologies

    PubMed Central

    Pendleton, Matthew; Sebra, Robert; Pang, Andy Wing Chun; Ummat, Ajay; Franzen, Oscar; Rausch, Tobias; Stütz, Adrian M; Stedman, William; Anantharaman, Thomas; Hastie, Alex; Dai, Heng; Fritz, Markus Hsi-Yang; Cao, Han; Cohain, Ariella; Deikus, Gintaras; Durrett, Russell E; Blanchard, Scott C; Altman, Roger; Chin, Chen-Shan; Guo, Yan; Paxinos, Ellen E; Korbel, Jan O; Darnell, Robert B; McCombie, W Richard; Kwok, Pui-Yan; Mason, Christopher E; Schadt, Eric E; Bashir, Ali

    2015-01-01

    We present the first comprehensive analysis of a diploid human genome that combines single-molecule sequencing with single-molecule genome maps. Our hybrid assembly markedly improves upon the contiguity observed from traditional shotgun sequencing approaches, with scaffold N50 values approaching 30 Mb, and we identified complex structural variants (SVs) missed by other high-throughput approaches. Furthermore, by combining Illumina short-read data with long reads, we phased both single-nucleotide variants and SVs, generating haplotypes with over 99% consistency with previous trio-based studies. Our work shows that it is now possible to integrate single-molecule and high-throughput sequence data to generate de novo assembled genomes that approach reference quality. PMID:26121404

  12. Assembly and diploid architecture of an individual human genome via single-molecule technologies.

    PubMed

    Pendleton, Matthew; Sebra, Robert; Pang, Andy Wing Chun; Ummat, Ajay; Franzen, Oscar; Rausch, Tobias; Stütz, Adrian M; Stedman, William; Anantharaman, Thomas; Hastie, Alex; Dai, Heng; Fritz, Markus Hsi-Yang; Cao, Han; Cohain, Ariella; Deikus, Gintaras; Durrett, Russell E; Blanchard, Scott C; Altman, Roger; Chin, Chen-Shan; Guo, Yan; Paxinos, Ellen E; Korbel, Jan O; Darnell, Robert B; McCombie, W Richard; Kwok, Pui-Yan; Mason, Christopher E; Schadt, Eric E; Bashir, Ali

    2015-08-01

    We present the first comprehensive analysis of a diploid human genome that combines single-molecule sequencing with single-molecule genome maps. Our hybrid assembly markedly improves upon the contiguity observed from traditional shotgun sequencing approaches, with scaffold N50 values approaching 30 Mb, and we identified complex structural variants (SVs) missed by other high-throughput approaches. Furthermore, by combining Illumina short-read data with long reads, we phased both single-nucleotide variants and SVs, generating haplotypes with over 99% consistency with previous trio-based studies. Our work shows that it is now possible to integrate single-molecule and high-throughput sequence data to generate de novo assembled genomes that approach reference quality.

  13. Applicability of the Compensatory Encoding Model in Foreign Language Reading: An Investigation with Chinese College English Language Learners

    PubMed Central

    Han, Feifei

    2017-01-01

    While some first language (L1) reading models suggest that inefficient word recognition and small working memory tend to inhibit higher-level comprehension processes; the Compensatory Encoding Model maintains that slow word recognition and small working memory do not normally hinder reading comprehension, as readers are able to operate metacognitive strategies to compensate for inefficient word recognition and working memory limitation as long as readers process a reading task without time constraint. Although empirical evidence is accumulated for support of the Compensatory Encoding Model in L1 reading, there is lack of research for testing of the Compensatory Encoding Model in foreign language (FL) reading. This research empirically tested the Compensatory Encoding Model in English reading among Chinese college English language learners (ELLs). Two studies were conducted. Study one focused on testing whether reading condition varying time affects the relationship between word recognition, working memory, and reading comprehension. Students were tested on a computerized English word recognition test, a computerized Operation Span task, and reading comprehension in time constraint and non-time constraint reading. The correlation and regression analyses showed that the strength of association was much stronger between word recognition, working memory, and reading comprehension in time constraint than that in non-time constraint reading condition. Study two examined whether FL readers were able to operate metacognitive reading strategies as a compensatory way of reading comprehension for inefficient word recognition and working memory limitation in non-time constraint reading. The participants were tested on the same computerized English word recognition test and Operation Span test. They were required to think aloud while reading and to complete the comprehension questions. The think-aloud protocols were coded for concurrent use of reading strategies, classified into language-oriented strategies, content-oriented strategies, re-reading, pausing, and meta-comment. The correlation analyses showed that while word recognition and working memory were only significantly related to frequency of language-oriented strategies, re-reading, and pausing, but not with reading comprehension. Jointly viewed, the results of the two studies, complimenting each other, supported the applicability of the Compensatory Encoding Model in FL reading with Chinese college ELLs. PMID:28522984

  14. Applicability of the Compensatory Encoding Model in Foreign Language Reading: An Investigation with Chinese College English Language Learners.

    PubMed

    Han, Feifei

    2017-01-01

    While some first language (L1) reading models suggest that inefficient word recognition and small working memory tend to inhibit higher-level comprehension processes; the Compensatory Encoding Model maintains that slow word recognition and small working memory do not normally hinder reading comprehension, as readers are able to operate metacognitive strategies to compensate for inefficient word recognition and working memory limitation as long as readers process a reading task without time constraint. Although empirical evidence is accumulated for support of the Compensatory Encoding Model in L1 reading, there is lack of research for testing of the Compensatory Encoding Model in foreign language (FL) reading. This research empirically tested the Compensatory Encoding Model in English reading among Chinese college English language learners (ELLs). Two studies were conducted. Study one focused on testing whether reading condition varying time affects the relationship between word recognition, working memory, and reading comprehension. Students were tested on a computerized English word recognition test, a computerized Operation Span task, and reading comprehension in time constraint and non-time constraint reading. The correlation and regression analyses showed that the strength of association was much stronger between word recognition, working memory, and reading comprehension in time constraint than that in non-time constraint reading condition. Study two examined whether FL readers were able to operate metacognitive reading strategies as a compensatory way of reading comprehension for inefficient word recognition and working memory limitation in non-time constraint reading. The participants were tested on the same computerized English word recognition test and Operation Span test. They were required to think aloud while reading and to complete the comprehension questions. The think-aloud protocols were coded for concurrent use of reading strategies, classified into language-oriented strategies, content-oriented strategies, re-reading, pausing, and meta-comment. The correlation analyses showed that while word recognition and working memory were only significantly related to frequency of language-oriented strategies, re-reading, and pausing, but not with reading comprehension. Jointly viewed, the results of the two studies, complimenting each other, supported the applicability of the Compensatory Encoding Model in FL reading with Chinese college ELLs.

  15. Historical Text Comprehension Reflective Tutorial Dialogue System

    ERIC Educational Resources Information Center

    Grigoriadou, Maria; Tsaganou, Grammatiki; Cavoura, Theodora

    2005-01-01

    The Reflective Tutorial Dialogue System (ReTuDiS) is a system for learner modelling historical text comprehension through reflective dialogue. The system infers learners' cognitive profiles and constructs their learner models. Based on the learner model the system plans the appropriate--personalized for learners--reflective tutorial dialogue in…

  16. Accelerated Change in Reading Instruction: The Arkansas Comprehensive School Reform Model.

    ERIC Educational Resources Information Center

    Balkman, Jami Ann

    2001-01-01

    Describes the Arkansas Comprehensive School Reform Model, which focuses on staff development and a collaborative support system for teaching reading in the elementary grades. Reports that preliminary results indicate an average increase of at least 20% on standardized testing scores for students in model classrooms. (NB)

  17. A Comprehensive Expectancy Motivation Model: Implications for Adult Education and Training.

    ERIC Educational Resources Information Center

    Howard, Kenneth W.

    1989-01-01

    The Comprehensive Expectancy Motivation Model is based on valence-instrumentality-expectancy theory. It describes expectancy motivation as part of a larger process that includes past experience, motivation, effort, performance, reward, and need satisfaction. The model has significant implications for the design, marketing, and delivery of adult…

  18. Comprehensive School Reform Models: A Study Guide for Comparing CSR Models (and How Well They Meet Minnesota's Learning Standards).

    ERIC Educational Resources Information Center

    St. John, Edward P.; Loescher, Siri; Jacob, Stacy; Cekic, Osman; Kupersmith, Leigh; Musoba, Glenda Droogsma

    A growing number of schools are exploring the prospect of applying for funding to implement a Comprehensive School Reform (CSR) model. But the process of selecting a CSR model can be complicated because it frequently involves self-study and a review of models to determine which models best meet the needs of the school. This study guide is intended…

  19. [Evaluation on effectiveness of comprehensive control model for soil-transmitted nematodiasis].

    PubMed

    Hong-Chun, Tian; Meng, Tang; Hong, Xie; Han-Gang, Li; Xiao-Ke, Zhou; Chang-Hua, Liu; De-Fu, Zheng; Zhong-Jiu, Tang; Ming-Hui, Li; Cheng-Yu, Wu; Yi-Zhu, Ren

    2011-10-01

    To evaluate the effect of a comprehensive control model for soil-transmitted nematodiasis. Danling County was selected as a demonstration county carrying out the comprehensive prevention model centering on health education, nematode deworming, and drinking water and lavatories changing. On the other side, Hejiang was selected as a control. The effects were evaluated by comparing some indicators such as the infection rates of soil-transmitted nematodiasis and so on. The infection rates of soil-transmitted nematodiasis declined obviously from 2006 to 2009 in the demonstration county. The infection rates of Ascaris lumbricoides, hookworms, Trichiuris trichiura decreased by 91.14%, 81.65% and 65.77%. In the control county, those rates did not have downward tendency. In 2006, those rates in the demonstration county were higher than those in the control, but in 2009 those rates in the demonstration county were lower than those in the control. Through the three-year comprehensive prevention, the infection rates of soil-transmitted nematodiasis declined obviously in the demonstration county. The epidemic situation of soil-transmitted nematodiasis could be controlled effectively by the comprehensive prevention model.

  20. The Conceptualization, Development and Implementation of a Comprehensive Guidance Model. [Georgia Comprehensive K-14 Career Guidance Project.] Final Report, July 1, 1975 through June 30, 1977.

    ERIC Educational Resources Information Center

    Vail, Paul

    A project was conducted to develop, test, and implement a comprehensive program for Georgia school systems, grades K-14. The target population included regular students, students with special needs, out-of-school youth, and adults experiencing career problems. Project objectives were to develop a K-14 guidance model, develop a state/local…

Top