Sample records for ideas underlying quantification

  1. A preliminary study for fully automated quantification of psoriasis severity using image mapping

    NASA Astrophysics Data System (ADS)

    Mukai, Kazuhiro; Iyatomi, Hitoshi

    2014-03-01

    Psoriasis is a common chronic skin disease and it detracts patients' QoL seriously. Since there is no known permanent cure so far, controlling appropriate disease condition is necessary and therefore quantification of its severity is important. In clinical, psoriasis area and severity index (PASI) is commonly used for abovementioned purpose, however it is often subjective and troublesome. A fully automatic computer-assisted area and severity index (CASI) was proposed to make an objective quantification of skin disease. It investigates the size and density of erythema based on digital image analysis, however it does not consider various inadequate effects caused by different geometrical conditions under clinical follow-up (i.e. variability in direction and distance between camera and patient). In this study, we proposed an image alignment method for clinical images and investigated to quantify the severity of psoriasis under clinical follow-up combined with the idea of CASI. The proposed method finds geometrical same points in patient's body (ROI) between images with Scale Invariant Feature Transform (SIFT) and performs the Affine transform to map the pixel value to the other. In this study, clinical images from 7 patients with psoriasis lesions on their trunk under clinical follow-up were used. In each series, our image alignment algorithm align images to the geometry of their first image. Our proposed method aligned images appropriately on visual assessment and confirmed that psoriasis areas were properly extracted using the approach of CASI. Although we cannot evaluate PASI and CASI directly due to their different definition of ROI, we confirmed that there is a large correlation between those scores with our image quantification method.

  2. Quantification of uncertainties for application in detonation simulation

    NASA Astrophysics Data System (ADS)

    Zheng, Miao; Ma, Zhibo

    2016-06-01

    Numerical simulation has become an important means in designing detonation systems, and the quantification of its uncertainty is also necessary to reliability certification. As to quantifying the uncertainty, it is the most important to analyze how the uncertainties occur and develop, and how the simulations develop from benchmark models to new models. Based on the practical needs of engineering and the technology of verification & validation, a framework of QU(quantification of uncertainty) is brought forward in the case that simulation is used on detonation system for scientific prediction. An example is offered to describe the general idea of quantification of simulation uncertainties.

  3. Learning and Information Approaches for Inference in Dynamic Data-Driven Geophysical Applications

    NASA Astrophysics Data System (ADS)

    Ravela, S.

    2015-12-01

    Many Geophysical inference problems are characterized by non-linear processes, high-dimensional models and complex uncertainties. A dynamic coupling between models, estimation, and sampling is typically sought to efficiently characterize and reduce uncertainty. This process is however fraught with several difficulties. Among them, the key difficulties are the ability to deal with model errors, efficacy of uncertainty quantification and data assimilation. In this presentation, we present three key ideas from learning and intelligent systems theory and apply them to two geophysical applications. The first idea is the use of Ensemble Learning to compensate for model error, the second is to develop tractable Information Theoretic Learning to deal with non-Gaussianity in inference, and the third is a Manifold Resampling technique for effective uncertainty quantification. We apply these methods, first to the development of a cooperative autonomous observing system using sUAS for studying coherent structures. We apply this to Second, we apply this to the problem of quantifying risk from hurricanes and storm surges in a changing climate. Results indicate that learning approaches can enable new effectiveness in cases where standard approaches to model reduction, uncertainty quantification and data assimilation fail.

  4. Partition resampling and extrapolation averaging: approximation methods for quantifying gene expression in large numbers of short oligonucleotide arrays.

    PubMed

    Goldstein, Darlene R

    2006-10-01

    Studies of gene expression using high-density short oligonucleotide arrays have become a standard in a variety of biological contexts. Of the expression measures that have been proposed to quantify expression in these arrays, multi-chip-based measures have been shown to perform well. As gene expression studies increase in size, however, utilizing multi-chip expression measures is more challenging in terms of computing memory requirements and time. A strategic alternative to exact multi-chip quantification on a full large chip set is to approximate expression values based on subsets of chips. This paper introduces an extrapolation method, Extrapolation Averaging (EA), and a resampling method, Partition Resampling (PR), to approximate expression in large studies. An examination of properties indicates that subset-based methods can perform well compared with exact expression quantification. The focus is on short oligonucleotide chips, but the same ideas apply equally well to any array type for which expression is quantified using an entire set of arrays, rather than for only a single array at a time. Software implementing Partition Resampling and Extrapolation Averaging is under development as an R package for the BioConductor project.

  5. Exploring the boundary of a specialist service for adults with intellectual disabilities using a Delphi study: a quantification of stakeholder participation.

    PubMed

    Hempe, Eva-Maria; Morrison, Cecily; Holland, Anthony

    2015-10-01

    There are arguments that a specialist service for adults with intellectual disabilities is needed to address the health inequalities that this group experiences. The boundary of such a specialist service however is unclear, and definition is difficult, given the varying experiences of the multiple stakeholder groups. The study reported here quantitatively investigates divergence in stakeholders' views of what constitutes a good specialist service for people with intellectual disabilities. It is the first step of a larger project that aims to investigate the purpose, function and design of such a specialist service. The results are intended to support policy and service development. A Delphi study was carried out to elicit the requirements of this new specialist service from stakeholder groups. It consisted of three panels (carers, frontline health professionals, researchers and policymakers) and had three rounds. The quantification of stakeholder participation covers the number of unique ideas per panel, the value of these ideas as determined by the other panels and the level of agreement within and between panels. There is some overlap of ideas about of what should constitute this specialist service, but both carers and frontline health professionals contributed unique ideas. Many of these were valued by the researchers and policymakers. Interestingly, carers generated more ideas regarding how to deliver services than what services to deliver. Regarding whether ideas are considered appropriate, the variation both within and between groups is small. On the other hand, the feasibility of solutions is much more contested, with large variations among carers. This study provides a quantified representation of the diversity of ideas among stakeholder groups regarding where the boundary of a specialist service for adults with learning disabilities should sit. The results can be used as a starting point for the design process. The study also offers one way to measure the impact of participation for those interested in participation as a mechanism for service improvement. © 2013 The Authors Health Expectations Published by John Wiley & Sons Ltd.

  6. Uncertainty Quantification using Epi-Splines and Soft Information

    DTIC Science & Technology

    2012-06-01

    use of the Kullback - Leibler divergence measure. The Kullback - Leibler ...to illustrate the application of soft information related to the Kullback - Leibler (KL) divergence discussed in Chapter 2. The idea behind apply- ing... information for the estimation of system performance density functions in order to quantify uncertainty. We conduct empirical testing of

  7. [Exposome: from an intuition to a mandatory research field in occupational and enviromental medicine.

    PubMed

    Paganelli, Matteo; De Palma, Giuseppe; Apostoli, Pietro

    2017-11-01

    As Genomics aims at the collective characterization and quantification of genes, exposomics refers to the totality of lifetime environmental exposures, consisting in a novel approach to studying the role of the environment in human disease. The aim is to assess all human environmental and occupational exposures in order to better understand their contribution to human diseases. The "omics" revolution infact mostly regards the underlying method: scientific knowledge is expected to come from the analysis of increasingly extensive databases. The primary focus is on air pollution and water contaminants, but all the determinants of human exposure are conceptually part of the idea of exposome, including physical and psychological factors. Using 'omic' techniques the collected exposure data can be linked to biochemical and molecular changes in our body. Since the first formulation of the idea itself of Exposome many efforts have been made to translate the concept into research, in particular two important studies have been started in Europe. We herein suggest that Occupational Medicine could be a precious contributor to the growth of exposure science also in its omic side thanks to the methods and to the knowledges part of our background. Copyright© by Aracne Editrice, Roma, Italy.

  8. Quantification of scaling exponent with Crossover type phenomena for different types of forcing in DC glow discharge plasma

    NASA Astrophysics Data System (ADS)

    Saha, Debajyoti; Shaw, Pankaj Kumar; Ghosh, Sabuj; Janaki, M. S.; Sekar Iyengar, A. N.

    2018-01-01

    We have carried out a detailed study of scaling region using detrended fractal analysis test by applying different forcing likewise noise, sinusoidal, square on the floating potential fluctuations acquired under different pressures in a DC glow discharge plasma. The transition in the dynamics is observed through recurrence plot techniques which is an efficient method to observe the critical regime transitions in dynamics. The complexity of the nonlinear fluctuation has been revealed with the help of recurrence quantification analysis which is a suitable tool for investigating recurrence, an ubiquitous feature providing a deep insight into the dynamics of real dynamical system. An informal test for stationarity which checks for the compatibility of nonlinear approximations to the dynamics made in different segments in a time series has been proposed. In case of sinusoidal, noise, square forcing applied on fluctuation acquired at P = 0.12 mbar only one dominant scaling region is observed whereas the forcing applied on fluctuation (P = 0.04 mbar) two prominent scaling regions have been explored reliably using different forcing amplitudes indicating the signature of crossover phenomena. Furthermore a persistence long range behavior has been observed in one of these scaling regions. A comprehensive study of the quantification of scaling exponents has been carried out with the increase in amplitude and frequency of sinusoidal, square type of forcings. The scalings exponent is envisaged to be the roughness of the time series. The method provides a single quantitative idea of the scaling exponent to quantify the correlation properties of a signal.

  9. Multi-fidelity methods for uncertainty quantification in transport problems

    NASA Astrophysics Data System (ADS)

    Tartakovsky, G.; Yang, X.; Tartakovsky, A. M.; Barajas-Solano, D. A.; Scheibe, T. D.; Dai, H.; Chen, X.

    2016-12-01

    We compare several multi-fidelity approaches for uncertainty quantification in flow and transport simulations that have a lower computational cost than the standard Monte Carlo method. The cost reduction is achieved by combining a small number of high-resolution (high-fidelity) simulations with a large number of low-resolution (low-fidelity) simulations. We propose a new method, a re-scaled Multi Level Monte Carlo (rMLMC) method. The rMLMC is based on the idea that the statistics of quantities of interest depends on scale/resolution. We compare rMLMC with existing multi-fidelity methods such as Multi Level Monte Carlo (MLMC) and reduced basis methods and discuss advantages of each approach.

  10. Urban land teleconnections and sustainability

    PubMed Central

    Seto, Karen C.; Reenberg, Anette; Boone, Christopher G.; Fragkias, Michail; Haase, Dagmar; Langanke, Tobias; Marcotullio, Peter; Munroe, Darla K.; Olah, Branislav; Simon, David

    2012-01-01

    This paper introduces urban land teleconnections as a conceptual framework that explicitly links land changes to underlying urbanization dynamics. We illustrate how three key themes that are currently addressed separately in the urban sustainability and land change literatures can lead to incorrect conclusions and misleading results when they are not examined jointly: the traditional system of land classification that is based on discrete categories and reinforces the false idea of a rural–urban dichotomy; the spatial quantification of land change that is based on place-based relationships, ignoring the connections between distant places, especially between urban functions and rural land uses; and the implicit assumptions about path dependency and sequential land changes that underlie current conceptualizations of land transitions. We then examine several environmental “grand challenges” and discuss how urban land teleconnections could help research communities frame scientific inquiries. Finally, we point to existing analytical approaches that can be used to advance development and application of the concept. PMID:22550174

  11. Hybrid time-variant reliability estimation for active control structures under aleatory and epistemic uncertainties

    NASA Astrophysics Data System (ADS)

    Wang, Lei; Xiong, Chuang; Wang, Xiaojun; Li, Yunlong; Xu, Menghui

    2018-04-01

    Considering that multi-source uncertainties from inherent nature as well as the external environment are unavoidable and severely affect the controller performance, the dynamic safety assessment with high confidence is of great significance for scientists and engineers. In view of this, the uncertainty quantification analysis and time-variant reliability estimation corresponding to the closed-loop control problems are conducted in this study under a mixture of random, interval, and convex uncertainties. By combining the state-space transformation and the natural set expansion, the boundary laws of controlled response histories are first confirmed with specific implementation of random items. For nonlinear cases, the collocation set methodology and fourth Rounge-Kutta algorithm are introduced as well. Enlightened by the first-passage model in random process theory as well as by the static probabilistic reliability ideas, a new definition of the hybrid time-variant reliability measurement is provided for the vibration control systems and the related solution details are further expounded. Two engineering examples are eventually presented to demonstrate the validity and applicability of the methodology developed.

  12. Quantification of shape and cell polarity reveals a novel mechanism underlying malformations resulting from related FGF mutations during facial morphogenesis

    PubMed Central

    Li, Xin; Young, Nathan M.; Tropp, Stephen; Hu, Diane; Xu, Yanhua; Hallgrímsson, Benedikt; Marcucio, Ralph S.

    2013-01-01

    Fibroblast growth factor (FGF) signaling mutations are a frequent contributor to craniofacial malformations including midfacial anomalies and craniosynostosis. FGF signaling has been shown to control cellular mechanisms that contribute to facial morphogenesis and growth such as proliferation, survival, migration and differentiation. We hypothesized that FGF signaling not only controls the magnitude of growth during facial morphogenesis but also regulates the direction of growth via cell polarity. To test this idea, we infected migrating neural crest cells of chicken embryos with  replication-competent avian sarcoma virus expressing either FgfR2C278F, a receptor mutation found in Crouzon syndrome or the ligand Fgf8. Treated embryos exhibited craniofacial malformations resembling facial dysmorphologies in craniosynostosis syndrome. Consistent with our hypothesis, ectopic activation of FGF signaling resulted in decreased cell proliferation, increased expression of the Sprouty class of FGF signaling inhibitors, and repressed phosphorylation of ERK/MAPK. Furthermore, quantification of cell polarity in facial mesenchymal cells showed that while orientation of the Golgi body matches the direction of facial prominence outgrowth in normal cells, in FGF-treated embryos this direction is randomized, consistent with aberrant growth that we observed. Together, these data demonstrate that FGF signaling regulates cell proliferation and cell polarity and that these cell processes contribute to facial morphogenesis. PMID:23906837

  13. 75 FR 23254 - Office of Special Education and Rehabilitative Services; Overview Information; Training and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-03

    ... assessments, and the development of individualized education programs under Part B of IDEA and individualized... students with disabilities to understand their rights and responsibilities under IDEA, including those under section 615(m) of IDEA upon the student's reaching the age of majority (as appropriate under State...

  14. Strawberry: Fast and accurate genome-guided transcript reconstruction and quantification from RNA-Seq.

    PubMed

    Liu, Ruolin; Dickerson, Julie

    2017-11-01

    We propose a novel method and software tool, Strawberry, for transcript reconstruction and quantification from RNA-Seq data under the guidance of genome alignment and independent of gene annotation. Strawberry consists of two modules: assembly and quantification. The novelty of Strawberry is that the two modules use different optimization frameworks but utilize the same data graph structure, which allows a highly efficient, expandable and accurate algorithm for dealing large data. The assembly module parses aligned reads into splicing graphs, and uses network flow algorithms to select the most likely transcripts. The quantification module uses a latent class model to assign read counts from the nodes of splicing graphs to transcripts. Strawberry simultaneously estimates the transcript abundances and corrects for sequencing bias through an EM algorithm. Based on simulations, Strawberry outperforms Cufflinks and StringTie in terms of both assembly and quantification accuracies. Under the evaluation of a real data set, the estimated transcript expression by Strawberry has the highest correlation with Nanostring probe counts, an independent experiment measure for transcript expression. Strawberry is written in C++14, and is available as open source software at https://github.com/ruolin/strawberry under the MIT license.

  15. StakeMeter: value-based stakeholder identification and quantification framework for value-based software systems.

    PubMed

    Babar, Muhammad Imran; Ghazali, Masitah; Jawawi, Dayang N A; Bin Zaheer, Kashif

    2015-01-01

    Value-based requirements engineering plays a vital role in the development of value-based software (VBS). Stakeholders are the key players in the requirements engineering process, and the selection of critical stakeholders for the VBS systems is highly desirable. Based on the stakeholder requirements, the innovative or value-based idea is realized. The quality of the VBS system is associated with the concrete set of valuable requirements, and the valuable requirements can only be obtained if all the relevant valuable stakeholders participate in the requirements elicitation phase. The existing value-based approaches focus on the design of the VBS systems. However, the focus on the valuable stakeholders and requirements is inadequate. The current stakeholder identification and quantification (SIQ) approaches are neither state-of-the-art nor systematic for the VBS systems. The existing approaches are time-consuming, complex and inconsistent which makes the initiation process difficult. Moreover, the main motivation of this research is that the existing SIQ approaches do not provide the low level implementation details for SIQ initiation and stakeholder metrics for quantification. Hence, keeping in view the existing SIQ problems, this research contributes in the form of a new SIQ framework called 'StakeMeter'. The StakeMeter framework is verified and validated through case studies. The proposed framework provides low-level implementation guidelines, attributes, metrics, quantification criteria and application procedure as compared to the other methods. The proposed framework solves the issues of stakeholder quantification or prioritization, higher time consumption, complexity, and process initiation. The framework helps in the selection of highly critical stakeholders for the VBS systems with less judgmental error.

  16. 48 CFR 15.602 - Policy.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Government to encourage the submission of new and innovative ideas in response to Broad Agency Announcements... new and innovative ideas do not fall under topic areas publicized under those programs or techniques, the ideas may be submitted as unsolicited proposals. ...

  17. Evaluation of a mass-balance approach to determine consumptive water use in northeastern Illinois

    USGS Publications Warehouse

    Mills, Patrick C.; Duncker, James J.; Over, Thomas M.; Marian Domanski,; ,; Engel, Frank

    2014-01-01

    Under ideal conditions, accurate quantification of consumptive use at the sewershed scale by the described mass-balance approach might be possible. Under most prevailing conditions, quantification likely would be more costly and time consuming than that of the present study, given the freely contributed technical support of the host community and relatively appropriate conditions of the study area. Essentials to quantification of consumptive use are a fully cooperative community, storm and sanitary sewers that are separate, and newer sewer infrastructure and (or) a robust program for limiting infiltration, exfiltration, and inflow.

  18. 34 CFR 200.29 - Consolidation of funds in a schoolwide program.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    .... (3) Special education. (i) The school may consolidate funds received under part B of the IDEA. (ii... IDEA for that fiscal year, divided by the number of children with disabilities in the jurisdiction of... under part B of IDEA or section 8003(d) of the ESEA may use those funds for any activities under its...

  19. Uncertainty quantification and optimal decisions

    PubMed Central

    2017-01-01

    A mathematical model can be analysed to construct policies for action that are close to optimal for the model. If the model is accurate, such policies will be close to optimal when implemented in the real world. In this paper, the different aspects of an ideal workflow are reviewed: modelling, forecasting, evaluating forecasts, data assimilation and constructing control policies for decision-making. The example of the oil industry is used to motivate the discussion, and other examples, such as weather forecasting and precision agriculture, are used to argue that the same mathematical ideas apply in different contexts. Particular emphasis is placed on (i) uncertainty quantification in forecasting and (ii) how decisions are optimized and made robust to uncertainty in models and judgements. This necessitates full use of the relevant data and by balancing costs and benefits into the long term may suggest policies quite different from those relevant to the short term. PMID:28484343

  20. StakeMeter: Value-Based Stakeholder Identification and Quantification Framework for Value-Based Software Systems

    PubMed Central

    Babar, Muhammad Imran; Ghazali, Masitah; Jawawi, Dayang N. A.; Zaheer, Kashif Bin

    2015-01-01

    Value-based requirements engineering plays a vital role in the development of value-based software (VBS). Stakeholders are the key players in the requirements engineering process, and the selection of critical stakeholders for the VBS systems is highly desirable. Based on the stakeholder requirements, the innovative or value-based idea is realized. The quality of the VBS system is associated with the concrete set of valuable requirements, and the valuable requirements can only be obtained if all the relevant valuable stakeholders participate in the requirements elicitation phase. The existing value-based approaches focus on the design of the VBS systems. However, the focus on the valuable stakeholders and requirements is inadequate. The current stakeholder identification and quantification (SIQ) approaches are neither state-of-the-art nor systematic for the VBS systems. The existing approaches are time-consuming, complex and inconsistent which makes the initiation process difficult. Moreover, the main motivation of this research is that the existing SIQ approaches do not provide the low level implementation details for SIQ initiation and stakeholder metrics for quantification. Hence, keeping in view the existing SIQ problems, this research contributes in the form of a new SIQ framework called ‘StakeMeter’. The StakeMeter framework is verified and validated through case studies. The proposed framework provides low-level implementation guidelines, attributes, metrics, quantification criteria and application procedure as compared to the other methods. The proposed framework solves the issues of stakeholder quantification or prioritization, higher time consumption, complexity, and process initiation. The framework helps in the selection of highly critical stakeholders for the VBS systems with less judgmental error. PMID:25799490

  1. Molecules and elements for quantitative bioanalysis: The allure of using electrospray, MALDI, and ICP mass spectrometry side-by-side.

    PubMed

    Linscheid, Michael W

    2018-03-30

    To understand biological processes, not only reliable identification, but quantification of constituents in biological processes play a pivotal role. This is especially true for the proteome: protein quantification must follow protein identification, since sometimes minute changes in abundance tell the real tale. To obtain quantitative data, many sophisticated strategies using electrospray and MALDI mass spectrometry (MS) have been developed in recent years. All of them have advantages and limitations. Several years ago, we started to work on strategies, which are principally capable to overcome some of these limits. The fundamental idea is to use elemental signals as a measure for quantities. We began by replacing the radioactive 32 P with the "cold" natural 31 P to quantify modified nucleotides and phosphorylated peptides and proteins and later used tagging strategies for quantification of proteins more generally. To do this, we introduced Inductively Coupled Plasma Mass Spectrometry (ICP-MS) into the bioanalytical workflows, allowing not only reliable and sensitive detection but also quantification based on isotope dilution absolute measurements using poly-isotopic elements. The detection capability of ICP-MS becomes particularly attractive with heavy metals. The covalently bound proteins tags developed in our group are based on the well-known DOTA chelate complex (1,4,7,10-tetraazacyclododecane-N,N',N″,N‴-tetraacetic acid) carrying ions of lanthanoides as metal core. In this review, I will outline the development of this mutual assistance between molecular and elemental mass spectrometry and discuss the scope and limitations particularly of peptide and protein quantification. The lanthanoide tags provide low detection limits, but offer multiplexing capabilities due to the number of very similar lanthanoides and their isotopes. With isotope dilution comes previously unknown accuracy. Separation techniques such as electrophoresis and HPLC were used and just slightly adapted workflows, already in use for quantification in bioanalysis. Imaging mass spectrometry (MSI) with MALDI and laser ablation ICP-MS complemented the range of application in recent years. © 2018 Wiley Periodicals, Inc.

  2. 34 CFR 200.6 - Inclusion of all students.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... all students in the grades assessed in accordance with this section. (a) Students eligible under IDEA...— (A) For each student with a disability, as defined under section 602(3) of the IDEA, appropriate... Act (IDEA) whom the child's IEP team determines cannot participate in all or part of the State...

  3. Metering error quantification under voltage and current waveform distortion

    NASA Astrophysics Data System (ADS)

    Wang, Tao; Wang, Jia; Xie, Zhi; Zhang, Ran

    2017-09-01

    With integration of more and more renewable energies and distortion loads into power grid, the voltage and current waveform distortion results in metering error in the smart meters. Because of the negative effects on the metering accuracy and fairness, it is an important subject to study energy metering combined error. In this paper, after the comparing between metering theoretical value and real recorded value under different meter modes for linear and nonlinear loads, a quantification method of metering mode error is proposed under waveform distortion. Based on the metering and time-division multiplier principles, a quantification method of metering accuracy error is proposed also. Analyzing the mode error and accuracy error, a comprehensive error analysis method is presented which is suitable for new energy and nonlinear loads. The proposed method has been proved by simulation.

  4. Model-Unified Planning and Execution for Distributed Autonomous System Control

    NASA Technical Reports Server (NTRS)

    Aschwanden, Pascal; Baskaran, Vijay; Bernardini, Sara; Fry, Chuck; Moreno, Maria; Muscettola, Nicola; Plaunt, Chris; Rijsman, David; Tompkins, Paul

    2006-01-01

    The Intelligent Distributed Execution Architecture (IDEA) is a real-time architecture that exploits artificial intelligence planning as the core reasoning engine for interacting autonomous agents. Rather than enforcing separate deliberation and execution layers, IDEA unifies them under a single planning technology. Deliberative and reactive planners reason about and act according to a single representation of the past, present and future domain state. The domain state behaves the rules dictated by a declarative model of the subsystem to be controlled, internal processes of the IDEA controller, and interactions with other agents. We present IDEA concepts - modeling, the IDEA core architecture, the unification of deliberation and reaction under planning - and illustrate its use in a simple example. Finally, we present several real-world applications of IDEA, and compare IDEA to other high-level control approaches.

  5. 76 FR 74779 - List of Correspondence

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-01

    ... Individuals with Disabilities Education Act (IDEA). Under section 607(f) of the IDEA, the Secretary is... interpretations of the Department of the IDEA or the regulations that implement the IDEA. This list and the... redacted, as appropriate, can be found at: http://www2.ed.gov/policy/speced/guid/idea/index.html . FOR...

  6. IEP Goals. Alliance Action Information Sheets

    ERIC Educational Resources Information Center

    Technical Assistance ALLIANCE for Parent Centers, 2007

    2007-01-01

    IDEA is the nation's special education law. Under IDEA if a child is found to be a "child with a disability," he or she is eligible for special education and related services. If your child has a disability, under IDEA, a team of people will gather to talk about what special instruction and services your child needs. This team includes…

  7. Network representation of protein interactions: Theory of graph description and analysis.

    PubMed

    Kurzbach, Dennis

    2016-09-01

    A methodological framework is presented for the graph theoretical interpretation of NMR data of protein interactions. The proposed analysis generalizes the idea of network representations of protein structures by expanding it to protein interactions. This approach is based on regularization of residue-resolved NMR relaxation times and chemical shift data and subsequent construction of an adjacency matrix that represents the underlying protein interaction as a graph or network. The network nodes represent protein residues. Two nodes are connected if two residues are functionally correlated during the protein interaction event. The analysis of the resulting network enables the quantification of the importance of each amino acid of a protein for its interactions. Furthermore, the determination of the pattern of correlations between residues yields insights into the functional architecture of an interaction. This is of special interest for intrinsically disordered proteins, since the structural (three-dimensional) architecture of these proteins and their complexes is difficult to determine. The power of the proposed methodology is demonstrated at the example of the interaction between the intrinsically disordered protein osteopontin and its natural ligand heparin. © 2016 The Protein Society.

  8. Recent advances in stable isotope labeling based techniques for proteome relative quantification.

    PubMed

    Zhou, Yuan; Shan, Yichu; Zhang, Lihua; Zhang, Yukui

    2014-10-24

    The large scale relative quantification of all proteins expressed in biological samples under different states is of great importance for discovering proteins with important biological functions, as well as screening disease related biomarkers and drug targets. Therefore, the accurate quantification of proteins at proteome level has become one of the key issues in protein science. Herein, the recent advances in stable isotope labeling based techniques for proteome relative quantification were reviewed, from the aspects of metabolic labeling, chemical labeling and enzyme-catalyzed labeling. Furthermore, the future research direction in this field was prospected. Copyright © 2014 Elsevier B.V. All rights reserved.

  9. 76 FR 9338 - List of Correspondence

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-17

    ... the Individuals with Disabilities Education Act (IDEA). Under section 607(f) of the IDEA, the... that describes the interpretations of the Department of the IDEA or the regulations that implement the IDEA. FOR FURTHER INFORMATION CONTACT: Laura Duos or Mary Louise Dirrigl. Telephone: (202) 245-7468. If...

  10. Categories of Disability under IDEA

    ERIC Educational Resources Information Center

    National Dissemination Center for Children with Disabilities, 2012

    2012-01-01

    Every year, under the federal law known as the Individuals with Disabilities Education Act (IDEA), millions of children with disabilities receive special services designed to meet their unique needs. Early intervention services are provided through the state to infants and toddlers with disabilities under three years of age and their families. For…

  11. 34 CFR 403.112 - How does a State allocate funds under the Secondary School Vocational Education Program to local...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... individualized education programs under section 614(a)(5) of the IDEA served by the LEA in the fiscal or program... individualized education programs under section 614(a)(5) of the IDEA in the preceding fiscal year. Of that total...

  12. Study the effect of reservoir spatial heterogeneity on CO2 sequestration under an uncertainty quantification (UQ) software framework

    NASA Astrophysics Data System (ADS)

    Fang, Y.; Hou, J.; Engel, D.; Lin, G.; Yin, J.; Han, B.; Fang, Z.; Fountoulakis, V.

    2011-12-01

    In this study, we introduce an uncertainty quantification (UQ) software framework for carbon sequestration, with the focus of studying being the effect of spatial heterogeneity of reservoir properties on CO2 migration. We use a sequential Gaussian method (SGSIM) to generate realizations of permeability fields with various spatial statistical attributes. To deal with the computational difficulties, we integrate the following ideas/approaches: 1) firstly, we use three different sampling approaches (probabilistic collocation, quasi-Monte Carlo, and adaptive sampling approaches) to reduce the required forward calculations while trying to explore the parameter space and quantify the input uncertainty; 2) secondly, we use eSTOMP as the forward modeling simulator. eSTOMP is implemented using the Global Arrays toolkit (GA) that is based on one-sided inter-processor communication and supports a shared memory programming style on distributed memory platforms. It provides highly-scalable performance. It uses a data model to partition most of the large scale data structures into a relatively small number of distinct classes. The lower level simulator infrastructure (e.g. meshing support, associated data structures, and data mapping to processors) is separated from the higher level physics and chemistry algorithmic routines using a grid component interface; and 3) besides the faster model and more efficient algorithms to speed up the forward calculation, we built an adaptive system infrastructure to select the best possible data transfer mechanisms, to optimally allocate system resources to improve performance, and to integrate software packages and data for composing carbon sequestration simulation, computation, analysis, estimation and visualization. We will demonstrate the framework with a given CO2 injection scenario in a heterogeneous sandstone reservoir.

  13. The Direct Methanol Liquid-Feed Fuel Cell

    NASA Technical Reports Server (NTRS)

    Halpert, Gerald

    1997-01-01

    Until the early 1990's the idea of a practical direct methanol fuel cell from transportation and other applications was just that, an idea. Several types of fuel cells that operate under near ambient conditions were under development.

  14. 34 CFR Appendix A to Part 300 - Excess Costs Calculation

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... total expenditures amounts spent for: (1) IDEA, Part B allocation, (2) ESEA, Title I, Part A allocation... last year: (1) From funds under IDEA, Part B allocation $ 200,000 (2) From funds under ESEA, Title I...

  15. 34 CFR Appendix A to Part 300 - Excess Costs Calculation

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... total expenditures amounts spent for: (1) IDEA, Part B allocation, (2) ESEA, Title I, Part A allocation... last year: (1) From funds under IDEA, Part B allocation $ 200,000 (2) From funds under ESEA, Title I...

  16. 21 Ideas: A 42-Year Search to Understand the Nature of Giftedness

    ERIC Educational Resources Information Center

    Sternberg, Robert J.

    2018-01-01

    In this article, I describe the 21 ideas underlying a 42-year search to understand giftedness. I present the ideas roughly chronologically, in the order in which they arose, and discuss how in a career as in science, progress means supplementing or even superseding one idea with the next. In terms of the 21 ideas, I start with a discussion of how…

  17. Postsecondary Transition under IDEA 2004: A Legal Update

    ERIC Educational Resources Information Center

    Prince, Angela M. T.; Katsiyannis, Antonis; Farmer, Jennie

    2013-01-01

    Postsecondary transition planning for students with disabilities first entered the Individuals with Disabilities Education Act (IDEA) in 1990. The required provisions for transition planning were updated with the amendments to IDEA in 1997 and its reauthorization in 2004. Since IDEA 2004 took effect in July 2005, 11 court cases have been decided…

  18. Universal absolute quantification of biomolecules using element mass spectrometry and generic standards.

    PubMed

    Calderón-Celis, Francisco; Sanz-Medel, Alfredo; Encinar, Jorge Ruiz

    2018-01-23

    We present a novel and highly sensitive ICP-MS approach for absolute quantification of all important target biomolecule containing P, S, Se, As, Br, and/or I (e.g., proteins and phosphoproteins, metabolites, pesticides, drugs), under the same simple instrumental conditions and without requiring any specific and/or isotopically enriched standard.

  19. The characterisation of blood rotation in a human heart chamber based on statistical analysis of vorticity maps

    NASA Astrophysics Data System (ADS)

    Wong, Kelvin K. L.; Kelso, Richard M.; Worthley, Stephen G.; Sanders, Prashanthan; Mazumdar, Jagannath; Abbott, Derek

    2008-12-01

    Modelling of non-stationary cardiac structures is complicated by the complexity of their intrinsic and extrinsic motion. The first known study of haemodynamics due to the beating of heart was made by Leonardo Da Vinci, giving the idea of fluid-solid interaction by describing how vortices develop during cardiac structural interaction with the blood. Heart morphology affects in changes of cardio dynamics during the systolic and diastolic phrases. In a chamber of the heart, vortices are discovered to exist as the result of the unique morphological changes of the cardiac chamber wall by using flow-imaging techniques such as phase contrast magnetic resonance imaging. The first part of this paper attempts to quantify vortex characteristics by means of calculating vorticity numerically and devising two dimensional vortical flow maps. The technique relies on determining the properties of vorticity using a statistical quantification of the flow maps and comparison of these quantities based on different scenarios. As the characteristics of our vorticity maps vary depending on the phase of a cardiac cycle, there is a need for robust quantification method to analyse vorticity. In the second part of the paper, the approach is then utilised for examining vortices within the human right atrium. Our study has shown that a proper quantification of vorticity for the flow field can indicate the strength and number of vortices within a heart chamber.

  20. A Previously Unknown Path to Corpuscularism in the Seventeenth Century: Santorio’s Marginalia to the Commentaria in Primam Fen Primi Libri Canonis Avicennae (1625)

    PubMed Central

    Bigotti, Fabrizio

    2017-01-01

    This paper presents some of Santorio's marginalia to his Commentaria in primam fen primi libri Canonis Avicennae (Venice, 1625), which I identified in the Sloane Collection of the British Library in 2016, as well as the evidence for their authorship. The name of the Venetian physician Santorio Santori (1561–1636) is linked with the introduction of quantification in medicine and with the invention of precision instruments that, displayed for the first time in this work, laid down the foundations for what we today understand as evidence-based medicine. But Santorio's monumentale opus also contains evidence of many quantified experiments and displays his ideas on mixtures, structure of matter and corpuscles, which are in many cases clarified and completed by the new marginalia. These ideas testify to an early interest in chemistry within the Medical School of Padua which predates both Galileo and Sennert and which has hitherto been unknown. PMID:28350287

  1. Rethinking the Default Construction of Multimodel Climate Ensembles

    DOE PAGES

    Rauser, Florian; Gleckler, Peter; Marotzke, Jochem

    2015-07-21

    Here, we discuss the current code of practice in the climate sciences to routinely create climate model ensembles as ensembles of opportunity from the newest phase of the Coupled Model Intercomparison Project (CMIP). We give a two-step argument to rethink this process. First, the differences between generations of ensembles corresponding to different CMIP phases in key climate quantities are not large enough to warrant an automatic separation into generational ensembles for CMIP3 and CMIP5. Second, we suggest that climate model ensembles cannot continue to be mere ensembles of opportunity but should always be based on a transparent scientific decision process.more » If ensembles can be constrained by observation, then they should be constructed as target ensembles that are specifically tailored to a physical question. If model ensembles cannot be constrained by observation, then they should be constructed as cross-generational ensembles, including all available model data to enhance structural model diversity and to better sample the underlying uncertainties. To facilitate this, CMIP should guide the necessarily ongoing process of updating experimental protocols for the evaluation and documentation of coupled models. Finally, with an emphasis on easy access to model data and facilitating the filtering of climate model data across all CMIP generations and experiments, our community could return to the underlying idea of using model data ensembles to improve uncertainty quantification, evaluation, and cross-institutional exchange.« less

  2. Book of Ideas in Business Education. Activities and Ideas to Motivate Students toward Improved Business Education.

    ERIC Educational Resources Information Center

    Oklahoma State Dept. of Education, Oklahoma City.

    Developed by Oklahoma Business Education Teachers under the auspices of the Oklahoma State Department of Education, this book of ideas contains short, one-to-three-paragraph descriptions of activities and ideas to motivate students toward improved business education. The business education content areas included in this document are divided into…

  3. How to Improve Your Impact Factor: Questioning the Quantification of Academic Quality

    ERIC Educational Resources Information Center

    Smeyers, Paul; Burbules, Nicholas C.

    2011-01-01

    A broad-scale quantification of the measure of quality for scholarship is under way. This trend has fundamental implications for the future of academic publishing and employment. In this essay we want to raise questions about these burgeoning practices, particularly how they affect philosophy of education and similar sub-disciplines. First,…

  4. Quantitative Proteomics via High Resolution MS Quantification: Capabilities and Limitations

    PubMed Central

    Higgs, Richard E.; Butler, Jon P.; Han, Bomie; Knierman, Michael D.

    2013-01-01

    Recent improvements in the mass accuracy and resolution of mass spectrometers have led to renewed interest in label-free quantification using data from the primary mass spectrum (MS1) acquired from data-dependent proteomics experiments. The capacity for higher specificity quantification of peptides from samples enriched for proteins of biological interest offers distinct advantages for hypothesis generating experiments relative to immunoassay detection methods or prespecified peptide ions measured by multiple reaction monitoring (MRM) approaches. Here we describe an evaluation of different methods to post-process peptide level quantification information to support protein level inference. We characterize the methods by examining their ability to recover a known dilution of a standard protein in background matrices of varying complexity. Additionally, the MS1 quantification results are compared to a standard, targeted, MRM approach on the same samples under equivalent instrument conditions. We show the existence of multiple peptides with MS1 quantification sensitivity similar to the best MRM peptides for each of the background matrices studied. Based on these results we provide recommendations on preferred approaches to leveraging quantitative measurements of multiple peptides to improve protein level inference. PMID:23710359

  5. Mapping proteins in the presence of paralogs using units of coevolution

    PubMed Central

    2013-01-01

    Background We study the problem of mapping proteins between two protein families in the presence of paralogs. This problem occurs as a difficult subproblem in coevolution-based computational approaches for protein-protein interaction prediction. Results Similar to prior approaches, our method is based on the idea that coevolution implies equal rates of sequence evolution among the interacting proteins, and we provide a first attempt to quantify this notion in a formal statistical manner. We call the units that are central to this quantification scheme the units of coevolution. A unit consists of two mapped protein pairs and its score quantifies the coevolution of the pairs. This quantification allows us to provide a maximum likelihood formulation of the paralog mapping problem and to cast it into a binary quadratic programming formulation. Conclusion CUPID, our software tool based on a Lagrangian relaxation of this formulation, makes it, for the first time, possible to compute state-of-the-art quality pairings in a few minutes of runtime. In summary, we suggest a novel alternative to the earlier available approaches, which is statistically sound and computationally feasible. PMID:24564758

  6. Optical spectral imaging of degeneration of articular cartilage

    NASA Astrophysics Data System (ADS)

    Kinnunen, Jussi; Jurvelin, Jukka S.; Mäkitalo, Jaana; Hauta-Kasari, Markku; Vahimaa, Pasi; Saarakkala, Simo

    2010-07-01

    Osteoarthritis (OA) is a common musculoskeletal disorder often diagnosed during arthroscopy. In OA, visual color changes of the articular cartilage surface are typically observed. We demonstrate in vitro the potential of visible light spectral imaging (420 to 720 nm) to quantificate these color changes. Intact bovine articular cartilage samples (n=26) are degraded both enzymatically using the collagenase and mechanically using the emery paper (P60 grit, 269 μm particle size). Spectral images are analyzed by using standard CIELAB color coordinates and the principal component analysis (PCA). After collagenase digestion, changes in the CIELAB coordinates and projection of the spectra to PCA eigenvector are statistically significant (p<0.05). After mechanical degradation, the grinding tracks could not be visualized in the RGB presentation, i.e., in the visual appearance of the sample to the naked eye under the D65 illumination. However, after projecting to the chosen eigenvector, the grinding tracks are revealed. The tracks are also seen by using only one wavelength, i.e., 469 nm, however, the contrast in the projection image is 1.6 to 2.5 times higher. Our results support the idea that the spectral imaging can be used for evaluation of the integrity of the cartilage surface.

  7. Automated quantification of myocardial perfusion SPECT using simplified normal limits.

    PubMed

    Slomka, Piotr J; Nishina, Hidetaka; Berman, Daniel S; Akincioglu, Cigdem; Abidov, Aiden; Friedman, John D; Hayes, Sean W; Germano, Guido

    2005-01-01

    To simplify development of normal limits for myocardial perfusion SPECT (MPS), we implemented a quantification scheme in which normal limits are derived without visual scoring of abnormal scans or optimization of regional thresholds. Normal limits were derived from same-day TI-201 rest/Tc-99m-sestamibi stress scans of male (n = 40) and female (n = 40) low-likelihood patients. Defect extent, total perfusion deficit (TPD), and regional perfusion extents were derived by comparison to normal limits in polar-map coordinates. MPS scans from 256 consecutive patients without known coronary artery disease, who underwent coronary angiography, were analyzed. The new method of quantification (TPD) was compared with our previously developed quantification system and visual scoring. The receiver operator characteristic area under the curve for detection of 50% or greater stenoses by TPD (0.88 +/- 0.02) was higher than by visual scoring (0.83 +/- 0.03) ( P = .039) or standard quantification (0.82 +/- 0.03) ( P = .004). For detection of 70% or greater stenoses, it was higher for TPD (0.89 +/- 0.02) than for standard quantification (0.85 +/- 0.02) ( P = .014). Sensitivity and specificity were 93% and 79%, respectively, for TPD; 81% and 85%, respectively, for visual scoring; and 80% and 73%, respectively, for standard quantification. The use of stress mode-specific normal limits did not improve performance. Simplified quantification achieves performance better than or equivalent to visual scoring or quantification based on per-segment visual optimization of abnormality thresholds.

  8. Reflection enhances creativity: Beneficial effects of idea evaluation on idea generation.

    PubMed

    Hao, Ning; Ku, Yixuan; Liu, Meigui; Hu, Yi; Bodner, Mark; Grabner, Roland H; Fink, Andreas

    2016-03-01

    The present study aimed to explore the neural correlates underlying the effects of idea evaluation on idea generation in creative thinking. Participants were required to generate original uses of conventional objects (alternative uses task) during EEG recording. A reflection task (mentally evaluating the generated ideas) or a distraction task (object characteristics task) was inserted into the course of idea generation. Behavioral results revealed that participants generated ideas with higher originality after evaluating the generated ideas than after performing the distraction task. The EEG results revealed that idea evaluation was accompanied with upper alpha (10-13 Hz) synchronization, most prominent at frontal cortical sites. Moreover, upper alpha activity in frontal cortices during idea generation was enhanced after idea evaluation. These findings indicate that idea evaluation may elicit a state of heightened internal attention or top-down activity that facilitates efficient retrieval and integration of internal memory representations. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. In-Gel Stable-Isotope Labeling (ISIL): a strategy for mass spectrometry-based relative quantification.

    PubMed

    Asara, John M; Zhang, Xiang; Zheng, Bin; Christofk, Heather H; Wu, Ning; Cantley, Lewis C

    2006-01-01

    Most proteomics approaches for relative quantification of protein expression use a combination of stable-isotope labeling and mass spectrometry. Traditionally, researchers have used difference gel electrophoresis (DIGE) from stained 1D and 2D gels for relative quantification. While differences in protein staining intensity can often be visualized, abundant proteins can obscure less abundant proteins, and quantification of post-translational modifications is difficult. A method is presented for quantifying changes in the abundance of a specific protein or changes in specific modifications of a protein using In-gel Stable-Isotope Labeling (ISIL). Proteins extracted from any source (tissue, cell line, immunoprecipitate, etc.), treated under two experimental conditions, are resolved in separate lanes by gel electrophoresis. The regions of interest (visualized by staining) are reacted separately with light versus heavy isotope-labeled reagents, and the gel slices are then mixed and digested with proteases. The resulting peptides are then analyzed by LC-MS to determine relative abundance of light/heavy isotope pairs and analyzed by LC-MS/MS for identification of sequence and modifications. The strategy compares well with other relative quantification strategies, and in silico calculations reveal its effectiveness as a global relative quantification strategy. An advantage of ISIL is that visualization of gel differences can be used as a first quantification step followed by accurate and sensitive protein level stable-isotope labeling and mass spectrometry-based relative quantification.

  10. dPCR: A Technology Review

    PubMed Central

    Quan, Phenix-Lan; Sauzade, Martin

    2018-01-01

    Digital Polymerase Chain Reaction (dPCR) is a novel method for the absolute quantification of target nucleic acids. Quantification by dPCR hinges on the fact that the random distribution of molecules in many partitions follows a Poisson distribution. Each partition acts as an individual PCR microreactor and partitions containing amplified target sequences are detected by fluorescence. The proportion of PCR-positive partitions suffices to determine the concentration of the target sequence without a need for calibration. Advances in microfluidics enabled the current revolution of digital quantification by providing efficient partitioning methods. In this review, we compare the fundamental concepts behind the quantification of nucleic acids by dPCR and quantitative real-time PCR (qPCR). We detail the underlying statistics of dPCR and explain how it defines its precision and performance metrics. We review the different microfluidic digital PCR formats, present their underlying physical principles, and analyze the technological evolution of dPCR platforms. We present the novel multiplexing strategies enabled by dPCR and examine how isothermal amplification could be an alternative to PCR in digital assays. Finally, we determine whether the theoretical advantages of dPCR over qPCR hold true by perusing studies that directly compare assays implemented with both methods. PMID:29677144

  11. Classification-based reasoning

    NASA Technical Reports Server (NTRS)

    Gomez, Fernando; Segami, Carlos

    1991-01-01

    A representation formalism for N-ary relations, quantification, and definition of concepts is described. Three types of conditions are associated with the concepts: (1) necessary and sufficient properties, (2) contingent properties, and (3) necessary properties. Also explained is how complex chains of inferences can be accomplished by representing existentially quantified sentences, and concepts denoted by restrictive relative clauses as classification hierarchies. The representation structures that make possible the inferences are explained first, followed by the reasoning algorithms that draw the inferences from the knowledge structures. All the ideas explained have been implemented and are part of the information retrieval component of a program called Snowy. An appendix contains a brief session with the program.

  12. Species identification and quantification in meat and meat products using droplet digital PCR (ddPCR).

    PubMed

    Floren, C; Wiedemann, I; Brenig, B; Schütz, E; Beck, J

    2015-04-15

    Species fraud and product mislabelling in processed food, albeit not being a direct health issue, often results in consumer distrust. Therefore methods for quantification of undeclared species are needed. Targeting mitochondrial DNA, e.g. CYTB gene, for species quantification is unsuitable, due to a fivefold inter-tissue variation in mtDNA content per cell resulting in either an under- (-70%) or overestimation (+160%) of species DNA contents. Here, we describe a reliable two-step droplet digital PCR (ddPCR) assay targeting the nuclear F2 gene for precise quantification of cattle, horse, and pig in processed meat products. The ddPCR assay is advantageous over qPCR showing a limit of quantification (LOQ) and detection (LOD) in different meat products of 0.01% and 0.001%, respectively. The specificity was verified in 14 different species. Hence, determining F2 in food by ddPCR can be recommended for quality assurance and control in production systems. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.

  13. Family-Directed Child Evaluation and Assessment under IDEA: Lessons from Families and Programs.

    ERIC Educational Resources Information Center

    Berman, Carol; Shaw, Evelyn

    This report discusses policies and practices for family-directed child evaluation and assessment under the Individuals with Disabilities Education Act (IDEA). The scope of the report includes practices across the early childhood spectrum, from birth through 5 years. Commonly used terminology is defined. Issues discussed include: the primacy of…

  14. Annotated Bibliography of Strategies for Infusing Transition Skills into Academic Instruction

    ERIC Educational Resources Information Center

    Holzberg, Debra G.; Rusher, Dana E.

    2017-01-01

    Since 1990, transition planning has been a requirement under the Individuals with Disabilities Education Act (IDEA). Students receiving services under IDEA must have an individualized education program (IEP) with goals aligned to grade-level content standards. In addition, the IEP must ensure the student has the supports necessary, including…

  15. Occupational Therapy Services for Children and Youth under the Individuals with Disabilities Education Act (IDEA).

    ERIC Educational Resources Information Center

    American Occupational Therapy Association, Rockville, MD.

    This handbook is designed to provide registered occupational therapists and certified occupational therapy assistants with guidance in serving children with disabilities and their families under the auspices of the Individuals with Disabilities Education Act (IDEA). The first chapter provides an overview of provisions in the Individuals with…

  16. National Longitudinal Transition Study 2012: Design Documentation. NCEE 2017-4021

    ERIC Educational Resources Information Center

    Burghardt, John; Haimson, Joshua; Lipscomb, Stephen; Liu, Albert Y.; Potter, Frank; Waits, Tiffany; Wang, Sheng

    2017-01-01

    The National Longitudinal Transition Study 2012 (NLTS 2012) is the third in the series of NLTS studies sponsored by the U.S. Department of Education to examine youth with disabilities receiving services under the Individuals with Disabilities Education Act (IDEA), a long-standing federal law last updated in 2004. Under IDEA, youth with…

  17. Classroom Notes Plus: A Quarterly of Teaching Ideas, 2000-2001.

    ERIC Educational Resources Information Center

    Classroom Notes Plus, 2001

    2001-01-01

    This 18th volume of "Classroom Notes Plus" contains descriptions of original, unpublished teaching practices, or adapted ideas. Under the Ideas from the Classroom section, the August 2000 issue contains the following materials: "The Thought Pot" (Andrew R. West); "Seeing Is Reading: 'The Hollow Men'" (James Penha);…

  18. A Bulldog Mobile Is Born

    ERIC Educational Resources Information Center

    Groff, Suzy

    2015-01-01

    Sometimes great initiatives in education start with just a glimmer of an idea and a belief that building a foundation under that idea can effect change. That glimmer of an idea came to Bandera Independent School District (BISD) from a middle school English teacher who attended an International Reading Association Conference and heard about…

  19. 78 FR 46858 - Proposed Waiver and Extension of the Project Period for the Individuals With Disabilities...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-02

    ... the Individuals With Disabilities Education Act (IDEA) Partnership Project AGENCY: Office of Special... Assistance (CFDA) Number: 84.326A.] SUMMARY: For the currently funded IDEA Partnership Project (Partnership..., authorized under section 663 of IDEA. The Partnership Project is intended to provide opportunities for...

  20. Time: Assessing Understanding of Core Ideas

    ERIC Educational Resources Information Center

    Thomas, Margaret; McDonough, Andrea; Clarkson, Philip; Clarke, Doug

    2016-01-01

    Although an understanding of time is crucial in our society, curriculum documents have an undue emphasis on reading time and little emphasis on core underlying ideas. Given this context, a one-to-one assessment interview, based on a new framework, was developed and administered to investigate students' understanding of core ideas undergirding the…

  1. Source separation on hyperspectral cube applied to dermatology

    NASA Astrophysics Data System (ADS)

    Mitra, J.; Jolivot, R.; Vabres, P.; Marzani, F. S.

    2010-03-01

    This paper proposes a method of quantification of the components underlying the human skin that are supposed to be responsible for the effective reflectance spectrum of the skin over the visible wavelength. The method is based on independent component analysis assuming that the epidermal melanin and the dermal haemoglobin absorbance spectra are independent of each other. The method extracts the source spectra that correspond to the ideal absorbance spectra of melanin and haemoglobin. The noisy melanin spectrum is fixed using a polynomial fit and the quantifications associated with it are reestimated. The results produce feasible quantifications of each source component in the examined skin patch.

  2. Quantification is Neither Necessary Nor Sufficient for Measurement

    NASA Astrophysics Data System (ADS)

    Mari, Luca; Maul, Andrew; Torres Irribarra, David; Wilson, Mark

    2013-09-01

    Being an infrastructural, widespread activity, measurement is laden with stereotypes. Some of these concern the role of measurement in the relation between quality and quantity. In particular, it is sometimes argued or assumed that quantification is necessary for measurement; it is also sometimes argued or assumed that quantification is sufficient for or synonymous with measurement. To assess the validity of these positions the concepts of measurement and quantitative evaluation should be independently defined and their relationship analyzed. We contend that the defining characteristic of measurement should be the structure of the process, not a feature of its results. Under this perspective, quantitative evaluation is neither sufficient nor necessary for measurement.

  3. Classroom Notes Plus: A Quarterly of Teaching Ideas, 2003-2004

    ERIC Educational Resources Information Center

    National Council of Teachers of English, 2004

    2004-01-01

    This issue of "Classroom Notes Plus" contains descriptions of original, unpublished teaching practices, and of adapted ideas. Under the "Ideas from the Classroom" section, the August 2003 issue (v21 n1) contains the following materials: Reading Poetry with Wright's "Black Boy" (David Fuder); Finding Poetry Lost in Translation (James Penha); "Lord…

  4. 25 CFR 39.105 - Are additional funds available for special education?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... with Disabilities Education Act (IDEA). To obtain part B funds, the school must submit an application to OIEP. IDEA funds are available only if the school demonstrates that funds reserved under § 39.104...) The Bureau will facilitate the delivery of IDEA part B funding by: (1) Providing technical assistance...

  5. Classroom Notes Plus: A Quarterly of Teaching Ideas, 2001-2002.

    ERIC Educational Resources Information Center

    Classroom Notes Plus, 2002

    2002-01-01

    This 19th issue of "Notes Plus" contains descriptions of original, unpublished teaching practices, and of adapted ideas. Under the Ideas from the Classroom section, the August 2001 issue contains the following materials: "Imitation: The Sincerest Form of Flattery" (Anna M. Parks); "Stories That Make Us Who We Are"…

  6. Change in Time Utilization by Occupational Therapy and Physical Therapy Service Providers in Schools

    ERIC Educational Resources Information Center

    Goodrich, Elizabeth

    2010-01-01

    Occupational therapy (OT) and physical therapy (PT) are related services that are provided under the Individuals with Disabilities Education Improvement Act of 2004 (IDEA, 20 U.S.C. 1400 et seq.). Related services are provided under the IDEA to assist children with disabilities to benefit from special education. Nationally, there is a critical…

  7. State and Jurisdictional Eligibility Definitions for Infants and Toddlers with Disabilities under IDEA. NECTAC Notes.

    ERIC Educational Resources Information Center

    Shackelford, Jo

    Under Part C of the Individuals with Disabilities Education Act (IDEA), participating states and jurisdictions must provide services to children who are either experiencing developmental delays, or who have a diagnosed mental or physical condition that has a a high probability of resulting in developmental delay. Additionally, states may choose to…

  8. State and Jurisdictional Eligibility Definitions for Infants and Toddlers with Disabilities under IDEA. NECTAC Notes No. 16

    ERIC Educational Resources Information Center

    Shackelford, Jo

    2004-01-01

    Under Part C of the Individuals with Disabilities Education Act (IDEA), participating states and jurisdictions must provide services to children who are either experiencing developmental delays, or who have a diagnosed mental or physical condition that has a high probability of resulting in developmental delay. Additionally, states may choose to…

  9. Bayesian calibration of coarse-grained forces: Efficiently addressing transferability

    NASA Astrophysics Data System (ADS)

    Patrone, Paul N.; Rosch, Thomas W.; Phelan, Frederick R.

    2016-04-01

    Generating and calibrating forces that are transferable across a range of state-points remains a challenging task in coarse-grained (CG) molecular dynamics. In this work, we present a coarse-graining workflow, inspired by ideas from uncertainty quantification and numerical analysis, to address this problem. The key idea behind our approach is to introduce a Bayesian correction algorithm that uses functional derivatives of CG simulations to rapidly and inexpensively recalibrate initial estimates f0 of forces anchored by standard methods such as force-matching. Taking density-temperature relationships as a running example, we demonstrate that this algorithm, in concert with various interpolation schemes, can be used to efficiently compute physically reasonable force curves on a fine grid of state-points. Importantly, we show that our workflow is robust to several choices available to the modeler, including the interpolation schemes and tools used to construct f0. In a related vein, we also demonstrate that our approach can speed up coarse-graining by reducing the number of atomistic simulations needed as inputs to standard methods for generating CG forces.

  10. The Legal Meaning of Specific Learning Disability for IDEA Eligibility: The Latest Case Law

    ERIC Educational Resources Information Center

    Zirkel, Perry A.

    2013-01-01

    Specific learning disability (SLD), although moderately declining in recent years, continues to be the largest of the eligibility classifications under the Individuals with Disabilities Education Act (IDEA; NCES, 2012). The recognition of response to intervention (RTI) in the 2004 amendments of the IDEA as an approach for identifying students with…

  11. 34 CFR 99.2 - What is the purpose of these regulations?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Education Act (IDEA). 34 CFR 303.402 and 303.460 identify the confidentiality of information requirements..., services, or other benefits under Part C of IDEA. 34 CFR 300.610 through 300.627 contain the... collected or maintained pursuant to Part B of the IDEA. [53 FR 11943, Apr. 11, 1988, as amended at 61 FR...

  12. 34 CFR 99.2 - What is the purpose of these regulations?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Education Act (IDEA). 34 CFR 303.402 and 303.460 identify the confidentiality of information requirements..., services, or other benefits under Part C of IDEA. 34 CFR 300.610 through 300.627 contain the... collected or maintained pursuant to Part B of the IDEA. [53 FR 11943, Apr. 11, 1988, as amended at 61 FR...

  13. Investigating a Learning Progression for Energy Ideas from upper Elementary through High School

    ERIC Educational Resources Information Center

    Herrmann-Abell, Cari F.; DeBoer, George E.

    2018-01-01

    This study tests a hypothesized learning progression for the concept of energy. It looks at 14 specific ideas under the categories of (i) Energy Forms and Transformations; (ii) Energy Transfer; (iii) Energy Dissipation and Degradation; and (iv) Energy Conservation. It then examines students' growth of understanding within each of these ideas at…

  14. Investigating a Learning Progression for Energy Ideas from Upper Elementary through High School

    ERIC Educational Resources Information Center

    Herrmann-Abell, Cari F.; DeBoer, George E.

    2018-01-01

    This study tests a hypothesized learning progression for the concept of energy. It looks at 14 specific ideas under the categories of (i) Energy Forms and Transformations; (ii) Energy Transfer; (iii) Energy Dissipation and Degradation; and (iv) Energy Conservation. It then examines students' growth of understanding within each of these ideas at…

  15. The "Red Flags" for Child Find under the IDEA: Separating the Law from the Lore

    ERIC Educational Resources Information Center

    Zirkel, Perry A.

    2015-01-01

    A comprehensive search identified 42 court decisions from late 1996 to early 2014 concerning the primary modern meaning of child find under the Individuals with Disabilities Education Act (IDEA)--whether the district had reasonable suspicion of eligibility and yet did not evaluate the child. The findings from a systematic analysis of these court…

  16. Measuring Schools' Efforts to Partner with Parents of Children Served under IDEA: Scaling and Standard Setting for Accountability Reporting

    ERIC Educational Resources Information Center

    Elbaum, Batya; Fisher, William P., Jr.; Coulter, W. Alan

    2011-01-01

    Indicator 8 of the State Performance Plan (SPP), developed under the 2004 reauthorization of the Individuals with Disabilities Education Act (IDEA 2004, Public Law 108-446) requires states to collect data and report findings related to schools' facilitation of parent involvement. The Schools' Efforts to Partner with Parents Scale (SEPPS) was…

  17. State and Jurisdictional Eligibility Definitions for Infants and Toddlers with Disabilities under IDEA. Nectas Notes, Number 5. Revised.

    ERIC Educational Resources Information Center

    Shackelford, Jo

    Under Part C of the Individuals with Disabilities Education Act (IDEA), participating states and jurisdictions must provide services to children who are either experiencing developmental delays, or who have a diagnosed condition that carries with it a high risk of developmental delay. Eligibility criteria used by the states influence the numbers…

  18. State and Jurisdictional Eligibility Definitions for Infants and Toddlers with Disabilities under IDEA. NECTAC Notes Issue No. 14

    ERIC Educational Resources Information Center

    Shackelford, Jo

    2004-01-01

    Under Part C of the Individuals with Disabilities Education Act (IDEA), participating states and jurisdictions must provide services to children who are either experiencing developmental delays, or who have a diagnosed mental or physical condition that has a high probability of resulting in developmental delay. Additionally, states may choose to…

  19. A Study of States' Monitoring and Improvement Practices under the Individuals with Disabilities Education Act. NCSER 2011-3001

    ERIC Educational Resources Information Center

    Bollmer, Julie; Cronin, Roberta; Brauen, Marsha; Howell, Bethany; Fletcher, Philip; Gonin, Rene; Jenkins, Frank

    2010-01-01

    The Study of Monitoring and Improvement Practices under the Individuals with Disabilities Education Act (IDEA) examined how states monitored the implementation of IDEA by local special education and early intervention services programs. State monitoring and improvement practices in 2004-05 and 2006-07 were the focus of the study. Prior to the…

  20. Colour thresholding and objective quantification in bioimaging

    NASA Technical Reports Server (NTRS)

    Fermin, C. D.; Gerber, M. A.; Torre-Bueno, J. R.

    1992-01-01

    Computer imaging is rapidly becoming an indispensable tool for the quantification of variables in research and medicine. Whilst its use in medicine has largely been limited to qualitative observations, imaging in applied basic sciences, medical research and biotechnology demands objective quantification of the variables in question. In black and white densitometry (0-256 levels of intensity) the separation of subtle differences between closely related hues from stains is sometimes very difficult. True-colour and real-time video microscopy analysis offer choices not previously available with monochrome systems. In this paper we demonstrate the usefulness of colour thresholding, which has so far proven indispensable for proper objective quantification of the products of histochemical reactions and/or subtle differences in tissue and cells. In addition, we provide interested, but untrained readers with basic information that may assist decisions regarding the most suitable set-up for a project under consideration. Data from projects in progress at Tulane are shown to illustrate the advantage of colour thresholding over monochrome densitometry and for objective quantification of subtle colour differences between experimental and control samples.

  1. Surface Enhanced Raman Spectroscopy (SERS) methods for endpoint and real-time quantification of miRNA assays

    NASA Astrophysics Data System (ADS)

    Restaino, Stephen M.; White, Ian M.

    2017-03-01

    Surface Enhanced Raman spectroscopy (SERS) provides significant improvements over conventional methods for single and multianalyte quantification. Specifically, the spectroscopic fingerprint provided by Raman scattering allows for a direct multiplexing potential far beyond that of fluorescence and colorimetry. Additionally, SERS generates a comparatively low financial and spatial footprint compared with common fluorescence based systems. Despite the advantages of SERS, it has remained largely an academic pursuit. In the field of biosensing, techniques to apply SERS to molecular diagnostics are constantly under development but, most often, assay protocols are redesigned around the use of SERS as a quantification method and ultimately complicate existing protocols. Our group has sought to rethink common SERS methodologies in order to produce translational technologies capable of allowing SERS to compete in the evolving, yet often inflexible biosensing field. This work will discuss the development of two techniques for quantification of microRNA, a promising biomarker for homeostatic and disease conditions ranging from cancer to HIV. First, an inkjet-printed paper SERS sensor has been developed to allow on-demand production of a customizable and multiplexable single-step lateral flow assay for miRNA quantification. Second, as miRNA concentrations commonly exist in relatively low concentrations, amplification methods (e.g. PCR) are therefore required to facilitate quantification. This work presents a novel miRNA assay alongside a novel technique for quantification of nuclease driven nucleic acid amplification strategies that will allow SERS to be used directly with common amplification strategies for quantification of miRNA and other nucleic acid biomarkers.

  2. 34 CFR 668.231 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... (IDEA) (20 U.S.C. 1401), including a student who was determined eligible for special education or related services under the IDEA but was home-schooled or attended private school. (Authority: 20 U.S.C...

  3. 34 CFR 668.231 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... (IDEA) (20 U.S.C. 1401), including a student who was determined eligible for special education or related services under the IDEA but was home-schooled or attended private school. (Authority: 20 U.S.C...

  4. Disease quantification on PET/CT images without object delineation

    NASA Astrophysics Data System (ADS)

    Tong, Yubing; Udupa, Jayaram K.; Odhner, Dewey; Wu, Caiyun; Fitzpatrick, Danielle; Winchell, Nicole; Schuster, Stephen J.; Torigian, Drew A.

    2017-03-01

    The derivation of quantitative information from images to make quantitative radiology (QR) clinically practical continues to face a major image analysis hurdle because of image segmentation challenges. This paper presents a novel approach to disease quantification (DQ) via positron emission tomography/computed tomography (PET/CT) images that explores how to decouple DQ methods from explicit dependence on object segmentation through the use of only object recognition results to quantify disease burden. The concept of an object-dependent disease map is introduced to express disease severity without performing explicit delineation and partial volume correction of either objects or lesions. The parameters of the disease map are estimated from a set of training image data sets. The idea is illustrated on 20 lung lesions and 20 liver lesions derived from 18F-2-fluoro-2-deoxy-D-glucose (FDG)-PET/CT scans of patients with various types of cancers and also on 20 NEMA PET/CT phantom data sets. Our preliminary results show that, on phantom data sets, "disease burden" can be estimated to within 2% of known absolute true activity. Notwithstanding the difficulty in establishing true quantification on patient PET images, our results achieve 8% deviation from "true" estimates, with slightly larger deviations for small and diffuse lesions where establishing ground truth becomes really questionable, and smaller deviations for larger lesions where ground truth set up becomes more reliable. We are currently exploring extensions of the approach to include fully automated body-wide DQ, extensions to just CT or magnetic resonance imaging (MRI) alone, to PET/CT performed with radiotracers other than FDG, and other functional forms of disease maps.

  5. Big Challenges and Big Opportunities: The Power of "Big Ideas" to Change Curriculum and the Culture of Teacher Planning

    ERIC Educational Resources Information Center

    Hurst, Chris

    2014-01-01

    Mathematical knowledge of pre-service teachers is currently "under the microscope" and the subject of research. This paper proposes a different approach to teacher content knowledge based on the "big ideas" of mathematics and the connections that exist within and between them. It is suggested that these "big ideas"…

  6. Recent developments from the OPEnS Lab

    NASA Astrophysics Data System (ADS)

    Selker, J. S.; Good, S. P.; Higgins, C. W.; Sayde, C.; Buskirk, B.; Lopez, M.; Nelke, M.; Udell, C.

    2016-12-01

    The Openly Published Environmental Sensing (OPEnS) lab is a facility that is open to all from around the world to use (http://agsci.oregonstate.edu/open-sensing). With 3-D CAD, electronics benches, 3-D printers and laser cutters, and a complete precision metal shop, the lab can build just about anything. Electronic platforms such as the Arduino are combined with cutting edge sensors, and packaged in rugged housing to address critical environmental sensing needs. The results are published in GITHub and in the AGU journal Earth and Space Sciences under the special theme of "Environmental Sensing." In this poster we present advancements including: A ultra-precise isotopic sampler for rainfall; an isotopic sampler for soil gas; a data-logging wind vane that can be mounted on the tether of a balloon; a rain-gage calibrator with three rates of constant application; a <$20 dissolved O2 probe for water; a stream-bed permeameter that gives rapid quantification of permeability. You can use the OPEnS lab! Just sketch your idea on a white board and send it in. The conversation is started, and your prototype can be ready in a few weeks. We have a staff of three engineers ready to help, where you are working remotely, or decide to spend some time with the team in Corvallis.

  7. 34 CFR 200.1 - State responsibilities for developing challenging academic standards.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... with Disabilities Education Act (IDEA) who meet the State's criteria under paragraph (e)(2) of this... modified academic achievement standards may be from any of the disability categories listed in the IDEA...

  8. Spectral Analysis of Dynamic PET Studies: A Review of 20 Years of Method Developments and Applications.

    PubMed

    Veronese, Mattia; Rizzo, Gaia; Bertoldo, Alessandra; Turkheimer, Federico E

    2016-01-01

    In Positron Emission Tomography (PET), spectral analysis (SA) allows the quantification of dynamic data by relating the radioactivity measured by the scanner in time to the underlying physiological processes of the system under investigation. Among the different approaches for the quantification of PET data, SA is based on the linear solution of the Laplace transform inversion whereas the measured arterial and tissue time-activity curves of a radiotracer are used to calculate the input response function of the tissue. In the recent years SA has been used with a large number of PET tracers in brain and nonbrain applications, demonstrating that it is a very flexible and robust method for PET data analysis. Differently from the most common PET quantification approaches that adopt standard nonlinear estimation of compartmental models or some linear simplifications, SA can be applied without defining any specific model configuration and has demonstrated very good sensitivity to the underlying kinetics. This characteristic makes it useful as an investigative tool especially for the analysis of novel PET tracers. The purpose of this work is to offer an overview of SA, to discuss advantages and limitations of the methodology, and to inform about its applications in the PET field.

  9. INVESTIGATING SURFACE WATER QUALITY IMPACTS ON GROUNDWATER QUALITY UNDER VARYING FLOW CONDITIONS IN THE BARTON SPRINGS SEGMENT OF THE EDWARDS AQUIFER, CENTRAL TEXAS

    EPA Science Inventory

    The expected results from this research include: i) the quantification of the proportion of surface water comprising spring discharge under varying flow conditions; ii) the characterization of surface watersheds under varying antecedent moisture conditions, and evaluation of ...

  10. 34 CFR 200.44 - Public school choice.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Rehabilitation Act of 1973 (Section 504). For students with disabilities under the IDEA and students covered... that term is defined in section 602(8) of the IDEA or 34 CFR 104.33, respectively. (Authority: 20 U.S.C...

  11. 34 CFR 200.44 - Public school choice.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Rehabilitation Act of 1973 (Section 504). For students with disabilities under the IDEA and students covered... that term is defined in section 602(8) of the IDEA or 34 CFR 104.33, respectively. (Authority: 20 U.S.C...

  12. 34 CFR 403.190 - What are the requirements for receiving a subgrant or contract?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... students with individualized education programs developed under the IDEA; (2) Provide assurances that— (i... requirement of section 626 of the IDEA; (2) Assess the special needs of students participating in projects...

  13. From ideas to studies: how to get ideas and sharpen them into research questions.

    PubMed

    Vandenbroucke, Jan P; Pearce, Neil

    2018-01-01

    Where do new research questions come from? This is at best only partially taught in courses or textbooks about clinical or epidemiological research. Methods are taught under the assumption that a researcher already knows the research question and knows which methods will fit that question. Similarly, the real complexity of the thought processes that lead to a scientific undertaking is almost never described in published papers. In this paper, we first discuss how to get an idea that is worth researching. We describe sources of new ideas and how to foster a creative attitude by "cultivating your thoughts". Only a few of these ideas will make it into a study. Next, we describe how to sharpen and focus a research question so that a study becomes feasible and a valid test of the underlying idea. To do this, the idea needs to be "pruned". Pruning a research question means cutting away anything that is unnecessary, so that only the essence remains. This includes determining both the latent and the stated objectives, specific pruning questions, and the use of specific schemes to structure reasoning. After this, the following steps include preparation of a brief protocol, conduct of a pilot study, and writing a draft of the paper including draft tables. Then you are ready to carry out your research.

  14. From ideas to studies: how to get ideas and sharpen them into research questions

    PubMed Central

    Vandenbroucke, Jan P; Pearce, Neil

    2018-01-01

    Where do new research questions come from? This is at best only partially taught in courses or textbooks about clinical or epidemiological research. Methods are taught under the assumption that a researcher already knows the research question and knows which methods will fit that question. Similarly, the real complexity of the thought processes that lead to a scientific undertaking is almost never described in published papers. In this paper, we first discuss how to get an idea that is worth researching. We describe sources of new ideas and how to foster a creative attitude by “cultivating your thoughts”. Only a few of these ideas will make it into a study. Next, we describe how to sharpen and focus a research question so that a study becomes feasible and a valid test of the underlying idea. To do this, the idea needs to be “pruned”. Pruning a research question means cutting away anything that is unnecessary, so that only the essence remains. This includes determining both the latent and the stated objectives, specific pruning questions, and the use of specific schemes to structure reasoning. After this, the following steps include preparation of a brief protocol, conduct of a pilot study, and writing a draft of the paper including draft tables. Then you are ready to carry out your research. PMID:29563838

  15. Multiscale recurrence quantification analysis of order recurrence plots

    NASA Astrophysics Data System (ADS)

    Xu, Mengjia; Shang, Pengjian; Lin, Aijing

    2017-03-01

    In this paper, we propose a new method of multiscale recurrence quantification analysis (MSRQA) to analyze the structure of order recurrence plots. The MSRQA is based on order patterns over a range of time scales. Compared with conventional recurrence quantification analysis (RQA), the MSRQA can show richer and more recognizable information on the local characteristics of diverse systems which successfully describes their recurrence properties. Both synthetic series and stock market indexes exhibit their properties of recurrence at large time scales that quite differ from those at a single time scale. Some systems present more accurate recurrence patterns under large time scales. It demonstrates that the new approach is effective for distinguishing three similar stock market systems and showing some inherent differences.

  16. Impedimetric quantification of the formation process and the chemosensitivity of cancer cell colonies suspended in 3D environment.

    PubMed

    Lei, Kin Fong; Wu, Zong-Ming; Huang, Chia-Hao

    2015-12-15

    In cancer research, colony formation assay is a gold standard for the investigation of the development of early tumors and the effects of cytotoxic agents on tumors in vitro. Quantification of cancer cell colonies suspended in hydrogel is currently achieved by manual counting under microscope. It is challenging to microscopically quantify the colony number and size without subjective bias. In this work, impedimetric quantification of cancer cell colonies suspended in hydrogel was successfully developed and provides a quantitative and objective method to describe the colony formation process and the development of colony size during the culture course. A biosensor embedded with a pair of parallel plate electrodes was fabricated for the impedimetric quantification. Cancer cell (cell line: Huh-7) were encapsulated in methyl cellulose hydrogel and cultured to gradually form cancer cell colonies suspended in 3D environment. At pre-set schedule during the culture course, small volume (50 μL) of colonies/MC hydrogel was collected, mixed with measurement hydrogel, and loaded to the biosensor for measurement. Hence, the colony formation process could be quantitatively represented by a colony index and a colony size index calculated from electrical impedance. Based on these developments, chemosensitivity of cancer cell colonies under different concentrations of anti-cancer drug, i.e., doxorubicin, was quantitatively investigated to study the efficacy of anti-cancer drug. Also, dose-response curve was constructed to calculate the IC50 value, which is an important indicator for chemosensitivity assay. These results showed the impedimetric quantification is a promising technique for the colony formation assay. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. Uncertainty Quantification and Certification Prediction of Low-Boom Supersonic Aircraft Configurations

    NASA Technical Reports Server (NTRS)

    West, Thomas K., IV; Reuter, Bryan W.; Walker, Eric L.; Kleb, Bil; Park, Michael A.

    2014-01-01

    The primary objective of this work was to develop and demonstrate a process for accurate and efficient uncertainty quantification and certification prediction of low-boom, supersonic, transport aircraft. High-fidelity computational fluid dynamics models of multiple low-boom configurations were investigated including the Lockheed Martin SEEB-ALR body of revolution, the NASA 69 Delta Wing, and the Lockheed Martin 1021-01 configuration. A nonintrusive polynomial chaos surrogate modeling approach was used for reduced computational cost of propagating mixed, inherent (aleatory) and model-form (epistemic) uncertainty from both the computation fluid dynamics model and the near-field to ground level propagation model. A methodology has also been introduced to quantify the plausibility of a design to pass a certification under uncertainty. Results of this study include the analysis of each of the three configurations of interest under inviscid and fully turbulent flow assumptions. A comparison of the uncertainty outputs and sensitivity analyses between the configurations is also given. The results of this study illustrate the flexibility and robustness of the developed framework as a tool for uncertainty quantification and certification prediction of low-boom, supersonic aircraft.

  18. Some scale-free networks could be robust under selective node attacks

    NASA Astrophysics Data System (ADS)

    Zheng, Bojin; Huang, Dan; Li, Deyi; Chen, Guisheng; Lan, Wenfei

    2011-04-01

    It is a mainstream idea that scale-free network would be fragile under the selective attacks. Internet is a typical scale-free network in the real world, but it never collapses under the selective attacks of computer viruses and hackers. This phenomenon is different from the deduction of the idea above because this idea assumes the same cost to delete an arbitrary node. Hence this paper discusses the behaviors of the scale-free network under the selective node attack with different cost. Through the experiments on five complex networks, we show that the scale-free network is possibly robust under the selective node attacks; furthermore, the more compact the network is, and the larger the average degree is, then the more robust the network is; with the same average degrees, the more compact the network is, the more robust the network is. This result would enrich the theory of the invulnerability of the network, and can be used to build robust social, technological and biological networks, and also has the potential to find the target of drugs.

  19. Monte Carlo Modeling-Based Digital Loop-Mediated Isothermal Amplification on a Spiral Chip for Absolute Quantification of Nucleic Acids.

    PubMed

    Xia, Yun; Yan, Shuangqian; Zhang, Xian; Ma, Peng; Du, Wei; Feng, Xiaojun; Liu, Bi-Feng

    2017-03-21

    Digital loop-mediated isothermal amplification (dLAMP) is an attractive approach for absolute quantification of nucleic acids with high sensitivity and selectivity. Theoretical and numerical analysis of dLAMP provides necessary guidance for the design and analysis of dLAMP devices. In this work, a mathematical model was proposed on the basis of the Monte Carlo method and the theories of Poisson statistics and chemometrics. To examine the established model, we fabricated a spiral chip with 1200 uniform and discrete reaction chambers (9.6 nL) for absolute quantification of pathogenic DNA samples by dLAMP. Under the optimized conditions, dLAMP analysis on the spiral chip realized quantification of nucleic acids spanning over 4 orders of magnitude in concentration with sensitivity as low as 8.7 × 10 -2 copies/μL in 40 min. The experimental results were consistent with the proposed mathematical model, which could provide useful guideline for future development of dLAMP devices.

  20. Targeted Feature Detection for Data-Dependent Shotgun Proteomics

    PubMed Central

    2017-01-01

    Label-free quantification of shotgun LC–MS/MS data is the prevailing approach in quantitative proteomics but remains computationally nontrivial. The central data analysis step is the detection of peptide-specific signal patterns, called features. Peptide quantification is facilitated by associating signal intensities in features with peptide sequences derived from MS2 spectra; however, missing values due to imperfect feature detection are a common problem. A feature detection approach that directly targets identified peptides (minimizing missing values) but also offers robustness against false-positive features (by assigning meaningful confidence scores) would thus be highly desirable. We developed a new feature detection algorithm within the OpenMS software framework, leveraging ideas and algorithms from the OpenSWATH toolset for DIA/SRM data analysis. Our software, FeatureFinderIdentification (“FFId”), implements a targeted approach to feature detection based on information from identified peptides. This information is encoded in an MS1 assay library, based on which ion chromatogram extraction and detection of feature candidates are carried out. Significantly, when analyzing data from experiments comprising multiple samples, our approach distinguishes between “internal” and “external” (inferred) peptide identifications (IDs) for each sample. On the basis of internal IDs, two sets of positive (true) and negative (decoy) feature candidates are defined. A support vector machine (SVM) classifier is then trained to discriminate between the sets and is subsequently applied to the “uncertain” feature candidates from external IDs, facilitating selection and confidence scoring of the best feature candidate for each peptide. This approach also enables our algorithm to estimate the false discovery rate (FDR) of the feature selection step. We validated FFId based on a public benchmark data set, comprising a yeast cell lysate spiked with protein standards that provide a known ground-truth. The algorithm reached almost complete (>99%) quantification coverage for the full set of peptides identified at 1% FDR (PSM level). Compared with other software solutions for label-free quantification, this is an outstanding result, which was achieved at competitive quantification accuracy and reproducibility across replicates. The FDR for the feature selection was estimated at a low 1.5% on average per sample (3% for features inferred from external peptide IDs). The FFId software is open-source and freely available as part of OpenMS (www.openms.org). PMID:28673088

  1. Targeted Feature Detection for Data-Dependent Shotgun Proteomics.

    PubMed

    Weisser, Hendrik; Choudhary, Jyoti S

    2017-08-04

    Label-free quantification of shotgun LC-MS/MS data is the prevailing approach in quantitative proteomics but remains computationally nontrivial. The central data analysis step is the detection of peptide-specific signal patterns, called features. Peptide quantification is facilitated by associating signal intensities in features with peptide sequences derived from MS2 spectra; however, missing values due to imperfect feature detection are a common problem. A feature detection approach that directly targets identified peptides (minimizing missing values) but also offers robustness against false-positive features (by assigning meaningful confidence scores) would thus be highly desirable. We developed a new feature detection algorithm within the OpenMS software framework, leveraging ideas and algorithms from the OpenSWATH toolset for DIA/SRM data analysis. Our software, FeatureFinderIdentification ("FFId"), implements a targeted approach to feature detection based on information from identified peptides. This information is encoded in an MS1 assay library, based on which ion chromatogram extraction and detection of feature candidates are carried out. Significantly, when analyzing data from experiments comprising multiple samples, our approach distinguishes between "internal" and "external" (inferred) peptide identifications (IDs) for each sample. On the basis of internal IDs, two sets of positive (true) and negative (decoy) feature candidates are defined. A support vector machine (SVM) classifier is then trained to discriminate between the sets and is subsequently applied to the "uncertain" feature candidates from external IDs, facilitating selection and confidence scoring of the best feature candidate for each peptide. This approach also enables our algorithm to estimate the false discovery rate (FDR) of the feature selection step. We validated FFId based on a public benchmark data set, comprising a yeast cell lysate spiked with protein standards that provide a known ground-truth. The algorithm reached almost complete (>99%) quantification coverage for the full set of peptides identified at 1% FDR (PSM level). Compared with other software solutions for label-free quantification, this is an outstanding result, which was achieved at competitive quantification accuracy and reproducibility across replicates. The FDR for the feature selection was estimated at a low 1.5% on average per sample (3% for features inferred from external peptide IDs). The FFId software is open-source and freely available as part of OpenMS ( www.openms.org ).

  2. Compilation of Projects Addressing the Early Childhood Provisions of IDEA: Discretionary Projects Supported by the Office of Special Education Programs under the Individuals with Disabilities Education Act, Fiscal Year 2002.

    ERIC Educational Resources Information Center

    Danaher, Joan; Armijo, Caroline; Kraus, Robert; Festa, Cathy

    This directory describes approximately 300 discretionary projects addressing the early childhood provisions of the Individuals with Disabilities Education Act (IDEA). It was compiled from four volumes separately published by the ERIC/OSEP Special Project. The discretionary grants and contracts authorized by the 1997 Amendments to the IDEA are…

  3. Compilation of Projects Addressing the Early Childhood Provisions of IDEA. Discretionary Projects Supported by the Office of Special Education Programs under the Individuals with Disabilities Education Act, Fiscal Year 2001.

    ERIC Educational Resources Information Center

    Danaher, Joan; Armijo, Caroline; Kraus, Robert; Festa, Cathy

    This directory describes approximately 300 discretionary projects addressing the early childhood provisions of the Individuals with Disabilities Education Act (IDEA). It was compiled from four volumes separately published by the ERIC/OSEP Special Project. The discretionary grants and contracts authorized by the 1997 Amendments to the IDEA are…

  4. Mapping students' ideas to understand learning in a collaborative programming environment

    NASA Astrophysics Data System (ADS)

    Harlow, Danielle Boyd; Leak, Anne Emerson

    2014-07-01

    Recent studies in learning programming have largely focused on high school and college students; less is known about how young children learn to program. From video data of 20 students using a graphical programming interface, we identified ideas that were shared and evolved through an elementary school classroom. In mapping these ideas and their resulting changes in programs and outputs, we were able to identify the contextual features which contributed to how ideas moved through the classroom as students learned. We suggest this process of idea mapping in visual programming environments as a viable method for understanding collaborative, constructivist learning as well as a context under which experiences can be developed to improve student learning.

  5. Developing "Personality" Taxonomies: Metatheoretical and Methodological Rationales Underlying Selection Approaches, Methods of Data Generation and Reduction Principles.

    PubMed

    Uher, Jana

    2015-12-01

    Taxonomic "personality" models are widely used in research and applied fields. This article applies the Transdisciplinary Philosophy-of-Science Paradigm for Research on Individuals (TPS-Paradigm) to scrutinise the three methodological steps that are required for developing comprehensive "personality" taxonomies: 1) the approaches used to select the phenomena and events to be studied, 2) the methods used to generate data about the selected phenomena and events and 3) the reduction principles used to extract the "most important" individual-specific variations for constructing "personality" taxonomies. Analyses of some currently popular taxonomies reveal frequent mismatches between the researchers' explicit and implicit metatheories about "personality" and the abilities of previous methodologies to capture the particular kinds of phenomena toward which they are targeted. Serious deficiencies that preclude scientific quantifications are identified in standardised questionnaires, psychology's established standard method of investigation. These mismatches and deficiencies derive from the lack of an explicit formulation and critical reflection on the philosophical and metatheoretical assumptions being made by scientists and from the established practice of radically matching the methodological tools to researchers' preconceived ideas and to pre-existing statistical theories rather than to the particular phenomena and individuals under study. These findings raise serious doubts about the ability of previous taxonomies to appropriately and comprehensively reflect the phenomena towards which they are targeted and the structures of individual-specificity occurring in them. The article elaborates and illustrates with empirical examples methodological principles that allow researchers to appropriately meet the metatheoretical requirements and that are suitable for comprehensively exploring individuals' "personality".

  6. Detection of multiple damages employing best achievable eigenvectors under Bayesian inference

    NASA Astrophysics Data System (ADS)

    Prajapat, Kanta; Ray-Chaudhuri, Samit

    2018-05-01

    A novel approach is presented in this work to localize simultaneously multiple damaged elements in a structure along with the estimation of damage severity for each of the damaged elements. For detection of damaged elements, a best achievable eigenvector based formulation has been derived. To deal with noisy data, Bayesian inference is employed in the formulation wherein the likelihood of the Bayesian algorithm is formed on the basis of errors between the best achievable eigenvectors and the measured modes. In this approach, the most probable damage locations are evaluated under Bayesian inference by generating combinations of various possible damaged elements. Once damage locations are identified, damage severities are estimated using a Bayesian inference Markov chain Monte Carlo simulation. The efficiency of the proposed approach has been demonstrated by carrying out a numerical study involving a 12-story shear building. It has been found from this study that damage scenarios involving as low as 10% loss of stiffness in multiple elements are accurately determined (localized and severities quantified) even when 2% noise contaminated modal data are utilized. Further, this study introduces a term parameter impact (evaluated based on sensitivity of modal parameters towards structural parameters) to decide the suitability of selecting a particular mode, if some idea about the damaged elements are available. It has been demonstrated here that the accuracy and efficiency of the Bayesian quantification algorithm increases if damage localization is carried out a-priori. An experimental study involving a laboratory scale shear building and different stiffness modification scenarios shows that the proposed approach is efficient enough to localize the stories with stiffness modification.

  7. Driving under the influence of cannabis: pitfalls, validation, and quality control of a UPLC-MS/MS method for the quantification of tetrahydrocannabinol in oral fluid collected with StatSure, Quantisal, or Certus collector.

    PubMed

    Wille, Sarah M R; Di Fazio, Vincent; Ramírez-Fernandez, Maria del Mar; Kummer, Natalie; Samyn, Nele

    2013-02-01

    "Driving under the influence of drugs" (DUID) has a large impact on the worldwide mortality risk. Therefore, DUID legislations based on impairment or analytical limits are adopted. Drug detection in oral fluid is of interest due to the ease of sampling during roadside controls. The prevalence of Δ9-tetrahydrocannabinol (THC) in seriously injured drivers ranges from 0.5% to 7.6% in Europe. For these reasons, the quantification of THC in oral fluid collected with 3 alternative on-site collectors is presented and discussed in this publication. An ultra-performance liquid chromatography-mass spectrometric quantification method for THC in oral fluid samples collected with the StatSure (Diagnostic Systems), Quantisal (Immunalysis), and Certus (Concateno) devices was validated according to the international guidelines. Small sample volumes of 100-200 μL were extracted using hexane. Special attention was paid to factors such as matrix effects, THC adsorption onto the collector, and stability in the collection fluid. A relatively high-throughput analysis was developed and validated according to ISO 17025 requirements. Although the effects of the matrix on the quantification could be minimized using a deuterated internal standard, and stability was acceptable according the validation data, adsorption of THC onto the collectors was a problem. For the StatSure device, THC was totally recovered from the collector pad after storage for 24 hours at room temperature or 7 days at 4°C. A loss of 15%-25% was observed for the Quantisal collector, whereas the recovery from the Certus device was irreproducible (relative standard deviation, 44%-85%) and low (29%-80%). During the roadside setting, a practical problem arose: small volumes of oral fluid (eg, 300 μL) were collected. However, THC was easily detected and concentrations ranged from 8 to 922 ng/mL in neat oral fluid. A relatively high-throughput analysis (40 samples in 4 hours) adapted for routine DUID analysis was developed and validated for THC quantification in oral fluid samples collected from drivers under the influence of cannabis.

  8. Kant and the scientific study of consciousness.

    PubMed

    Sturm, Thomas; Wunderlich, Falk

    2010-01-01

    We argue that Kant's views about consciousness, the mind-body problem and the status of psychology as a science all differ drastically from the way in which these topics are conjoined in present debates about the prominent idea of a science of consciousness. Kant never used the concept of consciousness in the now dominant sense of phenomenal qualia; his discussions of the mind-body problem center not on the reducibility of mental properties but of substances; and his views about the possibility of psychology as a science did not employ the requirement of a mechanistic explanation, but of a quantification of phenomena. This shows strikingly how deeply philosophical problems and conceptions can change even if they look similar on the surface.

  9. [Clinical and analytical toxicology of opiate, cocaine and amphetamine].

    PubMed

    Feliu, Catherine; Fouley, Aurélie; Millart, Hervé; Gozalo, Claire; Marty, Hélène; Djerada, Zoubir

    2015-01-01

    In several circumstances, determination and quantification of illicit drugs in biological fluids are determinant. Contexts are varied such as driving under influence, traffic accident, clinical and forensic toxicology, doping analysis, chemical submission. Whole blood is the favoured matrix for the quantification of illicit drugs. Gas chromatography coupled with mass spectrometry (GC-MS) is the gold standard for these analyses. All methods developed must be at least equivalent to gas chromatography coupled with a mass spectrometer. Nowadays, new technologies are available to biologists and clinicians: liquid chromatography coupled with a mass spectrometry (LC/MS) or coupled with a tandem mass spectrometer (LC/MS/MS). The aim of this paper is to describe the state of the art regarding techniques of confirmation by mass spectrometry used for quantification of conventional drugs except cannabis.

  10. PET Quantification of the Norepinephrine Transporter in Human Brain with (S,S)-18F-FMeNER-D2.

    PubMed

    Moriguchi, Sho; Kimura, Yasuyuki; Ichise, Masanori; Arakawa, Ryosuke; Takano, Harumasa; Seki, Chie; Ikoma, Yoko; Takahata, Keisuke; Nagashima, Tomohisa; Yamada, Makiko; Mimura, Masaru; Suhara, Tetsuya

    2017-07-01

    Norepinephrine transporter (NET) in the brain plays important roles in human cognition and the pathophysiology of psychiatric disorders. Two radioligands, ( S , S )- 11 C-MRB and ( S , S )- 18 F-FMeNER-D 2 , have been used for imaging NETs in the thalamus and midbrain (including locus coeruleus) using PET in humans. However, NET density in the equally important cerebral cortex has not been well quantified because of unfavorable kinetics with ( S , S )- 11 C-MRB and defluorination with ( S , S )- 18 F-FMeNER-D 2 , which can complicate NET quantification in the cerebral cortex adjacent to the skull containing defluorinated 18 F radioactivity. In this study, we have established analysis methods of quantification of NET density in the brain including the cerebral cortex using ( S , S )- 18 F-FMeNER-D 2 PET. Methods: We analyzed our previous ( S , S )- 18 F-FMeNER-D 2 PET data of 10 healthy volunteers dynamically acquired for 240 min with arterial blood sampling. The effects of defluorination on the NET quantification in the superficial cerebral cortex was evaluated by establishing a time stability of NET density estimations with an arterial input 2-tissue-compartment model, which guided the less-invasive reference tissue model and area under the time-activity curve methods to accurately quantify NET density in all brain regions including the cerebral cortex. Results: Defluorination of ( S , S )- 18 F-FMeNER-D 2 became prominent toward the latter half of the 240-min scan. Total distribution volumes in the superficial cerebral cortex increased with the scan duration beyond 120 min. We verified that 90-min dynamic scans provided a sufficient amount of data for quantification of NET density unaffected by defluorination. Reference tissue model binding potential values from the 90-min scan data and area under the time-activity curve ratios of 70- to 90-min data allowed for the accurate quantification of NET density in the cerebral cortex. Conclusion: We have established methods of quantification of NET densities in the brain including the cerebral cortex unaffected by defluorination using ( S , S )- 18 F-FMeNER-D 2 These results suggest that we can accurately quantify NET density with a 90-min ( S , S )- 18 F-FMeNER-D 2 scan in broad brain areas. © 2017 by the Society of Nuclear Medicine and Molecular Imaging.

  11. A Pragmatic Smoothing Method for Improving the Quality of the Results in Atomic Spectroscopy

    NASA Astrophysics Data System (ADS)

    Bennun, Leonardo

    2017-07-01

    A new smoothing method for the improvement on the identification and quantification of spectral functions based on the previous knowledge of the signals that are expected to be quantified, is presented. These signals are used as weighted coefficients in the smoothing algorithm. This smoothing method was conceived to be applied in atomic and nuclear spectroscopies preferably to these techniques where net counts are proportional to acquisition time, such as particle induced X-ray emission (PIXE) and other X-ray fluorescence spectroscopic methods, etc. This algorithm, when properly applied, does not distort the form nor the intensity of the signal, so it is well suited for all kind of spectroscopic techniques. This method is extremely effective at reducing high-frequency noise in the signal much more efficient than a single rectangular smooth of the same width. As all of smoothing techniques, the proposed method improves the precision of the results, but in this case we found also a systematic improvement on the accuracy of the results. We still have to evaluate the improvement on the quality of the results when this method is applied over real experimental results. We expect better characterization of the net area quantification of the peaks, and smaller Detection and Quantification Limits. We have applied this method to signals that obey Poisson statistics, but with the same ideas and criteria, it could be applied to time series. In a general case, when this algorithm is applied over experimental results, also it would be required that the sought characteristic functions, required for this weighted smoothing method, should be obtained from a system with strong stability. If the sought signals are not perfectly clean, this method should be carefully applied

  12. Solar prediction and intelligent machines

    NASA Technical Reports Server (NTRS)

    Johnson, Gordon G.

    1987-01-01

    The solar prediction program is aimed at reducing or eliminating the need to throughly understand the process previously developed and to still be able to produce a prediction. Substantial progress was made in identifying the procedures to be coded as well as testing some of the presently coded work. Another project involves work on developing ideas and software that should result in a machine capable of learning as well as carrying on an intelligent conversation over a wide range of topics. The underlying idea is to use primitive ideas and construct higher order ideas from these, which can then be easily related one to another.

  13. Compilation of Projects Addressing the Early Childhood Provisions of IDEA: Discretionary Projects Supported by the Office of Special Education Programs under the Individuals with Disabilities Education Act, Fiscal Year, 2003.

    ERIC Educational Resources Information Center

    Danaher, Joan; Armijo, Caroline; Hipps, Cherie; Kraus, Robert

    2004-01-01

    This directory contains 262 discretionary projects addressing the early childhood provisions of the Individuals with Disabilities Education Act (IDEA). It was compiled from four volumes separately published by the ERIC/OSEP Special Project. The discretionary grants and contracts authorized by the 1997 Amendments to the IDEA are administered by the…

  14. Urine biomarkers informative of human kidney allograft rejection and tolerance.

    PubMed

    Nissaisorakarn, Voravech; Lee, John Richard; Lubetzky, Michelle; Suthanthiran, Manikkam

    2018-05-01

    We developed urinary cell messenger RNA (mRNA) profiling to monitor in vivo status of human kidney allografts based on our conceptualization that the kidney allograft may function as an in vivo flow cell sorter allowing access of graft infiltrating cells to the glomerular ultrafiltrate and that interrogation of urinary cells is informative of allograft status. For the profiling urinary cells, we developed a two-step preamplification enhanced real-time quantitative PCR (RT-QPCR) assays with a customized amplicon; preamplification compensating for the low RNA yield from urine and the customized amplicon facilitating absolute quantification of mRNA and overcoming the inherent limitations of relative quantification widely used in RT-QPCR assays. Herein, we review our discovery and validation of urinary cell mRNAs as noninvasive biomarkers prognostic and diagnostic of acute cellular rejection (ACR) in kidney allografts. We summarize our results reflecting the utility of urinary cell mRNA profiling for predicting reversal of ACR with anti-rejection therapy; differential diagnosis of kidney allograft dysfunction; and noninvasive diagnosis and prognosis of BK virus nephropathy. Messenger RNA profiles associated with human kidney allograft tolerance are also summarized in this review. Altogether, data supporting the idea that urinary cell mRNA profiles are informative of kidney allograft status and tolerance are reviewed in this report. Copyright © 2018. Published by Elsevier Inc.

  15. Japan and Iraq: A Comparison

    DTIC Science & Technology

    2007-05-10

    Power of Ideas Democracy, freedom of speech , equal protection under the law, gender equality, freedom of religion, and free markets are extremely... freedom of speech and religion, personal property rights, etc… are extremely powerful ideas that aren’t well known in most of the areas that the US

  16. Curiosity & Connections

    ERIC Educational Resources Information Center

    Lim, Kien H.

    2014-01-01

    Retaining mathematical knowledge is difficult for many students, especially for those who learn facts and procedures without understanding the meanings underlying the symbols and operations. Repeated practice may be necessary for developing skills but is unlikely to make conceptual ideas stick. An idea is more likely to stick if students are…

  17. Idea Bank: Steps to Visibility.

    ERIC Educational Resources Information Center

    Music Educators Journal, 1983

    1983-01-01

    Unique ideas about how to maintain interest in musicals, concerts, and other music performances are described. For example, Project Parent Awareness encouraged parent participation in children's music education and the Akron (Ohio) All-City Festivals of Music provided students with performing opportunities under well-known conductors. (CS)

  18. Superposition Quantification

    NASA Astrophysics Data System (ADS)

    Chang, Li-Na; Luo, Shun-Long; Sun, Yuan

    2017-11-01

    The principle of superposition is universal and lies at the heart of quantum theory. Although ever since the inception of quantum mechanics a century ago, superposition has occupied a central and pivotal place, rigorous and systematic studies of the quantification issue have attracted significant interests only in recent years, and many related problems remain to be investigated. In this work we introduce a figure of merit which quantifies superposition from an intuitive and direct perspective, investigate its fundamental properties, connect it to some coherence measures, illustrate it through several examples, and apply it to analyze wave-particle duality. Supported by Science Challenge Project under Grant No. TZ2016002, Laboratory of Computational Physics, Institute of Applied Physics and Computational Mathematics, Beijing, Key Laboratory of Random Complex Structures and Data Science, Chinese Academy of Sciences, Grant under No. 2008DP173182

  19. Quantification of Lignin and Its Structural Features in Plant Biomass Using 13C Lignin as Internal Standard for Pyrolysis-GC-SIM-MS.

    PubMed

    van Erven, Gijs; de Visser, Ries; Merkx, Donny W H; Strolenberg, Willem; de Gijsel, Peter; Gruppen, Harry; Kabel, Mirjam A

    2017-10-17

    Understanding the mechanisms underlying plant biomass recalcitrance at the molecular level can only be achieved by accurate analyses of both the content and structural features of the molecules involved. Current quantification of lignin is, however, majorly based on unspecific gravimetric analysis after sulfuric acid hydrolysis. Hence, our research aimed at specific lignin quantification with concurrent characterization of its structural features. Hereto, for the first time, a polymeric 13 C lignin was used as internal standard (IS) for lignin quantification via analytical pyrolysis coupled to gas chromatography with mass-spectrometric detection in selected ion monitoring mode (py-GC-SIM-MS). In addition, relative response factors (RRFs) for the various pyrolysis products obtained were determined and applied. First, 12 C and 13 C lignin were isolated from nonlabeled and uniformly 13 C labeled wheat straw, respectively, and characterized by heteronuclear single quantum coherence (HSQC), nuclear magnetic resonance (NMR), and py-GC/MS. The two lignin isolates were found to have identical structures. Second, 13 C-IS based lignin quantification by py-GC-SIM-MS was validated in reconstituted biomass model systems with known contents of the 12 C lignin analogue and was shown to be extremely accurate (>99.9%, R 2 > 0.999) and precise (RSD < 1.5%). Third, 13 C-IS based lignin quantification was applied to four common poaceous biomass sources (wheat straw, barley straw, corn stover, and sugar cane bagasse), and lignin contents were in good agreement with the total gravimetrically determined lignin contents. Our robust method proves to be a promising alternative for the high-throughput quantification of lignin in milled biomass samples directly and simultaneously provides a direct insight into the structural features of lignin.

  20. Quantification of Lignin and Its Structural Features in Plant Biomass Using 13C Lignin as Internal Standard for Pyrolysis-GC-SIM-MS

    PubMed Central

    2017-01-01

    Understanding the mechanisms underlying plant biomass recalcitrance at the molecular level can only be achieved by accurate analyses of both the content and structural features of the molecules involved. Current quantification of lignin is, however, majorly based on unspecific gravimetric analysis after sulfuric acid hydrolysis. Hence, our research aimed at specific lignin quantification with concurrent characterization of its structural features. Hereto, for the first time, a polymeric 13C lignin was used as internal standard (IS) for lignin quantification via analytical pyrolysis coupled to gas chromatography with mass-spectrometric detection in selected ion monitoring mode (py-GC-SIM-MS). In addition, relative response factors (RRFs) for the various pyrolysis products obtained were determined and applied. First, 12C and 13C lignin were isolated from nonlabeled and uniformly 13C labeled wheat straw, respectively, and characterized by heteronuclear single quantum coherence (HSQC), nuclear magnetic resonance (NMR), and py-GC/MS. The two lignin isolates were found to have identical structures. Second, 13C-IS based lignin quantification by py-GC-SIM-MS was validated in reconstituted biomass model systems with known contents of the 12C lignin analogue and was shown to be extremely accurate (>99.9%, R2 > 0.999) and precise (RSD < 1.5%). Third, 13C-IS based lignin quantification was applied to four common poaceous biomass sources (wheat straw, barley straw, corn stover, and sugar cane bagasse), and lignin contents were in good agreement with the total gravimetrically determined lignin contents. Our robust method proves to be a promising alternative for the high-throughput quantification of lignin in milled biomass samples directly and simultaneously provides a direct insight into the structural features of lignin. PMID:28926698

  1. 75 FR 13109 - Office of Special Education and Rehabilitative Services; List of Correspondence

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-18

    ... DEPARTMENT OF EDUCATION Office of Special Education and Rehabilitative Services; List of Correspondence AGENCY: Department of Education. ACTION: List of Correspondence from July 1, 2009 through... the Individuals with Disabilities Education Act (IDEA). Under section 607(f) of the IDEA, the...

  2. Ideas for Intercultural Education

    ERIC Educational Resources Information Center

    Marginson, Simon; Sawir, Erlenawati

    2011-01-01

    Written by a cross-cultural pair of authors, "Ideas for Intercultural Education" takes a critical look at present approaches to international education, focusing on the intercultural potential that it offers but mostly fails to deliver. The underlying premise of this profound, engaging book is that international education can be a transforming…

  3. Which Disability Classifications Are Not Particularly Litigious under the IDEA?

    ERIC Educational Resources Information Center

    Zirkel, Perry A.

    2011-01-01

    A previous exploratory analysis revealed that students with autism were notably overrepresented in published court decisions concerning the IDEA's core concepts of "free appropriate public education" (FAPE) and "least restrictive environment" (LRE). More specifically, for the period 1993 to 2006, the proportion of this…

  4. Quantification of Na+,K+ pumps and their transport rate in skeletal muscle: Functional significance

    PubMed Central

    2013-01-01

    During excitation, muscle cells gain Na+ and lose K+, leading to a rise in extracellular K+ ([K+]o), depolarization, and loss of excitability. Recent studies support the idea that these events are important causes of muscle fatigue and that full use of the Na+,K+-ATPase (also known as the Na+,K+ pump) is often essential for adequate clearance of extracellular K+. As a result of their electrogenic action, Na+,K+ pumps also help reverse depolarization arising during excitation, hyperkalemia, and anoxia, or from cell damage resulting from exercise, rhabdomyolysis, or muscle diseases. The ability to evaluate Na+,K+-pump function and the capacity of the Na+,K+ pumps to fill these needs require quantification of the total content of Na+,K+ pumps in skeletal muscle. Inhibition of Na+,K+-pump activity, or a decrease in their content, reduces muscle contractility. Conversely, stimulation of the Na+,K+-pump transport rate or increasing the content of Na+,K+ pumps enhances muscle excitability and contractility. Measurements of [3H]ouabain binding to skeletal muscle in vivo or in vitro have enabled the reproducible quantification of the total content of Na+,K+ pumps in molar units in various animal species, and in both healthy people and individuals with various diseases. In contrast, measurements of 3-O-methylfluorescein phosphatase activity associated with the Na+,K+-ATPase may show inconsistent results. Measurements of Na+ and K+ fluxes in intact isolated muscles show that, after Na+ loading or intense excitation, all the Na+,K+ pumps are functional, allowing calculation of the maximum Na+,K+-pumping capacity, expressed in molar units/g muscle/min. The activity and content of Na+,K+ pumps are regulated by exercise, inactivity, K+ deficiency, fasting, age, and several hormones and pharmaceuticals. Studies on the α-subunit isoforms of the Na+,K+-ATPase have detected a relative increase in their number in response to exercise and the glucocorticoid dexamethasone but have not involved their quantification in molar units. Determination of ATPase activity in homogenates and plasma membranes obtained from muscle has shown ouabain-suppressible stimulatory effects of Na+ and K+. PMID:24081980

  5. Educators' evaluations of children's ideas on the social exclusion of classmates with intellectual and learning disabilities.

    PubMed

    Nowicki, Elizabeth A; Brown, Jason D; Dare, Lynn

    2018-01-01

    Reasons underlying the social exclusion of children with intellectual or learning disabilities are not entirely understood. Although it is important to heed the voices of children on this issue, it is also important to consider the degree to which these ideas are informed. The present authors invited educators to evaluate the content of children's ideas on the causes of social exclusion. Educators thematically sorted and rated children's ideas on why classmates with intellectual or learning disabilities are socially excluded. Sorted data were analysed with multidimensional scaling and hierarchical cluster analysis. Six thematic clusters were identified differing in content to those provided by children in an earlier study. Educators generally rated children's ideas as showing somewhat uninformed ideas about why social exclusion occurs. Educators indicated that children need to be better informed about intellectual and learning disabilities. Limitations and implications are discussed. © 2017 John Wiley & Sons Ltd.

  6. Quantification of allantoin in various Zea mays L. hybrids by RP-HPLC with UV detection.

    PubMed

    Maksimović, Z; Malenović, A; Jancić, B; Kovacević, N

    2004-07-01

    A RP-HPLC method for quantification of allantoin in silk of fifteen maize hybrids (Zea mays L., Poaceae) was described. Following extraction of the plant material with an acetone-water (7:3, VN) mixture, filtration and dilution, the extracts were analyzed without previous chemical derivatization. Separation and quantification were achieved using an Alltech Econosil C18 column under isocratic conditions at 40 degrees C. The mobile phase flow (20% methanol--80% water with 5 mM sodium laurylsulfate added at pH 2.5, adjusted with 85% orthophosphoric acid; pH of water phase was finally adjusted at 6.0 by addition of triethylamine) was maintained at 1.0 mL/min. Column effluent was monitored at 235 nm. This simple procedure afforded efficient separation and quantification of allantoin in plant material, without interference of polyphenols or other plant constituents of medium to high polarity, or similar UV absorption. Our study revealed that the silk of all investigated maize hybrids could be considered relatively rich in allantoin, covering the concentration range between 215 and 289 mg per 100 g of dry plant material.

  7. Interferences in the direct quantification of bisphenol S in paper by means of thermochemolysis.

    PubMed

    Becerra, Valentina; Odermatt, Jürgen

    2013-02-01

    This article analyses the interferences in the quantification of traces of bisphenol S in paper by applying the direct analytical method "analytical pyrolysis gas chromatography mass spectrometry" (Py-GC/MS) in conjunction with on-line derivatisation with tetramethylammonium hydroxide (TMAH). As the analytes are simultaneously analysed with the matrix, the interferences derive from the matrix. The investigated interferences are found in the analysis of paper samples, which include bisphenol S derivative compounds. As the free bisphenol S is the hydrolysis product of the bisphenol S derivative compounds, the detected amount of bisphenol S in the sample may be overestimated. It is found that the formation of free bisphenol S from the bisphenol S derivative compounds is enhanced in the presence of tetramethylammonium hydroxide (TMAH) under pyrolytic conditions. In order to avoid the formation of bisphenol S trimethylsulphonium hydroxide (TMSH) is introduced. Different parameters are optimised in the development of the quantification method with TMSH. The quantification method based on TMSH thermochemolysis has been validated in terms of reproducibility and accuracy. Copyright © 2012 Elsevier B.V. All rights reserved.

  8. Lamb Wave Damage Quantification Using GA-Based LS-SVM.

    PubMed

    Sun, Fuqiang; Wang, Ning; He, Jingjing; Guan, Xuefei; Yang, Jinsong

    2017-06-12

    Lamb waves have been reported to be an efficient tool for non-destructive evaluations (NDE) for various application scenarios. However, accurate and reliable damage quantification using the Lamb wave method is still a practical challenge, due to the complex underlying mechanism of Lamb wave propagation and damage detection. This paper presents a Lamb wave damage quantification method using a least square support vector machine (LS-SVM) and a genetic algorithm (GA). Three damage sensitive features, namely, normalized amplitude, phase change, and correlation coefficient, were proposed to describe changes of Lamb wave characteristics caused by damage. In view of commonly used data-driven methods, the GA-based LS-SVM model using the proposed three damage sensitive features was implemented to evaluate the crack size. The GA method was adopted to optimize the model parameters. The results of GA-based LS-SVM were validated using coupon test data and lap joint component test data with naturally developed fatigue cracks. Cases of different loading and manufacturer were also included to further verify the robustness of the proposed method for crack quantification.

  9. Lamb Wave Damage Quantification Using GA-Based LS-SVM

    PubMed Central

    Sun, Fuqiang; Wang, Ning; He, Jingjing; Guan, Xuefei; Yang, Jinsong

    2017-01-01

    Lamb waves have been reported to be an efficient tool for non-destructive evaluations (NDE) for various application scenarios. However, accurate and reliable damage quantification using the Lamb wave method is still a practical challenge, due to the complex underlying mechanism of Lamb wave propagation and damage detection. This paper presents a Lamb wave damage quantification method using a least square support vector machine (LS-SVM) and a genetic algorithm (GA). Three damage sensitive features, namely, normalized amplitude, phase change, and correlation coefficient, were proposed to describe changes of Lamb wave characteristics caused by damage. In view of commonly used data-driven methods, the GA-based LS-SVM model using the proposed three damage sensitive features was implemented to evaluate the crack size. The GA method was adopted to optimize the model parameters. The results of GA-based LS-SVM were validated using coupon test data and lap joint component test data with naturally developed fatigue cracks. Cases of different loading and manufacturer were also included to further verify the robustness of the proposed method for crack quantification. PMID:28773003

  10. Uncertainty quantification and sensitivity analysis with CASL Core Simulator VERA-CS

    DOE PAGES

    Brown, C. S.; Zhang, Hongbin

    2016-05-24

    Uncertainty quantification and sensitivity analysis are important for nuclear reactor safety design and analysis. A 2x2 fuel assembly core design was developed and simulated by the Virtual Environment for Reactor Applications, Core Simulator (VERA-CS) coupled neutronics and thermal-hydraulics code under development by the Consortium for Advanced Simulation of Light Water Reactors (CASL). An approach to uncertainty quantification and sensitivity analysis with VERA-CS was developed and a new toolkit was created to perform uncertainty quantification and sensitivity analysis with fourteen uncertain input parameters. Furthermore, the minimum departure from nucleate boiling ratio (MDNBR), maximum fuel center-line temperature, and maximum outer clad surfacemore » temperature were chosen as the selected figures of merit. Pearson, Spearman, and partial correlation coefficients were considered for all of the figures of merit in sensitivity analysis and coolant inlet temperature was consistently the most influential parameter. We used parameters as inputs to the critical heat flux calculation with the W-3 correlation were shown to be the most influential on the MDNBR, maximum fuel center-line temperature, and maximum outer clad surface temperature.« less

  11. Lamb wave-based damage quantification and probability of detection modeling for fatigue life assessment of riveted lap joint

    NASA Astrophysics Data System (ADS)

    He, Jingjing; Wang, Dengjiang; Zhang, Weifang

    2015-03-01

    This study presents an experimental and modeling study for damage detection and quantification in riveted lap joints. Embedded lead zirconate titanate piezoelectric (PZT) ceramic wafer-type sensors are employed to perform in-situ non-destructive testing during fatigue cyclical loading. A multi-feature integration method is developed to quantify the crack size using signal features of correlation coefficient, amplitude change, and phase change. In addition, probability of detection (POD) model is constructed to quantify the reliability of the developed sizing method. Using the developed crack size quantification method and the resulting POD curve, probabilistic fatigue life prediction can be performed to provide comprehensive information for decision-making. The effectiveness of the overall methodology is demonstrated and validated using several aircraft lap joint specimens from different manufactures and under different loading conditions.

  12. Development and community-based validation of the IDEA study Instrumental Activities of Daily Living (IDEA-IADL) questionnaire.

    PubMed

    Collingwood, Cecilia; Paddick, Stella-Maria; Kisoli, Aloyce; Dotchin, Catherine L; Gray, William K; Mbowe, Godfrey; Mkenda, Sarah; Urasa, Sarah; Mushi, Declare; Chaote, Paul; Walker, Richard W

    2014-01-01

    The dementia diagnosis gap in sub-Saharan Africa (SSA) is large, partly due to difficulties in assessing function, an essential step in diagnosis. As part of the Identification and Intervention for Dementia in Elderly Africans (IDEA) study, to develop, pilot, and validate an Instrumental Activities of Daily Living (IADL) questionnaire for use in a rural Tanzanian population to assist in the identification of people with dementia alongside cognitive screening. The questionnaire was developed at a workshop for rural primary healthcare workers, based on culturally appropriate roles and usual activities of elderly people in this community. It was piloted in 52 individuals under follow-up from a dementia prevalence study. Validation subsequently took place during a community dementia-screening programme. Construct validation against gold standard clinical dementia diagnosis using DSM-IV criteria was carried out on a stratified sample of the cohort and validity assessed using area under the receiver operating characteristic (AUROC) curve analysis. An 11-item questionnaire (IDEA-IADL) was developed after pilot testing. During formal validation on 130 community-dwelling elderly people who presented for screening, the AUROC curve was 0.896 for DSM-IV dementia when used in isolation and 0.937 when used in conjunction with the IDEA cognitive screen, previously validated in Tanzania. The internal consistency was 0.959. Performance on the IDEA-IADL was not biased with regard to age, gender or education level. The IDEA-IADL questionnaire appears to be a useful aid to dementia screening in this setting. Further validation in other healthcare settings in SSA is required.

  13. Essays and Explorations: Studies in Ideas, Language, and Literature.

    ERIC Educational Resources Information Center

    Bloomfield, Morton W.

    Seventeen reprinted essays and an unpublished one are contained in this collection and organized under five headings: History of Ideas, Approaches to Medieval Literature, Chaucer and Fourteenth-Century English Literature, Language and Linguistics, and Essay-Reviews. Topics discussed include the origin of the concept of the Seven Cardinal Sins;…

  14. Special Education Law: Illustrative Basics and Nuances of Key IDEA Components

    ERIC Educational Resources Information Center

    Zirkel, Perry A.

    2015-01-01

    Intended as professional development for both new and experienced special educators, this article provides both the basic requirements and nuanced issues for foundational, successive, and overlapping key components under the Individuals With Disabilities Education Act (IDEA): (a) child find, (b) eligibility, and (c) free appropriate public…

  15. Assessing Associative Distance among Ideas Elicited by Tests of Divergent Thinking

    ERIC Educational Resources Information Center

    Acar, Selcuk; Runco, Mark A.

    2014-01-01

    Tests of divergent thinking represent the most commonly used assessment of creative potential. Typically they are scored for total ideational output (fluency), ideational originality, and, sometimes, ideational flexibility. That scoring system provides little information about the underlying process and about the associations among ideas. It also…

  16. RTI and Other Approaches to SLD Identification under the IDEA: A Legal Update

    ERIC Educational Resources Information Center

    Zirkel, Perry A.

    2017-01-01

    This article provides a concise and objective synthesis of the federal legislation, regulations, and agency policy interpretations; state laws; and case law, including hearing officer and complaint investigation decisions, concerning specific learning disability (SLD) identification since the 2006 IDEA regulations. The results reveal wide latitude…

  17. Age-Related Changes in Creative Thinking

    ERIC Educational Resources Information Center

    Roskos-Ewoldsen, Beverly; Black, Sheila R.; Mccown, Steven M.

    2008-01-01

    Age-related differences in cognitive processes were used to understand age-related declines in creativity. According to the Geneplore model (Finke, Ward, & Smith, 1992), there are two phases of creativity--generating an idea and exploring the implications of the idea--each with different underlying cognitive processes. These two phases are…

  18. Unfolding Montessori's Ideas in Today's Society.

    ERIC Educational Resources Information Center

    Loeffler, Margaret H.

    1998-01-01

    Asserts that as Montessorians enter the 21st century, they could benefit from developing an openness toward other educators' ideas and from undertaking a reexamination of their own understandings and practices as well as Montessori's underlying principles and methods, such as the role of materials, the terminology, and aspects of Montessori that…

  19. The economics of ideas and intellectual property.

    PubMed

    Boldrin, Michele; Levine, David K

    2005-01-25

    Innovation and the adoption of new ideas is fundamental to economic progress. Here we examine the underlying economics of the market for ideas. From a positive perspective, we examine how such markets function with and without government intervention. From a normative perspective, we examine the pitfalls of existing institutions, and how they might be improved. We highlight recent research by us and others challenging the notion that government awards of monopoly through patents and copyright are "the way" to provide appropriate incentives for innovation.

  20. The economics of ideas and intellectual property

    PubMed Central

    Boldrin, Michele; Levine, David K.

    2005-01-01

    Innovation and the adoption of new ideas is fundamental to economic progress. Here we examine the underlying economics of the market for ideas. From a positive perspective, we examine how such markets function with and without government intervention. From a normative perspective, we examine the pitfalls of existing institutions, and how they might be improved. We highlight recent research by us and others challenging the notion that government awards of monopoly through patents and copyright are “the way” to provide appropriate incentives for innovation. PMID:15657138

  1. Simultaneous quantification of eight organic acid components in Artemisia capillaris Thunb (Yinchen) extract using high-performance liquid chromatography coupled with diode array detection and high-resolution mass spectrometry.

    PubMed

    Yu, Fangjun; Qian, Hao; Zhang, Jiayu; Sun, Jie; Ma, Zhiguo

    2018-04-01

    We aim to determine the chemical constituents of Yinchen extract and Yinchen herbs using high-performance liquid chromatography coupled with diode array detection and high-resolution mass spectrometry. The method was developed to analyze of eight organic acid components of Yinchen extract (including neochlorogenic acid, chlorogenic acid, cryptochlorogenic acid, caffeic acid, 1,3-dicaffeoylquinic acid, 3,4-dicaffeoylquinic acid, 3,5-dicaffeoylquinic acid and 4,5-dicaffeoylquinic acid). The separation was conducted using an Agilent TC-C18 column with acetonitrile - 0.2% formic acid solution as the mobile phases under gradient elution. The analytical method was fully validated in terms of linearity, sensitivity, precision, repeatability as well as recovery, and subsequently the method was performed for the quantitative assessment of Yinchen extracts and Yinchen herbs. In addition, the changes of selected markers were studied when Yinchen herbs decocting in water and isomerization occurred between the chlorogenic acids. The proposed method enables both qualitative and quantitative analyses and could be developed as a new tool for the quality evaluation of Yinchen extract and Yinchen herbs. The changes of selected markers in water decoction process could give us some novel idea when studying the link between substances and drug efficacy. Copyright © 2017. Published by Elsevier B.V.

  2. Noninvasive quantification of cerebral metabolic rate for glucose in rats using 18F-FDG PET and standard input function

    PubMed Central

    Hori, Yuki; Ihara, Naoki; Teramoto, Noboru; Kunimi, Masako; Honda, Manabu; Kato, Koichi; Hanakawa, Takashi

    2015-01-01

    Measurement of arterial input function (AIF) for quantitative positron emission tomography (PET) studies is technically challenging. The present study aimed to develop a method based on a standard arterial input function (SIF) to estimate input function without blood sampling. We performed 18F-fluolodeoxyglucose studies accompanied by continuous blood sampling for measurement of AIF in 11 rats. Standard arterial input function was calculated by averaging AIFs from eight anesthetized rats, after normalization with body mass (BM) and injected dose (ID). Then, the individual input function was estimated using two types of SIF: (1) SIF calibrated by the individual's BM and ID (estimated individual input function, EIFNS) and (2) SIF calibrated by a single blood sampling as proposed previously (EIF1S). No significant differences in area under the curve (AUC) or cerebral metabolic rate for glucose (CMRGlc) were found across the AIF-, EIFNS-, and EIF1S-based methods using repeated measures analysis of variance. In the correlation analysis, AUC or CMRGlc derived from EIFNS was highly correlated with those derived from AIF and EIF1S. Preliminary comparison between AIF and EIFNS in three awake rats supported an idea that the method might be applicable to behaving animals. The present study suggests that EIFNS method might serve as a noninvasive substitute for individual AIF measurement. PMID:25966947

  3. Noninvasive quantification of cerebral metabolic rate for glucose in rats using (18)F-FDG PET and standard input function.

    PubMed

    Hori, Yuki; Ihara, Naoki; Teramoto, Noboru; Kunimi, Masako; Honda, Manabu; Kato, Koichi; Hanakawa, Takashi

    2015-10-01

    Measurement of arterial input function (AIF) for quantitative positron emission tomography (PET) studies is technically challenging. The present study aimed to develop a method based on a standard arterial input function (SIF) to estimate input function without blood sampling. We performed (18)F-fluolodeoxyglucose studies accompanied by continuous blood sampling for measurement of AIF in 11 rats. Standard arterial input function was calculated by averaging AIFs from eight anesthetized rats, after normalization with body mass (BM) and injected dose (ID). Then, the individual input function was estimated using two types of SIF: (1) SIF calibrated by the individual's BM and ID (estimated individual input function, EIF(NS)) and (2) SIF calibrated by a single blood sampling as proposed previously (EIF(1S)). No significant differences in area under the curve (AUC) or cerebral metabolic rate for glucose (CMRGlc) were found across the AIF-, EIF(NS)-, and EIF(1S)-based methods using repeated measures analysis of variance. In the correlation analysis, AUC or CMRGlc derived from EIF(NS) was highly correlated with those derived from AIF and EIF(1S). Preliminary comparison between AIF and EIF(NS) in three awake rats supported an idea that the method might be applicable to behaving animals. The present study suggests that EIF(NS) method might serve as a noninvasive substitute for individual AIF measurement.

  4. NASA/ASEE Summer Faculty Fellowship Program

    NASA Technical Reports Server (NTRS)

    Hosler, E. Ramon (Editor); Valdes, Carol (Editor); Brown, Tom (Editor)

    1993-01-01

    This document is a collection of technical reports on research conducted by the participants in the 1993 NASA/ASEE Summer Faculty Fellowship Program at KSC. The basic common objectives of the Program are: to further the professional knowledge of qualified engineering and science faculty members; to stimulate an exchange of ideas between participants and NASA; to enrich and refresh the research and teaching activities of participants' institutions; and to contribute to the research objectives of the NASA centers. 1993 topics include wide band fiber optic communications, a prototype expert/information system for examining environmental risks of KSC activities, alternatives to premise wiring using ATM and microcellular technologies, rack insertion end effector (RIEE) automation, FTIR quantification of industrial hydraulic fluids in perchloroethylene, switch configuration for migration to optical fiber network, and more.

  5. Dynamic Risk Quantification and Management: Core needs and strategies for adapting water resources systems to a changing environment (Invited)

    NASA Astrophysics Data System (ADS)

    Lall, U.

    2009-12-01

    The concern with anthropogenic climate change has spurred significant interest in strategies for climate change adaptation in water resource systems planning and management. The thesis of this talk is that this is a subset of strategies that need to sustainably design and operate structural and non-structural systems for managing resources in a changing environment. Even with respect to a changing climate, the largest opportunity for immediate adaptation to a changing climate may be provided by an improved understanding and prediction capability for seasonal to interannual and decadal climate variability. I shall lay out some ideas as to how this can be done and provide an example for reservoir water allocation and management, and one for flood risk management.

  6. Quantum non-Markovianity: characterization, quantification and detection

    NASA Astrophysics Data System (ADS)

    Rivas, Ángel; Huelga, Susana F.; Plenio, Martin B.

    2014-09-01

    We present a comprehensive and up-to-date review of the concept of quantum non-Markovianity, a central theme in the theory of open quantum systems. We introduce the concept of a quantum Markovian process as a generalization of the classical definition of Markovianity via the so-called divisibility property and relate this notion to the intuitive idea that links non-Markovianity with the persistence of memory effects. A detailed comparison with other definitions presented in the literature is provided. We then discuss several existing proposals to quantify the degree of non-Markovianity of quantum dynamics and to witness non-Markovian behavior, the latter providing sufficient conditions to detect deviations from strict Markovianity. Finally, we conclude by enumerating some timely open problems in the field and provide an outlook on possible research directions.

  7. Quantum non-Markovianity: characterization, quantification and detection.

    PubMed

    Rivas, Ángel; Huelga, Susana F; Plenio, Martin B

    2014-09-01

    We present a comprehensive and up-to-date review of the concept of quantum non-Markovianity, a central theme in the theory of open quantum systems. We introduce the concept of a quantum Markovian process as a generalization of the classical definition of Markovianity via the so-called divisibility property and relate this notion to the intuitive idea that links non-Markovianity with the persistence of memory effects. A detailed comparison with other definitions presented in the literature is provided. We then discuss several existing proposals to quantify the degree of non-Markovianity of quantum dynamics and to witness non-Markovian behavior, the latter providing sufficient conditions to detect deviations from strict Markovianity. Finally, we conclude by enumerating some timely open problems in the field and provide an outlook on possible research directions.

  8. Large differences in land use emission quantifications implied by definition discrepancies

    NASA Astrophysics Data System (ADS)

    Stocker, B. D.; Joos, F.

    2015-03-01

    The quantification of CO2 emissions from anthropogenic land use and land use change (eLUC) is essential to understand the drivers of the atmospheric CO2 increase and to inform climate change mitigation policy. Reported values in synthesis reports are commonly derived from different approaches (observation-driven bookkeeping and process-modelling) but recent work has emphasized that inconsistencies between methods may imply substantial differences in eLUC estimates. However, a consistent quantification is lacking and no concise modelling protocol for the separation of primary and secondary components of eLUC has been established. Here, we review the conceptual differences of eLUC quantification methods and apply an Earth System Model to demonstrate that what is claimed to represent total eLUC differs by up to ~20% when quantified from ESM vs. offline vegetation models. Under a future business-as-usual scenario, differences tend to increase further due to slowing land conversion rates and an increasing impact of altered environmental conditions on land-atmosphere fluxes. We establish how coupled Earth System Models may be applied to separate component fluxes of eLUC arising from the replacement of potential C sinks/sources and the land use feedback and show that secondary fluxes derived from offline vegetation models are conceptually and quantitatively not identical to either, nor their sum. Therefore, we argue that synthesis studies and global carbon budget accountings should resort to the "least common denominator" of different methods, following the bookkeeping approach where only primary land use emissions are quantified under the assumption of constant environmental boundary conditions.

  9. Quantifying differences in land use emission estimates implied by definition discrepancies

    NASA Astrophysics Data System (ADS)

    Stocker, B. D.; Joos, F.

    2015-11-01

    The quantification of CO2 emissions from anthropogenic land use and land use change (eLUC) is essential to understand the drivers of the atmospheric CO2 increase and to inform climate change mitigation policy. Reported values in synthesis reports are commonly derived from different approaches (observation-driven bookkeeping and process-modelling) but recent work has emphasized that inconsistencies between methods may imply substantial differences in eLUC estimates. However, a consistent quantification is lacking and no concise modelling protocol for the separation of primary and secondary components of eLUC has been established. Here, we review differences of eLUC quantification methods and apply an Earth System Model (ESM) of Intermediate Complexity to quantify them. We find that the magnitude of effects due to merely conceptual differences between ESM and offline vegetation model-based quantifications is ~ 20 % for today. Under a future business-as-usual scenario, differences tend to increase further due to slowing land conversion rates and an increasing impact of altered environmental conditions on land-atmosphere fluxes. We establish how coupled Earth System Models may be applied to separate secondary component fluxes of eLUC arising from the replacement of potential C sinks/sources and the land use feedback and show that secondary fluxes derived from offline vegetation models are conceptually and quantitatively not identical to either, nor their sum. Therefore, we argue that synthesis studies should resort to the "least common denominator" of different methods, following the bookkeeping approach where only primary land use emissions are quantified under the assumption of constant environmental boundary conditions.

  10. Optically transmitted and inductively coupled electric reference to access in vivo concentrations for quantitative proton-decoupled ¹³C magnetic resonance spectroscopy.

    PubMed

    Chen, Xing; Pavan, Matteo; Heinzer-Schweizer, Susanne; Boesiger, Peter; Henning, Anke

    2012-01-01

    This report describes our efforts on quantification of tissue metabolite concentrations in mM by nuclear Overhauser enhanced and proton decoupled (13) C magnetic resonance spectroscopy and the Electric Reference To access In vivo Concentrations (ERETIC) method. Previous work showed that a calibrated synthetic magnetic resonance spectroscopy-like signal transmitted through an optical fiber and inductively coupled into a transmit/receive coil represents a reliable reference standard for in vivo (1) H magnetic resonance spectroscopy quantification on a clinical platform. In this work, we introduce a related implementation that enables simultaneous proton decoupling and ERETIC-based metabolite quantification and hence extends the applicability of the ERETIC method to nuclear Overhauser enhanced and proton decoupled in vivo (13) C magnetic resonance spectroscopy. In addition, ERETIC signal stability under the influence of simultaneous proton decoupling is investigated. The proposed quantification method was cross-validated against internal and external reference standards on human skeletal muscle. The ERETIC signal intensity stability was 100.65 ± 4.18% over 3 months including measurements with and without proton decoupling. Glycogen and unsaturated fatty acid concentrations measured with the ERETIC method were in excellent agreement with internal creatine and external phantom reference methods, showing a difference of 1.85 ± 1.21% for glycogen and 1.84 ± 1.00% for unsaturated fatty acid between ERETIC and creatine-based quantification, whereas the deviations between external reference and creatine-based quantification are 6.95 ± 9.52% and 3.19 ± 2.60%, respectively. Copyright © 2011 Wiley Periodicals, Inc.

  11. Advances and Challenges In Uncertainty Quantification with Application to Climate Prediction, ICF design and Science Stockpile Stewardship

    NASA Astrophysics Data System (ADS)

    Klein, R.; Woodward, C. S.; Johannesson, G.; Domyancic, D.; Covey, C. C.; Lucas, D. D.

    2012-12-01

    Uncertainty Quantification (UQ) is a critical field within 21st century simulation science that resides at the very center of the web of emerging predictive capabilities. The science of UQ holds the promise of giving much greater meaning to the results of complex large-scale simulations, allowing for quantifying and bounding uncertainties. This powerful capability will yield new insights into scientific predictions (e.g. Climate) of great impact on both national and international arenas, allow informed decisions on the design of critical experiments (e.g. ICF capsule design, MFE, NE) in many scientific fields, and assign confidence bounds to scientifically predictable outcomes (e.g. nuclear weapons design). In this talk I will discuss a major new strategic initiative (SI) we have developed at Lawrence Livermore National Laboratory to advance the science of Uncertainty Quantification at LLNL focusing in particular on (a) the research and development of new algorithms and methodologies of UQ as applied to multi-physics multi-scale codes, (b) incorporation of these advancements into a global UQ Pipeline (i.e. a computational superstructure) that will simplify user access to sophisticated tools for UQ studies as well as act as a self-guided, self-adapting UQ engine for UQ studies on extreme computing platforms and (c) use laboratory applications as a test bed for new algorithms and methodologies. The initial SI focus has been on applications for the quantification of uncertainty associated with Climate prediction, but the validated UQ methodologies we have developed are now being fed back into Science Based Stockpile Stewardship (SSS) and ICF UQ efforts. To make advancements in several of these UQ grand challenges, I will focus in talk on the following three research areas in our Strategic Initiative: Error Estimation in multi-physics and multi-scale codes ; Tackling the "Curse of High Dimensionality"; and development of an advanced UQ Computational Pipeline to enable complete UQ workflow and analysis for ensemble runs at the extreme scale (e.g. exascale) with self-guiding adaptation in the UQ Pipeline engine. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344 and was funded by the Uncertainty Quantification Strategic Initiative Laboratory Directed Research and Development Project at LLNL under project tracking code 10-SI-013 (UCRL LLNL-ABS-569112).

  12. Improving health care, Part 4: Concepts for improving any clinical process.

    PubMed

    Batalden, P B; Mohr, J J; Nelson, E C; Plume, S K

    1996-10-01

    One promising method for streamlining the generation of "good ideas" is to formulate what are sometimes called change concepts-general notions or approaches to change found useful in developing specific ideas for changes that lead to improvement. For example, in current efforts to reduce health care costs by discounting provider charges, the underlying generic concept is "reducing health care costs," and the specific idea is "discounting provider charges." Short-term gains in health care cost reduction can occur by pursuing discounts. After some time, however, limits to such reduction in costs are experienced. Persevering and continuing to travel down the "discounting provider charges" path is less likely to produce further substantial improvement than returning to the basic concept of "reducing health care costs." An interdisciplinary team aiming to reduce costs while improving quality of care for patients in need of hip joint replacement generated ideas for changing "what's done (process) to get better results." After team members wrote down their improvement ideas, they deduced the underlying change concepts and used them to generate even more ideas for improvement. Such change concepts include reordering the sequence of steps (preadmission physical therapy "certification"), eliminating failures at hand-offs between steps (transfer of information from physician's office to hospital), and eliminating a step (epidural pain control). Learning about making change, encouraging change, managing the change within and across organizations, and learning from the changes tested will characterize the sustainable, thriving health systems of the future.

  13. Validated reverse transcription droplet digital PCR serves as a higher order method for absolute quantification of Potato virus Y strains.

    PubMed

    Mehle, Nataša; Dobnik, David; Ravnikar, Maja; Pompe Novak, Maruša

    2018-05-03

    RNA viruses have a great potential for high genetic variability and rapid evolution that is generated by mutation and recombination under selection pressure. This is also the case of Potato virus Y (PVY), which comprises a high diversity of different recombinant and non-recombinant strains. Consequently, it is hard to develop reverse transcription real-time quantitative PCR (RT-qPCR) with the same amplification efficiencies for all PVY strains which would enable their equilibrate quantification; this is specially needed in mixed infections and other studies of pathogenesis. To achieve this, we initially transferred the PVY universal RT-qPCR assay to a reverse transcription droplet digital PCR (RT-ddPCR) format. RT-ddPCR is an absolute quantification method, where a calibration curve is not needed, and it is less prone to inhibitors. The RT-ddPCR developed and validated in this study achieved a dynamic range of quantification over five orders of magnitude, and in terms of its sensitivity, it was comparable to, or even better than, RT-qPCR. RT-ddPCR showed lower measurement variability. We have shown that RT-ddPCR can be used as a reference tool for the evaluation of different RT-qPCR assays. In addition, it can be used for quantification of RNA based on in-house reference materials that can then be used as calibrators in diagnostic laboratories.

  14. Accelerated T1ρ acquisition for knee cartilage quantification using compressed sensing and data-driven parallel imaging: A feasibility study.

    PubMed

    Pandit, Prachi; Rivoire, Julien; King, Kevin; Li, Xiaojuan

    2016-03-01

    Quantitative T1ρ imaging is beneficial for early detection for osteoarthritis but has seen limited clinical use due to long scan times. In this study, we evaluated the feasibility of accelerated T1ρ mapping for knee cartilage quantification using a combination of compressed sensing (CS) and data-driven parallel imaging (ARC-Autocalibrating Reconstruction for Cartesian sampling). A sequential combination of ARC and CS, both during data acquisition and reconstruction, was used to accelerate the acquisition of T1ρ maps. Phantom, ex vivo (porcine knee), and in vivo (human knee) imaging was performed on a GE 3T MR750 scanner. T1ρ quantification after CS-accelerated acquisition was compared with non CS-accelerated acquisition for various cartilage compartments. Accelerating image acquisition using CS did not introduce major deviations in quantification. The coefficient of variation for the root mean squared error increased with increasing acceleration, but for in vivo measurements, it stayed under 5% for a net acceleration factor up to 2, where the acquisition was 25% faster than the reference (only ARC). To the best of our knowledge, this is the first implementation of CS for in vivo T1ρ quantification. These early results show that this technique holds great promise in making quantitative imaging techniques more accessible for clinical applications. © 2015 Wiley Periodicals, Inc.

  15. Scientifically Based Research and Peer-Reviewed Research under the IDEA: The Legal Definitions, Applications, and Implications

    ERIC Educational Resources Information Center

    Zirkel, Perry A.; Rose, Tessie

    2009-01-01

    A systematic analysis of the references to "scientifically based research" (SBR) and closely related terms, such as "peer-reviewed research" (PRR), in the Individuals with Disabilities Education Act (IDEA) legislation, regulations, commentary, and case law reveal that SBR and its primary variants apply largely to state support…

  16. Know the Law: Reimbursement under the IDEA

    ERIC Educational Resources Information Center

    Osborne, Allan G., Jr.; Rehberg, Megan L.

    2009-01-01

    When school boards fail to provide the free appropriate public education (FAPE) guaranteed in the Individuals with Disabilities Education Act (IDEA), students with disabilities and their parents can be compensated in various ways. One of the more common remedies is to reimburse parents for tuition and other costs they may have incurred in…

  17. Criteria for Private Placement Reimbursement: Jefferson County School District R-1 v. Elizabeth E.

    ERIC Educational Resources Information Center

    Katsiyannis, Antonis; Losinski, Mickey; Ennis, Robin; Lane, Jessica

    2014-01-01

    Under the Individuals with Disabilities Education Act of 2004 (IDEA), children with identified disabilities are entitled to a free appropriate public education (FAPE). As it relates to FAPE, IDEA requires that each identified student receive special education and related services specifically designed to confer educational benefit that individual.…

  18. Individuals with Disabilities Education Act: Reauthorization Overview. CRS Report for Congress.

    ERIC Educational Resources Information Center

    Aleman, Steven R.

    This report provides a review of programs authorized under the Individuals with Disabilities Education Act (IDEA) and an overview of potential reauthorization issues, as the second session of the 103rd Congress considers revisions to these programs. The Infants and Toddlers Program (Part H of IDEA) provides formula grants to participating States…

  19. National Evaluation of the IDEA Technical Assistance & Dissemination Program. NCEE 2014-4000

    ERIC Educational Resources Information Center

    Daley,Tamara C.; Fiore, Thomas A.; Bollmer, Julie; Nimkoff, Tamara; Lysy, Chris

    2013-01-01

    Under the Individuals with Disabilities Education Act (IDEA), the Technical Assistance and Dissemination (TA&D) Program is the U.S. Department of Education's (ED) primary vehicle for providing technical assistance (TA) to individuals and organizations responsible for serving children with disabilities and their families. The evaluation is part…

  20. At a Glance: ADHD and IDEA 1997. A Guide for State and Local Policymakers. Policy Briefs.

    ERIC Educational Resources Information Center

    Gregg, Soleil

    This policy brief summarizes the literature and identifies responsibilities of state and local policymakers in meeting legal obligations to provide educational services for students with attention deficit hyperactivity disorder (ADHD) under the Individuals with Disabilities Education Act (IDEA). Presented in a general question-and-answer format,…

  1. Trends in Special Education Case Law: Frequency and Outcomes of Published Court Decisions 1998-2012

    ERIC Educational Resources Information Center

    Karanxha, Zorka; Zirkel, Perry A.

    2014-01-01

    The Individuals with Disabilities Education Act (IDEA) obligates school districts to identify students with disabilities and provide them with a free and appropriate public education (FAPE), which includes specially designed instruction. Identification, FAPE, least restrictive environment (LRE), and various other issues under the IDEA sometimes…

  2. Night Shift: Ideas and Strategies for Homework. Pathfinder 20. A CILT Series for Language Teachers.

    ERIC Educational Resources Information Center

    Buckland, David; Short, Mike

    A variety of ideas and strategies for homework assignments that can be stimulating and useful to second language learners are presented. Underlying principles are that homework can: give control; develop confidence; promote creativity; support differentiation by task and outcome; encourage pupil independence; support parent-school communication;…

  3. Creating Multisensory Environments: Practical Ideas for Teaching and Learning. David Fulton/Nasen

    ERIC Educational Resources Information Center

    Davies, Christopher

    2011-01-01

    Multi-sensory environments in the classroom provide a wealth of stimulating learning experiences for all young children whose senses are still under development. "Creating Multisensory Environments: Practical Ideas for Teaching and Learning" is a highly practical guide to low-cost cost, easy to assemble multi-sensory environments. With a…

  4. The Role of the DSM in IDEA Case Law

    ERIC Educational Resources Information Center

    Zirkel, Perry A.

    2011-01-01

    The school psychologist plays a central role in eligibility and other determinations under the Individuals with Disabilities Education Act (IDEA) not only at the school level but also, upon formal disputes, at the successive adjudicative levels of impartial hearing officers and courts. One of the sources of professional confusion that requires…

  5. The Power of Protocols: An Educator's Guide to Better Practice. The Series on School Reform.

    ERIC Educational Resources Information Center

    McDonald, Joseph P.; Mohr, Nancy; Dichter, Alan; McDonald, Elizabeth C.

    This book describes nearly 30 protocols or "scripts" for conducting meetings, conversations, and other learning experiences among educators. Chapter 1, "The Basic Ideas," explains the basic ideas underlying the rest of the book, discussing why educators should educate themselves and making the case for exploring student work as…

  6. Musical Example to Visualize Abstract Quantum Mechanical Ideas

    ERIC Educational Resources Information Center

    Eagle, Forrest W.; Seaney, Kyser D.; Grubb, Michael P.

    2017-01-01

    Quantum mechanics is a notoriously difficult subject to learn, due to a lack of real-world analogies that might help provide an intuitive grasp of the underlying ideas. Discrete energy levels and absorption and emission wavelengths in atoms are sometimes described as uniquely quantum phenomena, but are actually general to spatially confined waves…

  7. Other Health Impairment. NICHCY Disability Fact Sheet #15

    ERIC Educational Resources Information Center

    National Dissemination Center for Children with Disabilities, 2012

    2012-01-01

    "Other Health Impairment" is one of the 14 categories of disability listed in the nation's special education law, the Individuals with Disabilities Education Act (IDEA). Under IDEA, a child who has an "other health impairment" is likely to be eligible for special services to help the child address his or her educational,…

  8. Between Private and Public: Recognition, Revolution and Political Renewal

    ERIC Educational Resources Information Center

    Stillwaggon, James

    2011-01-01

    This paper deals with some issues underlying the role of education in the preparation of students for democratic participation. Throughout, I maintain two basic ideas: first, that a political action undertaken to obtain practical ends reflects a set of privately held values whose recognition is therefore essential to any idea of the political;…

  9. 2006 Annual Report to Congress on the "Individuals with Disabilities Education Act," Part D

    ERIC Educational Resources Information Center

    Office of Special Education and Rehabilitative Services, US Department of Education, 2008

    2008-01-01

    The purpose of this report is to provide an annual overview of activities funded under the "Individuals with Disabilities Education Act" ("IDEA"), Part D (National Activities to Improve Education of Children with Disabilities), subparts 2 and 3 (P.L. 108-446). "IDEA", Part D, includes programs that support personnel…

  10. 2007 Annual Report to Congress on the "Individuals with Disabilities Education Act," Part D

    ERIC Educational Resources Information Center

    Office of Special Education and Rehabilitative Services, US Department of Education, 2009

    2009-01-01

    The purpose of this report is to provide an annual overview of activities funded under the "Individuals with Disabilities Education Act" ("IDEA"), Part D (National Activities to Improve Education of Children with Disabilities), subparts 2 and 3 (P.L. 108-446). "IDEA", Part D, includes programs that support…

  11. 76 FR 17846 - Objective Merit Review of Discretionary Financial Assistance and Other Transaction Authority...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-31

    ... initiative, and the idea, method or approach would be ineligible for assistance under a recent, current, or... strengths and weaknesses of the applications. ii. An overall consensus rating will be determined for each... unsolicited proposal and represents a unique or innovative idea, method, or approach which would not be...

  12. Is "the Posthuman" Educable? On the Convergence of Educational Philosophy, Animal Studies, and Posthumanist Theory

    ERIC Educational Resources Information Center

    Pedersen, Helena

    2010-01-01

    Formal education in Western society is firmly rooted in humanist ideals. "Becoming human" by cultivating certain cognitive, social, and moral abilities has even symbolised the idea of education as such in Enlightenment philosophical traditions. These ideas are increasingly coming under scrutiny by posthumanist theorists, who are addressing…

  13. Sensitive Technology Assessment of ACOT.

    ERIC Educational Resources Information Center

    Baker, Eva L.

    This paper explores the ideas and the model underlying the evaluation of the Apple Classroom of Tomorrow project (ACOT), a 2-year-old research and development project incorporating at least seven different grade levels which is located in five different school sites in four states. The major features of ACOT are identified as the ideas of computer…

  14. 75 FR 30005 - Office of Special Education and Rehabilitative Services; List of Correspondence

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-28

    ... Topic Addressed: Maintenance of Effort [cir] Letter dated October 29, 2009 to Learning Disabilities... the Individuals with Disabilities Education Act (IDEA). Under section 607(f) of the IDEA, the... (FRS), toll free, at 1-800-877-8339. Individuals with disabilities can obtain a copy of this notice in...

  15. Enhancing Teachers' Curriculum Ownership via Teacher Engagement in State-Based Curriculum-Making: The Estonian Case

    ERIC Educational Resources Information Center

    Mikser, Rain; Kärner, Anita; Krull, Edgar

    2016-01-01

    Teachers' curriculum ownership is increasingly gaining attention in many countries. It is particularly important that under the conditions of centralized curriculum-making, teachers as final implementers of curricular ideas identify themselves with these ideas. This study investigates Estonian upper secondary school teachers' views on the impact…

  16. Circles and the Lines That Intersect Them

    ERIC Educational Resources Information Center

    Clay, Ellen L.; Rhee, Katherine L.

    2014-01-01

    In this article, Clay and Rhee use the mathematics topic of circles and the lines that intersect them to introduce the idea of looking at the single mathematical idea of relationships--in this case, between angles and arcs--across a group of problems. They introduce the mathematics that underlies these relationships, beginning with the questions…

  17. Outcomes for Children Served through IDEA's Early Childhood Programs: 2014-15

    ERIC Educational Resources Information Center

    Early Childhood Technical Assistance Center, 2016

    2016-01-01

    In 2014-2015, children with delays or disabilities who received services under the Individuals with Disabilities Act (IDEA) showed greater than expected developmental progress. Many children exited the program functioning within age expectations, and most made progress. States' Part C and Part B Preschool programs report data annually on three…

  18. Neuropsychiatry of creativity.

    PubMed

    Mula, Marco; Hermann, Bruce; Trimble, Michael R

    2016-04-01

    In this paper, we review in brief the development of ideas that over time have tried to explain why some individuals are more creative than others and what may be the neurobiological links underlying artistic creativity. We note associations with another unique human idea, that of genius. In particular, we discuss frontotemporal dementia and bipolar, cyclothymic mood disorder as clinical conditions that are helping to unravel the underlying neuroanatomy and neurochemistry of human creativity. This article is part of a Special Issue entitled "Epilepsy, Art, and Creativity". Copyright © 2016 Elsevier Inc. All rights reserved.

  19. Evaluation of the potential use of hybrid LC-MS/MS for active drug quantification applying the 'free analyte QC concept'.

    PubMed

    Jordan, Gregor; Onami, Ichio; Heinrich, Julia; Staack, Roland F

    2017-11-01

    Assessment of active drug exposure of biologics may be crucial for drug development. Typically, ligand-binding assay methods are used to provide free/active drug concentrations. To what extent hybrid LC-MS/MS procedures enable correct 'active' drug quantification is currently under consideration. Experimental & results: The relevance of appropriate extraction condition was evaluated by a hybrid target capture immuno-affinity LC-MS/MS method using total and free/active quality controls (QCs). The rapid extraction (10 min) provided correct results, whereas overnight incubation resulted in significant overestimation of the free/active drug (monclonal antibody) concentration. Conventional total QCs were inappropriate to determine optimal method conditions in contrast to free/active QCs. The 'free/active analyte QC concept' enables development of appropriate extraction conditions for correct active drug quantification by hybrid LC-MS/MS.

  20. MetaQuant: a tool for the automatic quantification of GC/MS-based metabolome data.

    PubMed

    Bunk, Boyke; Kucklick, Martin; Jonas, Rochus; Münch, Richard; Schobert, Max; Jahn, Dieter; Hiller, Karsten

    2006-12-01

    MetaQuant is a Java-based program for the automatic and accurate quantification of GC/MS-based metabolome data. In contrast to other programs MetaQuant is able to quantify hundreds of substances simultaneously with minimal manual intervention. The integration of a self-acting calibration function allows the parallel and fast calibration for several metabolites simultaneously. Finally, MetaQuant is able to import GC/MS data in the common NetCDF format and to export the results of the quantification into Systems Biology Markup Language (SBML), Comma Separated Values (CSV) or Microsoft Excel (XLS) format. MetaQuant is written in Java and is available under an open source license. Precompiled packages for the installation on Windows or Linux operating systems are freely available for download. The source code as well as the installation packages are available at http://bioinformatics.org/metaquant

  1. Rapid capillary electrophoresis approach for the quantification of ewe milk adulteration with cow milk.

    PubMed

    Trimboli, Francesca; Morittu, Valeria Maria; Cicino, Caterina; Palmieri, Camillo; Britti, Domenico

    2017-10-13

    The substitution of ewe milk with more economic cow milk is a common fraud. Here we present a capillary electrophoresis method for the quantification of ewe milk in ovine/bovine milk mixtures, which allows for the rapid and inexpensive recognition of ewe milk adulteration with cow milk. We utilized a routine CE method for human blood and urine proteins analysis, which fulfilled the separation of skimmed milk proteins in alkaline buffer. Under this condition, ovine and bovine milk exhibited a recognizable and distinct CE protein profiles, with a specific ewe peak showing a reproducible migration zone in ovine/bovine mixtures. Based on ewe specific CE peak, we developed a method for ewe milk quantification in ovine/bovine skimmed milk mixtures, which showed good linearity, precision and accuracy, and a minimum amount of detectable fraudulent cow milk equal to 5%. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. A framework for optimization and quantification of uncertainty and sensitivity for developing carbon capture systems

    DOE PAGES

    Eslick, John C.; Ng, Brenda; Gao, Qianwen; ...

    2014-12-31

    Under the auspices of the U.S. Department of Energy’s Carbon Capture Simulation Initiative (CCSI), a Framework for Optimization and Quantification of Uncertainty and Sensitivity (FOQUS) has been developed. This tool enables carbon capture systems to be rapidly synthesized and rigorously optimized, in an environment that accounts for and propagates uncertainties in parameters and models. FOQUS currently enables (1) the development of surrogate algebraic models utilizing the ALAMO algorithm, which can be used for superstructure optimization to identify optimal process configurations, (2) simulation-based optimization utilizing derivative free optimization (DFO) algorithms with detailed black-box process models, and (3) rigorous uncertainty quantification throughmore » PSUADE. FOQUS utilizes another CCSI technology, the Turbine Science Gateway, to manage the thousands of simulated runs necessary for optimization and UQ. Thus, this computational framework has been demonstrated for the design and analysis of a solid sorbent based carbon capture system.« less

  3. Development and community-based validation of the IDEA study Instrumental Activities of Daily Living (IDEA-IADL) questionnaire

    PubMed Central

    Collingwood, Cecilia; Paddick, Stella-Maria; Kisoli, Aloyce; Dotchin, Catherine L.; Gray, William K.; Mbowe, Godfrey; Mkenda, Sarah; Urasa, Sarah; Mushi, Declare; Chaote, Paul; Walker, Richard W.

    2014-01-01

    Background The dementia diagnosis gap in sub-Saharan Africa (SSA) is large, partly due to difficulties in assessing function, an essential step in diagnosis. Objectives As part of the Identification and Intervention for Dementia in Elderly Africans (IDEA) study, to develop, pilot, and validate an Instrumental Activities of Daily Living (IADL) questionnaire for use in a rural Tanzanian population to assist in the identification of people with dementia alongside cognitive screening. Design The questionnaire was developed at a workshop for rural primary healthcare workers, based on culturally appropriate roles and usual activities of elderly people in this community. It was piloted in 52 individuals under follow-up from a dementia prevalence study. Validation subsequently took place during a community dementia-screening programme. Construct validation against gold standard clinical dementia diagnosis using DSM-IV criteria was carried out on a stratified sample of the cohort and validity assessed using area under the receiver operating characteristic (AUROC) curve analysis. Results An 11-item questionnaire (IDEA-IADL) was developed after pilot testing. During formal validation on 130 community-dwelling elderly people who presented for screening, the AUROC curve was 0.896 for DSM-IV dementia when used in isolation and 0.937 when used in conjunction with the IDEA cognitive screen, previously validated in Tanzania. The internal consistency was 0.959. Performance on the IDEA-IADL was not biased with regard to age, gender or education level. Conclusions The IDEA-IADL questionnaire appears to be a useful aid to dementia screening in this setting. Further validation in other healthcare settings in SSA is required. PMID:25537940

  4. Athlete’s Retention of a Coach’s Instruction Before a Judo Competition

    PubMed Central

    Mesquita, Isabel; Rosado, Antonio; Januário, Nuno; Barroja, Elsa

    2008-01-01

    The aim of the present study was to analyze the instruction of the Judo coach immediately before the competition, in the process of preparation for the fights, looking to (1) study the coherency between the information which the coach transmits and that which the athlete retains; (2) identify the correlation between the coherency, the extension and the number of ideas conveyed by the coach; (3) determine if the retention varies in relation to variables such as the form and nature of the information, as well as the gender and practice category of the athletes. The participants were 11 coaches and 58 athletes of 3 categories: under- 15, under-17 and under-20, of both genders. One hundred and sixteen (116) instructional episodes were observed, which corresponds to four hundred and six (406) information units convoyed by the coaches. The coaches’ instructions given before the competition were recorded in an audio and video register. After the coaches’ instruction, the athletes were approached by the investigator and an interview was accomplished. To determine if the retention varies in relation to form and nature of the information and gender and practice category of the athletes, the non-parametric statistics, U de Mann-Whitney and Kruskal- Wallis, was used. Correlation of Spearman was applied to verify the degree of association between the coherency, the extension and the number of ideas conveyed by the coach. The results showed that a substantial part of the information was not retained by the athletes and the information coherency was inversely related to the number of transmitted ideas. The coaches were, mainly, prescriptive and the form of the information was not important for the retention of the information. Gender was a differentiated variable as the girls showed more coherency in the retained ideas in relation to the ideas transmitted by the coach. Key pointsThe instructions given by the coach are optimized if the athletes retain and understand them well and should be carefully analyzed by researchers and coaches.The ratio between the number of concordant ideas between coach and athlete (coherency) increased when the number of ideas decreased which raises the question of the adequacy of the instructional strategies used by coach.The prescriptive information showed that athletes were able to express a larger number of ideas in fewer words (larger density) while the combined information caused athletes to use more words to reproduce what the coach said.Gender was a differentiated variable as the girls showed more coherency in the retained ideas in relation to the ideas transmitted by the coach. These results indicate a possible tendency for girls to be more attentive when the coach is emitting information. However, to confirm this assumption more research is needed. PMID:24149909

  5. Selective determination of ertapenem in the presence of its degradation product

    NASA Astrophysics Data System (ADS)

    Hassan, Nagiba Y.; Abdel-Moety, Ezzat M.; Elragehy, Nariman A.; Rezk, Mamdouh R.

    2009-06-01

    Stability-indicative determination of ertapenem (ERTM) in the presence of its β-lactam open-ring degradation product, which is also the metabolite, is investigated. The degradation product has been isolated, via acid-degradation, characterized and elucidated. Selective quantification of ERTM, singly in bulk form, pharmaceutical formulations and/or in the presence of its major degradant is demonstrated. The indication of stability has been undertaken under conditions likely to be expected at normal storage conditions. Among the spectrophotometric methods adopted for quantification are first derivative ( 1D), first derivative of ratio spectra ( 1DD) and bivariate analysis.

  6. Droplet Digital™ PCR Next-Generation Sequencing Library QC Assay.

    PubMed

    Heredia, Nicholas J

    2018-01-01

    Digital PCR is a valuable tool to quantify next-generation sequencing (NGS) libraries precisely and accurately. Accurately quantifying NGS libraries enable accurate loading of the libraries on to the sequencer and thus improve sequencing performance by reducing under and overloading error. Accurate quantification also benefits users by enabling uniform loading of indexed/barcoded libraries which in turn greatly improves sequencing uniformity of the indexed/barcoded samples. The advantages gained by employing the Droplet Digital PCR (ddPCR™) library QC assay includes the precise and accurate quantification in addition to size quality assessment, enabling users to QC their sequencing libraries with confidence.

  7. To create or to recall? Neural mechanisms underlying the generation of creative new ideas☆

    PubMed Central

    Benedek, Mathias; Jauk, Emanuel; Fink, Andreas; Koschutnig, Karl; Reishofer, Gernot; Ebner, Franz; Neubauer, Aljoscha C.

    2014-01-01

    This fMRI study investigated brain activation during creative idea generation using a novel approach allowing spontaneous self-paced generation and expression of ideas. Specifically, we addressed the fundamental question of what brain processes are relevant for the generation of genuinely new creative ideas, in contrast to the mere recollection of old ideas from memory. In general, creative idea generation (i.e., divergent thinking) was associated with extended activations in the left prefrontal cortex and the right medial temporal lobe, and with deactivation of the right temporoparietal junction. The generation of new ideas, as opposed to the retrieval of old ideas, was associated with stronger activation in the left inferior parietal cortex which is known to be involved in mental simulation, imagining, and future thought. Moreover, brain activation in the orbital part of the inferior frontal gyrus was found to increase as a function of the creativity (i.e., originality and appropriateness) of ideas pointing to the role of executive processes for overcoming dominant but uncreative responses. We conclude that the process of idea generation can be generally understood as a state of focused internally-directed attention involving controlled semantic retrieval. Moreover, left inferior parietal cortex and left prefrontal regions may subserve the flexible integration of previous knowledge for the construction of new and creative ideas. PMID:24269573

  8. Final priority; Technical Assistance on State Data Collection--IDEA Data Management Center. Final priority.

    PubMed

    2014-08-05

    The Assistant Secretary for the Office of Special Education and Rehabilitative Services (OSERS) announces a priority under the Technical Assistance on State Data Collection program. The Assistant Secretary may use this priority for competitions in fiscal year (FY) 2014 and later years. We take this action to fund a cooperative agreement to establish and operate an IDEA Data Management Center (Center) that will provide technical assistance (TA) to improve the capacity of States to meet the data collection requirements of the Individuals with Disabilities Education Act (IDEA).

  9. Multimodal Imaging and Lighting Bias Correction for Improved μPAD-based Water Quality Monitoring via Smartphones

    NASA Astrophysics Data System (ADS)

    McCracken, Katherine E.; Angus, Scott V.; Reynolds, Kelly A.; Yoon, Jeong-Yeol

    2016-06-01

    Smartphone image-based sensing of microfluidic paper analytical devices (μPADs) offers low-cost and mobile evaluation of water quality. However, consistent quantification is a challenge due to variable environmental, paper, and lighting conditions, especially across large multi-target μPADs. Compensations must be made for variations between images to achieve reproducible results without a separate lighting enclosure. We thus developed a simple method using triple-reference point normalization and a fast-Fourier transform (FFT)-based pre-processing scheme to quantify consistent reflected light intensity signals under variable lighting and channel conditions. This technique was evaluated using various light sources, lighting angles, imaging backgrounds, and imaging heights. Further testing evaluated its handle of absorbance, quenching, and relative scattering intensity measurements from assays detecting four water contaminants - Cr(VI), total chlorine, caffeine, and E. coli K12 - at similar wavelengths using the green channel of RGB images. Between assays, this algorithm reduced error from μPAD surface inconsistencies and cross-image lighting gradients. Although the algorithm could not completely remove the anomalies arising from point shadows within channels or some non-uniform background reflections, it still afforded order-of-magnitude quantification and stable assay specificity under these conditions, offering one route toward improving smartphone quantification of μPAD assays for in-field water quality monitoring.

  10. Early Childhood Education for Exceptional Children: A Handbook of Ideas and Exemplary Practices.

    ERIC Educational Resources Information Center

    Jordan, June B., Ed.; And Others

    Intended as a guide for educators and researchers, the volume provides ideas and program descriptions in the field of education for young exceptional children. An introductory chapter (J. De Weerd) presents an overview of education for handicapped children and describes the establishment under the Bureau of Education for the Handicapped (BEH) of…

  11. Students with Prader-Willi Syndrome: Case Law under the IDEA

    ERIC Educational Resources Information Center

    Zirkel, Perry A.

    2017-01-01

    Prader-Willi Syndrome (PWS) is one of the low-incidence physical disabilities that the literature has not addressed in relation to the Individuals with Disabilities Education Act and its case law applications. To help fill the gap, this relatively brief article provides (a) an introduction of PWS from legal sources; (b) an overview of the IDEA,…

  12. 76 FR 13526 - Reducing Regulatory Burden; Retrospective Review Under Executive Order 13563

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-14

    ... participate using an existing social media account such as Facebook or Twitter. For further information, see... either SPAM/Inappropriate or Duplicate (log-in required); (6) Share ideas through a Twitter feed or on your Facebook page (log-in required for IdeaScale, as well as an active Facebook and/or Twitter account...

  13. Effectiveness of Inquiry-Based Lessons Using Particulate Level Models to Develop High School Students' Understanding of Conceptual Stoichiometry

    ERIC Educational Resources Information Center

    Kimberlin, Stephanie; Yezierski, Ellen

    2016-01-01

    Students' inaccurate ideas about what is represented by chemical equations and concepts underlying stoichiometry are well documented; however, there are few classroom-ready instructional solutions to help students build scientifically accurate ideas about these topics central to learning chemistry. An intervention (two inquiry-based activities)…

  14. 75 FR 43922 - Interim Guidance for Determining Subject Matter Eligibility for Process Claims in View of Bilski...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-27

    ..., who do not routinely encounter claims that implicate the abstract idea exception. Under the principles... principles: Laws of nature, physical phenomena, and abstract ideas. See id. The Office has been using the so... marketing a product, comprising: Developing a shared marketing force, said shared marketing force including...

  15. What Do I Do When...: The Answer Book on Special Education Law. Second Edition.

    ERIC Educational Resources Information Center

    Gorn, Susan

    This book presents, in a question-and-answer format, a comprehensive guide to special education law, especially the Individuals with Disabilities Education Act (IDEA). Following a table of questions and suggestions on using the book, individual chapters cover law on the following topics: (1) eligibility (e.g., under IDEA and other laws, specific…

  16. For the Future of Chinese Universities: Three Conversations from the Past

    ERIC Educational Resources Information Center

    Pickus, David

    2016-01-01

    This article argues that ideas from the ancient past supply insight about the future of Chinese universities. I make this case by outlining three claims about the nature and purpose of education in Homer, Plato, and Augustine. I propose that conversations based on these ideas illuminate central underlying problems facing Chinese higher education…

  17. Developing "Algebraic Thinking": Two Key Ways to Establish Some Early Algebraic Ideas in Primary Classrooms

    ERIC Educational Resources Information Center

    Ormond, Christine

    2012-01-01

    Primary teachers play a key role in their students' future mathematical success in the early secondary years. While the word "algebra" may make some primary teachers feel uncomfortable or worried, the basic arithmetic ideas underlying algebra are vitally important for older primary students as they are increasingly required to use "algebraic…

  18. Behavioral Intervention Plans: Legal and Practical Considerations for Students with Emotional and Behavioral Disorders

    ERIC Educational Resources Information Center

    Maag, John W.; Katsiyannis, Antonis

    2006-01-01

    Reauthorization of the Individuals with Disabilities Education Act (IDEA) specifies that a behavioral intervention plan (BIP) must be developed for students with disabilities under certain disciplinary exclusions. IDEA, however, does not provide details as to what should be included in BIPs, and this lack of specific guidance often results in…

  19. 2005 Annual Report to Congress on the "Individuals with Disabilities Education Act," Part D

    ERIC Educational Resources Information Center

    Office of Special Education and Rehabilitative Services, US Department of Education, 2007

    2007-01-01

    The purpose of this report is to provide an overview of national activities to improve the education of children with disabilities funded in fiscal year (FY) 2005 under the "Individuals with Disabilities Education Act" ("IDEA"), Part D, Subparts 2 and 3 (P.L. 108-446). "IDEA", Part D (National Activities) includes…

  20. Probabilistic Methods for Uncertainty Propagation Applied to Aircraft Design

    NASA Technical Reports Server (NTRS)

    Green, Lawrence L.; Lin, Hong-Zong; Khalessi, Mohammad R.

    2002-01-01

    Three methods of probabilistic uncertainty propagation and quantification (the method of moments, Monte Carlo simulation, and a nongradient simulation search method) are applied to an aircraft analysis and conceptual design program to demonstrate design under uncertainty. The chosen example problems appear to have discontinuous design spaces and thus these examples pose difficulties for many popular methods of uncertainty propagation and quantification. However, specific implementation features of the first and third methods chosen for use in this study enable successful propagation of small uncertainties through the program. Input uncertainties in two configuration design variables are considered. Uncertainties in aircraft weight are computed. The effects of specifying required levels of constraint satisfaction with specified levels of input uncertainty are also demonstrated. The results show, as expected, that the designs under uncertainty are typically heavier and more conservative than those in which no input uncertainties exist.

  1. Strain and Torsion Quantification in Mouse Hearts under Dobutamine Stimulation using 2D Multi-Phase MR DENSE

    PubMed Central

    Zhong, Jia; Yu, Xin

    2010-01-01

    In the current study, a 2D multi-phase MR displacement encoding with stimulated echoes (DENSE) imaging and analysis method was developed for direct quantification of Lagrangian strain in the mouse heart. Using the proposed method, less than 10 ms temporal resolution and 0.56 mm in-plane resolution were achieved. A validation study that compared strain calculation by DENSE and by MR tagging showed high correlation between the two methods (R2 > 0.80). Regional ventricular wall strain and twist were characterized in mouse hearts at baseline and under dobutamine stimulation. Dobutamine stimulation induced significant increase in radial and circumferential strains and torsion at peak-systole. A rapid untwisting was also observed during early diastole. This work demonstrates the capability of characterizing cardiac functional response to dobutamine stimulation in the mouse heart using 2D multi-phase MR DENSE. PMID:20740659

  2. You have an idea, now what?

    PubMed

    Gertner, Michael

    2006-11-01

    The innovation process is often more important than the original idea, particularly when the ultimate goal is to improve patient care through technologically advanced products. Many physicians have great ideas; unfortunately, many of these great ideas are never translated to patient care improvements because of a misunderstanding of "the next step." In many cases, the next step is a step backward to understand the real clinical problem: "the clinical need." With the clinical need in hand, the most efficient path to a product for improved patient care can then be derived. Often, the most efficient pathway involves an appreciation of many issues, including intellectual property, regulatory pathways, finance, and clinical trial strategies. The integration of these issues underlies innovation in biomedical technology.

  3. Computer-aided Assessment of Regional Abdominal Fat with Food Residue Removal in CT

    PubMed Central

    Makrogiannis, Sokratis; Caturegli, Giorgio; Davatzikos, Christos; Ferrucci, Luigi

    2014-01-01

    Rationale and Objectives Separate quantification of abdominal subcutaneous and visceral fat regions is essential to understand the role of regional adiposity as risk factor in epidemiological studies. Fat quantification is often based on computed tomography (CT) because fat density is distinct from other tissue densities in the abdomen. However, the presence of intestinal food residues with densities similar to fat may reduce fat quantification accuracy. We introduce an abdominal fat quantification method in CT with interest in food residue removal. Materials and Methods Total fat was identified in the feature space of Hounsfield units and divided into subcutaneous and visceral components using model-based segmentation. Regions of food residues were identified and removed from visceral fat using a machine learning method integrating intensity, texture, and spatial information. Cost-weighting and bagging techniques were investigated to address class imbalance. Results We validated our automated food residue removal technique against semimanual quantifications. Our feature selection experiments indicated that joint intensity and texture features produce the highest classification accuracy at 95%. We explored generalization capability using k-fold cross-validation and receiver operating characteristic (ROC) analysis with variable k. Losses in accuracy and area under ROC curve between maximum and minimum k were limited to 0.1% and 0.3%. We validated tissue segmentation against reference semimanual delineations. The Dice similarity scores were as high as 93.1 for subcutaneous fat and 85.6 for visceral fat. Conclusions Computer-aided regional abdominal fat quantification is a reliable computational tool for large-scale epidemiological studies. Our proposed intestinal food residue reduction scheme is an original contribution of this work. Validation experiments indicate very good accuracy and generalization capability. PMID:24119354

  4. Computer-aided assessment of regional abdominal fat with food residue removal in CT.

    PubMed

    Makrogiannis, Sokratis; Caturegli, Giorgio; Davatzikos, Christos; Ferrucci, Luigi

    2013-11-01

    Separate quantification of abdominal subcutaneous and visceral fat regions is essential to understand the role of regional adiposity as risk factor in epidemiological studies. Fat quantification is often based on computed tomography (CT) because fat density is distinct from other tissue densities in the abdomen. However, the presence of intestinal food residues with densities similar to fat may reduce fat quantification accuracy. We introduce an abdominal fat quantification method in CT with interest in food residue removal. Total fat was identified in the feature space of Hounsfield units and divided into subcutaneous and visceral components using model-based segmentation. Regions of food residues were identified and removed from visceral fat using a machine learning method integrating intensity, texture, and spatial information. Cost-weighting and bagging techniques were investigated to address class imbalance. We validated our automated food residue removal technique against semimanual quantifications. Our feature selection experiments indicated that joint intensity and texture features produce the highest classification accuracy at 95%. We explored generalization capability using k-fold cross-validation and receiver operating characteristic (ROC) analysis with variable k. Losses in accuracy and area under ROC curve between maximum and minimum k were limited to 0.1% and 0.3%. We validated tissue segmentation against reference semimanual delineations. The Dice similarity scores were as high as 93.1 for subcutaneous fat and 85.6 for visceral fat. Computer-aided regional abdominal fat quantification is a reliable computational tool for large-scale epidemiological studies. Our proposed intestinal food residue reduction scheme is an original contribution of this work. Validation experiments indicate very good accuracy and generalization capability. Published by Elsevier Inc.

  5. Selection and evaluation of reference genes for expression studies with quantitative PCR in the model fungus Neurospora crassa under different environmental conditions in continuous culture.

    PubMed

    Cusick, Kathleen D; Fitzgerald, Lisa A; Pirlo, Russell K; Cockrell, Allison L; Petersen, Emily R; Biffinger, Justin C

    2014-01-01

    Neurospora crassa has served as a model organism for studying circadian pathways and more recently has gained attention in the biofuel industry due to its enhanced capacity for cellulase production. However, in order to optimize N. crassa for biotechnological applications, metabolic pathways during growth under different environmental conditions must be addressed. Reverse-transcription quantitative PCR (RT-qPCR) is a technique that provides a high-throughput platform from which to measure the expression of a large set of genes over time. The selection of a suitable reference gene is critical for gene expression studies using relative quantification, as this strategy is based on normalization of target gene expression to a reference gene whose expression is stable under the experimental conditions. This study evaluated twelve candidate reference genes for use with N. crassa when grown in continuous culture bioreactors under different light and temperature conditions. Based on combined stability values from NormFinder and Best Keeper software packages, the following are the most appropriate reference genes under conditions of: (1) light/dark cycling: btl, asl, and vma1; (2) all-dark growth: btl, tbp, vma1, and vma2; (3) temperature flux: btl, vma1, act, and asl; (4) all conditions combined: vma1, vma2, tbp, and btl. Since N. crassa exists as different cell types (uni- or multi-nucleated), expression changes in a subset of the candidate genes was further assessed using absolute quantification. A strong negative correlation was found to exist between ratio and threshold cycle (CT) values, demonstrating that CT changes serve as a reliable reflection of transcript, and not gene copy number, fluctuations. The results of this study identified genes that are appropriate for use as reference genes in RT-qPCR studies with N. crassa and demonstrated that even with the presence of different cell types, relative quantification is an acceptable method for measuring gene expression changes during growth in bioreactors.

  6. Applying a World-City Network Approach to Globalizing Higher Education: Conceptualization, Data Collection and the Lists of World Cities

    ERIC Educational Resources Information Center

    Chow, Alice S. Y.; Loo, Becky P. Y.

    2015-01-01

    Both the commercial and education sectors experience an increase in inter-city exchanges in the forms of goods, capital, commands, people and information/knowledge under globalization. The quantification of flows and structural relations among cities in globalizing education are under-researched compared to the well-established world/global cities…

  7. Quantification of the vertical translocation rate of soil solid-phase material by the magnetic tracer method

    NASA Astrophysics Data System (ADS)

    Zhidkin, A. P.; Gennadiev, A. N.

    2016-07-01

    Approaches to the quantification of the vertical translocation rate of soil solid-phase material by the magnetic tracer method have been developed; the tracer penetration depth and rate have been determined, as well as the radial distribution of the tracer in chernozems (Chernozems) and dark gray forest soils (Luvisols) of Belgorod oblast under natural steppe and forest vegetation and in arable lands under agricultural use of different durations. It has been found that the penetration depth of spherical magnetic particles (SMPs) during their 150-year-occurrence in soils of a forest plot is 68 cm under forest, 58 cm on a 100-year old plowland, and only 49 cm on a 150-year-old plowland. In the chernozems of the steppe plot, the penetration depth of SMPs exceeds the studied depth of 70 cm both under natural vegetation and on the plowlands. The penetration rates of SMPs deep into the soil vary significantly among the key plots: 0.92-1.32 mm/year on the forest plot and 1.47-1.63 mm/year on the steppe plot, probably because of the more active recent turbation activity of soil animals.

  8. Cooperative strings and glassy interfaces

    PubMed Central

    Salez, Thomas; Salez, Justin; Dalnoki-Veress, Kari; Raphaël, Elie; Forrest, James A.

    2015-01-01

    We introduce a minimal theory of glass formation based on the ideas of molecular crowding and resultant string-like cooperative rearrangement, and address the effects of free interfaces. In the bulk case, we obtain a scaling expression for the number of particles taking part in cooperative strings, and we recover the Adam–Gibbs description of glassy dynamics. Then, by including thermal dilatation, the Vogel–Fulcher–Tammann relation is derived. Moreover, the random and string-like characters of the cooperative rearrangement allow us to predict a temperature-dependent expression for the cooperative length ξ of bulk relaxation. Finally, we explore the influence of sample boundaries when the system size becomes comparable to ξ. The theory is in agreement with measurements of the glass-transition temperature of thin polymer films, and allows quantification of the temperature-dependent thickness hm of the interfacial mobile layer. PMID:26100908

  9. Music in film and animation: experimental semiotics applied to visual, sound and musical structures

    NASA Astrophysics Data System (ADS)

    Kendall, Roger A.

    2010-02-01

    The relationship of music to film has only recently received the attention of experimental psychologists and quantificational musicologists. This paper outlines theory, semiotical analysis, and experimental results using relations among variables of temporally organized visuals and music. 1. A comparison and contrast is developed among the ideas in semiotics and experimental research, including historical and recent developments. 2. Musicological Exploration: The resulting multidimensional structures of associative meanings, iconic meanings, and embodied meanings are applied to the analysis and interpretation of a range of film with music. 3. Experimental Verification: A series of experiments testing the perceptual fit of musical and visual patterns layered together in animations determined goodness of fit between all pattern combinations, results of which confirmed aspects of the theory. However, exceptions were found when the complexity of the stratified stimuli resulted in cognitive overload.

  10. Capillary electrophoresis with contactless conductivity detection for the quantification of fluoride in lithium ion battery electrolytes and in ionic liquids-A comparison to the results gained with a fluoride ion-selective electrode.

    PubMed

    Pyschik, Marcelina; Klein-Hitpaß, Marcel; Girod, Sabrina; Winter, Martin; Nowak, Sascha

    2017-02-01

    In this study, an optimized method using capillary electrophoresis (CE) with a direct contactless conductivity detector (C 4 D) for a new application field is presented for the quantification of fluoride in common used lithium ion battery (LIB) electrolyte using LiPF 6 in organic carbonate solvents and in ionic liquids (ILs) after contacted to Li metal. The method development for finding the right buffer and the suitable CE conditions for the quantification of fluoride was investigated. The results of the concentration of fluoride in different LIB electrolyte samples were compared to the results from the ion-selective electrode (ISE). The relative standard deviations (RSDs) and recovery rates for fluoride were obtained with a very high accuracy in both methods. The results of the fluoride concentration in the LIB electrolytes were in very good agreement for both methods. In addition, the limit of detection (LOD) and limit of quantification (LOQ) values were determined for the CE method. The CE method has been applied also for the quantification of fluoride in ILs. In the fresh IL sample, the concentration of fluoride was under the LOD. Another sample of the IL mixed with Li metal has been investigated as well. It was possible to quantify the fluoride concentration in this sample. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. From the Teachers Professional Ethics to the Personal Professional Responsibility

    ERIC Educational Resources Information Center

    Seghedin, Elena

    2014-01-01

    Following the idea of civic responsibility of all adults for the new generation, we have tried, in different previous studies, to demonstrate that teaching is involving a lot of moral principles and values. Our present article aim is to present a part of our research about the teaching ethics under the idea of being a stable dimension of teaching…

  12. An Issue Hiding in Plain Sight: When Are Speech-Language Pathologists Special Educators Rather than Related Services Providers?

    ERIC Educational Resources Information Center

    Giangreco, Michael F.; Prelock, Patricia A.; Turnbull, H. Rutherford, III

    2010-01-01

    Purpose: Under the Individuals With Disabilities Education Act (IDEA; as amended, 2004), speech-language pathology services may be either special education or a related service. Given the absence of guidance documents or research on this issue, the purposes of this clinical exchange are to (a) present and analyze the IDEA definitions related to…

  13. The Impact of Professional Development on the Quality of the Transition Components of IEPs

    ERIC Educational Resources Information Center

    Flannery, K. Brigid; Lombardi, Allison; Kato, Mimi McGrath

    2015-01-01

    Under the Individuals With Disabilities Education Act (IDEA), transition needs and services are to be discussed as part of the Individual Education Program (IEP) planning process, and decisions based on students' future goals are to be documented in the IEP. These transition requirements were included in IDEA in order to plan with the student,…

  14. Data Sharing Agreement Checklist for IDEA Part C and Part B 619 Agencies and Programs

    ERIC Educational Resources Information Center

    Center for IDEA Early Childhood Data Systems (DaSy), 2014

    2014-01-01

    This 2014 document is an adaptation of the 2012 release of "Data Sharing Agreement Checklist" intended for K-12 audiences. Presented as a checklist, the document summarizes the requirements for the written agreements under the audit or evaluation exception that is specified in FERPA and that also applies to the IDEA for Part C early…

  15. MacIntyre's Revolutionary Aristotelian Philosophy and His Idea of an Educated Public Revisited

    ERIC Educational Resources Information Center

    Macallister, James

    2016-01-01

    In this article I revisit MacIntyre's lecture on the idea of an educated public. I argue that the full significance of MacIntyre's views on the underlying purposes of universities only become clear when his lecture on the educated public is situated in the context of his wider 'revolutionary Aristotelian' philosophical project. I claim that for…

  16. American Curriculum Theory and the Problem of Social Control, 1918-1938.

    ERIC Educational Resources Information Center

    Franklin, Barry M.

    Curriculum as a field of study emerged in an intellectual climate in which the idea of social control was dominant. The intent of this paper is to look historically at the integration of the idea of social control into curriculum discourse, to indicate its dominant position as the underlying assumption of most early curriculum work, and to suggest…

  17. Why good ideas and good science do not always make it into the marketplace

    Treesearch

    Charles R. Frihart

    2007-01-01

    Good ideas and good science are not sufficient in and of themselves for successful commercialization of new technology. Understanding the barriers to commercialization so that ways around, under, over, or through them can be found is also crucial to success. Barriers can include market needs, technology push versus market pull, availability of a window of opportunity,...

  18. What Is the Nature of the Principal's Leadership in Elementary Schools Where Response to Intervention Has Been Implemented?

    ERIC Educational Resources Information Center

    Roberts, Jennifer M.

    2014-01-01

    The revised Individuals with Disabilities Education Improvement Act (IDEA, 2004) has offered a change of practice regarding the identification of students with a learning disability. Under IDEA (2004) educators are encouraged to use Response to Intervention (RTI) as a method to determine eligibility for special education services. In an RTI…

  19. Quantification and purification of mangiferin from Chinese Mango (Mangifera indica L.) cultivars and its protective effect on human umbilical vein endothelial cells under H(2)O(2)-induced stress.

    PubMed

    Luo, Fenglei; Lv, Qiang; Zhao, Yuqin; Hu, Guibing; Huang, Guodi; Zhang, Jiukai; Sun, Chongde; Li, Xian; Chen, Kunsong

    2012-01-01

    Mangiferin is a natural xanthonoid with various biological activities. Quantification of mangiferin in fruit peel, pulp, and seed kernel was carried out in 11 Chinese mango (Mangifera indica L.) cultivars. The highest mangiferin content was found in the peel of Lvpimang (LPM) fruit (7.49 mg/g DW). Efficient purification of mangiferin from mango fruit peel was then established for the first time by combination of macroporous HPD100 resin chromatography with optimized high-speed counter-current chromatography (HSCCC). Purified mangiferin was identified by both HPLC and LC-MS, and it showed higher DPPH(•) free-radical scavenging capacities and ferric reducing ability of plasma (FRAP) than by l-ascorbic acid (Vc) or Trolox. In addition, it showed significant protective effects on human umbilical vein endothelial cells (HUVEC) under H(2)O(2)-induced stress. Cells treated with mangiferin resulted in significant enhanced cell survival under of H(2)O(2) stress. Therefore, mangiferin from mango fruit provides a promising perspective for the prevention of oxidative stress-associated diseases.

  20. Modeling particulate matter emissions during mineral loading process under weak wind simulation.

    PubMed

    Zhang, Xiaochun; Chen, Weiping; Ma, Chun; Zhan, Shuifen

    2013-04-01

    The quantification of particulate matter emissions from mineral handling is an important problem for the quantification of global emissions on industrial sites. Mineral particulate matter emissions could adversely impact environmental quality in mining regions, transport regions, and even on a global scale. Mineral loading is an important process contributing to mineral particulate matter emissions, especially under weak wind conditions. Mathematical models are effective ways to evaluate particulate matter emissions during the mineral loading process. The currently used empirical models based on the form of a power function do not predict particulate matter emissions accurately under weak wind conditions. At low particulate matter emissions, the models overestimated, and at high particulate matter emissions, the models underestimated emission factors. We conducted wind tunnel experiments to evaluate the particulate matter emission factors for the mineral loading process. A new approach based on the mathematical form of a logistical function was developed and tested. It provided a realistic depiction of the particulate matter emissions during the mineral loading process, accounting for fractions of fine mineral particles, dropping height, and wind velocity. Copyright © 2013 Elsevier B.V. All rights reserved.

  1. Quantification and Purification of Mangiferin from Chinese Mango (Mangifera indica L.) Cultivars and Its Protective Effect on Human Umbilical Vein Endothelial Cells under H2O2-induced Stress

    PubMed Central

    Luo, Fenglei; Lv, Qiang; Zhao, Yuqin; Hu, Guibing; Huang, Guodi; Zhang, Jiukai; Sun, Chongde; Li, Xian; Chen, Kunsong

    2012-01-01

    Mangiferin is a natural xanthonoid with various biological activities. Quantification of mangiferin in fruit peel, pulp, and seed kernel was carried out in 11 Chinese mango (Mangifera indica L.) cultivars. The highest mangiferin content was found in the peel of Lvpimang (LPM) fruit (7.49 mg/g DW). Efficient purification of mangiferin from mango fruit peel was then established for the first time by combination of macroporous HPD100 resin chromatography with optimized high-speed counter-current chromatography (HSCCC). Purified mangiferin was identified by both HPLC and LC-MS, and it showed higher DPPH• free-radical scavenging capacities and ferric reducing ability of plasma (FRAP) than by l-ascorbic acid (Vc) or Trolox. In addition, it showed significant protective effects on human umbilical vein endothelial cells (HUVEC) under H2O2-induced stress. Cells treated with mangiferin resulted in significant enhanced cell survival under of H2O2 stress. Therefore, mangiferin from mango fruit provides a promising perspective for the prevention of oxidative stress-associated diseases. PMID:23109851

  2. Rapid determination of quinolones in cosmetic products by ultra high performance liquid chromatography with tandem mass spectrometry.

    PubMed

    Liu, Shao-Ying; Huang, Xi-Hui; Wang, Xiao-Fang; Jin, Quan; Zhu, Guo-Nian

    2014-05-01

    This study developed an improved analytical method for the simultaneous quantification of 13 quinolones in cosmetics by ultra high performance liquid chromatography combined with ESI triple quadrupole MS/MS under the multiple reaction monitoring mode. The analytes were extracted and purified by using an SPE cartridge. The limits of quantification ranged from 0.03 to 3.02 μg/kg. The precision for determining the quinolones was <19.39%. The proposed method was successfully developed for the determination of quinolones in real cosmetic samples. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Clinical prediction and the idea of a population.

    PubMed

    Armstrong, David

    2017-04-01

    Using an analysis of the British Medical Journal over the past 170 years, this article describes how changes in the idea of a population have informed new technologies of medical prediction. These approaches have largely replaced older ideas of clinical prognosis based on understanding the natural histories of the underlying pathologies. The 19 th -century idea of a population, which provided a denominator for medical events such as births and deaths, was constrained in its predictive power by its method of enumerating individual bodies. During the 20 th century, populations were increasingly constructed through inferential techniques based on patient groups and samples seen to possess variable characteristics. The emergence of these new virtual populations created the conditions for the emergence of predictive algorithms that are used to foretell our medical futures.

  4. The Use of the Internet to Support General Aviation Research

    NASA Technical Reports Server (NTRS)

    Rowbottom, James H.

    1995-01-01

    For the past few years, innovation in the field of General Aviation (GA) has declined. The reason for this decline has not been because of a lack of ideas, but rather a lack of funds necessary to convert these ideas into reality. NASA implemented the Small Business Innovative Research (SBIR) program in an effort to promote new technology in General Aviation. Under this program, small business with good ideas present them to NASA who reviews them and determines their value potential in the GA market. If the company's idea proves worthy, NASA subsidizes their research in three phases that include the research, testing, development, and production of their product. The purpose of my internship this summer was to use the Internet to promote the work of SBIR companies globally to prospective investors.

  5. Quantitative determination of polycyclic aromatic hydrocarbons in barbecued meat sausages by gas chromatography coupled to mass spectrometry.

    PubMed

    Mottier, P; Parisod, V; Turesky, R J

    2000-04-01

    A method is described for the analysis of the 16 polycyclic aromatic hydrocarbons (PAHs) prioritized by the USA EPA in meat sausages grilled under common barbecue practices. Quantification was done by GC-MS using perdeuterated internal standards (IS). Validation was done by spiking the matrix at the 0.5 and 1.0 microg/kg levels. The average of expected values ranged from 60 to 134% (median 84%) at the 0.5 microg/kg level and from 69 to 121% (median 96%) at the 1.0 microg/kg level. The median of the limits of detection and quantification were 0.06 and 0.20 microg/kg, respectively, for a 4-g test portion. The carcinogenic PAHs were below the quantification limit in all products except one lamb sausage. Comparison of estimates when either 1, 5, or 16 perdeuterated PAHs were used as IS showed that the most accurate determination of PAHs required that each compound be quantified against its corresponding perdeuterated analogue.

  6. Thermal lift generation and drag reduction in rarefied aerodynamics

    NASA Astrophysics Data System (ADS)

    Pekardan, Cem; Alexeenko, Alina

    2016-11-01

    With the advent of the new technologies in low pressure environments such as Hyperloop and helicopters designed for Martian applications, understanding the aerodynamic behavior of airfoils in rarefied environments are becoming more crucial. In this paper, verification of rarefied ES-BGK solver and ideas such as prediction of the thermally induced lift and drag reduction in rarefied aerodynamics are investigated. Validation of the rarefied ES-BGK solver with Runge-Kutta discontinous Galerkin method with experiments in transonic regime with a Reynolds number of 73 showed that ES-BGK solver is the most suitable solver in near slip transonic regime. For the quantification of lift generation, A NACA 0012 airfoil is studied with a high temperature surface on the bottom for the lift creation for different Knudsen numbers. It was seen that for lower velocities, continuum solver under predicts the lift generation when the Knudsen number is 0.00129 due to local velocity gradients reaching slip regime although lift coefficient is higher with the Boltzmann ES-BGK solutions. In the second part, the feasibility of using thermal transpiration for drag reduction is studied. Initial study in drag reduction includes an application of a thermal gradient at the upper surface of a NACA 0012 airfoil near trailing edge at a 12-degree angle of attack and 5 Pa pressure. It was seen that drag is reduced by 4 percent and vortex shedding frequency is reduced due to asymmetry introduced in the flow due to temperature gradient causing reverse flow due to thermal transpiration phenomena.

  7. Sludge quantification at water treatment plant and its management scenario.

    PubMed

    Ahmad, Tarique; Ahmad, Kafeel; Alam, Mehtab

    2017-08-15

    Large volume of sludge is generated at the water treatment plants during the purification of surface water for potable supplies. Handling and disposal of sludge require careful attention from civic bodies, plant operators, and environmentalists. Quantification of the sludge produced at the treatment plants is important to develop suitable management strategies for its economical and environment friendly disposal. Present study deals with the quantification of sludge using empirical relation between turbidity, suspended solids, and coagulant dosing. Seasonal variation has significant effect on the raw water quality received at the water treatment plants so forth sludge generation also varies. Yearly production of the sludge in a water treatment plant at Ghaziabad, India, is estimated to be 29,700 ton. Sustainable disposal of such a quantity of sludge is a challenging task under stringent environmental legislation. Several beneficial reuses of sludge in civil engineering and constructional work have been identified globally such as raw material in manufacturing cement, bricks, and artificial aggregates, as cementitious material, and sand substitute in preparing concrete and mortar. About 54 to 60% sand, 24 to 28% silt, and 16% clay constitute the sludge generated at the water treatment plant under investigation. Characteristics of the sludge are found suitable for its potential utilization as locally available construction material for safe disposal. An overview of the sustainable management scenario involving beneficial reuses of the sludge has also been presented.

  8. Assessment of SCAR markers to design real-time PCR primers for rhizosphere quantification of Azospirillum brasilense phytostimulatory inoculants of maize.

    PubMed

    Couillerot, O; Poirier, M-A; Prigent-Combaret, C; Mavingui, P; Caballero-Mellado, J; Moënne-Loccoz, Y

    2010-08-01

    To assess the applicability of sequence characterized amplified region (SCAR) markers obtained from BOX, ERIC and RAPD fragments to design primers for real-time PCR quantification of the phytostimulatory maize inoculants Azospirillum brasilense UAP-154 and CFN-535 in the rhizosphere. Primers were designed based on strain-specific SCAR markers and were screened for successful amplification of target strain and absence of cross-reaction with other Azospirillum strains. The specificity of primers thus selected was verified under real-time PCR conditions using genomic DNA from strain collection and DNA from rhizosphere samples. The detection limit was 60 fg DNA with pure cultures and 4 x 10(3) (for UAP-154) and 4 x 10(4) CFU g(-1) (for CFN-535) in the maize rhizosphere. Inoculant quantification was effective from 10(4) to 10(8) CFU g(-1) soil. BOX-based SCAR markers were useful to find primers for strain-specific real-time PCR quantification of each A. brasilense inoculant in the maize rhizosphere. Effective root colonization is a prerequisite for successful Azospirillum phytostimulation, but cultivation-independent monitoring methods were lacking. The real-time PCR methods developed here will help understand the effect of environmental conditions on root colonization and phytostimulation by A. brasilense UAP-154 and CFN-535.

  9. Electrochemical sensors and biosensors for the analysis of antineoplastic drugs.

    PubMed

    Lima, Handerson Rodrigues Silva; da Silva, Josany Saibrosa; de Oliveira Farias, Emanuel Airton; Teixeira, Paulo Ronaldo Sousa; Eiras, Carla; Nunes, Lívio César Cunha

    2018-06-15

    Cancer is a leading cause of death worldwide, often being treated with antineoplastic drugs that have high potential for toxicity to humans and the environment, even at very low concentrations. Therefore, monitoring these drugs is of utmost importance. Among the techniques used to detect substances at low concentrations, electrochemical sensors and biosensors have been noted for their practicality and low cost. This review brings, for the first time, a simplified outline of the main electrochemical sensors and biosensors developed for the analysis of antineoplastic drugs. The drugs analyzed and the methodology used for electrochemical sensing are described, as are the techniques used for drug quantification and the analytical performance of each sensor, highlighting the limit of detection (LOD), as well as the linear range of quantification (LR) for each system. Finally, we present a technological prospection on the development and use of electrochemical sensors and biosensors in the quantification of antineoplastic drugs. A search of international patent databases revealed no patents currently submitted under this topic, suggesting this is an area to be further explored. We also show that the use of these systems has been gaining prominence in recent years, and that the quantification of antineoplastic drugs using electrochemical techniques could bring great financial and health benefits. Copyright © 2018. Published by Elsevier B.V.

  10. Volumetric adsorptive microsampling-liquid chromatography tandem mass spectrometry assay for the simultaneous quantification of four antibiotics in human blood: Method development, validation and comparison with dried blood spot.

    PubMed

    Barco, Sebastiano; Castagnola, Elio; Moscatelli, Andrea; Rudge, James; Tripodi, Gino; Cangemi, Giuliana

    2017-10-25

    In this paper we show the development and validation of a volumetric absorptive microsampling (VAMS™)-LC-MS/MS method for the simultaneous quantification of four antibiotics: piperacillin-tazobactam, meropenem, linezolid and ceftazidime in 10μL human blood. The novel VAMS-LC-MS/MS method has been compared with a dried blood spot (DBS)-based method in terms of impact of hematocrit (HCT) on accuracy, reproducibility, recovery and matrix effect. Antibiotics were extracted from VAMS and DBS by protein precipitation with methanol after a re-hydration step at 37°C for 10min. LC-MS/MS was carried out on a Thermo Scientific™ TSQ Quantum™ Access MAX triple quadrupole coupled to an Accela ™UHPLC system. The VAMS-LC-MS/MS method is selective, precise and reproducible. In contrast to DBS, it allows an accurate quantification without any HCT influence. It has been applied to samples derived from pediatric patients under therapy. VAMS is a valid alternative sampling strategy for the quantification of antibiotics and is valuable in support of clinical PK/PD studies and consequently therapeutic drug monitoring (TDM) in pediatrics. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Impact of the Developing Mathematical Ideas Professional Development Program on Grade 4 Students' and Teachers' Understanding of Fractions. REL 2017-256

    ERIC Educational Resources Information Center

    Jayanthi, Madhavi; Gersten, Russell; Taylor, Mary Jo; Smolkowski, Keith; Dimino, Joseph

    2017-01-01

    Contemporary state math standards emphasize that students must demonstrate an understanding of the mathematical ideas underlying the computations that have typically been the core of the elementary school math curriculum. The standards have put an increased emphasis on the study of fractions in upper elementary grades, which are the years during…

  12. Pot Addiction and Parental Friction: Emotional Disturbance, Conduct Disorders and Unilateral Placements for Drug-Abusing Students Under the IDEA.

    ERIC Educational Resources Information Center

    Doty, David S.

    This paper is part of a collection of 54 papers from the 48th annual conference of the Education Law Association held in November 2002. It addresses the Individuals with Disabilities Education Act (IDEA). Specifically, the paper examines unilateral placements for drug-abusing and delinquent students. Following the introduction, the next section of…

  13. A Review of Research on the Educational Benefits of the Inclusive Model of Education for Special Education Students

    ERIC Educational Resources Information Center

    Hicks-Monroe, Sherry L.

    2011-01-01

    The practice of inclusion is not a new idea to the educational setting; it is a newer term. Before No Child Left Behind, during the 1970s students with disabilities were mainstreamed into the general education population under Public Law 94-142. Public law 94-142, which was renamed to Individuals with Disabilities Educational Act (IDEA), required…

  14. Cavitations synthesis of carbon nanostructures

    NASA Astrophysics Data System (ADS)

    Voropaev, S.

    2011-04-01

    Originally an idea of diamonds production by hydrodynamical cavitation was presented by academician E M Galimov. He supposed the possibility of nature diamonds formation at fast magma flowing in kimberlitic pipes during bubbles collapse. This hypothesis assumes a number of processes, which were not under consideration until now. It concerns cavitation under high pressure, growth and stability of the gas- and vapors bubbles, their evolution, and corresponding physical- and chemical processes inside. Experimental setup to reproduce the high pressure and temperature reaction centers by means of the cavitation following the above idea was created. A few crystalline nanocarbon forms were successfully recovered after treatment of benzene (C6H6).

  15. Landscape metrics, scales of resolution

    Treesearch

    Samuel A. Cushman; Kevin McGarigal

    2008-01-01

    Effective implementation of the "multiple path" approach to managing green landscapes depends fundamentally on rigorous quantification of the composition and structure of the landscapes of concern at present, modelling landscape structure trajectories under alternative management paths, and monitoring landscape structure into the future to confirm...

  16. Real-time polymerase chain reaction-based approach for quantification of the pat gene in the T25 Zea mays event.

    PubMed

    Weighardt, Florian; Barbati, Cristina; Paoletti, Claudia; Querci, Maddalena; Kay, Simon; De Beuckeleer, Marc; Van den Eede, Guy

    2004-01-01

    In Europe, a growing interest for reliable techniques for the quantification of genetically modified component(s) of food matrixes is arising from the need to comply with the European legislative framework on novel food products. Real-time polymerase chain reaction (PCR) is currently the most powerful technique for the quantification of specific nucleic acid sequences. Several real-time PCR methodologies based on different molecular principles have been developed for this purpose. The most frequently used approach in the field of genetically modified organism (GMO) quantification in food or feed samples is based on the 5'-3'-exonuclease activity of Taq DNA polymerase on specific degradation probes (TaqMan principle). A novel approach was developed for the establishment of a TaqMan quantification system assessing GMO contents around the 1% threshold stipulated under European Union (EU) legislation for the labeling of food products. The Zea mays T25 elite event was chosen as a model for the development of the novel GMO quantification approach. The most innovative aspect of the system is represented by the use of sequences cloned in plasmids as reference standards. In the field of GMO quantification, plasmids are an easy to use, cheap, and reliable alternative to Certified Reference Materials (CRMs), which are only available for a few of the GMOs authorized in Europe, have a relatively high production cost, and require further processing to be suitable for analysis. Strengths and weaknesses of the use of novel plasmid-based standards are addressed in detail. In addition, the quantification system was designed to avoid the use of a reference gene (e.g., a single copy, species-specific gene) as normalizer, i.e., to perform a GMO quantification based on an absolute instead of a relative measurement. In fact, experimental evidences show that the use of reference genes adds variability to the measurement system because a second independent real-time PCR-based measurement must be performed. Moreover, for some reference genes no sufficient information on copy number in and among genomes of different lines is available, making adequate quantification difficult. Once developed, the method was subsequently validated according to IUPAC and ISO 5725 guidelines. Thirteen laboratories from 8 EU countries participated in the trial. Eleven laboratories provided results complying with the predefined study requirements. Repeatability (RSDr) values ranged from 8.7 to 15.9%, with a mean value of 12%. Reproducibility (RSDR) values ranged from 16.3 to 25.5%, with a mean value of 21%. Following Codex Alimentarius Committee guidelines, both the limits of detection and quantitation were determined to be <0.1%.

  17. Emphysema quantification from CT scans using novel application of diaphragm curvature estimation: comparison with standard quantification methods and pulmonary function data

    NASA Astrophysics Data System (ADS)

    Keller, Brad M.; Reeves, Anthony P.; Yankelevitz, David F.; Henschke, Claudia I.; Barr, R. Graham

    2009-02-01

    Emphysema is a disease of the lungs that destroys the alveolar air sacs and induces long-term respiratory dysfunction. CT scans allow for the imaging of the anatomical basis of emphysema and quantification of the underlying disease state. Several measures have been introduced for the quantification emphysema directly from CT data; most,however, are based on the analysis of density information provided by the CT scans, which vary by scanner and can be hard to standardize across sites and time. Given that one of the anatomical variations associated with the progression of emphysema is the flatting of the diaphragm due to the loss of elasticity in the lung parenchyma, curvature analysis of the diaphragm would provide information about emphysema from CT. Therefore, we propose a new, non-density based measure of the curvature of the diaphragm that would allow for further quantification methods in a robust manner. To evaluate the new method, 24 whole-lung scans were analyzed using the ratios of the lung height and diaphragm width to diaphragm height as curvature estimates as well as using the emphysema index as comparison. Pearson correlation coefficients showed a strong trend of several of the proposed diaphragm curvature measures to have higher correlations, of up to r=0.57, with DLCO% and VA than did the emphysema index. Furthermore, we found emphysema index to have only a 0.27 correlation to the proposed measures, indicating that the proposed measures evaluate different aspects of the disease.

  18. Putting more ‘modern’ in modern physics education: a Knowledge Building approach using student questions and ideas about the universe

    NASA Astrophysics Data System (ADS)

    Wagner, Glenn

    2017-03-01

    Student-generated questions and ideas about our universe are the start of a rich and highly motivating learning environment. Using their curiosity-driven questions and ideas, students form Knowledge Building groups or ‘communities’ where they plan, set goals, design questions for research, and assess the progress of their work, tasks that were once under the control of the teacher. With the understanding that all knowledge and ideas are treated as improvable, students work collaboratively at their level of competency to share their knowledge, ideas and understandings gained from authoritative sources and laboratory activities. Over time, students work collectively to improve the knowledge and ideas of others that result in advances in understanding that benefit not only the individual but the community as a whole. Learning outcomes reported in this paper demonstrate that a Knowledge Building environment applied to introductory cosmology produced similar gains in knowledge and understanding surrounding foundational concepts compared to teacher-centred learning environments. Aside from new knowledge and understanding, students develop important skills and competencies such as question-asking, idea development, communication, collaboration that are becoming ever more important for 21st century living and working. Finally, the process of planning and initiating a Knowledge Building environment that produced the results reported in this paper is outlined.

  19. Unleashing creativity: The role of left temporoparietal regions in evaluating and inhibiting the generation of creative ideas.

    PubMed

    Mayseless, Naama; Aharon-Peretz, Judith; Shamay-Tsoory, Simone

    2014-11-01

    Human creativity is thought to entail two processes. One is idea generation, whereby ideas emerge in an associative manner, and the other is idea evaluation, whereby generated ideas are evaluated and screened. Thus far, neuroimaging studies have identified several brain regions as being involved in creativity, yet only a handful of studies have examined the neural basis underlying these two processes. We found that an individual with left temporoparietal hemorrhage who had no previous experience as an artist developed remarkable artistic creativity, which diminished as the hemorrhage receded. We thus hypothesized that damage to the evaluation network of creativity during the initial hematoma had a releasing effect on creativity by "freeing" the idea generation system. In line with this hypothesis, we conducted a subsequent fMRI study showing that decreased left temporal and parietal activations among healthy individuals as they evaluated creative ideas selectively predicted higher creativity. The current studies provide converging multi-method evidence suggesting that the left temporoparietal area is part of a neural network involved in evaluating creativity, and that as such may act as inhibitors of creativity. We propose an explanatory model of creativity centered upon the key role of the left temporoparietal regions in evaluating and inhibiting creativity. Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. Operational CryoSat Product Quality Assessment

    NASA Astrophysics Data System (ADS)

    Mannan, Rubinder; Webb, Erica; Hall, Amanda; Bouzinac, Catherine

    2013-12-01

    The performance and quality of the CryoSat data products are routinely assessed by the Instrument Data quality Evaluation and Analysis Service (IDEAS). This information is then conveyed to the scientific and user community in order to allow them to utilise CryoSat data with confidence. This paper presents details of the Quality Control (QC) activities performed for CryoSat products under the IDEAS contract. Details of the different QC procedures and tools deployed by IDEAS to assess the quality of operational data are presented. The latest updates to the Instrument Processing Facility (IPF) for the Fast Delivery Marine (FDM) products and the future update to Baseline-C are discussed.

  1. The Chimera of Proportionality: Institutionalising Limits on Punishment in Contemporary Social and Political Systems.

    PubMed

    Lacey, Nicola; Pickard, Hanna

    2015-03-01

    The concept of proportionality has been central to the retributive revival in penal theory, and underlies desert theory's normative and practical commitment to limiting punishment. Theories of punishment combining desert-based and consequentialist considerations also appeal to proportionality as a limiting condition. In this paper we argue that these claims are founded on an exaggerated idea of what proportionality can offer, and in particular fail properly to consider the institutional conditions needed to foster robust limits on the state's power to punish. The idea that appeals to proportionality as an abstract ideal can help to limit punishment is, we argue, a chimera: what has been thought of as proportionality is not a naturally existing relationship, but a product of political and social construction, cultural meaning-making, and institution-building. Drawing on evolutionary psychology and comparative political economy, we argue that philosophers and social scientists need to work together to understand how the appeal of the idea of proportionality can best be realised through substantive institutional frameworks under particular conditions.

  2. 48 CFR 15.602 - Policy.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ..., Small Business Innovation Research topics, Small Business Technology Transfer Research topics, Program Research and Development Announcements, or any other Government-initiated solicitation or program. When the new and innovative ideas do not fall under topic areas publicized under those programs or techniques...

  3. 48 CFR 15.602 - Policy.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ..., Small Business Innovation Research topics, Small Business Technology Transfer Research topics, Program Research and Development Announcements, or any other Government-initiated solicitation or program. When the new and innovative ideas do not fall under topic areas publicized under those programs or techniques...

  4. 48 CFR 15.602 - Policy.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ..., Small Business Innovation Research topics, Small Business Technology Transfer Research topics, Program Research and Development Announcements, or any other Government-initiated solicitation or program. When the new and innovative ideas do not fall under topic areas publicized under those programs or techniques...

  5. Standards and Methodologies for Characterizing Radiobiological Impact of High-Z Nanoparticles

    PubMed Central

    Subiel, Anna; Ashmore, Reece; Schettino, Giuseppe

    2016-01-01

    Research on the application of high-Z nanoparticles (NPs) in cancer treatment and diagnosis has recently been the subject of growing interest, with much promise being shown with regards to a potential transition into clinical practice. In spite of numerous publications related to the development and application of nanoparticles for use with ionizing radiation, the literature is lacking coherent and systematic experimental approaches to fully evaluate the radiobiological effectiveness of NPs, validate mechanistic models and allow direct comparison of the studies undertaken by various research groups. The lack of standards and established methodology is commonly recognised as a major obstacle for the transition of innovative research ideas into clinical practice. This review provides a comprehensive overview of radiobiological techniques and quantification methods used in in vitro studies on high-Z nanoparticles and aims to provide recommendations for future standardization for NP-mediated radiation research. PMID:27446499

  6. Biogeochemistry and ecology of terrestrial ecosystems of Amazonia

    NASA Astrophysics Data System (ADS)

    Malhi, Yadvinder; Davidson, Eric A.

    The last decade of research associated with the Large-Scale Biosphere-Atmosphere Experiment in Amazonia (LBA) has led to substantial advances in our understanding of the biogeochemistry and ecology of Amazonian forests and savannas, in particular in relation to the carbon cycle of Amazonia. In this chapter, we present a synthesis of results and ideas that are presented in more detail in subsequent chapters, drawing together evidence from studies of forest ecology, ecophysiology, trace gas fluxes and atmospheric flux towers, large-scale rainfall manipulation experiments and soil surveys, satellite remote sensing, and quantification of carbon and nutrient stocks and flows. The studies have demonstrated the variability of the functioning and biogeochemistry of Amazonian forests at a range of spatial and temporal scales, and they provide clues as to how Amazonia will respond to ongoing direct pressure and global atmospheric change. We conclude by highlighting key questions for the next decade of research to address.

  7. Brief history of intermolecular and intersurface forces in complex fluid systems.

    PubMed

    Israelachvili, Jacob; Ruths, Marina

    2013-08-06

    We review the developments of ideas, concepts, and theories of intermolecular and intersurface forces and how these were influenced (or ignored) by observations of nature and, later, systematic experimentation. The emphasis of this review is on the way things gradually changed: experimentation replaced rhetoric, measurement and quantification replaced hand waving, energy replaced force in calculations, discrete atoms replaced the (continuum) aether, thermodynamics replaced mechanistic models, randomness and probability replaced certainty, and delicate experiments on the subnanoscale revealed fascinating self-assembling structures and complex behavior of even the simplest systems. We conclude by discussing today's unresolved challenges: how complex "dynamic" multicomponent--especially living biological--systems that receive a continuous supply of energy can be far from equilibrium and not even in any steady state. Such systems, never static but evolving in both space and time, are still far from being understood both experimentally and theoretically.

  8. Model Update of a Micro Air Vehicle (MAV) Flexible Wing Frame with Uncertainty Quantification

    NASA Technical Reports Server (NTRS)

    Reaves, Mercedes C.; Horta, Lucas G.; Waszak, Martin R.; Morgan, Benjamin G.

    2004-01-01

    This paper describes a procedure to update parameters in the finite element model of a Micro Air Vehicle (MAV) to improve displacement predictions under aerodynamics loads. Because of fabrication, materials, and geometric uncertainties, a statistical approach combined with Multidisciplinary Design Optimization (MDO) is used to modify key model parameters. Static test data collected using photogrammetry are used to correlate with model predictions. Results show significant improvements in model predictions after parameters are updated; however, computed probabilities values indicate low confidence in updated values and/or model structure errors. Lessons learned in the areas of wing design, test procedures, modeling approaches with geometric nonlinearities, and uncertainties quantification are all documented.

  9. Simplified 2,4-dinitrophenylhydrazine spectrophotometric assay for quantification of carbonyls in oxidized proteins.

    PubMed

    Mesquita, Cristina S; Oliveira, Raquel; Bento, Fátima; Geraldo, Dulce; Rodrigues, João V; Marcos, João C

    2014-08-01

    This work proposes a modification of the 2,4-dinitrophenylhydrazine (DNPH) spectrophotometric assay commonly used to evaluate the concentration of carbonyl groups in oxidized proteins. In this approach NaOH is added to the protein solution after the addition of DNPH, shifting the maximum absorbance wavelength of the derivatized protein from 370 to 450nm. This reduces the interference of DNPH and allows the direct quantification in the sample solution without the need for the precipitation, washing, and resuspension steps that are carried out in the traditional DNPH method. The two methods were compared under various conditions and are statistically equivalent. Copyright © 2014 Elsevier Inc. All rights reserved.

  10. Quantification of Toxic Effects for Water Concentration-based Aquatic Life Criteria -Part B

    EPA Science Inventory

    Erickson et al. (1991) conducted a series of experiments on the toxicity of pentachloroethane (PCE) to juvenile fathead minnows. These experiments included evaluations of bioaccumulation kinetics, the time-course of mortality under both constant and time-variable exposures, the r...

  11. Challenges in soil erosion research and prediction model development

    USDA-ARS?s Scientific Manuscript database

    Quantification of soil erosion has been traditionally considered as a surface hydrologic process with equations for soil detachment and sediment transport derived from the mechanics and hydraulics of the rainfall and surface flow. Under the current erosion modeling framework, the soil has a constant...

  12. Radiochemical ageing of EPDM elastomers.. 2. Identification and quantification of chemical changes in EPDM and EPR films γ-irradiated under oxygen atmosphere

    NASA Astrophysics Data System (ADS)

    Rivaton, A.; Cambon, S.; Gardette, J.-L.

    2005-01-01

    This paper is devoted to the identification and quantification of the main chemical changes resulting from the radiochemical ageing under oxygen atmosphere of ethylene-propylene-diene monomer (EPDM) and ethylene-propylene rubber (EPR) films containing the same molar ratio of ethylene/propylene. IR and UV-Vis analysis showed that radiooxidation produces a complex mixture of different products and provokes the consumption of the diene double bond. The radiochemical yields of formation of ketones, carboxylic acids, hydroperoxides and alcohols were determined by combining IR analysis with derivatisation reactions and chemical titration. The contributions of secondary and tertiary structures of these two types of -OH groups were separated. Esters and γ-lactones were formed in low concentration. The oxidation products distribution in irradiated films was determined by micro-FTIR spectroscopy. Crosslinking was evaluated by gel fraction methods. In complement, the gas phase composition was analysed by mass spectrometry.

  13. In vivo Magnetic Resonance Spectroscopy of cerebral glycogen metabolism in animals and humans.

    PubMed

    Khowaja, Ameer; Choi, In-Young; Seaquist, Elizabeth R; Öz, Gülin

    2015-02-01

    Glycogen serves as an important energy reservoir in the human body. Despite the abundance of glycogen in the liver and skeletal muscles, its concentration in the brain is relatively low, hence its significance has been questioned. A major challenge in studying brain glycogen metabolism has been the lack of availability of non-invasive techniques for quantification of brain glycogen in vivo. Invasive methods for brain glycogen quantification such as post mortem extraction following high energy microwave irradiation are not applicable in the human brain. With the advent of (13)C Magnetic Resonance Spectroscopy (MRS), it has been possible to measure brain glycogen concentrations and turnover in physiological conditions, as well as under the influence of stressors such as hypoglycemia and visual stimulation. This review presents an overview of the principles of the (13)C MRS methodology and its applications in both animals and humans to further our understanding of glycogen metabolism under normal physiological and pathophysiological conditions such as hypoglycemia unawareness.

  14. Jellyfish collagen scaffolds for cartilage tissue engineering.

    PubMed

    Hoyer, Birgit; Bernhardt, Anne; Lode, Anja; Heinemann, Sascha; Sewing, Judith; Klinger, Matthias; Notbohm, Holger; Gelinsky, Michael

    2014-02-01

    Porous scaffolds were engineered from refibrillized collagen of the jellyfish Rhopilema esculentum for potential application in cartilage regeneration. The influence of collagen concentration, salinity and temperature on fibril formation was evaluated by turbidity measurements and quantification of fibrillized collagen. The formation of collagen fibrils with a typical banding pattern was confirmed by atomic force microscopy and transmission electron microscopy analysis. Porous scaffolds from jellyfish collagen, refibrillized under optimized conditions, were fabricated by freeze-drying and subsequent chemical cross-linking. Scaffolds possessed an open porosity of 98.2%. The samples were stable under cyclic compression and displayed an elastic behavior. Cytotoxicity tests with human mesenchymal stem cells (hMSCs) did not reveal any cytotoxic effects of the material. Chondrogenic markers SOX9, collagen II and aggrecan were upregulated in direct cultures of hMSCs upon chondrogenic stimulation. The formation of typical extracellular matrix components was further confirmed by quantification of sulfated glycosaminoglycans. Copyright © 2013 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.

  15. In vivo Magnetic Resonance Spectroscopy of cerebral glycogen metabolism in animals and humans

    PubMed Central

    Khowaja, Ameer; Choi, In-Young; Seaquist, Elizabeth R.; Öz, Gülin

    2015-01-01

    Glycogen serves as an important energy reservoir in the human body. Despite the abundance of glycogen in the liver and skeletal muscles, its concentration in the brain is relatively low, hence its significance has been questioned. A major challenge in studying brain glycogen metabolism has been the lack of availability of non-invasive techniques for quantification of brain glycogen in vivo. Invasive methods for brain glycogen quantification such as post mortem extraction following high energy microwave irradiation are not applicable in the human brain. With the advent of 13C Magnetic Resonance Spectroscopy (MRS), it has been possible to measure brain glycogen concentrations and turnover in physiological conditions, as well as under the influence of stressors such as hypoglycemia and visual stimulation. This review presents an overview of the principles of the 13C MRS methodology and its applications in both animals and humans to further our understanding of glycogen metabolism under normal physiological and pathophysiological conditions such as hypoglycemia unawareness. PMID:24676563

  16. Photon path distribution and optical responses of turbid media: theoretical analysis based on the microscopic Beer-Lambert law.

    PubMed

    Tsuchiya, Y

    2001-08-01

    A concise theoretical treatment has been developed to describe the optical responses of a highly scattering inhomogeneous medium using functions of the photon path distribution (PPD). The treatment is based on the microscopic Beer-Lambert law and has been found to yield a complete set of optical responses by time- and frequency-domain measurements. The PPD is defined for possible photons having a total zigzag pathlength of l between the points of light input and detection. Such a distribution is independent of the absorption properties of the medium and can be uniquely determined for the medium under quantification. Therefore, the PPD can be calculated with an imaginary reference medium having the same optical properties as the medium under quantification except for the absence of absorption. One of the advantages of this method is that the optical responses, the total attenuation, the mean pathlength, etc are expressed by functions of the PPD and the absorption distribution.

  17. Implications of prospective payment under DRGs for hospital marketing.

    PubMed

    Folland, S; Ziegenfuss, J T; Chao, P

    1988-12-01

    The authors analyze the hospital marketing implications of Medicare's prospective payment system under DRGs. They take an appropriately broad view of marketing and consider related impacts on culture, structure, and management, in addition to the traditional marketing mix. The article serves to present in one place the ideas and discussions on the subject from a wide and disparate literature. The aim throughout is to identify those marketing responses warranting recommendation. The recommendations are assembled in a concluding section. Though many of the ideas are not new, they have yet to be widely adopted by hospitals. Hence they represent untapped marketing opportunities attributable to the advent of the DRG system. The authors conclude with suggestions for future research.

  18. Efficient load rating and quantification of life-cycle damage of Indiana bridges due to overweight loads.

    DOT National Transportation Integrated Search

    2016-02-01

    In this study, a computational approach for conducting durability analysis of bridges using detailed finite element models is developed. The underlying approach adopted is based on the hypothesis that the two main factors affecting the life of a brid...

  19. Quantification of error associated with stormwater and wastewater flow measurement devices

    EPA Science Inventory

    A novel flow testbed has been designed to evaluate the performance of flumes as flow measurement devices. The newly constructed testbed produces both steady and unsteady flows ranging from 10 to 1500 gpm. Two types of flumes (Parshall and trapezoidal) are evaluated under differen...

  20. Systems design and comparative analysis of large antenna concepts

    NASA Technical Reports Server (NTRS)

    Garrett, L. B.; Ferebee, M. J., Jr.

    1983-01-01

    Conceptual designs are evaluated and comparative analyses conducted for several large antenna spacecraft for Land Mobile Satellite System (LMSS) communications missions. Structural configurations include trusses, hoop and column and radial rib. The study was conducted using the Interactive Design and Evaluation of Advanced Spacecraft (IDEAS) system. The current capabilities, development status, and near-term plans for the IDEAS system are reviewed. Overall capabilities are highlighted. IDEAS is an integrated system of computer-aided design and analysis software used to rapidly evaluate system concepts and technology needs for future advanced spacecraft such as large antennas, platforms, and space stations. The system was developed at Langley to meet a need for rapid, cost-effective, labor-saving approaches to the design and analysis of numerous missions and total spacecraft system options under consideration. IDEAS consists of about 40 technical modules efficient executive, data-base and file management software, and interactive graphics display capabilities.

  1. Fully automated gynecomastia quantification from low-dose chest CT

    NASA Astrophysics Data System (ADS)

    Liu, Shuang; Sonnenblick, Emily B.; Azour, Lea; Yankelevitz, David F.; Henschke, Claudia I.; Reeves, Anthony P.

    2018-02-01

    Gynecomastia is characterized by the enlargement of male breasts, which is a common and sometimes distressing condition found in over half of adult men over the age of 44. Although the majority of gynecomastia is physiologic or idiopathic, its occurrence may also associate with an extensive variety of underlying systemic disease or drug toxicity. With the recent large-scale implementation of annual lung cancer screening using low-dose chest CT (LDCT), gynecomastia is believed to be a frequent incidental finding on LDCT. A fully automated system for gynecomastia quantification from LDCT is presented in this paper. The whole breast region is first segmented using an anatomyorientated approach based on the propagation of pectoral muscle fronts in the vertical direction. The subareolar region is then localized, and the fibroglandular tissue within it is measured for the assessment of gynecomastia. The presented system was validated using 454 breast regions from non-contrast LDCT scans of 227 adult men. The ground truth was established by an experienced radiologist by classifying each breast into one of the five categorical scores. The automated measurements have been demonstrated to achieve promising performance for the gynecomastia diagnosis with the AUC of 0.86 for the ROC curve and have statistically significant Spearman correlation r=0.70 (p < 0.001) with the reference categorical grades. The encouraging results demonstrate the feasibility of fully automated gynecomastia quantification from LDCT, which may aid the early detection as well as the treatment of both gynecomastia and the underlying medical problems, if any, that cause gynecomastia.

  2. Quantitative characterization of fatty liver disease using x-ray scattering

    NASA Astrophysics Data System (ADS)

    Elsharkawy, Wafaa B.; Elshemey, Wael M.

    2013-11-01

    Nonalcoholic fatty liver disease (NAFLD) is a dynamic condition in which fat abnormally accumulates within the hepatocytes. It is believed to be a marker of risk of later chronic liver diseases, such as liver cirrhosis and carcinoma. The fat content in liver biopsies determines its validity for liver transplantation. Transplantation of livers with severe NAFLD is associated with a high risk of primary non-function. Moreover, NAFLD is recognized as a clinically important feature that influences patient morbidity and mortality after hepatic resection. Unfortunately, there is a lack in a precise, reliable and reproducible method for quantification of NAFLD. This work suggests a method for the quantification of NAFLD. The method is based on the fact that fatty liver tissue would have a characteristic x-ray scattering profile with a relatively intense fat peak at a momentum transfer value of 1.1 nm-1 compared to a soft tissue peak at 1.6 nm-1. The fat content in normal and fatty liver is plotted against three profile characterization parameters (ratio of peak intensities, ratio of area under peaks and ratio of area under fat peak to total profile area) for measured and Monte Carlo simulated x-ray scattering profiles. Results show a high linear dependence (R2>0.9) of the characterization parameters on the liver fat content with a reported high correlation coefficient (>0.9) between measured and simulated data. These results indicate that the current method probably offers reliable quantification of fatty liver disease.

  3. Leveraging transcript quantification for fast computation of alternative splicing profiles.

    PubMed

    Alamancos, Gael P; Pagès, Amadís; Trincado, Juan L; Bellora, Nicolás; Eyras, Eduardo

    2015-09-01

    Alternative splicing plays an essential role in many cellular processes and bears major relevance in the understanding of multiple diseases, including cancer. High-throughput RNA sequencing allows genome-wide analyses of splicing across multiple conditions. However, the increasing number of available data sets represents a major challenge in terms of computation time and storage requirements. We describe SUPPA, a computational tool to calculate relative inclusion values of alternative splicing events, exploiting fast transcript quantification. SUPPA accuracy is comparable and sometimes superior to standard methods using simulated as well as real RNA-sequencing data compared with experimentally validated events. We assess the variability in terms of the choice of annotation and provide evidence that using complete transcripts rather than more transcripts per gene provides better estimates. Moreover, SUPPA coupled with de novo transcript reconstruction methods does not achieve accuracies as high as using quantification of known transcripts, but remains comparable to existing methods. Finally, we show that SUPPA is more than 1000 times faster than standard methods. Coupled with fast transcript quantification, SUPPA provides inclusion values at a much higher speed than existing methods without compromising accuracy, thereby facilitating the systematic splicing analysis of large data sets with limited computational resources. The software is implemented in Python 2.7 and is available under the MIT license at https://bitbucket.org/regulatorygenomicsupf/suppa. © 2015 Alamancos et al.; Published by Cold Spring Harbor Laboratory Press for the RNA Society.

  4. Aster leafhopper survival and reproduction, and Aster yellows transmission under static and fluctuating temperatures, using ddPCR for phytoplasma quantification.

    PubMed

    Bahar, Md H; Wist, Tyler J; Bekkaoui, Diana R; Hegedus, Dwayne D; Olivier, Chrystel Y

    2018-01-10

    Aster yellows (AY) is an important disease of Brassica crops and is caused by Candidatus Phytoplasma asteris and transmitted by the insect vector, Aster leafhopper (Macrosteles quadrilineatus). Phytoplasma-infected Aster leafhoppers were incubated at various constant and fluctuating temperatures ranging from 0 to 35 °C with the reproductive host plant barley (Hordium vulgare). At 0 °C, leafhopper adults survived for 18 days, but failed to reproduce, whereas at 35 °C insects died within 18 days, but successfully reproduced before dying. Temperature fluctuation increased thermal tolerance in leafhoppers at 25 °C and increased fecundity of leafhoppers at 5 and 20 °C. Leafhopper adults successfully infected and produced AY-symptoms in canola plants after incubating for 18 days at 0-20 °C on barley, indicating that AY-phytoplasma maintains its virulence in this temperature range. The presence and number of AY-phytoplasma in insects and plants were confirmed by droplet digital PCR (ddPCR) quantification. The number of phytoplasma in leafhoppers increased over time, but did not differ among temperatures. The temperatures associated with a typical crop growing season on the Canadian Prairies will not limit the spread of AY disease by their predominant insect vector. Also, ddPCR quantification is a useful tool for early detection and accurate quantification of phytoplasma in plants and insects.

  5. Quantification of birefringence readily measures the level of muscle damage in zebrafish

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berger, Joachim, E-mail: Joachim.Berger@Monash.edu; Sztal, Tamar; Currie, Peter D.

    2012-07-13

    Highlights: Black-Right-Pointing-Pointer Report of an unbiased quantification of the birefringence of muscle of fish larvae. Black-Right-Pointing-Pointer Quantification method readily identifies level of overall muscle damage. Black-Right-Pointing-Pointer Compare zebrafish muscle mutants for level of phenotype severity. Black-Right-Pointing-Pointer Proposed tool to survey treatments that aim to ameliorate muscular dystrophy. -- Abstract: Muscular dystrophies are a group of genetic disorders that progressively weaken and degenerate muscle. Many zebrafish models for human muscular dystrophies have been generated and analysed, including dystrophin-deficient zebrafish mutants dmd that model Duchenne Muscular Dystrophy. Under polarised light the zebrafish muscle can be detected as a bright area in anmore » otherwise dark background. This light effect, called birefringence, results from the diffraction of polarised light through the pseudo-crystalline array of the muscle sarcomeres. Muscle damage, as seen in zebrafish models for muscular dystrophies, can readily be detected by a reduction in the birefringence. Therefore, birefringence is a very sensitive indicator of overall muscle integrity within larval zebrafish. Unbiased documentation of the birefringence followed by densitometric measurement enables the quantification of the birefringence of zebrafish larvae. Thereby, the overall level of muscle integrity can be detected, allowing the identification and categorisation of zebrafish muscle mutants. In addition, we propose that the establish protocol can be used to analyse treatments aimed at ameliorating dystrophic zebrafish models.« less

  6. Quantification In Situ of Crystalline Cholesterol and Calcium Phosphate Hydroxyapatite in Human Atherosclerotic Plaques by Solid-State Magic Angle Spinning NMR

    PubMed Central

    Guo, Wen; Morrisett, Joel D.; DeBakey, Michael E.; Lawrie, Gerald M.; Hamilton, James A.

    2010-01-01

    Because of renewed interest in the progression, stabilization, and regression of atherosclerotic plaques, it has become important to develop methods for characterizing structural features of plaques in situ and noninvasively. We present a nondestructive method for ex vivo quantification of 2 solid-phase components of plaques: crystalline cholesterol and calcium phosphate salts. Magic angle spinning (MAS) nuclear magnetic resonance (NMR) spectra of human carotid endarterectomy plaques revealed 13C resonances of crystalline cholesterol monohydrate and a 31P resonance of calcium phosphate hydroxyapatite (CPH). The spectra were obtained under conditions in which there was little or no interference from other chemical components and were suitable for quantification in situ of the crystalline cholesterol and CPH. Carotid atherosclerotic plaques showed a wide variation in their crystalline cholesterol content. The calculated molar ratio of liquid-crystalline cholesterol to phospholipid ranged from 1.1 to 1.7, demonstrating different capabilities of the phospholipids to reduce crystallization of cholesterol. The spectral properties of the phosphate groups in CPH in carotid plaques were identical to those of CPH in bone. 31P MAS NMR is a simple, rapid method for quantification of calcium phosphate salts in tissue without extraction and time-consuming chemical analysis. Crystalline phases in intact atherosclerotic plaques (ex vivo) can be quantified accurately by solid-state 13C and 31PMAS NMR spectroscopy. PMID:10845882

  7. Methods to Detect Nitric Oxide and its Metabolites in Biological Samples

    PubMed Central

    Bryan, Nathan S.; Grisham, Matthew B.

    2007-01-01

    Nitric oxide (NO) methodology is a complex and often confusing science and the focus of many debates and discussion concerning NO biochemistry. NO is involved in many physiological processes including regulation of blood pressure, immune response and neural communication. Therefore its accurate detection and quantification is critical to understanding health and disease. Due to the extremely short physiological half life of this gaseous free radical, alternative strategies for the detection of reaction products of NO biochemistry have been developed. The quantification of NO metabolites in biological samples provides valuable information with regards to in vivo NO production, bioavailability and metabolism. Simply sampling a single compartment such as blood or plasma may not always provide an accurate assessment of whole body NO status, particularly in tissues. Therefore, extrapolation of plasma or blood NO status to specific tissues of interest is no longer a valid approach. As a result, methods continue to be developed and validated which allow the detection and quantification of NO and NO-related products/metabolites in multiple compartments of experimental animals in vivo. The methods described in this review is not an exhaustive or comprehensive discussion of all methods available for the detection of NO but rather a description of the most commonly used and practical methods which allow accurate and sensitive quantification of NO products/metabolites in multiple biological matrices under normal physiological conditions. PMID:17664129

  8. Digital Protocol for Chemical Analysis at Ultralow Concentrations by Surface-Enhanced Raman Scattering.

    PubMed

    de Albuquerque, Carlos Diego L; Sobral-Filho, Regivaldo G; Poppi, Ronei J; Brolo, Alexandre G

    2018-01-16

    Single molecule surface-enhanced Raman spectroscopy (SM-SERS) has the potential to revolutionize quantitative analysis at ultralow concentrations (less than 1 nM). However, there are no established protocols to generalize the application of this technique in analytical chemistry. Here, a protocol for quantification at ultralow concentrations using SM-SERS is proposed. The approach aims to take advantage of the stochastic nature of the single-molecule regime to achieved lower limits of quantification (LOQ). Two emerging contaminants commonly found in aquatic environments, enrofloxacin (ENRO) and ciprofloxacin (CIPRO), were chosen as nonresonant molecular probes. The methodology involves a multivariate resolution curve fitting known as non-negative matrix factorization with alternating least-squares algorithm (NMF-ALS) to solve spectral overlaps. The key element of the quantification is to realize that, under SM-SERS conditions, the Raman intensity generated by a molecule adsorbed on a "hotspot" can be digitalized. Therefore, the number of SERS event counts (rather than SERS intensities) was shown to be proportional to the solution concentration. This allowed the determination of both ENRO and CIPRO with high accuracy and precision even at ultralow concentrations regime. The LOQ for both ENRO and CIPRO were achieved at 2.8 pM. The digital SERS protocol, suggested here, is a roadmap for the implementation of SM-SERS as a routine tool for quantification at ultralow concentrations.

  9. Final priority; technical assistance to improve state data capacity--National Technical Assistance Center to improve state capacity to accurately collect and report IDEA data. Final priority.

    PubMed

    2013-05-20

    The Assistant Secretary for Special Education and Rehabilitative Services announces a priority under the Technical Assistance to Improve State Data Capacity program. The Assistant Secretary may use this priority for competitions in fiscal year (FY) 2013 and later years. We take this action to focus attention on an identified national need to provide technical assistance (TA) to States to improve their capacity to meet the data collection and reporting requirements of the Individuals with Disabilities Education Act (IDEA). We intend this priority to establish a TA center to improve State capacity to accurately collect and report IDEA data (Data Center).

  10. Strategies for Reduced-Order Models in Uncertainty Quantification of Complex Turbulent Dynamical Systems

    NASA Astrophysics Data System (ADS)

    Qi, Di

    Turbulent dynamical systems are ubiquitous in science and engineering. Uncertainty quantification (UQ) in turbulent dynamical systems is a grand challenge where the goal is to obtain statistical estimates for key physical quantities. In the development of a proper UQ scheme for systems characterized by both a high-dimensional phase space and a large number of instabilities, significant model errors compared with the true natural signal are always unavoidable due to both the imperfect understanding of the underlying physical processes and the limited computational resources available. One central issue in contemporary research is the development of a systematic methodology for reduced order models that can recover the crucial features both with model fidelity in statistical equilibrium and with model sensitivity in response to perturbations. In the first part, we discuss a general mathematical framework to construct statistically accurate reduced-order models that have skill in capturing the statistical variability in the principal directions of a general class of complex systems with quadratic nonlinearity. A systematic hierarchy of simple statistical closure schemes, which are built through new global statistical energy conservation principles combined with statistical equilibrium fidelity, are designed and tested for UQ of these problems. Second, the capacity of imperfect low-order stochastic approximations to model extreme events in a passive scalar field advected by turbulent flows is investigated. The effects in complicated flow systems are considered including strong nonlinear and non-Gaussian interactions, and much simpler and cheaper imperfect models with model error are constructed to capture the crucial statistical features in the stationary tracer field. Several mathematical ideas are introduced to improve the prediction skill of the imperfect reduced-order models. Most importantly, empirical information theory and statistical linear response theory are applied in the training phase for calibrating model errors to achieve optimal imperfect model parameters; and total statistical energy dynamics are introduced to improve the model sensitivity in the prediction phase especially when strong external perturbations are exerted. The validity of reduced-order models for predicting statistical responses and intermittency is demonstrated on a series of instructive models with increasing complexity, including the stochastic triad model, the Lorenz '96 model, and models for barotropic and baroclinic turbulence. The skillful low-order modeling methods developed here should also be useful for other applications such as efficient algorithms for data assimilation.

  11. Projecting Event-Based Analysis Dates in Clinical Trials: An Illustration Based on the International Duration Evaluation of Adjuvant Chemotherapy (IDEA) Collaboration. Projecting analysis dates for the IDEA collaboration.

    PubMed

    Renfro, Lindsay A; Grothey, Axel M; Paul, James; Floriani, Irene; Bonnetain, Franck; Niedzwiecki, Donna; Yamanaka, Takeharu; Souglakos, Ioannis; Yothers, Greg; Sargent, Daniel J

    2014-12-01

    Clinical trials are expensive and lengthy, where success of a given trial depends on observing a prospectively defined number of patient events required to answer the clinical question. The point at which this analysis time occurs depends on both patient accrual and primary event rates, which typically vary throughout the trial's duration. We demonstrate real-time analysis date projections using data from a collection of six clinical trials that are part of the IDEA collaboration, an international preplanned pooling of data from six trials testing the duration of adjuvant chemotherapy in stage III colon cancer, and we additionally consider the hypothetical impact of one trial's early termination of follow-up. In the absence of outcome data from IDEA, monthly accrual rates for each of the six IDEA trials were used to project subsequent trial-specific accrual, while historical data from similar Adjuvant Colon Cancer Endpoints (ACCENT) Group trials were used to construct a parametric model for IDEA's primary endpoint, disease-free survival, under the same treatment regimen. With this information and using the planned total accrual from each IDEA trial protocol, individual patient accrual and event dates were simulated and the overall IDEA interim and final analysis times projected. Projections were then compared with actual (previously undisclosed) trial-specific event totals at a recent census time for validation. The change in projected final analysis date assuming early termination of follow-up for one IDEA trial was also calculated. Trial-specific predicted event totals were close to the actual number of events per trial for the recent census date at which the number of events per trial was known, with the overall IDEA projected number of events only off by eight patients. Potential early termination of follow-up by one IDEA trial was estimated to postpone the overall IDEA final analysis date by 9 months. Real-time projection of the final analysis time during a trial, or the overall analysis time during a trial collaborative such as IDEA, has practical implications for trial feasibility when these projections are translated into additional time and resources required.

  12. [Development and validation of an HPLC method for the quantification of vitamin A in human milk. Its application to a rural population in Argentina].

    PubMed

    López, Laura B; Baroni, Andrea V; Rodríguez, Viviana G; Greco, Carola B; de Costa, Sara Macías; de Ferrer, Patricia Ronayne; Rodríguez de Pece, Silvia

    2005-06-01

    A methodology for the quantification of vitamin A in human milk was developed and validated. Vitamin A levels were assessed in 223 samples corresponding to the 5th, 6th and 7th postpartum months, obtained in the province of Santiago del Estero, Argentina. The samples (500 microL) were saponified with potassium hydroxide/ethanol, extracted with hexane, evaporated to dryness and reconstituted with methanol. A column RP-C18, a mobile phase methanol/water (91:9 v/v) and a fluorescence detector (lambda excitation 330 nm and lambda emition 470 nm) were used for the separation and quantification of vitamin A. The analytical parameters of linearity (r2: 0.9995), detection (0.010 microg/mL) and quantification (0.025 microg/mL) limits, precision of the method (relative standard deviation, RSD = 9.0% within a day and RSD = 8.9% among days) and accuracy (recovery = 83.8%) demonstrate that the developed method allows the quantification of vitamin A in an efficient way. The mean values + standard deviation (SD) obtained for the analyzed samples were 0.60 +/- 0.32; 0.65 +/- 0.33 and 0.61 +/- 0.26 microg/ mL for the 5th, 6th and 7th postpartum months, respectively. There were no significant differences among the three months studied and the values found were similar to those in the literature. Considering the whole population under study, 19.3% showed vitamin A levels less than 0.40 microg/mL, which represents a risk to the children in this group since at least 0.50 microg/mL are necessary to meet the infant daily needs.

  13. Predominant Lactobacillus species types of vaginal microbiota in pregnant Korean women: quantification of the five Lactobacillus species and two anaerobes.

    PubMed

    Kim, Jeong Hyun; Yoo, Seung Min; Sohn, Yong Hak; Jin, Chan Hee; Yang, Yun Suk; Hwang, In Taek; Oh, Kwan Young

    2017-10-01

    To investigate the predominant Lactobacillus species types (LSTs) of vaginal microbiota in pregnant Korean women by quantifying five Lactobacillus species and two anaerobes. In all, 168 pregnant Korean women under antenatal care at Eulji University Hospital and local clinics were enrolled in the prospective cohort study during pregnancy (10-14 weeks). Vaginal samples were collected with Eswab for Quantitative polymerase chain reaction (qPCR) and stored in a -80 °C freezer. qPCR was performed for five Lactobacillus species and two anaerobes. To identify the predominant LSTs, quantifications were analyzed by the Cluster and Tree View programs of Eisen Lab. Also the quantifications were compared among classified groups. L. crispatus and L. iners were most commonly found in pregnant Korean women, followed by L. gasseri and L. jensenii; L. vaginalis was nearly absent. Five types (four predominant LSTs and one predominant anaerobe type without predominant Lactobacillus species) were classified. Five predominant LSTs were identified in vaginal microbiota of pregnant Korean women. L. crispatus and L. iners predominant types comprised a large proportion.

  14. Quantification of menadione from plasma and urine by a novel cysteamine-derivatization based UPLC-MS/MS method.

    PubMed

    Yuan, Teng-Fei; Wang, Shao-Ting; Li, Yan

    2017-09-15

    Menadione, as the crucial component of vitamin Ks, possessed significant nutritional and clinical values. However, there was still lack of favourable quantification strategies for it to date. For improvement, a novel cysteamine derivatization based UPLC-MS/MS method was presented in this work. The derivatizating reaction was proved non-toxic, easy-handling and high-efficient, which realized the MS detection of menadione under positive mode. Benefitting from the excellent sensitivity of the derivatizating product as well as the introduction of the stable isotope dilution technique, the quantification could be achieved in the range of 0.05-50.0ng/mL for plasma and urine matrixes with satisfied accuracy and precision. After analysis of the samples from healthy volunteers after oral administration of menadione sodium bisulfite tablets, the urinary free menadione was quantified for the very first time. We believe the progress in this work could largely promote the exploration of the metabolic mechanism of vitamin K in vivo. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, C. S.; Zhang, Hongbin

    Uncertainty quantification and sensitivity analysis are important for nuclear reactor safety design and analysis. A 2x2 fuel assembly core design was developed and simulated by the Virtual Environment for Reactor Applications, Core Simulator (VERA-CS) coupled neutronics and thermal-hydraulics code under development by the Consortium for Advanced Simulation of Light Water Reactors (CASL). An approach to uncertainty quantification and sensitivity analysis with VERA-CS was developed and a new toolkit was created to perform uncertainty quantification and sensitivity analysis with fourteen uncertain input parameters. Furthermore, the minimum departure from nucleate boiling ratio (MDNBR), maximum fuel center-line temperature, and maximum outer clad surfacemore » temperature were chosen as the selected figures of merit. Pearson, Spearman, and partial correlation coefficients were considered for all of the figures of merit in sensitivity analysis and coolant inlet temperature was consistently the most influential parameter. We used parameters as inputs to the critical heat flux calculation with the W-3 correlation were shown to be the most influential on the MDNBR, maximum fuel center-line temperature, and maximum outer clad surface temperature.« less

  16. Detection and quantification of cocaine and benzoylecgonine in meconium using solid phase extraction and UPLC/MS/MS.

    PubMed

    Gunn, Josh; Kriger, Scott; Terrell, Andrea R

    2010-01-01

    The simultaneous determination and quantification of cocaine and its major metabolite, benzoylecgonine, in meconium using UPLC-MS/MS is described. Ultra-performance liquid chromatography (UPLC) is an emerging analytical technique which draws upon the principles of chromatography to run separations at higher flow rates for increased speed, while simultaneously achieving superior resolution and sensitivity. Extraction of cocaine and benzoylecgonine from the homogenized meconium matrix was achieved with a preliminary protein precipitation or protein 'crash' employing cold acetonitrile, followed by a mixed mode solid phase extraction (SPE). Following elution from the SPE cartridge, eluents were dried down under nitrogen, reconstituted in 200 microL of DI water:acetonitrile (ACN) (75:25), and injected onto the UPLC/MS/MS for analysis. The increased speed and separation efficiency afforded by UPLC, allowed for the separation and subsequent quantification of both analytes in less than 2 min. Analytes were quantified using multiple reaction monitoring (MRM) and six-point calibration curves constructed in negative blood. Limits of detection for both analytes were 3 ng/g and the lower limit of quantitation (LLOQ) was 30 ng/g.

  17. A novel quantification-driven proteomic strategy identifies an endogenous peptide of pleiotrophin as a new biomarker of Alzheimer's disease.

    PubMed

    Skillbäck, Tobias; Mattsson, Niklas; Hansson, Karl; Mirgorodskaya, Ekaterina; Dahlén, Rahil; van der Flier, Wiesje; Scheltens, Philip; Duits, Floor; Hansson, Oskar; Teunissen, Charlotte; Blennow, Kaj; Zetterberg, Henrik; Gobom, Johan

    2017-10-17

    We present a new, quantification-driven proteomic approach to identifying biomarkers. In contrast to the identification-driven approach, limited in scope to peptides that are identified by database searching in the first step, all MS data are considered to select biomarker candidates. The endopeptidome of cerebrospinal fluid from 40 Alzheimer's disease (AD) patients, 40 subjects with mild cognitive impairment, and 40 controls with subjective cognitive decline was analyzed using multiplex isobaric labeling. Spectral clustering was used to match MS/MS spectra. The top biomarker candidate cluster (215% higher in AD compared to controls, area under ROC curve = 0.96) was identified as a fragment of pleiotrophin located near the protein's C-terminus. Analysis of another cohort (n = 60 over four clinical groups) verified that the biomarker was increased in AD patients while no change in controls, Parkinson's disease or progressive supranuclear palsy was observed. The identification of the novel biomarker pleiotrophin 151-166 demonstrates that our quantification-driven proteomic approach is a promising method for biomarker discovery, which may be universally applicable in clinical proteomics.

  18. Polyaniline-graphene oxide nanocomposite sensor for quantification of calcium channel blocker levamlodipine.

    PubMed

    Jain, Rajeev; Sinha, Ankita; Khan, Ab Lateef

    2016-08-01

    A novel polyaniline-graphene oxide nanocomposite (PANI/GO/GCE) sensor has been fabricated for quantification of a calcium channel blocker drug levamlodipine (LAMP). Fabricated sensor has been characterized by electrochemical impedance spectroscopy, square wave and cyclic voltammetry, Raman spectroscopy and Fourier transform infrared (FTIR) spectroscopy. The developed PANI/GO/GCE sensor has excellent analytical performance towards electrocatalytic oxidation as compared to PANI/GCE, GO/GCE and bare GCE. Under optimized experimental conditions, the fabricated sensor exhibits a linear response for LAMP for its oxidation over a concentration range from 1.25μgmL(-1) to 13.25μgmL(-1) with correlation coefficient of 0.9950 (r(2)), detection limit of 1.07ngmL(-1) and quantification limit of 3.57ngmL(-1). The sensor shows an excellent performance for detecting LAMP with reproducibility of 2.78% relative standard deviation (RSD). The proposed method has been successfully applied for LAMP determination in pharmaceutical formulation with a recovery from 99.88% to 101.75%. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. Simple detection of residual enrofloxacin in meat products using microparticles and biochips.

    PubMed

    Ha, Mi-Sun; Chung, Myung-Sub; Bae, Dong-Ho

    2016-05-01

    A simple and sensitive method for detecting enrofloxacin, a major veterinary fluoroquinolone, was developed. Monoclonal antibody specific for enrofloxacin was immobilised on a chip and fluorescent dye-labelled microparticles were covalently bound to the enrofloxacin molecules. Enrofloxacin in solution competes with the microparticle-immobilised enrofloxacin (enroMPs) to bind to the antibody on the chip. The presence of enrofloxacin was verified by detecting the fluorescence of enrofloxacin-bound microparticles. Under optimum conditions, a high dynamic range was achieved at enrofloxacin concentrations ranging from 1 to 1000 μg kg(-1). The limits of detection and quantification for standard solutions were 5 and 20 μg kg(-1) respectively, which are markedly lower than the maximum residue limit. Using simple extraction methods, recoveries from fortified beef, pork and chicken samples were 43.4-62.3%. This novel method also enabled approximate quantification of enrofloxacin concentration: the enroMP signal intensity decreased with increasing enrofloxacin concentration. Because of its sensitivity, specificity, simplicity and rapidity, the method described herein will facilitate the detection and approximate quantification of enrofloxacin residues in foods in a high-throughput manner.

  20. Carotenoid profiling of leaves of selected eggplant accessions subjected to drought stress

    USDA-ARS?s Scientific Manuscript database

    This study focused on the quantification of carotenoids of the leaves of African eggplants commonly consumed as leafy and fruit vegetables. The results gave comparative profiles of carotenoids at different growth and developmental stages and under drought stress. Stress was achieved by limiting irri...

  1. Neuroimaging Research: from Null-Hypothesis Falsification to Out-Of-Sample Generalization

    ERIC Educational Resources Information Center

    Bzdok, Danilo; Varoquaux, Gaël; Thirion, Bertrand

    2017-01-01

    Brain-imaging technology has boosted the quantification of neurobiological phenomena underlying human mental operations and their disturbances. Since its inception, drawing inference on neurophysiological effects hinged on classical statistical methods, especially, the general linear model. The tens of thousands of variables per brain scan were…

  2. Quantification and fragment analysis of soil and cotton root-associated fungal and bacterial populations under different tillage managements

    USDA-ARS?s Scientific Manuscript database

    Background: Conservation tillage is a common management practice utilized in the hopes of reducing soil erosion and increasing soil carbon. Evidence suggests that conservation tillage may lead to habitat improvement for soil microorganisms, in particular rhizospheric bacteria and arbuscular mycorrhi...

  3. Quantification of soil surface roughness evolution under simulated rainfall

    USDA-ARS?s Scientific Manuscript database

    Soil surface roughness is commonly identified as one of the dominant factors governing runoff and interrill erosion. The objective of this study was to compare several existing soil surface roughness indices and to test the Revised Triangular Prism surface area Method (RTPM) as a new approach to cal...

  4. The enigma of energy: A philosophical inquiry

    NASA Astrophysics Data System (ADS)

    Todaro-Franceschi, Vidette

    1998-06-01

    A philosophical inquiry was undertaken to examine the enigma of energy in an attempt to clarify and further illuminate the basic ideas of energy. Beginning with the origin of the concept-Aristotle's conceptualization of energeia-and continuing through to the present day with an overview of the historical conceptual development of energy in Western science, an analysis and interpretation of the scientific and philosophic literature was performed. Literature regarding aspects of human sentience was also examined for underlying ideas of energy. And, finally, selected medical and nursing science theoretical frameworks were analyzed with the hope of further grasping the philosophical underpinnings related to the phenomenon of human energy. Certain ideas of energy became evident. Energy can be viewed as a process and this view works well within the physical science domain. When energy is viewed as a process it falls within the mechanistic tradition: things are viewed as particulate, and cause and effect related. However, energy can also be viewed as a phenomenon, a thing. As a phenomenon, energy is continually transforming and actualizing inherent potentials in a communal process. When energy is recognized as the sole phenomenon responsible for everything in existence, it becomes evident that all is essentially one. In addition, when energy is viewed in this manner it becomes increasingly difficult to deny the purposive character underlying all nature. It is argued that the mystery ultimately leads to something far beyond what we know exists. One of the intuitive feelings of this researcher was that there were at least two different ideas of energy in the sciences of medicine and nursing, which, while different, shared some common elements as well. An examination of Hippocrates', Nightingale's, Selye's, Levine's, and Rogers' ideas, as well as the basic tenets of alternative health care, revealed two distinct worldviews regarding human energy which are congruent with the ideas of energy as process and as a phenomenon. Both ideas, energy as process, and energy as a real entity, originated in Aristotle's work (384-322 BC) and both ways of viewing energy are still prevalent as we approach the 21 st century.

  5. Can nanotechnology improve cancer diagnosis through miRNA detection?

    PubMed

    Fiammengo, Roberto

    2017-01-01

    miRNAs are key regulators of gene expression, and alterations in their expression levels correlate with the onset and progression of cancer. Although miRNAs have been proposed as biomarkers for cancer diagnosis, their application in routine clinical praxis is yet to come. Current quantification strategies have limitation, and there is a great interest in developing innovative ones. Since a few years, nanotechnology-based approaches for miRNA quantification are emerging at fast pace but there is urgent need to go beyond the proof-of-concept stage. Nanotechnology will have a strong impact on cancer diagnosis through miRNA detection only if it is demonstrated that the newly developed approaches are indeed working on 'real-world' samples under standardized conditions.

  6. Detection and quantification of long chain fatty acids in liquid and solid samples and its relevance to understand anaerobic digestion of lipids.

    PubMed

    Neves, L; Pereira, M A; Mota, M; Alves, M M

    2009-01-01

    A method for long chain fatty acids (LCFA) extraction, identification and further quantification by gas chromatography was developed and its application to liquid and solid samples collected from anaerobic digesters was demonstrated. After validation, the usefulness of this method was demonstrated in a cow manure digester receiving pulses of an industrial effluent containing high lipid content. From the LCFA analysis data it was showed that the conversion of oleic acid, the main LCFA fed to the reactor, by the adapted biomass became faster and more effective along the successive pulses. Conversely, the accumulation of palmitic acid in the solid phase suggests that degradation of this LCFA, under these conditions, is less effective.

  7. Simultaneous quantification of five major active components in capsules of the traditional Chinese medicine ‘Shu-Jin-Zhi-Tong’ by high performance liquid chromatography

    PubMed Central

    Yang, Xing-Xin; Zhang, Xiao-Xia; Chang, Rui-Miao; Wang, Yan-Wei; Li, Xiao-Ni

    2011-01-01

    A simple and reliable high performance liquid chromatography (HPLC) method has been developed for the simultaneous quantification of five major bioactive components in ‘Shu-Jin-Zhi-Tong’ capsules (SJZTC), for the purposes of quality control of this commonly prescribed traditional Chinese medicine. Under the optimum conditions, excellent separation was achieved, and the assay was fully validated in terms of linearity, precision, repeatability, stability and accuracy. The validated method was applied successfully to the determination of the five compounds in SJZTC samples from different production batches. The HPLC method can be used as a valid analytical method to evaluate the intrinsic quality of SJZTC. PMID:29403711

  8. A fast, reliable, ultra high performance liquid chromatography method for the simultaneous determination of amino acids, biogenic amines and ammonium ions in cheese, using diethyl ethoxymethylenemalonate as a derivatising agent.

    PubMed

    Redruello, Begoña; Ladero, Victor; Cuesta, Isabel; Álvarez-Buylla, Jorge R; Martín, María Cruz; Fernández, María; Alvarez, Miguel A

    2013-08-15

    Derivatisation treatment with diethyl ethoxymethylenemalonate followed by ultra-HPLC allowed the simultaneous quantification of 22 amino acids, 7 biogenic amines and ammonium ions in cheese samples in under 10 min. This is the fastest elution time ever reported for such a resolution. The proposed method shows good linearity (R(2)>0.995) and sensitivity (detection limit 0.08-3.91 μM; quantification limit <13.02 μM). Intra- and inter-day repeatability ranged from 0.35% to 1.25% and from 0.85% to 5.2%, respectively. No significant effect of the cheese matrix was observed. Copyright © 2013 Elsevier Ltd. All rights reserved.

  9. The Chimera of Proportionality: Institutionalising Limits on Punishment in Contemporary Social and Political Systems

    PubMed Central

    Lacey, Nicola; Pickard, Hanna

    2015-01-01

    The concept of proportionality has been central to the retributive revival in penal theory, and underlies desert theory’s normative and practical commitment to limiting punishment. Theories of punishment combining desert-based and consequentialist considerations also appeal to proportionality as a limiting condition. In this paper we argue that these claims are founded on an exaggerated idea of what proportionality can offer, and in particular fail properly to consider the institutional conditions needed to foster robust limits on the state’s power to punish. The idea that appeals to proportionality as an abstract ideal can help to limit punishment is, we argue, a chimera: what has been thought of as proportionality is not a naturally existing relationship, but a product of political and social construction, cultural meaning-making, and institution-building. Drawing on evolutionary psychology and comparative political economy, we argue that philosophers and social scientists need to work together to understand how the appeal of the idea of proportionality can best be realised through substantive institutional frameworks under particular conditions. PMID:25937675

  10. NMR high-resolution magic angle spinning rotor design for quantification of metabolic concentrations

    NASA Astrophysics Data System (ADS)

    Holly, R.; Damyanovich, A.; Peemoeller, H.

    2006-05-01

    A new high-resolution magic angle spinning nuclear magnetic resonance technique is presented to obtain absolute metabolite concentrations of solutions. The magnetic resonance spectrum of the sample under investigation and an internal reference are acquired simultaneously, ensuring both spectra are obtained under the same experimental conditions. The robustness of the technique is demonstrated using a solution of creatine, and it is shown that the technique can obtain solution concentrations to within 7% or better.

  11. Noise Propagation and Uncertainty Quantification in Hybrid Multiphysics Models: Initiation and Reaction Propagation in Energetic Materials

    DTIC Science & Technology

    2016-05-23

    general model for heterogeneous granular media under compaction and (ii) the lack of a reliable multiscale discrete -to-continuum framework for...dynamics. These include a continuum- discrete model of heat dissipation/diffusion and a continuum- discrete model of compaction of a granular material with...the lack of a general model for het- erogeneous granular media under compac- tion and (ii) the lack of a reliable multi- scale discrete -to-continuum

  12. Bearing performance degradation assessment based on time-frequency code features and SOM network

    NASA Astrophysics Data System (ADS)

    Zhang, Yan; Tang, Baoping; Han, Yan; Deng, Lei

    2017-04-01

    Bearing performance degradation assessment and prognostics are extremely important in supporting maintenance decision and guaranteeing the system’s reliability. To achieve this goal, this paper proposes a novel feature extraction method for the degradation assessment and prognostics of bearings. Features of time-frequency codes (TFCs) are extracted from the time-frequency distribution using a hybrid procedure based on short-time Fourier transform (STFT) and non-negative matrix factorization (NMF) theory. An alternative way to design the health indicator is investigated by quantifying the similarity between feature vectors using a self-organizing map (SOM) network. On the basis of this idea, a new health indicator called time-frequency code quantification error (TFCQE) is proposed to assess the performance degradation of the bearing. This indicator is constructed based on the bearing real-time behavior and the SOM model that is previously trained with only the TFC vectors under the normal condition. Vibration signals collected from the bearing run-to-failure tests are used to validate the developed method. The comparison results demonstrate the superiority of the proposed TFCQE indicator over many other traditional features in terms of feature quality metrics, incipient degradation identification and achieving accurate prediction. Highlights • Time-frequency codes are extracted to reflect the signals’ characteristics. • SOM network served as a tool to quantify the similarity between feature vectors. • A new health indicator is proposed to demonstrate the whole stage of degradation development. • The method is useful for extracting the degradation features and detecting the incipient degradation. • The superiority of the proposed method is verified using experimental data.

  13. Thymosin β4 overexpression regulates neuron production and spatial distribution in the developing avian optic tectum.

    PubMed

    Lever, Mael; Theiss, Carsten; Morosan-Puopolo, Gabriela; Brand-Saberi, Beate

    2017-05-01

    Thymosin β4 (Tβ4), the principal G-actin regulating entity in eukaryotic cells, has also multiple intra- and extracellular functions related to tissue regeneration and healing. While its effect in adult organs is being widely investigated, currently, little is known about its influence on embryonic tissues, i.e., in the developing nervous system. The importance of Tβ4 for neural stem cell proliferation in the embryonic chicken optic tectum (OT) has previously been shown by us for the first time. In the present study, using in ovo electroporation, we carried out a quantification of the effects of the Tβ4-overexpression on the developing chicken OT between E4 and E6 at the hemisphere as well as cellular level. We precisely examined tissue growth and characterized cells arising from the elevated mitotic activity of progenitor cells. By using spinning-disk confocal laser scanning microscopy, we were able to visualize these effects across whole OT sections. Our experiments now demonstrate more clearly that the overexpression of Tβ4 leads to a tangential expansion of the treated OT-hemisphere and that, under these circumstances, overall density of tectal and in particular of postmitotic neuronal cells is increased. Thanks to this new quantitative approach, the present results extend our previous findings that Tβ4 is important for the proliferation of progenitor cells, neurogenesis, tangential expansion, and tissue growth in the young embryonic chicken optic tectum. Taken together, our results further illustrate and support the current idea that Tβ4 is widely implicated in shaping and maintenance of the nervous system.

  14. Derechos Educacionales de los Padres: Una Explicacion de los Procedimientos de Seguridad para los Padres de Ninos con Discapacidades. Bajo la Clausula del Acta de Educacion para Individuos con Discapacidades (IDEA) y las Reglas para la Administracion del Acta de Educacion para Ninos Excepcionales (Educational Rights of Parents: An Explanation of Procedural Safeguards Available to Parents of Children with Disabilities. Under Provisions of the Individuals with Disabilities Education Act (IDEA) and the Rules for the Administration of the Exceptional Children's Educational Act [ECEA]).

    ERIC Educational Resources Information Center

    Mountain Plains Regional Resource Center, Des Moines, IA.

    This pamphlet, in Spanish, describes Colorado parents' educational rights under federal and state special education rules and regulations. It addresses: (1) free appropriate public education and termination of services; (2) required prior notice to parents if there is a proposed change or refusal to change a child's special education program; (3)…

  15. Investigating Elementary Teachers' Thinking About and Learning to Notice Students' Science Ideas

    NASA Astrophysics Data System (ADS)

    Luna, Melissa Jo

    Children naturally use observations and everyday thinking to construct explanations as to why phenomena happen in the world. Science instruction can benefit by starting with these ideas to help children build coherent scientific understandings of how the physical world works. To do so, science teaching must involve attending to students' ideas so that those ideas become the basis for learning. Yet while science education reform requires teachers to pay close attention to their students' ideas, we know little about what teachers think this means in practice. To examine this issue, my dissertation research is two-fold. First, I examine teacher thinking by investigating how teachers understand what it means to pay attention to students' science ideas. Specifically, using new digital technology, three participating teachers captured moments of student thinking in the midst of instruction. Analysis of these moments reveals that teachers capture many different kinds of moments containing students' ideas and think about students' science ideas in different ways at different times. In particular, these three teachers most often think about students' ideas as being (a) from authority, (b) from experience, and (c) under construction. Second, I examine teacher learning through the development of an innovative science teaching video club model. The model differs from previous research on video clubs in several key ways in an attempt to focus teachers on student thinking in a sustained way. I investigate the ways in which this model was effective for engaging teachers in noticing and making sense of their students' science ideas during one implementation. Results indicate that teachers talked about student thinking early, often, and in meaningful ways. Science education leaders have recognized the potential of science teaching video clubs as a form of professional development, and the model presented in this work promotes the conditions for successful teacher learning. This work contributes to research on teacher cognition by advancing what we know about teachers' understanding of attending to students' science ideas. In addition, it provides practical information concerning the design of teacher professional development supporting their learning to attend closely to the ideas students raise about scientific phenomena.

  16. Systems study for an Integrated Digital-Electric Aircraft (IDEA)

    NASA Technical Reports Server (NTRS)

    Tagge, G. E.; Irish, L. A.; Bailey, A. R.

    1985-01-01

    The results of the Integrated Digital/Electric Aircraft (IDEA) Study are presented. Airplanes with advanced systems were, defined and evaluated, as a means of identifying potential high payoff research tasks. A baseline airplane was defined for comparison, typical of a 1990's airplane with advanced active controls, propulsion, aerodynamics, and structures technology. Trade studies led to definition of an IDEA airplane, with extensive digital systems and electric secondary power distribution. This airplane showed an improvement of 3% in fuel use and 1.8% in DOC relative to the baseline configuration. An alternate configuration, an advanced technology turboprop, was also evaluated, with greater improvement supported by digital electric systems. Recommended research programs were defined for high risk, high payoff areas appropriate for implementation under NASA leadership.

  17. Carbon stocks quantification in agricultural systems employing succession and rotation of crops in Rio Grande do Sul State, Brazil.

    NASA Astrophysics Data System (ADS)

    Walter, Michele K. C.; Marinho, Mara de A.; Denardin, José E.; Zullo, Jurandir, Jr.; Paz-González, Antonio

    2013-04-01

    Soil and vegetation constitute respectively the third and the fourth terrestrial reservoirs of Carbon (C) on Earth. C sequestration in these reservoirs includes the capture of the CO2 from the atmosphere by photosynthesis and its storage as organic C. Consequently, changes in land use and agricultural practices affect directly the emissions of the greenhouse gases and the C sequestration. Several studies have already demonstrated that conservation agriculture, and particularly zero tillage (ZT), has a positive effect on soil C sequestration. The Brazilian federal program ABC (Agriculture of Low Carbon Emission) was conceived to promote agricultural production with environmental protection and represents an instrument to achieve voluntary targets to mitigate emissions or NAMAS (National Appropriated Mitigation Actions). With financial resources of about US 1.0 billion until 2020 the ABC Program has a target of expand ZT in 8 million hectares of land, with reduction of 16 to 20 million of CO2eq. Our objective was to quantify the C stocks in soil, plants and litter of representative grain crops systems under ZT in Rio Grande do Sul State, Brazil. Two treatments of a long term experimental essay (> 20 years) were evaluated: 1) Crop succession with wheat (Triticum aestivum L.)/soybean (Glycine max (L.) Merril); 2) Crop rotation with wheat/soybean (1st year), vetch (Vicia sativa L.)/soybean (2nd year), and white oat (Avena sativa L.)/sorghum (Sorghum bicolor L.) (3rd year). C quantification in plants and in litter was performed using the direct method of biomass quantification. The soil type evaluated was a Humic Rhodic Hapludox, and C quantification was executed employing the method referred by "C mass by unit area". Results showed that soybean plants under crop succession presented greater C stock (4.31MgC ha-1) comparing with soybean plants cultivated under crop rotation (3.59 MgC ha-1). For wheat, however, greater C stock was quantified in plants under rotation comparing with that under succession (4.95 and 4.14 MgC ha-1, respectively). No differences between succession X rotation (1st year) and succession X rotation (3rd year) were found for litter. Differences in C stock in litter were found only comparing succession (2.42 MgC ha-1) X rotation (2nd year) (3.44 MgC ha-1). Average values of soil C stocks at depth 0-30cm under succession (67.79 MgC ha-1) and rotation (64.83 MgC ha-1) don't differ among treatments. These values in comparison with other determined for similar soil-climate conditions for soils under native forest (60.83 MgC ha-1) and under conventional tillage (60.68 MgC ha-1) reveals a beneficial effect of ZT in soil C stock. Finally, the C stocks determined for plants and litter, representing only 4.0% and 6.4% of that determined for soil, confirm the relevance of soil as a terrestrial C reservoir. Acknowledgments: The authors express thanks for the financial support and technical facilities receipt from Embrapa Trigo, CEPAGRI/ UNICAMP, and FAEPEX/ UNICAMP. CAPES/GOV.BRAZIL is also acknowledged by Dr. Michele K. C. Walter for the greeted scholarship.

  18. 36 CFR 72.42 - Expansion and new development.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ..., Rehabilitation and Innovation § 72.42 Expansion and new development. (a) Expansion. Because the UPARR Program is... development will not be assisted under a rehabilitation grant. (2) Innovation. New development may be allowed under an Innovation grant when it is directly related to a specific innovative idea or technique...

  19. 36 CFR 72.42 - Expansion and new development.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ..., Rehabilitation and Innovation § 72.42 Expansion and new development. (a) Expansion. Because the UPARR Program is... development will not be assisted under a rehabilitation grant. (2) Innovation. New development may be allowed under an Innovation grant when it is directly related to a specific innovative idea or technique...

  20. 36 CFR 72.42 - Expansion and new development.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., Rehabilitation and Innovation § 72.42 Expansion and new development. (a) Expansion. Because the UPARR Program is... development will not be assisted under a rehabilitation grant. (2) Innovation. New development may be allowed under an Innovation grant when it is directly related to a specific innovative idea or technique...

  1. 36 CFR 72.42 - Expansion and new development.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ..., Rehabilitation and Innovation § 72.42 Expansion and new development. (a) Expansion. Because the UPARR Program is... development will not be assisted under a rehabilitation grant. (2) Innovation. New development may be allowed under an Innovation grant when it is directly related to a specific innovative idea or technique...

  2. Autism Litigation under the IDEA: A New Meaning of "Disproportionality"?

    ERIC Educational Resources Information Center

    Zirkel, Perry A.

    2011-01-01

    Children with autism accounted for almost one third of a comprehensive sample of published court decisions concerning the core concepts of free appropriate public education (FAPE) and least restrictive environment (LRE) under the Individuals With Disabilities Education Act. The other major, and more significant, finding was that when comparing…

  3. Drugged driving in Louisiana : quantification of its impact on public health and implications for legislation, enforcement and prosecution.

    DOT National Transportation Integrated Search

    2015-07-20

    Drugged driving, i.e., driving under the influence of drugs, is considered a rising public health issue in the US and the rest of the world, yet due to underreporting and limitations of existing data, not much is known about the frequency of drugged ...

  4. Association between Measures of Academic Performance and Psychosocial Adjustment for Asian/Pacific-Islander Adolescents.

    ERIC Educational Resources Information Center

    Hishinuma, Earl S.; Foster, Judy E.; Miyamoto, Robin H.; Nishimura, Stephanie T.; Andrade, Naleen N.; Nahulu, Linda B.; Goebert, Deborah A.; Yuen, Noelle Y. C.; Makini, George K., Jr.; Kim, S. Peter; Carlton, Barry S.

    2001-01-01

    Examines the association between different measures of academic performance and psychological adjustment for a sample of under-researched Asian/Pacific Islander adolescents from Hawaii. Results support the use of the actual quantification of academic performance (i.e. cumulative grade point average or self reported evaluation) in predicting…

  5. White Mountain Apache Tribe Water Rights Quantification Act of 2010

    THOMAS, 111th Congress

    Rep. Kirkpatrick, Ann [D-AZ-1

    2009-02-13

    Senate - 03/26/2010 Read twice. Placed on Senate Legislative Calendar under General Orders. Calendar No. 340. (All Actions) Notes: For further action, see H.R.4783, which became Public Law 111-291 on 12/8/2010. Tracker: This bill has the status Passed HouseHere are the steps for Status of Legislation:

  6. White Mountain Apache Tribe Water Rights Quantification Act of 2009

    THOMAS, 111th Congress

    Sen. Kyl, Jon [R-AZ

    2009-01-26

    Senate - 01/21/2010 Placed on Senate Legislative Calendar under General Orders. Calendar No. 260. (All Actions) Notes: For further action, see H.R.4783, which became Public Law 111-291 on 12/8/2010. Tracker: This bill has the status IntroducedHere are the steps for Status of Legislation:

  7. Quantification of Ethanol's Anti-Punishment Effect in Humans Using the Generalized Matching Equation

    ERIC Educational Resources Information Center

    Rasmussen, Erin B.; Newland, M. Christopher

    2009-01-01

    Increases in rates of punished behavior by the administration of drugs with anxiolytic effects (called antipunishment effects) are well established in animals but not humans. The present study examined antipunishment effects of ethanol in humans using a choice procedure. The behavior of 5 participants was placed under six concurrent…

  8. Target-Driven Reforms: Education for All and the Translations of Equity and Inclusion in India

    ERIC Educational Resources Information Center

    Mukhopadhyay, Rahul; Sriprakash, Arathi

    2013-01-01

    This paper critically examines the ways in which inclusion and equity are constituted through education development policies in India. Programmes implemented under global and national Education for All (EFA) policies have largely involved the quantification of "equity" whereby schooling processes are measured against broad targets for…

  9. Research and development of ultrasonic tomography technology for three-dimensional imaging of internal rail flaws : modeling and simulation.

    DOT National Transportation Integrated Search

    2013-04-01

    This report covers the work performed under the FRA High-Speed BAA 20102011 program to demonstrate the technology of ultrasonic tomography for 3-D imaging of internal rail flaws. There is a need to develop new technologies that are able to quantif...

  10. Severity Levels and Symptoms Complexes for Acute Radiation Sickness -- Description and Quantification

    DTIC Science & Technology

    1985-11-30

    of military task performance levels. This effort was performed under the guidance and direction of DNA staff members Dr. David Auton and Dr. Robert W...INC ATTN: J HOWE ATTN: S SHRIER HORIZONS TECHNOLOGY, INC S-CUBEDATTN: B PYATT ATTN: J PALMER I:T RESEARCH INSTITUTE SCIENCE APPLICATIONS INTL CORP

  11. Identification and evaluation of reliable reference genes for quantitative real-time PCR analysis in tea plant (Camellia sinensis (L.) O. Kuntze)

    USDA-ARS?s Scientific Manuscript database

    Quantitative real-time polymerase chain reaction (qRT-PCR) is a commonly used technique for measuring gene expression levels due to its simplicity, specificity, and sensitivity. Reliable reference selection for the accurate quantification of gene expression under various experimental conditions is a...

  12. Chemical Quantification of Atomic-Scale EDS Maps under Thin Specimen Conditions

    DOE PAGES

    Lu, Ping; Romero, Eric; Lee, Shinbuhm; ...

    2014-10-13

    We report our effort to quantify atomic-scale chemical maps obtained by collecting energy-dispersive X-ray spectra (EDS) using scanning transmission electron microscopy (STEM) (STEM-EDS). Under a thin specimen condition and when the EDS scattering potential is localized, the X-ray counts from atomic columns can be properly counted by fitting Gaussian peaks at the atomic columns, and can then be used for site-by-site chemical quantification. The effects of specimen thickness and X-ray energy on the Gaussian peak-width are investigated by using SrTiO 3 (STO) as a model specimen. The relationship between the peak-width and spatial-resolution of an EDS map is also studied.more » Furthermore, the method developed by this work is applied to study a Sm-doped STO thin film and antiphase boundaries present within the STO film. We find that Sm atoms occupy both Sr and Ti sites but preferably the Sr sites, and Sm atoms are relatively depleted at the antiphase boundaries likely due to the effect of strain.« less

  13. Dotette: Programmable, high-precision, plug-and-play droplet pipetting.

    PubMed

    Fan, Jinzhen; Men, Yongfan; Hao Tseng, Kuo; Ding, Yi; Ding, Yunfeng; Villarreal, Fernando; Tan, Cheemeng; Li, Baoqing; Pan, Tingrui

    2018-05-01

    Manual micropipettes are the most heavily used liquid handling devices in biological and chemical laboratories; however, they suffer from low precision for volumes under 1  μ l and inevitable human errors. For a manual device, the human errors introduced pose potential risks of failed experiments, inaccurate results, and financial costs. Meanwhile, low precision under 1  μ l can cause severe quantification errors and high heterogeneity of outcomes, becoming a bottleneck of reaction miniaturization for quantitative research in biochemical labs. Here, we report Dotette, a programmable, plug-and-play microfluidic pipetting device based on nanoliter liquid printing. With automated control, protocols designed on computers can be directly downloaded into Dotette, enabling programmable operation processes. Utilizing continuous nanoliter droplet dispensing, the precision of the volume control has been successfully improved from traditional 20%-50% to less than 5% in the range of 100 nl to 1000 nl. Such a highly automated, plug-and-play add-on to existing pipetting devices not only improves precise quantification in low-volume liquid handling and reduces chemical consumptions but also facilitates and automates a variety of biochemical and biological operations.

  14. The regulation by phenolic compounds of soil organic matter dynamics under a changing environment.

    PubMed

    Min, Kyungjin; Freeman, Chris; Kang, Hojeong; Choi, Sung-Uk

    2015-01-01

    Phenolics are the most abundant plant metabolites and are believed to decompose slowly in soils compared to other soil organic matter (SOM). Thus, they have often been considered as a slow carbon (C) pool in soil dynamics models. Here, however, we review changes in our concept about the turnover rate of phenolics and quantification of different types of phenolics in soils. Also, we synthesize current research on the degradation of phenolics and their regulatory effects on decomposition. Environmental changes, such as elevated CO2, warming, nitrogen (N) deposition, and drought, could influence the production and form of phenolics, leading to a change in SOM dynamics, and thus we also review the fate of phenolics under environmental disturbances. Finally, we propose the use of phenolics as a tool to control rates of SOM decomposition to stabilize organic carbon in ecosystems. Further studies to clarify the role of phenolics in SOM dynamics should include improving quantification methods, elucidating the relationship between phenolics and soil microorganisms, and determining the interactive effects of combinations of environmental changes on the phenolics production and degradation and subsequent impact on SOM processing.

  15. Development and application of damage assessment modeling: example assessment for the North Cape oil spill.

    PubMed

    McCay, Deborah French

    2003-01-01

    Natural resource damage assessment (NRDA) models for oil spills have been under development since 1984. Generally applicable (simplified) versions with built-in data sets are included in US government regulations for NRDAs in US waters. The most recent version of these models is SIMAP (Spill Impact Model Application Package), which contains oil fates and effects models that may be applied to any spill event and location in marine or freshwater environments. It is often not cost-effective or even possible to quantify spill impacts using field data collections. Modeling allows quantification of spill impacts using as much site-specific data as available, either as input or as validation of model results. SIMAP was used for the North Cape oil spill in Rhode Island (USA) in January 1996, for injury quantification in the first and largest NRDA case to be performed under the 1996 Oil Pollution Act NRDA regulations. The case was successfully settled in 1999. This paper, which contains a description of the model and application to the North Cape spill, delineates and demonstrates the approach.

  16. Stochastic output error vibration-based damage detection and assessment in structures under earthquake excitation

    NASA Astrophysics Data System (ADS)

    Sakellariou, J. S.; Fassois, S. D.

    2006-11-01

    A stochastic output error (OE) vibration-based methodology for damage detection and assessment (localization and quantification) in structures under earthquake excitation is introduced. The methodology is intended for assessing the state of a structure following potential damage occurrence by exploiting vibration signal measurements produced by low-level earthquake excitations. It is based upon (a) stochastic OE model identification, (b) statistical hypothesis testing procedures for damage detection, and (c) a geometric method (GM) for damage assessment. The methodology's advantages include the effective use of the non-stationary and limited duration earthquake excitation, the handling of stochastic uncertainties, the tackling of the damage localization and quantification subproblems, the use of "small" size, simple and partial (in both the spatial and frequency bandwidth senses) identified OE-type models, and the use of a minimal number of measured vibration signals. Its feasibility and effectiveness are assessed via Monte Carlo experiments employing a simple simulation model of a 6 storey building. It is demonstrated that damage levels of 5% and 20% reduction in a storey's stiffness characteristics may be properly detected and assessed using noise-corrupted vibration signals.

  17. Highly sensitive simultaneous quantification of estrogenic tamoxifen metabolites and steroid hormones by LC-MS/MS.

    PubMed

    Johänning, Janina; Heinkele, Georg; Precht, Jana C; Brauch, Hiltrud; Eichelbaum, Michel; Schwab, Matthias; Schroth, Werner; Mürdter, Thomas E

    2015-09-01

    Tamoxifen is a mainstay in the treatment of estrogen receptor-positive breast cancer and is metabolized to more than 30 different compounds. Little is known about in vivo concentrations of estrogenic metabolites E-metabolite E, Z-metabolite E, and bisphenol and their relevance for tamoxifen efficacy. Therefore, we developed a highly sensitive HPLC-ESI-MS/MS quantification method for tamoxifen metabolites bisphenol, E-metabolite E, and Z-metabolite E as well as for the sex steroid hormones estradiol, estrone, testosterone, androstenedione, and progesterone. Plasma samples were subjected to protein precipitation followed by solid phase extraction. Upon derivatization with 3-[(N-succinimide-1-yl)oxycarbonyl]-1-methylpyridinium iodide, all analytes were separated on a sub-2-μm column with a gradient of acetonitrile in water with 0.1 % of formic acid. Analytes were detected on a triple-quadrupole mass spectrometer with positive electrospray ionization in the multiple reaction monitoring mode. Our method demonstrated high sensitivity, accuracy, and precision. The lower limits of quantification were 12, 8, and 25 pM for bisphenol, E-metabolite E, and Z-metabolite E, respectively, and 4 pM for estradiol and estrogen, 50 pM for testosterone and androstenedione, and 25 pM for progesterone. The method was applied to plasma samples of postmenopausal patients taken at baseline and under tamoxifen therapy. Graphical Abstract Sample preparation and derivatization for highly sensitive quantification of estrogenic tamoxifen metabolites and steroid hormones by HPLC-MS/MS.

  18. Suppression of anomalous synchronization and nonstationary behavior of neural network under small-world topology

    NASA Astrophysics Data System (ADS)

    Boaretto, B. R. R.; Budzinski, R. C.; Prado, T. L.; Kurths, J.; Lopes, S. R.

    2018-05-01

    It is known that neural networks under small-world topology can present anomalous synchronization and nonstationary behavior for weak coupling regimes. Here, we propose methods to suppress the anomalous synchronization and also to diminish the nonstationary behavior occurring in weakly coupled neural network under small-world topology. We consider a network of 2000 thermally sensitive identical neurons, based on the model of Hodgkin-Huxley in a small-world topology, with the probability of adding non local connection equal to p = 0 . 001. Based on experimental protocols to suppress anomalous synchronization, as well as nonstationary behavior of the neural network dynamics, we make use of (i) external stimulus (pulsed current); (ii) biologic parameters changing (neuron membrane conductance changes); and (iii) body temperature changes. Quantification analysis to evaluate phase synchronization makes use of the Kuramoto's order parameter, while recurrence quantification analysis, particularly the determinism, computed over the easily accessible mean field of network, the local field potential (LFP), is used to evaluate nonstationary states. We show that the methods proposed can control the anomalous synchronization and nonstationarity occurring for weak coupling parameter without any effect on the individual neuron dynamics, neither in the expected asymptotic synchronized states occurring for large values of the coupling parameter.

  19. A General Uncertainty Quantification Methodology for Cloud Microphysical Property Retrievals

    NASA Astrophysics Data System (ADS)

    Tang, Q.; Xie, S.; Chen, X.; Zhao, C.

    2014-12-01

    The US Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) program provides long-term (~20 years) ground-based cloud remote sensing observations. However, there are large uncertainties in the retrieval products of cloud microphysical properties based on the active and/or passive remote-sensing measurements. To address this uncertainty issue, a DOE Atmospheric System Research scientific focus study, Quantification of Uncertainties in Cloud Retrievals (QUICR), has been formed. In addition to an overview of recent progress of QUICR, we will demonstrate the capacity of an observation-based general uncertainty quantification (UQ) methodology via the ARM Climate Research Facility baseline cloud microphysical properties (MICROBASE) product. This UQ method utilizes the Karhunen-Loéve expansion (KLE) and Central Limit Theorems (CLT) to quantify the retrieval uncertainties from observations and algorithm parameters. The input perturbations are imposed on major modes to take into account the cross correlations between input data, which greatly reduces the dimension of random variables (up to a factor of 50) and quantifies vertically resolved full probability distribution functions of retrieved quantities. Moreover, this KLE/CLT approach has the capability of attributing the uncertainties in the retrieval output to individual uncertainty source and thus sheds light on improving the retrieval algorithm and observations. We will present the results of a case study for the ice water content at the Southern Great Plains during an intensive observing period on March 9, 2000. This work is performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  20. Projecting Event-Based Analysis Dates in Clinical Trials: An Illustration Based on the International Duration Evaluation of Adjuvant Chemotherapy (IDEA) Collaboration. Projecting analysis dates for the IDEA collaboration

    PubMed Central

    Renfro, Lindsay A.; Grothey, Axel M.; Paul, James; Floriani, Irene; Bonnetain, Franck; Niedzwiecki, Donna; Yamanaka, Takeharu; Souglakos, Ioannis; Yothers, Greg; Sargent, Daniel J.

    2015-01-01

    Purpose Clinical trials are expensive and lengthy, where success of a given trial depends on observing a prospectively defined number of patient events required to answer the clinical question. The point at which this analysis time occurs depends on both patient accrual and primary event rates, which typically vary throughout the trial's duration. We demonstrate real-time analysis date projections using data from a collection of six clinical trials that are part of the IDEA collaboration, an international preplanned pooling of data from six trials testing the duration of adjuvant chemotherapy in stage III colon cancer, and we additionally consider the hypothetical impact of one trial's early termination of follow-up. Patients and Methods In the absence of outcome data from IDEA, monthly accrual rates for each of the six IDEA trials were used to project subsequent trial-specific accrual, while historical data from similar Adjuvant Colon Cancer Endpoints (ACCENT) Group trials were used to construct a parametric model for IDEA's primary endpoint, disease-free survival, under the same treatment regimen. With this information and using the planned total accrual from each IDEA trial protocol, individual patient accrual and event dates were simulated and the overall IDEA interim and final analysis times projected. Projections were then compared with actual (previously undisclosed) trial-specific event totals at a recent census time for validation. The change in projected final analysis date assuming early termination of follow-up for one IDEA trial was also calculated. Results Trial-specific predicted event totals were close to the actual number of events per trial for the recent census date at which the number of events per trial was known, with the overall IDEA projected number of events only off by eight patients. Potential early termination of follow-up by one IDEA trial was estimated to postpone the overall IDEA final analysis date by 9 months. Conclusions Real-time projection of the final analysis time during a trial, or the overall analysis time during a trial collaborative such as IDEA, has practical implications for trial feasibility when these projections are translated into additional time and resources required. PMID:26989447

  1. qPCR-based mitochondrial DNA quantification: Influence of template DNA fragmentation on accuracy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jackson, Christopher B., E-mail: Christopher.jackson@insel.ch; Gallati, Sabina, E-mail: sabina.gallati@insel.ch; Schaller, Andre, E-mail: andre.schaller@insel.ch

    2012-07-06

    Highlights: Black-Right-Pointing-Pointer Serial qPCR accurately determines fragmentation state of any given DNA sample. Black-Right-Pointing-Pointer Serial qPCR demonstrates different preservation of the nuclear and mitochondrial genome. Black-Right-Pointing-Pointer Serial qPCR provides a diagnostic tool to validate the integrity of bioptic material. Black-Right-Pointing-Pointer Serial qPCR excludes degradation-induced erroneous quantification. -- Abstract: Real-time PCR (qPCR) is the method of choice for quantification of mitochondrial DNA (mtDNA) by relative comparison of a nuclear to a mitochondrial locus. Quantitative abnormal mtDNA content is indicative of mitochondrial disorders and mostly confines in a tissue-specific manner. Thus handling of degradation-prone bioptic material is inevitable. We established a serialmore » qPCR assay based on increasing amplicon size to measure degradation status of any DNA sample. Using this approach we can exclude erroneous mtDNA quantification due to degraded samples (e.g. long post-exicision time, autolytic processus, freeze-thaw cycles) and ensure abnormal DNA content measurements (e.g. depletion) in non-degraded patient material. By preparation of degraded DNA under controlled conditions using sonification and DNaseI digestion we show that erroneous quantification is due to the different preservation qualities of the nuclear and the mitochondrial genome. This disparate degradation of the two genomes results in over- or underestimation of mtDNA copy number in degraded samples. Moreover, as analysis of defined archival tissue would allow to precise the molecular pathomechanism of mitochondrial disorders presenting with abnormal mtDNA content, we compared fresh frozen (FF) with formalin-fixed paraffin-embedded (FFPE) skeletal muscle tissue of the same sample. By extrapolation of measured decay constants for nuclear DNA ({lambda}{sub nDNA}) and mtDNA ({lambda}{sub mtDNA}) we present an approach to possibly correct measurements in degraded samples in the future. To our knowledge this is the first time different degradation impact of the two genomes is demonstrated and which evaluates systematically the impact of DNA degradation on quantification of mtDNA copy number.« less

  2. Quantification of OH and HO2 radicals during the low-temperature oxidation of hydrocarbons by Fluorescence Assay by Gas Expansion technique

    PubMed Central

    Blocquet, Marion; Schoemaecker, Coralie; Amedro, Damien; Herbinet, Olivier; Battin-Leclerc, Frédérique; Fittschen, Christa

    2013-01-01

    •OH and •HO2 radicals are known to be the key species in the development of ignition. A direct measurement of these radicals under low-temperature oxidation conditions (T = 550–1,000 K) has been achieved by coupling a technique named fluorescence assay by gas expansion, an experimental technique designed for the quantification of these radicals in the free atmosphere, to a jet-stirred reactor, an experimental device designed for the study of low-temperature combustion chemistry. Calibration allows conversion of relative fluorescence signals to absolute mole fractions. Such radical mole fraction profiles will serve as a benchmark for testing chemical models developed to improve the understanding of combustion processes. PMID:24277836

  3. Scalable Methods for Uncertainty Quantification, Data Assimilation and Target Accuracy Assessment for Multi-Physics Advanced Simulation of Light Water Reactors

    NASA Astrophysics Data System (ADS)

    Khuwaileh, Bassam

    High fidelity simulation of nuclear reactors entails large scale applications characterized with high dimensionality and tremendous complexity where various physics models are integrated in the form of coupled models (e.g. neutronic with thermal-hydraulic feedback). Each of the coupled modules represents a high fidelity formulation of the first principles governing the physics of interest. Therefore, new developments in high fidelity multi-physics simulation and the corresponding sensitivity/uncertainty quantification analysis are paramount to the development and competitiveness of reactors achieved through enhanced understanding of the design and safety margins. Accordingly, this dissertation introduces efficient and scalable algorithms for performing efficient Uncertainty Quantification (UQ), Data Assimilation (DA) and Target Accuracy Assessment (TAA) for large scale, multi-physics reactor design and safety problems. This dissertation builds upon previous efforts for adaptive core simulation and reduced order modeling algorithms and extends these efforts towards coupled multi-physics models with feedback. The core idea is to recast the reactor physics analysis in terms of reduced order models. This can be achieved via identifying the important/influential degrees of freedom (DoF) via the subspace analysis, such that the required analysis can be recast by considering the important DoF only. In this dissertation, efficient algorithms for lower dimensional subspace construction have been developed for single physics and multi-physics applications with feedback. Then the reduced subspace is used to solve realistic, large scale forward (UQ) and inverse problems (DA and TAA). Once the elite set of DoF is determined, the uncertainty/sensitivity/target accuracy assessment and data assimilation analysis can be performed accurately and efficiently for large scale, high dimensional multi-physics nuclear engineering applications. Hence, in this work a Karhunen-Loeve (KL) based algorithm previously developed to quantify the uncertainty for single physics models is extended for large scale multi-physics coupled problems with feedback effect. Moreover, a non-linear surrogate based UQ approach is developed, used and compared to performance of the KL approach and brute force Monte Carlo (MC) approach. On the other hand, an efficient Data Assimilation (DA) algorithm is developed to assess information about model's parameters: nuclear data cross-sections and thermal-hydraulics parameters. Two improvements are introduced in order to perform DA on the high dimensional problems. First, a goal-oriented surrogate model can be used to replace the original models in the depletion sequence (MPACT -- COBRA-TF - ORIGEN). Second, approximating the complex and high dimensional solution space with a lower dimensional subspace makes the sampling process necessary for DA possible for high dimensional problems. Moreover, safety analysis and design optimization depend on the accurate prediction of various reactor attributes. Predictions can be enhanced by reducing the uncertainty associated with the attributes of interest. Accordingly, an inverse problem can be defined and solved to assess the contributions from sources of uncertainty; and experimental effort can be subsequently directed to further improve the uncertainty associated with these sources. In this dissertation a subspace-based gradient-free and nonlinear algorithm for inverse uncertainty quantification namely the Target Accuracy Assessment (TAA) has been developed and tested. The ideas proposed in this dissertation were first validated using lattice physics applications simulated using SCALE6.1 package (Pressurized Water Reactor (PWR) and Boiling Water Reactor (BWR) lattice models). Ultimately, the algorithms proposed her were applied to perform UQ and DA for assembly level (CASL progression problem number 6) and core wide problems representing Watts Bar Nuclear 1 (WBN1) for cycle 1 of depletion (CASL Progression Problem Number 9) modeled via simulated using VERA-CS which consists of several multi-physics coupled models. The analysis and algorithms developed in this dissertation were encoded and implemented in a newly developed tool kit algorithms for Reduced Order Modeling based Uncertainty/Sensitivity Estimator (ROMUSE).

  4. Identification of potential internal control genes for real-time PCR analysis during stress response in Pyropia haitanensis

    NASA Astrophysics Data System (ADS)

    Wang, Xia; Feng, Jianhua; Huang, Aiyou; He, Linwen; Niu, Jianfeng; Wang, Guangce

    2017-11-01

    Pyropia haitanensis has prominent stress-resistance characteristics and is endemic to China. Studies into the stress responses in these algae could provide valuable information on the stress-response mechanisms in the intertidal Rhodophyta. Here, the effects of salinity and light intensity on the quantum yield of photosystem II in Py. haitanensis were investigated using pulse-amplitude-modulation fluorometry. Total RNA and genomic DNA of the samples under different stress conditions were isolated. By normalizing to the genomic DNA quantity, the RNA content in each sample was evaluated. The cDNA was synthesized and the expression levels of seven potential internal control genes were evaluated using qRT-PCR method. Then, we used geNorm, a common statistical algorithm, to analyze the qRT-PCR data of seven reference genes. Potential genes that may constantly be expressed under different conditions were selected, and these genes showed stable expression levels in samples under a salinity treatment, while tubulin, glyceraldehyde-3-phosphate dehydrogenase and actin showed stability in samples stressed by strong light. Based on the results of the pulse amplitude-modulation fluorometry, an absolute quantification was performed to obtain gene copy numbers in certain stress-treated samples. The stably expressed genes as determined by the absolute quantification in certain samples conformed to the results of the geNorm screening. Based on the results of the software analysis and absolute quantification, we proposed that elongation factor 3 and 18S ribosomal RNA could be used as internal control genes when the Py. haitanensis blades were subjected to salinity stress, and that α-tubulin and 18S ribosomal RNA could be used as the internal control genes when the stress was from strong light. In general, our findings provide a convenient reference for the selection of internal control genes when designing experiments related to stress responses in Py. haitanensis.

  5. Ultrasensitive HIV-1 p24 Assay Detects Single Infected Cells and Differences in Reservoir Induction by Latency Reversal Agents.

    PubMed

    Passaes, Caroline Pereira Bittencourt; Bruel, Timothée; Decalf, Jérémie; David, Annie; Angin, Mathieu; Monceaux, Valerie; Muller-Trutwin, Michaela; Noel, Nicolas; Bourdic, Katia; Lambotte, Olivier; Albert, Matthew L; Duffy, Darragh; Schwartz, Olivier; Sáez-Cirión, Asier

    2017-03-15

    The existence of HIV reservoirs in infected individuals under combined antiretroviral therapy (cART) represents a major obstacle toward cure. Viral reservoirs are assessed by quantification of HIV nucleic acids, a method which does not discriminate between infectious and defective viruses, or by viral outgrowth assays, which require large numbers of cells and long-term cultures. Here, we used an ultrasensitive p24 digital assay, which we report to be 1,000-fold more sensitive than classical enzyme-linked immunosorbent assays (ELISAs) in the quantification of HIV-1 Gag p24 production in samples from HIV-infected individuals. Results from ultrasensitive p24 assays were compared to those from conventional viral RNA reverse transcription-quantitative PCR (RT-qPCR)-based assays and from outgrowth assay readout by flow cytometry. Using serial dilutions and flow-based single-cell sorting, we show that viral proteins produced by a single infected cell can be detected by the ultrasensitive p24 assay. This unique sensitivity allowed the early (as soon as day 1 in 43% of cases) and more efficient detection and quantification of p24 in phytohemagglutinin-L (PHA)-stimulated CD4 + T cells from individuals under effective cART. When seven different classes of latency reversal agents (LRA) in resting CD4 + T cells from HIV-infected individuals were tested, the ultrasensitive p24 assay revealed differences in the extent of HIV reactivation. Of note, HIV RNA production was infrequently accompanied by p24 protein production (19%). Among the drugs tested, prostratin showed a superior capacity in inducing viral protein production. In summary, the ultrasensitive p24 assay allows the detection and quantification of p24 produced by single infected CD4 + T cells and provides a unique tool to assess early reactivation of infectious virus from reservoirs in HIV-infected individuals. IMPORTANCE The persistence of HIV reservoirs in infected individuals under effective antiretroviral treatment represents a major obstacle toward cure. Different methods to estimate HIV reservoirs exist, but there is currently no optimal assay to measure HIV reservoirs in HIV eradication interventions. In the present study, we report an ultrasensitive digital ELISA platform for quantification of the HIV-1 protein p24. This method was employed to assess the early reactivation of infectious virus from reservoirs in HIV-1-infected individuals. We found that viral proteins produced by a single infected cell can be detected by an ultrasensitive p24 assay. This unprecedented resolution showed major advantages in comparison to other techniques currently used to assess viral replication in reactivation studies. In addition, such a highly sensitive assay allows discrimination of drug-induced reactivation of productive HIV based on protein expression. The present study heralds new opportunities to evaluate the HIV reservoir and the efficacy of drugs used to target it. Copyright © 2017 American Society for Microbiology.

  6. Between the Under-Labourer and the Master-Builder: Observations on Bunge's Method

    NASA Astrophysics Data System (ADS)

    Agassi, Joseph

    2012-10-01

    Mario Bunge has repeatedly discussed contributions to philosophy and to science that are worthless at best and dangerous at worst, especially cases of pseudo-science. He clearly gives his reason in his latest essay on this matter: "The fact that science can be faked to the point of deceiving science lovers suggests the need for a rigorous sifting device". Moreover, this sifting has its rewards, as "sometimes intellectual gold comes mixed with muck". Furthermore, the sifting device is a demarcation of science, which answers interesting questions: what is valuable in science and what makes it tick? The question is under dispute. So before coming to it we should admit a few preliminary ideas that are more difficult to contest than ideas that purport to demarcate science.

  7. Features of Knowledge Building in Biology: Understanding Undergraduate Students' Ideas about Molecular Mechanisms.

    PubMed

    Southard, Katelyn; Wince, Tyler; Meddleton, Shanice; Bolger, Molly S

    2016-01-01

    Research has suggested that teaching and learning in molecular and cellular biology (MCB) is difficult. We used a new lens to understand undergraduate reasoning about molecular mechanisms: the knowledge-integration approach to conceptual change. Knowledge integration is the dynamic process by which learners acquire new ideas, develop connections between ideas, and reorganize and restructure prior knowledge. Semistructured, clinical think-aloud interviews were conducted with introductory and upper-division MCB students. Interviews included a written conceptual assessment, a concept-mapping activity, and an opportunity to explain the biomechanisms of DNA replication, transcription, and translation. Student reasoning patterns were explored through mixed-method analyses. Results suggested that students must sort mechanistic entities into appropriate mental categories that reflect the nature of MCB mechanisms and that conflation between these categories is common. We also showed how connections between molecular mechanisms and their biological roles are part of building an integrated knowledge network as students develop expertise. We observed differences in the nature of connections between ideas related to different forms of reasoning. Finally, we provide a tentative model for MCB knowledge integration and suggest its implications for undergraduate learning. © 2016 K. Southard et al. CBE—Life Sciences Education © 2016 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  8. Beware the next big thing.

    PubMed

    Birkinshaw, Julian

    2014-05-01

    Innovative management ideas that bubble up in other companies pose a perennial quandary for leaders: Should you attempt to borrow new ideas, and if so, which ones and how? Even the most promising practices can be disastrous if they're transplanted into the wrong company, writes Julian Birkinshaw of London Business School. Broadly speaking, there are two ways to borrow from innovative companies, he argues. The first, observe and apply, is the most commonly used approach for adopting new management ideas. It can and does work well, but only under Limited sets of circumstances: when the observed practice easily stands alone or involves just a small constellation of supporting behaviors (think of GE's well-regarded succession-planning process) and when a company's management model or way of thinking is very similar to the originator's (think of two software firms that both use the Agile development approach). The second method is to extract a management practice's essential principle-its underlying logic-and ask a series of questions to determine if it is right for your firm, including: How is your company different from the originating firm? Are the goals of the practice important to your organization? Many management innovations are launched with great fanfare, only to fade in popularity. With careful analysis, you can avoid falling prey to this hype cycle. And even if it turns out that a borrowed idea isn't right for you, the analysis will help you better understand your own management models and sharpen your practices.

  9. Quantification of localized vertebral deformities using a sparse wavelet-based shape model.

    PubMed

    Zewail, R; Elsafi, A; Durdle, N

    2008-01-01

    Medical experts often examine hundreds of spine x-ray images to determine existence of various pathologies. Common pathologies of interest are anterior osteophites, disc space narrowing, and wedging. By careful inspection of the outline shapes of the vertebral bodies, experts are able to identify and assess vertebral abnormalities with respect to the pathology under investigation. In this paper, we present a novel method for quantification of vertebral deformation using a sparse shape model. Using wavelets and Independent component analysis (ICA), we construct a sparse shape model that benefits from the approximation power of wavelets and the capability of ICA to capture higher order statistics in wavelet space. The new model is able to capture localized pathology-related shape deformations, hence it allows for quantification of vertebral shape variations. We investigate the capability of the model to predict localized pathology related deformations. Next, using support-vector machines, we demonstrate the diagnostic capabilities of the method through the discrimination of anterior osteophites in lumbar vertebrae. Experiments were conducted using a set of 150 contours from digital x-ray images of lumbar spine. Each vertebra is labeled as normal or abnormal. Results reported in this work focus on anterior osteophites as the pathology of interest.

  10. Characterization of the Rotating Exercise Quantification System (REQS), a novel Drosophila exercise quantification apparatus

    PubMed Central

    Watanabe, Louis Patrick

    2017-01-01

    Obesity is a disease that has reached epidemic proportions in the United States and has prompted international legislation in an attempt to curtail its prevalence. Despite the fact that one of the most prescribed treatment options for obesity is exercise, the genetic mechanisms underlying exercise response in individuals are still largely unknown. The fruit fly Drosophila melanogaster is a promising new model for studying exercise genetics. Currently, the lack of an accurate method to quantify the amount of exercise performed by the animals is limiting the utility of the Drosophila model for exercise genetics research. To address this limitation, we developed the Rotational Exercise Quantification System (REQS), a novel apparatus that is able to simultaneously induce exercise in flies while recording their activity levels. Thus, the REQS provides a method to standardize Drosophila exercise and ensure that all animals irrespective of genotype and sex experience the same level of exercise. Here, we provide a basic characterization of the REQS, validate its measurements using video-tracking technology, illustrate its potential use by presenting a comparison of two different exercise regimes, and demonstrate that it can be used to detect genotype-dependent variation in activity levels. PMID:29016615

  11. Adaptive quantification and longitudinal analysis of pulmonary emphysema with a hidden Markov measure field model.

    PubMed

    Hame, Yrjo; Angelini, Elsa D; Hoffman, Eric A; Barr, R Graham; Laine, Andrew F

    2014-07-01

    The extent of pulmonary emphysema is commonly estimated from CT scans by computing the proportional area of voxels below a predefined attenuation threshold. However, the reliability of this approach is limited by several factors that affect the CT intensity distributions in the lung. This work presents a novel method for emphysema quantification, based on parametric modeling of intensity distributions and a hidden Markov measure field model to segment emphysematous regions. The framework adapts to the characteristics of an image to ensure a robust quantification of emphysema under varying CT imaging protocols, and differences in parenchymal intensity distributions due to factors such as inspiration level. Compared to standard approaches, the presented model involves a larger number of parameters, most of which can be estimated from data, to handle the variability encountered in lung CT scans. The method was applied on a longitudinal data set with 87 subjects and a total of 365 scans acquired with varying imaging protocols. The resulting emphysema estimates had very high intra-subject correlation values. By reducing sensitivity to changes in imaging protocol, the method provides a more robust estimate than standard approaches. The generated emphysema delineations promise advantages for regional analysis of emphysema extent and progression.

  12. Computerized image analysis for quantitative neuronal phenotyping in zebrafish.

    PubMed

    Liu, Tianming; Lu, Jianfeng; Wang, Ye; Campbell, William A; Huang, Ling; Zhu, Jinmin; Xia, Weiming; Wong, Stephen T C

    2006-06-15

    An integrated microscope image analysis pipeline is developed for automatic analysis and quantification of phenotypes in zebrafish with altered expression of Alzheimer's disease (AD)-linked genes. We hypothesize that a slight impairment of neuronal integrity in a large number of zebrafish carrying the mutant genotype can be detected through the computerized image analysis method. Key functionalities of our zebrafish image processing pipeline include quantification of neuron loss in zebrafish embryos due to knockdown of AD-linked genes, automatic detection of defective somites, and quantitative measurement of gene expression levels in zebrafish with altered expression of AD-linked genes or treatment with a chemical compound. These quantitative measurements enable the archival of analyzed results and relevant meta-data. The structured database is organized for statistical analysis and data modeling to better understand neuronal integrity and phenotypic changes of zebrafish under different perturbations. Our results show that the computerized analysis is comparable to manual counting with equivalent accuracy and improved efficacy and consistency. Development of such an automated data analysis pipeline represents a significant step forward to achieve accurate and reproducible quantification of neuronal phenotypes in large scale or high-throughput zebrafish imaging studies.

  13. Spectrometric microbiological analyzer

    NASA Astrophysics Data System (ADS)

    Schlager, Kenneth J.; Meissner, Ken E.

    1996-04-01

    Currently, there are four general approaches to microbiological analysis, i.e., the detection, identification and quantification of micro-organisms: (1) Traditional culturing and staining procedures, metabolic fermentations and visual morphological characteristics; (2) Immunological approaches employing microbe-specific antibodies; (3) Biotechnical techniques employing DNA probes and related genetic engineering methods; and (4) Physical measurement techniques based on the biophysical properties of micro-organisms. This paper describes an instrumentation development in the fourth of the above categories, physical measurement, that uses a combination of fluorometric and light scatter spectra to detect and identify micro-organisms at the species level. A major advantage of this approach is the rapid turnaround possible in medical diagnostic or water testing applications. Fluorometric spectra serve to define the biochemical characteristics of the microbe, and light scatter spectra the size and shape morphology. Together, the two spectra define a 'fingerprint' for each species of microbe for detection, identification and quantification purposes. A prototype instrument has been developed and tested under NASA sponsorship based on fluorometric spectra alone. This instrument demonstrated identification and quantification capabilities at the species level. The paper reports on test results using this instrument, and the benefits of employing a combination of fluorometric and light scatter spectra.

  14. Characterization of the Rotating Exercise Quantification System (REQS), a novel Drosophila exercise quantification apparatus.

    PubMed

    Watanabe, Louis Patrick; Riddle, Nicole C

    2017-01-01

    Obesity is a disease that has reached epidemic proportions in the United States and has prompted international legislation in an attempt to curtail its prevalence. Despite the fact that one of the most prescribed treatment options for obesity is exercise, the genetic mechanisms underlying exercise response in individuals are still largely unknown. The fruit fly Drosophila melanogaster is a promising new model for studying exercise genetics. Currently, the lack of an accurate method to quantify the amount of exercise performed by the animals is limiting the utility of the Drosophila model for exercise genetics research. To address this limitation, we developed the Rotational Exercise Quantification System (REQS), a novel apparatus that is able to simultaneously induce exercise in flies while recording their activity levels. Thus, the REQS provides a method to standardize Drosophila exercise and ensure that all animals irrespective of genotype and sex experience the same level of exercise. Here, we provide a basic characterization of the REQS, validate its measurements using video-tracking technology, illustrate its potential use by presenting a comparison of two different exercise regimes, and demonstrate that it can be used to detect genotype-dependent variation in activity levels.

  15. Evaluation of empirical rule of linearly correlated peptide selection (ERLPS) for proteotypic peptide-based quantitative proteomics.

    PubMed

    Liu, Kehui; Zhang, Jiyang; Fu, Bin; Xie, Hongwei; Wang, Yingchun; Qian, Xiaohong

    2014-07-01

    Precise protein quantification is essential in comparative proteomics. Currently, quantification bias is inevitable when using proteotypic peptide-based quantitative proteomics strategy for the differences in peptides measurability. To improve quantification accuracy, we proposed an "empirical rule for linearly correlated peptide selection (ERLPS)" in quantitative proteomics in our previous work. However, a systematic evaluation on general application of ERLPS in quantitative proteomics under diverse experimental conditions needs to be conducted. In this study, the practice workflow of ERLPS was explicitly illustrated; different experimental variables, such as, different MS systems, sample complexities, sample preparations, elution gradients, matrix effects, loading amounts, and other factors were comprehensively investigated to evaluate the applicability, reproducibility, and transferability of ERPLS. The results demonstrated that ERLPS was highly reproducible and transferable within appropriate loading amounts and linearly correlated response peptides should be selected for each specific experiment. ERLPS was used to proteome samples from yeast to mouse and human, and in quantitative methods from label-free to O18/O16-labeled and SILAC analysis, and enabled accurate measurements for all proteotypic peptide-based quantitative proteomics over a large dynamic range. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. [Lauritz Toft and DUMEX--the man, the idea and the way to its realization].

    PubMed

    Overø, Knud

    2009-01-01

    Lauritz Toft (1920-1991), also known as Lau, graduated with a MSc degree in pharmacy at the Royal Danish School of Pharmacy in 1944. During the education and parallel activities Lau had shown gifts for intuition and improvisation, together with special talents for leadership and large-scale working. In 1945 he got the idea during the organization of the East Asiatic Company Ltd. to sell the best of the products from the Danish pharmaceutical companies in India under a common trade mark: DUMEX (Danish United Medical Export). The article describes Lau's difficulties and problems in realizing this idea. The adventure peaks in the mid-1950's with about 40 pharmacists in DUMEX-departments in India as well as Thailand, Malaysia, Indonesia, China etc.

  17. Special Education for Children with Attention Deficit Disorder: Current Issues. CRS Report for Congress.

    ERIC Educational Resources Information Center

    Aleman, Steven R.

    This paper examines issues concerning the eligibility of children with attention deficit disorder (ADD) for special education and related services under the Individuals with Disabilities Education Act (IDEA). A policy memorandum was issued by the Department of Education in September 1991, identifying those circumstances under which such children…

  18. Monsters under the Bed: Critically Investigating Early Years Writing

    ERIC Educational Resources Information Center

    Melrose, Andrew

    2012-01-01

    "Monsters Under the Bed" is an essential text focussing on critical and contemporary issues surrounding writing for "early years" children. Containing a critically creative and a creatively critical investigation of the cult and culture of the child and childhood in fiction and non-fictional writing, it also contains a wealth of ideas and critical…

  19. Global composites of surface wind speeds in tropical cyclones based on a 12 year scatterometer database

    NASA Astrophysics Data System (ADS)

    Klotz, Bradley W.; Jiang, Haiyan

    2016-10-01

    A 12 year global database of rain-corrected satellite scatterometer surface winds for tropical cyclones (TCs) is used to produce composites of TC surface wind speed distributions relative to vertical wind shear and storm motion directions in each TC-prone basin and various TC intensity stages. These composites corroborate ideas presented in earlier studies, where maxima are located right of motion in the Earth-relative framework. The entire TC surface wind asymmetry is down motion left for all basins and for lower strength TCs after removing the motion vector. Relative to the shear direction, the motion-removed composites indicate that the surface wind asymmetry is located down shear left for the outer region of all TCs, but for the inner-core region it varies from left of shear to down shear right for different basin and TC intensity groups. Quantification of the surface wind asymmetric structure in further stratifications is a necessary next step for this scatterometer data set.

  20. Size-dependent forced PEG partitioning into channels: VDAC, OmpC, and α-hemolysin

    DOE PAGES

    Aksoyoglu, M. Alphan; Podgornik, Rudolf; Bezrukov, Sergey M.; ...

    2016-07-27

    Nonideal polymer mixtures of PEGs of different molecular weights partition differently into nanosize protein channels. Here, we assess the validity of the recently proposed theoretical approach of forced partitioning for three structurally different beta-barrel channels: voltage-dependent anion channel from outer mitochondrial membrane VDAC, bacterial porin OmpC (outer membrane protein C), and bacterial channel-forming toxin alpha-hemolysin. Our interpretation is based on the idea that relatively less-penetrating polymers push the more easily penetrating ones into nanosize channels in excess of their bath concentration. Comparison of the theory with experiments is excellent for VDAC. Polymer partitioning data for the other two channels aremore » consistent with theory if additional assumptions regarding the energy penalty of pore penetration are included. In conclusion, the obtained results demonstrate that the general concept of "polymers pushing polymers" is helpful in understanding and quantification of concrete examples of size-dependent forced partitioning of polymers into protein nanopores.« less

  1. Size-dependent forced PEG partitioning into channels: VDAC, OmpC, and α-hemolysin

    PubMed Central

    Aksoyoglu, M. Alphan; Podgornik, Rudolf; Bezrukov, Sergey M.; Gurnev, Philip A.; Muthukumar, Murugappan; Parsegian, V. Adrian

    2016-01-01

    Nonideal polymer mixtures of PEGs of different molecular weights partition differently into nanosize protein channels. Here, we assess the validity of the recently proposed theoretical approach of forced partitioning for three structurally different β-barrel channels: voltage-dependent anion channel from outer mitochondrial membrane VDAC, bacterial porin OmpC (outer membrane protein C), and bacterial channel-forming toxin α-hemolysin. Our interpretation is based on the idea that relatively less-penetrating polymers push the more easily penetrating ones into nanosize channels in excess of their bath concentration. Comparison of the theory with experiments is excellent for VDAC. Polymer partitioning data for the other two channels are consistent with theory if additional assumptions regarding the energy penalty of pore penetration are included. The obtained results demonstrate that the general concept of “polymers pushing polymers” is helpful in understanding and quantification of concrete examples of size-dependent forced partitioning of polymers into protein nanopores. PMID:27466408

  2. Digital Inequalities in the Use of Self-Tracking Diet and Fitness Apps: Interview Study on the Influence of Social, Economic, and Cultural Factors

    PubMed Central

    Chauvel, Louis

    2018-01-01

    Background Digital devices are driving economic and social transformations, but assessing the uses, perceptions, and impact of these new technologies on diet and physical activity remains a major societal challenge. Objective We aimed to determine under which social, economic, and cultural conditions individuals in France were more likely to be actively invested in the use of self-tracking diet and fitness apps for better health behaviors. Methods Existing users of 3 diet and fitness self-tracking apps (Weight Watchers, MyFitnessPal, and sport apps) were recruited from 3 regions of France. We interviewed 79 individuals (Weight Watchers, n=37; MyFitnessPal, n=20; sport apps, n=22). In-depth semistructured interviews were conducted with each participant, using open-ended questions about their use of diet and fitness apps. A triangulation of methods (content, textual, and quantitative analyses) was performed. Results We found 3 clusters of interviewees who differed by social background and curative goal linked to use under constraint versus preventive goal linked to chosen use, and intensity of their self-quantification efforts and participation in social networks. Interviewees used the apps for a diversity of uses, including measurement, tracking, quantification, and participation in digital communities. A digital divide was highlighted, comprising a major social gap. Social conditions for appropriation of self-tracking devices included sociodemographic factors, life course stages, and cross-cutting factors of heterogeneity. Conclusions Individuals from affluent or intermediate social milieus were most likely to use the apps and to participate in the associated online social networks. These interviewees also demonstrated a preventive approach to a healthy lifestyle. Individuals from lower milieus were more reluctant to use digital devices relating to diet and physical activity or to participate in self-quantification. The results of the study have major implications for public health: the digital self-quantification device is intrinsically less important than the way the individual uses it, in terms of adoption of successful health behaviors. PMID:29678807

  3. Digital Inequalities in the Use of Self-Tracking Diet and Fitness Apps: Interview Study on the Influence of Social, Economic, and Cultural Factors.

    PubMed

    Régnier, Faustine; Chauvel, Louis

    2018-04-20

    Digital devices are driving economic and social transformations, but assessing the uses, perceptions, and impact of these new technologies on diet and physical activity remains a major societal challenge. We aimed to determine under which social, economic, and cultural conditions individuals in France were more likely to be actively invested in the use of self-tracking diet and fitness apps for better health behaviors. Existing users of 3 diet and fitness self-tracking apps (Weight Watchers, MyFitnessPal, and sport apps) were recruited from 3 regions of France. We interviewed 79 individuals (Weight Watchers, n=37; MyFitnessPal, n=20; sport apps, n=22). In-depth semistructured interviews were conducted with each participant, using open-ended questions about their use of diet and fitness apps. A triangulation of methods (content, textual, and quantitative analyses) was performed. We found 3 clusters of interviewees who differed by social background and curative goal linked to use under constraint versus preventive goal linked to chosen use, and intensity of their self-quantification efforts and participation in social networks. Interviewees used the apps for a diversity of uses, including measurement, tracking, quantification, and participation in digital communities. A digital divide was highlighted, comprising a major social gap. Social conditions for appropriation of self-tracking devices included sociodemographic factors, life course stages, and cross-cutting factors of heterogeneity. Individuals from affluent or intermediate social milieus were most likely to use the apps and to participate in the associated online social networks. These interviewees also demonstrated a preventive approach to a healthy lifestyle. Individuals from lower milieus were more reluctant to use digital devices relating to diet and physical activity or to participate in self-quantification. The results of the study have major implications for public health: the digital self-quantification device is intrinsically less important than the way the individual uses it, in terms of adoption of successful health behaviors. ©Faustine Régnier, Louis Chauvel. Originally published in JMIR Mhealth and Uhealth (http://mhealth.jmir.org), 20.04.2018.

  4. Single-shot EPI with Nyquist ghost compensation: Interleaved Dual-Echo with Acceleration (IDEA) EPI

    PubMed Central

    Poser, Benedikt A; Barth, Markus; Goa, Pål-Erik; Deng, Weiran; Stenger, V Andrew

    2012-01-01

    Echo planar imaging is most commonly used for BOLD fMRI, owing to its sensitivity and acquisition speed. A major problem with EPI is Nyquist (N/2) ghosting, most notably at high field. EPI data are acquired under an oscillating readout gradient and hence vulnerable to gradient imperfections such as eddy current delays and off-resonance effects, as these cause inconsistencies between odd and even k-space lines after time reversal. We propose a straightforward and pragmatic method herein termed Interleaved Dual Echo with Acceleration (IDEA) EPI: Two k-spaces (echoes) are acquired under the positive and negative readout lobes, respectively, by performing phase blips only before alternate readout gradients. From these two k-spaces, two almost entirely ghost free images per shot can be constructed, without need for phase correction. The doubled echo train length can be compensated by parallel imaging and/or partial Fourier acquisition. The two k-spaces can either be complex-averaged during reconstruction, which results in near-perfect cancellation of residual phase errors, or reconstructed into separate images. We demonstrate the efficacy of IDEA EPI and show phantom and in vivo images at both 3 and 7 Tesla. PMID:22411762

  5. Drugged driving in Louisiana: quantification of its impact on public health and implications for legislation, enforcement, and prosecution : final report.

    DOT National Transportation Integrated Search

    2017-02-01

    Drugged driving, i.e., driving under the influence of drugs, is considered a rising public health issue in the US and the rest of the world, yet due to underreporting and limitations of existing data, not much is known about the frequency of drugged ...

  6. Optimized co-extraction and quantification of DNA from enteric pathogens in surface water samples near produce fields in California

    USDA-ARS?s Scientific Manuscript database

    Pathogen contamination of surface water is a health hazard in agricultural environments primarily due to the potential for contamination of crops. Furthermore, pathogen levels in surface water are often unreported or under reported due to difficulty with culture of the bacteria. The pathogens are of...

  7. Molecular Mechanisms of Insulin Secretion and Insulin Action.

    ERIC Educational Resources Information Center

    Flatt, Peter R.; Bailey, Clifford J.

    1991-01-01

    Information and current ideas on the factors regulating insulin secretion, the mechanisms underlying the secretion and biological actions of insulin, and the main characteristics of diabetes mellitus are presented. (Author)

  8. Integrated Design Engineering Analysis (IDEA) Environment - Aerodynamics, Aerothermodynamics, and Thermal Protection System Integration Module

    NASA Technical Reports Server (NTRS)

    Kamhawi, Hilmi N.

    2011-01-01

    This report documents the work performed during from March 2010 October 2011. The Integrated Design and Engineering Analysis (IDEA) environment is a collaborative environment based on an object-oriented, multidisciplinary, distributed environment using the Adaptive Modeling Language (AML) as the underlying framework. This report will focus on describing the work done in the area of extending the aerodynamics, and aerothermodynamics module using S/HABP, CBAERO, PREMIN and LANMIN. It will also detail the work done integrating EXITS as the TPS sizing tool.

  9. Cybernetics of Brief Family Therapy.

    ERIC Educational Resources Information Center

    Keeney, Bradford P.; Ross, Jeffrey M.

    1983-01-01

    Presents a cybernetic view of brief family therapy. Includes a historical discussion of the key ideas underlying brief family therapy, a cybernetic model of therapeutic change, and a clinical case for exemplification. (Author/JAC)

  10. Genital hiatus size is associated with and predictive of apical vaginal support loss.

    PubMed

    Lowder, Jerry L; Oliphant, Sallie S; Shepherd, Jonathan P; Ghetti, Chiara; Sutkin, Gary

    2016-06-01

    Recognition and assessment of apical vaginal support defects remains a significant challenge in the evaluation and management of prolapse. There are several reasons that this is likely: (1) Although the Pelvic Organ Prolapse-Quantification examination is the standard prolapse staging system used in the Female Pelvic Medicine and Reconstructive Surgery field for reporting outcomes, this assessment is not used commonly in clinical care outside the subspecialty; (2) no clinically useful and accepted definition of apical support loss exists, and (3) no consensus or guidelines address the degree of apical support loss at which an apical support procedure should be performed routinely. The purpose of this study was to identify a simple screening measure for significant loss of apical vaginal support. This was an analysis of women with Pelvic Organ Prolapse-Quantification stage 0-IV prolapse. Women with total vaginal length of ≥7 cm were included to define a population with "normal" vaginal length. Univariable and linear regression analyses were used to identify Pelvic Organ Prolapse-Quantification points that were associated with 3 definitions of apical support loss: the International Consultation on Incontinence, the Pelvic Floor Disorders Network revised eCARE, and a Pelvic Organ Prolapse-Quantification point C cut-point developed by Dietz et al. Linear and logistic regression models were created to assess predictors of overall apical support loss according to these definitions. Receiver operator characteristic curves were generated to determine test characteristics of the predictor variables and the areas under the curves were calculated. Of 469 women, 453 women met the inclusion criterion. The median Pelvic Organ Prolapse-Quantification stage was III, and the median leading edge of prolapse was +2 cm (range, -3 to 12 cm). By stage of prolapse (0-IV), mean genital hiatus size (genital hiatus; mid urethra to posterior fourchette) increased: 2.0 ± 0.5, 3.0 ± 0.5, 4.0 ± 1.0, 5.0 ± 1.0, and 6.5 ± 1.5 cm, respectively (P < .01). Pelvic Organ Prolapse-Quantification points B anterior, B posterior, and genital hiatus had moderate-to-strong associations with overall apical support loss and all definitions of apical support loss. Linear regression models that predict overall apical support loss and logistic regression models predict apical support loss as defined by International Continence Society, eCARE, and the point C; cut-point definitions were fit with points B anterior, B posterior, and genital hiatus; these 3 points explained more than one-half of the model variance. Receiver operator characteristic analysis for all definitions of apical support loss found that genital hiatus >3.75 cm was highly predictive of apical support loss (area under the curve, >0.8 in all models). Increasing genital hiatus size is associated highly with and predictive of apical vaginal support loss. Specifically, the Pelvic Organ Prolapse-Quantification measurement genital hiatus of ≥3.75 cm is highly predictive of apical support loss by all study definitions. This simple measurement can be used to screen for apical support loss and the need for further evaluation of apical vaginal support before planning a hysterectomy or prolapse surgery. Copyright © 2015 Elsevier Inc. All rights reserved.

  11. MPQ-cytometry: a magnetism-based method for quantification of nanoparticle-cell interactions

    NASA Astrophysics Data System (ADS)

    Shipunova, V. O.; Nikitin, M. P.; Nikitin, P. I.; Deyev, S. M.

    2016-06-01

    Precise quantification of interactions between nanoparticles and living cells is among the imperative tasks for research in nanobiotechnology, nanotoxicology and biomedicine. To meet the challenge, a rapid method called MPQ-cytometry is developed, which measures the integral non-linear response produced by magnetically labeled nanoparticles in a cell sample with an original magnetic particle quantification (MPQ) technique. MPQ-cytometry provides a sensitivity limit 0.33 ng of nanoparticles and is devoid of a background signal present in many label-based assays. Each measurement takes only a few seconds, and no complicated sample preparation or data processing is required. The capabilities of the method have been demonstrated by quantification of interactions of iron oxide nanoparticles with eukaryotic cells. The total amount of targeted nanoparticles that specifically recognized the HER2/neu oncomarker on the human cancer cell surface was successfully measured, the specificity of interaction permitting the detection of HER2/neu positive cells in a cell mixture. Moreover, it has been shown that MPQ-cytometry analysis of a HER2/neu-specific iron oxide nanoparticle interaction with six cell lines of different tissue origins quantitatively reflects the HER2/neu status of the cells. High correlation of MPQ-cytometry data with those obtained by three other commonly used in molecular and cell biology methods supports consideration of this method as a prospective alternative for both quantifying cell-bound nanoparticles and estimating the expression level of cell surface antigens. The proposed method does not require expensive sophisticated equipment or highly skilled personnel and it can be easily applied for rapid diagnostics, especially under field conditions.Precise quantification of interactions between nanoparticles and living cells is among the imperative tasks for research in nanobiotechnology, nanotoxicology and biomedicine. To meet the challenge, a rapid method called MPQ-cytometry is developed, which measures the integral non-linear response produced by magnetically labeled nanoparticles in a cell sample with an original magnetic particle quantification (MPQ) technique. MPQ-cytometry provides a sensitivity limit 0.33 ng of nanoparticles and is devoid of a background signal present in many label-based assays. Each measurement takes only a few seconds, and no complicated sample preparation or data processing is required. The capabilities of the method have been demonstrated by quantification of interactions of iron oxide nanoparticles with eukaryotic cells. The total amount of targeted nanoparticles that specifically recognized the HER2/neu oncomarker on the human cancer cell surface was successfully measured, the specificity of interaction permitting the detection of HER2/neu positive cells in a cell mixture. Moreover, it has been shown that MPQ-cytometry analysis of a HER2/neu-specific iron oxide nanoparticle interaction with six cell lines of different tissue origins quantitatively reflects the HER2/neu status of the cells. High correlation of MPQ-cytometry data with those obtained by three other commonly used in molecular and cell biology methods supports consideration of this method as a prospective alternative for both quantifying cell-bound nanoparticles and estimating the expression level of cell surface antigens. The proposed method does not require expensive sophisticated equipment or highly skilled personnel and it can be easily applied for rapid diagnostics, especially under field conditions. Electronic supplementary information (ESI) available. See DOI: 10.1039/c6nr03507h

  12. Pulsed glow discharge enables direct mass spectrometric measurement of fluorine in crystal materials - Fluorine quantification and depth profiling in fluorine doped potassium titanyl phosphate

    NASA Astrophysics Data System (ADS)

    Bodnar, Victoria; Ganeev, Alexander; Gubal, Anna; Solovyev, Nikolay; Glumov, Oleg; Yakobson, Viktor; Murin, Igor

    2018-07-01

    A pulsed direct current glow discharge time-of-flight mass spectrometry (GD TOF MS) method for the quantification of fluorine in insoluble crystal materials with fluorine doped potassium titanyl phosphate (KTP) KTiOPO4:KF as an example has been proposed. The following parameters were optimized: repelling pulse delay, discharge duration, discharge voltage, and pressure in the discharge cell. Effective ionization of fluorine in the space between sampler and skimmer under short repelling pulse delay, related to the high-energy electron impact at the discharge front, has been demonstrated. A combination of instrumental and mathematical correction approaches was used to cope for the interferences of 38Ar2+ and 1H316O + on 19F+. To maintain surface conductivity in the dielectric KTP crystals and insure its effective sputtering in combined hollow cathode cell, silver suspension applied by the dip-coating method was employed. Fluorine quantification was performed using relative sensitivity factors. The analysis of a reference material and scanning electron microscope-energy dispersive X-ray spectroscopy was used for validation. Fluorine limit of detection by pulsed direct current GD TOF MS was 0.01 mass%. Real sample analysis showed that fluorine seems to be inhomogeneously distributed in the crystals. That is why depth profiling of F, K, O, and P was performed to evaluate the crystals' non-stoichiometry. The approaches designed allow for fluorine quantification in insoluble dielectric materials with minimal sample preparation and destructivity as well as performing depth profiling to assess crystal non-stoichiometry.

  13. Induced nanoparticle aggregation for short nucleic acid quantification by depletion isotachophoresis.

    PubMed

    Marczak, Steven; Senapati, Satyajyoti; Slouka, Zdenek; Chang, Hsueh-Chia

    2016-12-15

    A rapid (<20min) gel-membrane biochip platform for the detection and quantification of short nucleic acids is presented based on a sandwich assay with probe-functionalized gold nanoparticles and their separation into concentrated bands by depletion-generated gel isotachophoresis. The platform sequentially exploits the enrichment and depletion phenomena of an ion-selective cation-exchange membrane created under an applied electric field. Enrichment is used to concentrate the nanoparticles and targets at a localized position at the gel-membrane interface for rapid hybridization. The depletion generates an isotachophoretic zone without the need for different conductivity buffers, and is used to separate linked nanoparticles from isolated ones in the gel medium and then by field-enhanced aggregation of only the linked particles at the depletion front. The selective field-induced aggregation of the linked nanoparticles during the subsequent depletion step produces two lateral-flow like bands within 1cm for easy visualization and quantification as the aggregates have negligible electrophoretic mobility in the gel and the isolated nanoparticles are isotachophoretically packed against the migrating depletion front. The detection limit for 69-base single-stranded DNA targets is 10 pM (about 10 million copies for our sample volume) with high selectivity against nontargets and a three decade linear range for quantification. The selectivity and signal intensity are maintained in heterogeneous mixtures where the nontargets outnumber the targets 10,000 to 1. The selective field-induced aggregation of DNA-linked nanoparticles at the ion depletion front is attributed to their trailing position at the isotachophoretic front with a large field gradient. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. PEGylated human serum albumin (HSA) nanoparticles: preparation, characterization and quantification of the PEGylation extent

    NASA Astrophysics Data System (ADS)

    Fahrländer, E.; Schelhaas, S.; Jacobs, A. H.; Langer, K.

    2015-04-01

    Modification with poly(ethylene glycol) (PEG) is a widely used method for the prolongation of plasma half-life of colloidal carrier systems such as nanoparticles prepared from human serum albumin (HSA). However, the quantification of the PEGylation extent is still challenging. Moreover, the influence of different PEG derivatives, which are commonly used for nanoparticle conjugation, has not been investigated so far. The objective of the present study is to develop a method for the quantification of PEG and to monitor the influence of diverse PEG reagents on the amount of PEG linked to the surface of HSA nanoparticles. A size exclusion chromatography method with refractive index detection was established which enabled the quantification of unreacted PEG in the supernatant. The achieved results were confirmed using a fluorescent PEG derivative, which was detected by photometry and fluorimetry. Additionally, PEGylated HSA nanoparticles were enzymatically digested and the linked amount of fluorescently active PEG was directly determined. All the analytical methods confirmed that under optimized PEGylation conditions a PEGylation efficiency of up to 0.5 mg PEG per mg nanoparticle could be achieved. Model calculations made a ‘brush’ conformation of the PEG chains on the particle surface very likely. By incubating the nanoparticles with fetal bovine serum the reduced adsorption of serum proteins on PEGylated HSA nanoparticles compared to non-PEGylated HSA nanoparticles was demonstrated using sodium dodecylsulfate polyacrylamide gel electrophoresis. Finally, the positive effect of PEGylation on plasma half-life was demonstrated in an in vivo study in mice. Compared to unmodified nanoparticles the PEGylation led to a four times larger plasma half-life.

  15. Simultaneous quantification and splenocyte-proliferating activities of nucleosides and bases in Cervi cornu Pantotrichum

    PubMed Central

    Zong, Ying; Wang, Yu; Li, Hang; Li, Na; Zhang, Hui; Sun, Jiaming; Niu, Xiaohui; Gao, Xiaochen

    2014-01-01

    Background: Cervi Cornu Pantotrichum has been a well known traditional Chinese medicine, which is young horn of Cervus Nippon Temminck (Hualurong: HLR). At present, the methods used for the quality control of Cervi Cornu Pantotrichum show low specificity. Objective: To describe a holistic method based on chemical characteristics and splenocyte-proliferating activities to evaluate the quality of HLR. Materials and Methods: The nucleosides and bases from HLR were identified by high performance liquid chromatography electrospray ionization mass spectrometry (HPLC-ESI-MS), and six of them were chosen to be used for simultaneous HPLC quantification according to the results of proliferation of mouse splenocytes in vitro. Results: In this study, eight nucleosides and bases have been identified. In addition, uracil, hypoxanthine, uridine, inosine, guanosine, and adenosine were chosen to be used for simultaneous HPLC quantification. Simultaneous quantification of these six substances was performed on ten groups of HLR under the condition of a TIANHE Kromasil C18 column (5 μm, 4.6 mm × 250 mm i.d.) and a gradient elution of water and acetonitrile. Of the ten groups, HLR displayed the highest total nucleoside contents (TNC, sum of adenosine and uracil, 0.412 mg/g) with the strongest splenocyte-proliferating activities. Conclusion: These results suggest that TNC (such as particularly highly contained adenosine and uracil) in HLR has a certain correlation with the activity of splenocyte-proliferating, and it may be used as a quality control for HLR. This comprehensive method could be applied to other traditional Chinese medicines to ameliorate their quality control. PMID:25422536

  16. High-Throughput HPLC-MS/MS Method for Quantification of Ibuprofen Enantiomers in Human Plasma: Focus on Investigation of Metabolite Interference.

    PubMed

    Nakov, Natalija; Bogdanovska, Liljana; Acevska, Jelena; Tonic-Ribarska, Jasmina; Petkovska, Rumenka; Dimitrovska, Aneta; Kasabova, Lilia; Svinarov, Dobrin

    2016-11-01

    In this research, as a part of the development of fast and reliable HPLC-MS/MS method for quantification of ibuprofen (IBP) enantiomers in human plasma, the possibility of IBP acylglucoronide (IBP-Glu) back-conversion was assessed. This involved investigation of in source and in vitro back-conversion. The separation of IBP enantiomers, its metabolite and rac-IBP-d3 (internal standard), was achieved within 6 min using Chiracel OJ-RH chromatographic column (150 × 2.1 mm, 5 μm). The followed selected reaction monitoring transitions for IBP-Glu (m/z 381.4 → 205.4, m/z 381.4 → 161.4 and m/z 205.4 → 161.4) implied that under the optimized electrospray ionization parameters, in source back-conversion of IBP-Glu was insignificant. The results obtained after liquid-liquid extraction of plasma samples spiked with IBP-Glu revealed that the amount of IBP enantiomers generated by IBP-Glu back-conversion was far <20% of lower limit of quantification sample. These results indicate that the presence of IBP-Glu in real samples will not affect the quantification of the IBP enantiomers; thereby reliability of the method was improved. Additional advantage of the method is the short analysis time making it suitable for the large number of samples. The method was fully validated according to the EMA guideline and was shown to meet all requirements to be applied in a pharmacokinetic study. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  17. Visualization of LC-MS/MS proteomics data in MaxQuant.

    PubMed

    Tyanova, Stefka; Temu, Tikira; Carlson, Arthur; Sinitcyn, Pavel; Mann, Matthias; Cox, Juergen

    2015-04-01

    Modern software platforms enable the analysis of shotgun proteomics data in an automated fashion resulting in high quality identification and quantification results. Additional understanding of the underlying data can be gained with the help of advanced visualization tools that allow for easy navigation through large LC-MS/MS datasets potentially consisting of terabytes of raw data. The updated MaxQuant version has a map navigation component that steers the users through mass and retention time-dependent mass spectrometric signals. It can be used to monitor a peptide feature used in label-free quantification over many LC-MS runs and visualize it with advanced 3D graphic models. An expert annotation system aids the interpretation of the MS/MS spectra used for the identification of these peptide features. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Categories and Underlying Processes, or Representative Behavior Samples and S-R Analysis: Opposing Strategies.

    ERIC Educational Resources Information Center

    Staats, Arthur W.

    Psychological researchers should deal with the concrete stimulus-response principles of learning on which behavior is based, and study behaviors that are representative of real life behaviors. The present research strategy has come from two faulty ideas: first, a concern with underlying, inferred mental processes, rather than with actual tasks or…

  19. Service Animals for Students with Disabilities under IDEA and Section 504 of the Rehabilitation Act of 1973

    ERIC Educational Resources Information Center

    Berry, Jennifer; Katsiyannis, Antonis

    2012-01-01

    In 2007, services under the Individuals with Disabilities Education Improvement Act were provided to almost 6 million students with disabilities (Data Accountability Center, 2011). By virtue of their eligibility, these students were entitled to a "free and appropriate public education" (FAPE). To ensure that students receive FAPE,…

  20. Convex geometry of quantum resource quantification

    NASA Astrophysics Data System (ADS)

    Regula, Bartosz

    2018-01-01

    We introduce a framework unifying the mathematical characterisation of different measures of general quantum resources and allowing for a systematic way to define a variety of faithful quantifiers for any given convex quantum resource theory. The approach allows us to describe many commonly used measures such as matrix norm-based quantifiers, robustness measures, convex roof-based measures, and witness-based quantifiers together in a common formalism based on the convex geometry of the underlying sets of resource-free states. We establish easily verifiable criteria for a measure to possess desirable properties such as faithfulness and strong monotonicity under relevant free operations, and show that many quantifiers obtained in this framework indeed satisfy them for any considered quantum resource. We derive various bounds and relations between the measures, generalising and providing significantly simplified proofs of results found in the resource theories of quantum entanglement and coherence. We also prove that the quantification of resources in this framework simplifies for pure states, allowing us to obtain more easily computable forms of the considered measures, and show that many of them are in fact equal on pure states. Further, we investigate the dual formulation of resource quantifiers, which provide a characterisation of the sets of resource witnesses. We present an explicit application of the results to the resource theories of multi-level coherence, entanglement of Schmidt number k, multipartite entanglement, as well as magic states, providing insight into the quantification of the four resources by establishing novel quantitative relations and introducing new quantifiers, such as a measure of entanglement of Schmidt number k which generalises the convex roof-extended negativity, a measure of k-coherence which generalises the \

  1. Deathcore, creativity, and scientific thinking

    USGS Publications Warehouse

    Angeler, David G.; Sundstrom, Shana M.; Allen, Craig R.

    2016-01-01

    BackgroundMajor scientific breakthroughs are generally the result of materializing creative ideas, the result of an inductive process that sometimes spontaneously and unexpectedly generates a link between thoughts and/or objects that did not exist before. Creativity is the cornerstone of scientific thinking, but scientists in academia are judged by metrics of quantification that often leave little room for creative thinking. In many scientific fields, reductionist approaches are rewarded and new ideas viewed skeptically. As a result, scientific inquiry is often confined to narrow but safe disciplinary ivory towers, effectively preventing profoundly creative explorations that could yield unexpected benefits.New informationThis paper argues how apparently unrelated fields specifically music and belief systems can be combined in a provocative allegory to provide novel perspectives regarding patterns in nature, thereby potentially inspiring innovation in the natural, social and other sciences. The merger between basic human tensions such as those embodied by religion and music, for example the heavy metal genre of deathcore, may be perceived as controversial, challenging, and uncomfortable. However, it is an example of moving the thinking process out of unconsciously established comfort zones, through the connection of apparently unrelated entities. We argue that music, as an auditory art form, has the potential to enlighten and boost creative thinking in science. Metal, as a fast evolving and diversifying extreme form of musical art, may be particularly suitable to trigger surprising associations in scientific inquiry. This may pave the way for dealing with questions about what we don´t know that we don´t know in a fast-changing planet.

  2. Lead users' ideas on core features to support physical activity in rheumatoid arthritis: a first step in the development of an internet service using participatory design.

    PubMed

    Revenäs, Åsa; Opava, Christina H; Åsenlöf, Pernilla

    2014-03-22

    Despite the growing evidence of the benefits of physical activity (PA) in individuals with rheumatoid arthritis (RA), the majority is not physically active enough. An innovative strategy is to engage lead users in the development of PA interventions provided over the internet. The aim was to explore lead users' ideas and prioritization of core features in a future internet service targeting adoption and maintenance of healthy PA in people with RA. Six focus group interviews were performed with a purposively selected sample of 26 individuals with RA. Data were analyzed with qualitative content analysis and quantification of participants' prioritization of most important content. Six categories were identified as core features for a future internet service: up-to-date and evidence-based information and instructions, self-regulation tools, social interaction, personalized set-up, attractive design and content, and access to the internet service. The categories represented four themes, or core aspects, important to consider in the design of the future service: (1) content, (2) customized options, (3) user interface and (4) access and implementation. This is, to the best of our knowledge, the first study involving people with RA in the development of an internet service to support the adoption and maintenance of PA.Participants helped identifying core features and aspects important to consider and further explore during the next phase of development. We hypothesize that involvement of lead users will make transfer from theory to service more adequate and user-friendly and therefore will be an effective mean to facilitate PA behavior change.

  3. Managing expectations when publishing tools and methods for computational proteomics.

    PubMed

    Martens, Lennart; Kohlbacher, Oliver; Weintraub, Susan T

    2015-05-01

    Computational tools are pivotal in proteomics because they are crucial for identification, quantification, and statistical assessment of data. The gateway to finding the best choice of a tool or approach for a particular problem is frequently journal articles, yet there is often an overwhelming variety of options that makes it hard to decide on the best solution. This is particularly difficult for nonexperts in bioinformatics. The maturity, reliability, and performance of tools can vary widely because publications may appear at different stages of development. A novel idea might merit early publication despite only offering proof-of-principle, while it may take years before a tool can be considered mature, and by that time it might be difficult for a new publication to be accepted because of a perceived lack of novelty. After discussions with members of the computational mass spectrometry community, we describe here proposed recommendations for organization of informatics manuscripts as a way to set the expectations of readers (and reviewers) through three different manuscript types that are based on existing journal designations. Brief Communications are short reports describing novel computational approaches where the implementation is not necessarily production-ready. Research Articles present both a novel idea and mature implementation that has been suitably benchmarked. Application Notes focus on a mature and tested tool or concept and need not be novel but should offer advancement from improved quality, ease of use, and/or implementation. Organizing computational proteomics contributions into these three manuscript types will facilitate the review process and will also enable readers to identify the maturity and applicability of the tool for their own workflows.

  4. Did dinosaurs have megakaryocytes? New ideas about platelets and their progenitors.

    PubMed

    Brass, Lawrence F

    2005-12-01

    Biological evolution has struggled to produce mechanisms that can limit blood loss following injury. In humans and other mammals, control of blood loss (hemostasis) is achieved through a combination of plasma proteins, most of which are made in the liver, and platelets, anucleate blood cells that are produced in the bone marrow by megakaryocytes. Much has been learned about the underlying mechanisms, but much remains to be determined. The articles in this series review current ideas about the production of megakaryocytes from undifferentiated hematopoietic precursors, the steps by which megakaryocytes produce platelets, and the molecular mechanisms within platelets that make hemostasis possible. The underlying theme that connects the articles is the intense investigation of a complex system that keeps humans from bleeding to death, but at the same time exposes us to increased risk of thrombosis and vascular disease.

  5. Promoting proximal formative assessment with relational discourse

    NASA Astrophysics Data System (ADS)

    Scherr, Rachel E.; Close, Hunter G.; McKagan, Sarah B.

    2012-02-01

    The practice of proximal formative assessment - the continual, responsive attention to students' developing understanding as it is expressed in real time - depends on students' sharing their ideas with instructors and on teachers' attending to them. Rogerian psychology presents an account of the conditions under which proximal formative assessment may be promoted or inhibited: (1) Normal classroom conditions, characterized by evaluation and attention to learning targets, may present threats to students' sense of their own competence and value, causing them to conceal their ideas and reducing the potential for proximal formative assessment. (2) In contrast, discourse patterns characterized by positive anticipation and attention to learner ideas increase the potential for proximal formative assessment and promote self-directed learning. We present an analysis methodology based on these principles and demonstrate its utility for understanding episodes of university physics instruction.

  6. International MODIS and AIRS Processing Package (IMAPP) Implementation of Infusion of Satellite Data into Environmental Applications-International (IDEA-I) for Air Quality Forecasts using Suomi-NPP, Terra and Aqua Aerosol Retrievals

    NASA Astrophysics Data System (ADS)

    Davies, J. E.; Strabala, K.; Pierce, R. B.; Huang, A.

    2016-12-01

    Fine mode aerosols play a significant role in public health through their impact on respiratory and cardiovascular disease. IDEA-I (Infusion of Satellite Data into Environmental Applications-International) is a real-time system for trajectory-based forecasts of aerosol dispersion that can assist in the prediction of poor air quality events. We released a direct broadcast version of IDEA-I for aerosol trajectory forecasts in June 2012 under the International MODIS and AIRS Processing Package (IMAPP). In January 2014 we updated this application with website software to display multi-satellite products. Now we have added VIIRS aerosols from Suomi National Polar-orbiting Partnership (S-NPP). IMAPP is a NASA-funded and freely-distributed software package developed at Space Science and Engineering Center of University of Wisconsin-Madison that has over 2,300 registered users worldwide. With IMAPP, any ground station capable of receiving direct broadcast from Terra or Aqua can produce calibrated and geolocated radiances and a suite of environmental products. These products include MODIS AOD required for IDEA-I. VIIRS AOD for IDEA-I can be generated by Community Satellite Processing Package (CSPP) VIIRS EDR Version 2.0 Software for Suomi NPP. CSPP is also developed and distributed by Space Science & Engineering Center. This presentation describes our updated IMAPP implementation of IDEA-I through an example of its operation in a region known for episodic poor air quality events.

  7. Direct qPCR quantification using the Quantifiler(®) Trio DNA quantification kit.

    PubMed

    Liu, Jason Yingjie

    2014-11-01

    The effectiveness of a direct quantification assay is essential to the adoption of the combined direct quantification/direct STR workflow. In this paper, the feasibility of using the Quantifiler(®) Trio DNA quantification kit for the direct quantification of forensic casework samples was investigated. Both low-level touch DNA samples and blood samples were collected on PE swabs and quantified directly. The increased sensitivity of the Quantifiler(®) Trio kit enabled the detection of less than 10pg of DNA in unprocessed touch samples and also minimizes the stochastic effect experienced by different targets in the same sample. The DNA quantity information obtained from a direct quantification assay using the Quantifiler(®) Trio kit can also be used to accurately estimate the optimal input DNA quantity for a direct STR amplification reaction. The correlation between the direct quantification results (Quantifiler(®) Trio kit) and the direct STR results (GlobalFiler™ PCR amplification kit(*)) for low-level touch DNA samples indicates that direct quantification using the Quantifiler(®) Trio DNA quantification kit is more reliable than the Quantifiler(®) Duo DNA quantification kit for predicting the STR results of unprocessed touch DNA samples containing less than 10pg of DNA. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  8. An ultrahigh-performance liquid chromatography method with electrospray ionization tandem mass spectrometry for simultaneous quantification of five phytohormones in medicinal plant Glycyrrhiza uralensis under abscisic acid stress.

    PubMed

    Xiang, Yu; Song, Xiaona; Qiao, Jing; Zang, Yimei; Li, Yanpeng; Liu, Yong; Liu, Chunsheng

    2015-07-01

    An efficient simplified method was developed to determine multiple classes of phytohormones simultaneously in the medicinal plant Glycyrrhiza uralensis. Ultrahigh-performance liquid chromatography electrospray ionization tandem mass spectrometry (UPLC/ESI-MS/MS) with multiple reaction monitoring (MRM) in negative mode was used for quantification. The five studied phytohormones are gibberellic acid (GA3), abscisic acid (ABA), jasmonic acid (JA), indole-3-acetic acid, and salicylic acid (SA). Only 100 mg of fresh leaves was needed, with one purification step based on C18 solid-phase extraction. Cinnamic acid was chosen as the internal standard instead of isotope-labeled internal standards. Under the optimized conditions, the five phytohormones with internal standard were separated within 4 min, with good linearities and high sensitivity. The validated method was applied to monitor the spatial and temporal changes of the five phytohormones in G. uralensis under ABA stress. The levels of GA3, ABA, JA, and SA in leaves of G. uralensis were increased at different times and with different tendencies in the reported stress mode. These changes in phytohormone levels are discussed in the context of a possible feedback regulation mechanism. Understanding this mechanism will provide a good chance of revealing the mutual interplay between different biosynthetic routes, which could further help elucidate the mechanisms of effective composition accumulation in medicinal plants.

  9. Exploring Students' Ideas About Cosmological Concepts

    NASA Astrophysics Data System (ADS)

    Bailey, Janelle M.

    2012-03-01

    As scientists seek to understand the nature of our Universe, we can also explore our students' understanding of cosmological concepts. What ideas about the origin, evolution, and fate of our Universe do students bring with them to the classroom? In this talk, I will describe an ongoing study in which students' preinstructional ideas are examined. Topics under investigation include the age of the universe; structure and composition, including dark matter and dark energy; the Big Bang; and how astronomers come to understand these topics. Approximately 1000 students have responded to open-ended questions at the start of their introductory astronomy courses. Analysis of the responses, through an iterative process of identifying self-emergent themes, suggests that students have a number of common ideas. For example, students frequently conflate structure terms such as solar system, galaxy, and universe or do not understand the relationship between the terms; believe the universe to be infinitely old; and may not be aware of dark matter or dark energy. Additional themes, as well as the frequencies of typical responses, will be discussed, and future research efforts.

  10. Fostering Under-represented Minority Student Success and Interest in the Geosciences: Outcomes of the UNC-Chapel Hill Increasing Diversity and Enhancing Academia (IDEA) Program

    NASA Astrophysics Data System (ADS)

    Hughes, M. H.; Gray, K.; Drostin, M.

    2016-12-01

    For under-represented minority (URM) students, opportunities to meaningfully participate in academic communities and develop supportive relationships with faculty and peers influence persistence in STEM majors (Figueroa, Hurtado, & Wilkins, 2015; PCAST, 2012; Tsui, 2007). Creating such opportunities is even more important in the geosciences, where a lower percentage of post-secondary degrees are awarded to URM students than in other STEM fields (NSF, 2015; O'Connell & Holmes, 2011; NSF, 2011). Since 2011, Increasing Diversity and Enhancing Academia (IDEA), a program of the UNC-Chapel Hill Institute for the Environment (UNC-IE), has provided 39 undergraduates (predominantly URM and female students) with career-relevant research experiences and professional development opportunities, including a culminating experience of presenting their research at a campus-wide research symposium. External evaluation data have helped to characterize the effectiveness of the IDEA program. These data included pre- and post-surveys assessing students' interest in geosciences, knowledge of career pathways, and perceptions of their abilities related to a specific set of scientific research skills. Additionally, progress towards degrees and dissemination outcomes were tracked. In this presentation, we will share quantitative and qualitative data that demonstrate that participation in the IDEA program has influenced students' interest and persistence in geosciences research and careers. These data range from self-reported competencies in a variety of scientific skills (such as organizing and interpreting data and reading and interpreting science literature) to documentation of student participation in geoscience study and professions. About 69% of participants continued research begun during their internships beyond the internship; and about 38% pursued graduate degrees and secured jobs in geoscience and other STEM fields. (Nearly half are still in school.) Overall, these evaluation data have shown that the IDEA research experience, combined with program elements focused on professional development, reinforces students' sense of their science abilities, connects them to a network of supportive students and professionals and contributes to their sense of belonging within the geosciences.

  11. Extreme-Scale Bayesian Inference for Uncertainty Quantification of Complex Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Biros, George

    Uncertainty quantification (UQ)—that is, quantifying uncertainties in complex mathematical models and their large-scale computational implementations—is widely viewed as one of the outstanding challenges facing the field of CS&E over the coming decade. The EUREKA project set to address the most difficult class of UQ problems: those for which both the underlying PDE model as well as the uncertain parameters are of extreme scale. In the project we worked on these extreme-scale challenges in the following four areas: 1. Scalable parallel algorithms for sampling and characterizing the posterior distribution that exploit the structure of the underlying PDEs and parameter-to-observable map. Thesemore » include structure-exploiting versions of the randomized maximum likelihood method, which aims to overcome the intractability of employing conventional MCMC methods for solving extreme-scale Bayesian inversion problems by appealing to and adapting ideas from large-scale PDE-constrained optimization, which have been very successful at exploring high-dimensional spaces. 2. Scalable parallel algorithms for construction of prior and likelihood functions based on learning methods and non-parametric density estimation. Constructing problem-specific priors remains a critical challenge in Bayesian inference, and more so in high dimensions. Another challenge is construction of likelihood functions that capture unmodeled couplings between observations and parameters. We will create parallel algorithms for non-parametric density estimation using high dimensional N-body methods and combine them with supervised learning techniques for the construction of priors and likelihood functions. 3. Bayesian inadequacy models, which augment physics models with stochastic models that represent their imperfections. The success of the Bayesian inference framework depends on the ability to represent the uncertainty due to imperfections of the mathematical model of the phenomena of interest. This is a central challenge in UQ, especially for large-scale models. We propose to develop the mathematical tools to address these challenges in the context of extreme-scale problems. 4. Parallel scalable algorithms for Bayesian optimal experimental design (OED). Bayesian inversion yields quantified uncertainties in the model parameters, which can be propagated forward through the model to yield uncertainty in outputs of interest. This opens the way for designing new experiments to reduce the uncertainties in the model parameters and model predictions. Such experimental design problems have been intractable for large-scale problems using conventional methods; we will create OED algorithms that exploit the structure of the PDE model and the parameter-to-output map to overcome these challenges. Parallel algorithms for these four problems were created, analyzed, prototyped, implemented, tuned, and scaled up for leading-edge supercomputers, including UT-Austin’s own 10 petaflops Stampede system, ANL’s Mira system, and ORNL’s Titan system. While our focus is on fundamental mathematical/computational methods and algorithms, we will assess our methods on model problems derived from several DOE mission applications, including multiscale mechanics and ice sheet dynamics.« less

  12. Patterns of mercury dispersion from local and regional emission sources, rural Central Wisconsin, USA

    USGS Publications Warehouse

    Kolker, A.; Olson, M.L.; Krabbenhoft, D.P.; Tate, M.T.; Engle, M.A.

    2010-01-01

    Simultaneous real-time changes in mercury (Hg) speciation ?????" reactive gaseous Hg (RGM), elemental Hg (Hg??), and fine particulate Hg (Hg-PM2.5), were determined from June to November 2007, in ambient air at three locations in rural Central Wisconsin. Known Hg emission sources within the airshed of the monitoring sites include: 1) a 1114 megawatt (MW) coal-fired electric utility generating station; 2) a Hg-bed chlor-alkali plant; and 3) a smaller (465 MW) coal-burning electric utility. Monitoring sites, showing sporadic elevation of RGM, Hg?? and Hg-PM 2.5, were positioned at distances of 25, 50 and 100 km northward of the larger electric utility. A series of RGM events were recorded at each site. The largest, on 23 September, occurred under prevailing southerly winds, with a maximum RGM value (56.8 pg m-3) measured at the 100 km site, and corresponding elevated SO2 (10.41 ppbv; measured at 50 km site). The finding that RGM, Hg??, and Hg-PM2.5 are not always highest at the 25 km site, closest to the large generating station, contradicts the idea that RGM decreases with distance from a large point source. This may be explained if: 1) the 100 km site was influenced by emissions from the chlor-alkali facility or by RGM from regional urban sources; 2) the emission stack height of the larger power plant promoted plume transport at an elevation where the Hg is carried over the closest site; or 3) RGM was being generated in the plume through oxidation of Hg??. Operational changes at each emitter since 2007 should reduce their Hg output, potentially allowing quantification of the environmental benefit in future studies.

  13. Probabilistic Weather Information Tailored to the Needs of Transmission System Operators

    NASA Astrophysics Data System (ADS)

    Alberts, I.; Stauch, V.; Lee, D.; Hagedorn, R.

    2014-12-01

    Reliable and accurate forecasts for wind and photovoltaic (PV) power production are essential for stable transmission systems. A high potential for improving the wind and PV power forecasts lies in optimizing the weather forecasts, since these energy sources are highly weather dependent. For this reason the main objective of the German research project EWeLiNE is to improve the quality the underlying numerical weather predictions towards energy operations. In this project, the German Meteorological Service (DWD), the Fraunhofer Institute for Wind Energy and Energy System Technology, and three of the German transmission system operators (TSOs) are working together to improve the weather and power forecasts. Probabilistic predictions are of particular interest, as the quantification of uncertainties provides an important tool for risk management. Theoretical considerations suggest that it can be advantageous to use probabilistic information to represent and respond to the remaining uncertainties in the forecasts. However, it remains a challenge to integrate this information into the decision making processes related to market participation and power systems operations. The project is planned and carried out in close cooperation with the involved TSOs in order to ensure the usability of the products developed. It will conclude with a demonstration phase, in which the improved models and newly developed products are combined into a process chain and used to provide information to TSOs in a real-time decision support tool. The use of a web-based development platform enables short development cycles and agile adaptation to evolving user needs. This contribution will present the EWeLiNE project and discuss ideas on how to incorporate probabilistic information into the users' current decision making processes.

  14. Using airborne laser altimetry to determine fuel models for estimating fire behavior

    Treesearch

    Carl A. Seielstad; Lloyd P. Queen

    2003-01-01

    Airborne laser altimetry provides an unprecedented view of the forest floor in timber fuel types and is a promising new tool for fuels assessments. It can be used to resolve two fuel models under closed canopies and may be effective for estimating coarse woody debris loads. A simple metric - obstacle density - provides the necessary quantification of fuel bed roughness...

  15. Quantification and mapping of surface residue cover and tillage practices for maize and soybean fields in south central Nebraska-USA using Landsat imagery

    USDA-ARS?s Scientific Manuscript database

    The area cultivated under conservation tillage practices such as no-till and minimal tillage has recently increased in south central Nebraska (NE). Consequently, changes in some of the impacts of cropping systems on soil such as enhancing soil and water quality, improving soil structures and infiltr...

  16. Quantification of net annual C input in terrestrial ecosystems of the Italian Peninsula under different land-uses

    USDA-ARS?s Scientific Manuscript database

    Soil organic matter (SOM) is a very important compartment of the biosphere: it represents the largest dynamic carbon (C) pool where the C is stored for the longest time period. Root inputs, as exudates and root slush, represent a major, where not the largest, annual contribution to soil C input. Roo...

  17. Absolute quantification of regional cerebral glucose utilization in mice by 18F-FDG small animal PET scanning and 2-14C-DG autoradiography.

    PubMed

    Toyama, Hiroshi; Ichise, Masanori; Liow, Jeih-San; Modell, Kendra J; Vines, Douglass C; Esaki, Takanori; Cook, Michelle; Seidel, Jurgen; Sokoloff, Louis; Green, Michael V; Innis, Robert B

    2004-08-01

    The purpose of this study was to evaluate the feasibility of absolute quantification of regional cerebral glucose utilization (rCMR(glc)) in mice by use of (18)F-FDG and a small animal PET scanner. rCMR(glc) determined with (18)F-FDG PET was compared with values determined simultaneously by the autoradiographic 2-(14)C-DG method. In addition, we compared the rCMR(glc) values under isoflurane, ketamine and xylazine anesthesia, and awake states. Immediately after injection of (18)F-FDG and 2-(14)C-DG into mice, timed arterial samples were drawn over 45 min to determine the time courses of (18)F-FDG and 2-(14)C-DG. Animals were euthanized at 45 min and their brain was imaged with the PET scanner. The brains were then processed for 2-(14)C-DG autoradiography. Regions of interest were manually placed over cortical regions on corresponding coronal (18)F-FDG PET and 2-(14)C-DG autoradiographic images. rCMR(glc) values were calculated for both tracers by the autoradiographic 2-(14)C-DG method with modifications for the different rate and lumped constants for the 2 tracers. Average rCMR(glc) values in cerebral cortex with (18)F-FDG PET under normoglycemic conditions (isoflurane and awake) were generally lower (by 8.3%) but strongly correlated with those of 2-(14)C-DG (r(2) = 0.95). On the other hand, under hyperglycemic conditions (ketamine/xylazine) average cortical rCMR(glc) values with (18)F-FDG PET were higher (by 17.3%) than those with 2-(14)C-DG. Values for rCMR(glc) and uptake (percentage injected dose per gram [%ID/g]) with (18)F-FDG PET were significantly lower under both isoflurane and ketamine/xylazine anesthesia than in the awake mice. However, the reductions of rCMR(glc) were markedly greater under isoflurane (by 57%) than under ketamine and xylazine (by 19%), whereas more marked reductions of %ID/g were observed with ketamine/xylazine (by 54%) than with isoflurane (by 37%). These reverse differences between isoflurane and ketamine/xylazine may be due to competitive effect of (18)F-FDG and glucose uptake to the brain under hyperglycemia. We were able to obtain accurate absolute quantification of rCMR(glc) with mouse (18)F-FDG PET imaging as confirmed by concurrent use of the autoradiographic 2-(14)C-DG method. Underestimation of rCMR(glc) by (18)F-FDG in normoglycemic conditions may be due to partial-volume effects. Computation of rCMR(glc) from (18)F-FDG data in hyperglycemic animals may require, however, alternative rate and lumped constants for (18)F-FDG.

  18. Fuzzy logic

    NASA Technical Reports Server (NTRS)

    Zadeh, Lofti A.

    1988-01-01

    The author presents a condensed exposition of some basic ideas underlying fuzzy logic and describes some representative applications. The discussion covers basic principles; meaning representation and inference; basic rules of inference; and the linguistic variable and its application to fuzzy control.

  19. Quantification of incisal tooth wear in upper anterior teeth: conventional vs new method using toolmakers microscope and a three-dimensional measuring technique.

    PubMed

    Al-Omiri, Mahmoud K; Sghaireen, Mohd G; Alzarea, Bader K; Lynch, Edward

    2013-12-01

    This study aimed to quantify tooth wear in upper anterior teeth using a new CAD-CAM Laser scanning machine, tool maker microscope and conventional tooth wear index. Fifty participants (25 males and 25 females, mean age = 25 ± 4 years) were assessed for incisal tooth wear of upper anterior teeth using Smith and Knight clinical tooth wear index (TWI) on two occasions, the study baseline and 1 year later. Stone dies for each tooth were prepared and scanned using the CAD-CAM Laser Cercon System. Scanned images were printed and examined under a toolmaker microscope to quantify tooth wear and then the dies were directly assessed under the microscope to measure tooth wear. The Wilcoxon Signed Ranks Test was used to analyze the data. TWI scores for incisal edges were 0-3 and were similar at both occasions. Score 4 was not detected. Wear values measured by directly assessing the dies under the toolmaker microscope (range = 113 - 150 μm, mean = 130 ± 20 μm) were significantly more than those measured from Cercon Digital Machine images (range=52-80 μm, mean = 68 ± 23 μm) and both showed significant differences between the two occasions. Wear progression in upper anterior teeth was effectively detected by directly measuring the dies or the images of dies under toolmaker microscope. Measuring the dies of worn dentition directly under tool maker microscope enabled detection of wear progression more accurately than measuring die images obtained with Cercon Digital Machine. Conventional method was the least sensitive for tooth wear quantification and was unable to identify wear progression in most cases. Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. Numerical Error Estimation with UQ

    NASA Astrophysics Data System (ADS)

    Ackmann, Jan; Korn, Peter; Marotzke, Jochem

    2014-05-01

    Ocean models are still in need of means to quantify model errors, which are inevitably made when running numerical experiments. The total model error can formally be decomposed into two parts, the formulation error and the discretization error. The formulation error arises from the continuous formulation of the model not fully describing the studied physical process. The discretization error arises from having to solve a discretized model instead of the continuously formulated model. Our work on error estimation is concerned with the discretization error. Given a solution of a discretized model, our general problem statement is to find a way to quantify the uncertainties due to discretization in physical quantities of interest (diagnostics), which are frequently used in Geophysical Fluid Dynamics. The approach we use to tackle this problem is called the "Goal Error Ensemble method". The basic idea of the Goal Error Ensemble method is that errors in diagnostics can be translated into a weighted sum of local model errors, which makes it conceptually based on the Dual Weighted Residual method from Computational Fluid Dynamics. In contrast to the Dual Weighted Residual method these local model errors are not considered deterministically but interpreted as local model uncertainty and described stochastically by a random process. The parameters for the random process are tuned with high-resolution near-initial model information. However, the original Goal Error Ensemble method, introduced in [1], was successfully evaluated only in the case of inviscid flows without lateral boundaries in a shallow-water framework and is hence only of limited use in a numerical ocean model. Our work consists in extending the method to bounded, viscous flows in a shallow-water framework. As our numerical model, we use the ICON-Shallow-Water model. In viscous flows our high-resolution information is dependent on the viscosity parameter, making our uncertainty measures viscosity-dependent. We will show that we can choose a sensible parameter by using the Reynolds-number as a criteria. Another topic, we will discuss is the choice of the underlying distribution of the random process. This is especially of importance in the scope of lateral boundaries. We will present resulting error estimates for different height- and velocity-based diagnostics applied to the Munk gyre experiment. References [1] F. RAUSER: Error Estimation in Geophysical Fluid Dynamics through Learning; PhD Thesis, IMPRS-ESM, Hamburg, 2010 [2] F. RAUSER, J. MAROTZKE, P. KORN: Ensemble-type numerical uncertainty quantification from single model integrations; SIAM/ASA Journal on Uncertainty Quantification, submitted

  1. The impact of conservative discourses in family policies, population politics, and gender rights in Poland and Turkey.

    PubMed

    Korkut, Umut; Eslen-Ziya, Hande

    2011-01-01

    This article uses childcare as a case study to test the impact of ideas that embody a traditional understanding of gender relations in relation to childcare. Conservative ideas regard increasing female labor market participation as a cause of decreasing fertility on the functioning of a set of general policies to increase fertility rates. It looks into the Polish and Turkish contexts for empirical evidence. The Polish context shows a highly institutionalized system of family policies in contrast to almost unessential institutions in Turkey. Formally, the labor market participation of women is much lower in Turkey than in Poland. Yet, given the size of the informal market in Turkey, women's labor participation is obviously higher than what appears in the statistics. Bearing in mind this divergence, the article suggests Poland and Turkey as two typologies for studying population politics in contexts where socially conservative ideas regarding gender remain paramount. We qualify ideas as conservative if they enforce a traditional understanding of gender relations in care-giving and underline women's role in the labor market as an element of declining fertility. In order to delineate ideational impact, this article looks into how ideas (a) supplant and (b) substitute formal institutions. Therefore, we argue that there are two mechanisms pertaining to the dominance of conservative conventions: conservative ideas may either supplant the institutional impact on family policies, or substitute them thanks to a superior reasoning which societies assign to them. Furthermore, conservative conventions prevail alongside women's customary unpaid work as care-givers regardless of the level of their formal workforce participation. We propose as our major findings for the literature of population politics that ideas, as ubiquitous belief systems, are more powerful than institutions since they provide what is perceived as legitimate, acceptable, and good for the societies under study. In the end, irrespective of the presence of institutions, socially conservative ideas prevail.

  2. MPN- and Real-Time-Based PCR Methods for the Quantification of Alkane Monooxygenase Homologous Genes (alkB) in Environmental Samples

    NASA Astrophysics Data System (ADS)

    Pérez-de-Mora, Alfredo; Schulz, Stephan; Schloter, Michael

    Hydrocarbons are major contaminants of soil ecosystems as a result of uncontrolled oil spills and wastes disposal into the environment. Ecological risk assessment and remediation of affected sites is often constrained due to lack of suitable prognostic and diagnostic tools that provide information of abiotic-biotic interactions occurring between contaminants and biological targets. Therefore, the identification and quantification of genes involved in the degradation of hydrocarbons may play a crucial role for evaluating the natural attenuation potential of contaminated sites and the development of successful bioremediation strategies. Besides other gene clusters, the alk operon has been identified as a major player for alkane degradation in different soils. An oxygenase gene (alkB) codes for the initial step of the degradation of aliphatic alkanes under aerobic conditions. In this work, we present an MPN- and a real-time PCR method for the quantification of the bacterial gene alkB (coding for rubredoxin-dependent alkane monooxygenase) in environmental samples. Both approaches enable a rapid culture-independent screening of the alkB gene in the environment, which can be used to assess the intrinsic natural attenuation potential of a site or to follow up the on-going progress of bioremediation assays.

  3. Identification and quantification of volatile organic compounds using systematic single-ion chromatograms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tsuchiya, Yoshio; Kanabus-Kaminska, J.M.

    1996-12-31

    In order to determine the background level of volatile organic compounds (VOCs) in Canadian indoor air, a method of identification and quantification at a level of 0.3 {micro}g/m{sup 3} using systematic single-ion chromatograms (SICs) has been developed. The compounds selected for measurement included several halogenated compounds, oxygen compounds, terpenes, and C8 to C16 n-alkanes. Air samples were taken in 3-layered sorbent tubes and trapped compounds were thermally desorbed into the helium stream of a gas chromatograph/mass spectrometer (GC/MS) analytical system. Total quantities of volatile organic compounds (TVOCs) were measured using a flame ionization detector (FID). Individual compounds were analyzed bymore » a GC/MS. For the identification of compounds in the main stream GC effluent, both the specific GC retention and mass spectra were used. About 50 selected SICs were routinely extracted from a total ion chromatogram (TIC) to detect and quantify compounds. For each compound, a single representative ion was selected. The specific retention was calculated from the elution time on the SIC. For quantification, ion counts under a peak in the SIC were measured. The single-ion MS response factor for some of the compounds was experimentally determined using a dynamic reference procedure.« less

  4. Solid-phase microextraction method development for headspace analysis of volatile flavor compounds.

    PubMed

    Roberts, D D; Pollien, P; Milo, C

    2000-06-01

    Solid-phase microextraction (SPME) fibers were evaluated for their ability to adsorb volatile flavor compounds under various conditions with coffee and aqueous flavored solutions. Experiments comparing different fibers showed that poly(dimethylsiloxane)/divinylbenzene had the highest overall sensitivity. Carboxen/poly(dimethylsiloxane) was the most sensitive to small molecules and acids. As the concentrations of compounds increased, the quantitative linear range was exceeded as shown by competition effects with 2-isobutyl-3-methoxypyrazine at concentrations above 1 ppm. A method based on a short-time sampling of the headspace (1 min) was shown to better represent the equilibrium headspace concentration. Analysis of coffee brew with a 1-min headspace adsorption time was verified to be within the linear range for most compounds and thus appropriate for relative headspace quantification. Absolute quantification of volatiles, using isotope dilution assays (IDA), is not subject to biases caused by excess compound concentrations or complex matrices. The degradation of coffee aroma volatiles during storage was followed by relative headspace measurements and absolute quantifications. Both methods gave similar values for 3-methylbutanal, 4-ethylguaiacol, and 2,3-pentanedione. Acetic acid, however, gave higher values during storage upon relative headspace measurements due to concurrent pH decreases that were not seen with IDA.

  5. Estimation of the quantification uncertainty from flow injection and liquid chromatography transient signals in inductively coupled plasma mass spectrometry

    NASA Astrophysics Data System (ADS)

    Laborda, Francisco; Medrano, Jesús; Castillo, Juan R.

    2004-06-01

    The quality of the quantitative results obtained from transient signals in high-performance liquid chromatography-inductively coupled plasma mass spectrometry (HPLC-ICPMS) and flow injection-inductively coupled plasma mass spectrometry (FI-ICPMS) was investigated under multielement conditions. Quantification methods were based on multiple-point calibration by simple and weighted linear regression, and double-point calibration (measurement of the baseline and one standard). An uncertainty model, which includes the main sources of uncertainty from FI-ICPMS and HPLC-ICPMS (signal measurement, sample flow rate and injection volume), was developed to estimate peak area uncertainties and statistical weights used in weighted linear regression. The behaviour of the ICPMS instrument was characterized in order to be considered in the model, concluding that the instrument works as a concentration detector when it is used to monitorize transient signals from flow injection or chromatographic separations. Proper quantification by the three calibration methods was achieved when compared to reference materials, although the double-point calibration allowed to obtain results of the same quality as the multiple-point calibration, shortening the calibration time. Relative expanded uncertainties ranged from 10-20% for concentrations around the LOQ to 5% for concentrations higher than 100 times the LOQ.

  6. Trace quantification of selected sulfonamides in aqueous media by implementation of a new dispersive solid-phase extraction method using a nanomagnetic titanium dioxide graphene-based sorbent and HPLC-UV.

    PubMed

    Izanloo, Maryam; Esrafili, Ali; Behbahani, Mohammad; Ghambarian, Mahnaz; Reza Sobhi, Hamid

    2018-02-01

    Herein, a new dispersive solid-phase extraction method using a nano magnetic titanium dioxide graphene-based sorbent in conjunction with high-performance liquid chromatography and ultraviolet detection was successfully developed. The method was proved to be simple, sensitive, and highly efficient for the trace quantification of sulfacetamide, sulfathiazole, sulfamethoxazole, and sulfadiazine in relatively large volume of aqueous media. Initially, the nano magnetic titanium dioxide graphene-based sorbent was successfully synthesized and subsequently characterized by scanning electron microscopy and X-ray diffraction. Then, the sorbent was used for the sorption and extraction of the selected sulfonamides mainly through π-π stacking hydrophobic interactions. Under the established conditions, the calibration curves were linear over the concentration range of 1-200 μg/L. The limit of quantification (precision of 20%, and accuracy of 80-120%) for the detection of each sulfonamide by the proposed method was 1.0 μg/L. To test the extraction efficiency, the method was applied to various fortified real water samples. The average relative recoveries obtained from the fortified samples varied between 90 and 108% with the relative standard deviations of 5.3-10.7%. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Quantification of epithelial cells in coculture with fibroblasts by fluorescence image analysis.

    PubMed

    Krtolica, Ana; Ortiz de Solorzano, Carlos; Lockett, Stephen; Campisi, Judith

    2002-10-01

    To demonstrate that senescent fibroblasts stimulate the proliferation and neoplastic transformation of premalignant epithelial cells (Krtolica et al.: Proc Natl Acad Sci USA 98:12072-12077, 2001), we developed methods to quantify the proliferation of epithelial cells cocultured with fibroblasts. We stained epithelial-fibroblast cocultures with the fluorescent DNA-intercalating dye 4,6-diamidino-2-phenylindole (DAPI), or expressed green fluorescent protein (GFP) in the epithelial cells, and then cultured them with fibroblasts. The cocultures were photographed under an inverted microscope with appropriate filters, and the fluorescent images were captured with a digital camera. We modified an image analysis program to selectively recognize the smaller, more intensely fluorescent epithelial cell nuclei in DAPI-stained cultures and used the program to quantify areas with DAPI fluorescence generated by epithelial nuclei or GFP fluorescence generated by epithelial cells in each field. Analysis of the image areas with DAPI and GFP fluorescences produced nearly identical quantification of epithelial cells in coculture with fibroblasts. We confirmed these results by manual counting. In addition, GFP labeling permitted kinetic studies of the same coculture over multiple time points. The image analysis-based quantification method we describe here is an easy and reliable way to monitor cells in coculture and should be useful for a variety of cell biological studies. Copyright 2002 Wiley-Liss, Inc.

  8. HPLC-ESI-MS/MS validated method for simultaneous quantification of zopiclone and its metabolites, N-desmethyl zopiclone and zopiclone-N-oxide in human plasma.

    PubMed

    Mistri, Hiren N; Jangid, Arvind G; Pudage, Ashutosh; Shrivastav, Pranav

    2008-03-15

    A simple, selective and sensitive isocratic HPLC method with triple quadrupole mass spectrometry detection has been developed and validated for simultaneous quantification of zopiclone and its metabolites in human plasma. The analytes were extracted using solid phase extraction, separated on Symmetry shield RP8 column (150 mm x 4.6 mm i.d., 3.5 microm particle size) and detected by tandem mass spectrometry with a turbo ion spray interface. Metaxalone was used as an internal standard. The method had a chromatographic run time of 4.5 min and linear calibration curves over the concentration range of 0.5-150 ng/mL for both zopiclone and N-desmethyl zopiclone and 1-150 ng/mL for zopiclone-N-oxide. The intra-batch and inter-batch accuracy and precision evaluated at lower limit of quantification and quality control levels were within 89.5-109.1% and 3.0-14.7%, respectively, for all the analytes. The recoveries calculated for the analytes and internal standard were > or = 90% from spiked plasma samples. The validated method was successfully employed for a comparative bioavailability study after oral administration of 7.5 mg zopiclone (test and reference) to 16 healthy volunteers under fasted condition.

  9. High Court Case Could Rein in Private Placements under IDEA

    ERIC Educational Resources Information Center

    Walsh, Mark

    2007-01-01

    This article reports on starkly contrasting portraits of special education that the justices are sure to hear on the first day of the new U.S. Supreme Court term. In a case from New York City, the 1.1 million-student district argues that school officials made every attempt to provide an appropriate education plan under the federal Individuals with…

  10. "Cedar Rapids Community School District v. Garret F.": School Districts Must Pay for Nursing Services under the IDEA.

    ERIC Educational Resources Information Center

    Russo, Charles J.

    1999-01-01

    In "Cedar Rapids Community School District v. Garrett F." (1999), the U.S. Supreme Court decided that continuous nursing constitutes a "related service" under the Individuals with Disabilities Education Act. The case involved a 16-year-old who has been paralyzed since early childhood. Cost per student could be $20,000 to…

  11. Bridging CAGD knowledge into CAD/CG applications: Mathematical theories as stepping stones of innovations

    NASA Astrophysics Data System (ADS)

    Gobithaasan, R. U.; Miura, Kenjiro T.; Hassan, Mohamad Nor

    2014-07-01

    Computer Aided Geometric Design (CAGD) which surpasses the underlying theories of Computer Aided Design (CAD) and Computer Graphics (CG) has been taught in a number of Malaysian universities under the umbrella of Mathematical Sciences' faculty/department. On the other hand, CAD/CG is taught either under the Engineering or Computer Science Faculty. Even though CAGD researchers/educators/students (denoted as contributors) have been enriching this field of study by means of article/journal publication, many fail to convert the idea into constructive innovation due to the gap that occurs between CAGD contributors and practitioners (engineers/product/designers/architects/artists). This paper addresses this issue by advocating a number of technologies that can be used to transform CAGD contributors into innovators where immediate impact in terms of practical application can be experienced by the CAD/CG practitioners. The underlying principle of solving this issue is twofold. First would be to expose the CAGD contributors on ways to turn mathematical ideas into plug-ins and second is to impart relevant CAGD theories to CAD/CG to practitioners. Both cases are discussed in detail and the final section shows examples to illustrate the importance of turning mathematical knowledge into innovations.

  12. Multiplex Droplet Digital PCR Protocols for Quantification of GM Maize Events.

    PubMed

    Dobnik, David; Spilsberg, Bjørn; Bogožalec Košir, Alexandra; Štebih, Dejan; Morisset, Dany; Holst-Jensen, Arne; Žel, Jana

    2018-01-01

    The standard-curve based simplex quantitative polymerase chain reaction (qPCR) has been the gold standard for DNA target quantification for more than a decade. The large and growing number of individual analyses needed to test for genetically modified organisms (GMOs) is reducing the cost-effectiveness of qPCR. Droplet digital PCR (ddPCR) enables absolute quantification without standard curves, avoids the amplification efficiency bias observed with qPCR, allows more accurate estimations at low target copy numbers and, in combination with multiplexing, significantly improves cost efficiency. Here we describe two protocols for multiplex quantification of GM maize events: (1) nondiscriminating, with multiplex quantification of targets as a group (12 GM maize lines) and (2) discriminating, with multiplex quantification of individual targets (events). The first enables the quantification of twelve European Union authorized GM maize events as a group with only two assays, but does not permit determination of the individual events present. The second protocol enables the quantification of four individual targets (three GM events and one endogene) in a single reaction. Both protocols can be modified for quantification of any other DNA target.

  13. The Constitution of the Human Embryo as Substantial Change

    PubMed Central

    Alvargonzález, David

    2016-01-01

    This paper analyzes the transformation from the human zygote to the implanted embryo under the prism of substantial change. After a brief introduction, it vindicates the Aristotelian ideas of substance and accident, and those of substantial and accidental change. It then claims that the transformation from the multicelled zygote to the implanted embryo amounts to a substantial change. Pushing further, it contends that this substantial change cannot be explained following patterns of genetic reductionism, emergence, and self-organization, and proposes Gustavo Bueno’s idea of anamorphosis as a means to encapsulate criticism against such positions. PMID:26850033

  14. The aging-disease false dichotomy: understanding senescence as pathology

    PubMed Central

    Gems, David

    2015-01-01

    From a biological perspective aging (senescence) appears to be a form of complex disease syndrome, though this is not the traditional view. This essay aims to foster a realistic understanding of aging by scrutinizing ideas old and new. The conceptual division between aging-related diseases and an underlying, non-pathological aging process underpins various erroneous traditional ideas about aging. Among biogerontologists, another likely error involves the aspiration to treat the entire aging process, which recent advances suggest is somewhat utopian. It also risks neglecting a more modest but realizable goal: to develop preventative treatments that partially protect against aging. PMID:26136770

  15. Doran-Harder-Thompson Conjecture via SYZ Mirror Symmetry: Elliptic Curves

    NASA Astrophysics Data System (ADS)

    Kanazawa, Atsushi

    2017-04-01

    We prove the Doran-Harder-Thompson conjecture in the case of elliptic curves by using ideas from SYZ mirror symmetry. The conjecture claims that when a Calabi-Yau manifold X degenerates to a union of two quasi-Fano manifolds (Tyurin degeneration), a mirror Calabi-Yau manifold of X can be constructed by gluing the two mirror Landau-Ginzburg models of the quasi-Fano manifolds. The two crucial ideas in our proof are to obtain a complex structure by gluing the underlying affine manifolds and to construct the theta functions from the Landau-Ginzburg superpotentials.

  16. Prospective study of one million deaths in India: rationale, design, and validation results.

    PubMed

    Jha, Prabhat; Gajalakshmi, Vendhan; Gupta, Prakash C; Kumar, Rajesh; Mony, Prem; Dhingra, Neeraj; Peto, Richard

    2006-02-01

    Over 75% of the annual estimated 9.5 million deaths in India occur in the home, and the large majority of these do not have a certified cause. India and other developing countries urgently need reliable quantification of the causes of death. They also need better epidemiological evidence about the relevance of physical (such as blood pressure and obesity), behavioral (such as smoking, alcohol, HIV-1 risk taking, and immunization history), and biological (such as blood lipids and gene polymorphisms) measurements to the development of disease in individuals or disease rates in populations. We report here on the rationale, design, and implementation of the world's largest prospective study of the causes and correlates of mortality. We will monitor nearly 14 million people in 2.4 million nationally representative Indian households (6.3 million people in 1.1 million households in the 1998-2003 sample frame and 7.6 million people in 1.3 million households in the 2004-2014 sample frame) for vital status and, if dead, the causes of death through a well-validated verbal autopsy (VA) instrument. About 300,000 deaths from 1998-2003 and some 700,000 deaths from 2004-2014 are expected; of these about 850,000 will be coded by two physicians to provide causes of death by gender, age, socioeconomic status, and geographical region. Pilot studies will evaluate the addition of physical and biological measurements, specifically dried blood spots. Preliminary results from over 35,000 deaths suggest that VA can ascertain the leading causes of death, reduce the misclassification of causes, and derive the probable underlying cause of death when it has not been reported. VA yields broad classification of the underlying causes in about 90% of deaths before age 70. In old age, however, the proportion of classifiable deaths is lower. By tracking underlying demographic denominators, the study permits quantification of absolute mortality rates. Household case-control, proportional mortality, and nested case-control methods permit quantification of risk factors. This study will reliably document not only the underlying cause of child and adult deaths but also key risk factors (behavioral, physical, environmental, and eventually, genetic). It offers a globally replicable model for reliably estimating cause-specific mortality using VA and strengthens India's flagship mortality monitoring system. Despite the misclassification that is still expected, the new cause-of-death data will be substantially better than that available previously.

  17. Epistemological beliefs and epistemological practices in elementary science

    NASA Astrophysics Data System (ADS)

    Kittleson, Julie M.

    In this study, I examined the reciprocal relationship between third graders' epistemological beliefs and practices in the context of science instruction. Epistemological beliefs describe students' ideas about the nature of knowledge. Epistemological practices describe how students interact with dimensions of scientific knowledge. Examining the intersection between beliefs and practices involves describing how participating in science learning activities influences and is influenced by ideas about science. To examine beliefs and practices, I used interviews and classroom observations. Interview data were analyzed to ascertain students' ideas about the purpose of science and the justification, certainty, and structure/coherence of scientific knowledge. Additionally, lessons in the FOSS Human Body unit and the STC Chemical Tests unit were video taped. These data were analyzed to examine epistemological practices. Interview and classroom data were used in combination to explore the intersection between beliefs and practices. Students held multifaceted ideas about science. They indicated that science involves description, but they also indicated that science involves generating evidence and drawing conclusions. Students indicated that ideas can change in relation to new evidence. Epistemological practices, in contrast, revealed that the investigation strategies invoked in these units underestimated students' ideas about science. Students used matching strategies to complete investigations. In the Chemical Tests unit, the teacher helped students move beyond matching by introducing the idea of molecules. Students discussed molecules in relation to their empirical investigations, indicating that when elementary students are provided with appropriate scaffolds they can expand their range of practices which also potentially expands their beliefs. Students approached science as a repertoire of tests. They recalled ideas about the purpose of a test in one context and applied those ideas to another context. Additionally, they suggested that certain tests are appropriate for certain situations. Although students understood the purpose of the tests, they did not seem to recognize the full range of purposes underlying scientific investigations. This study highlights the challenge of designing learning environments that scaffold productive epistemological beliefs. This study also highlights the complexity of the relationship between beliefs and practices, particularly in terms of understanding the role instruction might play in mediating this relationship.

  18. JPRS Report, Arms Control

    DTIC Science & Technology

    1990-02-09

    33 FRG Stand on Inclusion in NATO Criticized [Adam; BERLINER ZEITUNG 9 Jan] 33 FRG Military Policy Comes Under Criticism [W. Wolf; NEUES...34 Under the Powerful Leadership of the Party, Strive for New Victories in the 1990’s With Full Confi- dence"] [Text] With the booming sound of the New...the 1980’s. Under the management of Comrade Deng Xiaoping, we have established the idea and prin- ciple of the building-up of our Army: to modernize

  19. Linearization of the bradford protein assay.

    PubMed

    Ernst, Orna; Zor, Tsaffrir

    2010-04-12

    Determination of microgram quantities of protein in the Bradford Coomassie brilliant blue assay is accomplished by measurement of absorbance at 590 nm. This most common assay enables rapid and simple protein quantification in cell lysates, cellular fractions, or recombinant protein samples, for the purpose of normalization of biochemical measurements. However, an intrinsic nonlinearity compromises the sensitivity and accuracy of this method. It is shown that under standard assay conditions, the ratio of the absorbance measurements at 590 nm and 450 nm is strictly linear with protein concentration. This simple procedure increases the accuracy and improves the sensitivity of the assay about 10-fold, permitting quantification down to 50 ng of bovine serum albumin. Furthermore, the interference commonly introduced by detergents that are used to create the cell lysates is greatly reduced by the new protocol. A linear equation developed on the basis of mass action and Beer's law perfectly fits the experimental data.

  20. Non-perturbative Quantification of Ionic Charge Transfer through Nm-Scale Protein Pores Using Graphene Microelectrodes

    NASA Astrophysics Data System (ADS)

    Ping, Jinglei; Johnson, A. T. Charlie; A. T. Charlie Johnson Team

    Conventional electrical methods for detecting charge transfer through protein pores perturb the electrostatic condition of the solution and chemical reactivity of the pore, and are not suitable to be used for complex biofluids. We developed a non-perturbative methodology ( fW input power) for quantifying trans-pore electrical current and detecting the pore status (i.e., open vs. closes) via graphene microelectrodes. Ferritin was used as a model protein featuring a large interior compartment, well-separated from the exterior solution with discrete pores as charge commuting channels. The charge flowing through the ferritin pores transfers into the graphene microelectrode and is recorded by an electrometer. In this example, our methodology enables the quantification of an inorganic nanoparticle-protein nanopore interaction in complex biofluids. The authors acknowledge the support from the Defense Advanced Research Projects Agency (DARPA) and the U.S. Army Research Office under Grant Number W911NF1010093.

  1. Observations on autoregulation in skeletal muscle - The effects of arterial hypoxia

    NASA Technical Reports Server (NTRS)

    Pohost, G. M.; Newell, J. B.; Hamlin, N. P.; Powell, W. J., Jr.

    1976-01-01

    An experimental study was carried out on 25 mongrel dogs of both sexes to re-evaluate autoregulation of blood flow in skeletal muscle, with particular reference to the steady-state resistance and transient response in muscle blood flow following a square wave increase in arterial perfusion pressure and to the examination of the effect of arterial hypoxia on this transient response. The data emphasize the importance of considering the transient changes in blood flow in evaluating the autoregulatory response in skeletal muscle. For quantification purposes, a parameter termed alpha is introduced which represents the ratio between the increase in blood flow from baseline to peak and the return of blood flow from the peak to the new steady-state. Such a quantification of the transient response in flow with step increases in perfusion pressure demonstrates substantial transient responses under conditions of normal oxygenation and progressive attenuation of flow transients with increasing hypoxia.

  2. In vivo quantification of amyloid burden in TTR-related cardiac amyloidosis

    PubMed Central

    Kollikowski, Alexander Marco; Kahles, Florian; Kintsler, Svetlana; Hamada, Sandra; Reith, Sebastian; Knüchel, Ruth; Röcken, Christoph; Mottaghy, Felix Manuel; Marx, Nikolaus; Burgmaier, Mathias

    2017-01-01

    Summary Cardiac transthyretin-related (ATTR) amyloidosis is a severe cardiomyopathy for which therapeutic approaches are currently under development. Because non-invasive imaging techniques such as cardiac magnetic resonance imaging and echocardiography are non-specific, the diagnosis of ATTR amyloidosis is still based on myocardial biopsy. Thus, diagnosis of ATTR amyloidosis is difficult in patients refusing myocardial biopsy. Furthermore, myocardial biopsy does not allow 3D-mapping and quantification of myocardial ATTR amyloid. In this report we describe a 99mTc-DPD-based molecular imaging technique for non-invasive single-step diagnosis, three-dimensional mapping and semiquantification of cardiac ATTR amyloidosis in a patient with suspected amyloid heart disease who initially rejected myocardial biopsy. This report underlines the clinical value of SPECT-based nuclear medicine imaging to enable non-invasive diagnosis of cardiac ATTR amyloidosis, particularly in patients rejecting biopsy. PMID:29259858

  3. A new catalytic-spectrophotometric method for quantification of trace amounts of nitrite in fruit juice samples.

    PubMed

    Sobhanardakani, S; Farmany, A; Abbasi, S; Cheraghi, J; Hushmandfar, R

    2013-03-01

    A new kinetic method has been developed for the determination of nitrite in fruit juice samples. The method is based on the catalytic effect of nitrite with the oxidation of Nile Blue A (NBA) by KBrO(3) in the sulfuric acid medium. The optimum conditions obtained are 1.2 mM sulfuric acid, 0.034 mM of NBA, 2.8 × 10(-3) M KBrO(3), reaction temperature of 20 °C, and reaction time of 100 s at 595.5 nm. Under the optimized conditions, the method allowed the quantification of nitrite in a range of 0.2-800 μg/mL with a detection limit of 0.02 μg/mL. The method was applied to the determination of nitrite in 15 brands of fruit juice samples.

  4. Quantification of Saturn and Enceladus tidal dissipation by astrometry after Cassini

    NASA Astrophysics Data System (ADS)

    Lainey, V.

    2017-12-01

    Enceladus is the smallest moon known today harboring a global ocean under its crust. While the existence of liquid water in high quantity for such a small object is exciting from an exobiological perspective, the existence and maintenance of such an ocean over time has been very debated. The discovery of strong, largely unexpected, tidal dissipation inside Saturn has turned out to be a major actor for sustaining Enceladus ocean and geysers activity. In particular, interior evolution of Enceladus and Saturn appear closely related. In this talk we will present the way tidal mechanisms occurring inside Saturn are currently tested using astrometry. Since tidal friction may occur both inside the core and the atmosphere, looking at the frequency dependence of tidal parameters is required to assess the magnitude of both processes. Expected results using the whole Cassini data, including the possible global quantification of Enceladus tidal dissipation, will be discussed.

  5. Uncertainty Quantification in Aeroelasticity

    NASA Astrophysics Data System (ADS)

    Beran, Philip; Stanford, Bret; Schrock, Christopher

    2017-01-01

    Physical interactions between a fluid and structure, potentially manifested as self-sustained or divergent oscillations, can be sensitive to many parameters whose values are uncertain. Of interest here are aircraft aeroelastic interactions, which must be accounted for in aircraft certification and design. Deterministic prediction of these aeroelastic behaviors can be difficult owing to physical and computational complexity. New challenges are introduced when physical parameters and elements of the modeling process are uncertain. By viewing aeroelasticity through a nondeterministic prism, where key quantities are assumed stochastic, one may gain insights into how to reduce system uncertainty, increase system robustness, and maintain aeroelastic safety. This article reviews uncertainty quantification in aeroelasticity using traditional analytical techniques not reliant on computational fluid dynamics; compares and contrasts this work with emerging methods based on computational fluid dynamics, which target richer physics; and reviews the state of the art in aeroelastic optimization under uncertainty. Barriers to continued progress, for example, the so-called curse of dimensionality, are discussed.

  6. Overview of Brain Microdialysis

    PubMed Central

    Chefer, Vladimir I.; Thompson, Alexis C.; Zapata, Agustin; Shippenberg, Toni S.

    2010-01-01

    The technique of microdialysis enables sampling and collecting of small-molecular-weight substances from the interstitial space. It is a widely used method in neuroscience and is one of the few techniques available that permits quantification of neurotransmitters, peptides, and hormones in the behaving animal. More recently, it has been used in tissue preparations for quantification of neurotransmitter release. This unit provides a brief review of the history of microdialysis and its general application in the neurosciences. The authors review the theoretical principles underlying the microdialysis process, methods available for estimating extracellular concentration from dialysis samples (i.e., relative recovery), the various factors that affect the estimate of in vivo relative recovery, and the importance of determining in vivo relative recovery to data interpretation. Several areas of special note, including impact of tissue trauma on the interpretation of microdialysis results, are discussed. Step-by-step instructions for the planning and execution of conventional and quantitative microdialysis experiments are provided. PMID:19340812

  7. Covalent functionalization of single-walled carbon nanotubes with polytyrosine: Characterization and analytical applications for the sensitive quantification of polyphenols.

    PubMed

    Eguílaz, Marcos; Gutiérrez, Alejandro; Gutierrez, Fabiana; González-Domínguez, Jose Miguel; Ansón-Casaos, Alejandro; Hernández-Ferrer, Javier; Ferreyra, Nancy F; Martínez, María T; Rivas, Gustavo

    2016-02-25

    This work reports the synthesis and characterization of single-walled carbon nanotubes (SWCNT) covalently functionalized with polytyrosine (Polytyr); the critical analysis of the experimental conditions to obtain the efficient dispersion of the modified carbon nanotubes; and the analytical performance of glassy carbon electrodes (GCE) modified with the dispersion (GCE/SWCNT-Polytyr) for the highly sensitive quantification of polyphenols. Under the optimal conditions, the calibration plot for the amperometric response of gallic acid (GA) shows a linear range between 5.0 × 10(-7) and 1.7 × 10(-4) M, with a sensitivity of (518 ± 5) m AM(-1) cm(-2), and a detection limit of 8.8 nM. The proposed sensor was successfully used for the determination of total polyphenolic content in tea extracts. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. Uncertainty quantification in Eulerian-Lagrangian models for particle-laden flows

    NASA Astrophysics Data System (ADS)

    Fountoulakis, Vasileios; Jacobs, Gustaaf; Udaykumar, Hs

    2017-11-01

    A common approach to ameliorate the computational burden in simulations of particle-laden flows is to use a point-particle based Eulerian-Lagrangian model, which traces individual particles in their Lagrangian frame and models particles as mathematical points. The particle motion is determined by Stokes drag law, which is empirically corrected for Reynolds number, Mach number and other parameters. The empirical corrections are subject to uncertainty. Treating them as random variables renders the coupled system of PDEs and ODEs stochastic. An approach to quantify the propagation of this parametric uncertainty to the particle solution variables is proposed. The approach is based on averaging of the governing equations and allows for estimation of the first moments of the quantities of interest. We demonstrate the feasibility of our proposed methodology of uncertainty quantification of particle-laden flows on one-dimensional linear and nonlinear Eulerian-Lagrangian systems. This research is supported by AFOSR under Grant FA9550-16-1-0008.

  9. AGScan: a pluggable microarray image quantification software based on the ImageJ library.

    PubMed

    Cathelin, R; Lopez, F; Klopp, Ch

    2007-01-15

    Many different programs are available to analyze microarray images. Most programs are commercial packages, some are free. In the latter group only few propose automatic grid alignment and batch mode. More often than not a program implements only one quantification algorithm. AGScan is an open source program that works on all major platforms. It is based on the ImageJ library [Rasband (1997-2006)] and offers a plug-in extension system to add new functions to manipulate images, align grid and quantify spots. It is appropriate for daily laboratory use and also as a framework for new algorithms. The program is freely distributed under X11 Licence. The install instructions can be found in the user manual. The software can be downloaded from http://mulcyber.toulouse.inra.fr/projects/agscan/. The questions and plug-ins can be sent to the contact listed below.

  10. Plato's Ideas and the Theories of Modern Particle Physics: Amazing Parallels

    NASA Astrophysics Data System (ADS)

    Machleidt, Ruprecht

    2006-05-01

    It is generally known that the question, ``What are the most elementary particles that all matter is made from?'', was already posed in the antiquity. The Greek natural philosophers Leucippus and Democritus were the first to suggest that all matter was made from atoms. Therefore, most people perceive them as the ancient fathers of elementary particle physics. However, this perception is wrong. Modern particle physics is not just a simple atomism. The characteristic point of modern particle theory is that it is concerned with the symmetries underlying the particles we discover in experiment. More than 2000 years ago, a similar idea was already advanced by the Greek philosopher Plato in his dialogue Timaeus: Geometric symmetries generate the atoms from just a few even more elementary items. Plato's vision is amazingly close to the ideas of modern particle theory. This fact, which is unfortunately little known, has been pointed out repeatedly by Werner Heisenberg.

  11. Data on eye behavior during idea generation and letter-by-letter reading.

    PubMed

    Walcher, Sonja; Körner, Christof; Benedek, Mathias

    2017-12-01

    This article includes the description of data information from an idea generation task (alternate uses task, (Guilford, 1967) [1]) and a letter-by-letter reading task under two background brightness conditions with healthy adults as well as a baseline measurement and questionnaire data (SIPI (Huba et al., 1981) [2]; DDFS (Singer and Antrobus, 1972) [3], 1963; RIBS (Runco et al., 2001) [4]). Data are hosted at the Open Science Framework (OSF): https://osf.io/fh66g/ (Walcher et al., 2017) [5]. There you will find eye tracking data, task performance data, questionnaires data, analyses scripts (in R, R Core Team, 2017 [6]), eye tracking paradigms (in the Experiment Builder (SR Research Ltd., [7]) and graphs on pupil and angle of eye vergence dynamics. Data are interpreted and discussed in the article 'Looking for ideas: Eye behavior during goal-directed internally focused cognition' (Walcher et al., 2017) [8].

  12. Determining Need for School-Based Physical Therapy Under IDEA: Commonalities Across Practice Guidelines.

    PubMed

    Vialu, Carlo; Doyle, Maura

    2017-10-01

    The Individuals with Disabilities Education Act (IDEA) includes physical therapy (PT) as a related service that may be provided to help students with disabilities benefit from their education. However, the IDEA does not provide specific guidance for the provision of school-based PT, resulting in variations in practice across the United States. The authors examined 22 state and local education agency guidelines available online to find commonalities related to the determination of a student's need for PT. Seven commonalities found: educational benefit, team decision, need for PT expertise, establishment of Individualized Education Program (IEP) goal before determining need for PT, distinction between medical and educational PT, the student's disability adversely affects education, and the student's potential for improvement. These commonalities are discussed in relation to current PT and special education literature. This article suggests applying these commonalities as procedural requirements and questions for discussion during an IEP team meeting.

  13. Did dinosaurs have megakaryocytes? New ideas about platelets and their progenitors

    PubMed Central

    Brass, Lawrence F.

    2005-01-01

    Biological evolution has struggled to produce mechanisms that can limit blood loss following injury. In humans and other mammals, control of blood loss (hemostasis) is achieved through a combination of plasma proteins, most of which are made in the liver, and platelets, anucleate blood cells that are produced in the bone marrow by megakaryocytes. Much has been learned about the underlying mechanisms, but much remains to be determined. The articles in this series review current ideas about the production of megakaryocytes from undifferentiated hematopoietic precursors, the steps by which megakaryocytes produce platelets, and the molecular mechanisms within platelets that make hemostasis possible. The underlying theme that connects the articles is the intense investigation of a complex system that keeps humans from bleeding to death, but at the same time exposes us to increased risk of thrombosis and vascular disease. PMID:16322776

  14. Bad is freer than good: Positive-negative asymmetry in attributions of free will.

    PubMed

    Feldman, Gilad; Wong, Kin Fai Ellick; Baumeister, Roy F

    2016-05-01

    Recent findings support the idea that the belief in free will serves as the basis for moral responsibility, thus promoting the punishment of immoral agents. We theorized that free will extends beyond morality to serve as the basis for accountability and the capacity for change more broadly, not only for others but also for the self. Five experiments showed that people attributed higher freedom of will to negative than to positive valence, regardless of morality or intent, for both self and others. In recalling everyday life situations and in classical decision making paradigms, negative actions, negatives outcomes, and negative framing were attributed higher free will than positive ones. Free will attributions were mainly driven by action or outcome valence, but not intent. These findings show consistent support for the idea that free will underlies laypersons' sense-making for accountability and change under negative circumstances. Copyright © 2016 Elsevier Inc. All rights reserved.

  15. Origins of the historiography of modern Greek science.

    PubMed

    Patiniotis, Manolis

    2008-01-01

    The purpose of the paper is to examine how Greek historians account for the presence of modern scientific ideas in the intellectual environment of eighteenth-century Greek-speaking society. It will also discuss the function of the history of modern Greek science in the context of Greek national historiography. As will be shown, the history of modem Greek science spent most of its life under the shadow of the history of ideas. Despite its seemingly secondary role, however, it occupied a distinctive place within national historiography because it formed the ground upon which different perceptions of the country's European identity converged. In this respect, one of the main goals of this paper is to outline the particular ideological presumptions, which shaped the historiography of modern Greek science under different historical circumstances. At the end an attempt will be made to articulate a viewpoint more in tandem with the recent methodological developments in the history of science.

  16. Simultaneous Quantification of Seven Bioactive Flavonoids in Citri Reticulatae Pericarpium by Ultra-Fast Liquid Chromatography Coupled with Tandem Mass Spectrometry.

    PubMed

    Zhao, Lian-Hua; Zhao, Hong-Zheng; Zhao, Xue; Kong, Wei-Jun; Hu, Yi-Chen; Yang, Shi-Hai; Yang, Mei-Hua

    2016-05-01

    Citri Reticulatae Pericarpium (CRP) is a commonly-used traditional Chinese medicine with flavonoids as the major bioactive components. Nevertheless, the contents of the flavonoids in CRP of different sources may significantly vary affecting their therapeutic effects. Thus, the setting up of a reliable and comprehensive quality assessment method for flavonoids in CRP is necessary. To set up a rapid and sensitive ultra-fast liquid chromatography coupled with tandem mass spectrometry (UFLC-MS/MS) method for simultaneous quantification of seven bioactive flavonoids in CRP. A UFLC-MS/MS method coupled to ultrasound-assisted extraction was developed for simultaneous separation and quantification of seven flavonoids including hesperidin, neohesperidin, naringin, narirutin, tangeretin, nobiletin and sinensetin in 16 batches of CRP samples from different sources in China. The established method showed good linearity for all analytes with correlation coefficient (R) over 0.9980, together with satisfactory accuracy, precision and reproducibility. Furthermore, the recoveries at the three spiked levels were higher than 89.71% with relative standard deviations (RSDs) lower than 5.19%. The results indicated that the contents of seven bioactive flavonoids in CRP varied significantly among different sources. Among the samples under study, hesperidin showed the highest contents in 16 samples ranged from 27.50 to 86.30 mg/g, the contents of hesperidin in CRP-15 and CRP-9 were 27.50 and 86.30 mg/g, respectively, while, the amount of narirutin was too low to be measured in some samples. This study revealed that the developed UFLC-MS/MS method was simple, sensitive and reliable for simultaneous quantification of multi-components in CRP with potential perspective for quality control of complex matrices. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  17. Validation of a simple and fast method to quantify in vitro mineralization with fluorescent probes used in molecular imaging of bone

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moester, Martiene J.C.; Schoeman, Monique A.E.; Oudshoorn, Ineke B.

    2014-01-03

    Highlights: •We validate a simple and fast method of quantification of in vitro mineralization. •Fluorescently labeled agents can detect calcium deposits in the mineralized matrix of cell cultures. •Fluorescent signals of the probes correlated with Alizarin Red S staining. -- Abstract: Alizarin Red S staining is the standard method to indicate and quantify matrix mineralization during differentiation of osteoblast cultures. KS483 cells are multipotent mouse mesenchymal progenitor cells that can differentiate into chondrocytes, adipocytes and osteoblasts and are a well-characterized model for the study of bone formation. Matrix mineralization is the last step of differentiation of bone cells and ismore » therefore a very important outcome measure in bone research. Fluorescently labelled calcium chelating agents, e.g. BoneTag and OsteoSense, are currently used for in vivo imaging of bone. The aim of the present study was to validate these probes for fast and simple detection and quantification of in vitro matrix mineralization by KS483 cells and thus enabling high-throughput screening experiments. KS483 cells were cultured under osteogenic conditions in the presence of compounds that either stimulate or inhibit osteoblast differentiation and thereby matrix mineralization. After 21 days of differentiation, fluorescence of stained cultures was quantified with a near-infrared imager and compared to Alizarin Red S quantification. Fluorescence of both probes closely correlated to Alizarin Red S staining in both inhibiting and stimulating conditions. In addition, both compounds displayed specificity for mineralized nodules. We therefore conclude that this method of quantification of bone mineralization using fluorescent compounds is a good alternative for the Alizarin Red S staining.« less

  18. New approach for the quantification of metallic species in healthcare products based on optical switching of a Schiff base possessing ONO donor set.

    PubMed

    Singh, Jaswant; Parkash, Jyoti; Kaur, Varinder; Singh, Raghubir

    2017-10-05

    A new method is reported for the quantification of some metallic components of healthcare products utilizing a Schiff base chelator derived from 2-hydroxyacetophenone and ethanolamine. The Schiff base chelator recognizes some metallic species such as iron, copper and zinc (important components of some healthcare products), and cadmium (common contaminant in healthcare products) giving colorimetric/fluorimetric response. It coordinates with Fe 2+ /Fe 3+ and Cu 2+ ions via ONO donor set and switches the colour to bright red, green and orange, respectively. Similarly, it switches 'ON' a fluorometric response when coordinates with Zn 2+ and Cd 2+ ions. In the present approach, detailed studies on the colorimetric and fluorimetric response of ONO Schiff base is investigated in detail. The Job plot for the complexation of ONO switch with various metal ions suggested formation of 1:1 (metal-chelator) complex with Fe 2+ , Fe 3+ , and Cu 2+ while 1:2 (metal-chelator) for Zn 2+ and Cd 2+ ions. The limit of detection, limit of quantification are 6.73, 18.0, 25.0, 0.65, 1.10μM and 27.0, 72.0, 100.0, 2.60 and 4.40μM for Fe 2+ , Fe 3+ , Cu 2+ , Zn 2+ and Cd 2+ ions, respectively. Under the optimized conditions, chelator was used for the quantification of important metals present in healthcare products via direct dissolution and furnace treatment during sample preparation. The results were found precise and accurate for both sample preparation techniques using the developed method. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. UFLC-ESI-MS/MS analysis of multiple mycotoxins in medicinal and edible Areca catechu.

    PubMed

    Liu, Hongmei; Luo, Jiaoyang; Kong, Weijun; Liu, Qiutao; Hu, Yichen; Yang, Meihua

    2016-05-01

    A robust, sensitive and reliable ultra fast liquid chromatography combined with electrospray ionization tandem mass spectrometry (UFLC-ESI-MS/MS) was optimized and validated for simultaneous identification and quantification of eleven mycotoxins in medicinal and edible Areca catechu, based on one-step extraction without any further clean-up. Separation and quantification were performed in both positive and negative modes under multiple reaction monitoring (MRM) in a single run with zearalanone (ZAN) as internal standard. The chromatographic conditions and MS/MS parameters were carefully optimized. Matrix-matched calibration was recommended to reduce matrix effects and improve accuracy, showing good linearity within wide concentration ranges. Limits of quantification (LOQ) were lower than 50 μg kg(-1), while limits of detection (LOD) were in the range of 0.1-20 μg kg(-1). The accuracy of the developed method was validated for recoveries, ranging from 85% to 115% with relative standard deviation (RSD) ≤14.87% at low level, from 75% to 119% with RSD ≤ 14.43% at medium level and from 61% to 120% with RSD ≤ 13.18% at high level, respectively. Finally, the developed multi-mycotoxin method was applied for screening of these mycotoxins in 24 commercial samples. Only aflatoxin B2 and zearalenone were found in 2 samples. This is the first report on the application of UFLC-ESI(+/-)-MS/MS for multi-class mycotoxins in A. catechu. The developed method with many advantages of simple pretreatment, rapid determination and high sensitivity is a proposed candidate for large-scale detection and quantification of multiple mycotoxins in other complex matrixes. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Magnetic resonance fingerprinting using echo-planar imaging: Joint quantification of T1 and T2∗ relaxation times.

    PubMed

    Rieger, Benedikt; Zimmer, Fabian; Zapp, Jascha; Weingärtner, Sebastian; Schad, Lothar R

    2017-11-01

    To develop an implementation of the magnetic resonance fingerprinting (MRF) paradigm for quantitative imaging using echo-planar imaging (EPI) for simultaneous assessment of T 1 and T2∗. The proposed MRF method (MRF-EPI) is based on the acquisition of 160 gradient-spoiled EPI images with rapid, parallel-imaging accelerated, Cartesian readout and a measurement time of 10 s per slice. Contrast variation is induced using an initial inversion pulse, and varying the flip angles, echo times, and repetition times throughout the sequence. Joint quantification of T 1 and T2∗ is performed using dictionary matching with integrated B1+ correction. The quantification accuracy of the method was validated in phantom scans and in vivo in 6 healthy subjects. Joint T 1 and T2∗ parameter maps acquired with MRF-EPI in phantoms are in good agreement with reference measurements, showing deviations under 5% and 4% for T 1 and T2∗, respectively. In vivo baseline images were visually free of artifacts. In vivo relaxation times are in good agreement with gold-standard techniques (deviation T 1 : 4 ± 2%, T2∗: 4 ± 5%). The visual quality was comparable to the in vivo gold standard, despite substantially shortened scan times. The proposed MRF-EPI method provides fast and accurate T 1 and T2∗ quantification. This approach offers a rapid supplement to the non-Cartesian MRF portfolio, with potentially increased usability and robustness. Magn Reson Med 78:1724-1733, 2017. © 2016 International Society for Magnetic Resonance in Medicine. © 2016 International Society for Magnetic Resonance in Medicine.

  1. Picoliter Well Array Chip-Based Digital Recombinase Polymerase Amplification for Absolute Quantification of Nucleic Acids.

    PubMed

    Li, Zhao; Liu, Yong; Wei, Qingquan; Liu, Yuanjie; Liu, Wenwen; Zhang, Xuelian; Yu, Yude

    2016-01-01

    Absolute, precise quantification methods expand the scope of nucleic acids research and have many practical applications. Digital polymerase chain reaction (dPCR) is a powerful method for nucleic acid detection and absolute quantification. However, it requires thermal cycling and accurate temperature control, which are difficult in resource-limited conditions. Accordingly, isothermal methods, such as recombinase polymerase amplification (RPA), are more attractive. We developed a picoliter well array (PWA) chip with 27,000 consistently sized picoliter reactions (314 pL) for isothermal DNA quantification using digital RPA (dRPA) at 39°C. Sample loading using a scraping liquid blade was simple, fast, and required small reagent volumes (i.e., <20 μL). Passivating the chip surface using a methoxy-PEG-silane agent effectively eliminated cross-contamination during dRPA. Our creative optical design enabled wide-field fluorescence imaging in situ and both end-point and real-time analyses of picoliter wells in a 6-cm(2) area. It was not necessary to use scan shooting and stitch serial small images together. Using this method, we quantified serial dilutions of a Listeria monocytogenes gDNA stock solution from 9 × 10(-1) to 4 × 10(-3) copies per well with an average error of less than 11% (N = 15). Overall dRPA-on-chip processing required less than 30 min, which was a 4-fold decrease compared to dPCR, requiring approximately 2 h. dRPA on the PWA chip provides a simple and highly sensitive method to quantify nucleic acids without thermal cycling or precise micropump/microvalve control. It has applications in fast field analysis and critical clinical diagnostics under resource-limited settings.

  2. Picoliter Well Array Chip-Based Digital Recombinase Polymerase Amplification for Absolute Quantification of Nucleic Acids

    PubMed Central

    Li, Zhao; Liu, Yong; Wei, Qingquan; Liu, Yuanjie; Liu, Wenwen; Zhang, Xuelian; Yu, Yude

    2016-01-01

    Absolute, precise quantification methods expand the scope of nucleic acids research and have many practical applications. Digital polymerase chain reaction (dPCR) is a powerful method for nucleic acid detection and absolute quantification. However, it requires thermal cycling and accurate temperature control, which are difficult in resource-limited conditions. Accordingly, isothermal methods, such as recombinase polymerase amplification (RPA), are more attractive. We developed a picoliter well array (PWA) chip with 27,000 consistently sized picoliter reactions (314 pL) for isothermal DNA quantification using digital RPA (dRPA) at 39°C. Sample loading using a scraping liquid blade was simple, fast, and required small reagent volumes (i.e., <20 μL). Passivating the chip surface using a methoxy-PEG-silane agent effectively eliminated cross-contamination during dRPA. Our creative optical design enabled wide-field fluorescence imaging in situ and both end-point and real-time analyses of picoliter wells in a 6-cm2 area. It was not necessary to use scan shooting and stitch serial small images together. Using this method, we quantified serial dilutions of a Listeria monocytogenes gDNA stock solution from 9 × 10-1 to 4 × 10-3 copies per well with an average error of less than 11% (N = 15). Overall dRPA-on-chip processing required less than 30 min, which was a 4-fold decrease compared to dPCR, requiring approximately 2 h. dRPA on the PWA chip provides a simple and highly sensitive method to quantify nucleic acids without thermal cycling or precise micropump/microvalve control. It has applications in fast field analysis and critical clinical diagnostics under resource-limited settings. PMID:27074005

  3. Game Theory and Uncertainty Quantification for Cyber Defense Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chatterjee, Samrat; Halappanavar, Mahantesh; Tipireddy, Ramakrishna

    Cyber-system defenders face the challenging task of protecting critical assets and information continually against multiple types of malicious attackers. Defenders typically operate within resource constraints while attackers operate at relatively low costs. As a result, design and development of resilient cyber-systems that can support mission goals under attack while accounting for the dynamics between attackers and defenders is an important research problem.

  4. Flow cytometric immunofluorescence of rat anterior pituitary cells

    NASA Technical Reports Server (NTRS)

    Hatfield, J. Michael; Hymer, W. C.

    1985-01-01

    A flow cytometric immunofluorescence technique was developed for the quantification of growth hormone, prolactin, and luteinizing hormone producing cells. The procedure is based on indirect-immunofluorescence of intracellular hormone using an EPICS V cell sorter and can objectively count 50,000 cells in about 3 minutes. It can be used to study the dynamics of pituitary cell populations under various physiological and pharmacological conditions.

  5. Structural Health Monitoring 2007: Quantification, Validation, and Implementation

    DTIC Science & Technology

    2007-11-30

    11:20 ~ 11:40 A Novel MEMS Strain Sensor for Structural Health Monitoring Applications under Harsh Environmental Conditions p. 121 Matthew Malkin...Session: Wave Propagation Models in Damage Assesment Chair: Wieslaw Ostachowicz, Polish Academy of Sciences Room: 030 W. Ostachowicz and P. Kudela...University Dayton Research Institute 11:00 ~ 11:20 Low Impact Damage Detection and Analysis with Thin Film Piezo-electric Sensors p. 1064 Samuel

  6. Validation of the Identification and Intervention for Dementia in Elderly Africans (IDEA) cognitive screen in Nigeria and Tanzania.

    PubMed

    Paddick, Stella-Maria; Gray, William K; Ogunjimi, Luqman; Lwezuala, Bingileki; Olakehinde, Olaide; Kisoli, Aloyce; Kissima, John; Mbowe, Godfrey; Mkenda, Sarah; Dotchin, Catherine L; Walker, Richard W; Mushi, Declare; Collingwood, Cecilia; Ogunniyi, Adesola

    2015-04-25

    We have previously described the development of the Identification and Intervention for Dementia in Elderly Africans (IDEA) cognitive screen for use in populations with low levels of formal education. The IDEA cognitive screen was developed and field-tested in an elderly, community-based population in rural Tanzania with a relatively high prevalence of cognitive impairment. The aim of this study was to validate the IDEA cognitive screen as an assessment of major cognitive impairment in hospital settings in Nigeria and Tanzania. In Nigeria, 121 consecutive elderly medical clinic outpatients reviewed at the University College Hospital, Ibadan were screened using the IDEA cognitive screen. In Tanzania, 97 consecutive inpatients admitted to Mawenzi Regional Hospital (MRH), Moshi, and 108 consecutive medical clinic outpatients attending the geriatric medicine clinic at MRH were screened. Inter-rater reliability was assessed in Tanzanian outpatients attending St Joseph's Hospital in Moshi using three raters. A diagnosis of dementia or delirium (DSM-IV criteria) was classified as major cognitive impairment and was provided independently by a physician blinded to the results of the screening assessment. The area under the receiver operating characteristic (AUROC) curve in Nigerian outpatients, Tanzanian outpatients and Tanzanian inpatients was 0.990, 0.919 and 0.917 respectively. Inter-rater reliability was good (intra-class correlation coefficient 0.742 to 0.791). In regression models, the cognitive screen did not appear to be educationally biased. The IDEA cognitive screen performed well in these populations and should prove useful in screening for dementia and delirium in other areas of sub-Saharan Africa.

  7. Satellite Re-entry Modeling and Uncertainty Quantification

    NASA Astrophysics Data System (ADS)

    Horsley, M.

    2012-09-01

    LEO trajectory modeling is a fundamental aerospace capability and has applications in many areas of aerospace, such as maneuver planning, sensor scheduling, re-entry prediction, collision avoidance, risk analysis, and formation flying. Somewhat surprisingly, modeling the trajectory of an object in low Earth orbit is still a challenging task. This is primarily due to the large uncertainty in the upper atmospheric density, about 15-20% (1-sigma) for most thermosphere models. Other contributions come from our inability to precisely model future solar and geomagnetic activities, the potentially unknown shape, material construction and attitude history of the satellite, and intermittent, noisy tracking data. Current methods to predict a satellite's re-entry trajectory typically involve making a single prediction, with the uncertainty dealt with in an ad-hoc manner, usually based on past experience. However, due to the extreme speed of a LEO satellite, even small uncertainties in the re-entry time translate into a very large uncertainty in the location of the re-entry event. Currently, most methods simply update the re-entry estimate on a regular basis. This results in a wide range of estimates that are literally spread over the entire globe. With no understanding of the underlying distribution of potential impact points, the sequence of impact points predicted by the current methodology are largely useless until just a few hours before re-entry. This paper will discuss the development of a set of the High Performance Computing (HPC)-based capabilities to support near real-time quantification of the uncertainty inherent in uncontrolled satellite re-entries. An appropriate management of the uncertainties is essential for a rigorous treatment of the re-entry/LEO trajectory problem. The development of HPC-based tools for re-entry analysis is important as it will allow a rigorous and robust approach to risk assessment by decision makers in an operational setting. Uncertainty quantification results from the recent uncontrolled re-entry of the Phobos-Grunt satellite will be presented and discussed. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  8. Computer aided system engineering for space construction

    NASA Technical Reports Server (NTRS)

    Racheli, Ugo

    1989-01-01

    This viewgraph presentation covers the following topics. Construction activities envisioned for the assembly of large platforms in space (as well as interplanetary spacecraft and bases on extraterrestrial surfaces) require computational tools that exceed the capability of conventional construction management programs. The Center for Space Construction is investigating the requirements for new computational tools and, at the same time, suggesting the expansion of graduate and undergraduate curricula to include proficiency in Computer Aided Engineering (CAE) though design courses and individual or team projects in advanced space systems design. In the center's research, special emphasis is placed on problems of constructability and of the interruptability of planned activity sequences to be carried out by crews operating under hostile environmental conditions. The departure point for the planned work is the acquisition of the MCAE I-DEAS software, developed by the Structural Dynamics Research Corporation (SDRC), and its expansion to the level of capability denoted by the acronym IDEAS**2 currently used for configuration maintenance on Space Station Freedom. In addition to improving proficiency in the use of I-DEAS and IDEAS**2, it is contemplated that new software modules will be developed to expand the architecture of IDEAS**2. Such modules will deal with those analyses that require the integration of a space platform's configuration with a breakdown of planned construction activities and with a failure modes analysis to support computer aided system engineering (CASE) applied to space construction.

  9. Holding On; Being Held; Letting Go: The Relevance of Bion's Thinking for Psychoanalytic Work with Parents, Infants and Children under Five

    ERIC Educational Resources Information Center

    Emanuel, Louise

    2012-01-01

    This paper attempts to convey how the ideas of Klein, Bion and Bick underpin psychoanalytically based interventions with parents, babies and young children in the Camden Under Fives' Service, Tavistock Clinic. As the title suggests, my focus is on ways in which anxiety relating to separation and loss, can be contained through the transformative…

  10. Collective European Security Forces: An Idea Whose Time Has Come

    DTIC Science & Technology

    1990-04-01

    undergoing equipment conversion ( Nike ) - units to relocate under new air defense concept (CRCs, Patriot and IHawk units) - units converting by personnel...34Eastern Europe is traditionally one of the most volatile parts of the world. It has remained a volatile region under the Soviet empire. Gorba...levels of general purpose forces with offense-oriented capabilities, namely tanks, artillery, armoured troop carriers (ATCs), strike aircraft and

  11. Generic method for the absolute quantification of glutathione S-conjugates: Application to the conjugates of acetaminophen, clozapine and diclofenac.

    PubMed

    den Braver, Michiel W; Vermeulen, Nico P E; Commandeur, Jan N M

    2017-03-01

    Modification of cellular macromolecules by reactive drug metabolites is considered to play an important role in the initiation of tissue injury by many drugs. Detection and identification of reactive intermediates is often performed by analyzing the conjugates formed after trapping by glutathione (GSH). Although sensitivity of modern mass spectrometrical methods is extremely high, absolute quantification of GSH-conjugates is critically dependent on the availability of authentic references. Although 1 H NMR is currently the method of choice for quantification of metabolites formed biosynthetically, its intrinsically low sensitivity can be a limiting factor in quantification of GSH-conjugates which generally are formed at low levels. In the present study, a simple but sensitive and generic method for absolute quantification of GSH-conjugates is presented. The method is based on quantitative alkaline hydrolysis of GSH-conjugates and subsequent quantification of glutamic acid and glycine by HPLC after precolumn derivatization with o-phthaldialdehyde/N-acetylcysteine (OPA/NAC). Because of the lower stability of the glycine OPA/NAC-derivate, quantification of the glutamic acid OPA/NAC-derivate appeared most suitable for quantification of GSH-conjugates. The novel method was used to quantify the concentrations of GSH-conjugates of diclofenac, clozapine and acetaminophen and quantification was consistent with 1 H NMR, but with a more than 100-fold lower detection limit for absolute quantification. Copyright © 2017. Published by Elsevier B.V.

  12. Selected Aspects of Soil Science History in the USA - Prehistory to the 1970s

    NASA Astrophysics Data System (ADS)

    Brevik, Eric C.; Fenton, Thomas E.; Homburg, Jeffrey A.

    2017-04-01

    Interest in understanding America's soils originated in prehistory with Native Americans. Following European settlement, notable individuals such as Thomas Jefferson and Lewis and Clark made observations of soil resources. Moving into the 1800s, state geological surveys became involved in soil work and E.W. Hilgard started to formulate ideas similar to those that would eventually lead to V.V. Dokuchaev being recognized as the father of modern soil science. However, Hilgard's advanced ideas on soil genesis were not accepted by the wider American soil science community at the time. Moving into the 1900s, the National Cooperative Soil Survey, the first nationally organized detailed soil survey in the world, was founded under the direction of M. Whitney. Initial soil classification ideas were heavily based in geology, but over time Russian ideas of soil genesis and classification moved into the American soil science community, mainly due to the influence of C.F. Marbut. Early American efforts in scientific study of soil erosion and soil fertility were also initiated in the 1910s and university programs to educate soil scientists started. Soil erosion studies took on high priority in the 1930s as the USA was impacted by the Dust Bowl. Soil Taxonomy, one of the most widely utilized soil classification systems in the world, was developed from the 1950s through the 1970s under the guidance of G.D. Smith and with administrative support from C.E. Kellogg. American soil scientists, such as H. Jenny, R.W. Simonson, D.L. Johnson, and D. Watson-Stegner, developed influential models of soil genesis during the 20th Century, and the use of soil information expanded beyond agriculture to include issues such as land-use planning, soil geomorphology, and interactions between soils and human health.

  13. Toward High School Biology: Helping Middle School Students Understand Chemical Reactions and Conservation of Mass in Nonliving and Living Systems.

    PubMed

    Herrmann-Abell, Cari F; Koppal, Mary; Roseman, Jo Ellen

    2016-01-01

    Modern biology has become increasingly molecular in nature, requiring students to understand basic chemical concepts. Studies show, however, that many students fail to grasp ideas about atom rearrangement and conservation during chemical reactions or the application of these ideas to biological systems. To help provide students with a better foundation, we used research-based design principles and collaborated in the development of a curricular intervention that applies chemistry ideas to living and nonliving contexts. Six eighth grade teachers and their students participated in a test of the unit during the Spring of 2013. Two of the teachers had used an earlier version of the unit the previous spring. The other four teachers were randomly assigned either to implement the unit or to continue teaching the same content using existing materials. Pre- and posttests were administered, and the data were analyzed using Rasch modeling and hierarchical linear modeling. The results showed that, when controlling for pretest score, gender, language, and ethnicity, students who used the curricular intervention performed better on the posttest than the students using existing materials. Additionally, students who participated in the intervention held fewer misconceptions. These results demonstrate the unit's promise in improving students' understanding of the targeted ideas. © 2016 C. F. Herrmann-Abell et al. CBE—Life Sciences Education © 2016 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  14. The use of Interferometric Microscopy to assess 3D modifications of deteriorated medieval glass.

    NASA Astrophysics Data System (ADS)

    Gentaz, L.; Lombardo, T.; Chabas, A.

    2012-04-01

    Due to low durability, Northern European medieval glass undergoes the action of the atmospheric environment leading in some cases to a state of dramatic deterioration. Modification features varies from a simple loss of transparency to a severe material loss. In order to understand the underlying mechanisms and preserve this heritage, fundamental research is necessary too. In this optic, field exposure of analogues and original stained glass was carried out to study the early stages of the glass weathering. Model glass and original stained glass (after removal of deterioration products) were exposed in real conditions in an urban site (Paris) for 48 months. A regular withdrawal of samples allowed a follow-up of short-term glass evolution. Morphological modifications of the exposed samples were investigated through conventional and non destructive microscopy, using respectively a Scanning Electron Microscope (SEM) and an Interferometric Microscope (IM). This latter allows a 3D quantification of the object with no sample preparation. For all glasses, both surface recession and build-up of deposit were observed as a consequence of a leaching process (interdiffusion of protons and glass cations). The build-up of a deposit comes from the reaction between the extracted glass cations and atmospheric gases. Instead, surface recession is due mainly to the formation of brittle layer of altered glass at the sub-surface, where a fracture network can appear, leading to the scaling of parts of this modified glass. Finally, dissolution of the glass takes place, inducing the formation of pits and craters. The arithmetic roughness (Ra) was used as an indicator of weathering increase, in order to evaluate the deterioration state. For instance, the Ra grew from few tens of nm for pristine glass to thousands of nm for scaled areas. This technique also allowed a precise quantification of dimensions (height, depth and width) of deposits and pits, and the estimation of their overall distribution. Finally, IM allows the quantification of the volume of lost matter due to scaling, by studying the surface coverage of the different depths. The preliminary studies show that IM is a very effective, non-destructive and non-intrusive technique for the quantification of glass weathering, to be used complementarily to other investigations. Further applications, especially to the description of the depositions, are currently under way.

  15. Geometric steering criterion for two-qubit states

    NASA Astrophysics Data System (ADS)

    Yu, Bai-Chu; Jia, Zhih-Ahn; Wu, Yu-Chun; Guo, Guang-Can

    2018-01-01

    According to the geometric characterization of measurement assemblages and local hidden state (LHS) models, we propose a steering criterion which is both necessary and sufficient for two-qubit states under arbitrary measurement sets. A quantity is introduced to describe the required local resources to reconstruct a measurement assemblage for two-qubit states. We show that the quantity can be regarded as a quantification of steerability and be used to find out optimal LHS models. Finally we propose a method to generate unsteerable states, and construct some two-qubit states which are entangled but unsteerable under all projective measurements.

  16. The Legal Ethical Backbone of Conscientious Refusal.

    PubMed

    Munthe, Christian; Nielsen, Morten Ebbe Juul

    2017-01-01

    This article analyzes the idea of a legal right to conscientious refusal for healthcare professionals from a basic legal ethical standpoint, using refusal to perform tasks related to legal abortion (in cases of voluntary employment) as a case in point. The idea of a legal right to conscientious refusal is distinguished from ideas regarding moral rights or reasons related to conscientious refusal, and none of the latter are found to support the notion of a legal right. Reasons for allowing some sort of room for conscientious refusal for healthcare professionals based on the importance of cultural identity and the fostering of a critical atmosphere might provide some support, if no countervailing factors apply. One such factor is that a legal right to healthcare professionals' conscientious refusal must comply with basic legal ethical tenets regarding the rule of law and equal treatment, and this requirement is found to create serious problems for those wishing to defend the idea under consideration. We conclude that the notion of a legal right to conscientious refusal for any profession is either fundamentally incompatible with elementary legal ethical requirements, or implausible because it undermines the functioning of a related professional sector (healthcare) or even of society as a whole.

  17. The Role of Intuition in the Generation and Evaluation Stages of Creativity

    PubMed Central

    Pétervári, Judit; Osman, Magda; Bhattacharya, Joydeep

    2016-01-01

    Both intuition and creativity are associated with knowledge creation, yet a clear link between them has not been adequately established. First, the available empirical evidence for an underlying relationship between intuition and creativity is sparse in nature. Further, this evidence is arguable as the concepts are diversely operationalized and the measures adopted are often not validated sufficiently. Combined, these issues make the findings from various studies examining the link between intuition and creativity difficult to replicate. Nevertheless, the role of intuition in creativity should not be neglected as it is often reported to be a core component of the idea generation process, which in conjunction with idea evaluation are crucial phases of creative cognition. We review the prior research findings in respect of idea generation and idea evaluation from the view that intuition can be construed as the gradual accumulation of cues to coherence. Thus, we summarize the literature on what role intuitive processes play in the main stages of the creative problem-solving process and outline a conceptual framework of the interaction between intuition and creativity. Finally, we discuss the main challenges of measuring intuition as well as possible directions for future research. PMID:27703439

  18. Good Models Gone Bad: Quantifying and Predicting Parameter-Induced Climate Model Simulation Failures

    NASA Astrophysics Data System (ADS)

    Lucas, D. D.; Klein, R.; Tannahill, J.; Brandon, S.; Covey, C. C.; Domyancic, D.; Ivanova, D. P.

    2012-12-01

    Simulations using IPCC-class climate models are subject to fail or crash for a variety of reasons. Statistical analysis of the failures can yield useful insights to better understand and improve the models. During the course of uncertainty quantification (UQ) ensemble simulations to assess the effects of ocean model parameter uncertainties on climate simulations, we experienced a series of simulation failures of the Parallel Ocean Program (POP2). About 8.5% of our POP2 runs failed for numerical reasons at certain combinations of parameter values. We apply support vector machine (SVM) classification from the fields of pattern recognition and machine learning to quantify and predict the probability of failure as a function of the values of 18 POP2 parameters. The SVM classifiers readily predict POP2 failures in an independent validation ensemble, and are subsequently used to determine the causes of the failures via a global sensitivity analysis. Four parameters related to ocean mixing and viscosity are identified as the major sources of POP2 failures. Our method can be used to improve the robustness of complex scientific models to parameter perturbations and to better steer UQ ensembles. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344 and was funded by the Uncertainty Quantification Strategic Initiative Laboratory Directed Research and Development Project at LLNL under project tracking code 10-SI-013 (UCRL LLNL-ABS-569112).

  19. Normalized Quantitative Western Blotting Based on Standardized Fluorescent Labeling.

    PubMed

    Faden, Frederik; Eschen-Lippold, Lennart; Dissmeyer, Nico

    2016-01-01

    Western blot (WB) analysis is the most widely used method to monitor expression of proteins of interest in protein extracts of high complexity derived from diverse experimental setups. WB allows the rapid and specific detection of a target protein, such as non-tagged endogenous proteins as well as protein-epitope tag fusions depending on the availability of specific antibodies. To generate quantitative data from independent samples within one experiment and to allow accurate inter-experimental quantification, a reliable and reproducible method to standardize and normalize WB data is indispensable. To date, it is a standard procedure to normalize individual bands of immunodetected proteins of interest from a WB lane to other individual bands of so-called housekeeping proteins of the same sample lane. These are usually detected by an independent antibody or colorimetric detection and do not reflect the real total protein of a sample. Housekeeping proteins-assumed to be constitutively expressed mostly independent of developmental and environmental states-can greatly differ in their expression under these various conditions. Therefore, they actually do not represent a reliable reference to normalize the target protein's abundance to the total amount of protein contained in each lane of a blot.Here, we demonstrate the Smart Protein Layers (SPL) technology, a combination of fluorescent standards and a stain-free fluorescence-based visualization of total protein in gels and after transfer via WB. SPL allows a rapid and highly sensitive protein visualization and quantification with a sensitivity comparable to conventional silver staining with a 1000-fold higher dynamic range. For normalization, standardization and quantification of protein gels and WBs, a sample-dependent bi-fluorescent standard reagent is applied and, for accurate quantification of data derived from different experiments, a second calibration standard is used. Together, the precise quantification of protein expression by lane-to-lane, gel-to-gel, and blot-to-blot comparisons is facilitated especially with respect to experiments in the area of proteostasis dealing with highly variable protein levels and involving protein degradation mutants and treatments modulating protein abundance.

  20. Transient and residual stresses in a pressable glass-ceramic before and after resin-cement coating determined using profilometry.

    PubMed

    Isgró, G; Addison, O; Fleming, G J P

    2011-05-01

    The effect of heat-pressing and subsequent pre-cementation (acid-etching) and resin-cementation operative techniques on the development of transient and residual stresses in different thicknesses of a lithium disilicate glass-ceramic were characterised using profilometry prior to biaxial flexure strength (BFS) determination. 60 IPS e.max Press discs were pressed and divested under controlled conditions. The discs were polished on one surface to thicknesses of 0.61±0.05, 0.84±0.08, and 1.06±0.07 mm (Groups A-C, respectively). The mean of the maximum deflection (acid-etching and resin-coating was determined using high resolution profilometery prior to BFS testing. Paired sample t-tests were performed (p<0.05) on the 20 individual samples in each group (Groups A-C) for each comparison. Differences between the baseline quantification and resin-cement coating deflection values and BFS values for Groups A-C were determined using a one-way ANOVA with post hoc Tukey tests (p<0.05). Baseline quantification for Groups A-C identified no significant differences between the group means of the maximum deflection values (p=0.341). Following HF acid-etching, a significant increase in deflection for all groups (p<0.001) was identified compared with the baseline quantification. Additionally, resin-cement coating significantly increased deflection for Group A (p<0.001), Group B (p<0.001) and Group C (p=0.001) specimens for the individual groups. The increased deflection from baseline quantification to resin-cement coating was significantly different (p<0.001) for the three specimen thicknesses, although the BFS values were not. The lower reported baseline quantification range of the mean of the maximum deflection for the IPS e.max(®) Press specimens was predominantly the result of specimen polishing regime inducing a tensile stress state across the surface defect integral which accounted for the observed surface convexity. Acid-etching and resin-cementation had a significant impact on the development and magnitude of the transient and residual stresses in the lithium disilicate glass-ceramic investigated. Copyright © 2011 Elsevier Ltd. All rights reserved.

  1. Relative quantification of N(epsilon)-(Carboxymethyl)lysine, imidazolone A, and the Amadori product in glycated lysozyme by MALDI-TOF mass spectrometry.

    PubMed

    Kislinger, Thomas; Humeny, Andreas; Peich, Carlo C; Zhang, Xiaohong; Niwa, Toshimitsu; Pischetsrieder, Monika; Becker, Cord-Michael

    2003-01-01

    The nonenzymatic glycation of proteins by reducing sugars, also known as the Maillard reaction, has received increasing recognition from nutritional science and medical research. In this study, we applied matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF-MS) to perform relative and simultaneous quantification of the Amadori product, which is an early glycation product, and of N(epsilon)-(carboxymethyl)lysine and imidazolone A, two important advanced glycation end products. Therefore, native lysozyme was incubated with d-glucose for increasing periods of time (1, 4, 8, and 16 weeks) in phosphate-buffered saline pH 7.8 at 50 degrees C. After enzymatic digestion with endoproteinase Glu-C, the N-terminal peptide fragment (m/z 838; amino acid sequence KVFGRCE) and the C-terminal peptide fragment (m/z 1202; amino acid sequence VQAWIRGCRL) were used for relative quantification of the three Maillard products. Amadori product, N(epsilon)-(carboxymethyl)lysine, and imidazolone A were the main glycation products formed under these conditions. Their formation was dependent on glucose concentration and reaction time. The kinetics were similar to those obtained by competitive ELISA, an established method for quantification of N(epsilon)-(carboxymethyl)lysine and imidazolone A. Inhibition experiments showed that coincubation with N(alpha)-acetylargine suppressed formation of imidazolone A but not of the Amadori product or N(epsilon)-(carboxymethyl)lysine. The presence of N(alpha)-acetyllysine resulted in the inhibition of lysine modifications but in higher concentrations of imidazolone A. o-Phenylenediamine decreased the yield of the Amadori product and completely inhibited the formation of N(epsilon)-(carboxymethyl)lysine and imidazolone A. MALDI-TOF-MS proved to be a new analytical tool for the simultaneous, relative quantification of specific products of the Maillard reaction. For the first time, kinetic data of defined products on specific sites of glycated protein could be measured. This characterizes MALDI-TOF-MS as a valuable method for monitoring the Maillard reaction in the course of food processing.

  2. Recent advances in hopanoids analysis: Quantification protocols overview, main research targets and selected problems of complex data exploration.

    PubMed

    Zarzycki, Paweł K; Portka, Joanna K

    2015-09-01

    Pentacyclic triterpenoids, particularly hopanoids, are organism-specific compounds and are generally considered as useful biomarkers that allow fingerprinting and classification of biological, environmental and geological samples. Simultaneous quantification of various hopanoids together with battery of related non-polar and low-molecular mass compounds may provide principal information for geochemical and environmental research focusing on both modern and ancient investigations. Target compounds can be derived from microbial biomass, water columns, sediments, coals, crude fossils or rocks. This create number of analytical problems due to different composition of the analytical matrix and interfering compounds and therefore, proper optimization of quantification protocols for such biomarkers is still the challenge. In this work we summarizing typical analytical protocols that were recently applied for quantification of hopanoids like compounds from different samples. Main steps including components of interest extraction, pre-purification, fractionation, derivatization and quantification involving gas (1D and 2D) as well as liquid separation techniques (liquid-liquid extraction, solid-phase extraction, planar and low resolution column chromatography, high-performance liquid chromatography) are described and discussed from practical point of view, mainly based on the experimental papers that were published within last two years, where significant increase in hopanoids research was noticed. The second aim of this review is to describe the latest research trends concerning determination of hopanoids and related low-molecular mass lipids analyzed in various samples including sediments, rocks, coals, crude oils and plant fossils as well as stromatolites and microbial biomass cultivated under different conditions. It has been found that majority of the most recent papers are based on uni- or bivariate approach for complex data analysis. Data interpretation involves number of physicochemical parameters and hopanoids quantities or given biomarkers mass ratios derived from high-throughput separation and detection systems, typically GC-MS and HPLC-MS. Based on quantitative data reported in recently published experimental works it has been demonstrated that multivariate data analysis using e.g. principal components computations may significantly extend our knowledge concerning proper biomarkers selection and samples classification by means of hopanoids and related non-polar compounds. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Dispersal and individual quality in a long lived species

    USGS Publications Warehouse

    Cam, E.; Monnat, J.-Y.; Royle, J. Andrew

    2004-01-01

    The idea of differences in individual quality has been put forward in numerous long-term studies in long-lived species to explain differences in lifetime production among individuals. Despite the important role of individual heterogeneity in vital rates in demography, population dynamics and life history theory, the idea of 'individual quality' is elusive. It is sometimes assumed to be a static or dynamic individual characteristic. When considered as a dynamic trait, it is sometimes assumed to vary deterministically or stochastically, or to be confounded with the characteristics of the habitat. We addressed heterogeneity in reproductive performance among individuals established in higher-quality habitat in a long-lived seabird species. We used approaches to statistical inference based on individual random effects permitting quantification of heterogeneity in populations and assessment of individual variation from the population mean. We found evidence of heterogeneity in breeding probability, not success probability. We assessed the influence of dispersal on individual reproductive potential. Dispersal is likely to be destabilizing in species with high site and mate fidelity. We detected heterogeneity after dispersal, not before. Individuals may perform well regardless of quality before destabilization, including those that recruited in higher-quality habitat by chance, but only higher-quality individuals may be able to overcome the consequences of dispersal. Importantly, results differed when accounting for individual heterogeneity (an increase in mean breeding probability when individuals dispersed), or not (a decrease in mean breeding probability). In the latter case, the decrease in mean breeding probability may result from a substantial decrease in breeding probability in a few individuals and a slight increase in others. In other words, the pattern observed at the population mean level may not reflect what happens in the majority of individuals.

  4. Lead users’ ideas on core features to support physical activity in rheumatoid arthritis: a first step in the development of an internet service using participatory design

    PubMed Central

    2014-01-01

    Background Despite the growing evidence of the benefits of physical activity (PA) in individuals with rheumatoid arthritis (RA), the majority is not physically active enough. An innovative strategy is to engage lead users in the development of PA interventions provided over the internet. The aim was to explore lead users’ ideas and prioritization of core features in a future internet service targeting adoption and maintenance of healthy PA in people with RA. Methods Six focus group interviews were performed with a purposively selected sample of 26 individuals with RA. Data were analyzed with qualitative content analysis and quantification of participants’ prioritization of most important content. Results Six categories were identified as core features for a future internet service: up-to-date and evidence-based information and instructions, self-regulation tools, social interaction, personalized set-up, attractive design and content, and access to the internet service. The categories represented four themes, or core aspects, important to consider in the design of the future service: (1) content, (2) customized options, (3) user interface and (4) access and implementation. Conclusions This is, to the best of our knowledge, the first study involving people with RA in the development of an internet service to support the adoption and maintenance of PA. Participants helped identifying core features and aspects important to consider and further explore during the next phase of development. We hypothesize that involvement of lead users will make transfer from theory to service more adequate and user-friendly and therefore will be an effective mean to facilitate PA behavior change. PMID:24655757

  5. Quantifying Volcanic Emissions of Trace Elements to the Atmosphere: Ideas Based on Past Studies

    NASA Astrophysics Data System (ADS)

    Rose, W. I.

    2003-12-01

    Extensive data exist from volcanological and geochemical studies about exotic elemental enrichments in volcanic emissions to the atmosphere but quantitative data are quite rare. Advanced, highly sensitive techniques of analysis are needed to detect low concentrations of some minor elements, especially during major eruptions. I will present data from studies done during low levels of activity (incrustations and silica tube sublimates at high temperature fumaroles, from SEM studies of particle samples collected in volcanic plumes and volcanic clouds, from geochemical analysis of volcanic gas condensates, from analysis of treated particle and gas filter packs) and a much smaller number that could reflect explosive activity (from fresh ashfall leachate geochemistry, and from thermodynamic codes modeling volatile emissions from magma). This data describes a highly variable pattern of elemental enrichments which are difficult to quantify, generalize and understand. Sampling in a routine way is difficult, and work in active craters has heightened our awareness of danger, which appropriately inhibits some sampling. There are numerous localized enrichments of minor elements that can be documented and others can be expected or inferred. There is a lack of systematic tools to measure minor element abundances in volcanic emissions. The careful combination of several methodologies listed above for the same volcanic vents can provide redundant data on multiple elements which could lead to overall quantification of minor element fluxes but there are challenging issues about detection. For quiescent plumes we can design combinations of measurements to quantify minor element emission rates. Doing a comparable methodology to succeed in measuring minor element fluxes for significant eruptions will require new strategies and/or ideas.

  6. Detection and quantification of proteins and cells by use of elemental mass spectrometry: progress and challenges.

    PubMed

    Yan, Xiaowen; Yang, Limin; Wang, Qiuquan

    2013-07-01

    Much progress has been made in identification of the proteins in proteomes, and quantification of these proteins has attracted much interest. In addition to popular tandem mass spectrometric methods based on soft ionization, inductively coupled plasma mass spectrometry (ICPMS), a typical example of mass spectrometry based on hard ionization, usually used for analysis of elements, has unique advantages in absolute quantification of proteins by determination of an element with a definite stoichiometry in a protein or attached to the protein. In this Trends article, we briefly describe state-of-the-art ICPMS-based methods for quantification of proteins, emphasizing protein-labeling and element-tagging strategies developed on the basis of chemically selective reactions and/or biospecific interactions. Recent progress from protein to cell quantification by use of ICPMS is also discussed, and the possibilities and challenges of ICPMS-based protein quantification for universal, selective, or targeted quantification of proteins and cells in a biological sample are also discussed critically. We believe ICPMS-based protein quantification will become ever more important in targeted quantitative proteomics and bioanalysis in the near future.

  7. Identifying the local and regional travel effects of activity centers in the Austin, Texas area.

    DOT National Transportation Integrated Search

    2015-02-01

    Metropolitan planning organizations (MPOs) have become increasingly interested in : incorporating innovated land use planning and design into transportation plan-making. Many : design ideas are recommended under the umbrella of the New Urbanism; yet ...

  8. What Does the Law Say?

    ERIC Educational Resources Information Center

    Zirkel, Perry A.

    2009-01-01

    In this article the author provides legal explanations to the following concerns: (1) Free Appropriate Public Education (FAPE) and Individualized Education Programs (IEPs); (2) Functional Behavior Assessments (FBAs) and Behavior Intervention Plans (BIPs); (3) Manifestation Determinations under Individuals with Disabilities Education Act (IDEA);…

  9. Optimization of Typological Requirements for Low-Cost Detached Houses

    NASA Astrophysics Data System (ADS)

    Kuráň, Jozef

    2017-09-01

    The presented paper deals with an analysis of the legislative, hygienic, functional and operational requirements for the design of detached houses and individual dwellings in terms of typological requirements. The article also presents a sociological survey about the preferences and subjective requirements of relevant public group segments in terms of living in a detached house or an individual dwelling. The aim of the paper is to define the possibilities for the optimization of typological requirements. The optimization methods are based on principles already applied to contemporary detached house preferences and trends. The main idea is to reduce the amount of floor space, thus lowering construction and operating costs. The goal is to design an optimized floor plan, while preserving the hygienic criteria for individual residential dwellings. By applying optimization methods, a so-called rationalized and conditioned floor plan results in an individual dwelling floor plan design that can be compared to a reference model with an accurate quantification comparison. The significant sources of research are the legislative and normative requirements in the field of house construction in Slovakia, the Czech Republic and abroad.

  10. Metabolic scaling and biodiversity of forests

    NASA Astrophysics Data System (ADS)

    Banavar, Jayanth

    Forests are biologically diverse and play a critical role in the dynamics of earth-climate systems. A forest is a tremendously complex system comprising co-existing rooted trees of many species and many sizes and utilizing resources from the environment. The trees interact with each other and with their environment and the interactions are not precisely known. Using scaling ideas, we will present a theoretical framework for understanding the role of geometry in determining the metabolic rate of a tree and of a forest. The quantification of tropical tree biodiversity and their abundances is still an open and challenging problem. Using a global-scale compilation, we will present a method that allows one to predict, from local censuses, the biodiversity and patterns of species abundance at the whole forest scale. The method allows one to quantify the minimum percentage cover of the forest that should be sampled in order to have a precise prediction of the estimates of biodiversity and species abundances. Collaborators: Amos Maritan, Tommaso Anfodillo, Sandro Azaele, Marco Favretti, Marco Formentin, Jacopo Grilli, Samir Suweis, Anna Tovo, Igor Volkov.

  11. Spatially Resolved Quantification of the Surface Reactivity of Solid Catalysts.

    PubMed

    Huang, Bing; Xiao, Li; Lu, Juntao; Zhuang, Lin

    2016-05-17

    A new property is reported that accurately quantifies and spatially describes the chemical reactivity of solid surfaces. The core idea is to create a reactivity weight function peaking at the Fermi level, thereby determining a weighted summation of the density of states of a solid surface. When such a weight function is defined as the derivative of the Fermi-Dirac distribution function at a certain non-zero temperature, the resulting property is the finite-temperature chemical softness, termed Fermi softness (SF ), which turns out to be an accurate descriptor of the surface reactivity. The spatial image of SF maps the reactive domain of a heterogeneous surface and even portrays morphological details of the reactive sites. SF analyses reveal that the reactive zones on a Pt3 Y(111) surface are the platinum sites rather than the seemingly active yttrium sites, and the reactivity of the S-dimer edge of MoS2 is spatially anisotropic. Our finding is of fundamental and technological significance to heterogeneous catalysis and industrial processes demanding rational design of solid catalysts. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Quantitative brain tissue oximetry, phase spectroscopy and imaging the range of homeostasis in piglet brain.

    PubMed

    Chance, Britton; Ma, Hong Yan; Nioka, Shoko

    2003-01-01

    The quantification of tissue oxygen by frequency or time domain methods has been discussed in a number of prior publications where the meaning of the tissue hemoglobin oxygen saturation was unclear and where the CW instruments were unsuitable for proper quantitative measurements [1, 2]. The development of the IQ Phase Meter has greatly simplified and made reliable the difficult determination of precise phase and amplitude signals from brain. This contribution reports on the calibration of the instrument in model systems and the use of the instrument to measure tissue saturation (StO2) in a small animal model. In addition, a global interpretation of the meaning of tissue oxygen has been formulated based on the idea that autoregulation will maintain tissue oxygen at a fixed value over a range of arterial and venous oxygen values over the range of autoregulation. Beyond that range, the tissue oxygen is still correctly measured but, as expected, approaches the arterial saturation at low metabolic rates and the venous saturation at high metabolic rates of mitochondria.

  13. A Quantification of the 3D Modeling Capabilities of the Kinectfusion Algorithm

    DTIC Science & Technology

    2014-03-27

    Based on the documentation, this is based on the idea that the majority GPU cards on the market will not have the memory required to support more than...5.0 4.4 4N 520 7.9 0.4 8.6 8.2 8.0 7.6 6.6 4P 520 6.2 0.3 6.9 6.4 6.2 6.0 5.5 4Q 520 7.1 0.4 8.0 7.3 7.1 6.8 6.1 5D 520 6.4 0.4 7.0 6.6 6.4 6.1 5.5 5N...3Q 5.3 5.3 5.2 -0.1 -1.7% 4D 5.3 5.3 5.2 -0.1 -1.3% 4N 7.9 8.0 7.7 -0.2 -2.2% 4P 6.2 6.2 5.9 -0.3 -5.2% 4Q 7.1 7.1 7.0 -0.1 -1.1% 5D 6.4 6.4 6.6 0.2

  14. The use of self-quantification systems for personal health information: big data management activities and prospects.

    PubMed

    Almalki, Manal; Gray, Kathleen; Sanchez, Fernando Martin

    2015-01-01

    Self-quantification is seen as an emerging paradigm for health care self-management. Self-quantification systems (SQS) can be used for tracking, monitoring, and quantifying health aspects including mental, emotional, physical, and social aspects in order to gain self-knowledge. However, there has been a lack of a systematic approach for conceptualising and mapping the essential activities that are undertaken by individuals who are using SQS in order to improve health outcomes. In this paper, we propose a new model of personal health information self-quantification systems (PHI-SQS). PHI-SQS model describes two types of activities that individuals go through during their journey of health self-managed practice, which are 'self-quantification' and 'self-activation'. In this paper, we aimed to examine thoroughly the first type of activity in PHI-SQS which is 'self-quantification'. Our objectives were to review the data management processes currently supported in a representative set of self-quantification tools and ancillary applications, and provide a systematic approach for conceptualising and mapping these processes with the individuals' activities. We reviewed and compared eleven self-quantification tools and applications (Zeo Sleep Manager, Fitbit, Actipressure, MoodPanda, iBGStar, Sensaris Senspod, 23andMe, uBiome, Digifit, BodyTrack, and Wikilife), that collect three key health data types (Environmental exposure, Physiological patterns, Genetic traits). We investigated the interaction taking place at different data flow stages between the individual user and the self-quantification technology used. We found that these eleven self-quantification tools and applications represent two major tool types (primary and secondary self-quantification systems). In each type, the individuals experience different processes and activities which are substantially influenced by the technologies' data management capabilities. Self-quantification in personal health maintenance appears promising and exciting. However, more studies are needed to support its use in this field. The proposed model will in the future lead to developing a measure for assessing the effectiveness of interventions to support using SQS for health self-management (e.g., assessing the complexity of self-quantification activities, and activation of the individuals).

  15. The use of self-quantification systems for personal health information: big data management activities and prospects

    PubMed Central

    2015-01-01

    Background Self-quantification is seen as an emerging paradigm for health care self-management. Self-quantification systems (SQS) can be used for tracking, monitoring, and quantifying health aspects including mental, emotional, physical, and social aspects in order to gain self-knowledge. However, there has been a lack of a systematic approach for conceptualising and mapping the essential activities that are undertaken by individuals who are using SQS in order to improve health outcomes. In this paper, we propose a new model of personal health information self-quantification systems (PHI-SQS). PHI-SQS model describes two types of activities that individuals go through during their journey of health self-managed practice, which are 'self-quantification' and 'self-activation'. Objectives In this paper, we aimed to examine thoroughly the first type of activity in PHI-SQS which is 'self-quantification'. Our objectives were to review the data management processes currently supported in a representative set of self-quantification tools and ancillary applications, and provide a systematic approach for conceptualising and mapping these processes with the individuals' activities. Method We reviewed and compared eleven self-quantification tools and applications (Zeo Sleep Manager, Fitbit, Actipressure, MoodPanda, iBGStar, Sensaris Senspod, 23andMe, uBiome, Digifit, BodyTrack, and Wikilife), that collect three key health data types (Environmental exposure, Physiological patterns, Genetic traits). We investigated the interaction taking place at different data flow stages between the individual user and the self-quantification technology used. Findings We found that these eleven self-quantification tools and applications represent two major tool types (primary and secondary self-quantification systems). In each type, the individuals experience different processes and activities which are substantially influenced by the technologies' data management capabilities. Conclusions Self-quantification in personal health maintenance appears promising and exciting. However, more studies are needed to support its use in this field. The proposed model will in the future lead to developing a measure for assessing the effectiveness of interventions to support using SQS for health self-management (e.g., assessing the complexity of self-quantification activities, and activation of the individuals). PMID:26019809

  16. Maximum Entropy/Optimal Projection (MEOP) control design synthesis: Optimal quantification of the major design tradeoffs

    NASA Technical Reports Server (NTRS)

    Hyland, D. C.; Bernstein, D. S.

    1987-01-01

    The underlying philosophy and motivation of the optimal projection/maximum entropy (OP/ME) stochastic modeling and reduced control design methodology for high order systems with parameter uncertainties are discussed. The OP/ME design equations for reduced-order dynamic compensation including the effect of parameter uncertainties are reviewed. The application of the methodology to several Large Space Structures (LSS) problems of representative complexity is illustrated.

  17. J.A. Schumpeter and T.B. Veblen on economic evolution: the dichotomy between statics and dynamics

    PubMed Central

    Schütz, Marlies; Rainer, Andreas

    2016-01-01

    Abstract At present, the discussion on the dichotomy between statics and dynamics is resolved by concentrating on its mathematical meaning. Yet, a simple formalisation masks the underlying methodological discussion. Overcoming this limitation, the paper discusses Schumpeter's and Veblen's viewpoint on dynamic economic systems as systems generating change from within. It contributes to an understanding on their ideas of how economics could become an evolutionary science and on their contributions to elaborate an evolutionary economics. It confronts Schumpeter's with Veblen's perspective on evolutionary economics and provides insight into their evolutionary economic theorising by discussing their ideas on the evolution of capitalism. PMID:28057981

  18. Eigenspace perturbations for structural uncertainty estimation of turbulence closure models

    NASA Astrophysics Data System (ADS)

    Jofre, Lluis; Mishra, Aashwin; Iaccarino, Gianluca

    2017-11-01

    With the present state of computational resources, a purely numerical resolution of turbulent flows encountered in engineering applications is not viable. Consequently, investigations into turbulence rely on various degrees of modeling. Archetypal amongst these variable resolution approaches would be RANS models in two-equation closures, and subgrid-scale models in LES. However, owing to the simplifications introduced during model formulation, the fidelity of all such models is limited, and therefore the explicit quantification of the predictive uncertainty is essential. In such scenario, the ideal uncertainty estimation procedure must be agnostic to modeling resolution, methodology, and the nature or level of the model filter. The procedure should be able to give reliable prediction intervals for different Quantities of Interest, over varied flows and flow conditions, and at diametric levels of modeling resolution. In this talk, we present and substantiate the Eigenspace perturbation framework as an uncertainty estimation paradigm that meets these criteria. Commencing from a broad overview, we outline the details of this framework at different modeling resolution. Thence, using benchmark flows, along with engineering problems, the efficacy of this procedure is established. This research was partially supported by NNSA under the Predictive Science Academic Alliance Program (PSAAP) II, and by DARPA under the Enabling Quantification of Uncertainty in Physical Systems (EQUiPS) project (technical monitor: Dr Fariba Fahroo).

  19. Multi-fidelity numerical simulations of shock/turbulent-boundary layer interaction with uncertainty quantification

    NASA Astrophysics Data System (ADS)

    Bermejo-Moreno, Ivan; Campo, Laura; Larsson, Johan; Emory, Mike; Bodart, Julien; Palacios, Francisco; Iaccarino, Gianluca; Eaton, John

    2013-11-01

    We study the interaction between an oblique shock wave and the turbulent boundary layers inside a nearly-square duct by combining wall-modeled LES, 2D and 3D RANS simulations, targeting the experiment of Campo, Helmer & Eaton, 2012 (nominal conditions: M = 2 . 05 , Reθ = 6 , 500). A primary objective is to quantify the effect of aleatory and epistemic uncertainties on the STBLI. Aleatory uncertainties considered include the inflow conditions (Mach number of the incoming air stream and thickness of the boundary layers) and perturbations of the duct geometry upstream of the interaction. The epistemic uncertainty under consideration focuses on the RANS turbulence model form by injecting perturbations in the Reynolds stress anisotropy in regions of the flow where the model assumptions (in particular, the Boussinesq eddy-viscosity hypothesis) may be invalid. These perturbations are then propagated through the flow solver into the solution. The uncertainty quantification (UQ) analysis is done through 2D and 3D RANS simulations, assessing the importance of the three-dimensional effects imposed by the nearly-square duct geometry. Wall-modeled LES are used to verify elements of the UQ methodology and to explore the flow features and physics of the STBLI for multiple shock strengths. Financial support from the United States Department of Energy under the PSAAP program is gratefully acknowledged.

  20. Experimental design-based isotope-dilution SPME-GC/MS method development for the analysis of smoke flavouring products.

    PubMed

    Giri, Anupam; Zelinkova, Zuzana; Wenzl, Thomas

    2017-12-01

    For the implementation of Regulation (EC) No 2065/2003 related to smoke flavourings used or intended for use in or on foods a method based on solid-phase micro extraction (SPME) GC/MS was developed for the characterisation of liquid smoke products. A statistically based experimental design (DoE) was used for method optimisation. The best general conditions to quantitatively analyse the liquid smoke compounds were obtained with a polydimethylsiloxane/divinylbenzene (PDMS/DVB) fibre, 60°C extraction temperature, 30 min extraction time, 250°C desorption temperature, 180 s desorption time, 15 s agitation time, and 250 rpm agitation speed. Under the optimised conditions, 119 wood pyrolysis products including furan/pyran derivatives, phenols, guaiacol, syringol, benzenediol, and their derivatives, cyclic ketones, and several other heterocyclic compounds were identified. The proposed method was repeatable (RSD% <5) and the calibration functions were linear for all compounds under study. Nine isotopically labelled internal standards were used for improving quantification of analytes by compensating matrix effects that might affect headspace equilibrium and extractability of compounds. The optimised isotope dilution SPME-GC/MS based analytical method proved to be fit for purpose, allowing the rapid identification and quantification of volatile compounds in liquid smoke flavourings.

  1. Understanding of the Geomorphological Elements in Discrimination of Typical Mediterranean Land Cover Types

    NASA Astrophysics Data System (ADS)

    Elhag, Mohamed; Boteva, Silvena

    2017-12-01

    Quantification of geomorphometric features is the keystone concern of the current study. The quantification was based on the statistical approach in term of multivariate analysis of local topographic features. The implemented algorithm utilizes the Digital Elevation Model (DEM) to categorize and extract the geomorphometric features embedded in the topographic dataset. The morphological settings were exercised on the central pixel of 3x3 per-defined convolution kernel to evaluate the surrounding pixels under the right directional pour point model (D8) of the azimuth viewpoints. Realization of unsupervised classification algorithm in term of Iterative Self-Organizing Data Analysis Technique (ISODATA) was carried out on ASTER GDEM within the boundary of the designated study area to distinguish 10 morphometric classes. The morphometric classes expressed spatial distribution variation in the study area. The adopted methodology is successful to appreciate the spatial distribution of the geomorphometric features under investigation. The conducted results verified the superimposition of the delineated geomorphometric elements over a given remote sensing imagery to be further analyzed. Robust relationship between different Land Cover types and the geomorphological elements was established in the context of the study area. The domination and the relative association of different Land Cover types in corresponding to its geomorphological elements were demonstrated.

  2. in vivo quantification of white matter microstructure for use in aging: A focus on two emerging techniques

    PubMed Central

    Lamar, Melissa; Zhou, Xiaohong Joe; Charlton, Rebecca A.; Dean, Douglas; Little, Deborah; Deoni, Sean C

    2013-01-01

    Human brain imaging has seen many advances in the quantification of white matter in vivo. For example, these advances have revealed the association between white matter damage and vascular disease as well as their impact on risk for and development of dementia and depression in an aging population. Current neuroimaging methods to quantify white matter damage provide a foundation for understanding such age-related neuropathology; however, these methods are not as adept at determining the underlying microstructural abnormalities signaling at risk tissue or driving white matter damage in the aging brain. This review will begin with a brief overview of the use of diffusion tensor imaging (DTI) in understanding white matter alterations in aging before focusing in more detail on select advances in both diffusion-based methods and multi-component relaxometry techniques for imaging white matter microstructural integrity within myelin sheaths and the axons they encase. While DTI greatly extended the field of white matter interrogation, these more recent technological advances will add clarity to the underlying microstructural mechanisms that contribute to white matter damage. More specifically, the methods highlighted in this review may prove more sensitive (and specific) for determining the contribution of myelin versus axonal integrity to the aging of white matter in brain. PMID:24080382

  3. Mass Median Plume Angle: A novel approach to characterize plume geometry in solution based pMDIs.

    PubMed

    Moraga-Espinoza, Daniel; Eshaghian, Eli; Smyth, Hugh D C

    2018-05-30

    High-speed laser imaging (HSLI) is the preferred technique to characterize the geometry of the plume in pressurized metered dose inhalers (pMDIs). However, current methods do not allow for simulation of inhalation airflow and do not use drug mass quantification to determine plume angles. To address these limitations, a Plume Induction Port Evaluator (PIPE) was designed to characterize the plume geometry based on mass deposition patterns. The method is easily adaptable to current pMDI characterization methodologies, uses similar calculations methods, and can be used under airflow. The effect of airflow and formulation on the plume geometry were evaluated using PIPE and HSLI. Deposition patterns in PIPE were highly reproducible and log-normal distributed. Mass Median Plume Angle (MMPA) was a new characterization parameter to describe the effective angle of the droplets deposited in the induction port. Plume angles determined by mass showed a significant decrease in size as ethanol increases which correlates to the decrease on vapor pressure in the formulation. Additionally, airflow significantly decreased the angle of the plumes when cascade impactor was operated under flow. PIPE is an alternative to laser-based characterization methods to evaluate the plume angle of pMDIs based on reliable drug quantification while simulating patient inhalation. Copyright © 2018. Published by Elsevier B.V.

  4. Stability-Indicating TLC-Densitometric Assay for Methyltestosterone and Quantum Chemical Calculations.

    PubMed

    Musharraf, Syed Ghulam; Ul Arfeen, Qamar; Ul Haq, Faraz; Khatoon, Aliya; Azher Ali, Rahat

    2017-10-01

    Methyltestosterone is a synthetic testosterone derivative commonly used for the treatment of testosterone deficiency in males and one the anabolic steroids whose use is banned by World Anti-Doping Agency (WADA). This study presents a simple, cost-effective and rapid stability-indicating assay for densitometric quantification of methyltestosterone in pharmaceutical formulation. The developed method employed pre-coated TLC plates with mobile phase hexane:acetone (6.5:3.5 v/v). Limit of detection and limit of quantitation were found to be 2.06 and 6.24 ng/spot, respectively. Stress degradation study of methyltestosterone was conducted by applying various stress conditions such as hydrolysis under acidic, basic and neutral conditions, heating in anhydrous conditions and exposure to light. Methyltestosterone was found to be susceptible to photodegradation, acidic and basic hydrolysis. Degraded products were well resolved with significantly different Rf values. Acid degraded product was identified as 17,17-dimethyl-18-norandrosta-4,13(14)-dien-3-one through spectroscopic methods. The reactivity of methyltestosterone under applied stress conditions was also explained by quantum chemical calculations. The developed method is found to be repeatable, selective and accurate for quantification of methyltestosterone and can be employed for routine analysis. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  5. A review of optimization and quantification techniques for chemical exchange saturation transfer (CEST) MRI toward sensitive in vivo imaging

    PubMed Central

    Guo, Yingkun; Zheng, Hairong; Sun, Phillip Zhe

    2015-01-01

    Chemical exchange saturation transfer (CEST) MRI is a versatile imaging method that probes the chemical exchange between bulk water and exchangeable protons. CEST imaging indirectly detects dilute labile protons via bulk water signal changes following selective saturation of exchangeable protons, which offers substantial sensitivity enhancement and has sparked numerous biomedical applications. Over the past decade, CEST imaging techniques have rapidly evolved due to contributions from multiple domains, including the development of CEST mathematical models, innovative contrast agent designs, sensitive data acquisition schemes, efficient field inhomogeneity correction algorithms, and quantitative CEST (qCEST) analysis. The CEST system that underlies the apparent CEST-weighted effect, however, is complex. The experimentally measurable CEST effect depends not only on parameters such as CEST agent concentration, pH and temperature, but also on relaxation rate, magnetic field strength and more importantly, experimental parameters including repetition time, RF irradiation amplitude and scheme, and image readout. Thorough understanding of the underlying CEST system using qCEST analysis may augment the diagnostic capability of conventional imaging. In this review, we provide a concise explanation of CEST acquisition methods and processing algorithms, including their advantages and limitations, for optimization and quantification of CEST MRI experiments. PMID:25641791

  6. STEM VQ Method, Using Scanning Transmission Electron Microscopy (STEM) for Accurate Virus Quantification

    DTIC Science & Technology

    2017-02-02

    Corresponding Author Abstract Accurate virus quantification is sought, but a perfect method still eludes the scientific community. Electron...unlimited. UNCLASSIFIED 2 provides morphology data and counts all viral particles, including partial or noninfectious particles; however, EM methods ...consistent, reproducible virus quantification method called Scanning Transmission Electron Microscopy – Virus Quantification (STEM-VQ) which simplifies

  7. Highly sensitive and simple liquid chromatography assay with ion-pairing extraction and visible detection for quantification of gold from nanoparticles.

    PubMed

    Pallotta, Arnaud; Philippe, Valentin; Boudier, Ariane; Leroy, Pierre; Clarot, Igor

    2018-03-01

    A simple isocratic HPLC method using visible detection was developed and validated for the quantification of gold in nanoparticles (AuNP). After a first step of oxidation of nanoparticles, an ion-pair between tetrachloroaurate anion and the cationic dye Rhodamine B was formed and extracted from the aqueous media with the help of an organic solvent. The corresponding Rhodamine B was finally quantified by reversed phase liquid chromatography using a Nucleosil C18 (150mm × 4.6mm, 3µm) column and with a mobile phase containing acetonitrile and 0.1% trifluoroacetic acid aqueous solution (25/75, V/V) at 1.0mLmin -1. and at a wavelength of 555nm. The method was validated using methodology described by the International Conference on Harmonization and was shown to be specific, precise (RSD < 11%), accurate and linear in the range of 0.1 - 30.0µM with a lower limit of quantification (LLOQ) of 0.1µM. This method was in a first time applied to AuNP quality control after their synthesis. In a second time, the absence of gold leakage (either as AuNP or gold salt form) from nanostructured multilayered polyelectrolyte films under shear stress was assessed. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Quantitative analysis of pyoluteorin in anti-fungal fermentation liquor of Pseudomonas species by capillary zone electrophoresis with UV-vis detector.

    PubMed

    Wang, Qiu-Ling; Zhang, Xue-Hong; Fan, Liu-Yin; Zhang, Wei; Xu, Yu-Qian; Hu, Hong-Bo; Cao, Cheng-Xi

    2005-11-05

    This paper investigated potential utility of capillary zone electrophoresis (CZE) for very succinct but robust quantitative analysis of pyoluteorin (Plt) in anti-fungal fermentation liquor of Pseudomonas species. The experimental conditions for the separation and quantification of Plt were optimized at first. The optimized conditions are: 80 mmol/L pH 8.40 Gly-NaOH buffer, 51 cm total length (42 cm effective) and 75 microm I.D. capillary, 230 nm wavelength, 25 kV, 13 mbar 10s pressure sample injection and 24 degrees C air-cooling. Under the optimized conditions, the migration times of Plt and the internal standard phenobarbital are 2.09 and 2.49 min, respectively, the linear response of Plt concentration ranges from 5.0 to 1000 microg/mL with high correlation coefficient (r=0.99977, n=9), the limits of detection (LOD) and quantification (LOQ) for Plt are 0.66 and 2.2 microg/mL, the precision values (expressed as R.S.D.) of intra- and inter-day are 1.19-1.94% and 1.55-6.21%, respectively, the recoveries of Plt at three concentration levels of 750, 250 and 50 microg/mL range from 90.31% to 97.85% and to 98.96%, respectively. The developed method can be well used for the quantification of Plt in the fermentation liquor.

  9. In-line monitoring of cocrystallization process and quantification of carbamazepine-nicotinamide cocrystal using Raman spectroscopy and chemometric tools.

    PubMed

    Soares, Frederico L F; Carneiro, Renato L

    2017-06-05

    A cocrystallization process may involve several molecular species, which are generally solid under ambient conditions. Thus, accurate monitoring of different components that might appear during the reaction is necessary, as well as quantification of the final product. This work reports for the first time the synthesis of carbamazepine-nicotinamide cocrystal in aqueous media with a full conversion. The reactions were monitored by Raman spectroscopy coupled with Multivariate Curve Resolution - Alternating Least Squares, and the quantification of the final product among its coformers was performed using Raman spectroscopy and Partial Least Squares regression. The slurry reaction was made in four different conditions: room temperature, 40°C, 60°C and 80°C. The slurry reaction at 80°C enabled a full conversion of initial substrates into the cocrystal form, using water as solvent for a greener method. The employment of MCR-ALS coupled with Raman spectroscopy enabled to observe the main steps of the reactions, such as drug dissolution, nucleation and crystallization of the cocrystal. The PLS models gave mean errors of cross validation around 2.0 (% wt/wt), and errors of validation between 2.5 and 8.2 (% wt/wt) for all components. These were good results since the spectra of cocrystals and the physical mixture of the coformers present some similar peaks. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Organ-specific SPECT activity calibration using 3D printed phantoms for molecular radiotherapy dosimetry.

    PubMed

    Robinson, Andrew P; Tipping, Jill; Cullen, David M; Hamilton, David; Brown, Richard; Flynn, Alex; Oldfield, Christopher; Page, Emma; Price, Emlyn; Smith, Andrew; Snee, Richard

    2016-12-01

    Patient-specific absorbed dose calculations for molecular radiotherapy require accurate activity quantification. This is commonly derived from Single-Photon Emission Computed Tomography (SPECT) imaging using a calibration factor relating detected counts to known activity in a phantom insert. A series of phantom inserts, based on the mathematical models underlying many clinical dosimetry calculations, have been produced using 3D printing techniques. SPECT/CT data for the phantom inserts has been used to calculate new organ-specific calibration factors for (99m) Tc and (177)Lu. The measured calibration factors are compared to predicted values from calculations using a Gaussian kernel. Measured SPECT calibration factors for 3D printed organs display a clear dependence on organ shape for (99m) Tc and (177)Lu. The observed variation in calibration factor is reproduced using Gaussian kernel-based calculation over two orders of magnitude change in insert volume for (99m) Tc and (177)Lu. These new organ-specific calibration factors show a 24, 11 and 8 % reduction in absorbed dose for the liver, spleen and kidneys, respectively. Non-spherical calibration factors from 3D printed phantom inserts can significantly improve the accuracy of whole organ activity quantification for molecular radiotherapy, providing a crucial step towards individualised activity quantification and patient-specific dosimetry. 3D printed inserts are found to provide a cost effective and efficient way for clinical centres to access more realistic phantom data.

  11. Fully Automated Quantification of the Striatal Uptake Ratio of [99mTc]-TRODAT with SPECT Imaging: Evaluation of the Diagnostic Performance in Parkinson's Disease and the Temporal Regression of Striatal Tracer Uptake

    PubMed Central

    Fang, Yu-Hua Dean; Chiu, Shao-Chieh; Lu, Chin-Song; Weng, Yi-Hsin

    2015-01-01

    Purpose. We aimed at improving the existing methods for the fully automatic quantification of striatal uptake of [99mTc]-TRODAT with SPECT imaging. Procedures. A normal [99mTc]-TRODAT template was first formed based on 28 healthy controls. Images from PD patients (n = 365) and nPD subjects (28 healthy controls and 33 essential tremor patients) were spatially normalized to the normal template. We performed an inverse transform on the predefined striatal and reference volumes of interest (VOIs) and applied the transformed VOIs to the original image data to calculate the striatal-to-reference ratio (SRR). The diagnostic performance of the SRR was determined through receiver operating characteristic (ROC) analysis. Results. The SRR measured with our new and automatic method demonstrated excellent diagnostic performance with 92% sensitivity, 90% specificity, 92% accuracy, and an area under the curve (AUC) of 0.94. For the evaluation of the mean SRR and the clinical duration, a quadratic function fit the data with R 2 = 0.84. Conclusions. We developed and validated a fully automatic method for the quantification of the SRR in a large study sample. This method has an excellent diagnostic performance and exhibits a strong correlation between the mean SRR and the clinical duration in PD patients. PMID:26366413

  12. Fully Automated Quantification of the Striatal Uptake Ratio of [(99m)Tc]-TRODAT with SPECT Imaging: Evaluation of the Diagnostic Performance in Parkinson's Disease and the Temporal Regression of Striatal Tracer Uptake.

    PubMed

    Fang, Yu-Hua Dean; Chiu, Shao-Chieh; Lu, Chin-Song; Yen, Tzu-Chen; Weng, Yi-Hsin

    2015-01-01

    We aimed at improving the existing methods for the fully automatic quantification of striatal uptake of [(99m)Tc]-TRODAT with SPECT imaging. A normal [(99m)Tc]-TRODAT template was first formed based on 28 healthy controls. Images from PD patients (n = 365) and nPD subjects (28 healthy controls and 33 essential tremor patients) were spatially normalized to the normal template. We performed an inverse transform on the predefined striatal and reference volumes of interest (VOIs) and applied the transformed VOIs to the original image data to calculate the striatal-to-reference ratio (SRR). The diagnostic performance of the SRR was determined through receiver operating characteristic (ROC) analysis. The SRR measured with our new and automatic method demonstrated excellent diagnostic performance with 92% sensitivity, 90% specificity, 92% accuracy, and an area under the curve (AUC) of 0.94. For the evaluation of the mean SRR and the clinical duration, a quadratic function fit the data with R (2) = 0.84. We developed and validated a fully automatic method for the quantification of the SRR in a large study sample. This method has an excellent diagnostic performance and exhibits a strong correlation between the mean SRR and the clinical duration in PD patients.

  13. In vivo quantification of brain metabolites by 1H-MRS using water as an internal standard.

    PubMed

    Christiansen, P; Henriksen, O; Stubgaard, M; Gideon, P; Larsson, H B

    1993-01-01

    The reliability of absolute quantification of average metabolite concentrations in the human brain in vivo by 1H-MRS using the fully relaxed water signal as an internal standard was tested in a number of in vitro as well as in vivo measurements. The experiments were carried out on a SIEMENS HELICON SP 63/84 wholebody MR-scanner operating at 1.5 T using a STEAM sequence. In vitro studies indicate a very high correlation between metabolite signals (area under peaks) and concentration, R = 0.99 as well as between metabolite signals and the volume of the selected voxel, R = 1.00. The error in quantification of N-acetyl aspartate (NAA) concentration was about 1-2 mM (6-12%). Also in vivo a good linearity between water signal and selected voxel size was seen. The same was true for the studied metabolites, N-acetyl aspartate (NAA), creatine/phosphocreatine (Cr/PCr), and choline (Cho). Calculated average concentrations of NAA, Cr/PCr, and Cho in the occipital lobe of the brain in five healthy volunteers were (mean +/- 1 SD) 11.6 +/- 1.3 mM, 7.6 +/- 1.4 mM, and 1.7 +/- 0.5 mM. The results indicate that the method presented offers reasonable estimation of metabolite concentrations in the brain in vivo and therefore is useful in clinical research.

  14. In-line monitoring of cocrystallization process and quantification of carbamazepine-nicotinamide cocrystal using Raman spectroscopy and chemometric tools

    NASA Astrophysics Data System (ADS)

    Soares, Frederico L. F.; Carneiro, Renato L.

    2017-06-01

    A cocrystallization process may involve several molecular species, which are generally solid under ambient conditions. Thus, accurate monitoring of different components that might appear during the reaction is necessary, as well as quantification of the final product. This work reports for the first time the synthesis of carbamazepine-nicotinamide cocrystal in aqueous media with a full conversion. The reactions were monitored by Raman spectroscopy coupled with Multivariate Curve Resolution - Alternating Least Squares, and the quantification of the final product among its coformers was performed using Raman spectroscopy and Partial Least Squares regression. The slurry reaction was made in four different conditions: room temperature, 40 °C, 60 °C and 80 °C. The slurry reaction at 80 °C enabled a full conversion of initial substrates into the cocrystal form, using water as solvent for a greener method. The employment of MCR-ALS coupled with Raman spectroscopy enabled to observe the main steps of the reactions, such as drug dissolution, nucleation and crystallization of the cocrystal. The PLS models gave mean errors of cross validation around 2.0 (% wt/wt), and errors of validation between 2.5 and 8.2 (% wt/wt) for all components. These were good results since the spectra of cocrystals and the physical mixture of the coformers present some similar peaks.

  15. New solid surface fluorescence methodology for lead traces determination using rhodamine B as fluorophore and coacervation scheme: Application to lead quantification in e-cigarette refill liquids.

    PubMed

    Talio, María C; Zambrano, Karen; Kaplan, Marcos; Acosta, Mariano; Gil, Raúl A; Luconi, Marta O; Fernández, Liliana P

    2015-10-01

    A new environmental friendly methodology based on fluorescent signal enhancement of rhodamine B dye is proposed for Pb(II) traces quantification using a preconcentration step based on the coacervation phenomenon. A cationic surfactant (cetyltrimethylammonium bromide, CTAB) and potassium iodine were chosen for this aim. The coacervate phase was collected on a filter paper disk and the solid surface fluorescence signal was determined in a spectrofluorometer. Experimental variables that influence on preconcentration step and fluorimetric sensitivity have been optimized using uni-variation assays. The calibration graph using zero th order regression was linear from 7.4×10(-4) to 3.4 μg L(-1) with a correlation coefficient of 0.999. Under the optimal conditions, a limit of detection of 2.2×10(-4) μg L(-1) and a limit of quantification of 7.4×10(-4) μg L(-1) were obtained. The method showed good sensitivity, adequate selectivity with good tolerance to foreign ions, and was applied to the determination of trace amounts of Pb(II) in refill solutions for e-cigarettes with satisfactory results validated by ICP-MS. The proposed method represents an innovative application of coacervation processes and of paper filters to solid surface fluorescence methodology. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. Detection and Quantification of Graphene-Family Nanomaterials in the Environment.

    PubMed

    Goodwin, David G; Adeleye, Adeyemi S; Sung, Lipiin; Ho, Kay T; Burgess, Robert M; Petersen, Elijah J

    2018-04-17

    An increase in production of commercial products containing graphene-family nanomaterials (GFNs) has led to concern over their release into the environment. The fate and potential ecotoxicological effects of GFNs in the environment are currently unclear, partially due to the limited analytical methods for GFN measurements. In this review, the unique properties of GFNs that are useful for their detection and quantification are discussed. The capacity of several classes of techniques to identify and/or quantify GFNs in different environmental matrices (water, soil, sediment, and organisms), after environmental transformations, and after release from a polymer matrix of a product is evaluated. Extraction and strategies to combine methods for more accurate discrimination of GFNs from environmental interferences as well as from other carbonaceous nanomaterials are recommended. Overall, a comprehensive review of the techniques available to detect and quantify GFNs are systematically presented to inform the state of the science, guide researchers in their selection of the best technique for the system under investigation, and enable further development of GFN metrology in environmental matrices. Two case studies are described to provide practical examples of choosing which techniques to utilize for detection or quantification of GFNs in specific scenarios. Because the available quantitative techniques are somewhat limited, more research is required to distinguish GFNs from other carbonaceous materials and improve the accuracy and detection limits of GFNs at more environmentally relevant concentrations.

  17. An alternative method for the analysis of melanin production in Cryptococcus neoformans sensu lato and Cryptococcus gattii sensu lato.

    PubMed

    Brilhante, Raimunda S N; España, Jaime D A; de Alencar, Lucas P; Pereira, Vandbergue S; Castelo-Branco, Débora de S C M; Pereira-Neto, Waldemiro de A; Cordeiro, Rossana de A; Sidrim, José J C; Rocha, Marcos F G

    2017-10-01

    Melanin is an important virulence factor for several microorganisms, including Cryptococcus neoformans sensu lato and Cryptococcus gattii sensu lato, thus, the assessment of melanin production and its quantification may contribute to the understanding of microbial pathogenesis. The objective of this study was to standardise an alternative method for the production and indirect quantification of melanin in C. neoformans sensu lato and C. gattii sensu lato. Eight C. neoformans sensu lato and three C. gattii sensu lato, identified through URA5 methodology, Candida parapsilosis ATCC 22019 (negative control) and one Hortaea werneckii (positive control) were inoculated on minimal medium agar with or without L-DOPA, in duplicate, and incubated at 35°C, for 7 days. Pictures were taken from the third to the seventh day, under standardised conditions in a photographic chamber. Then, photographs were analysed using grayscale images. All Cryptococcus spp. strains produced melanin after growth on minimal medium agar containing L-DOPA. C. parapsilosis ATCC 22019 did not produce melanin on medium containing L-DOPA, while H. werneckii presented the strongest pigmentation. This new method allows the indirect analysis of melanin production through pixel quantification in grayscale images, enabling the study of substances that can modulate melanin production. © 2017 Blackwell Verlag GmbH.

  18. Pain mechanisms: a commentary on concepts and issues.

    PubMed

    Perl, Edward R

    2011-06-01

    This commentary on ideas about neural mechanisms underlying pain is aimed at providing perspective for a reader who does not work in the field of mammalian somatic sensation. It is not a comprehensive review of the literature. The organization is historical to chronicle the evolution of ideas. The aim is to call attention to source of concepts and how various ideas have fared over time. One difficulty in relating concepts about pain is that the term is used to refer to human and animal reactions ranging from protective spinal reflexes to complex affective behaviors. As a result, the spectrum of "pain"-related neural organization extends to operation of multiple neuronal arrangements. Thinking about pain has shadowed progress in understanding biological mechanisms, in particular the manner of function of nervous systems. This essay concentrates on the evolution of information and concepts from the early 19th century to the present. Topics include the assumptions underlying currently active theories about pain mechanisms. At the end, brief consideration is given to present-day issues, e.g., chronic pain, central pain, and the view of pain as an emotion rather than a sensation. The conceptual progression shows that current controversies have old roots and that failed percepts often resurface after seemingly having been put to rest by argument and evidence. Copyright © 2011 Elsevier Ltd. All rights reserved.

  19. Crossing the Threshold: Bringing Biological Variation to the Foreground.

    PubMed

    Batzli, Janet M; Knight, Jennifer K; Hartley, Laurel M; Maskiewicz, April Cordero; Desy, Elizabeth A

    2016-01-01

    Threshold concepts have been referred to as "jewels in the curriculum": concepts that are key to competency in a discipline but not taught explicitly. In biology, researchers have proposed the idea of threshold concepts that include such topics as variation, randomness, uncertainty, and scale. In this essay, we explore how the notion of threshold concepts can be used alongside other frameworks meant to guide instructional and curricular decisions, and we examine the proposed threshold concept of variation and how it might influence students' understanding of core concepts in biology focused on genetics and evolution. Using dimensions of scientific inquiry, we outline a schema that may allow students to experience and apply the idea of variation in such a way that it transforms their future understanding and learning of genetics and evolution. We encourage others to consider the idea of threshold concepts alongside the Vision and Change core concepts to provide a lens for targeted instruction and as an integrative bridge between concepts and competencies. © 2016 J. M. Batzli et al. CBE—Life Sciences Education © 2016 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  20. Mechanistic evaluation of the pros and cons of digital RT-LAMP for HIV-1 viral load quantification on a microfluidic device and improved efficiency via a two-step digital protocol.

    PubMed

    Sun, Bing; Shen, Feng; McCalla, Stephanie E; Kreutz, Jason E; Karymov, Mikhail A; Ismagilov, Rustem F

    2013-02-05

    Here we used a SlipChip microfluidic device to evaluate the performance of digital reverse transcription-loop-mediated isothermal amplification (dRT-LAMP) for quantification of HIV viral RNA. Tests are needed for monitoring HIV viral load to control the emergence of drug resistance and to diagnose acute HIV infections. In resource-limited settings, in vitro measurement of HIV viral load in a simple format is especially needed, and single-molecule counting using a digital format could provide a potential solution. We showed here that when one-step dRT-LAMP is used for quantification of HIV RNA, the digital count is lower than expected and is limited by the yield of desired cDNA. We were able to overcome the limitations by developing a microfluidic protocol to manipulate many single molecules in parallel through a two-step digital process. In the first step we compartmentalize the individual RNA molecules (based on Poisson statistics) and perform reverse transcription on each RNA molecule independently to produce DNA. In the second step, we perform the LAMP amplification on all individual DNA molecules in parallel. Using this new protocol, we increased the absolute efficiency (the ratio between the concentration calculated from the actual count and the expected concentration) of dRT-LAMP 10-fold, from ∼2% to ∼23%, by (i) using a more efficient reverse transcriptase, (ii) introducing RNase H to break up the DNA:RNA hybrid, and (iii) adding only the BIP primer during the RT step. We also used this two-step method to quantify HIV RNA purified from four patient samples and found that in some cases, the quantification results were highly sensitive to the sequence of the patient's HIV RNA. We learned the following three lessons from this work: (i) digital amplification technologies, including dLAMP and dPCR, may give adequate dilution curves and yet have low efficiency, thereby providing quantification values that underestimate the true concentration. Careful validation is essential before a method is considered to provide absolute quantification; (ii) the sensitivity of dLAMP to the sequence of the target nucleic acid necessitates additional validation with patient samples carrying the full spectrum of mutations; (iii) for multistep digital amplification chemistries, such as a combination of reverse transcription with amplification, microfluidic devices may be used to decouple these steps from one another and to perform them under different, individually optimized conditions for improved efficiency.

  1. Improved quantification of microbial CH4 oxidation efficiency in arctic wetland soils using carbon isotope fractionation

    NASA Astrophysics Data System (ADS)

    Preuss, I.; Knoblauch, C.; Gebert, J.; Pfeiffer, E.-M.

    2013-04-01

    Permafrost-affected tundra soils are significant sources of the climate-relevant trace gas methane (CH4). The observed accelerated warming of the arctic will cause deeper permafrost thawing, followed by increased carbon mineralization and CH4 formation in water-saturated tundra soils, thus creating a positive feedback to climate change. Aerobic CH4 oxidation is regarded as the key process reducing CH4 emissions from wetlands, but quantification of turnover rates has remained difficult so far. The application of carbon stable isotope fractionation enables the in situ quantification of CH4 oxidation efficiency in arctic wetland soils. The aim of the current study is to quantify CH4 oxidation efficiency in permafrost-affected tundra soils in Russia's Lena River delta based on stable isotope signatures of CH4. Therefore, depth profiles of CH4 concentrations and δ13CH4 signatures were measured and the fractionation factors for the processes of oxidation (αox) and diffusion (αdiff) were determined. Most previous studies employing stable isotope fractionation for the quantification of CH4 oxidation in soils of other habitats (such as landfill cover soils) have assumed a gas transport dominated by advection (αtrans = 1). In tundra soils, however, diffusion is the main gas transport mechanism and diffusive stable isotope fractionation should be considered alongside oxidative fractionation. For the first time, the stable isotope fractionation of CH4 diffusion through water-saturated soils was determined with an αdiff = 1.001 ± 0.000 (n = 3). CH4 stable isotope fractionation during diffusion through air-filled pores of the investigated polygonal tundra soils was αdiff = 1.013 ± 0.003 (n = 18). Furthermore, it was found that αox differs widely between sites and horizons (mean αox = 1.017 ± 0.009) and needs to be determined on a case by case basis. The impact of both fractionation factors on the quantification of CH4 oxidation was analyzed by considering both the potential diffusion rate under saturated and unsaturated conditions and potential oxidation rates. For a submerged, organic-rich soil, the data indicate a CH4 oxidation efficiency of 50% at the anaerobic-aerobic interface in the upper horizon. The improved in situ quantification of CH4 oxidation in wetlands enables a better assessment of current and potential CH4 sources and sinks in permafrost-affected ecosystems and their potential strengths in response to global warming.

  2. Improved quantification of microbial CH4 oxidation efficiency in Arctic wetland soils using carbon isotope fractionation

    NASA Astrophysics Data System (ADS)

    Preuss, I.; Knoblauch, C.; Gebert, J.; Pfeiffer, E.-M.

    2012-12-01

    Permafrost-affected tundra soils are significant sources of the climate-relevant trace gas methane (CH4). The observed accelerated warming of the Arctic will cause a deeper permafrost thawing followed by increased carbon mineralization and CH4 formation in water saturated tundra soils which might cause a positive feedback to climate change. Aerobic CH4 oxidation is regarded as the key process reducing CH4 emissions from wetlands, but quantification of turnover rates has remained difficult so far. The application of carbon stable isotope fractionation enables the in situ quantification of CH4 oxidation efficiency in arctic wetland soils. The aim of the current study is to quantify CH4 oxidation efficiency in permafrost-affected tundra soils in Russia's Lena River Delta based on stable isotope signatures of CH4. Therefore, depth profiles of CH4 concentrations and δ13CH4-signatures were measured and the fractionation factors for the processes of oxidation (αox) and diffusion (αdiff) were determined. Most previous studies employing stable isotope fractionation for the quantification of CH4 oxidation in soils of other habitats (e.g. landfill cover soils) have assumed a gas transport dominated by advection (αtrans = 1). In tundra soils, however, diffusion is the main gas transport mechanism, aside from ebullition. Hence, diffusive stable isotope fractionation has to be considered. For the first time, the stable isotope fractionation of CH4 diffusion through water-saturated soils was determined with an αdiff = 1.001 ± 0.000 (n = 3). CH4 stable isotope fractionation during diffusion through air-filled pores of the investigated polygonal tundra soils was αdiff = 1.013 ± 0.003 (n = 18). Furthermore, it was found that αox differs widely between sites and horizons (mean αox, = 1.017 ± 0.009) and needs to be determined individually. The impact of both fractionation factors on the quantification of CH4 oxidation was analyzed by considering both the potential diffusion rate under saturated and unsaturated conditions and potential oxidation rates. For a submerged organic rich soil, the data indicate a CH4 oxidation efficiency of 50% at the anaerobic-aerobic interface in the upper horizon. The improved in situ quantification of CH4 oxidation in wetlands enables a better assessment of current and potential CH4 sources and sinks in permafrost affected ecosystems and their potential strengths in response to global warming.

  3. Capital Ideas for Facilities Management.

    ERIC Educational Resources Information Center

    Golding, Stephen T.; Gordon, Janet; Gravina, Arthur

    2001-01-01

    Asserting that just like chief financial officers, higher education facilities specialists must maximize the long-term performance of assets under their care, describes strategies for strategic facilities management. Discusses three main approaches to facilities management (insourcing, cosourcing, and outsourcing) and where boards of trustees fit…

  4. Discipline under IDEA.

    ERIC Educational Resources Information Center

    Horton, Janet L.

    1999-01-01

    Discretion and regulatory flexibility must be managed without violating the amended, reauthorized Individuals with Disabilities Education Act. Misbehaving disabled students may be removed from their educational placements for 10 consecutive days or less--the same punishment meted out to regular students. Longer-term placements need IEP team…

  5. HPC Analytics Support. Requirements for Uncertainty Quantification Benchmarks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paulson, Patrick R.; Purohit, Sumit; Rodriguez, Luke R.

    2015-05-01

    This report outlines techniques for extending benchmark generation products so they support uncertainty quantification by benchmarked systems. We describe how uncertainty quantification requirements can be presented to candidate analytical tools supporting SPARQL. We describe benchmark data sets for evaluating uncertainty quantification, as well as an approach for using our benchmark generator to produce data sets for generating benchmark data sets.

  6. Simultaneous quantification of acetaminophen and five acetaminophen metabolites in human plasma and urine by high-performance liquid chromatography-electrospray ionization-tandem mass spectrometry: Method validation and application to a neonatal pharmacokinetic study.

    PubMed

    Cook, Sarah F; King, Amber D; van den Anker, John N; Wilkins, Diana G

    2015-12-15

    Drug metabolism plays a key role in acetaminophen (paracetamol)-induced hepatotoxicity, and quantification of acetaminophen metabolites provides critical information about factors influencing susceptibility to acetaminophen-induced hepatotoxicity in clinical and experimental settings. The aims of this study were to develop, validate, and apply high-performance liquid chromatography-electrospray ionization-tandem mass spectrometry (HPLC-ESI-MS/MS) methods for simultaneous quantification of acetaminophen, acetaminophen-glucuronide, acetaminophen-sulfate, acetaminophen-glutathione, acetaminophen-cysteine, and acetaminophen-N-acetylcysteine in small volumes of human plasma and urine. In the reported procedures, acetaminophen-d4 and acetaminophen-d3-sulfate were utilized as internal standards (IS). Analytes and IS were recovered from human plasma (10μL) by protein precipitation with acetonitrile. Human urine (10μL) was prepared by fortification with IS followed only by sample dilution. Calibration concentration ranges were tailored to literature values for each analyte in each biological matrix. Prepared samples from plasma and urine were analyzed under the same HPLC-ESI-MS/MS conditions, and chromatographic separation was achieved through use of an Agilent Poroshell 120 EC-C18 column with a 20-min run time per injected sample. The analytes could be accurately and precisely quantified over 2.0-3.5 orders of magnitude. Across both matrices, mean intra- and inter-assay accuracies ranged from 85% to 112%, and intra- and inter-assay imprecision did not exceed 15%. Validation experiments included tests for specificity, recovery and ionization efficiency, inter-individual variability in matrix effects, stock solution stability, and sample stability under a variety of storage and handling conditions (room temperature, freezer, freeze-thaw, and post-preparative). The utility and suitability of the reported procedures were illustrated by analysis of pharmacokinetic samples collected from neonates receiving intravenous acetaminophen. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. Identification and quantification of cardiac glycosides in blood and urine samples by HPLC/MS/MS.

    PubMed

    Guan, F; Ishii, A; Seno, H; Watanabe-Suzuki, K; Kumazawa, T; Suzuki, O

    1999-09-15

    Cardiac glycosides (CG) are of forensic importance because of their toxicity and the fact that very limited methods are available for identification of CG in biological samples. In this study, we have developed an identification and quantification method for digoxin, digitoxin, deslanoside, digoxigenin, and digitoxigenin by high-performance liquid chromatography tandem mass spectrometry (HPLC/MS/MS). CG formed abundant [M + NH4]+ ions and much less abundant [M + H]+ ions as observed with electrospray ionization (ESI) source and ammonium formate buffer. Under mild conditions for collision-induced dissociation (CID), each [M + NH4]+ ion fragmented to produce a dominant daughter ion, which was essential to the sensitive method of selected reaction monitoring (SRM) quantification of CG achieved in this study. SRM was compared with selected ion monitoring (SIM) regarding the effects of sample matrixes on the methodology. SRM produced lower detection limits with biological samples than SIM, while both methods produced equal detection limits with CG standards. On the basis of the HPLC/MS/MS results for CG, we have proposed some generalized points for conducting sensitive SRM measurements, in view of the property of analytes as well as instrumental conditions such as the type of HPLC/MS interface and CID parameters. Analytes of which the molecular ion can produce one abundant daughter ion with high yield under CID conditions may be sensitively measured by SRM. ESI is the most soft ionization source developed so far and can afford formation of the fragile molecular ions that are necessary for sensitive SRM detection. Mild CID conditions such as low collision energy and low pressure of collision gas favor production of an abundant daughter ion that is essential to sensitive SRM detection. This knowledge may provide some guidelines for conducting sensitive SRM measurements of very low concentrations of drugs or toxicants in biological samples.

  8. Method development and validation for simultaneous quantification of 15 drugs of abuse and prescription drugs and 7 of their metabolites in whole blood relevant in the context of driving under the influence of drugs--usefulness of multi-analyte calibration.

    PubMed

    Steuer, Andrea E; Forss, Anna-Maria; Dally, Annika M; Kraemer, Thomas

    2014-11-01

    In the context of driving under the influence of drugs (DUID), not only common drugs of abuse may have an influence, but also medications with similar mechanisms of action. Simultaneous quantification of a variety of drugs and medications relevant in this context allows faster and more effective analyses. Therefore, multi-analyte approaches have gained more and more popularity in recent years. Usually, calibration curves for such procedures contain a mixture of all analytes, which might lead to mutual interferences. In this study we investigated whether the use of such mixtures leads to reliable results for authentic samples containing only one or two analytes. Five hundred microliters of whole blood were extracted by routine solid-phase extraction (SPE, HCX). Analysis was performed on an ABSciex 3200 QTrap instrument with ESI+ in scheduled MRM mode. The method was fully validated according to international guidelines including selectivity, recovery, matrix effects, accuracy and precision, stabilities, and limit of quantification. The selected SPE provided recoveries >60% for all analytes except 6-monoacetylmorphine (MAM) with coefficients of variation (CV) below 15% or 20% for quality controls (QC) LOW and HIGH, respectively. Ion suppression >30% was found for benzoylecgonine, hydrocodone, hydromorphone, MDA, oxycodone, and oxymorphone at QC LOW, however CVs were always below 10% (n=6 different whole blood samples). Accuracy and precision criteria were fulfilled for all analytes except for MAM. Systematic investigation of accuracy determined for QC MED in a multi-analyte mixture compared to samples containing only single analytes revealed no relevant differences for any analyte, indicating that a multi-analyte calibration is suitable for the presented method. Comparison of approximately 60 samples to a former GC-MS method showed good correlation. The newly validated method was successfully applied to more than 1600 routine samples and 3 proficiency tests. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  9. Digital isothermal quantification of nucleic acids via simultaneous chemical initiation of recombinase polymerase amplification reactions on SlipChip.

    PubMed

    Shen, Feng; Davydova, Elena K; Du, Wenbin; Kreutz, Jason E; Piepenburg, Olaf; Ismagilov, Rustem F

    2011-05-01

    In this paper, digital quantitative detection of nucleic acids was achieved at the single-molecule level by chemical initiation of over one thousand sequence-specific, nanoliter isothermal amplification reactions in parallel. Digital polymerase chain reaction (digital PCR), a method used for quantification of nucleic acids, counts the presence or absence of amplification of individual molecules. However, it still requires temperature cycling, which is undesirable under resource-limited conditions. This makes isothermal methods for nucleic acid amplification, such as recombinase polymerase amplification (RPA), more attractive. A microfluidic digital RPA SlipChip is described here for simultaneous initiation of over one thousand nL-scale RPA reactions by adding a chemical initiator to each reaction compartment with a simple slipping step after instrument-free pipet loading. Two designs of the SlipChip, two-step slipping and one-step slipping, were validated using digital RPA. By using the digital RPA SlipChip, false-positive results from preinitiation of the RPA amplification reaction before incubation were eliminated. End point fluorescence readout was used for "yes or no" digital quantification. The performance of digital RPA in a SlipChip was validated by amplifying and counting single molecules of the target nucleic acid, methicillin-resistant Staphylococcus aureus (MRSA) genomic DNA. The digital RPA on SlipChip was also tolerant to fluctuations of the incubation temperature (37-42 °C), and its performance was comparable to digital PCR on the same SlipChip design. The digital RPA SlipChip provides a simple method to quantify nucleic acids without requiring thermal cycling or kinetic measurements, with potential applications in diagnostics and environmental monitoring under resource-limited settings. The ability to initiate thousands of chemical reactions in parallel on the nanoliter scale using solvent-resistant glass devices is likely to be useful for a broader range of applications.

  10. Digital Isothermal Quantification of Nucleic Acids via Simultaneous Chemical Initiation of Recombinase Polymerase Amplification Reactions on SlipChip

    PubMed Central

    Shen, Feng; Davydova, Elena K.; Du, Wenbin; Kreutz, Jason E.; Piepenburg, Olaf; Ismagilov, Rustem F.

    2011-01-01

    In this paper, digital quantitative detection of nucleic acids was achieved at the single-molecule level by chemical initiation of over one thousand sequence-specific, nanoliter, isothermal amplification reactions in parallel. Digital polymerase chain reaction (digital PCR), a method used for quantification of nucleic acids, counts the presence or absence of amplification of individual molecules. However it still requires temperature cycling, which is undesirable under resource-limited conditions. This makes isothermal methods for nucleic acid amplification, such as recombinase polymerase amplification (RPA), more attractive. A microfluidic digital RPA SlipChip is described here for simultaneous initiation of over one thousand nL-scale RPA reactions by adding a chemical initiator to each reaction compartment with a simple slipping step after instrument-free pipette loading. Two designs of the SlipChip, two-step slipping and one-step slipping, were validated using digital RPA. By using the digital RPA SlipChip, false positive results from pre-initiation of the RPA amplification reaction before incubation were eliminated. End-point fluorescence readout was used for “yes or no” digital quantification. The performance of digital RPA in a SlipChip was validated by amplifying and counting single molecules of the target nucleic acid, Methicillin-resistant Staphylococcus aureus (MRSA) genomic DNA. The digital RPA on SlipChip was also tolerant to fluctuations of the incubation temperature (37–42 °C), and its performance was comparable to digital PCR on the same SlipChip design. The digital RPA SlipChip provides a simple method to quantify nucleic acids without requiring thermal cycling or kinetic measurements, with potential applications in diagnostics and environmental monitoring under resource-limited settings. The ability to initiate thousands of chemical reactions in parallel on the nanoliter scale using solvent-resistant glass devices is likely to be useful for a broader range of applications. PMID:21476587

  11. Automated quantification of renal interstitial fibrosis for computer-aided diagnosis: A comprehensive tissue structure segmentation method.

    PubMed

    Tey, Wei Keat; Kuang, Ye Chow; Ooi, Melanie Po-Leen; Khoo, Joon Joon

    2018-03-01

    Interstitial fibrosis in renal biopsy samples is a scarring tissue structure that may be visually quantified by pathologists as an indicator to the presence and extent of chronic kidney disease. The standard method of quantification by visual evaluation presents reproducibility issues in the diagnoses. This study proposes an automated quantification system for measuring the amount of interstitial fibrosis in renal biopsy images as a consistent basis of comparison among pathologists. The system extracts and segments the renal tissue structures based on colour information and structural assumptions of the tissue structures. The regions in the biopsy representing the interstitial fibrosis are deduced through the elimination of non-interstitial fibrosis structures from the biopsy area and quantified as a percentage of the total area of the biopsy sample. A ground truth image dataset has been manually prepared by consulting an experienced pathologist for the validation of the segmentation algorithms. The results from experiments involving experienced pathologists have demonstrated a good correlation in quantification result between the automated system and the pathologists' visual evaluation. Experiments investigating the variability in pathologists also proved the automated quantification error rate to be on par with the average intra-observer variability in pathologists' quantification. Interstitial fibrosis in renal biopsy samples is a scarring tissue structure that may be visually quantified by pathologists as an indicator to the presence and extent of chronic kidney disease. The standard method of quantification by visual evaluation presents reproducibility issues in the diagnoses due to the uncertainties in human judgement. An automated quantification system for accurately measuring the amount of interstitial fibrosis in renal biopsy images is presented as a consistent basis of comparison among pathologists. The system identifies the renal tissue structures through knowledge-based rules employing colour space transformations and structural features extraction from the images. In particular, the renal glomerulus identification is based on a multiscale textural feature analysis and a support vector machine. The regions in the biopsy representing interstitial fibrosis are deduced through the elimination of non-interstitial fibrosis structures from the biopsy area. The experiments conducted evaluate the system in terms of quantification accuracy, intra- and inter-observer variability in visual quantification by pathologists, and the effect introduced by the automated quantification system on the pathologists' diagnosis. A 40-image ground truth dataset has been manually prepared by consulting an experienced pathologist for the validation of the segmentation algorithms. The results from experiments involving experienced pathologists have demonstrated an average error of 9 percentage points in quantification result between the automated system and the pathologists' visual evaluation. Experiments investigating the variability in pathologists involving samples from 70 kidney patients also proved the automated quantification error rate to be on par with the average intra-observer variability in pathologists' quantification. The accuracy of the proposed quantification system has been validated with the ground truth dataset and compared against the pathologists' quantification results. It has been shown that the correlation between different pathologists' estimation of interstitial fibrosis area has significantly improved, demonstrating the effectiveness of the quantification system as a diagnostic aide. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Sometimes it hurts when supervisors don't listen: the antecedents and consequences of safety voice among young workers.

    PubMed

    Tucker, Sean; Turner, Nick

    2015-01-01

    We examined the relationship among having ideas about how to improve occupational safety, speaking up about them (safety voice), and future work-related injuries. One hundred fifty-five employed teenagers completed 3 surveys with a 1-month lag between each survey. We found that participants who were more likely to have ideas about how to improve occupational safety and had high affective commitment to the organization reported the highest level of safety voice. In turn, supervisor openness to voice moderated the relationship between safety voice and future work-related injuries. Specifically, future work-related injuries were most frequent when high levels of safety voice were combined with low supervisor openness to voice. The tested model clarifies the conditions under which workers share safety-related ideas with a supervisor and the real consequences of speaking up about them. We discuss the implications of these findings for safety management. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  13. Optional IDEA Alternative Dispute Resolution. inForum

    ERIC Educational Resources Information Center

    Henderson, Kelly

    2008-01-01

    Though most interactions between parents and school personnel about students with disabilities are positive and productive, disagreements can arise. Disputes may range in intensity from minor miscommunications to significant conflicts that trigger the use of procedural safeguards available under federal law. The Individuals with Disabilities…

  14. Owning Corporate Texts.

    ERIC Educational Resources Information Center

    Winsor, Dorothy A.

    1993-01-01

    Examines the relationship between technical writers and their texts. Suggests the amount of ownership any writer has varies depending on context; therefore, in technical writing, the more a document represents an organization, the less likely the words and ideas are to be solely under the control of the writer. (NH)

  15. Nanotube News

    ERIC Educational Resources Information Center

    Journal of College Science Teaching, 2005

    2005-01-01

    Smaller, faster computers, bullet-proof t-shirts, and itty-bitty robots--such are the promises of nanotechnology and the cylinder-shaped collection of carbon molecules known as nanotubes. But for these exciting ideas to become realities, scientists must understand how these miracle molecules perform under all sorts of conditions. This brief…

  16. Nonparametric Regression and the Parametric Bootstrap for Local Dependence Assessment.

    ERIC Educational Resources Information Center

    Habing, Brian

    2001-01-01

    Discusses ideas underlying nonparametric regression and the parametric bootstrap with an overview of their application to item response theory and the assessment of local dependence. Illustrates the use of the method in assessing local dependence that varies with examinee trait levels. (SLD)

  17. Whitewater Kayaking Instruction: Skills and Techniques.

    ERIC Educational Resources Information Center

    Poff, Raymond; Stuessy, Tom

    This paper briefly presents ideas and techniques that can facilitate effective whitewater kayaking instruction. Instructors often focus so much on the mechanics of specific skills that they overlook less obvious, but equally important, aspects of instruction. These aspects include the underlying purposes and guiding principles of kayaking…

  18. Preface

    Treesearch

    David DeYoe

    1999-01-01

    The idea for this workshop began in 1996 when the folks in Quebec, faced with some unappealing fiscal realities that promised to devastate their provincial forest research capacity, held a meeting that convened representatives from research organizations around the world to share their experiences under similar circumstances. The meeting gathered senior research...

  19. Family Counseling Interventions: Understanding Family Systems and the Referral Process.

    ERIC Educational Resources Information Center

    McWhirter, Ellen Hawley; And Others

    1993-01-01

    This article describes concepts underlying the idea of the "family as a system"; compares and contrasts four approaches to family therapy (those of Virginia Satir, Jay Haley, Murray Bowen, and Salvador Minuchin); and offers suggestions to teachers referring parents for family counseling. (DB)

  20. Sparse Reconstruction Techniques in MRI: Methods, Applications, and Challenges to Clinical Adoption

    PubMed Central

    Yang, Alice Chieh-Yu; Kretzler, Madison; Sudarski, Sonja; Gulani, Vikas; Seiberlich, Nicole

    2016-01-01

    The family of sparse reconstruction techniques, including the recently introduced compressed sensing framework, has been extensively explored to reduce scan times in Magnetic Resonance Imaging (MRI). While there are many different methods that fall under the general umbrella of sparse reconstructions, they all rely on the idea that a priori information about the sparsity of MR images can be employed to reconstruct full images from undersampled data. This review describes the basic ideas behind sparse reconstruction techniques, how they could be applied to improve MR imaging, and the open challenges to their general adoption in a clinical setting. The fundamental principles underlying different classes of sparse reconstructions techniques are examined, and the requirements that each make on the undersampled data outlined. Applications that could potentially benefit from the accelerations that sparse reconstructions could provide are described, and clinical studies using sparse reconstructions reviewed. Lastly, technical and clinical challenges to widespread implementation of sparse reconstruction techniques, including optimization, reconstruction times, artifact appearance, and comparison with current gold-standards, are discussed. PMID:27003227

Top