Sample records for previous results based

  1. A voxel-based approach to gray matter asymmetries.

    PubMed

    Luders, E; Gaser, C; Jancke, L; Schlaug, G

    2004-06-01

    Voxel-based morphometry (VBM) was used to analyze gray matter (GM) asymmetries in a large sample (n = 60) of male and female professional musicians with and without absolute pitch (AP). We chose to examine these particular groups because previous studies using traditional region-of-interest (ROI) analyses have shown differences in hemispheric asymmetry related to AP and gender. Voxel-based methods may have advantages over traditional ROI-based methods since the analysis can be performed across the whole brain with minimal user bias. After determining that the VBM method was sufficiently sensitive for the detection of differences in GM asymmetries between groups, we found that male AP musicians were more leftward lateralized in the anterior region of the planum temporale (PT) than male non-AP musicians. This confirmed the results of previous studies using ROI-based methods that showed an association between PT asymmetry and the AP phenotype. We further observed that male non-AP musicians revealed an increased leftward GM asymmetry in the postcentral gyrus compared to female non-AP musicians, again corroborating results of a previously published study using ROI-based methods. By analyzing hemispheric GM differences across our entire sample, we were able to partially confirm findings of previous studies using traditional morphometric techniques, as well as more recent, voxel-based analyses. In addition, we found some unusually pronounced GM asymmetries in our musician sample not previously detected in subjects unselected for musical training. Since we were able to validate gender- and AP-related brain asymmetries previously described using traditional ROI-based morphometric techniques, the results of our analyses support the use of VBM for examinations of GM asymmetries.

  2. Validation of a Crowdsourcing Methodology for Developing a Knowledge Base of Related Problem-Medication Pairs

    PubMed Central

    Wright, A.; Krousel-Wood, M.; Thomas, E. J.; McCoy, J. A.; Sittig, D. F.

    2015-01-01

    Summary Background Clinical knowledge bases of problem-medication pairs are necessary for many informatics solutions that improve patient safety, such as clinical summarization. However, developing these knowledge bases can be challenging. Objective We sought to validate a previously developed crowdsourcing approach for generating a knowledge base of problem-medication pairs in a large, non-university health care system with a widely used, commercially available electronic health record. Methods We first retrieved medications and problems entered in the electronic health record by clinicians during routine care during a six month study period. Following the previously published approach, we calculated the link frequency and link ratio for each pair then identified a threshold cutoff for estimated problem-medication pair appropriateness through clinician review; problem-medication pairs meeting the threshold were included in the resulting knowledge base. We selected 50 medications and their gold standard indications to compare the resulting knowledge base to the pilot knowledge base developed previously and determine its recall and precision. Results The resulting knowledge base contained 26,912 pairs, had a recall of 62.3% and a precision of 87.5%, and outperformed the pilot knowledge base containing 11,167 pairs from the previous study, which had a recall of 46.9% and a precision of 83.3%. Conclusions We validated the crowdsourcing approach for generating a knowledge base of problem-medication pairs in a large non-university health care system with a widely used, commercially available electronic health record, indicating that the approach may be generalizable across healthcare settings and clinical systems. Further research is necessary to better evaluate the knowledge, to compare crowdsourcing with other approaches, and to evaluate if incorporating the knowledge into electronic health records improves patient outcomes. PMID:26171079

  3. Integrated Planning Model (IPM) Base Case v.4.10

    EPA Pesticide Factsheets

    Learn about EPA's IPM Base Case v.4.10, including Proposed Transport Rule results, documentation, the National Electric Energy Data System (NEEDS) database and user's guide, and run results using previous base cases.

  4. Microcomputer Assisted Interpretative Reporting of Sequential Creatine Kinase (CK) and Lactate Dehydrogenase (LDH) Isoenzyme Determination

    PubMed Central

    Talamo, Thomas S.; Losos, Frank J.; Mercer, Donald W.

    1984-01-01

    We have developed a microcomputer based system for interpretative reporting of creatine kinase (CK) and lactate dehydrogenase (LDH) isoenzyme studies. Patient demographic data and test results (total CK, CK-MB, LD-1, and LD-2) are entered manually through the keyboard. The test results are compared with normal range values and an interpretative report is generated. This report consists of all pertinent demographic information with a graphic display of up to 12 previous CK and LDH isoenzyme determinations. Diagnostic interpretative statements are printed beneath the graphic display following analysis of previously entered test results. The combination of graphic data display and interpretations based on analysis of up to 12 previous specimens provides useful and accurate information to the cardiologist.

  5. Validation of a Crowdsourcing Methodology for Developing a Knowledge Base of Related Problem-Medication Pairs.

    PubMed

    McCoy, A B; Wright, A; Krousel-Wood, M; Thomas, E J; McCoy, J A; Sittig, D F

    2015-01-01

    Clinical knowledge bases of problem-medication pairs are necessary for many informatics solutions that improve patient safety, such as clinical summarization. However, developing these knowledge bases can be challenging. We sought to validate a previously developed crowdsourcing approach for generating a knowledge base of problem-medication pairs in a large, non-university health care system with a widely used, commercially available electronic health record. We first retrieved medications and problems entered in the electronic health record by clinicians during routine care during a six month study period. Following the previously published approach, we calculated the link frequency and link ratio for each pair then identified a threshold cutoff for estimated problem-medication pair appropriateness through clinician review; problem-medication pairs meeting the threshold were included in the resulting knowledge base. We selected 50 medications and their gold standard indications to compare the resulting knowledge base to the pilot knowledge base developed previously and determine its recall and precision. The resulting knowledge base contained 26,912 pairs, had a recall of 62.3% and a precision of 87.5%, and outperformed the pilot knowledge base containing 11,167 pairs from the previous study, which had a recall of 46.9% and a precision of 83.3%. We validated the crowdsourcing approach for generating a knowledge base of problem-medication pairs in a large non-university health care system with a widely used, commercially available electronic health record, indicating that the approach may be generalizable across healthcare settings and clinical systems. Further research is necessary to better evaluate the knowledge, to compare crowdsourcing with other approaches, and to evaluate if incorporating the knowledge into electronic health records improves patient outcomes.

  6. A Comprehensive Test of General Strain Theory: Key Strains, Situational- and Trait-Based Negative Emotions, Conditioning Factors, and Delinquency

    ERIC Educational Resources Information Center

    Moon, Byongook; Morash, Merry; McCluskey, Cynthia Perez; Hwang, Hye-Won

    2009-01-01

    Using longitudinal data on South Korean youth, the authors addressed limitations of previous tests of general strain theory (GST), focusing on the relationships among key strains, situational- and trait-based negative emotions, conditioning factors, and delinquency. Eight types of strain previously shown most likely to result in delinquency,…

  7. Enhanced thermoelectric performance of graphene nanoribbon-based devices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hossain, Md Sharafat, E-mail: hossain@student.unimelb.edu.au; Huynh, Duc Hau; Nguyen, Phuong Duc

    There have been numerous theoretical studies on exciting thermoelectric properties of graphene nano-ribbons (GNRs); however, most of these studies are mainly based on simulations. In this work, we measure and characterize the thermoelectric properties of GNRs and compare the results with theoretical predictions. Our experimental results verify that nano-structuring and patterning graphene into nano-ribbons significantly enhance its thermoelectric power, confirming previous predictions. Although patterning results in lower conductance (G), the overall power factor (S{sup 2}G) increases for nanoribbons. We demonstrate that edge roughness plays an important role in achieving such an enhanced performance and support it through first principles simulations.more » We show that uncontrolled edge roughness, which is considered detrimental in GNR-based electronic devices, leads to enhanced thermoelectric performance of GNR-based thermoelectric devices. The result validates previously reported theoretical studies of GNRs and demonstrates the potential of GNRs for the realization of highly efficient thermoelectric devices.« less

  8. Roadway lighting and safety : phase II--monitoring quality, durability and efficiency.

    DOT National Transportation Integrated Search

    2011-10-01

    This Phase II project follows a previous project titled Strategies to Address Nighttime Crashes at Rural, Unsignalized Intersections. Based on the results of the previous study, the Iowa Highway Research Board (IHRB) indicated interest in pursuing fu...

  9. Display-based communications for advanced transport aircraft

    NASA Technical Reports Server (NTRS)

    Lee, Alfred T.

    1989-01-01

    The next generation of civil transport aircraft will depend increasingly upon ground-air-ground and satellite data link for information critical to safe and efficient air transportation. Previous studies which examined the concept of display-based communications in addition to, or in lieu of, conventional voice transmissions are reviewed. A full-mission flight simulation comparing voice and display-based communication modes in an advanced transport aircraft is also described. The results indicate that a display-based mode of information transfer does not result in significantly increased aircrew workload, but does result in substantially increased message acknowledgment times when compared to conventional voice transmissions. User acceptance of the display-based communication system was generally high, replicating the findings of previous studies. However, most pilots tested expressed concern over the potential loss of information available from frequency monitoring which might result from the introduction of discrete address communications. Concern was expressed by some pilots for the reduced time available to search for conflicting traffic when using the communications display system. The implications of the findings for the design of display-based communications are discussed.

  10. Effects of linking a soil-water-balance model with a groundwater-flow model

    USGS Publications Warehouse

    Stanton, Jennifer S.; Ryter, Derek W.; Peterson, Steven M.

    2013-01-01

    A previously published regional groundwater-flow model in north-central Nebraska was sequentially linked with the recently developed soil-water-balance (SWB) model to analyze effects to groundwater-flow model parameters and calibration results. The linked models provided a more detailed spatial and temporal distribution of simulated recharge based on hydrologic processes, improvement of simulated groundwater-level changes and base flows at specific sites in agricultural areas, and a physically based assessment of the relative magnitude of recharge for grassland, nonirrigated cropland, and irrigated cropland areas. Root-mean-squared (RMS) differences between the simulated and estimated or measured target values for the previously published model and linked models were relatively similar and did not improve for all types of calibration targets. However, without any adjustment to the SWB-generated recharge, the RMS difference between simulated and estimated base-flow target values for the groundwater-flow model was slightly smaller than for the previously published model, possibly indicating that the volume of recharge simulated by the SWB code was closer to actual hydrogeologic conditions than the previously published model provided. Groundwater-level and base-flow hydrographs showed that temporal patterns of simulated groundwater levels and base flows were more accurate for the linked models than for the previously published model at several sites, particularly in agricultural areas.

  11. Interim 2001-based national population projections for the United Kingdom and constituent countries.

    PubMed

    Shaw, Chris

    2003-01-01

    This article describes new 2001-based national population projections which were carried out following the publication in September 2002 of the first results of the 2001 Census. These "interim" projections, carried out by the Government Actuary in consultation with the Registrars General, take preliminary account of the results of the Census which showed that the base population used in previous projections was overestimated. The interim projections also incorporate a reduced assumption of net international migration to the United Kingdom, informed by the first results of the 2001 Census and taking account of more recent migration information. The population of the United Kingdom is now projected to increase from an estimated 58.8 million in 2001 to reach 63.2 million by 2026. The projected population at 2026 is about 1.8 million (2.8 per cent) lower than in the previous (2000-based) projections.

  12. Boosting with Averaged Weight Vectors

    NASA Technical Reports Server (NTRS)

    Oza, Nikunj C.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    AdaBoost is a well-known ensemble learning algorithm that constructs its constituent or base models in sequence. A key step in AdaBoost is constructing a distribution over the training examples to create each base model. This distribution, represented as a vector, is constructed to be orthogonal to the vector of mistakes made by the previous base model in the sequence. The idea is to make the next base model's errors uncorrelated with those of the previous model. Some researchers have pointed out the intuition that it is probably better to construct a distribution that is orthogonal to the mistake vectors of all the previous base models, but that this is not always possible. We present an algorithm that attempts to come as close as possible to this goal in an efficient manner. We present experimental results demonstrating significant improvement over AdaBoost and the Totally Corrective boosting algorithm, which also attempts to satisfy this goal.

  13. Depolarization Lidar Determination Of Cloud-Base Microphysical Properties

    NASA Astrophysics Data System (ADS)

    Donovan, D. P.; Klein Baltink, H.; Henzing, J. S.; de Roode, S.; Siebesma, A. P.

    2016-06-01

    The links between multiple-scattering induced depolarization and cloud microphysical properties (e.g. cloud particle number density, effective radius, water content) have long been recognised. Previous efforts to use depolarization information in a quantitative manner to retrieve cloud microphysical cloud properties have also been undertaken but with limited scope and, arguably, success. In this work we present a retrieval procedure applicable to liquid stratus clouds with (quasi-)linear LWC profiles and (quasi-)constant number density profiles in the cloud-base region. This set of assumptions allows us to employ a fast and robust inversion procedure based on a lookup-table approach applied to extensive lidar Monte-Carlo multiple-scattering calculations. An example validation case is presented where the results of the inversion procedure are compared with simultaneous cloud radar observations. In non-drizzling conditions it was found, in general, that the lidar- only inversion results can be used to predict the radar reflectivity within the radar calibration uncertainty (2-3 dBZ). Results of a comparison between ground-based aerosol number concentration and lidar-derived cloud base number considerations are also presented. The observed relationship between the two quantities is seen to be consistent with the results of previous studies based on aircraft-based in situ measurements.

  14. GOSAT CO2 retrieval results using TANSO-CAI aerosol information over East Asia

    NASA Astrophysics Data System (ADS)

    KIM, M.; Kim, W.; Jung, Y.; Lee, S.; Kim, J.; Lee, H.; Boesch, H.; Goo, T. Y.

    2015-12-01

    In the satellite remote sensing of CO2, incorrect aerosol information could induce large errors as previous studies suggested. Many factors, such as, aerosol type, wavelength dependency of AOD, aerosol polarization effect and etc. have been main error sources. Due to these aerosol effects, large number of data retrieved are screened out in quality control, or retrieval errors tend to increase if not screened out, especially in East Asia where aerosol concentrations are fairly high. To reduce these aerosol induced errors, a CO2 retrieval algorithm using the simultaneous TANSO-CAI aerosol information is developed. This algorithm adopts AOD and aerosol type information as a priori information from the CAI aerosol retrieval algorithm. The CO2 retrieval algorithm based on optimal estimation method and VLIDORT, a vector discrete ordinate radiative transfer model. The CO2 algorithm, developed with various state vectors to find accurate CO2 concentration, shows reasonable results when compared with other dataset. This study concentrates on the validation of retrieved results with the ground-based TCCON measurements in East Asia and the comparison with the previous retrieval from ACOS, NIES, and UoL. Although, the retrieved CO2 concentration is lower than previous results by ppm's, it shows similar trend and high correlation with previous results. Retrieved data and TCCON measurements data are compared at three stations of Tsukuba, Saga, Anmyeondo in East Asia, with the collocation criteria of ±2°in latitude/longitude and ±1 hours of GOSAT passing time. Compared results also show similar trend with good correlation. Based on the TCCON comparison results, bias correction equation is calculated and applied to the East Asia data.

  15. PZT Active Frequency Based Wind Blade Fatigue to Failure Testing Results for Various Blade Designs

    DTIC Science & Technology

    2011-09-01

    PZT Active Frequency Based Wind Blade Fatigue to Failure Testing Results for Various Blade Designs R. J. WERLINK...number. 1. REPORT DATE SEP 2011 2. REPORT TYPE N/A 3. DATES COVERED - 4. TITLE AND SUBTITLE PZT Active Frequency Based Wind Blade Fatigue ...18 Abstract: This paper summarizes NASA PZT Health Monitoring System results previously reported for 9 meter blade Fatigue loading to failure

  16. Spoof Detection for Finger-Vein Recognition System Using NIR Camera.

    PubMed

    Nguyen, Dat Tien; Yoon, Hyo Sik; Pham, Tuyen Danh; Park, Kang Ryoung

    2017-10-01

    Finger-vein recognition, a new and advanced biometrics recognition method, is attracting the attention of researchers because of its advantages such as high recognition performance and lesser likelihood of theft and inaccuracies occurring on account of skin condition defects. However, as reported by previous researchers, it is possible to attack a finger-vein recognition system by using presentation attack (fake) finger-vein images. As a result, spoof detection, named as presentation attack detection (PAD), is necessary in such recognition systems. Previous attempts to establish PAD methods primarily focused on designing feature extractors by hand (handcrafted feature extractor) based on the observations of the researchers about the difference between real (live) and presentation attack finger-vein images. Therefore, the detection performance was limited. Recently, the deep learning framework has been successfully applied in computer vision and delivered superior results compared to traditional handcrafted methods on various computer vision applications such as image-based face recognition, gender recognition and image classification. In this paper, we propose a PAD method for near-infrared (NIR) camera-based finger-vein recognition system using convolutional neural network (CNN) to enhance the detection ability of previous handcrafted methods. Using the CNN method, we can derive a more suitable feature extractor for PAD than the other handcrafted methods using a training procedure. We further process the extracted image features to enhance the presentation attack finger-vein image detection ability of the CNN method using principal component analysis method (PCA) for dimensionality reduction of feature space and support vector machine (SVM) for classification. Through extensive experimental results, we confirm that our proposed method is adequate for presentation attack finger-vein image detection and it can deliver superior detection results compared to CNN-based methods and other previous handcrafted methods.

  17. Spoof Detection for Finger-Vein Recognition System Using NIR Camera

    PubMed Central

    Nguyen, Dat Tien; Yoon, Hyo Sik; Pham, Tuyen Danh; Park, Kang Ryoung

    2017-01-01

    Finger-vein recognition, a new and advanced biometrics recognition method, is attracting the attention of researchers because of its advantages such as high recognition performance and lesser likelihood of theft and inaccuracies occurring on account of skin condition defects. However, as reported by previous researchers, it is possible to attack a finger-vein recognition system by using presentation attack (fake) finger-vein images. As a result, spoof detection, named as presentation attack detection (PAD), is necessary in such recognition systems. Previous attempts to establish PAD methods primarily focused on designing feature extractors by hand (handcrafted feature extractor) based on the observations of the researchers about the difference between real (live) and presentation attack finger-vein images. Therefore, the detection performance was limited. Recently, the deep learning framework has been successfully applied in computer vision and delivered superior results compared to traditional handcrafted methods on various computer vision applications such as image-based face recognition, gender recognition and image classification. In this paper, we propose a PAD method for near-infrared (NIR) camera-based finger-vein recognition system using convolutional neural network (CNN) to enhance the detection ability of previous handcrafted methods. Using the CNN method, we can derive a more suitable feature extractor for PAD than the other handcrafted methods using a training procedure. We further process the extracted image features to enhance the presentation attack finger-vein image detection ability of the CNN method using principal component analysis method (PCA) for dimensionality reduction of feature space and support vector machine (SVM) for classification. Through extensive experimental results, we confirm that our proposed method is adequate for presentation attack finger-vein image detection and it can deliver superior detection results compared to CNN-based methods and other previous handcrafted methods. PMID:28974031

  18. The development of intention-based sociomoral judgment and distribution behavior from a third-party stance.

    PubMed

    Li, Jing; Tomasello, Michael

    2018-03-01

    The current study investigated children's intention-based sociomoral judgments and distribution behavior from a third-party stance. An actor puppet showed either positive or negative intention toward a target puppet, which had previously performed a prosocial or antisocial action toward others (i.e., children witnessed various types of indirect reciprocity). Children (3- and 5-year-olds) were asked to make sociomoral judgments and to distribute resources to the actor puppet. Results showed that 5-year-olds were more likely than 3-year-olds to be influenced by intention when they made their judgment and distributed resources. The target's previous actions affected only 5-year-olds' intent-based social preference. These results suggest that children's judgments about intent-based indirect reciprocity develop from ages 3 to 5 years. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. Understanding the Association Between Negative Symptoms and Performance on Effort-Based Decision-Making Tasks: The Importance of Defeatist Performance Beliefs.

    PubMed

    Reddy, L Felice; Horan, William P; Barch, Deanna M; Buchanan, Robert W; Gold, James M; Marder, Stephen R; Wynn, Jonathan K; Young, Jared; Green, Michael F

    2017-11-13

    Effort-based decision-making paradigms are increasingly utilized to gain insight into the nature of motivation deficits. Research has shown associations between effort-based decision making and experiential negative symptoms; however, the associations are not consistent. The current study had two primary goals. First, we aimed to replicate previous findings of a deficit in effort-based decision making among individuals with schizophrenia on a test of cognitive effort. Second, in a large sample combined from the current and a previous study, we sought to examine the association between negative symptoms and effort by including the related construct of defeatist beliefs. The results replicated previous findings of impaired cognitive effort-based decision making in schizophrenia. Defeatist beliefs significantly moderated the association between negative symptoms and effort-based decision making such that there was a strong association between high negative symptoms and deficits in effort-based decision making, but only among participants with high levels of defeatist beliefs. Thus, our findings suggest the relationship between negative symptoms and effort performance may be understood by taking into account the role of defeatist beliefs, and finding that might explain discrepancies in previous studies. Published by Oxford University Press on behalf of the Maryland Psychiatric Research Center 2017.

  20. Feature-Based versus Category-Based Induction with Uncertain Categories

    ERIC Educational Resources Information Center

    Griffiths, Oren; Hayes, Brett K.; Newell, Ben R.

    2012-01-01

    Previous research has suggested that when feature inferences have to be made about an instance whose category membership is uncertain, feature-based inductive reasoning is used to the exclusion of category-based induction. These results contrast with the observation that people can and do use category-based induction when category membership is…

  1. Feature-based versus category-based induction with uncertain categories.

    PubMed

    Griffiths, Oren; Hayes, Brett K; Newell, Ben R

    2012-05-01

    Previous research has suggested that when feature inferences have to be made about an instance whose category membership is uncertain, feature-based inductive reasoning is used to the exclusion of category-based induction. These results contrast with the observation that people can and do use category-based induction when category membership is known. The present experiments examined the conditions that drive feature-based and category-based strategies in induction under category uncertainty. Specifically, 2 experiments investigated whether reliance on feature-based inductive strategies is a product of the lack of coherence in the categories used in previous research or is due to the use of a decision-only induction procedure. Experiment 1 found that feature-based reasoning remained the preferred strategy even when categories with relatively high internal coherence were used. Experiment 2 found a shift toward category-based reasoning when participants were trained to classify category members prior to feature induction. Together, these results suggest that an appropriate conceptual representation must be formed through experience with a category before it is likely to be used as a basis for feature induction. (c) 2012 APA, all rights reserved.

  2. Serology and longevity of immunity against Echinococcus granulosus in sheep and llama induced by an oil-based EG95 vaccine.

    PubMed

    Poggio, T V; Jensen, O; Mossello, M; Iriarte, J; Avila, H G; Gertiser, M L; Serafino, J J; Romero, S; Echenique, M A; Dominguez, D E; Barrios, J R; Heath, D

    2016-08-01

    An oil-based formulation of the EG95 vaccine to protect grazing animals against infection with Echinococcus granulosus was formulated in Argentina. The efficacy of the vaccine was monitored by serology in sheep and llama (Lama glama) and was compared to the serology in sheep previously published using a QuilA-adjuvanted vaccine. Long-term efficacy was also tested in sheep by challenging with E. granulosus eggs of the G1 strain 4 years after the beginning of the trial. The serological results for both sheep and llama were similar to those described previously, except that there was a more rapid response after the first vaccination. A third vaccination given after 1 year resulted in a transient boost in serology that lasted for about 12 months, which was similar to results previously described. Sheep challenged after 4 years with three vaccinations presented 84·2% reduction of live cysts counts compared with control group, and after a fourth vaccination prior to challenge, this reduction was 94·7%. The oil-based vaccine appeared to be bio-equivalent to the QuilA vaccine. © 2016 John Wiley & Sons Ltd.

  3. Differentiation and Exploration of Model MACP for HE VER 1.0 on Prototype Performance Measurement Application for Higher Education

    NASA Astrophysics Data System (ADS)

    El Akbar, R. Reza; Anshary, Muhammad Adi Khairul; Hariadi, Dennis

    2018-02-01

    Model MACP for HE ver.1. Is a model that describes how to perform measurement and monitoring performance for Higher Education. Based on a review of the research related to the model, there are several parts of the model component to develop in further research, so this research has four main objectives. The first objective is to differentiate the CSF (critical success factor) components in the previous model, the two key KPI (key performance indicators) exploration in the previous model, the three based on the previous objective, the new and more detailed model design. The final goal is the fourth designed prototype application for performance measurement in higher education, based on a new model created. The method used is explorative research method and application design using prototype method. The results of this study are first, forming a more detailed new model for measurement and monitoring of performance in higher education, differentiation and exploration of the Model MACP for HE Ver.1. The second result compiles a dictionary of college performance measurement by re-evaluating the existing indicators. The third result is the design of prototype application of performance measurement in higher education.

  4. Structural and catalytic effects of an invariant purine substitution in the hammerhead ribozyme: implications for the mechanism of acid–base catalysis

    PubMed Central

    Schultz, Eric P.; Vasquez, Ernesto E.; Scott, William G.

    2014-01-01

    The hammerhead ribozyme catalyzes RNA cleavage via acid–base catalysis. Whether it does so by general acid–base catalysis, in which the RNA itself donates and abstracts protons in the transition state, as is typically assumed, or by specific acid–base catalysis, in which the RNA plays a structural role and proton transfer is mediated by active-site water molecules, is unknown. Previous biochemical and crystallographic experiments implicate an invariant purine in the active site, G12, as the general base. However, G12 may play a structural role consistent with specific base catalysis. To better understand the role of G12 in the mechanism of hammerhead catalysis, a 2.2 Å resolution crystal structure of a hammerhead ribozyme from Schistosoma mansoni with a purine substituted for G12 in the active site of the ribozyme was obtained. Comparison of this structure (PDB entry 3zd4), in which A12 is substituted for G, with three previously determined structures that now serve as important experimental controls, allows the identification of structural perturbations that are owing to the purine substitution itself. Kinetic measurements for G12 purine-substituted schistosomal hammerheads confirm a previously observed dependence of rate on the pK a of the substituted purine; in both cases inosine, which is similar to G in pK a and hydrogen-bonding properties, is unexpectedly inactive. Structural comparisons indicate that this may primarily be owing to the lack of the exocyclic 2-amino group in the G12A and G12I substitutions and its structural effect upon both the nucleotide base and phosphate of A9. The latter involves the perturbation of a previously identified and well characterized metal ion-binding site known to be catalytically important in both minimal and full-length hammerhead ribozyme sequences. The results permit it to be suggested that G12 plays an important role in stabilizing the active-site structure. This result, although not inconsistent with the potential role of G12 as a general base, indicates that an alternative hammerhead cleavage mechanism involving specific base catalysis may instead explain the observed rate dependence upon purine substitutions at G12. The crystallographic results, contrary to previous assumptions, therefore cannot be interpreted to favor the general base catalysis mecahnism over the specific base catalysis mechanism. Instead, both of these mutually exclusive mechanistic alternatives must be considered in light of the current structural and biochemical data. PMID:25195740

  5. Structural and catalytic effects of an invariant purine substitution in the hammerhead ribozyme: implications for the mechanism of acid-base catalysis.

    PubMed

    Schultz, Eric P; Vasquez, Ernesto E; Scott, William G

    2014-09-01

    The hammerhead ribozyme catalyzes RNA cleavage via acid-base catalysis. Whether it does so by general acid-base catalysis, in which the RNA itself donates and abstracts protons in the transition state, as is typically assumed, or by specific acid-base catalysis, in which the RNA plays a structural role and proton transfer is mediated by active-site water molecules, is unknown. Previous biochemical and crystallographic experiments implicate an invariant purine in the active site, G12, as the general base. However, G12 may play a structural role consistent with specific base catalysis. To better understand the role of G12 in the mechanism of hammerhead catalysis, a 2.2 Å resolution crystal structure of a hammerhead ribozyme from Schistosoma mansoni with a purine substituted for G12 in the active site of the ribozyme was obtained. Comparison of this structure (PDB entry 3zd4), in which A12 is substituted for G, with three previously determined structures that now serve as important experimental controls, allows the identification of structural perturbations that are owing to the purine substitution itself. Kinetic measurements for G12 purine-substituted schistosomal hammerheads confirm a previously observed dependence of rate on the pK(a) of the substituted purine; in both cases inosine, which is similar to G in pK(a) and hydrogen-bonding properties, is unexpectedly inactive. Structural comparisons indicate that this may primarily be owing to the lack of the exocyclic 2-amino group in the G12A and G12I substitutions and its structural effect upon both the nucleotide base and phosphate of A9. The latter involves the perturbation of a previously identified and well characterized metal ion-binding site known to be catalytically important in both minimal and full-length hammerhead ribozyme sequences. The results permit it to be suggested that G12 plays an important role in stabilizing the active-site structure. This result, although not inconsistent with the potential role of G12 as a general base, indicates that an alternative hammerhead cleavage mechanism involving specific base catalysis may instead explain the observed rate dependence upon purine substitutions at G12. The crystallographic results, contrary to previous assumptions, therefore cannot be interpreted to favor the general base catalysis mecahnism over the specific base catalysis mechanism. Instead, both of these mutually exclusive mechanistic alternatives must be considered in light of the current structural and biochemical data.

  6. Efficiency in Rule- vs. Plan-Based Movements Is Modulated by Action-Mode.

    PubMed

    Scheib, Jean P P; Stoll, Sarah; Thürmer, J Lukas; Randerath, Jennifer

    2018-01-01

    The rule/plan motor cognition (RPMC) paradigm elicits visually indistinguishable motor outputs, resulting from either plan- or rule-based action-selection, using a combination of essentially interchangeable stimuli. Previous implementations of the RPMC paradigm have used pantomimed movements to compare plan- vs. rule-based action-selection. In the present work we attempt to determine the generalizability of previous RPMC findings to real object interaction by use of a grasp-to-rotate task. In the plan task, participants had to use prospective planning to achieve a comfortable post-handle rotation hand posture. The rule task used implementation intentions (if-then rules) leading to the same comfortable end-state. In Experiment A, we compare RPMC performance of 16 healthy participants in pantomime and real object conditions of the experiment, within-subjects. Higher processing efficiency of rule- vs. plan-based action-selection was supported by diffusion model analysis. Results show a significant response-time increase in the pantomime condition compared to the real object condition and a greater response-time advantage of rule-based vs. plan-based actions in the pantomime compared to the real object condition. In Experiment B, 24 healthy participants performed the real object RPMC task in a task switching vs. a blocked condition. Results indicate that plan-based action-selection leads to longer response-times and less efficient information processing than rule-based action-selection in line with previous RPMC findings derived from the pantomime action-mode. Particularly in the task switching mode, responses were faster in the rule compared to the plan task suggesting a modulating influence of cognitive load. Overall, results suggest an advantage of rule-based action-selection over plan-based action-selection; whereby differential mechanisms appear to be involved depending on the action-mode. We propose that cognitive load is a factor that modulates the advantageous effect of implementation intentions in motor cognition on different levels as illustrated by the varying speed advantages and the variation in diffusion parameters per action-mode or condition, respectively.

  7. Effect of BrU on the transition between wobble Gua-Thy and tautomeric Gua-Thy base-pairs: ab initio molecular orbital calculations

    NASA Astrophysics Data System (ADS)

    Nomura, Kazuya; Hoshino, Ryota; Hoshiba, Yasuhiro; Danilov, Victor I.; Kurita, Noriyuki

    2013-04-01

    We investigated transition states (TS) between wobble Guanine-Thymine (wG-T) and tautomeric G-T base-pair as well as Br-containing base-pairs by MP2 and density functional theory (DFT) calculations. The obtained TS between wG-T and G*-T (asterisk is an enol-form of base) is different from TS got by the previous DFT calculation. The activation energy (17.9 kcal/mol) evaluated by our calculation is significantly smaller than that (39.21 kcal/mol) obtained by the previous calculation, indicating that our TS is more preferable. In contrast, the obtained TS and activation energy between wG-T and G-T* are similar to those obtained by the previous DFT calculation. We furthermore found that the activation energy between wG-BrU and tautomeric G-BrU is smaller than that between wG-T and tautomeric G-T. This result elucidates that the replacement of CH3 group of T by Br increases the probability of the transition reaction producing the enol-form G* and T* bases. Because G* prefers to bind to T rather than to C, and T* to G not A, our calculated results reveal that the spontaneous mutation from C to T or from A to G base is accelerated by the introduction of wG-BrU base-pair.

  8. Theoretical foundations for quantitative paleogenetics. III - The molecular divergence of nucleic acids and proteins for the case of genetic events of unequal probability

    NASA Technical Reports Server (NTRS)

    Holmquist, R.; Pearl, D.

    1980-01-01

    Theoretical equations are derived for molecular divergence with respect to gene and protein structure in the presence of genetic events with unequal probabilities: amino acid and base compositions, the frequencies of nucleotide replacements, the usage of degenerate codons, the distribution of fixed base replacements within codons and the distribution of fixed base replacements among codons. Results are presented in the form of tables relating the probabilities of given numbers of codon base changes with respect to the original codon for the alpha hemoglobin, beta hemoglobin, myoglobin, cytochrome c and parvalbumin group gene families. Application of the calculations to the rabbit alpha and beta hemoglobin mRNAs and proteins indicates that the genes are separated by about 425 fixed based replacements distributed over 114 codon sites, which is a factor of two greater than previous estimates. The theoretical results also suggest that many more base replacements are required to effect a given gene or protein structural change than previously believed.

  9. Merge of Five Previous Catalogues Into the Ground Truth Catalogue and Registration Based on MOLA Data with THEMIS-DIR, MDIM and MOC Data-Sets

    NASA Astrophysics Data System (ADS)

    Salamuniccar, G.; Loncaric, S.

    2008-03-01

    The Catalogue from our previous work was merged with the date of Barlow, Rodionova, Boyce, and Kuzmin. The resulting ground truth catalogue with 57,633 craters was registered, using MOLA data, with THEMIS-DIR, MDIM, and MOC data-sets.

  10. Late electrophysiological modulations of feature-based attention to object shapes.

    PubMed

    Stojanoski, Bobby Boge; Niemeier, Matthias

    2014-03-01

    Feature-based attention has been shown to aid object perception. Our previous ERP effects revealed temporally late feature-based modulation in response to objects relative to motion. The aim of the current study was to confirm the timing of feature-based influences on object perception while cueing within the feature dimension of shape. Participants were told to expect either "pillow" or "flower" objects embedded among random white and black lines. Participants more accurately reported the object's main color for valid compared to invalid shapes. ERPs revealed modulation from 252-502 ms, from occipital to frontal electrodes. Our results are consistent with previous findings examining the time course for processing similar stimuli (illusory contours). Our results provide novel insights into how attending to features of higher complexity aids object perception presumably via feed-forward and feedback mechanisms along the visual hierarchy. Copyright © 2014 Society for Psychophysiological Research.

  11. Self-testing produces superior recall of both familiar and unfamiliar muscle information.

    PubMed

    Dobson, John L; Linderholm, Tracy; Yarbrough, Mary Beth

    2015-12-01

    Dozens of studies have found learning strategies based on the "testing effect" promote greater recall than those that rely solely on reading; however, the advantages of testing are often only observed after a delay (e.g., 2-7 days later). In contrast, our research, which has focused on kinesiology students learning kinesiology information that is generally familiar to them, has consistently demonstrated that testing-based strategies produce greater recall both immediately and after a delay. In an attempt to understand the discrepancies in the literature, the purpose of the present study was to determine if the time-related advantages of a testing-based learning strategy vary with one's familiarity with the to-be-learned information. Participants used both read-only and testing-based strategies to repeatedly study three different sets of information: 1) previously studied human muscle information (familiar information), 2) a mix of previously studied and previously unstudied human muscle information (mixed information), and 3) previously unstudied muscle information that is unique to sharks (unfamiliar information). Learning was evaluated via free recall assessments administered immediately after studying and again after a 1-wk delay and a 3-wk delay. Across those three assessments, the read-only strategy resulted in mean scores of 29.26 ± 1.43, 15.17 ± 1.29, and 5.33 ± 0.77 for the familiar, mixed, and unfamiliar information, respectively, whereas the testing-based strategy produced scores of 34.57 ± 1.58, 16.90 ± 1.31, and 8.33 ± 0.95, respectively. The results indicate that the testing-based strategy produced greater recall immediately and up through the 3-wk delay regardless of the participants' level of familiarity with the muscle information. Copyright © 2015 The American Physiological Society.

  12. Predicting recovery criteria for threatened and endangered plant species on the basis of past abundances and biological traits.

    PubMed

    Neel, Maile C; Che-Castaldo, Judy P

    2013-04-01

    Recovery plans for species listed under the U.S. Endangered Species Act are required to specify measurable criteria that can be used to determine when the species can be delisted. For the 642 listed endangered and threatened plant species that have recovery plans, we applied recursive partitioning methods to test whether the number of individuals or populations required for delisting can be predicted on the basis of distributional and biological traits, previous abundance at multiple time steps, or a combination of traits and previous abundances. We also tested listing status (threatened or endangered) and the year the recovery plan was written as predictors of recovery criteria. We analyzed separately recovery criteria that were stated as number of populations and as number of individuals (population-based and individual-based criteria, respectively). Previous abundances alone were relatively good predictors of population-based recovery criteria. Fewer populations, but a greater proportion of historically known populations, were required to delist species that had few populations at listing compared with species that had more populations at listing. Previous abundances were also good predictors of individual-based delisting criteria when models included both abundances and traits. The physiographic division in which the species occur was also a good predictor of individual-based criteria. Our results suggest managers are relying on previous abundances and patterns of decline as guidelines for setting recovery criteria. This may be justifiable in that previous abundances inform managers of the effects of both intrinsic traits and extrinsic threats that interact and determine extinction risk. © 2013 Society for Conservation Biology.

  13. Emergency Airway Response Team Simulation Training: A Nursing Perspective.

    PubMed

    Crimlisk, Janet T; Krisciunas, Gintas P; Grillone, Gregory A; Gonzalez, R Mauricio; Winter, Michael R; Griever, Susan C; Fernandes, Eduarda; Medzon, Ron; Blansfield, Joseph S; Blumenthal, Adam

    Simulation-based education is an important tool in the training of professionals in the medical field, especially for low-frequency, high-risk events. An interprofessional simulation-based training program was developed to enhance Emergency Airway Response Team (EART) knowledge, team dynamics, and personnel confidence. This quality improvement study evaluated the EART simulation training results of nurse participants. Twenty-four simulation-based classes of 4-hour sessions were conducted during a 12-week period. Sixty-three nurses from the emergency department (ED) and the intensive care units (ICUs) completed the simulation. Participants were evaluated before and after the simulation program with a knowledge-based test and a team dynamics and confidence questionnaire. Additional comparisons were made between ED and ICU nurses and between nurses with previous EART experience and those without previous EART experience. Comparison of presimulation (presim) and postsimulation (postsim) results indicated a statistically significant gain in both team dynamics and confidence and Knowledge Test scores (P < .01). There were no differences in scores between ED and ICU groups in presim or postsim scores; nurses with previous EART experience demonstrated significantly higher presim scores than nurses without EART experience, but there were no differences between these nurse groups at postsim. This project supports the use of simulation training to increase nurses' knowledge, confidence, and team dynamics in an EART response. Importantly, nurses with no previous experience achieved outcome scores similar to nurses who had experience, suggesting that emergency airway simulation is an effective way to train both new and experienced nurses.

  14. Parkinson's disease and occupation: differences in associations by case identification method suggest referral bias.

    PubMed

    Teschke, Kay; Marion, Stephen A; Tsui, Joseph K C; Shen, Hui; Rugbjerg, Kathrine; Harris, M Anne

    2014-02-01

    We used a population-based sample of 403 Parkinson's disease cases and 405 controls to examine risks by occupation. Results were compared to a previous clinic-based analysis. With censoring of jobs held within 10 years of diagnosis, the following had significantly or strongly increased risks: social science, law and library jobs (OR = 1.8); farming and horticulture jobs (OR = 2.0); gas station jobs (OR = 2.6); and welders (OR = 3.0). The following had significantly decreased risks: management and administration jobs (OR = 0.70); and other health care jobs (OR = 0.44). These results were consistent with other findings for social science and farming occupations. Risks for teaching, medicine and health occupations were not elevated, unlike our previous clinic-based study. This underscores the value of population-based over clinic-based samples. Occupational studies may be particularly susceptible to referral bias because social networks may spread preferentially via jobs. © 2013 Wiley Periodicals, Inc.

  15. Tests of an ATCRBS Based Trilateration Sensor at Logan International Airport

    DOT National Transportation Integrated Search

    1979-11-01

    Field test results of accuracy and coverage for an ATCRBS based surface trilateration sensor at Logan International Airport are described. This sensor was previously tested at NAFEC for feasibility and because of a lack of sufficient aircraft traffic...

  16. Enhancement of CFD validation exercise along the roof profile of a low-rise building

    NASA Astrophysics Data System (ADS)

    Deraman, S. N. C.; Majid, T. A.; Zaini, S. S.; Yahya, W. N. W.; Abdullah, J.; Ismail, M. A.

    2018-04-01

    The aim of this study is to enhance the validation of CFD exercise along the roof profile of a low-rise building. An isolated gabled-roof house having 26.6° roof pitch was simulated to obtain the pressure coefficient around the house. Validation of CFD analysis with experimental data requires many input parameters. This study performed CFD simulation based on the data from a previous study. Where the input parameters were not clearly stated, new input parameters were established from the open literatures. The numerical simulations were performed in FLUENT 14.0 by applying the Computational Fluid Dynamics (CFD) approach based on steady RANS equation together with RNG k-ɛ model. Hence, the result from CFD was analysed by using quantitative test (statistical analysis) and compared with CFD results from the previous study. The statistical analysis results from ANOVA test and error measure showed that the CFD results from the current study produced good agreement and exhibited the closest error compared to the previous study. All the input data used in this study can be extended to other types of CFD simulation involving wind flow over an isolated single storey house.

  17. The social-cognitive basis of infants' reference to absent entities.

    PubMed

    Bohn, Manuel; Zimmermann, Luise; Call, Josep; Tomasello, Michael

    2018-04-06

    Recent evidence suggests that infants as young as 12 month of age use pointing to communicate about absent entities. The tacit assumption underlying these studies is that infants do so based on tracking what their interlocutor experienced in a previous shared interaction. The present study addresses this assumption empirically. In three experiments, 12-month-old infants could request additional desired objects by pointing to the location in which these objects were previously located. We systematically varied whether the adult from whom infants were requesting had previously experienced the former content of the location with the infant. Infants systematically adjusted their pointing to the now empty location to what they experienced with the adult previously. These results suggest that infants' ability to communicate about absent referents is based on an incipient form of common ground. Copyright © 2018 Elsevier B.V. All rights reserved.

  18. Agent-based modeling of the spread of the 1918-1919 flu in three Canadian fur trading communities.

    PubMed

    O'Neil, Caroline A; Sattenspiel, Lisa

    2010-01-01

    Previous attempts to study the 1918-1919 flu in three small communities in central Manitoba have used both three-community population-based and single-community agent-based models. These studies identified critical factors influencing epidemic spread, but they also left important questions unanswered. The objective of this project was to design a more realistic agent-based model that would overcome limitations of earlier models and provide new insights into these outstanding questions. The new model extends the previous agent-based model to three communities so that results can be compared to those from the population-based model. Sensitivity testing was conducted, and the new model was used to investigate the influence of seasonal settlement and mobility patterns, the geographic heterogeneity of the observed 1918-1919 epidemic in Manitoba, and other questions addressed previously. Results confirm outcomes from the population-based model that suggest that (a) social organization and mobility strongly influence the timing and severity of epidemics and (b) the impact of the epidemic would have been greater if it had arrived in the summer rather than the winter. New insights from the model suggest that the observed heterogeneity among communities in epidemic impact was not unusual and would have been the expected outcome given settlement structure and levels of interaction among communities. Application of an agent-based computer simulation has helped to better explain observed patterns of spread of the 1918-1919 flu epidemic in central Manitoba. Contrasts between agent-based and population-based models illustrate the advantages of agent-based models for the study of small populations. © 2010 Wiley-Liss, Inc.

  19. Method: automatic segmentation of mitochondria utilizing patch classification, contour pair classification, and automatically seeded level sets

    PubMed Central

    2012-01-01

    Background While progress has been made to develop automatic segmentation techniques for mitochondria, there remains a need for more accurate and robust techniques to delineate mitochondria in serial blockface scanning electron microscopic data. Previously developed texture based methods are limited for solving this problem because texture alone is often not sufficient to identify mitochondria. This paper presents a new three-step method, the Cytoseg process, for automated segmentation of mitochondria contained in 3D electron microscopic volumes generated through serial block face scanning electron microscopic imaging. The method consists of three steps. The first is a random forest patch classification step operating directly on 2D image patches. The second step consists of contour-pair classification. At the final step, we introduce a method to automatically seed a level set operation with output from previous steps. Results We report accuracy of the Cytoseg process on three types of tissue and compare it to a previous method based on Radon-Like Features. At step 1, we show that the patch classifier identifies mitochondria texture but creates many false positive pixels. At step 2, our contour processing step produces contours and then filters them with a second classification step, helping to improve overall accuracy. We show that our final level set operation, which is automatically seeded with output from previous steps, helps to smooth the results. Overall, our results show that use of contour pair classification and level set operations improve segmentation accuracy beyond patch classification alone. We show that the Cytoseg process performs well compared to another modern technique based on Radon-Like Features. Conclusions We demonstrated that texture based methods for mitochondria segmentation can be enhanced with multiple steps that form an image processing pipeline. While we used a random-forest based patch classifier to recognize texture, it would be possible to replace this with other texture identifiers, and we plan to explore this in future work. PMID:22321695

  20. Comparing ensemble learning methods based on decision tree classifiers for protein fold recognition.

    PubMed

    Bardsiri, Mahshid Khatibi; Eftekhari, Mahdi

    2014-01-01

    In this paper, some methods for ensemble learning of protein fold recognition based on a decision tree (DT) are compared and contrasted against each other over three datasets taken from the literature. According to previously reported studies, the features of the datasets are divided into some groups. Then, for each of these groups, three ensemble classifiers, namely, random forest, rotation forest and AdaBoost.M1 are employed. Also, some fusion methods are introduced for combining the ensemble classifiers obtained in the previous step. After this step, three classifiers are produced based on the combination of classifiers of types random forest, rotation forest and AdaBoost.M1. Finally, the three different classifiers achieved are combined to make an overall classifier. Experimental results show that the overall classifier obtained by the genetic algorithm (GA) weighting fusion method, is the best one in comparison to previously applied methods in terms of classification accuracy.

  1. Real-time generation of infrared ocean scene based on GPU

    NASA Astrophysics Data System (ADS)

    Jiang, Zhaoyi; Wang, Xun; Lin, Yun; Jin, Jianqiu

    2007-12-01

    Infrared (IR) image synthesis for ocean scene has become more and more important nowadays, especially for remote sensing and military application. Although a number of works present ready-to-use simulations, those techniques cover only a few possible ways of water interacting with the environment. And the detail calculation of ocean temperature is rarely considered by previous investigators. With the advance of programmable features of graphic card, many algorithms previously limited to offline processing have become feasible for real-time usage. In this paper, we propose an efficient algorithm for real-time rendering of infrared ocean scene using the newest features of programmable graphics processors (GPU). It differs from previous works in three aspects: adaptive GPU-based ocean surface tessellation, sophisticated balance equation of thermal balance for ocean surface, and GPU-based rendering for infrared ocean scene. Finally some results of infrared image are shown, which are in good accordance with real images.

  2. Continuous Indoor Positioning Fusing WiFi, Smartphone Sensors and Landmarks

    PubMed Central

    Deng, Zhi-An; Wang, Guofeng; Qin, Danyang; Na, Zhenyu; Cui, Yang; Chen, Juan

    2016-01-01

    To exploit the complementary strengths of WiFi positioning, pedestrian dead reckoning (PDR), and landmarks, we propose a novel fusion approach based on an extended Kalman filter (EKF). For WiFi positioning, unlike previous fusion approaches setting measurement noise parameters empirically, we deploy a kernel density estimation-based model to adaptively measure the related measurement noise statistics. Furthermore, a trusted area of WiFi positioning defined by fusion results of previous step and WiFi signal outlier detection are exploited to reduce computational cost and improve WiFi positioning accuracy. For PDR, we integrate a gyroscope, an accelerometer, and a magnetometer to determine the user heading based on another EKF model. To reduce accumulation error of PDR and enable continuous indoor positioning, not only the positioning results but also the heading estimations are recalibrated by indoor landmarks. Experimental results in a realistic indoor environment show that the proposed fusion approach achieves substantial positioning accuracy improvement than individual positioning approaches including PDR and WiFi positioning. PMID:27608019

  3. Continuous Indoor Positioning Fusing WiFi, Smartphone Sensors and Landmarks.

    PubMed

    Deng, Zhi-An; Wang, Guofeng; Qin, Danyang; Na, Zhenyu; Cui, Yang; Chen, Juan

    2016-09-05

    To exploit the complementary strengths of WiFi positioning, pedestrian dead reckoning (PDR), and landmarks, we propose a novel fusion approach based on an extended Kalman filter (EKF). For WiFi positioning, unlike previous fusion approaches setting measurement noise parameters empirically, we deploy a kernel density estimation-based model to adaptively measure the related measurement noise statistics. Furthermore, a trusted area of WiFi positioning defined by fusion results of previous step and WiFi signal outlier detection are exploited to reduce computational cost and improve WiFi positioning accuracy. For PDR, we integrate a gyroscope, an accelerometer, and a magnetometer to determine the user heading based on another EKF model. To reduce accumulation error of PDR and enable continuous indoor positioning, not only the positioning results but also the heading estimations are recalibrated by indoor landmarks. Experimental results in a realistic indoor environment show that the proposed fusion approach achieves substantial positioning accuracy improvement than individual positioning approaches including PDR and WiFi positioning.

  4. A new lizard species of the Phymaturus patagonicus group (Squamata: Liolaemini) from northern Patagonia, Neuquén, Argentina.

    PubMed

    Marín, Andrea González; Pérez, Cristian Hernán Fulvio; Minoli, Ignacio; Morando, Mariana; Avila, Luciano Javier

    2016-06-10

    The integrative taxonomy framework allows developing robust hypotheses of species limits based on the integration of results from different data sets and analytical methods. In this work, we test a candidate species hypothesis previously suggested based on molecular data, with geometric and traditional morphometrics analyses (multivariate and univariate). This new lizard species is part of the Phymaturus patagonicus group (payuniae clade) that is distributed in Neuquén and Mendoza provinces (Argentina). Our results showed that Phymaturus rahuensis sp. nov. differs from the other species of the payuniae clade by a higher number of midbody scales, and fewer supralabials scales, finger lamellae and toe lamellae. Also, its multidimensional spaces, both based on continuous lineal variables and geometric morphometrics (shape) characters, do not overlap with those of the other species in this clade. The results of the morphometric and geometric morphometric analyses presented here, coupled with previously published molecular data, represent three independent lines of evidence that support the diagnosis of this new taxon.

  5. Automated railroad reconstruction from remote sensing image based on texture filter

    NASA Astrophysics Data System (ADS)

    Xiao, Jie; Lu, Kaixia

    2018-03-01

    Techniques of remote sensing have been improved incredibly in recent years and very accurate results and high resolution images can be acquired. There exist possible ways to use such data to reconstruct railroads. In this paper, an automated railroad reconstruction method from remote sensing images based on Gabor filter was proposed. The method is divided in three steps. Firstly, the edge-oriented railroad characteristics (such as line features) in a remote sensing image are detected using Gabor filter. Secondly, two response images with the filtering orientations perpendicular to each other are fused to suppress the noise and acquire a long stripe smooth region of railroads. Thirdly, a set of smooth regions can be extracted by firstly computing global threshold for the previous result image using Otsu's method and then converting it to a binary image based on the previous threshold. This workflow is tested on a set of remote sensing images and was found to deliver very accurate results in a quickly and highly automated manner.

  6. Opportunistic pathology-based screening for diabetes

    PubMed Central

    Simpson, Aaron J; Krowka, Renata; Kerrigan, Jennifer L; Southcott, Emma K; Wilson, J Dennis; Potter, Julia M; Nolan, Christopher J; Hickman, Peter E

    2013-01-01

    Objective To determine the potential of opportunistic glycated haemoglobin (HbA1c) testing of pathology samples to detect previously unknown diabetes. Design Pathology samples from participants collected for other reasons and suitable for HbA1c testing were utilised for opportunistic diabetes screening. HbA1c was measured with a Biorad Variant II turbo analyser and HbA1c levels of ≥6.5% (48 mmol/mol) were considered diagnostic for diabetes. Confirmation of previously unknown diabetes status was obtained by a review of hospital medical records and phone calls to general practitioners. Setting Hospital pathology laboratory receiving samples from hospital-based and community-based (CB) settings. Participants Participants were identified based on the blood sample collection location in the CB, emergency department (ED) and inpatient (IP) groups. Exclusions pretesting were made based on the electronic patient history of: age <18 years, previous diabetes diagnosis, query for diabetes status in the past 12 months, evidence of pregnancy and sample collected postsurgery or transfusion. Only one sample per individual participant was tested. Results Of the 22 396 blood samples collected, 4505 (1142 CB, 1113 ED, 2250 IP) were tested of which 327 (7.3%) had HbA1c levels ≥6.5% (48 mmol/mol). Of these 120 (2.7%) were determined to have previously unknown diabetes (11 (1%) CB, 21 (1.9%) ED, 88 (3.9%) IP). The prevalence of previously unknown diabetes was substantially higher (5.4%) in hospital-based (ED and IP) participants aged over 54 years. Conclusions Opportunistic testing of referred pathology samples can be an effective method of screening for diabetes, especially in hospital-based and older persons. PMID:24065696

  7. Efficiency in Rule- vs. Plan-Based Movements Is Modulated by Action-Mode

    PubMed Central

    Scheib, Jean P. P.; Stoll, Sarah; Thürmer, J. Lukas; Randerath, Jennifer

    2018-01-01

    The rule/plan motor cognition (RPMC) paradigm elicits visually indistinguishable motor outputs, resulting from either plan- or rule-based action-selection, using a combination of essentially interchangeable stimuli. Previous implementations of the RPMC paradigm have used pantomimed movements to compare plan- vs. rule-based action-selection. In the present work we attempt to determine the generalizability of previous RPMC findings to real object interaction by use of a grasp-to-rotate task. In the plan task, participants had to use prospective planning to achieve a comfortable post-handle rotation hand posture. The rule task used implementation intentions (if-then rules) leading to the same comfortable end-state. In Experiment A, we compare RPMC performance of 16 healthy participants in pantomime and real object conditions of the experiment, within-subjects. Higher processing efficiency of rule- vs. plan-based action-selection was supported by diffusion model analysis. Results show a significant response-time increase in the pantomime condition compared to the real object condition and a greater response-time advantage of rule-based vs. plan-based actions in the pantomime compared to the real object condition. In Experiment B, 24 healthy participants performed the real object RPMC task in a task switching vs. a blocked condition. Results indicate that plan-based action-selection leads to longer response-times and less efficient information processing than rule-based action-selection in line with previous RPMC findings derived from the pantomime action-mode. Particularly in the task switching mode, responses were faster in the rule compared to the plan task suggesting a modulating influence of cognitive load. Overall, results suggest an advantage of rule-based action-selection over plan-based action-selection; whereby differential mechanisms appear to be involved depending on the action-mode. We propose that cognitive load is a factor that modulates the advantageous effect of implementation intentions in motor cognition on different levels as illustrated by the varying speed advantages and the variation in diffusion parameters per action-mode or condition, respectively. PMID:29593612

  8. Language and the Medial Temporal Lobe: Evidence from H. M.'s Spontaneous Discourse

    ERIC Educational Resources Information Center

    Skotko, Brian G.; Andrews, Edna; Einstein, Gillian

    2005-01-01

    Previous researchers have found it challenging to disentangle the memory and language capabilities of the famous amnesic patient H. M. Here, we present an original linguistic analysis of H. M. based on empirical data drawing upon novel spoken discourse with him. The results did not uncover the language deficits noted previously. Instead, H. M.'s…

  9. Gold-standard evaluation of a folksonomy-based ontology learning model

    NASA Astrophysics Data System (ADS)

    Djuana, E.

    2018-03-01

    Folksonomy, as one result of collaborative tagging process, has been acknowledged for its potential in improving categorization and searching of web resources. However, folksonomy contains ambiguities such as synonymy and polysemy as well as different abstractions or generality problem. To maximize its potential, some methods for associating tags of folksonomy with semantics and structural relationships have been proposed such as using ontology learning method. This paper evaluates our previous work in ontology learning according to gold-standard evaluation approach in comparison to a notable state-of-the-art work and several baselines. The results show that our method is comparable to the state-of the art work which further validate our approach as has been previously validated using task-based evaluation approach.

  10. Risk analysis theory applied to fishing operations: A new approach on the decision-making problem

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cunha, J.C.S.

    1994-12-31

    In the past the decisions concerning whether to continue or interrupt a fishing operation were based primarily on the operator`s previous experience. This procedure often led to wrong decisions and unnecessary loss of money and time. This paper describes a decision-making method based on risk analysis theory and previous operation results from a field under study. The method leads to more accurate decisions on a daily basis allowing the operator to verify each day of the operation if the decision being carried out is the one with the highest probability to conduct to the best economical result. An example ofmore » the method application is provided at the end of the paper.« less

  11. Atmospheric Dispersion Modeling of the February 2014 Waste Isolation Pilot Plant Release

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nasstrom, John; Piggott, Tom; Simpson, Matthew

    2015-07-22

    This report presents the results of a simulation of the atmospheric dispersion and deposition of radioactivity released from the Waste Isolation Pilot Plant (WIPP) site in New Mexico in February 2014. These simulations were made by the National Atmospheric Release Advisory Center (NARAC) at Lawrence Livermore National Laboratory (LLNL), and supersede NARAC simulation results published in a previous WIPP report (WIPP, 2014). The results presented in this report use additional, more detailed data from WIPP on the specific radionuclides released, radioactivity release amounts and release times. Compared to the previous NARAC simulations, the new simulation results in this report aremore » based on more detailed modeling of the winds, turbulence, and particle dry deposition. In addition, the initial plume rise from the exhaust vent was considered in the new simulations, but not in the previous NARAC simulations. The new model results show some small differences compared to previous results, but do not change the conclusions in the WIPP (2014) report. Presented are the data and assumptions used in these model simulations, as well as the model-predicted dose and deposition on and near the WIPP site. A comparison of predicted and measured radionuclide-specific air concentrations is also presented.« less

  12. Positional priming of pop-out is nested in visuospatial context.

    PubMed

    Gokce, Ahu; Müller, Hermann J; Geyer, Thomas

    2013-11-26

    The present study investigated facilitatory and inhibitory positional priming using a variant of Maljkovic and Nakayama's (1996) priming of pop-out task. Here, the singleton target and the distractors could be presented in different visuospatial contexts-but identical screen locations-across trials, permitting positional priming based on individual locations to be disentangled from priming based on interitem configural relations. The results revealed both significant facilitatory priming, i.e., faster reaction times (RTs) to target presented at previous target relative to previously empty locations, and inhibitory priming, i.e., slower RTs to target at previous distractor relative to previously empty locations. However, both effects were contingent on repetitions versus changes of stimulus arrangement: While facilitation of target locations was dependent on the repetition of the exact item configuration (e.g., T-type followed by T-type stimulus arrangement), the inhibitory effect was more "tolerant," being influenced by repetitions versus changes of the item's visuospatial category (T-type followed by Z-type pattern; cf. Garner & Clement, 1963). The results suggest that facilitatory and inhibitory priming are distinct phenomena (Finke et al., 2009) and that both effects are sensitive to subtle information about the arrangement of the display items (Geyer, Zehetleitner, & Müller, 2010). The results are discussed with respect to the stage(s) of visual pop-out search that are influenced by positional priming.

  13. A historical reconstruction of ships' fuel consumption and emissions

    NASA Astrophysics Data System (ADS)

    Endresen, Øyvind; Sørgârd, Eirik; Behrens, Hanna Lee; Brett, Per Olaf; Isaksen, Ivar S. A.

    2007-06-01

    Shipping activity has increased considerably over the last century and currently represents a significant contribution to the global emissions of pollutants and greenhouse gases. Despite this, information about the historical development of fuel consumption and emissions is generally limited, with little data published pre-1950 and large deviations reported for estimates covering the last 3 decades. To better understand the historical development in ship emissions and the uncertainties associated with the estimates, we present fuel-based CO2 and SO2 emission inventories from 1925 up to 2002 and activity-based estimates from 1970 up to 2000. The global CO2 emissions from ships in 1925 have been estimated to 229 Tg (CO2), growing to about 634 Tg (CO2) in 2002. The corresponding SO2 emissions are about 2.5 Tg (SO2) and 8.5 Tg (SO2), respectively. Our activity-based estimates of fuel consumption from 1970 to 2000, covering all oceangoing civil ships above or equal to 100 gross tonnage (GT), are lower compared to previous activity-based studies. We have applied a more detailed model approach, which includes variation in the demand for sea transport, as well as operational and technological changes of the past. This study concludes that the main reason for the large deviations found in reported inventories is the applied number of days at sea. Moreover, our modeling indicates that the ship size and the degree of utilization of the fleet, combined with the shift to diesel engines, have been the major factors determining yearly fuel consumption. Interestingly, the model results from around 1973 suggest that the fleet growth is not necessarily followed by increased fuel consumption, as technical and operational characteristics have changed. Results from this study indicate that reported sales over the last 3 decades seems not to be significantly underreported as previous simplified activity-based studies have suggested. The results confirm our previously reported modeling estimates for year 2000. Previous activity-based studies have not considered ships less than 100 GT (e.g., today some 1.3 million fishing vessels), and we suggest that this fleet could account for an important part of the total fuel consumption (˜10%).

  14. Guidance for Reviewing OCSPP 850.2100 Avian Oral Toxicity Studies Conducted with Passerine Birds

    EPA Pesticide Factsheets

    Guidance based on comparison of results from the TG223 validation studies to results from avian acute oral studies previously submitted to EPA for two test chemicals following EPA's 850.2100 (public draft) guidelines.

  15. Guidance for Use When Regurgitation is Observed in Avian Acute Toxicity Studies with Passerine Species

    EPA Pesticide Factsheets

    Guidance based on comparison of results from the TG223 validation studies to results from avian acute oral studies previously submitted to EPA for two test chemicals following EPA's 850.2100 (public draft) guidelines.

  16. Efficiency of chemotherapy coupled with thermotherapy against citrus HLB

    USDA-ARS?s Scientific Manuscript database

    Six independent experiments were carried out to evaluate the effectiveness of the chemotherapy coupled with the thermotherapy on pot-contained HLB-affected plants based on our previous results from graft-based methods. Three-year old potted HLB-affected citrus plants were exposed to 4 thermotherapy ...

  17. Covariance-based direction-of-arrival estimation of wideband coherent chirp signals via sparse representation.

    PubMed

    Sha, Zhichao; Liu, Zhengmeng; Huang, Zhitao; Zhou, Yiyu

    2013-08-29

    This paper addresses the problem of direction-of-arrival (DOA) estimation of multiple wideband coherent chirp signals, and a new method is proposed. The new method is based on signal component analysis of the array output covariance, instead of the complicated time-frequency analysis used in previous literatures, and thus is more compact and effectively avoids possible signal energy loss during the hyper-processes. Moreover, the a priori information of signal number is no longer a necessity for DOA estimation in the new method. Simulation results demonstrate the performance superiority of the new method over previous ones.

  18. Simulation optimization of PSA-threshold based prostate cancer screening policies

    PubMed Central

    Zhang, Jingyu; Denton, Brian T.; Shah, Nilay D.; Inman, Brant A.

    2013-01-01

    We describe a simulation optimization method to design PSA screening policies based on expected quality adjusted life years (QALYs). Our method integrates a simulation model in a genetic algorithm which uses a probabilistic method for selection of the best policy. We present computational results about the efficiency of our algorithm. The best policy generated by our algorithm is compared to previously recommended screening policies. Using the policies determined by our model, we present evidence that patients should be screened more aggressively but for a shorter length of time than previously published guidelines recommend. PMID:22302420

  19. Time series regression-based pairs trading in the Korean equities market

    NASA Astrophysics Data System (ADS)

    Kim, Saejoon; Heo, Jun

    2017-07-01

    Pairs trading is an instance of statistical arbitrage that relies on heavy quantitative data analysis to profit by capitalising low-risk trading opportunities provided by anomalies of related assets. A key element in pairs trading is the rule by which open and close trading triggers are defined. This paper investigates the use of time series regression to define the rule which has previously been identified with fixed threshold-based approaches. Empirical results indicate that our approach may yield significantly increased excess returns compared to ones obtained by previous approaches on large capitalisation stocks in the Korean equities market.

  20. From the front line, report from a near paperless hospital: mixed reception among health care professionals.

    PubMed

    Lium, Jan-Tore; Laerum, Hallvard; Schulz, Tom; Faxvaag, Arild

    2006-01-01

    Many Norwegian hospitals that are equipped with an electronic medical record (EMR) system now have proceeded to withdraw the paper-based medical record from clinical workflow. In two previous survey-based studies on the effect of removing the paper-based medical record on the work of physicians, nurses and medical secretaries, we concluded that to scan and eliminate the paper based record was feasible, but that the medical secretaries were the group that reported to benefit the most from the change. To further explore the effects of removing the paper based record, especially in regard to medical personnel, we now have conducted a follow up study of a hospital that has scanned and eliminated its paper-based record. A survey of 27 physicians, 60 nurses and 30 medical secretaries was conducted. The results were compared with those from a previous study conducted three years earlier at the same department. The questionnaire (see online Appendix) covered the frequency of use of the EMR system for specific tasks by physicians, nurses and medical secretaries, the ease of performing these tasks compared to previous routines, user satisfaction and computer literacy. Both physicians and nurses displayed increased use of the EMR compared to the previous study, while medical secretaries reported generally unchanged but high use. The increase in use was not accompanied by a similar change in factors such as computer literacy or technical changes, suggesting that these typical success factors are necessary but not sufficient.

  1. Loran Automatic Vehicle Monitoring System, Phase I : Volume 1. Test Results.

    DOT National Transportation Integrated Search

    1977-08-01

    Presents results of the evaluation phase of a two phase program to develop an Automatic Vehicle Monitoring (AVM) system for the Southern California Rapid Transit District in Los Angeles, California. Tests were previously conducted on a Loran based lo...

  2. A new NIST primary standardization of 18F.

    PubMed

    Fitzgerald, R; Zimmerman, B E; Bergeron, D E; Cessna, J C; Pibida, L; Moreira, D S

    2014-02-01

    A new primary standardization of (18)F by NIST is reported. The standard is based on live-timed beta-gamma anticoincidence counting with confirmatory measurements by three other methods: (i) liquid scintillation (LS) counting using CIEMAT/NIST (3)H efficiency tracing; (ii) triple-to-double coincidence ratio (TDCR) counting; and (iii) NaI integral counting and HPGe γ-ray spectrometry. The results are reported as calibration factors for NIST-maintained ionization chambers (including some "dose calibrators"). The LS-based methods reveal evidence for cocktail instability for one LS cocktail. Using an ionization chamber to link this work with previous NIST results, the new value differs from the previous reports by about 4%, but appears to be in good agreement with the key comparison reference value (KCRV) of 2005. © 2013 Published by Elsevier Ltd.

  3. Evaluation of Contextual Variability in Prediction of Reinforcer Effectiveness

    ERIC Educational Resources Information Center

    Pino, Olimpia; Dazzi, Carla

    2005-01-01

    Previous research has shown that stimulus preference assessments based on caregiver-opinion did not coincide with results of a more systematic method of assessing reinforcing value unless stimuli that were assessed to represent preferences were also preferred on paired stimulus presentation format, and that the relative preference based on the…

  4. Update on Parametric Cost Models for Space Telescopes

    NASA Technical Reports Server (NTRS)

    Stahl. H. Philip; Henrichs, Todd; Luedtke, Alexander; West, Miranda

    2011-01-01

    Since the June 2010 Astronomy Conference, an independent review of our cost data base discovered some inaccuracies and inconsistencies which can modify our previously reported results. This paper will review changes to the data base, our confidence in those changes and their effect on various parametric cost models

  5. Standards-Based Accountability Systems. Policy Brief.

    ERIC Educational Resources Information Center

    Stapleman, Jan

    This policy brief summarizes research results and provides guidance regarding decisions associated with school accountability. Unlike previous notions of accountability, a standards-based system examines outputs, such as student performance and graduation rates, as well as inputs like the amount of instructional time or the number of books in the…

  6. Reversible Data Hiding Based on DNA Computing

    PubMed Central

    Xie, Yingjie

    2017-01-01

    Biocomputing, especially DNA, computing has got great development. It is widely used in information security. In this paper, a novel algorithm of reversible data hiding based on DNA computing is proposed. Inspired by the algorithm of histogram modification, which is a classical algorithm for reversible data hiding, we combine it with DNA computing to realize this algorithm based on biological technology. Compared with previous results, our experimental results have significantly improved the ER (Embedding Rate). Furthermore, some PSNR (peak signal-to-noise ratios) of test images are also improved. Experimental results show that it is suitable for protecting the copyright of cover image in DNA-based information security. PMID:28280504

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krityakierne, Tipaluck; Akhtar, Taimoor; Shoemaker, Christine A.

    This paper presents a parallel surrogate-based global optimization method for computationally expensive objective functions that is more effective for larger numbers of processors. To reach this goal, we integrated concepts from multi-objective optimization and tabu search into, single objective, surrogate optimization. Our proposed derivative-free algorithm, called SOP, uses non-dominated sorting of points for which the expensive function has been previously evaluated. The two objectives are the expensive function value of the point and the minimum distance of the point to previously evaluated points. Based on the results of non-dominated sorting, P points from the sorted fronts are selected as centersmore » from which many candidate points are generated by random perturbations. Based on surrogate approximation, the best candidate point is subsequently selected for expensive evaluation for each of the P centers, with simultaneous computation on P processors. Centers that previously did not generate good solutions are tabu with a given tenure. We show almost sure convergence of this algorithm under some conditions. The performance of SOP is compared with two RBF based methods. The test results show that SOP is an efficient method that can reduce time required to find a good near optimal solution. In a number of cases the efficiency of SOP is so good that SOP with 8 processors found an accurate answer in less wall-clock time than the other algorithms did with 32 processors.« less

  8. Refined lateral energy correction functions for the KASCADE-Grande experiment based on Geant4 simulations

    NASA Astrophysics Data System (ADS)

    Gherghel-Lascu, A.; Apel, W. D.; Arteaga-Velázquez, J. C.; Bekk, K.; Bertaina, M.; Blümer, J.; Bozdog, H.; Brancus, I. M.; Cantoni, E.; Chiavassa, A.; Cossavella, F.; Daumiller, K.; de Souza, V.; Di Pierro, F.; Doll, P.; Engel, R.; Engler, J.; Fuchs, B.; Fuhrmann, D.; Gils, H. J.; Glasstetter, R.; Grupen, C.; Haungs, A.; Heck, D.; Hörandel, J. R.; Huber, D.; Huege, T.; Kampert, K.-H.; Kang, D.; Klages, H. O.; Link, K.; Łuczak, P.; Mathes, H. J.; Mayer, H. J.; Milke, J.; Mitrica, B.; Morello, C.; Oehlschläger, J.; Ostapchenko, S.; Palmieri, N.; Petcu, M.; Pierog, T.; Rebel, H.; Roth, M.; Schieler, H.; Schoo, S.; Schröder, F. G.; Sima, O.; Toma, G.; Trinchero, G. C.; Ulrich, H.; Weindl, A.; Wochele, J.; Zabierowski, J.

    2015-02-01

    In previous studies of KASCADE-Grande data, a Monte Carlo simulation code based on the GEANT3 program has been developed to describe the energy deposited by EAS particles in the detector stations. In an attempt to decrease the simulation time and ensure compatibility with the geometry description in standard KASCADE-Grande analysis software, several structural elements have been neglected in the implementation of the Grande station geometry. To improve the agreement between experimental and simulated data, a more accurate simulation of the response of the KASCADE-Grande detector is necessary. A new simulation code has been developed based on the GEANT4 program, including a realistic geometry of the detector station with structural elements that have not been considered in previous studies. The new code is used to study the influence of a realistic detector geometry on the energy deposited in the Grande detector stations by particles from EAS events simulated by CORSIKA. Lateral Energy Correction Functions are determined and compared with previous results based on GEANT3.

  9. A Depolarisation Lidar Based Method for the Determination of Liquid-Cloud Microphysical Properties.

    NASA Astrophysics Data System (ADS)

    Donovan, D. P.; Klein Baltink, H.; Henzing, J. S.; De Roode, S. R.; Siebesma, P.

    2014-12-01

    The fact that polarisation lidars measure a multiple-scattering induced depolarisation signal in liquid clouds is well-known. The depolarisation signal depends on the lidar characteristics (e.g. wavelength and field-of-view) as well as the cloud properties (e.g. liquid water content (LWC) and cloud droplet number concentration (CDNC)). Previous efforts seeking to use depolarisation information in a quantitative manner to retrieve cloud properties have been undertaken with, arguably, limited practical success. In this work we present a retrieval procedure applicable to clouds with (quasi-)linear LWC profiles and (quasi-)constant CDNC in the cloud base region. Limiting the applicability of the procedure in this manner allows us to reduce the cloud variables to two parameters (namely liquid water content lapse-rate and the CDNC). This simplification, in turn, allows us to employ a robust optimal-estimation inversion using pre-computed look-up-tables produced using lidar Monte-Carlo multiple-scattering simulations. Here, we describe the theory behind the inversion procedure and apply it to simulated observations based on large-eddy simulation model output. The inversion procedure is then applied to actual depolarisation lidar data covering to a range of cases taken from the Cabauw measurement site in the central Netherlands. The lidar results were then used to predict the corresponding cloud-base region radar reflectivities. In non-drizzling condition, it was found that the lidar inversion results can be used to predict the observed radar reflectivities with an accuracy within the radar calibration uncertainty (2-3 dBZ). This result strongly supports the accuracy of the lidar inversion results. Results of a comparison between ground-based aerosol number concentration and lidar-derived CDNC are also presented. The results are seen to be consistent with previous studies based on aircraft-based in situ measurements.

  10. Multiple cueing dissociates location- and feature-based repetition effects

    PubMed Central

    Hu, Kesong; Zhan, Junya; Li, Bingzhao; He, Shuchang; Samuel, Arthur G.

    2014-01-01

    There is an extensive literature on the phenomenon of inhibition of return (IOR): When attention is drawn to a peripheral location and then removed, response time is delayed if a target appears in the previously inspected location. Recent research suggests that non-spatial attribute repetition (i.e., if a target shares a feature like color with the earlier, cueing, stimulus) can have a similar inhibitory effect, at least when the target appears in the previously cued location. What remains unknown is whether location- and feature-based inhibitory effects can be dissociated. In the present study, we used a multiple cueing approach to investigate the properties of location- and feature-based repetition effects. In two experiments (detection, and discrimination), location-based IOR was absent but feature-based inhibition was consistently observed. Thus, the present results indicate that feature- and location-based inhibitory effects are dissociable. The results also provide support for the view that the attentional consequences of multiple cues reflect the overall center of gravity of the cues. We suggest that the repetition costs associated with feature and location repetition may be best understood as a consequence of the pattern of activation for object files associated with the stimuli present in the displays. PMID:24907677

  11. Guidance for Classifying Studies Conducted Using the OECD Test Guideline 223 (TG223) (Acute Avian Oral Sequential Dose Study)

    EPA Pesticide Factsheets

    Guidance based on comparison of results from the TG223 validation studies to results from avian acute oral studies previously submitted to EPA for two test chemicals following EPA's 850.2100 (public draft) guidelines.

  12. Transforming The Munitions And Missile Maintenance Officer Career Field

    DTIC Science & Technology

    2016-04-01

    is a United States Air Force Officer currently attending Air and Space Command College at Maxwell Air Force Base , AL. Maj Edington was previously...INTRODUCTION Since the unauthorized transportation of nuclear warheads from Minot Air Force Base (MAFB) to Barksdale Air Force Base (BAFB) and the mistaken... base .66 These factors coupled with risk-aversion for anything but perfect results during inspections (i.e. zero defects), and the well-established

  13. Deriving all p-brane superalgebras via integrability

    NASA Astrophysics Data System (ADS)

    Grasso, D. T.; McArthur, I. N.

    2018-03-01

    In previous work we demonstrated that the enlarged super-Poincare algebras which underlie p-brane and D-brane actions in superstring theory can be directly determined based on the integrability of supersymmetry transformations assigned to fields appearing in Wess-Zumino terms. In that work we derived p-brane superalgebras for p = 2 and 3. Here we extend our previous results and give a compact expression for superalgebras for all valid p.

  14. Territory choice during the breeding tenure of male sedge warblers.

    PubMed

    Zając, Tadeusz; Bielański, Wojciech; Solarz, Wojciech

    2011-12-01

    A territorial male can shift the location of its territory from year to year in order to increase its quality. The male can base its decision on environmental cues or else on its breeding experiences (when territory shift is caused by breeding failure in previous seasons). We tested these possible mechanisms of territory choice in the sedge warbler (Acrocephalus schoenobaenus), a territorial migrating passerine that occupies wetlands. This species bases its territory choices on an environmental cue: tall wetland vegetation cover. We found that the magnitude of territory quality improvement between seasons (measured as the area of tall wetland vegetation) increased throughout the early stages of a male's breeding career as a result of territory shifts dependent on the earliness of arrival. The distance the territory was shifted between seasons depended negatively on the previous year's territory quality and, less clearly, on the previous year's mating success. On the other hand, previous mating or nesting success had no influence on territory quality improvement between seasons as measured in terms of vegetation. The results imply that tall wetland vegetation is a long-term, effective environmental cue and that a preference for territories in which this type of landcover prevails has evolved into a rigid behavioral mechanism, supplemented by short-term individual experiences of breeding failure.

  15. Evaluating real-time Java for mission-critical large-scale embedded systems

    NASA Technical Reports Server (NTRS)

    Sharp, D. C.; Pla, E.; Luecke, K. R.; Hassan, R. J.

    2003-01-01

    This paper describes benchmarking results on an RT JVM. This paper extends previously published results by including additional tests, by being run on a recently available pre-release version of the first commercially supported RTSJ implementation, and by assessing results based on our experience with avionics systems in other languages.

  16. Tensor scale-based fuzzy connectedness image segmentation

    NASA Astrophysics Data System (ADS)

    Saha, Punam K.; Udupa, Jayaram K.

    2003-05-01

    Tangible solutions to image segmentation are vital in many medical imaging applications. Toward this goal, a framework based on fuzzy connectedness was developed in our laboratory. A fundamental notion called "affinity" - a local fuzzy hanging togetherness relation on voxels - determines the effectiveness of this segmentation framework in real applications. In this paper, we introduce the notion of "tensor scale" - a recently developed local morphometric parameter - in affinity definition and study its effectiveness. Although, our previous notion of "local scale" using the spherical model successfully incorporated local structure size into affinity and resulted in measureable improvements in segmentation results, a major limitation of the previous approach was that it ignored local structural orientation and anisotropy. The current approach of using tensor scale in affinity computation allows an effective utilization of local size, orientation, and ansiotropy in a unified manner. Tensor scale is used for computing both the homogeneity- and object-feature-based components of affinity. Preliminary results of the proposed method on several medical images and computer generated phantoms of realistic shapes are presented. Further extensions of this work are discussed.

  17. New decision criteria for selecting delta check methods based on the ratio of the delta difference to the width of the reference range can be generally applicable for each clinical chemistry test item.

    PubMed

    Park, Sang Hyuk; Kim, So-Young; Lee, Woochang; Chun, Sail; Min, Won-Ki

    2012-09-01

    Many laboratories use 4 delta check methods: delta difference, delta percent change, rate difference, and rate percent change. However, guidelines regarding decision criteria for selecting delta check methods have not yet been provided. We present new decision criteria for selecting delta check methods for each clinical chemistry test item. We collected 811,920 and 669,750 paired (present and previous) test results for 27 clinical chemistry test items from inpatients and outpatients, respectively. We devised new decision criteria for the selection of delta check methods based on the ratio of the delta difference to the width of the reference range (DD/RR). Delta check methods based on these criteria were compared with those based on the CV% of the absolute delta difference (ADD) as well as those reported in 2 previous studies. The delta check methods suggested by new decision criteria based on the DD/RR ratio corresponded well with those based on the CV% of the ADD except for only 2 items each in inpatients and outpatients. Delta check methods based on the DD/RR ratio also corresponded with those suggested in the 2 previous studies, except for 1 and 7 items in inpatients and outpatients, respectively. The DD/RR method appears to yield more feasible and intuitive selection criteria and can easily explain changes in the results by reflecting both the biological variation of the test item and the clinical characteristics of patients in each laboratory. We suggest this as a measure to determine delta check methods.

  18. An improved cellular automaton method to model multispecies biofilms.

    PubMed

    Tang, Youneng; Valocchi, Albert J

    2013-10-01

    Biomass-spreading rules used in previous cellular automaton methods to simulate multispecies biofilm introduced extensive mixing between different biomass species or resulted in spatially discontinuous biomass concentration and distribution; this caused results based on the cellular automaton methods to deviate from experimental results and those from the more computationally intensive continuous method. To overcome the problems, we propose new biomass-spreading rules in this work: Excess biomass spreads by pushing a line of grid cells that are on the shortest path from the source grid cell to the destination grid cell, and the fractions of different biomass species in the grid cells on the path change due to the spreading. To evaluate the new rules, three two-dimensional simulation examples are used to compare the biomass distribution computed using the continuous method and three cellular automaton methods, one based on the new rules and the other two based on rules presented in two previous studies. The relationship between the biomass species is syntrophic in one example and competitive in the other two examples. Simulation results generated using the cellular automaton method based on the new rules agree much better with the continuous method than do results using the other two cellular automaton methods. The new biomass-spreading rules are no more complex to implement than the existing rules. Copyright © 2013 Elsevier Ltd. All rights reserved.

  19. Grey matter, an endophenotype for schizophrenia? A voxel-based morphometry study in siblings of patients with schizophrenia.

    PubMed

    van der Velde, Jorien; Gromann, Paula M; Swart, Marte; de Haan, Lieuwe; Wiersma, Durk; Bruggeman, Richard; Krabbendam, Lydia; Aleman, André

    2015-05-01

    Grey matter, both volume and concentration, has been proposed as an endophenotype for schizophrenia given a number of reports of grey matter abnormalities in relatives of patients with schizophrenia. However, previous studies on grey matter abnormalities in relatives have produced inconsistent results. The aim of the present study was to examine grey matter differences between controls and siblings of patients with schizophrenia and to examine whether the age, genetic loading or subclinical psychotic symptoms of selected individuals could explain the previously reported inconsistencies. We compared the grey matter volume and grey matter concentration of healthy siblings of patients with schizophrenia and healthy controls matched for age, sex and education using voxel-based morphometry (VBM). Furthermore, we selected subsamples based on age (< 30 yr), genetic loading and subclinical psychotic symptoms to examine whether this would lead to different results. We included 89 siblings and 69 controls in our study. The results showed that siblings and controls did not differ significantly on grey matter volume or concentration. Furthermore, specifically selecting participants based on age, genetic loading or subclinical psychotic symptoms did not alter these findings. The main limitation was that subdividing the sample resulted in smaller samples for the subanalyses. Furthermore, we used MRI data from 2 different scanner sites. These results indicate that grey matter measured through VBM might not be a suitable endophenotype for schizophrenia.

  20. A revised timescale for human evolution based on ancient mitochondrial genomes

    PubMed Central

    Johnson, Philip L.F.; Bos, Kirsten; Lari, Martina; Bollongino, Ruth; Sun, Chengkai; Giemsch, Liane; Schmitz, Ralf; Burger, Joachim; Ronchitelli, Anna Maria; Martini, Fabio; Cremonesi, Renata G.; Svoboda, Jiří; Bauer, Peter; Caramelli, David; Castellano, Sergi; Reich, David; Pääbo, Svante; Krause, Johannes

    2016-01-01

    Summary Background Recent analyses of de novo DNA mutations in modern humans have suggested a nuclear substitution rate that is approximately half that of previous estimates based on fossil calibration. This result has led to suggestions that major events in human evolution occurred far earlier than previously thought. Result Here we use mitochondrial genome sequences from 10 securely dated ancient modern humans spanning 40,000 years as calibration points for the mitochondrial clock, thus yielding a direct estimate of the mitochondrial substitution rate. Our clock yields mitochondrial divergence times that are in agreement with earlier estimates based on calibration points derived from either fossils or archaeological material. In particular, our results imply a separation of non-Africans from the most closely related sub-Saharan African mitochondrial DNAs (haplogroup L3) of less than 62,000-95,000 years ago. Conclusion Though single loci like mitochondrial DNA (mtDNA) can only provide biased estimates of population split times, they can provide valid upper bounds; our results exclude most of the older dates for African and non-African split times recently suggested by de novo mutation rate estimates in the nuclear genome. PMID:23523248

  1. How Prevalent Is Object-Based Attention?

    PubMed Central

    Pilz, Karin S.; Roggeveen, Alexa B.; Creighton, Sarah E.; Bennett, Patrick J.; Sekuler, Allison B.

    2012-01-01

    Previous research suggests that visual attention can be allocated to locations in space (space-based attention) and to objects (object-based attention). The cueing effects associated with space-based attention tend to be large and are found consistently across experiments. Object-based attention effects, however, are small and found less consistently across experiments. In three experiments we address the possibility that variability in object-based attention effects across studies reflects low incidence of such effects at the level of individual subjects. Experiment 1 measured space-based and object-based cueing effects for horizontal and vertical rectangles in 60 subjects comparing commonly used target detection and discrimination tasks. In Experiment 2 we ran another 120 subjects in a target discrimination task in which rectangle orientation varied between subjects. Using parametric statistical methods, we found object-based effects only for horizontal rectangles. Bootstrapping methods were used to measure effects in individual subjects. Significant space-based cueing effects were found in nearly all subjects in both experiments, across tasks and rectangle orientations. However, only a small number of subjects exhibited significant object-based cueing effects. Experiment 3 measured only object-based attention effects using another common paradigm and again, using bootstrapping, we found only a small number of subjects that exhibited significant object-based cueing effects. Our results show that object-based effects are more prevalent for horizontal rectangles, which is in accordance with the theory that attention may be allocated more easily along the horizontal meridian. The fact that so few individuals exhibit a significant object-based cueing effect presumably is why previous studies of this effect might have yielded inconsistent results. The results from the current study highlight the importance of considering individual subject data in addition to commonly used statistical methods. PMID:22348018

  2. Dose and detectability for a cone-beam C-arm CT system revisited

    PubMed Central

    Ganguly, Arundhuti; Yoon, Sungwon; Fahrig, Rebecca

    2010-01-01

    Purpose: The authors had previously published measurements of the detectability of disk-shaped contrast objects in images obtained from a C-arm CT system. A simple approach based on Rose’s criterion was used to scale the date, assuming the threshold for the smallest diameter detected should be inversely proportional to (dose)1∕2. A more detailed analysis based on recent theoretical modeling of C-arm CT images is presented in this work. Methods: The signal and noise propagations in a C-arm based CT system have been formulated by other authors using cascaded systems analysis. They established a relationship between detectability and the noise equivalent quanta. Based on this model, the authors obtained a relation between x-ray dose and the diameter of the smallest disks detected. A closed form solution was established by assuming no rebinning and no resampling of data, with low additive noise and using a ramp filter. For the case when no such assumptions were made, a numerically calculated solution using previously reported imaging and reconstruction parameters was obtained. The detection probabilities for a range of dose and kVp values had been measured previously. These probabilities were normalized to a single dose of 56.6 mGy using the Rose-criteria-based relation to obtain a universal curve. Normalizations based on the new numerically calculated relationship were compared to the measured results. Results: The theoretical and numerical calculations have similar results and predict the detected diameter size to be inversely proportional to (dose)1∕3 and (dose)1∕2.8, respectively. The normalized experimental curves and the associated universal plot using the new relation were not significantly different from those obtained using the Rose-criterion-based normalization. Conclusions: From numerical simulations, the authors found that the diameter of detected disks depends inversely on the cube root of the dose. For observer studies for disks larger than 4 mm, the cube root as well as square root relations appear to give similar results when used for normalization. PMID:20527560

  3. A Comparison of Topography-based and Selection-based Verbal Behavior in Typically Developed Children and Developmentally Disabled Persons with Autism

    PubMed Central

    Vignes, Tore

    2007-01-01

    This study is a replication of Sundberg and Sundberg (1990) that compared topography-based verbal behavior with selection-based verbal behavior in terms of acquisition, accuracy, and testing for the emergence of a new verbal relation. Participants were three typical children and three developmentally disabled persons with autism. The study sought to determine which paradigm (topography-based or selection-based) resulted in more rapid acquisition of tacts and intraverbals, which was associated with the fewest errors, and which paradigm resulted in the emergence of the highest number of new verbal relations. The results of the study showed that the six participants performed quite differently from one another. Most importantly, the results from the person with autism contradicted previous findings favoring selection-based verbal behavior over topography-based approaches for teaching verbal behavior to low-functioning individuals. PMID:22477385

  4. ITS evaluation -- phase 3 (2010)

    DOT National Transportation Integrated Search

    2011-05-01

    This report documents the results of applying a previously developed, standardized approach for : evaluating intelligent transportation systems (ITS) projects to 17 ITS earmark projects. The evaluation : approach was based on a questionnaire to inves...

  5. Pre-configured polyhedron based protection against multi-link failures in optical mesh networks.

    PubMed

    Huang, Shanguo; Guo, Bingli; Li, Xin; Zhang, Jie; Zhao, Yongli; Gu, Wanyi

    2014-02-10

    This paper focuses on random multi-link failures protection in optical mesh networks, instead of single, the dual or sequential failures of previous studies. Spare resource efficiency and failure robustness are major concerns in link protection strategy designing and a k-regular and k-edge connected structure is proved to be one of the optimal solutions for link protection network. Based on this, a novel pre-configured polyhedron based protection structure is proposed, and it could provide protection for both simultaneous and sequential random link failures with improved spare resource efficiency. Its performance is evaluated in terms of spare resource consumption, recovery rate and average recovery path length, as well as compared with ring based and subgraph protection under probabilistic link failure scenarios. Results show the proposed novel link protection approach has better performance than previous works.

  6. Mediators of Telephone-Based Continuing Care for Alcohol and Cocaine Dependence

    ERIC Educational Resources Information Center

    Mensinger, Janell Lynn; Lynch, Kevin G.; Tenhave, Thomas R.; McKay, James R.

    2007-01-01

    A previous randomized trial with 224 alcohol and/or cocaine addicts who had completed an initial phase of treatment indicated that 12 weeks of telephone-based continuing care yielded higher abstinence rates over 24 months than did group counseling continuing care. The current study examined mediators of this treatment effect. Results suggested…

  7. Analyzing Population Genetics Using the Mitochondrial Control Region and Bioinformatics

    ERIC Educational Resources Information Center

    Sato, Takumi; Phillips, Bonnie; Latourelle, Sandra M.; Elwess, Nancy L.

    2010-01-01

    The 14-base pair hypervariable region in mitochondrial DNA (mtDNA) of Asian populations, specifically Japanese and Chinese students at Plattsburgh State University, was examined. Previous research on this 14-base pair region showed it to be susceptible to mutations and as a result indicated direct correlation with specific ethnic populations.…

  8. Asteroid mass estimation with Markov-chain Monte Carlo

    NASA Astrophysics Data System (ADS)

    Siltala, L.; Granvik, M.

    2017-09-01

    We have developed a new Markov-chain Monte Carlo-based algorithm for asteroid mass estimation based on mutual encounters and tested it for several different asteroids. Our results are in line with previous literature values but suggest that uncertainties of prior estimates may be misleading as a consequence of using linearized methods.

  9. Comparing Student Success and Understanding in Introductory Statistics under Consensus and Simulation-Based Curricula

    ERIC Educational Resources Information Center

    Hldreth, Laura A.; Robison-Cox, Jim; Schmidt, Jade

    2018-01-01

    This study examines the transferability of results from previous studies of simulation-based curriculum in introductory statistics using data from 3,500 students enrolled in an introductory statistics course at Montana State University from fall 2013 through spring 2016. During this time, four different curricula, a traditional curriculum and…

  10. Preparation Of Strong, Dense Potassium Beta''-Alumina Ceramic

    NASA Technical Reports Server (NTRS)

    Williams, Roger M.; Jeffries-Nakamura, Barbara; Ryan, Margaret A.; O'Connor, Dennis E.; Kisor, Adam; Kikkert, Stanley J.; Losey, Robert; Suitor, Jerry W.

    1995-01-01

    Improved process for making mechanically strong, dense, phase-pure potassium beta''-alumina solid electrolyte (K-BASE) results in material superior to all previous K-BASE preparations and similar to commercial Na-BASE in strength, phase purity and high-temperature ionic conductivity. Potassium-based alkali-metal thermal-to-electric conversion (AMTEC) cells expected to operate efficiently at lower heat-input temperatures and lower rejection temperatures than sodium-based AMTEC cells, making them appropriate for somewhat different applications.

  11. RESEARCH: An Ecoregional Approach to the Economic Valuation of Land- and Water-Based Recreation in the United States

    PubMed

    Bhat; Bergstrom; Teasley; Bowker; Cordell

    1998-01-01

    / This paper describes a framework for estimating the economic value of outdoor recreation across different ecoregions. Ten ecoregions in the continental United States were defined based on similarly functioning ecosystem characters. The individual travel cost method was employed to estimate recreation demand functions for activities such as motor boating and waterskiing, developed and primitive camping, coldwater fishing, sightseeing and pleasure driving, and big game hunting for each ecoregion. While our ecoregional approach differs conceptually from previous work, our results appear consistent with the previous travel cost method valuation studies.KEY WORDS: Recreation; Ecoregion; Travel cost method; Truncated Poisson model

  12. Genomic regions associated with bovine milk fatty acids in both summer and winter milk samples

    PubMed Central

    2012-01-01

    Background In this study we perform a genome-wide association study (GWAS) for bovine milk fatty acids from summer milk samples. This study replicates a previous study where we performed a GWAS for bovine milk fatty acids based on winter milk samples from the same population. Fatty acids from summer and winter milk are genetically similar traits and we therefore compare the regions detected in summer milk to the regions previously detected in winter milk GWAS to discover regions that explain genetic variation in both summer and winter milk. Results The GWAS of summer milk samples resulted in 51 regions associated with one or more milk fatty acids. Results are in agreement with most associations that were previously detected in a GWAS of fatty acids from winter milk samples, including eight ‘new’ regions that were not considered in the individual studies. The high correlation between the –log10(P-values) and effects of SNPs that were found significant in both GWAS imply that the effects of the SNPs were similar on winter and summer milk fatty acids. Conclusions The GWAS of fatty acids based on summer milk samples was in agreement with most of the associations detected in the GWAS of fatty acids based on winter milk samples. Associations that were in agreement between both GWAS are more likely to be involved in fatty acid synthesis compared to regions detected in only one GWAS and are therefore worthwhile to pursue in fine-mapping studies. PMID:23107417

  13. Nomarski differential interference contrast microscopy for surface slope measurements: an examination of techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hartman, J.S.; Gordon, R.L.; Lessor, D.L.

    1981-08-01

    Alternate measurement and data analysis procedures are discussed and compared for the application of reflective Nomarski differential interference contrast microscopy for the determination of surface slopes. The discussion includes the interpretation of a previously reported iterative procedure using the results of a detailed optical model and the presentation of a new procedure based on measured image intensity extrema. Surface slope determinations from these procedures are presented and compared with results from a previously reported curve fit analysis of image intensity data. The accuracy and advantages of the different procedures are discussed.

  14. Combined Effects of Supersaturation Rates and Doses on the Kinetic-Solubility Profiles of Amorphous Solid Dispersions Based on Water-Insoluble Poly(2-hydroxyethyl methacrylate) Hydrogels.

    PubMed

    Schver, Giovanna C R M; Lee, Ping I

    2018-05-07

    Under nonsink dissolution conditions, the kinetic-solubility profiles of amorphous solid dispersions (ASDs) based on soluble carriers typically exhibit so-called "spring-and-parachute" concentration-time behaviors. However, the kinetic-solubility profiles of ASDs based on insoluble carriers (including hydrogels) are known to show sustained supersaturation during nonsink dissolution through a matrix-regulated diffusion mechanism by which the supersaturation of the drug is built up gradually and sustained over an extended period without any dissolved polymers acting as crystallization inhibitors. Despite previous findings demonstrating the interplay between supersaturation rates and total doses on the kinetic-solubility profiles of soluble amorphous systems (including ASDs based on dissolution-regulated releases from soluble polymer carriers), the combined effects of supersaturation rates and doses on the kinetic-solubility profiles of ASDs based on diffusion-regulated releases from water-insoluble carriers have not been investigated previously. Thus, the objective of this study is to examine the impacts of total doses and supersaturation-generation rates on the resulting kinetic-solubility profiles of ASDs based on insoluble hydrogel carriers. We employed a previously established ASD-carrier system based on water-insoluble-cross-linked-poly(2-hydroxyethyl methacrylate) (PHEMA)-hydrogel beads and two poorly water soluble model drugs: the weakly acidic indomethacin (IND) and the weakly basic posaconazole (PCZ). Our results show clearly for the first time that by using the smallest-particle-size fraction and a high dose (i.e., above the critical dose), it is indeed possible to significantly shorten the duration of sustained supersaturation in the kinetic-solubility profile of an ASD based on a water-insoluble hydrogel carrier, such that it resembles the spring-and-parachute dissolution profiles normally associated with ASDs based on soluble carriers. This generates sufficiently rapid initial supersaturation buildup above the critical supersaturation, resulting in more rapid precipitation. Above this smallest-particle-size range, the matrix-diffusion-regulated nonlinear rate of drug release gets slower, which results in a more modest rate of supersaturation buildup, leading to a maximum supersaturation below the critical-supersaturation level without appreciable precipitation. The area-under-the-curve (AUC) values of the in vitro kinetic-solubility concentration-time profiles were used to correlate the corresponding trends in dissolution enhancement. There are observed monotonic increases in AUC values with increasing particle sizes for high-dose ASDs based on water-insoluble hydrogel matrixes, as opposed to the previously reported AUC maxima at some intermediate supersaturation rates or doses in soluble amorphous systems, whereas in the case of low-dose ASDs (i.e., below the critical dose levels), crystallization would be negligible, leading to sustained supersaturation with all particle sizes (i.e., eventually reaching the same maximum supersaturation) and the smallest particle size reaching the maximum supersaturation the fastest. As a result, the smallest particle sizes yield the largest AUC values in the case of low-dose ASDs based on water-insoluble hydrogel matrixes. In addition to probing the interplay between the supersaturation-generation rates and total doses in ASDs based on insoluble hydrogel carriers, our results further support the fact that through either increasing the hydrogel-particle size or lowering the total dose to achieve maximum supersaturation still below the critical-supersaturation level, it is possible to avoid drug precipitation so as to maintain sustained supersaturation.

  15. Grey matter, an endophenotype for schizophrenia? A voxel-based morphometry study in siblings of patients with schizophrenia

    PubMed Central

    van der Velde, Jorien; Gromann, Paula M.; Swart, Marte; de Haan, Lieuwe; Wiersma, Durk; Bruggeman, Richard; Krabbendam, Lydia; Aleman, André

    2015-01-01

    Background Grey matter, both volume and concentration, has been proposed as an endophenotype for schizophrenia given a number of reports of grey matter abnormalities in relatives of patients with schizophrenia. However, previous studies on grey matter abnormalities in relatives have produced inconsistent results. The aim of the present study was to examine grey matter differences between controls and siblings of patients with schizophrenia and to examine whether the age, genetic loading or subclinical psychotic symptoms of selected individuals could explain the previously reported inconsistencies. Methods We compared the grey matter volume and grey matter concentration of healthy siblings of patients with schizophrenia and healthy controls matched for age, sex and education using voxel-based morphometry (VBM). Furthermore, we selected subsamples based on age (< 30 yr), genetic loading and subclinical psychotic symptoms to examine whether this would lead to different results. Results We included 89 siblings and 69 controls in our study. The results showed that siblings and controls did not differ significantly on grey matter volume or concentration. Furthermore, specifically selecting participants based on age, genetic loading or subclinical psychotic symptoms did not alter these findings. Limitations The main limitation was that subdividing the sample resulted in smaller samples for the subanalyses. Furthermore, we used MRI data from 2 different scanner sites. Conclusion These results indicate that grey matter measured through VBM might not be a suitable endophenotype for schizophrenia. PMID:25768029

  16. Comparison of alternative weight recalibration methods for diagnosis-related groups

    PubMed Central

    Rogowski, Jeannette Roskamp; Byrne, Daniel J.

    1990-01-01

    In this article, alternative methodologies for recalibration of the diagnosis-related group (DRG) weights are examined. Based on 1984 data, cost and charge-based weights are less congruent than those calculated with 1981 data. Previous studies using 1981 data demonstrated that cost- and charge-based weights were not very different. Charge weights result in higher payments to surgical DRGs and lower payments to medical DRGs, relative to cost weights. At the provider level, charge weights result in higher payments to large urban hospitals and teaching hospitals, relative to cost weights. PMID:10113568

  17. Mesoscopic modeling of DNA denaturation rates: Sequence dependence and experimental comparison

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dahlen, Oda, E-mail: oda.dahlen@ntnu.no; Erp, Titus S. van, E-mail: titus.van.erp@ntnu.no

    Using rare event simulation techniques, we calculated DNA denaturation rate constants for a range of sequences and temperatures for the Peyrard-Bishop-Dauxois (PBD) model with two different parameter sets. We studied a larger variety of sequences compared to previous studies that only consider DNA homopolymers and DNA sequences containing an equal amount of weak AT- and strong GC-base pairs. Our results show that, contrary to previous findings, an even distribution of the strong GC-base pairs does not always result in the fastest possible denaturation. In addition, we applied an adaptation of the PBD model to study hairpin denaturation for which experimentalmore » data are available. This is the first quantitative study in which dynamical results from the mesoscopic PBD model have been compared with experiments. Our results show that present parameterized models, although giving good results regarding thermodynamic properties, overestimate denaturation rates by orders of magnitude. We believe that our dynamical approach is, therefore, an important tool for verifying DNA models and for developing next generation models that have higher predictive power than present ones.« less

  18. 50 CFR 224.101 - Enumeration of endangered marine and anadromous species.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... institutions) and which are identified as fish belonging to the NYB DPS based on genetics analyses, previously... genetics analyses, previously applied tags, previously applied marks, or documentation to verify that the... Carolina DPS based on genetics analyses, previously applied tags, previously applied marks, or...

  19. 50 CFR 224.101 - Enumeration of endangered marine and anadromous species.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... institutions) and which are identified as fish belonging to the NYB DPS based on genetics analyses, previously... genetics analyses, previously applied tags, previously applied marks, or documentation to verify that the... Carolina DPS based on genetics analyses, previously applied tags, previously applied marks, or...

  20. Effective Sequential Classifier Training for SVM-Based Multitemporal Remote Sensing Image Classification

    NASA Astrophysics Data System (ADS)

    Guo, Yiqing; Jia, Xiuping; Paull, David

    2018-06-01

    The explosive availability of remote sensing images has challenged supervised classification algorithms such as Support Vector Machines (SVM), as training samples tend to be highly limited due to the expensive and laborious task of ground truthing. The temporal correlation and spectral similarity between multitemporal images have opened up an opportunity to alleviate this problem. In this study, a SVM-based Sequential Classifier Training (SCT-SVM) approach is proposed for multitemporal remote sensing image classification. The approach leverages the classifiers of previous images to reduce the required number of training samples for the classifier training of an incoming image. For each incoming image, a rough classifier is firstly predicted based on the temporal trend of a set of previous classifiers. The predicted classifier is then fine-tuned into a more accurate position with current training samples. This approach can be applied progressively to sequential image data, with only a small number of training samples being required from each image. Experiments were conducted with Sentinel-2A multitemporal data over an agricultural area in Australia. Results showed that the proposed SCT-SVM achieved better classification accuracies compared with two state-of-the-art model transfer algorithms. When training data are insufficient, the overall classification accuracy of the incoming image was improved from 76.18% to 94.02% with the proposed SCT-SVM, compared with those obtained without the assistance from previous images. These results demonstrate that the leverage of a priori information from previous images can provide advantageous assistance for later images in multitemporal image classification.

  1. Early syntactic creativity: a usage-based approach.

    PubMed

    Lieven, Elena; Behrens, Heike; Speares, Jennifer; Tomasello, Michael

    2003-05-01

    The aim of the current study was to determine the degree to which a sample of one child's creative utterances related to utterances that the child previously produced. The utterances to be accounted for were all of the intelligible, multi-word utterances produced by the child in a single hour of interaction with her mother early in her third year of life (at age 2;1.11). We used a high-density database consisting of 5 hours of recordings per week together with a maternal diary for the previous 6 weeks. Of the 295 multi-word utterances on tape, 37% were 'novel' in the sense that they had not been said in their entirety before. Using a morpheme-matching method, we identified the way(s) in which each novel utterance differed from its closest match in the preceding corpus. In 74% of the cases we required only one operation to match the previous utterance and the great majority of these consisted of the substitution of a word (usually a noun) into a previous utterance or schema. Almost all the other single-operation utterances involved adding a word onto the beginning or end of a previous utterance. 26% of the novel, multi-word utterances required more than one operation to match the closest previous utterance, although many of these only involved a combination of the two operations seen for the single-operation utterances. Some others were, however, more complex to match. The results suggest that the relatively high degree of creativity in early English child language could be at least partially based upon entrenched schemas and a small number of simple operations to modify them. We discuss the implications of these results for the interplay in language production between strings registered in memory and categorial knowledge.

  2. A New Perspective in the Etiology, Treatment, Prevention and Prediction of Space Motion Sickness

    DTIC Science & Technology

    1988-12-01

    ulsint options: Dilanlin (First choice based previous ground based ef- ficacity’), Dextromethorphan \\Dilantin, Carbamnazcpine, Dextrornczhorpl.an\\Car...First choice based previous ground based er- ricacity), Dcxtromethorphan\\Dilantin, Carbamazepinc, Dextromethorphan \\Car- bamrazepine 137 EVALUATION OF...Anticonvulsant options: Dilantin (First choice bised previous ground based ef- * flicacioy), Dextromethorphan \\Dilantin, Carbamazepine, Dextromethorphan \\Car

  3. Torso-Tank Validation of High-Resolution Electrogastrography (EGG): Forward Modelling, Methodology and Results.

    PubMed

    Calder, Stefan; O'Grady, Greg; Cheng, Leo K; Du, Peng

    2018-04-27

    Electrogastrography (EGG) is a non-invasive method for measuring gastric electrical activity. Recent simulation studies have attempted to extend the current clinical utility of the EGG, in particular by providing a theoretical framework for distinguishing specific gastric slow wave dysrhythmias. In this paper we implement an experimental setup called a 'torso-tank' with the aim of expanding and experimentally validating these previous simulations. The torso-tank was developed using an adult male torso phantom with 190 electrodes embedded throughout the torso. The gastric slow waves were reproduced using an artificial current source capable of producing 3D electrical fields. Multiple gastric dysrhythmias were reproduced based on high-resolution mapping data from cases of human gastric dysfunction (gastric re-entry, conduction blocks and ectopic pacemakers) in addition to normal test data. Each case was recorded and compared to the previously-presented simulated results. Qualitative and quantitative analyses were performed to define the accuracy showing [Formula: see text] 1.8% difference, [Formula: see text] 0.99 correlation, and [Formula: see text] 0.04 normalised RMS error between experimental and simulated findings. These results reaffirm previous findings and these methods in unison therefore present a promising morphological-based methodology for advancing the understanding and clinical applications of EGG.

  4. Venus - Global gravity and topography

    NASA Technical Reports Server (NTRS)

    Mcnamee, J. B.; Borderies, N. J.; Sjogren, W. L.

    1993-01-01

    A new gravity field determination that has been produced combines both the Pioneer Venus Orbiter (PVO) and the Magellan Doppler radio data. Comparisonsbetween this estimate, a spherical harmonic model of degree and order 21, and previous models show that significant improvements have been made. Results are displayed as gravity contours overlaying a topographic map. We also calculate a new spherical harmonic model of topography based on Magellan altimetry, with PVO altimetry included where gaps exist in the Magellan data. This model is also of degree and order 21, so in conjunction with the gravity model, Bouguer and isostatic anomaly maps can be produced. These results are very consistent with previous results, but reveal more spatial resolution in the higher latitudes.

  5. SOP: parallel surrogate global optimization with Pareto center selection for computationally expensive single objective problems

    DOE PAGES

    Krityakierne, Tipaluck; Akhtar, Taimoor; Shoemaker, Christine A.

    2016-02-02

    This paper presents a parallel surrogate-based global optimization method for computationally expensive objective functions that is more effective for larger numbers of processors. To reach this goal, we integrated concepts from multi-objective optimization and tabu search into, single objective, surrogate optimization. Our proposed derivative-free algorithm, called SOP, uses non-dominated sorting of points for which the expensive function has been previously evaluated. The two objectives are the expensive function value of the point and the minimum distance of the point to previously evaluated points. Based on the results of non-dominated sorting, P points from the sorted fronts are selected as centersmore » from which many candidate points are generated by random perturbations. Based on surrogate approximation, the best candidate point is subsequently selected for expensive evaluation for each of the P centers, with simultaneous computation on P processors. Centers that previously did not generate good solutions are tabu with a given tenure. We show almost sure convergence of this algorithm under some conditions. The performance of SOP is compared with two RBF based methods. The test results show that SOP is an efficient method that can reduce time required to find a good near optimal solution. In a number of cases the efficiency of SOP is so good that SOP with 8 processors found an accurate answer in less wall-clock time than the other algorithms did with 32 processors.« less

  6. Seasonally Transported Aerosol Layers Over Southeast Atlantic are Closer to Underlying Clouds than Previously Reported

    NASA Technical Reports Server (NTRS)

    Rajapakshe, Chamara; Zhang, Zhibo; Yorks, John E.; Yu, Hongbin; Tan, Qian; Meyer, Kerry; Platnick, Steven; Winker, David M.

    2017-01-01

    From June to October, low-level clouds in the southeast (SE) Atlantic often underlie seasonal aerosol layers transported from African continent. Previously, the Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation (CALIPSO) 532 nm lidar observations have been used to estimate the relative vertical location of the above-cloud aerosols (ACA) to the underlying clouds. Here we show new observations from NASA's Cloud-Aerosol Transport System (CATS) lidar. Two seasons of CATS 1064 nm observations reveal that the bottom of the ACA layer is much lower than previously estimated based on CALIPSO 532 nm observations. For about 60% of CATS nighttime ACA scenes, the aerosol layer base is within 360 m distance to the top of the underlying cloud. Our results are important for future studies of the microphysical indirect and semidirect effects of ACA in the SE Atlantic region.

  7. Gaze data reveal distinct choice processes underlying model-based and model-free reinforcement learning

    PubMed Central

    Konovalov, Arkady; Krajbich, Ian

    2016-01-01

    Organisms appear to learn and make decisions using different strategies known as model-free and model-based learning; the former is mere reinforcement of previously rewarded actions and the latter is a forward-looking strategy that involves evaluation of action-state transition probabilities. Prior work has used neural data to argue that both model-based and model-free learners implement a value comparison process at trial onset, but model-based learners assign more weight to forward-looking computations. Here using eye-tracking, we report evidence for a different interpretation of prior results: model-based subjects make their choices prior to trial onset. In contrast, model-free subjects tend to ignore model-based aspects of the task and instead seem to treat the decision problem as a simple comparison process between two differentially valued items, consistent with previous work on sequential-sampling models of decision making. These findings illustrate a problem with assuming that experimental subjects make their decisions at the same prescribed time. PMID:27511383

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saboia, A.; Toscano, F.; Walborn, S. P.

    We derive a family of entanglement criteria for continuous-variable systems based on the Renyi entropy of complementary distributions. We show that these entanglement witnesses can be more sensitive than those based on second-order moments, as well as previous tests involving the Shannon entropy [Phys. Rev. Lett. 103, 160505 (2009)]. We extend our results to include the case of discrete sampling. We provide several numerical results which show that our criteria can be used to identify entanglement in a number of experimentally relevant quantum states.

  9. The risk of revision in total knee arthroplasty is not affected by previous high tibial osteotomy

    PubMed Central

    Badawy, Mona; Fenstad, Anne M; Indrekvam, Kari; Havelin, Leif I; Furnes, Ove

    2015-01-01

    Background and purpose — Previous studies have found different outcomes after revision of knee arthroplasties performed after high tibial osteotomy (HTO). We evaluated the risk of revision of total knee arthroplasty with or without previous HTO in a large registry material. Patients and methods — 31,077 primary TKAs were compared with 1,399 TKAs after HTO, using Kaplan-Meier 10-year survival percentages and adjusted Cox regression analysis. Results — The adjusted survival analyses showed similar survival in the 2 groups. The Kaplan-Meier 10-year survival was 93.8% in the primary TKA group and 92.6% in the TKA-post-HTO group. Adjusted RR was 0.97 (95% CI: 0.77–1.21; p = 0.8). Interpretation — In this registry-based study, previous high tibial osteotomy did not appear to compromise the results regarding risk of revision after total knee arthroplasty compared to primary knee arthroplasty. PMID:26058747

  10. Stability and Change in Farming Plans: Results from a Longitudinal Study of Young Adults.

    ERIC Educational Resources Information Center

    Lyson, Thomas A.

    To examine 2 shortcomings of previous cross-sectional farm recruitment research, the study population of the National Longitudinal Survey of the High School Class of 1972, consisting of 14,112 individuals who completed the base year questionnaire and 3 subsequent follow-up questionnaires, was divided into analytic sub-groups based on senior year…

  11. Entrepreneurship in the Engineering Curriculum: Some Initial Results of PUC-Rio's Experiment.

    ERIC Educational Resources Information Center

    Aranha, Jose Alberto S.; Pimenta-Bueno, J. A.; Scavarda do Carmo, Luiz Carlos; da Silveira, Marcos A.

    The ideal of the entrepreneurial spirit has played a key role in shaping the current reform of engineering education at the Pontificia Universidade Catolica do Rio de Janeiro (PUC-Rio). The previous paradigm of a science-based conceptual engineer has given place to what may be termed a science-based entrepreneurial engineer. This paper discusses…

  12. Food Hygiene Education in UK Secondary Schools: A Nationwide Survey of Teachers' Views

    ERIC Educational Resources Information Center

    Egan, M. B.; Bielby, G.; Eves, A.; Lumbers, M. L.; Raats, M. M.; Adams, M. R.

    2008-01-01

    Objective: A nationwide survey of teachers investigated the teaching of food hygiene in UK secondary schools. Previous studies have focused on effective strategies in consumer food hygiene training but there is little research focusing on school-based education. Design: The questionnaire was developed based on the results of in-depth interviews…

  13. Can You Skype Me Now? Developing Teachers' Classroom Management Practices through Virtual Coaching

    ERIC Educational Resources Information Center

    Rock, Marcia L.; Schoenfeld, Naomi; Zigmond, Naomi; Gable, Robert A.; Gregg, Madeleine; Ploessl, Donna M.; Salter, Ashley

    2013-01-01

    In this article, situated within the context of a larger ongoing study on the efficacy of Web-based virtual coaching, these authors describe a virtual coaching model for maximizing pre- and in-service teachers' effective use of evidence-based classroom management practices. They also provide a brief summary of previous results obtained…

  14. Re-Exploring Game-Assisted Learning Research: The Perspective of Learning Theoretical Bases

    ERIC Educational Resources Information Center

    Wu, Wen-Hsiung; Chiou, Wen-Bin; Kao, Hao-Yun; Hu, Chung-Hsing Alex; Huang, Sih-Han

    2012-01-01

    Previous literature reviews or meta-analysis based studies on game-assisted learning have provided important results, but few studies have considered the importance of learning theory, and coverage of papers after 2007 is scant. This study presents a systematic review of the literature using a meta-analysis approach to provide a more comprehensive…

  15. Effect of Belief Bias on the Development of Undergraduate Students' Reasoning about Inference

    ERIC Educational Resources Information Center

    Kaplan, Jennifer K.

    2009-01-01

    Psychologists have discovered a phenomenon called "Belief Bias" in which subjects rate the strength of arguments based on the believability of the conclusions. This paper reports the results of a small qualitative pilot study of undergraduate students who had previously taken an algebra-based introduction to statistics class. The subjects in this…

  16. Legacies of precipitation fluctuations on primary production: theory and data synthesis.

    PubMed

    Sala, Osvaldo E; Gherardi, Laureano A; Reichmann, Lara; Jobbágy, Esteban; Peters, Debra

    2012-11-19

    Variability of above-ground net primary production (ANPP) of arid to sub-humid ecosystems displays a closer association with precipitation when considered across space (based on multiyear averages for different locations) than through time (based on year-to-year change at single locations). Here, we propose a theory of controls of ANPP based on four hypotheses about legacies of wet and dry years that explains space versus time differences in ANPP-precipitation relationships. We tested the hypotheses using 16 long-term series of ANPP. We found that legacies revealed by the association of current- versus previous-year conditions through the temporal series occur across all ecosystem types from deserts to mesic grasslands. Therefore, previous-year precipitation and ANPP control a significant fraction of current-year production. We developed unified models for the controls of ANPP through space and time. The relative importance of current-versus previous-year precipitation changes along a gradient of mean annual precipitation with the importance of current-year PPT decreasing, whereas the importance of previous-year PPT remains constant as mean annual precipitation increases. Finally, our results suggest that ANPP will respond to climate-change-driven alterations in water availability and, more importantly, that the magnitude of the response will increase with time.

  17. Modified optimal control pilot model for computer-aided design and analysis

    NASA Technical Reports Server (NTRS)

    Davidson, John B.; Schmidt, David K.

    1992-01-01

    This paper presents the theoretical development of a modified optimal control pilot model based upon the optimal control model (OCM) of the human operator developed by Kleinman, Baron, and Levison. This model is input compatible with the OCM and retains other key aspects of the OCM, such as a linear quadratic solution for the pilot gains with inclusion of control rate in the cost function, a Kalman estimator, and the ability to account for attention allocation and perception threshold effects. An algorithm designed for each implementation in current dynamic systems analysis and design software is presented. Example results based upon the analysis of a tracking task using three basic dynamic systems are compared with measured results and with similar analyses performed with the OCM and two previously proposed simplified optimal pilot models. The pilot frequency responses and error statistics obtained with this modified optimal control model are shown to compare more favorably to the measured experimental results than the other previously proposed simplified models evaluated.

  18. Thermal Rate Coefficients for the Astrochemical Process C + CH+ → C2+ + H by Ring Polymer Molecular Dynamics.

    PubMed

    Rampino, Sergio; Suleimanov, Yury V

    2016-12-22

    Thermal rate coefficients for the astrochemical reaction C + CH + → C 2 + + H were computed in the temperature range 20-300 K by using novel rate theory based on ring polymer molecular dynamics (RPMD) on a recently published bond-order based potential energy surface and compared with previous Langevin capture model (LCM) and quasi-classical trajectory (QCT) calculations. Results show that there is a significant discrepancy between the RPMD rate coefficients and the previous theoretical results that can lead to overestimation of the rate coefficients for the title reaction by several orders of magnitude at very low temperatures. We argue that this can be attributed to a very challenging energy profile along the reaction coordinate for the title reaction, not taken into account in extenso by either the LCM or QCT approximation. In the absence of any rigorous quantum mechanical or experimental results, the computed RPMD rate coefficients represent state-of-the-art estimates to be included in astrochemical databases and kinetic networks.

  19. Q Values of the Superallowed {beta} Emitters {sup 26}Al{sup m}, {sup 42}Sc, and {sup 46}V and Their Impact on V{sub ud} and the Unitarity of the Cabibbo-Kobayashi-Maskawa Matrix

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eronen, T.; Elomaa, V.; Hager, U.

    2006-12-08

    The {beta}-decay Q{sub EC} values of the superallowed beta emitters {sup 26}Al{sup m}, {sup 42}Sc, and {sup 46}V have been measured with a Penning trap to a relative precision of better than 8x10{sup -9}. Our result for {sup 46}V, 7052.72(31) keV, confirms a recent measurement that differed from the previously accepted reaction-based Q{sub EC} value. However, our results for {sup 26}Al{sup m} and {sup 42}Sc, 4232.83(13) keV and 6426.13(21) keV, are consistent with previous reaction-based values. By eliminating the possibility of a systematic difference between the two techniques, this result demonstrates that no significant shift in the deduced value ofmore » V{sub ud} should be anticipated.« less

  20. Application of Renyi entropy for ultrasonic molecular imaging.

    PubMed

    Hughes, M S; Marsh, J N; Arbeit, J M; Neumann, R G; Fuhrhop, R W; Wallace, K D; Thomas, L; Smith, J; Agyem, K; Lanza, G M; Wickline, S A; McCarthy, J E

    2009-05-01

    Previous work has demonstrated that a signal receiver based on a limiting form of the Shannon entropy is, in certain settings, more sensitive to subtle changes in scattering architecture than conventional energy-based signal receivers [M. S. Hughes et al., J. Acoust. Soc. Am. 121, 3542-3557 (2007)]. In this paper new results are presented demonstrating further improvements in sensitivity using a signal receiver based on the Renyi entropy.

  1. Evidence-based guidelines for the pharmacological treatment of postmenopausal osteoporosis: a consensus document by the Belgian Bone Club

    PubMed Central

    Body, J.-J.; Bergmann, P.; Boonen, S.; Boutsen, Y.; Devogelaer, J.-P.; Goemaere, S.; Kaufman, J.-M.; Rozenberg, S.

    2010-01-01

    Several drugs are available for the management of postmenopausal osteoporosis. This may, in daily practice, confuse the clinician. This manuscript offers an evidence-based update of previous treatment guidelines, with a critical assessment of the currently available efficacy data on all new chemical entities which were granted a marketing authorization. Osteoporosis is widely recognized as a major public health concern. The availability of new therapeutic agents makes clinical decision-making in osteoporosis more complex. Nation-specific guidelines are needed to take into consideration the specificities of each and every health care environment. The present manuscript is the result of a National Consensus, based on a systematic review and a critical appraisal of the currently available literature. It offers an evidence-based update of previous treatment guidelines, with the aim of providing clinicians with an unbiased assessment of osteoporosis treatment effect. PMID:20480148

  2. Common Criteria Related Security Design Patterns for Intelligent Sensors—Knowledge Engineering-Based Implementation

    PubMed Central

    Bialas, Andrzej

    2011-01-01

    Intelligent sensors experience security problems very similar to those inherent to other kinds of IT products or systems. The assurance for these products or systems creation methodologies, like Common Criteria (ISO/IEC 15408) can be used to improve the robustness of the sensor systems in high risk environments. The paper presents the background and results of the previous research on patterns-based security specifications and introduces a new ontological approach. The elaborated ontology and knowledge base were validated on the IT security development process dealing with the sensor example. The contribution of the paper concerns the application of the knowledge engineering methodology to the previously developed Common Criteria compliant and pattern-based method for intelligent sensor security development. The issue presented in the paper has a broader significance in terms that it can solve information security problems in many application domains. PMID:22164064

  3. Common criteria related security design patterns for intelligent sensors--knowledge engineering-based implementation.

    PubMed

    Bialas, Andrzej

    2011-01-01

    Intelligent sensors experience security problems very similar to those inherent to other kinds of IT products or systems. The assurance for these products or systems creation methodologies, like Common Criteria (ISO/IEC 15408) can be used to improve the robustness of the sensor systems in high risk environments. The paper presents the background and results of the previous research on patterns-based security specifications and introduces a new ontological approach. The elaborated ontology and knowledge base were validated on the IT security development process dealing with the sensor example. The contribution of the paper concerns the application of the knowledge engineering methodology to the previously developed Common Criteria compliant and pattern-based method for intelligent sensor security development. The issue presented in the paper has a broader significance in terms that it can solve information security problems in many application domains.

  4. Image restoration for three-dimensional fluorescence microscopy using an orthonormal basis for efficient representation of depth-variant point-spread functions

    PubMed Central

    Patwary, Nurmohammed; Preza, Chrysanthe

    2015-01-01

    A depth-variant (DV) image restoration algorithm for wide field fluorescence microscopy, using an orthonormal basis decomposition of DV point-spread functions (PSFs), is investigated in this study. The efficient PSF representation is based on a previously developed principal component analysis (PCA), which is computationally intensive. We present an approach developed to reduce the number of DV PSFs required for the PCA computation, thereby making the PCA-based approach computationally tractable for thick samples. Restoration results from both synthetic and experimental images show consistency and that the proposed algorithm addresses efficiently depth-induced aberration using a small number of principal components. Comparison of the PCA-based algorithm with a previously-developed strata-based DV restoration algorithm demonstrates that the proposed method improves performance by 50% in terms of accuracy and simultaneously reduces the processing time by 64% using comparable computational resources. PMID:26504634

  5. In preparation of the nationwide dissemination of the school-based obesity prevention program DOiT: stepwise development applying the intervention mapping protocol.

    PubMed

    van Nassau, Femke; Singh, Amika S; van Mechelen, Willem; Brug, Johannes; Chin A Paw, Mai J M

    2014-08-01

    The school-based Dutch Obesity Intervention in Teenagers (DOiT) program is an evidence-based obesity prevention program. In preparation for dissemination throughout the Netherlands, this study aimed to adapt the initial program and to develop an implementation strategy and materials. We revisited the Intervention Mapping (IM) protocol, using results of the previous process evaluation and additional focus groups and interviews with students, parents, teachers, and professionals. The adapted 2-year DOiT program consists of a classroom, an environmental and a parental component. The year 1 lessons aim to increase awareness and knowledge of healthy behaviors. The lessons in year 2 focus on the influence of the (obesogenic) environment. The stepwise development of the implementation strategy resulted in objectives that support teachers' implementation. We developed a 7-step implementation strategy and supporting materials by translating the objectives into essential elements and practical strategies. This study illustrates how revisiting the IM protocol resulted in an adapted program and tailored implementation strategy based on previous evaluations as well as input from different stakeholders. The stepwise development of DOiT can serve as an example for other evidence-based programs in preparation for wider dissemination. Published 2014. This article is a U.S. Government work and is in the public domain in the USA.

  6. A revised load estimation procedure for the Susquehanna, Potomac, Patuxent, and Choptank rivers

    USGS Publications Warehouse

    Yochum, Steven E.

    2000-01-01

    The U.S. Geological Survey?s Chesapeake Bay River Input Program has updated the nutrient and suspended-sediment load data base for the Susquehanna, Potomac, Patuxent, and Choptank Rivers using a multiple-window, center-estimate regression methodology. The revised method optimizes the seven-parameter regression approach that has been used historically by the program. The revised method estimates load using the fifth or center year of a sliding 9-year window. Each year a new model is run for each site and constituent, the most recent year is added, and the previous 4 years of estimates are updated. The fifth year in the 9-year window is considered the best estimate and is kept in the data base. The last year of estimation shows the most change from the previous year?s estimate and this change approaches a minimum at the fifth year. Differences between loads computed using this revised methodology and the loads populating the historical data base have been noted but the load estimates do not typically change drastically. The data base resulting from the application of this revised methodology is populated by annual and monthly load estimates that are known with greater certainty than in the previous load data base.

  7. Motion compensated shape error concealment.

    PubMed

    Schuster, Guido M; Katsaggelos, Aggelos K

    2006-02-01

    The introduction of Video Objects (VOs) is one of the innovations of MPEG-4. The alpha-plane of a VO defines its shape at a given instance in time and hence determines the boundary of its texture. In packet-based networks, shape, motion, and texture are subject to loss. While there has been considerable attention paid to the concealment of texture and motion errors, little has been done in the field of shape error concealment. In this paper we propose a post-processing shape error concealment technique that uses the motion compensated boundary information of the previously received alpha-plane. The proposed approach is based on matching received boundary segments in the current frame to the boundary in the previous frame. This matching is achieved by finding a maximally smooth motion vector field. After the current boundary segments are matched to the previous boundary, the missing boundary pieces are reconstructed by motion compensation. Experimental results demonstrating the performance of the proposed motion compensated shape error concealment method, and comparing it with the previously proposed weighted side matching method are presented.

  8. Symbol signal-to-noise ratio loss in square-wave subcarrier downconversion

    NASA Technical Reports Server (NTRS)

    Feria, Y.; Statman, J.

    1993-01-01

    This article presents the simulated results of the signal-to-noise ratio (SNR) loss in the process of a square-wave subcarrier down conversion. In a previous article, the SNR degradation was evaluated at the output of the down converter based on the signal and noise power change. Unlike in the previous article, the SNR loss is defined here as the difference between the actual and theoretical symbol SNR's for the same symbol-error rate at the output of the symbol matched filter. The results show that an average SNR loss of 0.3 dB can be achieved with tenth-order infinite impulse response (IIR) filters. This loss is a 0.2-dB increase over the SNR degradation in the previous analysis where neither the signal distortion nor the symbol detector was considered.

  9. PharmDock: a pharmacophore-based docking program

    PubMed Central

    2014-01-01

    Background Protein-based pharmacophore models are enriched with the information of potential interactions between ligands and the protein target. We have shown in a previous study that protein-based pharmacophore models can be applied for ligand pose prediction and pose ranking. In this publication, we present a new pharmacophore-based docking program PharmDock that combines pose sampling and ranking based on optimized protein-based pharmacophore models with local optimization using an empirical scoring function. Results Tests of PharmDock on ligand pose prediction, binding affinity estimation, compound ranking and virtual screening yielded comparable or better performance to existing and widely used docking programs. The docking program comes with an easy-to-use GUI within PyMOL. Two features have been incorporated in the program suite that allow for user-defined guidance of the docking process based on previous experimental data. Docking with those features demonstrated superior performance compared to unbiased docking. Conclusion A protein pharmacophore-based docking program, PharmDock, has been made available with a PyMOL plugin. PharmDock and the PyMOL plugin are freely available from http://people.pharmacy.purdue.edu/~mlill/software/pharmdock. PMID:24739488

  10. A mathematical programming method for formulating a fuzzy regression model based on distance criterion.

    PubMed

    Chen, Liang-Hsuan; Hsueh, Chan-Ching

    2007-06-01

    Fuzzy regression models are useful to investigate the relationship between explanatory and response variables with fuzzy observations. Different from previous studies, this correspondence proposes a mathematical programming method to construct a fuzzy regression model based on a distance criterion. The objective of the mathematical programming is to minimize the sum of distances between the estimated and observed responses on the X axis, such that the fuzzy regression model constructed has the minimal total estimation error in distance. Only several alpha-cuts of fuzzy observations are needed as inputs to the mathematical programming model; therefore, the applications are not restricted to triangular fuzzy numbers. Three examples, adopted in the previous studies, and a larger example, modified from the crisp case, are used to illustrate the performance of the proposed approach. The results indicate that the proposed model has better performance than those in the previous studies based on either distance criterion or Kim and Bishu's criterion. In addition, the efficiency and effectiveness for solving the larger example by the proposed model are also satisfactory.

  11. Research on Vehicle-Based Driver Status/Performance Monitoring, Part III

    DOT National Transportation Integrated Search

    1996-09-01

    A driver drowsiness detection/alarm/countermeasures system was specified, tested and evaluated, resulting in the development of revised algorithms for the detection of driver drowsiness. Previous algorithms were examined in a test and evaluation stud...

  12. Research On Vehicle-Based Driver Status/Performance Monitoring, Part I

    DOT National Transportation Integrated Search

    1996-09-01

    A driver drowsiness detection/alarm/countermeasures system was specified, tested and evaluated, resulting in the development of revised algorithms for the detection of driver drowsiness. Previous algorithms were examined in a test and evaluation stud...

  13. Effectiveness of Abstinence-Based Incentives: Interaction with Intake Stimulant Test Results

    ERIC Educational Resources Information Center

    Stitzer, Maxine L.; Petry, Nancy; Peirce, Jessica; Kirby, Kimberly; Killeen, Therese; Roll, John; Hamilton, John; Stabile, Patricia Q.; Sterling, Robert; Brown, Chanda; Kolodner, Ken; Li, Rui

    2007-01-01

    Intake urinalysis test result (drug positive vs. negative) has been previously identified as a strong predictor of drug abuse treatment outcome, but there is little information about how this prognostic factor may interact with the type of treatment delivered. The authors used data from a multisite study of abstinence incentives for stimulant…

  14. Ribosomal DNA intergenic spacer sequence in foxtail millet, Setaria italica (L.) P. Beauv. and its characterization and application to typing of foxtail millet landraces.

    PubMed

    Fukunaga, Kenji; Ichitani, Katsuyuki; Taura, Satoru; Sato, Muneharu; Kawase, Makoto

    2005-02-01

    We determined the sequence of ribosomal DNA (rDNA) intergenic spacer (IGS) of foxtail millet isolated in our previous study, and identified subrepeats in the polymorphic region. We also developed a PCR-based method for identifying rDNA types based on sequence information and assessed 153 accessions of foxtail millet. Results were congruent with our previous works. This study provides new findings regarding the geographical distribution of rDNA variants. This new method facilitates analyses of numerous foxtail millet accessions. It is helpful for typing of foxtail millet germplasms and elucidating the evolution of this millet.

  15. Impact of a disability management program on employee productivity in a petrochemical company.

    PubMed

    Skisak, Christopher M; Bhojani, Faiyaz; Tsai, Shan P

    2006-05-01

    An inhouse disability management program was implemented to reduce nonoccupational absences in a petrochemical corporation. The program was administered by full-time certified, corporate-based case managers and nine manufacturing location nurses. Employees were required to report all absences on the first day and again on the fourth workday of absence. A medical certification form was required for absences of 4 or more working days. Extended absences were actively managed. An Internet-based case management tool, Medgate, was used as a primary management tool. Results were compared with the previous year among the target population and with company business units not participating in the program. The program resulted in a 10% reduction in total absence days per employee (6.9 to 6.2) compared with the previous year, whereas business units not using the program had an 8% increase (5.5 to 5.9). This disability management program resulted in a more than four to one return on investment based on direct expenditures and cost savings in terms of reduced absence days. The inhouse disability management program was successful by absence duration, employee satisfaction, and return on investment criteria.

  16. Random walks based multi-image segmentation: Quasiconvexity results and GPU-based solutions

    PubMed Central

    Collins, Maxwell D.; Xu, Jia; Grady, Leo; Singh, Vikas

    2012-01-01

    We recast the Cosegmentation problem using Random Walker (RW) segmentation as the core segmentation algorithm, rather than the traditional MRF approach adopted in the literature so far. Our formulation is similar to previous approaches in the sense that it also permits Cosegmentation constraints (which impose consistency between the extracted objects from ≥ 2 images) using a nonparametric model. However, several previous nonparametric cosegmentation methods have the serious limitation that they require adding one auxiliary node (or variable) for every pair of pixels that are similar (which effectively limits such methods to describing only those objects that have high entropy appearance models). In contrast, our proposed model completely eliminates this restrictive dependence –the resulting improvements are quite significant. Our model further allows an optimization scheme exploiting quasiconvexity for model-based segmentation with no dependence on the scale of the segmented foreground. Finally, we show that the optimization can be expressed in terms of linear algebra operations on sparse matrices which are easily mapped to GPU architecture. We provide a highly specialized CUDA library for Cosegmentation exploiting this special structure, and report experimental results showing these advantages. PMID:25278742

  17. A revised timescale for human evolution based on ancient mitochondrial genomes.

    PubMed

    Fu, Qiaomei; Mittnik, Alissa; Johnson, Philip L F; Bos, Kirsten; Lari, Martina; Bollongino, Ruth; Sun, Chengkai; Giemsch, Liane; Schmitz, Ralf; Burger, Joachim; Ronchitelli, Anna Maria; Martini, Fabio; Cremonesi, Renata G; Svoboda, Jiří; Bauer, Peter; Caramelli, David; Castellano, Sergi; Reich, David; Pääbo, Svante; Krause, Johannes

    2013-04-08

    Recent analyses of de novo DNA mutations in modern humans have suggested a nuclear substitution rate that is approximately half that of previous estimates based on fossil calibration. This result has led to suggestions that major events in human evolution occurred far earlier than previously thought. Here, we use mitochondrial genome sequences from ten securely dated ancient modern humans spanning 40,000 years as calibration points for the mitochondrial clock, thus yielding a direct estimate of the mitochondrial substitution rate. Our clock yields mitochondrial divergence times that are in agreement with earlier estimates based on calibration points derived from either fossils or archaeological material. In particular, our results imply a separation of non-Africans from the most closely related sub-Saharan African mitochondrial DNAs (haplogroup L3) that occurred less than 62-95 kya. Though single loci like mitochondrial DNA (mtDNA) can only provide biased estimates of population divergence times, they can provide valid upper bounds. Our results exclude most of the older dates for African and non-African population divergences recently suggested by de novo mutation rate estimates in the nuclear genome. Copyright © 2013 Elsevier Ltd. All rights reserved.

  18. An Investigation of State-Space Model Fidelity for SSME Data

    NASA Technical Reports Server (NTRS)

    Martin, Rodney Alexander

    2008-01-01

    In previous studies, a variety of unsupervised anomaly detection techniques for anomaly detection were applied to SSME (Space Shuttle Main Engine) data. The observed results indicated that the identification of certain anomalies were specific to the algorithmic method under consideration. This is the reason why one of the follow-on goals of these previous investigations was to build an architecture to support the best capabilities of all algorithms. We appeal to that goal here by investigating a cascade, serial architecture for the best performing and most suitable candidates from previous studies. As a precursor to a formal ROC (Receiver Operating Characteristic) curve analysis for validation of resulting anomaly detection algorithms, our primary focus here is to investigate the model fidelity as measured by variants of the AIC (Akaike Information Criterion) for state-space based models. We show that placing constraints on a state-space model during or after the training of the model introduces a modest level of suboptimality. Furthermore, we compare the fidelity of all candidate models including those embodying the cascade, serial architecture. We make recommendations on the most suitable candidates for application to subsequent anomaly detection studies as measured by AIC-based criteria.

  19. Shear-wave velocity-based probabilistic and deterministic assessment of seismic soil liquefaction potential

    USGS Publications Warehouse

    Kayen, R.; Moss, R.E.S.; Thompson, E.M.; Seed, R.B.; Cetin, K.O.; Der Kiureghian, A.; Tanaka, Y.; Tokimatsu, K.

    2013-01-01

    Shear-wave velocity (Vs) offers a means to determine the seismic resistance of soil to liquefaction by a fundamental soil property. This paper presents the results of an 11-year international project to gather new Vs site data and develop probabilistic correlations for seismic soil liquefaction occurrence. Toward that objective, shear-wave velocity test sites were identified, and measurements made for 301 new liquefaction field case histories in China, Japan, Taiwan, Greece, and the United States over a decade. The majority of these new case histories reoccupy those previously investigated by penetration testing. These new data are combined with previously published case histories to build a global catalog of 422 case histories of Vs liquefaction performance. Bayesian regression and structural reliability methods facilitate a probabilistic treatment of the Vs catalog for performance-based engineering applications. Where possible, uncertainties of the variables comprising both the seismic demand and the soil capacity were estimated and included in the analysis, resulting in greatly reduced overall model uncertainty relative to previous studies. The presented data set and probabilistic analysis also help resolve the ancillary issues of adjustment for soil fines content and magnitude scaling factors.

  20. Generating human-like movements on an anthropomorphic robot using an interior point method

    NASA Astrophysics Data System (ADS)

    Costa e Silva, E.; Araújo, J. P.; Machado, D.; Costa, M. F.; Erlhagen, W.; Bicho, E.

    2013-10-01

    In previous work we have presented a model for generating human-like arm and hand movements on an anthropomorphic robot involved in human-robot collaboration tasks. This model was inspired by the Posture-Based Motion-Planning Model of human movements. Numerical results and simulations for reach-to-grasp movements with two different grip types have been presented previously. In this paper we extend our model in order to address the generation of more complex movement sequences which are challenged by scenarios cluttered with obstacles. The numerical results were obtained using the IPOPT solver, which was integrated in our MATLAB simulator of an anthropomorphic robot.

  1. Mach-Zehnder Interferometer Refractive Index Sensor Based on a Plasmonic Channel Waveguide

    PubMed Central

    Lee, Da Eun; Lee, Young Jin; Shin, Eunso; Kwon, Soon-Hong

    2017-01-01

    A Mach-Zehnder interferometer based on a plasmonic channel waveguide is proposed for refractive index sensing. The structure, with a small physical footprint of 20 × 120 μm2, achieved a high figure of merit of 294. The cut-off frequency behaviour in the plasmonic channel waveguide resulted in a flat dispersion curve, which induces a 1.8 times larger change of the propagation constant for the given refractive index change compared with previously reported results. PMID:29120381

  2. Some anomalies between wind tunnel and flight transition results

    NASA Technical Reports Server (NTRS)

    Harvey, W. D.; Bobbitt, P. J.

    1981-01-01

    A review of environmental disturbance influence and boundary layer transition measurements on a large collection of reference sharp cone tests in wind tunnels and of recent transonic-supersonic cone flight results have previously demonstrated the dominance of free-stream disturbance level on the transition process from the beginning to end. Variation of the ratio of transition Reynolds number at onset-to-end with Mach number has been shown to be consistently different between flight and wind tunnels. Previous correlations of the end of transition with disturbance level give good results for flight and large number of tunnels, however, anomalies occur for similar correlation based on transition onset. Present cone results with a tunnel sonic throat reduced the disturbance level by an order of magnitude with transition values comparable to flight.

  3. [Profile of adolescents with repeated pregnancies attended at a prenatal clinic].

    PubMed

    Persona, Lia; Shimo, Antonieta Keiko Kakuda; Tarallo, Maria Celina

    2004-01-01

    This study identified the biopsychosocial profile of adolescent with repeated pregnancies, who were attended at a prenatal clinic. Data were collected through patient records and interviews and were subject to quantitative analysis. Based on the obtained results and in accordance with literature, factors that are strongly associated with the occurrence of pregnancy repetition were selected in the adolescents' profiles. These are: early menarche; first sexual intercourse shortly after menarche; school repetition; school dropout; non remunerated occupation; low family income; involvement with older partners; living with the partner; consensual union with the partner; one partner; low condom use; family history of adolescent pregnancy; father's absence because of death or abandonment; positive family reaction to previous pregnancy; previous abortion; adolescent's positive concepts about previous delivery; and absence from previous postpartum consultations.

  4. Volatility and correlation-based systemic risk measures in the US market

    NASA Astrophysics Data System (ADS)

    Civitarese, Jamil

    2016-10-01

    This paper deals with the problem of how to use simple systemic risk measures to assess portfolio risk characteristics. Using three simple examples taken from previous literature, one based on raw and partial correlations, another based on the eigenvalue decomposition of the covariance matrix and the last one based on an eigenvalue entropy, a Granger-causation analysis revealed some of them are not always a good measure of risk in the S&P 500 and in the VIX. The measures selected do not Granger-cause the VIX index in all windows selected; therefore, in the sense of risk as volatility, the indicators are not always suitable. Nevertheless, their results towards returns are similar to previous works that accept them. A deeper analysis has shown that any symmetric measure based on eigenvalue decomposition of correlation matrices, however, is not useful as a measure of "correlation" risk. The empirical counterpart analysis of this proposition stated that negative correlations are usually small and, therefore, do not heavily distort the behavior of the indicator.

  5. Gryphon: A Hybrid Agent-Based Modeling and Simulation Platform for Infectious Diseases

    NASA Astrophysics Data System (ADS)

    Yu, Bin; Wang, Jijun; McGowan, Michael; Vaidyanathan, Ganesh; Younger, Kristofer

    In this paper we present Gryphon, a hybrid agent-based stochastic modeling and simulation platform developed for characterizing the geographic spread of infectious diseases and the effects of interventions. We study both local and non-local transmission dynamics of stochastic simulations based on the published parameters and data for SARS. The results suggest that the expected numbers of infections and the timeline of control strategies predicted by our stochastic model are in reasonably good agreement with previous studies. These preliminary results indicate that Gryphon is able to characterize other future infectious diseases and identify endangered regions in advance.

  6. Modeling the Response of Primary Production and Sedimentation to Variable Nitrate Loading in the Mississippi River Plume

    DTIC Science & Technology

    2008-03-06

    oped based on previous observational studies in the MRP . Our annual variations in hypoxic zone size and resulted in suggestions model was developed by...nitrate loading. The nitrogen- based model consisted of nine compartments (nitrate, ammonium, labile dissolved organic nitrogen, bacteria, small...independent dataset of primary production measurements for different riverine N03 loads. Based on simulations over the range of observed springtime N03

  7. Application of Renyi entropy for ultrasonic molecular imaging

    PubMed Central

    Hughes, M. S.; Marsh, J. N.; Arbeit, J. M.; Neumann, R. G.; Fuhrhop, R. W.; Wallace, K. D.; Thomas, L.; Smith, J.; Agyem, K.; Lanza, G. M.; Wickline, S. A.; McCarthy, J. E.

    2009-01-01

    Previous work has demonstrated that a signal receiver based on a limiting form of the Shannon entropy is, in certain settings, more sensitive to subtle changes in scattering architecture than conventional energy-based signal receivers [M. S. Hughes et al., J. Acoust. Soc. Am. 121, 3542–3557 (2007)]. In this paper new results are presented demonstrating further improvements in sensitivity using a signal receiver based on the Renyi entropy. PMID:19425656

  8. Flight Test Comparison of Different Adaptive Augmentations for Fault Tolerant Control Laws for a Modified F-15 Aircraft

    NASA Technical Reports Server (NTRS)

    Burken, John J.; Hanson, Curtis E.; Lee, James A.; Kaneshige, John T.

    2009-01-01

    This report describes the improvements and enhancements to a neural network based approach for directly adapting to aerodynamic changes resulting from damage or failures. This research is a follow-on effort to flight tests performed on the NASA F-15 aircraft as part of the Intelligent Flight Control System research effort. Previous flight test results demonstrated the potential for performance improvement under destabilizing damage conditions. Little or no improvement was provided under simulated control surface failures, however, and the adaptive system was prone to pilot-induced oscillations. An improved controller was designed to reduce the occurrence of pilot-induced oscillations and increase robustness to failures in general. This report presents an analysis of the neural networks used in the previous flight test, the improved adaptive controller, and the baseline case with no adaptation. Flight test results demonstrate significant improvement in performance by using the new adaptive controller compared with the previous adaptive system and the baseline system for control surface failures.

  9. [The effect of self-reflection on depression mediated by hardiness].

    PubMed

    Nakajima, Miho; Hattori, Yosuke; Tanno, Yoshihiko

    2015-10-01

    Previous studies have shown that two types of private self-consciousness result in opposing effects on depression; one of which is self-rumination, which leads to maladaptive effect, and the other is self-reflection, which leads to an adaptive effect. Although a number of studies have examined the mechanism of the maladaptive effect of self-rumination, only a few studies have examined the mechanism of the adaptive effect of self-reflection. The present study examined the process of how self-reflection affected depression adaptively, Based on the previous findings, we proposed a hypothetical model assuming that hardiness acts as a mediator of self-reflection. To test the validity of the model, structural equation modeling analysis was performed with the cross-sectional data of 155 undergraduate students. The results. suggest that the hypothetical model is valid. According to the present results and previous findings, it is suggested that self-reflection is associated with low levels of depression and mediated by "rich commitment", one component of hardiness.

  10. Ultrastructure of spermatozoa of spider crabs, family Mithracidae (Crustacea, Decapoda, Brachyura): Integrative analyses based on morphological and molecular data.

    PubMed

    Assugeni, Camila de O; Magalhães, Tatiana; Bolaños, Juan A; Tudge, Christopher C; Mantelatto, Fernando L; Zara, Fernando J

    2017-12-01

    Recent studies based on morphological and molecular data provide a new perspective concerning taxonomic aspects of the brachyuran family Mithracidae. These studies proposed a series of nominal changes and indicated that the family is actually represented by a different number and representatives of genera than previously thought. Here, we provide a comparative description of the ultrastructure of spermatozoa and spermatophores of some species of Mithracidae in a phylogenetic context. The ultrastructure of the spermatozoa and spermatophore was observed by scanning and transmission electron microscopy. The most informative morphological characters analysed were thickness of the operculum, shape of the perforatorial chamber and shape and thickness of the inner acrosomal zone. As a framework, we used a topology based on a phylogenetic analysis using mitochondrial data obtained here and from previous studies. Our results indicate that closely related species share a series of morphological characteristics of the spermatozoa. A thick operculum, for example, is a feature observed in species of the genera Amphithrax, Teleophrys, and Omalacantha in contrast to the slender operculum observed in Mithraculus and Mithrax. Amphithrax and Teleophrys have a rhomboid perforatorial chamber, while Mithraculus, Mithrax, and Omalacantha show a wider, deltoid morphology. Furthermore, our results are in agreement with recently proposed taxonomic changes including the separation of the genera Mithrax (previously Damithrax), Amphithrax (previously Mithrax) and Mithraculus, and the synonymy of Mithrax caribbaeus with Mithrax hispidus. Overall, the spermiotaxonomy of these species of Mithracidae represent a novel set of data that corroborates the most recent taxonomic revision of the family and can be used in future taxonomic and phylogenetic studies within this family. © 2017 Wiley Periodicals, Inc.

  11. A new method for the prediction of combustion instability

    NASA Astrophysics Data System (ADS)

    Flanagan, Steven Meville

    This dissertation presents a new approach to the prediction of combustion instability in solid rocket motors. Previous attempts at developing computational tools to solve this problem have been largely unsuccessful, showing very poor agreement with experimental results and having little or no predictive capability. This is due primarily to deficiencies in the linear stability theory upon which these efforts have been based. Recent advances in linear instability theory by Flandro have demonstrated the importance of including unsteady rotational effects, previously considered negligible. Previous versions of the theory also neglected corrections to the unsteady flow field of the first order in the mean flow Mach number. This research explores the stability implications of extending the solution to include these corrections. Also, the corrected linear stability theory based upon a rotational unsteady flow field extended to first order in mean flow Mach number has been implemented in two computer programs developed for the Macintosh platform. A quasi one-dimensional version of the program has been developed which is based upon an approximate solution to the cavity acoustics problem. The three-dimensional program applies Greens's Function Discretization (GFD) to the solution for the acoustic mode shapes and frequency. GFD is a recently developed numerical method for finding fully three dimensional solutions for this class of problems. The analysis of complex motor geometries, previously a tedious and time consuming task, has also been greatly simplified through the development of a drawing package designed specifically to facilitate the specification of typical motor geometries. The combination of the drawing package, improved acoustic solutions, and new analysis, results in a tool which is capable of producing more accurate and meaningful predictions than have been possible in the past.

  12. A Novel WA-BPM Based on the Generalized Multistep Scheme in the Propagation Direction in the Waveguide

    NASA Astrophysics Data System (ADS)

    Ji, Yang; Chen, Hong; Tang, Hongwu

    2017-06-01

    A highly accurate wide-angle scheme, based on the generalized mutistep scheme in the propagation direction, is developed for the finite difference beam propagation method (FD-BPM). Comparing with the previously presented method, the simulation shows that our method results in a more accurate solution, and the step size can be much larger

  13. A Randomized Controlled Trial of Acceptance-Based Behavior Therapy and Cognitive Therapy for Test Anxiety: A Pilot Study

    ERIC Educational Resources Information Center

    Brown, Lily A.; Forman, Evan M.; Herbert, James D.; Hoffman, Kimberly L.; Yuen, Erica K.; Goetter, Elizabeth M.

    2011-01-01

    Many university students suffer from test anxiety that is severe enough to impair performance. Given mixed efficacy results of previous cognitive-behavior therapy (CBT) trials and a theoretically driven rationale, an acceptance-based behavior therapy (ABBT) approach was compared to traditional CBT (i.e., Beckian cognitive therapy; CT) for the…

  14. The Impact of Trial Stage, Developer Involvement and International Transferability on Universal Social and Emotional Learning Programme Outcomes: A Meta-Analysis

    ERIC Educational Resources Information Center

    Wigelsworth, M.; Lendrum, A.; Oldfield, J.; Scott, A.; ten Bokkel, I.; Tate, K.; Emery, C.

    2016-01-01

    This study expands upon the extant prior meta-analytic literature by exploring previously theorised reasons for the failure of school-based, universal social and emotional learning (SEL) programmes to produce expected results. Eighty-nine studies reporting the effects of school-based, universal SEL programmes were examined for differential effects…

  15. The Interactive Effects of the Availability of Objectives and/or Rules on Computer-Based Learning: A Replication.

    ERIC Educational Resources Information Center

    Merrill, Paul F.; And Others

    To replicate and extend the results of a previous study, this project investigated the effects of behavioral objectives and/or rules on computer-based learning task performance. The 133 subjects were randomly assigned to an example-only, objective-example, rule example, or objective-rule example group. The availability of rules and/or objectives…

  16. Revisiting tests for neglected nonlinearity using artificial neural networks.

    PubMed

    Cho, Jin Seo; Ishida, Isao; White, Halbert

    2011-05-01

    Tests for regression neglected nonlinearity based on artificial neural networks (ANNs) have so far been studied by separately analyzing the two ways in which the null of regression linearity can hold. This implies that the asymptotic behavior of general ANN-based tests for neglected nonlinearity is still an open question. Here we analyze a convenient ANN-based quasi-likelihood ratio statistic for testing neglected nonlinearity, paying careful attention to both components of the null. We derive the asymptotic null distribution under each component separately and analyze their interaction. Somewhat remarkably, it turns out that the previously known asymptotic null distribution for the type 1 case still applies, but under somewhat stronger conditions than previously recognized. We present Monte Carlo experiments corroborating our theoretical results and showing that standard methods can yield misleading inference when our new, stronger regularity conditions are violated.

  17. Separation of large DNA molecules by applying pulsed electric field to size exclusion chromatography-based microchip

    NASA Astrophysics Data System (ADS)

    Azuma, Naoki; Itoh, Shintaro; Fukuzawa, Kenji; Zhang, Hedong

    2018-02-01

    Through electrophoresis driven by a pulsed electric field, we succeeded in separating large DNA molecules with an electrophoretic microchip based on size exclusion chromatography (SEC), which was proposed in our previous study. The conditions of the pulsed electric field required to achieve the separation were determined by numerical analyses using our originally proposed separation model. From the numerical results, we succeeded in separating large DNA molecules (λ DNA and T4 DNA) within 1600 s, which was approximately half of that achieved under a direct electric field in our previous study. Our SEC-based electrophoresis microchip will be one of the effective tools to meet the growing demand of faster and more convenient separation of large DNA molecules, especially in the field of epidemiological research of infectious diseases.

  18. Common path in-line holography using enhanced joint object reference digital interferometers

    PubMed Central

    Kelner, Roy; Katz, Barak; Rosen, Joseph

    2014-01-01

    Joint object reference digital interferometer (JORDI) is a recently developed system capable of recording holograms of various types [Opt. Lett. 38(22), 4719 (2013)24322115]. Presented here is a new enhanced system design that is based on the previous JORDI. While the previous JORDI has been based purely on diffractive optical elements, displayed on spatial light modulators, the present design incorporates an additional refractive objective lens, thus enabling hologram recording with improved resolution and increased system applicability. Experimental results demonstrate successful hologram recording for various types of objects, including transmissive, reflective, three-dimensional, phase and highly scattering objects. The resolution limit of the system is analyzed and experimentally validated. Finally, the suitability of JORDI for microscopic applications is verified as a microscope objective based configuration of the system is demonstrated. PMID:24663838

  19. A new graph-based method for pairwise global network alignment

    PubMed Central

    Klau, Gunnar W

    2009-01-01

    Background In addition to component-based comparative approaches, network alignments provide the means to study conserved network topology such as common pathways and more complex network motifs. Yet, unlike in classical sequence alignment, the comparison of networks becomes computationally more challenging, as most meaningful assumptions instantly lead to NP-hard problems. Most previous algorithmic work on network alignments is heuristic in nature. Results We introduce the graph-based maximum structural matching formulation for pairwise global network alignment. We relate the formulation to previous work and prove NP-hardness of the problem. Based on the new formulation we build upon recent results in computational structural biology and present a novel Lagrangian relaxation approach that, in combination with a branch-and-bound method, computes provably optimal network alignments. The Lagrangian algorithm alone is a powerful heuristic method, which produces solutions that are often near-optimal and – unlike those computed by pure heuristics – come with a quality guarantee. Conclusion Computational experiments on the alignment of protein-protein interaction networks and on the classification of metabolic subnetworks demonstrate that the new method is reasonably fast and has advantages over pure heuristics. Our software tool is freely available as part of the LISA library. PMID:19208162

  20. Retrieval evaluation and distance learning from perceived similarity between endomicroscopy videos.

    PubMed

    André, Barbara; Vercauteren, Tom; Buchner, Anna M; Wallace, Michael B; Ayache, Nicholas

    2011-01-01

    Evaluating content-based retrieval (CBR) is challenging because it requires an adequate ground-truth. When the available groundtruth is limited to textual metadata such as pathological classes, retrieval results can only be evaluated indirectly, for example in terms of classification performance. In this study we first present a tool to generate perceived similarity ground-truth that enables direct evaluation of endomicroscopic video retrieval. This tool uses a four-points Likert scale and collects subjective pairwise similarities perceived by multiple expert observers. We then evaluate against the generated ground-truth a previously developed dense bag-of-visual-words method for endomicroscopic video retrieval. Confirming the results of previous indirect evaluation based on classification, our direct evaluation shows that this method significantly outperforms several other state-of-the-art CBR methods. In a second step, we propose to improve the CBR method by learning an adjusted similarity metric from the perceived similarity ground-truth. By minimizing a margin-based cost function that differentiates similar and dissimilar video pairs, we learn a weight vector applied to the visual word signatures of videos. Using cross-validation, we demonstrate that the learned similarity distance is significantly better correlated with the perceived similarity than the original visual-word-based distance.

  1. A Direct Method to Extract Transient Sub-Gap Density of State (DOS) Based on Dual Gate Pulse Spectroscopy

    NASA Astrophysics Data System (ADS)

    Dai, Mingzhi; Khan, Karim; Zhang, Shengnan; Jiang, Kemin; Zhang, Xingye; Wang, Weiliang; Liang, Lingyan; Cao, Hongtao; Wang, Pengjun; Wang, Peng; Miao, Lijing; Qin, Haiming; Jiang, Jun; Xue, Lixin; Chu, Junhao

    2016-06-01

    Sub-gap density of states (DOS) is a key parameter to impact the electrical characteristics of semiconductor materials-based transistors in integrated circuits. Previously, spectroscopy methodologies for DOS extractions include the static methods, temperature dependent spectroscopy and photonic spectroscopy. However, they might involve lots of assumptions, calculations, temperature or optical impacts into the intrinsic distribution of DOS along the bandgap of the materials. A direct and simpler method is developed to extract the DOS distribution from amorphous oxide-based thin-film transistors (TFTs) based on Dual gate pulse spectroscopy (GPS), introducing less extrinsic factors such as temperature and laborious numerical mathematical analysis than conventional methods. From this direct measurement, the sub-gap DOS distribution shows a peak value on the band-gap edge and in the order of 1017-1021/(cm3·eV), which is consistent with the previous results. The results could be described with the model involving both Gaussian and exponential components. This tool is useful as a diagnostics for the electrical properties of oxide materials and this study will benefit their modeling and improvement of the electrical properties and thus broaden their applications.

  2. Comparing DNS and Experiments of Subcritical Flow Past an Isolated Surface Roughness Element

    NASA Astrophysics Data System (ADS)

    Doolittle, Charles; Goldstein, David

    2009-11-01

    Results are presented from computational and experimental studies of subcritical roughness within a Blasius boundary layer. This work stems from discrepancies presented by Stephani and Goldstein (AIAA Paper 2009-585) where DNS results did not agree with hot-wire measurements. The near wake regions of cylindrical surface roughness elements corresponding to roughness-based Reynolds numbers Rek of about 202 are of specific concern. Laser-Doppler anemometry and flow visualization in water, as well as the same spectral DNS code used by Stephani and Goldstein are used to obtain both quantitative and qualitative comparisons with previous results. Conclusions regarding previous studies will be presented alongside discussion of current work including grid resolution studies and an examination of vorticity dynamics.

  3. Improved optical design of nontracking concentrators

    NASA Astrophysics Data System (ADS)

    Kwan, B. M.; Bannerot, R. B.

    1984-08-01

    Optical designs based on a two reflections or less criterion have been developed for one and two-facet trapezoidal concentrators. Collector designs resulting from this criterion have been evaluated with the aid of a ray-trace computer simulation which includes the effects of nonideal reflectors. Results indicate a marked increase in performance, particularly for the one-facet designs, as compared to the collectors previously designed with the one reflection or less criterion. A significant result is that when a proper accounting is made for the actual acceptance angle for the concentrators, the performances of the optimal one and two-facet designs become nearly identical, indicating that the previously held contention that improved performance could be achieved with multifaceted reflectors (geometrically approaching the compound parabolic shape) may be incorrect.

  4. A comparison in a youth population between those with and without a history of concussion using biomechanical reconstruction.

    PubMed

    Post, Andrew; Hoshizaki, T Blaine; Gilchrist, Michael D; Koncan, David; Dawson, Lauren; Chen, Wesley; Ledoux, Andrée-Anne; Zemek, Roger

    2017-04-01

    OBJECTIVE Concussion is a common topic of research as a result of the short- and long-term effects it can have on the affected individual. Of particular interest is whether previous concussions can lead to a biomechanical susceptibility, or vulnerability, to incurring further head injuries, particularly for youth populations. The purpose of this research was to compare the impact biomechanics of a concussive event in terms of acceleration and brain strains of 2 groups of youths: those who had incurred a previous concussion and those who had not. It was hypothesized that the youths with a history of concussion would have lower-magnitude biomechanical impact measures than those who had never suffered a previous concussion. METHODS Youths who had suffered a concussion were recruited from emergency departments across Canada. This pool of patients was then separated into 2 categories based on their history of concussion: those who had incurred 1 or more previous concussions, and those who had never suffered a concussion. The impact event that resulted in the brain injury was reconstructed biomechanically using computational, physical, and finite element modeling techniques. The output of the events was measured in biomechanical parameters such as energy, force, acceleration, and brain tissue strain to determine if those patients who had a previous concussion sustained a brain injury at lower magnitudes than those who had no previously reported concussion. RESULTS The results demonstrated that there was no biomechanical variable that could distinguish between the concussion groups with a history of concussion versus no history of concussion. CONCLUSIONS The results suggest that there is no measureable biomechanical vulnerability to head impact related to a history of concussions in this youth population. This may be a reflection of the long time between the previous concussion and the one reconstructed in the laboratory, where such a long period has been associated with recovery from injury.

  5. Further study on Physaloptera clausa Rudolphi, 1819 (Spirurida: Physalopteridae) from the Amur hedgehog Erinaceus amurensis Schrenk (Eulipotyphla: Erinaceidae).

    PubMed

    Chen, Hui-Xia; Ju, Hui-Dong; Li, Yang; Li, Liang

    2017-12-20

    In the present study, light and scanning electron microscopy (SEM) were used to further study the detailed morphology of Physaloptera clausa Rudolphi, 1819, based on the material collected from the Amur hedgehog E. amurensis Schrenk in China. The results revealed a few previously unreported morphological features and some morphological and morphometric variability between our specimens and the previous studies. The present supplementary morphological characters and morphometric data could help us to recognize this species more accurately.

  6. On the usefulness of gradient information in multi-objective deformable image registration using a B-spline-based dual-dynamic transformation model: comparison of three optimization algorithms

    NASA Astrophysics Data System (ADS)

    Pirpinia, Kleopatra; Bosman, Peter A. N.; Sonke, Jan-Jakob; van Herk, Marcel; Alderliesten, Tanja

    2015-03-01

    The use of gradient information is well-known to be highly useful in single-objective optimization-based image registration methods. However, its usefulness has not yet been investigated for deformable image registration from a multi-objective optimization perspective. To this end, within a previously introduced multi-objective optimization framework, we use a smooth B-spline-based dual-dynamic transformation model that allows us to derive gradient information analytically, while still being able to account for large deformations. Within the multi-objective framework, we previously employed a powerful evolutionary algorithm (EA) that computes and advances multiple outcomes at once, resulting in a set of solutions (a so-called Pareto front) that represents efficient trade-offs between the objectives. With the addition of the B-spline-based transformation model, we studied the usefulness of gradient information in multiobjective deformable image registration using three different optimization algorithms: the (gradient-less) EA, a gradientonly algorithm, and a hybridization of these two. We evaluated the algorithms to register highly deformed images: 2D MRI slices of the breast in prone and supine positions. Results demonstrate that gradient-based multi-objective optimization significantly speeds up optimization in the initial stages of optimization. However, allowing sufficient computational resources, better results could still be obtained with the EA. Ultimately, the hybrid EA found the best overall approximation of the optimal Pareto front, further indicating that adding gradient-based optimization for multiobjective optimization-based deformable image registration can indeed be beneficial

  7. Factors affecting and affected by user acceptance of computer-based nursing documentation: results of a two-year study.

    PubMed

    Ammenwerth, Elske; Mansmann, Ulrich; Iller, Carola; Eichstädter, Ronald

    2003-01-01

    The documentation of the nursing process is an important but often neglected part of clinical documentation. Paper-based systems have been introduced to support nursing process documentation. Frequently, however, problems such as low quality of documentation are reported. It is unclear whether computer-based documentation systems can reduce these problems and which factors influence their acceptance by users. We introduced a computer-based nursing documentation system on four wards of the University Hospitals of Heidelberg and systematically evaluated its preconditions and its effects in a pretest-posttest intervention study. For the analysis of user acceptance, we concentrated on subjective data drawn from questionnaires and interviews. A questionnaire was developed using items from published questionnaires and items that had to be developed for the special purpose of this study. The quantitative results point to two factors influencing the acceptance of a new computer-based documentation system: the previous acceptance of the nursing process and the previous amount of self-confidence when using computers. On one ward, the diverse acceptance scores heavily declined after the introduction of the nursing documentation system. Explorative qualitative analysis on this ward points to further success factors of computer-based nursing documentation systems. Our results can be used to assist the planning and introduction of computer-based nursing documentation systems. They demonstrate the importance of computer experience and acceptance of the nursing process on a ward but also point to other factors such as the fit between nursing workflow and the functionality of a nursing documentation system.

  8. Support vector machine-based facial-expression recognition method combining shape and appearance

    NASA Astrophysics Data System (ADS)

    Han, Eun Jung; Kang, Byung Jun; Park, Kang Ryoung; Lee, Sangyoun

    2010-11-01

    Facial expression recognition can be widely used for various applications, such as emotion-based human-machine interaction, intelligent robot interfaces, face recognition robust to expression variation, etc. Previous studies have been classified as either shape- or appearance-based recognition. The shape-based method has the disadvantage that the individual variance of facial feature points exists irrespective of similar expressions, which can cause a reduction of the recognition accuracy. The appearance-based method has a limitation in that the textural information of the face is very sensitive to variations in illumination. To overcome these problems, a new facial-expression recognition method is proposed, which combines both shape and appearance information, based on the support vector machine (SVM). This research is novel in the following three ways as compared to previous works. First, the facial feature points are automatically detected by using an active appearance model. From these, the shape-based recognition is performed by using the ratios between the facial feature points based on the facial-action coding system. Second, the SVM, which is trained to recognize the same and different expression classes, is proposed to combine two matching scores obtained from the shape- and appearance-based recognitions. Finally, a single SVM is trained to discriminate four different expressions, such as neutral, a smile, anger, and a scream. By determining the expression of the input facial image whose SVM output is at a minimum, the accuracy of the expression recognition is much enhanced. The experimental results showed that the recognition accuracy of the proposed method was better than previous researches and other fusion methods.

  9. Adaptive Baseline Enhances EM-Based Policy Search: Validation in a View-Based Positioning Task of a Smartphone Balancer

    PubMed Central

    Wang, Jiexin; Uchibe, Eiji; Doya, Kenji

    2017-01-01

    EM-based policy search methods estimate a lower bound of the expected return from the histories of episodes and iteratively update the policy parameters using the maximum of a lower bound of expected return, which makes gradient calculation and learning rate tuning unnecessary. Previous algorithms like Policy learning by Weighting Exploration with the Returns, Fitness Expectation Maximization, and EM-based Policy Hyperparameter Exploration implemented the mechanisms to discard useless low-return episodes either implicitly or using a fixed baseline determined by the experimenter. In this paper, we propose an adaptive baseline method to discard worse samples from the reward history and examine different baselines, including the mean, and multiples of SDs from the mean. The simulation results of benchmark tasks of pendulum swing up and cart-pole balancing, and standing up and balancing of a two-wheeled smartphone robot showed improved performances. We further implemented the adaptive baseline with mean in our two-wheeled smartphone robot hardware to test its performance in the standing up and balancing task, and a view-based approaching task. Our results showed that with adaptive baseline, the method outperformed the previous algorithms and achieved faster, and more precise behaviors at a higher successful rate. PMID:28167910

  10. Is the northern high latitude land-based CO2 sink weakening?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mcguire, David; Kicklighter, David W.; Gurney, Kevin R

    2011-01-01

    Studies indicate that, historically, terrestrial ecosystems of the northern high latitude region may have been responsible for up to 60% of the global net land-based sink for atmospheric CO2. However, these regions have recently experienced remarkable modification of the major driving forces of the carbon cycle, including surface air temperature warming that is significantly greater than the global average and associated increases in the frequency and severity of disturbances. Whether arctic tundra and boreal forest ecosystems will continue to sequester atmospheric CO2 in the face of these dramatic changes is unknown. Here we show the results of model simulations thatmore » estimate a 41 Tg C yr-1 sink in the boreal land regions from 1997 to 2006, which represents a 73% reduction in the strength of the sink estimated for previous decades in the late 20th Century. Our results suggest that CO2 uptake by the region in previous decades may not be as strong as previously estimated. The recent decline in sink strength is the combined result of 1) weakening sinks due to warming-induced increases in soil organic matter decomposition and 2) strengthening sources from pyrogenic CO2 emissions as a result of the substantial area of boreal forest burned in wildfires across the region in recent years. Such changes create positive feedbacks to the climate system that accelerate global warming, putting further pressure on emission reductions to achieve atmospheric stabilization targets.« less

  11. Is the northern high-latitude land-based CO2 sink weakening?

    USGS Publications Warehouse

    Hayes, D.J.; McGuire, A.D.; Kicklighter, D.W.; Gurney, K.R.; Burnside, T.J.; Melillo, J.M.

    2011-01-01

    Studies indicate that, historically, terrestrial ecosystems of the northern high-latitude region may have been responsible for up to 60% of the global net land-based sink for atmospheric CO2. However, these regions have recently experienced remarkable modification of the major driving forces of the carbon cycle, including surface air temperature warming that is significantly greater than the global average and associated increases in the frequency and severity of disturbances. Whether Arctic tundra and boreal forest ecosystems will continue to sequester atmospheric CO2 in the face of these dramatic changes is unknown. Here we show the results of model simulations that estimate a 41 Tg C yr-1 sink in the boreal land regions from 1997 to 2006, which represents a 73% reduction in the strength of the sink estimated for previous decades in the late 20th century. Our results suggest that CO 2 uptake by the region in previous decades may not be as strong as previously estimated. The recent decline in sink strength is the combined result of (1) weakening sinks due to warming-induced increases in soil organic matter decomposition and (2) strengthening sources from pyrogenic CO2 emissions as a result of the substantial area of boreal forest burned in wildfires across the region in recent years. Such changes create positive feedbacks to the climate system that accelerate global warming, putting further pressure on emission reductions to achieve atmospheric stabilization targets. Copyright 2011 by the American Geophysical Union.

  12. Shallow structure of the Somma Vesuvius volcano from 3D inversion of gravity data

    NASA Astrophysics Data System (ADS)

    Cella, Federico; Fedi, Maurizio; Florio, Giovanni; Grimaldi, Marino; Rapolla, Antonio

    2007-04-01

    A gravity investigation was carried out in the Somma-Vesuvius complex area (Campania, Italy) based on a dataset recently enlarged with new measurements. These cover the volcanic top and fill some other important spatial gaps in previous surveys. Besides the new gravity map of the Vesuvius, we also present the results of a 3D inverse modelling, carried out by using constraints from deep well exploration and seismic reflection surveys. The resulting density model provides a complete reconstruction of the top of the carbonate basement. This is relevant mostly on the western side of the survey area, where no significant information was previously available. Other new information regards the Somma-Vesuvius structure. It consists of an annular volume of rocks around the volcanic vent and that extends down to the carbonate basement. It results to be denser with respect to the surrounding sedimentary cover of the Campanian Plain and to the material located just along the central axis of the volcanic structure. The coherence between these features and other geophysical evidences from previous studies, will be discussed together with the other results of this research.

  13. Water quality assessment with hierarchical cluster analysis based on Mahalanobis distance.

    PubMed

    Du, Xiangjun; Shao, Fengjing; Wu, Shunyao; Zhang, Hanlin; Xu, Si

    2017-07-01

    Water quality assessment is crucial for assessment of marine eutrophication, prediction of harmful algal blooms, and environment protection. Previous studies have developed many numeric modeling methods and data driven approaches for water quality assessment. The cluster analysis, an approach widely used for grouping data, has also been employed. However, there are complex correlations between water quality variables, which play important roles in water quality assessment but have always been overlooked. In this paper, we analyze correlations between water quality variables and propose an alternative method for water quality assessment with hierarchical cluster analysis based on Mahalanobis distance. Further, we cluster water quality data collected form coastal water of Bohai Sea and North Yellow Sea of China, and apply clustering results to evaluate its water quality. To evaluate the validity, we also cluster the water quality data with cluster analysis based on Euclidean distance, which are widely adopted by previous studies. The results show that our method is more suitable for water quality assessment with many correlated water quality variables. To our knowledge, it is the first attempt to apply Mahalanobis distance for coastal water quality assessment.

  14. Optimal Growth in Hypersonic Boundary Layers

    NASA Technical Reports Server (NTRS)

    Paredes, Pedro; Choudhari, Meelan M.; Li, Fei; Chang, Chau-Lyan

    2016-01-01

    The linear form of the parabolized linear stability equations is used in a variational approach to extend the previous body of results for the optimal, nonmodal disturbance growth in boundary-layer flows. This paper investigates the optimal growth characteristics in the hypersonic Mach number regime without any high-enthalpy effects. The influence of wall cooling is studied, with particular emphasis on the role of the initial disturbance location and the value of the spanwise wave number that leads to the maximum energy growth up to a specified location. Unlike previous predictions that used a basic state obtained from a self-similar solution to the boundary-layer equations, mean flow solutions based on the full Navier-Stokes equations are used in select cases to help account for the viscous- inviscid interaction near the leading edge of the plate and for the weak shock wave emanating from that region. Using the full Navier-Stokes mean flow is shown to result in further reduction with Mach number in the magnitude of optimal growth relative to the predictions based on the self-similar approximation to the base flow.

  15. Memory and disgust: Effects of appearance-congruent and appearance-incongruent information on source memory for food.

    PubMed

    Mieth, Laura; Bell, Raoul; Buchner, Axel

    2016-01-01

    The present study was stimulated by previous findings showing that people preferentially remember person descriptions that violate appearance-based first impressions. Given that until now all studies used faces as stimuli, these findings can be explained by referring to a content-specific module for social information processing that facilitates social orientation within groups via stereotyping and counter-stereotyping. The present study tests whether the same results can be obtained with fitness-relevant stimuli from another domain--pictures of disgusting-looking or tasty-looking food, paired with tasty and disgusting descriptions. A multinomial model was used to disentangle item memory, guessing and source memory. There was an old-new recognition advantage for disgusting-looking food. People had a strong tendency towards guessing that disgusting-looking food had been previously associated with a disgusting description. Source memory was enhanced for descriptions that disconfirmed these negative, appearance-based impressions. These findings parallel the results from the social domain. Heuristic processing of stimuli based on visual appearance may be complemented by intensified processing of incongruent information that invalidates these first impressions.

  16. Wavelet based detection of manatee vocalizations

    NASA Astrophysics Data System (ADS)

    Gur, Berke M.; Niezrecki, Christopher

    2005-04-01

    The West Indian manatee (Trichechus manatus latirostris) has become endangered partly because of watercraft collisions in Florida's coastal waterways. Several boater warning systems, based upon manatee vocalizations, have been proposed to reduce the number of collisions. Three detection methods based on the Fourier transform (threshold, harmonic content and autocorrelation methods) were previously suggested and tested. In the last decade, the wavelet transform has emerged as an alternative to the Fourier transform and has been successfully applied in various fields of science and engineering including the acoustic detection of dolphin vocalizations. As of yet, no prior research has been conducted in analyzing manatee vocalizations using the wavelet transform. Within this study, the wavelet transform is used as an alternative to the Fourier transform in detecting manatee vocalizations. The wavelet coefficients are analyzed and tested against a specified criterion to determine the existence of a manatee call. The performance of the method presented is tested on the same data previously used in the prior studies, and the results are compared. Preliminary results indicate that using the wavelet transform as a signal processing technique to detect manatee vocalizations shows great promise.

  17. Sickle-Cell Disease in Nigerian Children: Parental Knowledge and Laboratory Results.

    PubMed

    Obaro, Stephen K; Daniel, Yvonne; Lawson, Juliana O; Hsu, Wei-Wei; Dada, John; Essen, Uduak; Ibrahim, Khalid; Akindele, Adebayo; Brooks, Kevin; Olanipekun, Grace; Ajose, Theresa; Stewart, Claire E; Inusa, Baba P D

    2016-01-01

    Sickle-cell disease (SCD) is the most common inherited genetic disorder in sub-Saharan Africa, and it is associated with early mortality and lifelong morbidity. Early diagnosis is essential for instituting appropriate care and preventive therapy. To compare parental knowledge or perception of their offspring's hemoglobin phenotype prior to testing and actual validated laboratory test results. In a prospective community-based survey, we assessed parental knowledge of their children's hemoglobin phenotype and corroborated this with the results from a laboratory confirmatory test determined by high-performance liquid chromatography. We screened 10,126 children aged less than 5 years. A total of 163 (1.6%) parents indicated that their offspring had been previously tested and had knowledge of the child's hemoglobin genotype. However, 51 (31.2%) of 163 parents of children who had been previously tested did not know the result of their offspring's test, and 18 (35.3%) of these 51 children were found to have SCD. Of those who claimed previous knowledge, 25 (15.3%) of 163 reported incorrect results. Overall, we identified 272 (2.76%) new cases from 9,963 children who had not been previously tested. There is the need to promote public awareness about SCD and the benefit of early diagnosis, quality assurance in laboratory diagnosis and institution of sustainable patient care pathways. © 2016 S. Karger AG, Basel.

  18. Variables Affecting Pharmacy Students’ Patient Care Interventions during Advanced Pharmacy Practice Experiences

    PubMed Central

    Patterson, Brandon J.; Sen, Sanchita; Bingham, Angela L.; Bowen, Jane F.; Ereshefsky, Benjamin; Siemianowski, Laura A.

    2016-01-01

    Objective. To identify the temporal effect and factors associated with student pharmacist self-initiation of interventions during acute patient care advanced pharmacy practice experiences (APPE). Methods. During the APPE, student pharmacists at an academic medical center recorded their therapeutic interventions and who initiated the intervention throughout clinical rotations. At the end of the APPE student pharmacists completed a demographic survey. Results. Sixty-two student pharmacists were included. Factors associated with lower rates of self-initiated interventions were infectious diseases and pediatrics APPEs and an intention to pursue a postgraduate residency. Timing of the APPE, previous specialty elective course completion, and previous hospital experience did not result in any significant difference in self-initiated recommendations. Conclusion. Preceptors should not base practice experience expectations for self-initiated interventions on previous student experience or future intentions. Additionally, factors leading to lower rates of self-initiated interventions on infectious diseases or pediatrics APPEs should be explored. PMID:27756924

  19. The systematic position and structure of the genus Leyogonimus Ginetsinskaya, 1948 (Platyhelminthes: Digenea) with comments on the taxonomy of the superfamily Microphalloidea Ward, 1901.

    PubMed

    Kanarek, Gerard; Zaleśny, Grzegorz; Sitko, Jiljí; Tkach, Vasyl V

    2017-09-26

    The systematic position, phylogenetic relationships and composition of the genus Leyogonimus Ginetsinskaya, 1948 have always been uncertain. In the present study, we investigate the taxonomic position and phylogenetic relationships between the type-species L. polyoon (Linstow, 1887) and L. postgonoporus (Neiland, 1951) (previously classified as Macyella), based on newly obtained partial sequences of the nuclear large ribosomal subunit DNA. To test some of the previously proposed systematic arrangements, we have also sequenced specimens of Stomylotrema vicarium Braun, 1901 and Phaneropsolus sp. Our results clearly demonstrate that both L. polyoon and L. postgonoporus belong to the family Pleurogenidae Looss, 1899 within the superfamily Microphalloidea. Thus, the Leyogonimidae Dollfus, 1951 should be recognized as a synonym of the Pleurogenidae. Leyogonimus polyoon clearly constitutes a separate, sister branch to the clade consisting of Collyricloides massanae Vaucher, 1969 and L. postgonoporus. Based on these results, we resurrect the genus Macyella Neiland, 1951 with type-species M. postgonoporus. Besides, Collyricloides Vaucher, 1968 is synonymized with Macyella resulting in new combination Macyella massanae (Vaucher, 1968) comb. nov. Molecular phylogenetic analysis has demonstrated the lack of a close phylogenetic relationships between Stomylotema vicarium and Leyogonimus previously placed by several authors into the family Stomylotrematidae Poche, 1925. The status of the Phaneropsolidae Mehra, 1935 as independent family was confirmed with the addition of the newly sequenced Phaneropsolus sp. from China.

  20. The exact solution of the monoenergetic transport equation for critical cylinders

    NASA Technical Reports Server (NTRS)

    Westfall, R. M.; Metcalf, D. R.

    1972-01-01

    An analytic solution for the critical, monoenergetic, bare, infinite cylinder is presented. The solution is obtained by modifying a previous development based on a neutron density transform and Case's singular eigenfunction method. Numerical results for critical radii and the neutron density as a function of position are included and compared with the results of other methods.

  1. In vitro fertilization (IVF) from low or high antral follicle count pubertal beef heifers using semi-defined culture conditions

    USDA-ARS?s Scientific Manuscript database

    Antral follicle counts (AFC) vary among pubertal beef heifers. Our objective was to compare the in vitro maturation and fertilization of oocytes collected from low and high AFC heifers. Previously we reported results using serum-based IVF media and in this study report results using semi-defined m...

  2. Estimating Past Temperature Change in Antarctica Based on Ice Core Stable Water Isotope Diffusion

    NASA Astrophysics Data System (ADS)

    Kahle, E. C.; Markle, B. R.; Holme, C.; Jones, T. R.; Steig, E. J.

    2017-12-01

    The magnitude of the last glacial-interglacial transition is a key target for constraining climate sensitivity on long timescales. Ice core proxy records and general circulation models (GCMs) both provide insight on the magnitude of climate change through the last glacial-interglacial transition, but appear to provide different answers. In particular, the magnitude of the glacial-interglacial temperature change reconstructed from East Antarctic ice-core water-isotope records is greater ( 9 degrees C) than that from most GCM simulations ( 6 degrees C). A possible source of this difference is error in the linear-scaling of water isotopes to temperature. We employ a novel, nonlinear temperature-reconstruction technique using the physics of water-isotope diffusion to infer past temperature. Based on new, ice-core data from the South Pole, this diffusion technique suggests East Antarctic temperature change was smaller than previously thought. We are able to confirm this result using a simple, water-isotope fractionation model to nonlinearly reconstruct temperature change at ice core locations across Antarctica based on combined oxygen and hydrogen isotope ratios. Both methods produce a temperature change of 6 degrees C for South Pole, agreeing with GCM results for East Antarctica. Furthermore, both produce much larger changes in West Antarctica, also in agreement with GCM results and independent borehole thermometry. These results support the fidelity of GCMs in simulating last glacial maximum climate, and contradict the idea, based on previous work, that the climate sensitivity of current GCMs is too low.

  3. Loran Automatic Vehicle Monitoring System, Phase I : Volume 2. Appendices.

    DOT National Transportation Integrated Search

    1977-08-01

    Presents results of the evaluation phase of a two phase program to develop an Automatic Vehicle Monitoring (AVM) system for the Southern California Rapid Transit District in Los Angeles, California. Tests were previously conducted on a Loran based lo...

  4. Extending Theory-Based Quantitative Predictions to New Health Behaviors.

    PubMed

    Brick, Leslie Ann D; Velicer, Wayne F; Redding, Colleen A; Rossi, Joseph S; Prochaska, James O

    2016-04-01

    Traditional null hypothesis significance testing suffers many limitations and is poorly adapted to theory testing. A proposed alternative approach, called Testing Theory-based Quantitative Predictions, uses effect size estimates and confidence intervals to directly test predictions based on theory. This paper replicates findings from previous smoking studies and extends the approach to diet and sun protection behaviors using baseline data from a Transtheoretical Model behavioral intervention (N = 5407). Effect size predictions were developed using two methods: (1) applying refined effect size estimates from previous smoking research or (2) using predictions developed by an expert panel. Thirteen of 15 predictions were confirmed for smoking. For diet, 7 of 14 predictions were confirmed using smoking predictions and 6 of 16 using expert panel predictions. For sun protection, 3 of 11 predictions were confirmed using smoking predictions and 5 of 19 using expert panel predictions. Expert panel predictions and smoking-based predictions poorly predicted effect sizes for diet and sun protection constructs. Future studies should aim to use previous empirical data to generate predictions whenever possible. The best results occur when there have been several iterations of predictions for a behavior, such as with smoking, demonstrating that expected values begin to converge on the population effect size. Overall, the study supports necessity in strengthening and revising theory with empirical data.

  5. Synchronization stability of memristor-based complex-valued neural networks with time delays.

    PubMed

    Liu, Dan; Zhu, Song; Ye, Er

    2017-12-01

    This paper focuses on the dynamical property of a class of memristor-based complex-valued neural networks (MCVNNs) with time delays. By constructing the appropriate Lyapunov functional and utilizing the inequality technique, sufficient conditions are proposed to guarantee exponential synchronization of the coupled systems based on drive-response concept. The proposed results are very easy to verify, and they also extend some previous related works on memristor-based real-valued neural networks. Meanwhile, the obtained sufficient conditions of this paper may be conducive to qualitative analysis of some complex-valued nonlinear delayed systems. A numerical example is given to demonstrate the effectiveness of our theoretical results. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Structural Mechanism behind Distinct Efficiency of Oct4/Sox2 Proteins in Differentially Spaced DNA Complexes

    PubMed Central

    Yesudhas, Dhanusha; Anwar, Muhammad Ayaz; Panneerselvam, Suresh; Durai, Prasannavenkatesh; Shah, Masaud; Choi, Sangdun

    2016-01-01

    The octamer-binding transcription factor 4 (Oct4) and sex-determining region Y (SRY)-box 2 (Sox2) proteins induce various transcriptional regulators to maintain cellular pluripotency. Most Oct4/Sox2 complexes have either 0 base pairs (Oct4/Sox20bp) or 3 base pairs (Oct4/Sox23bp) separation between their DNA-binding sites. Results from previous biochemical studies have shown that the complexes separated by 0 base pairs are associated with a higher pluripotency rate than those separated by 3 base pairs. Here, we performed molecular dynamics (MD) simulations and calculations to determine the binding free energy and per-residue free energy for the Oct4/Sox20bp and Oct4/Sox23bp complexes to identify structural differences that contribute to differences in induction rate. Our MD simulation results showed substantial differences in Oct4/Sox2 domain movements, as well as secondary-structure changes in the Oct4 linker region, suggesting a potential reason underlying the distinct efficiencies of these complexes during reprogramming. Moreover, we identified key residues and hydrogen bonds that potentially facilitate protein-protein and protein-DNA interactions, in agreement with previous experimental findings. Consequently, our results confess that differential spacing of the Oct4/Sox2 DNA binding sites can determine the magnitude of transcription of the targeted genes during reprogramming. PMID:26790000

  7. VHDL-AMS modelling and simulation of a planar electrostatic micromotor

    NASA Astrophysics Data System (ADS)

    Endemaño, A.; Fourniols, J. Y.; Camon, H.; Marchese, A.; Muratet, S.; Bony, F.; Dunnigan, M.; Desmulliez, M. P. Y.; Overton, G.

    2003-09-01

    System level simulation results of a planar electrostatic micromotor, based on analytical models of the static and dynamic torque behaviours, are presented. A planar variable capacitance (VC) electrostatic micromotor designed, fabricated and tested at LAAS (Toulouse) in 1995 is simulated using the high level language VHDL-AMS (VHSIC (very high speed integrated circuits) hardware description language-analog mixed signal). The analytical torque model is obtained by first calculating the overlaps and capacitances between different electrodes based on a conformal mapping transformation. Capacitance values in the order of 10-16 F and torque values in the order of 10-11 N m have been calculated in agreement with previous measurements and simulations from this type of motor. A dynamic model has been developed for the motor by calculating the inertia coefficient and estimating the friction-coefficient-based values calculated previously for other similar devices. Starting voltage results obtained from experimental measurement are in good agreement with our proposed simulation model. Simulation results of starting voltage values, step response, switching response and continuous operation of the micromotor, based on the dynamic model of the torque, are also presented. Four VHDL-AMS blocks were created, validated and simulated for power supply, excitation control, micromotor torque creation and micromotor dynamics. These blocks can be considered as the initial phase towards the creation of intellectual property (IP) blocks for microsystems in general and electrostatic micromotors in particular.

  8. Coherent lidar wind measurements from the Space Station base using 1.5 m all-reflective optics

    NASA Technical Reports Server (NTRS)

    Bilbro, J. W.; Beranek, R. G.

    1987-01-01

    This paper discusses the space-based measurement of atmospheric winds from the point of view of the requirements of the optical system of a coherent CO2 lidar. A brief description of the measurement technique is given and a discussion of previous study results provided. The telescope requirements for a Space Station based lidar are arrived at through discussions of the desired system sensitivity and the need for lag angle compensation.

  9. Use of A-Train Aerosol Observations to Constrain Direct Aerosol Radiative Effects (DARE) Comparisons with Aerocom Models and Uncertainty Assessments

    NASA Technical Reports Server (NTRS)

    Redemann, J.; Shinozuka, Y.; Kacenelenbogen, M.; Segal-Rozenhaimer, M.; LeBlanc, S.; Vaughan, M.; Stier, P.; Schutgens, N.

    2017-01-01

    We describe a technique for combining multiple A-Train aerosol data sets, namely MODIS spectral AOD (aerosol optical depth), OMI AAOD (absorption aerosol optical depth) and CALIOP aerosol backscatter retrievals (hereafter referred to as MOC retrievals) to estimate full spectral sets of aerosol radiative properties, and ultimately to calculate the 3-D distribution of direct aerosol radiative effects (DARE). We present MOC results using almost two years of data collected in 2007 and 2008, and show comparisons of the aerosol radiative property estimates to collocated AERONET retrievals. Use of the MODIS Collection 6 AOD data derived with the dark target and deep blue algorithms has extended the coverage of the MOC retrievals towards higher latitudes. The MOC aerosol retrievals agree better with AERONET in terms of the single scattering albedo (ssa) at 441 nm than ssa calculated from OMI and MODIS data alone, indicating that CALIOP aerosol backscatter data contains information on aerosol absorption. We compare the spatio-temporal distribution of the MOC retrievals and MOC-based calculations of seasonal clear-sky DARE to values derived from four models that participated in the Phase II AeroCom model intercomparison initiative. Overall, the MOC-based calculations of clear-sky DARE at TOA over land are smaller (less negative) than previous model or observational estimates due to the inclusion of more absorbing aerosol retrievals over brighter surfaces, not previously available for observationally-based estimates of DARE. MOC-based DARE estimates at the surface over land and total (land and ocean) DARE estimates at TOA are in between previous model and observational results. Comparisons of seasonal aerosol property to AeroCom Phase II results show generally good agreement best agreement with forcing results at TOA is found with GMI-MerraV3. We discuss sampling issues that affect the comparisons and the major challenges in extending our clear-sky DARE results to all-sky conditions. We present estimates of clear-sky and all-sky DARE and show uncertainties that stem from the assumptions in the spatial extrapolation and accuracy of aerosol and cloud properties, in the diurnal evolution of these properties, and in the radiative transfer calculations.

  10. Temperature and heat flux changes at the base of Laurentide ice sheet inferred from geothermal data (evidence from province of Alberta, Canada)

    NASA Astrophysics Data System (ADS)

    Demezhko, Dmitry; Gornostaeva, Anastasia; Majorowicz, Jacek; Šafanda, Jan

    2018-01-01

    Using a previously published temperature log of the 2363-m-deep borehole Hunt well (Alberta, Canada) and the results of its previous interpretation, the new reconstructions of ground surface temperature and surface heat flux histories for the last 30 ka have been obtained. Two ways to adjust the timescale of geothermal reconstructions are discussed, namely the traditional method based on the a priori data on thermal diffusivity value, and the alternative one including the orbital tuning of the surface heat flux and the Earth's insolation changes. It is shown that the second approach provides better agreement between geothermal reconstructions and proxy evidences of deglaciation chronology in the studied region.

  11. On the dynamic nature of response criterion in recognition memory: effects of base rate, awareness, and feedback.

    PubMed

    Rhodes, Matthew G; Jacoby, Larry L

    2007-03-01

    The authors examined whether participants can shift their criterion for recognition decisions in response to the probability that an item was previously studied. Participants in 3 experiments were given recognition tests in which the probability that an item was studied was correlated with its location during the test. Results from all 3 experiments indicated that participants' response criteria were sensitive to the probability that an item was previously studied and that shifts in criterion were robust. In addition, awareness of the bases for criterion shifts and feedback on performance were key factors contributing to the observed shifts in decision criteria. These data suggest that decision processes can operate in a dynamic fashion, shifting from item to item.

  12. HLPI-Ensemble: Prediction of human lncRNA-protein interactions based on ensemble strategy.

    PubMed

    Hu, Huan; Zhang, Li; Ai, Haixin; Zhang, Hui; Fan, Yetian; Zhao, Qi; Liu, Hongsheng

    2018-03-27

    LncRNA plays an important role in many biological and disease progression by binding to related proteins. However, the experimental methods for studying lncRNA-protein interactions are time-consuming and expensive. Although there are a few models designed to predict the interactions of ncRNA-protein, they all have some common drawbacks that limit their predictive performance. In this study, we present a model called HLPI-Ensemble designed specifically for human lncRNA-protein interactions. HLPI-Ensemble adopts the ensemble strategy based on three mainstream machine learning algorithms of Support Vector Machines (SVM), Random Forests (RF) and Extreme Gradient Boosting (XGB) to generate HLPI-SVM Ensemble, HLPI-RF Ensemble and HLPI-XGB Ensemble, respectively. The results of 10-fold cross-validation show that HLPI-SVM Ensemble, HLPI-RF Ensemble and HLPI-XGB Ensemble achieved AUCs of 0.95, 0.96 and 0.96, respectively, in the test dataset. Furthermore, we compared the performance of the HLPI-Ensemble models with the previous models through external validation dataset. The results show that the false positives (FPs) of HLPI-Ensemble models are much lower than that of the previous models, and other evaluation indicators of HLPI-Ensemble models are also higher than those of the previous models. It is further showed that HLPI-Ensemble models are superior in predicting human lncRNA-protein interaction compared with previous models. The HLPI-Ensemble is publicly available at: http://ccsipb.lnu.edu.cn/hlpiensemble/ .

  13. Quasispecies dynamics on a network of interacting genotypes and idiotypes: formulation of the model

    NASA Astrophysics Data System (ADS)

    Barbosa, Valmir C.; Donangelo, Raul; Souza, Sergio R.

    2015-01-01

    A quasispecies is the stationary state of a set of interrelated genotypes that evolve according to the usual principles of selection and mutation. Quasispecies studies have for the most part concentrated on the possibility of errors during genotype replication and their role in promoting either the survival or the demise of the quasispecies. In a previous work, we introduced a network model of quasispecies dynamics, based on a single probability parameter (p) and capable of addressing several plausibility issues of previous models. Here we extend that model by pairing its network with another one aimed at modeling the dynamics of the immune system when confronted with the quasispecies. The new network is based on the idiotypic-network model of immunity and, together with the previous one, constitutes a network model of interacting genotypes and idiotypes. The resulting model requires further parameters and as a consequence leads to a vast phase space. We have focused on a particular niche in which it is possible to observe the trade-offs involved in the quasispecies' survival or destruction. Within this niche, we give simulation results that highlight some key preconditions for quasispecies survival. These include a minimum initial abundance of genotypes relative to that of the idiotypes and a minimum value of p. The latter, in particular, is to be contrasted with the stand-alone quasispecies network of our previous work, in which arbitrarily low values of p constitute a guarantee of quasispecies survival.

  14. Design of Composite Structures Using Knowledge-Based and Case Based Reasoning

    NASA Technical Reports Server (NTRS)

    Lambright, Jonathan Paul

    1996-01-01

    A method of using knowledge based and case based reasoning to assist designers during conceptual design tasks of composite structures was proposed. The cooperative use of heuristics, procedural knowledge, and previous similar design cases suggests a potential reduction in design cycle time and ultimately product lead time. The hypothesis of this work is that the design process of composite structures can be improved by using Case-Based Reasoning (CBR) and Knowledge-Based (KB) reasoning in the early design stages. The technique of using knowledge-based and case-based reasoning facilitates the gathering of disparate information into one location that is easily and readily available. The method suggests that the inclusion of downstream life-cycle issues into the conceptual design phase reduces potential of defective, and sub-optimal composite structures. Three industry experts were interviewed extensively. The experts provided design rules, previous design cases, and test problems. A Knowledge Based Reasoning system was developed using the CLIPS (C Language Interpretive Procedural System) environment and a Case Based Reasoning System was developed using the Design Memory Utility For Sharing Experiences (MUSE) xviii environment. A Design Characteristic State (DCS) was used to document the design specifications, constraints, and problem areas using attribute-value pair relationships. The DCS provided consistent design information between the knowledge base and case base. Results indicated that the use of knowledge based and case based reasoning provided a robust design environment for composite structures. The knowledge base provided design guidance from well defined rules and procedural knowledge. The case base provided suggestions on design and manufacturing techniques based on previous similar designs and warnings of potential problems and pitfalls. The case base complemented the knowledge base and extended the problem solving capability beyond the existence of limited well defined rules. The findings indicated that the technique is most effective when used as a design aid and not as a tool to totally automate the composites design process. Other areas of application and implications for future research are discussed.

  15. Synchrotron Protein Footprinting Supports Substrate Translocation by ClpA via ATP-Induced Movements of the D2 Loop

    PubMed Central

    Bohon, Jen; Jennings, Laura D.; Phillips, Christine M.; Licht, Stuart; Chance, Mark R.

    2010-01-01

    SUMMARY Synchrotron x-ray protein footprinting is used to study structural changes upon formation of the ClpA hexamer. Comparative solvent accessibilities between ClpA monomer and ClpA hexamer samples are in agreement throughout most of the sequence with calculations based on two previously proposed hexameric models. The data differ substantially from the proposed models in two parts of the structure: the D1 sensor 1 domain and the D2 loop region. The results suggest that these two regions can access alternate conformations in which their solvent protection is greater than in the structural models based on crystallographic data. In combination with previously reported structural data, the footprinting data provide support for a revised model in which the D2 loop contacts the D1 sensor 1 domain in the ATP-bound form of the complex. These data provide the first direct experimental support for the nucleotide-dependent D2 loop conformational change previously proposed to mediate substrate translocation. PMID:18682217

  16. Previous antiretroviral therapy for prevention of mother-to-child transmission of HIV does not hamper the initial response to PI-based multitherapy during subsequent pregnancy.

    PubMed

    Briand, Nelly; Mandelbrot, Laurent; Blanche, Stéphane; Tubiana, Roland; Faye, Albert; Dollfus, Catherine; Le Chenadec, Jérôme; Benhammou, Valérie; Rouzioux, Christine; Warszawski, Josiane

    2011-06-01

    Few data are available on the possible long-term negative effects of a short exposure to antiretroviral therapy (ART) for prevention of mother-to-child transmission (PMTCT). To determine whether ART for PMTCT, discontinued after delivery, affects the virological response to highly active antiretroviral therapy (HAART) administered during subsequent pregnancies. All current pregnancies of HIV-1-infected women enrolled in the French Perinatal Cohort (ANRS CO-01 EPF) between 2005 and 2009 and not receiving ART at the time of conception were eligible. We studied the association between history of exposure to ART during a previous pregnancy and detectable viral load (VL) under multitherapy at current delivery (VL ≥ 50 copies/mL). Among 1116 eligible women, 869 were ART naive and 247 had received PMTCT during a previous pregnancy. Previous ART was protease inhibitor (PI)-based HAART in 48%, non-PI-based HAART in 4%, nucleoside reverse transcriptase inhibitor bitherapy in 19% and zidovudine monotherapy in 29% of the women. At current pregnancy, women with or without prior exposure to ART had similar CD4 cell counts and VL before ART initiation. PI-based HAART was initiated in 90% of the women. VL was undetectable (<50 copies/mL) at delivery in 65% of previously ART-naive women, 72% of women previously exposed to HAART, 62% previously exposed to bitherapy, and 67% previously exposed to monotherapy for prophylaxis (P = 0.42). Detectable VL was not associated with previous exposure in multivariate analysis (adjusted OR for previous versus no previous exposure to ART: 0.92; 0.95% confidence interval: 0.59 to 1.44). Efficacy of PI-based combinations is not decreased in women previously exposed to various regimens of antiretroviral PMTCT.

  17. Putting the Learning in Case Learning? The Effects of Case-Based Approaches on Student Knowledge, Attitudes, and Engagement

    ERIC Educational Resources Information Center

    Krain, Matthew

    2016-01-01

    This study revisits case learning's effects on student engagement and assesses student learning as a result of the use of case studies and problem-based learning. The author replicates a previous study that used indirect assessment techniques to get at case learning's impact, and then extends the analysis using a pre- and post-test experimental…

  18. Here's What You Must Think about Nuclear Power: Grappling with the Spiritual Ground of Children's Judgement inside and outside Steiner Waldorf Education

    ERIC Educational Resources Information Center

    Ashley, Martin

    2008-01-01

    The author has previously argued against "early closure"--the tendency to close down children's curiosity through an over-zealous approach to issues-based education. Indoctrination might be a result but "burn-out," a potentially permanent attitude change that sets in before puberty, is more likely. This article is based on the…

  19. The Effect of Motion Analysis Activities in a Video-Based Laboratory in Students' Understanding of Position, Velocity and Frames of Reference

    ERIC Educational Resources Information Center

    Koleza, Eugenia; Pappas, John

    2008-01-01

    In this article, we present the results of a qualitative research project on the effect of motion analysis activities in a Video-Based Laboratory (VBL) on students' understanding of position, velocity and frames of reference. The participants in our research were 48 pre-service teachers enrolled in Education Departments with no previous strong…

  20. Space vehicle engine and heat shield environment review. Volume 1: Engineering analysis

    NASA Technical Reports Server (NTRS)

    Mcanelly, W. B.; Young, C. T. K.

    1973-01-01

    Methods for predicting the base heating characteristics of a multiple rocket engine installation are discussed. The environmental data is applied to the design of adequate protection system for the engine components. The methods for predicting the base region thermal environment are categorized as: (1) scale model testing, (2) extrapolation of previous and related flight test results, and (3) semiempirical analytical techniques.

  1. The Effect of Dynamic Assessment on Adult Learners of Arabic: A Mixed-Method Study at the Defense Language Institute Foreign Language Center

    ERIC Educational Resources Information Center

    Fahmy, Mohsen M.

    2013-01-01

    Dynamic assessment (DA) is based on Vygotsky's (1978) sociocultural theory and his Zone of Proximal Development (ZPD). ZPD is the range of abilities bordered by the learner's assisted and independent performances. Previous studies showed promising results for DA in tutoring settings. However, they did not use proficiency-based rubrics to measure…

  2. An Examination of Alternate Assessment Durations when Assessing Multiple-Skill Computational Fluency: The Generalizability and Dependability of Curriculum-Based Outcomes within the Context of Educational Decisions

    ERIC Educational Resources Information Center

    Christ, Theodore J.; Johnson-Gros, Kristin H.

    2005-01-01

    The current study extended previous research on curriculum-based measurement in mathematics (M-CBM) assessments. The purpose was to examine the generalizability and dependability of multiple-skill M-CBM computation assessments across various assessment durations (1, 2, 3, 4, 5, and 6 minutes). Results of generalizability and dependability studies…

  3. Functional Analysis and Preliminary Specifications for a Single Integrated Central Computer System for Secondary Schools and Junior Colleges. Interim Report.

    ERIC Educational Resources Information Center

    1968

    The present report proposes a central computing facility and presents the preliminary specifications for such a system. It is based, in part, on the results of earlier studies by two previous contractors on behalf of the U.S. Office of Education. The recommendations are based upon the present contractors considered evaluation of the earlier…

  4. Ground-based radiometric calibration of the Landsat 8 Operational Land Imager (OLI) using in situ techniques

    NASA Astrophysics Data System (ADS)

    Czapla-Myers, J.

    2013-12-01

    Landsat 8 was successfully launched from Vandenberg Air Force Base in California on 11 February 2013, and was placed into the orbit previously occupied by Landsat 5. Landsat 8 is the latest platform in the 40-year history of the Landsat series of satellites, and it contains two instruments that operate in the solar-reflective and the thermal infrared regimes. The Operational Land Imager (OLI) is a pushbroom sensor that contains eight multispectral bands ranging from 400-2300 nm, and one panchromatic band. The spatial resolution of the multispectral bands is 30 m, which is similar to previous Landsat sensors, and the panchromatic band has a 15-m spatial resolution, which is also similar to previous Landsat sensors. The 12-bit radiometric resolution of OLI improves upon the 8-bit resolution of the Enhanced Thematic Mapper Plus (ETM+) onboard Landsat 7. An important requirement for the Landsat program is the long-term radiometric continuity of its sensors. Ground-based vicarious techniques have been used for over 20 years to determine the absolute radiometric calibration of sensors that encompass a wide variety of spectral and spatial characteristics. This work presents the early radiometric calibration results of Landsat 8 OLI that were obtained using the traditional reflectance-based approach. University of Arizona personnel used five sites in Arizona, California, and Nevada to collect ground-based data. In addition, a unique set of in situ data were collected in March 2013, when Landsat 7 and Landsat 8 were observing the same site within minutes of each other. The tandem overfly schedule occurred while Landsat 8 was shifting to the WRS-2 orbital grid, and lasted only a few days. The ground-based data also include results obtained using the University of Arizona's Radiometric Calibration Test Site (RadCaTS), which is an automated suite of instruments located at Railroad Valley, Nevada. The results presented in this work include a comparison to the L1T at-sensor spectral radiance and the top-of-atmosphere reflectance, both of which are standard products available from the US Geological Survey.

  5. Identifying and ranking implicit leadership strategies to promote evidence-based practice implementation in addiction health services.

    PubMed

    Guerrero, Erick G; Padwa, Howard; Fenwick, Karissa; Harris, Lesley M; Aarons, Gregory A

    2016-05-14

    Despite a solid research base supporting evidence-based practices (EBPs) for addiction treatment such as contingency management and medication-assisted treatment, these services are rarely implemented and delivered in community-based addiction treatment programs in the USA. As a result, many clients do not benefit from the most current and efficacious treatments, resulting in reduced quality of care and compromised treatment outcomes. Previous research indicates that addiction program leaders play a key role in supporting EBP adoption and use. The present study expanded on this previous work to identify strategies that addiction treatment program leaders report using to implement new practices. We relied on a staged and iterative mixed-methods approach to achieve the following four goals: (a) collect data using focus groups and semistructured interviews and conduct analyses to identify implicit managerial strategies for implementation, (b) use surveys to quantitatively rank strategy effectiveness, (c) determine how strategies fit with existing theories of organizational management and change, and (d) use a consensus group to corroborate and expand on the results of the previous three stages. Each goal corresponded to a methodological phase, which included data collection and analytic approaches to identify and evaluate leadership interventions that facilitate EBP implementation in community-based addiction treatment programs. Findings show that the top-ranked strategies involved the recruitment and selection of staff members receptive to change, offering support and requesting feedback during the implementation process, and offering in vivo and hands-on training. Most strategies corresponded to emergent implementation leadership approaches that also utilize principles of transformational and transactional leadership styles. Leadership behaviors represented orientations such as being proactive to respond to implementation needs, supportive to assist staff members during the uptake of new practices, knowledgeable to properly guide the implementation process, and perseverant to address ongoing barriers that are likely to stall implementation efforts. These findings emphasize how leadership approaches are leveraged to facilitate the implementation and delivery of EBPs in publicly funded addiction treatment programs. Findings have implications for the content and structure of leadership interventions needed in community-based addiction treatment programs and the development of leadership interventions in these and other service settings.

  6. Species delimitation in plants using the Qinghai–Tibet Plateau endemic Orinus (Poaceae: Tridentinae) as an example

    PubMed Central

    Su, Xu; Wu, Guili; Li, Lili; Liu, Jianquan

    2015-01-01

    Background and Aims Accurate identification of species is essential for the majority of biological studies. However, defining species objectively and consistently remains a challenge, especially for plants distributed in remote regions where there is often a lack of sufficient previous specimens. In this study, multiple approaches and lines of evidence were used to determine species boundaries for plants occurring in the Qinghai–Tibet Plateau, using the genus Orinus (Poaceae) as a model system for an integrative approach to delimiting species. Methods A total of 786 individuals from 102 populations of six previously recognized species were collected for niche, morphological and genetic analyses. Three plastid DNA regions (matK, rbcL and trnH-psbA) and one nuclear DNA region [internal transcribed space (ITS)] were sequenced. Key Results Whereas six species had been previously recognized, statistical analyses based on character variation, molecular data and niche differentiation identified only two well-delimited clusters, together with a third possibly originating from relatively recent hybridization between, or historical introgression from, the other two. Conclusions Based on a principle of integrative species delimitation to reconcile different sources of data, the results provide compelling evidence that the six previously recognized species of the genus Orinus that were examined should be reduced to two, with new circumscriptions, and a third, identified in this study, should be described as a new species. This empirical study highlights the value of applying genetic differentiation, morphometric statistics and ecological niche modelling in an integrative approach to re-circumscribing species boundaries. The results produce relatively objective, operational and unbiased taxonomic classifications of plants occurring in remote regions. PMID:25987712

  7. A novel iterative mixed model to remap three complex orthopedic traits in dogs

    PubMed Central

    Huang, Meng; Hayward, Jessica J.; Corey, Elizabeth; Garrison, Susan J.; Wagner, Gabriela R.; Krotscheck, Ursula; Hayashi, Kei; Schweitzer, Peter A.; Lust, George; Boyko, Adam R.; Todhunter, Rory J.

    2017-01-01

    Hip dysplasia (HD), elbow dysplasia (ED), and rupture of the cranial (anterior) cruciate ligament (RCCL) are the most common complex orthopedic traits of dogs and all result in debilitating osteoarthritis. We reanalyzed previously reported data: the Norberg angle (a quantitative measure of HD) in 921 dogs, ED in 113 cases and 633 controls, and RCCL in 271 cases and 399 controls and their genotypes at ~185,000 single nucleotide polymorphisms. A novel fixed and random model with a circulating probability unification (FarmCPU) function, with marker-based principal components and a kinship matrix to correct for population stratification, was used. A Bonferroni correction at p<0.01 resulted in a P< 6.96 ×10−8. Six loci were identified; three for HD and three for RCCL. An associated locus at CFA28:34,369,342 for HD was described previously in the same dogs using a conventional mixed model. No loci were identified for RCCL in the previous report but the two loci for ED in the previous report did not reach genome-wide significance using the FarmCPU model. These results were supported by simulation which demonstrated that the FarmCPU held no power advantage over the linear mixed model for the ED sample but provided additional power for the HD and RCCL samples. Candidate genes for HD and RCCL are discussed. When using FarmCPU software, we recommend a resampling test, that a positive control be used to determine the optimum pseudo quantitative trait nucleotide-based covariate structure of the model, and a negative control be used consisting of permutation testing and the identical resampling test as for the non-permuted phenotypes. PMID:28614352

  8. Total hip arthroplasty after a previous pelvic osteotomy: A systematic review and meta-analysis.

    PubMed

    Shigemura, T; Yamamoto, Y; Murata, Y; Sato, T; Tsuchiya, R; Wada, Y

    2018-06-01

    There are several reports regarding total hip arthroplasty (THA) after a previous pelvic osteotomy (PO). However, to our knowledge, until now there has been no formal systematic review and meta-analysis published to summarize the clinical results of THA after a previous PO. Therefore, we conducted a systematic review and meta-analysis of results of THA after a previous PO. We focus on these questions as follows: does a previous PO affect the results of subsequent THA, such as clinical outcomes, operative time, operative blood loss, and radiological parameters. Using PubMed, Web of Science, and Cochrane Library, we searched for relevant original papers. The pooling of data was performed using RevMan software (version 5.3, Cochrane Collaboration, Oxford, UK). A p-value<0.05 was judged as significant. Standardized mean differences (SMD) were calculated for continuous data with a 95% confidence interval (CI) was reported. Statistical heterogeneity was assessed based on I 2 using standard χ 2 test. When I 2 >50%, significant heterogeneity was assumed and a random-effects model was applied for the meta-analysis. A fixed-effects model was applied in the absence of significant heterogeneity. Eleven studies were included in this meta-analysis. The pooled results indicated that there was no significant difference in postoperative Merle D'Aubigne-Postel score (I 2 =0%, SMD=-0.15, 95% CI: -0.36 to 0.06, p=0.17), postoperative Harris hip score (I 2 =60%, SMD=-0.23, 95% CI: -0.50 to 0.05, p=0.10), operative time (I 2 =86%, SMD=0.37, 95% CI: -0.09 to 0.82, p=0.11), operative blood loss (I 2 =82%, SMD=0.23, 95% CI: -0.17 to 0.63, p=0.25), and cup abduction angle (I 2 =43%, SMD=-0.08, 95% CI: -0.25 to 0.09, p=0.38) between THA with and without a previous PO. However, cup anteversion angle of THA with a previous PO was significantly smaller than that of without a previous PO (I 2 =77%, SMD=-0.63, 95% CI: -1.13 to -0.13, p=0.01). Systematic review and meta-analysis of results of THA after a previous PO was performed. A previous PO did not affect the results of subsequent THA, except for cup anteversion. Because of the low quality evidence currently available, high-quality randomized controlled trials are required. Level III, meta-analysis of case-control studies. Copyright © 2018 Elsevier Masson SAS. All rights reserved.

  9. Early-stage gastric cancers represented as dysplasia in a previous forceps biopsy: the importance of clinical management.

    PubMed

    Park, Jin Seok; Hong, Su Jin; Han, Jae Pil; Kang, Myung Soo; Kim, Hee Kyung; Kwak, Jeong Ja; Ko, Bong Min; Cho, Joo Young; Lee, Joon Seong; Lee, Moon Sung

    2013-02-01

    Because histological examination of gastric lesions by forceps biopsy is of limited accuracy, management on the basis of histological results is occasionally controversial. We examined the characteristics of early gastric cancers that presented as dysplasia resulting from a previous forceps biopsy. Between April 2007 and December 2010, 341 gastric adenocarcinoma lesions from 330 patients previously diagnosed histologically via endoscopic submucosal dissection were examined. We retrospectively assessed the characteristics of early gastric cancer according to their initial forceps biopsy results. In total, 183 EGCs were diagnosed as dysplasia (53.7%; 89 low-grade and 94 high-grade) and 158 (46.3%) as carcinoma by forceps biopsy before endoscopic submucosal dissection. Significant differences were noted with respect to histologic differentiation of carcinomas, Lauren histologic type, depth of invasion, lymphovascular invasion, and en bloc resection between the dysplastic group and carcinoma group, based on forceps biopsy results. A forceps biopsy result is not fully representative of the entire lesion and, thus, endoscopic submucosal dissection should be considered for lesions diagnosed as dysplasia via forceps biopsy in order to avoid the risk of missed carcinomas. Copyright © 2012 Editrice Gastroenterologica Italiana S.r.l. Published by Elsevier Ltd. All rights reserved.

  10. Impaired self-agency inferences in schizophrenia: The role of cognitive capacity and causal reasoning style.

    PubMed

    Prikken, M; van der Weiden, A; Kahn, R S; Aarts, H; van Haren, N E M

    2018-01-01

    The sense of self-agency, i.e., experiencing oneself as the cause of one's own actions, is impaired in patients with schizophrenia. Normally, inferences of self-agency are enhanced when actual outcomes match with pre-activated outcome information, where this pre-activation can result from explicitly set goals (i.e., goal-based route) or implicitly primed outcome information (i.e., prime-based route). Previous research suggests that patients show specific impairments in the prime-based route, implicating that they do not rely on matches between implicitly available outcome information and actual action-outcomes when inferring self-agency. The question remains: Why? Here, we examine whether neurocognitive functioning and self-serving bias (SSB) may explain abnormalities in patients' agency inferences. Thirty-six patients and 36 healthy controls performed a commonly used agency inference task to measure goal- and prime-based self-agency inferences. Neurocognitive functioning was assessed with the Brief Assessment of Cognition in Schizophrenia (BACS) and the SSB was assessed with the Internal Personal and Situational Attributions Questionnaire. Results showed a substantial smaller effect of primed outcome information on agency experiences in patients compared with healthy controls. Whereas patients and controls differed on BACS and marginally on SSB scores, these differences were not related to patients' impairments in prime-based agency inferences. Patients showed impairments in prime-based agency inferences, thereby replicating previous studies. This finding could not be explained by cognitive dysfunction or SSB. Results are discussed in the context of the recent surge to understand and examine deficits in agency experiences in schizophrenia. Copyright © 2017 Elsevier Masson SAS. All rights reserved.

  11. AOTV bow shock location

    NASA Technical Reports Server (NTRS)

    Desautel, D.

    1985-01-01

    Hypersonic bow-shock location and geometry are of central importance to the aerodynamics and aerothermodynamics of aeroassisted orbital transfer vehicles (AOTVs), but they are difficult to predict for a given vehicle configuration. This paper reports experimental measurements of shock standoff distance for the 70 deg cone AOTV configuration in shock-tunnel-test flows at Mach numbers of 3.8 to 7.9 and for angles of attack from 0 deg to 20 deg. The controlling parameter for hypersonic bow-shock standoff distance (for a given forebody shape) is the mean normal-shock density ratio. Values for this parameter in the tests reported are in the same range as those of the drag-brake AOTV perigee regime. Results for standoff distance are compared with those previously reported in the literature for this AOTV configuration. It is concluded that the AOTV shock standoff distance for the conical configuration, based on frustrum (base) radius, is equivalent to that of a sphere with a radius about 35 percent greater than that of the cone; the distance is, therefore, much less than reported in previous studies. Some reasons for the discrepancies between the present and previous are advanced. The smaller standoff distance determined here implies there will be less radiative heat transfer than was previously expected.

  12. A meta-analysis but not a systematic review: an evaluation of the Global BMI Mortality Collaboration.

    PubMed

    Flegal, Katherine M; Ioannidis, John P A

    2017-08-01

    Meta-analyses of individual participant data (MIPDs) offer many advantages and are considered the highest level of evidence. However, MIPDs can be seriously compromised when they are not solidly founded upon a systematic review. These data-intensive collaborative projects may be led by experts who already have deep knowledge of the literature in the field and of the results of published studies and how these results vary based on different analytical approaches. If investigators tailor the searches, eligibility criteria, and analysis plan of the MIPD, they run the risk of reaching foregone conclusions. We exemplify this potential bias in a MIPD on the association of body mass index with mortality conducted by a collaboration of outstanding and extremely knowledgeable investigators. Contrary to a previous meta-analysis of group data that used a systematic review approach, the MIPD did not seem to use a formal search: it considered 239 studies, of which the senior author was previously aware of at least 238, and it violated its own listed eligibility criteria to include those studies and exclude other studies. It also preferred an analysis plan that was also known to give a specific direction of effects in already published results of most of the included evidence. MIPDs where results of constituent studies are already largely known need safeguards to their validity. These may include careful systematic searches, adherence to the Preferred Reporting Items for Systematic Review and Meta-Analyses of individual participant data guidelines, and exploration of the robustness of results with different analyses. They should also avoid selective emphasis on foregone conclusions based on previously known results with specific analytical choices. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. Asteroid mass estimation with Markov-chain Monte Carlo

    NASA Astrophysics Data System (ADS)

    Siltala, Lauri; Granvik, Mikael

    2017-10-01

    Estimates for asteroid masses are based on their gravitational perturbations on the orbits of other objects such as Mars, spacecraft, or other asteroids and/or their satellites. In the case of asteroid-asteroid perturbations, this leads to a 13-dimensional inverse problem at minimum where the aim is to derive the mass of the perturbing asteroid and six orbital elements for both the perturbing asteroid and the test asteroid by fitting their trajectories to their observed positions. The fitting has typically been carried out with linearized methods such as the least-squares method. These methods need to make certain assumptions regarding the shape of the probability distributions of the model parameters. This is problematic as these assumptions have not been validated. We have developed a new Markov-chain Monte Carlo method for mass estimation which does not require an assumption regarding the shape of the parameter distribution. Recently, we have implemented several upgrades to our MCMC method including improved schemes for handling observational errors and outlier data alongside the option to consider multiple perturbers and/or test asteroids simultaneously. These upgrades promise significantly improved results: based on two separate results for (19) Fortuna with different test asteroids we previously hypothesized that simultaneous use of both test asteroids would lead to an improved result similar to the average literature value for (19) Fortuna with substantially reduced uncertainties. Our upgraded algorithm indeed finds a result essentially equal to the literature value for this asteroid, confirming our previous hypothesis. Here we show these new results for (19) Fortuna and other example cases, and compare our results to previous estimates. Finally, we discuss our plans to improve our algorithm further, particularly in connection with Gaia.

  14. Unclothed firewalls

    NASA Astrophysics Data System (ADS)

    Chen, Pisin; Ong, Yen Chin; Page, Don Nelson; Sasaki, Misao; Yeom, Dong-Han

    2016-07-01

    We have previously argued that fluctuations of the Hawking emission rate can cause a black hole event horizon to fluctuate inside the location of a putative firewall, rendering the firewall naked. This assumes that the firewall is located near where the horizon would be expected based on the past evolution of the spacetime. Here, we expand our previous results by defining two new estimates for where the firewall might be that have more smooth temporal behavior than our original estimate. Our results continue to contradict the usual assumption that the firewall should not be observable except by infalling observers. This casts doubt about the idea that firewalls are the ‘most conservative’ solution to the information loss paradox.

  15. An optimized method to calculate error correction capability of tool influence function in frequency domain

    NASA Astrophysics Data System (ADS)

    Wang, Jia; Hou, Xi; Wan, Yongjian; Shi, Chunyan

    2017-10-01

    An optimized method to calculate error correction capability of tool influence function (TIF) in certain polishing conditions will be proposed based on smoothing spectral function. The basic mathematical model for this method will be established in theory. A set of polishing experimental data with rigid conformal tool is used to validate the optimized method. The calculated results can quantitatively indicate error correction capability of TIF for different spatial frequency errors in certain polishing conditions. The comparative analysis with previous method shows that the optimized method is simpler in form and can get the same accuracy results with less calculating time in contrast to previous method.

  16. Anisotropic models of the upper mantle

    NASA Technical Reports Server (NTRS)

    Regan, J.; Anderson, D. L.

    1983-01-01

    Long period Rayleigh wave and Love wave dispersion data, particularly for oceanic areas, were not simultaneously satisfied by an isotropic structure. Available phase and group velocity data are inverted by a procedure which includes the effects of transverse anisotropy, an elastic dispersion, sphericity, and gravity. The resulting models, for the average Earth, average ocean and oceanic regions divided according to the age of the ocean floor, are quite different from previous results which ignore the above effects. The models show a low velocity zone with age dependent anisotropy and velocities higher than derived in previous surface wave studies. The correspondence between the anisotropy variation with age and a physical model based on flow aligned olivine is suggested.

  17. Incorporating exposure information into the toxicological prioritization index decision support framework

    EPA Science Inventory

    The Toxicological Prioritization Index (ToxPi) decision support framework was previously developed to facilitate incorporation of diverse data to prioritize chemicals based on potential hazard. This ToxPi index was demonstrated by considering results of bioprofiling related to po...

  18. Rhythmic patterning in Malaysian and Singapore English.

    PubMed

    Tan, Rachel Siew Kuang; Low, Ee-Ling

    2014-06-01

    Previous work on the rhythm of Malaysian English has been based on impressionistic observations. This paper utilizes acoustic analysis to measure the rhythmic patterns of Malaysian English. Recordings of the read speech and spontaneous speech of 10 Malaysian English speakers were analyzed and compared with recordings of an equivalent sample of Singaporean English speakers. Analysis was done using two rhythmic indexes, the PVI and VarcoV. It was found that although the rhythm of read speech of the Singaporean speakers was syllable-based as described by previous studies, the rhythm of the Malaysian speakers was even more syllable-based. Analysis of the syllables in specific utterances showed that Malaysian speakers did not reduce vowels as much as Singaporean speakers in cases of syllables in utterances. Results of the spontaneous speech confirmed the findings for the read speech; that is, the same rhythmic patterning was found which normally triggers vowel reductions.

  19. Design of membrane actuators based on ferromagnetic shape memory alloy composite for the synthetic jet actuator

    NASA Astrophysics Data System (ADS)

    Liang, Yuanchang; Taya, Minoru; Kuga, Yasuo

    2004-07-01

    A new membrane actuator based on our previous diaphragm actuator was designed and constructed to improve the dynamic performance. The finite element analysis was used to estimate the frequency response of the composite membrane which will be driven close to its resonance to obtain a large stroke. The membrane is made of ferromagnetic shape memory alloy (FSMA) composite including a ferromagnetic soft iron pad and a superelastic grade of NiTi shape memory alloy (SMA). The actuation mechanism for the FSMA composite membrane of the actuator is the hybrid mechanism that we proposed previously. This membrane actuator is designed for a new synthetic jet actuator package that will be used for active flow control technology on airplane wings. Based on the FEM results, the new membrane actuator system was assembled and its static and dynamic performance was experimentally evaluated including the dynamic magnetic response of the hybrid magnet.

  20. ECG Sensor Card with Evolving RBP Algorithms for Human Verification.

    PubMed

    Tseng, Kuo-Kun; Huang, Huang-Nan; Zeng, Fufu; Tu, Shu-Yi

    2015-08-21

    It is known that cardiac and respiratory rhythms in electrocardiograms (ECGs) are highly nonlinear and non-stationary. As a result, most traditional time-domain algorithms are inadequate for characterizing the complex dynamics of the ECG. This paper proposes a new ECG sensor card and a statistical-based ECG algorithm, with the aid of a reduced binary pattern (RBP), with the aim of achieving faster ECG human identity recognition with high accuracy. The proposed algorithm has one advantage that previous ECG algorithms lack-the waveform complex information and de-noising preprocessing can be bypassed; therefore, it is more suitable for non-stationary ECG signals. Experimental results tested on two public ECG databases (MIT-BIH) from MIT University confirm that the proposed scheme is feasible with excellent accuracy, low complexity, and speedy processing. To be more specific, the advanced RBP algorithm achieves high accuracy in human identity recognition and is executed at least nine times faster than previous algorithms. Moreover, based on the test results from a long-term ECG database, the evolving RBP algorithm also demonstrates superior capability in handling long-term and non-stationary ECG signals.

  1. Response Monitoring and Adjustment: Differential Relations with Psychopathic Traits

    PubMed Central

    Bresin, Konrad; Finy, M. Sima; Sprague, Jenessa; Verona, Edelyn

    2014-01-01

    Studies on the relation between psychopathy and cognitive functioning often show mixed results, partially because different factors of psychopathy have not been considered fully. Based on previous research, we predicted divergent results based on a two-factor model of psychopathy (interpersonal-affective traits and impulsive-antisocial traits). Specifically, we predicted that the unique variance of interpersonal-affective traits would be related to increased monitoring (i.e., error-related negativity) and adjusting to errors (i.e., post-error slowing), whereas impulsive-antisocial traits would be related to reductions in these processes. Three studies using a diverse selection of assessment tools, samples, and methods are presented to identify response monitoring correlates of the two main factors of psychopathy. In Studies 1 (undergraduates), 2 (adolescents), and 3 (offenders), interpersonal-affective traits were related to increased adjustment following errors and, in Study 3, to enhanced monitoring of errors. Impulsive-antisocial traits were not consistently related to error adjustment across the studies, although these traits were related to a deficient monitoring of errors in Study 3. The results may help explain previous mixed findings and advance implications for etiological models of psychopathy. PMID:24933282

  2. Development of Genetically Stable Escherichia coli Strains for Poly(3-Hydroxypropionate) Production

    PubMed Central

    Gao, Yongqiang; Liu, Changshui; Ding, Yamei; Sun, Chao; Zhang, Rubing; Xian, Mo; Zhao, Guang

    2014-01-01

    Poly(3-hydroxypropionate) (P3HP) is a biodegradable and biocompatible thermoplastic. In our previous study, a pathway for P3HP production was constructed in recombinant Esecherichia coli. Seven exogenous genes in P3HP synthesis pathway were carried by two plasmid vectors. However, the P3HP production was severely suppressed by strain instability due to plasmid loss. In this paper, two strategies, chromosomal gene integration and plasmid addiction system (PAS) based on amino acid anabolism, were applied to construct a genetically stable strain. Finally, a combination of those two methods resulted in the best results. The resultant strain carried a portion of P3HP synthesis genes on chromosome and the others on plasmid, and also brought a tyrosine-auxotrophy based PAS. In aerobic fed-batch fermentation, this strain produced 25.7 g/L P3HP from glycerol, about 2.5-time higher than the previous strain with two plasmids. To the best of our knowledge, this is the highest P3HP production from inexpensive carbon sources. PMID:24837211

  3. An aggregate urine analysis tool to detect acute dehydration.

    PubMed

    Hahn, Robert G; Waldréus, Nana

    2013-08-01

    Urine sampling has previously been evaluated for detecting dehydration in young male athletes. The present study investigated whether urine analysis can serve as a measure of dehydration in men and women of a wide age span. Urine sampling and body weight measurement were undertaken before and after recreational physical exercise (median time: 90 min) in 57 volunteers age 17-69 years (mean age: 42). Urine analysis included urine color, osmolality, specific gravity, and creatinine. The volunteers' body weight decreased 1.1% (mean) while they exercised. There were strong correlations between all 4 urinary markers of dehydration (r = .73-.84, p < .001). Researchers constructed a composite dehydration index graded from 1 to 6 based on these markers. This index changed from 2.70 before exercising to 3.55 after exercising, which corresponded to dehydration of 1.0% as given by a preliminary reference curve based on seven previous studies in athletes. Men were slightly dehydrated at baseline (mean: 1.9%) compared with women (mean: 0.7%; p < .001), though age had no influence on the results. A final reference curve that considered both the present results and the 7 previous studies was constructed in which exercise-induced weight loss (x) was predicted by the exponential equation x = 0.20 dehydration index1.86. Urine sampling can be used to estimate weight loss due to dehydration in adults up to age 70. A robust dehydration index based on four indicators reduces the influence of confounders.

  4. Robust Tracking of Small Displacements with a Bayesian Estimator

    PubMed Central

    Dumont, Douglas M.; Byram, Brett C.

    2016-01-01

    Radiation-force-based elasticity imaging describes a group of techniques that use acoustic radiation force (ARF) to displace tissue in order to obtain qualitative or quantitative measurements of tissue properties. Because ARF-induced displacements are on the order of micrometers, tracking these displacements in vivo can be challenging. Previously, it has been shown that Bayesian-based estimation can overcome some of the limitations of a traditional displacement estimator like normalized cross-correlation (NCC). In this work, we describe a Bayesian framework that combines a generalized Gaussian-Markov random field (GGMRF) prior with an automated method for selecting the prior’s width. We then evaluate its performance in the context of tracking the micrometer-order displacements encountered in an ARF-based method like acoustic radiation force impulse (ARFI) imaging. The results show that bias, variance, and mean-square error performance vary with prior shape and width, and that an almost one order-of-magnitude reduction in mean-square error can be achieved by the estimator at the automatically-selected prior width. Lesion simulations show that the proposed estimator has a higher contrast-to-noise ratio but lower contrast than NCC, median-filtered NCC, and the previous Bayesian estimator, with a non-Gaussian prior shape having better lesion-edge resolution than a Gaussian prior. In vivo results from a cardiac, radiofrequency ablation ARFI imaging dataset show quantitative improvements in lesion contrast-to-noise ratio over NCC as well as the previous Bayesian estimator. PMID:26529761

  5. [Prediction of the total Japanese cedar pollen counts based on male flower-setting conditions of standard trees].

    PubMed

    Yuta, Atsushi; Ukai, Kotaro; Sakakura, Yasuo; Tani, Hideshi; Matsuda, Fukiko; Yang, Tian-qun; Majima, Yuichi

    2002-07-01

    We made a prediction of the Japanese cedar (Cryptomeria japonica) pollen counts at Tsu city based on male flower-setting conditions of standard trees. The 69 standard trees from 23 kinds of clones, planted at Mie Prefecture Science and Technology Promotion Center (Hakusan, Mie) in 1964, were selected. Male flower-setting conditions for 276 faces (69 trees x 4 points of the compass) were scored from 0 to 3. The average of scores and total pollen counts from 1988 to 2000 was analyzed. As the results, the average scores from standard trees and total pollen counts except two mass pollen-scattered years in 1995 and 2000 had a positive correlation (r = 0.914) by linear function. On the mass pollen-scattered years, pollen counts were influenced from the previous year. Therefore, the score of the present year minus that of the previous year were used for analysis. The average scores from male flower-setting conditions and pollen counts had a strong positive correlation (r = 0.994) when positive scores by taking account of the previous year were analyzed. We conclude that prediction of pollen counts are possible based on the male flower-setting conditions of standard trees.

  6. Small-signal model for the series resonant converter

    NASA Technical Reports Server (NTRS)

    King, R. J.; Stuart, T. A.

    1985-01-01

    The results of a previous discrete-time model of the series resonant dc-dc converter are reviewed and from these a small signal dynamic model is derived. This model is valid for low frequencies and is based on the modulation of the diode conduction angle for control. The basic converter is modeled separately from its output filter to facilitate the use of these results for design purposes. Experimental results are presented.

  7. Detecting Pulsing Denial-of-Service Attacks with Nondeterministic Attack Intervals

    NASA Astrophysics Data System (ADS)

    Luo, Xiapu; Chan, Edmond W. W.; Chang, Rocky K. C.

    2009-12-01

    This paper addresses the important problem of detecting pulsing denial of service (PDoS) attacks which send a sequence of attack pulses to reduce TCP throughput. Unlike previous works which focused on a restricted form of attacks, we consider a very broad class of attacks. In particular, our attack model admits any attack interval between two adjacent pulses, whether deterministic or not. It also includes the traditional flooding-based attacks as a limiting case (i.e., zero attack interval). Our main contribution is Vanguard, a new anomaly-based detection scheme for this class of PDoS attacks. The Vanguard detection is based on three traffic anomalies induced by the attacks, and it detects them using a CUSUM algorithm. We have prototyped Vanguard and evaluated it on a testbed. The experiment results show that Vanguard is more effective than the previous methods that are based on other traffic anomalies (after a transformation using wavelet transform, Fourier transform, and autocorrelation) and detection algorithms (e.g., dynamic time warping).

  8. Evolutionary Optimization of Yagi-Uda Antennas

    NASA Technical Reports Server (NTRS)

    Lohn, Jason D.; Kraus, William F.; Linden, Derek S.; Colombano, Silvano P.

    2001-01-01

    Yagi-Uda antennas are known to be difficult to design and optimize due to their sensitivity at high gain, and the inclusion of numerous parasitic elements. We present a genetic algorithm-based automated antenna optimization system that uses a fixed Yagi-Uda topology and a byte-encoded antenna representation. The fitness calculation allows the implicit relationship between power gain and sidelobe/backlobe loss to emerge naturally, a technique that is less complex than previous approaches. The genetic operators used are also simpler. Our results include Yagi-Uda antennas that have excellent bandwidth and gain properties with very good impedance characteristics. Results exceeded previous Yagi-Uda antennas produced via evolutionary algorithms by at least 7.8% in mainlobe gain. We also present encouraging preliminary results where a coevolutionary genetic algorithm is used.

  9. Detached Eddy Simulation of Flap Side-Edge Flow

    NASA Technical Reports Server (NTRS)

    Balakrishnan, Shankar K.; Shariff, Karim R.

    2016-01-01

    Detached Eddy Simulation (DES) of flap side-edge flow was performed with a wing and half-span flap configuration used in previous experimental and numerical studies. The focus of the study is the unsteady flow features responsible for the production of far-field noise. The simulation was performed at a Reynolds number (based on the main wing chord) of 3.7 million. Reynolds Averaged Navier-Stokes (RANS) simulations were performed as a precursor to the DES. The results of these precursor simulations match previous experimental and RANS results closely. Although the present DES simulations have not reached statistical stationary yet, some unsteady features of the developing flap side-edge flowfield are presented. In the final paper it is expected that statistically stationary results will be presented including comparisons of surface pressure spectra with experimental data.

  10. Validation of light water reactor calculation methods and JEF-1-based data libraries by TRX and BAPL critical experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paratte, J.M.; Pelloni, S.; Grimm, P.

    1991-04-01

    This paper analyzes the capability of various code systems and JEF-1-based nuclear data libraries to compute light water reactor lattices by comparing calculations with results from thermal reactor benchmark experiments TRX and BAPL and with previously published values. With the JEF-1 evaluation, eigenvalues are generally well predicted within 8 mk (1 mk = 0.001) or less by all code systems, and all methods give reasonable results for the measured reaction rate ratios within, or not too far from, the experimental uncertainty.

  11. Gender, Attitudes Toward War, and Masculinities in Japan.

    PubMed

    Morinaga, Yasuko; Sakamoto, Yuiri; Nakashima, Ken'ichiro

    2017-06-01

    Previous studies have argued that masculinity is linked to war. We conducted a web-based survey to examine relationships between gender, attitudes toward war, and masculinities within a sample of Japanese adults of both sexes ( N = 366). Our results indicated that while men were more likely than women to accept war, the relationship between attitudes toward war and masculinities was inconclusive. Moreover, the results suggested that favorable attitudes toward war among men could be attenuated by interpersonal orientations. Based on our findings, we recommend a reexamination of attitudes toward war within the Japanese population.

  12. Using the e-Chasqui, web-based information system, to determine laboratory guidelines and data available to clinical staff.

    PubMed

    Blaya, Joaquin A; Yagui, Martin; Contreras, Carmen C; Palma, Betty; Shin, Sonya S; Yale, Gloria; Suarez, Carmen; Fraser, Hamish S F

    2008-11-06

    13% of all drug susceptibility tests (DSTs) performed at a public laboratory in Peru were duplicate. To determine reasons for duplicate requests an online survey was implemented in the e-Chasqui laboratory information system. Results showed that 59.6% of tests were ordered because clinical staff was unaware of ordering guidelines or of a previous result. This shows a benefit of using a web-based system and the lack of laboratory information available to clinical staff in Peru.

  13. New approaches for cement-based prophylactic augmentation of the osteoporotic proximal femur provide enhanced reinforcement as predicted by non-linear finite element simulations.

    PubMed

    Varga, Peter; Inzana, Jason A; Schwiedrzik, Jakob; Zysset, Philippe K; Gueorguiev, Boyko; Blauth, Michael; Windolf, Markus

    2017-05-01

    High incidence and increased mortality related to secondary, contralateral proximal femoral fractures may justify invasive prophylactic augmentation that reinforces the osteoporotic proximal femur to reduce fracture risk. Bone cement-based approaches (femoroplasty) may deliver the required strengthening effect; however, the significant variation in the results of previous studies calls for a systematic analysis and optimization of this method. Our hypothesis was that efficient generalized augmentation strategies can be identified via computational optimization. This study investigated, by means of finite element analysis, the effect of cement location and volume on the biomechanical properties of fifteen proximal femora in sideways fall. Novel cement cloud locations were developed using the principles of bone remodeling and compared to the "single central" location that was previously reported to be optimal. The new augmentation strategies provided significantly greater biomechanical benefits compared to the "single central" cement location. Augmenting with approximately 12ml of cement in the newly identified location achieved increases of 11% in stiffness, 64% in yield force, 156% in yield energy and 59% in maximum force, on average, compared to the non-augmented state. The weaker bones experienced a greater biomechanical benefit from augmentation than stronger bones. The effect of cement volume on the biomechanical properties was approximately linear. Results of the "single central" model showed good agreement with previous experimental studies. These findings indicate enhanced potential of cement-based prophylactic augmentation using the newly developed cementing strategy. Future studies should determine the required level of strengthening and confirm these numerical results experimentally. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. An attachment-based model of complicated grief including the role of avoidance

    PubMed Central

    Monk, Timothy; Houck, Patricia; Melhem, Nadine; Frank, Ellen; Reynolds, Charles; Sillowash, Russell

    2009-01-01

    Introduction Complicated grief is a prolonged grief disorder with elements of a stress response syndrome. We have previously proposed a biobehavioral model showing the pathway to complicated grief. Avoidance is a component that can be difficult to assess and pivotal to treatment. Therefore we developed an avoidance questionnaire to characterize avoidance among patients with CG. Methods We further explain our complicated grief model and provide results of a study of 128 participants in a treatment study of CG who completed a 15-item Grief-related Avoidance Questionnaire (GRAQ). Results of Avoidance Assessment Mean (SD) GRAQ score was 25. 0 ± 12.5 with a range of 0–60. Cronbach's alpha was 0.87 and test re-test correlation was 0.88. Correlation analyses showed good convergent and discriminant validity. Avoidance of reminders of the loss contributed to functional impairment after controlling for other symptoms of complicated grief. Discussion In this paper we extend our previously described attachment-based biobehavioral model of CG. We envision CG as a stress response syndrome that results from failure to integrate information about death of an attachment figure into an effectively functioning secure base schema and/or to effectively re-engage the exploratory system in a world without the deceased. Avoidance is a key element of the model. PMID:17629727

  15. Choline acetyltransferase in the hippocampus is associated with learning strategy preference in adult male rats.

    PubMed

    Hawley, Wayne R; Witty, Christine F; Daniel, Jill M; Dohanich, Gary P

    2015-08-01

    One principle of the multiple memory systems hypothesis posits that the hippocampus-based and striatum-based memory systems compete for control over learning. Consistent with this notion, previous research indicates that the cholinergic system of the hippocampus plays a role in modulating the preference for a hippocampus-based place learning strategy over a striatum-based stimulus--response learning strategy. Interestingly, in the hippocampus, greater activity and higher protein levels of choline acetyltransferase (ChAT), the enzyme that synthesizes acetylcholine, are associated with better performance on hippocampus-based learning and memory tasks. With this in mind, the primary aim of the current study was to determine if higher levels of ChAT and the high-affinity choline uptake transporter (CHT) in the hippocampus were associated with a preference for a hippocampus-based place learning strategy on a task that also could be solved by relying on a striatum-based stimulus--response learning strategy. Results confirmed that levels of ChAT in the dorsal region of the hippocampus were associated with a preference for a place learning strategy on a water maze task that could also be solved by adopting a stimulus-response learning strategy. Consistent with previous studies, the current results support the hypothesis that the cholinergic system of the hippocampus plays a role in balancing competition between memory systems that modulate learning strategy preference. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. New Concepts in Electromagnetic Materials and Antennas

    DTIC Science & Technology

    2013-09-01

    a metasurface that can be considered as two- dimensional structures which can have tailored response to electromagnetic waves. This is different from...electronic band-gag (EBG) based structures as the performance for metasurfaces 10 Approved for public release; distribution is unlimited. is not...gain should be achievable. Current efforts underway include the application of metasurfaces results as well as previous results from virtual aperture

  17. Convolutional coding results for the MVM '73 X-band telemetry experiment

    NASA Technical Reports Server (NTRS)

    Layland, J. W.

    1978-01-01

    Results of simulation of several short-constraint-length convolutional codes using a noisy symbol stream obtained via the turnaround ranging channels of the MVM'73 spacecraft are presented. First operational use of this coding technique is on the Voyager mission. The relative performance of these codes in this environment is as previously predicted from computer-based simulations.

  18. Molecular phylogenetics of Ruscaceae sensu lato and related families (Asparagales) based on plastid and nuclear DNA sequences

    PubMed Central

    Kim, Joo-Hwan; Kim, Dong-Kap; Forest, Felix; Fay, Michael F.; Chase, Mark W.

    2010-01-01

    Background Previous phylogenetics studies of Asparagales, although extensive and generally well supported, have left several sets of taxa unclearly placed and have not addressed all relationships within certain clades thoroughly (some clades were relatively sparsely sampled). One of the most important of these is sampling within and placement of Nolinoideae (Ruscaceae s.l.) of Asparagaceae sensu Angiosperm Phylogeny Group (APG) III, which subfamily includes taxa previously referred to Convallariaceae, Dracaenaaceae, Eriospermaceae, Nolinaceae and Ruscaceae. Methods A phylogenetic analysis of a combined data set for 126 taxa of Ruscaceae s.l. and related groups in Asparagales based on three nuclear and plastid DNA coding genes, 18S rDNA (1796 bp), rbcL (1338 bp) and matK (1668 bp), representing a total of approx. 4·8 kb is presented. Parsimony and Bayesian inference analyses were conducted to elucidate relationships of Ruscaceae s.l. and related groups, and parsimony bootstrap analysis was performed to assess support of clades. Key Results The combination of the three genes results in the most highly resolved and strongly supported topology yet obtained for Asparagales including Ruscaceae s.l. Asparagales relationships are nearly congruent with previous combined gene analyses, which were reflected in the APG III classification. Parsimony and Bayesian analyses yield identical relationships except for some slight variation among the core asparagoid families, which nevertheless form a strongly supported group in both types of analyses. In core asparagoids, five major clades are identified: (1) Alliaceae s.l. (sensu APG III, Amarylidaceae–Agapanthaceae–Alliaceae); (2) Asparagaceae–Laxmanniaceae–Ruscaceae s.l.; (3) Themidaceae; (4) Hyacinthaceae; (5) Anemarrhenaceae–Behniaceae–Herreriaceae–Agavaceae (clades 2–5 collectively Asparagaceae s.l. sensu APG III). The position of Aphyllanthes is labile, but it is sister to Themidaceae in the combined maximum-parsimony tree and sister to Anemarrhenaceae in the Bayesian analysis. The highly supported clade of Xanthorrhoeaceae s.l. (sensu APG III, including Asphodelaceae and Hemerocallidaceae) is sister to the core asparagoids. Ruscaceae s.l. are a well-supported group. Asparagaceae s.s. are sister to Ruscaceae s.l., even though the clade of the two families is weakly supported; Laxmanniaceae are strongly supported as sister to Ruscaceae s.l. and Asparagaceae. Ruscaceae s.l. include six principal clades that often reflect previously named groups: (1) tribe Polygonateae (excluding Disporopsis); (2) tribe Ophiopogoneae; (3) tribe Convallarieae (excluding Theropogon); (4) Ruscaceae s.s. + Dracaenaceae + Theropogon + Disporopsis + Comospermum; (5) Nolinaceae, (6) Eriospermum. Conclusions The analyses here were largely conducted with new data collected for the same loci as in previous studies, but in this case from different species/DNA accessions and greater sampling in many cases than in previously published analyses; nonetheless, the results largely mirror those of previously conducted studies. This demonstrates the robustness of these results and answers questions often raised about reproducibility of DNA results, given the often sparse sampling of taxa in some studies, particularly the earliest ones. The results also provide a clear set of patterns on which to base a new classification of the subfamilies of Asparagaceae s.l., particularly Ruscaceae s.l. (= Nolinoideae of Asparagaceae s.l.), and examine other putatively important characters of Asparagales. PMID:20929900

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hess, Peter

    An improved microscopic cleavage model, based on a Morse-type and Lennard-Jones-type interaction instead of the previously employed half-sine function, is used to determine the maximum cleavage strength for the brittle materials diamond, tungsten, molybdenum, silicon, GaAs, silica, and graphite. The results of both interaction potentials are in much better agreement with the theoretical strength values obtained by ab initio calculations for diamond, tungsten, molybdenum, and silicon than the previous model. Reasonable estimates of the intrinsic strength are presented for GaAs, silica, and graphite, where first principles values are not available.

  20. Direct measurement of additional Ar-H2O vibration-rotation-tunneling bands in the millimeter-submillimeter range

    NASA Astrophysics Data System (ADS)

    Zou, Luyao; Widicus Weaver, Susanna L.

    2016-06-01

    Three new weak bands of the Ar-H2O vibration-rotation-tunneling spectrum have been measured in the millimeter wavelength range. These bands were predicted from combination differences based on previously measured bands in the submillimeter region. Two previously reported submillimeter bands were also remeasured with higher frequency resolution. These new measurements allow us to obtain accurate information on the Coriolis interaction between the 101 and 110 states. Here we report these results and the associated improved molecular constants.

  1. The Decay of Motor Memories Is Independent of Context Change Detection

    PubMed Central

    Brennan, Andrew E.; Smith, Maurice A.

    2015-01-01

    When the error signals that guide human motor learning are withheld following training, recently-learned motor memories systematically regress toward untrained performance. It has previously been hypothesized that this regression results from an intrinsic volatility in these memories, resulting in an inevitable decay in the absence of ongoing error signals. However, a recently-proposed alternative posits that even recently-acquired motor memories are intrinsically stable, decaying only if a change in context is detected. This new theory, the context-dependent decay hypothesis, makes two key predictions: (1) after error signals are withheld, decay onset should be systematically delayed until the context change is detected; and (2) manipulations that impair detection by masking context changes should result in prolonged delays in decay onset and reduced decay amplitude at any given time. Here we examine the decay of motor adaptation following the learning of novel environmental dynamics in order to carefully evaluate this hypothesis. To account for potential issues in previous work that supported the context-dependent decay hypothesis, we measured decay using a balanced and baseline-referenced experimental design that allowed for direct comparisons between analogous masked and unmasked context changes. Using both an unbiased variant of the previous decay onset analysis and a novel highly-powered group-level version of this analysis, we found no evidence for systematically delayed decay onset nor for the masked context change affecting decay amplitude or its onset time. We further show how previous estimates of decay onset latency can be substantially biased in the presence of noise, and even more so with correlated noise, explaining the discrepancy between the previous results and our findings. Our results suggest that the decay of motor memories is an intrinsic feature of error-based learning that does not depend on context change detection. PMID:26111244

  2. Model-Based Speech Signal Coding Using Optimized Temporal Decomposition for Storage and Broadcasting Applications

    NASA Astrophysics Data System (ADS)

    Athaudage, Chandranath R. N.; Bradley, Alan B.; Lech, Margaret

    2003-12-01

    A dynamic programming-based optimization strategy for a temporal decomposition (TD) model of speech and its application to low-rate speech coding in storage and broadcasting is presented. In previous work with the spectral stability-based event localizing (SBEL) TD algorithm, the event localization was performed based on a spectral stability criterion. Although this approach gave reasonably good results, there was no assurance on the optimality of the event locations. In the present work, we have optimized the event localizing task using a dynamic programming-based optimization strategy. Simulation results show that an improved TD model accuracy can be achieved. A methodology of incorporating the optimized TD algorithm within the standard MELP speech coder for the efficient compression of speech spectral information is also presented. The performance evaluation results revealed that the proposed speech coding scheme achieves 50%-60% compression of speech spectral information with negligible degradation in the decoded speech quality.

  3. Reading Rate and Comprehension for Text Presented on Tablet and Paper: Evidence from Arabic.

    PubMed

    Hermena, Ehab W; Sheen, Mercedes; AlJassmi, Maryam; AlFalasi, Khulood; AlMatroushi, Maha; Jordan, Timothy R

    2017-01-01

    The effectiveness of tablet computers to supplement or replace paper-based text in everyday life has yet to be fully revealed. Previous investigations comparing reading performance using tablets and paper have, however, reported inconsistent results. Furthermore, the interpretability of some previous findings is limited by lack of experimental control over variables like text display conditions. In the current study, we investigated reading performance for text presented on tablet and paper. Crucially, the levels of luminance and contrast were matched precisely across tablet and paper. The study used Arabic text which differs substantially from the languages used previously to investigate effects of tablet and paper on reading, thus offering a distinctive test of the influence of these two media on reading performance. The results suggest that when text display conditions are well-matched, there is no reliable difference in reading performance between the two media. Also, neither the order of medium (reading from tablet or paper first), nor familiarity with using a tablet significantly influence reading performance. These results call into question previous suggestions that reading from tablets is linked to poorer reading performance, and demonstrate the benefits of controlling text display conditions. These findings are of interest to reading scientists and educators.

  4. Hurricane Harvey Rainfall, Did It Exceed PMP and What are the Implications?

    NASA Astrophysics Data System (ADS)

    Kappel, B.; Hultstrand, D.; Muhlestein, G.

    2017-12-01

    Rainfall resulting from Hurricane Harvey reached historic levels over the coastal regions of Texas and Louisiana during the last week of August 2017. Although extreme rainfall from this landfalling tropical system is not uncommon in the region, Harvey was unique in that it persisted over the same general location for several days, producing volumes of rainfall not previously observed in the United States. Devastating flooding and severe stress to infrastructure in the region was the result. Coincidentally, Applied Weather Associates had recently completed an updated statewide Probable Maximum Precipitation (PMP) study for Texas. This storm proved to be a real-time test of the adequacy of those values. AWA calculates PMP following a storm-based approach. This same approach was use in the HMRs. Therefore inclusion of all PMP-type storms is critically important to ensuring that appropriate PMP values are produced. This presentation will discuss the analysis of the Harvey rainfall using the Storm Precipitation Analysis System (SPAS) program used to analyze all storms used in PMP development, compare the results of the Harvey rainfall analysis against previous similar storms, and provide comparisons of the Harvey rainfall against previous and current PMP depths. Discussion will be included regarding the implications of the storm on previous and future PMP estimates, dam safety design, and infrastructure vulnerable to extreme flooding.

  5. Reading Rate and Comprehension for Text Presented on Tablet and Paper: Evidence from Arabic

    PubMed Central

    Hermena, Ehab W.; Sheen, Mercedes; AlJassmi, Maryam; AlFalasi, Khulood; AlMatroushi, Maha; Jordan, Timothy R.

    2017-01-01

    The effectiveness of tablet computers to supplement or replace paper-based text in everyday life has yet to be fully revealed. Previous investigations comparing reading performance using tablets and paper have, however, reported inconsistent results. Furthermore, the interpretability of some previous findings is limited by lack of experimental control over variables like text display conditions. In the current study, we investigated reading performance for text presented on tablet and paper. Crucially, the levels of luminance and contrast were matched precisely across tablet and paper. The study used Arabic text which differs substantially from the languages used previously to investigate effects of tablet and paper on reading, thus offering a distinctive test of the influence of these two media on reading performance. The results suggest that when text display conditions are well-matched, there is no reliable difference in reading performance between the two media. Also, neither the order of medium (reading from tablet or paper first), nor familiarity with using a tablet significantly influence reading performance. These results call into question previous suggestions that reading from tablets is linked to poorer reading performance, and demonstrate the benefits of controlling text display conditions. These findings are of interest to reading scientists and educators. PMID:28270791

  6. Generalization of Selection Test Validity.

    ERIC Educational Resources Information Center

    Colbert, G. A.; Taylor, L. R.

    1978-01-01

    This is part three of a three-part series concerned with the empirical development of homogeneous families of insurance company jobs based on data from the Position Analysis Questionnaire (PAQ). This part involves validity generalizations within the job families which resulted from the previous research. (Editor/RK)

  7. Gene expression profiles of auxin metabolism in maturing apple fruit

    USDA-ARS?s Scientific Manuscript database

    Variation exists among apple genotypes in fruit maturation and ripening patterns that influences at-harvest fruit firmness and postharvest storability. Based on the results from our previous large-scale transcriptome profiling on apple fruit maturation and well-documented auxin-ethylene crosstalk, t...

  8. Elevated rates of gold mining in the Amazon revealed through high-resolution monitoring.

    PubMed

    Asner, Gregory P; Llactayo, William; Tupayachi, Raul; Luna, Ernesto Ráez

    2013-11-12

    Gold mining has rapidly increased in western Amazonia, but the rates and ecological impacts of mining remain poorly known and potentially underestimated. We combined field surveys, airborne mapping, and high-resolution satellite imaging to assess road- and river-based gold mining in the Madre de Dios region of the Peruvian Amazon from 1999 to 2012. In this period, the geographic extent of gold mining increased 400%. The average annual rate of forest loss as a result of gold mining tripled in 2008 following the global economic recession, closely associated with increased gold prices. Small clandestine operations now comprise more than half of all gold mining activities throughout the region. These rates of gold mining are far higher than previous estimates that were based on traditional satellite mapping techniques. Our results prove that gold mining is growing more rapidly than previously thought, and that high-resolution monitoring approaches are required to accurately quantify human impacts on tropical forests.

  9. Parallel, exhaustive processing underlies logarithmic search functions: Visual search with cortical magnification.

    PubMed

    Wang, Zhiyuan; Lleras, Alejandro; Buetti, Simona

    2018-04-17

    Our lab recently found evidence that efficient visual search (with a fixed target) is characterized by logarithmic Reaction Time (RT) × Set Size functions whose steepness is modulated by the similarity between target and distractors. To determine whether this pattern of results was based on low-level visual factors uncontrolled by previous experiments, we minimized the possibility of crowding effects in the display, compensated for the cortical magnification factor by magnifying search items based on their eccentricity, and compared search performance on such displays to performance on displays without magnification compensation. In both cases, the RT × Set Size functions were found to be logarithmic, and the modulation of the log slopes by target-distractor similarity was replicated. Consistent with previous results in the literature, cortical magnification compensation eliminated most target eccentricity effects. We conclude that the log functions and their modulation by target-distractor similarity relations reflect a parallel exhaustive processing architecture for early vision.

  10. Elevated rates of gold mining in the Amazon revealed through high-resolution monitoring

    PubMed Central

    Asner, Gregory P.; Llactayo, William; Tupayachi, Raul; Luna, Ernesto Ráez

    2013-01-01

    Gold mining has rapidly increased in western Amazonia, but the rates and ecological impacts of mining remain poorly known and potentially underestimated. We combined field surveys, airborne mapping, and high-resolution satellite imaging to assess road- and river-based gold mining in the Madre de Dios region of the Peruvian Amazon from 1999 to 2012. In this period, the geographic extent of gold mining increased 400%. The average annual rate of forest loss as a result of gold mining tripled in 2008 following the global economic recession, closely associated with increased gold prices. Small clandestine operations now comprise more than half of all gold mining activities throughout the region. These rates of gold mining are far higher than previous estimates that were based on traditional satellite mapping techniques. Our results prove that gold mining is growing more rapidly than previously thought, and that high-resolution monitoring approaches are required to accurately quantify human impacts on tropical forests. PMID:24167281

  11. Upgrade Summer Severe Weather Tool in MIDDS

    NASA Technical Reports Server (NTRS)

    Wheeler, Mark M.

    2010-01-01

    The goal of this task was to upgrade the severe weather database from the previous phase by adding weather observations from the years 2004 - 2009, re-analyze the data to determine the important parameters, make adjustments to the index weights depending on the analysis results, and update the MIDDS GUI. The added data increased the period of record from 15 to 21 years. Data sources included local forecast rules, archived sounding data, surface and upper air maps, and two severe weather event databases covering east-central Florida. Four of the stability indices showed increased severe weather predication. The Total Threat Score (TTS) of the previous work was verified for the warm season of 2009 with very good skill. The TTS Probability of Detection (POD) was 88% and the False alarm rate (FAR) of 8%. Based on the results of the analyses, the MIDDS Severe Weather Worksheet GUI was updated to assist the duty forecaster by providing a level of objective guidance based on the analysis of the stability parameters and synoptic-scale dynamics.

  12. Efficient path-based computations on pedigree graphs with compact encodings

    PubMed Central

    2012-01-01

    A pedigree is a diagram of family relationships, and it is often used to determine the mode of inheritance (dominant, recessive, etc.) of genetic diseases. Along with rapidly growing knowledge of genetics and accumulation of genealogy information, pedigree data is becoming increasingly important. In large pedigree graphs, path-based methods for efficiently computing genealogical measurements, such as inbreeding and kinship coefficients of individuals, depend on efficient identification and processing of paths. In this paper, we propose a new compact path encoding scheme on large pedigrees, accompanied by an efficient algorithm for identifying paths. We demonstrate the utilization of our proposed method by applying it to the inbreeding coefficient computation. We present time and space complexity analysis, and also manifest the efficiency of our method for evaluating inbreeding coefficients as compared to previous methods by experimental results using pedigree graphs with real and synthetic data. Both theoretical and experimental results demonstrate that our method is more scalable and efficient than previous methods in terms of time and space requirements. PMID:22536898

  13. Eye pupil detection system using an ensemble of regression forest and fast radial symmetry transform with a near infrared camera

    NASA Astrophysics Data System (ADS)

    Jeong, Mira; Nam, Jae-Yeal; Ko, Byoung Chul

    2017-09-01

    In this paper, we focus on pupil center detection in various video sequences that include head poses and changes in illumination. To detect the pupil center, we first find four eye landmarks in each eye by using cascade local regression based on a regression forest. Based on the rough location of the pupil, a fast radial symmetric transform is applied using the previously found pupil location to rearrange the fine pupil center. As the final step, the pupil displacement is estimated between the previous frame and the current frame to maintain the level of accuracy against a false locating result occurring in a particular frame. We generated a new face dataset, called Keimyung University pupil detection (KMUPD), with infrared camera. The proposed method was successfully applied to the KMUPD dataset, and the results indicate that its pupil center detection capability is better than that of other methods and with a shorter processing time.

  14. Potential standards support for activity-based GeoINT

    NASA Astrophysics Data System (ADS)

    Antonisse, Jim

    2012-06-01

    The Motion Imagery Standards Board (MISB) is engaged in multiple initiatives that may provide support for Activity-Based GeoINT (ABG). This paper describes a suite of approaches based on previous MISB work on a standards-based architecture for tracking. It focuses on ABG in the context of standardized tracker results, and shows how the MISB tracker formulation can formalize important components of the ABG problem. The paper proposes a grammar-based formalism for the reporting of activities within a stream of FMV or wide-area surveillance data. Such a grammar would potentially provide an extensible descriptive language for ABG across the community.

  15. Predicting the required number of training samples. [for remotely sensed image data based on covariance matrix estimate quality criterion of normal distribution

    NASA Technical Reports Server (NTRS)

    Kalayeh, H. M.; Landgrebe, D. A.

    1983-01-01

    A criterion which measures the quality of the estimate of the covariance matrix of a multivariate normal distribution is developed. Based on this criterion, the necessary number of training samples is predicted. Experimental results which are used as a guide for determining the number of training samples are included. Previously announced in STAR as N82-28109

  16. High-temperature fabricable nickel-iron aluminides

    DOEpatents

    Liu, Chain T.

    1988-02-02

    Nickel-iron aluminides are described that are based on Ni.sub.3 Al, and have significant iron content, to which additions of hafnium, boron, carbon and cerium are made resulting in Ni.sub.3 Al base alloys that can be fabricated at higher temperatures than similar alloys previously developed. Further addition of molybdenum improves oxidation and cracking resistance. These alloys possess the advantages of ductility, hot fabricability, strength, and oxidation resistance.

  17. Curriculum-Based Measurement of Oral Reading: An Evaluation of Growth Rates and Seasonal Effects among Students Served in General and Special Education

    ERIC Educational Resources Information Center

    Christ, Theodore J.; Silberglitt, Benjamin; Yeo, Seungsoo; Cormier, Damien

    2010-01-01

    Curriculum-based measurement of oral reading (CBM-R) is often used to benchmark growth in the fall, winter, and spring. CBM-R is also used to set goals and monitor student progress between benchmarking occasions. The results of previous research establish an expectation that weekly growth on CBM-R tasks is consistently linear throughout the…

  18. The Development of Intention-Based Morality: The Influence of Intention Salience and Recency, Negligence, and Outcome on Children's and Adults' Judgments

    ERIC Educational Resources Information Center

    Nobes, Gavin; Panagiotaki, Georgia; Engelhardt, Paul E.

    2017-01-01

    Two experiments were conducted to investigate the influences on 4-8 year-olds' and adults' moral judgments. In both, participants were told stories from previous studies that had indicated that children's judgments are largely outcome-based. Building on recent research in which one change to these studies' methods resulted in substantially more…

  19. Empirical Study on Designing of Gaze Tracking Camera Based on the Information of User's Head Movement.

    PubMed

    Pan, Weiyuan; Jung, Dongwook; Yoon, Hyo Sik; Lee, Dong Eun; Naqvi, Rizwan Ali; Lee, Kwan Woo; Park, Kang Ryoung

    2016-08-31

    Gaze tracking is the technology that identifies a region in space that a user is looking at. Most previous non-wearable gaze tracking systems use a near-infrared (NIR) light camera with an NIR illuminator. Based on the kind of camera lens used, the viewing angle and depth-of-field (DOF) of a gaze tracking camera can be different, which affects the performance of the gaze tracking system. Nevertheless, to our best knowledge, most previous researches implemented gaze tracking cameras without ground truth information for determining the optimal viewing angle and DOF of the camera lens. Eye-tracker manufacturers might also use ground truth information, but they do not provide this in public. Therefore, researchers and developers of gaze tracking systems cannot refer to such information for implementing gaze tracking system. We address this problem providing an empirical study in which we design an optimal gaze tracking camera based on experimental measurements of the amount and velocity of user's head movements. Based on our results and analyses, researchers and developers might be able to more easily implement an optimal gaze tracking system. Experimental results show that our gaze tracking system shows high performance in terms of accuracy, user convenience and interest.

  20. Empirical Study on Designing of Gaze Tracking Camera Based on the Information of User’s Head Movement

    PubMed Central

    Pan, Weiyuan; Jung, Dongwook; Yoon, Hyo Sik; Lee, Dong Eun; Naqvi, Rizwan Ali; Lee, Kwan Woo; Park, Kang Ryoung

    2016-01-01

    Gaze tracking is the technology that identifies a region in space that a user is looking at. Most previous non-wearable gaze tracking systems use a near-infrared (NIR) light camera with an NIR illuminator. Based on the kind of camera lens used, the viewing angle and depth-of-field (DOF) of a gaze tracking camera can be different, which affects the performance of the gaze tracking system. Nevertheless, to our best knowledge, most previous researches implemented gaze tracking cameras without ground truth information for determining the optimal viewing angle and DOF of the camera lens. Eye-tracker manufacturers might also use ground truth information, but they do not provide this in public. Therefore, researchers and developers of gaze tracking systems cannot refer to such information for implementing gaze tracking system. We address this problem providing an empirical study in which we design an optimal gaze tracking camera based on experimental measurements of the amount and velocity of user’s head movements. Based on our results and analyses, researchers and developers might be able to more easily implement an optimal gaze tracking system. Experimental results show that our gaze tracking system shows high performance in terms of accuracy, user convenience and interest. PMID:27589768

  1. Towards an international taxonomy of integrated primary care: a Delphi consensus approach.

    PubMed

    Valentijn, Pim P; Vrijhoef, Hubertus J M; Ruwaard, Dirk; Boesveld, Inge; Arends, Rosa Y; Bruijnzeels, Marc A

    2015-05-22

    Developing integrated service models in a primary care setting is considered an essential strategy for establishing a sustainable and affordable health care system. The Rainbow Model of Integrated Care (RMIC) describes the theoretical foundations of integrated primary care. The aim of this study is to refine the RMIC by developing a consensus-based taxonomy of key features. First, the appropriateness of previously identified key features was retested by conducting an international Delphi study that was built on the results of a previous national Delphi study. Second, categorisation of the features among the RMIC integrated care domains was assessed in a second international Delphi study. Finally, a taxonomy was constructed by the researchers based on the results of the three Delphi studies. The final taxonomy consists of 21 key features distributed over eight integration domains which are organised into three main categories: scope (person-focused vs. population-based), type (clinical, professional, organisational and system) and enablers (functional vs. normative) of an integrated primary care service model. The taxonomy provides a crucial differentiation that clarifies and supports implementation, policy formulation and research regarding the organisation of integrated primary care. Further research is needed to develop instruments based on the taxonomy that can reveal the realm of integrated primary care in practice.

  2. Equivalent charge source model based iterative maximum neighbor weight for sparse EEG source localization.

    PubMed

    Xu, Peng; Tian, Yin; Lei, Xu; Hu, Xiao; Yao, Dezhong

    2008-12-01

    How to localize the neural electric activities within brain effectively and precisely from the scalp electroencephalogram (EEG) recordings is a critical issue for current study in clinical neurology and cognitive neuroscience. In this paper, based on the charge source model and the iterative re-weighted strategy, proposed is a new maximum neighbor weight based iterative sparse source imaging method, termed as CMOSS (Charge source model based Maximum neighbOr weight Sparse Solution). Different from the weight used in focal underdetermined system solver (FOCUSS) where the weight for each point in the discrete solution space is independently updated in iterations, the new designed weight for each point in each iteration is determined by the source solution of the last iteration at both the point and its neighbors. Using such a new weight, the next iteration may have a bigger chance to rectify the local source location bias existed in the previous iteration solution. The simulation studies with comparison to FOCUSS and LORETA for various source configurations were conducted on a realistic 3-shell head model, and the results confirmed the validation of CMOSS for sparse EEG source localization. Finally, CMOSS was applied to localize sources elicited in a visual stimuli experiment, and the result was consistent with those source areas involved in visual processing reported in previous studies.

  3. A Bacterial Analysis Platform: An Integrated System for Analysing Bacterial Whole Genome Sequencing Data for Clinical Diagnostics and Surveillance.

    PubMed

    Thomsen, Martin Christen Frølund; Ahrenfeldt, Johanne; Cisneros, Jose Luis Bellod; Jurtz, Vanessa; Larsen, Mette Voldby; Hasman, Henrik; Aarestrup, Frank Møller; Lund, Ole

    2016-01-01

    Recent advances in whole genome sequencing have made the technology available for routine use in microbiological laboratories. However, a major obstacle for using this technology is the availability of simple and automatic bioinformatics tools. Based on previously published and already available web-based tools we developed a single pipeline for batch uploading of whole genome sequencing data from multiple bacterial isolates. The pipeline will automatically identify the bacterial species and, if applicable, assemble the genome, identify the multilocus sequence type, plasmids, virulence genes and antimicrobial resistance genes. A short printable report for each sample will be provided and an Excel spreadsheet containing all the metadata and a summary of the results for all submitted samples can be downloaded. The pipeline was benchmarked using datasets previously used to test the individual services. The reported results enable a rapid overview of the major results, and comparing that to the previously found results showed that the platform is reliable and able to correctly predict the species and find most of the expected genes automatically. In conclusion, a combined bioinformatics platform was developed and made publicly available, providing easy-to-use automated analysis of bacterial whole genome sequencing data. The platform may be of immediate relevance as a guide for investigators using whole genome sequencing for clinical diagnostics and surveillance. The platform is freely available at: https://cge.cbs.dtu.dk/services/CGEpipeline-1.1 and it is the intention that it will continue to be expanded with new features as these become available.

  4. Smoking-related deaths averted due to three years of policy progress

    PubMed Central

    Ellis, Jennifer A; Mays, Darren; Huang, An-Tsun

    2013-01-01

    Abstract Objective To evaluate the global impact of adopting highest-level MPOWER tobacco control policies in different countries and territories from 2007 to 2010. Methods Policy effect sizes based on previously-validated SimSmoke models were applied to determine the reduction in the number of smokers as a result of policy adoption during this period. Based on previous research suggesting that half of all smokers die from smoking, we also derived the estimated smoking-attributable deaths (SADs) averted due to MPOWER policy implementation. The results from use of this simple yet powerful method are consistent with those predicted by using previously validated SimSmoke models. Findings In total, 41 countries adopted at least one highest-level MPOWER policy between 2007 and 2010. As a result of all policies adopted during this period, the number of smokers is estimated to have dropped by 14.8 million, with a total of 7.4 million SADs averted. The largest number of SADs was averted as a result of increased cigarette taxes (3.5 million), smoke-free air laws (2.5 million), health warnings (700 000), cessation treatments (380 000), and bans on tobacco marketing (306 000). Conclusion From 2007 to 2010, 41 countries and territories took action that will collectively prevent nearly 7.5 million smoking-related deaths globally. These findings demonstrate the magnitude of the actions already taken by countries and underscore the potential for millions of additional lives to be saved with continued adoption of MPOWER policies. PMID:23825878

  5. Combining heuristic and statistical techniques in landslide hazard assessments

    NASA Astrophysics Data System (ADS)

    Cepeda, Jose; Schwendtner, Barbara; Quan, Byron; Nadim, Farrokh; Diaz, Manuel; Molina, Giovanni

    2014-05-01

    As a contribution to the Global Assessment Report 2013 - GAR2013, coordinated by the United Nations International Strategy for Disaster Reduction - UNISDR, a drill-down exercise for landslide hazard assessment was carried out by entering the results of both heuristic and statistical techniques into a new but simple combination rule. The data available for this evaluation included landslide inventories, both historical and event-based. In addition to the application of a heuristic method used in the previous editions of GAR, the availability of inventories motivated the use of statistical methods. The heuristic technique is largely based on the Mora & Vahrson method, which estimates hazard as the product of susceptibility and triggering factors, where classes are weighted based on expert judgment and experience. Two statistical methods were also applied: the landslide index method, which estimates weights of the classes for the susceptibility and triggering factors based on the evidence provided by the density of landslides in each class of the factors; and the weights of evidence method, which extends the previous technique to include both positive and negative evidence of landslide occurrence in the estimation of weights for the classes. One key aspect during the hazard evaluation was the decision on the methodology to be chosen for the final assessment. Instead of opting for a single methodology, it was decided to combine the results of the three implemented techniques using a combination rule based on a normalization of the results of each method. The hazard evaluation was performed for both earthquake- and rainfall-induced landslides. The country chosen for the drill-down exercise was El Salvador. The results indicate that highest hazard levels are concentrated along the central volcanic chain and at the centre of the northern mountains.

  6. Exponential Stability of Almost Periodic Solutions for Memristor-Based Neural Networks with Distributed Leakage Delays.

    PubMed

    Xu, Changjin; Li, Peiluan; Pang, Yicheng

    2016-12-01

    In this letter, we deal with a class of memristor-based neural networks with distributed leakage delays. By applying a new Lyapunov function method, we obtain some sufficient conditions that ensure the existence, uniqueness, and global exponential stability of almost periodic solutions of neural networks. We apply the results of this solution to prove the existence and stability of periodic solutions for this delayed neural network with periodic coefficients. We then provide an example to illustrate the effectiveness of the theoretical results. Our results are completely new and complement the previous studies Chen, Zeng, and Jiang ( 2014 ) and Jiang, Zeng, and Chen ( 2015 ).

  7. Scene-based nonuniformity correction with reduced ghosting using a gated LMS algorithm.

    PubMed

    Hardie, Russell C; Baxley, Frank; Brys, Brandon; Hytla, Patrick

    2009-08-17

    In this paper, we present a scene-based nouniformity correction (NUC) method using a modified adaptive least mean square (LMS) algorithm with a novel gating operation on the updates. The gating is designed to significantly reduce ghosting artifacts produced by many scene-based NUC algorithms by halting updates when temporal variation is lacking. We define the algorithm and present a number of experimental results to demonstrate the efficacy of the proposed method in comparison to several previously published methods including other LMS and constant statistics based methods. The experimental results include simulated imagery and a real infrared image sequence. We show that the proposed method significantly reduces ghosting artifacts, but has a slightly longer convergence time. (c) 2009 Optical Society of America

  8. Aqueous chloride stress corrosion cracking of titanium - A comparison with environmental hydrogen embrittlement

    NASA Technical Reports Server (NTRS)

    Nelson, H. G.

    1974-01-01

    The physical characteristics of stress corrosion cracking of titanium in an aqueous chloride environment are compared with those of embrittlement of titanium by a gaseous hydrogen environment in an effort to help contribute to the understanding of the possible role of hydrogen in the complex stress corrosion cracking process. Based on previous studies, the two forms of embrittlement are shown to be similar at low hydrogen pressures (100 N/sq m) but dissimilar at higher hydrogen pressures. In an effort to quantify this comparison, tests were conducted in an aqueous chloride solution using the same material and test techniques as had previously been employed in a gaseous hydrogen environment. The results of these tests strongly support models based on hydrogen as the embrittling species in an aqueous chloride environment.

  9. IMPACT OF NEW GAMOW–TELLER STRENGTHS ON EXPLOSIVE TYPE IA SUPERNOVA NUCLEOSYNTHESIS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mori, Kanji; Famiano, Michael A.; Kajino, Toshitaka

    2016-12-20

    Recent experimental results have confirmed a possible reduction in the Gamow–Teller (GT{sub +}) strengths of pf-shell nuclei. These proton-rich nuclei are of relevance in the deflagration and explosive burning phases of SNe Ia. While prior GT strengths result in nucleosynthesis predictions with a lower-than-expected electron fraction, a reduction in the GT{sub +} strength can result in a slightly increased electron fraction compared to previous shell model predictions, though the enhancement is not as large as previous enhancements in going from rates computed by Fuller, Fowler, and Newman based on an independent particle model. A shell model parametrization has been developed thatmore » more closely matches experimental GT strengths. The resultant electron-capture rates are used in nucleosynthesis calculations for carbon deflagration and explosion phases of SNe Ia, and the final mass fractions are compared to those obtained using more commonly used rates.« less

  10. Impact of New Gamow-Teller Strengths on Explosive Type Ia Supernova Nucleosynthesis

    NASA Astrophysics Data System (ADS)

    Mori, Kanji; Famiano, Michael A.; Kajino, Toshitaka; Suzuki, Toshio; Hidaka, Jun; Honma, Michio; Iwamoto, Koichi; Nomoto, Ken'ichi; Otsuka, Takaharu

    2016-12-01

    Recent experimental results have confirmed a possible reduction in the Gamow-Teller (GT+) strengths of pf-shell nuclei. These proton-rich nuclei are of relevance in the deflagration and explosive burning phases of SNe Ia. While prior GT strengths result in nucleosynthesis predictions with a lower-than-expected electron fraction, a reduction in the GT+ strength can result in a slightly increased electron fraction compared to previous shell model predictions, though the enhancement is not as large as previous enhancements in going from rates computed by Fuller, Fowler, and Newman based on an independent particle model. A shell model parametrization has been developed that more closely matches experimental GT strengths. The resultant electron-capture rates are used in nucleosynthesis calculations for carbon deflagration and explosion phases of SNe Ia, and the final mass fractions are compared to those obtained using more commonly used rates.

  11. The Influence of Atmosphere-Ocean Interaction on MJO Development and Propagation

    DTIC Science & Technology

    2014-09-30

    evaluate modeling results and process studies. The field phase of this project is associated with DYNAMO , which is the US contribution to the...influence on ocean temperature 4. Extended run for DYNAMO with high vertical resolution NCOM RESULTS Summary of project results The work funded...model experiments of the November 2011 MJO – the strongest MJO episode observed during the DYNAMO . The previous conceptual model that was based on TOGA

  12. Discovering discovery patterns with Predication-based Semantic Indexing.

    PubMed

    Cohen, Trevor; Widdows, Dominic; Schvaneveldt, Roger W; Davies, Peter; Rindflesch, Thomas C

    2012-12-01

    In this paper we utilize methods of hyperdimensional computing to mediate the identification of therapeutically useful connections for the purpose of literature-based discovery. Our approach, named Predication-based Semantic Indexing, is utilized to identify empirically sequences of relationships known as "discovery patterns", such as "drug x INHIBITS substance y, substance y CAUSES disease z" that link pharmaceutical substances to diseases they are known to treat. These sequences are derived from semantic predications extracted from the biomedical literature by the SemRep system, and subsequently utilized to direct the search for known treatments for a held out set of diseases. Rapid and efficient inference is accomplished through the application of geometric operators in PSI space, allowing for both the derivation of discovery patterns from a large set of known TREATS relationships, and the application of these discovered patterns to constrain search for therapeutic relationships at scale. Our results include the rediscovery of discovery patterns that have been constructed manually by other authors in previous research, as well as the discovery of a set of previously unrecognized patterns. The application of these patterns to direct search through PSI space results in better recovery of therapeutic relationships than is accomplished with models based on distributional statistics alone. These results demonstrate the utility of efficient approximate inference in geometric space as a means to identify therapeutic relationships, suggesting a role of these methods in drug repurposing efforts. In addition, the results provide strong support for the utility of the discovery pattern approach pioneered by Hristovski and his colleagues. Copyright © 2012 Elsevier Inc. All rights reserved.

  13. Discovering discovery patterns with predication-based Semantic Indexing

    PubMed Central

    Cohen, Trevor; Widdows, Dominic; Schvaneveldt, Roger W.; Davies, Peter; Rindflesch, Thomas C.

    2012-01-01

    In this paper we utilize methods of hyperdimensional computing to mediate the identification of therapeutically useful connections for the purpose of literature-based discovery. Our approach, named Predication-based Semantic Indexing, is utilized to identify empirically sequences of relationships known as “discovery patterns”, such as “drug x INHIBITS substance y, substance y CAUSES disease z” that link pharmaceutical substances to diseases they are known to treat. These sequences are derived from semantic predications extracted from the biomedical literature by the SemRep system, and subsequently utilized to direct the search for known treatments for a held out set of diseases. Rapid and efficient inference is accomplished through the application of geometric operators in PSI space, allowing for both the derivation of discovery patterns from a large set of known TREATS relationships, and the application of these discovered patterns to constrain search for therapeutic relationships at scale. Our results include the rediscovery of discovery patterns that have been constructed manually by other authors in previous research, as well as the discovery of a set of previously unrecognized patterns. The application of these patterns to direct search through PSI space results in better recovery of therapeutic relationships than is accomplished with models based on distributional statistics alone. These results demonstrate the utility of efficient approximate inference in geometric space as a means to identify therapeutic relationships, suggesting a role of these methods in drug repurposing efforts. In addition, the results provide strong support for the utility of the discovery pattern approach pioneered by Hristovski and his colleagues. PMID:22841748

  14. Evaluation of Alternative Altitude Scaling Methods for Thermal Ice Protection System in NASA Icing Research Tunnel

    NASA Technical Reports Server (NTRS)

    Lee, Sam; Addy, Harold; Broeren, Andy P.; Orchard, David M.

    2017-01-01

    A test was conducted at NASA Icing Research Tunnel to evaluate altitude scaling methods for thermal ice protection system. Two scaling methods based on Weber number were compared against a method based on the Reynolds number. The results generally agreed with the previous set of tests conducted in NRCC Altitude Icing Wind Tunnel. The Weber number based scaling methods resulted in smaller runback ice mass than the Reynolds number based scaling method. The ice accretions from the Weber number based scaling method also formed farther upstream. However there were large differences in the accreted ice mass between the two Weber number based scaling methods. The difference became greater when the speed was increased. This indicated that there may be some Reynolds number effects that isnt fully accounted for and warrants further study.

  15. Characterization of Developmental Disability in Children's Fiction

    ERIC Educational Resources Information Center

    Dyches, Tina Taylor; Prater, Mary Anne

    2005-01-01

    Based on the Dyches and Prater (2000) guidelines, characterizations and plots in 34 eligible children's books published during 1999-2003 were evaluated; 36 characterizations are discussed in detail in terms of each guideline. Results showed that, compared to a previous study (Dyches, Prater, & Cramer, 2001), characters with developmental…

  16. Shaping Approach Responses as Intervention for Specific Phobia in a Child with Autism

    ERIC Educational Resources Information Center

    Ricciardi, Joseph N.; Luiselli, James K.; Camare, Marianne

    2006-01-01

    We evaluated contact desensitization (reinforcing approach responses) as intervention for specific phobia with a child diagnosed with autism. During hospital-based intervention, the boy was able to encounter previously avoided stimuli. Parental report suggested that results were maintained postdischarge. (Contains 1 figure.)

  17. Discriminating Dysarthria Type from Envelope Modulation Spectra

    ERIC Educational Resources Information Center

    Liss, Julie M.; LeGendre, Sue; Lotto, Andrew J.

    2010-01-01

    Purpose: Previous research demonstrated the ability of temporally based rhythm metrics to distinguish among dysarthrias with different prosodic deficit profiles (J. M. Liss et al., 2009). The authors examined whether comparable results could be obtained by an automated analysis of speech envelope modulation spectra (EMS), which quantifies the…

  18. Middle-term Metropolitan Water Availability Index Assessment Based on Synergistic Potentials of Multi-sensor Data

    EPA Science Inventory

    The impact of recent drought and water pollution episodes results in an acute need to project future water availability to assist water managers in water utility infrastructure management within many metropolitan regions. Separate drought and water quality indices previously deve...

  19. Leveraging Failure in Design Research

    ERIC Educational Resources Information Center

    Lobato, Joanne; Walters, C. David; Hohensee, Charles; Gruver, John; Diamond, Jaime Marie

    2015-01-01

    Even in the resource-rich, more ideal conditions of many design-based classroom interventions, unexpected events can lead to disappointing results in student learning. However, if later iterations in a design research study are more successful, the previous failures can provide opportunities for comparisons to reveal subtle differences in…

  20. Closing the loop: from paper to protein annotation using supervised Gene Ontology classification.

    PubMed

    Gobeill, Julien; Pasche, Emilie; Vishnyakova, Dina; Ruch, Patrick

    2014-01-01

    Gene function curation of the literature with Gene Ontology (GO) concepts is one particularly time-consuming task in genomics, and the help from bioinformatics is highly requested to keep up with the flow of publications. In 2004, the first BioCreative challenge already designed a task of automatic GO concepts assignment from a full text. At this time, results were judged far from reaching the performances required by real curation workflows. In particular, supervised approaches produced the most disappointing results because of lack of training data. Ten years later, the available curation data have massively grown. In 2013, the BioCreative IV GO task revisited the automatic GO assignment task. For this issue, we investigated the power of our supervised classifier, GOCat. GOCat computes similarities between an input text and already curated instances contained in a knowledge base to infer GO concepts. The subtask A consisted in selecting GO evidence sentences for a relevant gene in a full text. For this, we designed a state-of-the-art supervised statistical approach, using a naïve Bayes classifier and the official training set, and obtained fair results. The subtask B consisted in predicting GO concepts from the previous output. For this, we applied GOCat and reached leading results, up to 65% for hierarchical recall in the top 20 outputted concepts. Contrary to previous competitions, machine learning has this time outperformed standard dictionary-based approaches. Thanks to BioCreative IV, we were able to design a complete workflow for curation: given a gene name and a full text, this system is able to select evidence sentences for curation and to deliver highly relevant GO concepts. Contrary to previous competitions, machine learning this time outperformed dictionary-based systems. Observed performances are sufficient for being used in a real semiautomatic curation workflow. GOCat is available at http://eagl.unige.ch/GOCat/. http://eagl.unige.ch/GOCat4FT/. © The Author(s) 2014. Published by Oxford University Press.

  1. Shipborne LF-VLF oceanic lightning observations and modeling

    NASA Astrophysics Data System (ADS)

    Zoghzoghy, F. G.; Cohen, M. B.; Said, R. K.; Lehtinen, N. G.; Inan, U. S.

    2015-10-01

    Approximately 90% of natural lightning occurs over land, but recent observations, using Global Lightning Detection (GLD360) geolocation peak current estimates and satellite optical data, suggested that cloud-to-ground flashes are on average stronger over the ocean. We present initial statistics from a novel experiment using a Low Frequency (LF) magnetic field receiver system installed aboard the National Oceanic Atmospheric Agency (NOAA) Ronald W. Brown research vessel that allowed the detection of impulsive radio emissions from deep-oceanic discharges at short distances. Thousands of LF waveforms were recorded, facilitating the comparison of oceanic waveforms to their land counterparts. A computationally efficient electromagnetic radiation model that accounts for propagation over lossy and curved ground is constructed and compared with previously published models. We include the effects of Earth curvature on LF ground wave propagation and quantify the effects of channel-base current risetime, channel-base current falltime, and return stroke speed on the radiated LF waveforms observed at a given distance. We compare simulation results to data and conclude that previously reported larger GLD360 peak current estimates over the ocean are unlikely to fully result from differences in channel-base current risetime, falltime, or return stroke speed between ocean and land flashes.

  2. The value and impact of information provided through library services for patient care: developing guidance for best practice.

    PubMed

    Weightman, Alison; Urquhart, Christine; Spink, Siân; Thomas, Rhian

    2009-03-01

    Previous impact tool-kits for UK health libraries required updating to reflect recent evidence and changes in library services. The National Knowledge Service funded development of updated guidance. Survey tools were developed based on previous impact studies and a systematic review. The resulting draft questionnaire survey was tested at four sites, and the interview schedule was investigated in a fifth area. A literature search in ASSIA, Google Scholar, INTUTE, LISA, LISTA, SCIRUS, Social Sciences Citation Index (Web of Knowledge), and the major UK University and National Libraries Catalogue (COPAC), identified ways to improve response rates. Other expert advice contributed to the guidance. The resulting guidance contains evidence-based advice and a planning pathway for conducting an impact survey as a service audit. The survey tools (critical incident questionnaire and interview schedule) are available online. The evidence-based advice recommends personalizing the request, assuring confidentiality, and using follow-up reminders. Questionnaires should be brief, and small incentives, such as a lottery draw should be considered. Bias is minimized if the survey is conducted and analysed by independent researchers. The guidance is a starting point for a pragmatic survey to assess the impact of health library services.

  3. An ``Alternating-Curvature'' Model for the Nanometer-scale Structure of the Nafion Ionomer, Based on Backbone Properties Detected by NMR

    NASA Astrophysics Data System (ADS)

    Schmidt-Rohr, Klaus; Chen, Q.

    2006-03-01

    The perfluorinated ionomer, Nafion, which consists of a (-CF2-)n backbone and charged side branches, is useful as a proton exchange membrane in H2/O2 fuel cells. A modified model of the nanometer-scale structure of hydrated Nafion will be presented. It features hydrated ionic clusters familiar from some previous models, but is based most prominently on pronounced backbone rigidity between branch points and limited orientational correlation of local chain axes. These features have been revealed by solid-state NMR measurements, which take advantage of fast rotations of the backbones around their local axes. The resulting alternating curvature of the backbones towards the hydrated clusters also better satisfies the requirement of dense space filling in solids. Simulations based on this ``alternating curvature'' model reproduce orientational correlation data from NMR, as well as scattering features such as the ionomer peak and the I(q) ˜ 1/q power law at small q values, which can be attributed to modulated cylinders resulting from the chain stiffness. The shortcomings of previous models, including Gierke's cluster model and more recent lamellar or bundle models, in matching all requirements imposed by the experimental data will be discussed.

  4. Analysis of Learning Tools in the study of Developmental of Interactive Multimedia Based Physic Learning Charged in Problem Solving

    NASA Astrophysics Data System (ADS)

    Manurung, Sondang; Demonta Pangabean, Deo

    2017-05-01

    The main purpose of this study is to produce needs analysis, literature review, and learning tools in the study of developmental of interactive multimedia based physic learning charged in problem solving to improve thinking ability of physic prospective student. The first-year result of the study is: result of the draft based on a needs analysis of the facts on the ground, the conditions of existing learning and literature studies. Following the design of devices and instruments performed as well the development of media. Result of the second study is physics learning device -based interactive multimedia charged problem solving in the form of textbooks and scientific publications. Previous learning models tested in a limited sample, then in the evaluation and repair. Besides, the product of research has an economic value on the grounds: (1) a virtual laboratory to offer this research provides a solution purchases physics laboratory equipment is expensive; (2) address the shortage of teachers of physics in remote areas as a learning tool can be accessed offline and online; (3). reducing material or consumables as tutorials can be done online; Targeted research is the first year: i.e story board learning physics that have been scanned in a web form CD (compact disk) and the interactive multimedia of gas Kinetic Theory concept. This draft is based on a needs analysis of the facts on the ground, the existing learning conditions, and literature studies. Previous learning models tested in a limited sample, then in the evaluation and repair.

  5. Reexamining Sample Size Requirements for Multivariate, Abundance-Based Community Research: When Resources are Limited, the Research Does Not Have to Be.

    PubMed

    Forcino, Frank L; Leighton, Lindsey R; Twerdy, Pamela; Cahill, James F

    2015-01-01

    Community ecologists commonly perform multivariate techniques (e.g., ordination, cluster analysis) to assess patterns and gradients of taxonomic variation. A critical requirement for a meaningful statistical analysis is accurate information on the taxa found within an ecological sample. However, oversampling (too many individuals counted per sample) also comes at a cost, particularly for ecological systems in which identification and quantification is substantially more resource consuming than the field expedition itself. In such systems, an increasingly larger sample size will eventually result in diminishing returns in improving any pattern or gradient revealed by the data, but will also lead to continually increasing costs. Here, we examine 396 datasets: 44 previously published and 352 created datasets. Using meta-analytic and simulation-based approaches, the research within the present paper seeks (1) to determine minimal sample sizes required to produce robust multivariate statistical results when conducting abundance-based, community ecology research. Furthermore, we seek (2) to determine the dataset parameters (i.e., evenness, number of taxa, number of samples) that require larger sample sizes, regardless of resource availability. We found that in the 44 previously published and the 220 created datasets with randomly chosen abundances, a conservative estimate of a sample size of 58 produced the same multivariate results as all larger sample sizes. However, this minimal number varies as a function of evenness, where increased evenness resulted in increased minimal sample sizes. Sample sizes as small as 58 individuals are sufficient for a broad range of multivariate abundance-based research. In cases when resource availability is the limiting factor for conducting a project (e.g., small university, time to conduct the research project), statistically viable results can still be obtained with less of an investment.

  6. AveBoost2: Boosting for Noisy Data

    NASA Technical Reports Server (NTRS)

    Oza, Nikunj C.

    2004-01-01

    AdaBoost is a well-known ensemble learning algorithm that constructs its constituent or base models in sequence. A key step in AdaBoost is constructing a distribution over the training examples to create each base model. This distribution, represented as a vector, is constructed to be orthogonal to the vector of mistakes made by the pre- vious base model in the sequence. The idea is to make the next base model's errors uncorrelated with those of the previous model. In previous work, we developed an algorithm, AveBoost, that constructed distributions orthogonal to the mistake vectors of all the previous models, and then averaged them to create the next base model s distribution. Our experiments demonstrated the superior accuracy of our approach. In this paper, we slightly revise our algorithm to allow us to obtain non-trivial theoretical results: bounds on the training error and generalization error (difference between training and test error). Our averaging process has a regularizing effect which, as expected, leads us to a worse training error bound for our algorithm than for AdaBoost but a superior generalization error bound. For this paper, we experimented with the data that we used in both as originally supplied and with added label noise-a small fraction of the data has its original label changed. Noisy data are notoriously difficult for AdaBoost to learn. Our algorithm's performance improvement over AdaBoost is even greater on the noisy data than the original data.

  7. Complexity in Acid–Base Titrations: Multimer Formation Between Phosphoric Acids and Imines

    PubMed Central

    Malm, Christian; Kim, Heejae; Wagner, Manfred

    2017-01-01

    Abstract Solutions of Brønsted acids with bases in aprotic solvents are not only common model systems to study the fundamentals of proton transfer pathways but are also highly relevant to Brønsted acid catalysis. Despite their importance the light nature of the proton makes characterization of acid–base aggregates challenging. Here, we track such acid–base interactions over a broad range of relative compositions between diphenyl phosphoric acid and the base quinaldine in dichloromethane, by using a combination of dielectric relaxation and NMR spectroscopy. In contrast to what one would expect for an acid–base titration, we find strong deviations from quantitative proton transfer from the acid to the base. Even for an excess of the base, multimers consisting of one base and at least two acid molecules are formed, in addition to the occurrence of proton transfer from the acid to the base and simultaneous formation of ion pairs. For equimolar mixtures such multimers constitute about one third of all intermolecular aggregates. Quantitative analysis of our results shows that the acid‐base association constant is only around six times larger than that for the acid binding to an acid‐base dimer, that is, to an already protonated base. Our findings have implications for the interpretation of previous studies of reactive intermediates in organocatalysis and provide a rationale for previously observed nonlinear effects in phosphoric acid catalysis. PMID:28597513

  8. Infant polysomnography: reliability and validity of infant arousal assessment.

    PubMed

    Crowell, David H; Kulp, Thomas D; Kapuniai, Linda E; Hunt, Carl E; Brooks, Lee J; Weese-Mayer, Debra E; Silvestri, Jean; Ward, Sally Davidson; Corwin, Michael; Tinsley, Larry; Peucker, Mark

    2002-10-01

    Infant arousal scoring based on the Atlas Task Force definition of transient EEG arousal was evaluated to determine (1). whether transient arousals can be identified and assessed reliably in infants and (2). whether arousal and no-arousal epochs scored previously by trained raters can be validated reliably by independent sleep experts. Phase I for inter- and intrarater reliability scoring was based on two datasets of sleep epochs selected randomly from nocturnal polysomnograms of healthy full-term, preterm, idiopathic apparent life-threatening event cases, and siblings of Sudden Infant Death Syndrome infants of 35 to 64 weeks postconceptional age. After training, test set 1 reliability was assessed and discrepancies identified. After retraining, test set 2 was scored by the same raters to determine interrater reliability. Later, three raters from the trained group rescored test set 2 to assess inter- and intrarater reliabilities. Interrater and intrarater reliability kappa's, with 95% confidence intervals, ranged from substantial to almost perfect levels of agreement. Interrater reliabilities for spontaneous arousals were initially moderate and then substantial. During the validation phase, 315 previously scored epochs were presented to four sleep experts to rate as containing arousal or no-arousal events. Interrater expert agreements were diverse and considered as noninterpretable. Concordance in sleep experts' agreements, based on identification of the previously sampled arousal and no-arousal epochs, was used as a secondary evaluative technique. Results showed agreement by two or more experts on 86% of the Collaborative Home Infant Monitoring Evaluation Study arousal scored events. Conversely, only 1% of the Collaborative Home Infant Monitoring Evaluation Study-scored no-arousal epochs were rated as an arousal. In summary, this study presents an empirically tested model with procedures and criteria for attaining improved reliability in transient EEG arousal assessments in infants using the modified Atlas Task Force standards. With training based on specific criteria, substantial inter- and intrarater agreement in identifying infant arousals was demonstrated. Corroborative validation results were too disparate for meaningful interpretation. Alternate evaluation based on concordance agreements supports reliance on infant EEG criteria for assessment. Results mandate additional confirmatory validation studies with specific training on infant EEG arousal assessment criteria.

  9. Environmental analysis of plastic production processes: comparing petroleum-based polypropylene and polyethylene with biologically-based poly-beta-hydroxybutyric acid using life cycle analysis.

    PubMed

    Harding, K G; Dennis, J S; von Blottnitz, H; Harrison, S T L

    2007-05-31

    Polymers based on olefins have wide commercial applicability. However, they are made from non-renewable resources and are characterised by difficulty in disposal where recycle and re-use is not feasible. Poly-beta-hydroxybutyric acid (PHB) provides one example of a polymer made from renewable resources. Before motivating its widespread use, the advantages of a renewable polymer must be weighed against the environmental aspects of its production. Previous studies relating the environmental impacts of petroleum-based and bio-plastics have centred on the impact categories of global warming and fossil fuel depletion. Cradle-to-grave studies report equivalent or reduced global warming impacts, in comparison to equivalent polyolefin processes. This stems from a perceived CO(2) neutral status of the renewable resource. Indeed, no previous work has reported the results of a life cycle assessment (LCA) giving the environmental impacts in all major categories. This study investigates a cradle-to-gate LCA of PHB production taking into account net CO(2) generation and all major impact categories. It compares the findings with similar studies of polypropylene (PP) and polyethylene (PE). It is found that, in all of the life cycle categories, PHB is superior to PP. Energy requirements are slightly lower than previously observed and significantly lower than those for polyolefin production. PE impacts are lower than PHB values in acidification and eutrophication.

  10. A global analysis of Spitzer and new HARPS data confirms the loneliness and metal-richness of GJ 436 b

    NASA Astrophysics Data System (ADS)

    Lanotte, A. A.; Gillon, M.; Demory, B.-O.; Fortney, J. J.; Astudillo, N.; Bonfils, X.; Magain, P.; Delfosse, X.; Forveille, T.; Lovis, C.; Mayor, M.; Neves, V.; Pepe, F.; Queloz, D.; Santos, N.; Udry, S.

    2014-12-01

    Context. GJ 436b is one of the few transiting warm Neptunes for which a detailed characterisation of the atmosphere is possible, whereas its non-negligible orbital eccentricity calls for further investigation. Independent analyses of several individual datasets obtained with Spitzer have led to contradicting results attributed to the different techniques used to treat the instrumental effects. Aims: We aim at investigating these previous controversial results and developing our knowledge of the system based on the full Spitzer photometry dataset combined with new Doppler measurements obtained with the HARPS spectrograph. We also want to search for additional planets. Methods: We optimise aperture photometry techniques and the photometric deconvolution algorithm DECPHOT to improve the data reduction of the Spitzer photometry spanning wavelengths from 3-24 μm. Adding the high-precision HARPS radial velocity data, we undertake a Bayesian global analysis of the system considering both instrumental and stellar effects on the flux variation. Results: We present a refined radius estimate of RP = 4.10 ± 0.16 R⊕ , mass MP = 25.4 ± 2.1 M⊕, and eccentricity e = 0.162 ± 0.004 for GJ 436b. Our measured transit depths remain constant in time and wavelength, in disagreement with the results of previous studies. In addition, we find that the post-occultation flare-like structure at 3.6 μm that led to divergent results on the occultation depth measurement is spurious. We obtain occultation depths at 3.6, 5.8, and 8.0 μm that are shallower than in previous works, in particular at 3.6 μm. However, these depths still appear consistent with a metal-rich atmosphere depleted in methane and enhanced in CO/CO2, although perhaps less than previously thought. We could not detect a significant orbital modulation in the 8 μm phase curve. We find no evidence of a potential planetary companion, stellar activity, or a stellar spin-orbit misalignment. Conclusions: Recent theoretical models invoking high-metallicity atmospheres for warm Neptunes are a reasonable match to our results, but we encourage new modelling efforts based on our revised data. Future observations covering a wide wavelength range of GJ 436b and other Neptune-class exoplanets will further illuminate their atmosphere properties, whilst future accurate radial velocity measurements might explain the eccentricity. Based on observations made with the HARPS spectrograph on the 3.6 m ESO telescope at the ESO La Silla Observatory, Chile.Table 2 and Figs. 5-7 are available in electronic form at http://www.aanda.org

  11. Color Perception in Pediatric Patient Room Design: American versus Korean Pediatric Patients.

    PubMed

    Phillip Park, Jin Gyu; Park, Changbae

    2013-01-01

    This study simultaneously addresses the issues of the scarcity of information about pediatric patient color preferences, conflicting findings about the impact of culture on color preferences, and limitations of previous research instruments. Effects of culture and gender on color preferences were investigated using American and Korean pediatric patients. Much of the existing research in environmental design has focused on environments for healthy children and adults, but those findings cannot be confidently applied to environments for pediatric patients. In previous studies, the impact of culture on color preferences has been suggested, though the effects appear to vary. Moreover, the results of previous studies were typically based on perceptions of small color chips, which are different from seeing a color on wall surfaces. Previous studies also failed to control for confounding variables such as color attributes and light sources. Instead of using color chips, this study used physical model simulation to investigate environmental color preferences in real contexts. Cultural difference was found in white. Other than white, no significant cultural difference was found. Gender differences were found across both of the groups. Korean pediatric patients showed significantly higher preference scores for white than Americans did. Other than white, both groups reported blue and green as their most preferred colors; white was the least preferred. Both groups reported similar gender effects. Overall, male patients reported significantly lower preference scores for red and purple than female patients did. These results can help healthcare providers and professionals better understand appropriate colors for pediatric populations. Evidence-based design, healing environment, patients, pediatric, satisfaction.

  12. Factors influencing the robustness of P-value measurements in CT texture prognosis studies

    NASA Astrophysics Data System (ADS)

    McQuaid, Sarah; Scuffham, James; Alobaidli, Sheaka; Prakash, Vineet; Ezhil, Veni; Nisbet, Andrew; South, Christopher; Evans, Philip

    2017-07-01

    Several studies have recently reported on the value of CT texture analysis in predicting survival, although the topic remains controversial, with further validation needed in order to consolidate the evidence base. The aim of this study was to investigate the effect of varying the input parameters in the Kaplan-Meier analysis, to determine whether the resulting P-value can be considered to be a robust indicator of the parameter’s prognostic potential. A retrospective analysis of the CT-based normalised entropy of 51 patients with lung cancer was performed and overall survival data for these patients were collected. A normalised entropy cut-off was chosen to split the patient cohort into two groups and log-rank testing was performed to assess the survival difference of the two groups. This was repeated for varying normalised entropy cut-offs and varying follow-up periods. Our findings were also compared with previously published results to assess robustness of this parameter in a multi-centre patient cohort. The P-value was found to be highly sensitive to the choice of cut-off value, with small changes in cut-off producing substantial changes in P. The P-value was also sensitive to follow-up period, with particularly noisy results at short follow-up periods. Using matched conditions to previously published results, a P-value of 0.162 was obtained. Survival analysis results can be highly sensitive to the choice in texture cut-off value in dichotomising patients, which should be taken into account when performing such studies to avoid reporting false positive results. Short follow-up periods also produce unstable results and should therefore be avoided to ensure the results produced are reproducible. Previously published findings that indicated the prognostic value of normalised entropy were not replicated here, but further studies with larger patient numbers would be required to determine the cause of the different outcomes.

  13. Charging Characteristics of an Insulating Hollow Cylinder in Vacuum

    NASA Astrophysics Data System (ADS)

    Yamamoto, Osamu; Hayashi, Hirotaka; Wadahama, Toshihiko; Takeda, Daisuke; Hamada, Shoji; Ohsawa, Yasuharu

    This paper deals with charging characteristics of the inner surface of an insulating hollow cylinder in vacuum. We conducted measurements of electric field strength near the triple points on cathode by using an electrostatic probe. Also we conducted a computer simulation of charging based on the Secondary Electron Emission Avalanche (SEEA) mechanism. These results are compared with those obtained previously for solid cylinders. As a result, we have clarified that hollow cylinders acquire surface charge which is larger than that of solid cylinders. We have also found that charge controlling effect by roughening the inner surface, which have been proved effective to depress charging on the surface of solid cylinders in our previous studies, is limited for hollow cylinders.

  14. Biologically-inspired hexapod robot design and simulation

    NASA Technical Reports Server (NTRS)

    Espenschied, Kenneth S.; Quinn, Roger D.

    1994-01-01

    The design and construction of a biologically-inspired hexapod robot is presented. A previously developed simulation is modified to include models of the DC drive motors, the motor driver circuits and their transmissions. The application of this simulation to the design and development of the robot is discussed. The mechanisms thought to be responsible for the leg coordination of the walking stick insect were previously applied to control the straight-line locomotion of a robot. We generalized these rules for a robot walking on a plane. This biologically-inspired control strategy is used to control the robot in simulation. Numerical results show that the general body motion and performance of the simulated robot is similar to that of the robot based on our preliminary experimental results.

  15. Improved maximum average correlation height filter with adaptive log base selection for object recognition

    NASA Astrophysics Data System (ADS)

    Tehsin, Sara; Rehman, Saad; Awan, Ahmad B.; Chaudry, Qaiser; Abbas, Muhammad; Young, Rupert; Asif, Afia

    2016-04-01

    Sensitivity to the variations in the reference image is a major concern when recognizing target objects. A combinational framework of correlation filters and logarithmic transformation has been previously reported to resolve this issue alongside catering for scale and rotation changes of the object in the presence of distortion and noise. In this paper, we have extended the work to include the influence of different logarithmic bases on the resultant correlation plane. The meaningful changes in correlation parameters along with contraction/expansion in the correlation plane peak have been identified under different scenarios. Based on our research, we propose some specific log bases to be used in logarithmically transformed correlation filters for achieving suitable tolerance to different variations. The study is based upon testing a range of logarithmic bases for different situations and finding an optimal logarithmic base for each particular set of distortions. Our results show improved correlation and target detection accuracies.

  16. Predicting chemical bioavailability using microarray gene expression data and regression modeling: A tale of three explosive compounds.

    PubMed

    Gong, Ping; Nan, Xiaofei; Barker, Natalie D; Boyd, Robert E; Chen, Yixin; Wilkins, Dawn E; Johnson, David R; Suedel, Burton C; Perkins, Edward J

    2016-03-08

    Chemical bioavailability is an important dose metric in environmental risk assessment. Although many approaches have been used to evaluate bioavailability, not a single approach is free from limitations. Previously, we developed a new genomics-based approach that integrated microarray technology and regression modeling for predicting bioavailability (tissue residue) of explosives compounds in exposed earthworms. In the present study, we further compared 18 different regression models and performed variable selection simultaneously with parameter estimation. This refined approach was applied to both previously collected and newly acquired earthworm microarray gene expression datasets for three explosive compounds. Our results demonstrate that a prediction accuracy of R(2) = 0.71-0.82 was achievable at a relatively low model complexity with as few as 3-10 predictor genes per model. These results are much more encouraging than our previous ones. This study has demonstrated that our approach is promising for bioavailability measurement, which warrants further studies of mixed contamination scenarios in field settings.

  17. Leishmania species identification using FTA card sampling directly from patients' cutaneous lesions in the state of Lara, Venezuela.

    PubMed

    Kato, Hirotomo; Watanabe, Junko; Mendoza Nieto, Iraida; Korenaga, Masataka; Hashiguchi, Yoshihisa

    2011-10-01

    A molecular epidemiological study was performed using FTA card materials directly sampled from lesions of patients with cutaneous leishmaniasis (CL) in the state of Lara, Venezuela, where causative agents have been identified as Leishmania (Viannia) braziliensis and L. (Leishmania) venezuelensis in previous studies. Of the 17 patients diagnosed with CL, Leishmania spp. were successfully identified in 16 patients based on analysis of the cytochrome b gene and rRNA internal transcribed spacer sequences. Consistent with previous findings, seven of the patients were infected with L. (V.) braziliensis. However, parasites from the other nine patients were genetically identified as L. (L.) mexicana, which differed from results of previous enzymatic and antigenic analyses. These results strongly suggest that L. (L.) venezuelensis is a variant of L. (L.) mexicana and that the classification of L. (L.) venezuelensis should be reconsidered. Copyright © 2011 Royal Society of Tropical Medicine and Hygiene. Published by Elsevier Ltd. All rights reserved.

  18. Measurement of electron density using reactance cutoff probe

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    You, K. H.; Seo, B. H.; Kim, J. H.

    2016-05-15

    This paper proposes a new measurement method of electron density using the reactance spectrum of the plasma in the cutoff probe system instead of the transmission spectrum. The highly accurate reactance spectrum of the plasma-cutoff probe system, as expected from previous circuit simulations [Kim et al., Appl. Phys. Lett. 99, 131502 (2011)], was measured using the full two-port error correction and automatic port extension methods of the network analyzer. The electron density can be obtained from the analysis of the measured reactance spectrum, based on circuit modeling. According to the circuit simulation results, the reactance cutoff probe can measure themore » electron density more precisely than the previous cutoff probe at low densities or at higher pressure. The obtained results for the electron density are presented and discussed for a wide range of experimental conditions, and this method is compared with previous methods (a cutoff probe using the transmission spectrum and a single Langmuir probe).« less

  19. Incongruence between trauma center social workers' beliefs about substance use interventions and intentions to intervene.

    PubMed

    Davis, Dana; Hawk, Mary

    2015-01-01

    This study explored trauma centers social workers' beliefs regarding four evidence-based interventions for patients presenting with substance abuse issues. Previous research has indicated that health care providers' beliefs have prevented them from implementing non-abstinence based interventions. Study results indicated that the majority of social workers believed in the 12-step approach and were least comfortable with the harm reduction approach. However, results showed that in some cases, social workers may have negative personal beliefs regarding non-abstinence based interventions, but do not let their personal beliefs get in the way of utilizing these interventions if they are viewed as appropriate for the client's situation.

  20. Assessing the Formation of Experience-Based Gender Expectations in an Implicit Learning Scenario

    PubMed Central

    Öttl, Anton; Behne, Dawn M.

    2017-01-01

    The present study investigates the formation of new word-referent associations in an implicit learning scenario, using a gender-coded artificial language with spoken words and visual referents. Previous research has shown that when participants are explicitly instructed about the gender-coding system underlying an artificial lexicon, they monitor the frequency of exposure to male vs. female referents within this lexicon, and subsequently use this probabilistic information to predict the gender of an upcoming referent. In an explicit learning scenario, the auditory and visual gender cues are necessarily highlighted prior to acqusition, and the effects previously observed may therefore depend on participants' overt awareness of these cues. To assess whether the formation of experience-based expectations is dependent on explicit awareness of the underlying coding system, we present data from an experiment in which gender-coding was acquired implicitly, thereby reducing the likelihood that visual and auditory gender cues are used strategically during acquisition. Results show that even if the gender coding system was not perfectly mastered (as reflected in the number of gender coding errors), participants develop frequency based expectations comparable to those previously observed in an explicit learning scenario. In line with previous findings, participants are quicker at recognizing a referent whose gender is consistent with an induced expectation than one whose gender is inconsistent with an induced expectation. At the same time however, eyetracking data suggest that these expectations may surface earlier in an implicit learning scenario. These findings suggest that experience-based expectations are robust against manner of acquisition, and contribute to understanding why similar expectations observed in the activation of stereotypes during the processing of natural language stimuli are difficult or impossible to suppress. PMID:28936186

  1. Comparing electrochemical performance of transition metal silicate cathodes and chevrel phase Mo6S8 in the analogous rechargeable Mg-ion battery system

    NASA Astrophysics Data System (ADS)

    Chen, Xinzhi; Bleken, Francesca L.; Løvvik, Ole Martin; Vullum-Bruer, Fride

    2016-07-01

    Polyanion based silicate materials, MgMSiO4 (M = Fe, Mn, Co), previously reported to be promising cathode materials for Mg-ion batteries, have been re-examined. Both the sol-gel and molten salt methods are employed to synthesize MgMSiO4 composites. Mo6S8 is synthesized by a molten salt method combined with Cu leaching and investigated in the equivalent electrochemical system as a bench mark. Electrochemical measurements for Mo6S8 performed using the 2nd generation electrolyte show similar results to those reported in literature. Electrochemical performance of the silicate materials on the other hand, do not show the promising results previously reported. A thorough study of these published results are presented here, and compared to the current experimental data on the same material system. It appears that there are certain inconsistencies in the published results which cannot be explained. To further corroborate the present experimental results, atomic-scale calculations from first principles are performed, demonstrating that diffusion barriers are very high for Mg diffusion in MgMSiO4. In conclusion, MgMSiO4 (M = Fe, Mn, Co) olivine materials do not seem to be such good candidates for cathode materials in Mg-ion batteries as previously reported.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Onoufriou, T.; Simpson, R.J.; Protopapas, M.

    This paper presents the development and application of reliability based inspection planning techniques for floaters. Based on previous experience from jacket structure applications optimized inspection planning (OIP) techniques for floaters are developed. The differences between floaters and jacket structures in relation to fatigue damage, redundancy levels and inspection practice are examined and reflected in the proposed methodology. The application and benefits of these techniques is demonstrated through representative analyses and important trends are highlighted through the results of a parametric sensitivity study.

  3. Studies of thermal wave phenomena on the Jovian planets

    NASA Technical Reports Server (NTRS)

    Deming, Drake

    1991-01-01

    Ground-based and Voyager observations of Jupiter provided evidence that the tropospheric temperature shows global-scale longitudinal variations which are often wavelike in character. The investigation is presented which is directed toward obtaining additional ground-based data in IR spectral bands whose contribution functions are optimized for specific atmospheric regions, in order to confirm the previous results, and to identify the nature and physical significance of wavelike longitudinal temperature fluctuations on the Jovian planets.

  4. Examining the Impact of a Video Case-Based Mathematics Methods Course on Secondary Pre-Service Teachers' Skills at Analysing Students' Strategies

    ERIC Educational Resources Information Center

    Martinez, Mara Vanina; Superfine, Alison Castro; Carlton, Theresa; Dasgupta, Chandan

    2015-01-01

    This paper focuses on results from a study conducted with two cohorts of pre-service teachers (PSTs) in a video case-based mathematics methods course at a large Midwestern university in the US. The motivation for this study was to look beyond whether or not PSTs pay attention to mathematical thinking of students, as shown by previous studies when…

  5. United States Air Force F-35A Operational Basing Environmental Impact Statement. Volume 1

    DTIC Science & Technology

    2013-09-01

    Evaluation (FDE) program and Weapons School (WS) beddown, the F-22 designator was used. Subsequent testing , development, and deployment resulted in...Initial F-35A Operational Basing EIS Final, September 2013 contract to develop the JSF ( designated the F-35 Lightning II). Since then, testing of F...of the aircraft even with system failures. Throughout the design and testing process, safety initiatives took previous best practices for single

  6. New Mexico’s forest resources, 2008-2014

    Treesearch

    Sara A. Goeking; Jim Menlove

    2017-01-01

    This report presents a summary of the most recent inventory of New Mexico’s forests based on field data collected between 2008 and 2014. The results presented here summarize a complete cycle of New Mexico’s forest inventory, or 10 years’ worth of data collection, whereas the previous report was based only on 9 years’ worth of data collected under an accelerated...

  7. Color encryption scheme based on adapted quantum logistic map

    NASA Astrophysics Data System (ADS)

    Zaghloul, Alaa; Zhang, Tiejun; Amin, Mohamed; Abd El-Latif, Ahmed A.

    2014-04-01

    This paper presents a new color image encryption scheme based on quantum chaotic system. In this scheme, a new encryption scheme is accomplished by generating an intermediate chaotic key stream with the help of quantum chaotic logistic map. Then, each pixel is encrypted by the cipher value of the previous pixel and the adapted quantum logistic map. The results show that the proposed scheme has adequate security for the confidentiality of color images.

  8. Validating internet research: a test of the psychometric equivalence of internet and in-person samples.

    PubMed

    Meyerson, Paul; Tryon, Warren W

    2003-11-01

    This study evaluated the psychometric equivalency of Web-based research. The Sexual Boredom Scale was presented via the World-Wide Web along with five additional scales used to validate it. A subset of 533 participants that matched a previously published sample (Watt & Ewing, 1996) on age, gender, and race was identified. An 8 x 8 correlation matrix from the matched Internet sample was compared via structural equation modeling with a similar 8 x 8 correlation matrix from the previously published study. The Internet and previously published samples were psychometrically equivalent. Coefficient alpha values calculated on the matched Internet sample yielded reliability coefficients almost identical to those for the previously published sample. Factors such as computer administration and uncontrollable administration settings did not appear to affect the results. Demographic data indicated an overrepresentation of males by about 6% and Caucasians by about 13% relative to the U.S. Census (2000). A total of 2,230 participants were obtained in about 8 months without remuneration. These results suggest that data collection on the Web is (1) reliable, (2) valid, (3) reasonably representative, (4) cost effective, and (5) efficient.

  9. A Bridge from Optical to Infrared Galaxies: Explaining Local Properties and Predicting Galaxy Counts and the Cosmic Background Radiation

    NASA Astrophysics Data System (ADS)

    Totani, Tomonori; Takeuchi, Tsutomu T.

    2002-05-01

    We give an explanation for the origin of various properties observed in local infrared galaxies and make predictions for galaxy counts and cosmic background radiation (CBR) using a new model extended from that for optical/near-infrared galaxies. Important new characteristics of this study are that (1) mass scale dependence of dust extinction is introduced based on the size-luminosity relation of optical galaxies and that (2) the large-grain dust temperature Tdust is calculated based on a physical consideration for energy balance rather than by using the empirical relation between Tdust and total infrared luminosity LIR found in local galaxies, which has been employed in most previous works. Consequently, the local properties of infrared galaxies, i.e., optical/infrared luminosity ratios, LIR-Tdust correlation, and infrared luminosity function are outputs predicted by the model, while these have been inputs in a number of previous models. Our model indeed reproduces these local properties reasonably well. Then we make predictions for faint infrared counts (in 15, 60, 90, 170, 450, and 850 μm) and CBR using this model. We found results considerably different from those of most previous works based on the empirical LIR-Tdust relation; especially, it is shown that the dust temperature of starbursting primordial elliptical galaxies is expected to be very high (40-80 K), as often seen in starburst galaxies or ultraluminous infrared galaxies in the local and high-z universe. This indicates that intense starbursts of forming elliptical galaxies should have occurred at z~2-3, in contrast to the previous results that significant starbursts beyond z~1 tend to overproduce the far-infrared (FIR) CBR detected by COBE/FIRAS. On the other hand, our model predicts that the mid-infrared (MIR) flux from warm/nonequilibrium dust is relatively weak in such galaxies making FIR CBR, and this effect reconciles the prima facie conflict between the upper limit on MIR CBR from TeV gamma-ray observations and the COBE detections of FIR CBR. The intergalactic optical depth of TeV gamma rays based on our model is also presented.

  10. Suicide after release from prison - a population-based cohort study from Sweden

    PubMed Central

    Haglund, Axel; Tidemalm, Dag; Jokinen, Jussi; Långström, Niklas; Liechtenstein, Paul; Fazel, Seena; Runeson, Bo

    2015-01-01

    Objective Released prisoners have high suicide rates compared with the general population, but little is known about risk factors and possible causal pathways. We conducted a population-based cohort study to investigate rates and risk factors for suicide in people previously imprisoned. Methods We identified individuals released from prison in Sweden between January 1, 2005 and December 31, 2009 through linkage of national population-based registers. Released prisoners were followed from the day of release until death, emigration, new incarceration, or December 31, 2009. Survival analyses were conducted to compare incidence rates and psychiatric morbidity with non-convicted population controls matched on gender and year of birth. Results We identified 38,995 releases among 26,953 prisoners (7.6% females) during 2005-2009. Overall, 127 suicides occurred, accounting for 14% of all deaths after release (n=920). The mean suicide rate was 204 per 100,000 person years yielding an incidence rate ratio of 18.2 (95% CI 13.9-23.8) compared with general population controls. Previous substance use disorder (Hazard Ratio [HR]=2.1, 1.4-3.2), suicide attempt (HR=2.5, 1.7-3.7), and being born in Sweden vs. abroad (HR=2.1, 1.2-3.6) were independent risk factors for suicide after release. Conclusions Released prisoners are at high suicide risk and with a slightly different pattern of psychiatric risk factors for suicide compared with the general population. Results suggest appropriate allocation of resources to facilitate transition to life outside prison and increased attention to prisoners with both a previous suicide attempt and substance use disorder. PMID:25373114

  11. Genetic burden associated with varying degrees of disease severity in endometriosis

    PubMed Central

    Sapkota, Yadav; Attia, John; Gordon, Scott D.; Henders, Anjali K.; Holliday, Elizabeth G.; Rahmioglu, Nilufer; MacGregor, Stuart; Martin, Nicholas G.; McEvoy, Mark; Morris, Andrew P.; Scott, Rodney J.; Zondervan, Krina T.; Montgomery, Grant W.; Nyholt, Dale R.

    2015-01-01

    Endometriosis is primarily characterized by the presence of tissue resembling endometrium outside the uterine cavity and is usually diagnosed by laparoscopy. The most commonly used classification of disease, the revised American Fertility Society (rAFS) system to grade endometriosis into different stages based on disease severity (I to IV), has been questioned as it does not correlate well with underlying symptoms, posing issues in diagnosis and choice of treatment. Using two independent European genome-wide association (GWA) datasets and top-level classification of the endometriosis cases based on rAFS [minimal or mild (Stage A) and moderate-to-severe (Stage B) disease], we previously showed that Stage B endometriosis has greater contribution of common genetic variation to its aetiology than Stage A disease. Herein, we extend our previous analysis to four endometriosis stages [minimal (Stage I), mild (Stage II), moderate (Stage III) and severe (Stage IV) disease] based on the rAFS classification system and compared the genetic burden across stages. Our results indicate that genetic burden increases from minimal to severe endometriosis. For the minimal disease, genetic factors may contribute to a lesser extent than other disease categories. Mild and moderate endometriosis appeared genetically similar, making it difficult to tease them apart. Consistent with our previous reports, moderate and severe endometriosis showed greater genetic burden than minimal or mild disease. Overall, our results provide new insights into the genetic architecture of endometriosis and further investigation in larger samples may help to understand better the aetiology of varying degrees of endometriosis, enabling improved diagnostic and treatment modalities. PMID:25882541

  12. puma: a Bioconductor package for propagating uncertainty in microarray analysis.

    PubMed

    Pearson, Richard D; Liu, Xuejun; Sanguinetti, Guido; Milo, Marta; Lawrence, Neil D; Rattray, Magnus

    2009-07-09

    Most analyses of microarray data are based on point estimates of expression levels and ignore the uncertainty of such estimates. By determining uncertainties from Affymetrix GeneChip data and propagating these uncertainties to downstream analyses it has been shown that we can improve results of differential expression detection, principal component analysis and clustering. Previously, implementations of these uncertainty propagation methods have only been available as separate packages, written in different languages. Previous implementations have also suffered from being very costly to compute, and in the case of differential expression detection, have been limited in the experimental designs to which they can be applied. puma is a Bioconductor package incorporating a suite of analysis methods for use on Affymetrix GeneChip data. puma extends the differential expression detection methods of previous work from the 2-class case to the multi-factorial case. puma can be used to automatically create design and contrast matrices for typical experimental designs, which can be used both within the package itself but also in other Bioconductor packages. The implementation of differential expression detection methods has been parallelised leading to significant decreases in processing time on a range of computer architectures. puma incorporates the first R implementation of an uncertainty propagation version of principal component analysis, and an implementation of a clustering method based on uncertainty propagation. All of these techniques are brought together in a single, easy-to-use package with clear, task-based documentation. For the first time, the puma package makes a suite of uncertainty propagation methods available to a general audience. These methods can be used to improve results from more traditional analyses of microarray data. puma also offers improvements in terms of scope and speed of execution over previously available methods. puma is recommended for anyone working with the Affymetrix GeneChip platform for gene expression analysis and can also be applied more generally.

  13. A Bridge from Optical to Infrared Galaxies: Explaining Local Properties, Predicting Galaxy Counts and the Cosmic Background Radiation

    NASA Astrophysics Data System (ADS)

    Totani, T.; Takeuchi, T. T.

    2001-12-01

    A new model of infrared galaxy counts and the cosmic background radiation (CBR) is developed by extending a model for optical/near-infrared galaxies. Important new characteristics of this model are that mass scale dependence of dust extinction is introduced based on the size-luminosity relation of optical galaxies, and that the big grain dust temperature T dust is calculated based on a physical consideration for energy balance, rather than using the empirical relation between T dust and total infrared luminosity L IR found in local galaxies, which has been employed in most of previous works. Consequently, the local properties of infrared galaxies, i.e., optical/infrared luminosity ratios, L IR-T dust correlation, and infrared luminosity function are outputs predicted by the model, while these have been inputs in a number of previous models. Our model indeed reproduces these local properties reasonably well. Then we make predictions for faint infrared counts (in 15, 60, 90, 170, 450, and 850 μ m) and CBR by this model. We found considerably different results from most of previous works based on the empirical L IR-T dust relation; especially, it is shown that the dust temperature of starbursting primordial elliptical galaxies is expected to be very high (40--80K). This indicates that intense starbursts of forming elliptical galaxies should have occurred at z ~ 2--3, in contrast to the previous results that significant starbursts beyond z ~ 1 tend to overproduce the far-infrared (FIR) CBR detected by COBE/FIRAS. On the other hand, our model predicts that the mid-infrared (MIR) flux from warm/nonequilibrium dust is relatively weak in such galaxies making FIR CBR, and this effect reconciles the prima facie conflict between the upper limit on MIR CBR from TeV gamma-ray observations and the COBE\\ detections of FIR CBR. The authors thank the financial support by the Japan Society for Promotion of Science.

  14. POLANYI-BASED MODELS FOR THE COMPETITIVE SORPTION OF LOW-POLARITY ORGANIC CONTAMINANTS ON A NATURAL SORBENT. (R825406)

    EPA Science Inventory

    The development of appropriate equilibrium sorption relationships for anthropogenic organic contaminants with soils and sediments is essential to predicting the extents and rates of solid-water interactions in the environment. In this context, we previously reported results that ...

  15. Crossing the Threshold of Rocket Mail: E-Mail Use by U.S. Humanities Faculty.

    ERIC Educational Resources Information Center

    Bridges, Anne E.; Clement, Russell T.

    1997-01-01

    Discusses the use of e-mail by humanities faculty based on an e-mail survey of faculty at Brigham Young University (Utah) and the University of Tennessee at Knoxville. Results indicate that humanities faculty are significantly heavier e-mail users than previously reported. (LRW)

  16. Berry fruit differentially improves age-related decrements in behavior based on baseline status

    USDA-ARS?s Scientific Manuscript database

    Aging and neurodegenerative diseases are thought to be the results of prolonged effects of oxidative stress and inflammation. Previously, we have shown that daily supplementation of berry fruits, such as blueberry or raspberry, was able to reverse age-related deficits in behavioral and neuronal func...

  17. An Observation-base investigation of nudging in WRF for downscaling surface climate information to 12-km Grid Spacing

    EPA Science Inventory

    Previous research has demonstrated the ability to use the Weather Research and Forecast (WRF) model and contemporary dynamical downscaling methods to refine global climate modeling results to a horizontal resolution of 36 km. Environmental managers and urban planners have expre...

  18. The Gatekeepers of Business Education Research: An Institutional Analysis

    ERIC Educational Resources Information Center

    Urbancic, Frank R.

    2011-01-01

    The author ranked the academic standing of universities based on faculty representation to the editorial boards of business education journals. Previous studies that ranked institutions for editorial board representation focused on journals that primarily favor publication of basic and applied research contributions. As a result, prior research…

  19. Building on Strength: Positive Youth Development in Juvenile Justice Programs

    ERIC Educational Resources Information Center

    Barton William H.; Butts, Jeffrey A.

    2008-01-01

    This report describes the results of an exploratory study of juvenile justice programs where managers and practitioners are attempting to build youth interventions with strength-based, positive youth development principles. Previous researchers have not adequately documented how such reforms take place, let alone whether they produce effective…

  20. Gradiency and Visual Context in Syntactic Garden-Paths

    ERIC Educational Resources Information Center

    Farmer, Thomas A.; Anderson, Sarah E.; Spivey, Michael J.

    2007-01-01

    Through recording the streaming x- and y-coordinates of computer-mouse movements, we report evidence that visual context provides an immediate constraint on the resolution of syntactic ambiguity in the visual-world paradigm. This finding converges with previous eye-tracking results that support a constraint-based account of sentence processing, in…

  1. [Occupational risk as a criterion determining economic responsibility of employers].

    PubMed

    Subbotin, V V; Tkachev, V V

    2003-01-01

    The authors suggested a new method to calculate discounts and increments, value of assurance collection, that is based on differentiation of insurers, but not of economic branches. Occupational risk class should be set according to the previous results with consideration of work safety parameters described in the article.

  2. Military Base Closures: Observations on Prior and Current BRAC Rounds

    EPA Pesticide Factsheets

    DOD indicates that recommendations from the previous BRAC rounds were implemented within the 6-year period mandated by law. As a result, DOD estimated that it reduced its domestic infrastructure by about 20 percent; about 90 percent of unneeded BRAC property is now available for reuse.

  3. Academic Capitalism in the Pasteur's Quadrant

    ERIC Educational Resources Information Center

    Mendoza, Pilar

    2009-01-01

    Based on previous empirical studies, in this work the author presents an analysis of the role of context in academic capitalism. In particular, she argues that the literature on academic capitalism fails to properly acknowledge disciplinary and institutional differences, which results in an oversimplification of the effects of industry-academia…

  4. Readiness Matters! 2014-2015 Kindergarten Readiness Assessment

    ERIC Educational Resources Information Center

    Maryland State Department of Education, 2015

    2015-01-01

    Maryland's demanding new Kindergarten Readiness Assessment was administered statewide for the first time. Its results are revealing and sobering. Many states do not even check in any systematic way on their children's readiness for kindergarten, and in previous years, Maryland used metrics based on modest expectations, outdated standards, and…

  5. Assessing Quality of Critical Thought in Online Discussion

    ERIC Educational Resources Information Center

    Weltzer-Ward, Lisa; Baltes, Beate; Lynn, Laura Knight

    2009-01-01

    Purpose: The purpose of this paper is to describe a theoretically based coding framework for an integrated analysis and assessment of critical thinking in online discussion. Design/methodology/approach: The critical thinking assessment framework (TAF) is developed through review of theory and previous research, verified by comparing results to…

  6. 44 CFR 64.3 - Flood Insurance Maps.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... flood hazard that results from the decertification of a previously accredited flood protection system that is determined to be in the process of being restored to provide base flood protection V Area of... tidal floods (coastal high hazard area) V1-30, VE Area of special flood hazards, with water surface...

  7. Using Technology to Facilitate Differentiated High School Science Instruction

    ERIC Educational Resources Information Center

    Maeng, Jennifer L.

    2017-01-01

    This qualitative investigation explored the beliefs and practices of one secondary science teacher, Diane, who differentiated instruction and studied how technology facilitated her differentiation. Diane was selected based on the results of a previous study, in which data indicated that Diane understood how to design and implement proactively…

  8. Chapter 10: Management recommendations

    Treesearch

    Deborah M. Finch; Janie Agyagos; Tracy McCarthey; Robert M. Marshall; Scott H. Stoleson; Mary J. Whitfield

    2000-01-01

    This chapter was developed over a series of meetings using a group-consensus process. Our recommendations are based on published results, on information compiled in the previous chapters, on expert opinion, and on unpublished data of conservation team members. This chapter is available as temporary guidance until the Recovery Plan for the southwestern willow flycatcher...

  9. Stories in Different Domains of Child Development

    ERIC Educational Resources Information Center

    Gnjatovic, Dragana

    2015-01-01

    This article is based on the results gained from the research about the perception teachers have about stories. The study was conducted in Sweden and the main purpose was to partially fulfil the requirements for Erasmus Mundus joint degree "International Master of Early Childhood Education and Care". In accordance with previous research…

  10. Genetic mapping of 15 human X chromosomal forensic short tandem repeat (STR) loci by means of multi-core parallelization.

    PubMed

    Diegoli, Toni Marie; Rohde, Heinrich; Borowski, Stefan; Krawczak, Michael; Coble, Michael D; Nothnagel, Michael

    2016-11-01

    Typing of X chromosomal short tandem repeat (X STR) markers has become a standard element of human forensic genetic analysis. Joint consideration of many X STR markers at a time increases their discriminatory power but, owing to physical linkage, requires inter-marker recombination rates to be accurately known. We estimated the recombination rates between 15 well established X STR markers using genotype data from 158 families (1041 individuals) and following a previously proposed likelihood-based approach that allows for single-step mutations. To meet the computational requirements of this family-based type of analysis, we modified a previous implementation so as to allow multi-core parallelization on a high-performance computing system. While we obtained recombination rate estimates larger than zero for all but one pair of adjacent markers within the four previously proposed linkage groups, none of the three X STR pairs defining the junctions of these groups yielded a recombination rate estimate of 0.50. Corroborating previous studies, our results therefore argue against a simple model of independent X chromosomal linkage groups. Moreover, the refined recombination fraction estimates obtained in our study will facilitate the appropriate joint consideration of all 15 investigated markers in forensic analysis. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  11. Genetic demographic networks: Mathematical model and applications.

    PubMed

    Kimmel, Marek; Wojdyła, Tomasz

    2016-10-01

    Recent improvement in the quality of genetic data obtained from extinct human populations and their ancestors encourages searching for answers to basic questions regarding human population history. The most common and successful are model-based approaches, in which genetic data are compared to the data obtained from the assumed demography model. Using such approach, it is possible to either validate or adjust assumed demography. Model fit to data can be obtained based on reverse-time coalescent simulations or forward-time simulations. In this paper we introduce a computational method based on mathematical equation that allows obtaining joint distributions of pairs of individuals under a specified demography model, each of them characterized by a genetic variant at a chosen locus. The two individuals are randomly sampled from either the same or two different populations. The model assumes three types of demographic events (split, merge and migration). Populations evolve according to the time-continuous Moran model with drift and Markov-process mutation. This latter process is described by the Lyapunov-type equation introduced by O'Brien and generalized in our previous works. Application of this equation constitutes an original contribution. In the result section of the paper we present sample applications of our model to both simulated and literature-based demographies. Among other we include a study of the Slavs-Balts-Finns genetic relationship, in which we model split and migrations between the Balts and Slavs. We also include another example that involves the migration rates between farmers and hunters-gatherers, based on modern and ancient DNA samples. This latter process was previously studied using coalescent simulations. Our results are in general agreement with the previous method, which provides validation of our approach. Although our model is not an alternative to simulation methods in the practical sense, it provides an algorithm to compute pairwise distributions of alleles, in the case of haploid non-recombining loci such as mitochondrial and Y-chromosome loci in humans. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. Has universal screening with Xpert® MTB/RIF increased the proportion of multidrug-resistant tuberculosis cases diagnosed in a routine operational setting?

    PubMed Central

    Dunbar, Rory; Caldwell, Judy; Lombard, Carl; Beyers, Nulda

    2017-01-01

    Setting Primary health services in Cape Town, South Africa where the introduction of Xpert® MTB/RIF (Xpert) enabled simultaneous screening for tuberculosis (TB) and drug susceptibility in all presumptive cases. Study aim To compare the proportion of TB cases with drug susceptibility tests undertaken and multidrug-resistant tuberculosis (MDR-TB) diagnosed pre-treatment and during the course of 1st line treatment in the previous smear/culture and the newly introduced Xpert-based algorithms. Methods TB cases identified in a previous stepped-wedge study of TB yield in five sub-districts over seven one-month time-points prior to, during and after the introduction of the Xpert-based algorithm were analysed. We used a combination of patient identifiers to identify all drug susceptibility tests undertaken from electronic laboratory records. Differences in the proportions of DST undertaken and MDR-TB cases diagnosed between algorithms were estimated using a binomial regression model. Results Pre-treatment, the probability of having a DST undertaken (RR = 1.82)(p<0.001) and being diagnosed with MDR-TB (RR = 1.42)(p<0.001) was higher in the Xpert-based algorithm than in the smear/culture-based algorithm. For cases evaluated during the course of 1st-line TB treatment, there was no significant difference in the proportion with DST undertaken (RR = 1.02)(p = 0.848) or MDR-TB diagnosed (RR = 1.12)(p = 0.678) between algorithms. Conclusion Universal screening for drug susceptibility in all presumptive TB cases in the Xpert-based algorithm resulted in a higher overall proportion of MDR-TB cases being diagnosed and is an important strategy in reducing transmission. The previous strategy of only screening new TB cases when 1st line treatment failed did not compensate for cases missed pre-treatment. PMID:28199375

  13. Crystallization of bovine insulin on a flow-free droplet-based platform

    NASA Astrophysics Data System (ADS)

    Chen, Fengjuan; Du, Guanru; Yin, Di; Yin, Ruixue; Zhang, Hongbo; Zhang, Wenjun; Yang, Shih-Mo

    2017-03-01

    Crystallization is an important process in the pharmaceutical manufacturing industry. In this work, we report a study to create the zinc-free crystals of bovine insulin on a flow-free droplet-based platform we previously developed. The benefit of this platform is its promise to create a single type of crystals under a simpler and more stable environment and with a high throughput. The experimental result shows that the bovine insulin forms a rhombic dodecahedra shape and the coefficient variation (CV) in the size of crystals is less than 5%. These results are very promising for the insulin production.

  14. A novel maximum power point tracking system employing phase comparison techniques

    NASA Astrophysics Data System (ADS)

    Avaritsiotis, J. N.; Tsitomeneas, S.; Caroubalos, C.

    A new MPPT design is presented that is based on the comparison of the phase of a perturbing signal with that of the signal which is the result of the perturbation. More specifically, a voltage ripple is induced on the power loop of the P/V system and its phase is compared to the phase of the resulting ripple on the electric power P = I x V, where I and V are the current and voltage respectively of the P/V generator. A prototype MPPT based on the previous principle has been designed, constructed, and its performance has been studied.

  15. Application of a Scalable, Parallel, Unstructured-Grid-Based Navier-Stokes Solver

    NASA Technical Reports Server (NTRS)

    Parikh, Paresh

    2001-01-01

    A parallel version of an unstructured-grid based Navier-Stokes solver, USM3Dns, previously developed for efficient operation on a variety of parallel computers, has been enhanced to incorporate upgrades made to the serial version. The resultant parallel code has been extensively tested on a variety of problems of aerospace interest and on two sets of parallel computers to understand and document its characteristics. An innovative grid renumbering construct and use of non-blocking communication are shown to produce superlinear computing performance. Preliminary results from parallelization of a recently introduced "porous surface" boundary condition are also presented.

  16. Compression of real time volumetric echocardiographic data using modified SPIHT based on the three-dimensional wavelet packet transform.

    PubMed

    Hang, X; Greenberg, N L; Shiota, T; Firstenberg, M S; Thomas, J D

    2000-01-01

    Real-time three-dimensional echocardiography has been introduced to provide improved quantification and description of cardiac function. Data compression is desired to allow efficient storage and improve data transmission. Previous work has suggested improved results utilizing wavelet transforms in the compression of medical data including 2D echocardiogram. Set partitioning in hierarchical trees (SPIHT) was extended to compress volumetric echocardiographic data by modifying the algorithm based on the three-dimensional wavelet packet transform. A compression ratio of at least 40:1 resulted in preserved image quality.

  17. Finite-time synchronization control of a class of memristor-based recurrent neural networks.

    PubMed

    Jiang, Minghui; Wang, Shuangtao; Mei, Jun; Shen, Yanjun

    2015-03-01

    This paper presents a global and local finite-time synchronization control law for memristor neural networks. By utilizing the drive-response concept, differential inclusions theory, and Lyapunov functional method, we establish several sufficient conditions for finite-time synchronization between the master and corresponding slave memristor-based neural network with the designed controller. In comparison with the existing results, the proposed stability conditions are new, and the obtained results extend some previous works on conventional recurrent neural networks. Two numerical examples are provided to illustrate the effective of the design method. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. Convergence acceleration of the Proteus computer code with multigrid methods

    NASA Technical Reports Server (NTRS)

    Demuren, A. O.; Ibraheem, S. O.

    1992-01-01

    Presented here is the first part of a study to implement convergence acceleration techniques based on the multigrid concept in the Proteus computer code. A review is given of previous studies on the implementation of multigrid methods in computer codes for compressible flow analysis. Also presented is a detailed stability analysis of upwind and central-difference based numerical schemes for solving the Euler and Navier-Stokes equations. Results are given of a convergence study of the Proteus code on computational grids of different sizes. The results presented here form the foundation for the implementation of multigrid methods in the Proteus code.

  19. Improving frequencies range measurement of vibration sensor based on Fiber Bragg Grating (FBG)

    NASA Astrophysics Data System (ADS)

    Qomaruddin; Setiono, A.; Afandi, M. I.

    2017-04-01

    This research aimed to develop a vibration sensor based on Fiber Bragg Grating (FBG). The design was mainly done by attaching FBG at the cantilever. The free-end of the cantilever was tied to a vibration source in order to increase the measurement range of vibration frequencies. The results indicated that the developed sensor was capable of detecting wide range of frequencies (i.e. 10 - 1700 Hz). The results also showed both good stability and repeatability. The measured frequency range was 566 times greater than the range obtained from the previous works.

  20. Attachment styles and contingencies of self-worth.

    PubMed

    Park, Lora E; Crocker, Jennifer; Mickelson, Kristin D

    2004-10-01

    Previous research on attachment theory has focused on mean differences in level of self-esteem among people with different attachment styles. The present study examines the associations between attachment styles and different bases of self-esteem, or contingencies of self-worth, among a sample of 795 college students. Results showed that attachment security was related to basing self-worth on family support. Both the preoccupied attachment style and fearful attachment style were related to basing self-worth on physical attractiveness. The dismissing attachment style was related to basing self-worth less on others' approval, family support, and God's love.

  1. Factors associated with Chlamydia trachomatis testing in a high school based screening and previously in clinical practice: a cross-sectional study in Norway

    PubMed Central

    2013-01-01

    Background High school based chlamydia screening has been shown to increase uptake and detect hidden infections among sexually active adolescents. Our study aimed to: i) examine the proportions of 15–20 year-olds tested in a high school based screening and previously in clinical practice, ii) determine chlamydia prevalence according to testing pattern, and iii) examine factors associated with testing in the two settings. Methods A population based cross-sectional study was conducted in 5 high schools in Norway in 2009, using web-questionnaires and Chlamydia trachomatis PCR in first-void urine (800 girls/818 boys, mean age 17.2 years). Only sexually active participants at risk for chlamydia infections were included in the analyses. Crude and multivariable logistic regression models were applied with ‘clinic based testing’ and ‘school based screening’ as outcome variables. Results 56% of girls and 21% of boys reported previous clinic based testing. In the school based screening, 93% were tested with no gender difference. 42% of girls and 74% of boys were tested for the first time at school (‘school-only test’). Both girls with clinic based testing and girls with school-only test had high chlamydia prevalence (7.3% vs 7.2%). Boys with clinic based testing had twice the prevalence of those with school-only test (6.2% vs 3.0%, p = 0.01). Half of infections were detected in participants with school-only test. One-fifth were repeat infections. In multivariable analysis of girls and boys combined, female gender, older age, early sexual debut, no condom use at first and last intercourse, steady relationship, and higher number of lifetime partners increased the odds of clinic based testing. The odds of school based screening increased with male gender, academic affiliation, later sexual debut, condom use at first intercourse, and current urogenital symptoms in multivariable analysis. Conclusions More than half the girls had been tested prior to the school based screening and had high prevalence independent of previous clinic based testing. School screening was mostly associated with factors unknown to increase chlamydia infection risk, while clinic based testing was associated with traditional risk factors. The unusually high and equal participation between genders and the detection of a large chlamydia reservoir confirms the value of school based screening suggesting this approach to be further explored in Norway. PMID:23915415

  2. The disruptive effects of methamphetamine on delayed-matching-to-sample performance reflect proactive interference and are reduced by SCH23390.

    PubMed

    Macaskill, Anne C; Harrow, Catherine C; Harper, David N

    2015-01-01

    Different drugs produce different patterns of impairment on delayed matching-to-sample tasks. For example, (+/-)3,4-methylenedioxymethamphetamine (MDMA) produces an increase in proactive interference. That is, subjects are less accurate when they are required to make a response different to the one they made on the immediately previous trial. The current study assessed whether methamphetamine also produces this particular pattern of disruption in delayed matching-to-sample performance in rats. Methamphetamine primarily reduced accuracy on trials where the correct response differed from the one made on the previous trial. Thus methamphetamine, like MDMA and other stimulant-based drugs of abuse, increased proactive interference. This impairment was reduced by prior administration of the dopamine D1 antagonist SCH23390. These results further extend a general conclusion that a range of stimulant-based drugs may disrupt working memory function indirectly via a tendency to repeat previously made responses and that this disruption is related to D1 receptor activity. Copyright © 2014 Elsevier Inc. All rights reserved.

  3. The convergent and concurrent validity of trait-based prototype assessment of personality disorder categories in homeless persons.

    PubMed

    Samuel, Douglas B; Connolly, Adrian J; Ball, Samuel A

    2012-09-01

    The DSM-5 proposal indicates that personality disorders (PDs) be defined as collections of maladaptive traits but does not provide a specific diagnostic method. However, researchers have previously suggested that PD constructs can be assessed by comparing individuals' trait profiles with those prototypic of PDs and evidence from the five-factor model (FFM) suggests that these prototype matching scores converge moderately with traditional PD instruments. The current study investigates the convergence of FFM PD prototypes with interview-assigned PD diagnoses in a sample of 99 homeless individuals. This sample had very high rates of PDs, which extends previous research on samples with more modest prevalence rates. Results indicated that diagnostic agreement between these methods was generally low but consistent with the agreement previously observed between explicit PD measures. Furthermore, trait-based and diagnostic interview scores evinced similar relationships with clinically important indicators such as abuse history and past suicide attempts. These findings demonstrate the validity of prototype methods and suggest their consideration for assessing trait-defined PD types within DSM-5.

  4. Quantifying uncertainties in streamflow predictions through signature based inference of hydrological model parameters

    NASA Astrophysics Data System (ADS)

    Fenicia, Fabrizio; Reichert, Peter; Kavetski, Dmitri; Albert, Calro

    2016-04-01

    The calibration of hydrological models based on signatures (e.g. Flow Duration Curves - FDCs) is often advocated as an alternative to model calibration based on the full time series of system responses (e.g. hydrographs). Signature based calibration is motivated by various arguments. From a conceptual perspective, calibration on signatures is a way to filter out errors that are difficult to represent when calibrating on the full time series. Such errors may for example occur when observed and simulated hydrographs are shifted, either on the "time" axis (i.e. left or right), or on the "streamflow" axis (i.e. above or below). These shifts may be due to errors in the precipitation input (time or amount), and if not properly accounted in the likelihood function, may cause biased parameter estimates (e.g. estimated model parameters that do not reproduce the recession characteristics of a hydrograph). From a practical perspective, signature based calibration is seen as a possible solution for making predictions in ungauged basins. Where streamflow data are not available, it may in fact be possible to reliably estimate streamflow signatures. Previous research has for example shown how FDCs can be reliably estimated at ungauged locations based on climatic and physiographic influence factors. Typically, the goal of signature based calibration is not the prediction of the signatures themselves, but the prediction of the system responses. Ideally, the prediction of system responses should be accompanied by a reliable quantification of the associated uncertainties. Previous approaches for signature based calibration, however, do not allow reliable estimates of streamflow predictive distributions. Here, we illustrate how the Bayesian approach can be employed to obtain reliable streamflow predictive distributions based on signatures. A case study is presented, where a hydrological model is calibrated on FDCs and additional signatures. We propose an approach where the likelihood function for the signatures is derived from the likelihood for streamflow (rather than using an "ad-hoc" likelihood for the signatures as done in previous approaches). This likelihood is not easily tractable analytically and we therefore cannot apply "simple" MCMC methods. This numerical problem is solved using Approximate Bayesian Computation (ABC). Our result indicate that the proposed approach is suitable for producing reliable streamflow predictive distributions based on calibration to signature data. Moreover, our results provide indications on which signatures are more appropriate to represent the information content of the hydrograph.

  5. Development of the Mathematical Model for Ingot Quality Forecasting with Consideration of Thermal and Physical Characteristics of Mould Powder

    NASA Astrophysics Data System (ADS)

    Anisimov, K. N.; Loginov, A. M.; Gusev, M. P.; Zarubin, S. V.; Nikonov, S. V.; Krasnov, A. V.

    2017-12-01

    This paper presents the results of physical modelling of the mould powder skull in the gap between an ingot and the mould. Based on the results obtained from this and previous works, the mathematical model of mould powder behaviour in the gap and its influence on formation of surface defects was developed. The results of modelling satisfactorily conform to the industrial data on ingot surface defects.

  6. Mapping antigenic motifs in the trypomastigote small surface antigen from Trypanosoma cruzi.

    PubMed

    Balouz, Virginia; Cámara, María de Los Milagros; Cánepa, Gaspar E; Carmona, Santiago J; Volcovich, Romina; Gonzalez, Nicolás; Altcheh, Jaime; Agüero, Fernán; Buscaglia, Carlos A

    2015-03-01

    The trypomastigote small surface antigen (TSSA) is a mucin-like molecule from Trypanosoma cruzi, the etiological agent of Chagas disease, which displays amino acid polymorphisms in parasite isolates. TSSA expression is restricted to the surface of infective cell-derived trypomastigotes, where it functions as an adhesin and engages surface receptors on the host cell as a prerequisite for parasite internalization. Previous results have established TSSA-CL, the isoform encoded by the CL Brener clone, as an appealing candidate for use in serology-based diagnostics for Chagas disease. Here, we used a combination of peptide- and recombinant protein-based tools to map the antigenic structure of TSSA-CL at maximal resolution. Our results indicate the presence of different partially overlapping B-cell epitopes clustering in the central portion of TSSA-CL, which contains most of the polymorphisms found in parasite isolates. Based on these results, we assessed the serodiagnostic performance of a 21-amino-acid-long peptide that spans TSSA-CL major antigenic determinants, which was similar to the performance of the previously validated glutathione S-transferase (GST)-TSSA-CL fusion molecule. Furthermore, the tools developed for the antigenic characterization of the TSSA antigen were also used to explore other potential diagnostic applications of the anti-TSSA humoral response in Chagasic patients. Overall, our present results provide additional insights into the antigenic structure of TSSA-CL and support this molecule as an excellent target for molecular intervention in Chagas disease. Copyright © 2015, American Society for Microbiology. All Rights Reserved.

  7. Mapping Antigenic Motifs in the Trypomastigote Small Surface Antigen from Trypanosoma cruzi

    PubMed Central

    Balouz, Virginia; Cámara, María de los Milagros; Cánepa, Gaspar E.; Carmona, Santiago J.; Volcovich, Romina; Gonzalez, Nicolás; Altcheh, Jaime; Agüero, Fernán

    2015-01-01

    The trypomastigote small surface antigen (TSSA) is a mucin-like molecule from Trypanosoma cruzi, the etiological agent of Chagas disease, which displays amino acid polymorphisms in parasite isolates. TSSA expression is restricted to the surface of infective cell-derived trypomastigotes, where it functions as an adhesin and engages surface receptors on the host cell as a prerequisite for parasite internalization. Previous results have established TSSA-CL, the isoform encoded by the CL Brener clone, as an appealing candidate for use in serology-based diagnostics for Chagas disease. Here, we used a combination of peptide- and recombinant protein-based tools to map the antigenic structure of TSSA-CL at maximal resolution. Our results indicate the presence of different partially overlapping B-cell epitopes clustering in the central portion of TSSA-CL, which contains most of the polymorphisms found in parasite isolates. Based on these results, we assessed the serodiagnostic performance of a 21-amino-acid-long peptide that spans TSSA-CL major antigenic determinants, which was similar to the performance of the previously validated glutathione S-transferase (GST)-TSSA-CL fusion molecule. Furthermore, the tools developed for the antigenic characterization of the TSSA antigen were also used to explore other potential diagnostic applications of the anti-TSSA humoral response in Chagasic patients. Overall, our present results provide additional insights into the antigenic structure of TSSA-CL and support this molecule as an excellent target for molecular intervention in Chagas disease. PMID:25589551

  8. Pre-Nursing Students Perceptions of Traditional and Inquiry Based Chemistry Laboratories

    NASA Astrophysics Data System (ADS)

    Rogers, Jessica

    This paper describes a process that attempted to meet the needs of undergraduate students in a pre-nursing chemistry class. The laboratory was taught in traditional verification style and students were surveyed to assess their perceptions of the educational goals of the laboratory. A literature review resulted in an inquiry based method and analysis of the needs of nurses resulted in more application based activities. This new inquiry format was implemented the next semester, the students were surveyed at the end of the semester and results were compared to the previous method. Student and instructor response to the change in format was positive. Students in the traditional format placed goals concerning technique above critical thinking and felt the lab was easy to understand and carry out. Students in the inquiry based lab felt they learned more critical thinking skills and enjoyed the independence of designing experiments and answering their own questions.

  9. Artificial intelligence systems based on texture descriptors for vaccine development.

    PubMed

    Nanni, Loris; Brahnam, Sheryl; Lumini, Alessandra

    2011-02-01

    The aim of this work is to analyze and compare several feature extraction methods for peptide classification that are based on the calculation of texture descriptors starting from a matrix representation of the peptide. This texture-based representation of the peptide is then used to train a support vector machine classifier. In our experiments, the best results are obtained using local binary patterns variants and the discrete cosine transform with selected coefficients. These results are better than those previously reported that employed texture descriptors for peptide representation. In addition, we perform experiments that combine standard approaches based on amino acid sequence. The experimental section reports several tests performed on a vaccine dataset for the prediction of peptides that bind human leukocyte antigens and on a human immunodeficiency virus (HIV-1). Experimental results confirm the usefulness of our novel descriptors. The matlab implementation of our approaches is available at http://bias.csr.unibo.it/nanni/TexturePeptide.zip.

  10. Theoretical results on the tandem junction solar cell based on its Ebers-Moll transistor model

    NASA Technical Reports Server (NTRS)

    Goradia, C.; Vaughn, J.; Baraona, C. R.

    1980-01-01

    A one-dimensional theoretical model of the tandem junction solar cell (TJC) with base resistivity greater than about 1 ohm-cm and under low level injection has been derived. This model extends a previously published conceptual model which treats the TJC as an npn transistor. The model gives theoretical expressions for each of the Ebers-Moll type currents of the illuminated TJC and allows for the calculation of the spectral response, I(sc), V(oc), FF and eta under variation of one or more of the geometrical and material parameters and 1MeV electron fluence. Results of computer calculations based on this model are presented and discussed. These results indicate that for space applications, both a high beginning of life efficiency, greater than 15% AM0, and a high radiation tolerance can be achieved only with thin (less than 50 microns) TJC's with high base resistivity (greater than 10 ohm-cm).

  11. Exploring the effects of transducer models when training convolutional neural networks to eliminate reflection artifacts in experimental photoacoustic images

    NASA Astrophysics Data System (ADS)

    Allman, Derek; Reiter, Austin; Bell, Muyinatu

    2018-02-01

    We previously proposed a method of removing reflection artifacts in photoacoustic images that uses deep learning. Our approach generally relies on using simulated photoacoustic channel data to train a convolutional neural network (CNN) that is capable of distinguishing sources from artifacts based on unique differences in their spatial impulse responses (manifested as depth-based differences in wavefront shapes). In this paper, we directly compare a CNN trained with our previous continuous transducer model to a CNN trained with an updated discrete acoustic receiver model that more closely matches an experimental ultrasound transducer. These two CNNs were trained with simulated data and tested on experimental data. The CNN trained using the continuous receiver model correctly classified 100% of sources and 70.3% of artifacts in the experimental data. In contrast, the CNN trained using the discrete receiver model correctly classified 100% of sources and 89.7% of artifacts in the experimental images. The 19.4% increase in artifact classification accuracy indicates that an acoustic receiver model that closely mimics the experimental transducer plays an important role in improving the classification of artifacts in experimental photoacoustic data. Results are promising for developing a method to display CNN-based images that remove artifacts in addition to only displaying network-identified sources as previously proposed.

  12. 'Getting back to normal': the added value of an art-based programme in promoting 'recovery' for common but chronic mental health problems.

    PubMed

    Makin, Sally; Gask, Linda

    2012-03-01

    OBJECTIVES. The aim of this project was to explore the added value of participation in an Arts on Prescription (AoP) programme to aid the process of recovery in people with common but chronic mental health problems that have already undergone a psychological 'talking'-based therapy. METHODS. The study utilized qualitative in-depth interviews with 15 clients with persistent anxiety and depression who had attended an 'AoP' service and had previously received psychological therapy. RESULTS and discussion. Attending AoP aided the process of recovery, which was perceived by participants as 'returning to normality' through enjoying life again, returning to previous activities, setting goals and stopping dwelling on the past. Most were positive about the benefits they had previously gained from talking therapies. However, these alone were not perceived as having been sufficient to achieve recovery. The AoP offered some specific opportunities in this regard, mediated by the therapeutic and effect of absorption in an activity, the specific creative potential of art, and the social aspects of attending the programme. CONCLUSIONS. For some people who experience persistent or relapsing common mental health problems, participation in an arts-based programme provides 'added value' in aiding recovery in ways not facilitated by talking therapies alone.

  13. The significance and robustness of a plasma free amino acid (PFAA) profile-based multiplex function for detecting lung cancer

    PubMed Central

    2013-01-01

    Background We have recently reported on the changes in plasma free amino acid (PFAA) profiles in lung cancer patients and the efficacy of a PFAA-based, multivariate discrimination index for the early detection of lung cancer. In this study, we aimed to verify the usefulness and robustness of PFAA profiling for detecting lung cancer using new test samples. Methods Plasma samples were collected from 171 lung cancer patients and 3849 controls without apparent cancer. PFAA levels were measured by high-performance liquid chromatography (HPLC)–electrospray ionization (ESI)–mass spectrometry (MS). Results High reproducibility was observed for both the change in the PFAA profiles in the lung cancer patients and the discriminating performance for lung cancer patients compared to previously reported results. Furthermore, multivariate discriminating functions obtained in previous studies clearly distinguished the lung cancer patients from the controls based on the area under the receiver-operator characteristics curve (AUC of ROC = 0.731 ~ 0.806), strongly suggesting the robustness of the methodology for clinical use. Moreover, the results suggested that the combinatorial use of this classifier and tumor markers improves the clinical performance of tumor markers. Conclusions These findings suggest that PFAA profiling, which involves a relatively simple plasma assay and imposes a low physical burden on subjects, has great potential for improving early detection of lung cancer. PMID:23409863

  14. Optimal back-extrapolation method for estimating plasma volume in humans using the indocyanine green dilution method

    PubMed Central

    2014-01-01

    Background The indocyanine green dilution method is one of the methods available to estimate plasma volume, although some researchers have questioned the accuracy of this method. Methods We developed a new, physiologically based mathematical model of indocyanine green kinetics that more accurately represents indocyanine green kinetics during the first few minutes postinjection than what is assumed when using the traditional mono-exponential back-extrapolation method. The mathematical model is used to develop an optimal back-extrapolation method for estimating plasma volume based on simulated indocyanine green kinetics obtained from the physiological model. Results Results from a clinical study using the indocyanine green dilution method in 36 subjects with type 2 diabetes indicate that the estimated plasma volumes are considerably lower when using the traditional back-extrapolation method than when using the proposed back-extrapolation method (mean (standard deviation) plasma volume = 26.8 (5.4) mL/kg for the traditional method vs 35.1 (7.0) mL/kg for the proposed method). The results obtained using the proposed method are more consistent with previously reported plasma volume values. Conclusions Based on the more physiological representation of indocyanine green kinetics and greater consistency with previously reported plasma volume values, the new back-extrapolation method is proposed for use when estimating plasma volume using the indocyanine green dilution method. PMID:25052018

  15. Analysis of Dynamic Fracture Compliance Based on Poroelastic Theory - Part II: Results of Numerical and Experimental Tests

    NASA Astrophysics Data System (ADS)

    Wang, Ding; Ding, Pin-bo; Ba, Jing

    2018-03-01

    In Part I, a dynamic fracture compliance model (DFCM) was derived based on the poroelastic theory. The normal compliance of fractures is frequency-dependent and closely associated with the connectivity of porous media. In this paper, we first compare the DFCM with previous fractured media theories in the literature in a full frequency range. Furthermore, experimental tests are performed on synthetic rock specimens, and the DFCM is compared with the experimental data in the ultrasonic frequency band. Synthetic rock specimens saturated with water have more realistic mineral compositions and pore structures relative to previous works in comparison with natural reservoir rocks. The fracture/pore geometrical and physical parameters can be controlled to replicate approximately those of natural rocks. P- and S-wave anisotropy characteristics with different fracture and pore properties are calculated and numerical results are compared with experimental data. Although the measurement frequency is relatively high, the results of DFCM are appropriate for explaining the experimental data. The characteristic frequency of fluid pressure equilibration calculated based on the specimen parameters is not substantially less than the measurement frequency. In the dynamic fracture model, the wave-induced fluid flow behavior is an important factor for the fracture-wave interaction process, which differs from the models at the high-frequency limits, for instance, Hudson's un-relaxed model.

  16. Interval-based reconstruction for uncertainty quantification in PET

    NASA Astrophysics Data System (ADS)

    Kucharczak, Florentin; Loquin, Kevin; Buvat, Irène; Strauss, Olivier; Mariano-Goulart, Denis

    2018-02-01

    A new directed interval-based tomographic reconstruction algorithm, called non-additive interval based expectation maximization (NIBEM) is presented. It uses non-additive modeling of the forward operator that provides intervals instead of single-valued projections. The detailed approach is an extension of the maximum likelihood—expectation maximization algorithm based on intervals. The main motivation for this extension is that the resulting intervals have appealing properties for estimating the statistical uncertainty associated with the reconstructed activity values. After reviewing previously published theoretical concepts related to interval-based projectors, this paper describes the NIBEM algorithm and gives examples that highlight the properties and advantages of this interval valued reconstruction.

  17. Two Strain Dengue Model with Temporary Cross Immunity and Seasonality

    NASA Astrophysics Data System (ADS)

    Aguiar, Maíra; Ballesteros, Sebastien; Stollenwerk, Nico

    2010-09-01

    Models on dengue fever epidemiology have previously shown critical fluctuations with power law distributions and also deterministic chaos in some parameter regions due to the multi-strain structure of the disease pathogen. In our first model including well known biological features, we found a rich dynamical structure including limit cycles, symmetry breaking bifurcations, torus bifurcations, coexisting attractors including isola solutions and deterministic chaos (as indicated by positive Lyapunov exponents) in a much larger parameter region, which is also biologically more plausible than the previous results of other researches. Based on these findings we will investigate the model structures further including seasonality.

  18. Performance of Renormalization Group Algebraic Turbulence Model on Boundary Layer Transition Simulation

    NASA Technical Reports Server (NTRS)

    Ahn, Kyung H.

    1994-01-01

    The RNG-based algebraic turbulence model, with a new method of solving the cubic equation and applying new length scales, is introduced. An analysis is made of the RNG length scale which was previously reported and the resulting eddy viscosity is compared with those from other algebraic turbulence models. Subsequently, a new length scale is introduced which actually uses the two previous RNG length scales in a systematic way to improve the model performance. The performance of the present RNG model is demonstrated by simulating the boundary layer flow over a flat plate and the flow over an airfoil.

  19. Two Strain Dengue Model with Temporary Cross Immunity and Seasonality

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aguiar, Maira; Ballesteros, Sebastien; Stollenwerk, Nico

    Models on dengue fever epidemiology have previously shown critical fluctuations with power law distributions and also deterministic chaos in some parameter regions due to the multi-strain structure of the disease pathogen. In our first model including well known biological features, we found a rich dynamical structure including limit cycles, symmetry breaking bifurcations, torus bifurcations, coexisting attractors including isola solutions and deterministic chaos (as indicated by positive Lyapunov exponents) in a much larger parameter region, which is also biologically more plausible than the previous results of other researches. Based on these findings we will investigate the model structures further including seasonality.

  20. Discriminative Projection Selection Based Face Image Hashing

    NASA Astrophysics Data System (ADS)

    Karabat, Cagatay; Erdogan, Hakan

    Face image hashing is an emerging method used in biometric verification systems. In this paper, we propose a novel face image hashing method based on a new technique called discriminative projection selection. We apply the Fisher criterion for selecting the rows of a random projection matrix in a user-dependent fashion. Moreover, another contribution of this paper is to employ a bimodal Gaussian mixture model at the quantization step. Our simulation results on three different databases demonstrate that the proposed method has superior performance in comparison to previously proposed random projection based methods.

  1. Shape and texture fused recognition of flying targets

    NASA Astrophysics Data System (ADS)

    Kovács, Levente; Utasi, Ákos; Kovács, Andrea; Szirányi, Tamás

    2011-06-01

    This paper presents visual detection and recognition of flying targets (e.g. planes, missiles) based on automatically extracted shape and object texture information, for application areas like alerting, recognition and tracking. Targets are extracted based on robust background modeling and a novel contour extraction approach, and object recognition is done by comparisons to shape and texture based query results on a previously gathered real life object dataset. Application areas involve passive defense scenarios, including automatic object detection and tracking with cheap commodity hardware components (CPU, camera and GPS).

  2. Evaluation and simplification of the occupational slip, trip and fall risk-assessment test

    PubMed Central

    NAKAMURA, Takehiro; OYAMA, Ichiro; FUJINO, Yoshihisa; KUBO, Tatsuhiko; KADOWAKI, Koji; KUNIMOTO, Masamizu; ODOI, Haruka; TABATA, Hidetoshi; MATSUDA, Shinya

    2016-01-01

    Objective: The purpose of this investigation is to evaluate the efficacy of the occupational slip, trip and fall (STF) risk assessment test developed by the Japan Industrial Safety and Health Association (JISHA). We further intended to simplify the test to improve efficiency. Methods: A previous cohort study was performed using 540 employees aged ≥50 years who took the JISHA’s STF risk assessment test. We conducted multivariate analysis using these previous results as baseline values and answers to questionnaire items or score on physical fitness tests as variables. The screening efficiency of each model was evaluated based on the obtained receiver operating characteristic (ROC) curve. Results: The area under the ROC obtained in multivariate analysis was 0.79 when using all items. Six of the 25 questionnaire items were selected for stepwise analysis, giving an area under the ROC curve of 0.77. Conclusion: Based on the results of follow-up performed one year after the initial examination, we successfully determined the usefulness of the STF risk assessment test. Administering a questionnaire alone is sufficient for screening subjects at risk of STF during the subsequent one-year period. PMID:27021057

  3. Theory of chemical bonds in metalloenzymes XXI. Possible mechanisms of water oxidation in oxygen evolving complex of photosystem II

    NASA Astrophysics Data System (ADS)

    Yamaguchi, Kizashi; Shoji, Mitsuo; Isobe, Hiroshi; Yamanaka, Shusuke; Kawakami, Takashi; Yamada, Satoru; Katouda, Michio; Nakajima, Takahito

    2018-03-01

    Possible mechanisms for water cleavage in oxygen evolving complex (OEC) of photosystem II (PSII) have been investigated based on broken-symmetry (BS) hybrid DFT (HDFT)/def2 TZVP calculations in combination with available XRD, XFEL, EXAFS, XES and EPR results. The BS HDFT and the experimental results have provided basic concepts for understanding of chemical bonds of the CaMn4O5 cluster in the catalytic site of OEC of PSII for elucidation of the mechanism of photosynthetic water cleavage. Scope and applicability of the hybrid DFT (HDFT) methods have been examined in relation to relative stabilities of possible nine intermediates such as Mn-hydroxide, Mn-oxo, Mn-peroxo, Mn-superoxo, etc., in order to understand the O-O (O-OH) bond formation in the S3 and/or S4 states of OEC of PSII. The relative stabilities among these intermediates are variable, depending on the weight of the Hartree-Fock exchange term of HDFT. The Mn-hydroxide, Mn-oxo and Mn-superoxo intermediates are found to be preferable in the weak, intermediate and strong electron correlation regimes, respectively. Recent different serial femtosecond X-ray (SFX) results in the S3 state are investigated based on the proposed basic concepts under the assumption of different water-insertion steps for water cleavage in the Kok cycle. The observation of water insertion in the S3 state is compatible with previous large-scale QM/MM results and previous theoretical proposal for the chemical equilibrium mechanism in the S3 state . On the other hand, the no detection of water insertion in the S3 state based on other SFX results is consistent with previous proposal of the O-OH (or O-O) bond formation in the S4 state . Radical coupling and non-adiabatic one-electron transfer (NA-OET) mechanisms for the OO-bond formation are examined using the energy diagrams by QM calculations and by QM(UB3LYP)/MM calculations . Possible reaction pathways for the O-O and O-OH bond formations are also investigated based on two water-inlet pathways for oxygen evolution in OEC of PSII. Future perspectives are discussed in relation to post HDFT calculations of the energy diagrams for elucidation of the mechanism of water oxidation in OEC of PSII.

  4. The use of research‐based theatre in a project related to metastatic breast cancer

    PubMed Central

    Gray, Ross; Sinding, Chris; Ivonoffski, Vrenia; Fitch, Margaret; Hampson, Ann; Greenberg, Marlene

    2008-01-01

    Research‐based theatre represents an innovative approach to disseminating the results of qualitative studies. In this paper, we provide a rationale for the importance of research‐based theatre and also review previous work that has been done in the area. We then describe our experience in transforming research data into a dramatic production, Handle with Care? This production was based on two studies – one with women with metastatic breast cancer, and the other with medical oncologists treating breast cancer patients. Results from ongoing assessment of the project are reported. We discuss some of the factors related to the success of Handle with Care? and reflect on what has been learned about the process of developing dramatic pieces related to serious illness. PMID:11281920

  5. Assessing the Effectiveness of Studio Physics in Introductory-Level Courses at Georgia State University

    NASA Astrophysics Data System (ADS)

    Upton, Brianna; Evans, John; Morrow, Cherilynn; Thoms, Brian

    2009-11-01

    Previous studies have shown that many students have misconceptions about basic concepts in physics. Moreover, it has been concluded that one of the challenges lies in the teaching methodology. To address this, Georgia State University has begun teaching studio algebra-based physics. Although many institutions have implemented studio physics, most have done so in calculus-based sequences. The effectiveness of the studio approach in an algebra-based introductory physics course needs further investigation. A 3-semester study assessing the effectiveness of studio physics in an algebra-based physics sequence has been performed. This study compares the results of student pre- and post-tests using the Force Concept Inventory. Using the results from this assessment tool, we will discuss the effectiveness of the studio approach to teaching physics at GSU.

  6. Agent-based real-time signal coordination in congested networks.

    DOT National Transportation Integrated Search

    2014-01-01

    This study is the continuation of a previous NEXTRANS study on agent-based reinforcement : learning methods for signal coordination in congested networks. In the previous study, the : formulation of a real-time agent-based traffic signal control in o...

  7. Surface Stabilized InP/GaP/ZnS Quantum Dots with Mg Ions for WLED Application.

    PubMed

    Park, Joong Pill; Kim, Sang-Wook

    2016-05-01

    One of the most highlighted cadmium-free quantum dots (QDs), InP-based QDs, have improved their optical properties. However, InP-based QDs have some practical drawbacks, for example, stability, compared with CdSe-based QDs. Poor stability of InP-based QDs yields critical problems, such as agglomeration and photoluminescence quenching in light emitting diode (LED). It has to be solved for applications and most research has focused on thick outer shells as an effective solution. We introduced magnesium cations for improving stability of InP-based QDs. We applied very small amounts of Mg cations as surface stabilizers, as a result, stability of QDs is clearly improved. Then, QD based LED chips also yield improved values including RA of 84.4, CCT of 3799 K, and luminous efficiency of 129.57 Im/W, which are highly improved data compared with our previous results.

  8. Classroom-Based Science Research at the Introductory Level: Changes in Career Choices and Attitude

    PubMed Central

    Harrison, Melinda; Dunbar, David; Ratmansky, Lisa; Lopatto, David

    2011-01-01

    Our study, focused on classroom-based research at the introductory level and using the Phage Genomics course as the model, shows evidence that first-year students doing research learn the process of science as well as how scientists practice science. A preliminary but notable outcome of our work, which is based on a small sample, is the change in student interest in considering different career choices such as graduate education and science in general. This is particularly notable, as previous research has described research internships as clarifying or confirming rather than changing undergraduates’ decisions to pursue graduate education. We hypothesize that our results differ from previous studies of the impact of engaging in research because the students in our study are still in the early stages of their undergraduate careers. Our work builds upon the classroom-based research movement and should be viewed as encouraging to the Vision and Change in Undergraduate Biology Education movement advocated by the American Association for the Advancement of Science, the National Science Foundation, and other undergraduate education stakeholders. PMID:21885824

  9. Regulation Effects by Programmed Molecules for Transcription-Based Diagnostic Automata towards Therapeutic Use

    NASA Astrophysics Data System (ADS)

    Hirabayashi, Miki; Ohashi, Hirotada; Kubo, Tai

    We have presented experimental analysis on the controllability of our transcription-based diagnostic biomolecular automata by programmed molecules. Focusing on the noninvasive transcriptome diagnosis by salivary mRNAs, we already proposed the novel concept of diagnostic device using DNA computation. This system consists of the main computational element which has a stem shaped promoter region and a pseudo-loop shaped read-only memory region for transcription regulation through the conformation change caused by the recognition of disease-related biomarkers. We utilize the transcription of malachite green aptamer sequence triggered by the target recognition for observation of detection. This algorithm makes it possible to release RNA-aptamer drugs multiply, different from the digestion-based systems by the restriction enzyme which was proposed previously, for the in-vivo use, however, the controllability of aptamer release is not enough at the previous stage. In this paper, we verified the regulation effect on aptamer transcription by programmed molecules in basic conditions towards the developm! ent of therapeutic automata. These results would bring us one step closer to the realization of new intelligent diagnostic and therapeutic automata based on molecular circuits.

  10. Classroom-based science research at the introductory level: changes in career choices and attitude.

    PubMed

    Harrison, Melinda; Dunbar, David; Ratmansky, Lisa; Boyd, Kimberly; Lopatto, David

    2011-01-01

    Our study, focused on classroom-based research at the introductory level and using the Phage Genomics course as the model, shows evidence that first-year students doing research learn the process of science as well as how scientists practice science. A preliminary but notable outcome of our work, which is based on a small sample, is the change in student interest in considering different career choices such as graduate education and science in general. This is particularly notable, as previous research has described research internships as clarifying or confirming rather than changing undergraduates' decisions to pursue graduate education. We hypothesize that our results differ from previous studies of the impact of engaging in research because the students in our study are still in the early stages of their undergraduate careers. Our work builds upon the classroom-based research movement and should be viewed as encouraging to the Vision and Change in Undergraduate Biology Education movement advocated by the American Association for the Advancement of Science, the National Science Foundation, and other undergraduate education stakeholders.

  11. Gene discovery in the hamster: a comparative genomics approach for gene annotation by sequencing of hamster testis cDNAs

    PubMed Central

    Oduru, Sreedhar; Campbell, Janee L; Karri, SriTulasi; Hendry, William J; Khan, Shafiq A; Williams, Simon C

    2003-01-01

    Background Complete genome annotation will likely be achieved through a combination of computer-based analysis of available genome sequences combined with direct experimental characterization of expressed regions of individual genomes. We have utilized a comparative genomics approach involving the sequencing of randomly selected hamster testis cDNAs to begin to identify genes not previously annotated on the human, mouse, rat and Fugu (pufferfish) genomes. Results 735 distinct sequences were analyzed for their relatedness to known sequences in public databases. Eight of these sequences were derived from previously unidentified genes and expression of these genes in testis was confirmed by Northern blotting. The genomic locations of each sequence were mapped in human, mouse, rat and pufferfish, where applicable, and the structure of their cognate genes was derived using computer-based predictions, genomic comparisons and analysis of uncharacterized cDNA sequences from human and macaque. Conclusion The use of a comparative genomics approach resulted in the identification of eight cDNAs that correspond to previously uncharacterized genes in the human genome. The proteins encoded by these genes included a new member of the kinesin superfamily, a SET/MYND-domain protein, and six proteins for which no specific function could be predicted. Each gene was expressed primarily in testis, suggesting that they may play roles in the development and/or function of testicular cells. PMID:12783626

  12. Use of New Wood Material for Pallets, Containers is Stagnant to Declining

    Treesearch

    Robert J. Bush; Philip A. Araman

    1997-01-01

    In 1994, the authors reported in the Pallet Enterprise on their study of new and recovered wood use for pallets and containers. In this article they report on the results of a new survey in 1996 of new wood use by the pallet and container industry, comparing the latest results to previous studies. Their research is based on a study of 2,600 wooden pallet and container...

  13. Unconventional Constraints on Nitrogen Chemistry using DC3 Observations and Trajectory-based Chemical Modeling

    NASA Astrophysics Data System (ADS)

    Shu, Q.; Henderson, B. H.

    2017-12-01

    Chemical transport models underestimate nitrogen dioxide observations in the upper troposphere (UT). Previous research in the UT succeeded in combining model predictions with field campaign measurements to demonstrate that the nitric acid formation rate (HO + NO2 → HNO3 (R1)) is overestimated by 22% (Henderson et al., 2012). A subsequent publication (Seltzer et al., 2015) demonstrated that single chemical constraint alters ozone and aerosol formation/composition. This work attempts to replicate previous chemical constraints with newer observations and a different modeling framework. We apply the previously successful constraint framework to Deep Convection Clouds and Chemistry (DC3). DC3 is a more recent field campaign where simulated nitrogen imbalances still exist. Freshly convected air parcels, identified in the DC3 dataset, as initial coordinates to initiate Lagrangian trajectories. Along each trajectory, we simulate the air parcel chemical state. Samples along the trajectories will form ensembles that represent possible realizations of UT air parcels. We then apply Bayesian inference to constrain nitrogen chemistry and compare results to the existing literature. Our anticipated results will confirm overestimation of HNO3 formation rate in previous work and provide further constraints on other nitrogen reaction rate coefficients that affect terminal products from NOx. We will particularly focus on organic nitrate chemistry that laboratory literature has yet to fully address. The results will provide useful insights into nitrogen chemistry that affects climate and human health.

  14. Measurement of high-energy neutron flux above ground utilizing a spallation based multiplicity technique

    DOE PAGES

    Roecker, Caleb; Bernstein, Adam; Marleau, Peter; ...

    2016-11-14

    Cosmogenic high-energy neutrons are a ubiquitous, difficult to shield, poorly measured background. Above ground the high-energy neutron energy-dependent flux has been measured, with significantly varying results. Below ground, high-energy neutron fluxes are largely unmeasured. Here we present a reconstruction algorithm to unfold the incident neutron energy-dependent flux measured using the Multiplicity and Recoil Spectrometer (MARS), simulated test cases to verify the algorithm, and provide a new measurement of the above ground high-energy neutron energy-dependent flux with a detailed systematic uncertainty analysis. Uncertainty estimates are provided based upon the measurement statistics, the incident angular distribution, the surrounding environment of the Montemore » Carlo model, and the MARS triggering efficiency. Quantified systematic uncertainty is dominated by the assumed incident neutron angular distribution and surrounding environment of the Monte Carlo model. The energy-dependent neutron flux between 90 MeV and 400 MeV is reported. Between 90 MeV and 250 MeV the MARS results are comparable to previous Bonner sphere measurements. Over the total energy regime measured, the MARS result are located within the span of previous measurements. Lastly, these results demonstrate the feasibility of future below ground measurements with MARS.« less

  15. Measurement of high-energy neutron flux above ground utilizing a spallation based multiplicity technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roecker, Caleb; Bernstein, Adam; Marleau, Peter

    Cosmogenic high-energy neutrons are a ubiquitous, difficult to shield, poorly measured background. Above ground the high-energy neutron energy-dependent flux has been measured, with significantly varying results. Below ground, high-energy neutron fluxes are largely unmeasured. Here we present a reconstruction algorithm to unfold the incident neutron energy-dependent flux measured using the Multiplicity and Recoil Spectrometer (MARS), simulated test cases to verify the algorithm, and provide a new measurement of the above ground high-energy neutron energy-dependent flux with a detailed systematic uncertainty analysis. Uncertainty estimates are provided based upon the measurement statistics, the incident angular distribution, the surrounding environment of the Montemore » Carlo model, and the MARS triggering efficiency. Quantified systematic uncertainty is dominated by the assumed incident neutron angular distribution and surrounding environment of the Monte Carlo model. The energy-dependent neutron flux between 90 MeV and 400 MeV is reported. Between 90 MeV and 250 MeV the MARS results are comparable to previous Bonner sphere measurements. Over the total energy regime measured, the MARS result are located within the span of previous measurements. Lastly, these results demonstrate the feasibility of future below ground measurements with MARS.« less

  16. Structure of the SnO2(110 ) -(4 ×1 ) Surface

    NASA Astrophysics Data System (ADS)

    Merte, Lindsay R.; Jørgensen, Mathias S.; Pussi, Katariina; Gustafson, Johan; Shipilin, Mikhail; Schaefer, Andreas; Zhang, Chu; Rawle, Jonathan; Nicklin, Chris; Thornton, Geoff; Lindsay, Robert; Hammer, Bjørk; Lundgren, Edvin

    2017-09-01

    Using surface x-ray diffraction (SXRD), quantitative low-energy electron diffraction (LEED), and density-functional theory (DFT) calculations, we have determined the structure of the (4 ×1 ) reconstruction formed by sputtering and annealing of the SnO2(110 ) surface. We find that the reconstruction consists of an ordered arrangement of Sn3O3 clusters bound atop the bulk-terminated SnO2(110 ) surface. The model was found by application of a DFT-based evolutionary algorithm with surface compositions based on SXRD, and shows excellent agreement with LEED and with previously published scanning tunneling microscopy measurements. The model proposed previously consisting of in-plane oxygen vacancies is thus shown to be incorrect, and our result suggests instead that Sn(II) species in interstitial positions are the more relevant features of reduced SnO2(110 ) surfaces.

  17. Model-Based IN SITU Parameter Estimation of Ultrasonic Guided Waves in AN Isotropic Plate

    NASA Astrophysics Data System (ADS)

    Hall, James S.; Michaels, Jennifer E.

    2010-02-01

    Most ultrasonic systems employing guided waves for flaw detection require information such as dispersion curves, transducer locations, and expected propagation loss. Degraded system performance may result if assumed parameter values do not accurately reflect the actual environment. By characterizing the propagating environment in situ at the time of test, potentially erroneous a priori estimates are avoided and performance of ultrasonic guided wave systems can be improved. A four-part model-based algorithm is described in the context of previous work that estimates model parameters whereby an assumed propagation model is used to describe the received signals. This approach builds upon previous work by demonstrating the ability to estimate parameters for the case of single mode propagation. Performance is demonstrated on signals obtained from theoretical dispersion curves, finite element modeling, and experimental data.

  18. Full cycle trigonometric function on Intel Quartus II Verilog

    NASA Astrophysics Data System (ADS)

    Mustapha, Muhazam; Zulkarnain, Nur Antasha

    2018-02-01

    This paper discusses about an improvement of a previous research on hardware based trigonometric calculations. Tangent function will also be implemented to get a complete set. The functions have been simulated using Quartus II where the result will be compared to the previous work. The number of bits has also been extended for each trigonometric function. The design is based on RTL due to its resource efficient nature. At earlier stage, a technology independent test bench simulation was conducted on ModelSim due to its convenience in capturing simulation data so that accuracy information can be obtained. On second stage, Intel/Altera Quartus II will be used to simulate on technology dependent platform, particularly on the one belonging to Intel/Altera itself. Real data on no. logic elements used and propagation delay have also been obtained.

  19. Web 2.0-Based Crowdsourcing for High-Quality Gold Standard Development in Clinical Natural Language Processing

    PubMed Central

    Deleger, Louise; Li, Qi; Kaiser, Megan; Stoutenborough, Laura

    2013-01-01

    Background A high-quality gold standard is vital for supervised, machine learning-based, clinical natural language processing (NLP) systems. In clinical NLP projects, expert annotators traditionally create the gold standard. However, traditional annotation is expensive and time-consuming. To reduce the cost of annotation, general NLP projects have turned to crowdsourcing based on Web 2.0 technology, which involves submitting smaller subtasks to a coordinated marketplace of workers on the Internet. Many studies have been conducted in the area of crowdsourcing, but only a few have focused on tasks in the general NLP field and only a handful in the biomedical domain, usually based upon very small pilot sample sizes. In addition, the quality of the crowdsourced biomedical NLP corpora were never exceptional when compared to traditionally-developed gold standards. The previously reported results on medical named entity annotation task showed a 0.68 F-measure based agreement between crowdsourced and traditionally-developed corpora. Objective Building upon previous work from the general crowdsourcing research, this study investigated the usability of crowdsourcing in the clinical NLP domain with special emphasis on achieving high agreement between crowdsourced and traditionally-developed corpora. Methods To build the gold standard for evaluating the crowdsourcing workers’ performance, 1042 clinical trial announcements (CTAs) from the ClinicalTrials.gov website were randomly selected and double annotated for medication names, medication types, and linked attributes. For the experiments, we used CrowdFlower, an Amazon Mechanical Turk-based crowdsourcing platform. We calculated sensitivity, precision, and F-measure to evaluate the quality of the crowd’s work and tested the statistical significance (P<.001, chi-square test) to detect differences between the crowdsourced and traditionally-developed annotations. Results The agreement between the crowd’s annotations and the traditionally-generated corpora was high for: (1) annotations (0.87, F-measure for medication names; 0.73, medication types), (2) correction of previous annotations (0.90, medication names; 0.76, medication types), and excellent for (3) linking medications with their attributes (0.96). Simple voting provided the best judgment aggregation approach. There was no statistically significant difference between the crowd and traditionally-generated corpora. Our results showed a 27.9% improvement over previously reported results on medication named entity annotation task. Conclusions This study offers three contributions. First, we proved that crowdsourcing is a feasible, inexpensive, fast, and practical approach to collect high-quality annotations for clinical text (when protected health information was excluded). We believe that well-designed user interfaces and rigorous quality control strategy for entity annotation and linking were critical to the success of this work. Second, as a further contribution to the Internet-based crowdsourcing field, we will publicly release the JavaScript and CrowdFlower Markup Language infrastructure code that is necessary to utilize CrowdFlower’s quality control and crowdsourcing interfaces for named entity annotations. Finally, to spur future research, we will release the CTA annotations that were generated by traditional and crowdsourced approaches. PMID:23548263

  20. In vivo dosimetry for total body irradiation: five‐year results and technique comparison

    PubMed Central

    Warry, Alison J.; Eaton, David J.; Collis, Christopher H.; Rosenberg, Ivan

    2014-01-01

    The aim of this work is to establish if the new CT‐based total body irradiation (TBI) planning techniques used at University College London Hospital (UCLH) and Royal Free Hospital (RFH) are comparable to the previous technique at the Middlesex Hospital (MXH) by analyzing predicted and measured diode results. TBI aims to deliver a homogeneous dose to the entire body, typically using extended SSD fields with beam modulation to limit doses to organs at risk. In vivo dosimetry is used to verify the accuracy of delivered doses. In 2005, when the Middlesex Hospital was decommissioned and merged with UCLH, both UCLH and the RFH introduced updated CT‐planned TBI techniques, based on the old MXH technique. More CT slices and in vivo measurement points were used by both; UCLH introduced a beam modulation technique using MLC segments, while RFH updated to a combination of lead compensators and bolus. Semiconductor diodes were used to measure entrance and exit doses in several anatomical locations along the entire body. Diode results from both centers for over five years of treatments were analyzed and compared to the previous MXH technique for accuracy and precision of delivered doses. The most stable location was the field center with standard deviations of 4.1% (MXH), 3.7% (UCLH), and 1.7% (RFH). The least stable position was the ankles. Mean variation with fraction number was within 1.5% for all three techniques. In vivo dosimetry can be used to verify complex modulated CT‐planned TBI, and demonstrate improvements and limitations in techniques. The results show that the new UCLH technique is no worse than the previous MXH one and comparable to the current RFH technique. PACS numbers: 87.55.Qr, 87.56.N‐ PMID:25207423

  1. In vivo dosimetry for total body irradiation: five-year results and technique comparison.

    PubMed

    Patel, Reshma P; Warry, Alison J; Eaton, David J; Collis, Christopher H; Rosenberg, Ivan

    2014-07-08

    The aim of this work is to establish if the new CT-based total body irradiation (TBI) planning techniques used at University College London Hospital (UCLH) and Royal Free Hospital (RFH) are comparable to the previous technique at the Middlesex Hospital (MXH) by analyzing predicted and measured diode results. TBI aims to deliver a homogeneous dose to the entire body, typically using extended SSD fields with beam modulation to limit doses to organs at risk. In vivo dosimetry is used to verify the accuracy of delivered doses. In 2005, when the Middlesex Hospital was decommissioned and merged with UCLH, both UCLH and the RFH introduced updated CT-planned TBI techniques, based on the old MXH technique. More CT slices and in vivo measurement points were used by both; UCLH introduced a beam modulation technique using MLC segments, while RFH updated to a combination of lead compensators and bolus. Semiconductor diodes were used to measure entrance and exit doses in several anatomical locations along the entire body. Diode results from both centers for over five years of treatments were analyzed and compared to the previous MXH technique for accuracy and precision of delivered doses. The most stable location was the field center with standard deviations of 4.1% (MXH), 3.7% (UCLH), and 1.7% (RFH). The least stable position was the ankles. Mean variation with fraction number was within 1.5% for all three techniques. In vivo dosimetry can be used to verify complex modulated CT-planned TBI, and demonstrate improvements and limitations in techniques. The results show that the new UCLH technique is no worse than the previous MXH one and comparable to the current RFH technique.

  2. Risk of placenta previa in second birth after first birth cesarean section: a population-based study and meta-analysis

    PubMed Central

    2011-01-01

    Background Objective: To compare the risk of placenta previa at second birth among women who had a cesarean section (CS) at first birth with women who delivered vaginally. Methods Retrospective cohort study of 399,674 women who gave birth to a singleton first and second baby between April 2000 and February 2009 in England. Multiple logistic regression was used to adjust the estimates for maternal age, ethnicity, deprivation, placenta previa at first birth, inter-birth interval and pregnancy complications. In addition, we conducted a meta-analysis of the reported results in peer-reviewed articles since 1980. Results The rate of placenta previa at second birth for women with vaginal first births was 4.4 per 1000 births, compared to 8.7 per 1000 births for women with CS at first birth. After adjustment, CS at first birth remained associated with an increased risk of placenta previa (odds ratio = 1.60; 95% CI 1.44 to 1.76). In the meta-analysis of 37 previously published studies from 21 countries, the overall pooled random effects odds ratio was 2.20 (95% CI 1.96-2.46). Our results from the current study is consistent with those of the meta-analysis as the pooled odds ratio for the six population-based cohort studies that analyzed second births only was 1.51 (95% CI 1.39-1.65). Conclusions There is an increased risk of placenta previa in the subsequent pregnancy after CS delivery at first birth, but the risk is lower than previously estimated. Given the placenta previa rate in England and the adjusted effect of previous CS, 359 deliveries by CS at first birth would result in one additional case of placenta previa in the next pregnancy. PMID:22103697

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Waldhoff, Stephanie T.; Anthoff, David; Rose, Steven K.

    We use FUND 3.8 to estimate the social cost of four greenhouse gases: carbon dioxide, methane, nitrous oxide, and sulphur hexafluoride emissions. The damage potential for each gas—the ratio of the social cost of the non-carbon dioxide greenhouse gas to the social cost of carbon dioxide—is also estimated. The damage potentials are compared to several metrics, focusing in particular on the global warming potentials, which are frequently used to measure the trade-off between gases in the form of carbon dioxide equivalents. We find that damage potentials could be significantly higher than global warming potentials. This finding implies that previous papersmore » have underestimated the relative importance of reducing non-carbon dioxide greenhouse gas emissions from an economic damage perspective. We show results for a range of sensitivity analyses: carbon dioxide fertilization on agriculture productivity, terrestrial feedbacks, climate sensitivity, discounting, equity weighting, and socioeconomic and emissions scenarios. The sensitivity of the results to carbon dioxide fertilization is a primary focus as it is an important element of climate change that has not been considered in much of the previous literature. We estimate that carbon dioxide fertilization has a large positive impact that reduces the social cost of carbon dioxide with a much smaller effect on the other greenhouse gases. As a result, our estimates of the damage potentials of methane and nitrous oxide are much higher compared to estimates that ignore carbon dioxide fertilization. As a result, our base estimates of the damage potential for methane and nitrous oxide that include carbon dioxide fertilization are twice their respective global warming potentials. Our base estimate of the damage potential of sulphur hexafluoride is similar to the one previous estimate, both almost three times the global warming potential.« less

  4. Transient Three-Dimensional Side Load Analysis of Out-of-Round Film Cooled Nozzles

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See; Lin, Jeff; Ruf, Joe; Guidos, Mike

    2010-01-01

    The objective of this study is to investigate the effect of nozzle out-of-roundness on the transient startup side loads. The out-of-roundness could be the result of asymmetric loads induced by hardware attached to the nozzle, asymmetric internal stresses induced by previous tests and/or deformation, such as creep, from previous tests. The rocket engine studied encompasses a regeneratively cooled thrust chamber and a film cooled nozzle extension with film coolant distributed from a turbine exhaust manifold. The computational methodology is based on an unstructured-grid, pressure-based computational fluid dynamics formulation, and a transient inlet history based on an engine system simulation. Transient startup computations were performed with the out-of-roundness achieved by four degrees of ovalization of the nozzle: one perfectly round, one slightly out-of-round, one more out-of-round, and one significantly out-of-round. The computed side load physics caused by the nozzle out-of-roundness and its effect on nozzle side load are reported and discussed.

  5. The impact of multiple information on coupled awareness-epidemic dynamics in multiplex networks

    NASA Astrophysics Data System (ADS)

    Pan, Yaohui; Yan, Zhijun

    2018-02-01

    Growing interest has emerged in the study of the interplay between awareness and epidemics in multiplex networks. However, previous studies on this issue usually assume that all aware individuals take the same level of precautions, ignoring individual heterogeneity. In this paper, we investigate the coupled awareness-epidemic dynamics in multiplex networks considering individual heterogeneity. Here, the precaution levels are heterogeneous and depend on three types of information: contact information and local and global prevalence information. The results show that contact-based precautions can decrease the epidemic prevalence and augment the epidemic threshold, but prevalence-based precautions, regardless of local or global information, can only decrease the epidemic prevalence. Moreover, unlike previous studies in single-layer networks, we do not find a greater impact of local prevalence information on the epidemic prevalence compared to global prevalence information. In addition, we find that the altruistic behaviors of infected individuals can effectively suppress epidemic spreading, especially when the level of contact-based precaution is high.

  6. Probabilistic representation in syllogistic reasoning: A theory to integrate mental models and heuristics.

    PubMed

    Hattori, Masasi

    2016-12-01

    This paper presents a new theory of syllogistic reasoning. The proposed model assumes there are probabilistic representations of given signature situations. Instead of conducting an exhaustive search, the model constructs an individual-based "logical" mental representation that expresses the most probable state of affairs, and derives a necessary conclusion that is not inconsistent with the model using heuristics based on informativeness. The model is a unification of previous influential models. Its descriptive validity has been evaluated against existing empirical data and two new experiments, and by qualitative analyses based on previous empirical findings, all of which supported the theory. The model's behavior is also consistent with findings in other areas, including working memory capacity. The results indicate that people assume the probabilities of all target events mentioned in a syllogism to be almost equal, which suggests links between syllogistic reasoning and other areas of cognition. Copyright © 2016 The Author(s). Published by Elsevier B.V. All rights reserved.

  7. A new respirometric endpoint-based biosensor to assess the relative toxicity of chemicals on immobilized human cells.

    PubMed

    Dragone, Roberto; Frazzoli, Chiara; Grappelli, Claudio; Campanella, Luigi

    2009-01-01

    Several functional and biochemical parameters have been proposed as biomarkers of effect of environmental pollutants. A rapid biosensor working with immobilized human U-937 cells was developed and applied to environmentally relevant chemicals with different structures and toxicological pathways, i.e. benzalkonium chloride, clofibric acid, diclofenac, mercury nitrate, ofloxacin, and sodium dodecyl sulphate. Respiration of cells was relied upon as a comprehensive biochemical effect for screening purposes. Analytical parameter (DeltappmO(2)) and toxicological index (respiratory inhibition, delta%) measured after 1h of exposure were utilized for dose-response relationship study. Results (toxicity rating scales based on delta(50)% and steepness) were compared with those obtained by the same approach previously optimized on Saccharomyces cerevisiae. The toxicity rating scale obtained by the biomarker based on human mitochondrial and cell metabolic activities compared well with previous scale obtained on yeast cells and with available in-vivo acute toxicity indexes; respiration was confirmed as toxicological endpoint reliably measurable by the biosensor.

  8. Fusing Symbolic and Numerical Diagnostic Computations

    NASA Technical Reports Server (NTRS)

    James, Mark

    2007-01-01

    X-2000 Anomaly Detection Language denotes a developmental computing language, and the software that establishes and utilizes the language, for fusing two diagnostic computer programs, one implementing a numerical analysis method, the other implementing a symbolic analysis method into a unified event-based decision analysis software system for realtime detection of events (e.g., failures) in a spacecraft, aircraft, or other complex engineering system. The numerical analysis method is performed by beacon-based exception analysis for multi-missions (BEAMs), which has been discussed in several previous NASA Tech Briefs articles. The symbolic analysis method is, more specifically, an artificial-intelligence method of the knowledge-based, inference engine type, and its implementation is exemplified by the Spacecraft Health Inference Engine (SHINE) software. The goal in developing the capability to fuse numerical and symbolic diagnostic components is to increase the depth of analysis beyond that previously attainable, thereby increasing the degree of confidence in the computed results. In practical terms, the sought improvement is to enable detection of all or most events, with no or few false alarms.

  9. Model-Based Engine Control Architecture with an Extended Kalman Filter

    NASA Technical Reports Server (NTRS)

    Csank, Jeffrey T.; Connolly, Joseph W.

    2016-01-01

    This paper discusses the design and implementation of an extended Kalman filter (EKF) for model-based engine control (MBEC). Previously proposed MBEC architectures feature an optimal tuner Kalman Filter (OTKF) to produce estimates of both unmeasured engine parameters and estimates for the health of the engine. The success of this approach relies on the accuracy of the linear model and the ability of the optimal tuner to update its tuner estimates based on only a few sensors. Advances in computer processing are making it possible to replace the piece-wise linear model, developed off-line, with an on-board nonlinear model running in real-time. This will reduce the estimation errors associated with the linearization process, and is typically referred to as an extended Kalman filter. The non-linear extended Kalman filter approach is applied to the Commercial Modular Aero-Propulsion System Simulation 40,000 (C-MAPSS40k) and compared to the previously proposed MBEC architecture. The results show that the EKF reduces the estimation error, especially during transient operation.

  10. Model-Based Engine Control Architecture with an Extended Kalman Filter

    NASA Technical Reports Server (NTRS)

    Csank, Jeffrey T.; Connolly, Joseph W.

    2016-01-01

    This paper discusses the design and implementation of an extended Kalman filter (EKF) for model-based engine control (MBEC). Previously proposed MBEC architectures feature an optimal tuner Kalman Filter (OTKF) to produce estimates of both unmeasured engine parameters and estimates for the health of the engine. The success of this approach relies on the accuracy of the linear model and the ability of the optimal tuner to update its tuner estimates based on only a few sensors. Advances in computer processing are making it possible to replace the piece-wise linear model, developed off-line, with an on-board nonlinear model running in real-time. This will reduce the estimation errors associated with the linearization process, and is typically referred to as an extended Kalman filter. The nonlinear extended Kalman filter approach is applied to the Commercial Modular Aero-Propulsion System Simulation 40,000 (C-MAPSS40k) and compared to the previously proposed MBEC architecture. The results show that the EKF reduces the estimation error, especially during transient operation.

  11. Generalized Detectability for Discrete Event Systems

    PubMed Central

    Shu, Shaolong; Lin, Feng

    2011-01-01

    In our previous work, we investigated detectability of discrete event systems, which is defined as the ability to determine the current and subsequent states of a system based on observation. For different applications, we defined four types of detectabilities: (weak) detectability, strong detectability, (weak) periodic detectability, and strong periodic detectability. In this paper, we extend our results in three aspects. (1) We extend detectability from deterministic systems to nondeterministic systems. Such a generalization is necessary because there are many systems that need to be modeled as nondeterministic discrete event systems. (2) We develop polynomial algorithms to check strong detectability. The previous algorithms are based on observer whose construction is of exponential complexity, while the new algorithms are based on a new automaton called detector. (3) We extend detectability to D-detectability. While detectability requires determining the exact state of a system, D-detectability relaxes this requirement by asking only to distinguish certain pairs of states. With these extensions, the theory on detectability of discrete event systems becomes more applicable in solving many practical problems. PMID:21691432

  12. A resonance shift prediction based on the Boltzmann-Ehrenfest principle for cylindrical cavities with a rigid sphere.

    PubMed

    Santillan, Arturo O; Cutanda-Henríquez, Vicente

    2008-11-01

    An investigation on the resonance frequency shift for a plane-wave mode in a cylindrical cavity produced by a rigid sphere is reported in this paper. This change of the resonance frequency has been previously considered as a cause of oscillational instabilities in single-mode acoustic levitation devices. It is shown that the use of the Boltzmann-Ehrenfest principle of adiabatic invariance allows the derivation of an expression for the resonance frequency shift in a simpler and more direct way than a method based on a Green's function reported in literature. The position of the sphere can be any point along the axis of the cavity. Obtained predictions of the resonance frequency shift with the deduced equation agree quite well with numerical simulations based on the boundary element method. The results are also confirmed by experiments. The equation derived from the Boltzmann-Ehrenfest principle appears to be more general, and for large spheres, it gives a better approximation than the equation previously reported.

  13. Partial least squares based identification of Duchenne muscular dystrophy specific genes.

    PubMed

    An, Hui-bo; Zheng, Hua-cheng; Zhang, Li; Ma, Lin; Liu, Zheng-yan

    2013-11-01

    Large-scale parallel gene expression analysis has provided a greater ease for investigating the underlying mechanisms of Duchenne muscular dystrophy (DMD). Previous studies typically implemented variance/regression analysis, which would be fundamentally flawed when unaccounted sources of variability in the arrays existed. Here we aim to identify genes that contribute to the pathology of DMD using partial least squares (PLS) based analysis. We carried out PLS-based analysis with two datasets downloaded from the Gene Expression Omnibus (GEO) database to identify genes contributing to the pathology of DMD. Except for the genes related to inflammation, muscle regeneration and extracellular matrix (ECM) modeling, we found some genes with high fold change, which have not been identified by previous studies, such as SRPX, GPNMB, SAT1, and LYZ. In addition, downregulation of the fatty acid metabolism pathway was found, which may be related to the progressive muscle wasting process. Our results provide a better understanding for the downstream mechanisms of DMD.

  14. Treatment of obsessive compulsive disorder in a nationwide survey of office-based physician practice

    PubMed Central

    Patel, Sapana R; Humensky, Jennifer L; Olfson, Mark; Simpson, Helen Blair; Myers, Robert; Dixon, Lisa B.

    2014-01-01

    Objective To examine the treatment of obsessive compulsive disorder (OCD) in office-based physician practices. Methods Data from the 2003–2010 National Ambulatory Medical Care Survey, a nationally representative survey of visits to office-based physicians in the United States, were used to examine treatment of adult outpatient visits with a diagnosis of OCD. Results Most visits with a diagnosis of OCD (N=316) had been seen previously by the same physician (96%), usually a psychiatrist (86%), ≥6 times (56%) within the previous year. Most visits included psychotropic medications (84%), most commonly a serotonin reuptake inhibitor (69%) and less commonly included any psychotherapy (39%). Conclusions OCD is predominantly treated by psychiatrists using SRI medications, despite the prevalence of OCD and SRI prescribing practices in primary care. Given the potential shift in OCD treatment practice patterns after health care reform, future research on the treatment of OCD in primary care are warranted. PMID:24585056

  15. Reconciling estimates of the ratio of heat and salt fluxes at the ice-ocean interface

    NASA Astrophysics Data System (ADS)

    Keitzl, T.; Mellado, J. P.; Notz, D.

    2016-12-01

    The heat exchange between floating ice and the underlying ocean is determined by the interplay of diffusive fluxes directly at the ice-ocean interface and turbulent fluxes away from it. In this study, we examine this interplay through direct numerical simulations of free convection. Our results show that an estimation of the interface flux ratio based on direct measurements of the turbulent fluxes can be difficult because the flux ratio varies with depth. As an alternative, we present a consistent evaluation of the flux ratio based on the total heat and salt fluxes across the boundary layer. This approach allows us to reconcile previous estimates of the ice-ocean interface conditions. We find that the ratio of heat and salt fluxes directly at the interface is 83-100 rather than 33 as determined by previous turbulence measurements in the outer layer. This can cause errors in the estimated ice-ablation rate from field measurements of up to 40% if they are based on the three-equation formulation.

  16. Comparative Analysis of a Principal Component Analysis-Based and an Artificial Neural Network-Based Method for Baseline Removal.

    PubMed

    Carvajal, Roberto C; Arias, Luis E; Garces, Hugo O; Sbarbaro, Daniel G

    2016-04-01

    This work presents a non-parametric method based on a principal component analysis (PCA) and a parametric one based on artificial neural networks (ANN) to remove continuous baseline features from spectra. The non-parametric method estimates the baseline based on a set of sampled basis vectors obtained from PCA applied over a previously composed continuous spectra learning matrix. The parametric method, however, uses an ANN to filter out the baseline. Previous studies have demonstrated that this method is one of the most effective for baseline removal. The evaluation of both methods was carried out by using a synthetic database designed for benchmarking baseline removal algorithms, containing 100 synthetic composed spectra at different signal-to-baseline ratio (SBR), signal-to-noise ratio (SNR), and baseline slopes. In addition to deomonstrating the utility of the proposed methods and to compare them in a real application, a spectral data set measured from a flame radiation process was used. Several performance metrics such as correlation coefficient, chi-square value, and goodness-of-fit coefficient were calculated to quantify and compare both algorithms. Results demonstrate that the PCA-based method outperforms the one based on ANN both in terms of performance and simplicity. © The Author(s) 2016.

  17. Refinement of Objective Motion Cueing Criteria Investigation Based on Three Flight Tasks

    NASA Technical Reports Server (NTRS)

    Zaal, Petrus M. T.; Schroeder, Jeffery A.; Chung, William W.

    2017-01-01

    The objective of this paper is to refine objective motion cueing criteria for commercial transport simulators based on pilots' performance in three flying tasks. Actuator hardware and software algorithms determine motion cues. Today, during a simulator qualification, engineers objectively evaluate only the hardware. Pilot inspectors subjectively assess the overall motion cueing system (i.e., hardware plus software); however, it is acknowledged that pinpointing any deficiencies that might arise to either hardware or software is challenging. ICAO 9625 has an Objective Motion Cueing Test (OMCT), which is now a required test in the FAA's part 60 regulations for new devices, evaluating the software and hardware together; however, it lacks accompanying fidelity criteria. Hosman has documented OMCT results for a statistical sample of eight simulators which is useful, but having validated criteria would be an improvement. In a previous experiment, we developed initial objective motion cueing criteria that this paper is trying to refine. Sinacori suggested simple criteria which are in reasonable agreement with much of the literature. These criteria often necessitate motion displacements greater than most training simulators can provide. While some of the previous work has used transport aircraft in their studies, the majority used fighter aircraft or helicopters. Those that used transport aircraft considered degraded flight characteristics. As a result, earlier criteria lean more towards being sufficient, rather than necessary, criteria for typical transport aircraft training applications. Considering the prevalence of 60-inch, six-legged hexapod training simulators, a relevant question is "what are the necessary criteria that can be used with the ICAO 9625 diagnostic?" This study adds to the literature as follows. First, it examines well-behaved transport aircraft characteristics, but in three challenging tasks. The tasks are equivalent to the ones used in our previous experiment, allowing us to directly compare the results and add to the previous data. Second, it uses the Vertical Motion Simulator (VMS), the world's largest vertical displacement simulator. This allows inclusion of relatively large motion conditions, much larger than a typical training simulator can provide. Six new motion configurations were used that explore the motion responses between the initial objective motion cueing boundaries found in a previous experiment and what current hexapod simulators typically provide. Finally, a sufficiently large pilot pool added statistical reliability to the results.

  18. Base opening in RNA and DNA duplexes: implication for RNA stability.

    PubMed

    Chen, Y Z; Mohan, V; Griffey, R H

    2000-05-01

    The energetics of a low-energy single base opening in several RNA duplex crystal structures has been calculated and compared to DNA duplexes. Base opening in RNA appears to have an overall preference towards the major groove, similar to results previously reported for B-DNA. Movement of each of the adenine, uracil, and cytosine bases into the minor groove is blocked by a high-energy barrier due to severe close contact with neighboring bases. Guanine bases are able to open towards both grooves because of the unique orientation of the base that avoids steric clash along the opening pathway. RNA bases are found to have a substantially smaller major groove opening extent than that of their B-DNA counterparts. A comparison with base opening behavior of A-DNA duplexes suggests that this difference results from helix constraint associated with A-form backbone conformation. The reduced opening extent correlates with the RNA duplex stability and is consistent with observed slower imino proton exchange rates in RNA duplexes.

  19. Exact finite elements for conduction and convection

    NASA Technical Reports Server (NTRS)

    Thornton, E. A.; Dechaumphai, P.; Tamma, K. K.

    1981-01-01

    An appproach for developing exact one dimensional conduction-convection finite elements is presented. Exact interpolation functions are derived based on solutions to the governing differential equations by employing a nodeless parameter. Exact interpolation functions are presented for combined heat transfer in several solids of different shapes, and for combined heat transfer in a flow passage. Numerical results demonstrate that exact one dimensional elements offer advantages over elements based on approximate interpolation functions. Previously announced in STAR as N81-31507

  20. A new algorithm for reducing the workload of experts in performing systematic reviews.

    PubMed

    Matwin, Stan; Kouznetsov, Alexandre; Inkpen, Diana; Frunza, Oana; O'Blenis, Peter

    2010-01-01

    To determine whether a factorized version of the complement naïve Bayes (FCNB) classifier can reduce the time spent by experts reviewing journal articles for inclusion in systematic reviews of drug class efficacy for disease treatment. The proposed classifier was evaluated on a test collection built from 15 systematic drug class reviews used in previous work. The FCNB classifier was constructed to classify each article as containing high-quality, drug class-specific evidence or not. Weight engineering (WE) techniques were added to reduce underestimation for Medical Subject Headings (MeSH)-based and Publication Type (PubType)-based features. Cross-validation experiments were performed to evaluate the classifier's parameters and performance. Work saved over sampling (WSS) at no less than a 95% recall was used as the main measure of performance. The minimum workload reduction for a systematic review for one topic, achieved with a FCNB/WE classifier, was 8.5%; the maximum was 62.2% and the average over the 15 topics was 33.5%. This is 15.0% higher than the average workload reduction obtained using a voting perceptron-based automated citation classification system. The FCNB/WE classifier is simple, easy to implement, and produces significantly better results in reducing the workload than previously achieved. The results support it being a useful algorithm for machine-learning-based automation of systematic reviews of drug class efficacy for disease treatment.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yuche; Young, Stanley; Gonder, Jeff

    This study estimates the range of fuel and emissions impact of an automated-vehicle (AV) based transit system that services campus-based developments, termed an automated mobility district (AMD). The study develops a framework to quantify the fuel consumption and greenhouse gas (GHG) emission impacts of a transit system comprised of AVs, taking into consideration average vehicle fleet composition, fuel consumption/GHG emission of vehicles within specific speed bins, and the average occupancy of passenger vehicles and transit vehicles. The framework is exercised using a previous mobility analysis of a personal rapid transit (PRT) system, a system which shares many attributes with envisionedmore » AV-based transit systems. Total fuel consumption and GHG emissions with and without an AMD are estimated, providing a range of potential system impacts on sustainability. The results of a previous case study based of a proposed implementation of PRT on the Kansas State University (KSU) campus in Manhattan, Kansas, serves as the basis to estimate personal miles traveled supplanted by an AMD at varying levels of service. The results show that an AMD has the potential to reduce total system fuel consumption and GHG emissions, but the amount is largely dependent on operating and ridership assumptions. The study points to the need to better understand ride-sharing scenarios and calls for future research on sustainability benefits of an AMD system at both vehicle and system levels.« less

  2. Reinforcement learning design-based adaptive tracking control with less learning parameters for nonlinear discrete-time MIMO systems.

    PubMed

    Liu, Yan-Jun; Tang, Li; Tong, Shaocheng; Chen, C L Philip; Li, Dong-Juan

    2015-01-01

    Based on the neural network (NN) approximator, an online reinforcement learning algorithm is proposed for a class of affine multiple input and multiple output (MIMO) nonlinear discrete-time systems with unknown functions and disturbances. In the design procedure, two networks are provided where one is an action network to generate an optimal control signal and the other is a critic network to approximate the cost function. An optimal control signal and adaptation laws can be generated based on two NNs. In the previous approaches, the weights of critic and action networks are updated based on the gradient descent rule and the estimations of optimal weight vectors are directly adjusted in the design. Consequently, compared with the existing results, the main contributions of this paper are: 1) only two parameters are needed to be adjusted, and thus the number of the adaptation laws is smaller than the previous results and 2) the updating parameters do not depend on the number of the subsystems for MIMO systems and the tuning rules are replaced by adjusting the norms on optimal weight vectors in both action and critic networks. It is proven that the tracking errors, the adaptation laws, and the control inputs are uniformly bounded using Lyapunov analysis method. The simulation examples are employed to illustrate the effectiveness of the proposed algorithm.

  3. Study on Na layer response to geomagnetic activities based on Odin/OSIRIS Na density data

    NASA Astrophysics Data System (ADS)

    Tsuda, Takuo; Nakamura, Takuji; Hedin, Jonas; Gumbel, Jorg; Hosokawa, Keisuke; Ejiri, Mitsumu K.; Nishiyama, Takanori; Takahashi, Toru

    2016-07-01

    The Na layer is normally distributed from 80 to 110 km, and the height range is corresponding to the ionospheric D and E region. In the polar region, the energetic particles precipitating from the magnetosphere can often penetrate into the E region and even into the D region. Thus, the influence of the energetic particles to the Na layer is one of interests in the aspect of the atmospheric composition change accompanied with the auroral activity. There are several previous studies in this issue. For example, recently, we have reported an initial result on a clear relationship between the electron density increase (due to the energetic particles) and the Na density decrease from observational data sets obtained by Na lidar, EISCAT VHF radar, and optical instruments at Tromsoe, Norway on 24-25 January 2012. However, all of the previous studies had been carried out based on case studies by ground-based lidar observations. In this study, we have performed, for the first time, statistical analysis using Na density data from 2004 to 2009 obtained with the Optical Spectrograph and InfraRed Imager System (OSIRIS) onboard Odin satellite. In the presentation, we will show relationship between the Na density and geomagnetic activities, and its latitudinal variation. Based on these results, the Na layer response to the energetic particles will be discussed.

  4. Investigating a continuous shear strain function for depth-dependent properties of native and tissue engineering cartilage using pixel-size data.

    PubMed

    Motavalli, Mostafa; Whitney, G Adam; Dennis, James E; Mansour, Joseph M

    2013-12-01

    A previously developed novel imaging technique for determining the depth dependent properties of cartilage in simple shear is implemented. Shear displacement is determined from images of deformed lines photobleached on a sample, and shear strain is obtained from the derivative of the displacement. We investigated the feasibility of an alternative systematic approach to numerical differentiation for computing the shear strain that is based on fitting a continuous function to the shear displacement. Three models for a continuous shear displacement function are evaluated: polynomials, cubic splines, and non-parametric locally weighted scatter plot curves. Four independent approaches are then applied to identify the best-fit model and the accuracy of the first derivative. One approach is based on the Akaiki Information Criteria, and the Bayesian Information Criteria. The second is based on a method developed to smooth and differentiate digitized data from human motion. The third method is based on photobleaching a predefined circular area with a specific radius. Finally, we integrate the shear strain and compare it with the total shear deflection of the sample measured experimentally. Results show that 6th and 7th order polynomials are the best models for the shear displacement and its first derivative. In addition, failure of tissue-engineered cartilage, consistent with previous results, demonstrates the qualitative value of this imaging approach. © 2013 Elsevier Ltd. All rights reserved.

  5. Manipulative interplay of two adozelesin molecules with d(ATTAAT)₂achieving ligand-stacked Watson-Crick and Hoogsteen base-paired duplex adducts.

    PubMed

    Hopton, Suzanne R; Thompson, Andrew S

    2011-05-17

    Previous structural studies of the cyclopropapyrroloindole (CPI) antitumor antibiotics have shown that these ligands bind covalently edge-on into the minor groove of double-stranded DNA. Reversible covalent modification of the DNA via N3 of adenine occurs in a sequence-specific fashion. Early nuclear magnetic resonance and molecular modeling studies with both mono- and bis-alkylating ligands indicated that the ligands fit tightly within the minor groove, causing little distortion of the helix. In this study, we propose a new binding model for several of the CPI-based analogues, in which the aromatic secondary rings form π-stacked complexes within the minor groove. One of the adducts, formed with adozelesin and the d(ATTAAT)(2) sequence, also demonstrates the ability of these ligands to manipulate the DNA of the binding site, resulting in a Hoogsteen base-paired adduct. Although this type of base pairing has been previously observed with the bisfunctional CPI analogue bizelesin, this is the first time that such an observation has been made with a monoalkylating nondimeric analogue. Together, these results provide a new model for the design of CPI-based antitumor antibiotics, which also has a significant bearing on other structurally related and structurally unrelated minor groove-binding ligands. They indicate the dynamic nature of ligand-DNA interactions, demonstrating both DNA conformational flexibility and the ability of two DNA-bound ligands to interact to form stable covalent modified complexes.

  6. Analyzing the Effectiveness of the Self-organized Public-Key Management System on MANETs under the Lack of Cooperation and the Impersonation Attacks

    NASA Astrophysics Data System (ADS)

    da Silva, Eduardo; Dos Santos, Aldri Luiz; Lima, Michele N.; Albini, Luiz Carlos Pessoa

    Among the key management schemes for MANETs, the Self-Organized Public-Key Management System (PGP-Like) is the main chaining-based key management scheme. It is fully self-organized and does not require any certificate authority. Two kinds of misbehavior attacks are considered to be great threats to PGP-Like: lack of cooperation and impersonation attacks. This work quantifies the impact of such attacks on the PGP-Like. Simulation results show that PGP-Like was able to maintain its effectiveness when submitted to the lack of cooperation attack, contradicting previously theoretical results. It correctly works even in the presence of more than 60% of misbehaving nodes, although the convergence time is affected with only 20% of misbehaving nodes. On the other hand, PGP-Like is completely vulnerable to the impersonation attack. Its functionality is affected with just 5% of misbehaving nodes, confirming previously theoretical results.

  7. Quantum molecular dynamics of warm dense iron and a five-phase equation of state

    NASA Astrophysics Data System (ADS)

    Sjostrom, Travis; Crockett, Scott

    2018-05-01

    Through quantum molecular dynamics (QMD), utilizing both Kohn-Sham (orbital-based) and orbital-free density functional theory, we calculate the equation of state of warm dense iron in the density range 7 -30 g/cm 3 and temperatures from 1 to 100 eV. A critical examination of the iron pseudopotential is made, from which we find a significant improvement at high pressure to the previous QMD calculations of Wang et al. [Phys. Rev. E 89, 023101 (2014), 10.1103/PhysRevE.89.023101]. Our results also significantly extend the ranges of density and temperature that were attempted in that prior work. We calculate the shock Hugoniot and find very good agreement with experimental results to pressures over 20 TPa. These results are then incorporated with previous studies to generate a five-phase equation of state for iron.

  8. Theoretical gain optimization studies in 10. 6. mu. m CO/sub 2/--N/sub 2/ gasdynamic lasers. IV. Further results of parametric study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reddy, K.P.J.; Reddy, N.M.

    1984-01-01

    Based on a method proposed by Reddy and Shanmugasundaram, similar solutions have been obtained for the steady inviscid quasi-one-dimensional nonreacting flow in the supersonic nozzle of CO/sub 2/--N/sub 2/--H/sub 2/O and CO/sub 2/--N/sub 2/--He gasdynamic laser systems. Instead of using the correlations of a nonsimilar function N/sub S/ for pure N/sub 2/ gas, as is done in previous publications, the N/sub S/ correlations are computed here for the actual gas mixtures used in the gasdynamic lasers. Optimum small-signal optical gain and the corresponding optimum values of the operating parameters like reservoir pressure and temperature and nozzle area ratio are computedmore » using these correlations. The present results are compared with the previous results and the main differences are discussed.« less

  9. Improved inter-layer prediction for light field content coding with display scalability

    NASA Astrophysics Data System (ADS)

    Conti, Caroline; Ducla Soares, Luís.; Nunes, Paulo

    2016-09-01

    Light field imaging based on microlens arrays - also known as plenoptic, holoscopic and integral imaging - has recently risen up as feasible and prospective technology due to its ability to support functionalities not straightforwardly available in conventional imaging systems, such as: post-production refocusing and depth of field changing. However, to gradually reach the consumer market and to provide interoperability with current 2D and 3D representations, a display scalable coding solution is essential. In this context, this paper proposes an improved display scalable light field codec comprising a three-layer hierarchical coding architecture (previously proposed by the authors) that provides interoperability with 2D (Base Layer) and 3D stereo and multiview (First Layer) representations, while the Second Layer supports the complete light field content. For further improving the compression performance, novel exemplar-based inter-layer coding tools are proposed here for the Second Layer, namely: (i) an inter-layer reference picture construction relying on an exemplar-based optimization algorithm for texture synthesis, and (ii) a direct prediction mode based on exemplar texture samples from lower layers. Experimental results show that the proposed solution performs better than the tested benchmark solutions, including the authors' previous scalable codec.

  10. Evaluation of questionnaire-based information on previous physical work loads. Stockholm MUSIC 1 Study Group. Musculoskeletal Intervention Center.

    PubMed

    Torgén, M; Winkel, J; Alfredsson, L; Kilbom, A

    1999-06-01

    The principal aim of the present study was to evaluate questionnaire-based information on past physical work loads (6-year recall). Effects of memory difficulties on reproducibility were evaluated for 82 subjects by comparing previously reported results on current work loads (test-retest procedure) with the same items recalled 6 years later. Validity was assessed by comparing self-reports in 1995, regarding work loads in 1989, with worksite measurements performed in 1989. Six-year reproducibility, calculated as weighted kappa coefficients (k(w)), varied between 0.36 and 0.86, with the highest values for proportion of the workday spent sitting and for perceived general exertion and the lowest values for trunk and neck flexion. The six-year reproducibility results were similar to previously reported test-retest results for these items; this finding indicates that memory difficulties was a minor problem. The validity of the questionnaire responses, expressed as rank correlations (r(s)) between the questionnaire responses and workplace measurements, varied between -0.16 and 0.78. The highest values were obtained for the items sitting and repetitive work, and the lowest and "unacceptable" values were for head rotation and neck flexion. Misclassification of exposure did not appear to be differential with regard to musculoskeletal symptom status, as judged by the calculated risk estimates. The validity of some of these self-administered questionnaire items appears sufficient for a crude assessment of physical work loads in the past in epidemiologic studies of the general population with predominantly low levels of exposure.

  11. Probing the Cosmic Gamma-Ray Burst Rate with Trigger Simulations of the Swift Burst Alert Telescope

    NASA Technical Reports Server (NTRS)

    Lien, Amy; Sakamoto, Takanori; Gehrels, Neil; Palmer, David M.; Barthelmy, Scott D.; Graziani, Carlo; Cannizzo, John K.

    2013-01-01

    The gamma-ray burst (GRB) rate is essential for revealing the connection between GRBs, supernovae and stellar evolution. Additionally, the GRB rate at high redshift provides a strong probe of star formation history in the early universe. While hundreds of GRBs are observed by Swift, it remains difficult to determine the intrinsic GRB rate due to the complex trigger algorithm of Swift. Current studies of the GRB rate usually approximate the Swift trigger algorithm by a single detection threshold. However, unlike the previously own GRB instruments, Swift has over 500 trigger criteria based on photon count rate and additional image threshold for localization. To investigate possible systematic biases and explore the intrinsic GRB properties, we develop a program that is capable of simulating all the rate trigger criteria and mimicking the image threshold. Our simulations show that adopting the complex trigger algorithm of Swift increases the detection rate of dim bursts. As a result, our simulations suggest bursts need to be dimmer than previously expected to avoid over-producing the number of detections and to match with Swift observations. Moreover, our results indicate that these dim bursts are more likely to be high redshift events than low-luminosity GRBs. This would imply an even higher cosmic GRB rate at large redshifts than previous expectations based on star-formation rate measurements, unless other factors, such as the luminosity evolution, are taken into account. The GRB rate from our best result gives a total number of 4568 +825 -1429 GRBs per year that are beamed toward us in the whole universe.

  12. Attribute and topology based change detection in a constellation of previously detected objects

    DOEpatents

    Paglieroni, David W.; Beer, Reginald N.

    2016-01-19

    A system that applies attribute and topology based change detection to networks of objects that were detected on previous scans of a structure, roadway, or area of interest. The attributes capture properties or characteristics of the previously detected objects, such as location, time of detection, size, elongation, orientation, etc. The topology of the network of previously detected objects is maintained in a constellation database that stores attributes of previously detected objects and implicitly captures the geometrical structure of the network. A change detection system detects change by comparing the attributes and topology of new objects detected on the latest scan to the constellation database of previously detected objects.

  13. Alternative Constraint Handling Technique for Four-Bar Linkage Path Generation

    NASA Astrophysics Data System (ADS)

    Sleesongsom, S.; Bureerat, S.

    2018-03-01

    This paper proposes an extension of a new concept for path generation from our previous work by adding a new constraint handling technique. The propose technique was initially designed for problems without prescribed timing by avoiding the timing constraint, while remain constraints are solving with a new constraint handling technique. The technique is one kind of penalty technique. The comparative study is optimisation of path generation problems are solved using self-adaptive population size teaching-learning based optimization (SAP-TLBO) and original TLBO. In this study, two traditional path generation test problem are used to test the proposed technique. The results show that the new technique can be applied with the path generation problem without prescribed timing and gives better results than the previous technique. Furthermore, SAP-TLBO outperforms the original one.

  14. Mars Global Geologic Mapping: Amazonian Results

    NASA Technical Reports Server (NTRS)

    Tanaka, K. L.; Dohm, J. M.; Irwin, R.; Kolb, E. J.; Skinner, J. A., Jr.; Hare, T. M.

    2008-01-01

    We are in the second year of a five-year effort to map the geology of Mars using mainly Mars Global Surveyor, Mars Express, and Mars Odyssey imaging and altimetry datasets. Previously, we have reported on details of project management, mapping datasets (local and regional), initial and anticipated mapping approaches, and tactics of map unit delineation and description [1-2]. For example, we have seen how the multiple types and huge quantity of image data as well as more accurate and detailed altimetry data now available allow for broader and deeper geologic perspectives, based largely on improved landform perception, characterization, and analysis. Here, we describe early mapping results, which include updating of previous northern plains mapping [3], including delineation of mainly Amazonian units and regional fault mapping, as well as other advances.

  15. Attributed community mining using joint general non-negative matrix factorization with graph Laplacian

    NASA Astrophysics Data System (ADS)

    Chen, Zigang; Li, Lixiang; Peng, Haipeng; Liu, Yuhong; Yang, Yixian

    2018-04-01

    Community mining for complex social networks with link and attribute information plays an important role according to different application needs. In this paper, based on our proposed general non-negative matrix factorization (GNMF) algorithm without dimension matching constraints in our previous work, we propose the joint GNMF with graph Laplacian (LJGNMF) to implement community mining of complex social networks with link and attribute information according to different application needs. Theoretical derivation result shows that the proposed LJGNMF is fully compatible with previous methods of integrating traditional NMF and symmetric NMF. In addition, experimental results show that the proposed LJGNMF can meet the needs of different community minings by adjusting its parameters, and the effect is better than traditional NMF in the community vertices attributes entropy.

  16. Multi-hierarchical movements in self-avoiding walks

    NASA Astrophysics Data System (ADS)

    Sakiyama, Tomoko; Gunji, Yukio-Pegio

    2017-07-01

    A self-avoiding walk (SAW) is a series of moves on a lattice that visit the same place only once. Several studies reported that repellent reactions of foragers to previously visited sites induced power-law tailed SAWs in animals. In this paper, we show that modelling the agent's multi-avoidance reactions to its trails enables it to show ballistic movements which result in heavy-tailed movements. There is no literature showing emergent ballistic movements in SAWs. While following SAWs, the agent in my model changed its reactions to marked patches (visited sites) by considering global trail patterns based on local trail patterns when the agent was surrounded by previously visited sites. As a result, we succeeded in producing ballistic walks by the agents which exhibited emergent power-law tailed movements.

  17. Review of the dynamic behaviour of sports balls during normal and oblique impacts

    NASA Astrophysics Data System (ADS)

    Haron, Muhammad Adli; Jailani, Azrol; Abdullah, Nik Ahmad Faris Nik; Ismail, Rafis Suizwan; Rahim, Shayfull Zamree Abd; Ghazali, Mohd Fathullah

    2017-09-01

    In this paper are review of impact experiment to study the dynamic behaviour of sports ball during oblique and normal impacts. In previous studies, the investigation was done on the dynamic behaviour of a sports ball during oblique and normal impacts from experimental, numerical, and theoretical viewpoints. The experimental results are analysed and compared with the theories, in order to understand the dynamics behaviours based on the phenomenological occurrence. Throughout the experimental studies previously, there are results of dynamics behaviours examined by many researchers such as the coefficient of restitution, tangential coefficient, local deformation, dynamic impact force, contact time, angle of impact (inbound and rebound), spin rate of the ball, ball stiffness and damping coefficient which dependable of the initial or impact velocity.

  18. Relationships between large-scale circulation patterns and carbon dioxide exchange by a deciduous forest

    NASA Astrophysics Data System (ADS)

    Zhang, Jingyong; Wu, Lingyun; Huang, Gang; Notaro, Michael

    2011-02-01

    In this study, we focus on a deciduous forest in central Massachusetts and investigate the relationships between global climate indices and CO2 exchange using eddy-covariance flux measurements from 1992 to 2007. Results suggest that large-scale circulation patterns influence the annual CO2 exchange in the forest through their effects on the local surface climate. Annual gross ecosystem exchange (GEE) in the forest is closely associated with spring El Niño-Southern Oscillation (ENSO) and Pacific Decadal Oscillation (PDO), previous fall Atlantic Multidecadal Oscillation (AMO), and previous winter East Pacific-North Pacific (EP-NP) pattern. Annual net ecosystem exchange (NEE) responds to previous fall AMO and PDO, while annual respiration (R) is impacted by previous fall ENSO and Pacific/North American Oscillation (PNA). Regressions based on these relationships are developed to simulate the annual GEE, NEE, and R. To avoid problems of multicollinearity, we compute a "Composite Index for GEE (CIGEE)" based on a linear combination of spring ENSO and PDO, fall AMO, and winter EP-NP and a "Composite Index for R (CIR)" based on a linear combination of fall ENSO and PNA. CIGEE, CIR, and fall AMO and PDO can explain 41, 27, and 40% of the variance of the annual GEE, R, and NEE, respectively. We further apply the methodology to two other northern midlatitude forests and find that interannual variabilities in NEE of the two forests are largely controlled by large-scale circulation patterns. This study suggests that global climate indices provide the potential for predicting CO2 exchange variability in the northern midlatitude forests.

  19. LU60645GT and MA132843GT Catalogues of Lunar and Martian Impact Craters Developed Using a Crater Shape-based Interpolation Crater Detection Algorithm for Topography Data

    NASA Technical Reports Server (NTRS)

    Salamuniccar, Goran; Loncaric, Sven; Mazarico, Erwan Matias

    2012-01-01

    For Mars, 57,633 craters from the manually assembled catalogues and 72,668 additional craters identified using several crater detection algorithms (CDAs) have been merged into the MA130301GT catalogue. By contrast, for the Moon the most complete previous catalogue contains only 14,923 craters. Two recent missions provided higher-quality digital elevation maps (DEMs): SELENE (in 1/16° resolution) and Lunar Reconnaissance Orbiter (we used up to 1/512°). This was the main motivation for work on the new Crater Shape-based interpolation module, which improves previous CDA as follows: (1) it decreases the number of false-detections for the required number of true detections; (2) it improves detection capabilities for very small craters; and (3) it provides more accurate automated measurements of craters' properties. The results are: (1) LU60645GT, which is currently the most complete (up to D>=8 km) catalogue of Lunar craters; and (2) MA132843GT catalogue of Martian craters complete up to D>=2 km, which is the extension of the previous MA130301GT catalogue. As previously achieved for Mars, LU60645GT provides all properties that were provided by the previous Lunar catalogues, plus: (1) correlation between morphological descriptors from used catalogues; (2) correlation between manually assigned attributes and automated measurements; (3) average errors and their standard deviations for manually and automatically assigned attributes such as position coordinates, diameter, depth/diameter ratio, etc; and (4) a review of positional accuracy of used datasets. Additionally, surface dating could potentially be improved with the exhaustiveness of this new catalogue. The accompanying results are: (1) the possibility of comparing a large number of Lunar and Martian craters, of e.g. depth/diameter ratio and 2D profiles; (2) utilisation of a method for re-projection of datasets and catalogues, which is very useful for craters that are very close to poles; and (3) the extension of the previous framework for evaluation of CDAs with datasets and ground-truth catalogue for the Moon.

  20. Kernel-based least squares policy iteration for reinforcement learning.

    PubMed

    Xu, Xin; Hu, Dewen; Lu, Xicheng

    2007-07-01

    In this paper, we present a kernel-based least squares policy iteration (KLSPI) algorithm for reinforcement learning (RL) in large or continuous state spaces, which can be used to realize adaptive feedback control of uncertain dynamic systems. By using KLSPI, near-optimal control policies can be obtained without much a priori knowledge on dynamic models of control plants. In KLSPI, Mercer kernels are used in the policy evaluation of a policy iteration process, where a new kernel-based least squares temporal-difference algorithm called KLSTD-Q is proposed for efficient policy evaluation. To keep the sparsity and improve the generalization ability of KLSTD-Q solutions, a kernel sparsification procedure based on approximate linear dependency (ALD) is performed. Compared to the previous works on approximate RL methods, KLSPI makes two progresses to eliminate the main difficulties of existing results. One is the better convergence and (near) optimality guarantee by using the KLSTD-Q algorithm for policy evaluation with high precision. The other is the automatic feature selection using the ALD-based kernel sparsification. Therefore, the KLSPI algorithm provides a general RL method with generalization performance and convergence guarantee for large-scale Markov decision problems (MDPs). Experimental results on a typical RL task for a stochastic chain problem demonstrate that KLSPI can consistently achieve better learning efficiency and policy quality than the previous least squares policy iteration (LSPI) algorithm. Furthermore, the KLSPI method was also evaluated on two nonlinear feedback control problems, including a ship heading control problem and the swing up control of a double-link underactuated pendulum called acrobot. Simulation results illustrate that the proposed method can optimize controller performance using little a priori information of uncertain dynamic systems. It is also demonstrated that KLSPI can be applied to online learning control by incorporating an initial controller to ensure online performance.

  1. Global Kalman filter approaches to estimate absolute angles of lower limb segments.

    PubMed

    Nogueira, Samuel L; Lambrecht, Stefan; Inoue, Roberto S; Bortole, Magdo; Montagnoli, Arlindo N; Moreno, Juan C; Rocon, Eduardo; Terra, Marco H; Siqueira, Adriano A G; Pons, Jose L

    2017-05-16

    In this paper we propose the use of global Kalman filters (KFs) to estimate absolute angles of lower limb segments. Standard approaches adopt KFs to improve the performance of inertial sensors based on individual link configurations. In consequence, for a multi-body system like a lower limb exoskeleton, the inertial measurements of one link (e.g., the shank) are not taken into account in other link angle estimations (e.g., foot). Global KF approaches, on the other hand, correlate the collective contribution of all signals from lower limb segments observed in the state-space model through the filtering process. We present a novel global KF (matricial global KF) relying only on inertial sensor data, and validate both this KF and a previously presented global KF (Markov Jump Linear Systems, MJLS-based KF), which fuses data from inertial sensors and encoders from an exoskeleton. We furthermore compare both methods to the commonly used local KF. The results indicate that the global KFs performed significantly better than the local KF, with an average root mean square error (RMSE) of respectively 0.942° for the MJLS-based KF, 1.167° for the matrical global KF, and 1.202° for the local KFs. Including the data from the exoskeleton encoders also resulted in a significant increase in performance. The results indicate that the current practice of using KFs based on local models is suboptimal. Both the presented KF based on inertial sensor data, as well our previously presented global approach fusing inertial sensor data with data from exoskeleton encoders, were superior to local KFs. We therefore recommend to use global KFs for gait analysis and exoskeleton control.

  2. Vision Based Localization in Urban Environments

    NASA Technical Reports Server (NTRS)

    McHenry, Michael; Cheng, Yang; Matthies, Larry

    2005-01-01

    As part of DARPA's MARS2020 program, the Jet Propulsion Laboratory developed a vision-based system for localization in urban environments that requires neither GPS nor active sensors. System hardware consists of a pair of small FireWire cameras and a standard Pentium-based computer. The inputs to the software system consist of: 1) a crude grid-based map describing the positions of buildings, 2) an initial estimate of robot location and 3) the video streams produced by each camera. At each step during the traverse the system: captures new image data, finds image features hypothesized to lie on the outside of a building, computes the range to those features, determines an estimate of the robot's motion since the previous step and combines that data with the map to update a probabilistic representation of the robot's location. This probabilistic representation allows the system to simultaneously represent multiple possible locations, For our testing, we have derived the a priori map manually using non-orthorectified overhead imagery, although this process could be automated. The software system consists of two primary components. The first is the vision system which uses binocular stereo ranging together with a set of heuristics to identify features likely to be part of building exteriors and to compute an estimate of the robot's motion since the previous step. The resulting visual features and the associated range measurements are software component, a particle-filter based localization system. This system uses the map and the then fed to the second primary most recent results from the vision system to update the estimate of the robot's location. This report summarizes the design of both the hardware and software and will include the results of applying the system to the global localization of a robot over an approximately half-kilometer traverse across JPL'S Pasadena campus.

  3. Complexity in Acid-Base Titrations: Multimer Formation Between Phosphoric Acids and Imines.

    PubMed

    Malm, Christian; Kim, Heejae; Wagner, Manfred; Hunger, Johannes

    2017-08-10

    Solutions of Brønsted acids with bases in aprotic solvents are not only common model systems to study the fundamentals of proton transfer pathways but are also highly relevant to Brønsted acid catalysis. Despite their importance the light nature of the proton makes characterization of acid-base aggregates challenging. Here, we track such acid-base interactions over a broad range of relative compositions between diphenyl phosphoric acid and the base quinaldine in dichloromethane, by using a combination of dielectric relaxation and NMR spectroscopy. In contrast to what one would expect for an acid-base titration, we find strong deviations from quantitative proton transfer from the acid to the base. Even for an excess of the base, multimers consisting of one base and at least two acid molecules are formed, in addition to the occurrence of proton transfer from the acid to the base and simultaneous formation of ion pairs. For equimolar mixtures such multimers constitute about one third of all intermolecular aggregates. Quantitative analysis of our results shows that the acid-base association constant is only around six times larger than that for the acid binding to an acid-base dimer, that is, to an already protonated base. Our findings have implications for the interpretation of previous studies of reactive intermediates in organocatalysis and provide a rationale for previously observed nonlinear effects in phosphoric acid catalysis. © 2017 The Authors. Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  4. Mice plan decision strategies based on previously learned time intervals, locations, and probabilities

    PubMed Central

    Tosun, Tuğçe; Gür, Ezgi; Balcı, Fuat

    2016-01-01

    Animals can shape their timed behaviors based on experienced probabilistic relations in a nearly optimal fashion. On the other hand, it is not clear if they adopt these timed decisions by making computations based on previously learnt task parameters (time intervals, locations, and probabilities) or if they gradually develop their decisions based on trial and error. To address this question, we tested mice in the timed-switching task, which required them to anticipate when (after a short or long delay) and at which of the two delay locations a reward would be presented. The probability of short trials differed between test groups in two experiments. Critically, we first trained mice on relevant task parameters by signaling the active trial with a discriminative stimulus and delivered the corresponding reward after the associated delay without any response requirement (without inducing switching behavior). During the test phase, both options were presented simultaneously to characterize the emergence and temporal characteristics of the switching behavior. Mice exhibited timed-switching behavior starting from the first few test trials, and their performance remained stable throughout testing in the majority of the conditions. Furthermore, as the probability of the short trial increased, mice waited longer before switching from the short to long location (experiment 1). These behavioral adjustments were in directions predicted by reward maximization. These results suggest that rather than gradually adjusting their time-dependent choice behavior, mice abruptly adopted temporal decision strategies by directly integrating their previous knowledge of task parameters into their timed behavior, supporting the model-based representational account of temporal risk assessment. PMID:26733674

  5. Wavelet-based clustering of resting state MRI data in the rat.

    PubMed

    Medda, Alessio; Hoffmann, Lukas; Magnuson, Matthew; Thompson, Garth; Pan, Wen-Ju; Keilholz, Shella

    2016-01-01

    While functional connectivity has typically been calculated over the entire length of the scan (5-10min), interest has been growing in dynamic analysis methods that can detect changes in connectivity on the order of cognitive processes (seconds). Previous work with sliding window correlation has shown that changes in functional connectivity can be observed on these time scales in the awake human and in anesthetized animals. This exciting advance creates a need for improved approaches to characterize dynamic functional networks in the brain. Previous studies were performed using sliding window analysis on regions of interest defined based on anatomy or obtained from traditional steady-state analysis methods. The parcellation of the brain may therefore be suboptimal, and the characteristics of the time-varying connectivity between regions are dependent upon the length of the sliding window chosen. This manuscript describes an algorithm based on wavelet decomposition that allows data-driven clustering of voxels into functional regions based on temporal and spectral properties. Previous work has shown that different networks have characteristic frequency fingerprints, and the use of wavelets ensures that both the frequency and the timing of the BOLD fluctuations are considered during the clustering process. The method was applied to resting state data acquired from anesthetized rats, and the resulting clusters agreed well with known anatomical areas. Clusters were highly reproducible across subjects. Wavelet cross-correlation values between clusters from a single scan were significantly higher than the values from randomly matched clusters that shared no temporal information, indicating that wavelet-based analysis is sensitive to the relationship between areas. Copyright © 2015 Elsevier Inc. All rights reserved.

  6. Three-dimensional marginal separation

    NASA Technical Reports Server (NTRS)

    Duck, Peter W.

    1988-01-01

    The three dimensional marginal separation of a boundary layer along a line of symmetry is considered. The key equation governing the displacement function is derived, and found to be a nonlinear integral equation in two space variables. This is solved iteratively using a pseudo-spectral approach, based partly in double Fourier space, and partly in physical space. Qualitatively, the results are similar to previously reported two dimensional results (which are also computed to test the accuracy of the numerical scheme); however quantitatively the three dimensional results are much different.

  7. A Parametric Approach to Numerical Modeling of TKR Contact Forces

    PubMed Central

    Lundberg, Hannah J.; Foucher, Kharma C.; Wimmer, Markus A.

    2009-01-01

    In vivo knee contact forces are difficult to determine using numerical methods because there are more unknown forces than equilibrium equations available. We developed parametric methods for computing contact forces across the knee joint during the stance phase of level walking. Three-dimensional contact forces were calculated at two points of contact between the tibia and the femur, one on the lateral aspect of the tibial plateau, and one on the medial side. Muscle activations were parametrically varied over their physiologic range resulting in a solution space of contact forces. The obtained solution space was reasonably small and the resulting force pattern compared well to a previous model from the literature for kinematics and external kinetics from the same patient. Peak forces of the parametric model and the previous model were similar for the first half of the stance phase, but differed for the second half. The previous model did not take into account the transverse external moment about the knee and could not calculate muscle activation levels. Ultimately, the parametric model will result in more accurate contact force inputs for total knee simulators, as current inputs are not generally based on kinematics and kinetics inputs from TKR patients. PMID:19155015

  8. Design or "Design"--Envisioning a Future Design Education

    ERIC Educational Resources Information Center

    Sless, David

    2012-01-01

    Challenging the common grand vision of Design, this article considers "design" as a humble re-forming process based on evidence to substantiate its results. The designer is likened to a tinker who respects previous iterations of a design and seeks to retain what is useful while improving its performance. A design process is offered,…

  9. Information Processing Versus Social Cognitive Mediators of Weight Loss in a Podcast-Delivered Health Intervention

    ERIC Educational Resources Information Center

    Ko, Linda K.; Turner-McGrievy, Gabrielle M.; Campbell, Marci K.

    2014-01-01

    Podcasting is an emerging technology, and previous interventions have shown promising results using theory-based podcast for weight loss among overweight and obese individuals. This study investigated whether constructs of social cognitive theory and information processing theories (IPTs) mediate the effect of a podcast intervention on weight loss…

  10. Successful Strategies Used with ADHD Students: Is an ADHD Classroom a Possibility?

    ERIC Educational Resources Information Center

    Furtick, Kari C.

    2010-01-01

    Behaviors exhibited by children diagnosed with Attention Deficit Hyperactivity Disorder tend to be disruptive and straining on all individuals in the classroom. As a result, several research-based best practices have been developed through previous studies in order to facilitate learning in students with ADHD. A guiding principle in these…

  11. Three Insights About Change

    NASA Technical Reports Server (NTRS)

    Little, Terry

    2002-01-01

    Managers need to develop credibility, and need to base it upon new managerial accomplishments rather than previous ones. New and exciting jobs in management are challenging, but can lead to personal growth. Personnel who are afraid of potential negative consequences resulting from administrative changes are a hindrance to projects. An approval-seeking management style almost always fails.

  12. SEASONAL NH 3 EMISSIONS FOR THE CONTINENTAL UNITED STATES: INVERSE MODEL ESTIMATION AND EVALUATION

    EPA Science Inventory

    An inverse modeling study has been conducted here to evaluate a prior estimate of seasonal ammonia (NH3) emissions. The prior estimates were based on a previous inverse modeling study and two other bottom-up inventory studies. The results suggest that the prior estim...

  13. Cross reactive cellular immune responses in chickens previously exposed to low pathogenic avian influenza

    USDA-ARS?s Scientific Manuscript database

    Avian influenza (AI) infection in poultry can result in high morbidity and mortality, and negatively affect international trade. Because most AI vaccines used for poultry are inactivated, our knowledge of immunity against AI is based largely on humoral immune responses. In fact, little is known abo...

  14. The search for red AGN with 2MASS

    NASA Technical Reports Server (NTRS)

    Cutri, R. M.; Nelson, B. O.; Kirkpatrick, J. D.; Huchra, J. P.; Smith, P. S.

    2001-01-01

    We present the results of a simple, highly efficient 2MASS color-based survey that has already discovered 140 previously unknown red AGN and QSOs. These objects are near-infrared-bright and relatively nearby; the media redshift of the sample is z=0.25, and all but two have z<0.7.

  15. Analysis of rosen piezoelectric transformers with a varying cross-section.

    PubMed

    Xue, H; Yang, J; Hu, Y

    2008-07-01

    We study the effects of a varying cross-section on the performance of Rosen piezoelectric transformers operating with length extensional modes of rods. A theoretical analysis is performed using an extended version of a one-dimensional model developed in a previous paper. Numerical results based on the theoretical analysis are presented.

  16. Water Use and Growth of Two Woody Taxa Produced in Varying Indigenous Douglas-Fir Based Soilless Substrates

    USDA-ARS?s Scientific Manuscript database

    In the Pacific Northwest (PNW) container crops are grown in soilless substrates that contain different percentages of Douglas-fir bark (DFB), sphagnum peat moss and pumice. Previous research conducted by Gabriel et al. found varying combinations and ratios of these components result in differing phy...

  17. How Learners Use Automated Computer-Based Feedback to Produce Revised Drafts of Essays

    ERIC Educational Resources Information Center

    Laing, Jonny; El Ebyary, Khaled; Windeatt, Scott

    2012-01-01

    Our previous results suggest that the use of "Criterion", an automatic writing evaluation (AWE) system, is particularly successful in encouraging learners to produce amended drafts of their essays, and that those amended drafts generally represent an improvement on the original submission. Our analysis of the submitted essays and the…

  18. Violence Directed against Teachers: Results from a National Survey

    ERIC Educational Resources Information Center

    Mcmahon, Susan D.; Martinez, Andrew; Espelage, Dorothy; Rose, Chad; Reddy, Linda A.; Lane, Kathleen; Anderman, Eric M.; Reynolds, Cecil R.; Jones, Abraham; Brown, Veda

    2014-01-01

    Teachers in U.S. schools report high rates of victimization, yet previous studies focus on select types of victimization and student perpetrators, which may underestimate the extent of the problem. This national study was based on work conducted by the American Psychological Association Classroom Violence Directed Against Teachers Task Force and…

  19. Identifying Students with Dyslexia in Higher Education

    ERIC Educational Resources Information Center

    Tops, Wim; Callens, Maaike; Lammertyn, Jan; Van Hees, Valerie; Brysbaert, Marc

    2012-01-01

    An increasing number of students with dyslexia enter higher education. As a result, there is a growing need for standardized diagnosis. Previous research has suggested that a small number of tests may suffice to reliably assess students with dyslexia, but these studies were based on post hoc discriminant analysis, which tends to overestimate the…

  20. Developments in mercuric iodide gamma ray imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patt, B.E.; Beyerle, A.G.; Dolin, R.C.

    A mercuric iodide gamma-ray imaging array and camera system previously described has been characterized for spatial and energy resolution. Based on this data a new camera is being developed to more fully exploit the potential of the array. Characterization results and design criterion for the new camera will be presented. 2 refs., 7 figs.

  1. 40 CFR 63.645 - Test methods and procedures for miscellaneous process vents.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... analysis based on accepted chemical engineering principles, measurable process parameters, or physical or... minute, at a temperature of 20 °C. (g) Engineering assessment may be used to determine the TOC emission...) Engineering assessment includes, but is not limited to, the following: (i) Previous test results provided the...

  2. 40 CFR 63.645 - Test methods and procedures for miscellaneous process vents.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... analysis based on accepted chemical engineering principles, measurable process parameters, or physical or... minute, at a temperature of 20 °C. (g) Engineering assessment may be used to determine the TOC emission...) Engineering assessment includes, but is not limited to, the following: (i) Previous test results provided the...

  3. 40 CFR 63.645 - Test methods and procedures for miscellaneous process vents.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... analysis based on accepted chemical engineering principles, measurable process parameters, or physical or... minute, at a temperature of 20 °C. (g) Engineering assessment may be used to determine the TOC emission...) Engineering assessment includes, but is not limited to, the following: (i) Previous test results provided the...

  4. Reasoning Maps: A Generally Applicable Method for Characterizing Hypothesis-Testing Behaviour. Research Report

    ERIC Educational Resources Information Center

    White, Brian

    2004-01-01

    This paper presents a generally applicable method for characterizing subjects' hypothesis-testing behaviour based on a synthesis that extends on previous work. Beginning with a transcript of subjects' speech and videotape of their actions, a Reasoning Map is created that depicts the flow of their hypotheses, tests, predictions, results, and…

  5. Optimal Stratification of Item Pools in a-Stratified Computerized Adaptive Testing.

    ERIC Educational Resources Information Center

    Chang, Hua-Hua; van der Linden, Wim J.

    2003-01-01

    Developed a method based on 0-1 linear programming to stratify an item pool optimally for use in alpha-stratified adaptive testing. Applied the method to a previous item pool from the computerized adaptive test of the Graduate Record Examinations. Results show the new method performs well in practical situations. (SLD)

  6. Correlation of Self-Assessment with Attendance in an Evidence-Based Medicine Course

    ERIC Educational Resources Information Center

    Ramirez, Beatriz U.

    2015-01-01

    In previous studies, correlations between attendance and grades in lectures have given variable results and, when statistically significant, the correlation has been weak. In some studies, a sex effect has been reported. Lectures are a teacher-centered learning activity. Therefore, it appeared interesting to evaluate if a stronger correlation…

  7. Determinants of Literacy Proficiency: A Lifelong-Lifewide Learning Perspective

    ERIC Educational Resources Information Center

    Desjardins, Richard

    2003-01-01

    The aim of this article is to investigate the predictive capacity of major determinants of literacy proficiency that are associated with a variety of contexts including school, home, work, community and leisure. An identical structural model based on previous research is fitted to data for 18 countries. The results show that even after accounting…

  8. The Effects of Arts Integration on Long-Term Retention of Academic Content

    ERIC Educational Resources Information Center

    Hardiman, Mariale; Rinne, Luke; Yarmolinskaya, Julia

    2014-01-01

    Previous correlational and quasi-experimental studies of arts integration--the pedagogical practice of "teaching through the arts"--suggest its value for enhancing cognitive, academic, and social skills. This study reports the results of a small, preliminary classroom-based experiment that tested effects of arts integration on long-term…

  9. Identifying Predictors of Achievement in the Newly Defined Information Literacy: A Neural Network Analysis

    ERIC Educational Resources Information Center

    Sexton, Randall; Hignite, Michael; Margavio, Thomas M.; Margavio, Geanie W.

    2009-01-01

    Information Literacy is a concept that evolved as a result of efforts to move technology-based instructional and research efforts beyond the concepts previously associated with "computer literacy." While computer literacy was largely a topic devoted to knowledge of hardware and software, information literacy is concerned with students' abilities…

  10. Estimating Critical Values for Strength of Alignment among Curriculum, Assessments, and Instruction

    ERIC Educational Resources Information Center

    Fulmer, Gavin W.

    2010-01-01

    School accountability decisions based on standardized tests hinge on the degree of alignment of the test with a state's standards. Yet no established criteria were available for judging strength of alignment. Previous studies of alignment among tests, standards, and teachers' instruction have yielded mixed results that are difficult to interpret…

  11. Estimating Critical Values for Strength of Alignment among Curriculum, Assessments, and Instruction

    ERIC Educational Resources Information Center

    Fulmer, Gavin W.

    2011-01-01

    School accountability decisions based on standardized tests hinge on the degree of alignment of the test with the state's standards documents. Yet, there exist no established criteria for judging strength of alignment. Previous measures of alignment among tests, standards, and teachers' instruction have yielded mixed results that are difficult to…

  12. The study of human venous system dynamics using hybrid computer modeling

    NASA Technical Reports Server (NTRS)

    Snyder, M. F.; Rideout, V. C.

    1972-01-01

    A computer-based model of the cardiovascular system was created emphasizing effects on the systemic venous system. Certain physiological aspects were emphasized: effects of heart rate, tilting, changes in respiration, and leg muscular contractions. The results from the model showed close correlation with findings previously reported in the literature.

  13. Migration, Remittances and Educational Outcomes: The Case of Haiti

    ERIC Educational Resources Information Center

    Bredl, Sebastian

    2011-01-01

    This paper empirically investigates how migration and the receipt of remittances affect educational outcomes in Haiti. Based on a theoretical approach it tries to disentangle the effects of both phenomena that have mostly been jointly modeled in previous literature. The results suggest that remittances play an important role for poor households in…

  14. A high-oil castor cultivar developed through recurrent selection

    USDA-ARS?s Scientific Manuscript database

    The purpose of this paper is to present and interpret the data obtained from field-grown castor seeds. Under greenhouse conditions, a previous recurrent selection for high-oil castor seeds from a base population resulted in a new population with an increased mean oil content from 50.33 to 54.47% dry...

  15. High-Rate Field Demonstration of Large-Alphabet Quantum Key Distribution

    DTIC Science & Technology

    2016-12-13

    COW , 2015 This work Figure 4: Comparison of our P&M DO-QKD results to previously published QKD system records, chosen to represent either secure...record for continuous-variable QKD (33). BBM92: secure throughput record for two-dimensional entanglement-based QKD (34). COW : distance record for QKD (19). 15

  16. High-Rate Field Demonstration of Large-Alphabet Quantum Key Distribution

    DTIC Science & Technology

    2016-10-12

    BBM92, 2009 COW , 2015 This work FIG. 4. Comparison of our P&M DO-QKD results to previously published QKD system records, chosen to represent either...distance record for continuous-variable QKD [29]. BBM92: secure throughput record for two-dimensional entanglement-based QKD [30]. COW : distance record for

  17. High-Rate Field Demonstration of Large-Alphabet Quantum Key Distribution

    DTIC Science & Technology

    2016-12-08

    COW , 2015 This work Figure 4: Comparison of our P&M DO-QKD results to previously published QKD system records, chosen to represent either secure...record for continuous-variable QKD (33). BBM92: secure throughput record for two-dimensional entanglement-based QKD (34). COW : distance record for QKD (19). 15

  18. An Online Learning Community for Beginning In-Service Teachers

    ERIC Educational Resources Information Center

    Taranto, Gregory A.

    2011-01-01

    The purpose of this study was to design, implement, and evaluate the effectiveness of incorporating an online learning community as part of a comprehensive new teacher induction program. First, the researcher created an online learning community model based on the results of a comprehensive review of literature and from the previous year's…

  19. Timber growth, mortality, and change

    Treesearch

    Roger C. Conner; Michael T. Thompson

    2009-01-01

    The previous section discussed trends in timber volume. Changes in volume often result from land-use change; that is, land entering or removed from the timber base. On those acres remaining forested, tree growth and mortality are the primary factors for volume change. Annual rates of growth and mortality often differ by species group, ownership, and geographic region....

  20. Conditional Covariance-Based Subtest Selection for DIMTEST

    ERIC Educational Resources Information Center

    Froelich, Amy G.; Habing, Brian

    2008-01-01

    DIMTEST is a nonparametric hypothesis-testing procedure designed to test the assumptions of a unidimensional and locally independent item response theory model. Several previous Monte Carlo studies have found that using linear factor analysis to select the assessment subtest for DIMTEST results in a moderate to severe loss of power when the exam…

  1. An Information Retrieval Approach for Robust Prediction of Road Surface States.

    PubMed

    Park, Jae-Hyung; Kim, Kwanho

    2017-01-28

    Recently, due to the increasing importance of reducing severe vehicle accidents on roads (especially on highways), the automatic identification of road surface conditions, and the provisioning of such information to drivers in advance, have recently been gaining significant momentum as a proactive solution to decrease the number of vehicle accidents. In this paper, we firstly propose an information retrieval approach that aims to identify road surface states by combining conventional machine-learning techniques and moving average methods. Specifically, when signal information is received from a radar system, our approach attempts to estimate the current state of the road surface based on the similar instances observed previously based on utilizing a given similarity function. Next, the estimated state is then calibrated by using the recently estimated states to yield both effective and robust prediction results. To validate the performances of the proposed approach, we established a real-world experimental setting on a section of actual highway in South Korea and conducted a comparison with the conventional approaches in terms of accuracy. The experimental results show that the proposed approach successfully outperforms the previously developed methods.

  2. Force production of a hovering hummingbird

    NASA Astrophysics Data System (ADS)

    Luo, Haoxiang; Song, Jialei; Hedrick, Tyson

    2013-11-01

    A three-dimensional numerical study is performed for a hovering Ruby-throated hummingbird (Archilochus colubris) based on an immersed-boundary method. To accurately model the unsteady aerodynamics, realistic 3D wing kinematics is reconstructed from high-speed images of the wing motion filmed at 1000 frames per second, resulting in 25 frames per flapping cycle. A high-resolution grid is employed to resolve the vortices shed from the wing. The results are validated by comparing the spanwise vorticity and circulation with the previous PIV data and also by calculating the average lift. The force production shows significant asymmetry with the downstroke producing lift 2.6 times as high as the upstroke, despite a nearly horizontal stroke plane. The total power consumption is around 55 W/kg, which is twice of previous estimate. In this presentation, we will discuss several mechanisms that lead to the force asymmetry, including the drag-based lift and the leading-edge vortex behavior. We will also address the role of wing-wake interaction, which appears to be different for the hummingbird than some of the insects such as fruit flies. Supported by NSF (No. CBET-0954381).

  3. Interface tension in the improved Blume-Capel model

    NASA Astrophysics Data System (ADS)

    Hasenbusch, Martin

    2017-09-01

    We study interfaces with periodic boundary conditions in the low-temperature phase of the improved Blume-Capel model on the simple cubic lattice. The interface free energy is defined by the difference of the free energy of a system with antiperiodic boundary conditions in one of the directions and that of a system with periodic boundary conditions in all directions. It is obtained by integration of differences of the corresponding internal energies over the inverse temperature. These differences can be computed efficiently by using a variance reduced estimator that is based on the exchange cluster algorithm. The interface tension is obtained from the interface free energy by using predictions based on effective interface models. By using our numerical results for the interface tension σ and the correlation length ξ obtained in previous work, we determine the universal amplitude ratios R2 nd ,+=σ0f2nd ,+ 2=0.3863 (6 ) , R2 nd ,-=σ0f2nd ,- 2=0.1028 (1 ) , and Rexp ,-=σ0fexp,- 2=0.1077 (3 ) . Our results are consistent with those obtained previously for the three-dimensional Ising model, confirming the universality hypothesis.

  4. Reversible stress softening of collagen based networks from the jumbo squid mantle (Dosidicus gigas).

    PubMed

    Torres, F G; Troncoso, O P; Rivas, E R; Gomez, C G; Lopez, D

    2014-04-01

    Dosidicus gigas is the largest and one of the most abundant jumbo squids in the eastern Pacific Ocean. In this paper we have studied the muscle of the mantle of D. gigas (DGM). Morphological, thermal and rheological properties were assessed by means of atomic force microscopy, scanning electron microscopy, energy-dispersive X-ray spectroscopy, differential scanning calorimetry, thermogravimetry and oscillatory rheometry. This study allowed us to assess the morphological and rheological properties of a collagen based network occurring in nature. The results showed that the DGM network displays a nonlinear effect called reversible stress softening (RSS) that has been previously described for other types of biological structures such as naturally occurring cellulose networks and actin networks. We propose that the RSS could play a key role on the way jumbo squids withstand hydrostatic pressure. The results presented here confirm that this phenomenon occurs in a wider number of materials than previously thought, all of them exhibiting different size scales as well as physical conformation. Copyright © 2013 Elsevier B.V. All rights reserved.

  5. Phenomenological and molecular-level Petri net modeling and simulation of long-term potentiation.

    PubMed

    Hardy, S; Robillard, P N

    2005-10-01

    Petri net-based modeling methods have been used in many research projects to represent biological systems. Among these, the hybrid functional Petri net (HFPN) was developed especially for biological modeling in order to provide biologists with a more intuitive Petri net-based method. In the literature, HFPNs are used to represent kinetic models at the molecular level. We present two models of long-term potentiation previously represented by differential equations which we have transformed into HFPN models: a phenomenological synapse model and a molecular-level model of the CaMKII regulation pathway. Through simulation, we obtained results similar to those of previous studies using these models. Our results open the way to a new type of modeling for systems biology where HFPNs are used to combine different levels of abstraction within one model. This approach can be useful in fully modeling a system at the molecular level when kinetic data is missing or when a full study of a system at the molecular level it is not within the scope of the research.

  6. An Information Retrieval Approach for Robust Prediction of Road Surface States

    PubMed Central

    Park, Jae-Hyung; Kim, Kwanho

    2017-01-01

    Recently, due to the increasing importance of reducing severe vehicle accidents on roads (especially on highways), the automatic identification of road surface conditions, and the provisioning of such information to drivers in advance, have recently been gaining significant momentum as a proactive solution to decrease the number of vehicle accidents. In this paper, we firstly propose an information retrieval approach that aims to identify road surface states by combining conventional machine-learning techniques and moving average methods. Specifically, when signal information is received from a radar system, our approach attempts to estimate the current state of the road surface based on the similar instances observed previously based on utilizing a given similarity function. Next, the estimated state is then calibrated by using the recently estimated states to yield both effective and robust prediction results. To validate the performances of the proposed approach, we established a real-world experimental setting on a section of actual highway in South Korea and conducted a comparison with the conventional approaches in terms of accuracy. The experimental results show that the proposed approach successfully outperforms the previously developed methods. PMID:28134859

  7. Evaluation of Alternative Altitude Scaling Methods for Thermal Ice Protection System in NASA Icing Research Tunnel

    NASA Technical Reports Server (NTRS)

    Lee, Sam; Addy, Harold E. Jr.; Broeren, Andy P.; Orchard, David M.

    2017-01-01

    A test was conducted at NASA Icing Research Tunnel to evaluate altitude scaling methods for thermal ice protection system. Two new scaling methods based on Weber number were compared against a method based on Reynolds number. The results generally agreed with the previous set of tests conducted in NRCC Altitude Icing Wind Tunnel where the three methods of scaling were also tested and compared along with reference (altitude) icing conditions. In those tests, the Weber number-based scaling methods yielded results much closer to those observed at the reference icing conditions than the Reynolds number-based icing conditions. The test in the NASA IRT used a much larger, asymmetric airfoil with an ice protection system that more closely resembled designs used in commercial aircraft. Following the trends observed during the AIWT tests, the Weber number based scaling methods resulted in smaller runback ice than the Reynolds number based scaling, and the ice formed farther upstream. The results show that the new Weber number based scaling methods, particularly the Weber number with water loading scaling, continue to show promise for ice protection system development and evaluation in atmospheric icing tunnels.

  8. Finite-time synchronization of fractional-order memristor-based neural networks with time delays.

    PubMed

    Velmurugan, G; Rakkiyappan, R; Cao, Jinde

    2016-01-01

    In this paper, we consider the problem of finite-time synchronization of a class of fractional-order memristor-based neural networks (FMNNs) with time delays and investigated it potentially. By using Laplace transform, the generalized Gronwall's inequality, Mittag-Leffler functions and linear feedback control technique, some new sufficient conditions are derived to ensure the finite-time synchronization of addressing FMNNs with fractional order α:1<α<2 and 0<α<1. The results from the theory of fractional-order differential equations with discontinuous right-hand sides are used to investigate the problem under consideration. The derived results are extended to some previous related works on memristor-based neural networks. Finally, three numerical examples are presented to show the effectiveness of our proposed theoretical results. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. Trace elements in early phase type 2 diabetes mellitus-A population-based study. The HUNT study in Norway.

    PubMed

    Hansen, Ailin Falkmo; Simić, Anica; Åsvold, Bjørn Olav; Romundstad, Pål Richard; Midthjell, Kristian; Syversen, Tore; Flaten, Trond Peder

    2017-03-01

    Differences in trace elements levels between individuals with type 2 diabetes and controls have been reported in several studies in various body fluids and tissues, but results have been inconsistent. In order to examine trace element levels in the early phase of type 2 diabetes, we investigated the association between whole blood levels of 26 trace elements and the prevalence of previously undiagnosed, screening-detected type 2 diabetes. The study was conducted as a case-control study nested within the third survey of the population-based Nord-Trøndelag Health Study (HUNT3 Survey). Among participants without previously known diabetes, 128 cases of type 2 diabetes were diagnosed in people with a high diabetes risk score (FINDRISC≥15), and frequency-matched for age and sex with 755 controls. Blood samples were analyzed by high resolution inductively coupled plasma mass spectrometry. Associations between trace element levels and the prevalence of previously undiagnosed type 2 diabetes were evaluated with multivariable conditional logistic regression controlling for age, sex, body mass index, waist-to-hip ratio, education, income, smoking and family history of diabetes. The prevalence of previously undiagnosed type 2 diabetes increased across tertiles/quartiles for cadmium, chromium, iron, nickel, silver and zinc, and decreased with increasing quartiles of bromine (P trend <0.05). After corrections for multiple testing, associations for chromium remained significant (Q trend <0.05), while associations for iron and silver were borderline significant. No associations were found for arsenic, boron, calcium, cesium, copper, gallium, gold, indium, lead, magnesium, manganese, mercury, molybdenum, rubidium, selenium, strontium, tantalum, thallium and tin. Our results suggest a possible role of bromine, cadmium, chromium, iron, nickel, silver and zinc in the development of type 2 diabetes. Copyright © 2016 Elsevier GmbH. All rights reserved.

  10. Comprehensive Genomic Profiling Identifies Frequent Drug-Sensitive EGFR Exon 19 Deletions in NSCLC not Identified by Prior Molecular Testing.

    PubMed

    Schrock, Alexa B; Frampton, Garrett M; Herndon, Dana; Greenbowe, Joel R; Wang, Kai; Lipson, Doron; Yelensky, Roman; Chalmers, Zachary R; Chmielecki, Juliann; Elvin, Julia A; Wollner, Mira; Dvir, Addie; -Gutman, Lior Soussan; Bordoni, Rodolfo; Peled, Nir; Braiteh, Fadi; Raez, Luis; Erlich, Rachel; Ou, Sai-Hong Ignatius; Mohamed, Mohamed; Ross, Jeffrey S; Stephens, Philip J; Ali, Siraj M; Miller, Vincent A

    2016-07-01

    Reliable detection of drug-sensitive activating EGFR mutations is critical in the care of advanced non-small cell lung cancer (NSCLC), but such testing is commonly performed using a wide variety of platforms, many of which lack rigorous analytic validation. A large pool of NSCLC cases was assayed with well-validated, hybrid capture-based comprehensive genomic profiling (CGP) at the request of the individual treating physicians in the course of clinical care for the purpose of making therapy decisions. From these, 400 cases harboring EGFR exon 19 deletions (Δex19) were identified, and available clinical history was reviewed. Pathology reports were available for 250 consecutive cases with classical EGFR Δex19 (amino acids 743-754) and were reviewed to assess previous non-hybrid capture-based EGFR testing. Twelve of 71 (17%) cases with EGFR testing results available were negative by previous testing, including 8 of 46 (17%) cases for which the same biopsy was analyzed. Independently, five of six (83%) cases harboring C-helical EGFR Δex19 were previously negative. In a subset of these patients with available clinical outcome information, robust benefit from treatment with EGFR inhibitors was observed. CGP identifies drug-sensitive EGFR Δex19 in NSCLC cases that have undergone prior EGFR testing and returned negative results. Given the proven benefit in progression-free survival conferred by EGFR tyrosine kinase inhibitors in patients with these alterations, CGP should be considered in the initial presentation of advanced NSCLC and when previous testing for EGFR mutations or other driver alterations is negative. Clin Cancer Res; 22(13); 3281-5. ©2016 AACR. ©2016 American Association for Cancer Research.

  11. Species delimitation in plants using the Qinghai-Tibet Plateau endemic Orinus (Poaceae: Tridentinae) as an example.

    PubMed

    Su, Xu; Wu, Guili; Li, Lili; Liu, Jianquan

    2015-07-01

    Accurate identification of species is essential for the majority of biological studies. However, defining species objectively and consistently remains a challenge, especially for plants distributed in remote regions where there is often a lack of sufficient previous specimens. In this study, multiple approaches and lines of evidence were used to determine species boundaries for plants occurring in the Qinghai-Tibet Plateau, using the genus Orinus (Poaceae) as a model system for an integrative approach to delimiting species. A total of 786 individuals from 102 populations of six previously recognized species were collected for niche, morphological and genetic analyses. Three plastid DNA regions (matK, rbcL and trnH-psbA) and one nuclear DNA region [internal transcribed space (ITS)] were sequenced. Whereas six species had been previously recognized, statistical analyses based on character variation, molecular data and niche differentiation identified only two well-delimited clusters, together with a third possibly originating from relatively recent hybridization between, or historical introgression from, the other two. Based on a principle of integrative species delimitation to reconcile different sources of data, the results provide compelling evidence that the six previously recognized species of the genus Orinus that were examined should be reduced to two, with new circumscriptions, and a third, identified in this study, should be described as a new species. This empirical study highlights the value of applying genetic differentiation, morphometric statistics and ecological niche modelling in an integrative approach to re-circumscribing species boundaries. The results produce relatively objective, operational and unbiased taxonomic classifications of plants occurring in remote regions. © The Author 2015. Published by Oxford University Press on behalf of the Annals of Botany Company. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  12. Optical information encryption based on incoherent superposition with the help of the QR code

    NASA Astrophysics Data System (ADS)

    Qin, Yi; Gong, Qiong

    2014-01-01

    In this paper, a novel optical information encryption approach is proposed with the help of QR code. This method is based on the concept of incoherent superposition which we introduce for the first time. The information to be encrypted is first transformed into the corresponding QR code, and thereafter the QR code is further encrypted into two phase only masks analytically by use of the intensity superposition of two diffraction wave fields. The proposed method has several advantages over the previous interference-based method, such as a higher security level, a better robustness against noise attack, a more relaxed work condition, and so on. Numerical simulation results and actual smartphone collected results are shown to validate our proposal.

  13. Parametric Covariance Model for Horizon-Based Optical Navigation

    NASA Technical Reports Server (NTRS)

    Hikes, Jacob; Liounis, Andrew J.; Christian, John A.

    2016-01-01

    This Note presents an entirely parametric version of the covariance for horizon-based optical navigation measurements. The covariance can be written as a function of only the spacecraft position, two sensor design parameters, the illumination direction, the size of the observed planet, the size of the lit arc to be used, and the total number of observed horizon points. As a result, one may now more clearly understand the sensitivity of horizon-based optical navigation performance as a function of these key design parameters, which is insight that was obscured in previous (and nonparametric) versions of the covariance. Finally, the new parametric covariance is shown to agree with both the nonparametric analytic covariance and results from a Monte Carlo analysis.

  14. Method and Excel VBA Algorithm for Modeling Master Recession Curve Using Trigonometry Approach.

    PubMed

    Posavec, Kristijan; Giacopetti, Marco; Materazzi, Marco; Birk, Steffen

    2017-11-01

    A new method was developed and implemented into an Excel Visual Basic for Applications (VBAs) algorithm utilizing trigonometry laws in an innovative way to overlap recession segments of time series and create master recession curves (MRCs). Based on a trigonometry approach, the algorithm horizontally translates succeeding recession segments of time series, placing their vertex, that is, the highest recorded value of each recession segment, directly onto the appropriate connection line defined by measurement points of a preceding recession segment. The new method and algorithm continues the development of methods and algorithms for the generation of MRC, where the first published method was based on a multiple linear/nonlinear regression model approach (Posavec et al. 2006). The newly developed trigonometry-based method was tested on real case study examples and compared with the previously published multiple linear/nonlinear regression model-based method. The results show that in some cases, that is, for some time series, the trigonometry-based method creates narrower overlaps of the recession segments, resulting in higher coefficients of determination R 2 , while in other cases the multiple linear/nonlinear regression model-based method remains superior. The Excel VBA algorithm for modeling MRC using the trigonometry approach is implemented into a spreadsheet tool (MRCTools v3.0 written by and available from Kristijan Posavec, Zagreb, Croatia) containing the previously published VBA algorithms for MRC generation and separation. All algorithms within the MRCTools v3.0 are open access and available free of charge, supporting the idea of running science on available, open, and free of charge software. © 2017, National Ground Water Association.

  15. Weighted bi-prediction for light field image coding

    NASA Astrophysics Data System (ADS)

    Conti, Caroline; Nunes, Paulo; Ducla Soares, Luís.

    2017-09-01

    Light field imaging based on a single-tier camera equipped with a microlens array - also known as integral, holoscopic, and plenoptic imaging - has currently risen up as a practical and prospective approach for future visual applications and services. However, successfully deploying actual light field imaging applications and services will require developing adequate coding solutions to efficiently handle the massive amount of data involved in these systems. In this context, self-similarity compensated prediction is a non-local spatial prediction scheme based on block matching that has been shown to achieve high efficiency for light field image coding based on the High Efficiency Video Coding (HEVC) standard. As previously shown by the authors, this is possible by simply averaging two predictor blocks that are jointly estimated from a causal search window in the current frame itself, referred to as self-similarity bi-prediction. However, theoretical analyses for motion compensated bi-prediction have suggested that it is still possible to achieve further rate-distortion performance improvements by adaptively estimating the weighting coefficients of the two predictor blocks. Therefore, this paper presents a comprehensive study of the rate-distortion performance for HEVC-based light field image coding when using different sets of weighting coefficients for self-similarity bi-prediction. Experimental results demonstrate that it is possible to extend the previous theoretical conclusions to light field image coding and show that the proposed adaptive weighting coefficient selection leads to up to 5 % of bit savings compared to the previous self-similarity bi-prediction scheme.

  16. Optimization and large scale computation of an entropy-based moment closure

    NASA Astrophysics Data System (ADS)

    Kristopher Garrett, C.; Hauck, Cory; Hill, Judith

    2015-12-01

    We present computational advances and results in the implementation of an entropy-based moment closure, MN, in the context of linear kinetic equations, with an emphasis on heterogeneous and large-scale computing platforms. Entropy-based closures are known in several cases to yield more accurate results than closures based on standard spectral approximations, such as PN, but the computational cost is generally much higher and often prohibitive. Several optimizations are introduced to improve the performance of entropy-based algorithms over previous implementations. These optimizations include the use of GPU acceleration and the exploitation of the mathematical properties of spherical harmonics, which are used as test functions in the moment formulation. To test the emerging high-performance computing paradigm of communication bound simulations, we present timing results at the largest computational scales currently available. These results show, in particular, load balancing issues in scaling the MN algorithm that do not appear for the PN algorithm. We also observe that in weak scaling tests, the ratio in time to solution of MN to PN decreases.

  17. Optimization and large scale computation of an entropy-based moment closure

    DOE PAGES

    Hauck, Cory D.; Hill, Judith C.; Garrett, C. Kristopher

    2015-09-10

    We present computational advances and results in the implementation of an entropy-based moment closure, M N, in the context of linear kinetic equations, with an emphasis on heterogeneous and large-scale computing platforms. Entropy-based closures are known in several cases to yield more accurate results than closures based on standard spectral approximations, such as P N, but the computational cost is generally much higher and often prohibitive. Several optimizations are introduced to improve the performance of entropy-based algorithms over previous implementations. These optimizations include the use of GPU acceleration and the exploitation of the mathematical properties of spherical harmonics, which aremore » used as test functions in the moment formulation. To test the emerging high-performance computing paradigm of communication bound simulations, we present timing results at the largest computational scales currently available. Lastly, these results show, in particular, load balancing issues in scaling the M N algorithm that do not appear for the P N algorithm. We also observe that in weak scaling tests, the ratio in time to solution of M N to P N decreases.« less

  18. Evaluation Of Sludge Heel Dissolution Efficiency With Oxalic Acid Cleaning At Savannah River Site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sudduth, Christie; Vitali, Jason; Keefer, Mark

    The chemical cleaning process baseline strategy at the Savannah River Site was revised to improve efficiency during future execution of the process based on lessons learned during previous bulk oxalic acid cleaning activities and to account for operational constraints imposed by safety basis requirements. These improvements were also intended to transcend the difficulties that arise from waste removal in higher rheological yield stress sludge tanks. Tank 12 implemented this improved strategy and the bulk oxalic acid cleaning efforts concluded in July 2013. The Tank 12 radiological removal results were similar to previous bulk oxalic acid cleaning campaigns despite the factmore » that Tank 12 contained higher rheological yield stress sludge that would make removal more difficult than the sludge treated in previous cleaning campaigns. No appreciable oxalate precipitation occurred during the cleaning process in Tank 12 compared to previous campaigns, which aided in the net volume reduction of 75-80%. Overall, the controls established for Tank 12 provide a template for an improved cleaning process.« less

  19. A cosmology-independent calibration of type Ia supernovae data

    NASA Astrophysics Data System (ADS)

    Hauret, C.; Magain, P.; Biernaux, J.

    2018-06-01

    Recently, the common methodology used to transform type Ia supernovae (SNe Ia) into genuine standard candles has been suffering criticism. Indeed, it assumes a particular cosmological model (namely the flat ΛCDM) to calibrate the standardisation corrections parameters, i.e. the dependency of the supernova peak absolute magnitude on its colour, post-maximum decline rate and host galaxy mass. As a result, this assumption could make the data compliant to the assumed cosmology and thus nullify all works previously conducted on model comparison. In this work, we verify the viability of these hypotheses by developing a cosmology-independent approach to standardise SNe Ia data from the recent JLA compilation. Our resulting corrections turn out to be very close to the ΛCDM-based corrections. Therefore, even if a ΛCDM-based calibration is questionable from a theoretical point of view, the potential compliance of SNe Ia data does not happen in practice for the JLA compilation. Previous works of model comparison based on these data do not have to be called into question. However, as this cosmology-independent standardisation method has the same degree of complexity than the model-dependent one, it is worth using it in future works, especially if smaller samples are considered, such as the superluminous type Ic supernovae.

  20. Investigation of micromixing by acoustically oscillated sharp-edges

    PubMed Central

    Nama, Nitesh; Huang, Po-Hsun; Huang, Tony Jun; Costanzo, Francesco

    2016-01-01

    Recently, acoustically oscillated sharp-edges have been utilized to achieve rapid and homogeneous mixing in microchannels. Here, we present a numerical model to investigate acoustic mixing inside a sharp-edge-based micromixer in the presence of a background flow. We extend our previously reported numerical model to include the mixing phenomena by using perturbation analysis and the Generalized Lagrangian Mean (GLM) theory in conjunction with the convection-diffusion equation. We divide the flow variables into zeroth-order, first-order, and second-order variables. This results in three sets of equations representing the background flow, acoustic response, and the time-averaged streaming flow, respectively. These equations are then solved successively to obtain the mean Lagrangian velocity which is combined with the convection-diffusion equation to predict the concentration profile. We validate our numerical model via a comparison of the numerical results with the experimentally obtained values of the mixing index for different flow rates. Further, we employ our model to study the effect of the applied input power and the background flow on the mixing performance of the sharp-edge-based micromixer. We also suggest potential design changes to the previously reported sharp-edge-based micromixer to improve its performance. Finally, we investigate the generation of a tunable concentration gradient by a linear arrangement of the sharp-edge structures inside the microchannel. PMID:27158292

  1. The Context of Demarcation in Nature of Science Teaching: The Case of Astrology

    NASA Astrophysics Data System (ADS)

    Turgut, Halil

    2011-05-01

    The aim of developing students' understanding of the nature of science [NOS] has been considered an important aspect of science education. However, the results of previous research indicate that students of various ages and even teachers possess both inaccurate and inappropriate views of the NOS. Such a failure has been explained by the view that perceptions about the NOS are well assimilated into mental structures and resistant to change. Further, the popularization of pseudoscience by the media and the assimilation of pseudoscience into previously established scientific fields have been presented as possible reasons for erroneous popular perceptions of science. Any teaching intervention designed to teach the NOS should first provoke individuals to expose their current ideas in order to provide them the chance to revise or replace these conceptual frameworks. Based on these assumptions, the aim of this study was to determine whether a teaching context based on the issue of demarcation would provide a suitable opportunity for exposing and further developing the NOS understandings of individuals enrolled in a teacher education course. Results indicate that a learning intervention based on the issue of demarcation of science from pseudoscience (in the specific case of astrology) proved an effective instructional strategy, which a majority of teacher candidates claimed to plan to use in their future teachings.

  2. Marker-Based Estimates Reveal Significant Non-additive Effects in Clonally Propagated Cassava (Manihot esculenta): Implications for the Prediction of Total Genetic Value and the Selection of Varieties.

    PubMed

    Wolfe, Marnin D; Kulakow, Peter; Rabbi, Ismail Y; Jannink, Jean-Luc

    2016-08-31

    In clonally propagated crops, non-additive genetic effects can be effectively exploited by the identification of superior genetic individuals as varieties. Cassava (Manihot esculenta Crantz) is a clonally propagated staple food crop that feeds hundreds of millions. We quantified the amount and nature of non-additive genetic variation for three key traits in a breeding population of cassava from sub-Saharan Africa using additive and non-additive genome-wide marker-based relationship matrices. We then assessed the accuracy of genomic prediction for total (additive plus non-additive) genetic value. We confirmed previous findings based on diallel populations, that non-additive genetic variation is significant for key cassava traits. Specifically, we found that dominance is particularly important for root yield and epistasis contributes strongly to variation in CMD resistance. Further, we showed that total genetic value predicted observed phenotypes more accurately than additive only models for root yield but not for dry matter content, which is mostly additive or for CMD resistance, which has high narrow-sense heritability. We address the implication of these results for cassava breeding and put our work in the context of previous results in cassava, and other plant and animal species. Copyright © 2016 Author et al.

  3. Optimal back-extrapolation method for estimating plasma volume in humans using the indocyanine green dilution method.

    PubMed

    Polidori, David; Rowley, Clarence

    2014-07-22

    The indocyanine green dilution method is one of the methods available to estimate plasma volume, although some researchers have questioned the accuracy of this method. We developed a new, physiologically based mathematical model of indocyanine green kinetics that more accurately represents indocyanine green kinetics during the first few minutes postinjection than what is assumed when using the traditional mono-exponential back-extrapolation method. The mathematical model is used to develop an optimal back-extrapolation method for estimating plasma volume based on simulated indocyanine green kinetics obtained from the physiological model. Results from a clinical study using the indocyanine green dilution method in 36 subjects with type 2 diabetes indicate that the estimated plasma volumes are considerably lower when using the traditional back-extrapolation method than when using the proposed back-extrapolation method (mean (standard deviation) plasma volume = 26.8 (5.4) mL/kg for the traditional method vs 35.1 (7.0) mL/kg for the proposed method). The results obtained using the proposed method are more consistent with previously reported plasma volume values. Based on the more physiological representation of indocyanine green kinetics and greater consistency with previously reported plasma volume values, the new back-extrapolation method is proposed for use when estimating plasma volume using the indocyanine green dilution method.

  4. Multiscale QM/MM molecular dynamics study on the first steps of guanine damage by free hydroxyl radicals in solution.

    PubMed

    Abolfath, Ramin M; Biswas, P K; Rajnarayanam, R; Brabec, Thomas; Kodym, Reinhard; Papiez, Lech

    2012-04-19

    Understanding the damage of DNA bases from hydrogen abstraction by free OH radicals is of particular importance to understanding the indirect effect of ionizing radiation. Previous studies address the problem with truncated DNA bases as ab initio quantum simulations required to study such electronic-spin-dependent processes are computationally expensive. Here, for the first time, we employ a multiscale and hybrid quantum mechanical-molecular mechanical simulation to study the interaction of OH radicals with a guanine-deoxyribose-phosphate DNA molecular unit in the presence of water, where all of the water molecules and the deoxyribose-phosphate fragment are treated with the simplistic classical molecular mechanical scheme. Our result illustrates that the presence of water strongly alters the hydrogen-abstraction reaction as the hydrogen bonding of OH radicals with water restricts the relative orientation of the OH radicals with respect to the DNA base (here, guanine). This results in an angular anisotropy in the chemical pathway and a lower efficiency in the hydrogen-abstraction mechanisms than previously anticipated for identical systems in vacuum. The method can easily be extended to single- and double-stranded DNA without any appreciable computational cost as these molecular units can be treated in the classical subsystem, as has been demonstrated here. © 2012 American Chemical Society

  5. The overlapping brain region accounting for the relationship between procrastination and impulsivity: A voxel-based morphometry study.

    PubMed

    Liu, Peiwei; Feng, Tingyong

    2017-09-30

    Procrastination is a prevalent problematic behavior that brings serious consequences, such as lower levels of health, wealth, and well-being. Previous research has verified that impulsivity is one of the traits most strongly correlated with procrastination. However, little is known about why there is a tight behavioral relationship between them. To address this question, we used voxel-based morphometry (VBM) to explore the common neural substrates between procrastination and impulsivity. In line with previous findings, the behavioral results showed a strong behavioral correlation between procrastination and impulsivity. Neuroimaging results showed impulsivity and procrastination shared the common neurobiological underpinnings in the dorsolateral prefrontal cortex (DLPFC) based on the data from 85 participants (sample 1). Furthermore, the mediation analysis revealed that impulsivity mediated the impact of gray matter (GM) volumes of this overlapping region in the DLPFC on procrastination on another independent 84 participants' data (sample 2). In conclusion, the overlapping brain region in the DLPFC would be responsible for the close relationship between procrastination and impulsivity. As a whole, the present study extends our knowledge on procrastination, and provides a novel perspective to explain the tight impulsivity - procrastination relationship. Copyright © 2017 IBRO. Published by Elsevier Ltd. All rights reserved.

  6. Detection of food intake from swallowing sequences by supervised and unsupervised methods.

    PubMed

    Lopez-Meyer, Paulo; Makeyev, Oleksandr; Schuckers, Stephanie; Melanson, Edward L; Neuman, Michael R; Sazonov, Edward

    2010-08-01

    Studies of food intake and ingestive behavior in free-living conditions most often rely on self-reporting-based methods that can be highly inaccurate. Methods of Monitoring of Ingestive Behavior (MIB) rely on objective measures derived from chewing and swallowing sequences and thus can be used for unbiased study of food intake with free-living conditions. Our previous study demonstrated accurate detection of food intake in simple models relying on observation of both chewing and swallowing. This article investigates methods that achieve comparable accuracy of food intake detection using only the time series of swallows and thus eliminating the need for the chewing sensor. The classification is performed for each individual swallow rather than for previously used time slices and thus will lead to higher accuracy in mass prediction models relying on counts of swallows. Performance of a group model based on a supervised method (SVM) is compared to performance of individual models based on an unsupervised method (K-means) with results indicating better performance of the unsupervised, self-adapting method. Overall, the results demonstrate that highly accurate detection of intake of foods with substantially different physical properties is possible by an unsupervised system that relies on the information provided by the swallowing alone.

  7. Detection of Food Intake from Swallowing Sequences by Supervised and Unsupervised Methods

    PubMed Central

    Lopez-Meyer, Paulo; Makeyev, Oleksandr; Schuckers, Stephanie; Melanson, Edward L.; Neuman, Michael R.; Sazonov, Edward

    2010-01-01

    Studies of food intake and ingestive behavior in free-living conditions most often rely on self-reporting-based methods that can be highly inaccurate. Methods of Monitoring of Ingestive Behavior (MIB) rely on objective measures derived from chewing and swallowing sequences and thus can be used for unbiased study of food intake with free-living conditions. Our previous study demonstrated accurate detection of food intake in simple models relying on observation of both chewing and swallowing. This article investigates methods that achieve comparable accuracy of food intake detection using only the time series of swallows and thus eliminating the need for the chewing sensor. The classification is performed for each individual swallow rather than for previously used time slices and thus will lead to higher accuracy in mass prediction models relying on counts of swallows. Performance of a group model based on a supervised method (SVM) is compared to performance of individual models based on an unsupervised method (K-means) with results indicating better performance of the unsupervised, self-adapting method. Overall, the results demonstrate that highly accurate detection of intake of foods with substantially different physical properties is possible by an unsupervised system that relies on the information provided by the swallowing alone. PMID:20352335

  8. Detecting the borders between coding and non-coding DNA regions in prokaryotes based on recursive segmentation and nucleotide doublets statistics

    PubMed Central

    2012-01-01

    Background Detecting the borders between coding and non-coding regions is an essential step in the genome annotation. And information entropy measures are useful for describing the signals in genome sequence. However, the accuracies of previous methods of finding borders based on entropy segmentation method still need to be improved. Methods In this study, we first applied a new recursive entropic segmentation method on DNA sequences to get preliminary significant cuts. A 22-symbol alphabet is used to capture the differential composition of nucleotide doublets and stop codon patterns along three phases in both DNA strands. This process requires no prior training datasets. Results Comparing with the previous segmentation methods, the experimental results on three bacteria genomes, Rickettsia prowazekii, Borrelia burgdorferi and E.coli, show that our approach improves the accuracy for finding the borders between coding and non-coding regions in DNA sequences. Conclusions This paper presents a new segmentation method in prokaryotes based on Jensen-Rényi divergence with a 22-symbol alphabet. For three bacteria genomes, comparing to A12_JR method, our method raised the accuracy of finding the borders between protein coding and non-coding regions in DNA sequences. PMID:23282225

  9. Investigation of micromixing by acoustically oscillated sharp-edges.

    PubMed

    Nama, Nitesh; Huang, Po-Hsun; Huang, Tony Jun; Costanzo, Francesco

    2016-03-01

    Recently, acoustically oscillated sharp-edges have been utilized to achieve rapid and homogeneous mixing in microchannels. Here, we present a numerical model to investigate acoustic mixing inside a sharp-edge-based micromixer in the presence of a background flow. We extend our previously reported numerical model to include the mixing phenomena by using perturbation analysis and the Generalized Lagrangian Mean (GLM) theory in conjunction with the convection-diffusion equation. We divide the flow variables into zeroth-order, first-order, and second-order variables. This results in three sets of equations representing the background flow, acoustic response, and the time-averaged streaming flow, respectively. These equations are then solved successively to obtain the mean Lagrangian velocity which is combined with the convection-diffusion equation to predict the concentration profile. We validate our numerical model via a comparison of the numerical results with the experimentally obtained values of the mixing index for different flow rates. Further, we employ our model to study the effect of the applied input power and the background flow on the mixing performance of the sharp-edge-based micromixer. We also suggest potential design changes to the previously reported sharp-edge-based micromixer to improve its performance. Finally, we investigate the generation of a tunable concentration gradient by a linear arrangement of the sharp-edge structures inside the microchannel.

  10. High correlations between MRI brain volume measurements based on NeuroQuant® and FreeSurfer.

    PubMed

    Ross, David E; Ochs, Alfred L; Tate, David F; Tokac, Umit; Seabaugh, John; Abildskov, Tracy J; Bigler, Erin D

    2018-05-30

    NeuroQuant ® (NQ) and FreeSurfer (FS) are commonly used computer-automated programs for measuring MRI brain volume. Previously they were reported to have high intermethod reliabilities but often large intermethod effect size differences. We hypothesized that linear transformations could be used to reduce the large effect sizes. This study was an extension of our previously reported study. We performed NQ and FS brain volume measurements on 60 subjects (including normal controls, patients with traumatic brain injury, and patients with Alzheimer's disease). We used two statistical approaches in parallel to develop methods for transforming FS volumes into NQ volumes: traditional linear regression, and Bayesian linear regression. For both methods, we used regression analyses to develop linear transformations of the FS volumes to make them more similar to the NQ volumes. The FS-to-NQ transformations based on traditional linear regression resulted in effect sizes which were small to moderate. The transformations based on Bayesian linear regression resulted in all effect sizes being trivially small. To our knowledge, this is the first report describing a method for transforming FS to NQ data so as to achieve high reliability and low effect size differences. Machine learning methods like Bayesian regression may be more useful than traditional methods. Copyright © 2018 Elsevier B.V. All rights reserved.

  11. Sixth-grade students' reasoning on the order relation of integers as influenced by prior experience: an inferentialist analysis

    NASA Astrophysics Data System (ADS)

    Schindler, Maike; Hußmann, Stephan; Nilsson, Per; Bakker, Arthur

    2017-12-01

    Negative numbers are among the first formalizations students encounter in their mathematics learning that clearly differ from out-of-school experiences. What has not sufficiently been addressed in previous research is the question of how students draw on their prior experiences when reasoning on negative numbers and how they infer from these experiences. This article presents results from an empirical study investigating sixth-grade students' reasoning and inferring from school-based and out-of-school experiences. In particular, it addresses the order relation, which deals with students' very first encounters with negative numbers. Here, students can reason in different ways, depending on the experiences they draw on. We study how students reason before a lesson series and how their reasoning is influenced through this lesson series where the number line and the context debts-and-assets are predominant. For grasping the reasoning's inferential and social nature and conducting in-depth analyses of two students' reasoning, we use an epistemological framework that is based on the philosophical theory of inferentialism. The results illustrate how the students infer their reasoning from out-of-school and from school-based experiences both before and after the lesson series. They reveal interesting phenomena not previously analyzed in the research on the order relation for integers.

  12. WE-DE-201-12: Thermal and Dosimetric Properties of a Ferrite-Based Thermo-Brachytherapy Seed

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Warrell, G; Shvydka, D; Parsai, E I

    Purpose: The novel thermo-brachytherapy (TB) seed provides a simple means of adding hyperthermia to LDR prostate permanent implant brachytherapy. The high blood perfusion rate (BPR) within the prostate motivates the use of the ferrite and conductive outer layer design for the seed cores. We describe the results of computational analyses of the thermal properties of this ferrite-based TB seed in modelled patient-specific anatomy, as well as studies of the interseed and scatter (ISA) effect. Methods: The anatomies (including the thermophysical properties of the main tissue types) and seed distributions of 6 prostate patients who had been treated with LDR brachytherapymore » seeds were modelled in the finite element analysis software COMSOL, using ferrite-based TB and additional hyperthermia-only (HT-only) seeds. The resulting temperature distributions were compared to those computed for patient-specific seed distributions, but in uniform anatomy with a constant blood perfusion rate. The ISA effect was quantified in the Monte Carlo software package MCNP5. Results: Compared with temperature distributions calculated in modelled uniform tissue, temperature distributions in the patient-specific anatomy were higher and more heterogeneous. Moreover, the maximum temperature to the rectal wall was typically ∼1 °C greater for patient-specific anatomy than for uniform anatomy. The ISA effect of the TB and HT-only seeds caused a reduction in D90 similar to that found for previously-investigated NiCu-based seeds, but of a slightly smaller magnitude. Conclusion: The differences between temperature distributions computed for uniform and patient-specific anatomy for ferrite-based seeds are significant enough that heterogeneous anatomy should be considered. Both types of modelling indicate that ferrite-based seeds provide sufficiently high and uniform hyperthermia to the prostate, without excessively heating surrounding tissues. The ISA effect of these seeds is slightly less than that for the previously-presented NiCu-based seeds.« less

  13. Improving the accuracy of ultrafast ligand-based screening: incorporating lipophilicity into ElectroShape as an extra dimension.

    PubMed

    Armstrong, M Stuart; Finn, Paul W; Morris, Garrett M; Richards, W Graham

    2011-08-01

    In a previous paper, we presented the ElectroShape method, which we used to achieve successful ligand-based virtual screening. It extended classical shape-based methods by applying them to the four-dimensional shape of the molecule where partial charge was used as the fourth dimension to capture electrostatic information. This paper extends the approach by using atomic lipophilicity (alogP) as an additional molecular property and validates it using the improved release 2 of the Directory of Useful Decoys (DUD). When alogP replaced partial charge, the enrichment results were slightly below those of ElectroShape, though still far better than purely shape-based methods. However, when alogP was added as a complement to partial charge, the resulting five-dimensional enrichments shows a clear improvement in performance. This demonstrates the utility of extending the ElectroShape virtual screening method by adding other atom-based descriptors.

  14. Existence and global exponential stability of periodic solution of memristor-based BAM neural networks with time-varying delays.

    PubMed

    Li, Hongfei; Jiang, Haijun; Hu, Cheng

    2016-03-01

    In this paper, we investigate a class of memristor-based BAM neural networks with time-varying delays. Under the framework of Filippov solutions, boundedness and ultimate boundedness of solutions of memristor-based BAM neural networks are guaranteed by Chain rule and inequalities technique. Moreover, a new method involving Yoshizawa-like theorem is favorably employed to acquire the existence of periodic solution. By applying the theory of set-valued maps and functional differential inclusions, an available Lyapunov functional and some new testable algebraic criteria are derived for ensuring the uniqueness and global exponential stability of periodic solution of memristor-based BAM neural networks. The obtained results expand and complement some previous work on memristor-based BAM neural networks. Finally, a numerical example is provided to show the applicability and effectiveness of our theoretical results. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. BER Analysis of Coherent Free-Space Optical Communication Systems with a Focal-Plane-Based Wavefront Sensor

    NASA Astrophysics Data System (ADS)

    Cao, Jingtai; Zhao, Xiaohui; Liu, Wei; Gu, Haijun

    2018-03-01

    A wavefront sensor is one of most important units for an adaptive optics system. Based on our previous works, in this paper, we discuss the bit-error-rate (BER) performance of coherent free space optical communication systems with a focal-plane-based wavefront sensor. Firstly, the theory of a focal-plane-based wavefront sensor is given. Then the relationship between the BER and the mixing efficiency with a homodyne receiver is discussed on the basis of binary-phase-shift-keying (BPSK) modulation. Finally, the numerical simulation results are shown that the BER will be decreased obviously after aberrations correction with the focal-plane-based wavefront sensor. In addition, the BER will decrease along with increasing number of photons received within a single bit. These analysis results will provide a reference for the design of the coherent Free space optical communication (FSOC) system.

  16. Initial Single Event Effects Testing of the Xilinx Virtex-4 Field Programmable Gate Array

    NASA Technical Reports Server (NTRS)

    Allen, Gregory R.; Swift, Gary M.; Carmichael, C.; Tseng, C.

    2007-01-01

    We present initial results for the thin epitaxial Xilinx Virtex-4 Fie ld Programmable Gate Array (FPGA), and compare to previous results ob tained for the Virtex-II and Virtex-II Pro. The data presented was a cquired through a consortium based effort with the common goal of pr oviding the space community with data and mitigation methods for the use of Xilinx FPGAs in space.

  17. Actual waste demonstration of the nitric-glycolic flowsheet for sludge batch 9 qualification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Newell, D.; Pareizs, J.; Martino, C.

    For each sludge batch that is processed in the Defense Waste Processing Facility (DWPF), the Savannah River National Laboratory (SRNL) performs qualification testing to demonstrate that the sludge batch is processable. Based on the results of this actual-waste qualification and previous simulant studies, SRNL recommends implementation of the nitric-glycolic acid flowsheet in DWPF. Other recommendations resulting from this demonstration are reported in section 5.0.

  18. Kernel Partial Least Squares for Nonlinear Regression and Discrimination

    NASA Technical Reports Server (NTRS)

    Rosipal, Roman; Clancy, Daniel (Technical Monitor)

    2002-01-01

    This paper summarizes recent results on applying the method of partial least squares (PLS) in a reproducing kernel Hilbert space (RKHS). A previously proposed kernel PLS regression model was proven to be competitive with other regularized regression methods in RKHS. The family of nonlinear kernel-based PLS models is extended by considering the kernel PLS method for discrimination. Theoretical and experimental results on a two-class discrimination problem indicate usefulness of the method.

  19. Molecular phylogenetics of subfamily Ornithogaloideae (Hyacinthaceae) based on nuclear and plastid DNA regions, including a new taxonomic arrangement

    PubMed Central

    Martínez-Azorín, Mario; Crespo, Manuel B.; Juan, Ana; Fay, Michael F.

    2011-01-01

    Background and Aims The taxonomic arrangement within subfamily Ornithogaloideae (Hyacinthaceae) has been a matter of controversy in recent decades: several new taxonomic treatments have been proposed, based exclusively on plastid DNA sequences, and these have resulted in classifications which are to a great extent contradictory. Some authors have recognized only a single genus Ornithogalum for the whole subfamily, including 250–300 species of variable morphology, whereas others have recognized many genera. In the latter case, the genera are inevitably much smaller and they are better defined morphologically. However, some are not monophyletic as circumscribed. Methods Phylogenetic analyses of Ornithogaloideae were based on nucleotide sequences of four plastid regions (trnL intron, trnL-F spacer, rbcL and matK) and a nuclear region (ITS). Eighty species covering all relevant taxonomic groups previously recognized in the subfamily were sampled. Parsimony and Bayesian analyses were performed. The molecular data were compared with a matrix of 34 morphological characters. Key Results Combinations of plastid and nuclear data yielded phylogenetic trees which are better resolved than those obtained with any plastid region alone or plastid regions in combination. Three main clades are found, corresponding to the previously recognized tribes Albuceae, Dipcadieae and Ornithogaleae. In these, up to 19 clades are described which are definable by morphology and biogeography. These mostly correspond to previously described taxa, though some need recircumscription. Morphological characters are assessed for their diagnostic value for taxonomy in the subfamily. Conclusions On the basis of the phylogenetic analyses, 19 monophyletic genera are accepted within Ornithogaloideae: Albuca, Avonsera, Battandiera, Cathissa, Coilonox, Dipcadi, Eliokarmos, Elsiea, Ethesia, Galtonia, Honorius, Loncomelos, Melomphis, Neopatersonia, Nicipe, Ornithogalum, Pseudogaltonia, Stellarioides and Trimelopter. Each of these has a particular syndrome of morphological characters. As a result, 105 new combinations are made and two new names are proposed to accommodate the taxa studied in the new arrangement. A short morphological diagnosis, synonymy, details of distribution and an identification key are presented. PMID:21163815

  20. Greenhouse gas emissions from the waste sector in Argentina in business-as-usual and mitigation scenarios.

    PubMed

    Santalla, Estela; Córdoba, Verónica; Blanco, Gabriel

    2013-08-01

    The objective of this work was the application of 2006 Intergovernmental Panel on Climate Change (IPCC) Guidelines for the estimation of methane and nitrous oxide emissions from the waste sector in Argentina as a preliminary exercise for greenhouse gas (GHG) inventory development and to compare with previous inventories based on 1996 IPCC Guidelines. Emissions projections to 2030 were evaluated under two scenarios--business as usual (BAU), and mitigation--and the calculations were done by using the ad hoc developed IPCC software. According to local activity data, in the business-as-usual scenario, methane emissions from solid waste disposal will increase by 73% by 2030 with respect to the emissions of year 2000. In the mitigation scenario, based on the recorded trend of methane captured in landfills, a decrease of 50% from the BAU scenario should be achieved by 2030. In the BAU scenario, GHG emissions from domestic wastewater will increase 63% from 2000 to 2030. Methane emissions from industrial wastewater, calculated from activity data of dairy, swine, slaughterhouse, citric, sugar, and wine sectors, will increase by 58% from 2000 to 2030 while methane emissions from domestic will increase 74% in the same period. Results show that GHG emissions calculated from 2006 IPCC Guidelines resulted in lower levels than those reported in previous national inventories for solid waste disposal and domestic wastewater categories, while levels were 18% higher for industrial wastewater. The implementation of the 2006 IPCC Guidelines for National Greenhouse Inventories is now considering by the UNFCCC for non-Annex I countries in order to enhance the compilation of inventories based on comparable good practice methods. This work constitutes the first GHG emissions estimation from the waste sector of Argentina applying the 2006 IPCC Guidelines and the ad doc developed software. It will contribute to identifying the main differences between the models applied in the estimation of methane emissions on the key categories of waste emission sources and to comparing results with previous inventories based on 1996 IPCC Guidelines.

Top